././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.2566419 gammapy-1.3/0000755000175100001770000000000014721316215012417 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.codacy.yml0000644000175100001770000000007414721316200014455 0ustar00runnerdocker--- engines: pylint: enabled: true python_version: 3 ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.124642 gammapy-1.3/.github/0000755000175100001770000000000014721316215013757 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.124642 gammapy-1.3/.github/ISSUE_TEMPLATE/0000755000175100001770000000000014721316215016142 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.github/ISSUE_TEMPLATE/bug_report.md0000644000175100001770000000247314721316200020634 0ustar00runnerdocker--- name: Bug report about: Create a report to help us improve Gammapy! title: '' labels: bug assignees: '' --- **Gammapy version** Please use `gammapy info` or `python -c 'import gammapy; print(gammapy.__version__)'` to check which version of Gammapy you are using, and put that information here. If the error is possibly related to the version of Python, Numpy or Astropy or another package, please also include that version information. Are you using the latest stable version of Gammapy? (see https://gammapy.org/news.html) If no, can you upgrade and try with the latest stable version, please? **Bug description** A short description what the bug is (usually 1-2 sentences) **Expected behavior** A clear and concise description of what you expected to happen. **To Reproduce** Steps to reproduce the behavior. See https://matthewrocklin.com/blog/work/2018/02/28/minimal-bug-reports If you can't share an example that lets us reproduce the bug, it's likely that we'll need to start a guessing game and ask you for more information. The next best thing to a reproducible example is to copy & paste the code or command that caused the error, and the output you got (e.g. the full Python traceback if there's an exception). **Other information** Any other information you think will be useful for us to fix the issue can go here. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.github/ISSUE_TEMPLATE/feature_request.md0000644000175100001770000000114714721316200021664 0ustar00runnerdocker--- name: Feature request about: Suggest an idea to make Gammapy better! title: '' labels: feature-request assignees: '' --- **Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] **Describe the solution you'd like** A clear and concise description of what you want to happen. **Describe alternatives you've considered** A clear and concise description of any alternative solutions or features you've considered. **Additional context** Add any other context or screenshots about the feature request here. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.github/PULL_REQUEST_TEMPLATE.md0000644000175100001770000000222014721316200017546 0ustar00runnerdocker **Description** This pull request ... **Dear reviewer** ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.github/dependabot.yml0000644000175100001770000000016714721316200016605 0ustar00runnerdockerversion: 2 updates: - package-ecosystem: "github-actions" directory: "/" schedule: interval: "monthly" ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.124642 gammapy-1.3/.github/workflows/0000755000175100001770000000000014721316215016014 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.github/workflows/cffconvert.yml0000644000175100001770000000231114721316200020665 0ustar00runnerdockername: cffconvert on: push: paths: - CITATION.cff jobs: validate: name: "cffconvert" runs-on: ubuntu-latest steps: - name: Install eossr run: | pip install eossr - name: Check out a copy of the repository uses: actions/checkout@v4 - name: Check whether the citation metadata from CITATION.cff is valid uses: citation-file-format/cffconvert-github-action@2.0.0 with: args: "--validate" - name: Convert CITATION.cff to Codemeta metadata format uses: citation-file-format/cffconvert-github-action@2.0.0 if: success() with: args: "--infile ./CITATION.cff --format codemeta --outfile codemeta.json" - name: Fix codemeta.json if: success() run: | cd dev && python codemeta.py - name: Validate codemeta.json if: success() run: | eossr-metadata-validator codemeta.json - name: commit changes uses: stefanzweifel/git-auto-commit-action@v5.0.1 if: success() with: commit_author: GitHub Actions commit_message: commit metadata files ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.github/workflows/ci.yml0000644000175100001770000001323514721316200017130 0ustar00runnerdockername: CI on: [push, pull_request] jobs: lint: name: Run pre-commit hooks runs-on: ubuntu-latest steps: - name: Check out repository uses: actions/checkout@v4 - name: Set up Python uses: actions/setup-python@v5 with: python-version: '3.11' - name: Install pre-commit run: pip install pre-commit - name: Run pre-commit run: pre-commit run ci-runs: name: ${{ matrix.os }}, ${{ matrix.tox_env }} runs-on: ${{ matrix.os }} continue-on-error: ${{ matrix.allowed_fail }} env: PYTEST_ADDOPTS: --color=yes -n auto --dist=loadscope strategy: fail-fast: true matrix: include: - os: ubuntu-latest python: '3.11' tox_env: 'py311-test-alldeps_noray' gammapy_data_path: /home/runner/work/gammapy/gammapy/gammapy-datasets/dev allowed_fail: false - os: ubuntu-latest python: '3.10' tox_env: 'py310-test-alldeps-cov' gammapy_data_path: /home/runner/work/gammapy/gammapy/gammapy-datasets/dev allowed_fail: false - os: macos-latest python: '3.11' tox_env: 'py311-test' gammapy_data_path: /Users/runner/work/gammapy/gammapy/gammapy-datasets/dev allowed_fail: false - os: macos-14 python: '3.11' tox_env: 'py311-test' allowed_fail: false - os: windows-latest python: '3.11' tox_env: 'py311-test-alldeps_noray' gammapy_data_path: D:\a\gammapy\gammapy\gammapy-datasets\dev allowed_fail: false - os: ubuntu-latest python: '3.12' tox_env: 'py312-test-alldeps_noray' gammapy_data_path: /home/runner/work/gammapy/gammapy/gammapy-datasets/dev allowed_fail: false - os: ubuntu-latest python: '3.10' tox_env: 'py310-test' allowed_fail: false - os: ubuntu-latest python: '3.13' tox_env: 'py313-test' allowed_fail: false - os: ubuntu-latest python: '3.10' tox_env: 'codestyle' allowed_fail: false - os: ubuntu-latest python: '3.9' tox_env: 'py39-test-alldeps_noray-astropy50-numpy121' allowed_fail: false - os: ubuntu-latest python: '3.9' tox_env: 'oldestdeps' allowed_fail: false - os: ubuntu-latest python: '3.11' tox_env: 'devdeps' allowed_fail: true steps: - name: Check out repository uses: actions/checkout@v4 with: fetch-depth: 0 - name: Set up Python ${{ matrix.python }} uses: actions/setup-python@v5 with: python-version: ${{ matrix.python }} - name: Install base dependencies run: | python -m pip install --upgrade pip python -m pip install tox - name: download datasets if: ${{ matrix.gammapy_data_path }} run: | python -m pip install tqdm requests python -m pip install -e . gammapy download datasets - name: Print Python, pip, and tox versions run: | python -c "import sys; print(f'Python {sys.version}')" python -c "import pip; print(f'pip {pip.__version__}')" python -c "import tox; print(f'tox {tox.__version__}')" - name: Run tests if: ${{ !matrix.gammapy_data_path }} run: tox -e ${{ matrix.tox_env }} -- -n auto - name: Run tests with data if: ${{ matrix.gammapy_data_path }} env: GAMMAPY_DATA: ${{ matrix.gammapy_data_path}} run: tox -e ${{ matrix.tox_env }} -- -n auto - name: Upload coverage to codecov if: "contains(matrix.tox_env, '-cov')" uses: codecov/codecov-action@v4 with: file: ./coverage.xml verbose: true sphinx: name: Linux python 3.9 sphinx all-deps runs-on: ubuntu-latest defaults: run: shell: bash -l {0} env: PYTEST_ADDOPTS: --color=yes -n auto --dist=loadscope GAMMAPY_DATA: /home/runner/work/gammapy/gammapy/gammapy-datasets/dev steps: - name: Check out repository uses: actions/checkout@v4 with: fetch-depth: 0 - name: Set up Python ${{ matrix.python }} uses: actions/setup-python@v5 with: python-version: ${{ matrix.python }} - name: Install base dependencies run: | python -m pip install --upgrade pip python -m pip install tox - name: download datasets run: | python -m pip install tqdm requests python -m pip install -e . gammapy download datasets - name: test build docs run: | tox -e build_docs -- -j auto - name: check links continue-on-error: true run: | tox -e linkcheck -- -j auto conda-build: name: Linux python 3.9 conda-build all-deps runs-on: ubuntu-latest defaults: run: shell: bash -l {0} steps: - name: checkout repo uses: actions/checkout@v4 - name: create and activate env uses: mamba-org/setup-micromamba@v2 with: environment-file: environment-dev.yml cache-downloads: true - name: install gammapy run: | pip install -e . - name: test conda build run: | make clean conda install conda-build conda info conda --version conda build --version python setup.py bdist_conda ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.github/workflows/dispatch.yml0000644000175100001770000000054614721316200020335 0ustar00runnerdockername: Build dev docs on: push jobs: dispatch-docs: if: github.repository_owner == 'gammapy' runs-on: ubuntu-latest steps: - name: Dispatch Gammapy Docs uses: peter-evans/repository-dispatch@v3 with: token: ${{ secrets.REMOTE_DISPATCH }} repository: gammapy/gammapy-docs event-type: dev-docs ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.github/workflows/dispatch_benchmark.yml0000644000175100001770000000106214721316200022341 0ustar00runnerdockername: Run profiler on benchmarks on: pull_request: types: - closed jobs: dispatch-benchmarks: if: ${{ (github.repository_owner == 'gammapy') && (github.event.pull_request.merged == true) && (contains(github.event.pull_request.labels.*.name, 'trigger-benchmarks')) }} runs-on: ubuntu-latest steps: - name: Dispatch Gammapy Benchmarks uses: peter-evans/repository-dispatch@v3 with: token: ${{ secrets.REMOTE_DISPATCH }} repository: gammapy/gammapy-benchmarks event-type: dev-benchmarks ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.github/workflows/greetings.yml0000644000175100001770000000076114721316200020524 0ustar00runnerdockername: Greetings on: [pull_request, issues] jobs: greeting: runs-on: ubuntu-latest steps: - uses: actions/first-interaction@v1 continue-on-error: true with: repo-token: ${{ secrets.GITHUB_TOKEN }} issue-message: 'Merci! We will respond to your issue shortly. In the meantime, try `import gammapy; gammapy.song()`' pr-message: 'Graçias! We will review your pull request shortly. In the meantime, try `import gammapy; gammapy.song(karaoke=True)`' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.github/workflows/release.yml0000644000175100001770000000400314721316200020146 0ustar00runnerdockername: Release on: push: tags-ignore: - 'v*.dev' jobs: release-pypi: if: github.repository_owner == 'gammapy' runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v4 - name: Update tags run: git fetch --tags --force - name: Set up Python uses: actions/setup-python@v5 with: python-version: 3.9 - name: Install dependencies run: | python --version pip install -U build python -m build --sdist - name: Publish package uses: pypa/gh-action-pypi-publish@release/v1 with: user: __token__ password: ${{ secrets.PYPI_TOKEN }} release-github: if: github.repository_owner == 'gammapy' && !contains(github.ref_name, 'rc') needs: release-pypi runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v4 - name: Release uses: softprops/action-gh-release@v2 with: body: | Gammapy is a Python package for gamma-ray astronomy. See [changelog for this release](https://github.com/gammapy/gammapy/blob/main/docs/release-notes/${{ github.ref_name }}.rst). dispatch-docs: if: github.repository_owner == 'gammapy' needs: release-pypi runs-on: ubuntu-latest steps: - name: Dispatch Gammapy Docs uses: peter-evans/repository-dispatch@v3 with: token: ${{ secrets.REMOTE_DISPATCH }} repository: gammapy/gammapy-docs event-type: release client-payload: '{"release": "${{ github.ref_name }}"}' dispatch-webpage: if: github.repository_owner == 'gammapy' needs: release-github runs-on: ubuntu-latest steps: - name: Dispatch Gammapy Webpage uses: peter-evans/repository-dispatch@v3 with: token: ${{ secrets.REMOTE_DISPATCH }} repository: gammapy/gammapy-webpage event-type: release client-payload: '{"release": "${{ github.ref_name }}"}' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.gitignore0000644000175100001770000000270414721316200014404 0ustar00runnerdocker# Compiled files *.py[cod] *.a *.o *.so __pycache__ # Ignore .c files by default to avoid including generated code. If you want to # add a non-generated .c extension, use `git add -f filename.c`. *.c # Other generated files */version.py */cython_version.py htmlcov .coverage* .coverage.subprocess .hypothesis # Sphinx _build # Pytest .cache v .pytest_cache .tox .tmp # Packages/installer info *.egg *.egg-info dist build eggs parts bin var sdist develop-eggs .installed.cfg distribute-*.tar.gz .eggs # Other .*.swp .*.swo .*.swn *~ # Mac OSX .DS_Store .vscode .idea .settings .pydevproject .project htmlcov MANIFEST # notebooks *.ipynb_checkpoints docs/_static/notebooks docs/_downloads/notebooks-dev.tar docs/user-guide/model-gallery docs/tutorials/ examples/tutorials/analysis-3d/crab-3datasets examples/tutorials/starting/analysis_1 examples/tutorials/analysis-3d/analysis_3d examples/tutorials/analysis-3d/event_sampling examples/tutorials/analysis-1d/hli_spectrum_analysis examples/tutorials/*/*.yaml examples/tutorials/*.yaml examples/tutorials/*/*.dat examples/tutorials/*.dat examples/tutorials/*/*.reg gammapy-datasets temp gammapy/gammapy.cfg docs/_generated docs/gen_modules docs/api docs/sg_execution_times.rst .coverage dev/crab # Data files generated in docs building #!tutorials/images/* #*.png *.fits *.fits.gz *.root # Linting .trunk .flake8 .isort.cfg .shellcheckrc .hadolint.yaml .markdownlint.yaml # sphinx gallery docs/tutorials/*/*.md5 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.mailmap0000644000175100001770000002416614721316200014043 0ustar00runnerdockerAdam Ginsburg Adam Ginsburg (keflavich) Alexis de Almeida Coutinho Alexis de Almeida Coutinho Anne Lemière Anne Lemière Anne Lemière Anne Lemière Anne Lemière Anne Lemière Andrew Chen mealworm Arnau Aguasca-Cabot aaguasca Arnau Aguasca-Cabot aaguasca Arjun Voruganti vorugantia Mathieu de Bony de Lavergne Mathieu de Bony Atreyee Sinha Atreyee SINHA Atreyee Sinha AtreyeeS Atreyee Sinha Atreyee Sinha <32677370+AtreyeeS@users.noreply.github.com> Atreyee Sinha Atreyee Sinha Atreyee Sinha Atreyee Sinha Atreyee Sinha AtreyeeS Atreyee Sinha AtreyeeS Atreyee Sinha AtreyeeS Atreyee Sinha AtreyeeS Atreyee Sinha Atreyee Sinha Axel Donath Axel Donath Axel Donath Axel Donath Axel Donath Axel Donath Axel Donath Axel Donath Axel Donath adonath Axel Donath adonath Axel Donath adonath Axel Donath Axel Donath Axel Donath axel Brigitta M Sipőcz Brigitta Sipocz Brigitta M Sipőcz Brigitta Sipocz Brigitta M Sipőcz Brigitta Sipőcz Brigitta M Sipőcz Brigitta Sipőcz Bruno Khélifi Bruno Khelifi Bruno Khélifi Bruno Khelifi Bruno Khélifi khelifi Bruno Khélifi Bruno KHÉLIFI Bruno Khélifi Bruno KHÉLIFI Cosimo Nigro cosimoNigro Cosimo Nigro cnigro José Enrique Ruiz Bultako José Enrique Ruiz Jose Enrique Ruiz David Carreto Fidalgo David Fidalgo David Carreto Fidalgo dcfidalgo Debanjan Bose debaice Dirk Lennarz Dirk Lennarz Dirk Lennarz Dirk Lennarz Dirk Lennarz Dirk Lennarz Dirk Lennarz Dirk Lennarz Erik M. Bray Erik Bray Eric O. Lebigot Eric O. LEBIGOT (EOL) Fabio Pintore fabiopintore <39336833+fabiopintore@users.noreply.github.com> Fabio Pintore Mac_di_Fabio Fabio Pintore Fabio Pintore Fabio Pintore Fabio Pintore Fabio Pintore Fabio Pintore Fabio Acero facero Fabio Acero facero Fabio Acero facero Gabriel Emery Emery Gabriel Gabriel Emery Gabriel Emery <35919817+gabemery@users.noreply.github.com> Hugo van Kemenade Hugo Jalel Eddine Hajlaoui Jaleleddine Jason Watson watsonjj Jean-Philippe Lenain jlenain Jean-Philippe Lenain Jean-Philippe Lenain Johannes King Johannes King Johannes King kingj90 Johannes King kingj90 Jonathan D. Harris JonathanDHarris José Luis Contreras Gonzalez contrera Julien Lefaucheur hijlk Julien Lefaucheur Julien Lefaucheur Julien Lefaucheur hijlk Julien Lefaucheur jlk Maximilian Linhoff Maximilian Nöthe Kai Brügge Kai B Kai Brügge Kai Bruegge Katrin Streil Katrin Streil <60615019+katrinstreil@users.noreply.github.com> Katrin Streil katrinstreil Kirsty Feijen Astro-Kirsty Kirsty Feijen AstroKirsty Lars Mohrmann Lars Lars Mohrmann Lars Mohrmann Larry Bradley larrybradley Laura Olivera-Nieto lauraolivera Laura Olivera-Nieto LauraOlivera Laura Olivera-Nieto Laura Olivera Dimitri Papadopoulos Orfanos <3234522+DimitriPapadopoulos@users.noreply.github.com> Dimitri Papadopoulos <3234522+DimitriPapadopoulos@users.noreply.github.com> Léa Jouvin Lea Jouvin Léa Jouvin Jouvin Léa Jouvin JouvinLea Luca Giunti luca-giunti Luca Giunti luca-giunti <47325742+luca-giunti@users.noreply.github.com> Luigi Tibaldo L. Tibaldo Manuel Paz Arribas mapaz Manuel Paz Arribas mapazarr Matthew Craig Matt Craig Matthew Wood Matthew Dunseth Wood Matthias Wegenmat Matthias Wegen Matthias Wegenmat Matthias Wegen Matthias Wegenmat Matthias Wegen Matthias Wegenmat wegenmat Maxime Regeard MRegeard Quentin Remy QRemy Quentin Remy Quentin Remy Régis Terrier Régis Terrier Régis Terrier Régis Terrier Régis Terrier Terrier Rubén López-Coto rlopezcoto Sam Carter <43832342+samcarter@users.noreply.github.com> samcarter <43832342+samcarter@users.noreply.github.com> Sebastian Panny sebastianpanny Simone Mender SimoneMender Simone Mender Simone Stefan Klepser klepser Thomas Armstrong Thomas Armstrong Thomas Armstrong thomasarmstrong Thomas Vuillaume Vuillaume Tyler Cahill <83472373+tmcahill@users.noreply.github.com> tmcahill <83472373+tmcahill@users.noreply.github.com> Oscar Blanch Bigas oscarblanchbigas Ellis Owen eowen Ellis Owen ellisowen Roberta Zanin robertazanin Roberta Zanin robertazanin José Vinícius de Miranda Cardoso Ze Vinicius ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/.pre-commit-config.yaml0000644000175100001770000000045214721316200016673 0ustar00runnerdocker repos: - repo: https://github.com/astral-sh/ruff-pre-commit rev: v0.7.3 hooks: - id: ruff name: ruff linting args: ["--config=pyproject.toml", "--fix", "--show-fixes"] - id : ruff-format name: ruff formatting args: ["--config=pyproject.toml"] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/CITATION0000644000175100001770000000350214721316200013546 0ustar00runnerdockerIf you use Gammapy for work/research presented in a publication (whether directly, or as a dependency to another package), we recommend and encourage the following citation scheme: - refer to the Gammapy paper as general citation: A&A, 678, A157 (2023) - the bibtex is given below - AND cite the version of Gammapy used with the DOI 10.5281/zenodo.4701488 Recommended BibTeX entry for the above citation: @article{gammapy:2023, author = {{Donath}, Axel and {Terrier}, R\'egis and {Remy}, Quentin and {Sinha}, Atreyee and {Nigro}, Cosimo and {Pintore}, Fabio and {Kh\'elifi}, Bruno and {Olivera-Nieto}, Laura and {Ruiz}, Jose Enrique and {Br\"ugge}, Kai and {Linhoff}, Maximilian and {Contreras}, Jose Luis and {Acero}, Fabio and {Aguasca-Cabot}, Arnau and {Berge}, David and {Bhattacharjee}, Pooja and {Buchner}, Johannes and {Boisson}, Catherine and {Carreto Fidalgo}, David and {Chen}, Andrew and {de Bony de Lavergne}, Mathieu and {de Miranda Cardoso}, Jos\'e Vinicius and {Deil}, Christoph and {F\"u\ss{}ling}, Matthias and {Funk}, Stefan and {Giunti}, Luca and {Hinton}, Jim and {Jouvin}, L\'ea and {King}, Johannes and {Lefaucheur}, Julien and {Lemoine-Goumard}, Marianne and {Lenain}, Jean-Philippe and {L\'opez-Coto}, Rub\'en and {Mohrmann}, Lars and {Morcuende}, Daniel and {Panny}, Sebastian and {Regeard}, Maxime and {Saha}, Lab and {Siejkowski}, Hubert and {Siemiginowska}, Aneta and {Sip"ocz}, Brigitta M. and {Unbehaun}, Tim and {van Eldik}, Christopher and {Vuillaume}, Thomas and {Zanin}, Roberta}, title = {Gammapy: A Python package for gamma-ray astronomy}, DOI= "10.1051/0004-6361/202346488", url= "https://doi.org/10.1051/0004-6361/202346488", journal = {A\&A}, year = 2023, month = 10, volume = 678, pages = "A157", } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/CITATION.cff0000644000175100001770000001607614721316200014315 0ustar00runnerdocker# Metadata for citation of this software according to the CFF format (https://citation-file-format.github.io/) cff-version: 1.2.0 title: 'Gammapy: Python toolbox for gamma-ray astronomy' abstract: 'Gammapy analyzes gamma-ray data and creates sky images, spectra and lightcurves, from event lists and instrument response information; it can also determine the position, morphology and spectra of gamma-ray sources. It is used to analyze data from H.E.S.S., Fermi-LAT, HAWC, and the Cherenkov Telescope Array (CTA).' type: software message: "If you use this software, please cite it using the metadata from this file." url: "https://gammapy.org/" date-released: 2024-11-26 keywords: - Astronomy - "Gamma-rays" - "Data analysis" version: v1.3 repository-code: "https://github.com/gammapy/gammapy" license: BSD-3-Clause contact: - email: GAMMAPY-COORDINATION-L@IN2P3.FR name: "Coordination committee of the Gammapy project" identifiers: - description: "Zenodo repository of Gammapy" type: doi value: 10.5281/zenodo.4701488 authors: - given-names: Fabio family-names: Acero affiliation: "Université Paris-Saclay, Université Paris Cité, CEA, CNRS, AIM, F-91191 Gif-sur-Yvette, France" orcid: "https://orcid.org/0000-0002-6606-2816" - given-names: Arnau family-names: Aguasca-Cabot affiliation: "Departament de Física Quàntica i Astrofísica, Institut de Ciències del Cosmos, Universitat de Barcelona, IEEC-UB, Barcelona, Spain" orcid: "https://orcid.org/0000-0001-8816-4920" - given-names: Juan family-names: Bernete affiliation: "Centro de Investigaciones Energéticas Medioambientales y Tecnológicas (CIEMAT), Madrid, Spain" orcid: "https://orcid.org/0000-0002-8108-7552" - given-names: Noah family-names: Biederbeck affiliation: "Astroparticle Physics Group, TU Dortmund University, Germany" orcid: "https://orcid.org/0000-0003-3708-9785" - given-names: Julia family-names: Djuvsland affiliation: "University of Bergen, Norway and Max Planck Institute for Nuclear Physics, Heidelberg, Germany" orcid: "https://orcid.org/0000-0002-6488-8219" - given-names: Axel family-names: Donath affiliation: "Center for Astrophysics | Harvard & Smithsonian, USA" orcid: "https://orcid.org/0000-0003-4568-7005" - given-names: Kirsty family-names: Feijen affiliation: "Université Paris Cité, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" orcid: "https://orcid.org/0000-0003-1476-3714" - given-names: Stefan family-names: Fröse affiliation: "Astroparticle Physics Group, TU Dortmund University, Germany" orcid: "https://orcid.org/0000-0003-1832-4129" - given-names: Claudio family-names: Galelli affiliation: Laboratoire Univers et Théories, Observatoire de Paris, Université PSL,\ \ Université Paris Cité, CNRS, F-92190 Meudon, France orcid: "https://orcid.org/0000-0002-7372-9703" - given-names: Bruno family-names: Khélifi affiliation: "Université Paris Cité, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" orcid: "https://orcid.org/0000-0001-6876-5577" - given-names: Jana family-names: Konrad affiliation: "Astroparticle Physics Group, TU Dortmund University, Germany" orcid: "https://orcid.org/0009-0000-1257-4771" - given-names: Paula family-names: Kornecki affiliation: Laboratoire Univers et Théories, Observatoire de Paris, Université PSL,\ \ Université Paris Cité, CNRS, F-92190 Meudon, France orcid: https://orcid.org/0000-0002-2706-7438 - given-names: Maximilian family-names: Linhoff affiliation: "Astroparticle Physics Group, TU Dortmund University, Germany" orcid: "https://orcid.org/0000-0001-7993-8189" - given-names: 'Kurt' family-names: 'McKee' affiliation: 'University of Chicago, USA' orcid: 'https://orcid.org/0000-0002-8547-8489' - given-names: Simone family-names: Mender affiliation: "Astroparticle Physics Group, TU Dortmund University, Germany" orcid: "https://orcid.org/0000-0002-0755-0609" - given-names: Lars family-names: Mohrmann affiliation: "Max Planck Institute for Nuclear Physics, Heidelberg, Germany" orcid: "https://orcid.org/0000-0002-9667-8654" - given-names: Daniel family-names: Morcuende affiliation: "Instituto de Astrofísica de Andalucía-CSIC, Granada, Spain" orcid: https://orcid.org/0000-0001-9400-0922 - given-names: Laura family-names: Olivera-Nieto affiliation: "Max Planck Institute for Nuclear Physics, Heidelberg, Germany" orcid: "https://orcid.org/0000-0002-9105-0518" - given-names: Michele family-names: Peresano affiliation: "Max-Planck-Institut für Physik, Boltzmannstraße 8, 85748 Garching bei München, Germany" orcid: "https://orcid.org/0000-0002-7537-7334" - given-names: Fabio family-names: Pintore affiliation: "INAF/IASF PALERMO, Italy" orcid: "https://orcid.org/0000-0002-3869-2925" - given-names: Michael family-names: Punch affiliation: "Université Paris Cité, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" orcid: "https://orcid.org/0000-0002-4710-2165" - given-names: Maxime family-names: Regeard affiliation: "Université Paris Cité, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" orcid: "https://orcid.org/0000-0002-3844-6003" - given-names: Quentin family-names: Remy affiliation: "Max Planck Institute for Nuclear Physics, Heidelberg, Germany" orcid: "https://orcid.org/0000-0002-8815-6530" - given-names: Gerrit family-names: Roellinghoff affiliation: "Friedrich-Alexander-Universität Erlangen-Nürnberg, ECAP, Nikolaus-Fiebiger-Str. 2, 91058 Erlangen, Germany" - given-names: Atreyee family-names: Sinha affiliation: "EMFTEL department and IPARCOS, Universidad Complutense de Madrid, 28040 Madrid, Spain" orcid: "https://orcid.org/0000-0002-9238-7163" - given-names: "Brigitta M" family-names: Sipőcz affiliation: "Caltech/IPAC, USA" orcid: "https://orcid.org/0000-0002-3713-6337" - given-names: Hanna family-names: Stapel affiliation: "Université Paris Cité, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" - given-names: "Katrin" family-names: Streil affiliation: "Friedrich-Alexander-Universität Erlangen-Nürnberg, ECAP, Nikolaus-Fiebiger-Str. 2, 91058 Erlangen, Germany" orcid: "https://orcid.org/0009-0005-7886-1825" - given-names: Régis family-names: Terrier affiliation: "Université Paris Cité, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" orcid: "https://orcid.org/0000-0002-8219-4667" - given-names: Tim family-names: Unbehaun affiliation: "Friedrich-Alexander-Universität Erlangen-Nürnberg, ECAP, Nikolaus-Fiebiger-Str. 2, 91058 Erlangen, Germany" orcid: "https://orcid.org/0000-0002-7378-4024" - given-names: Samantha family-names: Wong affiliation: "Physics Department, McGill University, Montreal, QC H3A 2T8, Canada" orcid: "https://orcid.org/0000-0002-2730-2733" - given-names: Pei family-names: Yu affiliation: "Université Paris Cité, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" preferred-citation: authors: - family-names: "The Gammapy team" title: "Gammapy: A Python package for gamma-ray astronomy" type: article doi: 10.1051/0004-6361/202346488 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/CODE_OF_CONDUCT.md0000644000175100001770000000017314721316200015211 0ustar00runnerdockerAll Gammapy community members are expected to abide by the [Gammapy Project Code of Conduct](https://gammapy.org/CoC.html).././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/CONTRIBUTING.md0000644000175100001770000001010714721316200014641 0ustar00runnerdockerContributing to Gammapy ======================= Note: This text is freely inspired by the `Contribution.md ` file of the [Astropy](https://www.astropy.org/) project. Reporting Issues ---------------- When opening an issue to report a problem or request for a new feature, please try to provide a minimal code example that reproduces the issue along with details of the operating system and `Gammapy` version you are using. Contributing Code and Documentation ----------------------------------- So you are interested in contributing to the Gammapy Project? Excellent! We love contributions! Gammapy is open source, built on open source, and we'd love to have you hang out in our community. How to Contribute, Best Practices --------------------------------- Most contributions to Gammapy are done via [pull requests](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/about-pull-requests) from GitHub users' forks of the [gammapy repository](https://github.com/gammapy/gammapy). If you are new to this style of development, you will want to read over our [Developer guide](https://docs.gammapy.org/dev/development/intro.html). Promoting contributions ----------------------- Even in the Open Source landscape, promoting the work of any is not only *natural* but also a duty for the Gammapy leading parties. We are taking care that **each contribution is correctly awarded for each product of the Gammapy project**. An [Authorship Policy](https://github.com/gammapy/gammapy/blob/master/docs/development/pigs/pig-024.rst) has been settled of each type of products (releases, papers, conferences). Note that each code release (LTS, feature release or bug release) will be published with an official DOI (though the [Zenodo deposit](https://zenodo.org/)) that you can use as an Open Science publication. This publication is associated with a list of *authors* stored into our metada files (``CITATION.cff`` and ``codemeta.json``). In order to properly build the authors list with you as contributor, Gammapy is using the so-called **Developer Certificate of Origin** (DCO). This is a lightweight way for contributors to certify that they wrote or otherwise have the right to submit the code they are contributing to the project. The used DCO is from the Linux Foundation and can be found [below](#markdown-header-gammapy-developer-certification-of-origin). The practical acceptation of our DCO can be found [here](https://docs.gammapy.org/dev/development/intro.html#Acceptation-of-the-Developer-Certificate-of-Origin-(DCO)). Gammapy Developer Certification of Origin ----------------------------------------- > Developer Certificate of Origin > Version 1.1 > > Copyright (C) 2004, 2006 The Linux Foundation and its contributors. > > Everyone is permitted to copy and distribute verbatim copies of this > license document, but changing it is not allowed. > > > Developer's Certificate of Origin 1.1 > > By making a contribution to this project, I certify that: > > (a) The contribution was created in whole or in part by me and I > have the right to submit it under the open source license > indicated in the file; or > > (b) The contribution is based upon previous work that, to the best > of my knowledge, is covered under an appropriate open source > license and I have the right under that license to submit that > work with modifications, whether created in whole or in part > by me, under the same open source license (unless I am > permitted to submit under a different license), as indicated > in the file; or > > (c) The contribution was provided directly to me by some other > person who certified (a), (b) or (c) and I have not modified > it. > > (d) I understand and agree that this project and the contribution > are public and that a record of the contribution (including all > personal information I submit with it, including my sign-off) is > maintained indefinitely and may be redistributed consistent with > this project or the open source license(s) involved. > ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/LICENSE.rst0000644000175100001770000000272314721316200014231 0ustar00runnerdockerCopyright (c) 2014, Gammapy developers All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of the Gammapy Team nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/LONG_DESCRIPTION.rst0000644000175100001770000000060614721316200015547 0ustar00runnerdocker * Webpage: https://gammapy.org * Documentation: https://docs.gammapy.org * Code: https://github.com/gammapy/gammapy Gammapy is an open-source Python package for gamma-ray astronomy built on Numpy, Scipy and Astropy. It is the base library for the science analysis tools of the Cherenkov Telescope Array Observatory and can also be used to analyse data from existing gamma-ray telescopes. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/MANIFEST.in0000644000175100001770000000037414721316200014153 0ustar00runnerdockerinclude README.rst include LONG_DESCRIPTION.rst include environment-dev.yml include setup.cfg include LICENSE.rst include pyproject.toml recursive-include gammapy *.pyx *.c recursive-include docs * recursive-include examples * recursive-include dev * ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/Makefile0000644000175100001770000001112014721316200014044 0ustar00runnerdocker# Makefile with some convenient quick ways to do common things PROJECT = gammapy CYTHON ?= cython version = dev release = $(version) help: @echo '' @echo ' make targets:' @echo '' @echo ' help Print this help message (the default)' @echo '' @echo ' clean Remove generated files' @echo ' clean-repo Remove all untracked files and directories (use with care!)' @echo '' @echo ' test Run pytest' @echo ' test-cov Run pytest with coverage' @echo '' @echo ' docs-sphinx Build docs (Sphinx only)' @echo ' docs-show Open local HTML docs in browser' @echo '' @echo ' trailing-spaces Remove trailing spaces at the end of lines in *.py files' @echo ' black Run black code formatter' @echo ' isort Run isort code formatter to sort imports' @echo ' polish Run trailing-spaces, black and isort' @echo '' @echo ' flake8 Run flake8 static code analysis' @echo ' pylint Run pylint static code analysis' @echo ' pydocstyle Run docstring checks' @echo '' @echo ' Note that most things are done via `python setup.py`, we only use' @echo ' make for things that are not trivial to execute via `setup.py`.' @echo '' @echo ' setup.py commands:' @echo '' @echo ' python setup.py --help-commands' @echo ' python setup.py install' @echo ' python setup.py bdist_conda' @echo ' python setup.py develop' @echo '' @echo ' To get info about your Gammapy installation and setup run this command' @echo '' @echo ' gammapy info' @echo '' @echo ' For this to work, Gammapy needs to be installed and on your PATH.' @echo ' If it is not, then use this equivalent command:' @echo '' @echo ' python -m gammapy info' @echo '' @echo ' More info:' @echo '' @echo ' * Gammapy code: https://github.com/gammapy/gammapy' @echo ' * Gammapy docs: https://docs.gammapy.org/' @echo '' @echo ' Most common commands to hack on Gammapy:' @echo '' @echo ' make help Print help message with all commands' @echo ' pip install -e . Install Gammapy in editable mode' @echo ' gammapy info Check install and versions' @echo ' make clean Remove auto-generated files' @echo ' pytest Run Gammapy tests (give folder or filename and options)' @echo ' make test-cov Run all tests and measure coverage' @echo ' make docs-sphinx Build documentation locally' @echo '' clean: rm -rf build dist docs/_build docs/api temp/ docs/_static/notebooks \ htmlcov MANIFEST v gammapy.egg-info .eggs .coverage .cache .pytest_cache \ docs/modeling/gallery docs/tutorials find . -name ".ipynb_checkpoints" -prune -exec rm -rf {} \; find . -name "*.pyc" -exec rm {} \; find . -name "*.reg" -exec rm {} \; find . -name "*.so" -exec rm {} \; find gammapy -name '*.c' -exec rm {} \; find . -name __pycache__ | xargs rm -fr clean-repo: @git clean -f -x -d test: python -m pytest -v gammapy test-cov: python -m pytest -v gammapy --cov=gammapy --cov-report=html docs-sphinx: cd docs && python -m sphinx . _build/html -b html -j auto docs-show: python docs/serve.py trailing-spaces: find $(PROJECT) examples docs -name "*.py" -exec perl -pi -e 's/[ \t]*$$//' {} \; black: black $(PROJECT) isort: isort -rc gammapy examples docs -s docs/conf.py polish: black isort trailing-spaces; # Note: flake8 is very fast and almost never has false positives flake8: flake8 $(PROJECT) # TODO: once the errors are fixed, remove the -E option and tackle the warnings # Note: pylint is very thorough, but slow, and has false positives or nitpicky stuff pylint: pylint -E $(PROJECT)/ \ --ignore=gammapy/extern \ -d E0611,E1101,E1103 \ --msg-template='{C}: {path}:{line}:{column}: {msg} ({symbol})' # TODO: fix and re-activate check for the following: # D103: Missing docstring in public function # D202: No blank lines allowed after function docstring (found 1) # D205: 1 blank line required between summary line and description (found 0) # D400: First line should end with a period (not ')') # D401: First line should be in imperative mood; try rephrasing (found 'Function') # D403: First word of the first line should be properly capitalized ('Add', not 'add') pydocstyle: pydocstyle $(PROJECT) \ --convention=numpy \ --match-dir='^(?!extern).*' \ --match='(?!test_).*\.py' \ --add-ignore=D100,D102,D103,D104,D105,D200,D202,D205,D400,D401,D403,D410 # Note: codespell will pick its options from setup.cfg codespell: codespell gammapy # TODO: add test and code quality checks for `examples` ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.2566419 gammapy-1.3/PKG-INFO0000644000175100001770000000664114721316215013523 0ustar00runnerdockerMetadata-Version: 2.1 Name: gammapy Version: 1.3 Summary: A Python package for gamma-ray astronomy Home-page: https://gammapy.org Author: The Gammapy developers Author-email: gammapy-coordination-l@in2p3.fr License: BSD 3-Clause Platform: any Classifier: Intended Audience :: Science/Research Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Programming Language :: C Classifier: Programming Language :: Cython Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: 3.12 Classifier: Programming Language :: Python :: 3.13 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Topic :: Scientific/Engineering :: Astronomy Requires-Python: >=3.9 Description-Content-Type: text/x-rst License-File: LICENSE.rst Requires-Dist: numpy>=1.21 Requires-Dist: scipy!=1.10,>=1.5 Requires-Dist: astropy>=5.0 Requires-Dist: regions>=0.5.0 Requires-Dist: pyyaml>=5.3 Requires-Dist: click>=7.0 Requires-Dist: pydantic>=2.5.0 Requires-Dist: iminuit>=2.8.0 Requires-Dist: matplotlib>=3.4 Provides-Extra: all Requires-Dist: naima; extra == "all" Requires-Dist: sherpa; platform_system != "Windows" and extra == "all" Requires-Dist: healpy; platform_system != "Windows" and extra == "all" Requires-Dist: requests; extra == "all" Requires-Dist: tqdm; extra == "all" Requires-Dist: ipywidgets; extra == "all" Requires-Dist: ray[default]>=2.9; extra == "all" Provides-Extra: all-no-ray Requires-Dist: naima; extra == "all-no-ray" Requires-Dist: sherpa; platform_system != "Windows" and extra == "all-no-ray" Requires-Dist: healpy; platform_system != "Windows" and extra == "all-no-ray" Requires-Dist: requests; extra == "all-no-ray" Requires-Dist: tqdm; extra == "all-no-ray" Requires-Dist: ipywidgets; extra == "all-no-ray" Provides-Extra: cov Requires-Dist: naima; extra == "cov" Requires-Dist: sherpa; platform_system != "Windows" and extra == "cov" Requires-Dist: healpy; platform_system != "Windows" and extra == "cov" Requires-Dist: requests; extra == "cov" Requires-Dist: tqdm; extra == "cov" Requires-Dist: ipywidgets; extra == "cov" Provides-Extra: test Requires-Dist: pytest-astropy; extra == "test" Requires-Dist: pytest-xdist; extra == "test" Requires-Dist: pytest; extra == "test" Requires-Dist: docutils; extra == "test" Requires-Dist: sphinx; extra == "test" Provides-Extra: docs Requires-Dist: astropy<5.3,>=5.0; extra == "docs" Requires-Dist: numpy<2.0; extra == "docs" Requires-Dist: sherpa<4.17; extra == "docs" Requires-Dist: sphinx-astropy; extra == "docs" Requires-Dist: sphinx; extra == "docs" Requires-Dist: sphinx-click; extra == "docs" Requires-Dist: sphinx-gallery; extra == "docs" Requires-Dist: sphinx-design; extra == "docs" Requires-Dist: sphinx-copybutton; extra == "docs" Requires-Dist: pydata-sphinx-theme; extra == "docs" Requires-Dist: nbformat; extra == "docs" Requires-Dist: docutils; extra == "docs" * Webpage: https://gammapy.org * Documentation: https://docs.gammapy.org * Code: https://github.com/gammapy/gammapy Gammapy is an open-source Python package for gamma-ray astronomy built on Numpy, Scipy and Astropy. It is the base library for the science analysis tools of the Cherenkov Telescope Array Observatory and can also be used to analyse data from existing gamma-ray telescopes. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/README.rst0000644000175100001770000001516314721316200014106 0ustar00runnerdocker.. image:: https://anaconda.org/conda-forge/gammapy/badges/license.svg :target: https://github.com/gammapy/gammapy/blob/master/LICENSE.rst :alt: Licence .. image:: http://img.shields.io/badge/powered%20by-AstroPy-orange.svg?style=flat :target: http://www.astropy.org/ | .. image:: https://anaconda.org/conda-forge/gammapy/badges/platforms.svg :target: https://anaconda.org/conda-forge/gammapy | .. image:: http://img.shields.io/pypi/v/gammapy.svg?text=version :target: https://pypi.org/project/gammapy/ :alt: Latest release .. image:: https://img.shields.io/pypi/dm/gammapy?label=downloads%20%7C%20pip&logo=PyPI :alt: PyPI - Downloads .. image:: https://anaconda.org/conda-forge/gammapy/badges/version.svg :target: https://anaconda.org/conda-forge/gammapy .. image:: https://img.shields.io/conda/dn/conda-forge/gammapy?label=downloads%20%7C%20conda&logo=Conda-Forge :target: https://anaconda.org/conda-forge/gammapy .. image:: https://img.shields.io/badge/launch-binder-579aca.svg?logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAFkAAABZCAMAAABi1XidAAAB8lBMVEX///9XmsrmZYH1olJXmsr1olJXmsrmZYH1olJXmsr1olJXmsrmZYH1olL1olJXmsr1olJXmsrmZYH1olL1olJXmsrmZYH1olJXmsr1olL1olJXmsrmZYH1olL1olJXmsrmZYH1olL1olL0nFf1olJXmsrmZYH1olJXmsq8dZb1olJXmsrmZYH1olJXmspXmspXmsr1olL1olJXmsrmZYH1olJXmsr1olL1olJXmsrmZYH1olL1olLeaIVXmsrmZYH1olL1olL1olJXmsrmZYH1olLna31Xmsr1olJXmsr1olJXmsrmZYH1olLqoVr1olJXmsr1olJXmsrmZYH1olL1olKkfaPobXvviGabgadXmsqThKuofKHmZ4Dobnr1olJXmsr1olJXmspXmsr1olJXmsrfZ4TuhWn1olL1olJXmsqBi7X1olJXmspZmslbmMhbmsdemsVfl8ZgmsNim8Jpk8F0m7R4m7F5nLB6jbh7jbiDirOEibOGnKaMhq+PnaCVg6qWg6qegKaff6WhnpKofKGtnomxeZy3noG6dZi+n3vCcpPDcpPGn3bLb4/Mb47UbIrVa4rYoGjdaIbeaIXhoWHmZYHobXvpcHjqdHXreHLroVrsfG/uhGnuh2bwj2Hxk17yl1vzmljzm1j0nlX1olL3AJXWAAAAbXRSTlMAEBAQHx8gICAuLjAwMDw9PUBAQEpQUFBXV1hgYGBkcHBwcXl8gICAgoiIkJCQlJicnJ2goKCmqK+wsLC4usDAwMjP0NDQ1NbW3Nzg4ODi5+3v8PDw8/T09PX29vb39/f5+fr7+/z8/Pz9/v7+zczCxgAABC5JREFUeAHN1ul3k0UUBvCb1CTVpmpaitAGSLSpSuKCLWpbTKNJFGlcSMAFF63iUmRccNG6gLbuxkXU66JAUef/9LSpmXnyLr3T5AO/rzl5zj137p136BISy44fKJXuGN/d19PUfYeO67Znqtf2KH33Id1psXoFdW30sPZ1sMvs2D060AHqws4FHeJojLZqnw53cmfvg+XR8mC0OEjuxrXEkX5ydeVJLVIlV0e10PXk5k7dYeHu7Cj1j+49uKg7uLU61tGLw1lq27ugQYlclHC4bgv7VQ+TAyj5Zc/UjsPvs1sd5cWryWObtvWT2EPa4rtnWW3JkpjggEpbOsPr7F7EyNewtpBIslA7p43HCsnwooXTEc3UmPmCNn5lrqTJxy6nRmcavGZVt/3Da2pD5NHvsOHJCrdc1G2r3DITpU7yic7w/7Rxnjc0kt5GC4djiv2Sz3Fb2iEZg41/ddsFDoyuYrIkmFehz0HR2thPgQqMyQYb2OtB0WxsZ3BeG3+wpRb1vzl2UYBog8FfGhttFKjtAclnZYrRo9ryG9uG/FZQU4AEg8ZE9LjGMzTmqKXPLnlWVnIlQQTvxJf8ip7VgjZjyVPrjw1te5otM7RmP7xm+sK2Gv9I8Gi++BRbEkR9EBw8zRUcKxwp73xkaLiqQb+kGduJTNHG72zcW9LoJgqQxpP3/Tj//c3yB0tqzaml05/+orHLksVO+95kX7/7qgJvnjlrfr2Ggsyx0eoy9uPzN5SPd86aXggOsEKW2Prz7du3VID3/tzs/sSRs2w7ovVHKtjrX2pd7ZMlTxAYfBAL9jiDwfLkq55Tm7ifhMlTGPyCAs7RFRhn47JnlcB9RM5T97ASuZXIcVNuUDIndpDbdsfrqsOppeXl5Y+XVKdjFCTh+zGaVuj0d9zy05PPK3QzBamxdwtTCrzyg/2Rvf2EstUjordGwa/kx9mSJLr8mLLtCW8HHGJc2R5hS219IiF6PnTusOqcMl57gm0Z8kanKMAQg0qSyuZfn7zItsbGyO9QlnxY0eCuD1XL2ys/MsrQhltE7Ug0uFOzufJFE2PxBo/YAx8XPPdDwWN0MrDRYIZF0mSMKCNHgaIVFoBbNoLJ7tEQDKxGF0kcLQimojCZopv0OkNOyWCCg9XMVAi7ARJzQdM2QUh0gmBozjc3Skg6dSBRqDGYSUOu66Zg+I2fNZs/M3/f/Grl/XnyF1Gw3VKCez0PN5IUfFLqvgUN4C0qNqYs5YhPL+aVZYDE4IpUk57oSFnJm4FyCqqOE0jhY2SMyLFoo56zyo6becOS5UVDdj7Vih0zp+tcMhwRpBeLyqtIjlJKAIZSbI8SGSF3k0pA3mR5tHuwPFoa7N7reoq2bqCsAk1HqCu5uvI1n6JuRXI+S1Mco54YmYTwcn6Aeic+kssXi8XpXC4V3t7/ADuTNKaQJdScAAAAAElFTkSuQmCC :target: mybinder.org | .. image:: https://zenodo.org/badge/DOI/10.5281/zenodo.4701488.svg?style=flat :target: https://doi.org/10.5281/zenodo.4701488 .. image:: https://archive.softwareheritage.org/badge/origin/https://github.com/gammapy/gammapy/ :target: https://archive.softwareheritage.org/browse/origin/?origin_url=https://github.com/gammapy/gammapy .. image:: https://archive.softwareheritage.org/badge/swh:1:snp:11e81660a96c3252c4afa4d48b58d1d2f78ea80a/ :target: https://archive.softwareheritage.org/swh:1:snp:11e81660a96c3252c4afa4d48b58d1d2f78ea80a;origin=https://github.com/gammapy/gammapy | .. image:: https://img.shields.io/badge/A&A-Published-Green.svg :target: https://doi.org/10.1051/0004-6361/202346488 | .. image:: https://img.shields.io/badge/Matomo-3152A0?style=for-the-badge&logo=Matomo&logoColor=white :target: https://matomo.org/ Gammapy ======= A Python Package for Gamma-ray Astronomy. Gammapy is an open-source Python package for gamma-ray astronomy built on Numpy, Scipy and Astropy. It is used as core library for the Science Analysis tools of the Cherenkov Telescope Array (CTA), recommended by the H.E.S.S. collaboration to be used for Science publications, and is already widely used in the analysis of existing gamma-ray instruments, such as MAGIC, VERITAS and HAWC. * Webpage: https://gammapy.org * Documentation: https://docs.gammapy.org/ * Code: https://github.com/gammapy/gammapy Contributing Code, Documentation, or Feedback +++++++++++++++++++++++++++++++++++++++++++++ The Gammapy Project is made both by and for its users, so we welcome and encourage contributions of many kinds. Our goal is to keep this a positive, inclusive, successful, and growing community by abiding with the `Gammapy Community Code of Conduct `_. The Gammapy project uses a mechanism known as a `Developer Certificate of Origin` (DCO). The DCO is a binding statement that asserts that you are the creator of your contribution, and that you wish to allow Gammapy to use your work to cite you as contributor. More detailed information on contributing to the project or submitting feedback can be found on the `Contributing page `_. Licence +++++++ Gammapy is licensed under a 3-clause BSD style license - see the `LICENSE.rst `_ file. Supporting the project ++++++++++++++++++++++ The Gammapy project is not sponsored and the development is made by the staff of `the institutes supporting the project `_ over their research time. Any contribution is then encouraged, as punctual or regular contributor. Status shields ++++++++++++++ (mostly useful for developers) * .. image:: https://app.codacy.com/project/badge/Grade/9c32a21a915d4a28823f3b44a99a2810 :target: https://www.codacy.com/gh/gammapy/gammapy/dashboard?utm_source=github.com&utm_medium=referral&utm_content=gammapy/gammapy&utm_campaign=Badge_Grade :alt: Codacy * .. image:: https://github.com/gammapy/gammapy/workflows/CI/badge.svg?style=flat :target: https://github.com/gammapy/gammapy/actions :alt: GitHub actions CI * .. image:: https://codecov.io/gh/gammapy/gammapy/branch/main/graph/badge.svg?style=flat :target: https://codecov.io/gh/gammapy/gammapy ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/codemeta.json0000644000175100001770000003247614721316200015101 0ustar00runnerdocker{ "@context": "https://doi.org/10.5063/schema/codemeta-2.0", "@type": "SoftwareSourceCode", "author": [ { "@id": "https://orcid.org/0000-0002-6606-2816", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Universit\u00e9 Paris-Saclay, Universit\u00e9 Paris Cit\u00e9, CEA, CNRS, AIM, F-91191 Gif-sur-Yvette, France" }, "familyName": "Acero", "givenName": "Fabio" }, { "@id": "https://orcid.org/0000-0001-8816-4920", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Departament de F\u00edsica Qu\u00e0ntica i Astrof\u00edsica, Institut de Ci\u00e8ncies del Cosmos, Universitat de Barcelona, IEEC-UB, Barcelona, Spain" }, "familyName": "Aguasca-Cabot", "givenName": "Arnau" }, { "@id": "https://orcid.org/0000-0002-8108-7552", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Centro de Investigaciones Energ\u00e9ticas Medioambientales y Tecnol\u00f3gicas (CIEMAT), Madrid, Spain" }, "familyName": "Bernete", "givenName": "Juan" }, { "@id": "https://orcid.org/0000-0003-3708-9785", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Astroparticle Physics Group, TU Dortmund University, Germany" }, "familyName": "Biederbeck", "givenName": "Noah" }, { "@id": "https://orcid.org/0000-0002-6488-8219", "@type": "Person", "affiliation": { "@type": "Organization", "name": "University of Bergen, Norway and Max Planck Institute for Nuclear Physics, Heidelberg, Germany" }, "familyName": "Djuvsland", "givenName": "Julia" }, { "@id": "https://orcid.org/0000-0003-4568-7005", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Center for Astrophysics | Harvard & Smithsonian, USA" }, "familyName": "Donath", "givenName": "Axel" }, { "@id": "https://orcid.org/0000-0003-1476-3714", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Universit\u00e9 Paris Cit\u00e9, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" }, "familyName": "Feijen", "givenName": "Kirsty" }, { "@id": "https://orcid.org/0000-0003-1832-4129", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Astroparticle Physics Group, TU Dortmund University, Germany" }, "familyName": "Fr\u00f6se", "givenName": "Stefan" }, { "@id": "https://orcid.org/0000-0002-7372-9703", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Laboratoire Univers et Th\u00e9ories, Observatoire de Paris, Universit\u00e9 PSL,\\ \\ Universit\u00e9 Paris Cit\u00e9, CNRS, F-92190 Meudon, France" }, "familyName": "Galelli", "givenName": "Claudio" }, { "@id": "https://orcid.org/0000-0001-6876-5577", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Universit\u00e9 Paris Cit\u00e9, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" }, "familyName": "Kh\u00e9lifi", "givenName": "Bruno" }, { "@id": "https://orcid.org/0009-0000-1257-4771", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Astroparticle Physics Group, TU Dortmund University, Germany" }, "familyName": "Konrad", "givenName": "Jana" }, { "@id": "https://orcid.org/0000-0002-2706-7438", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Laboratoire Univers et Th\u00e9ories, Observatoire de Paris, Universit\u00e9 PSL,\\ \\ Universit\u00e9 Paris Cit\u00e9, CNRS, F-92190 Meudon, France" }, "familyName": "Kornecki", "givenName": "Paula" }, { "@id": "https://orcid.org/0000-0001-7993-8189", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Astroparticle Physics Group, TU Dortmund University, Germany" }, "familyName": "Linhoff", "givenName": "Maximilian" }, { "@id": "https://orcid.org/0000-0002-8547-8489", "@type": "Person", "affiliation": { "@type": "Organization", "name": "University of Chicago, USA" }, "familyName": "McKee", "givenName": "Kurt" }, { "@id": "https://orcid.org/0000-0002-0755-0609", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Astroparticle Physics Group, TU Dortmund University, Germany" }, "familyName": "Mender", "givenName": "Simone" }, { "@id": "https://orcid.org/0000-0002-9667-8654", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Max Planck Institute for Nuclear Physics, Heidelberg, Germany" }, "familyName": "Mohrmann", "givenName": "Lars" }, { "@id": "https://orcid.org/0000-0001-9400-0922", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Instituto de Astrof\u00edsica de Andaluc\u00eda-CSIC, Granada, Spain" }, "familyName": "Morcuende", "givenName": "Daniel" }, { "@id": "https://orcid.org/0000-0002-9105-0518", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Max Planck Institute for Nuclear Physics, Heidelberg, Germany" }, "familyName": "Olivera-Nieto", "givenName": "Laura" }, { "@id": "https://orcid.org/0000-0002-7537-7334", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Max-Planck-Institut f\u00fcr Physik, Boltzmannstra\u00dfe 8, 85748 Garching bei M\u00fcnchen, Germany" }, "familyName": "Peresano", "givenName": "Michele" }, { "@id": "https://orcid.org/0000-0002-3869-2925", "@type": "Person", "affiliation": { "@type": "Organization", "name": "INAF/IASF PALERMO, Italy" }, "familyName": "Pintore", "givenName": "Fabio" }, { "@id": "https://orcid.org/0000-0002-4710-2165", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Universit\u00e9 Paris Cit\u00e9, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" }, "familyName": "Punch", "givenName": "Michael" }, { "@id": "https://orcid.org/0000-0002-3844-6003", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Universit\u00e9 Paris Cit\u00e9, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" }, "familyName": "Regeard", "givenName": "Maxime" }, { "@id": "https://orcid.org/0000-0002-8815-6530", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Max Planck Institute for Nuclear Physics, Heidelberg, Germany" }, "familyName": "Remy", "givenName": "Quentin" }, { "@type": "Person", "affiliation": { "@type": "Organization", "name": "Friedrich-Alexander-Universit\u00e4t Erlangen-N\u00fcrnberg, ECAP, Nikolaus-Fiebiger-Str. 2, 91058 Erlangen, Germany" }, "familyName": "Roellinghoff", "givenName": "Gerrit" }, { "@id": "https://orcid.org/0000-0002-9238-7163", "@type": "Person", "affiliation": { "@type": "Organization", "name": "EMFTEL department and IPARCOS, Universidad Complutense de Madrid, 28040 Madrid, Spain" }, "familyName": "Sinha", "givenName": "Atreyee" }, { "@id": "https://orcid.org/0000-0002-3713-6337", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Caltech/IPAC, USA" }, "familyName": "Sip\u0151cz", "givenName": "Brigitta M" }, { "@type": "Person", "affiliation": { "@type": "Organization", "name": "Universit\u00e9 Paris Cit\u00e9, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" }, "familyName": "Stapel", "givenName": "Hanna" }, { "@id": "https://orcid.org/0009-0005-7886-1825", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Friedrich-Alexander-Universit\u00e4t Erlangen-N\u00fcrnberg, ECAP, Nikolaus-Fiebiger-Str. 2, 91058 Erlangen, Germany" }, "familyName": "Streil", "givenName": "Katrin" }, { "@id": "https://orcid.org/0000-0002-8219-4667", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Universit\u00e9 Paris Cit\u00e9, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" }, "familyName": "Terrier", "givenName": "R\u00e9gis" }, { "@id": "https://orcid.org/0000-0002-7378-4024", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Friedrich-Alexander-Universit\u00e4t Erlangen-N\u00fcrnberg, ECAP, Nikolaus-Fiebiger-Str. 2, 91058 Erlangen, Germany" }, "familyName": "Unbehaun", "givenName": "Tim" }, { "@id": "https://orcid.org/0000-0002-2730-2733", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Physics Department, McGill University, Montreal, QC H3A 2T8, Canada" }, "familyName": "Wong", "givenName": "Samantha" }, { "@type": "Person", "affiliation": { "@type": "Organization", "name": "Universit\u00e9 Paris Cit\u00e9, CNRS, Astroparticule et Cosmologie, F-75013 Paris, France" }, "familyName": "Yu", "givenName": "Pei" } ], "codeRepository": "https://github.com/gammapy/gammapy", "datePublished": "2024-11-26", "description": "Gammapy analyzes gamma-ray data and creates sky images, spectra and lightcurves, from event lists and instrument response information; it can also determine the position, morphology and spectra of gamma-ray sources. It is used to analyze data from H.E.S.S., Fermi-LAT, HAWC, and the Cherenkov Telescope Array (CTA).", "identifier": "https://doi.org/10.5281/zenodo.4701488", "keywords": [ "Astronomy", "Gamma-rays", "Data analysis" ], "license": "https://spdx.org/licenses/BSD-3-Clause", "name": "Gammapy: Python toolbox for gamma-ray astronomy", "url": "https://gammapy.org/", "softwareVersion": "v1.3", "maintainer": { "@id": "https://orcid.org/0000-0003-4568-7005", "@type": "Person", "affiliation": { "@type": "Organization", "name": "Center for Astrophysics | Harvard & Smithsonian, USA" }, "familyName": "Donath", "givenName": "Axel" }, "readme": "https://gammapy.org", "issueTracker": "https://github.com/gammapy/gammapy/issues", "developmentStatus": [ "active" ], "email": "GAMMAPY-COORDINATION-L@IN2P3.FR", "dateModified": "2024-11-26", "softwareRequirements": [ "numpy>=1.21", "scipy>=1.5,!=1.10", "astropy>=5.0", "regions>=0.5.0", "pyyaml>=5.3", "click>=7.0", "pydantic>=2.5.0", "iminuit>=2.8.0", "matplotlib>=3.4" ] }././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.1286418 gammapy-1.3/dev/0000755000175100001770000000000014721316215013175 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/dev/README.rst0000644000175100001770000000024714721316200014661 0ustar00runnerdockerDev === This directory is a place for Gammapy developers to put stuff that is useful for maintenance, such as i.e. a helper script to produce a list of contributors. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/dev/authors.py0000644000175100001770000000757514721316200015244 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Utility script to work with the CITATION.cff file""" import logging import subprocess from pathlib import Path import click from ruamel.yaml import YAML log = logging.getLogger(__name__) EXCLUDE_AUTHORS = ["azure-pipelines[bot]", "GitHub Actions"] GAMMAPY_CC = [ "Axel Donath", "Bruno Khelifi", "Catherine Boisson", "Christopher van Eldik", "David Berge", "Fabio Acero", "Fabio Pintore", "James Hinton", "José Luis Contreras Gonzalez", "Matthias Fuessling", "Régis Terrier", "Roberta Zanin", "Rubén López-Coto", "Stefan Funk", ] # Approved authors that requested to be added to CITATION.cff ADDITIONAL_AUTHORS = [ "Amanda Weinstein", "Tim Unbehaun", ] PATH = Path(__file__).parent.parent LAST_LTS = "v1.0" NOW = "HEAD" @click.group() def cli(): pass def get_git_shortlog_authors(since_last_lts=False): """Get list of authors from git shortlog""" authors = [] command = ("git", "shortlog", "--summary", "--numbered") if since_last_lts: command += (f"{LAST_LTS}..{NOW}",) result = subprocess.check_output(command, stderr=subprocess.STDOUT).decode() data = result.split("\n") for row in data: parts = row.split("\t") if len(parts) == 2: n_commits, author = parts if author not in EXCLUDE_AUTHORS: authors.append(author) return authors def get_full_name(author_data): """Get full name from CITATION.cff parts""" parts = [] parts.append(author_data["given-names"]) name_particle = author_data.get("name-particle", None) if name_particle: parts.append(name_particle) parts.append(author_data["family-names"]) return " ".join(parts) def get_citation_cff_authors(): """Get list of authors from CITATION.cff""" authors = [] citation_file = PATH / "CITATION.cff" yaml = YAML() with citation_file.open("r") as stream: data = yaml.load(stream) for author_data in data["authors"]: full_name = get_full_name(author_data) authors.append(full_name) return authors @cli.command("sort", help="Sort authors by commits") def sort_citation_cff(): """Sort CITATION.cff according to the git shortlog""" authors = get_git_shortlog_authors() citation_file = PATH / "CITATION.cff" yaml = YAML() yaml.preserve_quotes = True with citation_file.open("r") as stream: data = yaml.load(stream) authors_cff = get_citation_cff_authors() sorted_authors = [] for author in authors: idx = authors_cff.index(author) sorted_authors.append(data["authors"][idx]) data["authors"] = sorted_authors with citation_file.open("w") as stream: yaml.dump(data, stream=stream) @cli.command("check", help="Check git shortlog vs CITATION.cff authors") @click.option("--since-last-lts", is_flag=True, help="Show authors since last LTS") def check_author_lists(since_last_lts): """Check CITATION.cff with git shortlog""" authors = set(get_git_shortlog_authors(since_last_lts)) authors_cff = set(get_citation_cff_authors()) message = "****Authors not in CITATION.cff****\n\n " diff = authors.difference(authors_cff) print(message + "\n ".join(sorted(diff)) + "\n") message = "****Authors not in shortlog****\n\n " diff = authors_cff.difference(authors) authors_annotated = [] for author in sorted(diff): if author in ADDITIONAL_AUTHORS: author = "(AA) " + author elif author in GAMMAPY_CC: author = "(CC) " + author else: author = 5 * " " + author authors_annotated.append(author) print(message + "\n ".join(authors_annotated) + "\n") print("(CC) = Coordination Committe, (AA) = Additional Authors, (OO) = Opted out\n") if __name__ == "__main__": cli() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/dev/codemeta.py0000644000175100001770000000434714721316200015332 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Utility script to work with the codemeta.json file.""" import json import logging from configparser import ConfigParser import click from datetime import date log = logging.getLogger(__name__) def update_codemeta(maintainer, filename, setup_file=None): log.info(f"Updating {filename}") # add potentially missing content with open(filename, "r") as f: data = json.load(f) for author in data["author"]: if author["familyName"] == maintainer: log.info(f"Setting maintainer to {maintainer}") data["maintainer"] = author data["readme"] = "https://gammapy.org" data["issueTracker"] = "https://github.com/gammapy/gammapy/issues" data["developmentStatus"] = ("active",) data["email"] = "GAMMAPY-COORDINATION-L@IN2P3.FR" modified_date = str(date.today()) data["dateModified"] = modified_date if setup_file: # complete with software requirements from setup.cfg cf = ConfigParser() cf.read(setup_file) requirements = cf["options"]["install_requires"].split("\n") if "" in requirements: requirements.remove("") data["softwareRequirements"] = requirements with open(filename, "w") as f: json.dump(data, f, indent=4) # replace bad labelled attributes with open(filename, "r") as f: content = f.read() content = content.replace("legalName", "name") content = content.replace("version", "softwareVersion") with open(filename, "w") as f: f.write(content) @click.command() @click.option("--maintainer", default="Donath", type=str, help="Maintainer name") @click.option( "--filename", default="../codemeta.json", type=str, help="codemeta filename" ) @click.option("--setup_file", default="../setup.cfg", type=str, help="setup filename") @click.option( "--log_level", default="INFO", type=click.Choice(["DEBUG", "INFO", "WARNING"]), help="log level", ) def cli(maintainer, filename, log_level, setup_file): logging.basicConfig(level=log_level) log.setLevel(level=log_level) update_codemeta(maintainer=maintainer, filename=filename, setup_file=setup_file) if __name__ == "__main__": cli() ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.1286418 gammapy-1.3/dev/codespell/0000755000175100001770000000000014721316215015147 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/dev/codespell/exclude-file.txt0000644000175100001770000000005014721316200020243 0ustar00runnerdocker makeindex -s python.ist gammapy.idx ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/dev/codespell/ignore-words.txt0000644000175100001770000000007014721316200020316 0ustar00runnerdockerdne esy hist livetime nd merrors unparseable compiletime././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/dev/github_summary.py0000644000175100001770000002123314721316200016601 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import re import numpy as np from astropy.table import Table from astropy.time import Time import click from github import Github, GithubException log = logging.getLogger(__name__) class GitHubInfoExtractor: """Class to interact with GitHub and extract PR and issues info tables. Parameters ---------- repo : str input repository. Default is 'gammapy/gammapy' token : str GitHub access token. Default is None """ def __init__(self, repo=None, token=None): self.repo = repo if repo else "gammapy/gammapy" self.github = self.login(token) self.repo = self.github.get_repo(repo) @staticmethod def login(token): if token: g = Github(token, per_page=100) else: g = Github() try: user_login = g.get_user().login except GithubException: user_login = "anonymous" log.info(f"Logging in GitHub as {user_login}") return g def check_requests_number(self): remaining, total = self.github.rate_limiting log.info(f"Remaining {remaining} requests over {total} requests.") @staticmethod def _get_commits_info(commits): """Builds a dictionary containing the number of commits and the list of unique committers.""" result = dict() result["commits_number"] = commits.totalCount committers = set() for commit in commits: if commit.committer: committers.add(commit.committer.login) result["unique_committers"] = list(committers) return result @staticmethod def _get_reviews_info(reviews): """Builds a dictionary containing the number of reviews and the list of unique reviewers.""" result = dict() result["review_number"] = reviews.totalCount reviewers = set() for review in reviews: if review.user: reviewers.add(review.user.login) result["unique_reviewers"] = list(reviewers) return result def _extract_pull_request_info(self, pull_request): """Builds a dictionary containing a list of summary informations. Parameters ---------- pull_request : input pull request object Returns ------- info : dict the result dictionary """ result = dict() result["number"] = pull_request.number result["title"] = pull_request.title result["milestone"] = ( "" if not pull_request.milestone else pull_request.milestone.title ) result["is_merged"] = pull_request.is_merged() creation = pull_request.created_at result["date_creation"] = Time(creation) if creation else None closing = pull_request.closed_at result["date_closed"] = Time(closing) if closing else None result["user_name"] = pull_request.user.name result["user_login"] = pull_request.user.login result["user_email"] = pull_request.user.email result["labels"] = [label.name for label in pull_request.labels] result["changed_files"] = pull_request.changed_files result["base"] = pull_request.base.ref # extract commits commits = pull_request.get_commits() result.update(self._get_commits_info(commits)) # extract reviews reviews = pull_request.get_reviews() result.update(self._get_reviews_info(reviews)) return result def extract_pull_requests_table( self, state="closed", number_min=1, include_backports=False ): """Extract list of Pull Requests and build info table. Parameters ---------- state : str ("closed", "open", "all") state of PRs to extract. number_min : int minimum PR number to include. Default is 0. include_backports : bool Include backport PRs in the table. Default is True. """ pull_requests = self.repo.get_pulls( state=state, sort="created", direction="desc" ) self.check_requests_number() results = [] for pr in pull_requests: number = pr.number if number <= number_min: log.info(f"Reached minimum PR number {number_min}.") break title = pr.title if not include_backports and "Backport" in title: log.info(f"Pull Request {number} is backport. Skipping") continue log.info(f"Extracting Pull Request {number}.") try: result = self._extract_pull_request_info(pr) except AttributeError: log.warning(f"Issue with Pull Request {number}. Skipping") continue results.append(result) table = Table(results) return table self.check_requests_number() @click.group() @click.option( "--log-level", default="INFO", type=click.Choice(["DEBUG", "INFO", "WARNING"]), ) def cli(log_level): logging.basicConfig(level=log_level) log.setLevel(level=log_level) @cli.command("create_pull_request_table", help="Dump a table of all PRs.") @click.option("--token", default=None, type=str) @click.option("--repo", default="gammapy/gammapy", type=str) @click.option("--state", default="closed", type=str) @click.option("--number_min", default=4000, type=int) @click.option("--filename", default="table_pr.ecsv", type=str) @click.option("--overwrite", default=False, type=bool) @click.option("--include_backports", default=False, type=bool) def create_pull_request_table( repo, token, state, number_min, filename, overwrite, include_backports ): """Extract PR table and write it to dosk.""" extractor = GitHubInfoExtractor(repo=repo, token=token) table = extractor.extract_pull_requests_table( state=state, number_min=number_min, include_backports=include_backports ) table.write(filename, overwrite=overwrite) @cli.command("merged_PR", help="Make a summary of PRs merged with a given milestone") @click.argument("filename", type=str, default="table_pr.ecsv") @click.argument("milestones", type=str, nargs=-1) @click.option("--from_backports", default=False, type=bool) def list_merged_PRs(filename, milestones, from_backports): """Make a list of merged PRs.""" log.info( f"Make list of merged PRs from milestones {milestones} from file {filename}." ) table = Table.read(filename) # Keep only merged PRs table = table[table["is_merged"] == True] # Keep the requested milestones valid = np.zeros((len(table)), dtype="bool") for i, pr in enumerate(table): milestone = milestones[0] if from_backports and "Backport" in pr["title"]: # check that the branch and milestone match if np.all(pr["base"].split(".")[:-1] == milestone.split(".")[:-1]): pattern = r"#(\d+)" parent_pr_number = int(re.search(pattern, pr["title"]).group(1)) idx = np.where(table["number"] == parent_pr_number)[0] valid[idx] = True elif pr["milestone"] == milestone: valid[i] = True # filter the table and print info table = table[valid] log.info(f"Found {len(table)} merged PRs in the table.") unique_names = set() names = table["user_name"] logins = table["user_login"] for name, login in zip(names, logins): unique_names.add(name if name else login) unique_committers = set() unique_reviewers = set() for pr in table: for committer in pr["unique_committers"]: unique_committers.add(committer) for reviewer in pr["unique_reviewers"]: unique_reviewers.add(reviewer) contributor_names = list(unique_names) log.info(f"Found {len(contributor_names)} contributors in the table.") log.info(f"Found {len(unique_committers)} committers in the table.") log.info(f"namely: {unique_committers}") log.info(f"Found {len(unique_reviewers)} reviewers in the table.") log.info(f"namely: {unique_reviewers}") result = "Contributors\n" result += "~~~~~~~~~~~~\n" for name in contributor_names: result += f"- {name}\n" result += "\n\nPull Requests\n" result += "~~~~~~~~~~~~~\n\n" result += "This list is incomplete. Small improvements and bug fixes are not listed here.\n" for pr in table: number = pr["number"] title = pr["title"] user = pr["user_name"] if pr["user_name"] is not None else pr["user_login"] result += f"- [#{number}] {title} ({user})\n" print(result) if __name__ == "__main__": cli() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/dev/ipynb_to_gallery.py0000644000175100001770000000437114721316200017110 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """ This python script is inspired by https://gist.github.com/chsasank/7218ca16f8d022e02a9c0deb94a310fe . Convert jupyter notebook to sphinx gallery python script. Usage: python ipynb_to_gallery.py notebook.ipynb (Optional)script.py """ import json import pypandoc as pdoc def convert_ipynb_to_gallery(input_file_name, output_file_name=None): """ Convert jupyter notebook to sphinx gallery python script. Parameters ---------- input_file_name: str Path to the jupyter notebook file. output_file_name: str, optional Path to the output python file. If None, the output file name is the same as the input file name. Default is None. """ python_file = "" nb_dict = json.load(open(input_file_name)) cells = nb_dict["cells"] for i, cell in enumerate(cells): if i == 0: assert cell["cell_type"] == "markdown", "First cell has to be markdown" md_source = "".join(cell["source"]) rst_source = pdoc.convert_text(md_source, "rst", "md").replace("``", "`") python_file = '"""\n' + rst_source + '\n"""' else: if cell["cell_type"] == "markdown": md_source = "".join(cell["source"]) rst_source = pdoc.convert_text(md_source, "rst", "md").replace( "``", "`" ) commented_source = "\n".join(["# " + x for x in rst_source.split("\n")]) python_file = ( python_file + "\n\n\n" + "#" * 70 + "\n" + commented_source ) elif cell["cell_type"] == "code": source = "".join(cell["source"]) python_file = python_file + "\n" * 2 + source python_file = python_file.replace("\n%", "\n# %") if output_file_name is None: output_file_name = input_file_name.replace(".ipynb", ".py") if not output_file_name.endswith(".py"): output_file_name = output_file_name + ".py" open(output_file_name, "w").write(python_file) if __name__ == "__main__": import sys if len(sys.argv) > 2: convert_ipynb_to_gallery(sys.argv[-2], sys.argv[-1]) else: convert_ipynb_to_gallery(sys.argv[-1]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/dev/prepare-release.py0000644000175100001770000000153214721316200016616 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging from datetime import date from pathlib import Path import click from ruamel.yaml import YAML log = logging.getLogger(__name__) CITATION_PATH = Path(__file__).parent.parent / "CITATION.cff" def update_citation_cff(release): # TODO: update author list according to PIG 24 yaml = YAML() yaml.preserve_quotes = True with CITATION_PATH.open("r") as stream: data = yaml.load(stream=stream) data["date-released"] = date.today() data["version"] = release with CITATION_PATH.open("w") as stream: log.info(f"Writing {CITATION_PATH}") yaml.dump(data, stream=stream) @click.command() @click.option("--release", help="Release tag") def cli(release): update_citation_cff(release=release) if __name__ == "__main__": cli() ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.1286418 gammapy-1.3/docs/0000755000175100001770000000000014721316215013347 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/Makefile0000644000175100001770000001074514721316200015010 0ustar00runnerdocker# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest #This is needed with git because git doesn't create a dir if it's empty $(shell [ -d "_static" ] || mkdir -p _static) help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " text to make text files" @echo " man to make manual pages" @echo " changes to make an overview of all changed/added/deprecated items" @echo " linkcheck to check all external links for integrity" clean: -rm -rf $(BUILDDIR) -rm -rf api -rm -rf generated html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Astropy.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Astropy.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/Astropy" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/Astropy" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." make -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: @echo "Run 'python setup.py test' in the root directory to run doctests " \ @echo "in the documentation." ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.136642 gammapy-1.3/docs/_static/0000755000175100001770000000000014721316215014775 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/1d-analysis-image.png0000644000175100001770000052337714721316200020723 0ustar00runnerdockerPNG  IHDRw liCCPICC ProfileHWXS[ Л RH ^! $,*v*V@ؕEł.bCMH@}{'̙O*Cȗƒҙ:(/ h 媳UtB9d,ę9?U|Ro5@ij!֓!^*]3UpMRhPy`:A)XV l.48.: wOx ;Ћ  ! 1Gl' a!H($ iHH2"+ dRE"'Hryt"oO(RQ=E,FI84%h9ZDE:ڎD{0ib挱0cY `eX5V5*֎uaq"3x2'3xOWx7@#.a4!0PL(#l% gH$,sӈ눻ljmdDr"bIbٚ{4i>Hե:R9ԱTu u86-Fi%I-WK5KR^+:nCgӋeK.m6G=SRMX|:;t<% tn=11qѡGԳkOџ_D35,5gp!!! 2P Can놟FFFˍƎƓ67o(h}C&& &L6JMך4230 21[evԬӜa`.6_e~Sf1˙& Mvɖs-w[޷XVY5[u[[n]k}džlò٬9k6vms;C;]]={}$jkDC:ˎȱ$vZ60gdXTgssCh. .[O|_]=]\;"rM#޸9*ݮg7pz󋗷̫Ϋ;û&KZ:C s磯o>?rv=i7R8r9#T5*1"Qc,i:*rQblb$1 2~]ܤC #'Md$NHܑ.)8iiddErs =elJMԐ1bq815gLc:z-{cݸ)Ώ77ބԌyj^O&7* V :gYY+ggD]bB:'"gCm}yy53Jt%S&N&uK'NZ=[%*GzEaI0LLi8ugEaELç5O>gf"33g6ϲ5VPms:iSmV [/ZkBkiY~s}K.Z~qdٍ˷YQQ+W1Wk<6Qi/.o\kvLU_'Xwe} J7|(xkSjͅnIr/5[nM}{S555;Lv,Ek;+dWcsݦK==/f콱/j_~_m~:8PROn575FxՐgq]sF[7l%vw ޝ}p&w}wW![%>Ҟ=3Vΰ/Ƽx)}UU_gП-ݣ;^^Y 1620 Screenshot 622 2 144 144 1 #@IDATxTLJ(`A;.Pnlآ1*FiF{Êػ# o~q6e}wrwN9sfC. '" " " " " " " " eJc%D@D@D@D@D@D@D@D@DHAD@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D HU֏G‰Hw@D@D@D@D@D@D@D@D t.k$\':>t3f̘%HcED@D@D@D@D@D@DNڃ:Mh]ʥVfnB^͇L[[>'" " " " "0;gg+"PjR`k1cƄ{.t5iH;ws5WXdEB߾}üΛW^5G߇~(L<%ƤI_~yH ! GD@D@D@D@DAX1@exNV7|3xM_^ +`6(lᆦBX%8qb8g_|17|a„ kQ h_~e7n)ϦMz[l iˉ4i?>rdq:u>}^Gpwcse> s9g2eJXb%Zk:vl+뤨"и_gKIx/V]uO?F+P࠸)TR`R`K._>hgyGvm 4 ^b(Pqa9hw}FK/ _O'OӦOݻu3r-gzröpXD@D@D@D@9oSaÒK.hO:5t=V޵ a+/Bf.]~]w:,LiV[mfX9 Ѳ~]ZX-EV6:F(>Bt(ƌq1aZo/pGW^A2d=H#noK9*z+\z%¿% 0 5\aS"IE|np[ZzZcKY@!޽{[t(|*AɄuXKN8 O;B_x`P_~a饗+<? K/dF ˉ@K +gϞav P!GnF΅(>oA%kk]p J?#,鵖:ǭCy[[Um _}X{mDgĈ_W^yU8WD@D@D@D@g*>s  ={ 0L,7J+|'au1gaKtIa7qJ,d  ycdbV[UVYŬ^xpꩧHJ:HUϹ"rI0=N?ǥCfYH`e΅^h~+N3r*L]|ugڼv3O? vXO?jjjjwy)+4(v}w[rxLK/4㏶j8bv0+&Rlͽ~gYc5p~ r+͕]gZ9r' Iʐaw,Pd1B?O6ݰvĤmF,RMZ*C=XHe@Ű袋ڨ'V_}|0^" " UNzfFnF*G싀@?Na EZA?UW]eLKX&Uhn#]wCL"'I@ ֤&+(<|Jx'm,/|uc=f(9h2zPA`ַo߰_|Dz':@u>ءcYy$k@B?+->W^yeO<і +Y}X\)S 0k 5556G.=-^H6-Jw1zfRK,ȈA[;ı{ +uɇ?Xk.8{"P>iҤ;od7(.]v ݺuˏz\Z-cnQ;w.ԁoY0hc\%EF:N䏺;MXƟS{ϳ'"P|М%NXCAk;G)׺b1o,8g)He拜6)ZkQ8p`[Qw֜ک\>,]; [+v*?:[r" M݋(?J^QN,KXbe E3yWm(N)vm~P2_~"a<am[0bĈ*-" <ֿl#&[(=/-P߻wolixl@yboаsn,ww/Ҿ㧒//X߄;TT"0+(H"A)]2X\wu i<>|)^{ˌYSio(E J>p3z LcX?n{ٽrΟq2ErP=ڦFԼA" J; 2Ya[nʮrtNzW،cAt"*kdqpE lu(d?K/پАS7" s nU 2&aE@f!@CF!kfʦJ3y uS6P*kjPN`BR|`4n[SL{,N9(/Yc3brVmlhӹ.m-J+_d6W ruȞ~b?Lxb9@G0T4r" @\=V8wDYn~ZX(ýkXPdqc4>8/7x&s,6l9Y(OPʵ ޙ1cƄۇ>4&aλK_(7^3/ @X25~ik)k +,FeuZ|a/Z^_dUϽ XJg&0>q?i,Lsd9J*:Ja͛~:J\5G?'v[h<)ױCGvyã>^uSl33Lti8ǵFz޴o.'G?Oޠ+e0~Pa%r 39ƓIeIg={\u^6n}h_@iX]1fS&L0V9 ڠ\â#0h{[;dl_orR`5T _; \C邽IgW^&QyfQM(%>=-4%'trЏH}v4dӗQ|ѱg: ajjj{盲6(tH2d*##I3,GR2Qimbd}pwGG.K/NvZ (FX 09-[n;A}cZo2afk]exd$<GxX_.sdm4ώXf);"o"rVn}h[\Q󲎵4oF+(Xcuenky~u=u. b9zNwBPh cʋb݁gpI'YG+_銀/KP`1wcǎ[u")鸣b V,%^D 6a 74GI|IV0u]@'_I 5XÔ,~ L!.i4Q^׺k! Ώw:^Roȑ& J"ϧ#.;/b:vQnF/aѯe?'3aD8exPN A7|p{=(;>*bGCV㈋k 6] yn~u3|>Cqi0oTP~ )<rX=z q[mlg0Onõ#Fb(QZw"QyȀD@{宔/7C 믿-rONښXm~F4G)r1u,NX=X8V[me|{γNPv6}to~뤣tAAr;X`Tk~_D7e*nӝ+?0|wn%mdD ąB\d V_}|<#q(H0{äLNxШ/dF6xg,ȋ|sN(0e;yN?'=)V<,8•8Ql /C9hO“;M?{?ru.oa1JzuCv:Æ m[hƉ;X;=Ή ~=͓&ӱK  "f^UV;#>ML;+dp!# #y^ MOckWMMs$N ~\SYWND}0kڨ@lK$ 7tS~mk&o~RN-Nm1."PB :8 ^:d3줌 'yyzAnfLRla7=" IrÕ&t`ѦOY QbAt7xcS"(\B9;|c#~±~E6(P84ב2u ;p# DAz€x8R5X3MqGN<{5N(3Pe0DKqmTe%N,,H7͗)&1rLc-wakL=s.SyiPg2Egsِs=7_v2LX(;K~<^,=G֕1͒)X9ȉT//()۩cH˿֦r1(Xڔg8.-M@[o3t|h(7(hRWT(_0 4VUZ*HVTi#ϱaJ+(#âN8we 8?/<}.(\q__⣼Cb? 4a_4PLOdYb%LM7#k:)AN4(PjxG8 yd* VH3Utj]o]SvXx[w  -+/L컯혉qa*? qVO>=[Xbuzlcb m #Wƹɻ_>n=NE@71S|?{c]hmi}6+ʦ+KD[XEZ:F8@cF5 HthRq `Fp9J:n&O>u1om lAްMJa7KQvA-(\GZXJQ.aРAa:*3`msfuN7y(\1H푡&NE[eUIw>Ȇ#S֟/4x~sXԡĹli.kLݯ= q~ݿǤZ[X< CyC.?F.L1V0<'Sx<oy68X?" L2?3Mgڝv[S&H9b=}f.s'>+9 m`6"2KLLeNYٔQiBfd%\Q̡"<3wLmqQo6}=Ԙ‘x] SXȡI-Oן'sc3#e 9}bnFBX:3rQLd6G~X.Cv 3@+vݜÅ/'"P 5PSsP/K -I.Y:%J8¬ /Bʠq\tEVPQaFl<.M% VS)\YB+ai\3ϴ) 4@(*FD~D@m&L#eJ __P4Sܺk>(xQ{)16Z.GoӘNH84p<ȅCFA#C+WJvm fE(?PLquHEǪ1i)(X? v-`=#}?N6ߣZVѠ.;@@'KG~X]h(j۸QQA O9IG[o׏{3EO`)b `8K 꼹j.A/KlI,#t&ҖD)!Qʀi(e:KD2 `I#h.#<`O:t)Sh"i ,^x\M=R~=Sq% 飌gB_S.^}R֒G[lp0rV>]4隂$t%G8ɳs\C.| 9]>rrȆcP.?ϔфg!\#n=l#]h()+:(M#p^vI2$6pZ<#NQ6&GSzGys$mfZX\)*64d;9wq6$-h[HEKYSsCΕV~ڥAȓ@ 1c䧧1-AAqQ] 5Xa5RswL5`am,bh83NLF. cneԚ-&4YP}v*XfS yA8q\a 6~K!Q,0~_¯I#]g,y\F>Q6Qג+{GG#e3}g?OSC#lE݋,u9SGѷ[r <$uT>ETB@ PS#@J#JLXxV[ " .SSc wcKǂT8Oþ@U㈸nG(PҲ -(X밺*䈋w i5O% yB9B>%榟wĂ=Q,.kd[ÕpYNL }=gEX ),P<^!i1lns?M5],\k](X!e'/(n\#'" uz+:?>D]q=R{if}sϰ;`R.I\CvX91(@c 8*oDEʫÇH>W$ ٘c6=GDuUWYgi:0wd21Mc rO" 헀A22]WujJ){(8P`3%xPXƓ"ǬHfUx m$^z$\((ُE)SR d:gqfXy-6T fcڎ?> :4\r%ŷǗ5 -p  ,iym75" A - /3Ko=曶c^:F,K\)V8]PPR5i<]ް%.>8.߽/icm_K:d@^WEs=7[GE tCd\ 5XR`/2dϖupw-y[a9'N:eoB\i~ g6NGL䂧Aڼ.{ ">8Ktm&FF8~l@xNݑn0;(" N*:W~۹Ls^5Q~z@9r|*U*/5gAra6J+dۿq馛=X~C~;'h{~hL߶E(!Wp e_W1MǠAlHGE@ʛn%F*5z,c(^}U[xL³&*o/FG+s&XJQ撮Ǔù+zH E+)#}͗uyYc P%}YvjSQ.DG3a';k! C=dU(X_˧/;;1G1$V0@g9.a^lρKZ:ăs9haa{R(KW\qE{y>PO"^wph,oΓ3ǚ 2y.Oz";i!J^|w6sX>cAFޖ_ny>oGD|j2 kϛo9s9Vo0x%vuW˽MG9e8O)Sly@pgDz(F w}֑`A7`ScҰ~^1v S"P4OEC4݂Y:(ı w9s{N &:0t.Bi_MdgEd`ZDž^h+d=%A( T|BM>aP0hGX%㟝)+)gFmױ"BBن2#P4^ȡ S柁FC(HY@ +եswˮB3GсiÁm)凳@IH; Fĩ?BaDLƆϐH9#c$ϜEqQY' iPggG7Kr"P'N䦑McB$SO=Վ8=4~( 'tPnv s\4FC&Yٲi!+~vu*lw^8mNj/b nxP)z-]Ph1͂tp]E@ʇe e >e ~]Nw$ a tQ`ig(ȣa (/PlQNy@b89GVE\#Lͻ;,-£AIS"+~6O&uً#-E( "x|[mf}D8A|G y~qK%ȸ[/X'uYH qGΉe!\(j(J:J i>gq8QLna#;~>Q\G^{\I\GĻ_~W`K<#QBqB 2,2&L#Ot:>7(8HV[^V>|X?M3PɠtJBص\f (s#ky':Ԡ BvX= 32M/AAga_>ă2 -Q rOičY++`< V[me3-f: (PE(@O%1<G sv3nr/}n3ϰw$u&$M5 9~/Ӧ`~ Dy`Vq5)<G9HA:d66OM_᫛@X6LkPݜ"4PQQB~;vuft"4P⊅Q; R9?AN \t&xPѸOt,`è Q\F8/#"P(sP>eg:Z?:c.:;9(Cbiq׵ZtyRQ'>䤌rK#Kc@W_M~]q5-ǡdDyR>Fȟ*vQu9m(}(QxO%(PtB^tF&{p%>Gs!:Z#?učȆB7K0rN#<>yg B@AH0n5wI9/oRP6m#3gj<@P,/ Cgyf` EP\ _ /?sXâg OuFSY0]"*F+W>$yZyCGIN?:mX3x㵐.Bi9 4QҸKb/Y4q៸_ݞj5^1RW=i| e89sfOilM婏*QDʓ^@|MUIF!m늫e+ 1\z9,YD@D@D@D@D=68y*v?wט~P?7k[W]ө˷@ H|)h )/ɖr(AV@ HiR?(oJ4%n\ [}g@' K`JUqe nF`9 m6YxEâ8bam l>G@D@D@D@D@D% gRvx*kqņ{ Ů-7;%M]T1;ʉ@[}ѶFiW%>ꨣl PPvma5 w}q#-9hP\]|tPq;lG9p|0|嗳(W_Y~_m^%s}ݰk{55?{ -Px-*<*4,COaKB\e$u~m?~|+uҥ]Ύ~u." " " " "PPx /wya ӦM (VLbC A?+B裏N;dPիWQKJ ȑ#[olU6̈F))*xR0PM' V)d #;CX{ /L x"ghL 8flIm?PxKO<YZR`bk dYNAKǯZѕ6) 0.TrKx|ߟdK6[xU 5(=v ڰa/~flpb-¡K:ڷ 𥗶ub['&} &Yd 4!&vZ߀F+ X<^_>jײ<Ů}?"O^&84]z٥~;~==xBormeLI-O>fX*bw΋l;PB~ ]{|^s{y)MPaV O>.i> K::sIrS<į$$NJnvr0 2=ܳ5CS稴zYg5[B3/xz|e.>XaL#8cOZ`BcT~x9'O-amd'7?ιwh:ql]y8Bʾ{MbW@}]W8ȴq6(͇nm#=nPWøz=wqs&ȐQ(4+q88n_?ΰ`q6NA$ SR`٢` vӧOXc5L9^56VAN@IDATG6lЯ_Y*4N 6*6zkxglQm6x u]aԨQaM7 k`*:%>>(veeUV&EMX(P J=V[mW(}駶8%nvYeölk;xNrQQrpo,^{핋!{]ʨ\TD+\v yCfyTx>Q?c=rq\T?ϋ}(1ooҾ>S#lk֍Jk_q 6 [{E8>>V߀x/Ч!7EeEq?wȹ/f.s_ʷ a*hoYҡt7jjjfOEZsΙ%?.܎;|qy晹O?Οz)jcƌ>0{/dݥ^j6pC+l&{5[ow|Qh= tRUdK /Fb>nXο|oPXϵ~4~;E++ȉ'h7| ~-XiCjWr-Ca<쑑W,IzGyp;yQ.CZ+?j@\>,A#6\uUK. Q1 ḏ(c8\lVvf2P?Ozg!?pJ<4c4/=\%ʐKX9X>5!N,p͹p.'" " " " -IY,~EŶko[;Y4;uOVX`-E\h~l6CH=Bq~Avnswk߬M5m8p_~ɧ!*qm#ţ&Wv{t$lI '6|Lf]3%>\CD *l>  ƋDe-Œ⡿53vpXd/ y (#˰ k|f`}E?:<8*:VNm4F4bJ(b!5P.a]s5y k{}Xzğb.Nw mg.СC;Iqڝ%mO3uD\kt /fKx.*rn?>STD٨=|Y@?7y?yQ#8"o՛[g]kYmXvϹŭQr?LuQ*9h+ކN=NW1k|}î0rq=wh8>3Qp8w;I]=/P\~$_oч@o{Q!Jf8}GY-yA;#pX~y` ,g}EcҎ̮y,*f?K}Oٺ sfwy{nC+,Knv1 ~,dk(1b|3FqZYDeNx7y84(̾\oe`aXC9F ,kC~g~;s֝Zll䂵Zeg,+"oX)EZ FRMAaq|gZc+ڈ '3j9U}D=C4L^xYbp Kfº+- eP^WSS6፸&3ϸ3~u^Rz‚zI\Eq(pq &f.@H8 8_G湑B`as{qGRQH"guUt.~Ca .O㧟}jS0}GBw0AG% @P ! G}4Jf _ sLEQRaj۩svJ'9h8~~^5&;C@9ELX|kقŦw&ITD@D@D@D@Z+oB"(7ڲkRHߦ.G;+0tk׽qvƥ} og6pG2"@;Ӟp;DZ < d{Yy<䕸qʫ4υM ~Y'?+\xQ~ ˭45! VGT59 pNksWz,TFul0X}'d/A<(Xo4ަOOϳipa4X/ M闸\p~ċbgd 0믲:\fh9'êj !3;w*l671OXsUܡa #mQXpVѣ_̃W$}T9(XhQ`_qT^1BЛc޴+;q8Ҏ:(Hhʳl)]I6,mjڧ(šhH@)CyĒ$(8϶5#"l:vȧU۳PX{\w9ua,w?#^,K> Xu՗W)*Y(6uѺrs 2(8vcJ!;aˮ$nT^hQ`1ᅟWLVPv:²Sd:q{X+TV]iLC^d0f%\7oy4v{-eVpi:<']g>Xhc=f#.(\RD@D@D@D@D)\Q5}t!ӆnxTCfSsEV1%\c??7~]>9gn͔p!. V`tX*.ok{fmfVq*)ATc[./l "T^qgFp={(V\qw wuWm:+fuLONŇ1<۵25.B^zɂd㌻i|Ld+_gAƸF_yúlHd6nLt=+JkTϧ<tOD@D@D@D@JE 6O>azRٜxls=C]E;Ng ӣy߁sosPGxCny薘uX_w/4 q裤lXϋsά6b'-k/ˬg[(^]kjϸrHA3c *X:^ BREڞG?lǎf(.=ycxUWj:9M6~z -F(Q$S(pჩAˆ#Lr)b1ʡ38#\wuV)rBRLå}!=dĽkJ,,& EQDv8mr-i7jW^ye~7tM~A{߸ 3 p+<4:x=f̘[oUu^=4zuIsJ!Bw"> $ʏ-qH< cmv+Ң&g}&kK/uﱆsQm66 4F [γX -@Ԙ׹nVY&v^dNO$,5?Y\ Y/,Ԩ0х'aoa#^?Ӷ:gw(%ڴ`7mVڣeԨQav =P~Z~ɴ62ik3H^G~F=7er_l>v &,h߳)ұk얈3Cج͞Pphxd 2>W`syz٣+:(nfT(W.gy);iStBxg( b]M6Ė+޽kkmbnpc*K6_޾ toγrWPhYW#htzT,h&*t:LX1ٯ,k^\XUQ`B{Wl * }G1|lpYPXi2RBAJ+vodx"yf*"yaxm1C: k7z 81Ru* #P"!3 =gn2qUc(5aDyc b(c(<4^@K@q7Z*4mz6E=1 Ӧ16뽲akz[v3Xu`,fX3DFXfX !7T׉pL?cٙ Žy, 0`2 ~.N8Cڷg`̲(oOҷ@1AZk6S[rЩc'[8P_O& V~-9W`1*@!Nf6lcg8y/QWl>bMuJԷM75 -P2y>g9.HEDs3dO=Lq\D@D@D@D@Zmc,|hmco)l~->VSҹ_ K;Eah+9 G ǁx=w>O #+JHcܸqfQF8C2 Fy/4(vweice9׈8weM^'ʪQ خ a-ijEkJ1 M_ol^Gc{qJ Q_߫HZT=b\q?֙vWHbC|8ԿuF.== Jh"cO.iƞ_sz:V>Uϰr5^͒҅{݅((f(ו5~1G8;v{o_X5[._L^7[B]}̜E*qقѷD\KːW" " " " "P@)ڥ(vY+vo}92q:/&cM\b~]p:u+ \hvc8Zr(̀" " " " " " " E h ht>, IU-h*)JND@D@D@D@D@D@D@D@Z VWkUD@D@D@D@D@D@D@ڊqo+ ]y " " " " " " @@X$T3)+" " " " " " " "P$E@D@D@D@D@D@D@D HUO_y V<$(" " " " " " " L@ j~ʻT)*!IDfO?&MTk;3f4!%:X\-h-L8}9 s=w1/." " " " " mNNɓ=zj)E~!t=d+#ݔSCnZ.2w\sծS8qDSĩrg@w}'V$@qu*Fge\iǀ֎[;LL5"_s^+t<k" " " " " M@t" " " " " " " "PjWڦ"FE+h敖BfBH?B$1E@D@D@D@D@D^R`Ջi|Z:MP"P K?֜:ukJφo9;&b-zekql_saY8qb 믿~9rdXqjf/{gb-JwuW;ÒK.vmo{'tŞ;Ϛ7_UVYQ:K/4`^z*l~|p5ׄ.뮶֙u9##Ϛk&y^ȼꪫ)Soyje sLk­n_gu~gD=w[nd=ayxaVq86lo~ᣏ>2YZzpݵׅɓ'[.oXzÌ<xwVxp9焃:( ʬwܱF([ ,:_|qxGK\Ϩ$zwGo %r~+2|!4˄_XGuT1bAW\>3g(H.vq\ygJ{/p >jJlV(;{ϐ?wǏ{=c!}=.,\uU7ް;S^]p_<ޓ}/2)w "yN(P~}",q2:GaOz* 0 \~aW_}<)P)O_XyTED@D@D@D@D@D rP:4ڊC Rk׹f 74ÇM6:~s=w⩧'lXX]tѰN;Y|ӟLug+zaȃSN=%la泰y֟Fp~k)Pj;{x>XT-2ko[{K9`CBw-n7+h޽+$}D N8 fʭ>Y~McSN1K8NQ\aRKٳ;묳Lazy6B)l'ߓ+s[\F$υg= ,n_}ֱui(xv; ϻ§8,8e9#o|SqX^1GFE⻈UI'd0S Q .XxʱK.ĬP4xQyaȹC,=,񘂉لVO%k:((P\ azWYEĔвΕNJ,49O,9{ !enx?23}̟W6,`/"lnH [ԤE ʙڊcxNcW_V_}uZnLɊ|;WW.G]74Qdu"3ό<{&NSB  yxw`I#SY ET(Hq^֐sQ}|܈?2y]vq/f5en~ߔs (@lXtpX 0dS9S/:tXSqi?tˑEqXް7B!RrӭOiձc' >g- IRt;gȳe\auc c~ۻ8ӾC6ZOXop0K/dg䷜X{r92ʦ (ܰ㷍 Qhz,g[n)ms:>En֙bU(xX\OQ^5DJS&qXqmmB|15r+$1n\~lxX0VND@D@D@D@D@*@*XjzNB:f4ۛL9M1EʹYGNJ° 8߭ƒ;8~w7uꏖ6a Yi>5bXc༃k_$`2 ١ X9c+YŏO9C92zhVV գMy>bcXe}gæ\y,RSpX@=S%ZF gH1Qj%EQ^o.>k^ 6AYhwuPb;e."eJ/](\BEC ,HYS +ϑERɔY{O/=/S@([J]ΔQl 9צ1,c"Âi:l6\mtqtN8Spp :LGr s%0FN%z!Sj7V$,<~EQxLJ`|ǭxG| Xx[1NQ :(ƍF{Zi(XIّuɦMn= <ֱCEJ8:tM;`Ip?n뭷]y7J<7޳WOe+ Xbj\g](PN l)N2Q Eifñy".cj"<@r xT*QR;T_ t8omkǤytdL@ာ\#O?" @b@x>8~RktGi;M54K`)D_.i_,ܓkZ #5w_6m&J=/uJɫ1/z*}Ԩ7ؔs}7qƅҵ^[l̡/<ԸCWs=먬+br-/B$7ro B-O>l~~ yyᇇZV]u,},+Km?@E"P Q+W<%8u% 70-3t'FO̞XBżgBKE>~_!io~ Is'v-/5ezzB[oMzx uY騣JK-Tl<ӟ4=Iٞ{D#<~;]+Bmjeoň#Ti1c$V뭷^uw}wZc5wܑ6`J#~Wяvyn&68?9+p }u7 Oq]dPoo/9昴뮻u[7?-ߟկƲ9|;h_M0Ysx≡7/~oB"⊘JkXT*@E"0(sѣG|EV׾?.1_0G0(Ktf]ߩ@E`jTkjeny՗~A+rBtdV Jf#Is=Ȍ>;=ቴ :6xGb!sx"s?͝@^%\2~-DXFx]w%f/[o5љ;`R6Ѓ~ 6~tTܞG?b%]*E"PT*ӆ /|Jn_ˆݬ@E` Tk [b_cq\g<4͛ھoRL`7 [SK_Rx21fK-7D#k5Ď6B,ҡ.["(V[B/|!^||&Ȝ . ?H;}*N:|srj/,r}7@ׅu^CE"PT*C*UP̣[(?fa|{{.=sixv_o\8k@E"*jA@16x5L{,9g?= _/r#k},y䑝A\}C bǍO}SA x3YVƛ>IY=lF+_J:SN&%G裏{TZ頃J@^-2qO[4&7uY'T}9G쭻RK«@E"PT&TS6eAkO;?)͗nxM7rl|e'Y/*@T{l!eg "Ay R5|786&GLR9Gk-2ؙXf(9眓yY9l^o,c=2O{4|_ ùɑx4{饗b3~i,& +K]k"0WjkhGt衇F[ ^Џ/ )^CE"PT* KK;ЃU[>4ny@׻_T*AǧVZ$}s9a}Mn"PITk&]҆ %ן *V:o;#Nk9+}'^wM74٠e{JBOї]aԯvZET*@E` "z sט@E"=zg!4S{ o%\q;UUdJ2vu^6/Sc:5[mnҽfz^T*@ETAY T&!P IX̔3O dFU`ֵ8-:M j5OE"PT*@E"0*eKfu}T+ɳ0HP{ir(ztCѯK7˲cխ"PT*@E"PLby++25d2` i.^)d2tv2vn ȜLJ/`Pi:7.tC;R[{sP&9͊@E"PT*@E 0E;N }@Y\7%Rf4e"kOEzZ(s1TV*@E"PT* )X/3< bO}ש-/7%A%;ˋE;*"Lƽ'gR~{m ߋrOpr: \i&^x[\vY"}> ʭSΓݎs>\!B{i͋(sm[thZ׮K{~q%ۓcЖ [ڣS#k>(Yi L_t/f2wB}tg׬cRbuLJ5lqQB~#GD("TEW!QGlfK\8IxtPN&;/*WYy9B"w ɲ1%Ɔ\hfQ|uvUYzmzgӨQ:GR"c{T)kq~m%wS;SNǣHC4=v:e(v_(KQX\:DLCv=\gzRT*@E"PT#S@@IDAT%,SG?*_eb0^X g硝0ZP"wyP CUU@E"P 4x^ÛZIJ bH`{_13K5әv3ڍCI߾)$5hm= =U+jT*S%*D}_Do1U*ӎ@"Px}@\n`UVi_J]0^Vf<j׭#ڗΌq+.1oZb Ո"+J`:+@E"PT  y֙W"Pw٨^>tU鉉H3dUtվPW*M~ԬU*Ӏ@%zڔ/=,&bsL/k/u9ZK|>Y(C:r,^u%r]E_@E` PƱn-]wu!ݞV^e׌E*gE"09#K/=4bki(qz晗ktbGfJ6&^U* J`ŊU)*}=eړi>Mn=4en>弔W˱5uMkRPRy%OIӚ\t[K93X)ie\ck*@FfL;;_q祕V^G'Y*[E`#JXd{mz{̤fH~MWݗn41>Q {i%Kot4rx~{;YIֳ@E` 0d btM鳟l:N;iY߼VҼ;HGN.l.KzjZ`kt6tgwiUWv"+viוgkv!Mo|;J5Q}iԨQv3<392h\zp7ǤxM7MlA._~ߦ5\3U}<-袝S|΅ ^:K{n׿,п[n;/p ?RKEY3O_Nӟ_<1igL׆d+oF|GBr/±g}Ѵۦm&9KsNzvmb oWezt'~w&I=!aj=qFW z~7w}QMK(m亜O0aB?1H FmGq{o?>ϥ*p*` sOzӭޚ_|t5צo~AOBfx9WƆȷu]7N/BO~k_K^{m~A$}LfQpigO;oo~˛ö].kx=lͫ^'uj@!0d= @f 7.3<07 s5Wxmo /#Oy'HaKobUCPWtGGFmB 9C;1MTfmvqW/ixl&NfdO~ Qb‹PŃL8c;zĈ3̡I8o!E o^C+Bx:Hx`x2ߗ+O?<~' MRА9cƌggbOrxNmH{uQt֙g%!KK͞,L<<h3+ x!eO9܅^sщx7R~ G{J'+ x^+ *A蠃b3L; İf4Q; YC(_Re#u:묳8àG0 [,&ZYHD$H["}x|B b<Reٲ Yr%WK,,5CRH3s'D8Rx'Y&/ ϳ>Ȑ~QbO9D,hYgN;p_ó:mui&mfO9ƍm{YOQ} I_@WΘR~v8Dݮc5d9}կ~=|<#ƭaY@'zh}A e<{=0uV*iE}hu^sq9/k_C/Ǹ֏ 15T* !O`!*=>V>/z<&G%q@L S'RTŇ`7 |'sGZgeu\|J{"@CW:&{UKY&X-Y.KyPY_:>x.W^yN]y!%M!g̖ћ޴Exy~2ͬP0.IGV>8e͈p I`DH] BK[φwgy!SN9%u.cE"Pkʘͻ֏廯Y8Jfz^ 323}VBO[WM*M& H'GKpλŻWH1mL-  LU1OAk׼JUo*kƹ8K>-6AfH)/b %Xf,~o_>9Hg}NUJ}}Ѧ숈Zs),`%:Mhy"Ϧ=#E!+QBirRf5wwV)G$<vmNo:a%ɃNiLZ}g2PdG}ڱq_^1cB'x"Y%( ! = xaI@~*FO8AWү!I;rGADڷ^W%|y$n喑y"y|C$[J0,&ǖfdͧ9*uƚkIJA`Y^".lÆN(K!fϤK.$6s=A!F>*pDnч% @ -A+ <_I/~MG`y"Ξ_?{D2\,;a&ؠDl 替.ܖgDsn+o$R@^>p@9`ħYlZqݔW;~L^mS&γЊK[zDxI⛘D"P `TWX*zE"0 ӵw/}Kjz iMcvg2 |4>Ͱ5Wsg϶ɜ4XCe1:gвbkbGpu`kYcPp-x6Ki]uw5WDW|Dc*hڌ/uٌk-3V9Z/Χ;YURqJe2;o^G~/ %O\ti纙~Wyq"PT*@"0TbZzc T&ű:88b;ˢꀗǦ |<ǦN:Ҽ# -2fD^xsw]>m_.5Tf4cݞb0lofNU+MZ42[^O)mތ8oiْ+T ~՟1 R򱆁@yq`I;i꫃ʱw( eMբ"PT*/%q3+K?J)5狈3yr~q|u ų)/+$ۄU5<=^zÃie5kYó^O_4!U-::ļb;'&FT*3)X > R/9#D|>oz^$(܎ip̧*5%i`f*J=Ʊ31Oң,NiDד);F^ٳNޢ^oH ;d&;lEN4ڠLM|Kt|ϡepiP7b??JuuF8j)Rte!Jy#okZuuq^F%o=V*@E"P(d#Lo` w ꘧+[ie(^cWQ1W'"2IuϥS2 &[IG͚<(R5KZ6J@#0EK]aHe;![,25EQh{ROug]-uk^vYo$KYo[ӶhױUr+@E"Pt@vL̞J7f#^*mb]&뷑اFbiۣ^׾[gҪay5k+ʗ&k?2&wTfS$7^5M#kF21vؼ+iCAA5[e4zuO$I<<=< is[x>5T*@Y<="Pdqy?,^Et3P+.9>d,%3OZSE"Њ ,MHʧ5s=@.ޗPs C\0ugѱq(<ڵX,jf3U"yD7jjxj_&[_tT ;EEg_T*gy&6lX~w0aBm",d@zQ l~`=&  `H4sU"PT*Aiewܑ_|bzꩧwKX`J`͘fVCLQGOe*i"LTˌL.bw\7۾eH_iƋK9ͲKꔧxe7Q^yBjҒ"wɱPvWo5*쒾5ywM}Kwv~u YʘcRV\%ޱ+[f=]Ӟji)[u-r5뗶ė|4Eo-f^Y},YWYONEV9u%0Gh庻xjT*Ӌ({-iK#F";w/u3A=T*yxo*4;nه\T*^X6޺_ M^llRջvqyݛc9:2f0f}([=%l7fʐGtJO:7n\O@UR)=%:M|uU%iG/I+SPnh뚼*F-|Df|ѣ;L'. 6e,u:6CSM܋\)mZ?ӣzԨQQSRgҺ7=̂2MmBVX9m;$o ڷ>8{VZ)'Ǣچ\],=iSk(z tke3S.YLz4uwP<6p<7㛘>jq6eK%"Pƺ|037xcLܧ[Te<顇J /pZp\My4\IX#*@WO?O_6&VZ9Na, f'tRdMj{tkk/y0cЯvo~ӕW^?0ʕ*'#xGMio,JiwH7zCF^;mvz>h` >_N|gڧܳ:+-i7 . I}ZjHÀ|G>-ҝ֔„K/  ?9&mfmB뮛z0+^7|0׿.,( W^9Ҩ_WL g7|s` ~EMwuW >7P0%릛np(=|ov۴NnPu;ϼiO+bIg;3/~~oʸk8W9X87|/ԩ]tEя7F -P. 9y'/rCN}u.;Cwyr˕L\rIHyAj菷rkZq;bT> Q~ o#oEWm}shlx㍣ =^W(]_~y%hsI>ڎHҞ ~- =ܓ\sMyx!p[oZk9;Ucю;rK/^?S?l?Ve:in)0Bhi6(%ϛO?tb-yVsC/R:'Rȣ/!AXPTc9nS~LVLA^1g.gf"P $i\F͒n}9qb8ƴp:|YBsLaPMt,VT  y(bL,L& CZ)&>~g~oHO}Sa|] yz]w5&0 d+O>FmݞN>tG,~X:S#0@r1/q`F.B &K&` aߟve {f yM_l7h'xb'>{0{2BЅ>lxw!e{A|_glCT ԋܢ{=ztBH+"tAiE裏21}D:uA0 iC `о㭷ޚ{ 5bCԹkDׯ~Byzh:Ӄ8~}C=f୎%X"ȽBoۃH!A^$ 7mgmj_騣 ѓ+O #~.ȷ K6K_Rя~: nHĔ8D~!yN8tc=6iEI?Y#`Gw|$iѓ#]#OA!7T^{!'YD~s>J'h|;@=`w+ }Jgx^?v<2f! t%7\ɬ_~cK>lНk˜mBC>cgKlO?=ioC;yFhB -D+bʘ~dp %9眓I ~"P<= yee`lc @E"0sϖ\jt:m?I:8{=JK{Du *kEJM LX%<I$W+^ >i\60r>A*+_Jz'w0yG!J>Ć;0q&ex_QK AA9_ve @``2y01<90k ^?Ku~_ : oF*ý7{WIbW2 uV]u06X`c2oB7 k_Zg>i? "LzFpHw}k”~ׅGWqpS"F?}2"`3 eG&,F?;/CH}c *+ ˀW&,xbcs=B|߈>I7AAXh $o؏;.r)AD*|~Fi+(brUWgy>Dݑ:E`յgيEƹd }Tsgs7T@XBz'}Z>A/,ʻvKg߽y7GsCnBx9hS$, mhp4ēN6GXpiےW=usW|S$`ԃ0l!Q *\с D`YZT9€G02I_ȧ\(KANTgx!t,#_""Uy WySWoL? :N0FpL[/}D]Α"%꓎ߓn0<N>a ԁE,g TSg7B=-Wƈ37~Q+Hcx Yook /!8dv5ҟ;W}@=ꭾ#5X#LCU'Wy4}ߌgn?[*B\yf< CZ'>[{jڊ@E`"`,!A^=p7~\{/9cZ!*U@E`@ '}^;!W0kvVx1;k#?!@II B.A$0xz y]GƵ )55"ybęzq +$crd/QgsaΈh̘1a4+G:$b;,S:S '=F7cx4Bc 3h-D EwN-DڎMtwMBb}C ʑ٫! +zM~򓟄RՖM@ϞW@!ȳ\9`I޵˻l!F9BqĨWv6 "Ң,ǃjgdYԃ@hBP0$ƵI.9 7^ \pv& B j #Q{JG8"Uz|Zz $"n V/M+ȈAɈLj3rf21칕9vZ%7d " *x/dzM'y4!ē3#G}}3@Ԩ^ڜ%~9=l^ 8:Xh7c=򨘾o_LU r>{]ϒ{%$εO9'k @E`z(c,6T6J(cWӃb[t\L(s.J(缆@Eg` 2,#lb /نI1K/9,ah2O2DC288A{+ B8r#폌__! |^dt澲&EbA!cyL1-sX,KH%˼58E_T7(@2!t=}yL{ H^;#[OH h?R?{OװG"GXyTN?K`N{c_ oMy" F6y< iM=ձ5]!3Ɨg3k>gӘMɞW[=-Ԓ!ތDd1!ҞOg=Kq^CE"P mOVeH1e}2Aa"C^1U6'B IXHI5_bBNl. `1ƌNA V@.w8XwrNm_# &0mړ⥃%rAʳPi3cγ#;|Wī1 ˽!H82υq8I[#䑛l}g%OBNic^ ԗu @Egxfc3c;?-?L^ueꭜ5}E"Px]cxzٗm=nt?lgS-έ"PzU+<9BB8ؘaHb\at ģ ĸ4)b2^>{grw R|H#L5H[; .-dQ`2rS)/Cڲ+QuApL'j_ 5(aa2RX+2|ԅS# 0bYQg"a-e+[L>Ybl԰$@#_Wmsoɘ>dvImDI|c"mʪ>F0';:mD7䘶G 2坣m&|9br Su#讍% xYҊ>Ĉ"K}䴼9g-+83H%}6Z=26 H--m,G癨bMgJI6&)EU_GxFՏPO-@p(AZ#A<ٌ<,cFo>"M=+@~X1bSACC||iw1E<9888#9?*2Pu*kl<@˺AivNK?oLH/xIz~EWF e!:?x^xiiO&%'ڸ@E"ݾ+ԥZ:7ű~p[sjS+@C`„<;; ߗx}M;1lbƎ}F"Pnfɓn'F?"Lf bZě1LbP13{8cU>&F BeQH+o1".yx*Z {aCV}&fARS>e;/7Y\/2 ^oW%(2iDhyF#H'Tl"?}A7Ieǹ4VTB&v|`,2#Lv~:¹Cv/E+vy+u5=+(ׇ\ >-EFn2(xy]CE"PhE/}+O/]N]1Kc,]Gײ|vU*<7XcQKLLI^xe|7?Kgޤԃ̌ݞ>}miyG&Aeޜ 3yi9\4xسi5:zD:3[F PL4X69ۈ1H56 {B a F;IicQ bE84i͚S&Fy|8abOB (YՔ1,ĻO'Q]/2ƍGZ3Ȼ {2ys|O~t\w*dk/v:Y:f :uJ+һOfT4߮'; =ѹ/m ML:;DZVP$oű!=|Jە争WK(Y0V2i'%ן'm EWmHh\4dUG3zK824F|/]Kݭ]-r([iZL2ScLS)5,K[ڭ)sT~A=0t|9o+^yhiW &*"P & 7;-KǑABUWAJ`uLϓD%z]3Kze\1kS2)͝ɫN,=%&қO~^$<^U[=olSxF aL:'DŇ_ބ9њ{k{Be1ʄRF/";{Rvxr ,6As''NI^e'G~k|@~t򻪳^tצr8jԦtUҬsg5 Of|OשGs 0NīKs%|zJ/}![j?eij-r4hSj/iS`3N|ixZ1 /7 W,M]yuw7mbRd-vݗOp^C+1Yd\[:/12-˯LKyPCjT* 9FO>3*^s_z҈ٲCWLV71-i3JZnED`0Ƨ|1ЬzNcIRw]OKS*{juꧮLo|w5}uthQJUr ʹ](mkĕ.'][KwkyozZǸmN|MYvSCE"P yB~&LKVUCE"Pt?e4lZfbO锫N-@IDAT _?T6 Bh͕V^,ZcJ`M%A4X@BE"PT*!b<_5hB0WD"E=tW 37B\sH?2miʫ':հ]>riMrk \L@,l7 boԴ<4iIdѣ_ͦ"Pkw+%kXJ^ 9f٠(۲:j@E"00W'.\}רz~^bߗ"!*)X&5eALS$ί>5 ,]Y1${F ]I+a`M؇Z(mݦC oE`("->,ot߇~O)Ƙ2c+y5{Xչ"Pʿ!Ln;6J?m1LX5oEuC`< 8*F+ :5 ,@bx0l;t5y e26nN NϜvI%Cuiǰ dʲ>B -X[zA|͗Zjy5;K"PGlc9\PLZvXz< 3m m2^6P/O.O_rT*ӄ1~9 O}J^@E"DC4ѨC2SC#afy[^ j4 , c ~lӾ)J[L+y㺭 XB+e/beڥi[>*@E"Pq0Pg@E`#0 zUÊ@#4nؗ/6Xphw}A^-“N/޲Rg@E"Pz̐@8v]ҕc3Λ/Zfw%]뱙cIz]Zv.RG.__-K1;Ķ52MfewwLۚ'u&̬7rMOѩyޔ~.G%<_˱z]˱~.Gy,gQCVwygc7,F̯^yu=_nrK:Oh mt6|7~iCO2y׶nva(cW.ַ5gw3m {Khn!= mz뭷_G rxoWQxJf I;)DŽ':sC15y$ˍ7:oֱ 駟([8M'|r=3m N0%Տ9!G}4&֣Ge&¿(O?6?w]pW\&koK7wʪy晁s3> ២D_> vuUW򗿄& ט1cBjW\1{#oZ`sDdןZzꩧcse>,HzzVZuUoKv;win)=_^{ER9眓  >\{ \O8 n8 1|%bGDh#iy Ɛfؒw};&d'?Ɉ/LJ>tH ~ﳟlL0eyBh_\B|{ a5 Ra~G"rkkm V!=2q39HE8wC'xb:CO$H_:| ߑGT ^̄PB}L/x7=}qÀA˘IPsJ'Ϗg\& B ):y~vByȡ,}MozS@s˘b/3F׿{gdBWsY"y=c2c>>C߯͠~va/ y]:d o<#s8zLNɳ9!Ͼowet8g.MyE"0}x;eZ‚5~43e@^G]ǮDU̗I4te?KnI|~F>sTnZAWXEl*шivyB4' W |' K`1x? oG\[Z< ><<◕6 zUV]%{t]d d' :d>3!=QxKO{V!g}6ʴ,e5q9xӏH/m|AEx ӯxn5/M6$Osd'> oGϴq6ڟ<⧮-80w8Q%Hz񄴅>sǃכ>i4tB+t0~|ڒoC}^KG:y\j+ +y8|Pm6ۄ8P'z݇/> }Kpg7 ӦHZc$5\QPH_pे 6Ɛ+F; -2+7 Dv0=|E21'2cew67|9 +v. 럊@E`"`I7|O3;[XcTo70QCE"{`}50zg;it^k>f:nO1;|i,: )VPoV/=Ǭac#7H `ydF=U-aB0nj餓bYvQ(ِ r;c'#˵Ȑ47ni"meYY<xz]Cn<& WBzjd5ã рH67PMkF{hO!,7U&%)$"b'CL_@|+_ !~kOM菣 ϼ`iFPUZuͱ}{O?t=xy,u29!%ِ~#w3 =gV[nm/x=[K}^O`SƛN0rBf}W;3YAʤF'; 1 m5IJg&S"ɩ?kLE"Τ* I[nmj,c5F7pJES :|~&cc_c8{Ay)OXU}QO-"0h}6ʜ̼95稡"zY*v7^(gRgam%x($%y&QIo_1_al)8C`Sw ϪgRPeP6dO* N fO!Y>Mg"2x `H3H1g*cƌorx$A ) sdҊ&o-ҊW&\H< y$G^Cxv`tWt'*o5 /@ pF9-!DoI6_@*+'-<ԉxQD 'ّa ɏ?7`tFMkP? .E񪂋udN]G&"O!{<_ =_~_>׿O~hޒWQp#VU!`W )g\6~utf癧YG)3.-"Z-^W0@kt5e^x5T*cIƆC|z>!^ƈv+ʜ;F(e6ɫ- fxArʩ1ggɏgV{g }FAO!zkA@Ⱥ nilPy^vՖ XZlu)'rH[62mوkQĻlȕlmj7Lv-mxx2֖ lE|6۲Flec=Amiˆgۘ1cRrߖ u/rg0l^4myWg- m(l$A[t-W]׌+gMaE^eRSlG^DmyBFlܷe"-/ҖI(?mіɤہ{Pn:B[&2=B?ehüOuGt_OȄQem5V?Xe-imyeì9oޖɰpLxk_3)e6(XiSAUryFeB0dfxn'Y&{hL2IͶ-~%I/3x}=.<|Qxfh(/xqOZyLeKxlC<*e6m&K'ʴЯ5gSj%$K&i3Ғݒ"q6¯xGrp.x( hթG)9ޚѝ2{'͹-uSDE Vԩ2ӔYzшduN洼+k)2DTz / CЁp.|xv=!omбZO}lxUNJ:WoS<@W=6u~1m*ꖬU& eL공ݻ6ZBƟ~z)7#C|չQoU/h(ś:\kBSfȪГY{hQ6]kyGsjg% OfcJW6fOPc(OviK26Eɸظsbi_^6Eޯv:}m/M^U2VX#7%c9+YeM=!&|xi.L'_w_:k-7\)5z5Y.%|EtϿu4agYA^50g4=8^a|@FcLnsuEz {KcM`ig8F@a3X7'GiFveH\v Z6A&"10G.xA2ZPyl}NHJUfmF8v T|9ϑqՔC|GK:zk7Z~ԭ{6:l#>BqM]*V#k~J5:MY] hAv!t+_ЍfrMMJ–?}S!tG[uR\Cl3Xh1Ș"] j(`ސ}HXW D DP%p>m8[jg:0.1U)qS-6hldA5[,ask7AxoA_&vߤьo/7e:Bʯ{缙'Ź[fZIopt7ŷxG!\=ooG;rrD\3mn \JYV)"t-[iގ&o4:D~^i#{"?5|y'|GM>4~3O3mċ]\Ќ{q]UU[{^  ׍+h3fz1X ȟxtþq8[5mhՀ~D_r~Fg\דѭ<=?.;{yIu?W ,[Ni{]vqys ,yoޏckH.qAF,f`eC@eÃ
`agphJtc+KKfj 4OUoy0Fj'@[?'V]uWE cϔI?d3_u_A6b+w3v|r*5fO+PR;@ho4?px J'J Qq`1<1!Lj`.@,w퇷 [\228Ʋ+i/qҤI6+Q西/*UI ,>ꨣ ȕD>ʤ0%{ޒl&d*!,i|yǧko8O Η292٨a4f|+>GR7_ɶviY 9ˁ4r_sSPƣMcOC}oVZi7dZ8cCu,E뮻fyzjj,=Ӟ{Y<_a~%/yIe/{YO熣on ʒɯ5];!TUAzjW+bWg0&GR:Ւ9IGּg@0 ]u?W505`LSf5Nx^RKxa1x/^˪1qaHwW T ,jI7ttWms ":12z.*bz"-z%&l=x0z4l 4Q[q{)ùɈ7<ș-3>;? i&ͨjj`j8eIM,y^}$ѳn\W/ܳUfSC@5`~f}ahyAگb5~_wq^kBF~NF `531 F0q$ŽfHیk16;щMNMr 8n(S>=e=sBۉ߁w]< 9ەٌ im}#MS郆c(CIӤѩi[`C;\ʉ_/rGY4M|;v 9[iѮּD}D{*2M{Hߔ.?4i<;ngKYKMޛ#ܦUEzym? }kQ6aƱ{ מTxF`iF7|sY+KXBjj>So*ۑoKBKZ3 +-آ\SU T T , z@hLn66>7|>8J^o{=F/Ѥ-4LGt<剠Le|F<"Mtě`y<79oOxC)_&EG{NwE]"rjvG;i-/KK70-CSM2{kG7 ޚ.@<'MN>o˭2Vd~t..hahh>E z<وWCu+ތ eѦ㛾m$^ *Gm1vS/0K^mM}E+5//Y#__:+,mqV̺YT_|qqOmHgcsȩ~_c>&'Fm//J/x5&]ߢ~d̚Nֽ-O1.co|( ;dx>cØؤ7P^[ӇyBۚ^W T _O򖷔>Ho{O!TU;y՝llu_xᅦEy~_}Gyԧ>rC}qǕ]wݵ9ڝo}G>ҝ wؾy2ם Woqw6K~t[ou91SfrT|3zv( ug`{=^mݶ _6(ۋ7MyRͮ膮~ߕo~EWʤ)+3#_38؝r╯(t_u?#ݟg5};~W@sd^r=yOI$`& mJ0f!IEnn2U={R\,L&~ۗ}{K?u}G6C2 @V{/wSD OR7|>ǻc{0zyFxOo}OKsx@L=P0ﹷ 6?,r!Ca0e[XcXŲH|)˒-;He/&Tt_,c ]-]0Gq?Z {>īG<@ Fs4tMkXmXF:SW\dU ,/{>=/}ڜVQN Q/ uTauؒMK}C» sO{R^JˬXqɛȦMk(` /Jſv!>P;e.\Rݢ_d&6Ѯ'O^7G]KL>/ y-MA0՘a@t}:t/<'}V/blgj1foiS/q6j@)|1Rx%Z~0:(D꟪1>:K6='oMyHm76M4[&/iJV]=-gf>㛗-Ilpjj@N|MF `iRw<*3xN0L-=f2SQ a#KʺoTrW~0`xR~1>΁ TF pX3ģ/,N>:6ѻ^y8J;~I9 #?b3t `[wuӍ7X |F.-\ZГn 9[n$uޕ2ooyl[6O'< heg8?#uj"tӮxHq?n,^~t жN?ĻNy>xu`(7 , 9yMaɺT~y_+UexjK&F22$&M*yg^XuOƭL>蠃~^P^Oѕgb/V@Uږ>Cod@jWYzi/6arAяӾS >xxj3/+`N2DGfoRx/Ƞ  %y'|+WZ1(e\%c\uG_\0.5@=(-r^&Xt_y Ѽ3=dχ j`.@^GQy ù>_nuryW'G:z 9+qLJ`ϸoI^~~O>|6o|sZ|Ii-N /|7c՝45?7W%IP1Y@fA /͝4{G3fkT_gKzq}wqZKFY@IDAT> 2@1vqb1?l$zsTtmBW%^ #O++IB Um.XGƉݨc` ~&m L_l vR'sTQtɳQ&Stl` YCK_d2zg՞HU<~s{O@;P[ʃKazĺ`y7y Q&ٴ-m ШMP~K|W"߸ `I49 W18JFWO?U4c\_x> t?oH7x|Og'7I̗&뤅3 [?43 w[&-vi=ޞ&fPiJ7W/x LFu"]Ёx:Άk qf3^ x3Np Y?Usr=m\n龋WksfnV~Ғk^lѴ2˥ S}$=}fkӝ4ߥ;~b'EMi➇u_cZhUzu0<c.ۘ;T1ϳ cpю2jjycOXŤ~^sH=5T T T F 21O<=q,7<}x[0XX x,ag"ňw1e!I@3x@AO<[j@f12cwF%Cw͵=F>cǃ°$`=xW0hb$<>0(Bxxb(2 -CWa+|N1{>}7냎<Ԁ ǣ ]d#[ڟAt0gMճ!,4Wx: ?ꚗ%aK u;᭼*yyE@1z#q}ޕAŋxPWG] )22 \ڧ{G utuuemھv'@o!:!;S Y,Ebq<1|KpܮMj {")^<2⍤,2cBo2cIK,D 06y1( F4h*EZDe3]t-)ySnd_f,VTgR[ #]G)0\C@y<1My_ك1ǭS6ۦ_eeUi8i *(/g6 o?tENw|Ȳoy3up\Z} d(9W<+C<`sr`\5FФ4=oQUƮ׎ά9ެ(Ҭ)7Sxe>)ql͠a%Bx{ha y~]Nt~[oR{=qsF1W}+yZtC9S+,-e41~ZCA4 -ƦNe'k#^6f25itiv,'xzЧWO?tI7bo!kҼ xX^k>$Pt h~&?'zZyS.Kwt7N@?iUN_d 1<r޵<+}BTC@Հ~}(SN-[𖬡jjj`k`PeoFn#Τ5toG[NtS|+m4"v/t:*/uDZ]9NKy['xiWf3yNi7i6-C@dʎ{=IfN74y&NeKZpvu#Ljo{/]@ejES~jEuc u^>_ϷEV,xhIhn/0fxT^a&kNA?<4֛fBk/жܘ Dok8t#W}fK~lK g䩔֮jjj`4kVLFs9ݬ;7ڏMƲ!gbZoSWT h ~ߙ ^r^鑿ܖd羕&rًsYY%^xN mVH}iM_.wt tj7iY"gi? ,`el_A?{?ƲҬG1WM`h' xYzѯNMEV}‡.PPC<xvK+>1T5@i@)a7޸g׼,9xeP5P5P504'ewg,<'e0!֯^:sZvcc90b'tu*5\h9MW^@x3~n 4;M}si%zoA K{c4EM_<6e_:<]_,-+kXblnSV UU#G}j+x`N~nK>GN}VN^,8Sj=b04/9ߓqJFun X*T5q LzrsN8tY|k|t>VY{ݴ_3lʒD=O4]5@j2YFXrm7UU]XϠ8ڕP,kCYS5\NGnG}%]v(8i>:+px^ y<O˅w5M}rJ-/'t҂]VEώكt^q% kr6k4ɬ>|TO/=".6K> v!ώ|k`φ*h@h[5040eY@gY@e`(a̲?zm钃OSGIXp{^žQ<m~2$ɩжR'yRM`zO7?wEG}ye ^zj`41?я}u#NwW\>Oť@X*WC <܏?OZprcZ j`6k`X1X|(6t1O@Tf_4z+x^_oռ7S|/[[k^yh弝Glݎvy^)ϖL=&o}ԟR;S|3@;N[f:!k;>[^M׷N9唲 ̛ko:E35'[zZx݅fRZheg+xwuuEaԈMYd!֧]>#.M{'|ymTǏ_[nc+6]vX1 w5\3mF:3K^ϰ덱ƥǧLM`Zd¼ ˆJl]VXr"kyH |&jq[l잉σ>8{7194қfkce+]$1&XSLImoʥ/'o#X3 ?Ao7eG5>HNL4C''([~47+5'??taqrI djnBGכ?`59e7ī_3Zk\‹{mzK!x#Q׭֤X'!ɍh ZxGS.qƷ[hXS8K-Ԍ>^~\KN k@}-mimI/zыfPK3Ծ;ne:kzGҲsϮoIO=TZm8<f~mXԕ1\s\z F==]]u\lc宻*ϫErC{}vZrmo{[qӕW^^W[vj3tEc[NԞ!rd ͅƏKmrl3~YC@4ˋO|+`׿`?W"]uU.< &ַZkvyD^rI->o[o5}+uJO>?m&iu-M`l%"w#/)Th'?R_mY7Y~_4SV/z맿諒ӹ~__کF\Exg޹{ "?Jfm[6;w^Z]{sEhFxPo~B_0 ߟiO3m3Xc4y~Gc[ou\]pE>|F? /,oFޒ^Stn5u|-oyKK[mUY*@W7}i詎=<;찲@- :t}!Wоi^q&8ʸ 7xvӤ+^&X4Ӹ`y%c %_5Ъ8G{t$#{/xA[uPǞ&;>M^b4q ϗV^ddV ӂOM-ث*pj`PX 4>PꈁӤK_R1@MQ 0X@)k<"IYgU -FO~tŕWafތ30{~hj݇>bz%/H}3)FƲ ~? DƞNڧ`~e 8?&R 3A}Q~VXwV]eb@,q e/+2-zBNpNKGyd)@ʒ 過>CPP] [`̼m+j#!m1XI @ @so٥?a\#-Ў*z9f'O;?i5&MJ]w]JBxEoq(Hd6VOǠ1dvۭȥ=鳫+{@#}=@v7ᵯ}m3yơ{5vV1aߟ҂lͤ3y^x.lju~ov+% ak}O]ƨ+uV謁\:倭F7{rj:̿s+j7 b 0Nr|6 ~/Pe=,x/a0f^zɥ8/\7 =?`j0c`tx|(|cO`p#|+e2"W_}"'O!rh7xc˄q>V!4>x-x`$WB@6 @,`Ft [XbI;=#G5y~o,.'N^=G+Ӻ;ZɳhfFv4oXiioJIٹ 0SM>40{߯53&>7[Xe>,񲕷z={5ٻ)Frxr괼|^t}7yy { a XZ<bF+ =F#^`x eU<6x%xe+`е !2j@tŐ1(x@0bڱf ^1J&`ᚿ]S.KF-vm(ftx<5+#/ Ɠ%J VxLjyH]wrBF<-zhc`iLg$N. 4:cΖSJ{3x+dqrՓe^ȇ?oy^2I~^0O^(tPD(-||}T\3$#=|;x@7rW.?uD_~+r4F7uWuIAW<W*6k~r#M_Ho)2 G]ڳ`LP[VWWG#u O:) Ok@&Wܓ~rfվiШeЯO xִGyj0IZkbX^'ό/<ǐ% m4f+Hkɡk3nqC![T̤M7yhڬq˳L_աvߤUm\YiwLm"oͮ1ձsm^ﵳQ`)k4qǯO Kڦ,o>f$'9}S?<<ѯ_qlJ$n$9XP^ZOg? fcc )k=ԈMyH=3i[U%ƪ a(Z m& 5o uqd >0tZX:X rxr`h)@3w@`$pdIz 8z \ =`jumv ` ԣb! ,yy>CoⱦѓFVuF'|jy-S/[?BmbXnO_?,uŻ h@dl yI{i4o6`9B6[ߎky}@7FJ]Щ |<,ճ>m)7h0\E}|:ڜ7h=7;^7&@':&ķZÖ!/3} Yg)Hu@Ȯ <>i@@pӊn8+ k2fHKg>--8ykjm/񌲔x#tUn?j ƚ)=nʡ b׾1-1sg+gT^fn3z;5ߞqr+{Vcwu6$a.mkUfNal5O1/Xџk9p:өZ^\|mc^`Nmci6?j<+5٥ ^uʒNgvfG&7 lx0 Fqe2d8J㭟 L^^s#koLW5FC$+0^+A`a$2\M"lmC֕?pоM&q = [tcx'ҿowOG D&H~cKRؐ%/P Ǩ'Ϥyu4t1)6fkyb2کo粹ݓU;7K3fd)Iڜv ߐOYl` [ S޵ c=7ٸxHVo:3LJI{aZU;̮ˌMڄ>p8<{LF;h콖_綵Bnw˾_6ݾi_c&+| t!^3KyOgxjyF>FxmҼ糟l:CgSz6-N*UUU7_ N+plۂ18o Ÿdd=dYlaP/dH# 8X5!GjAÑ~~xJ~`Y=COW\~E\PCni7Q.`]1kSh@ c_{0y5 m0Ї>.~J2 KD<0vzIzˤ-_ve 7dR'<^n͞6x01 `I@K>eџ&03m:3J h{g93^lSmW}ȱuB= ȡ:j<"bi*} Xw/O࢞ԡ~ !^F<מy+D}2UmGJj&~wUO[l,e1x5 McDt ;McO'@XfL06'O#e\ґ'CmOlȤ8f- C*Jʤ'sޔ إ -y8=-ef?Sc~lgY5^*-: *[p_3ioӋ`gq0a7zs5НǶ|B'V{>i%|nYn1bR/8=7]xʳ^of҉5)^lsЗ@:5Rb8+.s Gmv;^L uQy褁y㬜!΋!fr΀4a0IKfx `C^^5(j0HÀAp o͘036&*޲(֜gP! ,`cR0&+M`zAg2=X>Lj@⇗-\}2x#ÕaK& ×ѣ;|MA _<Ȍno&/!,x/K/׈2"XW]N0 /10_fꋼE:ԙP)6Ô>>G>oCuL6W|FuBTydV/ڵK;/Gnu( Eg<ЉtJx~ͨ'mCi4][Ɗr[xu]iM6exk&`B}I=yO/[ˋW}xO.|NxV>#SV[g€Kx#=Fo.fE<7<,A}G7h@Xmm`6tUs9u_oI5S[Oy?}Ĝ=/x~s={3FsgS< ^{=ҋ3M4?8--ric!D{0{)%x1j<=3"!ҷ=[z6? 3z텛z0Cs H=`MWP5ТX6N2:i7u7h~ff|s{{,y딯oNg*?j@n(5t#uo#}c3_]:5O<)wN+L SD/ʹIh3yx1+ll)x% Ϊc|};]>N0D[[ԧHf Gqoc׉vVKk>H?H?hzAHHWUQ7CYgU^[d˴+1Q<0@qLV~^AwV>ZmV˜y1ŋ2ǗEfj1GV^= cXsBWop6X M24㥏7} iIy\G|_|UOh:ʐ?خHf|kVڮiZitJe7jG[i-ŵVz\iz/hu|o1GvlƹvkI5h5yn5}u+7A'^%l?5ntIe׎qBiG.>6˝]<|ͮrR6еMUp@9YgN+}"z _ ˿t^Mf/M{ļy&M֋s(Z"*PhE^}L{6i4>:7iL5̬RﭳU,µwꩧ-xҏ7B3~ݣ |M8ksX[9rg}a+"b0W5P5P5Ij>B[iu3}Yq/ʈuʹ=o_ڝu*?ؤѮ4eG8ƽc+HHLjo4΃~~kYؚ>̱/unL;v".Mqf4C:@ -wcRfƾviq9o9'=*Bn;.*YXk5/2ۿn#tIGz}~'-UWYEȬ*+x-\|O6 @Xzߚf0׼tW ߔNпnLlo2.CwqG}[x Ke)t7*( F7*+}pSs+$ ؋5>2*2Y5P5P5FЩQUUUUU5}xbxo_SPXb6aШo_ZlHsV;Ua bnDŽR|<` pzhק>.ݝ,9Ysӻ.c d^/w7^٥-žXmr 9;G.MQQxE6 -E%$>f*zÅ>o_[nߏ9.=?DQs5>KA^G9[n4iҤzz.Ir<hcܰa PN;sΦZ:li矟]6{Mo[-zrޟ>&/+&gqJW]uUq˜ q?oL&ks|4X504Я'0z4SyzP5A`7hϔ)t:Ӎz15`vy)U;5 sQ.qN]ױ1ʾH.uuuc!H~5 ]GK@lFCah p|E\*-z)Ofӱ:O_ZNOK.d+L{nvm~A6o/~QN}f毦 loscYhW jv/ۛWG}th^X*ON8!m&5kM'e_5@BNRuGg,&dfߘ_G1} ǴCܴL~HKAd}J\x `jssy^bJ3g[^YW ,Lz&xG?/^ns9xNY:G>򑲿ծ?t衇SN9ޘ=+g==NfPছn*c`\fQv=| ~OqWbŒb\sM:37! >G~{άNKS~fK\1?.-8Ըѯ>N"6v #ٸ<7f& kIy9J}gږ#}V|v̴E&<,釨Ӧ|hߪ^;:h`΃W&r^vq锾/A+K)y3oi+NޠL[|GS+"Fimkݽ~_5`/-oI[lE7Roܫ,\n,y{RK-)ƌ'L|q=I){3>O)N3h?9|}4/YK,DV[mU HM@[xc5,كwCc 7" Ɉhe7G?J'xb!MyeP4puwo"Ҵgz<cxtiˤ~w|P  9IpnIN;J;~eπ#l{ 4iZސK׎5cYx@Lyo?hQgC3t"z6IY/ė/ew~-o: 2 >Bk9 IW ߱VHB/Y/y^^NF AQV'ͶԴH~qӲuܦ1/f2 #tQMτŝՃ^02#,oL\Ko&8V&&qf=]|+02AQF|Mv++ y 2"zfhG?jy-MSV[5y4#G<&tQy蠝Nlyu?_r/~q/7^5@WaJج^3?{wT0aK ʚ"ih_("I1BT&KD%KY%ʾkv|Νzνw\{眙}ys>9'T/!Jwc"d.Zh5{#`vy٧yi3Ⱦlλ7yع}׎cNK۵Ӏ}Zg{rW_T$li5lr? L#{'p3|0"G[]w#ifi`89f9#"|7lCY ou]87tSY/WlL{8tIo/a k?N9?㢅?i9fO뭻^GK{N9g ?G]`";LkF{5S)9>ӻ#gZ>վPg:$3,u>%nߕ7YxyJ?6 Ư<2ͮ^)'RlF3#?zZuʩklGv[uOm}߫zV=6Ȥo4B_ʲ=zӉ>؀zƆti+̼Λ^Q4y A`ΜRi}orbTS7t40{v%A_m ?l[y_|q]8# ިV i੗Bs9N( b=Tf?8_}q8@s.rkspsg^}_,{lmيSҊ+ګ8@ N1玳pE{  'S?_㣎:k7lM~mbpUدW'?.ϲ:,}<&(oe`w]v)@ @CI=wub: bccH5g?v"-zC3R‡l<΀6'%vlj+k[֤_Z=Q=nr}_k@blH[KЩrƬ93~ƌS&ㄮ5{^O* oved]#o9@_>WE<hyQebpM_m z{,aiVk7׭&xf&˄R1< &75>AbJ4ׯPtkܛ ,}9U1 B?0E$ D'#A0Gc2r$4C,bũrjD8r^#'>wyc˾mPq9PQA=HO:կ~%Cgŀ<(CÑh|SmET6쪷Cܸr=F'* m!NUtTd) p@_ X($z__=tɴȓ<8Ƀ_m^Y!LHnog,& @$m^v<›~?mZ-l"uepJ#}NuC_3p6li#Ks:0X`IWC@DD%ӭ70wG&ˁ8ν (N%&Z9hW8z9d 8O"|D\yՕhʼn8J@ ]ËHE(-vXi@B2.쀣KǏ/rо>G2*F@0Qsdp#"Tȁ=aWʋѧQ_i `=ftZr%Kl~ ,iGm'[u"bw}} \3fLZuUx@ "#@X$/#z{kƜ|H^v_ $Q~j?mH !ƍ#ݳ5y|Qf dTXC6Ɏ}T.O]}dNB.)x3~'2  pψ3>8r/%ctQc1m>1ӟn@o^wm0ϳsъG4 ;md/ ?UPk B^̙7̘R g/NxX6Gv׆Juh5;E,Gܛy_$[sT5P5P505gpVceEG'<"8\i^9~jF8@Ϋ%XD4qE;pqjE )xHs9ǡ.dE6ϡ4³%K >wG([`,"QdJ}qs3O(iE 0eY03-o([/JO>Lѧ "]>취;tP3]ow.5vFx4; 3X`sFj,ic7G|KY:tccA~vȞE>$zTG]tNd'd*%qHx7ޒH&`tSdrHœ?m{`c}i?xdJ' ~%}hD=ozQ>Rp @{;jW28Ʃh3 sp 5U % ̘#UA=Pb}x53  ^L8>m+[ܙmv+uf㓽uJEҮs4w'^ӷĦqn~>RDW*O>ԜͲtK5Q"0fny>Bxd}I2l~,xX8h\^F@ߋę]k9Tqt`# "&c7j@"q-jG( K8e6˨E%!qht+b xtd=`&\ rWM%R0ζvIc?:\`,08oܳЂcɁCH,Q( /L(j8.ɔ%70DB$C)Sxyę,YM"jMF ,Kq;`_4O00`m_@s @k)F!.–% ʺ?pVXz i ,6vBʳ66%ug(A(`D8[Yl1x2G6D;5# 8!`FI l5Sr?Ifu s~06uxz1tLaQ&o sQvx? c דF=V c1jz ;(N \*`9u.D/=Tz[$gqB&Ǘ\0|56]soϫ<so!kk3ŀU"k+F_pщtR3"BtRH9kZ*D,%5]9.L^o̴<@ē@"YAy(=}8^K]EZuEpEtp#^i C(Ӻbk2F#e$ԜHh>b f[lr|ꞻKy8Ṿ^>EQz=hz˒zژXWnx適T]0K=]7hq;%}L j&;?xmpnMxf17X( fΰ2l5kçsQv?˕Dx"!E8xQG>M4I\[>6ULw9{dk[.(Ҧ䜄nN;fౝ_G_ne>rK}ȧGr;t9 WGv=(F'W]~B}L-qN]) 7ʠ2}.)?xfFȁm)#__|;s7tzG_WG;3uGo)+\$m /7yWBz[OGdԩ|$+}r <2EO =IM+G˛|}eE'MF~?x«rdyݤ('6 lj`0ixdd's L>xieN&'"$ڡ7De@ x wmŹ7Z2}C:k2M\5ﰣ =4yB7XHdnqn(z:$ۃø?׎nnp\cjz}5lUB=s(ʘ_Ə_{(fq~ƶ$Ϋ]m:~wS~omj]>C2YW.5vh!rڤ;K=y⨞Ѿx ԋu];.ݲe|pMyV~)-}), 8=w=۩x{Gu&8w3yyʅL³<<)GI8eqСc>@}(mN_IQ>"oЍc1'y&\ bLMGƋcYsKxz֗#ڛ+ #{ptehRsd\Lvzaޙf[dL!7WZ:l6֬&ptrr}r %- =V_zLAnGH.e[ukU{\@`X1I'|_+qԍv‰8ӎpx4+&'hQoEoGoeN5s==(ßfbv%SA>{!7Ͷߎڊ ;2M}KmG?dzKMu\iW՝M8ރ^+/ zVK>?kXѻ.y}xz<pYԨeΜX68ҔM_$x|y@µȞLCk- lyKi/)a5L3KcNIe[w@@PX!`\wo@kO;ON"o xFOwF pI{joh=o~{=$MOBt~VO?@Ю4 X9ȁcyY<| zRs,5DsO{ D؟qژ Ͽ;$t_`MU4`oMo܎m;!۲2Z5eIMwk?W}՞Wя4dZ嬨ݵyp^p L76+GX_]DߡCm孺ޢͻ{$ڰռzvHw\yOB?e?LseE7xr5X+ W5P5050Y˽NlL<7pZgا Ҁ] 45ɪct8ml^'T50h5`a̒X'4jnz x<ȩb9X9˳ʘ~te("nqt6:ci^X..i |6 ]ɇƶswL+R:ҩZ^*% s]/Ys5K+\ڗUUl)ѣDлɊWX\|)G'.(5  s},O׾c",s GYIF|]to0\zj4vGoy{ӵk:ZYc$㭀)@n~a"葿Iخ"5xiԀk^ oIg>rs=dtP6*۬wMoaܳ|H/|z˛LE+Diy<̓R7jUUF`$f@`MCKλ>dTXSc% .{9JoYC `xܑip(zP4k^@4t?.IOeiw͞_+*&x,? j-~xUI^MKbAeY\%a3O~MUUUCY^\8eaHC%V=#Z'24xkL]v9+E,d rT50{]l%E_giСy~#\$c/vR9P /#Uq [PF*Հ~< 䀪$ >M5MBqir7n'z_/z!Oߑ?!kUᣁci|3=}?S[ۆDXbӳn[H.KiF"b+;W!wԉ8x5:WOWY@@@5gdq|+QͼG{{*otzOlo&q})LS{;ꗶz෧۱~^VЉqNNe&\()WFuE3%Gɵ1c{9%Ef~{fNߕS8l8ߤٞl|o?6鼚A'dAf*gЍc/esc3,SW z =3>d!{zzyIOqɍ?Y6WI  6l0 *UUUUUES `v}F~<ݾ;44irNmvONe巗m_*↣o$yM:yK$um8F~c >ͼyyS[OMlqrmu=/~㫴r4˽Z;EIO{{ʵ#q>~;Fj8f^w3?Ϋ96l\OҴn'=N{\D)m4f=G^|=&!(ڔ|9ຟ˔[o ͘SZsi ?l+o-jY:50M{RKOڗ]vYꪫr-w9?޽RK.U\9rdz{ߛsB+pKW\qEk2K/Vy*E>o8я\uUu_׿&r=I/$љMUȌGWO>$|m=(MP޴r&k'6>ٴwѭZnܼ]?[=ܓ^؆ʓo-6}zuGҧ'ڊ_E>9"oDgd{I'Co漁9cZzK;H?Ydҏoͅ|o&b ތK.Ȥmu.1cƔ71٦rd A6T'f-~x6b- +>IMNd.׮|SYmm]cƿqh:󊝎7.O$z]%Yט?n>t!GYOi'6OyXǚ =/{^M=V T T T  ƅQ+'>.2@$@;?/ ^ [c=aVP:l ~SP/" K6o뮻n^wad@,hl|.-tM>g+۳^?d4~҇1~|4e$@'?I/z>Vd)RQJh=ca뭷.6?OHsl]xiQG񆆄Ws9kѦ+ɇ>b84el߂Bԥz(m餓NJ>Ҿ> GqDKn&O,'-|uͷrF+;%=le@B/;%?\]v٥VvJNdfZAoev['|rB3;C=oʹN|ݥ(V(q>uE+; 2 L/?;ͭܶgVZ})AVhy䑭|;vکk5B |zs+ 9ʣ(muY- Il9rEGB 2(Vh'IU2HW2ʯne^Qw-tȖV[4)VZY=  A+sNɠ]+G630b0he0#kVierW;h6qf`[p)zd9򦕁R162Z~J>r+\m {ɢ??~|cDh}ie/';1>32If-GG+mdpkZ9zg3lȧV`Z/R)g~)o0df@(;4%hؿZ\n¯9h}3^+)_#kȻ^{ue%("K|K\&CMI* 2Y~_)Υ3[ 1ײW|ĘU'V@ahkmA;LMUCA1~':"vք+^x?2;J@IDATdb%哭s]ޠ&bbu'87MXkʝ\rt]kgkD%:4,'qoig_hm~9>9W]ƹNj^@OZ_9< ^''V>[Ǵ:;|c[_;^XfY LB"XD &!"#;)[%"F@Df,e S 1;/˂D2dDX #zrr2f}"ZCԆ4se_d.9\I%jYF( +M% DdsQuZ%-|s1圾~&!%L?i* /\s,}D;XΨ9F%%`Nӽiv;xT_k]>y/CT^nvG;~xhyVm4C\ zzh" ȫaҹU@|(K+q,"h-07G\rJ HpT-ǒK, 0)GzQYN0v88z,s%<߲,b9p 1O"W)t~Rov+KzBo`28@ Y9Z*vae#<< Xhi@kC OrF8d_~WKnN*8ߖq娨L#)KwyWYw34b.[2]9r^ 9ਾ0\^pawYޜf`:o:Hޔ_P&l_t lNY:?32Yg)}/:)sQU=q)J@_cx1L?oN{WX;?|_N ˅}i~ Gx]XbI@l{_ p7!m:oз#;2G\4C?9v$!II˘5oG?QYƻˮs_K sDV^׼g>X4nܸ,WUu]x-~q o!O\T A xsҊG(L8L$GS( ?g$:/"F8 %2`@@B=4"nDq8} 2ی&YD8'?f@fԗD!O_J3`ITM .`iW3Nu)yqƩG CTEho%*[o)Ź4ZdfD‹m楀,*}7"S x wg `,--grx+d,pcC|"Hc9,\Dְ(< Fo*@ o"稧E.D5D8O}S@ lxblM'l謰_G}&^gpM2GŎ(ƃs'DF=V-_JD675qp-IlǞP"tkK JfԼW!܇?Ү>K2z@wmy3.٬qbu}bw|x6xӇ3V=L0c.`Ay jA?~zu9uM3㢴*MMҋ;D#3 `fhAd!Ҷv$4DptM>,7&@^ -#'PF;~ߑ$zv]uՕek}۲[Izkmx(^Ԙ2:#~#$wBpG7{Gp̘1/_>рHovTE`m ƛdzOK;KxЊ>)&|`&nt`B(?!GOIߙnLҹlEHLXi2Z6g|Q(H0zƀ1 gܫjـXl-7~IR17oZ* `7t#cl@:6nA?ТCKVz1om+k%2]6DT50l40ўbn;&wj遫wo=_tΟKO'EuZ_IyRyik l^v_AaHjjjj`@4&Q&YA9QtώUل:#%+C\=Rvdd#J6lZnʦ6W.Gcm^-)G+g|_~Z)߳cW6)>k[%JǏπH)k(w&~g`l.=e&e'l6os젖ͽN+/+m3ec!/eLu󖁆tv[Q,eҲ1zɑKeCw9ShE̓\AVgpMm8);ei:$L2_h .rUwM3S>69کAKݺ7N!q;6]`~K -~c~b v8?G9#ڬ=GH>c9{ `M룬y|{9A fJYӱbwh Xyke>tǶ%}A!Ɠqߔ[9T#8K-E d[^NQ2hmye0|gPx~ rT_n ѭ`Cf9QT 1 p߯b,㙟٤CVd,tl]sܠ#Քцiٰ [A.*Kk{T!/4$OdYQP"^D@x/JBJ:"Dx/YQHQX#"2FH"Q"$eEӈn"C[ٱ+zDݚ- w}yYA2#~D&Q=)^<")D=|t96ŋ( EQ'4#. DG#9.,-<-i_Xʎ,GR-};Gg]嵣}|jR9ԣǰӃHydPN[zkKݵю>&Z>C~vҞТ7dGx%.}=@c#q,gd1FN6@VMVz;ZF ݢ5  þ]vڟ =$ТN,u64 {}ɸBy.χK>G>a'zdӑDڤ{Vt]Fdm;C=fL^MUCM(Wok_LsRzs̬viUW/D$s=-]/'4fM4˥ЉyUԩ$Ԝ&W(kyE5j(˻EV?:o[NKQsv? CVOk}s'{1}ܝ~J >itŷ8}wüm&+GvܫJҮ[4;~4╪eᡁ)PjhTMsNs&nO壍ѐ1DGNO'+Lq8(ߞӫSNdg!ß6 `j٩l_˝v9@ DrlzjgJѩ<]ONt459>\'I՞ߩ~O:(f I @ĺh]]?=&͘%Y^2@P0M>]Fٻޣ[~"gq c~Wy² ^:Vk&[[n@Xe?8liE*/8iŗVk/rI=`@7q'RދSV.>G~4zoNm4ʶνrE%M xP(ztn[{>!=~ in?IK}l4ȮZ/oH^xNN7ezƼs;2-zC{m ^ȫWΖ|zJ?|淣a{*T&(;܌"pfS",z'Zuwa}|q|MjP4׌ U&@tc[Z|s7tVۤ?oAsҋO?5 xUd5O3-c/=L_>tŧLdjN8iurYΠh ^jye٠}4 +UUULZyzŗR:_]BWɠO&}jv_6ߘ[^;NNՎ,m5t "`[J9~ ]o\V\q5G.=K {twwKt^,ӶiѼ«}] "K=tYJw~d纑͚_1k~ M77#X!P k 4EU_6lo֊U T T W۸ZX8n@?O<1Y55''ӉGpaJqn26!~j5o&{)h~r`ig1-Ꮶޱj׉ǥ~sDG{g6HmqZp5̳Ϟqn99.)MЪ@zӳ> K|bzW'g29VX rԆO,P!ϰ)ixu1ty՗ ۛ߇⪀UUU4Kv5oj`^ ߛ4lM6A(ǐ` IDixc;$g)7:kф1S y晇EO>d);38T `XkLw_zIzc4ݸJ"f;[~Ji `iY 9w79M; ^H/Ho^TH/vUzGG.Ϧ\iޑF8cO].حKOnWkv*{ԋk،ٞZjtn,mo,nMUUUUUCES `؆6QF񻩨N"fjm/e$?jSޔ$_slݝMh'Y6io,w|e#E~7Dħq.GYu 9)7˖O'ě:5SxTS~{^mՉ(חcG&Yo&皿\9y\~5h +5#|e\\lb Ҽ#RX>|"=6~cxӿc8?:=+?.ҜhZi%ƦY=Zt4HP+)x^9S w Dߓ ^˲N7n\:c%\DaN:5jjjjj`(h`,N 3Zz#̹ ˣu#F駟Y ml*.:RNӖiO6Cn\|yhW-ǠmrRBWV' ~')kKl6oe"1d]ǓsFI?dBe6Ǹ~A+hO@@sU#<˯Qjqo&<Ho/,=s=I~*on3戬5Ǩ4 HQyq71{u-RH8_S[}ȫf\cLVD/}cKG}t?X #5U T T T T  r7a8>hz9Y8ٔ<g{IoΡ6vA}8oZ3Of-1jddbGrh͞m``t"\?xLވwȇ69~鲉79Q&䫧MuS_>pCfm}|?ˬ^m6|~"?^<}>Z=IY!:hҳ6|$硇*4}cd΅nm:+}ƶKO:F_]>&wԶԽ&ڎ10k)lh#z^DN:')GV(8&!') [w}w}cfD~my睅/<m3/>zy)'c|6\ҴWx&Оv?;m}% wv|'[C {\O0*K[]ʵio p mڪ\'eFk~ӟi7nN]5P5P5P5P504G9n)- /\>4f̘t 7 矟Xpַ5;߼?O[ZtPqob?Oq_|袋GgM^zi m.8ngqFZhJkO>lnI/eLN߉'vL\lIE_|)v]tEvUUt{qW<"~?rCO|;<4zۗBc;9_d?_h_ɱ+-o[?ԧ>ntr!n6 ovkM뮷ni? $vqi$?z術_s5eGl-~;)zn0 /,}F ^yWxeK/tWv|t~""˭Z6 9cw.B>iJ=5+OX=ܢwq=҃ӟ?~bπ|/~O/|Ɇ!N;T23}ݬ~oEk_*?v$2}4 _B/ݴf)n{y11v鱅Grn e]~yck-+}w?}s+o'el ;}-moMї'AxoMUU4ʝ.)_,\3We5M`q+ ^{e[A/CǏjjjjj`8h_9k,'=Fi2Jζo|.&U kqDUaɮLp_xpq&naP֍7'}L|s2Q]\;H@{a]wMz;KSN9<ٺӁXU7zX Y)ӟN pxCz>Py8*l8!(F $=pƕ=S DD?LGuTƌ7ByZ__^>?|+ $ؼ/_W=#ѣ; C>}F&'PQ{4~@ ad=WYv^ k=]{~V/<6;6K_Re<*AWo?n:tn,vA} %~2^nҟ رFNNw tdckxg?rNу|`G?ѴvaIߣ\ቲ%҅$ m+VZVRqEy榾޵Fͅ@lvO/?xfc&|[Joqts)s%e~ŦN8RݱoeE #Xy2׮Zj`i O@g1{11J=?hu]H^r=R߽k䪫Z=yW*g\#]o^Jjjjjj`i_K]0]d9W]uUqX_4 aIP & 1 RqP-|zeUW[8fu})d޶x~]/_W W۲.̉$GM&@PG2,SGdjeP,r`6}|rX ?qjC>ˌx6XpBH$ď$`GX.Z+BG5 </r^-?^tw͸AR&&8 |r؁q\f%48xgt8x;Q"T/ P[S'/l hQW@LG3J񢏌-|J=F@"qu%*+[nOm*[F-gmb7YRB+ovqO=2[ØӦ7] o?\ ~Ҋ+9A?zmђ9w}ZDaebi!KtclS}H&)XRƤ1'Kx/iUV)y睷1>wϾO_=UUUUS׊W\' ^z(~v-/|>g{A ܣpa%5 9+.x"A\EpEcHzN1 & )`ǒ&2Lk` 7:G/ZI,y%rH}2RA]ɂ(-|Mo/NѣK9Nx87,^H@, yIY&ōW nHXs߅^;ED &%ƍz>dIӤ,7l_/ _=%L)MD3q^[`:ף2DVꑋLKnuNG祃d[G 4.~6YkC+hhKjpI[srTyo|f4W1,⑝ q-PN;m@;G˦Rg-eoy~uc@@@@5`y7"p$ x,1.>'h  h h $%g|r#dWsLf?.D!9ωb`"B7Ii3M8On$?ε%\TOhNc8ȀH+;r<.U7ig(8}W.@Pm0@pӦO,rNtz7xPIӎ|7s"6bf/ ]Ou|u/ }MDO[<: }f@Q? `HS"Ur8Ҏwc+ǯ|zlyi[zdcur)W;ЉmS6:K@Wz-;\Ε4muuesCVɡ=E|؆s3iҎE"mИ241V͙{f6 cJNP!\T T T T 0Ǽ]"~}LzqBrMt.稲ՂJt<|\ݻr?~C+[^paZfeʵk.|mjjj'~yٗL;qʔwsD_lnKfs @"8Ţ%3u{~$q1<' p^=κdHg1Pgp8Q006ef hщ<dHao#m H5ϼW)_>WKmWmKݠhQ-g~V6jT~b3p܀, "e@@,9I4q0"IwxS}*o{xHn$*K+4$vOĉ'Xhjİ,ߑX~6/}~j;z7i-G<,w}8> }2G}7FNbGGнucя1p /L(r-K'I[ 1zStZYGt|`F1}  fO'v9m/A& En &љ|xc 6_~L͸9$[)qƕ}Xټ~ Yْ8?W}m_+cȬo?WWƁA K@@@d5`5KX^z߸7Z- hq*z =pHEF[Mzjjj @%|9 ]>t##g~faqC7͘i4b"h'1gBϠ@W-GN1'Qĕ͇m.d^W,E8[g7Է|-&m&C8x}85˷,M$݅0Ɓ~PvDXMέ ٟG2\8ndux{x9GҍEIޮOg1fa1=Q"xb,oMEADܜZz9'p p+,AxlDKDݞAڠg#{fE}W L ]}9,`'o 5r[V2G(;' ?;&E XВl쌽Noyfhѣ_黮._ »gd} ; (^E'_`W˘ (KMȣ>Ra=49ҒF9<= j00YydI&c *rL$H3 F1rG6,EGt$D鿏='ftܱz6jJƏ>3 ޵^\kͲɾh'GO&szΊ". KΠ3(b 19GΜ=g<=<9G Q1 K}Zz- ٷ3]]U7=w]#N0 ݳ NS r\0LAֱ>7!ҼY6 p%?F<|b.sg|aL.=YsP7s0GL{l /^1ge)x5554պu˰S%_DzE*7:)5ǩ" S*#X3E?jۨ,Y(KkTFZc4PMv9aA<:)G+2)că xF12a8c clnaȀG~HPw :yR^]~ =  !01 @9hμ~(tNX.$NqGtqO.,KlC$x&<:ܣ~3l`cVx\LwLӎOtDhC_Ѓ.; :@GF@EKGDdI2S K}-D{[6.9~ h)-@\CerI=I";:.m} no L\C 搕z/- ٩/tIS  }QGMD;_ϑ#:O`>RVg/f9A,1XF F^Z =-/xA͉ٔ!Q-\,g<=rrһgKmO{Ÿ\v3gbC~ l땡i;fz<-\*^V#ĤYeߖ1uP.RзEi>q@,!ss/yͦGڂQT\oL7 Iz2{C0x64d֝l<`9molh:U|tyʔ)/Ay4yﻬX}Mڀ׍8!sη<m  0N2>XWkʄ/ʃ/iKC}%]bYOD^&/2cxҹ <Ga$gr]tPK|]3kЙ++ͫ #uKr&VVƛUv[˞kMG$pqLW ^ l i)|l689#7wZ:xwi eݗx70Ɠ><<?ga -?Ǥ 7I4xp/ !0>׹G>-/xt ~7ViKFrH+>A+T4dB+@IDAT#iQfMjL&:S*fo mxdb=hdzb94N.⿨ՠ"}m}YZc]njg 4U$4[(nRe,?*O|3]#+Vrj@,gfE~x9,Oz53יqtr *ay/O\dff7[\嬪x/6̤W̴N<:3id^;0><sg+my+zyu6a-{'my{5["OAk<0񒯞u- pl#;Wt>H6>: 2ge~==j[R.MFo:a'MvN"4AV+hycLcY/,o/>@XG)';*U?2 d/ueLrP{^2W):g:X Mw3',ygO#E D D f Ԭ+J@jMj?Hze5Pz-,?j j j j j jiG|e2a,ON8]MC@@`{m+ùi^WV9+J̺(OV5ץb*/3?Ve(_(w)U&#fCàrHQ1.׆fb.Ajh\}Y)ļNgn k*ч !;b}~VJ>+Di[mѰt+ zM=J[qдlS t=^_W:F/^zlҨS);G>Y2o0i ҰsWy̟nyƢ/}P&=kr-(Sԏ}찳|~?d+Nj(!_⟗;ezTҏ{ M>R ʕ ØwޔwK h-Z#R-%!4Ry7^򰣥=qe{W>bT=ݠ>G:ث| >H>@/U2#4|EQ J1 쮻֯]QQQYtH="0Nr?`?a,D«@+x@&Yym39q]r$OW|juǤ9j]XCV6[2?.ǤqJAϗ|y:A͹Eq3)l >' t$5]zlG% bҼ[͓p >gs=RW1YU .|6ZYGˮ$-6ܨ^IW\$!R~8Cܟ1XrtU=ҡd{nO9[zx4jߡ/OTu 䣥~AM\o~B*5Ϥd.k14߳$)xGV IJCׄ`nmIFh(==,ǚ/ԃ/dP֯ ˗d/r 24 |}{ӯN{@%n-,SN|ٖ兀bxk +;K/ys_.WlԴzM;uRZ"?L^F}z(7Xjg}ey1QNҺU/Hϣ2*8T&T >mʴlD}WESO='vG-7x}}v뭷rl_!d߫uI#âaÆE L{<ۍ/j j j j jiu W"8VMz_kؠN2qd(S| c^z6ϼ_}2 CoUe <2Ξ=یZkAZcyI QWB1T_{RI,o.w,|-z AgɒE" ΓN?G9OeWfu,ӏ ؾt:ͷ.3+ӧG}/ye+HSck٢2՗e—߳2kt?\n¾M^3n '{ȏ/>/yVZp3h4ч#%HȆ-f>f|ym2˒6mdCϔ9>i~O,4*O3B䧟6~ xyg/R_n@ kY#le>9Q(XCSm `c}OٿL{Nm6OI)Ҽdž2Qyf~]z=6/yY7}t:uOMT-?:K9֭o^xᅲN;Y@ݼQF{)y}Z/j j j j ji`"o%md}^PHv7&~k m "wVoq|u\ Ռ,io|T*|C'KNj, \q|z~$-o Gw=&wK6뀟7zϽKP_|1xISiUs!jeyݺdL6K_:F:_}!t[_$E)PN k}ɟeOWֻ(S{Izyyk7$o?ozFH9ʫ;Imח }Zi+3~^8X3rtmRSϚeQMes;Uq$@@MVQ҂xXzxRX{*״]EJ+_X.jZfȝ8]pK1Ӥe-N`#zc+ySU| ?q06~a*+xE>t?Pz![:غukS+m6a饗 lᆐNlmv 2J(u٠to?'M6(|Sn) eլY3#}~O?!Co9E D D D D TB Z2J Bii{Rԙ\L竢MZZS*w54r\KY!U w VL컿WGީCX@?xǺgh? ɨ*I7&O{%YвsHҗ2teS\fqqk_^3Ei{r-6]a:aI2ʢdΤ_amZwek`_9Q78K!Ke =^W8OJ@ .z,UzWgå9ǛӂuwIꆏYT<^3zF%(0CXZd;eYtvs [ "??,c]Gjժ%z[r[qNYo4h_>_hg55555P5P!<sWD%.Lqt ie;Fir0dCl!Lxɖer@E E49=d4Ft24N35q Cy2CrL:]yX:gKSv6zƇs,Ҵɛx!4\C~sO-]g25/jsu:mQ0@v58׉TuE&R_K7Ef2괳k_gs[~<01=s&z۴ԥ4tYfD $ubsz&NV{-u!JgSKi`_<¼_'H^dj;cO)Snt*D)/"~Qis]eUrOj a?`XSPxmҠc=ŧ.b44~IQ[=XHAJx.\H)lٮ֊N ,/L-/V%Ȓx5GglO4L]PW5<#,7?[f> aOՅ5H u7v)Z2QQQQQU@‡ox>a)ܙy̯]y|灎 {ssWivˆZ{{/a\txhA'7_N;L/{ep8›퍪|1"wlezE0Ӗhsts,4P\7xYe#O^!8\~?W}2Vz%\gfgև- LNFתּc6:F?^{]atΑ^fʠ%Kg; e.b3+>7hbRCf{koMKM2hnjjWo ?VKjJdaEvfahkk*o3gK76/c@TUT==B2& W@ xSk$t]veXDrqA䮻iӦGW{B0͈QQQQQPS>9s᳾aoR6xc駟l_tNO>yf|:oʀ;d]w5Cx`;ꨣ,~رccc#FXS,}{5'k-}s={@'?\}]8vkqg&o/cƌx@K۶m&+4F)MtB.6`'ѣG uZoeСD Qfƍ߫juėpCg7IX޽N/(}5j;v(XbٯTt1Y_hK_FW^yE~mk]v&.+u3{v6=PK&MGyD~oikݻw6I<(Lk|J駟F{] W^ cK?ȃ 5¸ |I' P1~Yo-d& c KJc,N8x?Ю5`:coaegeLv/j 8j֣J2QNبC'nO#u˥si۝;aF{O$H&=rj$K e0\LrJiT6дTK.S\؍V%2s~= ^߫2z&Mf4ujL686򜋬Lx]0e6iHiiIŋ|,Ьk7CORiնR~TcHcjz`пel6ZFmN;FF\v۱nuΥLKCB=r4K\dL=<˚+8{6dżgs&b0ٳטB=m 82薺>3{4`u[=[-uWwqO|b/&d/_r0&t߷o_K8q9& >0ύvp`c=2Ȑ6 `K.QPHЏ5'C?j mk@F<=PDz6Hߢ @6У whƒ#Sfy|=!20>Rƃ߶ `3p/1qn3@ m6W}zNM@˛dyG ܛrLz{i2Ce g8Z\} 姧`R&2Q}|5Xls/Kf#SF|#f 'rUKˊFRYCy5mOoY^|rˍ?nO?.u\&@ 't!|4ԵqD>R4 d#yS{IF얛䵃zo=HZcl.ҰCg]2 y'+^1%E&O A^̼Zgqh_j/@ Ss^1v>Hsgfmf/29!j j j j ji@_?.t<xOm; ƦU]tyyBKSN1C0h'L`` 40`1 R xc`à& F ,qx]yqKxG0qxO0dk,;w6=@1I@6Wn0&2`—>8 tPGLutðŻ!3 ]5\c$xag@ C ,Bvի /+Ϣa[D2&e!=zx3)qD7(U}Ҟ9g%(udot\`mg<(> CC&t ACF𖣝sЕv綩;@ =48~\x6{z[2` rĚmPoIχ[nacVS}bN_jܤ\p#-{+O?@jBѢ`_r F_ء޸/;d=⡅w@HQ3lcq (|#2OцJC>) M!oo/h=\^ס=YN]kiegD V*5.S^% ;w6/eQ/* uO=*~:/?pwϗa W4]v7ˀu۴3OIx'1A@oɀMF={R/<#[>X>f[Cg_%/tf6eOؒ6k!}G}iHK@@ߌ;ޡw1x\RJ u5ϟM@Yp]~EX6&~܎un6ksհQxp|UqG ﹯!+p9*@~(s5}IԻ` Ո{j'j,&j&W\qE^3$]F=Q,i c ص!D05$HJt2׈o8këhvTC!QD DZ}!ODpӅ)I^ IF >eul {> q (KAY[%Q #Qo#F.?KtR(/?5jV-[% $ %ŕZ^ƻmvu^TFOA5M.*0gW*/m+*Q1IGDAbIK zTg^zeE.]R\ X`v:A>5-^( %j:LԛG?j' $( A'v:?)@AM;*klzYzٵ]S'4K.zgY+lV{+f2իLhRD䣟*+yر>@S`L@Uc tB%@rhC*dK\GAԋvM[a (PnmGDA;W3v7ᇲ<7KGp8闥0=zW=)lm*KZ~@bG ,&V7jĘ޽6K2~գ*Q2kj  ItYt@PtHC,Sy$Yd''K~,[w̤ɜHƿ;0uMɌ%KfJ,KMO?Lh>?,s,.J|e2ǒ_}1dCe|v: }xɲc%J2*EG%{2'{{2k̏RT\:n2ɬ$N8:OÿJ̛,it 2-_fH}4$[q}MD8):%ɂeעQ#4ɘ'84^DZKS3njN>?j֚|/&%yd=GaٲF7u_*<76A( Q.>5yݓ}LaTɼɓ Z?cy#ɒ ~w̱?isY iRpnj=/⻗OQPF*睱y(?x#s s;]]UEmW_/;"WLoN?t,xF]NwROƟ* ՕԶe.,,1US<͙-3:t&ˬNGSs[蔳UWN]hHY7lݥ#eɯ̮瞐źo[JޗvV.eS<]f7K7)< umivw[o't;RSIjc-w9})!_ՠu[iNf']צ.`XtzOyMiV^,.gӼj;izhg%UXisizJ./`r.TW^\|Pd#?q;} 4<4+b'0Gc xNc7zcfs</j j j j jjK1qE` !˳0q(b9n18V#/_zр+acͲ?Ж_aX``(X`/ڵ3#ͫYfXL_R6(A߮̒3&),bIGg"x> ˲0XXʃ,d{P@p,!dA|,9bBrCt53XEr=)y 7'BXZġ#@D < <,d'h!-uH`H- u@Vl `} | @M&c #e~tQ@YҎKy;}<{S/^hc~,Lwm=|̑NB6`'6;K4 B^2`#:ZE η ;j0}|_ _R_$!#u!N1fXhFg,e9c:uXIߥ#Aqݷu6/oBGԘ͌K,Lo }k&}6OׁwKiEݳ2kC_@ īImYR礳Kn]kEtaMx_T~蛯$qI֗[4h$.+rURM=Nty\[KaKA>sY:|mYtnj'ڗ OHEX_jA[j'T=h)CW+ppqpJҮyqRc?]jWi7i0%YF}ToJLaפ7j"u?F֒D7/XΞ<2)/xr@^*^WWL!*ud7⿨XeBᏗo1.Æ_x7'7`l<ξ6&sα/@G;,`s쁅M&o42 lz.0:ɽ6,!^Lߎ$P@.dp`Q&zUN<3 CHm7 @&F7|RL`ئm;fyO2(ozP>w0ؠ=0m,'/Nmr@;A-Y$!'6իzРh /g` @$q0c:ӧݫW/|le:BD} ?ޏ=|xA.X >ai XgDGMw_twr=z y}\ }xYFMyWE``m;/+c zϯ}[{pgF_g\C} xD[x5X|1q6ܮ];оTڡ? ԛC^v~*-ֱG TѣGK P^G xiȫ]G+gIE'>TGKAw,>ӐdP"6PN]<*_䯛(5.rFK;;W3Zk4Thж{^6N}R{&=v;dUXbiĹ u7LZ # e+A'9d)tL0rU7S0isQ$W<'-4<̉Rr{lfRx ˑLOb 䫱sM.1xƉ(xOq7` `F&u$R9NLT  }hHx'/ Gɖ xQi" h e<X2$C=qn@&*` c!7iffP"# 7F2 A+J&CL\B˱<|3#sX_0٠رʀWLK,ej׮}U|!=K(@%OB} e ~O#(}w _@@Sv]i/}a@l;KH6@r-hBdSWuLCԅҾ/ʧ r.CN}N@픾x_1@zAhäzd Gx|6Ze-\,g<=rrһ~y dqkj\(y-{kW)U78RNϯvg6n)t7Z=򙴉h1 Pcp8~,ӈpsxgO?ڏ)7Sxa"i^N&=13㳕r a\C_f:^E+GX2y}~//?=OLY^HՐB#W˔;d80xNyuf9V4߰/s/+er,̐N<X#4mʄ ɢ*/`MυLʽ4{xłg5ze9g,o~=1^x~lɦ]F25Jj0qί`F;-&78x|DŽ{Ӆ3ӧfy_4B٩\T({~i2>a0~1yϯI)r!m1͠N90}ge˦݆q< z^N.92]i@IDATzOW@a<>,USev+ ˌgƇJ+?LB&8S4ߕXY:dQQQUᘝ (&g"d^ Ǎ`'T7C,gJ;Ja}0O1.j j j j jkBJq㴬 rCjeg-xbLJ+}if-4WZ9l4Ź\Yz̚f]&ۀ flqKK^lqe!^H\K͓mrgץW$N㪢Sz)n@@@ЀOp ➿(< l$e>?=>H {/ ز—wySXV<X2Xϼҍi>:u?G D D T5 3Be}a;kvڕ->R,V,7|/#U߾} "X>{!j j j j j jX!T?D)|QMKQ_>quL򫺬ȗWmhUNs ذ=2_f3[n1pٹ߮];g{ |U,⾧!?g9!j j j j j jt `PU_M+d[%?V7ѱ4uQ7ޔ_\\E5?&z ;98?4*G]Ӯc`|U.^s5^yӘ1c/6HڷooGQW]-[ S +^U}/j j j j j`UiTUUJڪ(7RcUXAFk.+mZ^Nַ͸C@@@UՀg/2 ԩS tܩA)?ү ?o<~'رcMIK-[)W\!O*Ҫ\QQQQQJ(?W {f+J/A%nW<3>  e;'/![gKO2vTs1SyWrģHjԨr+l˚QO< Rmf 0]T/X $ϣ|Cyx:u*@kv6`KG /PԩRO뮻; < eCvp!Wʓ~y*W WRKRqU|C_t3=?{aڲ'GdIe+(-M|y\x#e+J!O9ŸeJƇ!\4!ϣW<ۆYe2ͫNm.Lj׮-guVK_uju-VH7G D D D y':dI ϳSk`Vh<0F * DUѫLtBw5j j j jALO._妛nsچ;w.EtV}Cb]u饗b}"sKE D D D T* 6W^ +BLm@֩J X<~ltHal2V$MsJHr\QJ2|ydzNH4>2s4įLpCs懏TS}ӭyYd-/ff^lz!oYIk+eMڕ p~߯9Uϛ-=qap8\2먁p|饗VyWV7p|L /d<y`?Y/ K.Ԙ ĪJ:D D D -?ʹ'ȌEA˷'?]*_l}IyI+`0go,7xʨ`_˯/ LZOCe^4&|?td^˙jw^Y~q|Ճtll29MoazvZrgk* Bނ&S#:%(>]&hzYZG[o;&glm5ٸ>L7onr: 䣾v*'MdL8GVѣZ ۶mB2o<6lǨnР i{I&lL]Pƨ;iģ@f􆮨SW_}eoTC~E;tFm 'oc6m?K_[Ln-tKMgmymn݌ yIG-B5mfhxDV~D@&.o/ҨQ#/d䈑ҥktM-~ڴJSztYuf^(LeLo6E۠}M<>2Q7 }&wf\oAvH۠6c/0/x{7')uN3B,bh{g3W ˷:4hjyR]BmXE_F5h'eԨQҲ0nn;kC[d[5z2k6B>mw{*p555P4XHyXXico{'Cy$nM߁WqZy.Ox5555P4"ΟQ$F޹TӽVZTl`'"LgGNSX<~3ÛFafwqqƙfM}V`|9 c"ذIy O>8{7c|#󫯾jdb񍞘UPSO5cA}Qw<nю:  8149c,W}U ''qMAf6_x2sL# oo{hL>h[*Aat,^X=Pۻz۷y`MV~! > #-"_,=m./"ygmҦhh=/O?`\8휺cM>WN. H{Ѣ"}3 :(u~[_@ݻG(66ejJ&J*f\FE%)\sMXE gNӗ,]bm1y#r^x#ޕW^iN3555P4xH܅fpcg{-C0a}衇GxpLe~B^D}5555P44‚|iXX jאXگaRjHu>>A2Vr0?x{8 p  >ZB`tUWY~!&'tX!= A WQd^ y ӷe1 z ̽k=ƈ#z-7Zx`'x! 7hFF.U`38QIܣerȡDqfv W7yȋA w|L!+F* |Mr[&7F2, }!w}g?s̨k%1!n!36Nx kc1 / uh Hg `rzw 1d BxM1 s @8LwC|ƔB;cm҉ΐ"*M\Cvd6m*_y R` 7y3r@!'zA_m5^J^n[yLWO?m)\BWA'DEzIjȐD1VQ@;eH.bի7CzrKc5@[vOIe#?p{h.x ϙ{27e,xfZ*Zʄw)5[ }VMk*3pxa(aa遧 Nh۶ 7jƍͳot@~2| f09e+ s=g` /<Vx q26 [ ^0f?ۄ%011tȋ ] _ ȏO܁_X^G#NWL?'zD+ pk`<x5w1ǚL բE AUЉx<=3MG]S/t fifd#=|Pvi: . }v ` B^<&@=h#vqGK픶۵k7k7v~g.iS#z3.DCk>ug&m/)@xLk㿨* :M-kfPKM~ kZ$([7HCuQٞY)_h+wcC?<&D`C9 b޽w1=T0 cO @} F=F&ixjuYfOd 9>=8/Cm2bLc ˞KgaWp f#7eo/a u(EPՇ2FMe`؇lgu0s@X M@b^nxl`FLި7F@/tN؄`$. atP X@Y|-a6ygz%=<5;<;zR.hEb#d Gh|^'Х25hCz^N;Lx%&,~)byUtü=x4).a( 4'sҠvVO 'X=ǠG4i_>m#co\{ߥ\aH'ixU0"m: zh"#?J>zalϳǪ}`>mzNt1T^ H{-aWy@v о ڳS@#V5.Xhg0x:HsyTx?ϣ-^&yWz¼?ʮjGq/v>',YJFiwdҠLYn@G 22{`1.U5Yqzg̀b 56J`@91*yȳO5(Q6 K%XFZnNś3A : 5=<fܰ; LctcT/!A^f⚾nzs8\aB3#<}q.2'Sc-x')K4^,"B}3YHgeS%lm =k9>B*D ,@ڎtkZ[X_jpS FrJ,j`@ l?l 6SCzm@")[&dfAs"tVE1/~#XY*Z4R32R塍 X o"ƍgM, ZKHL"=`0D}&'@eb`p }-L(c~`51?=>,x`S7LJrdG&5;dCɄB@M 쳭r` <xlQY^G;FOdWLc2D>mx~{.{50 Ų9W}z3n0"#\-F/F&&##}` ^`qSeuX4@;Ӷx7؄~ _AUiޮȘ\Ůj)/xYqxL#xuх6 0|/ʨ }سN|g[y}eXXNݷ|VJg2ѻ{Ar]Z9V?VKF^ﻯlԱ䱱:L`!7k XnWU'F-$`^H =P"(28RD:CSUJ(J Zr;[^޵9g3.x¸([@ 2haT,J'Y(:I0.T%θk0NYwa\.V|WZ aaX la\ /¸\ xa\ص4bmaVFk<Ԯ: yiWv?Zc,V j?c `0Z-h  2sA6%Qhٶ?bT5ZsT4|#phhKiOyq>۩63*ZQIh0 ouƧFex|;Z=-| _\.gO4Mp~*0^-[62?.xxrJK7*F%cQFB:*cibxA6MxО{ ¨Xkclp^E;S8_nBc 5㣅¨^_3__oUXvaPic=Y̘ƙv%6/m7+(<~ ޕ|+m!H%3aBqX9;g~.Q/;₅|RQz+O.ګ.(R#K~f񬾤՗ēGmC>;ۻ2}yr)8mO')Mu7򲄂rGW)~8mPěQDy ~hީi'efr*g0xOz2ͫkq6 їtYK/8hhx)h]6>ҫ~,9%mx=xU5u-w2vHcE>5 wkmc|j~؛,}'FP/4+~}|2nm֭J, SS0 ?XeYXƳ{E:w{މ-nѓg‹8yg(%)g>Rn7]¯ʑ?:p+)-?t ҧ eNy]WxG^)/-鄛IiiJyZ4_)_~pL</ֻǍѫGW)+ZQ` ;i~є>m3'LW*x m~x%S0oZ#*6o[OǠl͏VY;a<#a,z?=B_~~;猹Xזjz۶AV`]({>~Kvk_2?M*< lwy=X3)+EcpPj SYL>x^kbuBX.+}eoDf!0 2uIӶ>+,i<ڧׇ^<{ [*}=/Oy4t=s)4|g衵>WN[`m?@!PfG1ga<}s<)c,6zyoe=ђ@=%B`! G(K,s}Y'ppQG-#r>YU@ F#Wnah(W=@M>Cp/*t{~u32TKolj7zӆ?~ir6~'&>]nm,#QZ-e }o+ʸ5hҘ[styε5˯Ͻxz~Wt7 c.\UƞGn?Ϲʲ:?N]&^=/iޯO\u:_)Ja+јZ8\KbjkyWGydK6*FV}Зg (X20.S5 t1bni("}qta3lU T &!}YQ44˲C1|ߢ7I| {0m`xN;bt8-[Yjh`a]t<ymOH`E$?-Yi(C#\«<Ɣ4B[K~.RHٖFq! 9#`Gʻ"Q$x]N|b!Ғv/6z/ݒ[gi$d˜/VW41gR.-Yh*nbVUKd˱X%~|q;_l8-'ԫv sػh;9at4[onم].[K#ONŕhŰҗ-y3k 38lj<+ :Uz+\a_. B( Bz8 ֻ)w{,,t[\.rЖmDmq0e7zF-v\/}U1qOEE}[-i1ri%xoLJyGISVLEَdX~`<:.=?I{X̼ _wdK'ETK5Ċ<╕*e]#l10멿,濘o_H~w"NmJʙ2/pWkcѯҶ6.AxZ*oʌ}W`-),,)D[;ݱ F狈w;յv='Jʜ+.Kᗸiabi v t'᭿-ciVԂӈ]b*x" ]_sL@^ą_Y.bM=fl/\i/~<]z|>;DB( B`?7o^\[.\ @a衇.Lj̳W]Qw(r f+_`J9QtΝ?AY?m2Vq5٭PguV,+xE[SX[sw9 nKT 3ZicE\Q0ٙO94 xGY{;UN}\w[).V0he_v Wo|/+r@!PDv\T4c%Sh_+`폅  UnY/\זj2b>ʕXX*sW¶9un$q'I<׵( B` ;nfrLU[ R`m@!PnBA9θK\˯'lg{^vL|OX.2@!P:h ¼}<8sMZ$޼t-^/iω7 O繮,J5 + B( B`"SF]N?ӟt$;\nu[w]?o~_׵wٟ]~|.~JB(S2`8XoW68?Ol\"Gq4>\}𱦋]b-~,ﲗl??8ˆ˵udկ~ /zы?|[jyXmvaˇO,O9=Oɯ@) @!P@!"Tܟzpp߄~ ngE^ 'tR[ "z%zԣ UB( (pr]$Vömۆm-oyKK(yk=woCsw }kO~mWjns rF˙g9< /| J!Ee~} '0jS=q/m̼=i[5ʯʲqx4>iOkth8kgJ+^@!PC + +_Azpw]yۿ&_JWjYxՎ Wg<Mu{ܣ}ԟB( My%g>XװS@J'~,}k7oiXa}y1yLi7a?rp^wY _f{[rxC:\2iRZx#~_oy}1їz=;g Ozғַ@wVWvs42#^W5%qSK|+(Z(֖*l!P@!P 8㌁~׆lN$ }wy o|۫@N4?΂7MeƫV֟B( _m|fSdPL2' ,iw\)+}c+}{g=mPREq$7W)8sy}ŕ Sx{ы^4|lq:OaFEC)񲗽l0^YDKf=y>tZR`f1̻Iҕl#0z=^#ߗizv7/ew׀GӺNGU~@q9>HXʆ>s^<+m|=} p~Y( B`Geɦ㹉?=)vL82W;(n|7k/}Kg\Q oms͖(Bw{S*}#YV`7 W6d(/6WՇ{My܈3ʋ /qhA'sgbr,̙w\K6ZR`ͼ$]0,5}5fpFa95W b>J1%^'th.,G /B( sսFs?sk. Ώ:ꨖ(z{^S6o~)| &r9L7ˑɾ=-r55]zצr|3֮W_  >)Y^od͕2?5yFӚ.w<2HEn#ߛfh&ݬFwur?g?ك-|g<ׇ+~0م,+nB?~{gxE%41$8vcqUB9@!P.WkceWfDFfe]@n|K_:ɟɲrlW/uK 7 /mޛo~cY̑f;ͺ,k!0 昹OzN/2F7jB.Rkh֐mE lՑ3W$[Y!NŽ`)}u #8bQm۶eˀoec#hevvJ g8S& 0Ot{%feQlx(\^g?mq oxIwmgA^zͣ@!PlyƭR@y%/yIMqN+oz8'+Aq=mh&7_ȯu-f!P YOG @AELu6>яFgyعIW͂+vYoFx56,.E~8wf_υ@!P#/|ּ`{G!tȅ-g < 4l8YX6j>=-NwjN֘ʌv gyFMgb9g>f(O}*lk9Z׺Oq؁@=#f̼]̢! $:ym3 `3Wbx8'xEj @qvs4 Ggr}{&̳/׹=cͳs,I)üUƭu[G3=0>> b3{ _p)49:ƍ9v ͘YN<ĦTMnC>Pc#[&1 aFZn kwj¬{r)*{s6{joᵙjcea"ۊ+Rvr>VĠ\.~+ƙʳ@!P_7 XKMKA ޥ29(^%Vk֢6R\%˄C|e+zm׿{J3V'>~p{|LB+Z`  ݃W<&V^֭RΕITX!U X{v3F2v<^_@'Bz?O_ \!P7(۶m3HY\{HO^N+rZX 8@!P@!P@!P@!6i־fu;)'tR3ꝋEcv^VUv(B( B( B( B` 38uO}j8묳[ ^ < @a2+OySW|sL:.b)v7E( B( B( B؄SΒя~ v4#WO1u?_78Ls!0R`C B( B( B( B`/RvH0y(z\3Z3T( B( B( B( @5ud8FsVuWGX󱩐B( B( B( B( b( B( B( B( B@)BS@!P@!P@!P@!PR`mZ( B( B( B( B("P T@!P@!P@!P@!PF@XB( B( B( B( k.4P@!P@!P@!P@!(F( B( B( B( B`. M@!P@!P@!P@!PlJjx( B( B( B( B@)BS@!P@!P@!P@!PR`mZ( B( B( B( B("P T@!P@!P@!P@!PF@XB( B( B( B( k.4P@!P@!P@!P@!(F( B( B( B( B`. M@!P@!P@!P@!PlJjx( B( B( B( B@)BS@!P@!P@!P@!PR`mZ( B( B( B( B("X ? (va}ig `̩q'hԵ( B( B( .%?@p\!оWj( B( B( B VN/`5:3O?}я~ >K_.,.{>ᓟB%*W1S.ϳE@!P@!PX^ɢ?ooÛxWjx6| _;xwoo_J6kZ8?ݬr}w 2J׾W]k_Ǿ 糟SN=^6ܾG@ysZ9ljyphȫh@!P#pz,zX)۽6m'ߧ>q{?|n7ۈy}U).y^7_p ' XNE@g?uQP~^f^.w7~7>{sTA 6EN.X_򗗏ԥ.5\JWZ;S'KnrE8F cƬg_[f5)vo~ׇk\o|7M%ß{^0[#[EmF 8a¸R(v/k*?ίOY]X/,Y ىT?ɷ}u^YqWzӡ)͔СcfG13=M#]k,>JYtğE/۬tyJӇ@&AySg^9_x7/0s~u\߻1/qo5a_3iEGؼgş7y̻΢僩+}5uf YȲ4~zZy]oגGgn{/+b K^ GswAN:YgiPjKfJmTzrr@!P7V6E6\U6':jW?-I'}] B`XUer'$8P{޳)6ݭKjvݔ2^a~'^ܔFCkt_i}އJt'4}]izyiq6|oʲcmWGi~q+iV;\h(>3τ :Nwgh'~mqe~-\-}\OOx]g#E़&Bه!||ҁU K7f_o l~mk^:Bɂu-\v_k -oyqc֦((X9OTr%l7ci,øK? d LV6i8J'`[4t]rOO~* &,"4-sU|3d<;HN} ?! ^MZFs$eA릉w]`~;% E: id{uͨ/y)"s  5 InMeLdi{,z̢Qצgh7эДU)6-n~uOl/}oE,ObahZ h4Sʠ!~(F8ŅQ²wڗ/{{oEЧޣ2h}j eHW]:xr}pk^˚G:G{"%}U~kںvAV'ԚK>q'ZA^Iyl 358/ym,f7|rdLF9-:V:+>:lQC#׵bqw8 <ƒS}^=v/&M O:Nc)3qCxD/ysߧZ]ie0hk6'㶱S||p3T?iKP6K~\z+MҬ%,ZAX5~"̋Spͩ_8眵YML_-b>ͫӤNy,)}|a\ K:a6uk3MSυVA }e~k_/|aS;t_*z!W%+XQeA-,-"MRN, xQ` 0:䓛Br8 //~g?m1FeahI㜠n>u/x F蕯|e]w!<{oZ*Tx FQˋ^3wwmQOOX~Vj&&  xF3 ] 6Q_6AXE x8qSF+.wKsޟ࠼ܔxV6eAM! [9yNLK){ޭ DO^ oxCo}v|8ez/pZ{/yKt>8{_V; ז)W Cfܦ3*rM0>Ï/EMCh4}۞yeVFR.؉cԷ04qSLhVОYe_ʴ4SږfF{5%^MωxmԖVO@!>LBgL"; ͸gb-^ngҠЧKG<MV56fN1,~=&W*Wݕ-$)zges8 MɇSԶ5QQ(,n-<9i,8B9y+^1)rDQzTS:K?cW%i_O7)Qk_xCS>]^WəL&KF5*e8~?yu )Q}w6~+~X)W0) Zܤlck{ċNY@9HsksŻ "&|\) ǘ#.lXP&W93ƅGyP WFuHH_K|q㙣L‡V}٥WE[vMrC3&%A`V6<ְ҅_+_nrôc<9eҦf-> tk 5œ澴yWUh '>iS\o΅\hUȷMWJ<.LImC=`d|ґ':ʝG~~o5p 7++}o0y`DSxUR>s3mNUυVC@?>=7X6l108KqƄ? aybκ7&c]uDE 3~N<Ie"$`Lu"Moo5k"5gPP$x`I;,*-J>m–@] ҬVtlJ iд'EPt,(mFm?8GO},zEAެTq!e% '?4(ʣ8,:,ĕm% PC\P^Z+#q_&R8ѡ#бĐtnAEyW8pž`&Ig1pH R&qK²͢q{׾)RvΔw坺8G4ӳEGBޔP>[x_?W^*̂K:rS*:SO~~i9]S3Dž=PXƱ"[$qW)% 8rk~FK_lA=fq,D}RǔΨe/=aQXYƋXb%\-bi!S>w:6?,_z.mD3f鋔/V}U_ڜĂbU_Hesl.?Qdsq[^ɡ/oozs'O{H2E0J_}M9zKO `%x @/70cƀyQi猍~1#8}:Ὗu1&^gF|yK>/~+9\А|ͻs̻-Oʮ)8j)0yVym=n2֤pL"Œr&!΢hʊEP&4>W(X @$IܻǬL+w׾m!:ඐy O vjZVP:u49:4,:%.Ls:?~8!o}k{ qIK` [ !Kp&f!(@)A0ܵH) >[0qT?`L0zpc6H .}Ԙn=a 6@cT7WcN[4Le4lsij"o;Yؔߔ)/IZXf}Y\|r%Vg# x!toP0q/˽Ͽwh.YODtm۶5ŒCWY-p9XIS^OpAV)/zmˀhh&zwgw8'M=7ނbǎv2%5T9BS=;m کW)9WsJ  FڀIE;RCSfMi?`@οJIk`J5uk9Zd e,b-j3oXơZ?6M9 w qx~hJyBi/.L9qW}?cK / ruGg JciE? ԣͦ>h[+A]-q:ӮiTKjN]O~bMSjwQoDLEGi3zZWUǼ/ ];jO-y*q_sТ/)[83S0ms;ڱr0x,Wԉ|X8Xla^axf,%'J"CJ >p϶q~O Jzst/PyWJVz/n^9ԩCiҷQ(7g\ڽrj3/[y%.iʝ85a%X0c{}̕җ-C GfNnxi+ăh,HCH i)MmYicVo?WcGe,ląE(WvE?OxOvf&ݺ`9jҏ 9V۹r^Q4YB :ag?c2cз@$^9EiyYyE`^\V?K>4vٮG6} Nv;/ w=m!"#=J NSgv͟)۴HKQK4eo!Ѽ@iSD@,FysK\~H`DCO!"1^ʪ)4Ԉ?ËӆݖݲYkON&C1alN`%ꔒi e4 galqS(5`QWx۽Ό+ƴCh7gu>zT@p]sJB!sLMo~S笕K3Z_S9.kxgwW>)l.go;/́ ,ELi7i|Mf5g&BFNs]$N4&J-Le#.+PcK^|@|tЄT3vA#N ?56FRÏ`mܰL1@{!82Km`1@XkHYl#4)/jR'?m0я/3ikbK+ TqB&mVmi?vބG;9vo) XisAݼ89vcaAAYY_]Vt?3ISiϹ7I_e`$#ےߔheIs _mF3_OgZ^[ц8c&k)-uvx7`᭼p[7;織Oc|Ɩ >/ 0^og)7R;.x2tmZY4w6~^7j\O[VG Âd8ӚƒϺ42@5g͍:d,-eJ9Lׁq5eu51Jei{2DOgN(߼·ܬpC(XE g d4glld,3XKC1:熒N+ Lƙ  B6B(5:Iog!!i&XF:C\Rs;(tAe4O(ڂ \$a Qgdռ녑>^+3dV\Oyğ`fF7Qx }y-V+JهW+) " |(A ֫хE0WR1hڃ< K8_)4,R&:}X~[+ow8+&h1#OSje<ʠY?__Ҿm׈s̚| iWr$z/| ۂOl{Z&m2P*A&y&~',qW㗉+׺ M+ѕX;C }5x_[cmd+;)j8`nҒG§gA6jum]! dc!<%t\}>oB( Z=kboZl8K[}2*rKC8'Cnc,|W=<7NWQS V7 q]('Y.4j^~ G/|,GI[ h?lg6OUNŎ699s'?ty[n#^#W)'t}&S:ˤk:vYh:tq1Ϲ7q)'/&)2_^^r:rI Cg0#<|YNwj]DQ'bhьҊEƬ89*ˠtx2я~Z1:OEi TOMڡ \.s۶mm<}ue+ŵAFZ7N nv"MMӵ~)07ʭo!? OhLkih&};%NjQoœ|)nၖes Ћ OV EZvRk^'}E%Uflg<\Ïݵ,w}_}iY,e3Şbe*N9 ޵] E^6㟔38;$YWm<2kC=^]X#-v[)Ba|7"E]m~Q`/4H`^Xi 5unQ\&<E>CЄ:W.&>|gkC 'pdEvP4s^kv0gdx;ơ3~MnfIKtY*G\O_x{_["gZr~׻x,vښ NyuQ\r3,y+WO[}GD:9ܚsSVyr![ڨ~'sm6a _YR4tÄ>Myc7VzGQ,r 씛y/kr_=>織]];b)OyJSzN['ջC;)c[ힲR(Ƌ]]y/YL6͢-YM'?MhkҶ#>+Y}>ڬ6 s,95gY> #ȶ^OVq>OP,:o - ~0AƔ))c'>G>O SSMc#)W49OQ« K#;rxxUT}w,ӺokQ1䧝EH#eg~!2U$Xל۽yE?%Aoz51e&|y ?;IDAT@1Pv'LP:Kcgc1EӘEf⹢ENѦG3l9wڥ:-. -~.[ԯ|aKr9VVgkMxE.t2G/{yP΢,ŵl^iT2~lXC , +nu- ~9c1?N* 9RKxyK?Oب*j}86BAN56pBg^^;fo۶YP8zrFpJCY | {Úb}< lÄ?yXx( x(b Ĺo֔|Pb8ˎባՇ3}t`Jp1)|6DG‒ώn6K;Xrg9iւRPj!X= Ӟʋɂ|3k#Ure31zJMOi#K* \7AzAYYzNْ}}ݏhYP-[!m)S|3m,J{lz^-ڳ(}.Q_:7Ir>_OP>em(+|yi_ڠ1ɢظb|sNט_,-9tV6ls%㜚Żz;,sa~XK3 !_qkk%vU:xxE(')՝|׿ ZIoa>;jV+We8G~,K9eѝʕk UکCW(,$,Nԋ7S>uD)קq//v|c^_7yLfP]ŗ8m%}wW/vI W_泘Xu^eL_4ı0%\ SmOʽdb,EQ+/8ݗazq$CyX\_/_?L})zuXEzeFiO ysˋSnRRZHS}E Ͱ^ G: z"/I*,~mUB=i֟vJ^MXt}8bԽRNu{S,)%퓲ߢ ].?*vJE eZŋyr|_m.@_,&[+gp @n!O9DI,ڽ|IQ zCX 7hd%.+`?pPn(%_rs;}/h|Ф$7P̥H4vF]'A)8 (X->B xx>ENx'-Х57iʢͨmYWf^}ˣ9m1>#2g5?' <|24q!Sy3kϬqOmo{7e˜Dq,SRSDk>/W#@G}՗Ehs~g<0#w~gy%x#i~%zix%?s1~dA~@"/ol۶J^@5IGF0s}gUyx>M6Dž' 6? NY>4{^Zݬ{){(X.IW^YbkbJ:΂B=S<'LQSj#6 ougm mח)67+' 8|XQPpi/vS_)k+G"ș/4Y[q,o_W5:>ęбSG9N3~S`Wy}[Ze i¾H6Kiiw2QqzW򑭜M9NN;)=kwJ4}*<zu_lebsE56Cl njSst|Șlk6"}}p'rO2%E|rWcQJ>mk%Z kyzޙ4gW҆Ʈ\f 7KIkh]kw+.~KgW*Zkƛ߮EvDj1;gA兺GJI*- ©[-|3Anű7#A(YѥQeOA%-Lѵ]g^H'!:8q_vcLKx]E }8($5tܣݴO.l)RӲ%-&nKLzBKz?v\D]7X\\bog:%\YX\|{_ƙLaz)LBY(]ՖC_RG)wf]aLʧpM]#8Oٱ6 BθP _'ԧ6kXsq"E9e;l{c~lygg 8+ q屩e֦B S@!Pho(g`6+l_8u; >'y+QSRQSMnG!ys!P UUB( ]@ < tyI^akJoNçωR4 %e^Jy >y'n᫥Z$42&,caAcac12ɫo|^CBytC>}N<ו iͻ%]קk+Xa,,&-\c1MbumsCHe!M7҉Yl;sQ)w8VJ?cͯƙ>ԔPƑ8攼7+xH  `ǝuߧ^~.@R@!PlQzat=K7?W O\ω4|x+M۫ۍz2;OiN}X?ͻO>~᫥ZҬg6}n WbqEŊCqι紳z^ts⹮V?Zz:Y4qf=n=+ŝzAgӱ԰p bβpv j^[D%Fs&]h*sv7?JY۞r.B׿B\4rx0$4MF\{膆i>|\Rx54"&^]7\Y) B( -@^zkJ XH]pD[MUEW(' ϫ@w9BdE3 O_PT`Ua =Gһo}}ܙ_0'a znO?/ɛ =$dn}ZؾZu-z)|-mfϖ궜\Xs @;? fVtk nR:gj @?H`" ;Im}ϻW TUAgsBWD]{$UV=΍U Kwk_:f L# 5P W#Z`bثѬ'hH]qL ~u_@k1C @`[ll]ۭ}EL xϵ6 ~9  @ @``ͩ @ @x2  @ @zx @ @9@ @ 0 @ @ @ @'G @ e @ @ -O*lIENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/2d-analysis-image.png0000644000175100001770000075235414721316200020723 0ustar00runnerdockerPNG  IHDRw liCCPICC ProfileHWXS[ Л RH ^! $,*v*V@ؕEł.bCMH@}{'̙O*Cȗƒҙ:(/ h 媳UtB9d,ę9?U|Ro5@ij!֓!^*]3UpMRhPy`:A)XV l.48.: wOx ;Ћ  ! 1Gl' a!H($ iHH2"+ dRE"'Hryt"oO(RQ=E,FI84%h9ZDE:ڎD{0ib挱0cY `eX5V5*֎uaq"3x2'3xOWx7@#.a4!0PL(#l% gH$,sӈ눻ljmdDr"bIbٚ{4i>Hե:R9ԱTu u86-Fi%I-WK5KR^+:nCgӋeK.m6G=SRMX|:;t<% tn=11qѡGԳkOџ_D35,5gp!!! 2P Can놟FFFˍƎƓ67o(h}C&& &L6JMך4230 21[evԬӜa`.6_e~Sf1˙& Mvɖs-w[޷XVY5[u[[n]k}džlò٬9k6vms;C;]]={}$jkDC:ˎȱ$vZ60gdXTgssCh. .[O|_]=]\;"rM#޸9*ݮg7pz󋗷̫Ϋ;û&KZ:C s磯o>?rv=i7R8r9#T5*1"Qc,i:*rQblb$1 2~]ܤC #'Md$NHܑ.)8iiddErs =elJMԐ1bq815gLc:z-{cݸ)Ώ77ބԌyj^O&7* V :gYY+ggD]bB:'"gCm}yy53Jt%S&N&uK'NZ=[%*GzEaI0LLi8ugEaELç5O>gf"33g6ϲ5VPms:iSmV [/ZkBkiY~s}K.Z~qdٍ˷YQQ+W1Wk<6Qi/.o\kvLU_'Xwe} J7|(xkSjͅnIr/5[nM}{S555;Lv,Ek;+dWcsݦK==/f콱/j_~_m~:8PROn575FxՐgq]sF[7l%vw ޝ}p&w}wW![%>Ҟ=3Vΰ/Ƽx)}UU_gП-ݣ;^^Y 1620 Screenshot 622 2 144 144 1 #@IDATxTLJ(`A;.Pnlآ1*FiF{ =b@wg3\v_{)g~3gf: r" " " " " " " " "P:\KD@D@D@D@D@D@D@D@XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚXex$XzD@D@D@D@D@D@D@D@ʚ@粖NU=\.4u!)1c,ѵD$/" " " " " " " uMn@U.2usʼl>g)= >;]RD_3&(9aʔ)a%kVر햲F.KD[@~-%m(_|ꪫ~)0ZMX]t oFx<8∰nXICi; s1G=Ծ0jԨp]wW^~%LzB[ϔkńS" " " " "PM_]K.dm>uн{*kXy*> ? ,ܲvo,FEw}w8묳l0vZXm՚oc/Gwh)R`Y X4ӱk3 ?LJ?o (w뗿exGV^yɐ"4{nE-zp% 0`0sygN$miT|Annki-g+" " " " eC`P(zmi%.Q`/9l0>;l 1J|)ƂA~^:P8X,lK\'/0woK.'-M@ &D ={v*B<;d#<Lu7(ԓZ'J[omVi*{/\a׶ÇW^y%\yVR*^* VTb}6+{XGaי6XoV'O:c 38Ö"9餓o<3|X(ȘA:ƈ8r YSO >k f VxK*qʜ&1>.:t5bD++>(w.BS`W_tB<ٖSa0}3<3 X~i8W_yPSScVkg}v8lLY\D@D@D@D@D\Ga:8f\ze?XWk= ;찃Y)5Ndk}>kaSX%[i _}dW}ϴbsO8?֕!X\b&ӟlaIیX{ %#* -Ulz)]ʀaEQO>9ꁑ!=PE@D@9 C9*e_D@$@v:K(bN*ŜeB]R'0y,BvLbuMd9hMR`&mU7&\1ET,ψSO=eSPfy[sq3Eyu9F3+H ˴}?;eO>9xE@D@'ꃎ:U>r-"P +t+oxWmm_vL?mɚC l̚_GD Hĕ^ PQ`Y0q6+#GZm=Ju]7~SN9%,228qbã>jS  r" Kߺ7m3gǶ)#3S<7*9U3QD=:e fwqާLfs(fV 2fS\o xgGt""'M@o]kWz-FևbB}=vm]Y>w>kXs5L< G9oO3g&/QH*UNr?dUJ)ۺި/.g}6x{[닧9UWDP&F}|^w`plL裏B]*GeO@# %@AAjw zY*-R; (2b9qޱ +䭭z]aGڥ*x;O4);c!3~׮]Cn#WpVK`ر[TΝ*u۽{w{ W |.vGks<+" AY5n9f*! sdyr[\.;⊁s6dM^fȉ@k+%J *[n|g)E"ݵϿEh%'"PؽD^qĂ .رt%(Qi_8^{vbhf;,S@gk+ֶ%/Ç(Yɉ@b-m+_|6bBs={ͦ!vKY~V}S}V @Ö}X:]wKJr.˯lC`},VS%RIOTh tcr95ܧBfذaz /3fMei74g(1 *4|0sa믿^/W:((>0I)Lca#QRсeW2vh k<39]uUMtU"%99BHſ:;{obo?)`p֘H95қ5}s99y*? ^)-#@iy1+A]{.'" mElB߄,\7FXλ } ,X߆)@.6BC裏M7m/"кL®yQ0͊ (ڤ`jova-VVp*p?qúBʆO4(lɣ [,}ՔŜg՗{uƇ_wd~0 _BqsRzek|dϹ44L*Kz={p+l]q@u2k\bu:MM0XPF0hr jlà{mr )j 5v({)#Ef^L4*<*;#͢nPNy," Or2&lb !2.~2|K|~{x~s'u:!mdp;z7Gk~vYKe%OΔ)_z: F~wK$~Grlh89wy{~焝1J-sz*_ǗMsS?M)̘>>l8@@il9Ny/.(l sI<⭰@FirE#iO?28P]|Ŷضnkؖ[$@SL9V=-4%'trЏH}v4dӗQ|ѱg: ajjj{盲6(tH2d*##I3,GR2Qimbd}pwGG.K/NvZ (FX 09-[n;A}cZo2afk]exd$<GxX_.sdm4ώXf);"o"rVn}h[\Q󲎵4oF+(Xcuenky~& V~U{8 \*m+sᤓNW:m%(^b(;ƎDRqG(X(K8`먻RN<k)Y 3GB\4iܣ`u]>)C=(uX ވ#L>DOG5\)w*$_(:t` hl1_8S_CQ3NgNp;B,BYC "nذa{P`ew}U,Ďȇ%l`@.zmf~|LY`*Kߨ ¡ ; R`y8@:j(,Ij`(<ݘkÇ7Q "3E99 ))ˉ?(QRP>:4V@N->9hkR`P-J:Trhxֱh";a=N [mudž;:Ax]D}=|[' *Ǫ+ZӠs8e&Jew)SIFxt?X\)c.v+$,i##Jd . /"Sq5CFȄ+&MdrƒF}!k6°Ƴ>kyF^8s@U>)?(s9D|MDy`ĉb  ~Qrnϩ@G|<=is'syGm1JzuCv:~{nB 6NE8ߡD9qN\`l0i\6g69]:G4aɇ< .le&JpGܼ?Xq"2 :q.e lXtZk-뤣r",^y3EỴ8@Bɕ-eOl e/^.L|c%@A6=CE|3&vR秱G!},wd#]W2:yCFd]alG&xXaل EMh!!(`J\(GdI묳N2g#T7xQ_~ycPq :?3b~ 9dqE]kȄ>ーpGLw %j'|nfa͟  {F馲8OUٻ"y`(9'M7@3N.d !/z衇}#o(d™QT1}wx(dYf{+3c%S8rhG>#e%0%6tSO8OߏD9QРB; (䑩$X!XTѩ[wuMA|p_c'Om}/ƇV(\X3ᄊc&:;4X[=Sla3F8fX;J0r#~>-x\~'~U8uncL˧~V.ػVOӼ{lZ;={W~)MW)Ziup02j,r[ܥ0Ł)$r" "tEMVq}>\ءc)Gق~aƕ^s?~æsonb-ZP>e] AmauUSg0n!23͌Ν:og'0?*x簨C9s4].?Ι_?zIxa ɟ]<]yc# BaxN(xW'y$\Xolp5(?SD@*e僷g6B;vyW_}5\y& WGu ?׽\Αs9h BTg9A'茒v'璿֮*j$@erGc2[F4}h✎[9M\jr>S&H9b=}f.s'>+9 m`6<2KLLeNYٔQiBfd%\Q̡"<3wLmqQo6}=Ԙ‘x] SXȡI-Oן'sc3#e 9}bnFBX:3rQLd6G~X.Cv 3@+vݜÅ/'">xu-k|:V^*?Z]L uKLqYG=E_A 袋ޡ^Ò0OGp.hP(h̓>PqOc:U"}@^  ub\)um5@12ǩq A8c,ص q[:g gju[E43.po kbq~bat]DP._}նqu r2ŭj} S)gprS18.pWys H\ _.XF\1M-+RB"߽Pt@e"F$:5*],.blG|tS(EaE88:(YzzJG:Iק\45‘/>%E] 3^,b(5`|Z}ir5I+J qgʹ\r˻}7> GǬ,]~) B F{BG)O!&"P()+:(M#p^vI2$6pZ<#NQ6&GSzGys$mfZX\)*64d;9wq6$-h[HEKYSsCΕV~ڥAȓ@UF1cӘZxcyĂP;614e'&FI1e72jMؖvgz ww,N;,)?nu|ou M80eȏVGށ(Q_w.Q3G{ETB@ PS#@J#JLXxV[ " .SSc wcKǂT8Oþ@  q&(QJ+rk'vos|6.K M?oȉ zXzm],*:Tse+83z׍ jϊ>@R(/X~x"B%bESj "ϻXֺgQ`[CN^P-~/8GND@"e W\auu$|~Sh{Í2\sN;" 2avr{]^ӹ4@RsbPX(pT ވ$,WÆ |5_/M H+A8O1lz,몫Ά7taBeHc<yԟE@A22]WujJ){(8P`3%xPXƓ"ǬHfUx m$^z$\((ُE)SR d:gqfXy-6T fcڎ?> :4\r%ŷǗ5 -p  ,iym75" A - /3K;#曶c^:F,K\)V8]PPR5i<]ް%.>8.߽/icm_K:d@^WE??[GE tCd\ 5XR`/2dϖupw-y[a9'N:eoB\i~ g6NGL䂧Aڼ.{ ">8Ktm&FF8~l@xNݑn0;(" N*:W~۹Ls^5Q~z@9r|*T& _ r/)9k02bm*VZɶ?pM7չ{T7fI )-v$OXѰИ߿mQC9$ׯ4aڿ֡c A".+&7/JˍTj:XP6k*.gMT(0E;#^e q:VDiM($%]'sW5WBS=(G/J,OKoIp; 7>\|S9Xg4Nv.LC^ʫ~جPO '^vvb 㣏>j Ina,4s!]8"(|1w%tr2¢ΥW?QA}JE(FYޜ'wg5e8\4eEv@ Cl?|-}.@|j2 kϛo9s9Vo0x%vuW˽MG9e8OX ?C`W4hPAb:L>;hИljL;yaJ&a4hHF[ QGe8\pAw:gt{oSbiD\1-믷쬈 L m zeVL'"P{9e׼}W(i ' &(S6`+?Ұw e%̨Q:VD(\(PP`qD=“ 9Ya!yʵUW]3Ѐ(HEi (P`@aA=bt@nU(R_s(:#m8M8;%p(I6pCiبa8@((ذ_< <=zv䙳(. :+}! ,Xbbz^ ' 9C>W@|SiQw!/xo6?1+:la9M]>x`;f7y ,0{}ιꫯnώgA|/=NƊ"ϖq{Vɡ" MeJPn}VwfgGt Z}ŗMx{H:s\>W qAGN*XDn4)iH9SmSS@›p({hxb)@Genfכ :WEi48da-nw\@vyO?ݦ|ym /'xJ #,I JQD|PPPf`BY:SPNוt}'AX@ VX( v< Ȁ&GY< a (fsdZ+/5Լ;"<(9(I=e RgOy^;.a!]X8BY" Q,Š8۶nkG#tX|PGT[lB_xRωʻ;@:<_:('y䜸Qbͅi}rN}O88S®]2KX@ٝYSO=Ie{DguO1iX~:Cs!AXhb~J'nϊXYوOj+Yo4E(]FJ#~p,ɿm00|=Xws1}Ct 3q'i,P{]6k^p%[,xPN1ᑛ˹Bbu]D$a^+XĥA|~/B}NtYqɄCNp&QҿaYDSsWMCX5hߜ"4PQQB~;vuft"4P⊅Q; R9?AN \t&xPѸOt,`è Q\F8/#"P(sP>eg:Z?:c.:;9(Cbiq׵ZtyRQ'>䤌rK#Kc@W_M~]q5-ǡdDyR>Fȟ*vQu9m(}(QxO%(PtB^tF&{p%>Gs!:Z#?učȆB7K0rN#<>yg B@AH0n5wI9/oRP6m#3gj<@P,/ Cgyf` EP\ _ /?sXâg OuFSY0]"*F+W>$yZyCGIN?:mX3x㵐.Bi9 4QҸKb/Y-hjYB(FET(ݱ~tMC6l0HcMNf%"P)ꀣ[M`J%Z6v`mF.IU.Or@4ayc͹\Kå ]8=wazO\ / W_XnO{vV+\ZC燣;42K{]O9~ϧPx4Wv Y("Py{I/E| HUϦ]IF!m늫e+ 1\z9,YD@D@D@D@Dx<;OkLuLe˟ߛ?5n-[Dj>CP4UdK]9 +BD@D@D@D@D }4?(oJ4%n\ Hr," " " " " UO@`;#ZYƕݶ',Pȶf _gQFm&_oI$@۟Jkf`W<+ޒTlP;4vQ*'mI`FR. QGe;lBn kf{ 7 Gymwι@@qu8 tAaw ;S>Cz_΢~W_}e} _{De~wSx{7,B7ް|ƫp"j=- rQ}aKԹK.v9;չ@@~,py[.L6-X2e r:4|P8Ê+h 㣏> ;AapW^E-*#F[ouW2#GpK@q4XMg%&Z~a /,)fZbYnK05}t3n6\E@D@D@D@D0U7p@&k%J?0i0dȐK;L;cmcnݺYpWqU9c @>B-UBUJ@ *}--*v`׎>}5XÔ#iEXPi9nceX1Y@IDATPqa6 bJ`b@as뭷g}evmSooxwF6tӰ뮻v+RksP*Xa샢h]v]PVYenRdqo݄"QxxO;#lVyҧ~jSr;U:8lͶCvev9w | Fu]7g? +%D&N^J+)0e=q?ᮻ2n<Xo-Y^sQD@D@D@DF.[S)mN߆pk<|ɰ袋W_67IX4-_q/hi9ko7`@ø,C߶ r,ngAwyve'ۉLXq^zɸ0 ߷o_[ Xe;vll<`2,3/B s޽Ä8ѷD⅍?+c;#_X9@T 6dJOEc;b4B D]~z 7`calǨhQy3nɮG]OTbeO9[?9<餓iF:|FU.*,|ܶ7w7.\% \TY4;c q2B7._®TrQ;3{\p2+ڸߨwEEYi0{W.Vv=*rQJ\Tp5'yQDeZGn=ŵrQifO[7*E~W6`ooYsX~~r⥿@8O`*UE[!皿8X/B9̑+'\\+ӢgI>7ߜ???ks9gt,:p;v~gN?t;-&鬒3&>w^zp s+`{[ow}Fs~>NY`RB0KхX8>:n!bf;/8F&br%T)sm= ?ƲNG$r'7_~Ă_c%f c;sЪbr+|O#{d AޑGi4\:3yxbz^Tp /0\s.ˉ@K` _tEśNeV3 hfݓ68?XK;ڴ_|E}P wn7khkScE>_?o-E|ipgiA۟m կ= e?p '7<0YL PG~+7J0[>*Ĭ6sbyG8Qe˳0#x`ŌY_}nC,2,졇"X_я?5x`;:& NەNm4F4bJ(b!5P.a]s5y k{}Xzğb.Nw mg.СC;Iqڝ%mO3uD\kt /fKx.*rn?>STD٨=|Y@?7y?yQ#8"o՛[g]kYmXvϹŭQr?LuQ*9h+ކN=NW1k|}î0rq=wh8>3Qp8w;I]=/P\~$_oч@o{Q!Jf8}GY-yA;3pX~y` ,g}EcҎ̮y,*f?K}Oٺ sfwy{nC+,Knv1 ~,dk(1b|3FqZYDeNx7y84(̾\oe`aXC9F ,kCgyg~;s֝Zll䂵Zeg,+"oX)EZ FRMA'xaq|gZc+ڈ '3j9U}D=C4L^xYbp Kfº+- eP^WSS6at\g\?i: R`Usju)aA=W X`."8 V]8 M3K $l/m#?^!98yУrY(P$:*t?0J L>)}>裡s̏;avr ac=VZq%3/9&}㢉(09?So;~y 4???_/t\H v!d"K,DXklAb;t*" " " " -BI7!M|X mYڵL)oSL}\:p5h;ҾIG8gddi@8Xe 2=<_}GJ8 UB&{^?, KbX(XVbizrUJa:ϱơe QP&ZwT8<7:T[i/K+̈E BAqYQ(r?Mv`T$.R0M^ p FLj)B?dܝ.sK4<༲/q"w4a88&/b&+͚@ fqs爊#vCbк>G;~Ωqz-{J/%9Q-V^?q74eM1iy6 #QF>).#xQl<>lyV.K@ 3?SaUVSRxqufKS 6OEߘ'I,f*VCP06(,~/+J kyG@,4(ވ8* ad1oڂlVȎ 8?}fiǂXqEi$4YUޔ@u K) dq(?f/P j(P$ J5γmidMH7GF`,}8:V,ݲ~}F p{,ˆ9f0Œ2C)V]C@ yVe))*q]t7Bt.G  ]R|+ ZzXnx;lN լUqp'SY(zW͛Cީ?M{o,FY\N)Iχ5Zhۈ ,i+.h WTM6,wƴ;h>Ր+\UL ضgD1Fa&f8_~?n3%r0b}X#ֶ fٴAUܰJ ,ر[./l "T^qgFp={(V\qw w}wm:+fuLONŇ1<۵25.B_~قd㌻i|Ld+_gAƸF_yúlذaHd6nLt=+JkTϧ<tOD@D@D@D@JE 6O=izRٜxls=C]E;Ng ӣy߁sosPGxCny薘uX_w/4 q裤lXϋsά6b'-k/ˬg[(^]~R`U3nR̘>懯*>|¨бBÑO;4۱#n%k0ʭ.,cFB]*3QMG3'^гɦєOH;;`b %0|04hP>|)P\3%Z0F9tg뮻*GVLYkYO4WJ^!PaVIsdyv\2b{RcG+@ vip.úRio[Yr)j?uk(33[_Zx/]V X`џLEDֽ?ں(hM_ B<B+I}3hWdz!~CRWqnjzԿM!l?Ϻ9omw[RcsGv`8}$¿sDv ѱdEnpWbXiQP=^p饗 úXÊ(̶f{-BYLŅS kjj̋\v[c, PrE;/2'ܧhsB,ljTR“70Ցkoܟ}wtimZWn0@Mۛ6+QC2rȰN;~8n-aG?di봵M#?i#ޛ2vf/}oscApvK∙!lVfO(p<@[2uCt+9}IYQV Xvug@xlOq3 ֔bv a'^t>5Wߑp3rq!{B~3~1?lŮqŃ{댲].{z~MG(^Eƴ-+$]Ӱ=/o덍7'u|Za)k-k%C! 6 Q,QPï+kZcǥqvPvnk]6oⅺxdM k" " " " "ВvitkzYŮN{ŮlϦ=z/{-7OwNl.Cz=wzHOO'xӰ:nR`Um1O>f٢ 8lAZ["TreHs(Fbq,78 RW|.W1?Ů{8@ݦ"՟ ~GK0TD@D@D@D@D@D@D(UnG|>)꣥" " " " " " " M% VS)@V\`uUh̭r%" " " " " " "V{[tU(ĵ%H>NW贇g^g[~^&E@D@D@D@D@DU`yrttO!6qm>֎[;LL5"_sW:<k" " " " " M@t" " " " " " " " HB@{BBB ]% #=G ɒz HU/yi9P6B@!*t/Xsnԩv˯*<1䑣߫ܩ~%r" " " " " Mk`-v?Z,ݷo'6(ޡSNᥗ^ ]tQXo?ƌc9bw* ? \r%aV zhڵkxG_<#v]cquoqgvxr;7.} 7|swyM[,lauX_saY8qb 믿~1bDXqjf/{gb-Jw}w뮻’K.vmo{7tŞ;Ϛ7_UVYQ:K/4`^z*l~zp5ׄ.뮶֙u9##Ϛk&y^ȼꪫ)Soyje sLk­n_gu~gDϽ[nd=a}YxaVq86lo~ᣏ>2YZzpݵׅɓ'[.oXzÌs>F1:]e<'|2s9᠃ r2϶2kwlGE@D@D@D@D@DQN_{1!#3*yB?+ u2aF.QGn>pW>AabJ2#iןxpy癒 7pCC}ꩧ{10[ =gH矎;JLJs=מ1ʐΞe]*{O3zhsDYB|HJCc àA7B^KSr]wu / cǎ-,E$y}Xc5?P4>ek~k&M|>3\`a_y/Ϲ=yL)(32 xyP.r _?w,r-(QޑO?=qvȧÀ_6YxLq {aϘKAo_GD@D@D@D@D@ʜ@Y`9/Z J#L qgi4k2 ӱ:N;E2,cJk6}g~:8(xv,BK4o7 /P,1ecs)%l~(DZjٝuYAY=<>} /l~]cJ.(9Pm-Y?믛%?LE6SOɕ-.`#3wJ~X:ӊwɴI]<d]SjwL[d2Ӝ78,ևL#"]Ī줓Nw)(]ɆUY,oƀ%^bVU(IS G(Ϗ<0!xLUlDe5Y(.XwzWYEĔвΕNJ,49O,9{ !enx?23}̟W6,`/"lnH [ԤE ʙڊcxNcW_V_}uZnLɊ|;WW.G]74Qdu"3ό<{&NSB  yxw`I#SY ET(Hq^֐sQ}|܈?2y]vq/f5ewa~ߔs (@lXtpX 0dS9S /:tXSqi?tˑEqXް7B!RrӭOiձc' >g- IRt;gȳe\auc c~ۻ8ӾC6ZOXop0/lg䷜X{r92ʦ (ܰ㷍 Qhz-g[n)ms:>En֙bU(xX\OQ^5DJS&qXqmmB|15r+$1n\~lxX0VND@D@D@D@D@*@*XjzNB:f4ۛL9M1EʹYGNJ° 8߭ƒ;8~w7uꏖ6a Yi>5bXc༃k_$`2 ١ X9c+YŏO9C92j(VV գMy>bcXeoæ\y,RSpX@=S%ZF gH1Qj%EQ^o.>k^ 6AYhwuPb;e."eJ/](\BEC ,HYS +ϑERɔY{O/=/S@([J]ΔQl 9צ1,c"Âi:l6\mtqtN8Spp :LGr s%0FN%~aSj7V$,~*{? YlZ'i=]y^s?˜~-ίٗz?չTV`*t53{u6k nd9䓽~OUb^ky݂d ]k=VǺF/e꒥нZz>L:B~ ]'O\D@D@D@D@D@ʕ@Y+Rv]!Vy " "PȂKTD@D@D@D@D@DtOR*oK=gjSb[9==;Qo9o)|cKSD@D@D@D@D@D iEiciSN }GǬZ;nqժ Zbx~8\9_o!~+5r.'" " " " " L]uO?-r!D@D`=z>}+" " " " " DN ,F{e SY-3efnkݻWjҤIFnݪ:qXC9眡\] P3zX;ws1G9硢dC)T%:rE=<;~!:u2n.CTq9(gk3Sβ9?sr]g̏đOlz5=by:bt]D@D@D@D@D@ʁ@ ,Qzp~Cl~1+d[g͓c[.u?z~W?Vc'!`VdY9$R`UӒ" " " " " " " " H," " " " " " " D]+)#UC}Qka]JXFѣG#F6,`Յǖ%Kr-.7lpٴ;@ HT!BB i."K>]y"z%k~ٙ;wΜs8^m;v +$Ru֙^[ N=:^:lڴ)q?ӟyyW[J7n\~>O9s=zڜU5)D"@D "D" h$݈UYYM?UV|0|K,3oMaIٗя~ȪnԬ?Z#"r+n-\tEZjvS(--mMUׅ%rYf &w~_!Cpuׅ?HK/Ԉ_2\ve_Ӧ,t i#;999F̱TlV&M2naĉaܹ\*<' D"@D "D"h(r VIWcN|M# s>o=|00RPYt5ׄzȈ]?Of}u)= ]nh?4%p1DŽ}{' _}{_8tҡkmܝ3|ߴes>h8s$P[ _+++͒9r-Vɓ'up]wYH^x!p."㟈@D "D"@D "Vhb ˫&A￿BX\xFl7޽Y$q\RRbμYjv=nݺ%R^+b! saak8&J76K*Ο??@XQ' jY:wŶoW~ m팵?i_=@/ow1qD"@D "Dq ajl&p2B7F[5K~_ٳggJ2޽{}}kK.1 ˲#<2K> B ЊСCMz:L:5<'8uaJ,3{pSN;H;/9YyfY:q`-I>mMuֲAڎ%C̕KX:l0"7_]Xm)6kV[cyG[z -{NC>8G_D"@D "DZ+Lk!ğ'NmяX\A<Ç7 E MV׃ct>LnI%<@a3n`ʼny[d] !?FD"@D "D7kÀ78&ǵF tN?XWXªo'l[^N<X Ah 2Ĭ\[5 ,כ׳7dXI4~D "D"@D "ZXe^{'x(pg:7t3TJ[LdžQ/ڻ{V+~̿stq?"D"@D "f"[ 4"{Ȍdx_%8n_܁/ug۾Ma{Fre{&D"@D "W)E%1!30A\ڔQJK#D־r-ROu-8ުqxNE7bT?"D"MD`={z >q N۴ܨ)AiRZ`ötȳ=d'9%LƑpٔS;ϗHmQnD "x{ժU뮳%<~߭ީNZg;\Mӊ✞vgϵԄmW'uN*_d}|oeԱ˖}21 tg0R߾}3>ڑ⃊kͫW66=xy/9rY&fqD"f4z饗~;BiVjȴL-僡ӧ/~+ guVZ:vH}y0PÇg ~{իW +8Doe8s¸qbu`-?qp7x#yvC!(v͵N_64iMP1cƌw`< g Áhw+ç>)y]'@j8+Cuօ?a̘1ֿc$7ߴ C ATo˖-6xGÑGi~Ю-OC]F9ǣLfΜNtr)OԩSýJJJ©;8 ^VT^[n>?W\&O\_7{>m |>Lҏ{&gܓƏ˭??֟5jTƨ@ +wف6>MZ 3KZXQ"tՈ&7+hTB r!Ƹosׯ)zv D>܀LएО8krxscrX,#|ΘBV2 X1D"@D KB^OyKx7 =%2T-$1˶ v|Ą媫 ַO`Ҟ-Z(|_ίp衇Zh O<ш38n)\r%F8PJLdl۪IPT|x~~{#ߟ_*;671Ae5r|3֞P# ^~zz۳:B!&O y{mfC yga_"qsϭ',3Ts뭷=6br{^u纀P 0PW;~?я^ /X_0dL?aC?p>˭weJ#з!/^l#Y]^Zس"/VF5g J U|cwz1ظi]lBI,M^ly!v(۪ge#Аl^tiQQQɱ>vYM2k4YICRяq"y9?a~7l`<>6"D"o+e ȼ+?_/X' ?}sMO:$#4I3X,Zw [xx[Cv\9׿?XFچ,!//ډ:p P>wӰƪ9m xQGY:b6@'X@&p m͠Z" IKt/3n|X,[,̙cdKz`yGqpㄘk2~p[~F`?ەW_m2Aw'VhL^qMaYA ė$3уऍrސ1!<f}7l0vb˞ K\oZ7|m X=#}Paȗ7 FXa G&o+LP-m$ '[^9yu \q}A^u-9"tܰR(V;XA@1aɒ%v_ڒ{H&-ѣ)4h}{xʌ ,bcXrgbD"@s2̻>p@;fGCf:ۄ4?`|!/dЭBJ1fbͤd@dȇ>!+ b9-}snõd7\Ss,vĒ׿nwN:b9D[=%m,  /2#?oe@A`ʋŋ,Ҍ  b{y_r8}vp,<+*8kN#`QK?o;wrA,u"}L {S!N͕rjlo!HjlN^5W6zL̃xê{bsZ~x3 | 4W>2#x.,L#4>،{9 /bq1d̜̽N{TjoX&15D!UXE𖏴XK0xƚDw\ׄ?"$zo'3* bc,™3 D`52&CLX~i?&g}v׾ *bbϯo1\s568`+,d9K!x֙ (u1-5eB sWH' -SPWڵHb Є w)DžKB>  XV!>aXBv fN@ =sՔ o2q,d mYC rh3&; !!< ',]nC:[2PWz#uF_čoԗ@_d +DAT,l'B@a)Ud~ <bV2X214AOނX6Xaa >,cB}2C2׿s+(I>/[Y:rcjco78Kȸ77F< uUW]e[3Y DXa]1:T_JP'U[m5g|k+_YaG 2I8Hԙv?~#FN &XfaѴ:'}B7ưoYz[:a'k}AXч%Vl-֦@Ӫ/!?3+Lsde4L2$+G]􆼂gȫLgBqG^;X3TlI;^>6cOOC6@ZѷD"L!ch+"g{R{ '@c˸ABƀ uA76 -eg3wؓ =Yf{+*m{5GwB[?}9aͳq/y&-9@>5v;@ YltyqίA&#=k2?mm:sBY ~N1"_Kޯ9>L^SgBD_~ԙ#ۍ{8PF>k > T<2d?knI Q^Lǽ&S!JgS-=X=92=/Y, {Wȳ)'탵*|<Gܲ:l/2.;n#@D!sq ׼TCJ^˔('" ыG<vJ9KP\Jj=qi'1y -:UhGVP햭I$,ǁ4{8eРrr:-.LcvP8!ۥzEYZ^m"4lH>۾鑗M\ғW˗YE z,IO;cāaKd~#9c®d<ӻdY:]/?=sA `5|4%||–q~\'6}27zx:>QD}DMc~ɐ?TW3yk\zZ=XjN&K&]'y=qɼߒ[N#טso(n:Lԙ)yO977}5X`Ib+ d1IbKE2Ga&Z +4RBdz KXA2gr Q˖-7K[L LMBv됍L0iG&Khp}sp}'?Ӗ\ӌ!d,m ;w]e>ºO'w6[)=kERO2IZ`A [YjaڍH`}Ȍ|:G2]CeW#߮əf=הI9-%OY^78d}M?n,uLOcϷ71|S0hj޽C,{"V}^9y@jW` LZ@^Az-gYڋ5/2I^h,ILZG ֯_o$P&7}KK7Ƀ/_iȋQ4V;˭T>*K娔MڒwҮۗ77Ij%4wcmrRf`AC:rzH2:DfEERQ?W>euآUCn|Yh2d\vӹe98?+]5̐WUJ;AWʴЮYፊa*H [;C?6,fWn N~T(2./ 'nևVVB*:!GrM|wSZib*Ծ>(Y򋕖,Q(:FSWc;u p6+mͷa{IA7T$7_/WږCB~ҧLyKJYw T!ٴ)=$m@~ⱸ뢝2ms |S<zVt'8ye lÇ2wCc]Ba#`\{X\D "h&18yeMQQY8ZC@0A@01˔l+,x{G)0ϔl+ȊWds7-I fB6:c_̄\,uɫsyMْ[ l$dL^kk\Hd}fG 裴uO$N(pճ u yD2&U:`Hw=C$JְDDnzumȆ8Q:'+TA`UedAB)־b6* AL*oĒWu* )K%Νe5 ] MB,y)#gdJÜex/%֩bVH!:~N&~s/,>焞v-)GIFBI;@ ?/,~K?VJ=n $cmTciu!D\^[z`5~վ^Wow}'%Mϫ#G DRa V5EEEfB6!kПz̈́lHй0zcy`啇 Xta-1zzr]?+H=HO_s䓗,|2Ǥl/#n#[` bI?y^ D*7[H zy#X} 铣my5xs_Q5 P)X q~H - 3D|$hжY,&1kEnlSyK!Z/;!2yTDʼn"kUto!waOU1˵sVTeٱ˪n]ey$8$0i:yO]ߦ*G˅X.++ , &H ` ϳk Bpr wL^FOx3S&ec)A[b=B> }+Z#c}RR|M88e'⟈@DY0I =l3VZ㎼95\+So/exM  :PDKay`W+Q=Dp,V_ suf،Y"l6`:%suT'+R!HQ%L4fX-Xe@Ab,˥ْ]||<ҏh۔?O&ӵ/_ewWWe*xk+KBtV9(K19ӶWRNvSRmITNKyT(+EdN_ۨ觲!ðZ*+.UDc6\OǸAy eES(WRiځ̪)K_.l&Ӊ^.U:HTQIHZoT  X-p$%D_;~IzlQgVE>Fڈ~ƒA#z!`*=^j١qJo'㟽N ,A:Ռuvغ퇩| U+^~C:hhp`VI8d=J:1Ow}ӷ*'ꜞwr9p n2'or-o],I"eW|m^>yiҷuJeLL*?LFN.qBߤ\2g;\e_W蕖.'$uM2q9qHݞ>^2R&\7Nñ+arMu`Rvȅ0d/mנ +4vIY3.Ϸ&O2Ck)j^ ]]oM|VRZ4aRb#jd6@D peAZ@p7L`y|9 )B4~5YE"Y F$f(>~Fg,xl.b821rsX8Ai"xWU" lH;sutF%[km9X}au7o .)$FVR 7ٲl1fh!ez:9VT>@]#]./|M24 )a)$jcIa-k2JE,Yæ"6tg9e`EN*6j:AT.=KfLKKH' djeUfP~v/BnM~gf Y<|M $|$X""0Y֚-2WF<딑䃀_kc$`B{pZ}iy1}vJ`10AB[w|QCc"pALJy #dNJ* $~>G~jbط/c LN ALdc-Ƹ#g/:`ITGW,H>|}1Sz#<l>m`2S:xOCL!j^)ˣwC?+ GNja% V2k@eiXTAm@5X&YJm_ɂbH,@p\2:ȝ24Y(K)DJ0ɖB6)}O s=lHA"Yf4Alə-!*I剔ױDuS~Dhoe{ Ŋ>T@bXkn0oo  %"JcDT+ϫ"h)aF0LAB- ɺ k!llz6.CYj:l'4zit|&KQndE;RFiRYn,ERGy1HO ,hEFR_]L-=<]d \!*Č8KH aH!ҝ~3~M,u^F`^o*>lf|-V2"J$LK53}@_df:L'37)L\?I]Ǚus8n#!9Mȡ-/Ğ-O'(H"eSX,A*bgH‘Q,X% aH!" TWzFuKcj<8,% He"#`A &K(k2, $E"kCavx!pt>\q:FTEYWOH."T6,YgXj[( cw!|`T:^Yk=R(vf%HYW:"jgjiYL'"> xH!a,jp4Wq5[ۤ#T'dmܕ%ي갬V#6!p#8[ejS+^E'7Ȋm-4fpT~-;4u6U!MGii ъIgXɖ'puھxEby*}@~NrbG X{ @D "h5mkcmTiW/d#巐QlDU"|R-5 YBH[疅U"sAHQRUE^uy ;=O~nGX)@G8HVFIEּ&[E `ѳƾ dʯGۗE$S,%rdN)*rai8W*jXE01X)8{0m>dCFdM2ԙԭsج:&RhbVJʐpZpϢMa ]e4O>rҊpge"~Fh{iii* ˯fprU&*rcܼ$x8tJD*]lHr°yè9~kQ5,حYa,X^ȒjCv|%bՉ}_brLj8~ۛTVN ZGXeWW}“ "{@<^yx{/$$-VZїav!K{-$@? C}~]Tb;-7`6L$[7?ƌ2<-e`%Oz^*<C39I]H:I;!=oc651}OK#Bc6&#y{4Lvuv]V^϶!IdpLHӐTS^Im2.tyILOʦ,#+IYɲ\d9ur$/KS}LCHqc!"Ъ٪,eH%.sʭq&$-c۩}4"F , Jj8qpappMԸJ@@}|!N,[!6$Q#el}6- orYသa0WqmP.%jy"/"RUb84K$W(I:ҍ2GD@-V+=CseYn?r}$6-ŎH\ GiܿD4rnymX4oȗC#idv7d$} G2;`'Ӥ^&dH8r>땔Go$驇qnn6JL3i9ל.B&zS]J풯6䮜Ox5Ծ 3uM/e<Ǵ z5TL;uJ^R'^cogK.Lꑬ+}<'?;yOb}l/۔8"xADܧ7sM6#bU#qfTfXڇ5 &Eg}n[oSYLҒ>,a`8#RUY:j3a 8} 2mOEֲ [ Jj !t9%ScyT/-Wb% )4GƲ5a Ү-gc9ٛK*}͕ c4t.dŲ4zA]ΥfSaVP.lg ía"& fW%hRKWowu'o؀ț"HZ/Һe0x@vvl֢‚SGX218RV^7+~a??sg}㼲0MIu/eR߬Aڥ E?[(p'BSי$Xj|9p0$0Uv[Rzek"dw ebƲcyQVkԮcHJh$JsFQX~^TfiatgG>I`F+,$)Ҍ,\,FmHUa -S|bƴ!*g;?:eօέ OaRVmcػ4Iؚ5k}g_)ر:&KL٭:0~jժp'Mʑ{D7mdcdž3yf(Yh_AG>~SO=վ18E=ì<&?xxWl"Js9a&6N 'O=ᥗ^ }'&L 38# :YSC@0zꩧl '؄Y=N>dzh8mės ֞x3Ϙ]- ⋡wށ1"^`@|#iӦY SN Mj5|駭=gƌcQ΂ LO~ >;ʧn7dk69,g T{>ֳgOI?zǬGe8ؖKS_'k׮5ЛI=p @{O:p93C= /'<'4iK9C`8k0f+ P|(~J?{sυO?οݺ֋-{ÑGibkO01ue >%totD}h;e˖Y{._oS-\0pxWσI&:thk{;NӋs`I]8O<9?2uݧOnAhfGqݗHaÆpq7xSX?`E)^~p[mC_U @D!3:2CD=#Cj,Za&#PamߤaW ;V(O8rEsyTc熕Zv6}euXI#ĨCOAM 5-GfAr=UF2aʘrDXZ2J$VVʿ\K-遥 d8C$sʏ"r)+(LŸ eLDu}Z~ 5NH+/FdmejjSM^O,89_<?iH!J//('pHSr tLH^ QI,>N@=)k3Z`8JeNT|gH,'bgðaD,4"'>ak 7|Mď7P@j|s믿 ou}FF1AaG>&g v[kL_pn|gw&ȀbR|ň$B?M!3YdI8skKG} 5$׿u )%->Oäv k LNb,W_m%\o~ * (r:0kzE ʡmR ^x!|;߱f`?H&W}*KG@IDATg} 4@BGG#ڗzYe]fm 9n 3{laaw]+A9O|s)cРAF91B`wۈ` /$Ûoim7w\k3)m?1׿6owۗ2tu5Pچ׾~_ZЙn0SA|5 7+ s 7pSo~;ЎЇLu~>[?#\qR T?!OHg D藾%ԉЦ?O?@wyvmЧ|jDGܳ 45PWWt_~򓟴Ak]tA>1sO& RaeU A6xSYW%,Q&w [u^YINB/| pdaCtZ}Cf,2@ad@Z ,KU5HM²:S@0IcVq?d*09Y>L{K?]k6H4W=X@AM553RqՂoD1dVOn[oo&rob2X?ۤ+%H ndLjxc䘉_o?5 ~\RWU# 9#ȇ T&M Lޱ*8KŘθq㌸l!2q9sYOAlX0ַ/Xڏ}cf!Az Az&G k LЧ>,B,ڌ8pCD , 9 `edVv }~r'Ї .Bo78&7tYW! :P.xdVeYu4_zFdA|{߳>C @(@4%w ,h H/, Jߌ" aF[ABL@+cFdPwH\9ځl<&DSn^; .q]757(D+}bƭ/y//~I'dBя~dzcAI[.\W]u"`B[C4=ȯ[nB!ݭ7Rl )$#d@1ǵ v?H`ꏕ<:D]bcyO;qF?lho] @D=" 6iol6aϲ@V9 pL!BR@ tT_<'.aZIjx?,u$R!i KaOl=Hu(~/ ʰ D7Ͷ_g%I_@;o 1%kp1` y#V@LNb]+/j &tL b 7`/a!|3aE>&mLD LT^"HI%l<_;H+&ݤH`EtaiddB4A0N! ~_Ɛ&L)0e ƲL9~L@_q6H#1=Ry  gU2 6]&Ka,7y,3 aLȡ,p$ D%$pB,B? <#PC. Iu Xr3XA|=Ӷ"`KY% *V @FX `F[}gp͵D['ٮ[ڍ`n@‰@Z~`$e@BpZTLס)uuBBvڑdm q~5+ʡCDdG>N[ZYN;a5HBn`Y]!#jj}IO$?dyĉVim*ѵ<CsMr_~`u.2d?ܫh/]`\_W\3\܃cIIijbڈ@D "y؎^;yhp0~4/~&LZ! @,aK*VXe@H(/ȂUb Sh%?gvXOn8B j'@@,-ʿRK|٢]4{S.lNx9[XXM*ca*/=%w I|_H\t*e*QJeVc9J"xX0,+?•"E.|fYX\1+$_W52&+E^,+}:o7:euxCBCOYQfIDžz-JBYZ S֖֕Z*)*j@[DDmjpH0ofUp*}G "Vu&RGrX9D9Tɉ;_$+'ʺkG*8U(%+|:Gu˓/Bo ,%)< S*DtQL=\mSr4Oa+Kq_`eTpŁ;+N8ӮXW-Kf*g  X~EL {k)'>"ÍBr%y`v޹F;@ 1)P8bRl ib—yYuKЙ&mE9 @ÃLܙC@b@pLD/sOLiH.p @'j,aD'H-8BW!4)ߐCp E3FNJz/$%mvX9ĒVH „k 2b떴{\}EH,%Kɏk:&7ѕk%!G!\4Xq]@PxX䏾BlN@7d^ qX=@-AXnѥb5AD+49 ~lxzGFSk'Xtu$@cD&+4sC h9guUځrܭ`g)8ιryVLDݕD )M"\Zuu,f%beȚeUɱX~歭2a"EX>x"C?tX&WG7{odIY,A@aB CH?DD;U7Dկs.hU8XmJ&:bOGA[_UҟI#S {&Y`[} &LXFx;%LXad'Ld!"xe(r&O,y#Az@@ kAL@J10ÂV䇘@6$V, ! M (&,`sLV!hX C@*` (,j{FWIc&@=0b?0;sCA#t`RKh , ]R/0eX(9! A@A`E,ܙDCkW8OZȊ|3 yChoH*k Oco*I{B>&eK^$rpSOGX߾G~$|B} "b.d,S8Fx;bdeek i"eSK fk~ ֐>K!6ڛ~I?<B>upğHK?{HDqdKꏥ~`>UyYYk^\,#Kޜӎ\FH1ڸ@ABXqOq]BDC$ݭcz:'#tdb55F?=8H&$,θ&\O 6 7yc dhOd vque 1D"}'x2.=1@D@c(FLECh/_X\L]4_|!x,eXnX&H}уlGŒ)~ce߬^_Oe.2dK+R n9JG:"[ & F Tkފ \+U-E!=Ędi9e"JW^JG-pL|A"V2u`LՕ"Jg_ߨq{N([Qzɻ}䲄n`X׶蜾h8|dM%_ ~m4=-'tFS6Y|}rB0sHKTp+5b-Y]FPh3 =몬gHɫ=&L99Jy# KuE-)wȨ*,$yV:0JXJ :tql s2sژL+fX`![IbKx;VL`_ve`I@,e|b҅ LäܐOLX¤(rbB*bK,Ԁ(D/djAP6$5 D jHpM]X7h,rI'Q'2d2ałK ,` ~b 9rꩧऋ]@/?&XA^1ɇ8P$tX0EWHD &\3!EEE?fd   āKnuVK! L|YF{RI @ )b]-]fm > [h C8bD M~ T)xH%a9#27b2x3K?`%_bDmm~23 [ŒX,$kiFF ;@pAx8d "xRڟ\X4qqAA4觐J5ڗ7g8E12KRc{!.r ̫{[1I`kj-$h`zVr4  cWq7I0HmA)j)#JMã"X+U 2GFnQ@+ XVJ rKd! 8؅$s2W\tyM 7/A.._\HwGpLv  yM@ɭ_LdȶPx\.<"Һ.u *۱M?\ֳ =2(\T xK`Qd7|s\ԺvOelmL+_m نtuF.bed~ _EmmYYž\&֒ Ȋ1qcA|qln ͒}%#ATzS;q16Ww&#&8 dZWvc#X`1 G?6m47AYBBp|mKϛ8e{{.2sFP91Z.l$-uq^Ե{y*E/x=wAfTƢqN;uQPs h n6Ӟ6OPY0= d6zO^k'WJ}c`}m%8-088lyfk_)櫇8Nig.x:?G+i?@mS/o39u\j5uq#,I_L2ϴ:wgxi;HxIJ8rY g3>|T:G;׷gc6П 屇J<(0U?ju> (O  Em߱BɮͮzLct46 __՗ 3:)LI; pIeMpִ06фw>]Ե]+uxЅ8aQyWQpB>ezaTŽ:]Rp!T),{w PdPj̵ ; ھ~ Xu}n]~P66VZdZvBCkg݌`ta8>g}7ϔ5s7rx!Y =|98`#A XcۆIA*ߞ}.|VcBY \L?2 ( 1Z Crq9Ś# K\Pz7I?f:[Ʈ:Blp><N *LYg04m0^\Q~ɐ;p"NE,iľB_Ņ"U}~nR I _L. ahݹ@ظ}T@#a"t\H<3{BL4M."NYasC;v8DFabO/Tbߴt|B!@LvK`}k o FI].Gt2LOS,E}$u' [ؾy:V<`Y )vvw'\s;MM&.yIzHIPQ3z/%iAK^ΗZX @je]4re2.dua|)+ӣ)al]'6]ٞ/7msssԦ/9Ӧ V}2}E& [6/˦uOixUouH/ۏT_Unu6 ReגRc{il-e1Dvo-/'@S~LYS~Oɗ~Sc_kؾ:Sߒ==ZOFQ٩m]-jf>& 0OL~ϩeٮeͷRiQaQ)$iONʴ}4ej4S_=\Ħ>)_;9`=mS9dZ&گl@Ҿe5=;EXt>C*JZNVm߳kvPoeR9 ,pY?#n ikv\(WX a$u,}6%lE GWt׼vc kkяLxzUX h5r/m952/kwqooӷ2XIw ? xƣW<.ӛ{ya"}WWs[_%c9EUȸe6pZ0v䪰&&yGGPrp?k|t0!c/fŹ Ua,@WPw29d^`'BXPZ~K\Ơu+=G P1n*@;'fl\z愁܉p/{d+?y;'LgD7EG3Ng?U?_rng`:0ˑe`ӂ8[X\s1Zܲ.\he91|cz".-e4%yt͗$;YddI](z-;IiiݧWY#g (1Rt}d~Y`~_v?IL2'lvLOm>>:aJd9y?GO,=ݔmT7ݏ<~eOe`viF{mMOd/=tl.}Q:?ͶI5Den=~9\!;,@χ|dg,O̫C\9 [`^g `L$YWHʻJSk:!;E)T\]}.^NURKCĕ#8KpsN=J|?)Bʀ~30ÝuZP14 }4۟uaIjQF7 ֔R we–?kdR '}\3"l!2ZU\AO󠝽우;g w UGc8o\I]}$ö*E[çU#y0i딩q~[?^p#p̍.':wҘXl/&Vj>wwDoaM7{'dܬ(Y# =6Ӷ@NG%N[? f{w|St{LN?'T4\m;vb͜\̵ z?+5-,cg`j\3'KȺzYYy~O ?l?\˞Wz$e;YI:6dO#SNYD'FDO$73Tr#;e=Y'ϮR9Y~v~2]O#S'?Y_S{γ=a3D;#GTײ˦ɥrY n{ &oOe1UϫSD.e,3yqMP\zrwB`p2Im V$q8W)vӋy)2h|`t7C1s@/@vG2Y!CfQf7 |]W>]< rW:&wElL8FY+0fؕ0r(^`4`?@ .OzⰻP_.낅c@Bt1&ܳ;cea ̯ ko#R\8>|G{B?VtQUšp 4p]Wk|Kv ớz+jB7{{/f;xx:r7M* A C;΀r1D33oNU8@G\ @3@zVk  0ܰz2 nߖk5|cv fB1tc,thOỳ4\swWouꢎm l P-Ÿ ~X r{h U7E-hiNڹ @q I/~aC qۺ@G`x >÷Zxu6z†̝Vvu um_plx%TWv2bLaװZ f 7t{dƧ`H_L"4D<v Ck hk#}p붃)% {g+̈ʵٌVVȘUur-5-h͔[4+y4ƶ,ʣý+nWr%`ݣM ܐL9A^3ʠOcf{Qe8 tDm&ctl86gV`}v!{f?3F{CR/ z.Qߓ=z,JgT{4oύGS˸uxR9 x W*pL9ĝۆ &ܑoD]40Y""E*x/,X+xf?vTOm10} %l0^p"QKwAY1@*248<[Q9-!cJ@U D7]&&P&s B缁sSXO\ hK)~r;_S[ڟѾI_^I煸+Uk7 ȥ3orOh$<rn HuTswGS:>ݓ4~GKll}HLqGS]֕TN5d'=g3eg$i8yvW(<tɒ3lH"] }k FIѽ>/hvn4 `5m7]d=Yn|"cdh, ek̢\[nJU!?gP Rd$@/meHiZVU[5@np_?6Fa0@yC1v\>-&W6{8vTm7zoaS̀.vU?ҊVtTMUEH]ih[Y@ /?LMw?0zD$=[ؙ{]FߧT*Î=a+6*$ v֑l( m1 ;OL؎Va7\1sB@ |*ܙduSﺫa ƱaPQ-쫇ERԙ6RhL5o2ķs!my8v L=NڐasJb`u0 .F:Xk"ej/UM׹<[|_sySX.`U*y/Dlv/Đ`/g7c؝}V莐cm|ϥ3nN"@9Zs[9:picMzۇ`vC @M8&n[* ZɁl]t t `#~6? ى!!`L ɮi!YP ]<, o;^[L5;\ږ,@fd7xE3=USNΣpN aSPiʖOtf >\i+TP.(з5AXKƞk`.$GE1R̪m]q%6iW3qJ:Ajakw`']&aWa.vmĈ\(+Cnq0K&m>k"vurvaQatm\:~Ctk[Oa)P7Fk%3Ĵ92o'Z+eb4 Zn|-^@B j9e2Fu0 귓7}@;"MksJ˭X@[•c<=R;__cuAB請t_asVyvPWBhAn?~} ȝ8ѱIZ2$Xn ,@c~XpBep5tjL}푎5a^UA؋{dRý6n2fgC-~3\Mޭ [,O / ;Ch!Xotux|)4Œ`V>e k4F`|@ 7ZNA1[tseJW\ c WSVC{Bee4x"l>b>8t;Xh8JVS/cox?+[kt)l3rW^@Um  (n2l6l(.1P1ԊSi_LHWbʏ.Os97xb8q(sjw众skR˵DIPmndY&~Hϥo/ֆ7}!r PByT±mvY]hvQ\m=6&cX4la(bs5G-h#rYY馽~gU|YKSA 'd~?AouL(ja:b]9 ҝKJ9(Ӧ,>7$@ʹVL;(g|+"?EQ~d KHa >t.a-qgKWG~` ۪s({;][ c"Pϙ\>+!\##O8`N1d`@8r>Po-TR}]9<t#cv=Sّ)4#etLmtL,ǎn:<O7?]7}OTny?1ީg_WSt=E);#ϟJ߳,?y=岿z?y:&]Fty=搕R3՘'8ZfTh-Nk}4e'Y#FlY>"]Wy2g{&ߣϵg"'W'gt2fpha0J0E ww>1&q ^Gj$.}\+" F@q X_ `Yt7X# BQoZZ@l`9Z13 쾻HֽCv[&.bfwi8=P ƅkgGPrєX`mx'`Atꃭw+T 0Ο5,,n*PVWzOn C;*ܳPaVca٫fdžFPJ@iSJaej1VӪ`J"_\m9vY V(56U9n.\sy}xneHuǖ^bT WUrOŏo ,/\ (WNGB&T1NџvDWÆyqwAT "Z+M^Kb2iW"˨c@C[T.9nߡ}3GKΪO j@j>qiu1.Y h4dw Kϓo!;t҂Ƞonp]>{hqBM _B?Xfa]wƸӚA|,½c344Hnd'0eʔ`1/GƸ6mZQ%kۿ۸X7߸;w g?? OOQ:͜9`v5aڵQ׾ޤImַ‡>CaÆ __MƠ ʏ18_Ƕ׭[f̘ǭ!Ƒ:uj{ǎ-uNo>/})ꤝ x,_E B}}} h8{SSS쳁h?3_e1y(k˖-ѦۦcMs)}r<p6co~3MӧǠ'7$<#aq?by{'xZǘ,ʶMG>8~ҏ~aժU)֊:?>χ1X7l3 #ɾOsƹI}lO;#{cݹd/~g=ϫWc˹{?# }dJ3NIge]a]d/{MxߏL+WsS}occcӟ4_uk9V=|ߝ'>K>X0ou|qkؖ:+9䫣vqy:{ s̉`?UާhE}'ڲsΉ#ʽQ7u7p+>iCϔ A0hJ\ |ҙcNF)h'#V\As>f~R1e?߈v_p.->TUUgoivnn3}|SD;S/}tRuUf4ق˗01vؗRUc: xݝr]6*нn2\vǾ3 BAbLoĘF @JY=2k" ]vEZ6HCL.vD~m#d f+)uj)a@qj }}?/1z }ĥ0~;+w| 1^>8tHuեPAn!Rx$ʇՍLJ‡ yqC6fpI)<~Ϯ0}V W?za(XYRac5̹sB9⋈U=(^e={9Iww8 <,>3b3@{@G@j|rwX<˦l;pA|,&]Q&Zd.OVƲzqxʰXp?(cɎj@@dBXmy9q(b^xfz>~W/H_֢zh܀,缶IeYsLy黸fΛ^גiq7K8MQ@en\@ U 7pC"́2oo)X}ox\tEK. w^m[X ^ еp¸\~}8sW_ vqqbׅ ݻww]7~7"  eRx+^z.n W\qE\0 ̜s9+ & '>?G}4+]w]s"`DٴiSܭ /~Ú5kb15_(k^8`׮]ᕯ|e\,/k/ p^ǝ~~+ N~ W~W"Z~W5M߉/eor.lDžt|"׽.J.^-Z.ҰdɒAaRo10O[)Ӆ'pǹ,#z?sK;//E?.n ~^O'Y>{ӛ /^(mp⎏A"NEŋ#(R扠aٲeqC$#+B  DYw6 /rd)Hl#.A*uTXQXžoEwuzGc?n}W済ܹww#EP BP_]cJ΍cJ,MFm/.w;7GiCWWޓ1 kϸl֒6re jQ'O}SaqN漵y c)J2I;lsׇ%x $~ʥrYd7ܩ_HH *4{a@77,oI+6#wnGv frѸC+6J+t*Nk5``嶾hFi3F0V@,w;HUPQCC, ^ve7+Db#1>г@IAR hIEaX pAdUөG&q-3K#jiӝ[kDn,+t$ B4zioA܂0WPo?l!k=\mpUn p殊ulT"s`1ڸEaׅ: eaA}yx'V׍]6pe[XNL+Fv"ln}Z46Oaf-x̯u:-cz3aX;7=_CrCL#tym>.ض &̯ avhk2cpu'ck)s9:LͤX6Q<ǚRf3>FgPn5[۩ێ>{ᴲp@1L-įhKϗElAeA~~/%^xa/\?\ 2\q1(#G7$3ψsеO@EHF1+0.Q2d{ ҸwA{ /r:7d;P 4} GN\(uj?O7c۾`>;gnT.B.ڶ W s,X%`cUQ"7m(pX8.?] !dȘlEuۻ手nݚ\{Ў]LM12 @mؖsH9)% 1""!t7DNL/8{8R Bf\m:T׹\xւ (8"p.[,(P<>\$o~뛱)A5RΣ%:%_}mU{;^gtlMxq.'=x9Ջ^X1tz]K93>o'2Tgϒx25^}fW u -u +Oi#e?/c2m˴L3e ڼKcggϞィ\rYd>7r]Y P4Dˢ}GdGec%Y.Dk@L>2^FHХK+Ca6@A<~` \6olxQeaxa  |"pф8 R-2:"$f>s#.Ě.zWt() WO, SaE]?4\ޫŀ!fL J^x4锯i@7VGǎ׺PU M%a!fRyXlc0`=̻w rdo : ~~ h]CaT a5T=ȮZc e) s[:: .l`ғ* 4,\qѴз \/GOv{´zl:I%\hӇ{v hk|Y[a@vR@L2YS_]rŲ+Y@|Y9dؾ16|iV. 8zL)6/ZaXOyfK.$Dpd ָN"CIƓ}}k#K)[6sUM陸8 ʚQ\YnF OWQVm3  c8^jADc[|nc ?U|NUP11)edI_H:wGlpёdf09.`R'MO ]J벞/%3ɔœeBi}II2F&Pj]I>?b}eYk> kC|ͱ۶m PF=wY gT ]a3~,pb(WF3tGI uσ\S6 Wv%Qp8GUÛ1tْ(2U0jŠqg:p2:!Ow0>!CVz@D?~Y쀙Ba00$6 \Ca5mMQ6v^ڇC@11q[@𧰱>߷7sf)䚢pxb~x]ZA/\%0j'aU3«'jmhrC(\ 猹a0Yq55Xǰ #|ߪa5rH,\V[Ԕ7&Ə9ڰz s{ Kϣa&IJv] \d)K4e%92\"M VN( .28)?G }^J $Y-.,]-[, \XI$$B/eυ#&HBY 2dH]AYf0d6׭Hns%Kw)Y7Hd הgrFy&#I&<BU:Rl(Z8i}sq.*M i+sSwt=#Y\*2 3IkR-H`yc[J $(sYP y%{Hl9+A6u9? ǡ0jOHBxo;隲ez䫃`/d:/5[ޗyDۤMtsqP)L\O&u\3I\ Z֨6<TO"AWu/2IJTMG|N(m 6jϾ'܀?a;L5̨Kj\ Jáꉲá fbQZaq~螃XuЗb@kS G0pbpa;􇛎@a%m\,C f8F[մ@Gok=n(=^|޸ؖ 7 ' *aQaӲ1# ò^d)igQ\ 5yK6^ <ME9`k% pAVH ~ *Pj,*{Ec!@$}A'`l9Lb8Ax4X͠s4L\=ׇ4 <h5]~`fHB;|ude2\z-<$! p]\X 1qt$E2AtGsqn=pfϊ:!`Ppr.ؔU?YF sLƅ] 6 2|2Xr=] &хG֕ i.NAdؾXe{sss΁O$:ݤ  ).d@hidZ#p&z @ y_K q |h;E &K 'U9Ή,wdXlY\@L"䊸zJ;q-+O["*6\Z>CIN*0m%1gcPu}z6O}7OEm$L8?Gbߝ`I` ٽtr~:|si)M@.%O*E[}$H" $C;ҹ.[牠6^[p̹=ꮖewK/ҕh50 fi;?vy1}U?9"h%j 1٨@%}h1c`u?L2-+pnb] i[_y}9nw hlʩFc}[[ ՆLL'S~vkqtwkOxx[Oe#Vt~\eQX:܍^Z:Hc` *.W/U:͇BZ .7V-cQs 0kZm G 㱟M§p-呎Kƅ-ո-P^XΏ @;R):"@Ȣ+e\+y~8s!2`n~{X2֠W3yͰ1wtlzkf$YZu%|u^R54wanV;` .in0q)O>۟!炳1vߏM({x71#,NOXgX΂*~*˰˰;^&}[2,) X_({x\avdax:O0-Â8Â/ϰϨ' O̠ Lg9+q9,3,30XlDeaE+4d`ud 20mb>c_nKB1K}(w\Q!6ẎTYfb> `R!Ȕ,ݼB9{q 7i8/ekۃ4-}tQ,V6=:}fx1Nzj;a)9o31K,@lo<@>C5 hf횵@ 0ϥdʥ3=:og;Fhimg{=!ي_to -PhMFS:7hWzc;N~J&HsR1g}{%'e!uǏq|̵ @sN Xf"%y̋G#ҼK 3ge./Bd0C` (oy/--\2g-4ɼrrIf<>D,ȳk* 30{2)ȝ(s(3+X+ 3h(˔PWd./̣̫29וpY㮖(ss}i杋3YXy!?늋2Wg.UgnDhimܲ[2o-\TUynj̫+b]Sr.yA}7Nͼ83Wy;'g.)P[yϜ̧.-)̼t|Q7Wd~ﯫȼ8s䚼ED0scq}]QX\Q/P~ӯga\WSyU#rgPWyUIQW2 lʽfBIf ZLwAQ^닫3e$suIAf:m&fygI%fNv),d8*93N1-7d(9\%4~Wc ![w~}>.m@emZ%gꊊXW# xu)/2*`cP#=2wٖ SrL7N}z䚗lg{]u2Lile%WNQܺ ~*qMOYWݭk=uz/?rZF~Ss^}m'rqs)rmz:Ieh#1FtfXwnMh$ǟn:\XP8^ՑT崇LP*;;US]hq!J}^l|~&JS]#lY>e/_V>Rre^?~boD*;,p[ ݇ƿ&]9c r&aRëaD} X] ޽6'bly0"6ɞ'`L]P\: n5*?|ĤdgCԓ#0c0/ J½ƀq <s2*w>7VᓶV*a[RWu`UE@N|$|vi~G;ӯ7VN(B-% o@ qlآvV6 v[iES׷G*+}m KkB93Q6?yp׻gw%ݤ q|tewX7~vH~cpdÁ𕎁p%tFp l="Ǎr>wc'gM`I2;I%i}܋pl•tpR<R6aq转,ܱ6ҍ{Vw\2^7lVԆX7VI x; 6&|Zu `;@Ў0hnMi( wt68 0o^Zvw ՋBcM]omkB5;.fw Tfa_b)CR!}0/3'zCu)l7Jp)|ĺᒆ  v½-/[N"ߵ'28':ô1VVv ӦP:bF}v9H<-.Um=a\Ua.O 7! /_#LQm]M@k‹弥 @s_^.K]Ŋww̛H *8RuʹǮ`VWt| ejES '(mF9Nl X3 >ُKe#.(cN2[r.W:f3w"uw 4Pw}ӸpΉ5`CN eV@.=/^q=ͦ]ĥŔUuISS.-+ɷn姼e˚oJӵns$y&kk)uR62g*ϽTgd^hTf82Iٵ'Z.dg=ȶFn+}ه1͖OT'$@IDAT[gdJ?Rf:OeR}:YHKer)t-lSr阭wj7[lzv)߼{$3]Kc9 ,pX WƜt3 w>sϽ3es- B&ݹdq1)aVo;$,L E7R5y9\] 0Rm`t~R9\N1>wl oj, od'/n;n mΞp̣wYG5psۮҢPU7Ŀ W\84ry_Kq SV=0 0lv/Ʒ,maJ OҫNbb}@"Ʃb;Q?s 4v<< 47|`JUo}OGyl}t\.WGU`kƒKܦ@vYv$α!o*IҒdקlگI.iƸDSPz2PJmϟY'`1W{y~g?K+JshyԆ|:^??޹G6>G~>^Wid[yߑeCΏ,J__m'Zvdбҹr_ٷ=8ϾX}S9y>oqd}O4Ϸ>%@XET"DhƉ2<0I>dvV&o4PɽNY9rg$ Mk^LQ;-Y}* !h&v8U6JUje V@+.+' t6{ZK'c$6,B\ut٤(TkۚNk8j ;Xigfڽ]cfIlRG0sSh<s .N\ eQ09"Ztd;,+$.bŤ=fkwVy$(Dr-g[%0W\05^ch*Kyv&=n]rf˝o]a  vQF=y]!l`]7+d~#:$~HaZ.O˕Hemx5]L& %Zzl2~6#z?O[%bF9]M#A\.H lG>ELj0nk+^P >SA 0qm[`~~H*W@lS*n3J:(vF'L봵*4$LI`lε%/`*Q$Q{SYOP蝭y |9G' `>ÝJ=q&<xK Jxa %pJ8~ xEܷ0hu*Bx',3JnLd$j3]:ʀA찶& bG|[Z@E1šW(׮C:wVJNf 76ֳb(x-pn$ asVT 8GL)cd)?i>P9r懛-P̣}omG*z&@[eH}^rLTV+FP\h{..Y`!IrOfgl^h4>2 @(Mp`{a͘f/4ڻ)?i0ƆlMѱfgu7\=R wcAu7LIޭ,3mOh٥Jvc3*jo\W5 jyTR(p jZfjbR'&?4=کr}-OaLWlY81f*Nļ6`::`@#AIڔhI\oLvlea۞Xvg&*beE5 i%f.]&գmw/u)#|(mRݹf9jsU4O.₉$w9&Ş>kW6MKZuOvC &ۖ*U֮(@hXH>4ݚFU`NNwOLleH{MRcy3q,_ $' ˹mDSb]1)VƵBPerӓ\/)M[B ((|" HhJp"q-Urԏk~'fٹqXherSTF"46l #.<] i61 Ήrt_ۭ u^ "aw {=7d/l4ז]n@/1'gL1y~t{"txnMǞ'ڴNq>ܳJJJ3jcceeeSi%.M[9y DvFNdI$zuOا8Rmjs.ɉ''V @b9*Up 8Xǎx&CDjH/=PD+~FNnlsp `E=$UC6+Ҧ+=$.f 0^DRc|W UgEBNksC.BOxō8 U}{jHOis .W̦<^%PjUY\$/\f'{\?>/~b-l /&Wys#RO&Gql]\"a5uXeL=7ކ"la^*sK֠gvק_u:{%[ƺSt,hۣgCvCZqMimd+9ضW%jVQ?hڙpIF=Sn+pZ771_ZA!-`P6^' "°<ۿOt&ƟC<\ݒ1(`zzzbK77r|R_%4]F\ ^ZtDd+# d$(m13mv}amzkk5n?Ц>活w#p KT$ \0ؓt_$@;`OZ*Jتs3Ew4"^̀(ЋT2ޫGMB-g)f@S@+ڂ֯s}˚5jf-ƶ MyM`Sʲ;2o6(FTլR 9nsZԩz+UEGY}Rü(#"wk9͓[]sʈ~SYoCr3Փ+7 ]q@!itu$D.Fn~]r㋴ '[Z䷪iVM+T/y &9֒jbJ;pquMN2rMLr,lW{]W$GO 1lոjQ8in g)pÒKWviM ]buUHӤ B,B5ʫS,qbR땤 uiUة2Gr\蘤zįV8-u2Fc!w+A剳uH k|a%Nfvav =]W9 ļS[jFMRN>헭vkLoWu.ĎS40T%^' "Ř0xPXBnq] e^hmjrqο(2Q׎yyzvD/9qQ;&e Tv;;;6+ mڅY3 /)yXkQXͽV ۅnԼ5Bpr\Aޗ˽=!"jigjmKl^}}EY6]UTf؞.7;&1v@׏jhDk鷻I1%|jY@DLZgvP~&sv;N4k4#Z,]vXAh>H*Ti9Q.z9O_I`l'$:(r) Jh= ADuȵ^Y֗%UYNJP|IgG[/Hrm -K4^tvI*5EzBtUAJk!x>ܼyc)?<ɋa6F;B+I6[/_Ob` }MUQCʜ:1Iuȇ9 4^ v=x.coe@ 7h kF 4LI/q cG6㙐iCCx6n+,J)¸Ow D"^H/@aa(*c\r36P3Q+7v}g=L؇K/s;QdeRn K3VX%WGzT B ݪcFvwFX$@]C9Z4KW tڭm*qjsYYb )̏{Ki쵸$n*J \OH) >WxV5v֔4;۾n-J#2Uh=C6(DekM[c`ty=deئ.+TPu16׺K{ddZ{4_f9eOS~cu3GQ\/bG;ڋ-v2_bJ׺BڭѺOp@tvlךm|ke RejS-^S@T x%F$W:Z_CJI/.]u\I" ZᕗU[3<' z_-pOqBy#ۡmΓh:ˇCۦo'4H"J e݉cӏ+?=xǚOq>C(LBӫCP9,yB't >ϗxkZrE>SIn\och?t޾2 7ZOH~# h}K~nNqd/:W1w"m:Fڍȡ*犮/|Cy~̾_ >|>Nl و>G:ýծNZZZ,;;{\݅il6c>p8ՖPN5q/K*??T܏jkk]4!H (),N^eٍac6*xD`Aܴ"Ȁb;P 0T).2K5UT&Rl)$V`\+7 % G䧵Y*;]}!evkd!;Q]:\[A!W1Zg꿭XKU\ 4iz5J񠈹T/RkXb yRٱvtZ{~[y5CugeZ\Gm]{U 4lGk]3-jv|{{'C@5H^țd_}^[vޜ,űjlw8nܿ˵b-ůKa1OM]Di6|dM6'j׶ sOW4IVIV5 47' zR\N ̈́uD<^k\]5j[Rz?9^ ]-5MHŔx?jS"`TJ7vtέ@`w 7PSĢj^h.NrU[$Wd!1rK[TnjkI@\Q'}6ץ_HȄv韼̃rpel:??7_/d4r~Ɏr~藶Bh{=q1}8ԫ97 yЧ7s3.dSO -yꎔ y#?t~}Εq3f?>k<tPXݚ9Q~}HI~ ];h/w_{8Fqodm)z ڧ/>N\ pp pτsD۾lowL3@xMx&/k;cwwm%p:H@KPkA!.U>Xhf* WإZ8wo%\K`Fmܸ,HFF6Yފl [~{y޼ybRvfϞb=޽#su @/[nu ,p;`K\7еAH>O2fΜƻgկ`gCqRtr|o/ȇs5kg܌_̙͛/Ȏ@;f̘755è[^̧ܽ胵bi9bp';v_|#KO>|Xoº?uN7y|AlsrrظeK_{޹sk7==,找Bm׮]n>JٔSX***ܜ-}2g8Y7Ȃr^g,Kb+[|rDEuizR8~7ssY[M y`/AO>;c`\݀v/:\/zJ֍N?#˼Y\} zӦMN 3L{ڵ#OtY3GdFu2kJ}{62 ?pc.؆ unOY2VIkm/pB̉{ִi^ Buvhq Gւ{5:Zϟ?ߝƼibK`ך dI>ΟlB5cx~m: x/>,.1FXsY"ŒcKanXOS>JY6Geabq6ԈvP>lu*sP&6e։Y@ѬZT@TºDCm N6n.Jڪ^]rj.#"߭uQƾ!;GchQ; 9G.1ʛidg+֊h{TLi 8`kv4~^f3 tOȢ ]!F"NqpOx5k&>h `=JSFe_TFL/h{.Qt[ebpIs'n)+P[Ǿj _H/x;Sp}ns] ZvQ\-U{roz(ŧ[:5RRDh/Qشd7mE%S6S7vhHVkzrNba].^D>RkTq&C@)v\+nKZYzZmkvLeRPmʓV7Crd7tqp0MctXmmN *^d+S9tUU :t2<$=TL=SP~ηB6ʾ=8^߁iuN=nAU !Ȉ d2{ //6;O^o~;>m?˃] 0MW.; d02+2Xti K/~ W.r_WW>\~&w 2w]%\'ۍOe Kp뭷2?97lO2eew^?z<֬Y,[͍>a%K{gt%CO_=n>r2؝N[no  .`ѢE 0 ~y} FEEf;Rk{kkY+'o=@^]< .@n)yI>f??|eú+p_u\Ǜg#<{*n> =#]c^@  t zguV//N+Gwݷ-kuGtHf~_w}w^־(4xNikJ`ҟ?_ҍkqG]~nw]H7_#[dtȑn:_˕AϾ|馛\&;ݓ5]4$[㙸^kfT+%rK5Fz{j|]Y 䌨)%Rٛk ゛;2O>83X|QcS4%(?1gzZ We|3zrRӂ\ܽ<#xg]w)gM YB෷+>r҃o\҂L n*Jd߸{qp޿(9Ҥؘ_*tkq|pWaR)i1]ӓ%ǫTGޟK2[L\tl-K|/ux}H^% cQd/ʱW$ P<)>x)96|=YzȜi'm ߌHWˤcLՑaPrqm.w+ˬ3LEjC< Ouhws4'7)OMIv|⻉C2?uw؎1q]w96@-671s&~W_mo \1&C8a>g_v`&%пڡXKk~ӟfǒ!06ܼeL y; Ltc;iY (p?pr!|ƕ,Ej| ھַ9F.fǞO~= -b nj~~c Hrr@.0x` sX,z6 #@XH\vadĻҗr1ua|K_r@wZ0?l^\t3A @;kKc P5|߰5bed >/XJ0`L%1n_&p̭ *ߎ9BW)+[nyyфn`*!?cM~dy\oF^~0-<oGN&jY=`;1o7| rիWHࢻm>}:G&`dC0Ԉ#p]nDݘrֆs'?ϛ\wσ6V>%K@Y8`=wW8%p:K@x.2pqHqX&UՋ=)4q*\} p`vCVi.FNgbt#!.*+Qm-CI(>:W4W6]lF}-)gnX[Ѷcm婟}.|ib$BV,v6 VӐBY^Ţ9qaOrf({"k_J1fA*HWaaow;WY bg[N2J;Z~[s:n*͢niCo$+mnW8eU}vݍQHmnĽZMvtc4ٚUlxCàITs_OQ1ofzZZ`kCb勵ԠSyFؗw;o{wؙiV5QG-I68kGMjPM.IIK$8oKƶGk9I K 1(/_GMv${AS֚VE=t9oŬ#+v\u7&С'G jNcUK_sT,1g:+%56I4 ڀ(V嚈;o*r"UX1r4báv@H0h]qwٹ~DMn{O0?܂00qQŋ;с/ĶGDWW #CVΉzDwZ0V 3 f{ۿۣ> @!kSO= a {Ɖ!ab9a0 ƽ` wpc-q.[(%ĉXN6lɖ|`c,3^ ](..vr$?&rm vaz䳎;H5h,$٭wC7p_C~Ǭtn;Ņ>aԧ\ #Jy L5?zsa)@ 4ք~gXn 5$_l&XCI%͗q"q`cG?8>џY& $?kɉ:``녫'p5p]BoS< Xn|- 93t0b iݽa) `* / &r׾ ^pw'810Nbx1V܇X):؄zZ{Y0.}5HH }褿gʥ{cBŔtmq}n#SCN3k/S[+#k$',xJ{&AFxa do0{z>0l;׷C:2?VHVB ?(rkeipВ8>T^:mEI/ 4|^'+UCϵ h7@*=Yj@|~4&(!S Y!Vh9tM)RnF<`oR*$Wu(U+kBrqt[m<Sy$-|9 XiRR r5P6 H=h|SDLvgpKDJt˝I%M^',/ y f0P1P<5k30 Ƣ|}P0 &A[׿ۿ9C#Y͝7 |\+ 'k0[Ŏc y/@L'@rt} yU=bM1aЖ\9 cسwc!ct0uk >1aru:"`|w2_`XC^ȇCh~ }nN0,Ɖ. /#߃r!Bßy@opX#: [ㆾ :p׸5Eٰ̓9PJ@M/a1%1wډ6o@C%7  YPu`|,@kp¯!@,}rpmf!'$u\g6@ `˅1>aPFZ\W2:TE=uG?t5k &syr 6L۶ @Kb^` p`+1NC֙ k|< t_szDqe^"E\&`=}p/`w ?:#',N <Y~|ZL><ɰFH?ѲGv^AI m: U2bc'z'τ8P#@ǀ@!1"܎qK_Q{:X:_Ez!عJB kYJ-ddp3bH`sγ,F Te7ddTՎF)vbʌX{N5!uhu2^L6iD!;K2Lb_f-N[KmǮ.+:yHglɡEy{äGvʭzg,S`wΚd쒷`}U ϐTf pT}Rf$tY. . O RvC? *cKg&aNQr+kb%]0ꚑnhliebO$7QRVgdYs `&'Gvvtql\H "I+%UjPH'q*r jvG@"[:׿W3@$ <>C}OpWr"\o0ZQ1&?>oMkwV0 0'aDac}S|m{# H 0\-0xc cï00o6g6F3 "@=F;/Ѿ_q; =@ jqf EDQH̑㧜;1?%Ѧ7>1n1`plws%`eM }{$XA Vn|cM `Yqque=@}*sكdKY b+@&$&8`99uTMa[ cL憼X;a,r@%2%#/ؤ]7$^c9/uXaL/`q70cv=1R.<bqc}xSF9`s$2G΀IPGG'1$V`$/z59SgJkzXyts^}yQ7RGFh{$cCa}5܈n!q oŎ!!;^|dĖ{#`6%KpŸaqpKtfB{s<$Ӏ{%^LaٴY_Ϧ@,1] p@h=7gQ]\=ȱ繋I@!t hS〉cG#-_s'[ig:v`z-h,S9;7&VX *tHѨIC6]|5Y1a55Yr`OI \E 'fg 767_0ncKآq xI I0Öu0oj8XovpA͗[fbQE |]b$I'9Uc:mcwX@*Ҙ'YT\$sr;ceovSl{ۊl"_?+&eYC#]bu؆F,ɰ9mUڪP&Tz_P.]$Z/vG? )v|2w:G:uT5R($8hРu5}Y;&RnHOZY2ߦqZy,t: p@,vʄlKNw*pYw\gٹ|S{XuBxo^d8o?ڋ|1k<$ކ/3юôBx5"Xڱ` a#aŀA˂:G01ta8í &X`$zǨĐ۾c h#f1/_``ou>sb~;bΆ`Tzsvxø` ޤOߗ< `!val0'@LX[U 5sGF3,`pkL.O6b 1_bp#s0 WXX$N"'dWnc/Q1WaJ6}G3bg^' {M뤸عVAO.`ŠaNj2\ aNmքp=1nm-%k 9kõK;[^^0^aÌ s*%=&D{,>16`!;t;@M3 &'fL (h,sR o@9LStǼ#n=ڸYW096{l[[;'k v%$ 'q!?`] dq9Ewq!u@Xa A$g^ǝp K ,XOH"ʸ< x%]SGtͪY9z?]`Q%"VI%twC#˶@E&ڣ5=ڶCv[>=Ttۼ&[+Tl!1䲷T SmkC͊HuG@˴*O|TZd b@ܢXZ(($BvŢҬWĎˈ8&? MI>wik<2~ȮVFe,SĴA ݞkU( kى6S7f3glBST9P3v'R4BpSuK )"`-Yt/E.lg-mb_݅*Ozx'M`'<}{ Xԛ)B:1Ɣ/ 3x@`ahc</`a%,"aT0dn͸BŸ03260 fX(m2w@@5c|d y0/Ƌl+>2F@Ȫpr͚=ˁzn,;.lr]0Iz#uu[D!+̗c\kJ,0ڥCȖcȕ8VJ_%sF#Ypqӧ#gr- +m#zȸx'r@#xh2 ?G+s"}jߌ5g-2&ϕN 1W/;ΚWsLwe=6C\^)cŽ~X{ ?pQ~`~1 滱5ez>=Md<8ǺxwW6"Gdܬ sXik9 }aN~/O=g_tA}Co #׍iH{2$9^ GOQWjjk,&:xz}f<>˳n=扌QG"ɽxdNܛx^;g3msG ~~!|7qeܟ/p Kte >X BEtP%LB"nvjQ FHa3.UsLb%0C>z$1{T@bؐ dV_ W]6# ö (ZႶ] u 4 MJBQ[Mu.a1~gxAɥpkEE XXWbܧ D޼8vZ*Ķsb=e˅ 9 OZ+TﺅٶQGjbgYG{mvDZUZKUI-v|,XT[lFSkP/9?]'%Y'v9:nr;j1zmV^[ղ/ K /*}Tf][nmrS\xExӬǺ*[kb3[WgOϒ.`kg {R;^MΊu.zlYbhmw lZ"](kHc9Ⱥ@ߗ%O'v효K =\hX^iM=`Rŀ+b,Iœ>19=U`1`ܛ.=qᦸQY =ːN풎2nXuKIڎNMV"Gf C ͐EZt'yz(M~vA G& )B8.xD,/đE!DYo|ShIPcch# K6h9sG߱|O/ӇϔyO%?ʄɗs#.ca|a kx72גv9O_OM#i+4QɑKYG[3! ύ##ڢo#;byԑ8}[퓞f9#:F>S1PY~QvmrkϘ}sE' `Q׍/V돾?c#5 _WwRj7 1WK;G&̉ʑs>ru9.c`mxO"4w?v_ͷuNԥ~Evż{Y9U̷m尦MI,M8̞?egŐuG6\m#II^KKLfkiCG]Bk%YbsZ;]&-S˚mO-^ъ-8V"P(c lN;DsbT] M'e9%Ε{$#vWb} 摫^Ʊ"3~z,Z/Ͳ ɾM,il S\{+v&n|:W-RkY $>. .+.<#@,O4{juk" Nd\obc-NjשEr/B_&݀'[tn@DGWgLJ fC(ۨ|3Tv$]B}šZl}v3x]N<?^@F@+k##L^#pm0k6Qyd<7Joȍ, 'Z|0Hm|>c PX87Z>4͡zhNG}ĜO~(m0ީ#_xi#s09?Fɏy<'߷nbm4}a }ÏyGhm̩ˏ~C(K|Jt9/_!zPy>O⏗shU%cM}sg88ޜDo%v|^@Xa L$ pgH %,S(`e/``na660s??.D0m X)B 9OIr㐥*XVRijbs"؅vhڡ6dQl8g:yԱ_1S'#EuK)B jPm:a8k1G*qw <ꍍ UrR>-Oǹ`bIGjm>7gY][}O=Ch]{&؎u샷M}Zmݖ&[wˮf=`?`g(4,bh.Ȧ?zΝWS.@]75_,A[o9qfQik@Tlo;/ρ?ՒOmֹYsuz=yQk[Vo佽*^W/0ld[KghƞvCmVKY 6)7"Lk퐂ѫ.\!#T*0D쩊`ך=VYI'(ִ(ZIv\Hw7(olNXgIȉ>Ff)@Xt T&J_)n'Ѧm5Mvptj _3Aej$]m6w5J߰ wKwR Q[) ]i=OefHeǀ\W+y; 9ó:>y*b]+_xlx#_w"O Bf9_xG}?*2uUhi>Cx4(;Z~h|ݰF&2g_Gkx-{ x凶y2Gcy}ߔF9Z#X_>BϏuNuix9Na %DAhXG8S & x؅^ 3r4Q̘- ?jH"C?];(=1n!++YKMfV:*-[V-252"-UF\XrHRPcMWb%JLڐGuò,g,䖹vْ"{{eO) r36#3*/|9h-5eYMgEILn IuJoUa")BLbXŊAO>|;BKݴ2iyn/~If7Z{V;~<Ru[Y9Gk0< R̰G.Q,Zb],B4ۀv;eO]$ò KӸ -Y-HOH/iV28M,%]AU$Vs,Mr<)RI̮]mc .,;e_傪H^tX^Rm5CH,E_|Ppקs-X$ VUڟ&FvdC1ͶkGXz/?V\ ?_b#<#ܴi%u< u1{rv&-}X~&׳vާ|F?i"֞O|7~}Ce0a Ll W Bҏ ["م%p| W` l\p$W@dx5IӄPY]ܮ}'+^UkjOEZ7dH&kHxhV-P ֿYfvvGV|ÕA)3C%mXb2@J1p"1@ ȾR,9 /~^ *F|g˽\fO_.<ŕjpGPnX7Q{@,?utnY3z$=Ir#8D:SVojfK!n괄 {FZ-r?,xv+9lg[ރ`J2d=_mz!!/Z6qf ۚ6ֿ&|ɖ^e '[>I-vD ڲ 9 hoΜ,_̨i4'*Vljk*,p]"))ioDCIbcmU(qu#FySnU鶻Oq%Z@n[kZ-)3lՃmvx,H V y݌xj.|' Ws㙭zkl]%f>4Th-ٌ[GſK,A)6%]pO0cLtn<yM∁EqH^^~?JϫX!gn!u7nz9dzZk;aa-tJ jXXg{XϧcQc)^ !@-LWf;dud wνo )-ƟK/YWexk?%?KGe2ao$^󸜑8N6tgʰ(b.Ά"w,eΪÚ9UX@=]Paf*1GQ@b~:L_V0v1g&+B>b " 慺~WQ妇Zb(}c;lh f zbk2¾oX~qh+[o W.'^飢m[m}.oFۧw[H{dbc-ho{kx4*ܖ?[ayۛ-fX\e9Z}sXDb( կMh[67Ve|Ԏ*sf&X`\fQk-T&عYֶֿi(Ń^6VrmKi>SnωKtǏ uk-"#^V]XGo*{lG(0}^hj"8כh.[\XW,-Ż Wzn2_衇<Ր|ftDx=.ijc,,T_A%LNVCҧ$q@=$2g[D}/Qo?v(AڣhejH&Y 1qv \TWMb&jtN\t0(a| {Z´ ׾G~ ܽ^ub>ekӛ%Ya}@fb ra_0z: ˖cW+,o?3NJ jo['D[rve}ݖ#VBG;l-46Zn%7(ZBcBH݊Vdqk]w޲.&&vԜ ;&5piuޜJE@OږlN0+c_7fXXZ Z_cMF.,Sq拂ZW@s",JAԇ5&D~bB{ݚ,A +7;lBJ ZjYVVZpaZR% ]@'Igޮ aO+-EY=E'H9:ŷJӹ ۴c&1 0-($0=6@Xǐv͹ׯ_owu]z5`0{CٝMs3a,7pJ K@Ùܸgz[i#}J=T8LK>vF>{N? 0NzkO$ b;e`uue%6Ow%'wG3vHSN.D*1PĚ +1N HxEng#`ʤP{ReFFnfB*@HLb/5)ϊ:S~wCQ2R"ٖ+P,^TUfmv`<xkC%g'X}ZT@n{t)+ 1r3ClAXE  ofot[=?N@UZ}oizJz٣c,2'Tm6}rM]4 vq/leKͱkH@ɕ/U+vZ6̿hn(o};,')3.Fn4h+_/6pۢXVS>[<9 D1}VTiZrhm%r!\-ٖjh~L& UޫO#5m`l%zzO)XZVm/AudlHƾ /P=6Jh6ȝ2@D5G! mˊ_gnCZY`u>)KS)Ў_0&KOw7˔x6ʴ7W\h p^Dn6*otPV%\@^mcl&bMl΍-glK=}$b:x1gΜ{ƕ1ؙRӐHg 2։0܄ uR~nmª=gNHKx$;);^C~'3C]gbwpmwyU.oAoN}Ý\[EeA.3q#D Wf}#(TT< BJ9-0bxD5ڗ%p 6SrtճA1P]i9!i&v(h@0-0% bUm)2I !%Nv9V )02FTT]Jn v :*#Ğ U5t/G}+UBkt0X,`Ti+K1y_u~Xo RT}'16Hm+~) Zlsz嗽gB^p2kOh'ӱe:%08kƃngM[XOٝY` h.-C=y!OAC-߼RήNPtC4!C]]ޮ!˂|ɟOKKEEGYVVIP|?svx[ 8}2ύ8.Iq6y Ayz>¼5G:խ<A3Vl)hM [,\(]%KcBajj:lx)N  hP1s͊*D{򜎥4Nh%7Ť`k*V,%n-0dbxËUpǚUf`ĚJX+l2B3=z?Xh*FeƝ`;a1}x{dM6ScOٲ˲ώ~U6\.ַ.io3̆ SmXSlĜg}6B-/.gRb tlTwZWX]|md.Q.̦J;_TDmbolFQܩBj$iC쥪l IbCzS]eMTaSb͢v.[~YRbZj ߡ#ElEVJ;:\M躩w6#1TAY*ѲpSTXKZ=-q'%]jӹd4ϵ]6 kQoӊ%7 @[kC\Ip&8{Wmǎ0{-ϳVЇ>ds򗿴?޾ &x`7{ vϾ^`gq=O>=vڶmlɒ%óLJGΤ I@7%1'8`;0t/o\NOi?#no e"o 3u޸!oMzyr꼙"y&vMv :?ȃd:NSg&1B.~w[$wt0Tzh%Ý)4b>;e&=XOL($H6&3]Cpm2])bWZ=K~>KоD:a@ }n8bBmob^-{Q`ٰ~+/S+ _klRDyb%5;<ƞd1b2UVB,O+^z~[-k?e =1+ޞj\Zꄋ%,mdЉbSӡ6yTۮ7*mWq匉嗤ٰ?<ز獴v{vEJ.T`2,`r6u X\EYmCl&SlI/r(UK瞛)v0˕];>߫l[M* 3U+phd$#\1rU,VTMweAs8{|ueI>r`k[^3뾿07ŶwسE^+A3iI L/%ȕ `V(O5w,CϼM]V%JU(Q:(gII |)VXkD#p& eBר31T੫+:}A|/|?6wqd;VF=Aډ D0wbYq$I+ejM_*0;tP yj_Ic!]SyUj YȲ+~ԓ ~@1rT Ѩ0mM*o%DH!w=# DYi4R`ͦŊjwUF`-56\nBe6d6W|nɉrǻnl]8=v(Ww(X]QjS+pOmĆ6됫⼔^p̺meg=-}vnRեn1|F؇MJi-BCB(l,rcWfI{Ge[%y.S/nO+Pzh1VY%XZIflW uklf[&$YUF[59Z1F/.b;-PĒ <'@bbWjd)J;}XOk5*oaʕ{fZh$FgGٔ0 XYH=]2UܴOi5L}^-V,Gz@ߒ2)oWܭu^@+{T\*蔪R]R(kUHzV&KXyYA+'0@,|Cwe>Tu+oݎT' 7?s`}=P~?G[+hqm=T=AF^b|Vۚ-vlLkVf ?=ѪQTq}߾dr=0/jhф^l]^b!qV䱆Blk$-5rUE )x7}׏qֱ]L}^GnvONk·{T?PjI,QURUZ-Z1ԫA;u{',Zbvm_LF+|UJ@IDATϽb (5Y,$+rww<[*%(C*=_8p2W;v?Wp?7c7<7xëo}^u1c׿N:n~;?agty'yП~^LlbGMՑQ_ƍgW\q!uzKBxJק.//O~Ҥ'N+^P'چfD;З[o֯_CG_׽?|7}my}E @uq޲ɓ&=}G5\)}\G.2șQyyoé$ɵ7_r}tW^u8^rc7k<7B03 󇱇>1&$n!gog}z,YV^]kyc?衇8ZʆY.`cA*>}@W}0/Ylwq|q? Ȋ2}:6 HZHZ!{^yj@ŤJrй\=%hu;_*Vbse, Tܤ/_g5ST1'X pqw})r ; Y`lذ40`VAxkAvN@N Py,3o-2lVK\;\B?ǟ{?z&Q+AW1.糏1P ()y)sC#`&֭́[k )v63@Ȅ|Ͻk_uxC(tkg}ӟ670\ǸD|(r?$<:}γ zOjۨ 0@hvr`P { O_8sLp2V%B}'5^FlX~gR.["LO@m v\źS}DvwXhq2”RluZeLM`mav-mӦآKҭSjkP=#[v/ؔlpN 貌~ԂYi-Wf-J@>B#R,?ΙfxLXEjZtoi+`a_r-w~q^]U 2pN-+S)H{7a˂ȍ%S*cNd`='C: 3X GoF:)T||[m ֌ 8؇A`S6a c({ =LU ! Lv'epHW&bNNgnғ'O5_җMx`') Q7YdCvcƌXAmȖ悻h]1+`V-hnjk\%`5k=$pN0;/41CDЏ<Zdw-,cQ@ bŊl]}vїȁs'78vN;ING$eы#`v XЧ[n[ 0H[-%zt~{3N<I0&M_ԏ뎧G?0e7 ?}tc$Co9qkXHo!)1`z`h7:Jw 22kJY뇞 OY$=:0fH9 p.c E x`|/M'oȖ%؅!^wn 0,A} wx$@*zqx,%<#tcmU}R'/<$/c=1\i.q3Og9qNai,bsz= a\ F?ى5Q`mB:ˋZ۶nK_ Y Xm =w|V E:3ؠ3bgowX͓_K>Kѐ;;% T}ĽN񯒄\(O|P~{I̜/Us[Z-U.LZzX93Zm{8vVwXX V'w~Vd/ni`_[fXsk(V [Sm'y_]2=RmVo}m=vmSm\^jZyoe&ٺ<`ib-heŖVRƍ?ݭO p/. FQ2 2P)NU"/rLQ cm6<1ݯW,&ksĂ^05U]-Y0OJn OVO: v?x SްΗ+eF*NUj\[f/h+kr# Y,:W& i\z_!hڢNk+o,`Mx *XVm8V]4HP&5S1YMo-qI^MQzXT贚`} &nVswx~i?ޏi$9<'`X`t`6QC<* I *HsFX9L `Oz@ LX]i([wn>ԗ:{v`Ҧ|hV e ` F'&9$@@#%qmEf mU+݀xm;vl 1ao!w -6&g1uF60>`l%}F3T0jHƖD_$\ 7/X`yE" C'P\` =>#g 7$ȇ0a,]Y^!GC=I;`!򣏐`} C 㘾1Cn uaȑO\wI7:N졯\Bam!puwi2 `qi;}GI_ƛKoh`ѷ$'29M,XO5@&r}= zm% v&>r1N/PGt ]!z q.Gz-{%e|s=Dϸ`V`7MYĹ WP輓kw?Hm}grYpIWg|z߼KJ{Q><`2{56ư;-Rbagͩmo?̓cg^/a]SrBb*!Z }qm /X`X)u`hHÀ ɵm(!`|&M0ݯ|< \(ȉhGޘylU.Y8~D_#K{1C%!GFY05䄮Vvw@y6G(D'3?x%`zEțl>`VhA~'\{"C7Loƶ͵;4t w6ğ/)@|0,g4t0F' :+õ]qy&:!cځZ>JdL4cL`θ>Ƞ0 3ؗА1 {"=ܘi "3\-ap:}x6DZp-+3+ @H50oo: B0"۫RoBC0d by!]$楰_{!|?S!w_>Xa|H)rSb5^HJ66@ W Xد`,Y`@P#h8+bEH~k@yZ \LqzYJje |7K(͍mrÇ)RefXX;uE`Lu}d fWVōJ-SnnfYasm"narKnQiN/QSǤًŐC.j֓k]bqknV7st+*L}r huܻJ[TvجN{m/my9/[lƯ^a#ꚺrR[sBAlr,*{z&@y/1\AOr"\*>{S `A +/D_Y_-7i+hRo|zӿXG `3F/`51"+&Cp Ìk1``c c #C7^g$&!/X,  ɉ|/dޑ3Xɍw(01ƅMH2 Ār\;Ft&t>(-q qDO-:q"BMC1O}X.$Q n'.ɹh=^>0`&q$X8@vϸő8w{@.F~A{/0q~S#n~0h@2zCܳ{ڊ 7H0 %$V?|9یLJ8(1XOAqCV/O7!wt{1gs`G\:\~_'7ޘw0,v1yV¸|0y1`.%^ c'6n.W|,+*m@6Ƶ/_;KƦ]Y-oYS`efW\߲ܺ0j(hVVb]H6hk$k2[捖:z̳KK+2mV] ִT{]6jZ/J1:m90~G2!ܮh]U]r}i%$Zy-PeNN 9.9ߞe4 -TOЈe>B0'Owh:̒`k+.nz,.0:6ByQb^ONI+L.RM/h{pl$r? % S -01H^wg\bd" ;`>`Podp`>€t~yfdT0`RU;a , m%.OjW_= ppEs&83aa0a\s5ě|[@>Þyi[.&Sx- !/o7\t!+ڍ1KPhVmBV9 o*~C@^v57x ;o?Iuozc+V[ `0З"1mpIhIbEBNȚ>ɗQd `CȆ8نzOz:6cILzo+̠ o;5X=Kq |@Oa0\"CWa!9';;?.h=xG:>d\1n/e @GaQ/ 2pSuėB7JoXFvG>걊̧ŋۍ7-ޏq Manro={Qٻ # FW,cr_ފl)9q;lJXe@`R  =3ӻ nԅzq !'oCp?qɋDA:.?1: 0bܒE܃y +%^F{y 2mA~fܵtY0 Ae+h0۬Qy43ڏ_"DU WSu\TX+>N/ 5{z{JLj s:dSmҩj)fM*|3Lh2,+fNo{wi;G ̛lIcm?׬ ĥ3dh5iJ/,)Hy=1Tpx+`:+.3Zڤ&6Y|֛1WoRm՗پ|v 6R@w^e vT1zm⥖efnh.>+;_Xb+vw'ۃ{ٮiy嶽Iiͭ>Oڬb뉉5lӫ&fllc"'/W ƅ֯T+G NC>hR|h٪bb2.>L^k[PzDOv+Q2[%ҋ~ |B-^V~+=].X+]68}D=&^Ic/ .0KEY8Y4kz3`\}Pp!F hWqK!p+\o~?k$Ot C9o u ǘeV|( qªqD E$ 50 &L,``c8Rԅ1C0ُh[p9&^2; cqҥ06B^0( {| PXN/f6ab"Cʦ݀Pțqh@ȚxA짾D w7A~y|$f.z,M;eP `p;M/y8pzQ/t p=:@^\C=:&TKm S'O877q`Oj1h}B;`R;m_hcX 0zG/+u}Gi?vzBf/G>qPq*,KĻb*Cw(sKm1w @r#wCVgs R8/EZ ň("RynsQW- ]rӊq XPZdZiZI]fH{ɬXۡ}EN1bk,J̞gCk\`nBn9'v[m}v\dEZM/|x O.eOi);X"ӄd+n.i߬fϖ;aup]x^i5]k[OM@<) ,k,MzH.UhzEo)ݱ6dm7 4AA؛-zM"S.s/EVZWg]ϵ#-%^~f%H(~m֛hfXA~mj`\mnNˑп`ʐ[ծgmXk Eoa!V"FՌϱ1)1beE]y"_.X:2Va!~{XQgV$WFYmҪ pQ M#]#ݔ;,T#ՇwmOAXGjST6C`( *+᫺fLLNP-L*~OJU: J+w-@^,$]1b#}r'HD!^$I?C&c(ݷvz﵇~4 vΡ;}y^87[^y]统!'# ?T?.]w<}x^Pn`rxX9ٿ}=c # {Cww8l4r= V!&gx2u'X(A?י`Fy^ -^ʱB.xܳ`!# rW]ۺg/L1Z4Ѥז֢ ,ϹA:xMV i¤T {V+,s#l?7\NY(GY\tI%9AQaJ5|{V @~-I@Exfi5 C3bF[Å)3Ӭnom-aQæ'F-m=ZC}Y?SYoY6z\"Pbm5be{{@[%e≑텕z fauٺaCݿ|]<_Г*~:ccUf97k*oUY_lr\S깿!i"/1%Ů\/zrt}lX{c}AV!\ʶ7X"t .rw>K*[@<)nf/^y1 >JnkS_jfyZu>\nb~nz.&W=:7#6hM:,Ty|)fIǤ^[-ye#U:S`r 7V/:y-=hN:͙ iwߌ;w?|~0#_|;%w #͹:w@Y <5G^oW׻n{r./W ̛߾ cye0x|vyqws>0=yXӑ=N+Փwl}v>p=T^u-{`}}o9ŔIݱmzMGo o8{ߡ>;Pys=~wo;FoNU'R2>ބ0F1 ޿=wQO^Nx,DAJf)0$E^'eyߟN; ezkML$AŎ°O_"w+j ^f o{,K&Ҧa_KVLɊԮVhU}-M)8Rw5 .H W~Gm׌zy ا㭴ö(VV֋ *(ZpĄ*Ve%CoYzthyD'JdY9V΂#yimۊ׫mRkbX]o^ѳ6:fxFXĆl\ zlT詷SZ^SmEeٻf s>{;ˬۦ?_j큧klYVdgZnNF(9M/713^*{Omi#Տ=nc%#:gG e&vTDdUh|֩]5?wFF=7،H hCǤ;__}w]kklWڇ~nҪaArc=(0nGl F bSoX& ٨Wk>ń 4^HUw-kaKձR/6r(`*6bK)vV*GHZW냎o*@+/*JS#tNt$*{!ҁhZ 3E|'޾}r}w@;>{+{4[<U;louo;myn#{|.xHn;|WlÕs[W#v*{綾u߷\sN;!{$9ؽP;e.@©*ˑDG)b|,>E #&nɒ%cK~sK$ۻgmK܇V`\., 3aw:+lxaܞ{a<)Y[\Y` fd[\=]E&[$c_q3L; c=&i7eX,@UyW_5D;Eٸ6+Z1*}&w&UuY=Zvkc":.V|q[zEsM}"~Q-lUG}ǥ͖^d)hX"awꫩ SZ5W>tbTݶZ[WumDΐ YlW ʒ+/ωHƎK :yn|{:1,. cK>I{JlCS.'@IDAT] b(RM҃NUdG2ܢ~'.ڻڪYh\LDVH #1z=S01nʓ-ҵ_t*XPў$d{*%p\֩,K/_Ā"ω>Ւ/0x:82 X4 ]9w|K$8E-\yRc?c- @xâqq+,7/SĈb.9X'FNS$"f$/<" ɝ lM Lj ꉺ7Ab $ ־B ½ {mTD,2oĽbeS˵+uE X[K/]hȯHAuU]ķϵm3E!Ƣ.[ߦ`fb`I W};l66Z4[]Or3GZ~|Eδa9j;*Pnhv{XKalKkwxU:`x]NˬjS/NO6v4T[Τ8m47-imGj1Q̬-Dlip>7Fhw5شa۵_{IbJ V:ou-= '|Ɋ"7aNOޓLf{B ! !"X@EŲZ*Uq]]+ʤN2%L:Τ%{>ssV|~rFT@UNgd5Ibd4q{uvc,#': Vv"ﹶC ^XPo)&o{۬U. A0} %klԜ,jxKuuv(JW6+N7(r8r,o(~ &!({YeIwqj '=-u]B,ЩbM-\Y+{ $^;YiȉAPA %K[%վur{T21wq$>H-//ﮣ`2oAA %]@[ qkcZ,"Z*Yod\J*g V$0|aa[IF x?: 4:ǵq,#Q$ )U{,:uNq׭rʧSչMQR2U$ PJ)TG8Vl4?I8jYM*GپP|e>A`޵KAU~@zZxhJG'H.X XgVը:P 0 pLXXm{ekkǯ `eTL#LLp+`Pv6>FSgz"N_ } Z e$*\q;rMUk!`A T:x=3H vS^,n:sKs=yf5?oO޺Ӓ*pb-k|@023Ϟ^ ke4`klܥ>XmK[>Yf )0  ,X@FanDQ*~U5c=/}j$+*Jf ҟfHIesBeIT_B-/+ҢXTc@\D#u,~,.Ab_ŚKf@G,Ũ\עJJX bi,@W~f^R/lL:S/t&|DtHbL 39VZ1,sWuMLǖq|M7ҥKmʔ)GnнHA *fY͜9{1{J9^9鰘?|q7Mvm9眣6\߶m{A}W0ﴗ< nnD1wX'Q5f1{;%Kc 8ۥV WLjj[bɭ#2e x$ $jc =RR.|`.\H4 b-^=;}L-_EVg)6 Zb;0K0 {~3/T,#VL8N 99(/$ adFtY]֣YʔkW`T@) #A&9%ZԀd$+a%%%<šʄj41C@^/l̻j^@{FwmuIj=TfS&)^֪xLxm |ir5mJj@e 0C[vv[ jdmOT4[XqE:G&cgzk9nۛ icmXю#^'&h5$+xY]eWKMM_K*_fחgM1UU6=>GGZ2`Osm ,Cnm]V|_ ]>Zk,>Y]$UYլQ{}ЖUSMVbq1h]mV]PT)^]:f%Z-P~wk3#-E|U؊ɱ.`T+AZ-7?4t[\ 陮jB JV 5 1_+&HЕ(;gV'] DS_%K4mO&Q8 @ J:5G\8bᖈB } 4+@5n:QKx)bLOrLjҖ3h[W3^_}XKɟ;gC[OLJ7`|UW+h2oN5%*߃zĹޟC #ֽj} p ޤ^šJ>'+ ݻMƢ l7eCu|}Ct'N; MwMqIf4y?duznVEȥ,J*}%"_ P Ng%  ϗ?TfFv5\ZE.RXUK bq;aP;p.w0aU#C+F̣ٲ*U`/bJ! ,@!W+Z|Ծ}N[nƧD[DkCRAm벹 ;cmK/H2ˍ _)W37Yajom KBNeeI98ks R 6YDIV@0eOiDZ'hbS͛oV[Òk쩕l@͗،q|qMLbd2#.{DkHRUnuQ̉Uڣl[e蓞eo]a51vɴdWD&Y^o;c-j̽U:v\.]sd [%p1ڥ?X@= 1ҧIz@y@u % @apk>3 `5NWc 04 /.2:d6R0dqKyb'~r.<’h]`z}$pLk$xL7{;^]^ \`;|i|< ]|}}~O =?1(yo;#Pk;7t4{p ͡ #.htF*?/speGk=/p玕ǵN`٣q3f@ wpyι?~x N7l\p? rp᩽ԩpZ=Qc(Ͼwy ;s਺şKNN>Bj}|>7Ӟ9Zq%p?OTPQqq d߬F|V<\Xgܹ-@T!=_cicR4/( Y[@Xf'GXs!{F ]{|Uv.^yϓ jw֧dW >zNŋ?>Yb:a}HnC>}16w^ARS\5g,)NR5_~ڒ2\n*jπ#WEmOM-UAamsr,A;3V0kv^hcMAeG#]^Y1*%VZr&łV[xלd9=/Q.wem/y׮Y6n/}ۙ`7K{Z$m,T^~By?|O/䁴ɇ>[`>AT#C;kC l#ǞG>mґa2o`;}"_}|MC ch2hih=e`>eu/> e8w/%Z{>@::?fx|5pYW Q#kÍC<]g1ᶪF+wA\T|V {o yB {]4nd?d)02YU尯úmc3~\ޒn h^(gڬrS\Ku4ZsY,3w*yM_3Z[y%6[6wۜ[kj=[n.i-;ZoV/n[.E V?,iӲr\>Z.Lz^mdtnw@QRu?w*`)䙋%_F\vsό)]&WH!K#^#;+"0;Uyꔗa:D,;te0?^ ŠP|c9 I0!yˀSn VHMOYeE9:du~"ԩ_uD>õOy|y?蒢Dp:}rHpO\K쐻Y)e( 6@zOʗdpM?XuP~}>t ODQo'e?Si+_xn׸U1ᑲ$S/sL9#܇ }Pz&{|Y$7M\G_R?(6Oik()Vۇ?<#3~^بk)o  uyǞ;wT /H:>@***ܛSKKﶶ6_A׋??ppЎypUUUɭEȧJz+>#NV>晃~ĩ vi)V nW%k.k|.[b +]Bmx|`_+/bO=_"$vJ>):MG+ "ldF@ \ovRPE;>*S]%MYanN?55NRwLt߫Vޒ!{'?#wCvYZ3s]|n`v(NWu6+ٷoN=+]g{Wa,+aƂ k,\ ލ ET)bЪ= @rbY$V8!A3bz7 dP?k,X`vƍ]wnML1c/s9wӄիx?-{ 6عk7p7 oܹӖ-[xaݻw;^|;?ڞ={\|~a7mѢE`{}}w"#&lȅ n6,0kܤc޽rJ+RK+t_xǛfccO>%%'ٵ\h^3g8pyG~^^.Ë奟y4⋏LLOR?AS9}X>xf,Q Mӟ,;;Jx)CG5Q<mG?t>2u]܂쁕^v+#zt~EWM/y9' /8=okȋ ŪU 9Lgyot4ˤtK~ϝ;kJ- 裏::'OvYy.c(m ڿ2+3ܚ{ /4=>!wb /E7x+W-Ky\yڄL{~ʲ_| Yr"// :V1=xW\1O$: ?/kGGgOtݯ-N;1o'A)O/3{:%ڎ ܏~#{{\?X2:ӧ$4.Yq£7,~&&j`ߣKȘ'JX3g'{Y&m^yͽ?Z|ѠɽWxfDGXg#G>]cFKWt^:Z}vt}ޟe0d210bĆۖOVED`1p:HTWlzƒUc|EF -@r*FPq}0Jx]GgU )yK̏ zlOY=Uޝ- >+QJNT1 &mѪu_[aXjpfwU( hYZ[dX\vK^el;YYI"&rykv798ahK_'bY T? ~ng^SuTٜIBA]4>z `j rWI_ V_G xk;_rEزڭ9qKWZmVW[8o[NL$Ҫ2_p{rSlie ,K`o^Adv@Ի'-]*\)a^e6a|54vgHz6..d=JT?]NQx[w ?t:\*UUHmV\c]j +n9Xfɵp0=҃e h"ب Tw8u4gaD&ݬ>~w\V7͔?@&L6gͤ|ys AU ^yd0s~/|; ?gnF';d XHs7i w@$ 6RS\d̳ػ.ڵ~9Y~Cr`"0X|0&LJ}mƏڎ|׬YczQ?}-:P||csKb7j@ n:{t\nӯ˗ >gW>]fN,2/+2/\#Kps|#͵eh/}uݺuuH` A&@q+7q~hРm'>9]w@fC,K$̠Г &8 ]ܲe[=+zV 7o~3צܿk)S i#}Jh}J/;$:D]'+$2/gyq}yow}?i U뿺'}#hp}YG>3 ihWo~0}s<,3(dv:ʍg : ?[ =cA@G_zzІܷM;0ިӖ2=:q{gJw57ϻ$ lJE m.c)Lq!$A6deܽA[bB]ҥVmXt5i$< Ena㈺R^LcZ7ƞW,hc3<]@n]"P]Y Yc) @ҪvQ8Nc,{dt_ԚvhTpFMݭ6i\5(.TIe65Zs]-iC,kR<LzHq4ykV5ȅR?ܖ++j'k!EȸxV DknC)ZeQg+(fWH<%n8h|/03Zp,$^e1 .`i=~͖[}4MZm~OVpDn2}~ f' K[h%i7Rp.5V>Vi/mn}a ήalN^)S@#\@X)@ y v^LT.0H}c!=}"@R@_r/Q's%}M> mAg7,ñ P `0vzCe.d=R_t](sb6^ ,Qy6b1ӟɗ<4Ji;2r@vܐ5u<ޒ55G @3t !HNHrpb4:dZ닻:~Ω҅O.{F{?AgUgۚ&Xoh#i Pi~íM`V2ſPNeͶ4>i[[-W[XkU  PӨzZT?iӢblVV=t.+ [ Rs`T̫0kQ}c1rD:% :-oJuʺ쩗šrX.7$YDUv%V\iwιP\Gj[ \Lh,6 BO%:MU>*`ɬ0QT%& +h\ߢΧi&G[ʐGF.;ua5cb??nč:Z6fΜ&`7 p  &018QLmRe^LNb¤:trV,X0a% 燎X `9DOL@aI;i7EXhٳ F22&mX c&0za0A;́L7$`"O',|ڼe4`$3G$@d`)}[P\} ̿Ȳhs&ErkubP?}i@ Vs@<*z  †>/ih7k^~M6YЦ~,8nxIkPw>S|[rw?po{  " uF ,/i/ K!~N簲@/:Ee/2'tk/pW.Q@$ڏ}٣kW+Gt >)Ͻ I̓ G#/dŠ9L>u% <8 "qҟޚuIܓٳxA,$Oxl/u=ZQhR7C * ;c%p>3%4c@E ^v~?ZWשG}*4Otn(s6cfځ,Xm;;c$?5 7r . gL=x 1: +Bd%Zbm>+\-i4 UL7[@P ˝׮e#hO۱2̑a9jYaV_1ܽ XاC_eDĤPъ.C+w5k|(C<تZepuZ*,'%R2L ͍VWz%NqQmI^j}Ftd,+#@lYRՍpX}QB&^GGX(p=2]e,p>cn#2m^swR7> t ᶈzE :c dtEŨh9t,u>?d~σ?fO;u\%{$ztii =O%=dD!R@"{$(?9=yf ZwXusxZ] .N.{fMWf_.}Y"m8we.noBEqv,hv["2"X+ʹWv}TMTraMkV,.~[;SUuZ>lfŅa))VzE$u=VUS~dH@ ^YN=W%wMDɗdEeWb==CeWUGE@"}.*MU_"p~ιH.q>$2ZF}Ƒq2b䌄X#&eLx×&yyy x%L&gX0gbĊ L/Z&fL y2iƒI,B&i|1&t"'#X"`/JX&0%1 /mxº ~ፍ&D;XX#R>Mzc,v|`rP60ƚI}ߧeX:q$uHLk2dG,[P/@$}A` V@@ D] y`0?)} )pCXO}N˙`r|p/ۢԉ'hF`(I?p-܅+*]`mH`JAG)9,L1@ }0hBZ/VrZȡ]$c[|wNFjO}\F=tO Γxw~ܓX >> Y[ߢ{]ل>ϻks>GqD94sgYxI4@r@F?`)~:ZS 0?,E<װbl p;~`<Xr C 'x@PA iǸl|}X,RlkWHv_hi(l#{H{)v WTlnMѲ0KJP@t+6׮s{v 82O LM/0FlҪ/L=vY='q MR{e$JnR@A6#@k7 p"FmYL:}3FVbrQ`$6ۧrj`}kŕ5FJ' .>֪ļ>oӾQF?b9\V]:g01$s/*/5X0 #17&fL? L⊯;L,xibD!8m& a$ $($I7.W`-xkĄ&+k +_{E[9/ė!rM]]G& 2`,B⩷MV"'bօ;i7e-22MI,2e 85 m$ t`ɣSO>,U6d2ҿ: gv=N 3@6r }=}@Yd]]U@ \O?n,s ;Ң#@O~ǀ"}D`D!1NV>L/bٰp=G8pmĵ6~tNܧ3g% }35)-AF2c?MAM_|XŽ&Wq~}d<60ߔ>E!t(2ѷ@PA -`׳?xWіgWq7`:;$@m,Xpb$+N&.[q @eQ0tcK Z  v d`0*(uY,k<p[n"bS"Vץ>{P`.].͢9YF)6[jL4hUʢ5`},#,"* ZZ #tnyXVUrk9>MRB^qePI.b[==;"lV d U`Zꜩ #e5/ 1`+N 7.7ˮpk 3;eP β3,NUV+GLqC[GOUZ+S"lqFMe<`]0n^mOȚ-JeSjПG5OUY,VeJծ_>jj[c E`$?bsU 6񰧡^MZ2kz-ryP}J, '?LW:>W_ å|W/ܨkkCVcqxLUV\> T>[z/FhR='JoX)sI?*,o(\ t!Xk71I :.A\g~SY`zH TWFbI/3&$^z4f$jw1^Ģ`J` *-X `1 Lș2$!6 בp)dJ>VOL$&j 1i{gowO^LD$VL X~Ζ`ejc5 ^|še\Xa`U~$<6/S(cB mP2͗929}VdȜ:` 0Cܨɭo~[7B&X4@` eC @*0o0 dH,\Cfq-{x"y!_@V(Gt qrޕA_|y+$G#Dqɣ^0^.O ~՗Z 2Rr `}+XU4: z (;&-/t6!_@ڈ̸pc{׸E9Wt v!ڃ5t( { 5 /.}XL'Ic]M@RG21:=Um׿hpz@ YFoډl||< mrs@ɜWt;*5cNNMxn9L-<Ms+PGF>m ûcS?~ϖ|#A'! Id>Vy clߙNs({S?Dp =+QB5:&/I H=V[}(ם/k,YAmwj trcbkZdEak \IC.Ŗ۴Hn`5U4bZeiu}}ۥc+.}ҪE8Q.& !ÞR\:iw^&R<͕eOc@vV(&ҪŻh& k咖r>-X^o*ʺz[n:;MsH+9Ф$w7=5D)pgirlqQ~iaZ7L8le Y\VzL/`& =f \ȍASlB ҲXܭ/k6 \,KaNf]>-ԎvÚ#+MPOoTviW eV-hUkrܪ Y!}z$Vv1q+e薙UVx+=(^bMN tT՟D D?oU.] K?Pe"lOy1/C:ɿ[68#1'Iˌ/ ^ bG61`8E3EIH$aL>yL=/\K>ey0xie)C)zT`7u69tKp)zP~<}N!.<瑕Gv{2gM9g(/QbuCikOE=e\THGqkINfGh m7 ?7g?E6ɣ._;(Q=ehO;s-/m#eJ>P:i+G^C9 |.x~.ߏ9 gh$X>Ms` 4ZsV>!+uyHٚp//7yVlL*3D>=Vbhn=Jp&hnKV[)S~hjl1!z'IW9V~`RcbkĹ}|]$teu_e0JyjSR'QeCZU^ L v,p766j%-׈iV(Pkmye*CώhѨo\dR {ړ߹rVrkۨDҹo;h5vYc*l\4g9os̾JòC#E 4he*M165T\wUOrc{a|Z-]S3l,IA녌 P,4WA,DŠRpZ8zRyN[:;ŹcEq[V],1C2>XYkMhVk?-o)WbljY.P)Qre[ev$B`#,\gJN}Z,>CsV҉zy KѵnT O&+-xŪ g:8TY=N;,sHYvIx`]QIj$M&LC1pBx1T&1#1tÄ|2Ak)䍽({ٓ?f"ȵ@r̀i|vs|2)zx|Ivh|>M̐ih?e{k9Nd}{.Wz|Y/ hqL= :+s} _\ϱ;_'4I{y;y|{ O,z(rת.X/?~/'?Q_C}F>e| _$/@)O/#$ p$f;XJUkx1838Vv.5/gܲR52+_N<5jPya q& TH0aׯxB#-fShMZ]*0VqNYwxתVcʺgB8'nV"=+[FdԥXFZP@HV{P.~FsuZj$XK oN/R^v+nQP0!k_2Wb8QVYzm|Fg~VoK-V@Sf;]Ob6+.rR2Pz,I K3I) ˥izJV1bmݘin^Y-k;$JF[ Q\g&=Ͷ_BZ5!0OZiEm~|K .\aOD($jO~(nkjS41CԿ{RP0yɷYh|)bE \,8mW_imG)youBuLxT>YekvL*O:C@:'8 0*EVWa.jDY:Dms|W>@78gDp[p2pQ05IXc{$)I mCi#/v&yZ_Љ??Αhȯ[`#c<Vg`'[Veڡ4_7\ۆHuO]$_&;3ߥmtZÏky<Hmxxw|Ox9keJ (3Q<J^p =b+55bIeŜx| ĕ}-Vnf2ʼn38{IS_na@vV/Hɭ"( ,O"W*&FGXsoVWFFV5 UXJWO]R+\ 2=OHfIs NT$k[©l# r–,)_ $%sZ_% WōUH bT.w6"4Y\̋ ._<v`ݔ8t,blZoöVc9>r/oyZ6i3qEƌ}+vQJeiF{玒&R!yMRvy6 l*ښ^-@1RSKyV6<+W r."sR[Жjs8 S [mVuv]Ϲ :l'wKµ2xg]"Ŋs*cކ&@l#✡XSŨ,@ ,YW6ԩt(Q)Q^n = ]싎C7Kwu0ﵒ陸O)?RWDwh#)4q`wi:G?FHN4TtJ}'HO.>V]#??&ucu {5th(SbR?iP1ZWʭC `T~Z? kFrkw d-D 9m6aU!p^4ڥꐥNN bo6;'[(- LliFʭ(kϮK vc#ů@e5W) jK߲[ƵMV$Se3]4[7-Vuj^]cEkK1>JJEZQPLfV^'B i:(gla=^Nh:Yuۻ6ʓ;7mdҪpvE/lhV\/ \Tc[]l{,_ƅ'rPNч,Wp+Y\,|{S?l,tMvHʚ.,lWvXf3[9⻠2$˜(ko /I&ՎTɩQrIS-\qOŵ+U4@=3hb^u].U x9)(/@fI: XxY{6TzEt&l'ZPQxDɭ%!<^_ bE<1*3VMhb +]+7Z˞& U! c/0f|R_)`Dy, E.M{Sa' rc$UeE\#PŪJQ@G4q9P~[eЪ ZZm𱝍;>oaJ{\˪X VFoV0ץv4)Su8+W d,jLp9aGmZjg -/Nq;ByuTl}ժz,KHݸB7$P'ÇH ivrﲬwUwW{;o*H7ܙ3眙{sFAޯdK+nMyJqgu8s(*;Iw㤃rɕ|S`X,pXa<`D›XW=Bէ? oRvQt>mZ$/vJ;v3M}p\%IGܮ@ҿ76`Z5`ףv`ʥI]bcg9`Է*VXX"8J}2ܯXgd v"ICB'C'z=n"X;dqX~bO?3Vt;LJHht@8qb9w%e-nʃWLә6.W^IuNm" HxL(m\JR|\@D`"ۡ28UyPU>Ud 5 8=ޘjw,]rM|Iqj^.̰LT\b]!sd…ŒYh3d7.`hj'[J*)LxR`&eLvK6DW$ڗɍt0IV^Dq$v㹪[&mu@p|YnyȲkN "Y/r;;\%=_SW.XWw~nBIU`eiy6 T~Y PzF_oԻUA.MklK SlIwQ8TmcB5C6^L^SF2W"RyqU,~>~Bp@"!JbÁV:haI Tl6mP]@K&WjS 13f Doze'b8Yر`Oрtk`ڞHe>XOL~4;)}3LLX`aH $4Y W<8z+oa+_f4K &eXTN٭'\#@5E+W( 0e(N"z8~b>z.tWαtS v{OnrS;uel5̭3K m|/R=Ov,Tot)$@V~!uO,+uŞ\b!]5EnY~2L!Vc!H8NJe5-+U`|Y [_Q; :QUE7?+3]iǟeYvYY=X\q;(م,>ߍ6IS@.bK3%gOXbx0)=riG?Qp?Ǥ?Y\; Ͷ}1Khga$5eʉ${JzO)$4@BϯR梻ۚˣW>f/W?68fT$~%6;A2vCm\oP~Y F9V]QcB7\dŤ 8^5dy\ +ʴz">ZN2"n;Zr4j^CS_.{eRe(TfCVfS[Ow YMe8Y;I> 8(7t"ӫjDz@|z%Y3C`rj[| ٞPܫjOW\+UZ␽GmU$ڒ-K5N,Eߚ2d%MRl~eT'౓بX5ɕQn-=PK:JV`.NV8U[٪M^I46U2K\$@B#~#>V]ϗU / Z1@H,6)=Y+ʈUD $TgLR\k9`p=7ΔH`yY8|QMKj`,H2GO+ţǢ N? `Ha $ֆPSVVf{n+xU.LNa*m,ʄd}%d] ,.!6rT0-ʅ9Xkaa.FZ@@>r={Q!K˩]XmH|JB6 /SZ'#Wl֍Թ͸`kqlv JQ0v^ٍlDe-G^-CHv,J˳rKK03jz=X=%6!KgZ{v4i2 KK:g]8>Cc{%6Qbbf d%>._hZ+,Kn*ͱmՠ[%i*`TRB  qv<̑^[g_km1b=%q1ϩkr+$Ų)Q:Cz(G`MՑԷ^6I<KEXvI֪=UTF;!qGʲ6a]ٸأ{'<*̕UV"f8j+V.^}>|8ik6Rp~,mk3!DӈG_>Xh_O3|lK~lYnu <|X(//zzS4|^<]H;|hX6{;RW.xt¼#?$/<W64Ǧc%xc-\4B~zI;ɤM[A$hvqC-.yW !v_v:ハ_MWQry'O^*Z$2$=[^p^!dW5em^`5p %A$mF&s(иaES8RHYjZŭRu[1e`څoH&gKmHmLPrA-(v;'9)ے4`~M̳[_hvo}mM<ӶaymG=" VK@^n$VƢL;ȚC^{Зd? ełc$ě'㵝V2`r}|Gy=&'mQVYۚVz9'Y'9uxYMmvbPP e6Qe~+sl|sssLg\q=KZDzwO\LVq(EPrAl5GS6IR-EfwPzO.zfcե1WU4zxfwuZ @'n9TCo5^mKzo>VaKJ:ҡti,^Kr|ɏG+^-F /\trhk]/__&|F!}H~Hp~[ƟIf_u}c?Ry-_Xk+\//|-y_㉎Sh8b4R{i$t|[#eCH'/^篅clj/nӷvн[/q>v5b0'"1tN]x"{x6478ٴ=8: r>k;鞌~D$%1l$33 8TnJWa%:xV;ߋW'ɰ`a;a[d+4_>k껭MF%%>E&94Fge*U 4$c@AYq}9*;wZaG]/i\~UiW)S\ @6)ö*:oTEeYCHB#H -C,cn/M_j[x6t=;M®΁ᄐR799מmlBA%uHFN٨^rvEAwHurd$  .*b)yƢR+T`>Ӗ ): o9{Rɗ.nHV,uEe@xNuGղ|+BI../%x=N77K ]@_hX-Q;gBQ%e T;-׋:ct/lV}}90 taڮBhy8nNxzhN{'r . jY@P~Νk'Ovz)[vX–/_m>NKLHmʔ)vYgٞ={t/2g&Nz/m~]>y[z-^9w߶m;߹sy_~vEߵkI_nƍsοsٳs:KW3gO\C# 6,]~ڶnjzիlɒ%nn;ȁE\ydʕj_p͙= rQ.b+))qҏyFuD<裆^x͘1)` |C}dvz,[;߿ߞx Cvڙ6mk?|>c>ڱc민/cЁOI\;>|p/7o!77N=Oōo>Y#qz% l?OW\~z_sm9!ctW%ϹE8fp'S_x1Fy1x% ~GmgLfgFoͽ9Zѳ{}xߞ6~L;>d3<|ڼxΑ6Z>Mԓ`$i " i*&7z\cbqۡd-!fq"B*@`=c\,`KO xϮmnXh4E/fmh+_w*Z@U}{9X/+CǪnd3e )@(9?*ɲ{4S-6Y`nٯW;piPK}5R;.C IfNߩgp'klslHVPwhVe)&TP#%&w^XfuI_k[f6v؛gXq[juoh!{&%tc|55WzKrzҩM6E u.ke\q?DmPJpwb]kӤ:]-7H\q{VCekeNLۡK,UŇ ɬDq>Gg2@ǦkM0\W8}s%VxDa{*g{EqHb$魁VI&b',nfƂhnk֕wcϸ2‚=q^btX{n1Ƥ}m˖-/~nFhGl֬Y-P}.+s[^:Km„ ooQx`W]uk @@,?O#pXַIԀt?M7b 805?BLYl;@IDAT7b߁uozӛ"p_wnv0Ϸ~a2}?xJ&>O^zɁrUUUv=8 I?q_/6GNjG_җSQQqȄY_x… ;??9:gBw&>`~ЁZ}C>۶}ڧ?i7^0u:?,._ '2mbD~: ԥ g]weٿ뿺?&> ַ: D.ۭc^q#,%6(_+OO<Wѿ0|GćqN&tUoҘ%~N7)Wyk\ybvZ#aԧoݳ.Y~3qzihhp@E>pH}vH5Qx;`_~1;7~pww}&]wu=d}oHI;ܓ;@f@>at1@j8Otp&{s3Áp@ CӆgBa}g4~g|hCɤM?F#|{]&mtx{98~׼~̑'yg+t5d=tᏹ+<' ;xE"W4HR 2vx|vcX嚃:AeTfNZ#(SI@2\ P)W'^,~je4Y@J2GuEmT;ϕo[,ɍP VFOO*IglWī@u(WXةR1t]A@#Y-Ml[ߧ+͓sUVe D6YYkM[z99km\-WVwZrO%.Y:: 6vIY\leGf[YRBYmɍ;ƫ/"ȬQePT\Dkh嫷E `XQ& >vrKh: k:&,jXaYE &1X`D҄7XA1ca  >,cXa>yX1 :m[XRayW` hE)<Ȃ pޙ0X$t䣗e ),B@,<." "?.]{x e_@‚B'|[<,UrܸR_DvFXXr0hÒa|b5߰()-@HD_;7-@1Af brq/PB=V*wvzM:LLDcMF&VQBl| /{Y[o#GvFVxGV%,J[hbâ$0ۍoem{]?>?:Gl(@tYm:+ I>}X%w&5`no_ϒGD|:F>:ۣEgA2;{40 'k@ϙ11 ^-6܇^Ʃ `KW.6 2fhNW7HW KKN]?KyeDO-CpF z\k/K*# ZRc:duB Ѫ 2):ReA|Igpqj 0w\ u}taMR[|,kv֥>x^O)i ҸVֵ U*Wޠs*9U>f *n2%$G$oGQE4K')* 6[Q+.vW67~kñ9m֞C2ɸ\.-7ً$:& E\X,k?) .0&`LH XJgBD ,keX=&aU,pW]&Sа߹gAHbя[:c&+ ܗevkd` |? Y!XjVukEc(19OpGCVA>/WXyl߾͹5,X|cIiOy׮upztnAސ)%92LI ӧN) J6ȁ /=h 0ov}~sss,S`:oFtGo­h26 ]60﷿ΐ_P$f|_7 **}_=3䦿ctq?V1t4N? >%#Mm`Z st 2\ss&׹O7G1 xFloh3sYg$,-WJqY/aXAzEZZg-)}HK'}'@?9޼\fr{-LЋ=i4b]r{c9{mG;Jp<׿ߴ}(iZ^(;wj^G;r}?WӎmL9G]?g̻&2]y Pf 16 `AFpޤ$\G: `L`Jez]C죈 Ö5oW9n.:z)3v6{K>eV2.R[+d.FGDU 6As|]kWMHO)(Ł%3T>xeudV-WiD$YV`g#xV6W`ҀnJ%1W [me3l۾~Rlt\-͹>xPsb|n^XlNv V 񪖬Y/ްvKQ[3d=F`74Y]~L>_Mɔ.* hvKXYX:);O컜~pQs*\z/C1rWtD )}jTe11(C_%1[`y@u!XCkL &XXaBE-qV"{AXg`݃] ȇ,֡`, Ӱh`[!̟rX|X#7Xcp e< #\~X4 :s4 OR}y\*cvZtb5Ƃ0ve gN~ VZo{ۜ|.X$@u78K)LpK²f  ^M<,̂V1_]Џ7L<x2~ |# [č-x_b!?ud"\il㦅*5/p;'[lu@ +an8~iV m3 1NK:ˏ_cwk?Vf~#KOk܏)2ƽ +M(~X" pDbP%pcg zrX$r8YIF.ﰼf\s"WUU/GRquvQ]]~m,7Xtܫ31CA, `<,BIğ1W9Sb#W N3#R&VpHtM=p?O+y~9Dو~%͋-s_/? ɃyF~4n#HτǼb^9 ;#t&(!_N~@n}x8 5v_TFkT; e* HYR |rr\̛csR)ה"*]"BI*.K)Qv6 Zkl𴪩ǖůjhV[ैYKS*li2 ۣmC6MB1BOJN,[{uiu0nv Tg%,ŔUuROh}#@6H`'}85U&#j1Dw`7~yDZj"8j0/X ( &\ѰvTa2XDcu +b!XX<0,{pР-+\paR9$h~PBl! :\̂` w3\pP`̢(,o];S.pr)z_@xcA dE! #n;"?;X!31,Λ\[mu' 7mWx@x}HD=? pƵG+`/X>`FO-,<e[1(~Gx ?ұM4i#}S[Ot,:u PMa 8$Dy+mHudH?^h0` 0 ZO?O>`@1pUUUQ|Yџ$gGdD\g&dMwdLR,v̫T WUd:K'jA Q `t.0Xb h/[phz_nrX"j$ӦZԨ@& ,˪*:,ڤ|aA.Sg{ǹ5p 6œ`F\ 88V*]% (IrH>iuVsmiGzѦ%VWTM:˧u+鰚*/$v̖c?b]1TVXn OXq/ZtU9*0\U%iLh@/J.0rfRF @w#._+'|-ݹZq!˔@.AAƿ:/@@]YY^́ܬqe(+@ bŋ8vr ry%p~|"jy\^ 5/`(Gn .4RE|\u@$;|R_eQ'7@C| @ @G4_V,,\r)%Nz4dQ͓h+]'-{+*'m йO!|@G#3Ȋ*>)>i|?! ?|yZƹG\_ʍ|2,l߶=@_Wh2e}D0ѲԹ; dER':-'k@q^! YIЃ;IY@[rJwN?P^VC1h,`l lrdt$Ս;G,K f$@ 78;ι/5 1EȊL>qҝz]l>{d|@.yޑdUIրA5qh@d @2 tryD'Iyg<Ѥmf?{GJyF.=GĻ~0u\?FI?xs078Z@Sp({,gy t>+9)ҧ\SEY zIizDHIVsS*K C &ꢴ`_,Uy)Te$%,'8#,+آ]U9iyi`nzr `.>'=X(U Ͽ(-) 4 >08(5xӴ`x*>tk7UeWW3],H&Y;*וgW)(oqrV `d(#%ڟ682=+~Js8cN׳Uraf)z f&N?9W"tVnnN0Q2,ՙ%j-"oJm>1M^"M] "! -YN#@*&ZQ;=YfбXcn|42V 3c`%$%$Ȩ3&܊s5x_"A_8OM zX`u@’+,#e Xz`)/IUUU|bMuԩSmiah 婇e@(g債mom]k GфH=*6$|`MX( [\2ՋŝbIwmrkܷX&hxKbrx !,~0#!3ò>I)G\J75`+u|2$O GdB6D?#|Qޑ*z<>|.4FJ У~>zR:f !/C?C} #zd=:"kDX[#ʡK&2VilA=^g%c=8;:-5-|_;7y=7 }p?7OFYcnq~1Zy=>Z4zۃ:!C"(P&t+uw@ rVCoqX t +U̧]{%XRr7a{k\~WkWhG>Y&gZV_ۯ]"GJd3Fb7a#*Ѫ,H-u,P0 ﭒ%n{oG)v, lhjGP%MX\U];mEVn.}X-zz=:_iO*H|P_sKm?i`ݽ҅"NPzJȺi\'7@<瞉uAґ˪$/ NY~vd/g@Wل!<p*R#Y&H?Shs^>˰:}Fd_tNl5DRU/}ʒ %6- }֪0G{u.!8wXx7ɿiM%Cӌ=]Ļ/o$ڇm͑6Ic%-|XeB`:Q9Rz巑h+sF/)?#0Ÿ6yX غxyJXhYtX+OT8Ib$^ÑW>^^<;+HO1u>8;S,d]jx\;Us,թi0_~dhwV6(B8])֘&йBV{b~ *ZTjWT(~X# 7?@*#FQD* nBx,TXbruf8[4 *W ߢ](7H?2vwfZ[ω|.<{@0ءh5Z<~:WǜX@(W!=4XV;dv l[3y1 \#@"!1{R GS(|͊Ԭ]Xg"Mh p jб[D!Xߐc vhc/TP%@l+Z*ճ.{^Nk堠3Ee"6KwL{z VOl1,rz|@2]'f}'Im65;͒e= a%@q[9XY8Ћk\$鮁Hd!_؆XHtOf}M~G5RH:hˣ)6puczǒO2zHˎT&z,KaHGS/^lsHaZ?:($H4?Y)L;^GhRhxz W /2Nw8A_8⊣OGz\O>tR2lt 8syjz YLu~6.H%De)EruVOA_HyC;޳&89S̽fѦ$ pC*qZ6*C",Ovp50%qN^mȴk@ ){ވMTdس=6I2|&G}]*:rI3o )ޞit,JN|֧𱓥&"mXW;`.6?yٝ?X9 BkŷMɶsg=ɢKu>yq‹ڝ[:sewYv{&~GvM5r~T%X!'gs>cG_@ E0{"G4i8=qĈj8Y\:ri,h _B&|)84x !F[_ v{˖=S_rӟ|#6{3v3bf[;z.끝 ZȰ^" pR'b!]B(׸,]tZjٹ)V]HR !prRy"w<W E0$*Vmʛ*Qvv©nODg'W {;%V-4Ay2fhQZ:Ikr옷[ $D;??.o%(]EuUIg ˵ Ҟ ,J.:l$e`j WU,E'řGJ*um$noN*H NĎN[ >hqjgT '-$`#1H%|"pN$?wh XErLvIHz W&c9q,ˋ|ϴ}xNM=`,wBn^WW\;0q :ڀ3?SY10"b8@@H^H[@Q rY%)Vvc]-  [nb""c SQbڵg~<ڧm@1!F*7S.oR1gIU/2_XhOﲟ(\2]pG/K>vd 3=VOoUt"7K`F!qw\#`hiVjRr_9Gd'yLfq밹w;O4pݤPJm) =W+~XDyl_dA,ސY.m} d-b{ć8>&>ng/3Sb92ӓng\k}^p}KîZxfeeY~~!5%_D et9:NS_IW&yyyw^V&ټjfgfdZzϰCE"%4@Bx H@8yTw(x|{k^c7vLZEQ(6UM0t6(XKྶ] ÉԫY#x="oR6AE3eT1v4Hq(O:{lku(`LU/Uh(Cº9npXj^%kćfda,S\E`/MÝk|9B@^H'jX뚽<?X%*+}{*G!T)O_"nW)x}|5:@ P@@n| ϐ;s")CcH8/ K+cAQT%7OBS;La,(rMX`n_Q ]r+ &2N9 ɇXLaY]i,peHh SYjoX^T}=xEù z; k;~SO=erK:-,$`V25Xbqnx,> PE'x[NŁ:`st3UKݐqeG )m؁/[9;ro+{Nv)~U~eؽMQ[K"V+W]vUp_x\awUj+fէt: اr`(4xS[uhGE(Dl/1=,Sl\Վ2rՖ b"1&cRUx!6YIm) n?OGy*^9A':ވS%k/=sp!P@| jbp T}MT*ktQ* Kv\$3a9)"qD X&rpEVyP`c."#z% $骁[te{,:4h5臿mӹܙ&W Hh 4NU- xu|}\!͚5n&{ᇭX`zvl$HHXQGHh|< tǠk\@ 16_l*UmNp5&Mp6>ԑ.?}uqeL@ސ*'L4.9 i5 zY'bg @͋ߤS[4,, n=V+C}OֶfY<5Kē[,`n3,v=0CoJ4 <8ٖOwVW>$MV):d%e.x \-k_r XhcLw벋iծK ??Xh48XFp5w`, =$-F!/N_>|<_ޗe}p._7L/\ޗ4}eӏI#wڡ>0=QyRG7._};L=gl~o:-1=}͟ѷc#\>\ڞN8mYSaCNZ-S+%`\s,LJ/`6Xk1ܯ{~γ5wn<}*!8k^`v$nq>-@Yͪ\i-XdT/1ĵ}rWpA^QŲ`)R2c{u_؞f*^ :k,pscĸBz!x<@NnoT!}^:]kCc;;qâV_)*ӥ:nwG JV2$IOTp{fW] 2AKVKwHM⋚V:WP//RDJBS Jz+T@H-HcO`ChDyySR'/p0A RϏMQgdd=]ho>֧|F#$VX_9W2quw/I~<O=[_HGh>Gsm/X&0}RƏ y^^QsRxqnk댺ǛQL9Ow"ƹ'&}EL^~/:,Hay8-ONIhw`1~=m.͓O5 X; gʔ)cB{ԑw$={W. wҢE쪫|Ж,Y.a Xa` 1"+`V@f"X'G$#Ղs )f͑p7˹͓Kv 32HXR4k]2b׉Be ަ 7%~\zx+i6>;Łvk犝$5rjuvu3l\ˠԡ5'-g \T{;*"o'E@(W7kCZ}Qܲ4k:`-f/H&vEC!4XԊ+'ƆGTԍ4 WN2]J1X675˔Y8'^s 0򭨸moLPlx*,,tAE :nauڅ.ƍ.)&/EEE%[AAˇA> N^8@= <tk%%%nIǯ_V\RA[ 666Fy)..vy %tm1_چb:|*, ;ޑ\NwfK"5rp6rtuu , S6wGD"okKuvu 3Lx?O>?~||=}i:Dϴxmihhpd/1d r&@}8G^yㆾDv͸smڣnmmȋ }b!Lȉ#Q.ȣ?ۏx$5@!#H ccB=^ ::k.Xn"m 0^kF4|zz^ηo`?$ h M^ty>YJr{zzxwۏ~#W.C33gδG~NnyEKyh&8d5P1@<j2N92!S}2z|au Vr8AGjuxZPbAv q͛K'x,VR&1X.]$]V+Rq "5 }VH}5/\Dšwh[ SepZMҀuzܲ ۩ZZSr k9[@f#?;":5 jZ\Wkjns@ .ɖkWj2y\7`5}60JD]že +ad+V؎nD(: q8%CKu\Phn/}(K|gE *_QJ/*'DOҭ~& 5wIt@hɸ }n c<6/ŜS_BSVvٕ ]L|ЋtĤEOEpXUff/2f~Uܵ+W:,֮] O~2"4?/9-‚O}S;#Phc&L?._h~s(0{ڤ7tq/$+vcUzU4Wk/y/~Orm|39$8>7xc:2'%6}{Y4DxG 4ut@ "7m-oyGǽ71?˷h]zK˖-}kyQJOo^{qeq˖-&Ů_ǘ7+ ۧկ~5>{Ø}{_>hpOCg?sm|ߎzṅuO wG#Ю~cq6y`|xy']>yܧ۷ow>h'> W]#曣?x^Ӗc-ƖvHuxO|?gh';w{qis?=ڴO$r2vsM8G̳_'tB;N;gwߩ'Uk.>x׻TUU|;{7:?޵҉gdbĉѹk0hT .H &''\I dY  +I ,O *GeX0]IO nKӒY)8״`P/Lչ*+՝ 0 h_Yp>%'AJA&-_Q\1-U085DߗWM櫍OL.59iN+3SM!H>ߗKGGl֭[M/EY2F,X_:"Yz{^K_zaZ#V;X`GG5-]L,H5K/u>>;jlm&ľn ]Uu/ I CB!aAqmW;XV[imվZ'j[|YjV * 2G0$_}~._~7d:;s>{{Zw99|/̕X$T'*T.>կH*6%iFk,Jlӟ~R{EDʕW^Y>T^8*}M|sV# .(\JLU.:f_c\;Cּϓ-/ė`}}r1GW^-oy[g>:0-97Vd=#i5%:1cBr]m͠oxigPϝXLV-a0âm3(/ox=^hc$WGԼ\gF u5u03~HR5/oZ0xcy#WƄ_WY,/P3pm7MT<"#r8XT+({ߦ\hyj1aU{C߬)s@= 'L U\wF>7tVQXSD\(UU'sue]cZ(v" VW?jS?.GC Ì@ s,9l-kIT/=q աE˗}xY^尠{=*6奇PGn~ҤENPbъ{|zhW͞^&O_8ߪqfXoLoBOZZ}=u־H8**pSh.7Ojp[?L/o]-2<W,(GEDZ F3D&-Gmi- 9t[8"ڰJJ\y[$tusf/~ lb-n#LN9唺RQg 1ͣ Aѵ$ğKh,+̛ N<Ѓ/4Qhky-GuT,@M,[VBěrGV,~8tgT443˜`~[|,\l*&22gQ0QYт@N:Z&Wft̻TIAV[+4PBQٳgM NG* YR= `y{^^#uExLJ͝\\SkWy䥺PP#Nj6-zȡ/Ղb*/ 8([*? y*+ƍ6p0;`Mu%G} O,{$*OHƜ2CxҨ}{_6$&xJikQtuNЎ{&hax;Y.XVQlC/nm>#< d*˳J~}Ե2> HnLsf3l^c|_~XЦߴ$ `{ƘՇ=5_ X{HR*xh)dm0Pd^1!QI%>Xԇ]yZI޲ `e5=|4d2h[h( wו^DWZӢ /B㴚8#~Uc6g 5a 4 %p& M8f.\lа 5<ZX:6a6 7!-ͬ0;@ o66TВMMͥ  Q&~m)'ਏ & U#OӸ:'??[9=&5x/1XxlO*K{ 0 1[j*?7'K rֆ3]'ʳ2iT+S鍻[$;8$e\e[r1Eـ*2Wh)xچ??UUK{t$`Џ; 7H2K3 JN(!Р7GE7|A' hU(8/nii {iWp nek+BKh<˃/Bj+m',-vI+,i=-b/q`VD诈N{AuuKW[TхyVpry %d|黔9,-_hY]h^kqۆ:[٥9Kb=z8@;@ 6|Q{(<.{F:@x\u&lTC;B6`Hۭk]pЋزߴ8P 7y]&XnH87%Х=onXk`balqb3m- :$A.` joڳ:nT]۠Y2?)c(nmag'M#ޘ0 qlӟt^l-4lx-0=YG'!cdCi3>چMu)oN# lOF㴐?s+ŒY_Ff(6O@ AhOmDMz,}4-N3ϲA gȂ+|1UYnʡ)Ce"^-ԁ;ʤC4{Ă=ye͢"M9A4~'| xߑ Y1m4LȗV:mP&zesmpY @B_ i0c8 @$'z8RXU?WA<~œIiioocS7}YS,9IO̙3&9M-`}zYe/:W_hg0ѠR9 GiCn8r CcV_֡Yc՜yVk'?z'?ɪ}6lOsݜcM=[_|O)_la͔hϤ>xFsc t^峣_]2g'^>y6xA} ڏ3'U=mh%!$`XFu7yp/n:"@ڮqv[Ye+)}$p풮v-3E_<0O N 7Gӎy~hYQ}Yyc-=S^ױENO +xYۖ{ WJV!Rn\qmCCp}3%>m0Hu3sЙ_?13wᲲmZscxxfDM޺fRĭ*? Ye1[|$h3 >o^Tx_3 lj]y`K vqW` qF0O ݿ4h{hy5勓V9irm%֫ZD4| - f.7Ll/ͺT`fUSљ3gV6Pq~03moUeY8lx3RPV6 `Z3vQ76xr/\gzȇ^l@ ^-&:jH\ۨXEi#@3pPY'+xL 2zB ̍f ``xSdW210b? 66vJ?.ip^iH8b\2bXġOGчͿm4=*/e}`K]2Gy'maH zO xFxnIVBpH(t_c/|d,T'vh 7e7|TMi!kic 33Ƭ'} Ǵ˭#:6g͚5R~o9Lx&c u\ odLaM'y3.CEڠ)N_?@yam,x|%;,)h)@36is H8z_$6J%ĻfK̊UC聈]tݯ!n $ZFУqt8#vė#]Q&}/crfL=JҪY" J5 Ctg~ _xqY}d~]0KgRj``$?'^{aˑ@e}= mTkj)s LpVT'c ʔN $ fɗ mMc>ڀkd cH]RS3a|Gꫯ*=>m'}O?0 >ZodrꩧվFFڝF'L? x)B=κsA!>cĆ^TMk>@0(Wԥ #%<3oC4Qf0h6i/YѼ?^3YWz hkmb9='=3'}uaE+vD'Y{XZX):p 9tKmKT11\h03uc+auԠ!Ϟ~0[ڹCiB9,4. kzxw, >Q/>B}]? ʘYQN54 B)Qփ_]|W1_wDl|y:?4DDwhu›V _~0L* +ʧ ɪMq@<%^G`*ت@l\[6ST+ }LHsDVիh!ᶺ{<<۰H\ois}vX &0y92i:2nG 6\f  cm2je)k޼ia}92,~\C3˕ϰmȿ^7U\vȶqUWZ[įr<yLogYf|@y80N7Db,Z]80 4>w,Iiߊ#itՊU)[~h͊[9qB/bVh9,@^:_10c&Tuڹ򰪼hڶe]-}l T4a;N*޿jG킞t?Lx_SM{o]#a֨@heěٖF{T@\7M H;hs<~N ]ĭ$|o ZYeAh*nLaT z\R=iB~аc5A H ?)[js\V0#(6|5lV@B>` Id񶥔3&PLi(9AzJ!wy/ڹ@O@#@gKxK GI u\sO:ϱ,MJAVJ:=U&%w+3e*m3>e*M4A,&/Ң )2S\[;r~rC ͬuHxs?{L ʔ^Ps YhƊ/ kc JU6Ƀ Z~bOf:5K}qƸC/Co/_y?&ռ/sl6ys xO3.'C,r!fIq,/˵+jݓI^|˫N:H<ɛoR&x8e!tn6z|jKXxbf0 ^y晵< ^y1酩SI?)4*`.-py?^ļ5R) JP t\13*}qylX!ղHcǯ]G0U/[8c9˄ y֬^TV,YU}X 37(SE6|^9S&VWW k k&}+*` .;E]&~Lv 0##@2[ܢLu1{UE_϶ >2̏/*6/:"e}2{ڊ j|ї;> ,ج dy{QC9xE|prw㠟L kX<8iÑ~a &8B `=Y\Nww?&AǼ+gyT3}8 {7DK^ yOcaq^344gSX''SuOMޒw2LӯAr2$~H3(Hn2M/Y͸:Ic7ifO^ƍvT^gW'7[,V;fS_?+_|3xm/<$H7,_]%eK I#c+V6w }e8'}&xE<+0LڹVҞ6Vy3_+-4"`{+\0Cȣ l F͜蛴 r?iNƹ7`f Ƒ3II GrmhQΈu@ ? m>p>-9FBk G@GFYhM b.BijV99n H=\F遒]jbxbBmeZMVOM]⋊9*xwg5kG9}&FGJmً{}g}+˴/Cw @lJ:+êmïEp%FQkt,{L22$2hB9@R. Y˗U(!HDh-A:屗H3>i{{{u_$y'o3 ,|'LǤy\{lmMȻi5ihYyo޼co^7~2mN^7>yTVS#gQ Ltc+V i$2vr\|XN xlWW&IȔFq[AcJupD+)i7&_ U>SqhS.Paoy(k$xE;##ԸA%&,/̵Фbz0@>Hkh H}jB9bp^|՜'lwƽ<" >Q1&i& N-1y3:// vʡ&<ZHǏ8r]J/u.aGma?xE 2gr`]D#dqSa-pC&hO P qK8go+sc Y~4ƴ߬]:$cKgUU)}A d|[/?[qmmJ@+VZ h%H8?>nۥ?)G{LUW > * $}@?!mOć{|Dsw>YD[].4J`CJ {`i "b&K9grPe>tb.4|_ 苃ସ_0)Y"ELO?Cѩ, T Pv9|]KF~=#bp!Z>H6AftKI8(,[Q oѕ՗=مߦ,\^piY*RE w8\򳅕Nj ˓pu$?@1DpӢ2mehܙFΪe0Y<4_O]e/>@'f2MXhpѢ{$;@/>2jSb(."m,d@lyaLۇ:F2ãRZ^fy\mH5!yZe:cYYD{KhߍI-/Z <^͙3~؂VGMl0>Fa*Bqw" +guVp,ARⷽmU*5kV5Wf͸H)ki@MiVU{gNFCa]2A}F4fD)|JW-7?پu~p`M|78S#ܔj8C3o 9-6i@O8_Iď2}/+*޽•U;4Ts>R_=rj9j?_N^v H77L% `[?<:Ӣ9vlΕt(5[GqwL|ti՜4+v{A< D{EFF [la"PAYyzق$<oUg1s+U2^&_>ѷ Vs@*s#]Q5 -+?:Uº?he@}QY*-)rsPWŁ­E%eQy "іrtKdԉzRK".z_ vPcWV{,4hiw3'Y&Dޑ^:̘] "jEBϟ9Ȋr9|s1háH-@ϼ^CH@<< 0m%aӒ@X09K۴ZjlܚK--hkPiC+V6 ,\qevz ٍyӱk6}W_:uj9S??V>gϞ];Q5@IDAT9_Ѿ+'̙3eH|\P\Z ll0*i^2ez~Kh@9:"ZJFUq=)(   @k[@8w*%/ciLϖD@WS9؄y6eN8<)_YD9ŏ%qq{1K߃cNjEy V|aMOs>:^9["v[J 7ԞL765n&{wݦ:ubXՒmu>*RT~{C Ƽp% XS D]B=,LW0o^Z:Q'fAYK±{XewgWLX@+hFbb> GZM)[o `EL(}呃Zy?J%_c]D6l!ug6OA ;M۩l |fvYn:ܯƼ3cb{(lrނ{=eb9wP+dP6@+VWwyg]44sqPvTYt9W,^|61m :ZQD9{e*#>ˡoQ. +QYqEtբ- {0 Z3cn6F_'8<ڬ)eS9?_ rϴv֪Hw]Ck Hܪhmʷdg@O`z_0!0hex[<4=n9X#H] ÑyM4SXKeȅ■^Ws[4!TDo>i%Jꮻ/_4Gs؞^d~`SN @E/lc5AVA˹_⢋.*~˟韖SO=5V's>N?42ОuuhжbR(;44\+#)"~UմYRqO`|,wֱmS0-VT3ph͈KS$/Ђ(kj8͟-<6T\3#M}5= 4yZW͑uĮ˔@ܻj 3c^77ncUCfM(Mӹ*>: `3/ E ߏa-K35@ >;#, z綉߲HORA֖@ۊ |4/k|b$Fb>+-"Yŵ' {elMR`嶕@+V@j^h'^y:+&6xi+w}UyߡVQW򕵌,[Xti5?qCP3~-L¨j XT6K &^D*Xt̗će`l.B=q׎ֻ8;'\Zގ, pij?@M4elS&1aiQ2CjjWj`8>jZ-YDSE㈳{}c >L 0vQNxE0$/|ܓE(*̸ ϢEqTwW/O]tۿR`y $u75䦪Iyemyf9Mޚ4y>fa*+yQΚG!XdڔЏZ^zz1][0z4i{.]%͸u=oءw~i c_Gf'y[ 7mg~~q~۸VZ lJ^f'KǦN;nf?O{r 'T|\]wu뿊{_=z_>8xK_Z'G/2r/L[ lJN08@J#$\@T\q3JO䅮#@iߑs>+`8=i11r F"&ty.[,Dh00[ZRGy] afݮ23(bx/Ȣ%k~bLΪreuh?1#ikbwWha-*_b+ҢZfTdnjxiiA$#3S -8%P+ {F<22yڨcb,vqJ zYy:][Xeޯ]e| ޢ`@io~v򀦯Qΐ-xK3 :xRgco8 wH^zU;!H0&6i4eьo' qYt 7MK9h6ywt[4B3T~vײ2/Zsiﺨ/%6q$A Dpƕ07}h`$Jq֥$5u%/ƭ6lVn`}_Wl-/{qWnryUǚtPy^WK:K^o}g݀Q}3ɟ'˜9s7o}k~WZӦQ, \PK7Wh|Lg>no8M^',<Ɵ{U>2UW]U>ϗ˷y>-^}kU<Cŀ7kőGYZn՛*t`˪ <2wer76ysj*Wz!ձEW#_0O]^n|~[B裏7L2Pg/>ϮAkg@[Nn#>j!<)Jd-o_P~/Ӟ%,iŬ7o^$$k`13fߨhx{+CrH8Ư_w:~mo9ۿ'5%O}S[nV_;k!;?! _j8kh5{gE[|\P25󃠝k:oϿ{W9Hտ:ó;EES)q~?E"k`9f5S1U?!\ѲW:ߣpkʧɷ񆑺'Y <˭hy^qxh噏w[m*UK:zúE@/>{S_/Y>YڨDgֶ:Wnl_7%uWS]ypcNY3*{$F1|}@&oFD<%*^@39ҡ니4TM]/IC I?3l~R{E<#UT5~5h_ãzFaqf.qt8u;~4Ch~3^Q8-;0Xq>yN96nw@|Q~(%ʒO^3N+d)> S7򤁦ndE!^}#k-|'@Ѫ.uĦN@rO^p$u~&`A#H'6ߧ?>6#2\s7v~w//ǝv.N]q'N7w5NNKwrĵwx_u/Ąމi'6ZB3min:,aěNlN;Ǡ*O6ȫ'ƿMMD',jH(Qww\rI' -ej:|2y;SN4(G_}{k,o~+=tfrMmcޓ6k#r/~LWUnu ~\zg/cbމuҎci/Ov4oUvoxj+$vuqO;5IW:@N^\e9Mo;GqD'h|ow©mOC@P]$N>>A/*晜KǕѸ\|{&S@9uV'sP8ƥ Uxլ+M?Nܼ!Yg *;H״cgwUȺgنKtacQ]8^  1:`^'j@aI~4Z7Oi5 >`E YsLj ІvF.Px,GT;h k3_!Z,I.x3#͎__l㏦}h2/o%wUuf4VT'5=m)~'hxW\^iUioClQe3k֬瑞.R}ܜ|jIKk[m\]Ƨ6GcKX;u gU)7bh4hW1%юm&{ULIh'-WUVU-Cc<@lՀ$cR]߈N ~dt Ojf!웽OUp>ޠoƮ㥝3^NlGyLiZԏ_Miޒ5İdrP7n#YW3Bs5LsT3).8^ݬ9VoחxiKޤGz:rRƭoYrR7}%ς*-+r”F 5H)6辊bs#/6LFhԼΙʯJݜZO0ǡrE[y*#z!Wmh@Kzfr0@d8roM~zIg>+\ Iמoƭ}ui6y4Z $VN* ͗7I Sk  bfF4'@.lހ?6BnRmd}֢, }:85`SM:-lbaT(N@ M`kZBid a1 $irlZ/&A0=`7eؤ7<شZg~VuUG2 ~aV *@>Vfu1~66lS'r KHX64Ț)9&ڽ۾.`/q׾\sSqӆڕf> k|[G+<>T/|J7屦̣L2P~d@Y* '(`(̼K>ڎ\sM @çiyw"6ڋZP7uج ⏱)qb!6p[epY8} b_rI024C3r^='P[iФsL+F`9O6Sr۸KH BS@jS|qMG~=I1޴7yƼKÓ-gحs@-y /+dՂ?W}ܳœf^9`sՏg~=35y7yc->sX|GJK,23"oka,j:]Z~gme.F~94IDI Rٿ_x[z)ܞ,H_%ȽYv{l%%J.8e!xbra .be t2ycH &9`1w+_ކ*^>lm<-L6崪^fΜY76,h@$_qy9j,2lmmcvcqad"|'xA(B3̆&-|?q/ o4@% FImr=lεͳ~Pm27BC,82@ {g~6ഩAV[Idb3!7ZF6H}EM^X[~cu\A.Xr0N&\;+哻>/c*8ƐM 6i+",Д& l$%-c%3?ɷr\;ZXhӛcI>&rB{Wd:jS`uyu&@3әCl.jYۊty.&m72rlf@GQ@^*H@]|3Q[ƒ>i;m {˯iI*X3&+nP6A9&UIγonN|f@nAâO`}dmf&?E#amsm8HJYܟyh{@5gr}&XXw ~2O{r$}3u?zsܖ!W噑tc+Vc@F{Vbz4 5$B[Ɵ7ULjh\|D{k7'3=0:lM6 ^XЧb#g3kcHlҀTlФ?Tsdմ`@F0Ϝ0}`c?e)A`ksO80鴀_T- | av`C9BEk<h^ڤH8]W2cE+וnFMpbL2wEwش)ְ^g^Xۚ1k`7ZIF+eH R5k! K^{[dZ$`<ړhph:ȷ\ fۦ׿ѪU}jP&vMM$Mi4r 5@T$Ib3lCOK4m0g@ Ƅ-:w6*:6M9zqۘ i0JTlZ1éy9; VxnPxr3Nr7&֍:PFS;!KS+ =< `pXSJK%YU;2ӎK_J:BsE|i;}J_)!Wr$,~ɐ앭ƌEE7r E;f" 50 4 r񽦺hY[Ƙ4,T@o*MVZdû1fle/ W#"+- r/ 'T9k4iV/6~c |Te. h*2Zʇ#<rM#לkcŧ1&@smh54F z/0KpaJBo[Cٜf4;~]&3Zf0̝ a|O1?2lh]àmγ^Y/AVZ lȹٜJԏٷzZm/hQ(Հs1h^٨j996B/߇> Ē/˸J % `& k2y\1[K[ h%9K`, $;0(}~q/.^i3tohzWN&c4}KtIfнۤK4<׃x|ziOߨ[xG9i95OʣEgRڻZ $VBIϫ}MvJG3A3C ̺}Ȉ4di˯+KLH~E d?Is̙ŏ^uUdR!u{l%J`H )}/uvRgЙ1y=~--D6RmXVw-mKo%J >?'`viEJUW_}?4|,p ?EBI iX9/>C-4+O+ȾF{Ǒ|i>@ij[ h%YH`VNy,j+aa?lr݀1Ƣr6SGͽVEۜtPwK Z \%qߵ^[+h ~U4wʾ;k֬S- bo_mjtmn$L Jo)h%J`W~ba,X =aӫƱv m~!psEY9n{0 Q-)hk Z $`>\͟??,^xa׼5eʶ]͕Լx~=oL:+z׻h?(Ӓ5ӧׯQ|N^Vc@}K6lXpa|(q[VK`T+D9:$@VaMڸ{`h7ͽ1ou˶ںۦ%dmuoJ+/'8\_Estz;YEжuc ֡Zw#Hy[~v݆VÐ@ܫ fKcP$G+o~s׶N-GOFB+ͿuNkbO,{M oǑb=c|YZ<*cgrNkhiʶ4{߽\x& )י1Zf~ ڽ_zqu؍ӈ4^y *#ׅ;cHW8um ~ԵFgyVݾi{5ti8fR7LZ h%H!OU5^c%(e~s=Hȟg_|;ՏaXg}vy6*ȵ'AkӖuXw}uku!i%I`!8c350+{|<9] +L ֥^ZtȳG+V^u ty? uCY޸usf.!Wy imG˛ֆޠ45򺙯6?MZII.7-罱7|q闯<$}lM[Vk/՝wYoU`/xpe +n}\=|򓟬'>2{}ّrHj,flh%JI@s?,&;.:eJɔM1c7N-Pίy?7a2iRf=gws_4]Ϲ7dX8%2+3AE|SU\y_!C[]m&/ʔ>yoƏ&/tă ɃSod%OFdhӕo'o4~ıє SO1uPq4o47tw}pm}CY;iIYv{|%g]d0)'tOAf|) l̃qYcfk'}<цV[ {^06hnlWL9xFYgUM9JFu{]5AUr̼h]veմ$[ h%A$`2Gz;>|p=s;Pjc7xcy1S]ve宻>g̘Q}gT-@HZC!z(m8|3Y;^Y>Ҁk-|yӟ^\|u1`٬XtMu@!kΙ3piŸ/2Sn2[nm2E 1s*Jvh.^x‰_?O$w\/G?r|ӞV'2^\6zO*GqDmk;X,m'ʻIc* ^TΝ[yR|G7|<ǮZjIS'S:YiSoAB_S~1<mMگٮA-P8ë]>dsOJ' 'Pv[]Q]2IC{;\F\wuC)O}Sk4~R?iGh7Y/NƛvЧ }{M}ϾNS7ɬYj{&!za X۪LϤ?XJ(k2]9[x+駟^֜kO+Vd /j{q1WUuܜ+?u=}|o|'jmqVǷY<3 G"J!6/˃h 7iVE fΆPz '@ 񒗼 dPP/,+_DAY$@]`+š*AwET * * (J$_>=5w0=tӧf^:u_Ts},}Ҩ@IDAT9Cs[oϽ"N|:_]ʐP pobo,e$QdWղ@:f.8lͷ4Dȏ=9yN1n馲$wS}bepqN?馿Yꠠ7a@roA땯|eMEiv?яJ Vp/‚%lxы^T9l׿ސ殻Z1?OvsO.iOVst '4A͝vک dC;}'=rۖ A7 7%XM4pٺ5{kdS{q`zꩧ DY/}K_xÍGG$2(_] iV5~?>|9K?2K[t|ٳ9_8}_`:昣l) oƽ` "Ǹַ[}L_#p΢~86ςDh U*|xkC%/yIe+:$FcG|'c&2f·qIPtQuR8%.u]E6CG`^m[#; zKccN~ } s\lw>^{U|wVp?CYk/(~T]nD` 0XI}| (-]%>NG?ڭA>Bo~r\8dN/Gc| Hmwm xoǞqs4x#?вQ"O+mEBʴev}.ΪpX"J-`Ljmv>0b=f[`'J`=d)%L{9eT x%.Ǖ*(v4q` B ? $./.ڮҷN"u+1ਫN pk^8ËA# 1!)~`b.%@wx׮2mBW{ okW wP`ۯd"+ѪfX>]ɦk@>Zc5* }Y;t@ Ms= _²Km\ga*xE/vd};r ָW7LL% GWpC["ov *]vۭ蘠>N2$pe"`#䞣>! >x%ihhh& DOvKO7A($ƽ}٧萼08K{]7}<>lAsf|r]0SK+ȴ2JjF|1"*L=Ih>f[ワ~uD H,z&{$$vxٲ[BǠEb{޹&e?!.}w`!@8ETo( rzqx/eJ^N@w%A#C0FnθYRַU/~w<2DlxGjY<)8P.|8Kx<l|rѷ+N9GuT qrFmvh 2crx ~^y%8v8tSd]$x!_;e)xf *>SN0K~#~C>vsGG4XZ'ߎ!\#w+dOk?Yja\p®tUjq5,񉆀lx2`I6b/N| P%(J.>肫 8v :D0yeMV@|"FWGnWV? ؕFs[7h"Mmc9Yɔ~_]vR#DNe lh OQ`.#P;yrlY#d>Mm,=dC&w>e/+}`"Y.t:w('p, /xAmɣnxӃ ·u+PL' \'2+V3 nJ`q;ģfˇB8 ` ]Q`SFI'pGp$UIB`V_mRCȍCs|t$$~ىdR_d]zW;2P;[v\sOP~;uwfoZ`9Tù<ڣݡ?d?^U䛎G/V>ޜ b#Rȑ3ß./VJ AsI'L4>LO% t54yK^єI$gZzˮOR }><=I@kV(ӴE =qŸ_hnw,n"sd/C;Yx AVM'3|YLڰ56rHqq2hX< Ii2xFLbn^mS Oȴ ևڅŁ؉e57Wy)Hcq}אImR`}ڑ5~cOy?őmW/P3r8cN4q(w=@HF7#ĎF'A@;cSpHЃLVkCw&.T gOvڵ2YuL5 t3[,`gEJ` h8K1t6 c .j}Y.HB`_f g+d,cٽQ,z@ĕl2sq$$dǍwpD8L>ZjS1u (u?U҆';#m{wNh`=vC |*0 a%x:svgΧ˽G\`aQ7{z>z<(19A-mq7)3)TH-̲p&S' uY$m}!)VS%@Gq鉄~n?1dE`Fpn; }wDQW#L_&01=FGԣMv 8 9i߆nPv o)\vO)X݆6Nwc6 :B<'#J#:YOȏ߂xvAzמ.hA;ȐȁXcb\9;, &}:,P /D_ٽ%u]DqV=pi2mfgU$f $d.h#MMmcxl<7h&6OmLz` 0+u \GXc+;ƃhI%mwbJճ٧7Ř8/J230@Eʒ> 6mw9awyNjG8Um@aT!d4Qq pJaaK,0O$)y@`V j.C8~ dǁWc\p(H38)J:[1|ؚAvO^tt)e^:<#3]W8'׶|++{FC D;)v4^&p Vm:KmD͵z}w_]q``w`vΡ_\;:p|j^U6ohGG}uu4}p 7]S/;m0RO͗4Q(v5B=1!=U'.c}蹆/K}i+\dzR6ȏs/xI(vǹ{(o|fAଟs^ ۯM!Z CխEߩ/p  )_S~4L_ zڍɄ]ЇM$g!PQQi;s@}mKz@"M,z~;ɘ6k 611"+l)<ŽFC)2KG?P6 %<&K 96wSs) w;P0FCw"ʇs:snѓ7#/u::뺜GPbP+Ӗ~h7 .:~#?g,=hwM[}'mP>:~)>ԩ+&$]^(DX5Me}m<.O/}S^{u[{AnȻkh+Zrj39i*+|w]y<}!jO$_kx%:i7 :ᚺ/OG[:~\TqOIs(e /'_M'X?;)Ol5ծ+x%ӿua"0qTx=f6K$@"01,zEQO @iԓPb?z;=ɡ6 eQ|̜v; ,;܏q-sO-H"AAgꉲ%c'.;_޻z\73H6ѕ!D QvG4mGuG~\htn[]3ȟǙ@)#ĂVE>|29MӋ&HxAk_ڢ,\kn=JԾ^+|Г ;YY1kAwPbs#ʗ%"Z?&Dv`6G[Mܦx{ZP-!㙞 TIҏ|0LnD HD h;NpN{,Iy &y~}A{"0,d*i'@"$^1X%̙3gBc ?0SHD H]cU-G&E(oo$8)G8!'ţ73s榚%E_Wr~Mxݿklmw?,fWm/x;>\{K_Z艬3HDp+"#*\#L+&F ƺL5c<ڐcR~6FF9*T7^\\F޽S]V2p^g.yk)~vQn^jԤzWӞq2e,YF +-elV^[Qrei//B/W"hDZwƩ>/w۬"RjJpWW3^{;+KE/cV9HޘĝT<Ӣ܏z(3|l-XmuWMov ^/NlԻ7Ҧ2^+dόF31Kyq׬͋_%XsV$@"ۯ_s"6 QX#_F`O8<p[@[ =V}\v3*G}=5~冩F-vB\_nNq(/5+*Ư'rGz&yjLڌu_r ޗ _N{_UX 5^U8]a^y;m*o)d=3JQoPu^DG(/-Iq{[ѮJnHwK*/P~~/ lioZU3n3v·sRF|{z`+3B&;铑 ^Z.me }#03K +074r>conKgrmGFzzf6j^7{/wy͡n^l0knR/m~7p@ m#efD HD H"D HD H#;~7~x3wfwl̙3y{^G?g?|+_iđaVh``׾%xeG`XnYzjs W\qEvۍG$br?t$εM{8/arTosi.{'+/;zÈ\C[;/:ip[))=Aӟ/oN92;"xu7'pB5<1;b߸ dD HD Hf ZtA{\!|nC=}݋a LͮZ_җ9gO.#əw՞{|_lfj/7\rI'KG2ӧIlAsi5W_xG~9B0} !sLN;#x{FxG fFO%v:~P>?Kʮ*i뭷nj2;R pW7'xb};;cy1{w_y0]],<)HD HD`azˌ _+@Oe3`xXq/|xDm6k6xvL'}%.;NEcW^yeя~dM~Y3gNsG4x`#ݭ׼5ԟ2欻%GdIa HD HD`#x,AuѸo9>nyL`l,TON#Q|F{}q/ݟX&m5^ `hy"0~bLƢE^n:o'@"$@"4M>BRP,>EdD0cUhgh +Z>s҃dY<+us $X~ :E)ƘcQ?)?ȗD HD Hf+Z=6KD/ƤL@"`R"U?9uΛ@"$@"6|4:LWl3,fC߅:[9lV.%@"$@"$@"$uyGU2,˅uq]~^:yh+kNhm}ԙD HD HD wN cir ssIR̆+9}*31g嗟6'@"$@"$ A` iC$@"$@"$@"$@"p?F`w`ݏ۝MKD HD HD HD !dktT$@"$@"$@"$lE X݉@"$@"$@"$@"25C:*LD HD HD HD`"D HD HD HD Hf!l&@"$@"$@"$@"0[llw"$@"$@"$@"$3 `͐J6D HD HD HD dk|;HD HD HD H@fHG%@"$@"$@"$@"V25[{>۝$@"$@"$@"$ A X3D HD HD HD Hf+=ND HD HD HD` Qf"$@"$@"$@"$ `֞v'@"$@"$@"$@"0C d3HD HD HD Hي@fkgD HD HD HD !dktT$@"$@"$@"$lE X݉@"$@"$@"$@"25C:*LD HD HD HD`"D HD HD HD Hf!l&@"$@"$@"$@"0[llw"$@"$@"$@"$3 `͐J6D HD HD HD g{キ{f+)Fަ'_,S"sr 4$@"$@"$A`_,PZj K2%=,OwD HB B*'@"$@"ܗ,0v1qTm/o~_5?z7o"0zYg\uUE5łoZ7o^s7]tQB_jW23u3*7>x ӱsPto,BoPE>Ee&dgMRKD L8pok6{ls7?я+뮻F3*~5;S?/Lvxw=Ug<ݐ뮻yk_ۜq}i ؉'\s5>c1 !yͼ;dg`U+B ;d%L/ ~P;SS3AaT_`fa??(Ž\Ms|A'EGN;Y >._,rG|W\YuUVZoor}Vqf.x@馛6,| be'dL&aLӟԘK]wݑq27na0mX߿;?ͥ^ڐQX.2E1CCCņi^moƸ$%Ȳ[4CmLm#ZOd,rwޱ4~<';9yur?r' #zlםׅKh| _8S_1}D`:"Pv>5> ,nym@]cVɐD`W+mH]hԴҏzG;#?Z[Fo{ (49,HOݏs2s 7 ɑnن^7E"_cܫR.7~:Ѳi3n3l[Ԅ%Neȝ=+,c#jtCWLGYn{{"x_<ŵEt :uYКHzmE_ p<"H5%+x4ayjR;TJa7#Xcӟ@yj+#mp|f٩.p̙SƩЃy'mIܕ/Y*_'v&"0RuH1D=%5(xrAeh۾#!Ĺ45^%AC3Ak _Z](Yg67;W|J}i#Ô]ffo28ugD X1X1xG߿N!${UL*s-+.'Bҗt] QǸ9묳 7#̊PkYWU2KU-^|;ZZg\guJ`Hx"yW7 :&Z|Vw#xIΝeh]k~S]: rlG0Z!e/Ɯj% V9Y=VS}Ǘ{ѾhocaĈD_ d5x6cp?X# 0Ġs}e>K[{\wyͻBFY}C6E_Ze]i^k%)ۍC/dh"N9: ntF_-)V>PXZ~IV#qSHI8RS6N!sDcs@IDAT%dYk~.Q!N +#ځׂwsD#@6aj_ QƑ+Oi95?z¯GqGGǿw<83hF#So4<}!uG-%Zʍ꾥\;~]4ˇ7GuiD ZA(5}=^dzsy}嵮C<ݼ?ʣ>u?hҽ>\}}Bm=Wظyǝe,9L_yqTH:-.˯hV^e 8D%Cg$Dpx1Ǿ187_6߬{]h}9Dxۿ[iY[j饊rGY?N'} hoz% L@ 'PtHK݂Oؒ_oկ~uq-#Hy9>St^Wy:E2UBsZa& PcD֍?g?#c >G s~!OւxJ (Y"ςsszqߧڑKh'(Zh]_Ij\Q7c.56 ƫǸ)Sl-r$uV5c:2i6JO'tro >H}#h2V>M'Wc@ G8H%t9ȄyL9꫚︽q !ù?򘫍v;8ԏԘTFɛ/Wxy_7 $ڹ/1mdW%F/t:tX@2:b0Z]䐫23TɑTƷ9RÂO}8_#ASR&&%Ccm{ګ056~Gv g3‹q$ġ }G+Xs[Dh/|ګ9pP Bvk?PN_$~ M6 [x6W+.Thkaj3>~'C^k6Ƃ 7445 >?x LMk"<%i㪼jQ |E GO.ow *%KnGXx>gKrW4*9f/mly|wȍ6)kW~ >ګl С[y9h{C&hnG>>٨YNk[,D":K~o(e)O~J!]Ot<7ĵ8v恵滼R)P2UrhvUE4Xd213u1NFe8l&ZMj.kUN.83[w}GV9~nn"Ġ(9 : SWĘ[\81&a #kˁQM'"6M Qu8Sr2m G+`Pcq H"LJSmwQ$qmmM N0F]#ƍ Tv=aa|I#XLj[a\'m0 xt:_]TX+?)ǐH &/6I'TUvgA7 #FrꎾX,N@9 q4^x B;f5Vs1E]ȞwYe KQ#~t@O$On\䩏tnuY2]\ۂQiwq%@M"qLg?mwm[nXv;6 91:n68tp>98 :aX։I x`:B1+](NP/P$0T!lY+eꪠ 6N紻ɨ|AYŁzn|_2M8tvA{E-/kZ ).~8ՠDt̚[k![JBl*i'V5]0h78` #t_ty-2zO`6 Oeٵ3ԉ]7vm CѯpA.7e#P ¯㢱v3okC 86# X<W$.tiDxJn䎬gxw=֖Ѧ/sѷp4Ę(y9 -Q(2mߕn9sGY ]V=OQ{A}Mu1~ežq")| )_]{з\ɐD`9 !`r5A2BMfb" 4OL;8'Q>4n)QJ+0dk=>LfCyV2ۦyI'Gn$*ɐaQ8q߽_ũ ʀ;i(: "FV"/Lܗ N1 ^=pgq&<*'ɯ,KA 3qb5"Ep8[e'3#i/؀R퍶(H$0 a3Z츱'uqdGy}JL.[;u@r!*p&u-a@ yG_h bs}9yK0hknb В#ђtG3=BA誺ĸ'C^_pC砦aY,8կ vpެG2K;8w5!#FgA2Oq£e3kxysER|/|`dԟ鮾q18C|J}@}8*%{h~ Ia3FCY1"Az@` C~IP~e@Ն2\+CgkGB8Fi;cy&V; ѷr ]S{Ʉ]EIѿYb| dkQ6 /~bmZm֗LNFK^16v 1ѥQCnoczB~A-7;r\:\X2GO~(:?T%nosٚts/$"+26R=t/U›\KɳsrDw×ϧNAu-:-[c"0ZeUu~dR"O?kKGtX?s|/7X5ƞnby7,]4~v$]vveՏO9huڱ;\w:, BS옢x'/?qi[<6OFvD`QXCdQA^yf^)5?Bm" ZaLL \1Ť32ax-L1( HLi1A% .s)ŗĘs`#c#dvt162QFD+݋]Z~mHuYLCCC(984:Jʢh/C;(Q=h|ϱ3`8gжێzqb0~(π0A)uQY#)hg<2JɻPYЉC X)hG=&L`DH߻4](?qdҁxUe g0rd(wvHZ#J8=p 蛉-;㠃'sLF;ݏsF5R.:;>䘜phKAA{ǝF#9]x}(*w8Zu? K$PZ`d-ٗ|NoO|_,xoSݭ>g ? a2"1vBɟzX6>=|Q;>+-8b7Y?[.#ڠv3.Ȩ/hSeҭ#s( 3[XA`b<1 iߍl:k,E#va8'[/xpBg[̕Av" 'thw뚬39 =dCɮGV/ļ騟^f`Avc(u4HQ M2`܏;hGoC{8e!'X/`)+_ uKi 5p#O?rXϵgf~uD`<lܭ]+umYf?xG뷿<=rQ;}Qp㊹v:lMEmrQ[l^v[kSD`25E) @_ f X+XP + 6xT1#&0 c0M[hE=ߔ{>23\c40xV^|uS7@p^mZI42‡qz+:188}CV][`tÅQ gM Lq?GN_yܯiFьYTxc2cG{m^s<w7ᑞ_?Ϙ M jrNȫU&R`ޏ^H~DqAy[z\CzO)? AE&[ Lb]I7h?:0DŽMRGm3AawAvq gũ=(n]F>rs\`m tND 4GY_ ^`2I8M{>ېc;9]]ZБЎt[0'@?X=G5я}Lӏn~^lT}O .KdMhwO=W=bF'tǼ*jBk ?wxcyʆ.V6=ʏmwj{9P[~sn[k?|c^a{kism2٩};eLh2~'>^hSc"B4}CG}t128R?a$1`[(O99al)AT E*QD[v#Pr{.+E9L&`͹Z%GSك ND8#7ۯ?/ pXb P ΠcΣ<_|;0Q07Ebq#o\cuqner$]sڛQŨG[ow‹᠂ O}dU <;811hno֣VfMpǍ[v/S0܍V<}l&:n ~!&Am C/ 8ۜF\(0$ug| NhG>ǘǗ05ab>xB^=Sh~Xx y\ g<6#s 8wx&ud{/^ƴ73Nz.=OU6jk6#gd9F.D]QwxMv ^h\:s=(DC =|Z;,q>lGIc6=6{I=2xZq| `IS?˵?C׻+5c;M-U EHw 9mP=Hw[A;EjEqY=oE8~֭'1Lȡ'],8xZ̳lNi%bgݡQ=Z=Zسi6WK鬼Klu_ih@ɷRp5OG-1vzjߘW`evZe[?H5raD{"|??WjhK#I`!^aD`=]X ZyWAo@~ɃGXcP@mr>}}AFtMOHϣV<^e.S{|`-k6;:Cko3$^^΁~d!}_.;Ν[rmUv 6=]u&<q/v/Gk/vS=q_R̓`(m{k#yȋ~ q ^_S Nԛ_Q<;'JW[2mknѮU&-J&;LX(O<IKpdbP>({Y%;vMpJ|ӑ]~wA^;mЊ/'Zcݼ-o)J%0ۋKwzO7C#O<'^IO>2(QG}Ћ(X;y< }_ȴz`w]C%;#Ef霝WV;{B&r.`a,-12Ѷ#HfΔjZ5_?ɴl?C7]CϹ2qMHaW.ۺEȳq\s8f=l֛^v٭N gb4b qL;noG7*s.Q5G~8gis!]u˷J:ib˲_Kyw_sǯmǥLd!0$>j}([*눱B)٪".kHhLeG _ճKiw Qr y[ γHG(X|/`l( 3-7:%Ow|O8xtNn߷]8bCs邧N5G] rь:hNHau_DJ䗬+۠Y 2KcKw\G aʞ_QIpﴭ>h`(Cy̸/VD?G2iv /r=sn42D{ 7ڰ_04Nʳ ˷=m_Y=e h iRx2a|sFlۅ:Mh4ѽCo'>yl2'vٱň*;8qz82 y>1c.%<ţ_c:8[ ǣ@kǯ/#0.ܹs;}L䁉 LjFF2(j?y+#"(d6/[R9uBC;Rjh/k,|7~ d\E0b;=; 3^hyjW ;R<Ŷoq _hqFc";/1}!Q! >oqy#*@WƎs+{0 ^~~x{zN;h+֧?3 VT9)3n/ѬE>Ok?L8UǂJ8t׷ebe5d80Ip\tNA/iC-⅘x2_Q>8v ģ]]9mz{ ?6T0  ׿y!=80y & ;rlpW(Sb OFO;ի>=[ qF;*8K[@]Lq1smAKKЗ :'3D;>D=)sɼW>2{/rx4mu>{s *In%'tM9l(xNgH~̚5~_ܸ۟=ۛ;f0#@y/0bL.&v7Gl՟zac޴wJu}OhB-$_`AsSڔ } @RePFZ8SLObGBJ2p0/ :hp3Ќ<~Q7lG +*/eR:t础$L0"ǍaDDO3`Ba%/X" :sX:Tӆ t㹐߼{)ߢ4l:#䥼9u`Hyi]FN# {/gQ('/ E϶HS+OFHOYgCz".G^-`0$ WQeD'^)ϔ5ڊRtK-3'DHG|)<y#ϴȢ LS:΋=<vx<`yliS(a̤,XS0z/5)e^HG t0Q~8xQבKh_")mE$zc 5'У;2ЄW7_: & xyq?|k Ià8 AKyyS|pO  ^V'Hxn-dh Eh 0ﳨ&-z%c4y"12FҐǔ>. x&䉾XRvx#~2DWi@Hy%PfOYA%J58wSɅ2 `ZMڠ/֤9p?+<61 |HAqa|LFe /c>NK'A68ϐޭmn{eGY7=7y[7E6똿^hkh_h_4ߎ<eyѕW,7߉!d!gܒБzt|QˢS]qW`of VĻ?図8Pўzˇ fD}A>^d-JG X!BX M+u_J+W>>r<5*oi9ϳ}˳RJy^JV..rEqEgIKŕrqJGӢ0{!A#j\V&hҼp/1@.tAkJq4-zgĕ{iw"H1Ȁmf0bx=1B<{Ϧ՜)@ Bh/6,_A /<`Hø cޕ?|4b616m˜qoT M|@ t~x]; uUi8ܧs2@4>J`p%.ٿMxÍn3[ݐLsC1N0&1khĈ^zo@"fZXhCQ7{k}}}o/^b_sHK/YG.e4w%t|vTf> X.sֿ:y>=y^tif2#@kiAȧc>u+1ҤEA"^zi8OSg>mWOC'd!7u5!ד6埦q4>M75T/'+Oë4|ӉzdTKޞ9'4)RiB^Q|^6A^{jziJŅn!;)}ФB7a{yz M/2h 'd?`d%.N~Ki~Ho~-ś)]5z/ᵅ!"J>AH:1<Н8AQYH \WHϏF4!j]M^4o_ ,gfzЇY2O \ x2%K ѡK>R;+F"ej\r-27is;v4鞤V6ͽ.&0^QsnzRg̘iZdrhiF{o>4/[sV_uFe܍\n׵vY9S Wz]?ȧh3ɇ[;7ShO?Χ)uq"+|I=Yi>!Eح?Fn+Ah=7مFxUyt-jE*&JeZkQ/}-:56t_=i`wűI[*Mc5!d j7JZJ_.?6MtE &m?G p-w,UIǢP)T 1xr/dWC_ CSI~ >*Ɋ|RnW94բG>ƥ)}H" !hE׍L*:i\K>A8MXo O.{blƣ1ձ5pxYS=d71mjCWXW&R|Js!P3n j%k~ȘnVI[kM7~;m2#oq:=zdF(iS60qM4Ѧ8ѺnQv n| sec}loM];jCmO \{GUt5A-3|N7`Mwz(z=U#9F27[clw̛ksfY7Ʀ|56.t.3i{ox w6vonߥq mB@! A[rZC+?E^n8V8ؐj곫|ՋX#TLZl2ZQx_xl0=Hcв M2gwD =^{cbOb>|8ӎ6cXzϹ6lmW?omGMG~&6ǎsCN/Yb7^c٤YGWt7;5lݿ=:=Y|v+_kL>7*m\콏Mpў~Ě`閯[׃Y۔#;lsOٔygYg|>z:>&kgd͟דG箿FM<˦<'uosc57YuWZˡ.c~hܨ7{7͘wlvyOl~lϴ&RB)R@|B@!  MlȂxc,Ύՙ's7`8S!Mv?B`wCYn{.Y6ֶsxn^C}sz<|]FcjSIDAT xJyOVۼv5^nXx>g0K5v؅ƾȆoAmi6lkq7ˆ y` 5/B@! v_kX){#+ xch`΅h&sp/k7^鄗.vDĄYp]vbe X/㇫ ! B@>dӞre`Y4Li3ѥ!}φ1ƍkf4;pF){KkNaB@y+B@! /S UZ m)/@DcG[uKnT^<5΃vBԇ7B@! B@!0bza6-_ܽZ!0) X;mB@! B@!:RTsPJȀU ! ! B@! B@TdݩKB@! B@! @%d⅀B@! B@! v*2`T%\! B@! B2`UBHB@! B@! ;v*.B@! B@!P *!x! B@! B@ X;~ B@! B@! v-*cIENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/3d-analysis-image.png0000644000175100001770000105523614721316200020720 0ustar00runnerdockerPNG  IHDRw liCCPICC ProfileHWXS[ Л RH ^! $,*v*V@ؕEł.bCMH@}{'̙O*Cȗƒҙ:(/ h 媳UtB9d,ę9?U|Ro5@ij!֓!^*]3UpMRhPy`:A)XV l.48.: wOx ;Ћ  ! 1Gl' a!H($ iHH2"+ dRE"'Hryt"oO(RQ=E,FI84%h9ZDE:ڎD{0ib挱0cY `eX5V5*֎uaq"3x2'3xOWx7@#.a4!0PL(#l% gH$,sӈ눻ljmdDr"bIbٚ{4i>Hե:R9ԱTu u86-Fi%I-WK5KR^+:nCgӋeK.m6G=SRMX|:;t<% tn=11qѡGԳkOџ_D35,5gp!!! 2P Can놟FFFˍƎƓ67o(h}C&& &L6JMך4230 21[evԬӜa`.6_e~Sf1˙& Mvɖs-w[޷XVY5[u[[n]k}džlò٬9k6vms;C;]]={}$jkDC:ˎȱ$vZ60gdXTgssCh. .[O|_]=]\;"rM#޸9*ݮg7pz󋗷̫Ϋ;û&KZ:C s磯o>?rv=i7R8r9#T5*1"Qc,i:*rQblb$1 2~]ܤC #'Md$NHܑ.)8iiddErs =elJMԐ1bq815gLc:z-{cݸ)Ώ77ބԌyj^O&7* V :gYY+ggD]bB:'"gCm}yy53Jt%S&N&uK'NZ=[%*GzEaI0LLi8ugEaELç5O>gf"33g6ϲ5VPms:iSmV [/ZkBkiY~s}K.Z~qdٍ˷YQQ+W1Wk<6Qi/.o\kvLU_'Xwe} J7|(xkSjͅnIr/5[nM}{S555;Lv,Ek;+dWcsݦK==/f콱/j_~_m~:8PROn575FxՐgq]sF[7l%vw ޝ}p&w}wW![%>Ҟ=3Vΰ/Ƽx)}UU_gП-ݣ;^^Y 1620 Screenshot 622 2 144 144 1 #@IDATxUPA- ;X7T5M-{)FEłػX` r^.voom3g|sߔotE䄀B@! B@! @"бJZB@! B@! BA! B@! BU#儀B@! B@! D`B@! B@! jD`UuH9! B@! B@XB@! B@! XU=RN! B@! B@! B@! B@F@VUgB@! B@! o@! B@! BU#儀B@! B@! D`B@! B@! jD`UuH9! B@! B@XB@! B@! XU=RN! B@! B@! B@! B@F@VUgB@! B@! o@! B@! BU#儀B@! B@! D`B@! B@! jD`UuH9! B@! B@XB@! B@! XU=RN! B@! B@! B@! B@F@VUgB@! B@! o@! B@! BU#儀B@! B@! D`B@! B@! jD`UuH9! B@! B@XB@! B@! 9Z;)rC#K7cƌYU"Y"ЍB@! B@! "y|Az)Z>ɬMAȓyt4S|B@! B@̎c'B kD`e5 cdžg}6;ys9goBϞ=C^‚ .'!CO 1eʔ:~~"JJW #B@! @0~Ck,B6ȧ6owfk׮a喳ca32 ⩩$oé>5 [na{(q&L`ٴi ,Xb"Qd儀B@! h:ޮأ9b_)n9E]4,bv?sM~ @~g0u԰RKW_=tzKYWډK#J!д_gvK.VZiO?F+ p n X`5\^ ?p~~_v),ͪ0 )iAv瞻b_ƌҋ/_N}6}zܩW} {GX{\SNB@! @Gԟ}YaذK?cܹs kW}W/Bf뭷 ]֞H; z [O +rYrG>Zڏ@U)d%$(б64#Qbgĉû^}7@ WUxaVXatH{H#^!R 7 \|q8àA|Q?%!Ҩ<V[>WiJ ! B@A`PUݻw~ !|*F@2At1E裏p4obyÂA޽{ 5ba+񜴾ft?p/'*J#,%!`.]vmg@Bя܌ !>CF@`Z{n8p`I'=KY{1 6|sJ2dP;k k5*K_Up+B@!  V$>S qq]Fxie |CZy䣏> k<'|-Erdž_fXd  Zy8r6 +Y=N>k g@@VH%8eGT>.:t7bD++ȝ;~ NɃm5 s[:Sl^;̀ASO:Z;gm#E.3B@! h: Cζmcr>LK.$j8bm1+fҘnw}?돬f8@?uz+J+|C@v~m/Ok6EnD!sC r?,D# Bvi6ݰ~ĤuF,{0i$# QѣUlxXHe@ŰۨqVYex )^B@!QRΡR "@v:K(biW\qb2!.afw}& ?/3GX]19!ВjIW4_rb+W$X'x¦Afy۰-y3:3kBeZ^{ Dz:Fg! hPtбJm?pnpB? c/ھ~1r!Zp'k><N0k~-F\ef?,pȂ_ dok+n {W`$}ɏ_Oֈٷ~Zh0rHb;`B"oݏKcӤ13S<7mb>|gaԨQ*JN!Pml&aذa?o1AlAtU{go w6bt-.f[C1moaK>X,G;㧖K//X߄;TRJW!0+4`!H"'y')c]{2#F0W_,^fKe 8Z#RSnF!0k/2=裁):ZM{HXE;C5⊤-u7eW5:PtLmSO Kbj̽,~.a-NG}^xuϭk tPdx}aY~j*aRVY!H53VXa#Cq|l|cUԠ25Хxr!ӸYo!L:ޱ8|`7|~Õ;2cRi5lșo+}hx}ˇ5\ƍg#XdjCov{,XO[gMfAզij.ShJ3_eU(RǭXyÜ]*۵]'Ns0x:tZ>}t#P9E2qᥗ_2X/&mntbbp6RZ%kz׬sX:$!idXF!NQ,=LYBKĺXOM?||;% 8оy5IѵB`njL{b}$^M4)\ve6^Ad8] BK[ f+_d6knf{:tOXV9c&tj#uT*9! q&~XZ78Yፆl~ZXZX^}٧{M,(O*YFl,}wjs߸09VmQ`,H,C 73vpۭ{,)-8,v~ni+CD6f`_  {N肅WGY-o}{O / '~|7>q?X|sH*:JΜa͛z:+JZrPz5gF-4ϔ_ءY;mL{gINY%s\K7kͻs=9uR :XR-!@g ,ȃ7NUC~N!ZxW0E˴{yw^% j@@V5t/1pG ?X?[2Yg:W| S0]r%moȑa 74=lkA !в®yQ0͊ XmRv05]X<% {0OpِS({y<{`=З24{Ҝ}Cax<+)c~ ڿ3şq/}$s?)'&Kye5wiٺBm!ek\bu:MM<j(#\â30h[;tl[_RD`55| /p! VSBЋ'ͮ]jTxT <;C̢nPMi.B@<q447`[' QH,oi?gtyxh@Z;1qmjp^ .,[ g!j Gx+,EBvFRF.|Ig+FepLa袋l-Ҷ4H! Z:ŔsX a3BXrM7~FS޳!S EǞ逄{盲2:A2ʈsf=)vy))f-/Y`A袋ZLEgO` ';Bb#[`J>-m7A}cZo2ߊafg`ɷBHx ЏXױ~=\R4qT445[ XNG}tBBcS^-:woc=±kUtZK+@u e #Cv?>oՉ<,%^@"yz!)ؑoҤⴂx%UW]HaF)Ȣq=xZv'K.8?u|X裏~'DNg5STH :t` C<%cr83* K<ND; k=Ba FaH}U,Ď臮!|ZwumHo%~_/z:{9# =dn3?,0\p:fq,+`O^óQFVoY3,HS"R<B!qY|!@8|:XrQGY=)(@k# s@W8oDSqTiǢ#O<1lfG>lj`^nN:  cՃSiP5~ CG2GoT^o-3ݦ;+?0|wn%ntAdA(#La*Oj9sL#~t쥗_ SbzBla~Ҍ8kL*vҜtO>M+K.L-BrmTA: OoOw25rv믿n[wyR[o [mѣ,1 3x5܁1?Ou{f9ל*nvM`A!G2rȐ!ae-Af2t÷X|!!Xu,@Gz`!^1pte}ZyO bpB@67`ḧI՘p*@T+ 4id5Ҋ5G=~1]IC3g}jee l,:u!ܺ ^z,cEc;/ILN=)`;^.L|c%@A7=Aaӭk7jj'IyzzFDzi`ܣ:D' u喳qc-eDMh!!y`,0/>uҚk/'z!^_S~ H % xEB]+> uҹt|\3y&ķyפ }7xN;Yt|A!9|oR7מO?YHlgD%VTQ9{'7@Oc$ص&ɷHHrB@m(̚6q鼜dn[ov~UW]Ok꩸#кBg!! :8 ^:d3좊'yi|~nFLRl&a7=B@O(74M6v,Ădp j:믿Yȁr rƝw湧cV!pɓ\S:B$A`p(2u R3=)kv~Z  3uѣnkq ?d;rH=c8Af@AB|%ql6*"-/8 ‰x0Y=?8A1ݶw|?pw[6zn5iK~g|`亥Śi=3!{g0LߡcRk'x,b=6n4z̿A p-s?s}ɷ_.Sg! 6cL˧o]{=O뮻lZ;=[~)MWݷ"Z i04i@m;Fsh3 ! @F%wāIIvy:MZH?d ;z6)+3wpk%h$"\>aСCðalauUSg0n!:'g#s1qt{ˉA:4G8mVG~膥SڟL|?u8-:WsLݯ= 89d-,pXA^ GZF2?Mhv[tDh@ֺ3J:؝V,KZu,hxQNhLLg&騻% s:ʪ1ԃ4Ai%!\Gidp)6`6*2KLL aЛ,bgڈdQ#7MpDi?Yޓv0B'iї&{<^Os3#e =}bnF=T,HnEt\oEwN 2\xQws .俜/e=~wceIy|%˔PPDZYցpSE([@\`elOѽh."U%^Hb}9, P:r)u?Ub`$)8:AWNm'O#e$UP4Sܺ҃5(?phP4 (޸1$ @8'AQw/ I[ndb~@9X41-"}Έy\ϴAy깭A]w4po kb|X=B!@|Wu?L'o-G9D O9IGo׏I)grS18.pWDP" \*`lLbq34֬$W{!x$KDL#tUX\IJ#؎l5>XILQ aE88 H/s\Vsϔ_O>N?d<zTOhn|#]%E] 3^,`XtkI^ C%GR~qI99WJ G!|k{&n8:fb&x€g:㏣BL3e%BP01iN1R1nQkgb?7G$?C/(\wΔ5*G@Vc+Cv:묳ld{vs:#m*H%QM9йC߅vi*IyBM c槧1-iAq}Q۝P\S XXΌKØkFu&l7™ݤ: ovlqp`:/~:t#d;~{H%YM 5 G]KzH#P]^C܏B@?lo,ϰP?aĺi t(x {ѥ!{(V7tYn1pÚG}t.bdžջ 9)L!@J#&t~w_m NV[mߩ);1ccE*Da7#@@?M!! cpɲ - "xb: 9dqx'Ry4YWrk'vosB>-"rݣ'L4虆ںXUvndpiMG&ɣ/'ٔWB{ 9O9;~G#ܯwp-ܧB`[)u'-GN!^P]~VQ1@rQG7hHFKn,?xai]w lM~&%g P?T!ȁ1bon]m7 ,"i<Φ8"W\a o 'II޳& !iҸ.Mt-@E RHBl9Dgݭ(wGW.q[`SE둌5qCJ3zF^%' =ߒ]9H;ѣ 7/]_}jbx'Ӕvl==K:rh|xysv0?_O@dB BxΔdMAXĕEY!U7Sq GX2Ey2]wd&\ tvo: ! D2ݔ򄁑נrƤ2MߚU/ˌB@QD`F/ zQ1p8eJ #]vby@G┓O +osOŠAq6qzĉ_l[|dZἂc뮻vzb֭sq,@u , 列hmx 1bwaֲxn(,rͧXe&wi+!8p/go"G/>֦I?C|t| }٢=Di& EgOJ`%^ x>}dupǙ{_q-̰\#L @hH:2CZ2-h2yq7ߠ<"Ha u.񚅉!MIY<iw'78,~ܯB@G*:ȫN:z.@u#[xԚN=1M-?KaY ({wQA`gtԱ"J~M!e.񺜤>\;C\83'Ay 9ҫW|_ח56 e?.gv@ةܘTMHzqsar1A^=f]AZ>%Ƒ#G!-c5r"l@.rp'-,:LqUkw7Gz*\/Bށ-m'o<ǚ : ')?y;q@!$/?k75X>#h#m3I!|*y78 7! i ' D;dʺkG2,2?; RVRΌ3ƞcEB Ca(OrYa!yʵVZ3Ѐސ? F ku5gwˮBg1 m)ñ$Yol GiXAQ>.aD^dx\4  ,1-'d˫{! jJ)RC[E!);8A?9=C5e.U)krBVU9FMcB$N88=4~( CK: E;=o.DtйB/L ۴n?tht9$gN:$A.5C< {1+--FYN룳Ճe e I|):Nw$ a t!e‚$`dy y!D(ǁ> $0ub\+֢XyQF~!g ) 9(ωX2])'i~\3B{pY&EX.mf}D8A|9GH 8qK%ExtdM /H,'uXO| |CAqkdCbQC[,H++1g'Hi32!&zk밑F||O_+ї{řs&RCOKiF7!Hd]MmFSt#n?5"c1z.@"@K}5,eeb;8uuuVOPRnfmݬ)T$`P%AO\($2r!'jXsmDotN@/Ѱh/ q*#cFnb֭J*:)UbCBϱF ^wm8묳L'of!j".6(G,=4tiro?wN#0 ^qw ^&Qn2Ő< |=$⧌2d-NN%u#>dPow)98! ) !X/l\d%_ϴ>AckLtJB8%,$PzF4y':@MQyôt &r ? ;/r Ђb~JG6"NNCCY 5S刋B4e2KNGyIMF8ǁ/+{LkZAQNB@4*'ik;E먆  ҇r{ҹ_ 鄾餍4q&)ǯyB-ґ?xJʅd3DG]C :I8:gH7tN#oY8~_ul1RXA G}#%`8||{ăkHiBe@'k!yz&@𻤜ʵ_~.- eC6ZP~qPᏃ yY`7%!SN9%O C"^dЋᗿe ;椳`$z( 0z\{xSK' y:tIyb 29{MO)W4vX,@!מXT/"7oڕfjC4C]#ϫIg"B@! 'MŮ5_a:'uhȟ?5m-yxwB|D`$d@1Q~6!HB@! F 'HF^MI\ʹrRJB@! B# gJ,Jn#,Pȶf ?g1cƄ-"?oI! B@! *mv(eWWlA]xWPy%ul6Oa(v]p3]T9[@k"0 nj;l -V[mp]w.T27x#r!6k9! B@! .W]tQg}~v۰ۙ80s9, c/ ,WKȹƍ ;sꪫ# w}wѣGxLo\ 'A@X堧 [2Jt_}U8qb+kqz%WB@! B@>*{ g}v۷o6mZX:u r>< _8ȣ€q~an ǻvZRcG oyU:̈GΈA> $j>v 1swm kFXlf)7R7MBӧu)ۀ7I< ! B@ T9ܐ!ClF2 Hۖ(9Ôo:+,yrpGqmשS' XRV\;NZ-"SҚ@bYNAٻwo;ٮ]m#8ŕf 㦛n =X8Z >0DX( ! B@l 5\r_"0c&ly̳WЬ?vɥ 3! B@!P9L)dZ}:Tbwt;}_(\!?y;/uOs#P+aWO<ؐd: Ku@@X#3IWL9sT I$ !Nƕx5ۥ( 4ϢxLҥKl|q]XaLIGp/~x&.ʗ0,}=r&M]hL62ىMs\wh92[׃g8vA[hQwܹYK_! B@!Pq~k_c G6n m/ qY9w t9e}MH')w:ԣd'}} ?ǰ8 }B.("hV"YT>.hXu(@IDATUI^ċ56UǂF mQ߿,SR&7~i[q-Lz 0zw1[ ZTb#FQ;.(+;nBEB>bX lfyB?)ywm*vbK P]z饁G1^xa3fpZ?yXn,JH}f! ?3+${4w@; }KA.x^_"Il@?;\$,l$ \$ Y #O뮻calH4Hnܲ$P.VP㏟-{/ӟr.8h7Rs\$r\sMk/ nbaaH뮻j#DT.XHp3O:Xˑ?L.v@.fb7#B@! 2FۛO8k92g+X۟n$E~]w[omYWe|9q}|?2 #L"Ye\m3:}=?|yH\\y8 Ǚ$Mo~c8Q,88m>:lƁ̺ fJ >\WoI0[>b6syiX^#}HF,A"~i bQ,+x)SÆ Ӛki⨵s&#F, sX ],$s[UW]gНO,?\fAmgˤ8hH;Fp"9Ú,9Jt3TB<\47W'Na(⡧7 O,MN;- M׿[aYqWzq>nl"enj-9Q/ g4\B@! aivr>Z).aG$9Y\#sW_}u;'،}'ǵ;f`E߁HrX Gr_~e}t &OI{ c@W0!H駟n}Ղ~~ Yg}vX"9. ّ(g'$,8x@_yrl]53Mv.3,9i}d]r# cc!XC\0'970K`cQĕQqؠ?dt7}FB䣏h`QB|8q6cyIaKb؍! B@@(}fjȺ%K68VKʯjiGۛцf}_,mtHp?qAER-D"flgaC$fZ3ki6K/d33,i`M(08cM#}%T/aB?b"ñ&.'?0˅~]_/b73Bi]]]]Kڄq3~6UZz‚zN$ X`."8 V.z\ɦ% $l/mđynX^<\W* $pꩧZEFR zr ?c҇YC#G9cz܁ [B~8,?`y3/9&ǽ⢉T1g?7;4iRt"c=L<„zm L_Ĵy% -Ylzg"*] ! B@ DM!_hҮeJ!}`WapɁ[^ˢK%M8y>ˈ#xO{YǂAd=>.3iE67|JلO :,NXXsB~ ˭47A@VUjarDß_cC!VuQP&W*!d3MqVEs_ʘ~֣r tIT* B(r?H\0o!aM^ FLJ)Pbm:*,w'qGD˜{R&& 8l즌?. s`_{E,V`qjY9iVFt *B@! mVg]l}H5֍9 pN:o'r]:2p=$baП?/Nd/)ā ֛b68xGXfFU\H闸^pr!|OZݷfU4+ ,8OL~ٕ  /bc1:L!uVILC_t0f%\4o}Gpwt̢o5]%xыʟ УG#؈ eܒ%B@! UӧM7d`UՔ6t齃kS B;!DV1mq cߙ ?7~]?9g[nH8f`aqQ ,k[m ^vhl r*nX%3;t+ S bNHeD_v(4 ץk0`@;3\{6I$Ϟczs*/gl  _|тeLl< / cI#/@xBa]6b[$2-=G)\7/LLV懯>xaT蜅RpfOhbEMRQvkw< [b&IXG|ȳbO?Nu/B@! D6 PJ{3 QFn׭%,0L{:F9mmiџ{ȴ~_?3[罷݂9#ApvK∙!lVfO\8-º̀OA<霾n_ }C=ԦNdXRK.gp)T?Hc*k/ kCbźl-WB߃)ݻ/pd#58&uIKm1g4+u @ϺR,Oʃ!"ϱfw`ݺ柣zaUE9l2/lkPQ? sHKp] Xӊ8) ^~7xs@e@Hڼp?0zmco)l, } 8N>wK{>D2^itI8Opy.xd@ϓ qL0, !Ёo3Ⱦ?ܬ؁ f>ʊs!,q' '5i!U Dl]PGa8[$8zĚRL#iMp qd4[,uąK=#_, 9: VaKwQtB@! Ղ@-W1ObWH'6bϛ*#ε +e \hvc8, 2TP! B@! B(Z(4z, j -B@jFuYc0Kd-=}Z|妣RJQIٕi>ZB\YN! @# sP ! e!E1"%w*KnVUήi] $ ܐNQldM2"?KN];ŝ$#Eϟ'o򑝲;۝, W9CXw.]lz!'BX{7 jPB@!l>"XGf Ld[tȱZg!:׷K'*֢̊dB6zǒK._& f]"2!Tz)4uTho'|߫W/&>|$.H&`M~.th3F͆;Np\N! @-# suWC3@ ! @Y|gGF0%,RIg5 ?6y{6kT;5[V: ]y /W,A sFeiI_>ݺfn喳o0+|#'N4jFfEe! h"! 5GB -iZz( A#AL@,W.ۉ1Ț[,So1o~yzC}ᇆdE^&xdM^񍐗X &  S儀B@c[EA! ($yEPVQ(/)yL'&Ml\NxUCvx/֑B6zH is=wmAu+hVDn, 8/6g! hjh "B@&iqyE?+>X-X߉rY9$Uȫ3#"["by%WڵcY}VF*9B@!\d\N! )i} XÉfN^aU S"5y4,+ltcG@lg'Fȡr(|wЛ'.ܞ:}XHeIJW,UVHB@j@@XՐ A! (W 9:u)gYMIBg2Y y1 xLsu% cadd AnX?,;+2tC@1]Ɖ,0"^ :A1%fE.A"R|JN! @[FClؕgݖQڄB@F`᭷2"CY8'f = ZݐO?FmnLj1n4b`VMAd7 CX!|s[ K&,扖WYF>hbݺu0~֦"+6 YyN-՚l! h$0K2eJ+նUCm#U],Օٵioimy:{.?!t5r_ !=e SƍgIo´˸ cE;wN5&;r!.=[.YSNfx̀\$8CgzIʵqx3aV]]NR~ C,&7.y ׅ <=\H&t.eq.QXaٕG]Y<#By~7fT3EfHr{V3eQ}GF0-2F`91Vn%n饗l=-/y#oY` {/,v2D6$}J-QKf*z =rB@! : X$z;޴u@Ks`,כh,N9:aD tۺ@=vH?V*rB-!zF@W{Δł. v =i9YakbMH O>$U˗YȇToB,a=D>b! IYML ʁ7`{k^|`䩦fc#@0\y袳4kyTh'tC޵ә=-ߣI-;RMk|m@1ڳgOÞ ! h/4J`9Y5\^{=` \kKBp7,Uߟ㮳B@Z@@Vr)QԓX!P*гZ!5iIbh*niy"ܐ@YZ^agyU.LX%-ʉ3O?G.[Vr9 8ܕ72|$> Q㋪+NpBb-5y΃,E b|ېW*3WbBU,Xb%j,k:A>)ca+w{_H?!-CH$xO9߹e$mA1u{:4LXU~Äce37E.~]vǎ"4,aānXtVnYF.G.@v.&Iy9]&k/tdc,6ـtwrB@! +UG`y^ ?ls8"d3x[^\pAX{!1vXk 7  _|qX~Áh#G ^{mw}C=dm ҀbD#4!>gVFc[  V^ye_F/wuװɦ~wyg;lN;zpw@4D_W\w\./0n8Sf78?UW]uUjŽ;Q ߨ<:<7tt^i׿~$e]6~7n}5 {}YpӍ7^|CtPx8cr-g@N<İ[ڷ:rH'2 \}u s=m,,,y0|k^~go0l0+')N?@Vw-@)0m( q'gY!?+ (u.6ذB/NH{rg:!~(s)_e#:'}\FWKzZN^q{9! #<#h]tE!AKF}g}F>42(񙛑3k,\:4b.[|\sfmLyG56h=c2;:#h '`D E56frA?>ɤZtIwg{E^z{ ^pWv"+[Hg_|шȫ??o3"">oV#V_}uݩT,9w~Wz4yV3jYLJ.M6$@^{vUWNAKHM3+; b6{g8G 2?.YnI'YQQֽᤓN O> ` 9 +`ٽk#/{.d?5TX0ašC|u bL5fĶ@riS@(1euxY8EudMx@Q"9pYɦLt.0 -IYl)d % #r?[a@k4{9! %ܲkC=4?rff߱3j(w>Yo37bĈXǏQM:XBswqǙ%J#[m;"O6+o,Ѐ ϰš}aaAgau׵u ̳4o5027VӱGXӧ6!ya߁B 2<QuX!}b8 A hp?sKGH,}a|3Xpyg zDǃ2S-H;<|’ GvY|-ᰎ"=Y,#?,{ot:4 *rӊ  ѣGE> }7_}Ȫ;M4DƃVG8Q S/'ir=#@'*"aMpG4. ɀ֘wH3$) ~\nWV҉yW7dTi:;'ߓծi=_~W!!෎~3L"o#k82m5QD+EȪc=־a3 ٝd{Ssa9v%U9q`yH8E|;`I\7|cS!S󲆴έF !9ܑoᏰLiaK~emfw|+ !5 BT6ZOM}B,tݕ+2rA ?_f-1>ruFD e,3~뜼BִpYȦ~I˫,dCpR002\ 2|nj_-zq6Š_ !8[H;RVj2__Mjo}[Yn@ `k,9\SM΀#gʽ_3_hyC ګϙjy=n;>%?xƕzl6SryEs& V%C^c=o@G_dRice:l{uN- ݟNhxk3>s^?<331.ʏ 6:0AkM``ǿ`[h jc>FC yk^1b 8m2F io+#zu@'N: -U{Z@^0f6c 4H\|նW@ a׾J&Za|օ/Ծy^W@Z8ESlubUvkr򝔀NGO pƌLZ;mb, D16fbxtxTi:.Aɡ Wt<[ߌoFC}GF2@s붭gc/U[`ygg"jc,t4: t$I`fJ`C ~%N^oϦ3>k{Ͻ:ssXg^ɩI`&HHC q׶^UIkx+mF7pC04{P'@ ` *}hQU 0^Ux֟mW={WdW}1oMh|"h&}^uՠ"w$I i"e-ȓ G"; lR,c GﳩSa\z>Z39om~uAQg2mVj ^|z=i68\!s y0l#wQ֥Ϋ ͫMwܞ1rAk_VxWaK]$I@'NB¹)l m8=d5G)I@=lW]uW&}XT$PL6~{*do15` ^+ՏBhzPy&+2o Bosz[sڞ_Vv:-gY: hENmyd2emiWG@'N: <8-e@ݾܴ=b뗲6IqQz( X8[xqr> 続lp,X WHq~J3?Pg*~6h&@`)[զyx/hx浥yhSA x% *4Ȅy 64Sj?R7zlCۚᢋ.t߆L!`ֻ!Aw-Jv$I@'GF `Ж #Ӥ]^>!fvK?ꯦa@IDAT}vOЛ.{{=:܃,/d^|sd|UkyNrd)K '^M=uSӮ*+IiJc_Ow*[CXu1QIʯ.nώR{HL >su=\yz_smk=m43%k׮EBsY[`֢eLx]Y66Ї Tkԧi0" < Fsh= O0JPH.4Ȧ<#~Ʉ́bmch16dm@ڀ.鄩$4l&|^U_Oti7*[X[ût$I@'N>HV^=_M8 bϞ=er)#MP"scyЗ=A[\/uW=cǛ#AA aahqSv鼹TkD< Kޥ)?RxfWP+u/|wHĔRy"a (Z0<[IڒĽm1:ԇC),)Ris]Ko9=6s"}N D?mVG6c!{-\ _bRѿaRfvȟɟL[}w~gٔ`#u]}wfm{i֭]W6jWx[Ӽ'Ƙ~E *Ki$_ ` K[y8xU4={mN@@:FCY+mm0ܸqc^UtsVuMh ْ6'A5qZpml8ԧL?YIMt4`(5E%J#Y r|h2mƲVo Kʢ \C9 D(Oꞃ* $e'}$_/H䋡sihaHySܔO[r)|[2?]9D# Ф/`NԚ#hQ[0aq⎥sI(}-/qV*_ZfsؒEJ{wÁWjr jN6m}Xs i8` e-a/{\76Jw?A\Vg}7ukDS77{kL ,f' =|r-+_r"}*Ke~/4?S?|<$ߟ&Lycͧ ysJtugOCPͻ"*S: % ^gԖU X7S3Um@ Pꫯ. q=X[Z5;}LFGG AR>VzkY@oܸUDfͿSfd$D`^k#?ѥG xUe}cC|#m$pHjɶƾPZU5M9Z7AYiy];Cɘ^-WJJ4/JYd$tbVtrʨ[SH*ta1+ȃמ*@#g$~Ne`3Aa7s:~.M5K B2g.qL􀪀Z mr)I^Rȗ4k/FV7! 3^_*[.xrJslՀJT@O1rQ mʢ#.ԩz(zGv&/F6Y ZlV.qVV>:Y*S+miR.s(zLY$ӜŽG>bv }nG?zKV `ɈM0>W쥵+,^XLm:4r߶m k_Zi+~i6jWW+'䟔_i!wvQGs)ꪫf)w|w׼llZ4Vcwwhn,? Øk#S8=g&.M#aP5b ^ Jo96hӼWmѦidq65(*r?x5(]w~ ݚ`7Ǵ{Yig.2}9v% o׀U@ਛWql.K^qlHjM STvhhC, h^"&o r%3 b"'.mEi? /-)#V ,0 O@^u'e¤ͻN`ڟڨ,% SMTg m%_xx,dN!hg^=䛤=gr%D=W[4jh0 nrVDVz_1< ȯOhO|$Y寴j><4A.^dׅNV&`P*_yhYT̠ ޶[l򗿼Uoy[JUc]zW˯x;J{lnt,Z8oۛ15{K gl&oyc[46ۿ]4.X?E_ }󲗽g?SگOYV#I5Llvd3::ZL' m}7y$qΓx_qrK]$0s% wwx`MZy0QP

t񮅚&cLC>u.y.:Y>&#h] /=|Qnw6x+ j<)ǼhYukַ=/_5Z5i&6__k>я675/~'ooM˻//~|'3gO4-Ys6o{ <)Q7y{ g>S>_-"JthY$YA\-_kto EgV 6{POtkڔ:Δd[}uyr<dO >Y/\U޵h瑇9͠AHgLz6u4: <|vyz~ <@QA\NjWos|ݭ nгxy'/`P.~h,a_ m%ΎG+I~H^;zO٠\'@03uݬOO ôv-(<3G/ .M$^rs0+}u;kQ4ƒX)o"yRf_~@G}!3 q(DnhwWD1AR1qLrɳ2HHnD>EGk t8 R) 4g`M[viMrOOڬ]={2_@Tr%2_<'%: Wk\צZw O퓩iYfg @]/ t5@K^5{ 6-T}05?z|1b:ůM}&3|07tciM ʜ!#͛7i1pkx4a[\o~ Ǝgdj3i:{8M6iW=˪sy?y@ ٠|- 2д.oF|Oۤg0^mѮcD{LMۢ_Q\78Bqf"60Ϙ#.t*W`@hVi1,ƣUU0An| i=7`.UM(aX. :Nus <'xe=%*tdD^oxB8}?gBØ1B r?s|u-,ו^ƩkcH?! z\O&wHB'Y) Gݛ˖4_j{_]Lf2Ur>O1Lk eOjttyӟ^Lj;Y[Y?H٨V3e8oJ:mo>6K;} |`FhtJa˿e8|a |؏X,.th[@gX"3x`S3)D; )G P mDsRx0,VAj4sY{+m&x~ 5MZ,P}Pڕ.V@ ?t<ϖoe22Ӈ~Di7dm~6^0QwfzWхN3I.+}P\9x@K~ZPS%ކ`G6!i/h9az &H~V M /@zs/P>‘y͞b:&>mez^< (_\Z |^@(tI4`~h[;t{-j'ZӲc@9,K1?£|X4Q@S{NQ}]艜 A\8ed/ {v"<Xho/ ~ مc$^7TxW95H onţ^O3qwډ^EC@W9BYv@]۰z]Hbj гB'C+WQ޸qc m mnV5NE[#4ЭZm&s 0(}Zh D36PsW= S[CmZL1tCz :7d$x҃' 0Ƅ)% flm܆޵8hı+P|1` Ht@seA#e`Ӱ*ZO=`0p  9vaRvMZT6`H?_ʛhCБ 'SdZ@jS@{S? 7徟9-&.NjxP7[$I,,xm53SK (޸qc84DIfI{SUj*?L&5o~s:sb~cE6/q/<2$<0\t6j3~i"0цЦy6uAxUM)[[WyzbB8Vy:+;+ PSVe:$'ˠB|2ÀY~4? ȣ6mW?x l#=z{zSs!lB]r<6uCYJ3h|W #?K'R:q/?<2&1<7{l!|䓳''e @BOمNZuaW_U5ϴ3GJt9mpTmu*y3QZ4 s9jǩd߾r;ghtM⏮Mq&+[mC Pqn+l۶Xxnyj{;]ZJ`-e0 6ⴺhZ5Я EP䷥Յ7fihyܑeU VB7S6bG&m7 :Sռi廖gZv5kkO{C9]|xfh1dpk |ue/ x.9 A֊/sKrQajc_gOqRI&Ц2\0!eТ!S5 ,In6M ) ?mMr;⃆M 7~Z8ZXE  sf2W|:a9[0X |]%> [1h}%J+iq΁  Ɣ^R~$/Hiyʼn9:y3mKf](w_ʬKm[( )(D{S*/%nK<٬ rdPG=Ah\IGZb 0t;sH/]{|x?ȓ2ތ2 Tڪ='<7U|Ƭ+{u9*c]6.ԯ%ʘ\Uj B[^\ףD?wU7%?ݟv{mdHw қekv_`Ұ0dx:X-R7NjL2uȄe&ac)~3<}E> @~l>hЭ\*EAV&}훚  xeJNhA_d/"*@S_$]bn]Zv=S_sys]yO d TL7؈٦-'͡FwݗQEڒ Ǐ6PK- |oovkki]j|\LmH'oEH*kNso,P`*O#5K"\}+G-m2VpI7&h2ݙy8}Rl4*5'}Gm^&67af 6%d pi=%-{\; ]5ճ7 [11MǚIw [@?Tht 0`ր (3c|A۷o/~Av; LfN؅a (U%f|q-G/聫jli-M99Yv ]@rB$FPI}uAӇr7c^j^9. _}b1۠c@Z].|kY|wP >|Wռgse?Vͫ$p6jX7? )sxP+ $^Ŝvg$6.L(&hL|qOCa`u$9@b^ ^1+o_Ͽ4~}lhpܧ4P@iF(Шaz&m%;BJ%im*&5PzrV37)01g DDfENu;dg*7/m[̣uqڥ+7pP ?LkuߡfG: p_j 7'kBpEܓ9W?.\nπ #yENڣ4K҈8a{qǤrG'hJ)'a~1ICE ML7c'#M(r$-9 "mLJs;r4ZpZ`lUn6 uocO~%P{t6ճ'޷ 23=B'I߸qMY~+*ch:g `F'J8+mmޠe@+-)QYm%E 8$]Vh xԣU6Юed0e dԴAd.YEf=ڀ-9>(/3hm^%l4-r .L"@h^̫m:m>0ȀblśKm9=4QfbCϣP``1ďր x8qOXMsRvEsx0Ҵ}H!f^- ?aGR G[h>}cW~=Z_MrўnAs8&tmj=;m.NeҶ0v(WƇTNB\==TkvQ)'i y0xS@_xҗyQ#=ٛ$m)L)T|W:h9Q 4u(rEV==}҈85+m'+4ж uv_$0PE>߽qs,FDw̪I4՜QYNBsONCz8-%kB%pO 2C.8L+ &ݤE|+z닿q"jv?}5TjݽR:,u-Ӌ6miR9SyK[WT")WTɟK8eduْV4ї9УӋ)&hՊ&eTL$^kF-sݓ[(&2DtJSc)ӣ_nNJ jk2ر.]VO+d^uHj5~FMt]vx׷'qhzo_^ϩȯPZR_IdSmԝ9 ڃP?Rry5y.zJ]6/\,jg eK,u"ܓ;)#^Bs"&*%ῗ*d$j@5Q1&YphCk.`﮶wgЗ8nˤճ 3:GD b@6j[`8,8X-[6}F'JVW=pE B$;]< #L@u oܻޢAז^:ʏ ?޼/-|g>0F=GGG uЀgSmJDcO_t]hLvL3* 3fW 6tm{,)CK(vV`-q@@F">;C0t{<)\|UMeh(r۬ BCGZ J8ed"<BqY8R8f.Ye/0DM1+c; Hvb'1Gd?T:1qL/ +s}sLFr_!1O,_h4w|{¢b p_1˒ ,ܟ;c:ˈ;ѵ=)fAh%7hOh~]>S;p;ܝ&eP^84$ie@҇*xDQT ͯ9dzLN_7DW/f'?-fi"D6GF끈Wu}´PߊwKc ~^o枼noѼ7hj olO$@n ^]hOɴ=y>fSilj#9fZ]]?ϴ9&l{`w6@wl˗h1 ـSyPֆXh;|UؠAcÆ'%"tϲ~AɣkV)W[jVۿm7Oj39+^~cmmh{4nc:X? ݛraԥ]9&&}|s|PdQ@׆?7 X벘r!p hR~"x֦H4҈1sdw"β1{\.m:m"p 55/ؚ恃r4ܽN3#`.K9Nh`ؖfT0"NG% h a< S  3Bo3<EOִ?3AuMV&o++G`/-Y 0V4}u(?|k_k}qM75ͼI'MO)څH7:QEguL;ShU;9sD嫀4>⣉I⼠*[,TlϮH]R/- ukfh6m(8m|W/kKSڴf:L{R|v%H=s̎Fj8##br/2ю)?u -}:0?[%ȉt2\^P͔c/J8Xx7c@PD}˓wy*4i D-+cӸkx- iJeCk?@,x3oo42Pr{qIU = .t$I@'NRql Y>[tbKnʜI< ZcmflWW'56L8?lNOl)]6+ ۋW76}݅3q+(7mpLhVϔUh YނM; ص^x2N5!-cPYtCx+GL?=ˋ>~B6Vh_-?믏_]N[@\ۅ7 M`XSj36ꠙTLyE>"6+/t63k@d4k~{%क&Кz{[Ir>J.߀Ӫyu&eK%SWgc0K늖@@dkm9%` zNӕ8m*Kܖ9Ȧʁ<[o>h `bƤc u[XH9!ͤL -!@p+Bv<G.&e liꠃ'aEр1ƛ"hs,8H}0IX\")E ȵ. \ kg C#R;dW.J=DXhF `ɳ0E DKu; ̝305 c71͉2V4bv<|A6[w5[-S=oWi]N4vYG"K MwG `Ǘ7W'8U(e$p|] 28/mJ[+R %;~ adtݲ勃wEP3Yu1hj x"u7/ҟ|mNϬYmfm~8RP4*ݳ3I_z3?3+_/|aXq,HyoP0ybDw8}K^l}`aa2mټP7"6WEEm7Yi+  i ߿j>6W_}u>n!WR' ?o~yӟ^~<[5P^;Wyذjmn~wh>/}s[68_LMl\7|3[lBmSm?<)Ol55yܸ)-(q}˿˲6']wu1я~'6{]$0Un,92=S=}}m[Z;IoK2|xd6^a~m9J6Sclcx5>)ޟq.p6o|S>}IG'>k31^Xuwww|>MO$Oum ֏V5H(ĸ&czc. ReX2߫6I,fS>kyEsgӻ;5S8wfWucr!:FCѾ{fOzlf'tۑB x=ܘ6@ZMn Rk,`G<\f~.`˂(oбbguI@2Hoɱ.th|5RUʆ|UzU`u$`3nWp5n 5k@5Mt臗oo6w5V Ck-O~ʓn'>YpQrɟIkeG6:dmsSp,";Ɖ?\|uk}pNu!7ʻ/Eecڦ;B.x|!6P QAiӶ<Lx'm PnܸhOg4ɛnkkrw9г%]9^ueӼB=.]9D@2Rma⫹#=9m/ȱj]N*\0\$<)ǧr( , _L{b[ssh{gϼ򃹿6>< /C}B (4O\>9h 9na6.3_YI. 9zir5( LF8q'yTC E9=X0 ,qJ>%n4]>ի)N @|Nw=0:ڦۛO q?߳n;duǗhp] 룑|`ǛB$`L`eg?]_җʦ9yN4ɦ/Yђ-.0@D0pf a ??k7"UWB?٬Ӏw/xA5"symV)Kw~w Ш 7y"_V k_E*G=QeNf o ac=y̵)y"21h&S|W?G~GJ~ɟɇگ4{eoB'mA6噲[|8 xր]N}lSʼ=-2X{W6LG<4v\*z?U4~~2oT_:6haFyF4T4:Ԍ gD>-a_ʳHC~J*D+6W]uUk&x{(s9X5 kK]gh tM+0]4cmj՘g6mTGs`Ban+xyCX*}?B˧^ȑGƩZ@_ku.l+]el3-m}Ry@!9/?Z ]N}(KKءx&#Msg4_h9l ڈȚŗ &30ճgֵ&ZE48_]m(ؐbdr)6ǣ0CWC`4f~6_ݞh ڌ{gޙw})3E*93BwLa@?ą:_shnU}eg{8#<;ݓd܅N_K,ǯʯ4/}K"ߘβQJY\h6«_<*>U,<`^ _MY2`ω+@_Uvַ0c,:M!M5 ͷ',cfjW˳EM;NtJuhu+DUV6"t@]b]wwYUMZ(s#xfhM|`CӨGmcxUi#n,_Uu>4.hWu-js]mjZ?3UӡWˠQ+󺹛O4x4|9 k?|擻#9 cbxkQmQZ=sc~ӖCGөivX#N5HٵFƛ1k3t߅ƤGflDm`8u%jGZ:_~ Iٳ<#yI'! z jk]!-i3XHІg!@tTl+GͳkZXdOs0f0.6? b]Pdc4XÄ 3a#4e昖 `gjSCrgAaQD|Yp{{#g&ί6;mc-ab u% B-V:::Zb6H=]U7/h Iq@+rx<>3J<1A T>&vOyipyȔJO@c6͕`<Ұ?A"Bv7 H2hn;}c \O= ݟsFsƍ˸kys սe/@X5-u6G Vd 8?=MPUW,x<_ր\#Պ94@v`4ͷ_8 ͗+}˂ 7Rs[)s+5WoXؼ޿zp,_8~jagf[exyyֲ!%yƎWAU( -(ҖtBe2g8۲-ٖ%[?Ϲ:ʵm]I3s}OW Ak_#)oׄ. UOPr-DIZdgثM7жԏKQ?)džިf]!Fr">Gf?ȹ}gwL0o4DČ35M~Ӝ a;2&|{"亍B:2AkgX6f|g1\S~xv-hqt7jcB*V+ePhR_慁үNzb/}c'. &1`p* W4*N.$'rs'_L AjyMO'ӥ>8&:!l%%cC_|դ;!Mu͎1$TuRرǢfp+V .( eep!I扂!ZmB!5uE! )>HLyP]+} "ʠtȯ'tk^rMߌ45Sb&?,~L}=-ձ=s 4POaɧy)CU#["V3BǀF~}efI gB ^9wLʱscW\hY7970{sQoVJ r0ݙЈpCӼsvICݕF` [AL7;sw0Oy^y6p+*b5ӇR8ȿ+s#}Zφ8 Bsogpɸ+aF#ǓYEEHbhe |t 6_ܼ|,*#@5\D~i?S Eo*Q ([G68 9O=G=5#憫G\W&SfA>-N0 'ד^6={aQ1p7wHG-h#aAд̇Z,A4]muF B4Þ޸Ӻ/U&`7(oG$6vMurP\t[5VS[lA~~M+^ \}DN OR6ۏ=ohɱ!T3+IǪ!k{[odSJ2}]y^ibKN`uA|j;3)⹄Wj^uㄩӦg2WBp_G>Gݳt=*s1W^ «\_k(kb?& \wJ#!Q.?wSO5,w2<|,pޑ+Y[o۱{wekߓ%Uec;!/I27N/~ .ڗhg2)/ LDDZIjBԴ* 5ד`7.rc30k@$~x3(Kq o Kj"|L |AA5.ݻq͹ =Ԯaӻ0fajhv<n eM 'ٶڡ<6/C!@Ӥ 7^ͮ~^ sүSCL zm+z;;IA-\k!"bjxyV ڰACOxjO Lu6"*἗UDc7 HGK`X½Yo:MpvWK},$Mj3nl&PP/Պ/42he PEC'׆Ns2/[,:=WL314(zFjئ},[JsD5wFVڒ%Q 'aM6Cz.,BFC]< >_IM#FX:fӋ/ wF\4+Q̠I]m 5WTY,A4yVf_sWLjcR욯OARe| gVWWo=CS-i +'-]Kx;mk^}[.Ζpl+5<@Lb*>^Qg5N\<,jG wYCK%0}Ǿ W;ӄWB1#ݯs:3 ^ Ny H4d. Et%ju\ix}Q KPBF&!y5EMm@?P}# hjjh'颍GPm[aRia!ұXhm>q+"5&4STsR? Hpssui8vuԠ料 h^-aUiN W/yh+RGM+]U>)ܻAaD9̞]o^7(ok퇓80NcW sa!cO|`BT츾]('zfo<ے\Udu oXc۬sum\Tf%)Dž:A6uK`p-Ռsr)* 7OZȷ؂3L8 \|E\G^>A]xhE]G9ϢnU /h:nY (X }<5В_,54gt !hckׅa6ǰ>hgm_g-/4;B5gɾtKcʃ8V,s>룊Ͽm?&?,ϲԗ_5}?Ztf](&~3es2XN'i.|SȞH>Oav1QSNB)(״VgC'ҥwl~f)nuuqkvmhVi1]$ld}e-?ڏi[_&[?, K]C2pq¶&7hB<Sy xLg^ ϫ\&ZFYs2N<`gh= ~|=D:|/0R־O{4?4Vx%tؗ/1Y< W)\_- gi&ϗ%}V&1EZ;'@H?}\VWeÀ$|@S=qjIiQ5ZH^0p" qPZȏm:@a<}]ʳ2೗:{fU# 8-l!qtCcq1EaPPӺsOwzR% 0 au#dp'ˉ<-ALWa>wi#30g,jG¶ ɷ RZR^Cvx/[86Yw$\1"R~4&Ս]q%i2yqhA[^uvW3wP'-%tQChPmhy][qLuv t>G! 㚖a9iK]"5gL\<^A3}@A \܋u22?`GØxf׍症3uhG?yi *Hsͦ?wQ"r 2ԓS&W;qؠ2WB- >ŒNa0)W#ASQ4QUjinJ &hTsQS!03IYB 6sS0i&r3`~c47dٓ!Nq?tr=C$II龟'g?Q\ťR!uz[t!5z:~:N{'R,kh<tߴ<7~fϾO3:H2~u?Qz mov)MLԾPD\ ǜJMA)^1oXm3A AHX0Prd?jJ)8JxsFKʬrcD4tfٿy5瓾%}!M "⊰|T 9\@fgv[hpрHp\eθ4ӑr>JHqFA(SG& CJI']罸s6rpREwuvLiIL-xLZ]Qh%]9,lvLܕut`]|<|L[EaS{Ox6{a!\\81|M C2)ׇg4UuD4L̯ ;Źa12S,e'tWW>Nݮ.-c^F:n2j_}PGH ƭ8i8,\_4+4_ǢZb:3.-{XGSKc2)?54PL6gS~x'=۟eg[]Kuǝt?WǡOG)|?9 $x5ojIm4U޺uk'|~$_Өi06Հ-W*VJv%z.&ji:}^eOQ|ռRZv0Mxe?Ls0)$УٞJ v^NIWBSR=iswN-cD=Dv3:& yx$*%4s-hb4f÷;Ge+8B%UC֠ոT"n5p!0q&4uzDC0QTZ4]߸^GEwϮ\7a G ԭK *³־pݽqr=a1{;\VVuǶ,@G|s f9*,]Y]Ot>S>0l [vz!Tl*ˆd̲0 彘w˺=kb/ vEd6i`Um}ahKp>?t4Exa[oO:>Vd…׈V PHPKƆBha _&)`%DQx &R48Fr4z×t,ʒ`kw6c`Y%u. A:>Gj%s?Oo5QP*Rp.׼'`[7ƪe5Gf/D11} ;Ԁ> t$+D&>QXj^uH9+"Z]?hT+>ϡ3GJHѤ022_?%@^I -?yK~ݷo_Ș;wn((bY-:roݎ;W[0!ɆW[=x"o5«曝^yXw̾w&AJ/`+*;|SpY54GL')ݩ|Q\+}^ĪPcfvu6ZW~թEf.݉)uh0{]δC%*_jtNWU_+*`_j|Yq\3DŽјUvvYcý;;<@Y?W%\4n~lOmSOonZo(+ w|uTh fy:zx=M8nZ8 U]@IDAT 3 زVh]^QF.4"v>{$,VZQa[?,m+|[]!1ܲbl(w? ;9kJY >혱 ͥS FWKeYx(GOm;ljx[=܍kH شEt/$\7Rhf K7~'Ki.I[wZS5|dЂkw  5A 6ǦAx5AHmK檎jn9],3ZaB[{C4}=L"͌ONc[Rжga$ GF঺T) \ FVbD\eQ|m4 j8 kw}Ƭw><'~>oM3^XHB|ajǩcE>NSI;Z/IxZX g b &`$L?L)9) 4S㪦f$1y"<>L"oRnޚQe/0!d2n))2I> ӽԆ4}Ҽҧ&x3SZDO`n=Sjkk Ӧ0gW:[\<ʅ4_yv&Bַx.5s6 ggB!5s:Km8"|M;I]Ļ`B~j76rqY91SF;(owtbtG5z+aR_^1hu'ajی&on\;l8Cx'T h >VpM 4axvaZ W~fg'? GАp?H(ͭ$^184(|@OO^Х{6[D{ZN8joHKumFզe*@pic ‰ȣ ^ @a,GM'D(eۀrk(q2[yle7F?\2VkȮL %IS&MTAfR?Dj ,  qiWɘ;m*0l TO+wE{hc՝>3擎E;n7ρ둉ܿn)'ZViZjc4B]|>wcV !F7u0)U1zjH8N6\O7CJc:,lᕟp`9QK /}酌hsv  ގ&*i/^H5;XY )N@z[t>4`~qq2Hig}Z<8| Uq3T€|ӵٗ촩N'CLP 1@~d0`m=4S);o?Tv)m*H1M&ೕ}Ͳ @6Xqob91S4N5Im= 1?Cۛ⥼5Lhc ײӧk朮edz ys>8MC{v~xl .wՐˉ2GN>WOf]eύo4ۓ7gD?Vh)LAibڡqt >H`rq|J^hnpc&[]Kq<ͻ_iUu8.5%Ej^La.HoYs:ap Pkk_]k6(ALXO'O4jtﮃ.{qΰ:gƒͭNWle _@c/(X L'.hhCfRd8xh WYPNSWO?]"_d^۶miBAÿ\z _r(5B6oZX ɥ<;aa.u/iG {]vLsNÕ%ǝ}g*o8uΧ}qK@5t,B:+{re7nfMf)<|7C2V#,磽}]81osǷ+ZQ >wj\uw@C0r>tBUpuR۹[]hF'wwg!@L,E[23]m0;}XBQcqj~|f}Σa;b})#<.X]ڌ_M=a?Xn^?,&yԩ+<_PEs"fw"oCDK{Fk&C1C#4btu3ߞl g>tAdƷ_v>;z|蚺#l5Qp)뉧fMBfӐR[o5}Rk;}+̧//}m^u2 Kb-&Mܝ{1+2r=B瓀}EJ2x/{. *pbи o .\ËCmYQF\jimc>[\jSNP5GH ^.ϼCLjU^9F܁1cg\Z&NmEsw'J{ AR|pNi%O)K#Yg:> @ *_/k.ŹKj{cMX X炤 ' 5Or؍n;p9ftrw-jB٘0ԣ?kga~ #< =BJ! Wa#R|S_UОzsegcȦR@Q  CưjLW;0jrq؊* < t&O,A˪?lwwSc"ͫn-@3"pZ~9'@qOo, 3G/::ܲ9GԉA,=kL8 p VpCgtL> 2̾`3 `jjy1=⪰3 t(FrUP䫏/XYRW"9h$uh6jJZ}oo *}? R鷫G˵yi\B/ Nzk:)$xN:Zb:ZSmC@);Y5C-OՍ1հrNawqEN܋Jj}m`d8l۠;=55h8\m=^&n;jظ43t}q$2߉6Z[_]7\[ 0\toi`F Q\\Ӯ]S1`neu׎0A4UT7FO3<zGs?;>tͤPx*H,p74j<®Ƅa(UanRq+W|^vɄu_Gش3Je;Pi~ m 0]=*@ 3cxHn#Cm"A7G>Td֔[Hŕ?r5a}-4В#8$8s~Nb9vQHX@:Apjz}`h8ٖ9Gͬ1bӛT!)PIOZޏCET FL|$7 D蘷}X1@^y %@^y %pHJM13VUP|u7v}gXd>,#{G8N$44J5ao pH$~k錒 3\k_F`>L3~^؅0-}= 8(毆*!@ w4oHht] 9Dl_;e@X]ʂ_ GU r!z)MA!j(@TGzVNVPQq?v4 GEX(p z>8fF&{}Niur4;Xh$ΘhqcDž0I"j\vrzLp0;9iIoIo6ǿ>䘹+  L̅F>Q/r)5{d0;zRc˝1Ǐr1.PFϗKIڄ^SQ~_a9|O;863%U)S~ vp?N1d]^>%@^4CԄRxyK5KjGYXT}*ޡS)4)&+}15IRv>΋KitkePJgglB N!Q'Ý:в,; t>NZ}y PϢ\.t3ɫz|A8ى]ӕ&tcL:<]JGoKgQamO;fU8LKBT`bʧa4?Qm Z<8CH^:tWCE0Hб܄q&J£h=uX|*qWm8ϺK p$P7:hr(lG;p%[ ASWΣ=Qa>oj[{ep\`;d¨Spyk`ՇmiR5sBB&ṟER򭺎}QyNSn]_ϭJD9َ'q#oPuLelzegԧC˷m=.o;<_Re99p<ĺW\iǔ҇4iM!Dbyɶ-?u*-gܾ^'jGvT#iHrLm$t~&|4OۑVǎuH4Cǃu7m6ryξ-%ǿeO}5 eHO%@^|G ܅RxyéN&"߯6ue˖?YSn~4{TK I {}츢z4AT:.ǰŹ'[Sg[ȏYv*ѡ G;G,hό≥gpJx,|p4/lE q8BpM( ). U?;\*P )hmcwXܡ_aUKm߶ýa1-B7fqC~M̽471vH@S "zkg(s%|MN %pu#̭5ta`t2yY-Mn>hhJ"cnjN@j˱Kht%Q ʐȘ8KxjNi,R8~GW3iVIqz·KE+:\qaɒ%׮;lݺ5,_<կN(/~AŠ_7pCx÷(7ŋDž޳>7.^Ƽui܍7Rͯ97>,c 7w\rI,I~SO B׽ua1o|q<\|Q=nx'C[ۣU'1. Bn7tS5k 8y&p"tu]em^wb;K_xz_W\`W^wjWljԽz WY,Z(Ʊ{,iu ذaCuʺ}p=%/yII+וCZ+uǀ^ 铣LxXʱᦛo .;vzf)ݶm[ǝ6Ǜy/ۮ txܴqSXxXXєr7o.8>ˎ$Kyn[T?pKűvlwWh>k6#ny-}??c^y p袋Š+bNTgAψW^S^w9g;YM료Ze]K>oח5\gy5[q(/ Լ|7)[4%@Aey %".>} ny9~WWWw ~?:WpC5|C^'4sU+Ϝ&T>"1x\ ƱJdTX`6Z%9 n4r?wʞθS qx[;Т¤k˜Ų*A\-9# -^h:VX3ىn db ѮO\PurmqY3eo NkЌ{k1,T) CQ9>j; PP6W.x$iBŬol V65ug{Έɼ7KwSS\s>;ncG,:vu5@['`h`w?>~yؾ{jûfi[{{XT EIͭ 5h!~oJ-cɕU¬pa<3c Xk_xi4z0eQyL(O`_g>5hD/樑UcbBEFr`wiV rLGot sLlm(ͥ/ٗ1jix 2\@C"0SKKm4SNx㩗gsEltjf>bZb!_SWH)s|L#gh8 /jgqzCZ 6>ƭ&/n3?{ t??~~{/iooիŘ1-o0\n޼%|_ W?Tʗ_E\`7.?яFp/p.r"˿ E a~nS׿>NP骫uwwcB 5n?T&SPWT'xG[~?]m}$n>`\ԛۈ[}<B 5`VZ>_L}30m-Ƴ̟AHc:u8 _ۭ7mnkںMy(kA( '?__O||dž|K 'Nle׼&#`0a͚5>>>SYiO"L?Ύ7ۿm&LyMP˿Qۉ/i !c~R/b϶ E<=4q*U8CPkBAM}$P}׻Tp| Џ|#qbi.@K oφcv>A'Qp|7'E*r共}K/zq9  /E-̇K\@zwg.2{1j^ xe~y;?8IEAKY֯ε j1 1u.F| = 00No`߅lί\o6smB8Z 5l!dz2f%n YNS%8 W L Ѣ9x'l0(tf@A2ի!w[Dʩ&{5[3<;F;vi{@^T9cp.y5aS]^:{^?l.Y\6ln{G.[k|EAX:?W}mre7@w־ІGGa8:cneX\򟶷b_#ϯ a֦q1C(T o) +K~>D,{̡4rW޺xL.wk [{ÄQщ>~l o±Tvht h ֢y)cw dIEѵ2a}6ÆSm&}.>ӎr3p(k?j &4$'23R@[lFl*5㨍5QAqz߱TiB4y 2lPDZx 0>;V0LAS؞{s}4Z-)ca9'ǴK:F8e{M-ofw7/??VҾ_|oا.Teb_ o;(=="#Tm}-iz?վo~+AQXe9C |6P'jY9|'޽1?!>NqIf77>3ñ!/8$ r.5~3ψ{+'sVrj s_{qK \튋}!ibe `Z鷪 i]] R'kUsq]Eh }#d+F@|.䧒N+"{~?|c=qp*RmFNd=Ơ6 ajSul#9 >܁P3ǵs C =(_>~nrQh6Z2&`8/ ?$~ScGi.mOarTQ۲mnԘyh X< իƇ G?`.xO?yɤp;#aFsbl׾%I;tT׏m72<kpHO~4Z_J˓Uж>%SW1lTh܏836[դr莅 kuTu;)I:{;BiYD _Z^χ)-p*K e:N[$^b?hjf ά嫍HrBc!fwm Ǣ]>kha֠cH -ǓuSv2nBn{4lٲXׅi6GM<)uL^? 65jJ v}1/]e cgMe}W>3>뾃Gy&ct}:!7/Y4{5}~;<**f;=Mk|.-c(3ӿ%vs>7u1Yx@dΜŶg` tx++XJ͖kSQcɚfH\ē>Ƣy.z7gy JLrc:皝Y5-ԣ #X؂0d `KP;U,`]D|^(HPM]uw1[<$YU x죍WO,h'ml Ï՞ݝh;&uCgh`3{CvTG XZ!>TWp^EG_`7de}"yV55@.M!4̏4B.mPEYN[Fpqf rFQØ -;hr2X9$l`6^#NV@Nj>MO}~->EnSIK>%0c$/LZ Ӌ1y! rl.m(Ƒkc nZh/C2o u^Y L%5PݨeHZcp8f|BPBpx{ .貛\\LI -'.zZ$j^ )\ \~}G#@RCP$E>\:.x-5P*W9W?#.~pRG Qb.]j&)$$Mm?.ĸfm6%KKBEj6&B`NԂO1j U.~'\!}555YS8I2{a rQ.PrOE E2T0=jVW#HFf$?.:B OvIh~1K9wNp'N0Z'N *дfΚDP6N}S+DԄKeYGdui*D~#@>1$0f }n+95llMX"z]+;XL|mhyݺ:VԺ4ǾpTШsE}&pumdžkis{^S~*#AP6}tV G?3>^״3/qY^ #kk>%@^3ދÑy>w~ruGEu|8%(pL"thHѷ QSJj&"ҹG?V0Cʦ,M Հ!'>40=1?M8,O@+F BS,X ֡q}{hVUb&үAPA^j-AsҩJڤcjrhJm&vCXZt˷-pD8"@ϯ vEwH7JvK|fxb[5˖M1)ȸ'H{SG8oP24tӘC?w6mZUke裾ðlny>:橿CZ<> 08: (>Rj樂*Yf:d16sQ5Uo*:r&-6 Jv!HL0)l3X.Z n丒٥uj݃&TWJ㺋xl"wUtN,[YUM>@1Td\YJLnFSDVV/Wpk0$bTr ŘQX$r6HxZ }S)|X58a-Gc~%s Vɓ=߽'>E)NƔ~ ?$VԈ1/MH%:t9Q_ml0G$u8 L/ϊRY }lBƷ#QᡓoGi:N^h}>',O[}c-.tb?ڟ>PL6;?*@IDATξWwR-Mơ0r $ϜϘ?MIj~O>K, 3ffLP[Iaϗi^gP>;[TWx8K /|=\4;qBn!7?>=w_>0L^G`VM%0B&42`=GZ!Kx3\9y#r0<2Q1q2q:Ҿd\Qh·TTE|NmTiGBAZ9|4}inqN-S0;$^?oî@z{"v[@y{ 2L`:w @d|A NG{} R*q38H)C3k)i%vj HʮpCaθs1Qg$.旳ysx4'qHux3<%T鵌8+f hpC_P3Z[O0Լӗgv-n|O]8jso}F#>hEqU{c!&1"GC&#ZfCSE[_ +Cjv6ij9m fLm9 R> SEI'59PtvݼKU#3aYZa}Q}g>.q6 \j4 ʄ8*3ǩPɱft'ƷI'USjmXM>il8&u붭|jJ+`ه^u<;6_9h)Rqu3 P. c[F1س|'||]8^̝!}\=*->O@^)vLYdSIbl_C^y %beG~wg"Wʺ9l3Bn<~6$4:M2?eY_E K [G_OӵxT`UY{70Dˁ:| ?9ȇ7E}P2)o&4'%1UΏkFƌE%tb/UI͕7M%z,jğӶ}A [+5cwTL1*hB/Њyee*q27cІU)L 0ZHw`^).+QI4wa2h#mmaS+> {q5YsgΤRvPLS8Fh1Ʌgp\c ]5k @' T|AAL^zhZ}Q])؋>*1.4G܋ԨSv'!T%(Uj#𪉴P|5V54Oam6Wsq.(3PTOK̩p +s ]j Ä܊X'r{o $~~/|_ ̒>%#!TN O}l?(/~u4i^g6}/dVFXu> ǣOb\Z ͶGMȡnx]i*_F_3K#ch>SmUӧj,*ǴXZ@q*hQ?`\e,SB26 ӻ@Fɛ^qSXvM|>YLӂ)1& ǽ熍7)^Kp]sUP~Z ZYǿڙsܝSu_7p<%{#UxoPL jԹ;Zl@:ZZA8!/x1J?TkYx5N{dETs'' $f\YFzg'd#|nH@* m5d!YwSPCk j$dLM],Z&o6:lD~*ye%1%0Yh)]CV0ѫoؒbK?#zNTyi&}21(L/.=X ɀ4w4Թd$ǀDj%48'5Gu]].۲q6`!Gzi/HxyIx P:w^dKzj]m;F’u;̙3/39N@W[ cZ 3kO 8q YӈWA 8v~ei +K~^x[g:}}SϺշz-ON)[Ʋ2,N A@Qh#`h<|uj)6R͏1ynئeWn巌[1Hcj A 8vQ֥L92Ym>>p~9vtԕm)Ns^#cc['Xcb>\Wgvx)}um;^kM{8۱̳S7#i;ҲKSVe|N9V?S' p*oʓu6߱OU/^rSn:>Յu鹘#fUm{}uχ^I}L2!^Y$J^k}K4h_)BU.wEr;=3n2x^7_y)/}9zl0yKF\O7),c~) 1 >}9qZ2 @n*V2dM\k \T($cF誃t;{h389@| >% 4AmCg!`p8Px El23M\d\ݠ( .uXu^9+rWk|}`wlJ8k m1!26$™u< >p978~=53oKjC}qI?vpkLW"@0`փ짽TE:ym ƣWNojvX^=5Pckgċ*@!bZM Qlŭczþ @&,701(C{GxY @ ި X_jCkY/n?GfҾRӅ *Qf`S&̀iT+YK(km!bb-ܷ-`QL@.ey6؜ d={ s]+4E?eAMm]hsȚ3lJ&^Wex+<a^k;!h˓Nh㆘ b - Ӈ  4O#e}d5durp4 ̗ٛbu51,Y>kԦ̬\lSR}֝=YIئL UfADz!/U~SoeM^e~lCUAa?;w_]nӲ~l/eB=fKrrk~4ӏ>`ܾs2W1oYzգuvԑzͺ(O꫉et|<·fw۱.۵w_yZw4rO=9`e}P֗l2~V߹cab *Cach9~:F嶎R*i/9iS5j{O5޳R]̔+eGlTO4]o2TNbltiiӆo. pj#:(x|u4vLCUMa@M9劀S]K= {i;[ buxͳu4![ȕYp0q<Wibq_~aPXltigwğz_ce}/nd«jڪIN8>=1+ q*Nxu0OYt%BPlơYoL,#:.\Xq1ceijA6..!v@M6%p\M@V&]h|ԟ\A'0@L'V=/djKdp e. NgSe7CA&:CHmx ӭUW}>F\cDJ8g{A.hZ4}@Ƈ*so{/QIa40I&uy+h*XyYfg ˥ ?VY,ƴB=:91̱Ǻ Sq^Y -?~xp&u'ܲYB}%Ϲ|8Oa];T|lyUx|.c=y晟jlXܩ#<8^YR¾yǓ0߲:ɲoL~ϕrmI% 4Ҁde'T^n_vsѳc[ e.&x_el>K<4SF+W3:0kyֲUd"䷫ߕu5V';@n hho'dsu.>$}^9&!=Rj X`ZF{V'fL-H-;a]Efn|;fɚq[߼nvfA{p{Cs}xLKXNjfZw#GЩs+ʺ~k/+OV\my@W,][#wC,bfM.+l`*A&z.MVll_|j VS8<\9)G↽1sjultvp{t:@6~ȩYDԳ /;F, vgZS|W=n&x}d}grExR'c6}ʖB-o1e5DT-B7e 爮/NS%#ԯkSi6t͔眜\0йqoG,/k_+]V3{qA9e]Ilj3lvEVߜCpF6mNmLvXXT枾4E:Fc hM dUu?^=޾ͷ.=Tx;Tw[~܇osQFXUo1W9 Sachٱ y9^~'xm/x}pߩ|n5X;cW(3+,{L)4P@I t-#I QW2rm' "/;*h=+g= ^Ɉpʜ/m_heNsJ, :c:Mg'[F֏EKfyy'cj?<x D kbA _5_BkiOp zOge/lY=uܺ*3GP66O=@㰲O2dX @4g/C%A4ְoxܻP;E{͹YkƓ8˩|ȩCCӵð,۞ca L˧WXM/_9/`<wy~6Xb88#ӏۿ#ȄCu1m|-ns>@з_o @kZ]#ފn:JOgL^! 5 LE~%w0nt;@8D+!}[ )2:7#.i3YXP6i3tc,г] 1+Gw.v8g% }bqh棞t"BGU_Ƣ&.3Ɏ]:-h;.DyΨQ 0^4D=D{o:XOn쎻t:Æ)Օq.̟V!_2:3`LNiI*]߃.ogEU&oܰ7.-W=G ,tSiZ4hF?_o%jfc l#8h m`,eV{tC5x&oֵn2vQ%X|jSO^oW5ݴ };~+uU% }z^'ެNɹ|rD]JJHWӶ`vL`} h t+}64sT9'llR:A ϛ7R:5 CЩ^:Xx{Ree4Jrj 3sYw돢֏6s-_sў$2ƓW& yelbcOP0 ,A&Wz:Y7J]̰!T:S1-Gg Ui(cCj#z$jXjSvRFp L@D$ySހ.E|+4uix%ɔ9c%rGAdhBQ$XRM[2n=m?xȰp3Ӑ8O01ō&Ug7uM*qSAȖ"x Aԗ㾷pӦW:\<k".F;|[{f57PCnWO tʉsƘ^]7wӎx8QF\r zvcNM{ 8ϋRV+ܶ#>#Чj.bsp3Dp Px{Vǁ ǹe Sc/д脑Ձ z݇~^ee(uO勵'&㩮< 𪎀P 1nq=.K8Z'D&]T͸3Kl ߻&X$ wQdYŸoޥ5 o-,ޢPGbhM5OݝK0y 9_hE\%Sf;[MHA-7Ma_d-vg*[N|78r.-JYgp %x4ߖCφ˔3X}HXJр RҀcg3%KyoktG䕴~^3vрRzhlm[SmnnN.Zgμ-l϶cuʞluyO#`#(/jru>.ߺnݣn֠.\=zƖ˺5Vd__?+2r% h@ F>{t{[%Ѝ#KZ`ANG~ 4 Į`8`fh#$fMW4]Vq`G I &>g }MV]tjN3K\XC" zŎY1 {uZuEhiE亢y웱E^}(~!%Z Ѐs8.V'^ x\u+u-V_yLOEPLֱ*(fRfu|SNX]qTN^&OEk.hӒU,++ېyʴylSN.Ƽ8M#q &ѸI2J?DWEfۏĹz_4:zq,j`o麝=c 0r:onM6aM7Y?<4r׀rkVJĘZ4F %>*Q\̸ {bڸ7$(BXbp܋.VzElZElP/W.ʝ1~@Mf K;} 0:Fp\p4:exc63TvЅ+Cr[z`C9b9@BFYM07ǫr9.ݠ0մw:kp36]S8>Kvs:V󶉱tuPaA4+p,:;alaL*ZccIBm)\iC<2YTOg6R`.Jt%MvU F 8DrؿC>/ǥ-~*K?Jrjkldku~h t Le;I<aL-^ݼ兑+Zg1rzD(vJP 4[QVitċ rk%&.X:7<_`fy-0&xŸd^+*K)ritˠ20Y!&&xJ0/ C{!+}q8 &VAYpNH\tv݁h^J f5z%%||cg !p\KM!ABΩSv *D+z@^WM'˕]e{@ntE >B&tC9Ǫ|^$VQq:v T}T\Q`7aWPft( >,K;r0w=9y@v}V&_r8_TM V%zAMX [`ma SP@LB^'f[قnYwO|Mu(FE0XcPWAJ lW`L2> t?p)=pSR5z8S㬁|.>y{.e:x.(ӀLq$wUڭ[Ⱥe.h(V#X#nżN8ߊ-` r;G/X:GV,c)W%dSG+emjjJdTup,sҶ N槮 JG=S 3ZqH2>8nzO[4 fL9| $4HI-ojaO"J^}T8 k,)%2pBBB]Z_@`QHSNC#qx:բ ]A~|RwO;/&Fp=*䘌\5ȺͰ]CKXqp,e_u]y=.Fxt}obͮ6ڷӏ?b@|r׳ BZm_́T5UC :;8ca;^pKiqE]c|f`qÄf5RRmLhԚ807`d2A^pF7|.:62U%k "g1%\pIkd <À{!`w8d:oR9su~ و+hd=&YPi,c&8mS t IL]EZ 㫋>TR>~mWt~Z*l0@Ĕg>^ \y& @ X%%_Xm-X@I% N՘:J[ 8Qd]ܱ^/fPn7?L~Ɗ95ԉ $^o(ܶ!iֵ>sDVR{ .LP1<8F}]}45 xu;v$8`mmmvi7%]L\ n@0,NU|7TF+-`>Q`q)`V_s6q&t:J}(}x\ vUκ{Ɋ҅j`2adшw:bP`(iXMT]ƎA zve7Ҿ*ߴYH qI5r˶9B`UJ \YiqPtW9lrmLbee(nTCl^M}6 }z}{cF-l+Cm/걝r E 0R puncu,;!`Uś7fc 5s i4vXUD9̮0xg }v_\<-v XPo=6d>b{; 9COAYpmU1l,5&OY9.|mmV1_vP!4U"wnr'5b:@|ZʳEdwʸms`y"iNjgOsMPVye6*Kʿ[9֧ 6T򭟵 nwML89 ^QmS%5QjƄs^!86)Qu%߮h|.Yk]w%}x'A{{~ɿ6ם{춰|9۱uyȿ6bW.SsL~I?Z+x ˎ-s"mLe['#ש=>[P2cykvlwfYy;v ?9(mui=;M[,O6 :*Ŕ;;X繐1˜+F;Ŕjnn. ʪhjzn+W4):.@~Ɨamj*UЀ u'= U rMG)\m/|2Uザ0\@k> y\}$@0`Cpb SbAt2HzKA&` xȈ`Ҩw@kP[x2%kkk+~ uʐMs־un 8wjѷsˈ'5o#gS s@y(89Lu/5B@Yw~dϨ"+)(P 6ȴ2 о:vR@l_XT`q2 ͠ ˭smhX Lw礬F;ѸW ;!? A ?C *X٨ӫ2*T`ٯ0|~pyοvl~=v[X>\v2[Bgl]}Y/1Od']00' ymciʎ{g:\ɦ6'Rga Qyk~\G=?\?|a:?^Ɩ?7^ )VKecemq]LYW^Or;y[XTJ}zkw޳d^3Σl(6F׬Yjl̼={t D_I ILՀd+ Tj% , ,];Eh沦L94e<\Cs~qѾ;q-a;dٔ.'U 읙:߇+>Om^.Zs0p9"<:flp4DZ"hhVZIb1o6z 5 fa` xY 4Anzzk{ŴGWPv$4 ]|wO_aZUT]5 7&&u;cP`N+ڸTc[pܼ;ΟSpǛ0ł޲% X?=2e =0hBu}q6",C P.p)N[ ue8с/lvfjs+Q[@J0ѱ4 x]P3XČ:,hs9v1њ@} $Pԕc`2Pl} LԱul<M͹m<4cV ʼnKc<"+] \(3dLhPƬ5v-S *%s}gL8]O 6ًA.A ]%?#,Yti@_etMqmŽޛ675C=;[&M>hɟI\1ô~n\477'y |oo8bL~o[oNu+ŋSק:/\EƸ_W` VjUlܸ1>'|L ,0>W<+|ްaC,Z(ԧׂ >$ٖ2|$za;??Oʇ u7|e/}KiY|t֭I/d06Sܹb~xnk׮ۿm:v_WS>L_ox lْ[w>mK1iݶG>oQx}(xDYS\|ׇ^۴R>t8O><ݾs3LzʩHCIZ'y^OxޏM?xcln_Uks#A7M%ۏt8E_Brpu-e^Ws$ׯyd}˖-^l?Ui_랲y*s,ltj|slʮ[#y.ArR8b.KeI39O'ѺvxN3y,9skbZm>Յ㨮9> w+Z~,Oos9Wx&euu<ڮ}g('?2ݱ2,GFc1m4,۶˱zs߼>SiF<3=ōm 8U,]d`Md>6 ֋T=X%)JgW#`ۇ #+E`uV($Cj!gLlk9Ǝ22dm R >2=ȹ j?V}eh"FqªiWGԹ69\>G] x4xa@^p+j }}\\dtYM+@Z3kAX]u nc*ʣu-q릾xe#ŕns? b6O ~)P9;_k7n$f^\l.Q>,&aqZ4Z> j7c^ ۫76bN]@:\l$tGc5Xby2d>vb_>gB&GGG[k&> :fV8` [Xˎ?5 c2Om\2d\:gѵvtfU7uBrUɜ̕_vwCCwMsIIWFSReb :\u}EQY@6Zʈ`1t5adkcUnddNv-YKV54-OĊeKc.vC,HPY]x.0v>0'4t^^ RUloFzvߌN/zы⢋.:+V۹A?Cq+^:SO=k֬%ik``ܯʯ$Pc? ={v////%@ćͷ Q׼5XWsD@|/~Ӄ̙g /<+Aw3? >`zzի^~ K۞47oO~qWƺuR1ԙ OO& Lxӛޔ@޽;^&c/yK6B꫓ iW2[~~- N~ ggQ|6=C79LW>_'?½oM`W/8=ik۱v SW9cƌT4K97pC[I7'ʹ'`멟Iu TL>l_vA/yܖ Y9vyQgua]>o?87=O4}ׇI,'= T&X8NsN:ú^zi2.Ϲ_^S3rw<suyM\T^禦kB=jϳI?X09h`??J窀~7580-g̾ ЪGc}e>ynd89y.{^ l V{Ә/4P@IG x=^zO0{ש\CGvxuuah`  yn 7i'sG!Wsܐ!b ;@M:ax0LMD_3.M!x9+q&p|t9D02zߎZpL;xLᐔlۀi(vMu[R_m_[Tv4̫ puU4!Ү==r4<cqw)EU >-ȸ 0ON?߿-n|L]ķzyvp{ QR4}Fe\6;k%&^y<k톝E?z< xUH&/VMrPkw$5V/nowǣ8Ҧ== tx)v'.U//bÑX8oƵ~UlnG<'Co=7v`-|~XhC}%쭗OǶXl }.mO'1o$Ma M# [cP"d􎧺UVg|PCggh+'*d1L9GXo*!݃<>mg=MMMɈ뮻|t,PƳ wxdo[jHzHv,,Mu*@%S‰'~ iz-{N#:`a.ȤG},AC4 oꪫ>4o[g0 oxC;,w ̑)iӦg?pV$0G` I|Ss·"+VH7ª`+Wd&y: z)Ogʌ/f[V7=O}ԑsDh~wycp0m۶LLϱy9-<: Trnt? 9d rkq,G &jo^ԍ㯬2!='S↑n 2NWv:$??ʕ+Ӝ:yk?+A>RV <=eyzկ~ue/Kms=iԍ`%\'̭R*i^~.59^cM^3~|ˣb$e~~=+]f>L2ish.9 #_a}7V-d,:Z ~ 9OH`Yxbi*@=a"6s50*J,fUø S6O3`뙴dth]`,>X=gPa/LI'ĶvZx. Ђ\0jRdgr\AX6@#3`| gviMpŠ!6O+(A/~b8~SN#0(bU;=:%'a3V2?;V²ʸUOk m ^n@ؖCxkSg|}`|h.wdeլ7k)p$@D=[K^l9}"<;b;96ىlPB.rV$ƁhͿ`jL*Xa4-p9K]ǘt4t uy+9޸en\v@vG]#-O+x=fq%M݇Ɓ]B 6iN4>'2 uam-A˕ZRdu+`um:G8%VZ JS]ȱj~u$Km lLtߨkWJP/' Q$@vC L[{Xr$B֛ICC^ɤѫa+1(G7$E M@e455dvhː%H#sGpϞ=!4}бL]5m'> M2tl[7x˗-OEH^J4۶=5>YaUvˇ7*qr+Mc:s,c cA;OWB;AAB3 'D[& yѭFIdd09#2`g\LDSٽK,( ^!KŤl'r}Pv,P 1su뾝ö%VGWKQyy&,('Hd!:6_WRyA5R#gJY.^_@ 9waYxq.'9x9K_$cW~u:ŗ])usn9:g8Od %|ΐzz'@5Kse(Vxn)l ^9 TN/ar++< &@)L43_͋/8WpveK•4P@Ic k>^g{S|S:R6Ju4`#}ƪibn .``ـA)5NÈ_@q4*Ҁ%jwSd䦵 U 0"_{%VS_h2.#ΐ+6öaxa\RXiMhS 3Vd{WʸX|3(o@&Y^W{Gٸ mL}jrELUGk;a cJώW9n؞τ٫&_^='޼ɪX|BweS:+n{`~a 1V~+j\ `o;mWKq+O]a{6^.;%nl‘I}l[K m@AM/ nV_K|)@Ƿj Mw'^7wĚUckOj(iUq&21( 3fWE!vT?sƀg3յ1D@)*8׽Z̩/!M@v20V88^KK WTcie?ei|(->^9γGCA?cj ~ɊGVͭϬ]|ۓ][@Ak7IMLkSꋮ "Uۘr8.A- vt3_P隶~<*k;^ T=̲d@itk,j 80bK4) n0G4`5BL45u[+"mmJd^W%} -/@2_V&"o )a A)xuqod?dLdh)Gm'4FE^"Y8k|SO"CIƓ}oy[KٖluU5 q8)kFFrS7y mx^kLƑ:0t o0r[L":3>4_՛?عkgsm?S|13@c(c>ϗqfy8G ˄z ,<縀K7e?97tAuz4n$S/8?/TJtKNc3]y z_]T`SO`p?곌6XxKܾ}G1?N mdKےJ(i^_`\$xU?N3g#Qǔgst4塃04GiXC\WK?|mYHYLN%2RES-+|ё @$mDh0\]NWFW(+ch':|e O!y],zHw@GperlCa%aW5㩢273ah\W놸垽q㎮XPOE|%WLO|1_bcHTN+^5+n-fuF\, ִhU°$ۜ#ѿm(jMR5>k/[M\>I-f5UWAUSaʾiY~l㉆Y9o[o峟Xa0͘ "s⁕ǣ_JwU_S_mzJG]>\+[GR-$i޴x'ľv\҆Y+Ɨ@e iG`G|iH.0ofA.s1:>R l$jѰt\vem a񯛓@I9>Ld= SYGЇv S u4lWiɒ%>mΘYd.%F#XV,"er47di`jh6 /0%L!󷠉1ykFq纔j򃊺U粎4pe {Gn KGօtҐ2 P$XO^ՋvK r#$q?(>4*uY@}A \@ %0Xg?=>d i;޶)~5usnNw+symݖ/I r>$p,߲7%d{˧Asr.`,MoexssK ȼ1SCq:;FU;Y΁JcYWa+`_n H̕ѵ2)Q.8J|p>Lq!X& ʿs v̔,:% F~TӜ2ad=D6, i} F3kf"0gs6[7*x#!Y3"`:%߷71&eOɃ}5l{WHFv}GGę`o 1yѿi_TtG'L/̭K[aU-c:֍s:uaWak=_*v $NָSu?gϬ#PCydH?/7L9b10RmtlNU\3`T RdY54 b=4}mOdHU408R+$xX.@@sXzÂ^8 Rn[W{B ;^ w,1EGjf'_yb:tP`&`ISе㌇%Х_}r#k@{'Օ4Q~2Uc+mA$ZFZmL_uyRIC7q]4uODM8 gy i{ܒK >\6\CNpCͺ4^uet?FF @l2& N ͂Vƃˮg\`]xd]iHk j Ȉ}XR7gOW9%[E2S4d@idZh ZQ]W~]Cއ#eʫ!.|A,u~hk8ANk}g>3h\aOA#\\Z|88ʮO>dKW?BPѹ.'-}hYfy d)@ zӭJ0yxGIF'>Tݹ)fOd؎Y6Ϸ<]ՑL2+R;}w i' 9">ԧl69ϻ?L)?oݶ-G$ž֝Pί7Ԟ[牠:\[p̹9ꪖew睟%+2^kya@Rs~x1wQdT1<눠j.4dv^miLƍK2ʺ(>x>罴m~gHK [䙢L2*@ݢi;_Iv twqv,W-S .@X׼Lee8\dҸ2#*b U)'s̡n` VSc?q:pl#o#2JpeL̨W@1p.du>Ic'G8Ia! `6E,w75i[v`"u7SkLI16$Ug/LlNbJՖG=~AN(Ag ~зmJ輦1 h=16gUDB};F˧ZcQ-N2/IvKOTL^8umB?(Mk89Vo21kwϛ\ `Z>Sq{(w1g16&Vm-OӐm3 vb%'椹pl+(׬kfٌ~~x2g^#fbv-R = ÛBw;}:3umbԟ'9)'0*p:s)8sP Lt={Qhr E9d\#g%#PFmhM }DUT` ,亙tӴFy<]}z[[[8D7jZCL0C#NT$bA% ^撠+ai tY9HyZ}`нЇu0 2D7 >o1f,dϤ+#Kj`ʣɲײ%4Ձ2d `cH?>` zm2w4e4.&YDTͣlR%0z:N @ի dhl  !nV6ei nnm'd`9d; ih B)I~ØƿoV4MY9>-88&3(<E5eۙԡu|-wq39 \7As|pPwͱK:2#MDZR_LI6,-e29y81amjLqLRKx)1sn:Lܲnee7y^;Սi7qe<:b=]Q8fmr^7ȱE2O ql~L2='KIr)KU&u}m,RfP6y}s^x~x> ꛜS_ƚ|M]eyeˇ.KJ(i RGk^Wz?s {|1ė1RfS1dzm#?KR=?Yp5]Mޓ*W{J1VFc?k}οQfH29#Zֈ|f%o< 7"MhN}YRi9RaV@(:b 6lehV=0/VK-rz`vҏ '؂xO:5Aע5:D`[Tbo'5~t2IfHDZbpVyla [Ӄ0n|L2 m2~nlq4L΢S݇lZo[-bVl, gekFϜܖOnymR),7yC7cQ54[Jʭ FF) r7V-op|xѣ\^|z2ڦmeܞo<~a*l0_`'g=?۴1?'\WVceg.kV̳-ݡ R^گ {|bV"6ldݙڦ({JN4yLm?zO7֝Szd\r>?ם5/N=yly[G޺>k<=ks<[ O%i}dcOYn#eXSfgrȒZF~Z9ue99[rXXKs9־*gʞ1ufӱۺLnzI;c;>2X:y_|xfX=qhk\;GFώmjyO SF ($` LT{y2e.^5yzHLV@>XWKv>Yξ[6Y9h'n=Q7~LTx^[]ʗ sy_WKcѱ:\gI% 4ҀOfpB& Rm YNŐY')D6J k0iNAgzK@4K:+] Xmi 6eVn1o׍ʧ[]dJXF=rZxB^K(d5G`G/j'Q}p}fUg\)n&2 f`8.a: dH&d1qcѤMSS#fvXK2 \ǛWZXK Fe襟T +VGLW\H< \M`7z8o|+.2!ZbVE9Ac"gQuzúa+qj߼'❾wv?{zܫlM1B!8!&10n`mٖ{[$mッoV#^6V ~zo;̹s|;gRl]i=H٪M˚~kе3ŭP:4kP<~;pLP/9x86[~TF']BBoSK>nICL@ԶvIsޣZv+.рvoD!z^n &:@-MK6rsM` +STӹt1=; R]r<,7n]Xة jexY!똈qsj{`?1=Z5폑qvs>㏝7ȺFX=Z1;Yew`cx :>VejeMفߣӗ||yGޟB! $"^ns޹&-@!2xϝs٧x0_a}fLׁ'0xoYo| `JУI0EDrͮn[ U|t#|2[!Lyu2 *IyX[%O^RV)?JPtmʷG[Rib[3asO~(*m~u 0jq;., ԩXWCV]rÕVYOݺ,޲0͎{_smRO]ejzPiƫ31x$zGۭ 2-VuD " ȅ'Fe6OAZv\t}Mh{v b|boXnrTʬ(Q!aQdb|t++l?+7LU+SK  Ԁ>mND 6oK8iuCSU @zEqfIt$B1`Ibϥ*̟j5/`btuf 7xOωLq8|F.SXy_O]3ڱW+l)|Ώun Vc3ΑuݫidYg{G <3>SYgwdڹ|tgۗ=2?s#y{_7:<g>h/#@H! %6r=+6@g'`R>nl\L08|h?B׿1$ %a ^ʎQ߲]U~@0`D+?T)0+ # #|*1._,ĹJ5Nqzt=,9r(aH;%j>J&#W.bOnڝ 52).cIOvXZPeU' %"vEBJu"8} 5V]&cd $0*Z艅uXn* Ď*F[6yh 6))$Fg~.{v'[Q <K[ 75ev[@VɳY( ^9E<~l b04w;C-em]zkm\mu.QqzT& W >$ti_sw1[ fdžjxoٖ[Ѫz5BhJU"'|vJܾV)t ^1b>> ;2] aP d" U:qm#=S*x1(OQ6 s1䜵8%15f#g@Z3M>O^7`c :l.aZ!/@u t/$ģNQP(I(NBI@?ơ$p>I p΂t1Y1\ Sgob")N;v.= %6]VLF̬4YOh5b6u8V+b1(:QY^+TF9'}-3z_7žpMb_%Zrj,=-B.k'.>&)G2ީ[2b챺k@]bn~'vbmiA+8<)nJť[Eݷ2`GƲnH.&JX;ZOu ȉLFhl.HyQV/hvQҼW,wH)pEf% m:Z}88T9*. [@,b[xo?l3#lkjMr߳UѩgEʼn6]cI< *=#I:CcMuT-W^ggM.߳lj٦2Яtx5SFe<#~:=%ߐ̹!>*WkRy<ꓲ;E\͇Dx7B1EsN'vCC %Ϳwkx@IDATpOG7 ~,8/?/3Ρ@H! $]lq ֦< < ꁱ'*#7#a0s8,UqqPol N,ץ00stL#F8v,\%Ns">V>U2 )$ k8=(CB}*SvNx_|*>1n=Yw+o0dU( 6 0vV@ϴ$>*ʫN . @-zԙtqF[tqX_~"*ԞW\,XXoUփJK*K̼h["qA՜x wP2@Pin7u4+iYqܡvGf-eIY'ly/ DkWC7-F4)6(j#}SnR1Ζ]gӺ -4PtŒ͕m/c]q@+4F$.ٱd95e- , ?`;C'^(F7.n8_̩1LebS)RG@b)]EIpJ% Rc t5+0IeQ6-FCVxĊRշIQ#Ŋ۴;><=:m܁. n0Z)oZq)^TcEҘ#Z*;mͳlWqصHUPܪ[yqڷ_o*.iFL#Pn[`%YXvV_%Vp^&]]pm/^P]-녃M{2.J(z:L1.B7UL*sٽGJA- M#*EdtѦ^։WH1hgDPh}C; ߟ z%fTUxRj ?DzO lCsaʮR(T鹾Eq"&zar9 #Uib| (5)_m?=iG+@5R8xmtP=,VzPuryL *{b+.F rU) +@)Dnc^Pacb7*$>|t؝^8fNb$b E432ﵡ< }xHS? =q? cuZeپ:|]f)q6V,ጌ Cr0L d=1a)-- =E[h6Oϡ.0Ҝ"VxɅIvBozlbL.9ŵ,917yAO<#\hj Cwt*D@/W~ X` 633bK?FTjF.nYrYu-WܢHr.eu׶Z[`~5Jk%4^)벥Q6WԘz 2{I-R9ϫUsRn.6%16(6ZHڻO6 Qj^6}v_M)ֹ*Z享m ]!Pv`շL@ P@!kǞyq6iPdWjQuU-RN[:a{{-#];V[.]Ere X_gk++ܞ[do.Kav:lQ?S$&H-U(mme~J*8!/HWSnv4Y}$FFR1GL%\C1 |Qe_bI=qf)Nvl9V.7̝bKOHj6A5bZB6_Pޫx\jԪ!Ɉ8iѢ 6#:8_![j8U:S[{jē.@㪏T$;_6t V\"#O',@UV*. r/=Uwm> 3W +- H9,$$TGB"WC ޔfC#w`.j_`!O `}~&9,0yuרA5 XBJy`ZL<(knN_v0@^־`Xb`j`󴛲,%%ŲaM@dO |*$R;f=}%جi)vVK5Jk 1nTܬS-ZLS%'w\ F"۸n\!7PqԾ ϖ6KHݪwN~5) P% TH$P4kD+V C!9Qccg c3n?if$ْfϭmvls_h[H~IV&4W7W)pɫ\Lr owjFˀguP:]d.7; vHN-U!VP5Q7yC@YCY'iAvBԹ*sЦ6q_c@.I`SاOYU( L,axyx`}r~'qW(PXYY G/S~cf9L4!otޕ>r%]DOxQɚe8 ڇy> D7Xɓ])@!E(B<'q6^+Ɏ>rUn@`76B+!ώl۴`D7 M"&V\Ž4:9Z/6 B/tX5'l@ eErQ*رmWvHZ2ԬN09䩧MCScJv[PjeDۯ Ńzf{LK\ގh76#-*cG}o>q–EUk'Gcf:ZiuMVm6EӾog͉YVy&E{=+1ƎttY9M\KTgjz^Aگ m@.d^==:;.UvL;XYz$ܼj-UTY|J۪QTd+(Rrn 3-QŦ >:W0meIkviiiuk22yRLWϱf ZA\ [-k@nju˂4>e$vtzZhYuRt3WS*re,ku ՜݀vK77% :+-!3/Mw1"Ű"Rg}nYmap|>2֟lΓ(>r|>/euo[`_/+>FJMMNq\;R.N@Kϵ6&>#`zEC>mEwc:F#u돏%GKlC3p|^>sgndmxi X>$ w"@`&;r.h=1Jbr=XsϹ(;ةӁo0Iu׃l7st%xuq\MYB雼D,QyQVm)C>1U0IU kp\UI rQ{B*jh[)O#I$([\7;ukr+(uRYCDCک0F!-9RU )T2'**V`ut^ZR@[Я.xdW fVm_'E+2Y$|U&p;vG[,K=ZowkO tN h;z4+N+3"Fx XK \IHN py%XcQnk6f/TT(PHfj-F\(dqg5rk6G%WM_Za6UڴkUm}Ǎ6­(-֦EؿHfoQV/7ǖvPL8ȴJ) ca"*h{П4!u0BҐuu>#VzvE,bY>Viv]/Q@^ Ί$OwX9[!&X"K /S|^qʇ @,zƚ> 16 0#Tzf)G+u`LQ?*M~w J9u[IE87U Zձ~d u}JgOבh>{$D|zAu3sL/n9g`߹s]< ͛7򲲲\;X6$岐8ڌ@ӧOw=xT8}"? *4QN={e޽۵ȇs1ciLH~M;Ϛ5˕۷; dG{1A?ӦMsy0xsˇ:+,6>b4P'={gsY7ڌ-ur-yn3)57uo 1- Fnǎn+l?bl޽ܔT?'@y e2f!o>ן+RR^^l0eEF,dA>/W'Fԁ?˂CƖH5Tꦏv#oR'/ a8~t=DqxB=%xq\KYG_d\m۶9]͝,Lw2׻Dv˘r=x22  :ew_['/Vu 3q?k>1gayu:|on|"ek)˗PO+zzf_>2]h _>kq:^^.` |ꏏqY>-ɁI;:g,s'sd0">` #G"WDevx}KvqqRF7WN1a7'0c)/ׇ,l{d]+w1b,dC>`P^!U J.zqb{LlQ [ F  t:1bHݖka ؅)MV1&ْbw (Kn?!7@5Z ,NYKK:UL3쪥)mgu@&x2;\%rŒje 65-FcioIsT\%?Oض&KBC[^>a%)mMH$YN[Y~^KrS|}uWϮ,Cb {NcԢ8Ymǿ;(M`C'֎٭r'B|u%I68Ύn)oѪx\ ,Z5w-K;NXU #5kw: &Ph\k_g H[l厵6_mljS?Nmhc鶷p΁g!tUؔ7aeG}QEvUht@ϐ2upҬo:HWÊe+otAY.WN]v0T:XaE8]1J8;1 s{?\.tK\ˇՒ&& Lc&Zf2nF7n׽^ߥ4Vx16 UK/O\QQK/W_m7|M:^[`~֮ G?j/3op|uu,|c7cٳ>kVޮ:g~_%K8c'?v+b;}@~iWv__߼۵]~p'*\1/" W*qÆ v%8#tŊXf1POO<3{׻ޥ`}v /Ё3dNog ڟPk_(#|ҥv!׿vSO9N8X~?A/~n1~3xr7ؽ(o|NƋ-r UW]AY!Ǘ_~C+_[`"~Ci }SW^yz!7qzpEƏ<_7Go:0 >6Zbf_r.ٿۿ'> p|8#s._܁g^.Ѻ9x:9" z|M79{ }FȂ^F+ Ipdi7eqOr?@dV^Vn˖-sF>˵;0/qD'?ƀ s=v뭷:0?Ǽp\uwدk~f{v1~ +qGg4>d^8yyyw ?y7Fw:2C[H@;;_qo3roqtW?mos`)@LJVm~ `3_ƒek+Ӄܫ4 3?>}=)_ = ʝxEz@`X>.h׀ l}˕urMGLŢ^1M&ߴi}Crr縯4ps :; U xEۑoheQ7k8r K>mc<3iۘ ;(4Y<$!c,ǒn:wqZCk(jht.p(T/fLbv\">*d=Rۑ1:yMkBm38Tı. Fx\m 3SySQ,?k}𮵲ҾĢ.[2կ[;iG]-jȪ4Oj%9mӋ 7U } tؤr*"6+p/WPr!bU *fG˱x6G{"=/p8^?-Go ΀\qx]09˅N@8K> &^:ckdvW >R9]ANU3bl@u Y2WD^9%>}y\+ie+?1`r%T9G5Y8WdƝp!خu,.)Yd FUr-*:FZC]` KDQ;}\g ǮJe9#|Y!1p|dH={=FyFEa`% A0ⳬ́aMaDba?vt5FWk֭|U; y(EEE|(eP\hkWc,L{``pL{!$U+&0oy[ 슻ロ9=y 23N˂` Ƨ?i{g\[06hߣ> }ϱtG dtrX#y0zM,>L0- 5>$l ,7Kn,Zq?ȣacqax@>??Q׷-rcoCq עO0⮹wp 2dda!L?zvDٟ;]D_1bed lYpP,% <$1>P tL2+FIbq+cLf~oyCý7sqǟE-t a΢ Ů̃l?яܼ}}n8>G&u!juԃ /׵99 ,`l8 P ȜY߿wH! $pJy J^lܹ$pl<86D+փK^;vO]kx:$~15'ZeuvϽ;[+vϮH-n\ol%9ꋲghf];/G;&ʞMPD Zmv[쏍Yb}-Y^|ٮ)xл\mGkL]rYSPyvvRiֲYHLG;v%ܣ*+S/ cMbq kR1\DkymU Jűzes%̎QaR- Ҟ "@XJVÍ=V\Be_y0~ЁѢ=fyZߘv+Vyo*juڝrAJW4mVib Kw $lLJĨ6zlmmGF6*8&^@?BvR4]z*$]s:+n_õPޔi Jq"<Ձ1`:l8teEHeɁ`ع-do=c?+;y9xyĿ|3<`?|F\@5g `y 1# EFFo $FqCc n0)ȅ~iLZE M;>9S` 1:ŧK;>9vs E_%9/BDit0 0ՓE8ERl}7`D:ݥH{:t MԯkƳ'/e/~,c@KJaͽ ‡aʕE)9~v:Ъ(p0y):؄cKh{yσQ$t): Kb<2gv-eqq F5 HvSL#7?BB$@Ho0 0R 3 ;; ;ׄs09:cv1)ϘP I`<BbXO- ZԈک1`Tb&LT`dN؛0ws߀ =C@X&r S%A`W(#^:Q*PhUpufT;ZK}G̿ry5U8F^D rUa'Y'{lI%ȵõ"}vbRg;l}Ӟ6{/[GqIf$U,M!.'He9jVk.LMMn:dˠ5 YK k}T}K'&L{U#6]p hòۢ[%t*MBIs쥍6Zk_|>f+P}FXbV؛EQs[͚hO4Smw S,hvF0P;$ K StOqRZ"92oÊ]4a`:yŹLl:"fEo*E\wq4#X;t~8$"6.5tl0V.6K 9Ohc4Sg0 $ x*wy5+^,:uHQP7X0'8x4$]Á#az0cۜ # c7vPx39u ,alFۮ` K CF~`bTaa\F1Cl<"oBPD vڋJz+`0ƐLjElsα:#paDblA;` Ƽ1C`8!o5h0D|(ԗF.@X BcB e9 m8xc!cn2tGq:1 b1{u` ^ ȂR:9.d p붭cf@Yt #}eH?8_\\d3,,5׉dbWWN>=$@9/ƕߧK>~M$7ƕFg} pK*z=.`j<_17` 1|?9O?q'9'%@[Dxk}Tc.`,^>ZJ1/p˖~ 3ƃ92`yVfc0NbybKY<9!m]/O3FSRSO;BC$@HxuU0>c^pVc>fYu+1Rh/k0^&Rv($p.&Ź-;1p%[CԚRu$h s5$K[*Oց6H5V$Ύ-MI;,w?v3|Ԇs_|ީs B eq 8z>3*7۽l(JN:Gbu)VR tb'k[6<Nvsu*?A;3j7=Z!d+ɰ{ֹn(M[-oiWNFXE !6צMv/=[ʏȏ)v%%Sdt[geMRm\oKmsYӷY9':P*ۻ}]8}Zcwta' !z]M]v])džXa$Z4xſH5]E8[NK+r힟HqMbTۑ~lvʈ)bX|-b+ol0ZP`Ϟ+/i/ZCvlm.Wl,KU,Iv@>GSNtYc'} UV…e- ]/L3T;r gt`bgr[6oqH+hgW zD0 Q@ny8Pu|Y]pFCd,i'? ͅjۋG00'ǻn}[R7rAnF75!=ό?,Po`EQ`0]p1}v(6VHpȂkgPJ||)@{p!@,ueBBNI,p-ltkLrqć7$80hi3}em9*,GϘ .ƙO c lMQ1AlAlN;wv/ @;:@_Ǔ\`N!Z^bub+!JVoE n/v \ ٖVck_.\y=~>ojt {%5?̚UbpM/XcIr61mdXoN7Yw%,w6/>cqhF]XiOVVnͫrًkm@OP5%Ql,=E2 ^T_ƺQelUʳKgibVV1vu6OcMqvh.\dHpKP}{[4I,~&xҵ]N5Cj$Ø*r9X )jT^؇Ҩ*;oHNKGt߁v.C/7fg [~s  UFèUP0 0'aDac}K_<ͶA,0wEXm0\-0" "uB'008Ì1I Bю[ nk_җ]vb <@#6₇Mr#ÔDI|8ZezP swd 2nD,(`$UH1t]!vS\\,$.{0\ t!K][q|ظX2y9"gb2&1yM1# F }C^e9 %(4`evQ`lR.75@/|kOkBXM/ߘdǽbu]\χ~q-<bqc|s=qrpw3#g$t}")PQ xL?U p'tLq9b|Y^8! }i;kxH\?^7,IچCߙK#M螟wE 9 CBv>߀a8\nx7@H! $ ?~>HEKP-AlPe= ~KN=]@+@M=o%Ul4[{T@[Qb 솹6IڝOW[d7{z eN؃/in=)jqr؍bݾ$ÞhWu _-τ{[mRk]>kV6X_\_bɥEY|vVlf*޴vj\orР}qAM+[W]{-32RC#A叉̞&8&֦)I-veQEɅi(8,Sd9@Fߔә)R\@{F`\* =KN@A_o[cm;-V=@ظ0Ha5.5wW}ʞ k+:&`uXyWW(LR9v'fwAvWTLo~s%* n~GxY*v7Bm7KL>uځg=Zgv6fyE F`s`oB'Cߣz:6=!:F> jC`w, fvc.00F `A #0fx+q҈ ,w0cTba޳L/0vX@">?ڀ1_``mˁr`=`}vxC` ަO?< `!ral'@LY1Ok֬qȈ0a ##% Sdaasb;b!wpr{E䄌? LHv%ƞJ:#@)}.xSYp/F=@*?O GOh.SYawK\'Tq5D hyB`Q6cBG6Ɩ{1@ȜޥXXvz fi2L t6bh;#Q`q}_w0c}15(Ț6/Qkqʑ`>qCXmo+ͼn8k,9kC|ߍ#pP!SخG\;0?{2!)=u]BI $שXvE 7~0&3g/0X; Fq֝D% {%!H#8[Ft iiTPp# f nU)RV2S q\/]^WBܶfH:$N1&ٓvQ^a; 8.0EUA[^Q<TԢd[+j(ӅV*VӤ0;g9.gmѤ(L* :JOW]MAۛkn#Kf1}{%b#Zyh]|`XLYqbI%G]-V~I2Y`) ʡ _j b}aV<&pK\ʮ~۶fIåm6rU+vX!Kkjgl[)^vZ=aG%06e_X0^n,Sxa=u@@ӣ6VH_f^iɭD ߲ir,8%nۡ.P0fHO טP+L  'FOzɚ3#WX+]&bdI5ǹyI8C^o8i{:j R :>Wzx!t6"+\5i”CfȐsJ8"5oŘ3(J)ϔC}ˎ3fkFG&w빖 ">2eLi{qqKgn@͟?ϵ1 Ls Aߴ2=m]=HF܋|y91|4ep/ 't>/ɏǹy>XQדc-m!\9Q7p(%~0^#oE, lC~058` V9l͜\ S9 <_(3<`_ A$\CHp Zt5 ׏ mBC%{T3h1cMskFγl 6֜g#?2rYRcj}c\`b 7 ^rZEsx W[̌Ͳ3ZY˜p1FYc 0@~B1Vr*+YyU^Pkcvc]'H1LZ^gNp{|[*TIA<&D)TդxXfVss RZ7'׵rK%Ѡ:EW[X<2_J$r'gfkIÇ&Z6';Ξ+Vmw|wGno.L“vyi%=-7m]\drZ&_~ N9fm?9 U ]nkجVϦ.{P,%LOxm%XfYΰ{*T`;J!YuOkwYfZxDۛ`͖qeѶI,I )8b=18`o#?SM-*C~&ܹvNrQ'U%MLU@Fb!vL; Z|RRh,1ܦE3]o@)vD*U]>8.43xRt+RIh)dkaUXj6JPMnڕk{ M'PKcu6I Jqb]uV^b3lYwuΈ,‹e 苴5gmt;1iКD̉.ڧ-,PV$g<:ދ]޻,w7\1 PP>Jb Sqݸ[eWj.}V^kcV;wʙ3ys<&NfRf^j t/Ҏ}v(QAJ$ltP\.aum]<#Ú䖹wY-dPR3^TXI6umw&J4)fWX! R`+TEiϋJph!H AP҉`)U+7MMV$9iVҮVٴ!̮Cs~\MPz^& TT*T.mHp"; C67eX]DA.X7o

ǵ^>'^i,8(Hs(0׹.^ځs9bl |)ϟ83Xe~8,& 뎥Guʃ㧓|Y߱˜mm%OqJ/s`~~3ǁ׸:o71׮#}菹r4UD|ܱq!@}(/WixG-wg>r|8'2V >_?69,a K EXxx5Tg&nWa{zbvx^f^ ei`d6_ xKCP֎eҿ~tzΔHa4ɽCLb6 a}Iy /bʃT5;[T#U Ө'bK*NRW\;[-V Y,U.SpS7NTlL%h{TTOWZdsq~-l]"B] rrSu֨] 7ZK+l鷏Sv;lu쫵*XU[[OV3%6c^UVji6B1yV > 3UB,SRSgӦYm]K_o{c,myU[MH((RndP]p|ݖ~Ђ~Yw.&,_갍Wf[9J(~yf8E)8;Sv!%`P,,9ޝmqSY (Bc"-n\Ev\֗[i;?_evB.ҐzkErL,Drӎ`g)RU~r|Rl=eKrjkdr-t 6T-J:uw?@Z+u20pwdeLI+mn\pɬ֮ +Ii:.:~ܢP!06pX7Wt}ʊvm#|P.)?ϐky_~s/GxOŒ C;0q'eݗ,:7Xб1i`ځ)çy>Ǫgk8~&2uk{ccxc:Xf`yn:Ocϝyk{` 2Ov'_7yup~2UNObr͐m5q~5lQ; tߓu6QZbȋTR[butb[26cڷ̦O-A13lE- r,I.}+,N(O,1M=:ι9J;GrK1.#ٓ{l: ;^Ӄ[Fml `nw?[n5^rPMKf%(CI~I;"JYZ:{F3I)GXTYB*H:t21 }o9¼ʕ^EBa{+ 0 aʋb6[Lutj P] Hă#fǪvqUS%G =v"9G"sk8E{T|>?Dݏ؄!Zx)iD]RuFFjCkq]x5B^76 _P*K`H [*Mu{a/kb1JkcSu m$+XUiFA&Mhs[ExȊ#7P Qs#,Z`"Au|\&kh*`.es_%͛'Zqy=׾ڦ{_H#ګ6vm[ds/)׼QvK%J#ɱ_3w7$گ~F`s(wϵ=|Ou)v)q67Mvݥ֭]A67%£I2'ٞݗW:K5Hakz-[QbbGEO|̶.I%A#Sks,F wOm](tlD9%Y$փ ngg["PY!L$i鴧 Ŧ`nTlb5;I+jgmPNҵ t@UNt!N:H<}S"^UrM|ͪww$=k3.X{l<0E *M|*[0&X)Գ>)+xk`x}OUz5?bBfׄ)s.rK?`J`5y"Fϯ.ᵃի2|A=xYz!ON\,6b1_pl+ C{~;{bhϽk:!?>әK9zyi%UL oE=,QoUѱI9j[T/}oqޢ*%pF*0z 9`H,ϴL⟫1іwp_CGK!92> e &.{ 5\Y=Sz dg0Z Eb]W 1W*>`H/{[nv{Rm,l(L1aV-fyr%#Ͻm6J,Щ^YPXJ.T޿GlXX;-ExnMZ+NtY:f3Gƃ v_$W>UaBcnjXuո8۶栍V (_XJjm4ؚ6گn\m~drIUi5u|Ye ix=6$)v\ vViY5]iaڥ#,s36P:gtKinjnc[K~hؚ,hU Ȳzl¹#i(kjKKݿ/w/JbVO'nͱEv`]<%FMSr1•[-FU[U1,!sP9mdnVcwAhL65?ƞl&iivEvSpU5l9 ڤCkj1K&?[r$1TҺXP3V6tpuKׂj8P%NZ&WWSsm* _@bF.wz%ƫzq^e@Zb⼶7#AK]]-zD+KSQ1:vWk*}E(y|z\yĈ/"x3?x.yo#8 ʕ+sOa-vz&1uQ]}nS8+#&=PH;9q _o,p#P'gn ?|?Fm'S$;Lf8`s9 6Lo0fG/3]ϴʹrٕ y$!Q_: ܓUIJ( c~_+'ԶUq• Cb!PkG5(x%V冇ٺǁcŦ/`(ZooiĘ`.鴙m(RqňR+6vWOH?$Kuަj~M=.Š+1|KIV(R|$Vv 2%T\2-|Tm@mi6vYkiS-]ᗫ->6ICg+f-P H[W,lgc_MdxRjCa\l6c6,[lm15V!ˈn6 s4m/{+?ަi W}6(ۤ8x6rNPڸp}iQ5-r-upVVcMj!e >FvX@cؒEXdy?dNL_}q3LjM5ccm8˫,g\b-JSƖ~91 N F :v;3^D ʿ>VF#3I\s&o 9|/ciUrw,f q}V ?x"U^)q%V:?[+\Ftd*U1WiCQ*B (nQ,0!kgU'f$X]OzP,ZAЇ5jb}JȻLu(v/Ya6)>ʶ$Mjse.(X>lZvhN9۷"&G7re;Xxq |qcՊXgcϒHThW5vXcjX'O>aOﱴ%3a:{vXxoMJgϷǾd ٖw δ',pKzlZeOPb孌˝od]hWDqځV+貋G%ZIa51(_KY@}-G̥!,Wf%l9bk{Ƥ[Ux{J:QXmC(ll$bOS\{-SK/ծ9S]ߪ6LԸ}j\%3ďW :G@¤PkI`#u)`k|c7;xF ((Fz&=ޡv,n}eZ j}z rݢc 'VU#[}S߱ N!.0#T>pgnX\(2dl?ׯ=~=?cgO㼷q|\? `!NC#:PR, {͇ŵd=fN_|{L< jG yXg"Moo|+œNIIn:uH, 3+1ފ^VW a iumV'þMMGj)Xժ<p:Gv+hf[Sاaޗ$E.1x^U+RMY%?%=V(|=2]sԲZ¹+HP[eN+m;Nka^1lFin0arCl blez#895uXg|ͻ(RC-V̧ #-bJ}YP=j I ͖#V;~ b|{l^iE[Ɔv]跞%M06&WCOWZ j 8aFϾM?wMR<̎_rܼ^oe63gcF&ڬ 1stv-ۚ+JE ub8iffFYvS|4Gϕٗ'')nW- )'v8sr)86ۥxo{>[~,i9{Wu5דnk{=VXbUjuY-Ώ t 8z\μ-[?ܨa p7#ڿMUk}{IyoySzo}N7Uݪ/w̧q\ 8dFF P~MK`Y 9tReh>BQ>t {6<33K0xW{ n%gMVܛHH`aR+sC;Esvv_'v)S"?pd̊u*"0SӟLMXP$Nyr+(%T WzH :]^Q vQ~U5YBJ'jG' JSBmx^۪+wrCv%1(Vc,hֱ|6?;WR+l1Y\`Yũ]#LԜ ,|wC̒p 3Q?M$LEV&H<-ڠoH* Ͻ!Iz4\s = K+woKϸe}=" Y)o0b*C(#y" v:P2;Te3o@*%GnXG|ߩ4 CbC``6*;2 4 l әr)f82pKN~5|+roEOw'U|6Orf7*\JJhU,hÕ@2uyXaף]ɍtQlpb]rS|w;/":BikQ)Ywmi0bJDslv$70U%jFy'DvGYBUw\ǩ@,y)vXjS/Ɋq"ScB5Q. 8nkYT}T[j~ Ut~P]MW0-]}/[CH}u {eEXb?(|HY lk3Xb/iӃżlCV/*f;\b-Z\좯G_(/~#R=xWZhb6o(T@)ϰՑ79OMS_q_];ɢ Zs-YTXV='`h\"V!B,TD! .Z\&^kZ-YLC ,&kS[KuOm첑ԡv\;.J -V1a(k*L(S/jS@JW-_UV|mVҭ,#AHheT)҉š Qr7DG(Ot.IPVcKĢ5ՃKTVʄUE, O@]vDo}饟.G~֩O 9o)js1/yBg}c|AKqXi:Ϥ.XEϴ̈́cқ!ue0Y~NOisC]` &ޓ9>XNtXu V`OԮ'/dLeZ0h3$cU9xLh\geLub eٌmlnivr)..*Qrh'lݔDC)k,ʦo0)0ȥ#ӕmFș6=|Ae ҙ&ƭڹRvLV]L϶)E]>P&@ #E:SbxBuLt~i2s&K*]}X)?ޅ-)U úWU] U#SjcUn| (Ѧ{B<#kG[QlƝbw <[8%Ŷ^nm{M-irPwṰ (:q4kKUNQ FX|;<9rb\ʔ"ϸogw ԧKE)V{>~yu4vV+Q%vXD޵fc nE~U7lm3G{ng#lڹcl+EVUi#bm2۵S; FK1ּfi;cɱ X}Q7 VrhzCbna>p  r+oNM_]`Ƴ\NZ%X@IDATx s L2 >so8b" c`|NXA`<~&ߏxq~Xm<5Xտ7S9:}]'Sq ;y-BsaȜĥ90Z/m=2JzPm&~m경)"#"mĈ"PQUUu,r՗cgd Zگҧ?L}"?f@#)oy0ie|PqA*`O [~JP Mk>VUiZRsA,+bvj}Zld:,DEb^S,IGX`c2m=Vy+ZXE/h˾66%:R?{b˰M 07Κޝl[_+(Q^ ZkLS|˼4 e yӼRRh;mn]g1rK]:fVhu)71jmִy_zxuOM׾Bz jb;6?We|jUJn]a対h3b{Jm|v=YkYQ֪AmQ6Z۸.I r[] _XzS!ak_}[KJAA["U1BYwǬXW)pŦ v)](jFT^ٺ|Ϥ4b|n&hvS&WJ^MLpҺ&+BAjUvX{V",@jyy*.sf$$ΓwYB 2/ZY |B9pp b~ׇ'fVdt| b 7C ?NHQ0($rIGfn7aQe}[w8BXy8EpEi:M;i7;dr ~pl`?qG=ȂM$]q ?T/|?0)O&ځ):*eqhrF Y!/9E]q qXlC΅~L߾ϔFFa9>TAaa~YvsηݏH\Їqrhhlh7δqϾ]if,Gʤ<|r| {]:nǏ5͢ nq?-E@~?Wu8;*Ƌuo9HY?/UήN1H9:W=|ǸPp:abG?+.B?Q29ӹsVYڂ;CM촴5Kx֧e3Ϙ39TXޓy{}aa"nD Vgqpرn^g^?`ζ{r죎ͤyN Xne2( }dw6)bA +#LQW J=a.O@LJW֓7:$ZvJd)}d KLgj.L͉bS|rzWR\ja }XL yYfFZk $[!!C,.[JnkZPH+\4KK jlJfM`k/mYr8b^/ʹ]bWmʹW%}ͷE-8J,N`Pee]sy.k*~*ŶnX[~l뿲+,(m˜j7ۘ[_=g!)i fسg]r%yfWUlҥwۮJ۽{͞=Γ/-bY >-ؾoq-55~;}`ĉ??i&W&m9s$4}Wu]g{uǟz)wkn-`(v śo)sm'?i_~C=raB<۷o[kwe]f7ntuz!bᆪ?o{[oՐ'y_lЇ7E]dgXy:w yZ;Eٟgw|ڵn(q\rko)c}̐ъ+l֬Yn|ӧMxok'Mt }y~IY$,;;,`lb~UI?Om„ v5_z;.ۿg?szT\\lOƔ.|uֹ׃ O~fnh?d=ׯSڎ;#snpzX#$$clh Li#r@ .4,!{n]Bg|˖-ssVt|͚5:Ρ=1H+__yt>xn?3lڅ>aN2>0~1$ڎn!gڏ#sXdW_}>Ǒ 3$L?;} }3]}n.]w7d!N&@OƘϠx 3oN'.@̡viCT~ȑiH{oFC&PǎP7DnE^&|i; Pv`o|I'kn'ǜ_'J} |GԕNq/DDzXhd/Ze$S#Q- JEPR$Ĭ+Ymr[^6ydp\{7slI^W6-5ҳ#Scy{5=9Z@j'_jw-i(Sv6xҰѢ Jm8P>9ޞ{x>knDDiWuOOm7VhʹHԵ3ҥָa|~Y:RZar=οdWd{${}*;PEv5~vyԹ m+{uoJ}薙nw `ۺ6 )3m߮Zk'Y7%DZ%w}2%.P$$@GA{(@M" -lCz,`Ma-acbŷTːt h.}`uDNݵQk:4f&$G5k8v<NJbM|:VL-WaҥCq&\OG \2RYW-@2)hbI] ,tD<.=z(c@9H?U2Uꍋ`γC!)Xm#;5.ک NOJNHEp@ð7 9gk_sb$DΈ|{c/w3f8#c~gc4`c_,Q"xW^y>O8{q "0- ޻(+}6m'G{Ϸv6{́'#Gj _x֭_gw0/5sLկ~aGw>~ueej2bqI~@GP`IȊ 68@ǂXZt!AXI@5qIO&2Cԍh>=?̝|۹s{CJ s+YP(7c " PteytM_#!3ss葟 _3 y}1#yYƏCg &s О_/( PPM=nsԧ>eo4#py>?~}ܓ>:O0`3Ըaд9~\ #P_w1%$?ߎ\egR/Õ) eپ] ,C=Ib. pr2} l\w<mOtɜe7;gd:5%U+2#q.((pqHF bT?TzAxk4@gd;3e8D#~0;:U3M3Arӹ }\ %UKJŪQ%&de Cq.Vr5|JCe'Hb%CԔ)cKaVgOlo-Q0.FQ vߖPY3YlLKO-@ZbkMT'W<1z),)¾B]mW-ʹn|p=l׮̛󭩾Vn D+WL}8vy.aYrk눱ز"[R}S+sg)(Zbηr lVVgM\ ;k^#dU/MlPڡ (jk-OȱOd[O^f2\Tm}ڑ~ |~byM{dkv]$V#/WYXKߺ2vDhُ6)zXm(n$j]#]=[bu]hxbbp@61|3ڶMV"悆|=&(ηHBg *{J"$[\CEZieS`uMt5SLTܳ: ,T*tg7*xol4Qf b5Qejh5Yf"pqy ~|,HJ%O9a’ ~ H u f<xuaL kNvN[+8PEy01"t:GF_+a&p,V3hX,SNu/}K 8=QIy0Db ٍ31.X` 1Ȗ›J Ѻs^s+~ kюAYRoaw@2@`CY!1(2?ydw-, 6P'me/;ٲ'DK@9c: Nh@F?'`Vx1$Iы#`Ɠ~ ۽{+2`Lm8 Wd xݡ~k=P;@ )S8q>?plt xÏc_qD}- .c sy 0 ^,^؍FGO:K91rwU>EBț60gH@AA_`[J t_C@c>0?@E&ڍ|@A36cdBe/;?SgPǑc}`,$R{@?q<+N9?}Oӗ?gfy&9kCYV`ow9Ú{"_~Z09z r.ed=2ސuyz&;rPA%N'K/~:R>X6Zy*:&s&{A.\]LSXlbܤwMr @qju~Tp{!|'W![ _Tbݲj{z[UأWnŪA%LӲ5u=.( 3& BgZ H"(P`MU-nuɌxcd<k\!;źx #Aߧ[~s -Cb`Q?kboW|0#( ,-eyk :87NOw_A`I}FF60 T ` *H .pF#&@@0}Pq/,0]`uablHGX^^4 8c>am e ۷EL3NX=^F&:x'J\A1̐7 p@ % V)F ap򰷐;@6d6#,`c6p P 1@c) w*5$/~X$^ 7/X`fL !Od6r@(m, z#g 7$ȇ0~_}]G;I;`!c`c <8f,aAcC[{Xn=r ga0V>Ӱ#myN L01f$ Q@Ǩߌ8blkIOdPoSLn79?}I#{,{J.d/ PzE'H{, ; j IVL+mIam',3S+gؤٖ0.Y`9:tJ&=!v\QnYs,4ҫ,ng Z+F8(PHw1^8)HԺwW**`㋑Iy+35(>U{(YU[m|RenI91bU)֯tmyws7EvhUt]07"ĕh&=T;{0N 揬1 +-Vg`^Jݰ#Ǹ3)ϗҁtF` N*9A)F!FX 5baaXd@9i,9+hhh!oyS&D F11qA:ÈY qccӿ"a끾0!``X4X .q¸@Ņ  0mZ ~v0Y>Ɠ 0tʂA2X'VK.uÀ Q7:@B̃mG~M0/ ?}Ⲹر^{5t̂D>}! , 9iuXCאF_3ۥ`^oaș?O;rɒ%No䋼:V qj-L/sXds@׀IÏyC[+݀0  d Ѕ+ Kƙ92`tp ys>^'_`2 >@Bƌ!L3&:}9 Z>NK(6a3 WfWA+k`ÿ%0,g֖0Y^`mu Mܛy~q Y%8+oNn[3Sʤ] 32=O~@g+YV6y& }Uה֛3WyM)2u(VPZuqC,PJ%JqX ULۡ <^fW(8Y QTG')L/ϷW m jڙI^g[R Rk[-QǫmZمiC{l¨P[>{dc͚d Ͷoi3E4"Vmt 0U%[t6CsZN-:K9tǐR5nG@sIkyE:T}ǝPwi}Nh1wwi;~&,RdSIXr({gP2. rw G#/:@`tyC#ooAuCVބ u3G /e!kpA;g/X{¾%2!L=u!^g:gό/%w6%lK{-3-WkPr/EwػFutzUO nB`ƚ&0W^Vp&T0e|֦^kA$U炊ǪNܿb)jkwضn'Վ>8U[0.νMCBlck]aMJ;^0U8Z=6̰ԴK>zԮ7;łm,[XzL} aF}b?XO{=۾Qm\aKRp'Zٶ""1>hl{Ea663ʂY.(*KG%YNuehS ƔnVgfؽ+-Er,D#lowy3CccV6*28.}mrvLɰű#C  KzB iSLUb= ]|TuGL)<=2laBmomj0Wc]@b{k,i3C@fJѴ~lIXKAH`%;^ՋE yп[7FX炥թQ8>,F)tXո :' VJ`DKbFsΕ>P[/ݬDUC$ZJ٠O`(@.X9D>v+<{.II,_'X8m1qFqL wp}lں1$4!`>€QCq?  p뮻;vh  B%\ _` Gg3Y`0ap 7ܢe8xC6mX~NP;ŋ[nu@``!nAY`jO Cq`}V,/9Pk1UWY}]0& 1֮YBx$ s1 R, .iM,1?I_$`ƟruXL88E3zk;XX?iQN%}P'`mf\H@`5~,5,= z 8 '`B N^^?.hGr:N 2`Pb2 EPtϰhSAoK7,}s?XE_`.5lIu?gls{L'07 G珶&w_?0`{/`}B_az!U@Ȗ:=æĘBrV]XWU=ܪi 8S^O0a%1 70gA^&+oܰ%p`=3y}q܇r?k.g<˸vLb=?O֢c֨ 3yKxqOJpv%]5φ놀q-?*F nW5-FsmXF+EOt`)FPOx=_:XW=dJ Jo)&RqK 1+bClyM5NwTթ^u^GcE(jYzbQD.i:-jMKo ]N [2&LʹW m[b#lD㬰6nP8QԪXZM&fE[W^8`S#K=fȴ,lYUo{kb_d;`M]B|Tv{''zlcGZ}\-;$܂;칚VM.u{ZS7HZfk[AY]o a3jvj<)V6ق(IXkeWK]]$*G۔tU#[cXxd ʎF T̟fU( nebwgK7@}-f;K"-_jm,%G%J׎A*UU3%ݖ(ESŐ.n(5VU&ewMvr'*O,v`j哧akIU9QN ,]jQGY])K9ıUy|;<% O**#Qc 6 e4>`{w" B Le30 l0aQlTE ;rF Fy0'x16Xpfqa1AKd1D n&?$>e!1890 6XЊc lҥ80i/ <wQ@*G),hdew]eW_sQ @@Gt% ƁrLĢo~q( 9t@6-ycCǽ߷2N6PY!sy>]|?t7g}Ј\3+p_`c  }B< 7g/'w4&UOw{ͼBge.NJH݆T\ϼ:? Xh2e'3TVU:ĽyN?Σ+')t7? mV/{z羔gvr`:d̑ %K_!r@!3y H˜@^mK.us9K%K{5% uHQCIɃs# =eiXxs%=M6x>2q&k K[Wl k_#91հx }ׯOδ75/LB M}u-=ϚphrM{]ҿ7QmZ9glH!O!RC\B!w6$ ;J{lH0dk@ Ykz`P^{nfЎcYZЕ(NHCblO*υ9d#*e]!EꛬE!WFWnb Ju-Iy#tP'PFhjWo,ϲ'fmG^c9!XN CRի_i;;*&8{F[h;쨲]vkVNK({h|MMF0%eis];#Dਲ਼gí5PHE`Ruٺ mVl 8 nk^GuޥjIlɸap78 @>Ћ@6cPmpظ˖e5m%J"m}W՘׻jqy6s̙rg{zْwZ޴$#vlf>(PG'Nk~gi.L4_`%6;מd}wO&x,9`:߰?\dy|m8dU9_vRvvY iDzլh)%v8Klʔ iMq|ui=2CD#*G}LMbuB;BSht-7`@w gA:G# '#ܨ=624|ƍݎGQb(Db"/4Bt2<:'>iXP{9y0I;V} !{qǢhk'IΉq +K`YE01.x>Zc rxLLOǢ9gru~y@@@!*I She ƥXD<~H~m4??Wd:@Č <m_?mXzp ,hxpѧ=@R m_| 8->|< '{h?:CȂzch#r4;@# c|& }BeI&>mh- WG>~htNyɇ /?K^|Ȁ/p`k1-H,в @ovr@>KxcɕvA}oӦN=6̀_2L_1V)pJ6*FXj"MBiY ,ݞצN,[<9n)4LYMF=r>N5Ŗ 3>^k滋Wm_]$Ϧcg V^lu12aly)V25nﶯmrud!hRƹ'ͧ#J'eئIݛ:l]͝9-M>X u־Qy)W(QvDu8beO}*e04hVd WXԎv[_aT훫m*؉鶷}B;" =meӜWoe43e8`rLk,ͱ4ٟKiJᴾIvp+-M&dwaVII2f <,Sd|b\Фx[_c] (U@neUgGBvrq%@=׳z6/%Z>miI6g)a Tx|v }z~cF^=ֽF[ hźgՑG\@.%hЫqR}[Zb=z}-xQT?x=QZuCIh?駼/q^mDc +0_AJc '~hpϿZ gg =o|&Oӄ"9iN$Az폣{Z>t 1V|/w` }>Pڞל }uhP\l8shk<>5BGbhyyhP~Ϗu./^N|H7Ƣ8Ϗ}=3m|_s^A^|Ǔx_Ț QhfAYG$mDt ^\bc_UU>b IPy(}"< h]1N .g<ةx*ҌGNUN,"@IDAT"b&p;ZV܆4 [c*3 *^?v1P m_X fr8+f(|D dVYĴ~ @wX[>`?N4I^d9q|[-IUg(v 2qF~ΐ3ublxT;U DX_oלmҞ:$-E "۴~xT{dW̐,9y4OV. ؤ([pv{|]IpE7er>@] ӷm5OM[mT_ઐ3VɤUSejX4VK탋MKr햽Yqmj,dt ٥Ou;en[k׾wRn_{dKjC2>b׾ky~nIaKrlSKu5hXo`%^h;nõu2UP= _FQRSj`4֖N8G桝ҢZ}#8V[(_j'sE%0VTxPlRtūai'-TZ?-Eq9)d䤾϶K-]f$>h=+hnm]81-هƱ]GgGxuc9yDBx%' Ή|毃?H"yhzGPG{tc1Hc|s 3`>?gCix?Z >>+y* >?V>?z^wF;4m0ɞF {|quk'ƍUƾ 'xD΃y\, WEPIMc',tYS˗ldhFF$`|1TTTA@ ѐ27  Qh鲲W#K#KAhIv_;1lW.'r_-jiV^:hV{uXpvosv|QMW^y޶ݹb2Yg;쮊F[(pWf)>'˙6[OVڲ-NH7muLlWX@Vii嫆_4[Ȳ"p-V;]@IΒ]c[9*o@*9m?zIVm'Xu.'8zS$)'|_ի-$h-v,.K4?f ƏQiC&8}sYs)do@JϬ7v,iy7ܜ⋄3O)P k&Ne EO(pBjmMFo.NN : |DHF$@D %x|@l<__nVDv,>('We3,L= r/(7r?"WkWм?" s9rb&+_vT^ &<9鯀3# VlNbeǩy̕oڗ3Pܗ3Z(ѢڃW8ئ bvve#i1QL8(-رpOVPǿ@1PCcprQ*ˊsHC2]öOq櫟4E 3')٬]{mX{MMiʻU ̹[D[YD;UvNQX~kGu{me'ɷU˜N9[mJֱd+0*AhFOڰxw]-zҭSvֽuYzdL'lh -Ͽ&4ndSrvlTKL}k]ߨ^$3AgGE4bmɬTbݝ▞o2 ,ˉ{oSFiݬ] P:J;"vUasC P;Sm"7VMs˞`s[`N5J;w/ɲV P3Q^qB8i)@E RS=Qq79Ic{{Yig FUZ|ɱ;[7-0qF|Z>镮̢b*X &&%+ZWp# ~] 4wh#m!ߗ=gASv԰8 UMM 9D`ex"]_<ޣ]ݴ6Co\ή9GۤH-t# si(@X;_DmpVht=Y6hӪOtΎ}2R>*=-=H =ϐO續0K `Y'BnO$v$ND D$p$\L\F(y,̔4bA8gaicίJkii_$D$%쏜 r8? 4? dv\0 ˚jbƍOˏlX5qD3h]9J_`O˝߸:GZ-n-*Ԃ&@zd!W Ke&&:H0[gĈСgꛫ1/c8ʮnj{vD!1 6 HR"۰Ni`-i@]lj>dr⾪\nP%ze+d*IS_@c7?~j{[,%$i'=YQq'aP;mxu6r(WiUIhyNddh%Z&}%Kn7NqI|-+K}A;b"Vn8\o׋8>`]V'Z~Y]@y@A{mvƹJs6ZZT{8g2,vliJXMe].Ma.;lvy/xq5)s˓ Lo_&O]vٿQ~k{d~xc㬱k s'sPb#c &^@Mʧ}pdG,GeZ{>̦HmI.lJРEnP\bL7/\uLG4h߀BM) }hR>/|8& :DBx$ ԀT9#%|k0){=k=(^`X,Q//G#X<^xB1''Ǚ b"YZZBzp ?" D$þ_2Gdu ȺU#^(}ڵPy0VU(6YկT b7 U:PT¹zՒY*> 5Wh-M8Ƌ6uP[^%k.;A\^: sE eG0kG⣁ O\PA9Hx3BF«GSZJKYtxxƺ8G]`1.E% wKS^@Dᑀdke{]w9g;w|aJD)??[b]tEF^^G>5`ϷՅ/;OԂ6~"#nccUf{[/r" 5x0H8y YBeX ΣKs8Կhcw``Qg׀T ''J*U&^# $726)UYT`nl҈YQJXjzʾ˕fRzu=ybZ=q^+R1ڽIG{Rtl]Rdȷ Qt~qN$ƪoȿԮ>}ڤEr[kn4=&F@ͦuv g5Kܿr,.+v谻m9hhFut JⵋaLRz&X4:k(+3b@bj-A슲JKjʝ"Mɉ6[k׮fk}DO(w;/Un;0V>t2pM6|YE-CЋKbr ڥ%\'sa mT#`qD2hbMЖ501sQ@Ī9?kr(7LfJwVH3KPl/)j d$KlK-X5M懘O L4Tu 9aЧpҾh"T;f&ifY.?b6(m*-S*_x4H#x 0RkʳZN3lLsխ v{;LBgf'[Ԫ{O ʐb9hgX['T@DSϑ/)E ˖բZD l.Cr[m+w5>9e/vAt{F&hIK chi¨l{tgS?^{bim~4tEQvʯW X;dpPcV+7]iAy [@zzΛfW[tu' YvAN`eIO,_kOO’DV`ԪF++H7d ג Ӯ s$Xpwu j<"K3 @FN6"m;iQ*ύ2 ,LLu!]Y< 謔Vf9OSd ᛥµ]]rNؗY5S D&$-ir?Icg[]:MZh?su9~qCVVz;X5S`Xi58#X, *k庣lL}x-5BzT% @8ZFco~M! `O~ D$023;J6+dIss@)ң?gЮ1 ds.90?$- :L _+1^0n7'eC,["$Z}k`ᇯ*YGiOL3e.4V[ 9JPHAAjo^ԏ"=:@d %5lrivT_S^ZWoXqVl!hqҜj6]= .%#vƫѳ Jdۧo@^tsWuI+[yv Fn4|jm֮sҚRr'RݳݞWDXqlQq E`rxV# )^Nk]wWJ4e*(ylF-MJN .\NM-*n~X`oE Tu"M>iJMVxv6(T2}āSl<{sT+`Ȭ\`Jo{],)@_춫0yKiG[|{ey'ٱSbUJgʄ3U^ ~L 'Jt^~5PMֳ(5zd$0HutPmCmOf^_;BV:å.G PmMI݆L[TW6aRsڂ)JO{c3eq{Bv HƎ#1 DwH|JR;Dq,}ZFiq .zF{IQ #TWQF1>G8KdwTN/uY(L9QiX4x*7fgiʹ9Zۄf<|>ȳvXtƊ?/?~cqt `N“]9眗 *^.Wc^Y\B|^JaH\ec!}* =A˧Ô}N4G~"xɧu7Ӳ <[sX?ZyS~}Rx=,ѸB{ŭŽ.0RE<%qҪ&7P Y" CG}Q3 .=v|b|_(G [HC2uCˋ!EGIZ5K;nfOr:B cr)來>2^ȱdv҂.g4 0Mr?ChMrI >9_2Lgq7mfɉtbdjOB]`䇪CY@ڍPiwysR-!'FfH6Z2Gժ*{NPHQNnȁ0I>幉w.βfi5 2K۴sg*NʮݝUgsLw61ɞϭ?ˇb%Y~0;ɚ" _LgMIA[Wfti(pOtg!MNb4^fȗ!+x+3P`@uXy[5اfblLݡV١Ak@㨽WSKUlGz=|ICjTzhu xk6&PVα.` ` +~\.уty$> 2ŤysV]Ԡ~b@zJ} އ>4UC _,I_AǧC+$Z*x>Oܗ}Αϛǒ̓|AP>qB< c0.|s%J/q m)#h«ףHxdD֞Gl>N}:Btm43Nx,4 ?Km]_0Ov#4IK^ |9C)Ho4%4 A^34&t9㺄ǔZds5eʔhK'ӞQ.NW=5x<| y1B\ȳpǺTAhG+?GG3pç  io9d7)yAmxuLWxBS$q$"ȿMO ZWMp.k7k5hl $Q<̲hhc߫zaJv@ G%Ҽ ,ʼn52iRe,n@+v{nT#6W>2clҞyMG0OTI7P[*3I <Ƀ眼|y}ytc2t #={ù:_蓖8!w)q I\xȗ2@|O` ڤ_y0 ~}@O@I;"/_Nr.GO -KqgF~J}^,+C5E>|? ̸H^7i5|v< cO=;ե^B`]pvB}m*!K"pF!MӆEG_τ#WK 'mh!kdcnƆG#c,L̼i cŜ͝Y^#2[WW0--͙1f^f+}ZcʅcĮ.GwN$O6ޏ/MMf_%Ga?l (.:4fi.b,<:$#]7j #Wʃ8EIZT_q-9S >fI^?b;|& nx(z{=F3#6i (`;Y4Z }Ö- t5sv{L|_*ӷ񉶷rμQ4iT5ɌM/;[لV{͕[cMFg͉3W:2nnZD[Т2/˵颷kCOUM2WvZgLkH&Q&if#cM MVkw \ 1QNn~74yZLd(Z Me9gU(M=ւˈI9nG%o3lvWuUEo>QNFf@ssnk:n$KKj4y.vȌ0Nmq4~O~H{; U.T㖪{hQ;&hWՆju4 \:Әsf~8a!<#׶b=q?+SLܹ& ?#K,^$6ĨC(-\}Ϲخ [h۷mQa;w_n‹ǬY'|/^ؽW\iosҥn֭3<ӮZ2߽{۷/lu3ć'τ/y]|2I_b[p_s5vYgI?nwqy軝߄u_ ^~PMK7&_~pWTTr SŮ;UUU7.2KMMk=KT}5d_"u@}͛78`-yfw{逸r0q .?MSSW,*|A:lk"w%S h#L6yܣ Ц] {\Nigrre{6o3<2<^EЦnwӦM GkU\v݈Aprmt Ow=n$xG[I@JYR&]-[8&{|U䂯s=A;NhWX@`,W#Lwy+gS&dJ)+~J~ r."/c8F߃L~ӛ6m6 ?_ӮIO=׏a9У/C&m)g,bo8O|xO~b1їɓ~@[|[oo H81 P?pB' gBkQ8iH{<.ї41N|poiC8ɺ/Ÿ.4xW>}ϸ@ٙ2g---upqN|W6{23Oe/]x!j ^j d֢1~`hzue#jWP~:d D <궏]?բd.9]Zf=m6 5+ei:I^i >(PG]cK|e͘dMrv/m,iQhF!5cESlKcVIHLpjzlh$+MVLF*9q!u,9cJ*M !ھ\E0B@YkUVPڕq[Cj29fj3QYG\OlSR~m6]` 0P-5S{*;qzS׾X:A<+c$yv 4"v۶mno}}%ʀ`$7|Ͷb (WM1 ',ʙo?pm9L*X Fx "s7n_WO/} _r{ 9̀,nvm?m|& _,,Rł+,d,_R<}_w|-Mo}/~ '+wp 06,BYw8G?QW,'?iޯ~+to@cbfB7ڌ֭@jPǔGqӟ-'|([`P.C@G}y2yr~`Xfo0O<@@0Ml耙KOPvh ~0#@礣Ї#v y=q=ԧp 9`/2`>ΒY㦛nrmP_/WSgȋ~F]#̏}cȏvO4H˂ЎuƊGx|k_s __pLc +9~@!e=#;`=` c;R~Ȅ6G_3vb[3<8@[DUv~cRYY}Cng,*yƻ~n h?.1963?>iˌ%0 Z?/ "Ax5 ^y137xW3^x_;wtn>|+ .0]f흎6œv=BcCM1` ^9NohѷywGO0}D ֫=iрһ 2(%BCj#[~z=|Rv>=o8jP&^DәKDӅ efmyBVM9r~%-YJ2i@і(^+0WWꗦ~%{HV+#C*3T4v7%>di;j齗9}a'KJ挋feX\]ɊfaE=&_Sw [Lby1/[}E" 4XXd!Mc"⍅!bҳfb(ʿ뿺%z4zD@K_܁ ex{'mn_RR@e1իWE6OrEMЖM1HnM|-ңez5;HOɗ9ce lܑ~WdG;@nȚ #x_O]֐AUO9VpkJPmHH P.B͑.=ᢍL

\ {)aLQǚTa,2"C[`|BhVz*9; )SH`y%B'vP4*zSn<ҭdGԢACBBa:]D`\쫚/(KU1qSV!D: qD.<8 GQqx4VZ0)TZ  *;$gh#/?urmnĈ0y""}p RBƉ; |1~,B}cV z} !:~eL\X5k_ *x‹EIJP-y2P\*o?o%3ӤdηxuѢmѸ)ba """""Zt2ADDDDDD !k_ AI| W*Oq+2BƺRB;jMF?{ [eTLJq9y,牞RLTLjR DśDż@BEt݄2nܚ`.)x")l(Ci/qS2 hUP(iT L#׼H ھXZ({Pډ۪HaBkpF?RJ *"ѓ J"PpQAE&k}P@lV ALSvf9. HA iOTc0CE#a Z!2"J;xC0e*lqmۮ2'/e|JC }q52‚ Nm[DLaCT %pU )DvK&hQY"DDDDDD4Oay%2ev %]Bd\^6,!-6l#[PFCF `U2)lDl.\]qϷ*NT8*?u󤀅xxQ7e g!k.r0 o?ig/` @<x(+iېL2Hs RF l%NdSWVW_ea~&$UL$( 'NSew&Cؐ@ٟ{h9(_{H$zHI RA*pV$fRjFä(a;+a|UŒ*4 !a ;lu1wHں`,LH "UxO Z@B Xf'hdհD()7ŸYAϑ%Ī" ֭9<ʲic hGhީ$""""Edz-+"1/WvABg"OȠ5N0J<0+ځ)wO73 Ō6f*%LWR86W|ϏN]Ba~ҎØ3V1ӪSسT"1DT m#8}q9' a4_څH *Do%j$jT["+j|.}BiJSMKh6((a6UM [=s.E 2s0'a*_W6 QJ0d&~^u&,\nj a"m`ϳK VѢIB(q?^``Qp~S3g4@QBi:QLzd(N|.I$5Dp #Na{;7j'U4ހ5&[]jj<&MѢR"J: """"""Zfu *~RX0 `r-W-ReeS Dy]oqAnз#^<'=:lzz=Z+!џ }2@Jd`h o(x S>7v+`t| HhPE%L$f z]B򶘈z"ɰIy! $`) ԏ0`^fC^L3fH /-M& S0 CG뚽 T҄%a H' שE6isCJR-V4$ $"LkĦ1dx<`K!'PJ@h #?!!҄0UtBqL!%! D0!4d8] D`R8iB( QN ̊Qk Fj!ϯut>Bֳ=fp5(0dt)HIdgFh0M è[A9ǹl@t}%a4Q v/%L """"B3p/K/: |馛zѲV(nEƁ+M`W.2"b&67~ՒX Uc-.$qgb($¯fźs;ǼO!jWZl}ߋ|W{V/:#9iuH۲ YS .IJ84ADDDDD]NYe)=zם^}뮻 1@DDDDDq !"""""N{ђ3<Äe}ehc jS"{>㈈hJӐK#]3g^uqilذסQ4ADDDDD]*DDDss=\bZ__oVˀ~q:hj?xwq2W\nHDtISJP($]}=fq0iƶm0Gbff&1c("R _J%9r;vaTDD˓ik _kQJ"";w'NHLkvEDf͚YI./2~(""""""j&+,˂iQ3Éii8r,Cz+,JL;|0|QDDDDDD첱1ݻ ",6n܈[;@. 8pcccw޹rIHDK$FGGכ7op$`6V^5knh}SSS|>alڴ W8NqpD,ڊg!?zsl6-[o֭[;Z(Ǻe_|Nbڇ?aED##{xڻヒcǎCP##""""0iK cccbcccÞ={022z:u ?яֳw^| Oq< }ąVM6> q:qعs'=8{uLMMɓ8y$6mڄ۷qرlm6v۷/nݺ󏏏СCsãGnk+Avݛ7o?q=XOQErj… 8~8l-ܲ((D94WT ^wپY,O?l6xe~ܳgy䑆 ؏V"Z}] i?أh!x8uΟ?Oúupe02"""""j9+_J]FGGqwرcm/ԩSWfd6Mh;b$ȑ# MMMa߾}A'xirC'O-;<|ɦ˴m{Vdäur<>̊!rq۷>tUFGGZO<Ѿq'ZADKO:^Qeyߝ BR,[߻w/v5+aѲvڅֶҤhm###3)%oߞRJZ02"""""jf|{˭o6w14~~O?z(vV\-@cǎqnC(nQiBD/ɬ*N;DwygѨ X}ߎ_gY<裉d(;WU>}~wN{!Bo+-]R ?|bZ6G>EDgϞEPg?>02"""""z xFY.7O 5Z>`bٵ7m9HDS4p̶mNMj{6eE(u& ב#GmDl]w's4rСZm3Lz||mV0qmoos;3JFk[7 VIg9Bxԩ岣Č(?x]0Q+G_ͪ(PDtg?C\NL[WT,|OLsgV &XP[Or9lܸuurcimb$>m5UUU'dо~+:qD$@ 7onnöDC###sVhAD_:^1U&x1;wcuԩj$4Kr9s=}-hLV"Z^u+i]wEDoCOoEDDDDDD0ib߿?Q$I=ntX1j+U0;bnՆz'N#G뮖  [lik}uѪ"HX={uғN{¢UW]x]^j>>W;߱c.*fz[hit###rm/űJDKӱc&r-l奿V ɩ) =꙽RQ(OG_$-?CD4s|$D0_3gСCi-ێAUv#.\^w{i-mb66mڔ1::Gbڵp׏]Tes8p o{q7c˖-m%?LLL$3N>~xkO .ڵJDK;3Z{*i&:u*ю_ĺupW02"""""4` Bu'|D|SpbbbV̥`9HDKS}nbLNNbjj `jj o漗5>>ǏϚ~8qڤ or2۷ox"&qp8"cxx6mZC(ɶm0 ;3z!;v,R,q8`hh[lwމ 64\N♏n$M Dh1:: T>q\p bbݺus&27AF擤sN۷iSSSqŦM}ADݳLoe!cܽ{7v4!J8ppΝ;ݻwϹ[.D[3338rHbZ>7|)> eq (Nxcx >pn,_]|+֭[ꫯbrr2vQ_.U@o톇~il[nƍ~z !رc]I1]ٳimk{ "#Nc||<0۶m͛k֬=>>>gK͛yf8qxgUZDz':Eŏ{ŽދBcǎ駟ج}?66{{WX-.f[hiyf]~eY^(N~{H?{`| //{9/ 륣p?ɞűXLOT*xp02""""I Bk׮D)֭[ߋL#ڷo_ mۆ_le( cǎD2(1000Ο?d&"mcڵXv- G&#G4Aci*OZsFFF0228u&ֿgϞD"PbNZIl+-^}Ĵuk2}ieNei7b-::$}2<< 6ԩSg7 02""""K:KE.KVoIJGFFc5 !F"u̙5kpw7MX6m¶mlӚUm;Qm6bعs'6mڔRN!uUqÆ ؽ{7ژr\"q[ǫV"Z\3{}*k&"KG6^n+SJv GNӞ}YT*EDDDDD1ibƍ^x!Gin b$"XR/ۉfk׮Ś5k.\'Z.ٳO/'\pGTa K%x Ӊ[lEPXKm%{05LBضm:^(N!wmH=Ӆ oq%.> o߿h]*2 01P(?y""""""Z٘4vؑxw޶;11|֯_{H#:|`[)]eǎ5_}#Gt%n6'Dp2̊2`X2}ݗxgϞaV"ꭷz /bbUW]믿es83/_z O;w}eQֹlܸW_}ub/ߚDDDDD84FFFfm}ӴlO]ogEDDEF6'O(ɮ$Z8Q2j-!ĊOb[1 rDm\둑:Z='qzG)ÇCkORb'] ގc. tuSߏ:2a45> T##""""Zy411zח>xykttێ骫Jַ5{mmv3OLL`=q}O:yH>|?~gϞZ(uV?nY~=vލGydD8~׿ubƍ;N:Id(c(S%LSW ߿O~ohz'q oXJ344OjG"""""Zk RDb?zQ?::W^AXD6֭[c$կ~~[.;y晸k6ƍn:| Bv؁ 6\~P-ű a˖-m/1pAmzƍ_yqs uᓟMȑ#8sL\۶m^c]WI̵Ο?'Ob||qض|>EsrrgϞśo)LMMضϵk.J<yԧpu0"Z.lF6uM=sW_hmN+ LLL1ylٲ7|sb+jr1LMM{#׿uSGv7GD#ſy M,r/r{L<wʼnpW:"""ZƘ4ADK&&M-Lbu]<&MrsEb'> lܸ}H=4czUe~^Qd}?iG-L ""tvђbY֒N ""Zlo&~_&]s5Wϴɬ o\܏~{VD?/HD/zyuW^oxܹs=jed2Z$""""tRbh%xgQ[0 @r2ko(S b܏~?x,N|E0μ X/UB`޽: """"IDDDDDR:Y""fhhO=yL%G33?I<7륣ms>o%?+a|T$D*uDDD*?:U=PW jwwmgQGjlvLڕhEx#eUXk!}r(?' ,V,}Ƒd9H$QwI ePjȤ27"?':g7)J=hx}]jד*kO?4駟ֻ)  """"3MQ^7eY&>s[<44%M<_nTo~%zwc<SOd2ջDDD _Reoֱ5z_x(0Waxms!DUDDDDDDTm%"""""C2ADDT-K7ؒ>߈ǧ>*=Ͼ{-Lѱw """"d2駟w3? unMnj5`齏 f0SMGDDDDDT """",2ADDT;_?z7(IÝA~S/mb1qb1@QY/& mmmxVџ0hty:ş DDD]ĩ|^x)kOy~Rvu$2)X,X,ׯfջIDDDDD& 7ٟYADE*=? w@7z7j /RfQ >z74<%?uO}$2RMb1lll L"1hhʠ_WnQCIRD"nUS*G򗿬w3NV|e%7/ǟoZ677]V;OB#"""":>Z""""":>N>z ""cNm l0cђ4Z&џ0h---xg """:f dxO^m;nI_]]E,;Ebi""""cq҂z71꯽===nU;m o,gϞ?Qz7}J%w.+?ǒ}=w ݅_LDDDDDG&N:z7|;wz7詧dw3*WW'""*}LQm|q?4O^6 ===%("&T@<>{,&Ow$` 3h¤)QlX,ͬ;::[vg"(buǠ "?iiilw3.N׻ DDM$miiiAKKKADt,L&:joIy)_R<ץ( ֲGgggE|x\nYzDD& ?dYF:ljlŋxTT*wރ,0pݸz*NgGD{!jgbG%Q}5"Fv)MT 4VxO LǟYc}}]jbpp*/>4,DDA|>ȲtZDQf ڵkX,׿wV333p9J^$I$ zM, 2nܸKc [XXXv8sNwl,chhH,c||-"?Q^_m@:)[pj#Ŕ6LLhʂ&xLDD"t:e,//5H$NL0,"x$@$뀁JRLsssZMecnnN,K:4ADDDDDDDDD%y?ۂWbd2ᥗ^Jp$`BQ$OCi( B!s$Dž `6N!2>llld `0@.~fesss$Ilb]|>9fx&xcIIhCCCY$eYiΝ;:Nܺu `Dt,1h?G˟Ju|m;nrx<#2Ξ=[AVv5*Y{nqqN{N'ܹ#JLf ׋P(9] @V A2~ADt,1h$+w? wmmM<@wwwUۦ2TB4QLDD'''Kbzp:X,_|Bnd1(:ŋ _rDЄQ fQl"z7N5P4G,C2yzɄ˗/W=pX<4(3M.`6zZO# c~~>+͛7Ә*AFA$v """""""""t"/:)$3Ldj2KsT4qpp[f "lϟSKN] YPU|ȕ5O>fsDˏ~#\t .]js&d𓂯ƫъf2(W<-3Q̠sթ%'ϭ[tHRujM2۞k_"x ":4ADDDDDDDDDy=uhޞxl2*駟4AD-3ÇԒtf{n}SFAPhT,g?q """""""""*{_Tt9Ξ=[>c2seӖOrzuujI2h2ܜx,IR """"""""J't˟piZf^E`+Hbl6랛SkN۝x Ә* BY] Q7h""""""""":ZI?:jMbJCD4]' 0>>^V.KL,[D-Ȳt: ˅/v׸z$}іXZZB G޶FDo6!I._#JaccHHt,cqqN~AȲ,wyx`0u=sΑ1-~*qfà """""""""ʩ>MZDfiTQ92L600wJ]V0]xQPkfH3FX^^lؑ lk@dYڠ̀~/X.FF{.]P(dxG z>N/--a~~0 !s3-//,I399YMR͙ENǃ1öʲ!<00i~R5ZdddD<|̶9)2?˙P(i][933S0_ay"""""""""USK-={q2&rX,z>bzz.]t-:FT 9i|'7oܜXv8/iH͟LLLdL8]t: ϗ<-YÀL,cbb-ʕ+% qFUie)$I]'(Rks\KKKU:DzN5+f """"""""~[uj!EQi;wu2Qin7fffrjg'K/_.D@.h.]*fN(WY|>]իW1LR"  3g԰zεo.yH_F5;;;)dY, \.WF4իW!I8ǖtme~0;6@`|VkE"q^$INhk*q9|?:jAfP… s^Wt*IRm P]h kzmf4kV> """"""""No9wQY֩-ADDx<7L%˲H/I&''v <Y F,8^,㭷:c+I$I QN9pp\=aI3eY$ wА8~81,cn;,E0 X,YIbKdyc5PtbddPsssVݻwuA.SSSA.A~蠉$F-,o6qm]0200^{M?apʇADDTM]ZTݎ֥( uϽKhkkGauu5k ݎo$&!"""r)QǖgZe2O.{}DDvw{be6 ٬ D"W\$IYG!3hB/n=4*ߕIQ.# f-)6\WMMM&[:8Eumz|#He $yQPݲ,#J5WܾjkkK<h4ZVD ppu2ՒJ,%I*98LNNU Y/TFߣH$>\b &e:O$$b|M VG4Qk""BHewԒCFص*Z'QRӹ JaccPH7 p:kQLvr0,:5QHRt|>_1$^U_vl6|6snY%:yժ/VnpL2Ll[GPFj?@ P0hDe0eڵkEڵk"hBkV1Z݀zooeecccT(PQN2ߏ $ p8Ca-"j^ʛ]-l6_Vboo-##<k( 4hO?DDdbbff>^377t F9D:%JattT0100{M (,='2@PN_>pT\ P\`q+zw+NHStb :׬b1ӄm$‚(I$2rLR|" Hu9CQv{Ӕjkkk~zMookk/j9MX~߮ov<`VLki… =ݒSdFsss|ŋr,˸qn`~rrR̜l|Q BAثWV'Ç ^{N9sef8.D#јUu%J~VIUf,^FhF5MiX㘜%GǕ* ҥKt"HMz뭷SSSXYYɻz՗d333壝-UWݺ}Vx9M|mNG1t~ulMmJsAT^""x{ujM}d1(NƋ/!oT*01??/f3K'BgkL8{9y_yr Af_L#M?HbTH}3 t#2`ukV4Q׫#^JI69@gj U_ww7 #Hud*i&39,p IDAT8bX[DZ Ȳ ]PJf@R#QhNr!3h(Kf Q #o6}W5݀d``@ԍYZZ*:#Ip8D$/DDdp{{{X,&:)cŰd2d2 EQ ˕(X__pݍD"p8, $IXVtwwS 4Q~sׯ_fv3Cq1cd2 վ?jn'}L'.&hcfpG)[H$1T'{DTm|ODDTg~4MhiVǏn5(exQS ߽{Wv=22R)lmma}}P(k"l(FFFnmwzj`hhH,GQfyxl6zпT Át:cll-,n'}{< JtHH@f Ykr5 (˗Eifj&5T:µkײaQhs+++Dٮv9|'cF-~lmmAefH˗/c``],ƍ_]D9%~ ~?0JT Hf-zOjǭ՚fCkk+8Wd& Z0L&E*FGG Rv;:::pirpFL2Lb[dR9WvvvAp8M`V`q8cсnXV^|ݻ'v;`Zqppm$Iᥗ^Rri#ڀm|Z"[IFt(?^=t%m%"ͦn+_uiQp;P`v""ꖛ!ZU82Q`+Ip8pEszhooG$dY!'\.2F)KZfCOOa@w3XsiBLj Պ>KЉֈ~_\O<NRxvww`s]VV""*323؈,9dΊ$&2*fܺuj @Fњ @£Gt}M< cX9ߺ$51r\1`0XbZЖFDv(x=u\b?`Vp׬bu'D3JG4<<{ajj N$add⵳YX,X,4TsFmdmݹsp8FP CqMα1ܹs^nxE|>_4"E!V 'Z]]dz7+`d2ʕ+ fY#K)yg{rQJ'cP}fZyDD. ahh0\2<x*QR<n9_ K[ne Á479@\vMNzfȜĦf~qUr=q̎ i?dزX,AkObzs}c`` +*IeܸqjlmZ]Ǔ3bd6hX3vB!RP 0a'Xڒjjve1([lE~3loo:|YVѶN9fd{pp&YZ#;dm+U_e='^ο ]o"f LOOcvv1m69wVhPK+jgSfgf($W#eAVŒ3]$ܾ}Ҧ5vxii-.5F,ܓe %czz:纵%mԬ*ت It5t15,QR6eR#MwRvn:HRTtӸsN^h2#2_`0YjKn߾s?7ecccy700@ !:iĬ{i jhTJR,Ts^AQ$Iln>ڊsk|͢qN9cܻwOR8W0Njegjxt: r,GW{rJDD9їЖtٚo(щNH$" O_XXʸ066Vt:]Q ZAqrrAH$ n+mf\zE)HplP(砿,(xCA|_6>lfl$Iu/o`D[4נ,BBX2l6[*|,ˆ1˅2HȲ|,2]x^1< e-VqLu~A$5 (Afd9uGiZZ,rattE$ U|!^(9ϟ7[[[YQSia`(bf{5f+X,$ |gH$UDNbp^;hښw0(Z5Lx嗱 #ckk b[ '֦16LPkkkx嗳ίjo5#Iרccc@:p8pe·zCV""̩noٚյqpp 2LG{pp`" ۍv9sVdR̆feY9{fAƍ9?N#LfJ$ܺuߕo)ة$Ib3~nW-kXmD"\v ,ѣG{W7~$ T T HD;wt&z4ŋ/X+ 7 R),DG?!Ncbb8_666 Ibb\xHzvA]ڏ#I8G @w mD:e)zfMOO̢NSH$?@@~x*&J$TJ]|(_;ST9 F5~[L& "H #!)U\~]dBPj/\vNjP/pk}1NFX,wn޼5C툹{.ffft.DDT>BKZ?e#oG,UN{׏?ADDSSS-%؉ NEZkGJWmFAsٌ[nby<LNNʦGѢXn7FFFJDѼ؛Q*$Ies@L G~3#p8. (Q써)e,//C$fȲ P\j6~-5hn`0Xen8^JŠhSYZZ?G XF0شŻQBW3$IJ-&buvvYA8Pٌ|mفVvvL&l6L&fe7(v=0&p۵llSQlnn!Nr[[8$E9R#8ILqlnnc MTO|֕eh~!+++"`^崕gT?wmH$"jM K 53ۍ;w  Tp`jĵ6ut)\xw-z\$4á$I+hfۍP(T:I*hF@P ׋#IFFFN++++SLēٶeI066ǓUV^oV`@Yr%WÁkqfAE~fa7ĉp8rpEX,411Q D1]̙3O'-΍Z9_UOWWA!xx<9gRk777㎎[H,ӭ?5_0VZmWe2'Ax]]]M B[pNJtwwcX "(CݟjoY%I$IbD*+"5eKi+0h{G4gD{[)mI+#L'"j$G7zww>ăt٥%I˗ ?{gP,x,Na6aZobKTJd X__'ŋr ;D"D"VٌܹspBX,x nvvV7700W<jh4D-IjK+ezʽ.HbBvWaee r2/}}}}FKv^qJ5X (B(ҥg1(hLNN有VjG=7n`ff?|x<ɄNl6BQ$I]gߏ{!Lbss>Ĺs>$ qAQx}p8g']Qeחw-7Lخ5lެc`2p\re=(eKu\ۋMSMtR5>l6gΜ@:,˺raո.e+i+ΰ446ڈWUMjr.{{{,aGDDD5'jrOr7Á$axxX\ckVk $I"L70B5xzKtiBu h)e?<͛7 r־*NbX(fȵ/ Lgg^N9LZ"jjŹsىm|r

}bA<G2===U) ]]]E[RuH˗/`ii <p"'WX*:gdtZrapp$unp̙:2֗CKIWf+O~v766 o~_,9-rw5`0i/f.T /XvݘZUSM5f,&(󎈈U3iG1X_OGX,ռ~zU2fZ׍$&wQя~QJD$k{<' """""""":_|Jsloo֚L`9""":6n߾[Í7tA~B1:: fQNջDDDDDDDDDT_O};D"].WͶm2сZ2ADDD5t:F1779Q8, IDATg ȑc """"""""&ðnkȶښV+:;;kBA5>twpiDQDQÀ ۍEL&\?-#~wwaɄo ===eBzzzjADDDi||Dt%f3V+n7`XjADDDDDDDDDM%-OtvD<dٳg v$mPIÇY7Ξ= ˅#>Sf54MQӲlxW """""zz#QPv\rȷRDDDDDM """"""""i}L&"""""Ǡ """"""""&ew/:/\[CDDDDDtX}nGcUQ/'螳X,hi<`"W_!J;s NjЂ3"""""""""*"""'|ׯ׻DDo//݌aX5%MQSb5SnX,|Ml:$ \~]ѵ0h$6 7""͍(MQSby""""""""":6vvv Ųj/\QCc """""""""""""""jJ """"""""""""""Ġ """""""""""""""jJ""""""""""n`ADDDDDMADDDDDDDDDtll6lz7&j( unQ&""""\K@DDDDDDDDDDDDDDDT4ADDDDD iaabvc~~uR)⋺t:+jQ@(*: IrpoDDDDDDDDDbЄF0:(dYHÁ˗/.Y144T-LvK:1 ?(0LZZַ.L#mǑL&aZaxR+wssbncppu)Es/*jciۋs]P(Y!IR 5hNFF ۍɲI KHY'2dYF0l^oZIDtb1z*EQDZM,p8{.L"LfZbDYfcGDkК766S~ȲP(!vqKFMLOO# e5:Je}"Ncnn`촾DDjggGw=ZbLjbk-һ EQj"Ԡ5' UA0D (9hBP]O#$ ###edLIh!#H Ld2 .i]C,&IqH$t{zztmD"!ޗL&>3g`nnaJMKuALOOt:k+_X,X,>۱Xmqlood2F%򴶷u<a@:~iip@~cc#1W놿%$00ח=Zj\O2ߟr-\P5d2`v=R;;;tjz&QZ݀zySκn,..pѨ# ҥKt"HKÔDv 8Q;W_^v̄s8+jmfcGDb6Eݻw~_*Vb``@,񰽽ՂEjll޽{ޣ+WA%$VWW<9_ Z~uff>L&~IGb1Em[ f(y;.`Qo#DDDDt<5e`0(:>f3n߾ R}oƕ+W Rg5"CMYVK5$2lmm(L}4 ;"%׋`0h4 Yʚ< ݭu3)lll N#" 1ؘvvvD6=UOO:;;ֆD"!A`06A掎\.ZEQppp ֫ 99 ,ӧlmm߂]]]fP%I`{{dV0L8jk5kvNҖN`09t,s5VQLgI3Pg%)۶m)ǎj@&B4JsB!~Ȳ YNp8rpڵ۩T =^/dY*>|()I^/FFFJn_._,&BPו`аGf8mX__Hߊϟee% x4$IիWW%`82LE23 eM w@z.Ay`fKWoׯI(uf|$޽{  ܹs9ߟ5IDDDDSMhkUvEFQJR{nV'HNأSɲ7n^uMC퐌Fz$It쬕eCCCa;Z#x@ގH$ߏ-Ȳ Ipe 팍D"x<9tg:22|| s:K$\.Wbb1 L"LBQ33mT-oopPVv]t-Egg'0 vb1QUvsT{{{x!b2ֆKlnnp]1 rλD"!뾾>Nu8`$j&W^E e[ZZ88΢!2&&&ĽV4E4E LT*MDDm|>B!Rd{$=W-G5;==;Ϊt:P($DפR)kFYeYl t===y1uYM&I>W(KAgg.hlRbXM3'e7xj u&*?h&4J fgiɲ9ANzv⩳r+H`ddD8&i!Z,AF0997E=d|~>JZm1YD"UÎx::*~@DY1==E//5̬.\.qu}ob~wXVݽs1ؙ\"iDff)fCGGSLfB@T|ntAAJfkoލԎёfPBtщgdwwo$U;/^(kyyR%lh-$J鎣C:~C:.8{31͈F"YD8kVxjwuu$Qd& ?{F B`ȬFo]f'R0TX셼 ]vlCF쑁Aa\ד/JR[-V5@&)^uJNLf:͋-$ ĩ= @"%*xqwY!I ( ~ьjIqܾ}[ x\.x<ȲD")iСC'm6X499Yv[4)ֶ\---hkk,ˢD"H$RUmѽuCQ wD"BeDD?!s7F&N'9R4St|%ŋq…e|>>}:'vll ulw:%UcXr^tWm7R6.744$^޿?pa|9^/mnhT|ҢT$Ȳt:]?A Դ[UHv,..|jۗSFUFDDDDvmF &PȻyf΀_ANgϞ-8~8& ('tZ+9hbۏ|>?~\dAkkk5`B,lAZj$ F$)V( _2Mf-5ESSw>ズ@sssk޽+&}>_@@[[~?b߿)v#L!Ү1:Tk[\$pvTPj PR Y` |>Ȝ_Π.jlC'ml&p>,*VCN{ zz^jxA0s1dADUKR9ƒ$޽{_IxrXAܿ_\.S[M@#zju"gDђ?Q&][]F%vttDSx ^9W^-zVU KDYgD59Zv Wrj1={h9AǏ@000+4ܹSr"^$tvvN.?zRL ŨֱXL׾*-C:d4SP{[ݾ>~_igggUor*A=Ā "}.\YZ[[#TD{oY{Z3yfW Pf8|0FGG133SLvyeUY-ɿLӘ]]]k `||W^ŵk011Q0ɾVU;Sb(0==jgʍ]&].hB;7T]NR;vL|\Kx/_.2JPPl9lSRhU<Ϩ)Axt3IjQ#\.tuuĉ8uTı(l5媢(iSVW'D}_>jYjף\[WڵkiI͌պZnRA=]?WZޢ~ZvZh'jֱYˬM̼^;88(fqԩSqI-Sva|%S jQˢ|iNiD"ܹse3bB/~ׯc||P(9 `$"""ղhmmVѾW̯v];688 dY1=gwwwxzs:F;::t: 8~:wzb,8p@LNN|OA6B$x^ݻwVҞhS.,,T=~x<"StNVu%nڪh N# v۷OWn߾-g w,=ӉaQBkFL&I`%عd@jn߯y&1?lDDDDT/.hXҥKVҽ2ff&g333#.5ȹV+|> DЄd@.Q8p@offbNպeK%FZ)?vI` IRS$4E zDԖ)̶llbJQB!QH$ΒdP*Ym^ܼy@>Oܓ;v(ٳ%33tv63Up8 .SǏ"^UYV;wN_?~XO>22b8JKD$IeYMb5p8ߏ q֣7h4,҂?5ʕxSGr_IDDDDf 4B___PNB)x~Ô}?sANuO?rmP* ?_OFGG/>hveշ(V۷uo7(JΊ}|zzZLd|> 8CL.:JN IDATtBY\"WԻ$IBWWgZM|?55Uszh;#mSkF֮bJvJfggg29sFdp:<+gΜVǭzfqӖcyGGGųJm166}I1`fKD>snn`[ՐOSq@z&Vs$)g?59?߯M""""զ a1fq)@ 'Om4N'>3] %3$\rE,уxz޽Iqܨb|>1;;/=nffϟߛ}jQ% $I\~w-Hϙ_~)^oPŴA! #F^c9Dy:vSFK\;h<4.6X~GDiǏl~_ rJU9s/^匎b`` 9ٳUX  8|̧[j=3WM qG?Ąw/^(0h_U{{{{?3322'Oa\x?яJ^"`F2;1wmpB9f-+7#J#7Vr{խ[r|tFGGGaϞ=oo-`e>3ř3gۋ>xd{x9rDl͑dW_lRN={ 2_U{l6\.b1V>D2{l6v(t:d޽'Iq(qۨvzkwDTY___NְZp:8rn޼ `TZ{,9崶Be/pႮlg###ѵܹse̼^ӉV r54jz_`AXV\zϟG8l oY \/fCKK ޺n[R4eAZRܛ,Eo0vennhTlXluAcPjM"""""զV2N6b1<)AFUW VښC=#ZZ}hrcpp <3vO>ԔdV'K x$IŤ|A-Ykx<]@jVjEQ tvvVuBbzKvzjwDT0 .سg؆btt>uSe  "رcl's}Oi^|Y{e{*x^Èx!̂ejebbsQ۱X^"Xb~nW̐ݍG bߏe9'h"Hr7ݻ} j02l0JUgg)ԌF@FGߤٳgªٶml6l説//3O']Ƶ6/_bii ""P4ADDDDDDDDDDDN7 DDD """""ZcN'~6DDDDD-//cii ;vhtU6M#*heܱ\/2_,nNW NDDDDDDDDDDD,--ayy۷oi')/^/Tm AO-Be{{ŴhumM6X,c>OIH߿s;o1umhۻ9> "ADDDDDDDDDDDD댢(صknogbKb_yf)[ 8EDDADDDDDDDDDDDD:]z6%ڞx9Yd<]bLs Qm5VRf.$IjtLDzM oQyCv{pM;, mADDDDDDDDDUjt (]va֭ Ou]sX1t "c4ADDDDDDDDDDDcݍK\6м="c@? ODĠ """"""""""" DQܹ&"Og\(xm&6eUA`,8t|ˊ8|0ΝGFW\͛7|ַnjpߏD"Q{nilc~СCp8UJpun@޽eHYp8`VU]AQ\z cB. UOOT9%饽 Wl&__vU߿:3g Wt2_7>Gzk2 ?G6[;## b`pp}}} %ڴEQ~Y\.$Iu\UYHD|^U L"L߿vNҚŐJ` W{B^a19jXjdYbvv7oDGGGD3-YxbZn&u@"8"""d~~زeK=uu[}?nkEtg+A9h゗~mfw>>mP9DZ <&6}АX @lQX}fq%A>!"*rELԖގP(($» $]fHRH&xK(ÇzC$AggMOOס6Dqi uѣGL8|0ѱ5J۳l;::*&z{{ՊL& Vr&9>V?W_}e5mUi]vgVm㜖ްeV07lz沿ضm ^?K໗K%A'O,t=p8K.Ó'O◿jWLߏVQm AdhTdxv=_+VRkly{yFe'olS h""XhBDdM4144$mZ[[qe򤘎\z~HY;44 .VI`0!V%A iAj(V\v tX DÔ򉪱o>%V̪fp\g!nAdYACjӧg\t ӭ[zl^~z۳jwffF|]8m& fO!""˗XZZBS7N@^r~`-&^4a{}0t­<^#.DDlʠ`0(VY,6*Պ˗/c``@ p%"jooAj^uK `e[hL=b56hrrRwЄ(w HԳDX,"Kč7tg2ׇԯkL__fffĖCCC̶`YveTCU'"""P4Qg0r}k-Z|2^ӈ+s_ mgwHf/6eЄXWϠjbppPap8 ˲:`zzZwԔی,,,eHmZanndTJWֈH$`%v3h֕>A"ʚ A)hΝ; dY"s~gV_"""jyyKKKرcGay}P4?I[MnEZT &s7ڷ4"wVG &fu.|T.hBbA__2022l6+Y!zŪFu=Y#`YuZJDL/. $AQLMM{-llAVtttb /s*ƍc͛7|i,KDDD3h޷umpkvx٬۶> i+>^5S{ Y?Tį4tWhstAZ[:::ĄĄ2 nܸ"mG-; q="Xt:p|>:::ʮ 믑ft:ގcǎ|_<ɓ'+Va΢NS;88Gmx<Ç9ӧO\-_ml6VEW&i:uTk0̈z{{ `rrxN-jjff~)?ZrOL=ʁ@"b4$I,طo<O67X,i뭩~?ees0==t:t: EQ`rޮ{VBl6ZZZjʨ v&$bHp8zp84dYxo*;wLXP'LgVm?J$---K^݊+hKs]S4}ĭ[#zel߾馅V~Qx͝.eصäl/ ^󷛁]eesѺuTWG.G;p-L&+WFx{'VlpW\ɼ{}>dc;x IDAT8pղ<s1pt:ڊ>@\͛7uKUzWke2KjE0)YϜ9SR׿(_7wxxຌ졫Kێsfx@tlb`vvV u)J!J:65(}Tx>C iRegggqƍ\LFL2Q˽m13|eY< p/^ԵH5@ixq}vkmFO.:544$^π̙3ˮJSN'Ξ=[qqqL (4Q.đ#GUYtłPߣZ5?^cG|>?.Vl:mb``ll`z077{O> nbew(*;Iir$o$)9]X͖z,.."JLlzE*zhjj>ܻw"`n?qIhEQD e 0nWdRp8z+@e~d5`BeIvEAŶ Q.\JS fr˻w2p8ߏX,|VEZێ?!C 7nMzo[Lj~Y@)all uv:";Q򵞷Q[zv좶-Ţ-k """Z{FWcse~ѵzgW4_w_Owu;oJcEbFb6u8a FQr/gSrW9{xbԁp8L&SW^-zVU KDY+V+N>]rpK{ Q66+Aj ={h9Ǐ2Ѻݮu{ vbXh EQd- C+VB!\~b%IBggg DG ~frP(Y$IzW~Ve1a_=F/ޟea .< ڭ6^-VY0e5KDDDD)W^5Y4Ʒ[vV-7n٩뽳oځbq!Qq.hB;7T]P9vZ;f8|0FGG133SHDd-@KPV&*mi +tף{dY.eY|juFD ({ "W6rR\S/ޓw-!,fj'`\V }/kyKF<xuMRD씍l),bubC#6b+{Ձji~@6E<ݪ.@;Pza]u(鬘jNx@^ǏS>^Lks1:99Y=#*]Y'û+&A=zT=BV&'LU/fD"H/Ivĭv}޽WZS۷sʭ7[m2}G{+EZsGur{[mW+jdw' 11ZZ]]]bHbۅrj}RJNVPvI( qĉܜ?FsCDPJ=QzԃfB,C$A-l=_3/\J>Q/t:сp84͐fFո_Y,Z~Q՜wߠ3٠m.hBݿX/p9Ceh=X,%6j,ꫯ099),Μ9[n( Ӊ[n\7z{{@ uffFԷJ-(G5ovxڌJDT4 p'IZZZĤNܛE9.KlL& LMMazzmzJL[ Yq(/%okk4~?rH$"ȿz>zFQ3v~x<b1%fla>aT-R ,ĉ5w2p8gBwbb"Z{[_jFz޵Z}-e՗hQ<2ӛvcs%.| '|\FW46]ҥKVY9TLV[EQU2 FFFD]~?N> `%]:𻚩oV+|> Ā:Х+ɓbQkk+jtj̙3uۣT#I%%2pDbmJ~=+EЄbrNP+b[v\.Q?Isв9mi${LԿdP`(br\4 [yvE^=%D#p L"N C; qQ!rWV}AjR4hQfw=߿_ B]mjiǙ1kXt dF)DBLJkkÑC4E,9xjmX" ȯX|@gg'\.-m-߿_sSSS%'Չv׋N,ꪹ~ZG/^X4 Ο?/ ("n7v;$$&&&N'}+"H$~5Fb{<[J(JN|6 Hi477vy===v qĉսۛzW%0hqxw* %)NX,&Ujx*j;vS_3/\.h'VOFQȲ\6͛@۪݊fgg>Od2?^dE {J~ުzWY 뭾D9ۿvsܹ``c3&IZZZL6WI(fmնСC Zv~&D2cǎFW6?6teȋ}w Ϸ/4c("2Ǧ [nktFGGGaϞ=oorg}f( 3gۋ>x<'-BJ޽{bAWWكVdYqܸqCޞ> NB<ݻnmD< &&&fqYS&D###brX @(^+9"f2|W9iYnqMٳ,ࠡ˗/c``lcccG}H)c{{{J?ńg$A,dΤѭ%^=U9---E>zb EQtOwuu!H L"ڵkxA\8Ǐ<3m牾> p 1 >O>mԽmV-.,kV_j,,,THƭJb><M-(<&n~NC7/5}ьyZVAo~_/3 cU^:QҦ,hSM+^+Y8n޼YXbxJw.\(XN0  YV\zϟVY9N&5Mm99rD͛7 > YsJf:TZVoX ]aFDI'N`||X tvUe.P30V`0$V,z%uOMM,ː$bܽ{sssPd[ F9tvvϤTJ:Œ~$IZZZDP/8npPN ڏ6ݾHPM>3u%$I}CHR Ll>}}}9[ղŅVȲly[Jܹseu/oyUj<˚i՗. ---oԊNRHӘ'`drMgX/7t8"ڜ7tAbJaß[wcI_ QMlߢyd 2x4tFa6+AҾ>AHⶵ݆O8G"# #͊Mǎ+wxxx~>_{aaLLL`rrxE uuu\5jL9}4#&Vt:Eۏ믿(bR-ޫZOێ8~8W?@])0==-$f%hoo/D%^ @e%2v;ۧk⹳,# LkeY155'O(?#HڶT۳]^W Bػwh.hP(BHR я,n. DB @w[[\.B=z$rDh4*SM Y1_WR_jF4-~|> (trFitV7^Z],[/뭾Dղ̢aD6D"/2ջ(뭿mτh=ZXXblrHdc<{H6w 0>w{)G;s^o\scA׵s[ėgxVkѻ23׍w}*&ݣ *H$F9Y:ۍCxEADlg۷o,V^QZ"۳y<5 ez3b1d-IN:U:h4q1j5{-c;FobֵٳgYx6x߿XM+)2GfkA焱opg9.]F7H}ow&ThJrwCW=z%"""" rn#LBQDњ#""scݍm0oڶ/L+k|kgHm .GyYu2|c𜍵q'imb*NпE6HB^6.!2%DDX`Νk&Qieo,]l񯲅 |96M\.b"H~LLL众VׯEzMiO &?Av ۋ\ C|Z4vNMADDDd*χ;w@Qܹs---طov;T =XYx4ADDDDTA"۵?.zB{6+鍏X*{>ߏX,8t)% u uDHX t$Ae۷$}VŮ5Hv(E$Ip8z+ڶRϩ'fÇH&H=_KK \.W{H]jn3>Yap055T*%޽{Ycx>MPv. EYT*%Σ:$z^7Qy477c˖- Q^d X .X ńs(Q$C₡r(I0hLep #annssse""""UvىP(H$V^|bbBL0F$b޽[,( $$BPN@H5ee`aaPSSS?bb)[_#][?)[/uQ LgRϣ~R~mb~DS ZN_~%UG6.z!ZmA, HP^caބs,\5<]GA_9w71`l3cN$#J! LehiiihIRH-ZYy^,,,l,LOO#XTaƄq,Y)۷olEQH$Dz0bASSSHp\bu2Ɏ711NnND"٣X6[(WdY(H"@,z񠚺T{FL, jC9}߿sש\.Ȳ I077P(EQq)>H$011!(dYY2Im֯4^쪽KxGZXVSlzV$jI.DzMهmB ""llt5ӉUAO5U EQD0>>?! X \lU("xZVy<厏bYn7P(T$:9XW']EQdvB\ ojjy\?$I*^pl zojQm]n~&Xnx<\~ri^MMMw޽{mTm; Y` ˗/l_j IDATӯ֕"V$ܜs\F'Tį> :k:j.ɪ v7hyim`&%98tX<7Hƃr[jhhii)^$I 8qBիy<t:]6%zMP҂v/̺ԃDfn-+RTuHTGUy%IA+3h֭w|8]"' Owao⹾=tk ~lN^ViLĻ`fƠ """""""""xumoF2ʊUՒ$ .6J6 ---[IT rf r$Ekֆ{VUwVSu1$k2Ԡ u 5E S-//cii ;vp֟XV~Ui9; M]WdNKS!2ADDDDDDDDD]& e9gK=<jv5w}Wl1ѣ'W3Bgg'(EA:F(?طo_X|z?zvA3'eY^A.3hhQ ̷rE(bA4ADDDDDDDDɵ R[g5މiY׵dX4$ĉٻ6No$ԏbpMk4\nA!MJeۊվڶ{wv86E={]T 5ɤw*8%nFF!`f |w/_R$|iLI4hLOOCIs=s٪ĭ0 (UU籾|>o5 ܙ(6QH6KSTsSs޸^;#X q>Iw돽._{ϝ "'(JML&Q=A2BPUz\`d|AQ(bmT*G}v\!bqqF^ヴi 8p ڵ8XߦCown{`l<\_p"'4ADDDDDDDDGT*ii$m._4 q  0d 2@e$=72;;kP(I IA@0 HRuY88 60o>x~OmRm]{njͱOCmHgA M[ﰳ>[ d0\?_`m?M2-ۘ [DLT OF8\b1kC9`ffa[UU l{G#w|AԔݜZOE+ arraYe_Z]ݰioY[[= Uvx/F!؜ڕcC띗2gZJC&|>YIj2qbyy خ hPA4ajjH?~|[HuUUJTafM˟޼{{0v.N^N֨ąa59u5N{aa Bm}Rq~f.96KEcEM]`m?M EQ(uaXZZµkZs%QN8a.HRٶSN… 0 ,CQDQa hfe?6o%`TM0TU,8~5ǥ%hAF  `t]‚ב#G( 2L6Ձ\ݚ׽TgmHӈVi0(J:ku5N{WR'B{} \00QGq_}y٘mg7;e5MPQ1hh@H$be0kT B 7|^j=L&CҥKMI9st@udY۶Dx|[Ռl`^竳Y FDrf'"\& "[k+֭3Q!tN qx=#Kcsl(m#@9=[_k#%>8Pj/AXwUWf- """""""" > r ð6v0($I(6JWgd2VtӲ LŋVƉd2elEQMӠ:A($ ;éSMpQ Y $(X3Z7ѧj}[>Îm_龓4[s=Bws&R ;vнk}< ߏP1{/i/1H1 """""""""""4A;ޡC+@BQ5O}&50FvC-MQSbT}M,wj ~;wXGZ{p$τM6Z䕚comt;ͯ'Ql[4ADDDDDDDDDDDD-1hv{M(Msb"~{{0:X;J_Dbb!n/{,W}x>\s}(s)krug9=ٚc[̇FDDDDDDDDDDDDm) 8p@Ad]4K!hbȥ1l#]cYi o} lYǠ ""J*(- p8H$3 ,@E9sիWܴ EpPkS^xH$<#j"ӈFum@$$Ɏƫ^+ӧO#wԟ6z0 {*Du?A~M3l3ak7e(Ky%KAߵy\~+++XYYbttwN[naaaa `qq333I,|aii ֵXZZ©Sv嵠%2{0""""""")l^S!q;P@hM0h6A~{ZFkohch\y6BE頉y<5ǎ;cǎattݻxquܽ{+++~z]$ƚ&ŏ؏5mɝDD;Y##[e,t]ǥK033SwN,R@`0`0h4 0p%+D:"; PCDDDDDDDKR'v~qw} <#~^w_/;joQhQh7&>|smT᠉_~ϟrrΝse?w_s[(5A,AD{Չ'0;;k ,..ee0 v3A033 .@u(UU`~2r9DѶ5LH뿇B G 0pptɰaǝ:j_GTJ8 ړA5?8y䑖om4% vI  @6TsQ9"""""""<~>o'> gw?3_LKoNF!|D?k\3Di+)ݲ͐5ǴH[(ls[=S{.hʕ+3!"ڻ v9TUŋlߺa*R`jjʺ]er9A FfPź;<7CQ9]!"fff:կ0 4y4(X\\ua  A$LLL>x< Q166X,VH$'N4|mfɕɺfֹ(" allh4jͩUЄ(Zbrkm7reZR$6Zwy>k """""" FFF=}tdr+hJEG ܩ?滃 gywOF[.r`cXsSA5DD|>t:mmԛ07h 2 ltcsNf<ֶ7Cu,,,X}!#JYm\4M,-&sZQ#L:~o^e7ݺo>tRM;/>׼L6ȡZ\t^.//[ea 5 04kVkn RekA{u}iPz[Rpj}}|aS0D<-S W_}նlܜZCN:ez À,ˎ0ыm;halU$HDDQLLLX2LWYPf^vzlW.: +ХZ]XX;nyUl41>>T*EQpڵ^vֵΞ=key8vN<7n/`3h_o\GDDDDD|d!:th%M}~O=ee,岣~z4Q(ߩ4[>B1l=4q]1LGJlgm:w `$Ŭ@EQ'mUU(JMIX,ִl MLǭͷ[n=ofr0i:, f2`PW|MvAe?"HeR^z%A&gΜiknu~_( [+Bl6kNi s~ݬfY| D" >%I|K_(0 i@OH駟&z꩚rGɓ'O< 9o{ x/~;wʕ+uADDDDDD;<OADWϪg(swugc6^ߏ.O&?sNgBD;5ڔ޸\[(6DžByFY;ٜ V @><E+tjQ03Tc^`3s[vbzud붸h=nڃ "'7:53vy)3xC庠 3l v#%ߘA=zG^+++VdMD'puܹskNj/ذ#t<vzNu"? Ž3Mش/ME1+\zg@;q}Yͻ~:|ɆϿ+,ADԦi L&H$Ҳ%Uvc' "Lbnn,Cu馩ٺ^i[~ȑ']lM\^\F*@{F[z4Z]]HuƋŚ3(% Z5[[9]EfV:ii{UxǛ}衇p,,,fi׭ǟg;ha >0)R}⨟w]*1dScXv=4?WyfwQ{A䤵yNpwIfr5m{ajj @uB À(PU|M7r.`nH8sL ^\Py53 Bwh=vS: $IAaX&nddmotm6 iٺhi{U0|ӟnVխܸqzff"""""~x<̪H4L1+eGp6;S쩠cǎ\ǎ6=3]JDWcyy@4dٞhw^l#Jajj $P(@eȲ ]uI4M fX\\tlT ?== QqEK.5 n0`n[ڡ:0vb17# biizUifYۍ6 "Μ9;yEI M f#"""""0р;'Ǚ&sÝL~n\zgOM?lexMѣGk=n799iENamԚwoźnGv B!HI  A@*)5P- YexD2;L"! H؞ӏk^{ evrր2"nDQ+eyyx r&Dkێit]o+(^ym:[JDDDDDD7  €l Wxǿֲi;;#eط^^^f=gǎTW=wu_YCJ,DÁA4k5JcN';&''6?|qܹmh1e:̻3 Ep(B$ij6#wŧiq|amoOMMAUUhEQpb1CH]ױjmAvA@2ŋzSMsY( (0 4MJTIQ(b?a`ii ׮]smޭLMMY+#0 r9&okwbbZۭZӘd2XZZؘ^\.00999fcǎmK Ykarr,x衇pܽ{O>$>7;EO"",B^Njt wnn"{D { ;V~9*卺cw нx{ konvwn>7ܞ GyVƉyرc8vFGGl7n͛2Ѯam2㘟,C# `bb+Exai+e!t]o: ^eQ׏ ^p*[%Xr׫W׻Api(]D"pLADT*U& !oˇp5unb1}]]DG5[[i6Ǎ>̬$veb|ٳgW_Ń>x7n޽z.hl _WpCEkr Ν~~:fffCC5믿n}>uA=%geP$ H YH$؃l??;hgznͥÕ&<.frM~ucGg٠ `3+[X^~ݪAl*3D\Lݦw#a w"WEB!H$ "dYiVP(Vlv je&u?Ip#[Y2 9bm2z6( -&&&,1=VsS?#B$j1!I2 nݺeI$+H.I0Lh4jeihunڬ^\{rA xgpuܼya0mQ<y_jqF9/"~iݻVɎ8"", u2; 0gP _}ЍhwОF">~Z<ԫ?7;#x3E瓴SX/bN;JR$vW_ݻw@E;懦p1"~#Q:ok}gp=mq3!|WP!\mϕӿ1m+}5stms!5::Gyin=]/0""6?D6ӧ6P+[w^޺u8MDf; z>z(Ν;sutQT[XXh;h0 +D,&1XGou\h2ӿ[z,u@&""""DZ MӐ*U'2͠h4ʠ "bdd gll԰R]Fi J _ jޛ_>UsLr0~HJ@ "=K$(Yk6277Ws.9MDDDDIA`l[EQ9""- ߺ!Q/x}ݷVDSOXFťe@n)`uI`o>329Ber}+08(0 P$abb 4<0 6=Ǒ!˲)CuqqF.4M6pH$Ym|BL&co&3AQLOO#ھ0vѼͱ3LMz/]MӬc,?I;iDDssشxR((W O}RiُqvMў&ɤE$@H"^o""""<|[m[m+xBF!b]pD099u0D(iXXXh4aγLa F!IDQ X^^F&aHR8{l?s\$D"+ۀ( ZffuAlxhHE5~MhgϞȲle tf#4BuHde\4&Sܜ.lfo.׮] B:YuB!b1AQ3y WUsssEʖAf ;k9gp6f~l(zYƊr&Eb~0h%~Oc&"""ݪjf%z-H0 Ȳt:]s翢(`4ybbt { c14q~kj& &3#GuYH$zz9 uּs5ܜ0Q`P\paX0[FQb1K`l{uHV]h4 +kD&iaY)K0K簽MQ?K |J%J4i_.4QP1hh*R)>}Jlf:$9Bl6kb4 l^[Z'IRÌ'N  uYfD"h9ik@Il399YwLb1+H@kTm@m;KD" ?p̙} :|$"N&l{})lT (P,хRҲ """"""""rI<jd4 6 Ј.r]Є(VɂN&anΛٴ(M&`])'5 .`||GO0XF(ffЄr6KZAD",//[R7d4PevKA+cщ2)ʮ_,9 i_,UZR⍁  """"""""=lff&(5%:Qy`qqI z PW(ڸֲnK$r0 #XυB!5 v[: -PxZeTnݺQwD Q~?|N7*zawJ<p 욗+)35}uC $I(e0 kٍ,a@Q|>uy+ ՛nh3g N[%#LA4,,,`zz$t.[~W۵9/DQ7+ 669-)+b&ڗʕ8g7`(JMFL&QYhG,(4 |`KKKgdY,..<-#A@2D>iPX0 R)<=,NUUJ a qvAs h‡Jho&hFDBuL6 (UUJt:t:d2U6h4jmH///#Ceei6$ $AAT*USlg`0`0hL&c]CYH$m>a[`=5SU7gΑh7` "Q|ݷ )ZM<ſSܴ?Oνi[?tr}g{U?ŊqvM2-ۘ [DLT O*P Y";oqqcrrafeDm `jj M뺾A{ fF`󽞙iVUU ^"|>L8ӄMy(m4|~wűY<,>V ?N K;m6fi_,WZS܃&lVt=}/)JhnNK~|oĶXV8wS_BHxo~2_QGj +k4ADDDDDDDDDDDCD/!!|g ~e'qgÏهP)4/QrX6î}\iُqvM@^DDiy^YjY+gƏ~-ӿ wh=r󠊭67l8g7`,D94a41uiv_$~EZghߌa_,9QQ,WZt݀ADDDDDDDDDDDD{MeODS8ڔ~*gV ~ _oow\0g~Ty?N 4ADDDDDDDDDDD{ DDM ;۾ַPA?2MoiIƇj$e Ege3<6bO~s.3<Է>Ѵ7Յj- JeOcvǫUkMp1hhb "$TSߧ@o>_g8n ~}_ طo4$cTX<s1hv\.t: 8}4pgDDDDDDDDD4<A46H9L+k?q֏|~#Ѯ)+8T;[^t͏ls4&v@ ,D4P}BG^9G~~hk߇g򣮌Q,WE+`à """""""""""]({ DD4QgSS1,Q,{X0lKe|?±wJiQS@^/h aفb>8M,p0$漟8&+Y+KE9sf[ |>p8-T2ADׇcفrq0kq6_d݊ADDԕp8 MӠ: ð(z) YeATUeig[>#"p$"Ec#ij_nWD$Ge""ڱb8ɸҧmy5W$""""""""T@Y&hW<w Jӯ}M?iZ_5kau%C$,fMr9ZwaX8aRT[-.."5!DDDDDDDDD5"Anr4CDD{H@Dᾔ-s'AK)\:!O Px_CG,ha0VY14m"]F`@2JeGoO-w\|@~ )рa "ڍ~dJ ^,>~6/Q5q&c "ڕӇ1;<Ǧ5ύ_;aC; """"""""nr9Ȳ UUaDQ$Iumt]/^;裏֝oRU,CQE`H.2"")FFFM>""'4H'A>j|MQG KiZq]!2dY4ѨA㺮Cu(y$IHDD^/~AD~dJ'9:h{A4ADDDDDDDDY+˃$IbE"Bui>}pFqY,d2d2i1 `;!buX]]ro^0Ln֗ فN3M8LDDDDDDDDDիWɚrpHRP׮]C27!6Hܜ0 p)s À,uADD4x<Oِ&aaQ!LTK&җEQ`FW^|*!fffx㱈v{ 1j,XL  """""""".TH$Mb1Ȳ EQF;SQD@S ""0hmKb&4ADDDDDDDDԍ^zsR2r#GꋈhGy`M)&ȉ|>B՘Z`W}C}4z}NlW3N  """""""".EgΜٖ1SymQ6خqvMu!C40 ^ BVYUU4ADDDDB_&$&zg0VŬǙLƕ>u]o{k׮2&Qm kaQ0$IȲl6۴}.uǫKn=o8 Za J<"r\BDDDDg} z\.oWeXKӘd2XZZB C4r9ɺf[fE@Be?~V) MӠ( .\X,f:VWW F5cAe5l}h;Fq3.C:M $ݓDSH&  F 7x1R^xa+tXt_D M+NͻƯ_iw_(\{K7NPJRɇ&R}[ R锒KޜD*5p=t~Z@9DhT9#4@A9kcץ:ujhoo󽝝1uԬFͧ|5^žc4g=G3M):ujL)Ҟs<8p V^tqƘ={ve$@h(JB@QJtE'YkQR򡧧':;;jkk4ƟP$4I9x袋8V0)Cz8p 2?0Ҥ4sĉ8x`Zeee=`|ά(-т'`B9x`^:2cƍ1{7by@QPJ.""*++Z?#]zW\p* htttի<MPJJnhѢ>}zdr 4% ( MEIh(JB@QP&TtIt:RR())T*t)Pp?GN(XT*;8>}zTTTPP0zzzСCٙt)0!8zh=z4JKK.ʒ.k(IS L(8q"ZZZwI CxfeǏsN̜9s+&_=<8tP=z42`B;qDW/بa~;8t@&s())s9pꊶx3w^M B@9u^RRuuuQ^^PE01s9QQQZXG M CIeJ`4_5j&&`hi%{m׺Q7sb ttwwgUUU%T LLw^[~;鴱6CiHTOOYksN5uԳ"ۤ ML2%&t:}֚ߎ{-Mą^t @|V Ǥ M̚5+IKO ~\zILI0JKK`|0H*[n%JK'G @ii%{m׺Q7<#5i'>Ox뭷.g~r-Q[[t)8 Ϥ MD N]6^y8tPqɤo==ܸ袋Ʊ"`22eJ\x1k֬KuFd͡4.첸˒.78p袋PK[WDDvkH ߤMD7ШV5쎈q4J. :MP֮]MMM9sѢE06663{ jK""bgϞQ Mͱvژ>}zl޼9{ݻ7{Ųeb<}Ncx&hooضm[lٲ%""n?kʕCvrزeK477GDD}}u]SMw}gGφ ""bc`BX*֬Y3=Y/}K^Soh"Taݺu#ڣ7lѻUtv[-MkxϦc׺XuG!x sl4G M@Yu3ˣzڀTOdXհ{V5%뷏Gt:t:>7o˗']R"qدmnn{Fĩ&M(-X 3#6w|ڵksZStIȒƠFQdw}Ö-[}X]#6llٲq ME/#K21u02[lhjjL&q5kduxH70q*tA2… cʕ6lK.$l2k֬ D ok/ … c޽lٲؼysٳ'2&V\?}Eػwo477GD҄&`ٵnilZx.]"(l+Wb۶mqmł b͚5_"-[ѫa{xq߱L"u3ˣzڀkE=n8^G21 ok֬Ʉ&ۣ1-[{ǂ "={ĦM=2f͚L'z$OhW.q|0@_[l-[d|5bӧO={ʕ+1;`7XhѰ^fMlذ5 X~{v[u3\0ӧǶmۢ16oӧO%V\9w_Vhb͚5qX M@`,mۖt gټy^lٲ#S___68$ 4% ( ME4&ڃ7&]0:MEIhr+Z; xXu"c<ؒawD 5`d׺Q7<Pi":MEJhHT*:k-N'P L\}E69 v[2#.δi⨩ WIٿ}:uj;| fu3XQaKRQVVvȑ__=\&AhH\eee#G+j`b9qDtttdwy U3Eh#DZں؀^k猪x73===UUUQUUSN5j]]]vxX& r!rxT9ٙYhooPlʢ ¬Yv]]]IB?zzz.I=8~xҥ0J?c{{ILP{8rHe0Fe?^BP&xɓ)S=%%?| 6ěoPE@Pp&[~|#Q__PEŹw}w|ͬuuuW`eX~{v[u3/~E [oECCCZmmmaB ,|Yk;wL"@h GGf}~L:k+_Y? 08oXhQBKqwfO$T7 HЫ+y䑬>dgbb…Ykwb߾} UKh$0o|#~_g}ӟӧ'T!JwsNf'z8qD@L}꫱e˖]|H"{۳~`| M@&N<?pd֦Nw}wR+c<~ov7hmmM"(>Bgľ}V^[PEԩS{Z|+N |ki֎c^o8-m]^`l& &3׿'x"kge]v۷oO"maW5%;`&] Lt:/2m{79#ݧE;$'>񉬵O"&`L9qDǬYFOL`\1N?'"ҏ1s<ƽ%%v򗿜`Y5^S=͸q$49$0 СCYk\rI|>Ύ܏sFG[=;gHUV ռyCPڞ={?aB%49@t<#ogRT{QZZ:F1S?HcP#䣙;|ܞ3qqd=cޞPE09Y`Djkk&я~/B->'?jW,ԕDʫ#5HH"H1۞O?HЈjn<bP^^'gY舿뿎?hi֎c^f 4&8rH|8k?wްHhaQ覘(0qT((95cZڨ~?z#5og~Q^I?p֘n8#Zںk{&`?S~z#/Sg#xꤲ:Jn]=J'K.$VZ3k/bz1eʔ++.6mRLwɜO2%}ݳ͹,sJ8®!bB=2qgT_1O?Ŏ;5v+(N-֎caw7]5\+z+[o9sh>G7e$@J瞍tPxa穓(1cHeeeqw']@ѫY>h(zZ,cEHgv(HuY#n|fT{`ey2Tg#+""Z; xO5_OPM@IhkvPdKUVGjN;-92UUUŕW^t^KKK%L8KoU #"orEh L6>r)ufwxagzXMwvDϟ/w>&}Eee^są^8N0M@Iwvduo()s/bD1DB(&;wFee^sS52]!%]ư9cD]4jgGʫQ"ybI0Z4""Z;Ū޳i⨩ϲ%4JjqGܺz{{ rH455ūMMMo߾8zhNMBEEE̛7/֮]sMtrPP&ѹwOoK{13#""uBO?t9z|4s<.G` `l*3ΎHpFstvvOoWQOk E.񨋉׿D2q5 rڙ9***4NW^:^@aIJe˲6n8mۖ83џ_~9#"b{3vE޺jjj\DrүSyj UrO?u>.?ϳΫƴhi֎cq,ZںXU M@BꪬÇc۶m#ާ5M74X+VwwwGssϥ^9K1&&7K'#UY\1yϛ뮻nL{۷/\ KoU gUX~'3_$̎?|hc9c…Co0>ʲQ[[ pEzvŀOOweSݔ`!lݺ\LQRtPf̘˖-͛㮻n:f;ʆ`.x3AoC` Ok&͛T*&ԥKSy晬YfEeeۿYks@n,12"$쪫  wLGsssV+W;74;@K/4hѢ~;\ #8b˓7СCYkc2_9`pM-3X Ckkk;3A3fˈ3fdFtv 9%i~yvvBrf+bL{ˆ0`"ii֎c^o8-m]yB#4`ƌg'"δuL,VX1g555eobH:X477gjX`ANjቜ &:;;cǎgľ}zdXհ{vǒXFh … ӱuֳ;s,NJ+=#T^gvn8žO~aoc> W'F/}@Dg酄C3gNTVVi:M9rdL{ٵniZ46]<=.]汪kC1*Mക+WWL{{ެ0ų>9XS޽PVVܜyGghÇGkkk(<2Q?~|I;3z~kyE*3&s!FsD0,iúo2]_&/ ( 6ؼysw}m۶̸ph^τ&9+1Z .4d}kg?Y O ;j/NS?S 7\qc޻D&& (0#z,twwǶmw~wܷr? .fgϞ&ϟeee:14!<1 g&R DtvvX~{v[ ζbŊ@?}b#2KYLߑ/rFJ7џ# @.'W^t:tI.n S;v>gΜ=kġXrelܸ1;cƌ#ݹa߾^paVᥗ^IE޽{3 gԔU $Ml~PYDjyꩧ]ώs]FkDZXհ.iyHtU__oG+Wxf̘3f}; \EMMMv V $3cCvx#tՀVUUK,}QZZ|Nϱcǎ8tPr8rȀۗg_K[Wvz﵉2 FM Eo~+EKEwww477G}}kZpa477Gĩ+V8|)DOweΎTeuxꩧu˒awD "~&L熈q{IhbQVV74ѷV>R^IDAT i}/Ftvd/ν,u߿?^}~UTTܹsVG@$I^/X`į;J#";0VϏ{FwwwfDGszC$aԕ低ҥKsAٳGh !~\/Lb}],ZhT{ѻg.@,Z(g?Y>|8뙓Quuu̙3'2&_W5.{ K ,BZáCbǎ^ehb(MMM|$orϻ0I>|8Z[[#TLjQ3\&jjj&Z[[c޽gjBt%%\tO??s0T켖0X3ͦ= S7\ TtahFs78ښb,zݙncF*ng>-@zߋg/^$5tvv֭[DDĬYGp&6$;"b$.\0^~ySSS,_|L{FDg}6k-z//yA:袋rg,127㚼0X`""k*++gs (|-֎caw7]5\#KoڮuKnfy+"_; Psss1cƘ̞{I,ϟ d>|x]2Ʃ",1:RW^zA/]4ϬGx}q1ԇ5|̻0 ٳ's;c,X?|DQ__?}W\ c(0%觻+#U;;og /̙3gX!Fj֬Yꫯxѣcǎw`p~\n1/| #~ʕ+s^ˇ]⮻# K̘1c_Q}\%.}@DgGZZO?=뮻n\;wACB70) KNgs4GSSS:th{+pW֭[):;;ǥ_K[W,Yk-g=`R^<{-L̙3'f͚5.Ո#`rY=xce0J-m]qlF^8 ,1~үS#UYecظ=2رcǀ̚5K lpCU #"c&)&0 KY7s081k֬q蠡O~|`xW IX":;NSo~C #ΝϺ`bf2UW]ybBGsDD,Z(y׿jTTTĢEⓟd̚5+/5|S[fjG?s8;c8e`™={v%٣9J\"e,Z(xDST&<@f1+Iͻ#u{w }6ࡀJ  t2L-`jD$T8T:-4 Эf$V@!,=yFZHBHn0Ȉ(RG}>647=w{2O،ߺ/ *Eh8ԹKSsCO\Ɨ;E-( BCqpZ5f\h8ơC8xZ,%4Yf3n/n݊ +`<{16tPs<-z"40&juIVgIg{8egiģوxh׽߇o݈Y:hBۣS\15'._8s]3̞H'y"4} SPd%y!40&JRg]0"u h P#gH$$ɺ `D 7=].ibKMp*q_?Z\,t9)qTF{ zy 4pevN CүIM'Ϟhe])H}ɷ& i6Qբ^GiDD ~z嘛ziZg=??RÇZFji?JFMh4>ZQcss3޽SSS]^w^ ;D:tH$l6o"I:5cuuw^=~j:pww7߿B! ֭[Nj_`xzuث|AGCh\ш;w<^( iƝ;wb}}X}ݸ~4!a׿WXR iFHjڵsNc ZV|O7q'yϱGB0buPJҸ4M>Zyouu5?u}?vww#"R۷#Ih6h4bssAQV^02??UsZ>l6>[Tolltrܾ}h?4Mc{{sW~AF~d_ 0sKzMkXXX,//GPthZ1p@ ":7]AvdžR]c6 M JH2j_so#"^Ͻ+Km?gWվ ۓg_<>g[[[aVu]% V* A,//w&?~ì sssY뎱$Iz4so߾zpȫj_^NFC<^axQN2l.xufgg{K _X,dg8fdXa^-?Or]Ǒվwv?/[%49sծ/Q.sҥu<yR,kww V^LLLtv8MY G+he6>|Fs>|F333Zx"Y__bxZ$4͗{73dݸqzee%J8 ( q333Q*"_rnݺtk'Irz""ݻ7cտ"\.G\4Mc{{;?~iFjP(ĥKbzz:fff\.zAJhΈX\\{^Xavk(Ɛ$IϟdY#/sY0ʪj9@f3zDzjzX,ƹs00|l6޻zjF @ZRdX @>?>^ȵ+Wd]Pw4oa%i8@Z4M#"\.G$W4 FW@ZgEi);uYj6]BVT*Lk{R)]0$:M$IJH4hZQ(RbHhȽK2`<KB@. M$4 O>_Y)yI%0F?u Bp>ϲ.1%  f]u g(2ke]'o曬69\rIh% &\rIh% &\rIh% &\rIh% &\rIh% &\rIh% &\rf`2IENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/galleryicon.png0000644000175100001770000002572214721316200020015 0ustar00runnerdockerPNG  IHDRx pHYs++IDATx10"Uww@ozЛ<7y@ozЛ<7y@ozЛ<7y@ozЛ!àb7<л=w{@zл=w{@zл=w{@zл=w!àb7<л=w{@zл=w{@zл=w{@zл=w?qpM'EJ)(20ER,`X(Y2Uu7,2D?}V;o=9mS~/1x,i]Gɾ,eۡW{}V?S\m/m6|a`SC,~73v>*_RҬQ~w}o^+_F[yyRnV<@[K)&4jS|V>R~РFP4f=Ql04l;݌x\Sd0TN7#!\ K;^7!!>t`^J7z8|`,N+Ջh72巻j إ|`l+UhcsSZ<@ʟ^cS'Kc]V<@Ψbsu;^x]U7M8촇f|Պܜ_63O+.Vx=t\^ݼZ_۱MPD\2"#jr~TҼ=g?|]Ͼy?T+>yMw>g߼nPu=1p7uM `0H|@9l"('M $>6&`rD PNA `0H|@9l"('M $>6&`rD PNA `0H|@9l"('M $>6&`rD PNA `0H|@9l"ֱ °{`x$PK #rXb3H<G <(g%x@9,1`x$PK #rXb3H<G <(g%x@9,1`x$PK #rXb3H<G <(g%x@9,1`x$PK #rXb3H<Gֱ °{`x$PK #rXb3H<G <(g%x@9,1`x$PK #rXb3H<G <(g%x@9,1`x$PK #rXb3H<G <(g%x@9,1`x$PK #rXb3H<GMMqi(KJyYQ"KYEJ6i)L64,2KQ$ Q" w: c{g1yYs1@ c[*]r11b8..] }!/zJg"cb 9@}#rϯ5OupN @ c*G_qQltpv @ c-}NAg @$E+cXlv  1_C4 %z#pAg @$rgM.;<@r jғ @$hM_[)_>A@ cLrG>|]y;܏K9h 1r7ޔw68h 1ƷW|WƣU<@r %g^<{&:h 1Z *w M`b 9X}Dn @$mr|\N @$#]Dt<@r Y"%AȌy=rx#3UT_?<&΋Eu[\Eԁy=L8WT/i]\))\ @$`qQ?(_+fʼVک)j<~ 'ۋOQ\{"`y"RŭȭAIRh"%MQ2){u1Z{}}~}y-PktQ{,'`<&>J]H `9wi`Df+/.(8:sn (^5Lq #q @3sfLAұ+ #41g&'*cF8 Q  @F0c˜jOۖ(9`2.9쏮uʎ=nFCq9P2">q @Nׄ9SJ3Q_VDd_dD|<9wtO~VEߦ`R%FO]e(Nf 7P2">q ieXq C'] q ż{¿9$ PMa/zN` #0`5Uc(x|Q(;*C4Z/z*GK4BZyMWEp"^ܤa.JESPh"%"UeBT"!Hn" r(HZ\4J֢y Z^3~g[{睋3c^@r UOtPGO>t^7@@ $P~E2Y`>Z$Ce e"PeHZ\>fy1\p,A?3cļ a{R_[;[񰃾fy1\={r!nzu[×=O1/ 9'z򂃞uAzѺObaļwKqqo|Z1nwP^}ς c.nѺ/@[Ro%|A-<0b^@r sV e}781/ 9ïW$-z8Ɔo=w;1/ 9['7qBIy}cIqѠw'Roe%7}V\0sR$*@ H`lD{qZǴSa9c~.Z_!&G͹Rny1q[.%ӂ%MXc5=RךC&"@ H`<-zAQqԸzU|8/:h1/ 9s]{t+lXax\Ey1Tꦱ^a7Wvڻ{w_y _ROF(1/ 9pS{Rwpy;coq.b+pz4Wy~c0u΋[ >"W⵩aqp߯+[׬9yh>0|xNK}sb=1b^@r Y1G‹fy1!R;K1/ 9Z?Gl[i4R=F|6zy1Os8zy1G~n26@ $@Isv>;>V3 u=aļ ><%%x(p2c c +8K u:;@ KQP (aaLX(J$d)̂l\jl$1[R)˂%<}~wsοG}P@C'|G deTAb-$>| #̡K>>dOPo0W #ڒgT{L2.58ȫ CI!/7QfM7W #R@fkֈ_c G$ :j@pD}Pi$#qn{Y`WGJܢٷd!_LNy%8>@p(Pq9ĶȚ.cIRPQ 8(eɼm k)>%s 䕀@ | *pݡ߹\XG$o si( ƯݧևdK'A63@ngA;Ym @N;I2U3GrYYl%+@T`zL0W!uf_@ݫT@_kyf@R޿0ՊQ]uKFȜ #OC1՚)+g=@f&)O=h^];Tzp@PQ 8h}Ʒ5ײuT{j^ m #89>p6^` y[@]N d0,y@#)L=7vv^O{R@pD}PT1{.-y,G:,i( TSa[+0ճ$v;a oPQ 8B!F2ڳ6CS?++.vyXYk oU( m߬2ˎ 57 GOUUpXZE $YI AED#"H4 #jP!55 Dp EFa5h}^w{7~kuϻ=w F!?L8iΕNj~PcQ=>P@a/fty<@E )N|?0^/\v8 q/ 8Zh:1':~[Jj0Wx~Ga{1|} g{;Z%67ѺPc8k41n(߅*^@p 4zmEЃcjYPc Fol?pXYۺ^yΑy@E p׉?wa*^@p ^ZS@_+q@۞JT@ q/ 8vz֋A8T^*^@p mDx}s %`+d{1g0`׹K`zSq"Оϼn'TO0!V q/ 8#6R@4K31TĽq:x}>n~izթ)?ƽ1TĽڀ3&u/w,߿e@E `q)&zB(oQҿN[ڑx}Έ#^ ? mXm@E VO)^UGÉg<3̎>pºk"Y^S$~=@wq_#{] )߷SK3Pk9y{G_q l[Hw<)3̖` KE q/ ~2|Y3aԅ5 w0[wtw,n{i\{P8Nց'6\Mcbs=lKT3@EKUU`bXQ|臒.(eZQMA.K!ȌpPP])jPHŊ LϿw;o9g] H.c_eou{I/Ct+ye 7J'_;ڋa(cޒ_WC wOڨt?t.:c kdļ2,I]^=skDgZP|,:2^ \rT|p>~Rl5N뜍& K~ak˕%偕)Zpmcm \5ȇ+O#N cU,xeļL4+C{rhYq,kbļmo&Vti:<-:c`]4z1/ vHVDPAqwK= v|[t2|&@ $-,-)Ng jJ9OC0rV|%:cyWqrO- \P>D|&<0ߩX|=6}# Eg@ $!o W vJ?ŪK9iq<'yyg*֍+c.D}^qMqqeļ7+Q~C=Cv(iŧ31*@ HsM;دԮ.XIqR{,*qk2b^@r^59>|Dϗ* PX^>WH_, p~S`F~BNo3j/QD'cA<>\SEvq{l \O=&A}^L:A$J:#$)":`B}3ů m>,˝KyYlqTb+doq gK?hQpH8hP$MD"b6,be BZ"Xj ;,vwK|;󽹯{3I{6TfVtqa9ĸ\]^RϪ{y_ Ɯ|z%d]_G83y0үHo7藨]"_ u/6uǽY}D~:эrb^@p`sskEwmRO~ѻ_Ntg0^@pVGkNFs+Q g0^@pVrig` f(Ō_^@p`Dl{fbiTݏa@DVg28q/ @]̳IVAJ((Mb k2hBWGfia`*ig0^@pMN7=S>4i~Rp pJl6{T>V_/Q 14Ƚ*˙ۊ`f *Q& \0UyjUK*Sw[d ~f>^;Ħvhz%J^Wp/ 8uS E_u[& s/ kww/VKbl +)/n!:!6O@?(q a({6 0/d׍@DB\)``V?sa$@Dn @ ;U 9g%J?8t#~9dX>7C@D)twPCJ4atJJW)* p/ 8p `>%otJ0ýPx0 ΕI_ q/ 8; l"zWG3*@A -Csd=~kC ^@pD4ct&z% f-{-E[WlvH77Kc3SXݿMQ(D AH7`OР҈HИJB#7 xDSYID$2ܹ^=w{>k &x|k8z j=ZHN;gQ)pNLoDio:XװX+ϡFakpW91I=n{[zQ׵ |7ò\pq";.=අ,ܿp x*!y&=@޶`,߿fP@`d`45Nw=@3]P(߷gPD;ZwQ@>r^]_`PK zJ>]=@1g z]OaPЉSb^Q'ϥs=CM)w_.e/tӴZoda]bV3Vy{w˳K6_վ˚zb w4="s,*g =kgi2s,y wOr"״]o2jN]Ml3@c{3@{_!àb7<л=w{@zл=w{@zл=w{@zл=w!àb7<л=w{@zл=w{@zл=w{@zл=w!àb7<л=w{@zл=w{@zл=MRIENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/gammapy_banner.png0000644000175100001770000023113514721316200020462 0ustar00runnerdockerPNG  IHDR/q a; AiCCPICC ProfileH wTSϽ7" %z ;HQIP&vDF)VdTG"cE b PQDE݌k 5ޚYg}׺PtX4X\XffGD=HƳ.d,P&s"7C$ E6<~&S2)212 "įl+ɘ&Y4Pޚ%ᣌ\%g|eTI(L0_&l2E9r9hxgIbטifSb1+MxL 0oE%YmhYh~S=zU&ϞAYl/$ZUm@O ޜl^ ' lsk.+7oʿ9V;?#I3eE妧KD d9i,UQ h A1vjpԁzN6p\W p G@ K0ށiABZyCAP8C@&*CP=#t] 4}a ٰ;GDxJ>,_“@FXDBX$!k"EHqaYbVabJ0՘cVL6f3bձX'?v 6-V``[a;p~\2n5׌ &x*sb|! ߏƿ' Zk! $l$T4QOt"y\b)AI&NI$R$)TIj"]&=&!:dGrY@^O$ _%?P(&OJEBN9J@y@yCR nXZOD}J}/G3ɭk{%Oחw_.'_!JQ@SVF=IEbbbb5Q%O@%!BӥyҸM:e0G7ӓ e%e[(R0`3R46i^)*n*|"fLUo՝mO0j&jajj.ϧwϝ_4갺zj=U45nɚ4ǴhZ ZZ^0Tf%9->ݫ=cXgN].[7A\SwBOK/X/_Q>QG[ `Aaac#*Z;8cq>[&IIMST`ϴ kh&45ǢYYF֠9<|y+ =X_,,S-,Y)YXmĚk]c}džjcΦ浭-v};]N"&1=xtv(}'{'IߝY) Σ -rqr.d._xpUەZM׍vm=+KGǔ ^WWbj>:>>>v}/avO8 FV> 2 u/_$\BCv< 5 ]s.,4&yUx~xw-bEDCĻHGKwFGEGME{EEKX,YFZ ={$vrK .3\rϮ_Yq*©L_wד+]eD]cIIIOAu_䩔)3ѩiB%a+]3='/40CiU@ёL(sYfLH$%Y jgGeQn~5f5wugv5k֮\۹Nw]m mHFˍenQQ`hBBQ-[lllfjۗ"^bO%ܒY}WwvwXbY^Ю]WVa[q`id2JjGէ{׿m>PkAma꺿g_DHGGu;776ƱqoC{P38!9 ҝˁ^r۽Ug9];}}_~imp㭎}]/}.{^=}^?z8hc' O*?f`ϳgC/Oϩ+FFGGόzˌㅿ)ѫ~wgbk?Jި9mdwi獵ޫ?cǑOO?w| x&mf2:Y~ pHYsnu>iTXtXML:com.adobe.xmp 5 2 1 2@IDATx EE 䆀r;@grJP*<@ A rCBND徯M!z ٸ홝陝ןVw]&@@@@@@LHæ[+QvX6lبw^ ^""0; {+2&zE?kf*     Tݰə`ɜk=M`j*v * ;JYkgy2 f-zB)+&@@@@@*O΋2;y@60֎舽 I-:5Ccfv gifl)     ƿvbshLdC*Zz[Ż) [af<& @@@@@@Gb}*ӝqԩap%IGVP=    ԾyֱݯyRd셺b|<,587!` rV\#>E r.n:Ŗ[O.     Paֹ*C0'D7qv KgÎT-nRe//ײ7#    @ yn1hl>j7yԉ1";yצ<2N_r:vOZ]c'     P{t^hڱfh}(ᵷK_#=–v]J#9T268߳|oy#    Ԑ@]w^̎hE>'=خu럽#|.Hԁ5Eߧm|J6'q@@@@@vTQNS#ksm 1eO;SA(%Dog=W3,ݿ#i@@@@i`G 7ltgk (GGoG툡C*4 ъ ߫L\\"    T@]u^GFV"Tv{DDNuRmDїs.g1`YY     PGuyao3;b=zu~+1((@9Oύ+qc~hN @@@@jNyaGGխl_sk*08aUVtۅ5:/V`.4[;4@@@@KyaǏohoEt\Tƶef)q-`."͵AZ    T@M ͮ^{pګo:`]W;tht{׶R8t^$cy@@@@j[F^Q>^w:.*w=^oG7[d DkVǧ¥oJE,    @T9cQ6;K gVgMu+>Q6Վlޭ:k@O룡Qf&yO\     @m m#5 jvƌ1ZGmZWPf2G%7%    Զ@7ɻ6E NӭNUU3y>m澼vkY{5#Fjf܆- @@@@j^oeEva]t\Tv[HmeWguפJ }k|@@@@j_j;/;G+MSkgWՎ޷k[5#Ud,H     PUyakkkuꥒ6P6URj:5zI6f8     P?Uy{AYMTS{dvĀ3Vm2{\f.~/.@@@@@^vâltNec=aao3wѓXJOcOhvI˩Qo&[Ғ4.#    @} T u\Eѵt\u;ʮk(VW qqOZ^HEuoJ    @w TEu\\FEAO7Wr'3k5\oO8     ^;/q~gkG[/խzfYx:)g_6>'.q@@@@@*vԀ(Q =oPc'+_ l7>uϺO^$/[Â#    ԫ@Ŏ5OP t\^ݕ;+~Y _ OS@@@@I"G^FI ׽ R_H;^~I}_"O1o߲o%    tv"k݈ :.: 0DϿغ~5q"g+99#    @ TT1tHIS466+K:p.5_5?o 7s52D@@@@b*Nlu`HL9 B l0Ύ1(']'5b3&è 8     @Pv0:.b O }-ützֶWj&x[|W\"!    @ 4T@t%?eGVBYJY=&xUW [k*t͘72QNw(ZdL6ͺg{76һQ40ffyCg)_ iQ# ?SeȾ28oG:fuF@@@@xO@;eG 8C ҿRPתc uJK  ^43_Xnn&C_"Â(M=}L`3DKgw:n2zn4Ttŏi4u`f&     @@v^QR#_@Z#@P1yuRֈ2 `aiMNwBµDm`1Xuv(U~H̼ׄ#ẓuDqS1G-%     ŽhmvFlVmBizVeS3=33_YXMuePga*M5ߕU`Uin,z^]7ⶋwsezy$d+:     '-![D﬽_na`.~%(mnv]ԑtr9Q#2v+mn饮K@cl'/ ~a?z%     ^ ّn \cyN1wKׇ:,^^TM`vGAtk>Qb\R!3S$3wM txș;SqY@@@@h/P΋~ qBTg=LGv;4^Y `tiֽ ru\ձg/08a+ @@@@h'P ;jlvnԮ Qk7A×?wWoJdo㠍TN2\Ru/zŋv'#    @æ+X$phYRaG7FԕWR?1#[I媶Q>k|ʭ珜\8     @Gc@xsZt\{eqvس-qQZ7sa#2A>1wKdwLs֍򩡾KmA@@@@y:t50Wri&  >3/*gY [&j~"ꕵ_rTk!F wiO\     Jv(;bfխJG\ŔSaƜe7 GUCJ9\C/=V\c~Q7_̷ @@@@@#RťU+M~]ޝ M!8fa&=ܘ^dvbsŢlpok?K@@@@@+t^DfTu\U7a=`ɅfƣkRKݯְ#ƄN#F{/ZaOZwQmc*nUf+  @@@@B kE^wݷe,y.DI(5%_uMO-BޮbU2Z%udKQ6^4#     kǏoV?v,˩04Kk"Bq)AR#rR0c̈́#̼ŷ#L-F6}A^=5UsLx@@@@@ H!9Zq3/4EEa+[2M޲(OF#j#L&S[U1F@@@@F^QGQ? ӿ8cž|̜ٚiVȚkl`{|a/yT[vQ n] [J     @TF^d¬ #z(yt\$X6mtfRQG)^l&0ƕN@@@@2TF^Jo9oW$͸qF\PX[ϰibJ{m]ʃg.;%a3%G@@@@@dEw^DoE1tP)Jgp-(ni[ti,-_(:54AtOiL` A@@@@ u(;vhm듺Ow:.1w,jYWݯyRD7hDQcahv4Z_OavĀ^Sg:+3>q    $(jE?P}aMqtuVf|w"QdO)EՖf6)F], l'.q@@@@@ ;y 4ocns7.̚.9_V7R'[&(H֎`퇽 s/9"!    wn6km~tkdcVyևhU*ۄL'6OO]mC/vՔ^6S^X65O\     ^U#}3†G;7pmQV][2ԴӬ{zWyMphW\"!    ;/]Ld_l4]mf/u#`f/~,4iWX{-Nҋs}ʩP0c~8     @1;/to]Lbmm.~}8WkƘi 6E+fՐ9hk=KdWYm3gs^q    !N٧Q _64N3k~=04l`"g[d|Æ#     P>> .-q<"     )<Ž`\ +HK6YԶ95D86:*q26Q A@@@@#PpEvͣsMXqhسH4j[ 3w_J0sX #/\ f>SТnÊvɛ8     @.΋(Dg9eY1OqlU숁HQUg^b=Y@@@@(@ 0zEmf^ 2]-nb5`1t Uf)znثW>q    RSE0jC}3Ucp6~7>H" Ë,qY=!9cxoG*01{?M4@@@@@D:/[FfJT^E`#=W^5?Q`&%R1lٮ&ii1ñ<    E` ;~k웳nC*l|7>(j ;yr3nf'6ѳ.N)5vs߫%     Pj:/Տ;S\l[w|""!:Zu=u]\,o-6?c#     P:/ 87S5acCW̘խ~Ӷp1%ndW=y-w%     P:/gjճ.'& WQ~o pogGQR{mY>pA@@@@ l輰oboa&37.(Vܴhe`}1~8d ~.,ӳU./!    I6t^k3ctē &HR3|ȋ0nAώzWff+.@@@@@2 l輈vw\%иuLdQ뻧1)1AkˤX@@@@( ?G=GtA M3sfw;/eQי,K@@@@@:/cel h_\b!@hۦ c7~Yy(sO^LsxA@@@@-XOzP}Ry-5yߴuѾq/y>9K産    +Nѭh$ML~:̤X Xw֎*mS#wS>)X'q@@@@@Xyaf&XzP[3!P9aSTyEs>,>q    tFYkMRun'òCTO^옭-e{S^yNƙ/K$@@@@@6t^d৉7f⥉0}]Yqy:V6IgLh$iXn:u(uZ^=ݮ33_YkA<@@@@@;6t^]F4[B80nudtYLcG Mw.,TܷemǏFUXE0je2O&lAZ    T(![m )hlh lr3W|[ ;O >j)xtEҸXuĀkkNPfҽcy@@@@@*״{1!PUܙĝhVwwwâkȋēn6ŨjD@@@@@JQP ʀ@jMYZ'>ҎY{:S2I-e1iIי,K@@@@@輨@J'0h'(vۭ~SL}    #@EŬ R 3cFZsW6/HY9dQlA@@@@*E΋JYdz~׭TnyaG <'}@qF]@@@@*J΋ZϽCLҌ̍̚^g,qC@@@@I΋jZ[KkyVSz+ֽyFS ns*Ϭ    T*(H)l`I?2vOxqu}i {4\4#    (@E%ʔ{O6:"e+|eO\     &@ES0 F^mݼ; #    TZ(Tz mzyb3;Q%I @@@@@*7Ve-)t yKZ1/yA/숡TӼG$@@@@@B輨CШQ`K܋Y ={6Ysex,    Tv([^A[ID׃chz͘$ B@@@@V΋]u<@Z '+zPO8mˆua    5#@EͬJ*ҥ@FFeE,u{07     TZ(T)ݯgD4mk~vփ+dy;f`-BF     Pt^J Xx:*h]U[GEOA= {Kc^W'     @5yQ k2&`LObRP'g|Ƨ< K$@@@@@輨D<啢)C_*iyLذWI<     P-t^T˚dy'!u2/_lnk澅^W!    "@E Q޽F^hiЎ<vO0A>pA@@@@jVMCܛzuGZCmzzċe[X="vf@=㕠7YY     @ yQ $PA[.hȮ.i^qCwufͥf.c@@@@@輨GCr8VvbЮӨ0Үd @@@@@輨G}L܋t^A4ŧ֚ܗ%     PMt^TڢPEw^wl0Χ"c.G@@@@@輨5FyLg^Ll>- -'.q@@@@@jMA cJ=;O\@@@@:/eMQ*W'hqEЃ?nncz    Դ5z\.ĝFOl+.í߃kdF     P#t^Ȋ L(am[GQ+>yG@@@@@Z輨5G3/:/uT{Pw`̼UY"#    U&@E05^s/zPw*      P5 USR @6GI=;IN:uj8o޼-l5k3S6(Ԥ{bg5WCNJ`=ܳ>lKL>{wxy@w mmyQZ wSB1K72$fAyn^>ۅ<ϳqzqnjYt!==ۡn.rpEMqC@@ugkw*(rQNmHRvK@@ U+WNձ.]%;dr{W38؃IyUw;[t=[ˌj / ',:,UGj-k% 4L u(}AٮP|CݴYMA^  T.Vy:+i;Hj(g{544Iz S#JR+usEe0ꑞ1r'n8E^WfOAnS?NzZ߼,Is8UlgUc{ŊTbTڦ'Zj:0;Z* @@r c^No7+nM'{E EF^IR&`neXnWgt+mqOoU}BU"P6V̺1#O)#㊠-ފ0i\&u`{[H l@@AB}q aR΋Rve ܧp궁2iu6e'Wo.ctUoG՟"PN}Ƕs{0wjq/cڧOn}225^>u\@}E_Q>,Y2Y  U*ni[B@(8U@^ r};V=[#(w / .-dzXFW.搜yعKt.2O;Tqwǎ~)S4GpԉJ1>"[d|#  P1:պFkt^ʤ* mn76h=Ӻcz⇉%m=ROcǎ{֏㐺 RԋDUn湆r)nnjk@IDATٗ$??#x[WIH4@-ncNn}AVzloRvZ>ʕ@MC.1mHmx538.r]۶X{iS3$P?:G^ר *D@+8C־jLa;FrODظ=b̼3f U`o%d:M^φ*4O`o, ~9*c1 퓫Z"&z9C(R^Z߫Q1o*trk׮"Q>(6zov s>>߮k2]vgNv{w3ޖբ:-rϻz֌3F!+t}uW̤r:Ar#yzoParˇ_I{oHS-Oʵ&1xONPvmۗt zJ][Z:Y}CbŊ#T1*l鱤 5'{qwѢkrsv.hTJS1=Zѱc\wx+5Iy^(OWmR(xAm I'oR媍4oe,= 9Mp_rί:=X7?HSEC@t(mR@8ƙ1 ׾ coWsw^8\xݬu;+|i-uŮW:*q ~jG:A;+gC7$ZW\9ie]1`ôI wdݻŷzkIF۪,TCxuԹ (w~ҺY릤'*.B'u\˫5'^Q7O9?ߋXcx8ձ9:jdo6mڮr,vT>26t\\^^خ1X-6:e~v^t^wr |~1KNNi*ĸ{}4;V;S踌!uU_mc瞓6;::qo?MېsUL:ux'tQرZfe^Ro (H˼ߜs7 LOZ:eigmO&,FŋmH:?N :n ;TmeJ/ C犳 'AQc/3\"<_'j}vR7I@G+tqV59IYuM]\EVWT]Q`ZWN ?E53Y@FPDB{;g)sհR;. &lz*z]-haJwH'3>U'ǏH6tuWRv/_}=+Vs1^ENz1wŽmkóqq¼F?NkQ#ԀǶ|:tnsmשdt~s{‡se6U=W7^y\2UEMJuP߷0K{}E%ZA흢p֩d-|^ v\6ܾ`ſ: swLI헏Sz t1?k{uw۩N?m'w[}oTܣhy-R?oϴ]y'[x~' Lgl NUh{V' ol-}W>k0m{;T~ۥGM qv%VW?;XM6ʫq!Q;fw̄#' UN-;gĥ[Eq:d^tiؼ;)c5J,." VA.V~j!Iۖ(D舺NF *t Y{ʫUTnn^~S'iʶ:.Rgޱ\ݱ~;\+[Րw|tW9*AMjR:^|OV&*Ou~nvIiKWKj=WUtE[i]-к{>Lz\u::MMFimLѭ:ܾ=WguWsT*Sw于 n7}Go``]Re׹f~'wU}U.kCE%RPJ/BIy.3{n'3_jPm'qrS^jY@'u;AWKίzewС9h4H:S~c 5$ܭգ?OPg)))1EƸNp'h}\'nGENe6G Ttb(͝}ig*;Xߩ#415\FDܧmNyK*U2GK^cG.;gժU(]RϠ 8}VlWӒ<ΪmU}7wT}/j~O~uvaeCzG/٤D0+Y&5p9u"lr1IGw[@rXCE/ƞEט> tWݤz]Z/0Sgj4w_N\cM 7H:+xnok(߼c$fi2ը,v0iUsa.[Sl(ܭ |m5v(K'}ޡ oY)hiRc/kR~:Cu<1:_ mQM7iY\5NkMJs<Ѿ.FOu܃zV QnC^&w0ݒ&IvʱgW NOkzN9}}3^O {y>\:L!'XYUy G>u[,Qow Ga&->Efi (e3uƹLUZL9zש1'svxCS۾Hb}0}vߙc%.VḎmP?'wlyE ]MyC5oWJ/Fݢ}칺Zp] ;̭'{ǎ%vwqZW.Q@{^) Xyy+#E|u=th7,u|1'y:xkr힜tڠr~Zs̋ VA!(9Y 5'U}Uw4ȯoTt]![l#wqZKyP,yT_]DzV 755M/v-8gk]|Fi/Gu5;0̱Lk8w۶Hu>GϵVۘ/4^z]yw8N]n::K:{t=ʷ]4uT\cg*iv ӵ9w˺H:2,4ߴfnqY-Q1aJݞܝWNF]uu֕,~>6Rj̄΋j\k9=chDu#2ӓgX1UxkRt\[u2-C'KNLPƀ1Yܺ+`OrB:/\Z 'QI۷5ou^Dtb{Nw^*;qJB{f{nI*ˇƫ]c_+wuy455]:RX󱔓29誅kl(-k;dus:.U'ѺWh$\bd+a˄ X@"RYfh=hW}tիWBy\ϜI{Ļ{+B>+9gu}M#~wNkdI&~M7UWϿE,+u{9w\kۘOw1籩Uz}Iۘ_< zKyQ |BߕY'ȥL_+z[9~Y\Y?UD(=}}ƼN uc+{9ܿVt\@a+-z*NճTY:f9GTT'Zws/.yƥ+L|Cws_ϙ*=:Z}يo}>wg:Xզv6ctĘxM6l;<9N<\ʎ@ K-@'uzpNu?#:%gr,(]7zӤ}ގN3({@Յj]3#t3FtvF/m߾/<]C:.REH|\Bn~\l:UsߊhʵtAmaG\:]lǵNVo} .-{+|TNmn2Tݨ cF?㶽J^^QT^n! sZIʱY_ɕӏZr s'wkrg+ZIVƼ^ӝQmKKPwWݢ2q#"mYm*9wJu?UUW mdie(mTnxɱhz2oZ{EX۝Ieuotomj`ln -Aca:|7*o2IhJY&>e_l6i؄߿~]!;?AlWFK5쥎{Kjs%:;M.+5IWT}L꘠¶TyݴRIsF4 7B{ٍnKuRnv9Oo^7_۸[/k!`]u/0^@&J#Uzx^A{3Cw~%+A~~dxհ;W9Z5 1ձV7 pMLxA=zUc~gr-o z h7"+Y])ې?J1ט4}놫y=1F=uk^c- w"mGǤ᎙ܭ-AW~W~gs6Vʕ+]+)=?#iu\mn}:+vT6m}unkXt~IsQUGj:0&,7^۬טyuԝ:7A+q#XHo[i6Stttٍ&@_6 ZAĝhWP( %yS*^mzca>9TVu?C7XgeGA@<1ou MXw{ aY\u%gTD@' vNK\;2]][G:j8A^4ZǮwK䟣uV-sdT:UIGBtB~taKvZvאBՉ{+4 Vg?a߫mzeNT#vz֍r>j6s#cϵV.(nnrL Gk[r$K/I^dvH/RůꝒkh2E*eҩWߗ tpHo>aү_t Lr9}g]A1~\a;/԰:.6t.OWj{\):.\>*c:dܹC>ǡZuv۴')UMlU,y@hn}ƻ"O7K? jl`wN? ho֏BU^驓t25L+,IFݱ~ZV:/:8R͠a:FN.nqQ3iuyZǮ{Ҿ]s#qTW<+nʙx3I}.IŻkrꝕ4}ø;E wU^''|ڶLWs{ʍ7vW˶r~Uh_k&ŮbFt}UC׾yr/ip5ߒBxݪN+ ZRf{VΏIF*myy9:/G:ye8Cq3S iԐ떀oq~[ vغ|nj&LH]6{'ڮ>)1mz ;E) \D+XN?q^NӖs}lP[=q S;o똦ʵc'*|zy%뜷:0wK_G =Ǽ8U#ܛ81"pAnU0L/W^ݝt2.g9CDIn:H{p`v\87'Ф~̤qiyh(/h5%v/ _iuvr'd-qt b\K7*:.\)|0`-f^˷LywM5p7Q'>u\{ў5it 9u\tѲՀ:^4ps[Lr[I_VeB:.sW ֩r]|>B5{wL곫i^͜[|2+u$.3ǻR sO|^Jgٹʧu u\:+|M6Y[&5xӅ\sd=4;.jmӱҖ2:.n*~=s*@OuҶat:Fgy^܌_ciWmUʾι#pWwrorDVfsä ">&K^=n,#/z}ݎ'~Lb**ۿ+ct?TF_vC˕w)я!쏡;8Ǿ M"8;a!5nm' IWҪi t@XV^n QPPߘ~LnbMSCwhw+ mKKOR|(sյ</1:pI4iTdExD54nDaiQEvE [:@ 1v7T@m3ӵoX qo~Q@teZljOj~? %{Ks3pe3&giaam7D m񵞇}]iC--?{&:H+F{rl5pK;jZi=Ԑi$3Ç_mlXB( ظyfmK4w^k~puLF&b*elz΍e|W!ZVnMIe}Ya茫m]g^ EQ~cg*h|>n'%mU>mܼRg~7*N*GjR"ҭ}CQ|[[~#$mtTkt:zSSW:oB0,  F'M[FMKYxܣQ!q.]:N?W䲕˂ZV'Vd\^׾{GT }_ފnAt1lyuu*@ zirWٺN `o*$Q\VC{Hr\OI +*kvfp)PvH`4aXm߉nӖ;{J/q5jVW-OϖNm3UA#/n^\wM Kyus]NQYAǝutMoP+2UKO&xH+\+ۘ[7:K4t w;:R\ۃq^ G~rJ׵[ nyGmSyQ{n)$B! H QA}gػgQ*UI EA@Hy w{;so>ٝ{93s4> zMy%`4/ E;IvNյNJch|3yR5=rA9¦9h,'՗(EPh5;:0+KDFY*K7U{ɫ (6{p:gA-H-[LE6Uz:jz53uPF =ݝûQءdFGP3UA3W6jM>708e_7geMх'-prDv4廿 G{p[u`3gf"+ O% _!c^O}{b!Vec8 =%67Ҿaq\~$Kmoܓvc/era:9E# kwۺN6tO4gPx>Gk']7wm T_:4򉌾QFRS«MT4/ >pdR!ǬߨڔG鬻+vczGMDZ6h~OF`/YA9^G %dzva$bKy\^o^"K{ǝ^ah:/g7\'g-eR=|+T^t \3.|g6\c\uW2Aq=j>^n׬_LQ[9)(qȑ\,i'PFY9g#eX: +'ޫ43m_z{gY~@{(ɚNƲ(`0x-I2]ȿQʽ˫hF'کg̎c~VsnL%u]Q|(ϩ OCzy^ h4 ^ R4.FIEDTFnhex7H[=Ǡ|5e$/Ⱦ˽V/"y6._S^q1@A^pxO}˽n)-?˜c^ugH{[ G(Wʋ{ѳ Fm@΃"P l:&()5#TF~1#e)_)7+xPZU2h!{/Eb=5UE>ٹ-MHhV{Ͻ,%Nn!P6sz=&ʒ }Q}4.?mR̟pa:=GfkKȋad,c 3T/^APO,虩-jybC>=N=eR7?'e; k {kэZHoƚ3arɤI"d#!`90<;}>qHcXu*KY6"hGbQ#Գ#W#T ?t`+}csxI@>j'd о~gYQG!cۆ -:( pa@Mqv3EOǩ N9OEA<[,'ސ0y:@ؒA.ߒ T>'=E ɾcTޅqm:sL p.F羏37fXv*>_ ˥^J0{nݺ:kߪu̘.{v㵅'KqOРA(ip*>us9r `buy#kJ[G]lwqm-5dܰ W@bVH7{\ƭÔ)SvgȍoѰo~-n[ȃ8D^6<&ӗ0 lVƋLz$//:%`Yg6>%|FbSz-~L=~JK0!mz-:0 dIqeN5/2WdR8fNyй{c5謹8j>o>f( `ؘP?Np5i^HU8bekՃeߚ|Z'Ì "ɖeXcߠ2 ,c8IG_:z=?#lYNg`4^³F4\%x [ь"ǯS9tYKmbM8lm%xReu◖ QI}őAqG-@d{ZEʍp] $2k%i&Q mohs}sV7> lyCJ-61Q跙MR3}:!^@BHl l޼ڧ,*nңݷuzBϞ=_` FH'ֳt-@E H18<;t:u[C;:RC&H@~ˠHi (F2DÃOJd=*V… ^^!pӞ3kQ`+:xpı5 (y<#Ox .tC1ˊzKa>o褉+J GhzGyA:f]C䙇,B-*E-s{lz':bgٿͮkrQbx1X\ƠO*Dxqw-wIo^(Z@xZ geYtIqw] 'UQ)ٴAV2xY!2hWKE5;0pm|IOq'@Z|ų/iӦwO Sf^οg%4(L$/¹K\Yf޺u뉰Yg6 f'prSh 4_e{מmކ,{,D2F!u3?+m߶ARKlH"$-":.|&)u%Ľu5^AƋYꊶ&P۔~xvDz>6lp~{pY~-dNjtu 3$jgn7iu=zg|?&΃=p呏]yV^IW\ql8e;eud-P<ᗿ 9!%% u<v=*T.;xz p+u4ssZW\8҇/2!Åw*vq30Y`9'SL~?r+@O\RV͎\&Fnau3ݤQC2%|Ib#u5@ *bv/g{bXsu1uP- m\(u/qĞ[ICaB Tƭi,cKPY zZp$0 ̌T"mny_ǽ˳(?ۘ=rA 7uCB/0nP,ԳU>u '8/h4E^NDE-#nLR'N r?uj.L'7M^O @]ԵCh{xtЌ "4&N i+l$aXJ:|ͺT|X)c=6`~% :,6;*:4H0n/; Y bs_@k.r1̞]𿁁P7k3صٌam^a(cܳ;`T"L>XeFwF! FUمqjY̙0|ڞ<<;(ČW6{$ g=J[G{9Xs12^TSE3\@uprpEkeV :dT i9 ^ԑNA+ C=L=]'M gx'N|cg[YF5J@IDAT=ߛ6m+l[CGFw_Y8fyS2z>2ylY(d|+ϩ%Z:%cVMep>-@Z]c$BkƋ.g3uVlab e #LŌ/me~;gi 3|x[tq%Qe9{ϐgN1b$ ,y_(<jF9r^]/^ku p;KԙMmQgMbB7%^jե[ZۺضmۄkG89UK@V2"P]=vK'dž>g+V(/c‡[&M::Uiy%?J)ة'Ħ}h֫}ǯ*c[}?||*uxo>4iY}JA Ƭ-B=F,W*w䒏5=TJK{tF55`TV8G pԿ0>嗔~QeS~b^zzTa7-YFa,7}|o9c^jY~_oX?Eŝt/EB=154ȡo6mnadf5eʶ']yϖt]}2^T A -i'ws~R%S ALݗF=>LcO/0N؄K'O5k3#-xcZRw'lc8}*Z¿M ]B x!<#g>'rR悐&~wO@ݩ_@뽥WGܓt#ӣ߿r%PD/YAHݛ2bDh W ýs^t~)Yl?"ýZ:³ƏհG}#*_ ]JkĈZ%ό3^IꌧۓN?mu2bɨ NCң48߈@):N )Yx,/}a^vCh:jzXMU Y55FK%g#¬#A[EuOJ:p!^uc> +.lzܲCÙ_GAlhllL< Rw ^awjiHqD ہƛԏ>K4VZA19+]e Oz'F|C/9䩑Nx%q [z™QtA9]>~>5W,MMM,WPƳb4|؇}(g[Jy >:͘w %8B$ I{QFj7@}ЇcI{MGt{@L?#hs\kb8[YcTU?$J%YF(hKAFw{~ڴ(:(nE ^vMĨcWk3o6~/x垄3#{/%‹ayӹv_{yW}>1`3Zڧ }]5Wnjwܱm? e06g+~LzQ3L>呤K|?RC,f>6nN z:4"uCKFj%FF"tf.x[\μ*'-] gF]"~GW"@jLȹ\v{dAнù,':by;ʒ]ٷkc'^E N+!Cf5OϘe/k lIsW`ڛao/B7d.t߸垄ƍ?܃b#/y˖^JOķ sWJ&J@LpL_Expa$4<x| P6~lW ~iL>J{&/$x&rGxɆF"tS䖊x0ppj߾}N7G@Ƌ<3N q y k,e /1/q%bhf4/y|+%xQ R fۨ#3縯@Ǯ蹴dsa=iUz#ʌnc%ܑd:BܓSwgdtX{_r^yI(5Q_ = 0-N%s:aۯx㍞K**̇)J`^_<ڦѼSVlt(QJE>O~O؜iQN%s./4=(6qcYKjr_lu/I)/PȆ }#{O*\Q'ykwe_{yԋ[:cmK1 ԍF/ϝ;wܪO@Ƌꗁ4vsX u͵d#n>L#@y-(DC xdu#˯Vx| yqBGƋ2?}~&w" Erleq̿ё`ru)y939/}ly2:ڢF}"ƭ⼦;3 @4|lґ:6:I?m{#ޝq;?cYk0&vuV3\}߉n-1` /DlGcâG-183Po/үnշṈz_QX%kQ]gEJ!iC { L]>zhwQaJP/V8Y9{)hDU+ 5J _҅~aۦt\gGdP5rP޷Hww:;3}c^|k|*-^O&}>QN6'r;}cٲefne+e:fl:: bK?4ˌEvȶ WpPt3:c?({oӞG63@HG}NY~^r˛7u_2^E`|4oQŇB3r Klذ֖Fi>$8aOk팕t@umRO㪀"W)@>Oe;A{nWkKqsϟ9[':68.4qT 1 /Y&M7?3fI/lj}r/t+O Jύ[ &?-tyTuttۈ78?.w9=fyvӶo|Ҳ}~=Jnڴi)^}-k:lɖE:ku{.bϚԏ/p>ѧ gcl__o`I& ~> `{;ui3ݿ3+=.% OIF/Hl)2;}/Jx/GJuמ ---QNg,fY/==dHOYH pK#xjo =d͛ρÐ|q0c OS6b+a45="^>J8)#oG%"tuEV?F]/{h;&hhc6:f=YbS$R0賕 SBm9:yl¨:[&f~W:ΐ$w"{|{t>~1:L r?E_[| }7:| a:d^>=T""*J6շhl?jzDV> If;E̱>A{oK g,H:-M;0C,SBi }$eFD@tIZ/?j'uxg6a F'?7ɿmΥطnFp#@4lFH:t;K47(?q^kͳqç"A$3fv=cKSq)8L<;zŊ_Qu3~zNt/nyuh\6h<Qbe踗X>0pr]vypp&a8NןC\{ᆼ0MuT+,9N)*|~rwy}.5.:w9~ ߷QMB[hO6 ;s3|lxwQa&ΏpAsT@><ڨ('ߩH~֏s.G_E_{Pڐ>WA[鼽RϚrt *~ &wzc| wxC7mC/Wi0 fO7/ g`xčO 14񋋟gL[ۄVnu?f ybH Q})oa*Lڻnu3,-UYJ-%"wI6"WC|5F|Y<]2amXCԱ]䈫Ε YÉuٝfZl)YV@m7eHK. 1[XQ jFRr>U^ 7jԨSdB.ayvʹo9Xt>a|hbJY+F [Keu+N7|ϸ6Ni>ނf hse|r~4]jC왩@0''}vs >vCdCٻlx>6R34ˎ9,ԩO7ߩCX:8p#aFfi_9KZ'b6#fDmeT2S5;"%(C`2ԅ:xmocA)QRjo0ͦb +o*{t. ؃:vdI>@֋;S7Q7xn} Plp m(RΧ" -"ϑ0MԹiɺ"2EFFMlr`a=.dtBk Ow~2Ӎ嘆Q0Q̙?r6!'{=|7ia9Ztr(_Vu3ou \ؗgs}~8v<{3'1싌QE(Ob1^f4Er$O%`%m峕: Pm $ E M*G'77gs/xfqfhm}[`Lg9E_kirUX/(k0r[ pߜG.v _/+nX>|m`%>~r| *q+GFnc|t3r0~:#:fP*=Gt6$ƔzGԫpQGdbK]=:4;mݤiܹSDmt?YORr79NoW\Qo)%-#G>'WdYǽߌI=|F-bW͏+3y[Nc{zv8K+d`?s7 a/užE0 į/a).w=Kl7Hwgm|dNgvA0U9 C~'QFsβJK+Db?m~~ar8 _o{gMk} 릞 }h{󰿹;+Q. 3äEy|o8 >#J{qJT?Ma9ڙY/6~$WfC `֝߃1sʖWv"6?OlMBz9cƌuc[}"ŶN/?uؼ@^|>˖;륱uQ+t]P{8 t& l֫uj/tc9ߌ{a2yX,0ob=s/Nn+vL> HdH1j^Xee0otW3hGnJR =gn2tRXb%t$D d[gFtúr2Zz[2^T.wJ)ND5ѧC"`K"a{9 |0Z858b"A+Խ8./Ml_D@D ||Yz}~l>LSQLBkg͛1 |`돿,u8}DkMBh0^f}}_yC3C:C]D;PwVیҒ"$"Ѝ =0BM7r(FFE2\o /-/p≀$I626CMk@ǒO?k/wQ Zi37Rcܢa4ft*<9t%km\u,Ƹ~L>yݨw0šQN)%PN[RեVLJM)hwsֆMٜVG(/i,;pkcFQݳ^ /&JD@*L|_6N5d\E43^<74*,D@O@3/_ҠFx{. [&nM/S HcYtZpk\*b"oFtFbx"P/X勴G/o%Nt@`ӦMmÆ #*.d3RD@D@"p1uwd >#x&WQaf%3Sf^D" `}]X7ĉ;vU| % @*m.mn/[H)9ryZPW1ֺMa3Рak<^Т, aP#;|Eכ " "֭[hѢKQam=6k+m6~NQ!cK,YQ1f?@nxg9*=kLPw]D@dP+mBϺ`oGNUYF 㔰"xc"2H3/::3W))"mW}|-5 ILٳ7@/pƎ$38y,Qui5z ΝOn" "I@ƋN_[OIQ+n8?bb EyK+T9MD >b3f[.Ah/Y赦w"kjNFͼ¯FZ2z`F>#ϼ`4z`<4aʛk:ʜ@ ?It_0Ȫ?s @z.7oKE%" ;xPq:(KFX]R&#H^FE(9,yF" "PMcs&Lj꠴E@D@D@jwayf2mъB-q1e x_`taҥ]D@ kUroeExc{ܹZ|N Ҭ 됯.ڎf]Q"D@F#GY<ףVJXD@D@Df0> ]³!ocĸok9Ũ܀ tؓxfa'a(!C\/K" "PgU,D \y;[RVum^}EEϋ;oƶ2QeY]D" 'ZdoPxJPD@D@D;,{3s̨x670{bϵ𝗾Y]`6q˷v@2^#K?JYuOEgY/G[TFw#R S0O8񨖖x' n3w oCD@"-$ܶ0\&۠agHKFzjılڭYk ,{6ZqĹ &<*QOFyag@]@ ,X߼, [-\)AX¹s/^`:-"PdU^#ӣ,eRSJ`ެynpy[?hР S ~'Oe˖}bȋ1@-q5rr˗/(>-s%~@i S\ |^~~Юk%3!2^D):˒QoE峞j~ :"!EL %FD@D@D@D@D@D@D@D@Ƌ4tIÎbb*E k]Se5tٱvzTQ?u(Ƙ"&#" " " " " " " i" EJCJmd)DuZ2'%ބ9UFo޼pkcI/PQD@D@D@D@D@D@D@" 0:šmcqȑ%QǑ;l]l 22^Q$CD@D@D@D@D@D@D@RF@Ƌԉ;aUC\v/3fhdą1d &<ԋH#EKI @u nJ]!ж=їqgk6%5kL3~yR]tp"jQ|qyQ\sr,r-Ce\XzAh>~Z6* 0," $peŤ@|;9WKFs.GG̺8#qIƃ`+.=%E\0%GD@D@D@D@D@D@D@RD@ƋT@\770if^&q.23~+_4t-" " " " " " " 5B@Ƌ)Heu`?_ѮF/!/Z2j٬Yk뙋xѣG/^g+2^LQ*#; 8nlƋ\+N'0vc0^^2^x'" " " " " " " )& E G'< t(9eB1Ҹde@N,g^>eȃK2^JI*;M K3f!G)W/?dZ P3'q=ǝ7dINmhkksɨŵE?7ihhpT,4"ͥ#݂c]770X$Ѝ yᏧn}&NF2?6.qʀQQ*)(\KFs퍏ģO,KҋQ7F2f^n@f,eQKuTyZ E-lǶY7p~vЊ@>6|uE؍=zs,b%TtH#uTS`a{:n}"S7}ك@ssd6+CE`SBV}z UD@D@D@D@D@D@D@D z@  mk:1YFuG.Eę+f fݛ6m:({3$ST:}_ N΍s(hE RK"&L3%Ό+OŢE꩞ždE[[[S\e"9" " " " " " " " ! EzB$@2Xz5<REQ:ʩ0`5[뛺9\[J@ .sG9wH\y3%yqɓ 8sB=kS^eV" fPJ;C'" " " " " " " D@Ƌz*k{8P)OO`ܸqǑ {54)/4"%$D@D@D@D@D@D@D@D ]dHWyH06ĺ߅xj: /3tPg]>ͼI8/jk={#ve׀̧8y'Ќ˚2ed1g㖙vyp;)eH䊀@u xQ]J= "DWO9#LǺA#ّ xjEU&eWD@D@D@D@D@D@D>xQ\stghtrG8Ws/[@ ,Yf5)>3 [d MHX&f 2^##\4*x~̼3kV{2%+OC}֞%q ØB3/⮥'" " " " "ԩS{mܸqw8˖-{]n" " t;ToNJvVxW8HEdKA@x~i(jvwnIP]tLxA:yT\/^'(V ={q{UF Һer9r7aFWzU9]=4?>9w޲i;A9=هc?۫3oz7羌Kpyɹ/%>48_'k[Nq۞:p83>?{0K{ycFNioύcWط㶁`/2א3_.it{,hm+_׽a%%`UOq9&37{|7{p6憞wv\zE3gر#N]4ctVhm WnϺ9n޳g ,x֑aÆ۾:7{>6۞^Ye˖=D:OgFDHG|ztV¶Mh߰ǡ1Mk׮B*6pp^viefed'޹Ǜn^p?aͫW>eL;o$e|K\?1ȱ ^=0:?eٞVY׭7W9!}.{d8rsmll?`9{3 F>BnDCJk@  @IDAT>EG7Xcuw-WO6n JJ[0WwQR.ѼzRXi)pOkpN:'Y#ܜЉB\ۦCopm=Ni~75zsc&փ<k͹wq*ƙ? ߶cxC=zY  Вk~PM8~+{ҡ^}Rp0~n, =z@96۠;c=/?w:;(n9tsm~asWZIqJ_YwK箺!Z??0*D /5؅WoOMbcM/gH#NiwݏȆwɹQǹngipatZ&n!Qҷ=2lpo+:Z >tFI$~^X?#$>?dz]S<^nБBߔ5dɒm$h?w}Y~K+WND {fRw RNy^N8q@KK˷bD*{ͪؼy{? WĝWCtL_Y{,_^3f[ f|s7GqW\Z*|?^oҴt/<п[He:`v.mVLKPː!C>"9c85Ψw>-N 4HNR[,P:4,D@wTBGX֟DZJ#[})Ϗi,l.xqgc183_F #g7S0^M=9fno_z!@G7w?08-' W{Wd:79w8 m!#?q'Ja: ^C1X|<_J]pe9|v(B^(Y-3X{9h_&a]Xr ORN6Df<>8J PL Vڑ(gXt[9!$K `k2ĸe3/2\M5;h?XmU 4'ѻRQ#*Fڿn{e\ 9_Ƕ<O4 tn {64z1wA;@Jw̑pX/L.^~t-m=a}},? Z?pё~{;?u'8g4:cF܋w[B #kuYEɢPoۋOn5ρķpa3".më逻~aX܉ʓt 6\Xr^W [ǒgk~Yp6Ϥ1eʖ*θ sZ]:o:H_`iq.,;V?A>p/2ʄuVffdGE6\OE9]KNw+E3Kq.LtOYf)͡^ 1atWemuMAgKc8XuH{hp֟ 3/a*-")j2^TtS I]ykt2?v~t:r˻Z[GxԠqzq-mİrzܲgDFˉ#ooQçKx^tymKɼ8G+MݻU+66h޾m%:0;L~-NΒQ<#=gДҳTj[u^,MT:n!όBg)3:cǃ4wTdw9I7Dy2H,'0<]3HlzaB w3b?̘Eʋl-&{ v ]|!h?SWXNEa{4ܧo1tܴi4Y9UcOI@) )ڥ2^ 5#h5egP0VDA%R1;jwQN|݇8ϽZy]2nmYgɪ<*^dOfazAZF*hvXeIoи»'zq~ϖbjmm[HqĶH>egxR>Mhmogkz&OHcsW|;͖rn0JKgiĵ=S.r^ɖF'Ӟ< (Ο mdcY:Vqmwko)8}½L:L f>?9YOu ܿE7p]> 8BYn7&'aŊSvmYDlv+C+1b< R,`~L>JF 6a=~#\۹8{}RG~IHzIͳnxYY(SLٝM=ڎٝBi:D%޻}旹zGQ4+^"ܗh?tP؋I|{ޝ]AO_uSR vAoǐp=oZLg#]v%kOtnؽFZ4 i87>$ݰI746NjgR$avuqƭ,#;2`xg"sDe+ Ssa*|5mlL۞+C#嫌 k4p0xic燉[gۘ嶻ewwG6-$T}Myt7PosgY I}qnXN>D~tn~-p -Y{~[:sx$~7ޫSˆ|7.yn/÷pĽ)ύͳVtvD-[u4Y8|W%R:hmm):(i}&w r ӽ+mM+)[3, CIpM,)"Z߄?ipܱtND:"ZY ܏prM yl>?dȐ ZiȻ i5d_M]R#ӏ`{[9:?pv/ *%G v c2uH].CGNp? o(J`ȱ.:һhIC첤3GCh.:Hu|^!`#1\\Q ÅQz4ZGBE:EBi7\}ZWm7]+rо_CB6\X.bG֯rS٨;t㖆X>pQFFj4ȃ!Bv;Ye +-QpA OmyGTÅɷto@n%FkO z=9t&F8˅렿t/kn ;3\,<' +?bl($kϟ _Åpa UJV1"2ކO ٙz9.Lc/”_&UWMo]h#{z5 OP;a Fl6{;Γ~oA4pPw(+ wD{ػ첋ȾG~"8;?A/i:uOIB)7 E`S.5ߎ+ $xTӲ9TZ~X|e{mfuNif(չNC,*?5 >|CJ`גa  /֝r$v-Ρ8MF&. \'quMF) !wmky~JqXeo@a.y>A+iׯ֑ @^qѣ%7P?3yY,]T[^r0u{+c!j:D> WimHP4:}",|YvBߔl7q}&Q')?QkYyvi.3gNM>.Y6~s݆~8{i`g$sf\rm}McEG'k( 4Lcc;QP%rM @á'/Е@# rGU,1%ێ1gơrsur{83 7<GQenmn<8 wեsڱGy(i|gtܾw,_AgE-@vL @Hκw#0bTuZ\uXGzomVh rmi7lk ;}z^:"6<=:t-J"ie Y1o,H @i(e?Q}uyv_ftt:!ns`gAh%H'y${M P^+k.y=K ;=ONKxNeexv{[ȵ%aw a93! UBϹomx+VevevDWYistim-% ZDΉ=c]t/"n} dW{F_FύћӖmI~$Iz׽"_"P5a}0yK 9ڹ+y&ؖ ;Zk@:YH"dggQRα,1NŪ4w}èz'{V?6.Wf%\AXiQ(ߝyTOGK 3GMSѴ|/J.L-kO1 mtQ"Φ;r hlȝ4iyDC3'^$IErFc =5y`S3Үݤ}+aZ@&'F%8ӽ&> ֳV눳 Nw٘:T܎H[&>Ɩy‡ׄd:ށ|>,I$^`?gy[=gu-B^(t{}rS}RlWa1OG5\M6bp~פKpљQMN#`q;KbZefcXfJ})RM fr^Bō{8k[a_DV3qÍaG [$b5^^4/D .QafF ̠"*ۀ2az}}w=UuΩ֭[f{<_7E<70tux?q"eʪ #",;:ʹQe%[pl`ɜl=I+X#PY+U}oyx?%-qMLJƔOf1@({/ITվQzhZƏ?:*MZYG1TBB[d@`uͲ>P9ay{VVt:Ωg75>(\G|:F.уODEFcդ)Ls<ok… \~SJt*$J[ID ={\P__r$M7]yL9ߥ4+X0\i]~Ԋ =iŬs0ꋥar?fbHK =_*.0iҤgk|/3@HHK0ԇyK9iqb%?L Hue5V_TAQA8ucSx$yzڏuavݣ_^%ի4.'ơ{B #bl3Y3|W@Ψ^RZ35xm#/A:f 1\|UYc` x5ݍkD}Vpg3EڧrK33lHP E6Vn!XQ{ gVV#0Ɖx&M82o0(Ut¹Kzve ++>770S .h^u\.<눯mU/*?/'ԅ{O:3t&3_I<_H eOE`2]/A'FihwA_ ٺ- {Zo}25 ZӜƠS2bPwFcX}dlD(V|l}Ve r KQVQqI9q~$412e,N֡CW[5^}Y .2^K=:'^:d/\fI!\HGOƋ㥙}d8N_F 9 d!⽫wzӦu).Lh߼!ִ5d}cBv׀Ō|jDLZi`z]+ e bcFZe%0OnCî]ZڃTr#&\r'A&l)]F$!i0S33KzV J]ċ7(:6|<d`.`uuQa4u0wΒQ0׋0܌@Me%$5i PLo0S> t9~+j̤JԍwGu<-yڞ1U'-qg0ҳk||7T}+vZ3.f4눪JGb|Bݢ[c&蠘uLC#KQ06Xvf'ʃ ͉F'p>%0sÀL6h1M#,[{v*DhMJLYdOda&iaMÝId #ʠPHB2S;AE}T KL"v i9uj0Kd = Oo7x o :Z/H*2MoD{&\-@Y*cjʚ;HJ>6WLv' =#Xmp톷 s!Z GZg nDZ~4ZU֗SZ8qݹs =(;۷o 3Y˿{TA!S},M5@pųTg[H a5X2SҳUDW=SxֿL[W]Fu(sKt$0@,G?MO F{]aAtֈӊd[>pؕa6^ )0<NeWl</NL|9qjH=4t|I2pCt3"-^x/d0mDFlFک_}%yewoFAe"}[n&YIezR\ܱ#^p\?۲BO2VxF%aHKCCTI#X59) ]"Hc;v`N%D@lօFlm4ז;/ A]CQ،OIFl*.D[cJ'-f{t]6.8|C•HLc[+mbTǂWL%nN$6Vo,.\O aLL+ü[% Lnfk>uP"_ϞG 7ޯ#.A=i6Z`gs+~ffں„א"I:A7a I84@9jm!.9aE!lF@ANtmH 3r>o-Z谑46Ƌo&AxAe>ZQ7LdAzbz_Dj cقC} f=< KgviH^Tx!AHEO1v 6L NVcy2 -Q!k[A^~4^dDI04=;q\e_/rݣB̾reU˧P]ICC {iHLq+suRgwpN[:"솇MLC%[r Z b:7ݭ:k7^|hfxL)A] >G'hCW[/~] /d k,ܵpK,ђD9ɱQPR}^rVc-&H bÌ܋|oS\a^fn <7ip[u5ʖl ?0.yЮmP?oP~͒42^đ pPM}Md4q)DǑ*`?j!}.&d(SfH.u .^:W֧d8 řzHhW٫%s\/N{WU8Il] sͰ, p4Al\++AmhmlVgʤ"Oa-l[e 7bEdF2| q@kD%22,pWHM `cjԥ4:;VB>3a5~e{8d@haFJ% m] |`?VrMd K j [ sef1y%p!|'6 FK.89y :ё`:gJG;n-z0MDdhxș# oV5RIeN\ ჾn3Ө1w[e ϮhÔM^Ym+u$ԅ3 p)#1^OW̸:ɒ?=ecK. @gvW%?Ob{ۀY15i KRj&ҥ5Ij&Lp*Es I_I @2A-R 4ʬ i7-?wLer&g NA`C# tfڨb8'`j;%}@)(1)]ok}-̠l^7[Rvoge1=)+x0 \Spg7E޽yCm-0>Q/!wiDoD:*e[i[W16_LDK*fĶ'}r|:f+f;rk u juL!DE46 l;aLYo;wyI#x|awN-p y#ɸ ZJG&j辧2k5^9s\ħz\"{<_W%G{2[FoM#Ek4:O/;p:dvlK̼ w@ni'^Px.n 9^Pu"S;((YB2٨#F. YLS2D25Ik ݅KeN}8tSVTL^ Y{{6R'@udLCC#(a K]h[GԩL5M?TfM+n.`";{^c=q߿ fA;63%ע=TBoC"E8&8t!vv5؉lwjjj~ಮ-ssY3jЎL())KkU|H5& V^0`(CE3/ Eýc&c9;^ cB wنÜ|\>s gzv]v&cQnjzɥ:"o-a-V{<v'F<>͸9bCy.MrB&Ŋs |0;oEXV4lfw,՘;w >iy[򢢢,,;yL:=S45qp3OTfM+/2apj]3}f1v3fl;~tL/> Gޗ$)Ec*0 \sƍC~.eyY!NT:ڵ/t(7dUv QM.2 oQQ盎LM!zvc*>\Gh0]4RVe[?(O8x'YĬap&Nq`f?0G /`97#Og-VQ9N@Z)83VCRЗYF:Kߌ /Alryԭ[tٽ*#\tBt/xaV0-u[f6 .&Q~dyc08d:<4pv*k!iKQ!=S4F7i2 r2s1S7N2,, n%h6J 2'D yG/"˽n녎efL/.#>(Ty IyS4#)i?tR s6ZT01a 3JM'foc"+Tp6FkIut_HK{؜q<Am1jX.6ƁτseEf !bŸ6v$Ҿ ,~춪E.]9\kECw/zV+^ʂmPe^3Jqww؁4I F2š`Hn ꀥU Yóz2Cab l͜ekhhe fgz¡ᅇ Ţ` ;:πbbx鋯Bc t #XS8SWTljn>%%zv蹲c jxxt5/&] ~f̈ӅF ~z>P]s#DnӤ:R%Ry`L-a4YPgoMftyHo.C$Bf3 6١c#+<4 J"^I/KO fJ((2^d #Eex})\yPv[Ft>` g*}oT}?c,p!zv[LitZ|#l^ՑP๰OdKfu<rLG/&IygJwz $g}#,UҬ Af"SN21y[I(c%&Í\d6=e8>,YxnLe*9- Q~Lyqk'z@[$}I"&4:[&gjk~j֟Cn|W_l[c2k[sj^zxa}dr; 9vIuBbi# bWaƊY̧hkYYuCtEȗgq)>o߳ AvLRGd3#ܠR~[xb5F˵ .44pcЧ~}T0իWy{4lq*N<u8SMt;ߓO4b? vGҏEw&&3e|q\L 2'W'`4_]"t~!!aE[p8|)]mY 'Sݓ":|dBG#N-Es4H;lŋ?^g`8C WFC; skI[[nXu!݄/HUz2ЮldO6"/n^a+VRGd(h^ٛ|,\5pC:>sH(W+"cL|$ o,[K`.43@ݹV+`ݕؤ5Yx23`uXL`tPt)2,K>sd֒_c"Ժ+nĵ/~k,zyPҤK_P!`7|x9nL߅ݠzߘ1cơQ{3h{XxwC02H2g|:6Z5D\L48hO3s;Ց8N|vIFʔYu!`1bkZזf*qIEO*o#ax~"Kc29@CWpŽ;d:OjBn9eB>Y&P]x>EV$̏=->"!~h{y'('4ˊ`xa,HF1#.~;YT J$<`7SS8/ؼdxЈW?403^01)bcT3zE@JE)z0< Ցc74x*V$:^2.'s)@tAet'iy O H6mv(| 96wץKJ$ݔړ .NFYI H,\2.JDЙ|id< enӓx!+@,ŠvHߣn;`\!ޘ1cF,CMWTm+61R`hTxge|DT>jH=kd U".-"/D.v#>@> @n>r56X$ysC}C 9A^x^ddFf&]z .4gkd*/b3ثW 4inlj=F$*-4ǂ3.փ+1E"@.Rn ;I" cvJyWK^8qbGYq/1u;kW? 1oB p7jr\, O1Ohb0^0>+#Uq,Ց3M6;/+4я֚0{e 0P<:G৵ɷG=Dr"bp;q4 cӤ|p׮]^[4<,`0Ci1dd5hV|O㹛.x'ZhꥑB/Rb1vLe1WS4b!2^+2Kףs7JU>OSq7﷗~_>tSiSDؘ\FccP(!гjgEhZk?}xw%{J+: `FR.?} G^FxiIKq WI[DLClZQvVX-Zt8>@)y#Zf_X~>d_ AYX>Ҩԝ_sdPG[:3`Xs9+fp"FmE+1W^`*I<@IIޕYހM6iYuGuݕLҜD XQV6cZ{ nQ.Ń7p)^\9 E_6> ֫\gz"_>1lDU@.Q߆?'Y?~I0,5|DE@b:eJfFesX̙3xݫGb6x6kQFiQ(m3T0If_ױfS:Wعs5踜0t^e˖mCo4I &QN!A٘K9"{Y쟖e)9/xtu(kCb1Åei- 0 Ko!!OгOm.TG`|[5gF jxuDd7÷`0iV: "S0%V4!d5xva28#!O죀>+8P#ޑy4 ڵ ֚5`};&\р'ҏYy7~O2;]YG? .IZsD 0!!B.#[˖oo'OR/fL F3^+1+(iZf"v } ?'PaPgyYb<gǾnԋ"F]TG";a_*{iM 7ȣ4X$H{+G ߰*a!j wY#ǯH/[IJknbwi5CZM_8H;T93BV\؀~-=ꋬ?JKKa&dFf:Uٲ,QCz۪ w:Y{3MxZt WoxM2GO :.xJxZ^j!$\ _p/\ę`mʆ4! fr.L7՜T< \#}b4#׳iiK`p,h!z6*3!;ex˚ q锔>W:}_U ~OLꇎтL|j,Lj#௬YE~)]cd ~87~gipm5l LpP\C<n&J6D!zv-dH[nZ*}KOnb+$wN-2ʛg ڬh-b_K1`,+ޮ'  x&L 10:OGR j-**@2Ċ8;µ8cN ox ?Bʠϛҥ{bCiw~ }ח?e?6Z4] +l aAp- 0@y~{yA]ނ}$[è;񊊊>8~7oa:k V NdR8Y.ff2dȐxƵ&ζKӔ]B! C\, `?akn4yi Q_2+\FY褮ԨOŨWg˕y VV`pxw3z 8;'PQ >>^lƌXG)cuv7l|4izIw#^г4ϟH2xb} S^fMϤoUf`: _DE?;8`  /Ā9wbBƊY"^*cbV{a1:Az[ V8ܞ!qqٷo_z\oo^[Dw6NJ+ݘחNidžݻw؝&|9ň{_0隆187|qNEǍP^fn3qfu3v *3 v G^{;vj8qԝ+`0{e7'9:2#[Oe@3K*3#=7CxKAK. {7X&l.C#p?O@t.Tw\SNNOKեK [a2-q[ښg\#,98:%=QNH[t10Y֖bc"gXb} + \7tP'cX>Y4`R R܃h'WB7E"_b N*KH+7-.\( #oEuu(2;F Q^Nn0#"e1/)23E"wH5 %K4J h:|tSuN9Ly] 'NN-ϫX*faf֛LH2ԫ#̉j,!@Emc/%0hܪgXc.QZ u)n.+]ƙ F"Aʍ0`[# LggLU vnW$ @-LTk#M%y39D;Bn:rN?k;SWJD€ѩS3)|_I !?1/ 5&WH @п.\[QǹOBYQB}8 YYDvQL3!k2'Y>3yMx.iq:K+ fP42P*뫬+F#Gl 3mmaDLW]r/Di?U@E24fʺ*K<@[a@{VΡS$FCN9> NipDMȓ%BޯOosSR:vM _)",?שf Cy 4qɾTXRzy)gpZ c3`)9;_c*313o5i"vt#A .x=y]7&gЦC4BڎN|Zya:F}:'8|q$N@PWHa([)A68cTLN§AT_bH%V/Vbe,hIa!zvAwsi,.܌]$[,оl処%x+EVL<ԙthxiKL̢ut}Q/7^3$\ُxbMGQ dwɓ'wohh8r2靛58ݎ&0vj4pbZHg_4Og#Sml#Tf6>ÉY3S1n垌VУ\k>aN3 !&t1|3 4G!Wdƨ~N 㔗J%;u ;^FG{`b$^ kfn,(d=JSw3ы@jM+*wj0e!L7>fXc~~.6>uC_+P#oA oEl"o=1y&N|/o x/ t^9PNlܸq=PCz(ܶcuŎҔnZ&9(YCx] HQkx#xt;V%RzBD/Q1k򁙫{Ղ ,8Xuh 32 o1K2 /M(Slbg l6u:>p :>vh'%\ӯݫ x!ݬ_c11mIU!@@!^tET-2/킵Zk* B 1u^8Zo+7bF@>,A)x ȕ+ pߗi ר5)fnB'2^^2 #ڰȑOжbVla%K3~l&!B r^:>?B B Sp= ٳ5i}ZN [õm3U/-{ 089x: ^t)@΋0\>3e.mr*\kuޒ4Sa_L3!@y@"?yUE!@!`3SL{^ӧxݺ.JhzX/d8t4|h֍_,7 hZ6ޅv2OiFlGl&,Xum>(K:+n B%ij:3ȡ'c)c1MKB B 51$ދ %m@ )]2"7/O.:P|+G  qWt!=#wG1QC:ֳ Cn(yhC^D"hh`k$ B 7RLQ_W'6UB){_ @ņ64!B ;NBL>pH;5Wyټ23?]aœ!汼}1}H`iB#߫STRcpU{iK BQ09ɯp`+`UL\µ5E!@dq]wn͚5$']H`g)"bN {k)PP"P^^~:LAgϞd!|Y!EǏ!}H;tzD` 1ϡ¿7.fK@ >s㈢hx xtM]ْ%@!@Q8fMG B B@h1<@n4P'sc9>Uc qpD4)_}[UXjU5\Gm"gdZ1C}bփC-o?\P<6N*8tqԉn D(Sz'oĢrcM+ VlQ:B B@gc"!@6 0f̘)p=);v|"{2^؉&҇c'bH_\Xx8Ok9AXs3-0r< ?ఘ3i#QO#K`dpEaEQJ\PA?Y3p+_pu(dwbF!@!`|wVPzB BHF`ޠlTz͋fyI}B B B BE bEX+!*hXe5T(,w h)v_b0{*?!ѣ .s,ko׮kHB B B B Bn!Mrc{v6y %Y*[ޗ-5jdv&~Vs+V[EG`رb }25V,ȇ!@!@!@!@d2^dty>v?qe|mmlV*X젃P &tmll|Ƈ阙_`a@ŀ?IthzX1e$rތy(| B B B  @ )Bz18={a\spʝN\QEzSo roC~3վ'P W[3f̈y#!@!@!@!@Yxa9JG@ķk.6x`qF޳gO>w'$%lAu9 { :? !@!@!@!@!@!@!@!@!@!@!@!@!@!@!@!@@V !{3PIENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/gammapy_logo.ico0000644000175100001770000000670714721316200020150 0ustar00runnerdockerPNG  IHDR<<:rbKGD pHYsHHFk> IDAThov{.^fx1IS'V 9O -@kӾEkm6n+[R(Iͅי݇5$DzdKAsĐ{f}l+5Ib/Mx:b$9&2A|%^ X0a>sc0P( l ,,G>WW˽8`i0@@x "D:Iĺvr P NV+9)NA_0!4z]}`0!bj5hjPpuh D! 5ӑBQP /*!A~~^NRV nބX(C@q ^#pQtUt!/OGނ=v* İeWK88fGq˽n^}od ^ r.2t|O[?]a6d. chqRn%M!3`r@Kɉ&)ʫ a,(f᧧9s`@KF0 u4pkY9C4g !,KE4+KvƁٵ +PC6'#Cq(n{h:'+"_"/)9>8<W%a:TjP9cRױleZfp x P*A  X5d Ԫ +Mqb Y0NP‹ߕj4|ͤȘD8@>/,_*A9W =N:|݀  7'k(|\TAa0ɞGy<.BaIܖP. ;D.*ܸR5_K$2a\Va9,M L0HLG G0s9qw N^E4[>fSQ@6zXJC$FcFT&d$E&{I-~ kk^` @^0q 0L+@,o S++|iYEOb2h,YU 0 %(P(Y+?XbX] Ĵ!WP\h"[㤄Hf*ϋ%rifZⲚ*-F,kJ2$fݷÌ!u[y$%()Y%1\fGu-){?qEI704@sSXZ +C a!&.mZUBuZcɁ``%gxM GJJ'ƔƑ}{{Ç28<\VrWWWw}XY(aM$)u nڶTA֖?'NA, CJ[ߴm>YcVEt9D\Ўp È/=#ý=8&\u%h sP1Mv kyQXӔ?7\ 6G^w<}dc"6`4TTt: u]<ϣ\./]VVV.2ӎ#ǵdrr ϫeRʶkzFtW>JR&zO?tymqrUk%^6.d Z 7Хk{lXARgfhq-n߾<~[of}}Z-r- @,oGd]Rg<{qK%2 JvsSyB gggJW_}ŝ;w߼ySeR|7lnnFH`R2\_A] JCk(]bmmk7Z]Onj<* ܹsq<$IhZg=ʴ.Cy5i9s4n:dEmmOnPe%8&.LdFB 'bZ-"c|QS5>jRANdloo `۔J%GϭL&hC|Kʲ1 M6mT Tr!rN<. CztN/MDL#iR6l؂.Bj$)3P<*6 ѣG}=k.tݦ,X8x"'20ml22Xq}Ɖ ŋ?頁,+(dBh?\p5ID(d%tx%%zqF"yjhR$t]&{{qֱԻQ/%qٌ֑~6~j4>Q(E6-xCfqL?pHkR',|uϕm?`t[3Lz쳭:Nvh٤Z& +++bj !H.THnQH˰xH/RK-/VmjN%jaT*|}|::2W^iYN Tkw[CM+E^S/ި|ܿlvbyo qi\E&su%gCrON¶m++4M޽ƩƆ{.fS5ܺu3(4]]_!4@"뜙CPRǘ۷T KNVZk\޽{jm2 _~!p\ Q0ʹDt(ɽA:S!nzu_ik.1ӓ:=Uz@|i}|j/y iWg.}}Zwy._4za%tEXtdate:create2018-04-17T12:17:55-04:00%tEXtdate:modify2018-04-17T12:17:55-04:00:IENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/gammapy_logo.png0000644000175100001770000006461714721316200020166 0ustar00runnerdockerPNG  IHDRqK AiCCPICC ProfileH wTSϽ7" %z ;HQIP&vDF)VdTG"cE b PQDE݌k 5ޚYg}׺PtX4X\XffGD=HƳ.d,P&s"7C$ E6<~&S2)212 "įl+ɘ&Y4Pޚ%ᣌ\%g|eTI(L0_&l2E9r9hxgIbטifSb1+MxL 0oE%YmhYh~S=zU&ϞAYl/$ZUm@O ޜl^ ' lsk.+7oʿ9V;?#I3eE妧KD d9i,UQ h A1vjpԁzN6p\W p G@ K0ށiABZyCAP8C@&*CP=#t] 4}a ٰ;GDxJ>,_“@FXDBX$!k"EHqaYbVabJ0՘cVL6f3bձX'?v 6-V``[a;p~\2n5׌ &x*sb|! ߏƿ' Zk! $l$T4QOt"y\b)AI&NI$R$)TIj"]&=&!:dGrY@^O$ _%?P(&OJEBN9J@y@yCR nXZOD}J}/G3ɭk{%Oחw_.'_!JQ@SVF=IEbbbb5Q%O@%!BӥyҸM:e0G7ӓ e%e[(R0`3R46i^)*n*|"fLUo՝mO0j&jajj.ϧwϝ_4갺zj=U45nɚ4ǴhZ ZZ^0Tf%9->ݫ=cXgN].[7A\SwBOK/X/_Q>QG[ `Aaac#*Z;8cq>[&IIMST`ϴ kh&45ǢYYF֠9<|y+ =X_,,S-,Y)YXmĚk]c}džjcΦ浭-v};]N"&1=xtv(}'{'IߝY) Σ -rqr.d._xpUەZM׍vm=+KGǔ ^WWbj>:>>>v}/avO8 FV> 2 u/_$\BCv< 5 ]s.,4&yUx~xw-bEDCĻHGKwFGEGME{EEKX,YFZ ={$vrK .3\rϮ_Yq*©L_wד+]eD]cIIIOAu_䩔)3ѩiB%a+]3='/40CiU@ёL(sYfLH$%Y jgGeQn~5f5wugv5k֮\۹Nw]m mHFˍenQQ`hBBQ-[lllfjۗ"^bO%ܒY}WwvwXbY^Ю]WVa[q`id2JjGէ{׿m>PkAma꺿g_DHGGu;776ƱqoC{P38!9 ҝˁ^r۽Ug9];}}_~imp㭎}]/}.{^=}^?z8hc' O*?f`ϳgC/Oϩ+FFGGόzˌㅿ)ѫ~wgbk?Jި9mdwi獵ޫ?cǑOO?w| x&mf2:Y~ pHYsnu>iTXtXML:com.adobe.xmp 5 2 1 2@IDATx Ed$deNL"#.EQ{wyOE.% 7@ @֙ 왙]_Of&wfnWW$sowS|oUHHH|>ʲhԬcG k ~BH}B.>oR1v(nKw,!Y @0xU}-ﴹKPJ,<\(5J 1ДIRe4KJ*kX~CU|  3-q*ށ3T%ד A&KK,Z"F?%R,ưR  `B6wt?u~'5F{,!o#{@]$ >W͙coLk`e6<{HlƟ-a!Tsӄ/  xYhSj $TXɥ{A_$@$@$ܮ$7,lw1d[M؆zeWP.] ԑ/j胅y΂pA CrEޒ@ G&{DQD0g%ox>Nщ ^X?9\eN_Dὲa @0xM\?(]UT:8'~fY5pF.\Ig_zS@;-K|X.oiBu>cUUeH%ۄI5Q7&W-&CFo.r@[ݪ^^ +]$/mog_:{ .J]ns#GDV'+]mj<+'@5ahn~]VLh-1 ‰' _1[(J:!@.Uwd[r"1F]QBĐM"޻奔)qd)D$9< B[t9]V؉.6,qmay-NC xCSV#8\kQCV~ɱ< G =ކNi.5{91H%8|]'p6l\"?2FtZnsӔ-c5-펀489 OU uG=^F>CG*|O'[Za܋Pk\TjKRnk˿~%f9HLjܐ]Yٰ %`(?7rӱrRrL|:LR\:g|7:!@f-vWqxv?Kʉ؊e?r, 1K6mRdxB+HH[G]|?qMlˮ YhIoTreK49$HK!sߍGOiz&_eWRH )R0}݁WuR@Ӳ ge_0PϪrY ~GՃ^mʲ< @,uog ( k\ރ* ԩOõ:A=%I IRw&g*wY)`7v-=N;Xw{}?}vyZxB/ %xPFh{AǤlyLݕڨ& ;E(aHR\>\Y4JKZ(44ު%K! J `\%|ti k'XCu{-¥%W BtZt>'Xm]9_Y'HȃgLpTL_x j~L'בWW3zI݆u=TX<հܪ+\/Z,qmayd)D$*j]Y$QBЪ ^3>9:m8_Cq~3dKbdX^+:Or^Nu\$@,xPྵ ܘܼpK"h؈~h)Jo kWjskezaE5T9w$f f?qm:v#S?QG2$@# -7%%pf-r>mw{ *pƇ ٢,c2Kyxk_RH " Jj ʸy1H5j1vCOZ)n?î_XY-߇ǺHOEXm"@M۶e+!u i@N#  4+$@%jσIs\nl;Ki,[8PeyE>CxrոatUt@ U'4$hXXDP\9I3_դ"p7SU8n:y%#K tm MvCzT"VA.kzy 0Sr=KF_[Eg}R)z+WuKO',`0=ω_ aQW;u~ &\t3n W®#]yVgm!F$`Ҹ`Y_m:쟰"д|AI55n聸琰Ž\5L޳~gXuM?] kp3%0O pσ7^*-ؾc&3pp @a o5{:~5vԲW_MZ",7O\&˲%^Jdspi˥i#57 \զ7B Dz%NC#F]|-$0e'6˶/64zM3#n37îvw?d[ ;$%!rZ-r"uX|oq . e.US1!ܛ+$ƶn74[ϥx6TܖuD1;|(%oẍ́hys FUE:r {T]]A ϦHa)Գ7Q5(|p&rݷ>8@GHH`u0n| pqXiI#CeR\ ]z3,> S_ ZsC$ʐ $@Ew#B_:nDC\!aWBB͑s渶?5a"}OLJuޭKi 'PQ]At-QlXO)1|H)(eQ:44[}#K 5x@Ѐ _ +cF䉀կ꟱F|V}F$TI񺋸),E^ڲ4CAL6 5[֮;Eޯ*_ڻ՘g?numUY7RH _z ǹK\FlTg\K _wL5qizɔtS].[,eHGAͨ#EsrٚAP@Wy-wәXZ|& masQ,\""mn)IO3? :(K] xɓl LT . ٮ*E UReHH#mm}&j0M~^)^g:DIRd/TwQ(LQRܠ$@ P6xP㇝N [0 tC@.]POݜ+K+.(Xo2*0i$@!(<x[J[w;UU%o\7CjsS8^9u6-Y $Ɠ<(rySCnH%MOc}A*D5v-DUP$@$p@ 푺8p3veh9k]+(e}/ޚh(8e. u ۓ|)M1 8 xp݀CV{.DxW-/hZf*NVо6~ ]yʑ&@RS5+K}-Er㲝 @A#Q*)2$oOC. MЕBg%JJ\ڼ4 Y @ {VSCDb醢_9'p (I)u]W{ T~#G (G=xػ[ x HPP*_$VuA{^$u'u('f]VYʐ @9-pr;֞(Yn<@\=>S?J;zTHdԬc`iUB$@ lK[uN_qEPy4{A#ܷ|AU}zfMƊ  TH`DlK^_lo*+cVMZH &VUϺU*YCLݫr$@=,yJD&f-F9ʐ :(>Ji<FOw*eFA$@7*b#4kپcgͮ}s-9 &(>U zރ#[އDI;~OIH Q:EқqO M DI!խr+[¶IH#i} XJ4ϓD+rNb8ʐ "!> @*a )ԉaDI^\0mn (GC9*ڄ{jֱ슏hRfY,d)D$@0x0jCTk ]6u /]:ԯ.:,fr,O$@ 0x0MO@Z+{eց"wekRHL``&uei<;0Yfp p!0NqTxtM7pσJ bƅ:! <&J}) PВ_C1KN(imsHL``$e6XQe: Q=zrU*% X 0x?+w{0@f()\fz)H$@ 0x0 RB@ Y [i ]x+JL$xrEIp!Rq A/xRk+Jj%Jb &%m#kszPR+xp5%J g%HB,A$e녫QR#JyV+۬~X 9sX˗/?qA0e4{.>c ik{3VR Hפ5يۂw B ׹ W@<5=2C} (Yl> "0{իW XGF{<^px34nҝ M`5gϾł Z'M7 WSSfv3Ze?d|JDIi[Uili'6>ߛ!0eʔKm'#8Ȍfj!twUUU_Cl4chq%Lf̘1dעGv_gH:PƗ-Z:gCG|7ތ}:]q΃eezreOXr`v;[  ?mDoTrTx,ax ըGHu?^Q+Jvg'N|Awt,A C'ZNqhorVTͽ0̜O$@:}L5?U68@`uh-}ݝCj$Jziӎ ?EU "$@~{u= D9GfScDɫ@ZZr9BWjL 2ÿkr>% Irqآ)qKԜafؔ' OLDW3U!OO_#'a}V+~%zr{ lJʿM:yVDn÷8qC;6>GHM6]snw@+x{8(IMqدau΅_rӧ^fOFi 7+~fշJȲpVDr}QcpBF][$&/Λ7okW9lѕ?@߾|!*tDI u=^g[ (U6 *rX'uVC9*<j-0j!Gǰ!/|1)'ҘQ1[e/A0"+H|X]Cڒ" oz KT;?|jzƳCS63bޅԃpq@ cK, Q3.Ãwٲ~#e]T̋W B,] # nߎvp֊X,{wvĊ HBu(ݑP=b:퉒JMт%e{9 7;). @8n =ig#T}Oo٪ŝHC\$\[Ћ?硝HU@uv܌3`io݈0 @&"pxd%X&r0^DI,lHum֬Y5;vCG-$@@o3ˋ/fק(8R7QR[w8øK @ ISN=oƞ0[O@)D+t{+J *]S2fR @\y*[}Xi%O?_ |#@H @ QbtS/o}i.# I7U/; ݌c-܊.zͮB*wЧLjw-Ytx|)5j & >F\;-݌7C]{6<{h@wC;N)n}>{y>ڧ͸>>̙ۗBK O@eSP-%XtL/?梍AȰl<Έ%L@A nboZ`bxd?---wZ/_l(`|]aXCI<˧!SvDzY~ѧLʍb (U{E_<=Xؽ{a8F(^Ѐ8lZiӎGy2{=퉁 Q%_ð]2ȣ;zҐIxH䀜Xi2,^@b =9I8~.2X~P߿~TI^?,<^zvϴ!--E;SK@JPZ*4/e˖P<ԥ0,CexBRWy2|h .+P|Y ,B<jSCooǔeio pxhxxAV_gCFN GS<)ԓNG p%2ɓgF0X⡱z?w'c Sd{! =!ڔX#bnVbĔ“0Ga~cѣ.\pݒ%K¨"q:O~4vf9SBo 0aX=Fxɑ7`בz2bUu ɢ_4g^I`2M~Vil5lF$J5}37k]? y`x|ÿёUիWoT2,^! QSS_,X:sm{ܛfk,@t_/^ sz)xHe`lq|˃>>SD3x4I*V,U-lp3 =7ёKkM7W.Mq+i12xHkn3c^5'+m_'"8q z+VX4ż<pa<,4ag}VژCRܕ41\Q6ސ 0&Ygh9PHk--*;Gg p1x2`dVP5u<,|n",]WG||Xq f~}on}Wv,*Cec((=/>?ii\|ݺ $ rm۶L~/ LSN4 O@FH6| b/K ]1+9̜V<'ARJ@Md 5d [~24Θ1cF~U0xKSE@ͨ.): x!:ZZZL:x4 ^p: tQ&=q3 b0~ 6y/m k%He~Ijj2p^= <))LO3~wy0yįCm@ h 6I&ƒtS<|y0yŬC PGQqt4M<`j0EH&J|<3h^wsh1|<U!Zl|qgŰŻMC!:Ge#C|x6?ߴyv͙T? Ӎ&z( -zs <!28U_ocMjo< ;lS?1TZx(E=DЂ8!gtVigaMU@crL+pŞ|ch{L_cP'J5?d!pܤ91aT$ BNɜxZC1iҤq𬐽|ykB#C'8s 3pR%wI :DjMՌ䅺2v$<H>ڽ<$[x{K_n\qH{2x棺4?D=!Z3Tݑ}9֌` ~G./Z[[3CE6C;^qёF<˗ڃ(_/a$]!hۤPzab1zR?z{8"CDYqCwPb$W!8^@xB 0<1e A `ͩahۅad,},y|=$!}̽ {MrC?ΫtsM[š6ʥ^4M{xC`gG`BIC[}OE;1̙֑!Rܬ7K㱾C/ EcL{AqUV!8C[mH<jppw@f1uԐ&'N{ft}U :x,Vwo`$nSðɒbz3"`:tbo :5A8<ÕZPG,2|Fٰ~U@("L8z} #G,X<SN="[}b=GC(ɒe4 3J`̙@G*Ez)ӱ xGQ!*Ҭ"jֱ5XἊ ({X>Ν;uE7Q՗zeWΌW!M28]7q#C_Y'0{l/_憈?g7:P]l#ӱ@HiC[-6;1 7K#/^ Glكח0uug= BKzp'I,elnV3̜u]7Eڹ!yftc]m. gGU :X 9s,)XF@z`j2azd<Ό: =)k3-u-ފV!5R7܄Ap ]/2sq{< @<}.7C8\UP,%9_f;_Gm6/:RߠAb 69e3 CPR 'xE5l4a|[*{zrIn.{Nvs: 8tHh!OY"; ke }9z᮫<s΍+@PEPqA gP EO@M~8F␅yӈ\ ÃިLa}iӦ YC[5>93. -|0&I'z x:<{3y'>x81EñZBz תg<b X{q![,fyR&JGWb]F||7Y&eU'50`Cĉeq~1x:D@N頁؂8&YɓPx5,=,z$59iDidIo3#/K Mz0d $H"-Gµ$ؑTnc!ƘCq4/ȡ0'A[~OD$$,! bԤᣔPރ˲<0eXxdymVtB ?IƧ'ķH`)nVvR<.&1)7P+3fDZ3{++SI񅉓fZԢIu9=݇&pE---¯wVdEB^дD14tRT5a&%U__o "_ 555Iy8 ^,+! @Ps` q%,hjjz Wiw2M !d!f `(ATt\+|FR<ܖzx~)."߅5Z!\iS{-p)\>\zGyL Iyp-]<~CZ,kR0,&G_|GrPҒ$NH%ń!! G35< a\<̺SL8E0m'&i8\W U~${d ~EPjrZ]Sl)ˆ&oPkĢPCv~) ?/f7w*9hĬvR[[;lܹ*e&Y]CrX]4lÈڵk <7sDf' q̚$maLp]avQ;q窒&'A:0T{rC&ڀEqڐrY}yOmJI D4P)qI,KnRu%P ,5s ÒzqڗB C`>RhfQuCQO2u5x)nRX\ H?Yb[4I,-5B@>JHqeJ=X,GMI u-Z\bkìK|w2]x [d8F+L5kV kM u#`7<̥}g  hiW!l#x32b.رr<6u6d)QXB J_V9w*jH$E_;2e‚ Ir dOlaVʚJ ,Uɻ?^;?ۆ ןęvgVʘJ#M$x^XEgLwK,9=qA7<C _BRmԩt,!ϭxijKƈ*I`ЯdqkȤ |%{. Ρ)%t…L>?r쪮NK h'оTR|WJ&V_ېom2-/Xz9W"8#2xHC %kJy\<B e8|˦ʂkq A>~^i -r.7cH+;pCܛٲ-*jjj.UC4)DJy\ћW xB&wnW6XǁoxWLWg!8CjI|ICM~,f=|ɖh<+>B#0tWtwqwBLe  @ dUoO*Bժ.\ 9W_9!zz ގ˱#9AX7&a9'B cˊEo0/6KZ7<Gp+C%| ρx$X1m }۷y{Z&Ľ/|ӂ%j8DU|8q&s}`KK-j (1/hO4i*F:ohW!MQ+`P'"H2= k]}/݃iM8n{(S #v.?= kak.7&  ㏗/IENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/gammapy_logo_nav.png0000644000175100001770000005167714721316200021034 0ustar00runnerdockerPNG  IHDRqK 0iCCPICCProfilexwTSϽ7PkhRH H.*1 J"6DTpDQ2(C"QDqpId߼y͛~kg}ֺLX Xňg` lpBF|،l *?Y"1P\8=W%Oɘ4M0J"Y2Vs,[|e92<se'9`2&ctI@o|N6(.sSdl-c(2-yH_/XZ.$&\SM07#1ؙYrfYym";8980m-m(]v^DW~ emi]P`/u}q|^R,g+\Kk)/C_|Rax8t1C^7nfzDp 柇u$/ED˦L L[B@ٹЖX!@~(* {d+} G͋љς}WL$cGD2QZ4 E@@A(q`1D `'u46ptc48.`R0) @Rt CXCP%CBH@Rf[(t CQhz#0 Zl`O828.p|O×X ?:0FBx$ !i@ڐH[EE1PL ⢖V6QP>U(j MFkt,:.FW8c1L&ӎ9ƌaX: rbl1 {{{;}#tp8_\8"Ey.,X%%Gщ1-9ҀKl.oo/O$&'=JvMޞxǥ{=Vs\x ‰N柜>ucKz=s/ol|ϝ?y ^d]ps~:;/;]7|WpQoH!ɻVsnYs}ҽ~4] =>=:`;cܱ'?e~!ańD#G&}'/?^xI֓?+\wx20;5\ӯ_etWf^Qs-mw3+?~O~ iTXtXML:com.adobe.xmp 5 2 1 ڢ  pHYsnu>? IDATxIu ZꞱ%d\KAEQŅADތE$"r*JAB%`(&q{퐋,=󺻾x,_f]$;6*٦niwtݥOhg/OV=emȓ6\#󔝪b #Uaoj ~ }8{wvvmKkU]{*$h*~5ݙ[vNtVEo#-cD6 6~U1oӫWGo?!L]*¯*~nh$j$.+K[-~Ů¿ n=sێ4?,QvcBM^j"Do{ܚ8&'ěQʧ[=6XVe]&75uK{ٯ؈ %~'Pcqf8``\4|]%暈8ur+ }˅]/Da Tq+'VP2`o4Д7 *َ<* Q]ةF>]=vX9mrSɾ.z#|xtnaazmc_? kUU'*A{sSJ 0" FYQ ">EH}BHBDYyn̢?5ӻ=Z왵ZgV8Bvajr(u>:}BZh-0Ǚ;Ѳ|"3bBCR &$,s0MQ~-PY}B@0+d4b*B>-AB]!?ϟ`=]o#}),{BdBL<؁L2z(l`A7V2tl1[!D&`¹XCR7NoGbԻm{ls+;xIq2{&$i*XţFC?p>d/HLPOH_a~Vjg^ǖxߎбEqϗB\'o<y?~:9 9ЫXXf/HA6U}mE\ޫ!taqrrLtYl{ [Taԝ3KzBF3̾wJ[SQa뫰bM7I]Scyio|"QdOFWi_AB׎er~k0fضLzס~w H0 _~Op}+l4a_϶RKlۥKiK !W%`^cHvB7@p n"v6mxm|E# ^v߀h3 C{>dV)h_Tv [!D` vLxo~;vZD ] QPi4" فfJL]+i^{Il7zA#}>NBH `Yw!eBag xݤcKvi3CaG=Ol=!W zp)*5ϚŶI xnP#wh3c/H LJ l%' {@=R#t)N Beۣxlmvl ň0|NL4 _!'vh#Ce?_uFV0)_5S7䡴aǶCۀ&硴 !ueѯ>ϙ0fBN Y{_OWUmڈ}ߌ efdhFmĠ0+j!HhPP\ 1 ]J?آ&+Yh*cvN{w~p}w{nV`hΠl0?,Ő`%|^;v*tsAJ3-(26:.BDTvy992v+t]fbA*aӘ<\:Г;?=E[Zd!Sx(,It*fNcO^FJu\ZA_Xh[$d/ODi/:h=F9˪cʜZCGtN˘8:qYR 7'ٛ1q`{1R>q e #"n޼DYqjt DT;G ׉3N#gkYr2Y bM-w'J6qo,CG轴'mKD\}pYk1|'"n79+HD~9 x2 z45''+ߢovOD}4`qYFL#oz>ĞFǓgHuǤA* 䡑wбh[XoJ@^謹sAy< u5:~Ђ{YxD^e|k.^gwp:Myt%@IsC-hoW~˧Fc|==ʖ?xt7-9WiGW~%o;W!ov39XBǣPnoѢ2Xux<0Xslv3֙b WHL8Wދ3㠒kW=y3%Rt qD>D9WމΡWy1ѱ(dt\1Y]ɥzFe,?B {&ޣs` 'unm-{}kt ?/; /k󷖃9G!;{hWk9?h1:in*9?-T&#jOS[vE Z\49-HUt@?:ߣ_btǔc%P-,/c3U<.}:ߜۨG05$>Dky7/Tt Ӓ )?^:^]|7'ͫ*@he(oG lp4Fߍg0Zd 1Q}Kk"o4&X`ɱ(T 1桖il؞ `cB͟kJ. ~Owu8r[)g?JZעsn;q66o\}9m_4ȳ@yT}0X6_`{&''J&*6 <؞ @o:'Zjɱ3 g`4]E㇆gcOO o,؞ `;hQJpl|BRD,`4(bh- D"bpE`H! I"͝شsT ?"kbRyW8LqȎ2;fvofUgZnKEiۅr{"7=\/32SbfgPU9l#o &se|I1>X<ܮ6~0Ij}k#erCӵ"ȝ\fy/ i8~ RRe,Y"!drJW4 k/8?.z`e*%71$[}3uvP;ޫ܋;E sʂAbm屚Wko*.[]i=e s"haxCIe3r9.KLZ]ZemEeWjڈ' ܮl]H?(6^QAPAY&EQs:e xE\fbe>*kš{R5"Z[D}) BQ( F|)JGZMi)4R0ZmcO>ܳw09'wwܹ#m=I +Kصx% TCHhS~^]#-'D ʂ7JEm6cmǁFˣh4.Ӯ{H;KŴ`v_ᘶ߄d×pFZY/@)b;XKieTm]|\l(^ЎvO- c?1 )bA~hlHg߄v_a.֎($){vY`{,9x\o ⁧%BZ0{ZuZ;V_`,gA.(+v ͒3rTm !I}<la!׎JAiq+N&&pl.өk<i$Ie.A rMoBH`b|!2Cqzc0ϸA;N 6je5߄0k0qp7/P)~K^iյ~Bm,j}fv.ءP'P^Ҏ'RG8~$IS@yi !p}&f]RTNFQ܁p8_iǧ=fX FvmdRoBH@3! 8^W$s&|`0yBw ' '& NaP;>W~0Ն)&IUvHw Ɵm !bote s&2߆rtHBwg y\BH΁xdա`Ў(k311 i7 \Y;!$P >v"=62m1d!ܞRTk.u0{::M(px# A[;ZTp}m0|Cdvk +6jǒG߷ Z;N$㑴y[;!$`sxXWS玉nU8}{6rsX ;~8/l}Tw ]UUwVDA̹wH-a^=a*}D0XC%FiFDAY4>DBT}=Ճ"Rb#Q9]™֮#ଽ9e^auZgo",z2gn;q$ɕ֞yfĥn/Ek'FwE{MQk#8$;n0CnBqx8xhqZAY8ȹ}NH\~P{b7mth1O2u_\g$ojc;{ !fnk@Drh1n/w 1%Zo3HQ&GճXW7=B3h1 Ծp+B;?BO&<!f@VLLL,7jaHY'[{B~x˗&ԈCvb|-D3sף=zQm&F)F{HMoûfXB"EмCsG"{1{34Z?!8RhYC'÷8T~Bqx;3*Xh6!;hQ+D MU,DMRF|` DZE A Flh%`#6 0B|fۈqɜ;G 9f(*Z.f'w`ޣ{ `q,e-2˵3@0NYP"|Œ ?s$c(ΐ:)Xy Ț0CPwV%+ _vkQyi5~Y W;|}ysrm˦!03@~Pq1]% D%sSz 4O1 iO?iiig8ǵ3r~xmH;(t˯,N + EyOC̽Ov iUw7ʺE<9 PWis7>Αz[D%3z`477=# % pp(;-`h D\v4н*Pk [[bkVhzS"wj_8 d58f]]; d , !IM~,ؚ@v1i~D~9 sKhqaX% t kY <xpW; d =poΒY<\;M. _jg!F`>ΑzNY9< Dykv6\]hbC$k$=;2l;wB$4 WjgICvD\5]ܯJμ{jdM`*zi5}y 4;|Fk|%-`yzk^kglr$Q,ipεIlim*}ڙ qY3*v.؄ ,zj-Ufmڙ qY+—OJ,z>Q; *+R<]4? =z%-dyh43@"k>ٔ$IO0%Z;SOUe?b.ꜙ;FPhӴ 0hjQQ,kiR &riDK\h"x"T90L遼}}}v{/J="]Yr%U+5Vt&"Jna<~FKQ^eߡi>a#:%NK$K;+!Gʹsx T1\\KYDiŕ\-(*|o8<\@¿h8<|%Nй(Q)i)t&i\~;LNNީyN&و(QZ/Z;TFgmem£MQO.K:EKX th]C4و(ApU%asQ\gL] oy و(Au)_Y\EghQ`YzaAgfcxСQt6"Jaљ(.M<9}@NQϙ< 1t&GƧsw4oÃ'و(1uU!MOt.=ZP;ե3|D;-qEM{Kkx|DL, q2tzt.m|D-_%ϡ3Q<.:;aQDAN˒Ya҂XrY~>o;:~t AhuET& ]BAB5EjE%Pm hjҋ^ bb RR*ڂz.COi)lj?I/||?:.y;_˒GѹOgÂ+ݾCZzsQIJ7wVK}ذJh+\v&5UհFXEQ|mXp?yzx/B$iO,KEXinZ*7-Qs9(b:<\7Ρs~s=hYnv9tF-[tN"TxižC# t6bXl~C&4ɲ$ߤYjv{Ըfxp#=\#y[wʺO늢,Ns%+M=蚋FbxȮ:~"&Vȏ ϰr=h?տ&~"$nCQ(_^o>]߭Vk]w{tDIR֙w6cbbbeAA5 CMCEv rABӍFce=;f^Q!YcaP֢k#;(ў,~5٘C(^o61::TC*|:#w@( Sၛ*LBO^Cl4_E }TRtOh4lfj6`EcC=!{ZQ 4WtJ7Yu"Glg݆`¨ xUףBD_Q/!iڟJCB ܲ/B煈:*fS{1:EH]'SP#t^l)>ex*tLK柢]QmD"o\i7jSOy!|L+^9$";tJrK tsBD%sFElCC;G7M8-ÛQ*]EB\݀n&={ pK jd-QM zL/XH ܙ%&9:۳?sAD]pCEQ LȪwcsAD]K04͵ 5 7V Iq{,P\a4hD|Ӫyև7*6G9ANZQT n5]-sqn~xçzC$ Ӆ멣SJon5yPРX65IENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/gammapy_maps.png0000644000175100001770000076206314721316200020166 0ustar00runnerdockerPNG  IHDRe]: pHYsdzIDATx}|\?d*i60JjHH $ '!| LwKlɒuy3޹9[no{g{3vM w|z=rQܹsl(OUlxZ [ Oc~!灴?knߑޱc"uq1HN'~`(y%үY a| H $pu{LϘ_.z_F0?g||20)A< #xҳzS|27 $3$ !"?q&q:c&N{C['8) tF.Ϙ 9Y03u1ePPv!T:JC$~".?؀MEwi4< 4@TL# *aVt 3+6v) ,ppDC pU(5"(FO#-XY JElVg ߼Y|ًgKKK6YQ? #ʠ ᶀn TLpcC>nڬ³j`@W-K8믿?'Eg`v(pjK+Vx!n'!{JSNF(lNfժ*,r_&׵S,X0yrj7HܑL @u` +G\b0(8 |+؝|Ѣ`Fz5H?(>(%q#$eOW{W ,chPQFI8 [kC .VZ4t[RrC#&udz%Y]Q~EOƙyjŎYsPzHGM)>5(BLA:VcΘERзq'ܑ9GdpBR>e 2+H\""T?@9s]{Wdkj8_AUtmMO .ә at#IjHPT Aj6l喋߅[;?_ #6d}”T;T.jalcpuS?1v? $dQHTIW+ʊ'jCGE]7&; -jHN<SlܯGx~X_C ͺ%,/*Std5kU) 60T\ ܤD#$w$#ag!(JŪVvBx5|N[?)ҹJF\E݅9yBu45*پ.a8Pnaр=1.E0vx|&|}i,$6LV9How:gt0|k)]("D9?lH+"BglŢ~eBt9rLvzr+Hƣ= d o`#2@;6?xF`?ᚅμ/en~cV\S' bA ؑ~Q~?D@Tl|lJJdY UiMW)xNGL]M ;j/8P=[zr(kR MO iY aOC~{ fFFSԟNmKl9lam[413$PGVH!uxv Y1Dm?¸{GnsM}T K9ʁ[lPW0n\ 5ÜpŬ$U0#jb5&KQz{/az);^]`OahY&^[fa\ C@ :8FtGE ;.%i@qYl= Q;QTX~;cm`zK+++Ț0V$l!ٛ%/gؼFU(OG(4wUIa- u>T<B$ܼGR{K'F":̦>W8qzϸ ٵq&Fc:+Y6jg{4#¹eӝwpo#hikESMӘxn%$x%fKVc0ԯeq1CWTw106Qw[pV f/M[}"-H G}f5Ęzd~' s2 Sl-l#1YƦxS􀱔C)`g#S1\v~IJI)PBXg(VR" `KuDxж&4\8ǮFWo9$VdF dEB+I̜tQ܁mV!*Ԓſ\+~zuq\ux2V-Y)^Uz]osa|n|ֵK7|c#*]@rVQQd^DhD}>s糺( CbY e$#w$[~N? qѨwwN~ݕ/W?ȹMVS~H j1%Y;I$tYGNڂ)Eeu) ǎKD,>b__~EEEYm_2gGbq||{u˅* /ž#H\,Ak*/{W9sȓ]KsUυ `]_KzT _I^ON?O~OΝvwwwņ YC YcVn+͆_? g $qnλV%KKKl 2U .$^J 1HEIp69;mޏVM\R(:88J(HvCu~J0#z?_-袋{/+u]'pՖl<bKh;f7{"C_ק2/?XCϿ6>)?ownn)S'C@V)y ?X|#HrG8 ܑC@{$b1gO0]Qf>ՖyL.34窶ncg5uq8u5l:/%!(eP 4y`9uSӨ'h`$lX_5g i AL|L#2<;&IW#I1Ϻ qmu |_ܫBɜO"˜hUfg4!5!n31ʴVAa.jGz-qcΘyFM|vȶ%?췉pY2ISvOLv0vW*xa8t jjF$PRrx5 s ̹*o:ɮ `Jz085q@&g-ybQCuJmG&)0@0GRkZKK3gZYGbPPYT'M⧔8]qq&0b6!OAnF{ ~'}gI4م r Ep4; V%*,Hoq1(t7@PU¼,MT Bs^;{52rNjE #($u;6(BU˕ʯ(BbMboL&(>"r; {5Be6 nb_^ py`8bZ%ě л(1ui5%_If*B-\MLD%|HQ Z`1-#&L!<"->Dxʬ2xyS ~e|f3 ܱ T~,=p OU5зnP%}{9={ʢiܠ"'*z7K\K WY^-`ސB[Ub%Y[o}W\Pw ;=ς`qk{ԑ5(9H߬{Wwn/cHlm7wC&x2()^[t#1+w_{j [J%SB9qܸ~/O*'Qm~MݝbN! +f5 _}DjtjC#DЩJ>|%KW-uϧ^V̟R8ƞeXzo0 Jm4ݯ9rҝ9ATK sy57Yd!hqqs91H u%<\6^%wudd"V%鍋!bl$!1zfHdq_`=o238DtT 1p_?sPm"2i><# ӓ)p +6,"yY"o-X0////|9bL9\#)j]֔P Zz[7!"_q,Z.IX[2 :N\q"_%&VX9Lj7*+ jB S[YLlb_39p¦ʃ|pz`/Z7—iŠY3P1[=0svCF迉x BA[xSwIOX,ڛ vi-d.!_ zHs‘('JQuO&U0Cٱ^](W/GW \pQ`{{Y߂%0._f1hX+`rSBl0X|: exw4 KHX ,!)!9Cx}uPrF=yƢ {u}k'<uH,}Ɔ0XE>uU)ka[.ujN@_F ReCdPM.0&'Q  pn~hN@Q>SqNˆ2Am|߮h]U$ =uԸ/ F<Ʌ[+9ة2*WsÛ׭Q]_9̈ d9$[Bn%*F!hCzM)wCKP?ЙZ7uEm݃QƔ? jÑCJR߰<aqΌo4MѸ3bF=ᙱ>1}hQFQ?[+q+Ż5UHo'2J+Q/GV9jL-*&LgGwԩtmM"NVI!c p$ܫ ewuq!}uԓ;`WJ\sT>,| =7Sh0$N3O. Qd ѕb@ 3Oi eRZ5m²9@"9ک?c%›O{ؽ:LƛZrϽEy1lzW U*Z ZP^7$*Xv) ;EFڱ3-+xCD' +"%M[3Y-%Fm<^0r7/hi_t2WMbkBڻ5(Al/$bWFLps]K4C~D2.%_̂5YA74;qV@R$cd@梩Km0;F0-DLQ7Å 7킥0,+Ɠ,4ًG͡~cwϿ6H W Y5Lg>ΜFX8~ظ ^]N58 IsBU#\4ӔHBʛK^on֯N#|){ea7q([mt/͚5kjM *cwWY#>kנ CF8@H؉ҷB8k.78ΣBAqoF"͖+|+PTeM_kO􉏛n6TLowР" >ƞ*Li?BĖd,$X 9=(W8p4*?iLen*Gqc>'A.!yh}R<Ǡu0QAG.q0peJ+={]ƒ=/_E|V.Υ'9oL 'ݫif_X9%Ze]׹_I>66% ;:w$P9##w_}ŋ% @Ȕy<7p oezv9Ye}x_4YrV[fwN09V97ZzƊG _LBvbeI E`jCKU.3nwl aȬd;6sፑ2?XB  0|$#i"陷Ȭv!CD\Grf yS3f>/ztf>VAPAڇځ$x݂`9@r7 铓ܒݒ)1Ku5:I81\! F=7EnHIݑԈ|(eƦI NM|2q60HrS`#ǦUHl&_$Hx%\J'a%r /, 4wgbRÍaZ;B iG&a6Ўð!?>؝2̦T~1ȜԜS*6U\Q$&b,˦2 q1!{t|Y| 98Ǎ^`HH.V%ALW{{5'D}u@`){QU%nӢ!X(mHҒH,Pd% H*ݗV,1J>`dGKV^ڃ~(U FYlBS]W]ԖE_pef6rZ_NȬG+̐Z %6%MopaҮX j Q[  h ROh OK@yT!zq~E8jjVhF#yQ 4MDr_ݪk*(<*^oi-dfǷG0&"=|)1r pH'2"p=.aAHR"{#~Vh)Yg_x hhKawQ(C]]?$c)IU,/VTTRRO4a$],D-&bD$+R:EI`αlYJzB[^F,[xRHK 0.SXI\>UI&n#'%PKc=Akjr%ԧ$^'i @1^tAC,`sP (hid#BdۨCh#Ø4ctO8eD$k8&ҮL>[8Լ7Zv@VuAf=e#͓kIZS]uq}>lO0* D"bDCz-:A2%+!%fWRS ɉOyGV1-_CD8KJ)( v}wX,Rϧ0 `wE '`hNc)U]kR[x[zZ& &MMVR%%lXѹuK{%O x@Me6JLF;pH8$҄#=KD9%bΠk5`$qV)L)YȦYYS?!|< jpb>-\a{͂. #${018+ {4atSFΤȌR0ATagtpRjBVY6dZ)bLs0xNQM`"b1sM @I$#CRlAB}j9qѠpA {EL?(B0&&!Fr^+#IS`ሚ 2kqwʲ̬^:1[b$ M`v4CMm6I@9>jdUDY9vD!h Nƕߓ&mckSJΧ4BLe0U (Vo߬'St<j!=Ť 4CUhWNhUT&Cʼn!&S蹤 peTwQUh]&Sp*p{Xj{q`Vlv{8dpnrYLks;/R߼ޞK%ՏUG*)%ٸI%!1OU~]pFΙP,cdڞ7Eti̱n9%&S; xTohL!,%Q#GG8-Dv8c̯Q8GC}Pc(ԙgI?"+55mā=b` r p &C1I4.r!%V!l䂳ڋ6|ƌH4M6K7a|N"3Cɬ)|s6Lé*#sge%d^MEiNh'QAIr7Bl^q z'}jT^j%EIdΰ0:0fw"9 *=V ԌTI$Nl&1/cQ^9!I,?l_TĚ._ k3mLB{&nWAa y a(qX0Sa@a}9ẓ~91hb(B CA,!GƎ*N7NDhb9[`, F,6\5yb-n=WA8CB)#Hl<IjMV-`,H?^V(A*25Lj(kv<5#P2a$1&T2ʮDi~s~h)|Z[QpbѹK6`=44Esmk7R%*Pɂ/l hKHo7 BvLjXv2EB0vq64CL@La6#44KI%qZ̈)pBV^Na LEU9kyZ}l#5o!gLBnnu st-# /Rb6aBڄsk)5'M`,m#<g%N)4ʧ\L.A0OL&*긄L9!3X`'[+i0|>b6r>9(IHaDXKKxSDT㈐oÈ'w.G{RDsب$b(KT}ӄÈ'„ 纆F+C1YeF=ǓN2XrfƐ/1HЄ:$BF-BX< IaXN~gB=]뫈^!9z;;tn 3WKވW/@ij}NIPD5@Q~xU]f2J_?ET& Q||0s¬`(I1FKMUOL41S8= 1b‡3⹜8A>՛B+ buT;4>/ QV7%ᅠ[jUdBuSD$ Uɇx/4Eh&<퇝hDBNq^;~C*+IALnzoQ5QSdS0}m ncdmwIL_ FX]RMMchRcشiSN F%#%$KN HE\?!1 }Ac9Og+k4Z]gD;?op&Z2LCb5R)wuL 'p`$m{0FQU$JzL+!d'EL>+4ؔFAIkn~g· "ώt$~͚5K.x'YOqh k'QN׏s;jAfD7S.zQJwUCv@3ROѱEqmݰ/Z^]0Qn&b/aMB13z4a~YLm'io;v?q"v>Bi֚K#7=U'(?;6{O9U.JAQC2ͼ =ۭXClr='Mo{2Rh ]u zdJ8NŐ3.GIUx6ʼL~nm;I yN4)$,y)D0r 0CII.O=D`JA,K:$!rգ@ID+R%BAjQ ;"vY ƘЛYtbILI˷CAowF$81 w#g 0xו_8ּNB.x{^0.(LڋyϿe\ ݏƷVI }Zki;i}Œ ^ݳ!7;zĈָ@lX\9H3JQ |\;Ȕ`5`2DaȟPF'4<x8goíƒ}x^NZ)m%῟tW8!ܽ.[G ߁37n Âס ,ߴ΀'{(.C _~akn8>{yt6nڥ< av~n k5M|WV/J1sqǥٕ4!IڍaP>giIhv";>[C do?+5<DΫ ƹ;C 4tS*‡Hgp6iA Vs`!r}=öP$/FtA%Eup]/?AӭUj T.^ X_\VTlwxϾVaؔ2g֥0EgY6 8}W8pGV>$O f!8| xۣ$D <|>9GoÌk7]j~EdB8™RZ;1J_~|KnHfKPи]/Ku6Fa"tU\vGnuJU*tã'A&ZI2)cN7U $)@L'vȭ4)*!|$ ۟c* LOAF0N)E-ajDV6;4aDωrF p?m[6[OXS =`U*8$ۑhѳ4o%p#v"h(đj;Fg+z C8{} A"a &cNyvWmS<&566I?d"9z ?X"=0'䉤SĦx'I A@+]ǬO"nn)fBڍW4C=RI_-MKMF{)Xx rQ 'ݿS&;aEC ⃪ּ֋ɥf0M;f & so~ a į4=r's pLv$ZD8$Aa69F(;f#V<{x>Ӭ*$o!:9|H9㺉 3Agr~wd49m0qg/KstXklw&ảmHF0d$+\{d!3inS(IH$9sl=a2*D7V068Ȅi1lMe5@Ha3Ih7M 8ۮ' GG6{K7vxB{8$ϯp1yC0' F]&w<f"ם~{,pC iSe 2;3yMDl0_ c5 <'hmI0rjCbD!b>AfjE 7F4I/ho"7 WD9ܺM&Idez^1|1Sg "֓qGv 0XҮb)aoԷ"!j$T8Q4lYƐ;DT{<ViWڭC9I B`= Afq6EɌ0Ҙ܅rZ :=ƀPܣ "\ʛ!3̳&2TA77_*g-&&s./Xy<\HșxYT/̭hiF}(Cx ˤ%%o?~Pؚ1S"v+ $A2ُi/_[l}#i @I")iè}8~)iVgN"fA l +Dtqaf)B)Wb'oD3P?= ^={bRJʃ ~Qj918 Ʀ>#1U#bKrVoa4r{\e&d[*cp6wRҏ7nǡ"e Io<ޡ A3@-9mD`I&v(BR)PJb_,@ Q3$nRa&Gy^X 5oj/pGfAӶ$~W1a&DAy*~+={v\rh^ GMs-Ki %jdEab ]r'{oK] 6 ugG 8+4=lrj16 SQ$(xN#0\:Z?PrWYtvu(}bՒT;Z2vUmh$&a` oK D*GpI(~G!uW_^U/IpՋ}j!"0"&;}z M3HWv|+ҲVRp̛d8ufFm-#.?zoS(_nOX1p&,LYBe/_Rbz\`ZEL8f+)9jv5湟{{ۯUv̙b“'<+wI,$#7LĻBM_Js]t$)Ek'!Ut{oq1iS{ɝ2/qR-W,ϡ7:iI8n3`9"~|sO9K_+ə2랅J|1>_W8W";p![̰~Af`Vro9.NFU>R#E }d6!F,w'VE óD <"Hډ0p:Ń}pE䥁}Y8ڳL+(\gc0axul1-=;-xf!W]P,s>cW`ME"DLq7o"'뫶w78<&2 QϹt/79XV DA@5rbǨUe~VN3*4$2Ƃ=z4ǰQe{Lp#";p/֥ßkd .d= ~F0ǀNHePLY 3?OOͿ%C4diKYL*KRdf uxmBh0%BƱFV͂^z8iʩ߰a'k^L 7_>}`ڻ >xy3?4Gr1![HT `$>.UJvCz>>8xTI"fNaZEF([I;p |Q06e,(ʹ 0 ib_@?\usN"s羍\I{6kOOO>>gw7.[P')L{ VYP/Ϲ9s W5_`ݧ}S{P}iJmiZ.efrG1@ E厜##w wܑcܑ;r ;rGrG1!C5Sm޺]4)f83^JŌxGzcPAA`97nHR̫~-r&HV>dO(ʼn]D,,z fQ>|8 " d2E/L,$fU⾚ڶ| U .>DlatR(*3Ec AMmى-8AmZ1`i„,FR;n"p f33cY#\NSuμXA ))AI1s |q, *ӌ$tWT$$LNZ62K04;\E&Ja |ឨ6L}|RR>?Y-/0'esk&0`;"06{\%e&c_DM -7 <şM[]_4S v 3Q!rE?EpQS,*YV!| )g#óցd(7Mf &Z:vajXHI3r戨Эԋ;eyJlIs9t &lD n`M^LP %%ٲ12Ġ.PL7)aDE\,(YM-4Dz I:hcOD$mSn\!Atp>˧_ xF-+EJCffEvD +fbtL*~Lvr'3!6hB9I;I'1k}-t{"%U`"`*ӄbqQVSe7*-r+˸xcBpʓ,YQ|E\B#\!C%΀Irbځ?n<)"6v&YWjHPKyHY{ƣFˢ,}H0+n]m<2Y9r jCidfE3YX |`@1FQ&L^"FtudIp.|L5.;i.um2;Њ{1^{Ph&&iP;MM>&Z'&2?6jiiW_MteqlBbՁQYE!ϥ$?[Uc v| J@Cb4 }!p{l2A>٬2&b"*űr.H-\)F[*?Kef6MZ- #l%1 o9D斬uAP!g!iR`#.Uz̆=`+ĻAU @<&M,8a'"].p#sC;`5)faFt ( K]hߎ-fLJǬ2?¼&n6 Hhz ' 䝰\s@.#}8!B^b^eE9*62iRdD @rC B}(rA*G(,6X?.: @Ieq(&Ҏu7d.4YYPhZnjy &b2C9WD P%M8Q!,X@ijs1M0>}Ҥ dcCt`IsPU${r;b' 8mJ?H4bX, >F xcƝ)c("K>(ߥkLLDd\a,CnnR68*U@0YPab^[c!S~M␹._sMĻ-BD&vIn$(z l7b;Qve[$,t[/b"4[D*s0Lf:ESNtHADIa$Q >1d/Em`TGLβmح/N#hH9? P@(D) ƤQPː0Q~ ${ԗ yhDo։fP)eGvS?D:vR^ W%TzbBBw UĘcG;=Z1>tb+%B Fem `Ej `0/~j}}K_ZBK=&k4L9pY' 9^>PQҫ)ZT~:qTpB/R׊NһBq"%xЮ*F "H,*Lj0Lؚ~b_E峻581ݥ?H%둪qưl^dSDP=h*)-piPTh ^b:&ة gUh )GY&F;\-Ö\`0= vIAa۫U,X0Oϳr9*2'"pps 0%d ӆwfqTz"|N@̴ VVB $~|߸?4lEC0T(;kK`}WLiMU 8F e D?{fWUn[$3LzO PB'`h(by*"{X'E"R&H Fdf2׾g-κ'wD}7{)oN45~oCwl0Ե;HS$R;\6>S|IKB:M}%RȰdeRلzH"!v(>h[ZvT9$2;=0ls'moAK 2c01*a@f;s Ws6،fP5 Rq6t9 :B2طZ'E0]̫%Fw 8R ׌c kA$r:FPq2vA31yMq?ҺO~ 7Ů'),Ǖ7M'a5)YN u0N^_)/I7I@ix?|WoԎrB&2SVMZ\\?}شoyƽD:/]{+A\ǎz&arhe r 7P5/D$,EcĈBD$ LLGso,G 74F=`hG9/zdN0Vuq ]WOf â9k^އZpMfbv-h Q4irWvC1$&N1@RbAAa s8Y+"Jj=.WvaSI,a¦̱I&9@nF)kMR0"b0z@2dc4C&iY$@H]$Qm&6l6a(zVc;M#x2ŽFTD6Ls0>,bV LNX<{.W|}1d코>Ic=014B3Acȁ7vBČ$ḭߐ "t&a TfsrE¼`9BĖ0`*!4xXN6/VIciOjb$'M&8ZvaE`QƼB0AnbiaTJ &]*+Er1zSD_ [RUL㔦s 0Za?Ѷ`+|q .qdG&@&y Iq?M^R.W>~3R9H)kzuR3L6]>L~!YY2е_I:Ӂm/./f:<Igr az?}7WOaDS*f tG}i%Lݥ}:@Ig0=҄ gJ"4_b%C':5XxJuX_j"TH}Aܐ?LۜNb7L I$˳&No ]nb&6ݫ9wo89yՑO{Mo/phaqO%C8\Ib #;|Ep?~Sp8,ZT`˶&h@ {  W{t"^@!hRʧa@\@p$I hk?TzAb`3 rw@>x D-,P.% ~=h× Qb WFYqeг/Pπ)5م_= ^HdKrw*Ic%]T,i`d`!1f[#]b@qaFu HWz=—au RkUiX8; ;K4IŬ412MvLA0*ӡYG+(vMO*7`(L=t'A1A _z"}w9ї3hKE;KHp ~N'`- E1N߲'M@mv «" ̵' J\k=A/45>;Zo_ҽ+eq/Q6ZܖS;r̥LxaBzM(/MƄx (=b"xv{( _ Zv(@>0ʠ Ud*| 1wF:E\@2`1~kA&Re 3f~| B>ؽ= ` jl{ym pJ!8w/pzr#U z: B#5A|#RT/O6{`bsO$4~ I!2tMUJrK(M|d*cM0 0M Ϡ }?n׉l^/}'<~'`xc<"7~_? " {f9;4H<<_Žat7DfA;la:BOw]Z.d8έ3 Hף]szq֣#TAr(#EjI=cd7vOՃ`6kG|Ll:q͢2S3t͛Ԑyh&.m(BV^WjץR0eZv5*lO _ztb.F1AApV/mC箇( ^LAqX͠H] &}jn^3Ljwuu$ϻl&Qe[0^h3aM{vJნI6(fsʃ84L(l*L{ƴ8܋6h;)0b+HsqFh!)֜/[RKc12f QHK-R S||D" ؖFяn/D_˚}]j<0ǒ:ѕQL%";NFH<~W#Y*4#]]"k yy\:62 i71r`?vHT6XE y`dDdU\AE$!R`*NNڭ(?LQgGlSK(oǙ+zO+O@'2.tΡ*l6 Ic85ջ BVqTyNaم#J$QgUsLB@d ӉAR`?m%X6}&cAp@\,>Mp&~PǼe$|aCk,vyG9BqgskZvE_?t jx9![u;)l) PC|tLvٜ}([) ^W,JBj^bVߋ!b Q%ZOSq+#鐑 B)Uy)63p&-8Ͷ[NE#7mP&<3Y0\xSڀ)~V<p9(G8~cz)|h8m0éeV3:q? FvjPHɉ.؅h?LѤd~0r̦wI; I8 V0Dt!Vsp`Uc txҡǘ 2CK4P7x<kBA, au[Ce.Isv'\ e`̩] WXJlpt HO! ϫk%:g@q%kmS&G#FF&k8mI/Ngj Pob&0Avpj#4pD/Jod G`[s<1BFLH2@iB*S'Ѓot1:Cc=H!h%c3k<= t ΄朴י0]x̄]NZgpBhNRke·^~q]jDwr_&*6Չš A(He `GZ7ytb09𐴮#CHh<'zi2. {ړgf~bV@0j+~{ɱK=%n` I1cdMH>N#PMD,93F+N6p4F%0v;Qd>\v~1TVK;% PWTAܝ_D; D FAHsg/={]Fq2E;TL6"b:E|GHoK "mƾS18BIʆԵ{UӺ6#@ lE1wZmw!*`b)JǷĄ㭂ڞ"v Mc vؾ/OWπy,R/ڄ]ĤL"h>MƗ(u+BDDEϠs?693z/˫sSX-x*,oX@ aӘ)?v=;GQoL#|(P! QdžQP?Ǝ)GLTORՒ}D'y8DlX'c֛7[m)6ȕLaif7t3cGࣧuove%nع/w0m*)؏z|, e߄U`hZA) LYBQ5!]M" ) 4nxGNpHG.8`[ù(r J9-do: ꠏ s a!R̽_w V ^q-Z8n+wPVo=x%,<WОǾOAXcGB0\weg&g 43X 2)/AS :$|)c1:ͨ67Elmwh k +2!E,mMONtIC΅ A, (]%u-.~"exs.7`Pq'`(|)PQ]Gϛ^_ ]|x88G`cex\Z5GUKup3q^h\R $d ]ˑ[e3(* hާ|N&c bknNJIpIÔk]$& ,Tx+.i= >4||v;{'5emE O\Fgzbni ¼/@%+0#!q!>o ?*|'V@۶=u\' \y-ϛWOgw/8,c)-l埩~~ .o§N26,ܭ:Sψ{Xqpԧkƪ(Z & >GqCm`Z#E+)μh%Hok7O)*1X ]0S6؞ul5ta_߁[w' ~xqk;\}0^xc<W_ Cpᬲv(}]1^x XzS?z!0mj3pυ_Gق<0㎮+:aG΀[VdW}>AF}_v5Ʀ DĴ&Gܿ(u ~tM6.g$r\PSV&PBcxquң`o7/z:?KHN՟3?ZcpȐ}p:jqsQ(X /qk[ozJ>;KN w _qz 2a8tCC2{15. *Eh˼~/\؄` z.C68pWH09OX`[D`0v;q\FUNؖ~$\x4!-Rz4%6޻^GNUNv%S'OG_g+ifn ZkĎ|/* z& =WS?FaL'g#9I w3.G) ӝn=(MDiLxx#_3Ԑ$IqB>-5@I+ >̔7Ͻ ⍤zf0jϠ s-VtiжVߍ> K~@ Y-[HgF:~f|5}%]-A9(z7' MhNC?LݐsjOb^~06(d|FU:r|HäI}}U~=2WNXMGḦ́H‘L*aTUumaEAy Jcx7!dO s 30:#`3. T' kXgGj(ij落O h&գ4É\p4Mg3bY .^L  ě޸WR&ݟseeC,Z{8:)|2 FZHLd37h24A|}-!D\_#&ʯ8ΟFEC0ɦª8_X?B)0Zt?5ï$wϻW vRl )%i(]?H?| ƝY(/a0Ɓ%n IT`,ܟCɲ+|jUFc7F:7<􋹴bdLj}l AS䬎br9ͺJl_)30g_ uCBHEQ]vѮAx_O Eu8 ;_HW 6M #(; ;3{Bto$ 9: 1G9TN`-8;(4P d22 BߖCy/!N0{sq%Ab>g{ӗWv_U<ܣgv Y,=]0gs![ψ4`~~X5CO [k~7t=|hݏZ D/{09CZM ҩ_wIp:Kv6&fQDa-c2&nQ5"FY arƻٜd&(Y"eudX~+R!2Wnpԧa'Pdӡ0J73Wg {pO"m$f$ކ?}p% ago#l*`lpgvW14dsP AhQsd$⋦R9O&],JAAMJ&g;uq(Œ3'lλMdPڡr47SaEp?Y b{%MxK-^* o4H}UV*i0ϞH1K\{0HV9V{m(V)v;txcR <ޢȼk=wIp ɼH8ܸ|KHadXP@Dڈ9~;I-E Ӄe" ]DΝ"Hժ*WZZs6ͦQtbFl/d^;;R5E"u"{u5fG pJ_N)AZ/!KB;VC(M  fc:s&-'+TD-?4Є fu rM-ZuBnMB~ߣ,~bB/=w#+61; "7l,$s\P>F҉u=љcݰl\{v8|) FEY.DG} BΗ0D)%EWKy$kUS5dn6ql+w+j>7.ذAg[IǛDȝPV6DTR Oxp*ܱUn,9)nAf1Ixau.BnP, `T7фp3ATIEx/EUC{"OD{9 FE"WPl^> h k6Ne;Vs$wm\an6T}w mʅ\|b ;{ϼz_h{ȵZOph7~+׵\>Ocބ5dr#?`@Hr2& $I>Z|kL yC]懩gW7R.䮤݊m2ϜD'#lΝ\hg7 N d*Ї4kW2?6M ,E$t.ri!>J59Q0ʮC-v{ 7,XATowKO٭MsƔ `lcߩnin!xVM8r۹JBE."hU) :|Q)z}4b,4\9Ns%22/RҶc4g_ ڣ0OqπZ/LB㺎yvE/v_7j@u$`F&)EhҎ BHM=b͖"c#}™Hw8M̿/NcKvBѿ Fʱ܉$&8&x=7 &sJ5yʜgey 4Q _xc0OK:!"B>4_/soB::Im!aŕҖ8|caCG)p>͐.4O8idth"ny5D`qcM2۔ETC=^ }eő 9n}Ͼ|?b{'P(N(G  6A\SWO nhoowK/;21QVܫV@|WWM!]ܶO}Oݹ <.o&lHQ!&Du?y^o}0aao<%U'P9*Smq[oח#.Y}Dhc1zEIֆ-Shϖg~nHCg?=)XQ[Ma2{(})@8g 2 jT#5⭦--]D JN|rJv*KkYqi0c| }C7E{8DI7Bd6jJxw&TFHcRM_If *$3; 0#i2Q+ԫ͛7o[ (>ܘH>_pǃu٘wDP:W{˘"#QdR(^i 2g.Ae\t2ۏ|B0}*~W&QqQnk.8\# *:IgIDҵpNoBX| ٢VB حqM(& _,Z19t %۩;{NBm/Ҧ`:Rՠ[eQW)yi8Cad\q_#`a )F?V63[k0r֓FI]ۅYcv;ұ8IcoL[&O.4THܰ/, ܲ%=o݄DmtSޝIX(>0PQS5k ]3B1_=t7T"pe Q _jޢoT&?Da1 t<&| P8ϟUﯬ2Bzڒ/)e}c_|(8lP%рf?o`Bahn1^"-tL1q¼"t4E5%L}1yqi͂XA>r@/Z2؁g_J6tt*cG! )Nx50|.ׁwac޷D\iHք99DB`nܴf ArddD;۷od}2 c xDžyk~K`P0VdvK+O?J+#{h`'>A Abh7,Y@ UN陗5@$@y]\!S{ x' W|!<^H|*)i"=)ew݉_*{A!e25wtXFЍw /uV/yW:>9WVV+M]w/]7T EgJs_ŬȠ'ڋ}  Qʤ 0`TzkQ4OҼA60R()pX'_;YG}k6iZa!&ThIbޅ }:~}{_1=JW!if y)pWoyK.1>Mo}[?-ylG/^۷o{W_vuĽs&hvtuw=ݳ'hܜw3ֹ3g@nP~"aHmA;vXF{~OS.- 7}h "6<s v  2S zkK3J/<tjE0<}z54 a@DwQ}o!N;l!wݑCLW0={OL)?T&x}c?a{#ܶ͟kodYMrpY L]L`WD1mw =_k8mj[G_P,?35;2W1[ 9w"%4#p"|`27Ke_H;zE`1ndG0ܛ(Ar 'OާPC!g?.>TT//l9uQ{ë~pEܦ?x8@·qw`wD QcSڷ % Pg~y,oWD8"Bq.oxA /cd&O8]A T1x POby:lٲ^8s͚5K-[ҡyldms|Xt*Ouw9kƷϿrc$hX߬ImL(оQF;d=`7KF (+H J򃱸 M'IܰcmN>b :# " ц 1HY 'T*6 }،Wέrr_Peppi<5JPrw~_'eKw`VCMq=dHPWGQ}T8\UHƚ|ğvho#ì^o"5}瀸}P]]s8V Yj @1ї̞Ҫ-R>⳵=:E53Ͳjhǔ*M'o\QOmE|R_Y@izA۔i7ORu͋iJ6EGzo}_ZyQmiX m׊ϩ8={ ?FO|[k}scI ^lp"~>\s.}sώ9n٫. d=~#g#^^G z'QZ*cMjVjVjVjV;8hfլf5Yլf5YRVլf5KXjV,`5YjVլf5YRVլf5KXjV,j`h'cjU BEQe"Tr0sktw U-j 7Ayc ]{"NWLf8~Ur{;X q0vބ-\J h t Zs"At3ڽF7e6a{yڎ#<Ҫ~b)[ a(,$]Ġ1qu $c#H`7kTߒ`l3&Uↀ;L~跈X)PxdMi+=MjX0vh Ņ yAqtTHd~!쥽M؄"LQ % 0 E16[8g_qJ\ʒwyQ x`s@J)u-^FLɣ$$S`C!n4sHO;}wp a}ׄJ (6=KB!xh]4V!fbX k0F,!|kH el –}FX` A )T )Vl2F T sE 7EF|@VfNKˊ^ϗS(g9V<=%זǺg讄I[| 5 LX*jˆL YRVDh`$&I'Ġ 7‚dV n+;ҙPuw7xvKL9vQ* "Cd͏ <j<o# ΛB!9U&8kwᠼfeotPJ3ys^qnd MOD+f($<G%rg㠌jilLLs9%\Z"b s`vĥ4nMa cfU> 5xqsCxx[7eIf >QpOrY:qذ5x2BxDž'$bw bBtʘW)1L$a9I6Eq*!6=cD<שyYRV~TL Mntѣb`#lPǑ˻4]@(W>p~3KQLmI8 ڇb DM v+o9>ԕR5 =1YQK5 4 L#U4y*| hBX*ws_2,C&H+Ί^D@)yT2 4)CySx)etA1i 5Y5-a0Y t sp F=;KHx<}L£Anj@k訕1 ke-d5OhP0zƄV;Z?1 2nUB&2Bfot*{٩o0].w P-k9Uwӹ`$Кݤ,j~]|WN~_-/ړ'P׍H,eA@VWPpv807%ƋFʉxMD0 3nP0Qisº-o' .œVC(+Au%^dbb'Z(x|0%(hGBPx=PAiu i&lPRUtR4q QJfdxMVvV. h aȭ EBN"P)9AnC>vRJ%W=,+LŞ4yYҳ(c-rJC6=C0 a P_e8REL:Js0͉UjR @z1O11C AZ?VL"8 1 񢣈vK#=FN@J F:6`-vf") 8Qt:Q`0 ]Pj2q<6A!JB?$c l-x>~ HP@ isNA#=JCӯ3VBIIXr- Osൊ~KfMA_$AHR i3$tYLsi8͒)<ѩgU S41" c dϱ"ǯ^h,di0!FP( 6nd`NJ#{H-lf( FVSY b&!#ym?!w/aݳς&H}%1J/fy {)IB.l⒵\QP nw`JF 04)(ب7 nW4S-DJf8R6$A >ODP9Q xc @=n۝1` ]3ۓ0D ~C<5t,8Z&//l`$L<#5^ u6zRqȟbVW$!7e/"|. ,/Λgl?BJbYu&Ax]ܹK S%s0.2ĽЇV~O(񐦧tP}d:Q}SW 3ƪS- VjpڲJ!l*oy-.Ӡέ/0kie W)Gc'ٻ\WH{(By*t<4 #H@c}E9`^lA_u4=&hA ""0F2IR.{MtUUb@ PlcMlu&F܍Ӹʊ11Ұ o.`=y_%b`O 4V(1LD0>`d0.V-[ r ;2RZwŻ$vFt`;vb3 IxRȕi|P*iYcvཊh,.4"tr@XP5BB +)K PAZ-eXe*#ixgs V'8tF N⍏tb#RMr|p;JO%a;QZ5ʖM@v!xR&*HCQ⨥)`R )ZbDt԰x^S%{U3LsN ZT4 IjZ:ch\vʮ=&qSAv0,7><hXU~^<$ލ0ysF0jN/_RrWn0RU9! y&eQ4! * aW-5jΤ+SJP) ! I89 FNkߡsS8ZPUɮ0<2YUM<n8qN卽z-R!{78l0ޏ^JZ3޿;0ضF3I?r{X vQQ*f)?[Q5A +c䖫OW(FMRI'm@ sP7EAd^i[-rz;j􃐛&EAOdX휡yM0![\c Yfa\)GD j`2 Iؒ,CPFt="#9zZf6ڿ4"cMf衈v7LV+wr<1th5؈l^8<J=кOBqj )*? 6uQSZ9R?' Mr@HԳ{s\$4'9`& [ģ2mWջQ"^(A7m0Zf+Qxz[rv VJZ$5莫ͼ7_)_Bmd" $5dRBsqQ樀eKe]`K c,E*9D41A4Ť(Ra1k(B>1PFVҩ>yȍ*FkCCCe* YL10VʍE@XMH!fb8&:cM@OMCKUx|*@ၣϫXwa{dj.(R;^Y^ O]7CA\NM/I Nպi{u Љfi8^]9֔A0Vf"{%Ȳ 8`(y4P< 0|UoϘۂ&:LӃ~Wrv}ކD4gqp7 1>#?clvvg1Ͱ޽`bzN H[@lA1!g܍_G >$3ׅ򀑎jq9nyac,d08=}Xv@WgS17X:,pSF+5kW6`d~L+9sA*bBQQ{1n]Ĝ0+^z^cYԷ 6vr5reTkY#b̥10V\ D#yk,b!NjxaZ0MWtWXD{yhΟE4'_eeCcYB1̤.8;9&tl3K K c|YRcntAcO[Xa)b>7sH%;跁1j\Ap9(^BJ.B 2 f'1f6PI$(% T-<COF<ԧ| j¨Y'A4Y.%L=b#<ŸS 9:,5t}H-5>b"nw'h4|63{M**<.ZjW7L̙:T1bCHrK0Qȗ.06  ZRR$ IPy)"f\p`I`^N#V()G0ĸ7:Ns s?N͂)#'%$9Ťp`Z'{Ż(%K?ƭuso+Qv(!a g5-@ })(DKQjN0УzyM!0LR-ޱM!z2R3=ԯ]>01$KepFQ{r޵Hon#W ZxAD@V; L֑O%IZ 0&<F sy;W3[O¥3Ĭvwd3v]D̒$abg@nI {?e=B ՛]\$48)ޱaC_@nerIwȐ](״$c݆*#ùrN 2e7 ].+^B%ƅM !NI4S^F@n ë#`|6{ќuT)' ߴIADı ]jfrI+.~`KgBdc* րI Bed2CnS*H  %iA$%OVx"8'c'1A0r޹U׫*3NC൜ 2ށu SH|Yb" F$RPF{BMq4aj\t/ MJeX<1m=˄KeaɂWK,N(lf^&y"wLv F)|c;븪%X]e&8f7H]J`/,mم6eB !H!q؉wE%KVシ^jbvOmsw9#nnD-Mjd`kV0]1S`JvJ3&x NskkO!!Z\+,1oc0>N-})v5;a ~8q#9)]I;5K/Ңo%cKA{ aP`=Yƙdqϳݢ;Q\kh2,Tm=~odڢV0 mWRKe[I-!u=& /&Cn4٦=+G]i K :Sƥ ŠVdMVh jܠB&ayH#0i|젣0O6G=kylR6[fls5+MVMX gK>ueWHŽq!9AxmKx]Cf#icY|42D'Dc6"*U 3j9QnHyyxD5sz\L\)%8\sY:rzi?ě<rD]5.SS %Y/j'9hk `{P"NС  Nh y8U3RVY0q2 . x~jnxz-öN;K#_k, KW k#uctѣpj4M^.xjbi[ߑ8Nܡx*Gx]FmLeX&hgrԵ1umĘ˓Hy6.X^=zz#)'$DuJl&,$R/ˇeF}^8= z5Q9q아Jt0\ŜJ}՛3$%2\ 41L^ypj'Tꥁ&w& *B׼fb80Y9BJ3ARZp{<@au5-c@i\V2@H_}څN0,SU}$iKPG43ϊ:*xH>rc赓' /(֓?ỈuH^82ja%i *IVU<=NZǷ a6aJOE@~90qagu7%!t`&~C|EY ®m ̈Yr&r105M ~JT1^?_%{$`7>I閃f;5rD'(|N'< !1͖c-u TXLf3fđ6Y"aB3q [t${SsgT\asO?B7]Bh!O@ 1#MP%=Ț Λ_E yIdek}Q`w_|{2(2x{#e~Џ$\̉ c끛 +Zv'.hVapj4I3 ͆g/8PL&S(㥈iv qTpɰ|Ut٬2plWCSXS}c[gD1'Qw5n\#1d;L5O풷. Znmq$k:%@"vG.+k<dãǒm(`|H?NTBii$~+j7dGuFNYkVp}VLP ː={/x=Q|\uⵉo`#h,YQaTQM o/ķ^(D[мAeJ]DPi<^M 1I2jI. 5@& ɀ,Ke2mW\ a <^xgU6[j˒\UuO)?Ha=ȟr_ b؞}'hcOgՕWQci@B/`Ԅi/|Ԧ#Hd+Jfeup8#V))p:ρ0z†dXWk_/$3aos{LH)!3Y*ۯ?4ΤHT)?"p$VlY<,Bfmׇv!sq.A0k 8/<w\KJ^hYNQ,8$`Lc?coJicryً5ĂIq;QId&rkdbXaݩ`yJJ1pgNs^I*yh7$rQK{50qM_ܧJĠS;p(Y#`DfQ 1i8VOЯ5P 06-=ohC*m/t}VmL⁥?,E~!yֆVETc$ xQ訃jVcKqC{Wrfzry [(OəJNw<A4I%Y\&(YBD K'>@7nixD|qe!{cuĒ_Z=)Ag عf1V m#F_gwh c!Q3?7Q#ݣQ"؏Ak3`}R+5bV%1fz>,0$emIE/N K?7h; !~>4V|18Ѹ|fѿ?ׂ!N\kMBuKŁ_0*CU1ZURۺǪ٦p18*GhdG@=ʥԫI3REOK@7nq[5d[C!~1~wV,W]6$S{/RF WaXΘȒؚ.S#,NjNa$è^g4g b=qJ@ VæoT޼&b,-eNP27}F7xYB.l~UດJcJ *o_EwU/-G"*~5}_l%>>U~ ݍ^|qp* >Kyҗv aǦ-L.cQ\G$B5= w(187yLK\TixV`it_I=h1/!._EW'Khj dqh!K$ػ%$9/4uMk4ɯj 3j">Xvč)^{1M32z=h?[c tB*ﱓCH5PXS&êmqDyD r)54 0!xr)| 6T s)ܨJ?hSw8M\F,:LAkQ,̍aXI|b5ݟ7su!lnd3iaAP|G4Tyi3WV|EQ߷yZ#3ɉL?ca䖥aֿmX|v2>Lb L$Ct;ulf1":g!=qXJt@{' y41B:2m`y$ jQi,<, j,҉[#ڞ>KIm{s)LA1hl&A-ʼnkL'Slk<{Z)'u`Q TNKB 7wF;wd5M!l3Tн[Iq c+4.o@^q14 +/'΃IX{N>:ch DO]AA]Aͩ7Vc371zfB ets !$Bwhʗ|_Sh+ymz`~uvu C`"6=aU˄.AT&?8c8p֯ڿw;cѠs} f*j\DD1YqKy~,MXtujgOwT/^\O\a͖ʰ%^G5Vm4-_TR걏 TM-#MT|ޯ}w$l,gdnd4ՒX5A+@ CO@`9Jp1+sґ~|o 90+쫈"=!ڌy 3j16S7L.\b*N1:šgkgV֝tzs$uSa\8Zd?#o7w ĭ#k1Kf0hb"H ( f>5OWOdžϣHmtʧDߊ#q8 Hl`+An^+WҀ(f{Y6xzVLMER!\_#B8~k"D]\aZ $jfS"ĸWq& m LX]@tQMهX?Ars$brUC\&)S9qR5NDf20iI;C%xpKZ 2`U6 _~x/maڃvBڡ߭K3̠H ,F&wY3B908C%QOxhӨs (ݻD6 GUhL8N]OV(Kİ+}<,MUJk=t54${ 'cJdX7DqÉn}ƙr},@u4_vǖߋjUMqItzSR( ^:NwLb@|g/ƉFX첩t[k3"Im7!8IQh:y9;ߞ;|ػ =FwbG"Ӎ77 cV%)l\vt g{F5vؔC&(ίa5Nⲕ)CرM4G {LnrR X <SdE)X<Ǒ3TZ>se, A\QM5bːmzF@:@Zf`K4!X% '3p LIޭ)~`#L[aǻO}v>L$>:;-Ayb24B Ol)+ ΃v9~ Iӻv⶚"Q,er81"|m88t~~uQMNbhF n'ȀPeSCEOrtH 8R>&\T ~d t^2mt#6d0Ri zM|#s#ă%w>-g&6a$Z=4  A:>E hcNcL97Q^'Z8̥\<XAzvhm> }=0pwMۍYgogbw3@2Pz9Xj{dbj '}"Tbjˮmm-^}692/ӦM]rEHRxc+Љډ0)!`<I'2hg!XkiՑ0u}gGTq:\dr8終8iF)#@r^Dެ/N~<@ !d-1I6T BnRboy/$$ `Q}4 ܷNvt0Pzi zMK6䞑HԨs-ўKfl8'RaUpQ6 NBh׋ ޣ|>oyFXLti]$>c2=zX0=H-tAX]$?sULjA#F sY tj*p`eT2nbbL*>0{{,E sPәÒsŘE$*82 *Tcs0uԳ>l%oFsZ_ /D[JejĎ>tokPԵd|f~u?ha*x6aOu0DZ6i = b0m`B ˈ{DS7jIX[IgajlH{鶶ϕsU^;? L?!57Uk'$q0KP9f8  YO)7#c3Q$tuwv_vbHq sJ[*귩& pJ>_[rKN^[,fu67r>Pհz@*FeuP?tzn@5]->bO!M p? ö W,Ec4F$JS?gcn\PG\U`(U\[*`)Uq  įKsy|qT-86FC9MINL*|) JMrv(/'./Py0REs:<Ԝl(bΩ'DD _2?L&Gxmvi`/{Jg@V/~,MW&+/9膍70!Ĺ=Z_ub0OP+r$P*e+jMښh8ko)X?5=׫ UrsOƑUi?Kɒ'ˡz7 J-`<{s격& /'DhüvÖN4"N6onkO2 cY3L6jC83؆rb%SC昫h%%Ɉ"O ٩81;4*b$4-(qE x~}+.v꺢;L?Y ^SJHw@zip;i=4I;LIaEO~ر,XFc/E졿\w=T)s8{^6.lB)sjjRzޫj !4vNOKEzpf -7r\d2nΖ*d';lE*#VIL׉pq,3PMk4)[Yqk۬'DJ75Nc(*{d%WM5su%`a~I_Sq-`P\$ qw=ΘA?,ƭ[?Jd*ٸyr145qR`~L{878mK>]ޠmzC.b?+Q|Zaqnwb{L+sk 5=Y)0>^L0ލxp#nbg4M} c+vAS~Vu`DۛM ILN6Jř>ہYEN= nl>Uw!R|uM7C1\+ڜ3͉O|f7Tc!{fGVYbfnݍkTfq٧WkrAALZkn$ހz7<øsu&zk{yh|n?Ã8DH;kl ` SGn?C<|l ap˰t3O>IO/ }1հ w߈ ƲkKPPЮhºGq4{6BHoC^ K@s> _οS._wp Dp2t\\>}W.#̩bw} h=IhT*qo߱U3Ya%Id֦IaMW ː;%A;cݺucmB|Mwh~F& /UXN)6Crq eJ+){5:WsX#K ]D*m>L2}{; L)trK=?Ƃ\ wcىH} };7;z+_ DTmY>^^m]y< :+utC?ұ威_'02 Q=Ԣ>@2Bワ౗\3zO koW_ _^,HKBC?EHKđ7ª,ݗ0%҇;M{W@FI#2+_qw>t܍>@_۟؁s눬&8ILAp .\DFKu/mpqDgg"i%ŹV ;nCQcr4Ԏ9Ks&yLwi[<$B'qU"qSx'jO\,be8ߗ`κAtZvEfܮJ:MeƉnV4c]SiX s{%` Q bq扉tŃ,b^ҿ(U+2P2綛Mΰ'{^I ʝ4~\.'pZB 1څ6aCzR\s 4N,+Wwޏ+=ȍCiC+gr99X8' ŷފ!׃yw.o-~,l<`exK_Cpִ15_k/rM҅7y_%ِ~e3Hw|~2R]v!BH[m#HOqa ;UST?| F6+@Jz"XZBIaCx',=W d>_&*M\H or?sy1eUs5 j}I{4q{7~2Dr ˅զ!1zp~M64*Nh$\>J  ~*&-H% `d^ {!0\tC ~V/+@KM[{PDH3:^OP*sQV )2 ia;h@9ts2ˉy)~M[ߊ̞׋zO-a{?DbO~yj4ן(pw \Dҩ)pF "fAT|TlԷq@my%oçh.ݵ_*#xvQÃ?1BnS-uQK#bVG;H5{cCk&,uKoo h枑{n--hhw h{ ;ӄ≶ZLFSyVcNM54"Wö+>Y݉\׆=_I(%`"u F1#E,sygtnIbr >'F ;G4>qɑ/Maͤ/IyɡƊWNsч|%B؄=4o!oHd)u+v1)Mb  Dy.xt ܲ";~"ڟ=eJn z4asTM Ĩ"HL7p3L_~9u$Տii\; ܣVr0È3 Qa:ˁ_i2QbK'6=9EE^pFV!V󁥩.|Ё׉V1P<>l Mpk /o鮝e${h䛕)eR҉\VkQC-Zn={p5DCw 4s$-`w>'=eN) ϲpaW? JP!K0>>h} $ӱ:_~ =>t TtDnB!أql)MUjJI?"2ǖMxS$ /lI|1f.N-+0иyZ~B"hkw*l:ݘO&8{EgV{ @c-SR苪cVYPH0z{IM𧻓C'2nbW7_0,0)8vHNVoo[O5UL}9DAt2m_ΥAc[#A~wW;60?7',_|GS',U&Y҄GmR5A>p#u(Zu.dת3Š($^/IA"QN/u_֖D~Ĺ 0̶r5b +U:i%Ѷ>ɰ/z5iw"7jţq50dJe}Qp|@YuaS࿚u;@a~V~%8cA x xzp# \*!Lv&-F,M:duW/̤&jZl@q$q..XX%sQ6HJUg0҆ ^Pw""GRyMW9TTb1ж.7%ab Y%ѣ9%7lIJUStXnz52/ qFL}'VR9Qk) [XcؤzsQ+v#zmWN"#-45A>/H2sRLܰeF0h.ě=LrĭMIo'{J,68q1'։Sa}yUns MD:rBx;S~F Tyc(;a.DFmH, ay0 x[HK [h<ȗX̒I:$X5i;ls V"]I:?@r.za^/y7ǣQz` 4J?iS*_\vE?k.YLTyG;&Ua}: +2X%T`ĉypZNGsYƖn/!u@X[Əx uc~ƑNI/Z7Pǘ&SV8U襟TcsHr7mN`+$>EUG!,c⤏2WY '=:b,1Czh} ;p<SMMfW㿁~ῇ몬ZMDocN6Jf|0"+irSgE@ t?`Lx̀U4`s-Q!poM]^EJçPs4,'s{Gvpy-05Sö/ymXDŽt$68*`yӆ\b]b>ugSIuʳf}?Z,[]7 K9Z-m&w  w5} _0iŦP[|2;/@>26k$Cr?aF165Vԯl>(利81ł+q.MH$M@a&-K,M&~pӟ+~h[)ɨX:VXt>s{l: DݚV|TNYuij&m< dsNKc TI{&QEVn84Á^oS̃4M7܇2AhblB,uI(##v8T"\o⒃k{z11M|Xbk$%[N? 5 LFF]w+VΌₔwZwڷ줅*Ls4)'?W{H񭓨ytNb?\,#HkKO5v{F(" r ?kë;HvMxٔ?QųH_*UƩkî +rp1Hn,!KP)mI@֍RF^o`X::d2k顅tÂ3q4 MWP\$RͧqdxL4ݰZ$L{gI #.tf{\n:iYW`wWStrw]ȄBƭ؟PYD|"{PNjž(>F/7t[*sGj?&؁{:qBl0s# "rWg\:TS}G1 w>K1Zx>ϑJRg}H-/ġ&?61Ew=S@_Ġ~7@@c$UQY?5?]2ĆJ^x~t JY⪵IyM4^fm&k ,ncRt.> G4V>)66Rw#ii#Rva|X}1{t;”)ScRuOVf ;~O=ۧ)W^|޺y>?v]Gzϲ~)#j_ 3Y Xs`}O759Wl3RiP)(jVXaJs% Kw!^B`Aǫ@}S?GC>Md8`긇x}`k8Pg)R6kY"/H7.C̢:l`˳ -}bn;/*~pp4nƦNI~]y-KDYCJà/BNݎ_xߺgϛ |f5C%Mޥ^OWcƘx0\'ݏDZibvsק76 N] <22f^e$=*~vN*T*^y1~y`땇{V?r:"`uC?K9Cys^wK⍟ةw\>fKN=sMgLj aE Re4=qLcvX6T&M6UKs|˥ :3ߍ;{иN&7L9.k@b6+9-V'93vDJ< tE~~s K?`V 렆w ˜**CUﰜE+YaGhg1}7>~f[s&ό oL NVxr}jRN*^&5~V|S$?h+Q縘0CS7l#fNx</jQgų3w_#{/;xy{1FJ+vis#0G0—r'w$K?_8zi=kP+Gj>y(}Q3hESSl_SqN%˵Su{F *x:ݧ@Evev*:/^Ƿv _4NY*)Q\HHCA'%C6!D4>M_Og`刯D ,R6hKCb~\wU| :\XegCC,C;Gc(R0hE %B%aENnG,x'$Nf9p4w&/ōb'cQThRc#_vx5$l CQF~riGw,3&HrYͯVyxm/IԹ0iѓyt<I/q'HW/Rx%FPxv` gK~fK A]k֬ySnmm-$ؒqWWϝW-&^LŢK*l9߯_y w߬6L,X5Y4B6J"K'R QWjajt(9C(2xuC#10-N8+f-TsX:X^'.=Dq%ru1l"$'9n$Lua@ u7LB -Vz4XU#usGݴ|0iq>{Y]xsO!ې\؆%mxtro#sgQX(C'Q(bn(ޑZ$0|H4(K|YeIXd\ 6V&Pe ,'63@/7ttoO`' !ʼn>vDg-U  \}<.qYMzsapMq(4XEYfۆ\?03FH>*Z?ra$h$ԎgBևis jټVXb$.I&PCbOeZ@NY. ˱ ]{Zq!M(a1M.d{%DOfHTSDnKD5Ͽ%#VR'f#C|w#m/\s |dZs=w{vf }ߝv`ι+~}t?/|&Nec1Y|Х8K7FY?l|hykeN%"K'0ڧz%6Xpdz 5fJ"ISth]vkM]I*ou5wM fΐ[qG`hI0Wi㩣/r'-ޏ}߼ AYXѳŝT^4 DG*\^:M֤L)#2laE<ɞY93>qTSEqlJmZRgQXEjޏ$P[[;c̙ǔz衻zHm۶sկ~WP]))m+Pupx Uҿ߷`XF諜zkUğǫ +ksrjpMτK+ԀPjJ p|U:Q=1ei&&!N&x=^$X`@azˈpu ӎh vJC @8CNo ?A'hqGDs&i# ZAĆ8t ViG&ssh8ghU/;X6$9sy1XsL%gZAHbfSⵑ}1L)X~'Xk&yj&UX '^z6O;Q3K` EQXab"/YON2巟UzO~)7Pe^7?+Wܡ|'xmӥgu5䝮8YvWk۔k|~^8?ӛ_;`!LOt1X"a:AXU`_ g\JglJEZe[E;A;o1 ПCug*;ٕsNgƖ $-4x@7JK  陿viՋ! 9{OHW{wD!9y`RjܤLf"6 ;ōg&⾻JP R )+ sWt L4l(D_hu `$zNI601 A7񳲪W;?/òiV?wK/ɀ޽n馧|gyyy]P(g|wO<-t=oÏyّA W}F庹[]O/Եz*}ה+W~k.?AƻTa+b\aF.c06V]݃$ނ|&eT1B3HY6ؼDzb;xg ||ϩ H< RX^@lDAb]I\qe9~`٩4PԈq$uas,aNX?8_U׶ֈs6ڶ`bJ=؄I4ƌO 3^gz'iTyL~X1%.t4؞/7Lhf?ɀ_q__;t:cJRJT@?wo>g\~Q_;?7C"cׯt TUOT[}ƉV ow ;ia!Fa2,>.2\m8a <־$oւ>t,"yMynFpӕl @H/p}2 YbW^y+shEg}#JW}ه}=yϧ?\~"3 p~]d;F￾mcc:ZWݴ`DΩ?3)mo]_?=~m{ɶ/>v>w / c9" [N&9ŸycdXxHʴJ2{ S$0%c''CIx'^75N1S:y>d6 :)gؔ=GԪ }B>ow~{j<˶\M8Qs:Ͳ:Us ͂NY8gbbgt9.Wmm=nU~nߔ. |+ư;/􃤭P_*Lvc+KFeAkE ܕǛ^lffS6,F<>=wvtt$o۶B5\b~[ȡ (Y /Vfz/Nc'Ǯ㿻eW|̓wk+5;WKt֩@[Izxz.D6.h;6, qmc:Yŏǵr{y' 50mbj;k0_Z2a$ Q{eI?L}w~A03Ȫ4WSƈAo&̚W٘{7A`ϧ.?^zGrO~|#y]w]~M9{?я~@;MA7x' X+ol'nڶAg棳_7=2cz??WR ZK_S@wzT}.0-&b%o6_LM>on!G %Li#m4@,$\Ӡx>/@ye\@Nc1lHݞi +07I  y=+U6u#_{ꫯQ'?n_y$?zX~%c﬏Yw5/{hͷ?{m[scU ·~߾j}_xÒ"<[1.;AC'Ud$sO|i[9ӊ=v!tPm' Ǵ;| \L5p J e,d&,rH,ʹ,\etQh5l|`JߍCs &fy8jf0Qmք9+;d朒"aY 3@f钌WZeW!ߦkr*;\kxPX}/TxV*]*?pGu_~õ7&:SCۑ<O+?%/igQYxE :n>Y۴mO7;HB,8.:hJqqqz@1MKf$Ƥ2.Z,ف{\˖ B=6Byaf$\疣b\ P?@fJBٙ sB_-@c@k4/!gNњ֥Au n"%AC)-4qb? tUY<(7lUW]uSX<ƻUtJ&=RX~%鯻-{(۾3=\W濭ݛOd/oTg~(~gbQ( aYo|lsqeѧ4 Ыimꍰ0@ :LfN n'}Rؔ9^J+AǍ?I` |aX2dvF |Hi-E-bPS0?ۡ9|`f|v63t84熊>_6M >Jc[:,S1S;QRl^bW1K/w1]E/X,~Ϟ=[6o܈PY;+>yKַ^ c?w3`_yhw4;rZ9 ߍs_{裊0ԽTk.?kێѫy (/:}K^\̅ & s3,6kXARn/eÑ6BAvH08Œ4qe]>$C3@hSQtV!HrLC0p\= :פ]I8bϐȇ5v|hamC_xmO H4u(k)+`WRW9~^?~osĄ$?@成-CDΓr"`-<ED pցAsKqb:eA3jZJ!Y ؎@p%}&ݗA7yluIg(hkt:uϯTEc4wy1gX;+|{;2= SWSw(k0sdS#Tet~f|v%˗+NX~eWS7o+O o ɩP6V=*\Sl-^~O=Px臂N2ۣ5o{ u{L2碤\J" LۢEO tAxe.͹LB6E: bN]|>Q5~b ,3xo&+um@hP?]!1Ȕ[ah` :3X c ҷm;?w 3\`={16k~XYK d rh&lHP[72yLZL(.;U8Wk-򒗾7+Wyw*}C2*?HN@x'm3nzu?kk>1fG{Wv~4 ~_[~ox ?ܧdȚkBB JW?R%/y-JR+p*˿wW^ b_#Ὧ;įz[)c+_a^q#?yW>pg~cM|S~ @7x&)g2mܸq?_/#G35/"E5ǿ| )9j'L oVg5~T(V5JPib[B?gzv~S>} >t_2h'm?^txxxk^o$B MO9 h*H8u!RH3uh c7;D?G=sgH[`?{J/Y, e;|{!2NMW.n9}SǜSb#Q髮z_5l޻kޭ[/ ^.jEzl/JHѿyڪ @uY* m8K}ik1P=ۥļ蝖Z̻/K̕4{*4Q=c=[0%σ2j%?Ć-l+d!=o @A4mj6kbö۴h$ګH53ž|l$jRa'}hC^Je)h̶\~Kk3+79Nɔbݴ٬xl+hDԨ=S~2,yb!x#%k1T¶4۫ctYBذ T 4ۋ],Sa;uh3@mw210K<1zjgv F3Y1\+ӏc>#B?oUH.,s8&PG1Ɠkۼ!["׭+(?/ Af` ! yA,1h4 Y$IJЦlq<V G `*3bgsr dpOVc`Bc nZ&.4,`g¤g)- ԁ )gۂQ[-0.#߻y}L|_340smXgI.7K@Ph=-OHU!ȚP҆uW1sv=;+PMG=@sieJ 3JUIT-"LcmB'4&'@#0 :"Ƶ 4 C;E 4kUht}K`NcNC B?wU`֜RjI-Tq6;U>UH֭0i_ @pt%8 %E-u].O{bJE'& `JG }&,Q(l*w ޱ}3k!G GUH]]8+-6_N|M̯%|QiU `.t,&j|;Zlaf\,)<& OKJJMcF{HuM- N 3zt2)NLcܤL87dzqO\̿#>7Oچy9cvF[|>5&s𤵽Ԟɐm0)dKP"6lhk1Ns_9_9wة>[dhhNQ$廢-Z֑1El:33DzsoX yɐ^K,Ph-ˆ1\P.g@H[`RcP&hһf[>\HǗFM Fm=YvF}Nuæ*w̜skToq_-Y;xˠ#aJ-[^#ǒϷ /K̗:ٝiFI@\OfANAk۽)UOAþbZB𼦄0TXQ*Ua*`5>gn]5T* ɟ(/Fl|x7?H2-ć*H$㏈~qļc, Z=h 2iܳvX $gn:v,4-ŅTgsܴ NB/⮿qL7S=G#>wJ8(~NvϴCGK E^_!Nͪ~$L9ijic~0v )F #l'l.gߊm <_) OFq`l> @"1(aa1 ͚d6{ο}H.9$Wi|p oDq(J#HMt8mG̷VRL5$yy>AH :+s2)&ZR=rV`dW .+#x-6 I2!U4?->WIj~{+҈XX ;kЇ7eoYЁb}6@AiۋPi{"ݮ./=]q\٫iX~lhA0kQ Aݳ gFԲk}Y`0Ӱɫh|zw,ݲC\<9]QhSm\->@x,0)r8B6М/,"4j¬ąL}>a({ivx{p`q~y#vzw#;.,|dѷv򟟈`}xv?@J .:h,4%iUZO,L0ɔ = ҙT{$P]U" Cm=hNӠee]"+;m$=&k5N`o IG!%h>aח XM/qD꫰DD ^37@ t@w9 S]^;[ V!7Q99aΰ8/eXtxFL`"6>ZBސyM F,yX~K>>_24GAoFe覻ȸ8ZMV=c̀M ( *iY+HCC Jq.g.~>[R~('g ۗO*>r| 1DEtuF``nk~ ^i8;b$|r/U7p! BIm̏l6 XdBKZ04)fpQ:5l{dW`BApm2r%X+-FY uI;Kc)σeWCw 3%f<҉ j (T>djCGan؍1+6z~i`;tiU9lw5avn+&|UBnTƆժa4G`ajicEpi\Wh|Mna#&8 LF 3rѾ%֗/S{ [x9N)J#`aq+tH!xƒ YZ{U,65S͈n*yh%#74# OCIn6 P=&`[,H":5 S(z.AC1@D`(*ՙ@0u!6jW6ԫn|G-Rdw<h| 0g O `3;QEf1X1a hÞ,֩Ϟ9Bh\HAUƘI4Ij\HcL3 8 I?12=_/%ɔ1hH$ 36cP' T -$-6W4'cRlDR 㠣lTnDTͣYWttXoL&8RCq"u,# "j g!@UlI|nl^`?Y!(: ͇Da:<)Mi0iD͙:PQNI_[r?s4]{THGn]F7Gjl}bC9 i :fC;"6r!TȰU^jݭ& Bj/,du3}{`aАU @mM&+6|#M=#bS@Gydiӹt^j,^OhymH_@v!)!@gE$A0,@:%"xT< B`QIQJ˰ZZvna2X:pd2K〆𹕨\k"ȴ,"$W ʉaa,L!*YwQ0@ރ4Lg,3 FK1qEkE7eS^yjnE8G#GV'V>¦)mbEc@p-<" 1.g+isv 0I7/L5a&MT@6E.,us9A :~(I4Tq:WyfQttmE!` Q1 Dn6yeAȠX\"-d.+r5p1rp{ׅؿ=\, :ԵWI0ww9g w|_X蔶s NKvxLtnWſ &Ka ?}=2a2ftݚd h~Wea5B2 aH!=k()a|DR)Uj6V(,klR}iMŠG xϗQ_ә]B%_e߯Pq<<.$F#]Uh:P!F, ׋_  H @(Qh> ͠[(K\X#72> ے6lx®n{A'-FUPk@ZkM\y$ 9fYsޙ@'$u!]?oլ~ơ9K\<(>9!JrȑP& 5 C܄60@KOѿq\@]%HC46X7TTVpC1w/(㎝rEIGZ3Pp*Ӎx 擾Nko΂mAo?Ju\qLD%}BKCVRqjqJ C94$;Q;ԠTK@M%R9%F1!Qcim~|@C W*)-vZp>="LU5>g:-7 sKzi!0(pQIX?jع6$v JH#jU&ˍ@ nvxy5llqwPE\It1:0 ÔY,GPƐ!eUORIp@1$Xؑ4f֋x_#6OiԹ]G9=Xx4G"jM5EdS1_#Pmr7^I$N}_n~ n\G= 3 w9<Rw+A LshGo?e^Mq}+ʁv煿m: B߅ @:}9 T>p E _KRd6d` QѨ:6$rˍ X^,>'t9©J̷YSe<|4:܅_8EA)Q9xYT3Qé!ʂ)NL͵ZUy@s )8Z,`3uh.A|L& pCZ@HxP]k6:Mo̐ؒb Hr]ۢ _D| $T6ʭSNj Tn* J̀CU'cSY'4BFc'MW(z"?~8u|n#NP2L}?Aֳi(TKP9.8=_3%O 5*aF9zUzdg S*IqoقsZ˂5p.hN4$|[ btzs0@kڙ݉Dm @{8[ȓ}JlV!=+|0ɤWA `ڗc f T:&hBd4Pa,|l5b$[w#Sfǡ23u 4D҂r ۫]Vh.c!"E(F@~% j<4tW!S;i|u*dY@W{BK|ƣ`dl^q,?7`ӝh D YbʂNTפ۷ϿoZ,}ޒ!qX#]7}^30gx|8A",=!ϧ Ir~g6X{s?M/ vik$L)H~ 4dHB 3',IS8P|t~MN`6E v)mS v@}f"0pUldž4v[{}i]I"NTМ`|: {o#Յ2h#6Ť,?7 ~L~ WGF06~Z,ޡU(\eGO.aqC^DWѕig^ s|3AM`a5(L+@`\fz+yu 1'ցt7CDz׋f_%1'B`R[t Ob?g+0sLQH賜` AĀjvNlĜ0M4 pvꚉJƧ]R9٦J~dՠHr>`4g؄+eGI*G U~B98 )ZSgYNC֩9'fΙMAj@a8jZ'oU3fU1:I۬1.{bJflڊ<fjČ/ oE7>{2[h!๪楶'O1U}Ik)2 B 9Dg&Q>\IGs=V+)w={Hbb?eSEW5*3ѺNܩ٨_Lm {Gބk^1lĎM\ #jC^p*N##_YEGR[p}fMgiV}P5UIu&I>a61C \R5 d2̛Ħ%.8LSwsݥA1@sZ(L&sLжm;`"Xb1%A9OU ]A5s]b~dl]Gyb fSґL'tC3ӴrQumnļH!3NJAb {~7Mr,6Y"۩c@TxvrTVĒYNPeT\?jS2tx*Pq^FT{:$j3No\O&@92)IR9]ͤ9ME<B^(Lx!h v3H6pxp]GXS7r!;Fշ O^jamarE_9mI㌃Gya#p MW؊hsЉz2ŔFQDm?kj|aOi\FqO@_oN]]8G0KCԏ|ڍL-Ƣ*:D5WJ_1o[({Ǵ  7- K sFIv2HeuAL tdCFc[Uc0: fXk )[Y2.@Pɶ,Xxk(Q2 Y 9! ' /IL0&;ip@v d%b/!SV^ʆa/>|Yx_wOrs`"h+U<|j8hT6}`HIc/Kʬ!LT}~PU?JU]SQLa*PRW29 }BTsQ(mCQ*t|rTi..OϑIGiI$I@/RO%Fn 3x.KY9_ 賽ʎPBQǔ`. 4 AHm#"w3-4:ݲ^23Syb_H, IӆLc56ncc\%a3i=W$`zReu*c D \q J(L[PB?; S6oA jtq'#Z jLV14Ǝmp ՈpLr\'hpI#T;T?l6|m &œ6S~:}r@ , 82F \`d^ =%-vrdYD& :u .N C>dC)@ss921%zD?!ρ2lbշ.6X$6.fAGU[=\S$x:l“~~ՐL%2>k8 f 'HԨEofW`KE~&j {&NQ߻Yx#0|Ԛ4) e$ ΊB*0Gv"=;}σ}#AR V@7+uiɴt\2]2S`hhBS}ΩM vphR|ޣU^=Vi.y@C: Ij9G.x 4!tI轠 mF8H y-7-j1P[Y2Iq"M/Cfch2~WBJX1> a9;4׸Oք6 i~xa)MϷYjN84 !d0NCfR0Ǎ6iNJFqi6YBEJbEcX/"qa*L Ԃz$*):I8;i%@BWR>!s8:͙yZh'p6&jlHgqyÈLJOnЧ:CM Wh)81t4r,4F#М#ɦw`G\  /7w?As\?7X O>dHjɸC?5' es\N'[y`jWj5c~Y qe`'E,Өb1 hiOn@pލ 7Ff!-S7$Ua*<;LLmͰ.b G91U%>]$ks\ Мr`a|<y=,W9>NC'틣ͩx0;_龵יWɰQ_P{`~gDb}VhIZ_9TlaŅ@ìmAB|1cؘQ]^@aeD63ۤ0=gC,C^Os-J#_c8ǁ6ZʠKg^'4i,pmz6MZ#\],q}4@5>sڀMZ<<: ?U+W`iM+iN܄sH. ڙlR/h1oxLF5=`Ҕ¤Km67(f4Mc!tA?mPfԌ愺$I8"7ȇ0H›ES KRlݺ,$n>]&b|EA9_z9yVQVsz|8-xfy< Z/y|v+6m#v3qdŜϑo0=LӼt҇J( tAW5 8Mؘ} t(iP1 kB:n G,H>J4evTHyƛ j )aS"29;oNa0u˧E;@;'#GgӖ;i*m$lhC2X8,sGp/B)b &fD/" {`}pF(ӻ8 C0}S04\iŃ9Դ /9j`dY!$AzsyXX=|#XڮK+@;`IG4G .$s@'mؘZ)`@SR$@bC2{wPՇ9C*_A;8UQl|@GXFr#٩O'fvg2#aC3mịIjk qH S0]ÚϜ,a߇ 9(>-޼x$n>mӸt 4c?GHhmV3 ?8pPH?/H%Aqo#ɫ60i#et! sCl z==1;K Ev,6'Oˆ$haWwO#Ȼ+ 8dx 3MtO` XIxtJѵ{Żlg =L.Fs?lBE4Byj74G)iX>ܲU2ua(,!z¼)|W]E͆0(X`|iIduiMof 1aQ6Z!YɃNzrtz tqWvdstiv Ŭ }_2C̫!`4ncdz5J >aNPcƂs{ &Sw6:\xӺzz1`ϴ@vFm*]q^ JB3>&Sguu19r`EZH!`NCsZň%Ed){Pek}WvRh {(9#b&TM`a;rMIK6@;W.ha/'] <9}τ#A KpCUXčMIbأԇiB͙;%P\zaFwm0>qQeռ kfl ORHeŋ],id sV:+h64u&fid:D Ip_B*ɹ1%hJF* uȀv,m8O)}Lv&Z$( `sDk"l1Yz/5>88KL S\tuRYL|7"3cKx^4̥UsȜcOIrgx_*6=G$B ~ r8LaPeH[I*@T@6cO#rl_G/ny 1 =gyȴJi#-tLN/ & %0`tBsLvZv΁S 4N%| ڜQ@D;T> =tiy{;`e1hZ*v<YIk'МWjtRF*iê :z#~|+GӆYxuXm-B,*p|‹G\N]m?>zl}QCnGNtSM h ml W5叛S/L {u'^]h8WT`YbVINZ76Ŵ;r\@ \E} چi+fxxnw JQ*N|'>vJf81 {EUv5ӊ fj FZCmf1Gh4 Ȉ Y=],2&V>Ц,沪ZX3=ˇc4 #djrXoB: @C#5-BS)'A=v55%IB _a'/s*PU'!XnLAЦp+ᇷN FTDi?,vh>xuBU>@DuFM%$ uPS;A`K3j ;z)W]̼!,ͫ|yxE_.i\Pԧ>jS>K&< JE󨸯 : h_16s]d|j@!k^0jˁ'<_4n6ʕf==/3<$4ghgc WEKH!7 o)sp"$rA Fԅ+BDaD ¡2nSEKn6{Up1l/i^ǝ (^be L7'/6bJ* uYrF y1 $XOa,[(ȌtJ/`~9"dYGV1&AGq4;kI7TZAb=.6B*m* b~ht%4NmsARNR XH'`RI)'K9;8Y[FD.a _/;_;4ߋip5+> ,aKg#2gyHH^EgE;(^_e+wND)goz*<'mLPjbN0YNTF SXScu >'k#q[0( b&聅@_+i1n=& aZ05>/O cF*e9+cIdQ#3<ϳ[\d8N>QLAϡ 'l GkZ(9ɬ`&  ` t?,<9((:|wg?<e- |W{^?9o+ nr^up_^F}'Ig}A$[ٗ]$t̹p% n3 HtT5?yMƼhvjm J6sq +႗t|nzz|im_Ceشwsi^d>jH"!ؙqh&{`ah֗=IXaUFkuyw>C4~,4j|4z:2[i((- r %ׁ6TFkk։~f jbsN$PHa! \m2tp9*c $ނv) LgB|7NEmϑiwLxr$tu u ܏ > n`2X̋7@U/o~ &8s*02kb_l+q N?<&<8$.'h?k,W}$E } T|Ș"u{3c&}xr5,lޖ!]]El\0 @/idpQQC=`-OfbD3It`.B"O|6::Ip rzPwp'qd 7>ez|.&bkӼIhN~ 2ސ ͞C#Szw+ I\w[\zZXR{= 7:#h faͫҿ,okxeblx}{ᵿq>p|jGB 72O{t_~aYs΄B*wO`p",Y<@d;hnua7Z9|`iB2Q['>yy=N ^u>A`쏲,Jt$\Onh8jkc y(9::BqyQfb.]b$ԏV'~ň$DU>SHа{HjDqV.d: 0=S"_@ڧO &B?F(iON)%i`+v"1MhEٌ7sɃTfy/q͹p[?^N䡺>H'o+l;1x#@ #}]9P'vMRs5xlO:Vï2m>4nԃڇwbsxa&kdj({vˆԚ%́> A:z^>$8+lY&`Ab|$5v82 N>-=eOι_mq Y Eo l΁\Mmi{ IhwUHCg4 5B P8+Jb=<щ_ݸHrP0/z1Q/KfS Бa_a_(yaZILS/۴g!g0|lvlD#:ٙb~4$T 54P,2dh|$H^HǓx .Y6UZFf*~Lt] uG ~pŽ0{BazTka'9} 8 *;A}3 ?̗߳Pv^H6,֠h_3^gy8 ܙ؟C7eHvv>miFHbCdooT!ːIWSs 3C}#tgQm0q=9Q<+q5mBN*i9&N8)NQ~A3Bsa̳iz,f𲰯sm.hkqa:駪]~T@!8}gyuas?^Ec\nqH 1H 5WX!!%YQIU=f9Nc.OH6T:ojFPy >w s!(}^ ky==}Y tu=?/\y3G~}O?׏Y筅ٹ- /f%l]^vi"NvuWHn\]0; e {AaQ)8gQMZ/o>[P؁8S{I%$f杠ss0rb=LxK^@d4~/m"W}}ƍ6 2ͩ62CS3tу/NQIC&Ißi| }]r¾D5 87%6kyr4Zw!(0:"+6EO@wh6tVЏ*_W!lF8;)g/M~ abttFk3?&\Av[&sQ:^{88 w%7 J>T]x>K;NHhNY]teވRCcdj5iA@g]Aa@sEk:jgJ'˴!1c㩈ooߊ#[GQ9]El^R^ N 柂깊"466,gag:;_9'`09#( V0EgK+[}oeuv-}߳,,YJV-(&H$irasx[sg@Ҡp0֭[;NΌ'PFZꊰ}W1 3򱶼gbe<4<"'zt#h LJxkPhƏ|W&Q #qhg _˽ j*S=wA Ϣp<U%txJH6m1dhBi>~1eUrl$x2Y1cl7bhT#F̍iMR&ބ\w'4b" GaP_] f89+<\B"aVJiakDEU}ԥS>|w 켄[ IȾWS$mD,Vhwj!f5DkKtP 4Wn|g-`K 'Z }fUR4k(m(B`jma8=# 05Em ^HEkۗ@"DrlW./F19CC^Ҡ'mkuuܵ1WG|Wbhݺ]͢q>y [Q[DSy He˾DeV_é&iߜ6yU: z!q!Wrc]д! $2!$e/(I[['v+-~t8+ 4Jmqvc~<^ W*8Fu\amzH tDQobPמēWi҈$8y4oB]e5Ϟ9w7O<x GTn ;+iԁ %)8h+Pq3 N@ڔ5A;AP_w'H֌9p3|܌oi.n$! 9qmh1b,g0aBB4⏞MܘebZqfЖCSK[&r"|Ծ5gEm#,qnMёN̏Ok¿fh.,ʓUHrUq;c9Wr Bq#Yn.e_ @{-r;J|^x?4T \A,ʱ"҄WQ>+ߤPy嵸_Y =AG N Lw_hT] M4pTI<{*i0#^R&+t%Au@W2kھ$ մc5&>>gH,` jKyzmJY6y^y`Xbn\Kr&ġ"Mp4 gGt˜;|Ӊ'p79\ɕ[&D`p"&v3|PS'``snhH^T{ΏQ,]Ԯ;O@  3ZEt5L*[2ձ8N3N=]w pڸeL2)#^78gSh[p>f>ܽ#k$Q`z#>A>XQ ]=U)fx_Lӽ4Sl()cA o34W So*S ?.Z7 owb݊ؖa,8ZH姰|K~x.@$q As3N@B_mPj8/`D$ 0=K}HJ tE@E89h8lq4`~$P7'SmɃK1oL8K&h6>*auR>=rRx1K,Ҁ6g蟅~d!Qc/:6Ssyhy9Vt+!iI(Sj$4s%'{Im".({qOQ3T=jԙGn姆ҙӔi 8-D7C*r2mI#zͯ+)5rnoLdB dA6J҅hS-ґ^x- E} %ypTS@\Hb0!ԙFv=`FBh?JXI7G?Uԁkb`y⚚# A͢'4 &ce08O֢Ͻ:hv tDO.ȩYM$x~<z屮b7Q}&c,XPD`FaR;! YVσl,3r9r v#Im5c{ ŧ"PqԪ|u֪4x(YEj`n#ĻՅ:F 0܈ׇ1"1]䝧-R"(V~F5r1JP6_LYu&>,$VQe4C`081͒M<\̞'i$&['~:%JKNZ]y{ "+PhI!'֙2%pV`,B4mk!μ6PҨ,UV&_,a,xAp Ǔ\d[5p`F'mYSYWIEOށވڲz\|2cǣ(+x&IIc`.Uaȣ|R/ʡFP'zf'J HK&&Y%觝%}2E,hEA^Sąv$}B&U8'qQmMj߇6u9kr$ qɕ {x7΁j 3%TOi>Huq~ޭnx'-܅F@L`EBVy9i҆&7Ll'(0$%`r?هk7ݔ!zБ @(/ABst/vF7^yzUJ"c,t?zq쯾~!M{Xry^̠b˜1@Fp>U33ڦ !ˇ٥0,uKQS?YQs?:9+} 0_kq}h}ԭ*X,NiGNfQ(1 y'n;}r0O&IZ`nճGhE^dS(|*i5|c{u.2\L `ͮ,,6iJ+Ƣҵbտ'd-N{%׮%/mBݵ/!2 OM.+6erඏ_ *.qlHGH=09FX`}hah5`-)M\ǔ {h7BXvX Bi!}hS2 Kj1Ӿ ,]@M'-MQn]i=)~n1 eI\`K s9N$k}p;*被?mfmzfXo WG0? KOZܔQmm_'K9s+b $Me?6+qK%:<|*iגwot1B_ ghEs bE"6.502gDh}mU@QVPqjH5U,ÙiW w~JdhzQQK nu4{; AP?g4qä(f^~!̄'҆0РOzMF h^5 hFvI_)3sۏZV|N'ȥxPjah.E0-B*heڲOf-]-6n/Y]-!ZyH3_cûi5N{@UXPb,H[B n~4 MԳ/ztbЏ54EdpnjM7>þ%us[eHʏVR%kV쟺{ڜQDވeơ z8gxnB7W0<"1`aoVK "zI=B :͊*T# N:`RA8.{4 WBSwcY84B,@qIeI.ӷVY,Zqpdq+l1JIpcɌHlj]*z0}r@Ql w"B7=h /+ p]h[O1pԘrf"K;1?|ov5"ift"QM9$T@wz& =G1ĕQ:%!ac3\*>=i!d=|95Eo@5uL#Tc}k*i쥨~Ttr(xm>f4hz6]u,d,~Z=Y@SOG,8 Uw \+aZ,7vI܋ţ$+YrNѨY86jj_nqX=ʹDY\[;k!/ mq1Ŗh\>9P2Zh w nzz EM 'VY|=4idԼbFҖ`n pH)Y$J|vk&2RnߘќLbO;pjt_). o|b(:.;~ Jxߚ$!#brҠzd/`&, & Ba03˸^,>qVY W 1nWKZmXDp1?M _' G{o)ST0>dQ,R6|N#Y^Z:[JNgB|09$K-]IIr%'RIYy2XR?LRj#bg{҈yBQD -]X ?/kN070l>ـr&t *=z d"7E&J`NX$@-}\K2 Ve]gqT<,usv6=^޻/BD=c% r+@ :6D0$82mfd4% ,Zj,,y\Yڡ},ī觕[nc9ʜgQ`R6HƧI^ K+†5%iaP}-mOXBt,G3Y"$УVDДq`5CTR eZHr]LYe8f-ZrMSYz澒\ ⾙sX k  e"E,nxX}$0sùrYps697iQ;b," 8a˄/gD;XH ,p4RH/Ϗ$p|әd};HZЅ5AW&)76Z IVp^c 1]ִ2L",P{+D :k:czd;'D @SNL T1`;Lx{ڌ})_^\οF1wwJ0ݠhePӈ'UqPW~ ~̢ 4E޺'$:y ;L:K׷uV;JcMC~f$c֗&TQ\,<}-  x0(jg`J,kr2<=Y]ޛab5>y}thcߓ2F}> <4$46f奈hp1М$\ T1X {7WReRӊ}=f4ɓ4 x!vb^c!ċPh,8%p]xp ; qH2?oPw9tTGEvw{vLyu~=T1폢+RZ(b|bc7 nt+}լF얳ir~Nt4eWhVJ#m5`kvY=BoijB1˿Iə )46KC).>A`?#O{?0f"Bم{\z\̹+DaF=, +P;OygU8(QOZܧL4w^%ϛ2ր`nd17R$"F I|Tg7}*+`fB{ԉ4(WReBhk,iNɀH+eǢ ̤#mMwC 1 4b>UBBBI0W3hցO抃X\3 |l]-aU(Vr.A"ƜgK g=G @i;,s(Os! GyO\Bёau|NcoAӱ-b)鬬 nIߕ1XDI^Qڵѕ]k!7s0`bOLʅ^?ĆS׏6x\e!Kŕzj9"Ӗ6\V\yG-ɦi]F~kwgYk҃r00O^Bhu#@D\?(Vf )Yi~4k'5{/Gꝸv.w3JMڌDR;n+a`4534'YZE;vŒH̀lUjtY m X. r8͠@>3V1h|kXkXaזD]s`&aӨ*+Y+q].zU-h=mz5E-uN+Fw2"K ꜝ--'p{T7l+}޽ 58YKpH2g$0P.G?5 3F>MsAm$5WF ^!9+VrnD*%'qK}͚?Os0̅-iq`N M=qm%+\.=&H\X"97ƿ`*ɖ(ٓ! ڇd.tX *, {PfFcUPoO+Z<#zjS 4Sʨe(ipQhRiR&əd~&g|]*{X^Bq LP<{96wuOB(~ q4"3!L}* ]2,'~mY\<0sf|'Q"AC?Dܐ1_y=`6GPU9nJ{hPEYַ~BB",pyスլ6gm߿̃gÂ{<&#" ֩ҮI_Yٷ)M|CY4F]GAn ,JKv0  #JN7HSU'L"Dj*zxSL^-l` /HUt&'smܰJi !pݛ0&\˼*m\&/"/ss!zBO uRʌ$;fE_I/~_θPÏw@NpS[ڨ{I8t4BqP*b+JυS('k1/gš2_~' H./̂T)ÑL¤ qjnOvKܧy΅A\y@P7 k^0xEB%.[:!k- 3)pNhԬsuXzkk;xuZyStN$`ѷ.%oW`n%NjݚY~Zm6:ƹϳ5ހ(w`[Ņ)1@*&6ٔ-a?@"?/i+u+kxZt>HH\OvS=4DLc6ݶ>ӃI H0R 9CPL MQp=uF0f>d&K`fZZE> 8E"mnl{F)B=9r_yzŊ5WwY*YJin|kk.{r52fl 6Lc-Xz] E"yU4d f n13XB+EiЕIB Ԡz/_lނ N{[p"Ư.jX!}qHN.gMsV3k+2az"IABcIE$UTnww| T0 bckH ^zj'm8&XM4Tw:G$Tk:|l#X 1s;C~TaeC*Ǽ8HX@׭lukW('!vo/g9ďHXA0j3\Q]FčCRJ(-!v#雊海ZM׼NRuבd n@W{$;5 ԟ K7,70Ǩkc3] ߸ֆ<{~*Vt}-BO_HV F;qd&5;cimRkǹ\ǐ';14#KȆ0Ν.ܸ—$6ճxx Ge)x6  3x`R;{mkԗԿmm a3H*ngŊ8=j̽~GaN= <۞+, XHg%<a]q$+:ݼy .lݶu녔&DY_a굱@.KE5aQ_ .E0-YX9e0ydy [z,I_YF"VC& I^cy_}>츞D4h)D+]RLȏ ay*QE\ؒ,ŊQ}"e=tO 7Z2 6 wtu6Vƚuݸ,T+b{,2 ߹.r<0595 hYN>>vjp+x̆ + It 젗 G(MsdA VBы C4! '-e,3Gx^ XuU[@6 E#n% VtW*' ͖8!}49h(B6^S9+Lh,A)αRFZ0.dbfbV'BLƋ|D[A== ^~{h}Ӡ|H M^|K=xuO&T#䘢 \qCH*@"c1 ~(nWRۏ:DqW)NT19S(kCD߱!")\y=+_ AKcU;nxv4 U[p/zWpj:0q($/~N#56JRt^ L\SD,>3i}:86M'PCңQg>"6'fnG["uMHNXOI kُYP𳯣wv$N`AC3os1q(k+C[c_ x:=ɰjԇP:E]r9LD`wкyvVGd\mǐ <l%k`M$+h Z↓Ē FŴy|0rႡ${[9axrƨHN,VcaNkdڶLj'kB ,9)3["ieY/4k ٘$PW'Ywe@*L{;fJGQxzv&:^#3P#S 1FGkJ2ڡi#AA!֕g&u w?`jw\O~ 1~N 9ԠEu3H5\9\*A,0ʃt‡NwWV c"u+=Hǟ<}5xX;$./᳿]V{&ipf4|nT^ySG5@4iGǺX4;J|_k"xcҎ3WVB ICXny6BX͖raa2W6D!r<ɫ}suoozx.h͚57`Q9ʕ? ZW3UirbO-Ŋ\}tsV1']ǔEYT}m3Y, _] W } +)m-%s`>;G8p49CW*>^ p= YUˀSaKܰicY!> kہX&@/:p|"ӁxwPυmO5>b<{ N\G8ևs?ƺ8~n.4$<~Bs u>'F:lpbF^s{>IK`+x4:kq9p+_ESpSg^w G0޼q>l.Fx<"=>RKg ߵ?][XwrB댱0m$y~*^:toԼEiBw.t'r! ߋ%NŒhDRxM] orlmF|4+R*rTj@Y}%kg0w{{ø ކp:#8܏ D3ɩ]@">?6~86/wp~#~]ALY`nX.90e1`;ϯ| e?!&@0ilo#˕wPeRZRd5\||OkoYLKY*t- w fƪCmfKD?ϾmI]Hxob5rK ̗d.m鬇Tĉ^u `G h0<_Æ$n#aq*:0~5&zֶR{vذR <DUAX F*6\ 5 vN3Gag_ I4F`/"N<(!g1OyηaK˙6މ:F^dê}aLDS{b83$F5Bk1 k5@OeJ)Rsէ<%i(׏̓߿f!efN^l݅%92\e@&Uo/fH\8h/_wPJxe/:vTX}B0H7[ű0GBn6r+:*̆Gc,"D\6M.!STѫ@pch |l5eiC*Ŧ!)Ek`2pn *^@Kid 0LO=Y m^W^KaUOMMդR){C]Ĝ݌ h"k1g9˕[%" H>RI!GX d(b-gV0_&Ibcs B/F5OYk)[fRXGJq,N=ُbB(v=p@jp!_4L"2KA%/ H ReS[F >oIv1IaX8hƸN{9zyԈj_8BB}FIwKota@Ψhxn~|lč?ܳ 3;!DȟGBFLjZfb3)8CF 7h,7ٹZt:Inڋc-m<4`Vb+upfccc PP>xUQQmmm,IЄomC7s+@o+3iDy fԔ',Z|#--.)dH"a*rC `+՛g{+ uf _= C#}-f[d c+{t; Ы9"Ka8.Rp5mhO 'Td|~`qAutm5='N=}_FxxW8k%#0Ŕs?Tz&*#s63_h"O ?e޹NUBvPwȢzAsFxg5+\ wca EW5 3G2$v ks$O m%!P[]=@,'9YֵPe+iȕ\e'V@s4-L 4O4zk2Z,- OXJ&{vZ=eSPl'DDHXXP}8T7@M> LfӄY]r&|:Kk}HhBI3nc$Tj*SY1c@'Gq:Tm3ӌNY6@67)kbܨI]5ov Qg/Ő7G,Bإ $p'qߐ Ò ?}7)=DP0g_V^Ϡ?) !'<K+`RVhjaB2eb%WBPayII`lii'I{31M6Nin&yfH$](yNdֺ[lGrJ3iR@^qs޷PY^^P jf䏟a ̨/uq#Ub\!lY߄;vT7,#C_m06wB^QmT Mwm۞‡=  qJga)Iu0=lo{Ixms5!Mk#;u@I$\XxaAuWAd# MkZ` ν\OFӣUҊ\Gh34Y9܃g:]1S'*M}+iL>{!pNe%TڬfNmC,q~W_4PHBRo752 z\K/GNޮjFjQ= 4..c7ൎ A'WX'4]\S< ks" +xNw-~2ھOIa ili ]]y$UU͌y&E}G "mq_M So_'[IOZ&Y'HZۣREX K`lXӾ.r3zzBbh7zfЙT~`]EIhj eM`% 10E-}V@4`-~2f B6F0YSWI$Vj{!C;>N{!!% R30^VپRDeڏ:{]ME3} 3Iwdsz>YS]$fp/bV%uiu5|pd5G bnԿ#,ˡYրDoeJN°HimRa/UҚ'Pʋi8iI@ĆWY qJ#%qAeZgGQ{#@(i$0?J f1D*fu&x%vv>,/F{~xiOS*fB-Ԁ+I3~nLcO$]bS>&xf"g4NTI?cK|Z#R06̕:3nՖ7?=J@7}+ؼzTfԏnup8Ah:0e[\B=>73y%Q@\!2fAf [$\҅EH-XҬQ,<#cT}\ɕw\LfObdn54/lϑM ^;e;\Ƌw/jgβ0($q*UtācF{IHf.TDh@07ů}:b:v,}!肙MpÊH:}*jm]]Oi p}8ITR%iا+l.)ard/ acv$58![`S¼Ȓ7"X8LX4_'ύ$; A#D{F76dÖ={xf< P2plVT➒0Wu%QMpXCn7WnpZ4g/z֮\ >|Nj'MӒJ""n&=TZš49iqi@jmt߃=hɷٍ$]kˑoj7Oai0V^RXN+0v4H^"sYm\}4`>oS 7_̡@m APӄn5)߽0acuu9: eDlXGUK\KdީbMKRMۥbm0 0y5 Eܷ%Y]!y]Yzl"%eY)d߇;+X-@" h}V"$m䅙k'rifR7\1W-B!dc-ZjFZ]0Y[EJ7V9qoJٹvi3$?bf2J9*iEX(VҌIbUR_SIL'2$pРQHjL]/{Ud&ɤW@H!4)A{e?uukY׮um]WW]{TJ{o3d9wf4Ѩ$w{9XXf@ 1B?T7SJ(2h~0Aa6?qt%GOS"k&1b H3ЮD l湔ĎŅԓFQ(] \ԢslM(F>9 كJ {XFb}P3g(i|02 `5a@qU ]14 ۾ǰJ|*D3}̰ß4A#ASD'!Ww ա u89!57i27Џ|l:id U!hX2_ aSC~=G|cSY'llhj"/h^U,̯6PWSlAH?t &psC i#wD\H}ܨK2D>ghIJkcl1QʼnXHz"WM'Z#ّiȃ jhiJ S8F@;Xc5mX+V9Y#!m :aV+W4P & V=CbľQfm|,ŘBx=0'@4ᵵ[ HCqohf72/c/;{RKcya2lVV۞uB/5|F 9A Uiƾ S1RazCh~.-\'7^ܤ6C}$ LR0N @:\U]j 5LNP)b)c$QlJbYZ*de>)0P=BW(\Áu4Hh C2V`- &+ 0Pܮj,iLY准$@Xa=Q*3u29[0xA&Oa8 Dc k| CQ'Hi5A2l  kqo9jRt!"p =((K0;!CO@ZG:lS$KX۹X*:%D ʉInL6z{|-%%I2r|, QC0F4=^ IrjltG: zYBy:R 18Z )\P: &\vk0i *xD&Ve&44mV#u``+c *1&  ًEZRµ=/2Fl[&uz:q̐phm-ͦi ++`)XOO|%9Pxf")p1!f*aBTV~8P}AG(@S`0ydҫA`X(Ak >ϸk ͝6U;>0D & ³ZV$@L!.=U8 M+l FtRlJ1՗pP6(ꆡxM 4 v Z0]L`0' G@`3車@5B"i+Dy ]!1[VѪvj5@߄TPѾ@NvϼB@Wt֫ 1b|]7;A/'&͐ uvX' sŋ}v8 B9nV+i:A3. b]pԑa +z Dya1¦T"paqOJ$ѓM QhTUhn؟sBYb^y9tHʥm>㗜{ouYoV@mi&l[?~OQrks&d~`QT*&u2aQx;f7XjzP=dC,jtM2*{!O](c;gc(/xj5QZmOcR <ٹpߣx{=TV Q3Zu*0Ið}:s᠆œ3P[p[~M:r e?;A*sN : C!/͘Hɍ֊{lbm |g [{<}%xK'tO:XĴBŊA ӨWМpξ /g.0g$x?#/Y=?or@5O|UGo|'' @ϵLA0 @6%P>&P=-W2PW j8 qA4OizqC3VK؅g`k >#Q1T9,:PGƼ@2 FGA6n2Ăfdzi怂 >rp@YE=dℽ@{/N9j Xء Y c?D ֝:qL*zb5u:$bHq|a.zsݡa f#7h >M9N+l)"h`FAr _~/%{ѢEw t/{+{6;;?<{?мW& QŁLMcͨj7 ݇@0 <vle;JEPYg1t| |OE?Ak-4 VS J@()/.^tV,^Spzkx7&]-o0Fʯ<d(cY mbZQmy ~K(g3QOr h|Z4ZP蚒#/@cq~ ` #+U+|hÅ}.&E4h4[JH :zЗ_~9_zfԩPmL˭q WcX'zV9:1IK*31>aja0G㢨Åh i+J&|J#ɺd&}r)7 NBaYJS.L$*6g4#M5>۠{mSC~J"B(JaC+y`ԼƢFX|;8a(+dAk\x?&.jPC kuu- \{k4mũ@:RYJMQ8VE8d։߾BӰ 'c5%FX]3zў(|jͼCXQSӋ\JhN+0//^|{ v-XE6y!d +hڌX0y4 JX>^RO%:X'X@I&^@C 9c\ <B{譵nTTB3+4uj |a0, }.(8.N͸gl`fj<0* R8B;pC;t:3 fG|]zda#E6)iL S.j4lsWW`{1i4F e΋Bה> ~;ѷL)))Fq@厉?zR$Kn}LUj : ,ՃV000o "(ec/!]Da H"JAɬ1LmbH>X 6$8g1`neP"$t8T6!= (N1 e?tmw18YLL[>'̾Dظ& q. k]3iYeC5٬A#m~W呦]‰u>b3}}V̙b…K-޽{4 @=㻃7uҬ]0'9ÊZZV?ufԡWMwJ}dhEFFI}#PZFGF˥Αvk3@N #%CUzȫ3*! 1b^֠P,"o40V_#}]cVa [qR2a0dt<llu⁂vg7tteyLJ2z5_ex'83 1ԷAu:FY(s!@K@MI`[f-YA+u>Xs!9ȈJ?qoW^}Ν;0owX'))(m=d9O<]{aXvWR7l1]eixEaG(]jF %;>Jw r?رsKc84&v1z=.j5dld{@xp0[`T Lf'MV1\v1qMb]L Nv\م0h4pc!q= ʉYs#!w oe}[k&3D/D#vfI5[Z`;$|G-pL7‎M[< M.?)bD_`svwkGG^³X$+8g)q 雑m b~9;FAйg}G}r,**eTaÆOtf? \ēѥư {;?*Z%7G1POeJ#г_"rFPwe$>POfHN_1L2}NOz5t2u|.a@W%JR)t9xBa*~pƹ`w,&S'Y18a#Lwh)T/ y,D槍$ YarهF=G—Om &^rl]u0 N~E K<|aԓ\8-g߹o w6D' eOnݫ߂x2gJvp㣯^ 0&3&SIMBBaԹF0+<Sl'o24tL` lyO0BN7g!}NNN1?^=j}|FK_.~.㺫eSj*=>ܿ۲S",a~%y^V@mi!NqbSJbBk@{'{VQZSvS{ tOKc2S Pnڠ"ug;+2.|}o$L!w7R.vi@]^QV7Wa6hLx߬FPO7k>ce6nq&HS`x > (dbUh\jH^Dk{z;_n>hO@̑n|Y z੻nn oU+ /:C% f(շa|qAP t4]BY=bFP@]42GggH6>YH6x!3#۠ཝhr5pXv d7C'p:9p kԈ}֌j0_Rp~@%m=OXxM:B~3f"_PP7}uz~*Ȟ3*G})\gDhIlc&E%w!_Vkb8]^V2)*'݈ ?/?s8$&bC,f%RNMJLTS.I98Usdu Ǎd?(Y59bq)T\MRQl\~<_.)?8caDj:84 -+$΄AܮX`a؉b_$ctӴ?%D=P]"±u8 ˜FS=d2y sޱ_::"rhgyY ,&b 5 bGi(❇# !8ұ}r2JqPAn5%A&Xqڝwª YaZqL24B /tU\U6b˓~lI2th;iڤK@Mzvow7, t@]YF6X>VHTg`|g+^/kCqpxb#G '>AƤDwXhqC;{ ls2@GfVƷ 3'Zj5yU0zA!^+.ms8:FV[}_?ԕ:r5N)YoЩVA: L(P7hE5W[`8B4ÔgŢ5(BVL'C/C`,-w~g yL#=0{I0.3T|L[̲Bǥ'@D3mS8IgaPкV5d+(]LyFZ@cX6Vwf 0VĘA(~>ЗMy"a˗MpHbudBn+6@p6 ? 83^r Q#s._Xu)X, 5|ufO"Ԕ$zK^վo߾oxx2.j֑o-O@`E "ɳCP}IE`jTd Syqo. K:"ID)֎\P O@a*e %=o3\v JZ vȍRlPd{.Uه!Nfu2Ck}>"Qkc&8hrZ ^nNLc0E!&mai[;F'\6O&9bO<b<+ )pâx>[)Y.Mw1xw΋f,p|&| hb dJ/2Hu~֭W[>z Oݧ8'gO ~'\|LB5. 5Ocn \#WE-t 'gn-ieq7ؠ#;K\P Nhg5,ge lCib#5 f!]Kk- {n׆e؟ wemK %-p`#$ %>(-V|#lIãe ]C"pMubc0h( M`lYCk!# 2eA!;k[; e| /:ZHT0XH͠s NJƋ~r_:<|ɉӦM[뮻1'@ٿ~Mjlٱ^L>`=gι-Gh네p8'/jXYd5FR3TZdY<bbl0ʟr o,SgZ{<@sqDO-ΘVtFԚML/sBJD$@q6DA[Z ۿA I[:A 7n7A A{"6OO!ۄ\ϓ{55j {ԃMLD31Dy+B|c]ů6O}idow7{5o'&&ʵ_x7-f6 P|/7C !CJHO {P8s At\| 'Xݓv>M9g4ØXDLjk$*0a'!ã5Vn!|PژX.W0BA F˛v31}`I o, .h۴~aLcXNXdi14d"s[]&6 _:ާMsp&BYg4)m}G^f͋ו^̻FlL;P#r]^'=SE5ŨGt9r7|P@s~;& ~)92T"@~}х/2sakzszu翲t-\qIOYν8kp@/TvCmN -_YY20O4i3#(//2dH|e˖ xc|cW,9z-|/;s' )E_~)g΁Ej'dcIOHPOywG <%<"XzPOkB^.gg3kgomvCpdڮ8Fqd&ZS(,u@"t$2Ԩw71`fOUQF8Xǎ0ܰ.:MR(塂O([4p3˝2 /c6u>eysذ%N4v ,FMzr1Tc{uP۲ k|c(.y/]o,}gǶņz޳g(^n",fϚ5{SÄ 466&qmorvWDPkv 9{g۾nkڇ%Zu5Qo}}m4ծ#5QR 3,y Qj>︰IFQ=wkdZC0?"": kh8Pn ߦv@<֊gX؅hrX T !1; _t$Y}5 -@m^R Em(<\w2W0@7d/kWBJhPlEp-D@#SqjQ[!k(dD v6ao{KO??.ӳ?sޜ9sZ_Kՙz𞛒:m]Pg7w,/ݧeggDFFv\tE/d/Z讗^zb_u1]rxz[^瞸-/i=wY o,bC"]yc&{|S'|)iMGa(ŕdCvL{]]c'+E\,Q_1;X9qXj jqoC161aAໜNyK@7J@6K>?_t3Q~ JP{a؇CiMpܓ[6PݎLo8K`7JDZQ &eaD~dFv֑>8<==FzΝ[KA~ %c;(Ctfq0d4Zx_:c~ܧ23а-%~`TbuױSd)ҾW2~V-} v~N)&n:]3Fn]B@t!v[Z8%xIws@Iyu)L(ɫV9e;?A};q"VY-bydϱ qsR{09fl*h%XtfzaSLbFq=_[N5F |X3o zf eL~Xڲ`I(c Me07ZXWg:raԨHH9~xl*o;|q)T|o3]K}ldjgYJQ  ]|0gٽ;JFt_{28M)0'-}sNˁa&7\Um^lz"p(j‡;:q36H@LOJnlbCqP:Vϥ韜{kwm鈻0\>z;'ۗzZֲ:{j{ʳpYq)g~D37gNcޛϝryj~C' @L%3H*t7[p!H`!.6T5У6 WP7D6٥x>Ç{x5 JP]Sk>ը~IxJFC+ͬ1LŸ*Yc@]@9]}]zO1{TpW_s_~i?귍Vn =-޾j>sv9GoVERV] ZAl5 %ay f FI aa(uFr$ =co uOa`FxG-qȣR>g@3w䮞ܳ vxMzUi9,6y}mC1*XjOxo))Ed;J@M:Dw`!6sΒDk@%xHLoۣjxwN~JјY=!*7~P1 'tDLM{vÆfMJn̏:r͉sny3/=55!& ~&ݠM{\:ާyvuL4 A$d;^?$yd…Ցb f+ & b)5w`"X9p 9 vwSA34O!"8: F7mN6/cHene@?pDN/P#ll#2bL{/y?>qD՝6ئ9&00W&45#[ql'Z$Qu(@ kS=j2h577'ŵ^z/p{?+>gA7>e=%\z{e(37n'䥬ܩN>wo},gΊt '́zxmLJmJ-*|gK@u;5c& iT^jA1hʦ2& rC2Ao~pՋv=N?^|?~MMMZJM:.=/}.TXwwYWN:s]Ue(<8Y~.ҟuྥc#.!jiIcga3P#؄]3Pj <4%/ѨdI5Q p#kܗup^ }`D5w!&xMyqxn=02K1M\PO%%-Z62_q SqI+&1L^J$0yёb[b? }Lxم>w򠲂K/9Gpa]:̱/1mbr=w3'}æ?}I(}or%wϸ3{!4ЃkW~ulG~uo?9\wb^7Teزڶ+<3 ' ݳgQJ&o@p *S`){3|ǎلp S/GD[hwF Nf})*;S#*5*jW'kFP M `%HtB4YyOji( y8yΜ 8o;'ȫMߋbt},2)c)L31|ҍoƇ{}q3fX]f͚Ymxضisp6PmM/֬Ͽov{SEG7 ]XKȝ~zkZ?qWwּe}G2 j6&~.LJAuۣK%Z:t!h#,DY`n+nzv'HܴBvP4h GE0t|Y 軉FhKPB AH$K8Ç-L|O'{C^'7x;u,/sܹ·z%wCcqZ xm';ŷvp x~aG('~?qg_@Jjkw%8tOנ͛Ql:OvV:Ar@h-h @BI*u@љM}SXjц4kj37@(mfjaBU@j;@?4ߣ$NP͞:"Q0cNUs=B/=q K Ԍmv~ֱZQGSɋQoNѣG;yvkqn ^ >nA3381%~rLS\l{7<1s0w*R0Ǎ ϾDcZ!X2)^'>d)v8xd& af|C hDzs c=(yyIm1[c1k4ԏdS@,d9̧>\mzB>ԖJYJϞL<|<: }Xmvga] d!5jJ29Q;l,_[[ߖz.<'Be;YSe._|W>G,$|+xFo,װm~jcT53P>~ܢKwIߵy/y3U Tmb &AJ+@P5!M( J!x6 A*w"xg4>SEKFʿ9 MSX)ğPRj Lޚ2G#+HS,/0^3]Ƀh> 动hF4X@gy;9{9?Kz衿M߅ .~t8t~7yqG%|zH5?Q{&\w{~Yi_`5bC7쩙ݶλ<' R3f\K5Ua"pUh37ߌFsꘇ&FWµvicӓk`_ mcH *1P7zкk!A_uV@h`BHL Shvq Z naU8W)H@PtDSO=uՓO>yup [0.ӓ~O1&vH{ӛzdQ>p_QLo8#iɇ踛?*;Up|' ` bI{@m-wjV#q;tlTV|ƣDP7B'($ bhԀ/8k<2!9/&A0g4 @@/Fot_"n]W&M[U;qPXjH6zh<5\oϧk߸ xK<QO7`c_t˃o[~yQcM]>ٹ5īvֹuӌiQL3:1fgV}?Oz?}ێ> =`VbHhT]:fX\FbmHVׄ nVi@]{5NAlEt[ki"H5`tݘգ {lBa,VEmqj'mׁ756[m$0EbzYv l5^f\5Du]x5s+5b{ҤmK3칫gӹˇKf7~m;e)p_{չ>~!2vƞq a{KZg98~bL|OKYuG̏yq3r_/|{^]ړאd&HC)XW tJa0Q.}Hm2iW ni#h nF3>@t']KѼRa|C~&Pc] : Edl mhF{/2rXLm_ 1h<Tb R< miLP;'LD|mE"OIB'D+wDW>CAwѲbblbzsU^4eMjʌY_)?8E_=0y ||bꔉ7_xA>r'ѯ}#qo;ڥR% isPA.@=P.]V4`I250_&h5xm /؃m'>vр f!huB @M@)e6LP$ ixja{@@T Rz0Fԉ4hE5$i@u>Go.dCɇ#\j7~8gx, T  펨bypfI}1EۼM)I|[qeozhb'Ex<˓qԴ_k_?CW7ڳ7KI]~yI~{EEEZu3)Ix>}h/aߌW#aS^< ƅ?gە| p S.51ZKI&r;lof 2l-"W "4 xNڂeY$C(&dI{V# @9~:}n`; tv3jԘu(<s &+'2= |Yw| 66R> OO.[MXS0ZmEs{z6ji|75.y]tUכN;B2m۶砟?"sބf )!xV' .(/r#$zʖ(^ID R`4X?:CCKOIDzF:Tcf2O`N#vJ5Q L,F``}=@hC@'M` @oƑ_6Z Foe+~ӆA8Vd3BǴlDĦC6pu -vذa\+@<_ r|ƈ]X?ޯ[8-獋{_MFo`%ӓF]m!<H$ Bt/~aa9s!x|.|_7F)/7:+imLOyov;:[?痪w'yi?I]]peW>rKXqՁ?@ \KU%.lSNsYbŜ.yFg}vB>*MuQ'{)gՓgF_p߲dV4Q>_xxWpeˎ> v Q;$I''}yp(Ӟd K7>Æ֎%<)/{Qf%H?>S?uttDz,$I_;ER1SO=犯v%\m$I$I}C)OyBlZG(D 1<-GȺ$I$Ik+xy$Bm$Ia"X$I$$I$Id~ej]昋s LJLJ@YC;~`Wc8hl-oJbcXّ:m5uѺօwN]&N$$ *xtҟĞVZK뾚M郪xa\W(u?X͝8ot(#<9yIW,~?s_ǰr7k+#I~͐E3^ 3YzS)oFRZJ4&6ޡt lΒ$IdkzmWocI5<0qe}qP[N] P#1vXk|^+9{߯~v/u|ɓ}a9$I 'R S7 7ܠ/k7%cȤS!1֤ɲ!>C%IdN`O$}_"t@("L6Es(>HK ӧo!@l\)p uۏp!az}q$$I2I}I|'SKe54>`;umYLlaÜ ]k?~_2V=$ $?ݽj|ӓ'^“G|t-L>|s > c3o)Oi/ڗ0G{͓X[ =I$K2 ]R=S9lv/͚:,jˎ!'gcLҸ ̚}Gz$dTw![iؗ11w?}KO;/cζN_dG~\DfC{★{dF7&cuW9<EQr4%I @Ro(fAKPGmvtڦ(]%~AslCn@ Y7ML9QxC[;׼ߊҺ?ʫ%cn,9$I iPvol9{m ï, eg$I$߳axX$I$$I$IH$I$$I$Id$I$I2I$I$$I$I @$I$] I$IH$I?PI$I$I$I? @$I$I$I$H$I$I$I$I*$I$I$I$I $I$I$I$I I$I$I$I$H$I$I$I$I*$I$I$I$I $I$I$I$I I$I$I$I$H$I$I$I$I*$I$I$I$I $I$I$I$IR$I$I$I$IT$I$I$I$I$I$I$I$I$I@үD6[+w}%1{ُ[x΀+ÈNo&"ԁUo%b;z0M<2L~~AC"Qz1_یܚ:?uk 􃯏֡9DhoSm}?LUw'CxDm4 #^׮Pݤ_^"\Ӯ!f=t8ύ5-\7["`ize \9q%ǔ$I*~' ZdZ&i"8i = 6FIص p]h)d> lZ!Z+Qx>8V4^="|zLGF𼧇(N= Bh`KѴ#|7 }²7ҥbyc:AIq9Bh4j{O;>@ꥧ\k [􃍕vGF5»(/4W9 t_%ε.Jށ|[bB/fM(B_zPD *mE!#&s-١$I XO2 I¨WDze.pMB'Wp&|G+ei2YA9ʒ"HJ(RL;+hSG!0\ CW˨v )Z=ا>Ϻ ;!/r߈y^T< ] ?A8(c9uU޸5kHTEW3 T!QRYnx85<[mQ_aO^8p ьgz?{ocI_{Do]UnvfEa$ HG鉐$H"@jUյdee|3s s7"b Q;K 6"0*Cc'UcS5+~K_z_ ?p8 @pxio8NQbo AqA*)EtѪ x׼Xl߷nR>T '֐?|`BMmP*RE S#W/)j^2cߺ\'ӲN֩Komk7͊x9ؘu;weʪyxe\ŵwGcu1/}K/WX07w>Pu.袖۴M}~ ` I4jTieN`'n7m@ܵy p&%85 _DL:[町htbv#j?PsmUkxnl˒V[ʶ{qJ@gϹv rz=2$k&iuY!`u{szN$ؗ@?x&n(Cwp̩YVct/B2 ixmڊdu>znA&"mP-z,|J dgFds\Kov^#scl ӫહzn]M ϪW(*˘-bu/ ٲR,{񇗘:fݾh3XZ1^ܗhL4PhĀa#mu2;m_zVT@ jsNM-j~ u@aK]>(B`}B+ zLX]{=u>a}Z=o\= u][|kcrYW]/&X}"1X!Ȳz"kUf͊x9XtcFW z9]dK_җ^+vßPӏmfr}f$; Aq 0vb:uuXZVE|3ja|K3괱 #PwFTs0'iB*=0Ԭfͣ6Mڞ(}` [c_QY%vs/n{sQwPi W5<8'0H]Sӭ˨5v-[.-WmsyX[]m @pWS)Aҗt@ ]~8!.pQ.pHc8hK ph1 #@[ 6"k`'uLu4ɯ0"Rx>' SՏuu9zYpuߘ̋L p~eU} :9_VҗrHJ=$O{5*w^ݘ[@omfr~3 `@(j(F2gg5:XԤ7D6-4fEyg6,u&<3AkseYgvQ+b*zАf!0gh@Qs=Q.:W|14mQǾ4kۗ`]A:(H0f!Q{{LBSW> y= h_җ^G"`IsvQA$f &,%5FPs6&]f`c|(:0Y^ 5 . 6ZlMC!՚cSLTrJ,j6'C y1&>v"\rd=jFUuf:_PVwl NmӘ:p8zmڲއL֖Y2%ӢnhS0rz{Q4tt.5PN{\Ca2KE5Tm>^VE&k.2 veZSDtѥMHbV.ҹ.EAoЗ/+\|]Z9PuXcgrFu}hH[ !{jrh -/]L`0o\١=<' 36ʄ}*Av|l0aۮ=ֿa>Z:_wS@ P6pы$A `F7")ڝeT'6ӱ%N܉:0x"jϩ=okI4+k f܏No\f ΠծUs"8,Vܽgn#}K_z/B%Tg#d.<%}/*ft:i!f1}He\cԽɕ3:c$~VTegڼ#pq讟 2MZI { ,:&UEˤ:+kL* jA Ks5O}k}ôuS[@ *PwPc" Tgm1,^qy?wm|35lH̀ s_.=e뾳3/}_"@P҇gY9lQdsUOh:AC w 4gyE -إZ#;q@7L ڞt?FJg;M#0F.2c}&k ;|0ې~*VpEN`,KuJuHA2V"ecZhf9q E&֕E3dn='>t#|g.``ֹZy4 ssqE3=_~g sl0vU}Xgޯz/t1e1D{/  P'h O2fTk |!>q; j&EV0@w6+,`@^A[\/ aHQtZ(ͽ!f Γ҅h{ˍ.J]I^f01pDQ%,`⼨]}ҫG@@!u0KjZ4]z~n9;M8jjF I) 1!s-"87ͤeRj?m91Uٷcx/8"'Xe͂E|5ڠtRjR^u?R(V}K_ }ھϮXOZ=c j:3Z e Wmb.K *iӱ/L-H#)sf6\2 Tԁ4?(꫏m>fui Hm 6zc7ڔ-}K_z/}i=E˫. 4Z0ЂpBP [7G`]DCN`<(Z}tVxǍ66c1r?9(u Ȃi\  aɼ,ܘh`bs<5*%þpwDi4*L_UTDž$JPT`7s@1!hBdV/'zct}жi^l+tq-\2mKO|]ѴS:92V=̉u9]_w$ _ 9d9 Xtt #jZ>WiO=z #5(pb^K_ _쳷жpft:ݼ{kX4IDD1DM^pLleZ3& HE XP31G?gqA?uA:f#S0ղ5džFC4$,qLQ`-ldNO" nR@$IK_ }d{9t1M Hڡ ux4ܿ%ɅwCo;7]==̊5ohru }+.êE2& Qy ?>p|5NG\iK& *-D G6X3@OG 6en>OdE0?gXl4EUIF(xȂEH,k3M6))-2P[N ѐ.r Y MbYi{r:!r.I1={3yFD¡{s`(*su/C}yKDDie\Üa9dBke;OijG&P=kk13Z?A[, |ao2P:fZvoOݧ , рj@F*DMµm~i W-x{v'݁lE?g`m'l 5/}K_z区4w74]AOCWFd hpbLv]d~MAz4_ rZו4IAk5nV8(E!QnsfӺ{tҒQQa~nJYAx~M#_RXPZ~CSrr_05UA=Sj}nk4 C@9byJ K`2/̄۴m_Fu!*-h0Ψ=/@2j&%Cw7ltt\dD<Tv.܀Ѱ N\A@,@iFEأ*"SPоlGL薂!Կ̩wּEuR3.x,dM,WoRB!"5k?6G۵+QH,PxAM j,9֖0 ܵ ϑ/CMv5@ IR4TZ~`R=jJ-t?j^izw >0hb9lwXm>Q~rxkߊB`t}4~ۊx.#釥A}`*8x;ϓN)y? Ńƌ~dLEKwS .% Yʟ ~c]Bn KP^+\ V8u'F"|h~m`PWW<t1 :|N1#q<ŁY9SM\G;26 d<.Y#0Cސ =x-=2c ` XQ窽(HNvЮ/Tǟl tEB [ykP[C92VbΪӀ/}ox,NU~ h7vezh;Aa1] #jC)/ڤ38X3קaN+oGL *jwwRiΫO' i 7 | R l]Cg}y/3{b[J944r-3p;V ,8hh|΍o , +pp1Cf,/"eo i0+='5sʟm~P7ظ%˹;[N8R3,PJfQv GkhZ6jVh@PXTYUEV> V҈di#_MÃ>e ĥ~4"zS=,ϲRK=c*ea (X ##nH2.aڲ\U|ɹF[B |px@F 2cߥN!h bUHR+R9̙M7YLCRP*Aßw豩(/-Zv6|s;nnc>*jƑ^S[&0Xb ~Ťr/JbGqy\T i[iѺXF{+d|Q;P@K-yѼ5/˺gPmC ~Ӟf$A.T0\x槼ϿM';>zGԝt<}B+(Vڂ10~F:fE(+Xk_cK_ }ye=p_;6]t1@LhT2d_tcC}H2 |;u-As-€#+-8prTpuɷBQA0, :4&9h?.2񎇕A(-J?d)Y}eaژ=c ArɉJK}^,e`IZF܇!Ã`B*Yi,X!*U>ݽ! `Y& ;K sat`H*>;yHR<*@y/~T.T6W6gCW/K,p1)vU8w0kM6s-'жPFҷWEG 8lQ}7]Xzjs|LO3xh-ЖL& f>Gض߼}L+-Cc$3p9]dg+>8ČR3 11dyIK!/}|C͉X1H\EQi~u4|_Ra[*P gRT*K`ϱ;t0Aչ/_@ Y!ȣBԵhAR,L 톜}/6FCXYZjmvw oI *SqfܹN]$6SA eTm][ph(Mʶr`nRSr~[ v4c<lP%m?|SCnL|'vhfV_.Ms+ 86ςҗ@_P!(LC88$>=@]Kf5"|5VjQS)o&!u$̡idP1hZ<\9B@b2@3f8`4`,4[IaDw28x(J#CtG@=O]^[@L-{J5?xI׉rʕy `Z @,mi\ e?snMSnAh:TYyI9(I4ۈ)c =9/ȼeeHqfstڌ r짌n6qBC;71KRЩJ\N@EMk}C366~ Kې&}JSspl@Pw.PlX\9ĵfACH2%IFT ] t˄ p /؇$7kcg?uÃ;wksڊ//_ t/E}9җA@ȣYs0 ,UL>WR3,9-A:r M8[0wӂй)&yPԎ-_( -Y(}ۭu鱼T =hEh>NtHMR ޱKJg[n18V=* zϺ pfPnN L Ks փ HXFVb0+7EVFNcQ lnoh5 ,1Y-~V47cj& a P<%;My(F"#n(sĵM\&0r|iV-2evnE\zZ 9܃A3.Qط7,aN)",8=%Zޱ'o?)%TiEX0B{R{є8 /PӮYV߿}h8L2j2H;mm rwW=ctuWrL@IҖ skz>0t//}o+4_Zq+Eh5B:2f"3jfm Z8J"hD:l׭&Y֤XoZWdX~˞c)Bc)4 sZA|9eT/+j ~9vӪ/ڵ$QS Q3Z'' 4 ylET)Y$S"@h~n[ST4K[XwBȹqou2/)Id;ZxZ|͙@hAi`)15 Q L#-30<\́X@ebQ 9Q:X5OKr2"0odT(\j N%;(e3s8hROr"/}ȚSP%i8&%0 o84P-?(LXITRZ_8 Nm||PAu~[w+ 9QTB4R%’sc8 m)ABGTZ̍uJu27қL+!&˲ QT#(Bm4"SFɩDV\G钢 Qs)} XQs&A3p@ jAKdmFu3О$-`.T+v]LMȌ GI#@Asu{"nOx]&+1gqم3PTX.G}Ws?yr{|W&d!e-u?n5 0tވnZ5lP6dAº2?e/}K/eu |p81rJQKxJ.]L 9dmJ}!A:EF 9UK̀=;WRϖ7ɶց|2mEUeRSg=x; _ZDa|=~;oÈ&׷4F|TX2JpT;c^ w`{?&x^*V8P MKhڽ fb^TzϜ͌HpDmFqLTºIs0&Gb$h8 g "ER$S~ǀmR)NY{H!c=?o7o> ⦴.U2bۥ ߅J9PZ0ݠm:65V9aNS̢(/'4s8P*؂I&D#@:ͬ)Dь@9hD|%6p>%\Ez-d=,Y`fרj݇r*2'Az3;3u 970ul+|-Җ=fdp{ߍ# CW8A;xV$5tV EC/0~R˜eV& fHuxtqڭp07uvkIOV~ `欐P$iU&KcٚAQ[kݙR35t)7p~0F%XܼBW9Y"/f蚩m\;TPN.)?W-A |/5_s@ )y u~yWw>~~o,H8=[c&(]BOIAhy|Fʪgb S8t3Chҗ1Oa) IB i`乶P:ZA:v[s).j -~eXiMGy l=/1zl*Zrݺ;(E[oEڵ6[\uٰLn<楱6I~č99^CY=MlP,_˃}~_=Ųմ,/i^}t=f\xv&Ľ/+Ap[jPj/*9OҴX@,,:FţdEZamYP Z l¢R@Cd\4Egy&ɨ(昢1<ɲIV'+칄f֤/`ܐaZnqs%aYݹ*IVJ&),hbcJ17xLݚ^R3{Sx{c|GR%Řu- _d71Nz݂I%X\dq0gi7c踯T&m}K_z/W(9WwIg]AÇBpK/˦.ڻ't1+ X0 z9gj |YiAX.w6^v10|e/ҡ^i^j_L羻(nѵ7)،sV3gp;J*aQ xpw'MV2X yXq`Z-vsBx}`5XPYl{ᅟEEqO.@+&1Ms +$c)D*ZNV\P(j. jfB)s 6 ˬ ia892 S BK6\is%iX֐)P`zFsZŷO d{n|N㼄w=A6V|!@{F }p7is½Էp EN#Gt1yA(b;QQsXRJ$A|~M9_K'Tg9!945Šp4.Z䘙L\Cl@Ғ܈Pc,sm2켠C*ö ]zWNÎG{]*Y4\m I 'nWₓG[eR& |>z~qvM{XrMN&}SHK,sGfdCE5 B`fo=J~dGy#c"FkXPDXeZЋԋB>/ЗohmhD$J@ޘ@6 {wa(X[8oi+}"NXIJXx=FV<(  ngD狘,I:7oN'xCott=:L))_hsǎ6oI^Y(wvXFY%\/9S7[ `k-06 L8[ehZ ɜ4nc.@}K_E/|?~;<Ӵh09J~ç1peqt [ gu!uYِC]T;NfR9,Vi&oWc.>`LU1kaC&|7T-CplkJ27 xD/&Ϡ]f; v=xjWWv/pctZq?ny=Ko xWN7|1 `dga$B)Q3Ӹi<|=W7w~_Q33Qrs{_s{&`hn8v`%$!1@]AX8 }K_z CU52~n{d쵁pq8gҼDCݷ+CAOS;0}ݵk;v]^){57N2(N/#j+MZ dW\/s~ Hjq7\&a?:qhoq:#wa]Ǯ  8q=MN=z4l\zOBfIAMw*T,@8 5Ԍ%pFS]d/%KRUs8x]mCo2wBVnЏ՘ g +qA8>@\Q"O\At@1ܿc/`D4fC^t*AmMcؠOk u=qm &`VxdM7WsU\87aŌė k ~7&SCIN罈SU{-qܧڂs..yF0OG0/@}ծo8hP2uny/}<0&<)kݗU `h 0%C];0vXX"}nN \Sw9̭O\ =],]e<`:Եq kN&J0@況 H7 MI*&^ח/::&Z% q`7t_Q% N~{FfLmWM~,0MPV#ߪR?ELy!xJVuakȢ]rRɅK:W8Gd$r sn#xIa^#mKyW%DB.LJp^4[Nxb<`I$06gaү"AƁZ_2gT560/bXvΛ7ܵ_r/fsUX|iDZc,W)5eN{\J EF%w p06ճ!I|k& Ե 5߾/:C;0ńթlԵǰ1?oQ7̨y[ t}|k ,v %4ho9&GUa$>ST؅ X8~hbw.!}g-F2 YPB0@g=%DJoҋ5ŋ5B@J]BX'-sLsj2* UBi/g};&e!7Ž;:W1l"EeD줖hl 5&k º=N=H2={QtaIscjD5 e8ʡs3cW_43b?p,-wXWKn㖹}>[T Wܤ 53~z(```/yJFu|IpG[U>qھv n$=x5d-wcjjqNh)q!n\zDMħgpD˛S#v 1ڲ٠:pt =kej״ڻb;Rizӗ?Ԇc2wh*=Dj&A~qlԤsI{??_K0#X,=O}Q u}#m,5jzp.zFG \?$pWǟyLuVM lKjki0 .IKGmsB Tz sY^ DݸІMw3u F5[$TrcEw!n33Z{6bAJ1NùR4[ Ro߷핢dw/9`cNFɴ`_ Xhu.F \YSk}XVT{=gC` ]FmshG Rmu c'Z]Ƹa6`}FM*_y4JH 4/蚮aB:= ;%@_\Q$[H{"cU!|IPP m͹Ote}B,PM+,@9WƘIŲ]_E֧kk(dBدp?y!>}̬+Iݞa: Z_.֞ o'خ/}K/7=L6qg;iYm wĖ>|u[xqѐI vzM_Ճ hvk L*>|_MB66$jJϩH?$Uopݻ?!.% aa޵hB/"1"D t̓6AQbTSCwՖ=owˁ?],bolޯ6"*(W0$M7O༏`{]B;kF6ZI9%sU_aRJi^>1cb< p ,¬r 6qa;$c06.&S9]t {%1Ivj 03IvMyNB&jD;E&?]ht x2qH3npBg=I X:t *DT3l5scMӸn٠&լenoQLTl-om@Kmx^ϔi 8J̋$ij*'蹚DŽXy5kڧ1yRGB*^'0W=%(T6K܏Y25=T]=^җ[-؍8E]g$Ɇh%w sc-|sI4O`~\v<#̶1u2ڥe @9ץz|C|=psjT3A~&’T1s.[cmϱ7ɾ'X*#}6n^^8X5u(o sj3)nY1( ;DlE_SK@QqS} `jb6h͊&tFbD .F}}K_z[WEDft1N~m /J#x&WaNxSt% 5 7f_jW!hߪcjC1|JЗO},q#79旴ɲzSiDC1!I8powߟ: 1502NEZ. XS\LV"IAB+=_b1kkGv0D+"' ^!\J]7Ʃg $3V>RlLTŊ4PX_h@䉒TlB>a]iXLR%V/Ay=d|SnT >3=WXyd0 mX~tU1fZީ0jύһ/}v?=Jp`(m0q:ĽxCl{D܍]{|jݿ |}aZ@j&1Q,-nqEBv"zӁLaY7Efsɚ%uG`uq{2tF `iٱ1+87W}QLl({{8<&N[5qLu?~2P?"۴Ǒ{n><79!j?]n@h*>>c ϐt; D4#IMnjً 1y$0Vyt P3n"/}?  sOD)ZX<%?? Sf) DHi.UVеK,A{s18.ڶMj T?E3k\61×uD_JIuP( ՁfO9V>= @;n& Դ!̳ vlN#2L 9Yme@3ڕAص$0m. 9Pw$E9h5QU|N5lUW8i`ǎy/G[e^Aa .&I=36,Kΰ)]" I_s+Z4xL{Na_gObR0oJxMJG+ "@b?"ݗ*PqL +MRW/{j6 <͐hpAz$լ }-SmL'ܩVcPgw>r>Uyj匚Y92Q܎T$RSS$-֦ z\<$ Ψm4@?Y0C[%E֗5캹R˖3&+ \'k{IuPE=^5Sֿ;K_ Ha"sk!@waAƴ {}o< }YNL5Tj(`-n>`t1&5Зn j-/} 0] uc hg {hzJ~_&Q)0l,#jS+JK0~?~̒mw' Jc9wʕ%NL4QN~`;k>>1?67sUs:%:~ɿ=JI59h}/*~€$LGZͦ"Rd»V"ƾ3I^  Gn>JOCPE/TCfSy󏴍]ELtnCl^rqi' pԵ\߉}LMU>qGDylFYZ_+]q%Cw*054SSmy.sfxS[pkM&!qPC$̕ ?=c&(/?h9K_ ߊ"WL㭹S>2ڏ~F sCM?NRP!u.nMc`;?CiQ-VN lp@r@_|ճoۦ%975>Bp+ ,_7vp#%&v 1_?7 ?׹C7o>> +N'9P+4ԣm+w,0YWx '@\k;,4y.\' aMh;Ae\sИ?`G0(t7\=fߠ:cm 9XQnPqKB5AbCf"{-Ԍ7w2)#1| ƆM\ F*P%ٝ.h}F)q5z`i,v(wh(~xQT5Ll?OÀ WRa`l)~ͦ,} 3'@mGuQ FԤ w/milLSA W 1L8h-3N"&B]g>ﹶ4?9.dҤ57eu=,lp\{?q-E7S=p7^hNF*k!< >4t-赽!t 픎*p[jkh,kD(K?;u1KD)?ॗcڹApD圎 2'?o/O?1jȸygmb)*Fq@עF9ņ> ,:pcipE |nώB>Zn7;d2*kA | -qݒǒZ|^`{vlBҮ2g~ _ M?Aq c\z }" lC[{:s7N@jWeg1zzZzsv2oI+u>v7c7QM;sܿm;Ա v~Aޢw~B9_3ӻh! 7n~Y5~ׇ<-iqRRrqKGYD[ܖypnT'cQQb}ʹ:h$Pz~}iHjNKnNSY3mC2 մ#q7c轗c>dIᱡ/snD}ozē nOkR{_~1Q`ҮU1.y>|Ք5  /_Wnm O~AeNɘdT'BYg{<<&f`- N3;K9{2'T@ U}slWV74dj4|@6& cX"0, pЈ!W:Rg1A) m\^ Bq7X3]%NWy:^}K_z[W$)ҖMQacA1 -aJM6aK5Yx64tg|'L {oz'*P7}'\V)#|ϛw�O;Qʈ֯ݤU:=Mwi,iM޸Fg_#F6=zxҀqɡ~H7(N|NI~;(aFo8/N>P#+ ~m\뮔zL;`VM9uEW2{->&ʄ,B(LLcc9n*W6 mNs Z2X~O_pE_3ϩv)RM:X'H^[D(NsXƣp}C?Ji]aQ9> `L8,`/VOcYT0Q7qmQ3BB5UB~s @y`u{ɮL{/ʪګX}oM[F[3103_ ?0ƿ00`di44["[dbfeV^DFrx̨ι|[Y#(\|\x`ѻbF\D&̨"(FLTLmFQy -]}*OSG)R3o}@i:9"i,DyMM]F`;g ϚX6uXt)5U)/Yc#ԣ%X9}Z9Ull=d\*t j_ :W-O#gGzI`2v*>Ӌ WHdP=}v2F287Bei @<gDp=z'1o 都_ЗGd ڱs03sT/3=ީF*J'  .v&pg {G "lΏ`n_}kNIJc2֨][4Ut5Y]=Ϟj8ج J fÃlcI(0 Q/8X4qa @eT !j8 7\$d ә7CUrzD9AV+ ېZ .GށWR+렞їYW#D:_9`L#_zBD7Њ!Z猪51\zq#OO ;I1<İ^j,)#2Pp₅sxeX%\wʌ O"򳈝9ԧ_MSpi}kuD0p~?B?sßd?o`fkQ":J&"8W0_=+@y'ASEa!3vkȰpRx?tOƽ[(YVQ/[^wQHy$ ]>yS1!靊3jŃ "j,>jB? +ؔߕ4aYuyL) 1_& DnÎ_ 1gd!'Q$g`st5k㨾iЩʿdq߫gF!>J#f+Cz^ҟ#-)7oBVcק3 t(cPq GIgVd-(' pCcbWCƅc# `iLeg+''[̔# b-4=Ɨ5ue{v΋l7њ8BKM{4"f#z5[cy\/@0b&"Jﺴfy~Qزԩlf2ָOG50ϙF t0 k= 9XKJZ; Bw:Rt컩+&(7қXT"?W/|^:2mhZ #2[1DVУ :L/2%>|! P: XV'QSE\EgϏ`䅋xHG,z79"_wM M8E=? LzʟZ= ޮi;`;,\}8VnU%,UX{V1+t|u'pR3^+Zj071˔JK̓Gal{5 q%T.0 dɣ9L =} r~bu}&0XQMQ@xaI9缈}4Sbca_=m 3XΊPL%\Gseo'E%#N@LѶx7Xe*.3U^uY8qo#I!Az?r5/V{nԫ]quIzNQKaVI (h0%%&q2Gko>us$ CNXN'ftS7uG2CZ$M(_Ry\k÷/6**i2|; OI`nas=A9e]'9֋+9?zzMTvQ=|yN?ҋob`>H8cg{oOh5؇b _d7B'{gu>1>FmN yr ѱlbv~n$9Sq< O"n٭9^nqm-IH~lW&;4r?0z"FEYآFZN>3#4;5Գٖ֟8jIJ@s;l9'RZ:-Nf\,ר})֯r`t@ ,N@p]frBG)}lLI]_S|}8U5IFF\FQA]pSFnu8UC.?II`NdÜӖ}] 4]?nnz䜀b R d_h4Xp-ol3ų\c㤊pQ7WS2ia$̞ac#(QT5(԰-ΟH#1z :08"|T kXY ;bPE*Q{z(&]L6(#JDͶqj4 SHчkoA%?Dfz'lsn >"0bb8^E݇}j ܵ5Ĩc"IDFg0va #{5l- 횢Sۙ،@uve(Woy:Ghll2̐S.5,; $K7]$A Yܧ ayj[qW,$r6q@:]5E1-۷COݽ|qIqML#ejq1؍]2| 0`` 0OL<ѿy)ꢿY}830Qsqt`:LknIWɘld$A)^e6iSO5SN0#h}m:8$:@Cr6OwK^Bޅ[`O0e {z7пr=qN"qyƩYڭ.f3軴Ox=6yhPm,fG d:h*iGHB͌rce&ҍÞʘ]^;uBN 2[P\X 4I:!R>'oL%af=.q^Rer hBBޣWq 6rqJ)qJ(SQ1zMOl4byvyA4SSl_ AP>oٶ 348#WO|XDNiY-# Rl.}7;e@FdYE[ ЪU!T/51>ۋ^ߩX >e޶}#/#a1o#y f31/2HJ.]2ii䖐W06e•X*,!ˊsS|qd.XU0DBm௿rQLw9Bp5|]_| 5<9<1b"c):D Mk?5o/bV rVH.]$' 5֗A ׷1n*XDK}GٰͦugRD֞>׿qQiL `&taBW@Cݷ F7D ?9{ fpOV~g1dpL{s,,1IþGmlNzNUƒaq-ـ ҩ6ZWk.x*=4@mJ(*BLՉڵ\2e9_g_ T`qBhzEfT;UhZtf|K-lAd30Iޡ L{=ߩyOL,mzH> hbPvX?+(bF>Xycha>_z9vYyɱ@Ï]Wv}0l5/,k4)'yR-W 7iq?|1eDG }yT|=:.P7uS7=r kU\Ac˔>ec0=QzMlDUĢ|f*Q *j57Z(@Z'"KiLN^:?<+5|XF-O&1v.[w:+eHSCnݶ0YLISI̜b~˷jX^rKֳH" jܹZ :%+G9樎.;>c$djئCFw/gD_ұ2UQλX[S =DuPk3ԷY'7oox+8poK/'Xs՛#{3d*2xwx; 3BIŮDߟ9GY O3DӶJ~YKٜc |zOH>u%ޣ,r_ҏjŖ_ N+>qe5eΣr|,Z,0/kp]: 0M8%ˋwԧچ[&6`M{h@ɤ`[ ?+4Z+kZZMLA}bc)pc>=|"dk̈%[ciS_=/=puϖ+~F@4ly}ȊIkoӊҔM#hvn3wlOh' tft T~pJA:|0} YJl7:!h]琩QS]Hĩv,|~j~\i,s {so7(X G|_ї2?$a*l˳`ǰ[;u|xBPCD9ƺ<]*NWb"0a鍊StqLňZXjx{;Iv'hHu$zƝ6vS63*vyb"&Q= +Ed-D|'X(Ր/naZFoa'i@/E +O 5 :DN6M$`^ ;"BN VȰm:|{;g!="BQ@~L{e3aGu@ PofeY"5.xSD3Ⴊb>]A3] 494l ~c0`!8*/k#h> &xyZW1uwkb1Wef8YTỊFe fuM~Λ}JoQ`Bk  J~UeZUW˴>ܠo '(ACme,64۫5)Vc ^`aQ=H<=jL&r+U軏Wxnjj Wڹ*qcԨ_u[Ưϕ{(ᕡk7KG}&_≳m[I`|٬OY9XFd}R {co0u,jQ[Ui5zQ.nR?S4um#MA^aia)߉<=bW[TeQ 4>I{^ 0~L$}݆8J3#* 1cT$@]:Y:h8=քv7xBann!:g }9y)L$EUпIZ>Xcm9;h~Ԙ-WP"|_fe}Q͇jm6t} 8]x.xN<0u+SDq Q4|xLWe*}%I_F鋄>cGxઋe6 ; 43}Snn ڐ4|vq 3c@;ZZHx} ctQAZT"!2OQIH(,l-:L5?ב(佗Xs^kٜxTXc1c`좙ILi1v0N2E%]ݢة?xR!mBPG]nnz$fzM@\(ƈP`+GQW1??~v6vBOwwBO,1,3OUQ. /.a}wJC̤:]ޡ~kLۅr$ Ho{N6k Kit{!+:ל.|n͝ʬܴ`48=4~$}U[CԱp,Ց2[ZV6 G-{6ڹAv=#͎/]J/ۂG w>z]5)B tR!\:u}lurp]]1愌5l}uw3gYџwhMwBm1G/d'lcr,0dz`ݳGeT7o?8h4xM+*$3(ÐŌkzxN 3Ż e;D8BWMz&ͦ4}Yf\/?p&%YnG`Y|]Zo I|4@F8rx+C_:J9?|\Bµ]#zhh'|L42&~Y \lכ"C^VuAVņG5hi@Wh̶,v%u.{j#Q^u7n.^~9w x'1 C'q1RƐa7eK;F$ (S++ 4Eׄrk<1Crѳ?nyه=ڔ+h~GuI K4@kz]=FzJ j^ѧMuB I7wm72q͇M> m@;3"vї|0+4 qVy p5 'vu( o `U{HVu,tS! !+ M>= ϒBt;Gm @\/%6JÏB \--[T rA3YAP؍йH+%^!1fft"f34;1*u)-vqh5NGE lM@~m8]<_;5F9؞uXy鞝}Y'vjIYH2 ~v(xH}@o\ 3T?]P4UƧԠwifw\APk>+KYba&~G6mLhA`W{B=*{zZH(υ<P I;G@&iH0XpVk|7uS7uӯUbd7鄑5-ykPvAk6bL!loIۅVhThN pᕋWw+ZF-Q8 2T:Mysw| D%qCkj1q_BŎp5hZq=hԸ׮Qv#*`e~+ K\:]*=B!(Kېr\VAgB$jVcrzLϞ(S@6 zG;& b5T7SuL?ٗ3P^_޵UB Tq;%P[آ;>z30dKHݥ<5M.1{_1=E_U%PJׯd6H6<D6&jsqmcTXW,$%qhAPͷM4fGWK#O{ESնXKG^:92s\\1Q  ,v"$J'n@@nꦮyϣ866Л;zbZ{47EmQqv1zЈ;I]QI1vQ$C202w'"guS.RFܽiP;EhqMGmEz 9POM{6#2]}Z_B`Ϛt(6f^$@-2B=tX~>IUlnv΢zQp G//TBO]!m8k6b6]a/Yp6{6 mA xq,'bHEp^C~`qqov3sZ\'A'L- Y!@BR Ftd?;fh1#ΧO\xbyO+\ )7hEihI+mrHj. α, Lگ3N [ I_I4]Oh߭A6y6﷘olxN>Z> t앎9W:kW&xr[E>-f ߹#\9nb;P$د?NeXvՋ|Cc:(ʽX(p}zNjnL q'nq &l,Xvq)ʗ4-@;41!bF2IA6u UYBo;Nm\%] fH_AG̈FsٯT7hQl5={ܤ8QE)Lz;7v_mh oӎu8sԉJu-q7zNW'=c̀7A[QwyaQV˞~j*KϤ.jc>í:>UvؤvD#Pzi_י)> `/odvd&C4p$q=>i]Oކ=gg4؜Lv:Dڭ#x~ üF7qIk >1N?qO0lUԱO;7de͵v'> 1CeVFħiݦ*2怓q1n7!eKS 1eM~oaQ`ҎM@&KvGٱ+uU.Ww?'bȠB7X!I+B&tϾ8J;tR2)|ӜVXDa0;13mI->}a޳S%4L45c)bQr>M,\@ר!W-N?~ =֩~?'ICiO@%]?3q<N%LD>[t ~[k/C_ YH(p$u *6gzoW-6N^pՆ\5k+%ܽ]ffIQZyTrѪ\|h~GVg T$ק7ޜ"ouS1(w&u @A56; ,L( Y?O>!wsy+z՚:e8Ay]θD(7MBՇYn_[w AmPi'#z}O ;E}Uz%IOm4w rU?"%WyaeRz<'hRtƋr.j폣#4Mnj__ L hNy0XgW7YJ,X!`8.ϗ=O۠L+# 0"66,WP^/ ͧNxo-S~:NQ稐}ZheVԞyȘ 4ZL*mdvY%Y0#N0ml|ρGT4 >uAD[4HaQy ̡.wsJd'[ڍ*`_$du p?M >C:5Iݷ6+P~F{DHhW#ˠvd^f[>=ׂi1z;frF ]8xpgl3u q9N\z,|A\oCumZBs`VΎme1W Aefӝ9b f|o! >[WNI5DPu|sVe߬A>e({?jqu3GUPvFV@7uHgP/t7sf 2>1:G~u&Onn7K?,vы,gMKF=sxEyn</ޯayNەbݚ猚ą8&S:*x.iYunb"j:8l>k^󮔮Llb@67Er.~PDͅeÍZGQߨxh5'GX=5P2}gml[5=￶i\ 1OмGxЛ+ʏ#ёJ{b9 Mz~BN,_DQƨ"U/":=`{`v , @ς]Uz};d=8*%XLZ]F03\q`>/j<a-+5GpFDyEmt+[p^Ĥf(HX.>a&erga,Xl_%A=bN`20ͦtxIxдJR:Bf;nzHGFnꦮH$Ch!h-хR33I|FvX=ІM{zAWI zm,pF=h9jmx>"4IFlEAC6VZ-_xСc <pjgi&|3 4ӬqU1ԪLw*Xƫn7jF`(۸kP QFPK1͹_l~l3 Y؛ \u& E]jy5dT+n) YZ'|믤 Vw7yFISiWS. -%}ϮCD:Qv\~U u;3HMh-Yc@BL7)\Be ̙R~ߩ߃NҝFL&>iB׭4M4m;XJ|377w~gggqF R)[\r#LVWVXO Ah7_)NMvEInpRdқ o^-qZwjATe|n=egnR$byi 0;k5ʷ1]>u)L1j8S`F;m6 8GߦA2!~&"p.|PO+\ppJo 81ęn/qR#7T"NBhe{A<*Z2MPYB|I$!iPϝK7W,?BW2&*i[u* U=yURҀR|As|Vfa b>zqdz!YSKw- tN{1A"פx!YLvnZrqv,V׹x,PK܇1_SFP`8!m ]@jX;xt(%baRu+Gh\x^-S˿oF->Ո$U4~Ν\` !e:8)fNM WhԘyPo{zdhpux8zS".~ %-14EKse񿻇 '6Pwp*9Bҵ< BKyZ=4~X׏7a;ذcH4@9u!& )DZUC6AvnK}:v&^N'ۋo!Z^zwAtd锋3$y'JWzh0FaEhB|LW \b6 ;^3pQ.=r4wnx<~By)MQ qjfb +hfqIEXg k>D|Tb{1>3wD#dvH'bl@$ϡҏbE|%k҄XxOl7h $Ҍ =MEE"駟g'˯ .ƒO$tM:*]2OMMH7Q v bE%kfŜvZobg!ZߺUOoE{jy4T=qJ`x vhWܠkfbF8;#(U8I :q~F3lF:yiش~F/ӌ SPQ(@34/̨*z#,Ѯ~XGW-ڿ$-ۏԬOу^">(}o0G7C\oy7P?mR'D a{M{X /ʬI6`TZO Dptvs~:o) `0b@:og/u95r1BN&Nׂvs]lFC U98fRQ>)6djx%( edvDVȬz=F-\7~wjEwr-S \}U)I7uS7uG"Y!,~#քЂ\c&|v3u BIфUb60rc6I~R?ul!TڙN(Ё"S|C4;Xi ;qK'gfiqt_ XD¾U.zCtlSW47 SG9/ـ*[W h&av [IE=ė辊f;:M=hFO|4!TuH/s}_++ K/)-/;2hDhF 3;ŷo"y*#&柿3blw_O_6O9_Z[2YzΥMAn>CGEy+uWo [y-buc{TץW#ѐqK䄀}y"<{ԩյ|CCϟ;wo/'͖RzkkkY˲?.<;~f\1DcAg>2>AtbnꦮT՚-:|Р!".T' mѡ%`bw 1- $ooj ބGI9jc-,8R>/T~QA3#RElv&*xe^F̎D1u%:P۬9PM;C]ˤBܲG=`QT 46DQA*oURN#vjWNܥn-^TcK{TGEe",OG 2sȗM*Bhf! 0:d-YIq-"\M2jbX->׊Om/=62>5C{]PP*ͣ4lr46O!;$Xk{1D ӐrM?՗_[$ܽ{Hp\oI_xgɕ}\cUƂ ,.u*QӲ PC7uS7uG$6ȚNSp&h2e|35[zMIB /rzNT2qV׍;v@vv2q oa:a3v0Fe1GH,-6FVU0! :Xɟ6ql ‚rަ]9M*Ob ,QG<߶`f;:XUEYR*L$ "2( uw2c"h;ț p$RaY,}|2;g~zf#4\0&2T@;>r>QK EXl=dr!@UbA{|]V50`$㤰6⽵RoN U lN…$҇!hf2:GY/QvU'~*튶53d']A}IYƒ'Kjx8zף6h1/ tS7uS7=󐍉fF+Eo# F{Lz ,]Wt>h&{F/J(՚ GF@&:c^[ho?v1@uYd*l5h!(!Q*+Wq0pe E;M  ]}oRecm+~9[@Nٕ)nH6#ù##:b>.ë6|?Ϋ_)cy~ æ#hf* cZeHwUIL4+>|}]bq}!֯AO\.8mƱ Է_I߻1Yc_|kdhhkG׋aOd<هЁ@_=i7uS7uoT3$4^@XZg֮cϔ/7Ia /zs׭8^&8p@3p3o Sh6e~cMn#lɻEɧeJyu‹3~{xѧ'G7Q')e| "&T*$oK@Es٫ Es'a%zG&GIB?x2uV&. DڸN\_cg>Sע< 0dTA ?Bt߄τi[UMh5/mI{A?v|Fze I7=WI|33h>bY}u_nwd_ujpq}SQ( POrl]*oZ Q1Wc}>klEu]zt LU|*ŵtN4pq\oS/S(s_"c˵&KQ4AtI͉ͥk׮=wP__.ɍ\p״ O^rk>AFuVNZ ;BVX^+<l*c鍃ZJG*NIכ9bseƉE7o":")Gl)x h~+.l,o#X=BkՎQm,эC=zc/a71% fu\XI= ;ʑWm8dzn xWfB$qV4i&h@fO%0}"(!a9 tNE ƳT)?Pez3KK^=ݳρF}LbmiAs)_BYM=O̫_\}Bw߬ ܲM9 Q#=u}$Gh)4lAlI]*< %؛ׁc x*:=jb6ECz@bi5k4uE`Qg^S 1gpr z"eT-LѠ4A+G939IW/AM_v( ܼ}p>3PFWNs z-D[y[-@6ׁ,H>gxWVq1X?PN,^nPG閠̓flS^O[)]o`1}A83+e4gQlbb\'h!Өͪ[!Ǒ#~Oq}0:YFs)9Wy8hXL&;\~ViQ<*Q۬!өZ?ߜ"ѨzR$wwwGflێ?w|! )vcƥ9 f')ij>j˘a' c]MM@P+֋hde> "o5[0ݢ-]N>g`;,sⷲu ]30! Wl.&en`?#8VЙIPŲ_,*]ۺWGUD&.Dʮnŭ je/VA&>gM0_Ɓrջey:FىN}$. *Uy{(^/ûK_nR>7h-, z V:GϘG<|$ly<&N0,1 #ͲN T=th'a qsԚ{2w^]'( {C36oS%K5BOPI?CŁPX[?`ņ `KcnbOj^;"N)y,Bu=OOE<]$e2B}KtmSYO(1pE{tyͬ"hHIP9W#bBɕϛbBeO !%C=v}ӷSo~ɩ?)7?O7/.~uƕe=/3gpYzK- !^juG1FhvrF @1u.n r"oq)-t_oR̢d"췯Л2DbFmǽ̆š%#, )d>(+6t40Ws1=繞q:NeRP^ܝbvI{HE߭Ӝ+zV_V9}T]ޢIFOcV8\T+tw9ڵHeːy4I׬f:Ɇ e|~p}-ٷimOXL F1sLW*TUM|pB}JjN]_ lw_ P㞅:& cˆ g uLqZo%'Wh]|{z*uYĐOl&,zi8 Y.Xm\+h" jM؆T~;6fH.,;|羲l lZ$|s%|cDr߱Pi?:el4 T9UuڣJ||kRo u.򎁵7J!+I| q?s^8iƮɻH?Ũ^g>\o3% 4?`6 d> Ƭ6 Cqp|.Y=ﷅ6B}[@IUiN{A юҦ))M:5v.{~ 4;6Ѝ}yE{[ :3F~VƾUׁ?ߞIOE*Pp^:S/J-n畞CTQ5絠|U֫r^;ٕZ`"m-43q69pݻ>^9sf w[%*LcntFNM u 4w 'q=O;ibg+4h-غzB,8f̈!vwSh "9bs Ki]N>K ~Z1|6`U /،'|1FwirGsS84cx lImgfGy"fH8wW[MbبT37 pda *3Lq=Kt+ۙnn[@b A+#4R-ec*`bO6e;(k=pc桬{T1*}`:(oy/v@]Je7qz4_,л13cg޸kJw]/H;E?!%.śZؘkJϰgo\-{' ֍AA܄~5W.”'K/.W ɠL}fLbʨQ]KUS8zʙa4M!T*1J$S a%\8Zm ~b7vK{e2Ɔ-r0> !2` j4{l Q ӽr!ql\!ɺH· [v4X0 ݫ 3QLDpVcn d$x1NPklӞ3obۨRk*<"ɚQϹt. Y>78ʉ|ؔjU+[2Yܪj~9}`^Y^n@~H)\lSAkNNϼ৑; ^ w]j7# qDu;jv‹a]i辺qZլ-1؁YvXj0AE[(4)5ZJb U&ui,@dG}!AF+>"Vaѕp2YLAoP`-5a2;12fJ?pUx`r<12!uibH)j00<\ׄc)] |;I)ц{IzM{zpHxtB#g^ S#͎"ubo lPNcFAuJwq "R`@nqpT ]C}кalbネPdNN[Ht L:ܚAq|i'h(f- ύ؟B hI$dȮVQD3Kԋ2v0& ' >UnuBϋ4"4($M =CH:8Nw6 yy\#Eeڮ&s Fc 1?JO&k(Ml80dUIGi ‹xS`H5E2CPLOI0S?fRّDG[S3SY-l@Nl>:j JxW=_j0avW,6G?;CTY9ϒ: Si )OS G(hȘKhv}TvH ã8l8:.3]5v#]gW3E1ݯax갊&%n25 JkH Q `X^4c_WݓJc2^8_uiucjPJ4<å5 U 4IT ZP_@/ƹעVG"+NL19&@t=C(ΰC& DbࣛeT׎pi"/ @oa}]98Yߥ~:D' !IN ڗ*60 .aH7C(}eSAF/ql=6Geϐ,caqœԪ,d7QԱPU4 gX ++n7A4L4<~a 7n_n ݃a֐]`@VFDN\'R]aџ#+PY#^-L)tH~q?M=jhƋ=6ѰPbꞅ0ibs|@c7S/3);W` ۇ(XF {CW bM)(Oc E%U'`^OYpاک-<+ p<] félEQK$5@Lc<{`1CPW̠}1f 7-jD,?]u4Bw^@/^jAH 2Z /\A}m(m5N+/ObꟿX[069V'!N v ^>|k2\spdc t'`lE8Jfsn 0w +aReXy3YJPNGYa%Ş\k(|:LUj5wa( 8N"l-IsEF7zbT( :rxg3[}(.}NNϚ 0PVV+ή*X}"vI/lHe|t5Y(9>@@g`SucMUzJ &R^oCdԃ6sOݨ:f| *.DP[ihfs v}yj(I{$JjaOv6Kx |2.װts 'P $j!WԱIJO8fʝ,e>%ݤlB*/e_Nɱ ֲ )=n˔S IIAZ4?}*ZtARVU>!FԟvG֪(oj>:Fr7w!}6NDQGHS8=ya~%\Q Ec Z'!]tr ;yܬk(D8=:K2=c<~AMmR6k~{}fdh~27 (] an;+U 1~4Ig}:s\Ÿу_xejHuMG1f+\-̀}AB74G~9G}$*>u5 EI_;YÄf 7)eX9(7Y붮Gz=*{qTؙPޫT*wY&aఓNM2UN$?ɺ*gjvZg+&`2ѭ5Ӟ<[*,iv9#c8ʻi~)- 0+.x p8b^A$ô梼 /*D|"@15ⷨǨ7"82K7p̩yU_g AKH\za=t"!BrX}TǯJ;J 3=_s\O9pD`{Zpfh(´!tsC5(Wmᾑ3.lGD &9H`s0]6 h;S;=3X]CYeES=DQw/)x` }ض6\j(lCXzPpXԛ&J5 fW ȓJMωXJ_.9ϑO+E8 A0N"ZTkwpǿ,>*: aX(䷰M$)E];e|a Al:N0nI<ӓr$h>'A VmQ6a* -3c$` K¬gύ{A'%hUf*ع!Oz}%5je5kt/9u c/ 5>'_`~g ^MLD[_\1 Hwh??0MaILÉDICQ+7l!IʤFzV 0GeX'a5fM**[x܄m gxcO=syTnT"YCR/ V2QX6K4;^3BF8  l݀K!4pU$HIX.`"!1.L&1}n=Dr$v#1UOc*I ncbZc5 \:`AC{ XACh&4MJneb дHr3nB F1jTμ[WY >;HbMslz*r,wp$EJ:]yLreWڴS;S[F/ͷhyUI)W<hX kO26DpO~#!GWe5ב#%7nL61J[3yf)Z~oru£e*C3 gGd6ŊK -" z}!`]D-WCQ,lFpe."piˢ6YBXGv:-UԠ%81b"eaCs%RIF W[f5*0@[k/|!}hHPBPuCd&ue騮,a$i1Iinao2"Qot"vq%<^Y:VI\&jul poFAAX.num˳-nf!cR`T#)T;F/RB=iCa5؎ ?D'etq}X(]^X~X$\>E I}v=it!oX_a.gch% \sC.@9Lqw(޳iiiRϟ}XY?)L&کک Ԛfu,e£d?kc9i-44?trc}XϚ$3No5=iMb3'×?<3t6xt(Z~fc'u~:Sܢȏ^Qp8)Km5ݛ_`spwB A Ov06ö%{%BpC`ba%c㣇U0Tj]U$#K/WN-xl+-;Tz`ؒ@!!P#Jذ4PGnmK54;ht\]מal";c4/럮o5 sA.StLL[Qr֐[@&JwcAp,fxر(.!֑@L `n*WŸ=OL'w7Ż0a`;;\1bItcgS03gc|HQ0!T6"!eEOO01ۨ/]ϰ[G!oac XYv1QCP 8`K d+Pyxf-* yfrmAB/~GGG+''znsSy14M˚⨤RRQ"@GGkvjvz&#QYv+`KS>hrQ41.]h2OxQ_0 H{w÷,@Lw}Yusr3>:S[!E}A=hO C畍d9xaTCP0Euv? XC@eڨ5ǙZ*gu' ֲ>!5܍1LKW7aIgRCSqΩ4YG+M=Äb63=tckV#4'YSSce^G^ F)q5af (q ~G`>[XQLKj]qD{Ȭqf&D-aVv 4&#jǽ"AOP.ձz+̢`BIPY+6JAŨ`Yzu7DBZ1¯ 3#̺=a57`y3S8߼hkq |MӀݦ+kFh٤3lx}XMUxyJPz"ſv]PFVO]k:6>6Gmf&3?Mhu 5[3ϝ;wwhhh~XQZ eo]4V('NĴId/,<;{%'&T$;!Q+ݑdOõW]eK퐅]-mfYW GEjw2~Ʀexvv ;>-#:yn)动 F*gLP_ &pA?`zyr+waH $+& U<5j R lzv}xTS+dcʷF)]j(R#187kX5#9(Ckexl dHʲg"Ll~/ b,T-TsTkipKBm#v'n橍&w1ήrKuE6 zjho aꔠg;iLF  "hL,ocf8&ׯT`Аy((+T3~&]9U&Vt+t+}I_U>}ڵO:5;55TR}Rr M~: U5mS;3#8Ý& x(7?/n7agÜئ3eEPkJ[-&فξԻU + G=oGL*mj"1Y2g8e D4 u^.8,B:,EߦXrR萝 "P^}TQj~)GEj1E;.IB` DL\⢹F xHmqJA `OX/Xl^aB\i*l]}\' b9mo|VJhy"̇ 9wf8`&a?Cyv;ԁ$ Q u7^&!eBj$4Ua2O`5Mt5,/'I`AMӳ=5*&DH:X1NB^c$Da먐2A:]GNBh`$4D{^vڪ ]4>u=jĝێbdSuP.OstsNJʂX\ֶz}wCu;!ymhXwkϬTFCg+>&;3S|/RSYәHe;޾}ڨQY=00Q׹VfNőIG[NNϬf67h03LX P ~ ff&7J %;j;h!|Զ)^&f{зMi5ɓ}AJ3ZAV@&]ǻ brlQtBG$0|Ž;fjHɐe&jZyHд `khtsʿbBG?t}i$4:BE.b$4l'eI=tMHXwhl"!atMRZtmH%oya>i ҊFa* ]pĦRk¢SFzPuYzҮ3s&K>y0ڮw-[2rmQjv94;3L ~-w4j y"3)SA,lYmDQz;[iӂ(j!lC$aŒP*"hk5)& j s+#Tuj=ןEyy&]|Dʼ lhu*^+u/)'*q?Q<'Z1>(䱟8l>EE Qy]O|zӌG0\K^5L3my#j)UH:v59}h_𯩵S;S[xfRL\٤ؔ{&Zk[\ +7RCnmvec'g<+;Y 0>@`Z";YO^Ra֞,G;U sSw}D>Ba18HOucB/TE+j( 3#ցo/Boh̶ `>5XqN Gc *cXkG2nEmﯹ Z2 RFu@{}ufԱA-WB~6)XrN  +Hg|;-ͩMi:"Ju:Ah4.%7c[@~vEx,ɹ:W?pu] >- 'G^{w)RǾHI! HVl|argd}bQtQ^#gO/%㸿U\!@Յ?FP%mٞ^W@lP?5/V Ux5E7esR`:SWx[_SG(TNT{7?዁<<0<1E`]B?X'Ĕ;w4~W? h"~K/P_ߎh9x_gLWlP"D|u}Y`B@;S;=XȎMjYlJ+r$ <ݧ,UksT,@+7qU,L:2a=4v)ڒ͌:"ɺfe}KdgFQ9OcץLu3@x`Nܵ ߟK"*^kqD. Ə ndhN]S#?'f><¬(E3ITh%~r2[jԖ{EtY.m%s$)MFœET5֟v ,dS*.~0<ȒOEԩOHaF>((e>-6ԯXC6 x'0\&91R@; `p?/+}0"*w0ث"\1褿[?[_]B|^G BN IyO5S˲8 Tܻ1 )igM~}@ekm Ȓ4uw̖+0}/ʴ:7D ;7ŠfwD=sUit˒FA9Wp~n\{PΪT/jԞf{=]'eE)ZY)N^ab eZI:J'X~B9'$s^93JyF'%K/^X#x1q yyq ;jPZ'h0HeĦ:}r >Ck-|5Rk23R岒.UEbOiM4|R8࣏ި\|%)_}٫[Y#ZzABB"^pV֮@gywBrX5jDy>UTW>0њ}}NN @NnZF zi`.xG R+pG1ױc#lI^V̈́ LE~7Aa^ /ްTs-e'X4EL )SZV{:FY~r@!BmlL ,c(zGR֖//c~wjH6O/Fy,xr#N}gF[cȔeI:NyXDIO&\-`IRånx\>AWcc F܅rfkXٰf+n[V0| ۵٘aRy2}2+Ig,'*FGWʨDMR&IONSܺ: @HFs"OgHyu 3?rOFp}XnVoyu"Hя5꛾,%yJuPae+ 2%BƲy_QLT6ij0| u<\`'#'4!ҧԕuxqBZTve2W ǫ"@X0 d8;< Ki+fƓf$PN =qO` 5n eg'tp tP.*_$d9v$ }"_:[p@`i0>."XU;ʲLy٥$91!i1n²]&|*"hBAq gwl"2ue]!2ŋK2x'],} k^'>AO3Yvw j,uEIb-9VDgءNGEQɺ#2&~ңlB!@؏nq$Ooݺ`~eH+ <LAPNNT %ECb<& 959I_jYb)]|3NYՌOG:OѮ%KY}#/f'*]\"@t媏Qc.U\a&s~]u^2=IHiz*CHL v|e\ {p$6I+=ُ kL%)M>R*3C]k?/:`?0# gdLMR,3-9eq%P} +Ԯ$c)z"Iߦ<5-Ay_]^9X:I!ՐUU/9t&%ޠ ~yaשY0{J$LN(B kzOؘ#Da2=*fJQτd7$LyISDy2\l|M+jX2fY>mFizE5Idک7&5=`v¬aTnK8 ') -$rPfb?+aIue KƱ L| 8RmtWoIK@Cƙ+JYSñTY 1܀_?NvE u&}ydm!ҥ#0H0(c],l4ҏ Njʿ} g(**u$ThTGwg D7,z=QX 2 rwٕ}B5x'>1E*Ƅ:9`5ێǕgT当!(Npmti{Rhv͜-6Ra3[D٠o</@L51Cʹ *uW8hvXJQKV{ulW^31Wd۲CO9ͮlC~AeWro>J1Y,sg< ^o>h;f**k[k-G7v$.)swM@2`P4^a-A|39|H0TL -X3B)n%;̐ ̈́&|rHCoN2 71fk8ocx u71r @k T< Kj>6H/Ew0q@ۿZ9{DEfq ȧ21WN S^UP^Ha|uԛ9LڸGw-4*pfsB&2 B_Æz:Hۊ@@檲m 3j@ xvުE4<UE 5B~!ITο'7ћ4>(Aruת6~=v C_{x.5]s *@0[}cLѠR (R6136V~R4۲/Pwtɠ D>$FgJݛ+OV1y"K/%`u\SCpi>EtPejTkSvE8~/6%v4rI9 \t nchgkx-)Z@j SN]T$ 3Z0zAt@`ݥ>`lChē@.6PA8e}SP%I\E۽--= Pqӭl `ǎ2Ry:5}'F& &~^ W'Նo38G̫ک韬GT:ݧY4Ug@f閛 hS)L=Y*#PjԌ=(ra1Jym_<aOҸ%6GR(-NH8Ս& uĹNa]{YF4htJRUW/\L4@7` `N2o83wS4G.4L`A"!uM go~+AG3ml*y B&L&U~]p8ҔlOWsp]szQ:1lwqdEgJPf Y6QũX^/ ٷ7w~!,uDi|o}TjOI8:qO&.8`aOdžl)€٢V$NT~fXo1UTڏl;됵V <گ#O`@ӬNuR> DLl(~I[m!߷۩کYiC #8hYdI]Z3/zl|&YG*>;1d)R4`;8:Q6fYzT>u.H έn^&yQ]]!?J$GX0Y8ܣu;qtTGUlo(P&{x1̼ҏ/XZ+-@Imc2kL@kh-W jH& %x&0l:QI?WND3S4)w/%Ӎ{WKHm-Vn݂ۊoj~ RSCx@g>#-?QثG}XPv _\Eܩ"Rc -2 e_>$R#,Rcei-aN; UC"qx- ̭#M*+h^/?wZ)k;o469XΨg4]]  {AћM76 Xm#EP WYh;S;3*0a"; gIH׎yóMg@| 5Ggڿ cuZefV)It~tZ:sm `0RxlSu N17 *2uF_bPX8ig FG6::U ZNѳ劁tiUd*(T} GѬmV^1%PZS48@8PSU7E`Gp֠V:lq6 Շ55*"R{]?8MyPOB͕4h _ -De{W݈;Ԑ)Jٓ|vQҀ"*4uY7QXi\Qq*Ik2I'*ٱ^>0 5Wfulxa2j&փ:#e E`A#e 2nbmޤj+,1$L'r *kaP{GUZu(" rكTlAǮVS9Wڨ4E8(]Kد&'O(qک陊p*t70M;v25%> V<%7^x7;, ;ZJ^&yeSgV¾ Ć&^Ǔop`4II|ud:Ф<JVˎ&3餆˶]΄p@H#aX"hتU^Hx W7=dߐMݤMx6ؤN`SQ<|*̍Urc 0?4IZŪu,ws=GR7Dt9YY0+ $5?&Ak3*Ӻ|Ǽ8E~,2Vcӥ]ѿGLp$gM}8`!! <>v3AD]k ZDl3uEɶ|hp qB8&;U[ұGl {菩B"ʴs2wrBD]35LSqiBmϑ4DHQ8< xlPuyA Ч=VoC!7a@Զ_J@҂4 `W^NN  Bh|!Ǯϯ!LWӧB~f`?W>ojج^lsNeg8LTʛ\ 32;z2-SB>I I֢V푆V[r6 (mkk'G9:Cqsܗ <ҽ%AH/!05L;uܾU#Ϟ_@f 6lQCvŘ LG0է*C:.ч bx~枚2 E@U9$>xl4a42) _(}.+ 5z8`] :&9p`6=Ӗ )NfN_qY䶯(sS nq1.$Lr0 GxRCĨ#epa<8# Z0/ʸ9g"#a+8i[S5G$ꞰGZvjWsTmFme+u˶RI^^IxǼT.( ur>>0u*yI-uQVTɢ2GUQ[7(Qe.r yє9W0S{,pVF޻B ?n6ککq@,TKʿ4x~S6ĆtFn)rY2Q['T/p.!w6V@&SwrpiꑀG3VkVQ̯UߴQ QBX\w\E*^{ jFOlU*h`ҝF}4q*7a#wAƗhL%=(љ)Q:4\"5zr=f|aV*C9]Y4]e}7{="I.Mw]6g(hG-}dEsԓwv'nl .9#QPA?}D}OTP Py"9Ņ]60vrtt~pkv. 8޺UVϩs՞nzQA=1ORT\r6Z\xd)'}Cv63)FA(qi9A$[L@ciGi=Nut}[~crQVʲŞM Dͷ V(ż#P3-֬d}AsRv7aQxO?``!`J+Y4yB` vatҋWS"!T:g9sgr9"bh5/(Df&ClAVv_XC@|s$._ͱd"TPy*L`VP|!\vݶ~ G4Xd\TNꏘ` K@7 ;:VR2m, 2BgS=*b $!I/'ȼQw%aKj3 >DK"E|;ܮՠz̶#Jly^2}0Z`qߪ2}WQѦRY)6ߡjP:S3q cc>Rv1LYԋ]-vc9E1΢1mRʬ(3 6X\6ռScxW+I1Qe2g]z`PةGdiXuQ0#dcn I$(pi)V|qlF-u *::N\ U~]*}EGԨL0mHC&=W/Q )d=Nbܩ1&kڴG#,+c(Z"![#nLibMI#tt7YJt,]4+1ozDxUn>uЬ܇1#FC|oKf*&&Ň[|,\D>Qy-¢+Q;E͆A;g+C{6QAI\~̵ї‹}I\a"CL,\?0vd*lJw;#ĖB)n.g%ҐP`}HclGE%Ҟlj2 ! RX3.KϜÚذ3;=P&;lN+Պ^ ^F :7lG5|#,_JE U$B/dѡC xܭ(19\~QfKDPЄ*vcD晧Yg6"MJ؇&MR3,aZx#ykgggq}}J#`ֶgǎs ա=FWyl߹%ҫt5+Ot|NI]X(VTma&>2 xuw (ly *S.*J*pYK1[aU#X0T}hF$׽*yw ]QlߙV2L(q=@E:ōvM%;C5aúͣH$c0pObu+Wi>z0C^+cF/&y.Fj^R KfX(}ʛ[ʹ!Vil6@V跡ĪaOÌֿ!˚B2`"#lJFxpm铠란ʕcJ NƮ5nY5FcC"sh5%>w+aKiF/x =,DXi%43pcs,ݞF/dz;6b`qEF@y(G+0q|>N`$^ `6j1l+jj'%<8QlJf|UwӠAlDZ5l)n~5\1Āq0g'<5q>*ԨʄMYFVؽ%rO[Qxո{OAӊ2;4K,Ƞ?CrQIce@P3(ʡhcVwn+=0c? pL yU|P +0R(z<^终9.+*=X(Wy&_2.ޅ |I7}[ #P;hR`ѱNf q*kDzAeRF=ٔ݀G}믿-[,,tOiiix֭ <O SǤ=tSC^MвMofR ?}zux|9b~Txz.TY zI7B3`u<0ŢM9lR&UrK eVª'eYX&spʐ@X_ZXm0\`T/XjрQGHxC8VjܹߗF`qQⰢՙF[EI뫩B5BQpXOmC4uVesA]*c^w5V\/3)Wc]3(d=zq&hk;մVx$aO) 7hC^0RQ e!Hbr,@diFށ g"LJ@ĮY-5@ް5792{'a:sZ=h#c86Nd[VRhBEz%۩| `$]aTStv!ϖMv q&K$FB[g嘑v Aq(%iڜJ,p:-,;3wYC `cې||;ؘ>N)fU)E~;nl1~HHELrLsBMBD=]Y[emHGPEE+y4*\ mo/@{k.WSɮa1QfҵR0vDQtr 6igToӔ.]pkh я~ogRŢ^f@1>[kAlEb`JΑNo{Ιs_{g֬XQr2 +d+L#]XJE8,lVHkwLZX81%!Hkf1 gr*ݵ6w VBxazĬj ʨ҄`#QAhH (CI8b4T o[BeQE?C7|U&f2aL&}14H9n+欥ǝo9{a怿`Vfzdk,3f"ZRoxn=q +È$qD|Zl^@:ݛzLWeW#ش~=ѷvq_fJ_l( И:4&x7m7uB:G˸ޮ8^g `hh;v 5H` <#C#f%'F+W_0 b.FgwOL1pVM-Ǖ]ʃMJ )EH*@;to,V9b C<^|]|'B㓸<9E~O|[=^xӗ.]~ B1#=?znfKzHWs?'jM7>Siy eObjTv}@'^ yu.yeq@[Y' xS fNv}  t[ri9w|AXDJ!gb, PYϲާKDG5 ҸCŒv ?;"3-$:Uz\Z`G|44Rh*]p/*;b͹hTVEYSm =`I*m7n8o(Mw ؖS-9@ ,,2ڭp8Wz8H 11p"Pf;ܤuS$;b4ҍtc%\1ldٸ8`v@x\mJ;eTMԈx+} :kf5O#Fp{C[>tc4$ 99T~Ub= h%s"s۔oH 3+2B9~{i(Ȩfv`I iӰ?rH">ma|,Ra1`CQWY&~K|@L9^xt6O\=[ֿ~ȯL &ER'l[5M0|;C&K8bv=(F%߯.2}(vp3+ ځi3rqd v?'qJ`( )99yU!yq#t)p˺rHK% [ ]2ND$] KuDG$@$Zm4x,aP*Lv0UqaWWNKE"%-csYũ'S*X Ӆ_xz[ў~ؙo9]`#!~!'i\[b]GB/-v%0LXXBy[ Dvp(wˉKaU!i{a0k-kc*}XFMAyɶvHU%!֖$n{pq^*mȈd^ilx+߿0?sYgwB-J^֓[PZfEls";z18*:t#թSA./S,0S 9̗VJ]iUQ!$%vސm}0On QyYTQjp^9dgZEzyω^*zRlҪp=KoU]bZ#xT<3H0aXs#D)x#&` ޝA l *:j0* nj ,. ->jEWBCiŗ#?ΝxXU%W% (7+xbfT,A.z~0B!5jC0te~.`9oG=~Fw[)u|y}Z; w ? q@?+(4gY{g}+V8< VLo}_l8 Wu/z:=,湙w~MG/?|xk?{鱵;N*ُ6￯hɷFD(ȣ,F#BA;ӀVȅͯX  eqeUE ̨ 1)bOvX$w)e~foLx׬Ztyb2A|L*FD}~VHx6%N}gP<fJ=:W%aۨLau IөCY4Hy2(հbu=LނӞAKh``0خD?$#RbƤX032u`\bL4_p$JjZzezo"͡#Qo*BtI%l ;ο1*e|IE+&$e$yLSE\@l 5߃SX2t$ eJ#NeQ9R5:xN%"ٌ"&e)r0 ˥V]]Bcqz[ʇi`+a)VpTO~L^e^M1?wl٦m؟?E(dW!!\s%Y?c/ǨiE͇W OGP^{Cd$v9UM{ۄ_xAַ]f%nG:yWȒOt5b{LQNI=Xޓg"gK#,"\!'^f`^L4q2r$e">9KdcBZzn>VҽuTє]݇99p@YxcqjeWX^ 'تN3EH:d%ồ92B)lA^@ \tҶ5/K평*-&Q*/3(뫆€]E8[H䷳H~lU\X N Y^,<- +0m ;O#Jz(sDQw\)fm $24 P̵$2BRk,Mo+O^<4lΕ< ǐ&wT ɤK}nǪ̆NT.mBOe7iJ[0஫Ȫ\{q'NeMޱkF2zQ@Pg2/= k1뱼:aл  Y(u)s}+˙(<en99C*-Rn{\'ÖPq+`Pw7&>m TMp%](.? nᆛ].W?OA.yuuu5;\!Xv^uM>[OmO?ގVކk1`xg^N{Y?Ϯ|qNޣ;6n,nP&BZFav0El 3 6 ʹਉ[z=kZ*7$4`܄ma=,ˆ ~E=?yUB?"E1 Z q|'`do3VE&IEѶ35/}W&0~?=xq_qWV-s! =)Tze[Z4mmo~Zgsmݾcۦ,%k~ۇ=÷/,:7o+ ~[Vs$VbUXz 3I2@+P 30|"QXqڣ)%+|;[c,N <&6T(;)pSf<)`1(W@aS*OʢήLA67b /ܡ$lpRpS*+(o@=4:\'𖘎T؊>nFiGiZBh&JaDMK' E~Ȯ<ͲONXVns0c[Q f~ T<>I칸tS,L6" Xug9vea#7гFZMehmvv' 3ɹdu414BCB^E=IbbzDxN=;`ŧ<%tq7iv{eRԧgVX.%,bxoBdGYǷaͰIih߉dfD,g'PH ߃Aĺ`ÖFP$}kX~w+ŜN)oN3\3& XFsyѪYC.yN`Z~9}0v|57W1ːC}uGeGiy*o "M€I`~)g/%p6L=|[ 3H$2%=c4{^}ॵ]v?zˇZK:ov'_%?<|y?{XtY!eCq*,$l},bO){h6Elix!df*~!S{BEXR֢2 >ex%/XR( xD3S)rxR,y^)?ks'OCc}+Yjɶf+**xc6^_/~EKP:uᲯƓzU$8>y!1X}^R#(7W)Ϙǔ;:^`݉ P@"ʲw=zw# Ů0PsB1deQ,/eW';q|wR"ˤ[\ e.# ~xn#[`7qPGQRb++зA TsgiRg-rǖ(Z`:Έb]d`uٲK֧(>*XǑ}#$23D$ a$|G5-r q܉B.]zVP. Icq98PGoy/| ucQDbO)`14WslL3Jk@ (e[8EY6=7B ܋y)̎#9] z5tʎs1Ğ?bk8&Eh? }47⭍[s8=:Fr^TiGJʞIJ?2;OL+^8I*?LcwyK}(!.\4˥+uRQ8ǭ|(D']O)e#WPsXrn3[$VW9 w +A,MPI*f7+Ğ㮯LN*hH xq)"ߴ²g Q*e_wX`xƱ#=SFP۠XV 8R6)8})X^fy+5X=gqn8GKCr ]j鲢Ҋ9vEfa=n܉G UjuݤG=/˒{?юFjʖqCYrG{9g#У:<ζhAyfa64jBlȎ^EC0s"QG]ԉX9/%T T/ laU@8߭Ǽ;*EAS.}F^'$sYg=jժ:k֬9B0LAChdl/"闳qF[[3šK?n˿frJ*;f@VdH1N4O}y9Dq^wӂ1qȠPMr,U7b!cwVz>E+3 eŀժ[C^Q ZqYs1̆v<.1f-6Y4ԏ.jg?SϠnjTi%Au`Ub TI+ [} NLt#=# Wb] rY )a\Ϥ'tN u^‚^V%J &7,("_fG*.883X$& OXtLL+[D0 `EEc]Ύ\9;Gy*U@`eB>L RL$%( 6jέAly+gZsl8h֑Į FBW*#!4P>"0`$Qb,S@ R0< [ZԸ΄*.JBo(mX$3H|$kXoljNԓ@z iC >>+-ՠmI9WJ&t0L C /2ݠ z`d5X |w)q̃zD29!1DI95,ӱc(CJmm4r M ch9c! wzևPLZ;BcKcF:O ͂y>6$ GI Kzs<9%?|D9[gNRd̔6)VӫX-zf~n,Cq@[<;ƺhJ K P^KR!e #iN;af "k`l`sRԭAi>gFGx[d߽ ۞+VY۟Ao*ԏB&%\e`zfkMtv,l5הz21 VfOkࢂ ? lo" mKS%t&P>'tĊyhyTey{zy3PWbA0; ?OSҍZRZF3H!N(Fw4Raڙ+ʡ( Xy+ h'̅%ѣro7y펫\{|_WE1/NpsՔT&wH.\v[bJ2S 44^΋D7ٲ,ZccI<*-x ,n [L G o=?J$Ӡ^4*ڲ9"NeU_ 3G$Em2G53('H*A)g,[+=Dت =kuL?b.:Fw>`d K28*[Ӑepy4 d2]OeI*W7+ʡ+YP Ԭ<«ƪdd_&GMycjCۏڍY>гgЪGQP T6xPR]aG1G9ѵ #ۃhQjun5|M$#WPs(mqע{F* 5 myUvs+b[18Òt( ͤ %Y\S KNM9g36LqOnC_?(\hRzSw_$Yz#\}ˏV8$3wIVh>JPŠLV,|8,샦؊}R&zK耼 `9Ov L͉m8yW z|_nYR!E(g^1_.BQ(, xUIS.r3-JVӿ;Ip i5d'{q,aߢS?c/: U'kcY :Ԑl@6DGH8,hH&T6j(s<^%qF[VF," zllq^EKZsnO~|oPȊA5hj7F6osUr:R/=/|ɍovS]"o]l.߯nSF2:,ӜHԻ6ׯ2ӨFXcV[6E]x4O4ܭs_xvXL\/]Ty?85aG!@jrBHۨoFS{ 3(n#14~}SIC8=XzV#\JG%!;c$L~nj]EewGL`^"+̾Bv SC2qlJݥL&n9Ep]/ԣFYUe:r2=t`@(-}97Ų.EkO?l8w`t/9U[G06|硳?\5B'n@K° ˹í(e?`Xb1K`xRw@R t`Вj!cRy/GU*݄a=mWiw f,G$0y .j[3-J,lRf??S_ꪫ>\۶mamb%f$̻?>ȹ)I?[?}Mpw9,l^5%)T7 RJ_c7qeA 8G!~,yGl9"D ηxmL(m(mIB:?;*?^CL dPCsp3kvbb8m b/3GSF/Un@?5'D01!x}ղ8[SdjVE2BerٟTx*T1_g9piC}k9vuKb*YYa=Zbl6v0D/k%47_VE/4=T/Hg*ᷕ#3-rg*2m$|g<0̙yŻHr|H% aefJW=H\A[pJy\,mBJ+64 oltţw7J}Q.q MGI,ϴcZQ,5-N"Cr;s7.ۑR*oC1ߣ&P{t<:XIPw [ҡ1-(g̈È(9b6eRw5yEO~E .ేz~{br%>?~D܀jym\pL g GJyspw7cRH%y .~dc=3noͽg~ܺo䱟o+9%`Vtv.҂A[N .)-r`Hc!{_S@ڐbE`)v!Y7ʻbűLfF{̖1Ul̈n)Ev*>Zi,G`zΥd ,B/OQ#KVT8PWe:)LaˆN 9׳tTGҘJM#ֳuzd9Ț\0 ) U,7VޭJSuo<':Y>T@у3c@+-Ilm`.\’BfR FM!:D'u eu- 2I!OGTL~%4&Mj$A3/C2R96F>eLFVo U-tcl~ UO?O僘_O8)wa|/;܅vy N*YM02 ]Ň" j'R$RKpe`% &H^Ta='GƀĵzYBc̱~vW_Ku cw1;~zƀ^̫?-o.}ߋ,@\>bऌM[3|nG}ѫ֮]{p^3 5M\X;p饗>|yfˈ\s]{m]hֽ;wߏ6Wk)H8㥵]=> 1_9L̊=}g܉Wc4*?uknGwϟx7=2%`% enuɚƁ۟1-X5-qZ=*uζ<PM˜b.JL]z̙G^.~VZN~$4hRlbG7o<^>M$W C[\[iAM@uzHc AO']4ZQ"zv*@5l!ӮGT 8O+T"-) '-opCw$m[e] t%$7j,툣oS< rM*TT$38jI>o!6f8/ ` nJ45dn`Svr$(cuD(:eN_Vhࢱ _HSmj;l>JqC t$(h6zt% 1fT-G^ lJ OG啸2H asz5ڕ@vL GHc1s5YN\ʲLyGXeW6wA>Bު씙xʼfL;I8@UteײƖ?]Xy*J d?_X>{ֵ^\|"W+ϙ3g k?.%_=ߚ;6{O^~E|nWsGNsw^fhT7fɼ =`l#&bV,VRM;O yaM0 Jlѳ;Cԃ0+N6w9VaXjO͗2BpI[@7iW[weƨ.a%]-cSyo*K}û%6ٟu0 B`2k:/#cU& aI+-ϴT8UQ4'SG2n&46@N+Nl´:p&u"a `M` ޗk,(!-Eְ\@/F*p243eJԘR6];ɧ0ydeWؠc:rlI4N)4㊣khAd>hZ*\aMiLkp ՗ͪusgygWT-UHȱ(~42-oX&cT,W{{YZ1LfS L+T&Dzߒ`TkQQ}+ҡ&,Y% ;23Z+VY\_q~\>G>OF 0//7xc1cƾ'x%Klg׾m@K.]/zQ}~]oQ/<ǧ ?gV }O >3BIϿiԽmgP].v>iն|h폾r_P Wc :b߯VcpE|Q@pEF`1=vs'QeCqբm1YE}U'A* =x*!j Ju̢5$h™Y0S2UzSXYV:^H=IMg?r| I[byNLYUr3{xw5=P0"gz]}1ܷA }\L"cy8nPwtI`_:{e?l^Y+7̢sm۠09!^mX('`:~UZSv!8x쇑סVCꮅey9I!ٍrrg.ojn^U{g> \X?n7};.>KГO>ySO=ug'8s^ /~/ B vm^9c/|)7Cpg_ G=+YK'+wڝ's.Xݫ.\bɳ?6-0YeP[p[XrEԉ67f0a2C0\8 Ue]Rpza{$ruϘzxv B0Coatmߺt4@ki1":H!2PA3JEP8!CFBEx.p6;qڨ)`X"F\ #@6bpvGld+lPM_dJl=tn7e$M=zVLABzbГ4L*֭غ/}{G3x6NJhFо:.wJC,ch0G :IYOK%9η*dFAȜe[*|h 4>B3ͫVv JEVC+rr?N¢/\zn`Ţ?quj_ E,{M^=u<ẛo7'[_ߺC 7[eP5݊}CXﯽrJ]Ǔ geפ#Sf7j,` ~#A4Sj9v* I)g$RX?p`ps0k#mj5(eƔ1Éq5?=gsЍ>jd9=D M@݃Tڀ#;4 hUtPfa9uNâMf/h;б#ξCפQJ JĬ1I[9G>N7QхA$Z,TYogM'5{kW&1ПAtWN rR06A:2Eg:4ˡډjA*瞫Q~ _?OꫯG7~裏^(rX_]UUIݼu}1e,|?W\slhy9gx7 tB1hWy.8Up:/\tE)n;\s"|~3> \y5Wt[(ovuHvukn ^p=<X{_ÅyOS>$=>$N$_s{5x/7zWOg r eA3[Z 0J(μ%`o^Hq# B7 eLd!-tUXؗ߬E{DQbآ] V8 _ *P_ #$n29![!nO Lqep_%ai`-TH Ȧ922%\Xf(c"%M:[ OZIZ2v+jzHiPaq5p-|U'S"3CwĤt;03Jqމ*f7$PG5wNzN Xؾ9g\.) Z^\~R@cPIMz<.q/l<}G$9`ЍZ'T{)$ٷ-Fӑl]7"։!xO:WE"@y e_P{:`籚PbKJL)hA xN$hlY|m.{uz7~̰VU=}W~Kxc֚%?Gr=ޫ`O/n_3A/rK=r}îŧٗVK?>~5>yq h/,/(98,B:Q ~\`e7!`,q `ḃ>M | ]|h4-;r!n'YW@\N/J/ \d~zMt؁17=?0+@N9͜ cOd\(\>Mjך~dn .h+N8[ۇ5w+͈bZێe|ZzYx N<(ZîQ-wOcɁX*+;~'0b 8?}އ,s`IJ8%d/laEYXg[ou1u zWzwN@Da^x4o߾G["A#WN۷_rC3OwQ>='.C6_s-mcRɤ+k !q'=^/x!?=f6=7"ɪnSˎ* 7'ox8޼̉gL4(igoC8m$ MY|zLE(^e0yxa_sECNfA(@DccŌ0R*`13lKI9Pt{Mݻh"r$N`Fr {8"WRݹ(ĖS8;s p-)HpȘ)ՕMSVTDIIzֿ?a(r$ U[0k,0 eFڼۻ(.vM6=$!лt"E&UIҔ*E ] tHQ~@H轥@H&slK|#ξy̹{.7痪H ϝ$FN[ I QQ/$^0hRǏ笌W gG,;mXg-LWoNEMwOѳZ`^1fV`NQ'\[f}GRhŁP8v ~#%;DIkdSDbC%Dܟ~ 2%JZ`LOTTT%Kބ"a@,X0 ~T #rڃ>}yfIPի*MvBi8rӴO]IDdYMSWzy'O66Q#%ǯ]ylУ 굎|֮,_;{E0\ ϙ߼AT2žVnVh2N]}jƃ>!foV=c8jch ʣN+m,3K"#֒qu SgÛ?:T>O8wqR0QKѧ`A2^τ`8A O.&I_459=nXS`Q&#c(rC}\gX3NO#YP3g%'3Y!B)s!I]l7-yұb,ƙ+xQʖWz%ˏhe2CACQ+ 21ƢRQzÞ8x֘hnDqP3ɕ,\,j"PX*qǍ\wCDXükd0\=snbf ϕʐY\d8@$=ŧ4Yt]Rkd8DdV }0yT[>8**ޠ1`"ı &ȪґܜREx%ғa%d,d/M$]@M Y-?r!\ҶR9=]N^*haZ>EEdN'^J?ߜl)=qdN'ژ9qIH?268Qf2QxX; ʟ˓hxnPZ( kxR)Fzq~a5m=i-)@ÿTR7o~߾}CѰ */M&!FxI.7jq=ZtQ!k~Jo间w{P$_#B[}H;?sWnI?k6ZV_{{>ejR}n:{r ǙckF8nE6:Z1?lG'G8ɋDa /XBMvpt%y"q&$8ihA4˨ c'$^%GPL}(*s!DZN@P5*!6H<̫/< *K>49KEYzU@aS x{X}$gL`fb╙ &\v[1nbc?MH*%Ё{/r4'NƠU1xF@ f/yҐSj?+Vs~)c۬裏~޼ysGnG ݠZ;7VsD4PíL *T7kR=x}}-gRӖ[a]Rq{6~?65U`(.z4n=͚}# 6+Z'o":w~9rO9k

(DDDPT]t8q$ԩƔO?DĘѴ~}7X7fJ–u}'ZU]Uѓ=}1>.~C.(d_˔Ƨ!wk`L_r8n9t=l3 6Z:~q\Zdc6$3MV!8/"=u̾ ej9Tµ%Je!uEc'3 ь&0/&Wi$}P{ % W~<(py(E|WZ\PjL,CDf=6TL\Ź&Sck*f!9T 3ڸ^&@~nϝ;&/V -ӧOl֬پ_-رcϘ1c4Tw/f:^-/oUJ]moeF}FwFbG.?| Og~5nJ\O~=z|pX2҂].A~٠W" Xr jCRۺn3>ױ]W߾VnZn{ ey+H׺$NIv&&+: $IưɼR%>THf283& }6ʐdb+U2*)0v_ɒX#'l՞cd,j"V&sTmC4P B 'VHYbb4je9aOś1dN gX$fLr~"MEsRo{\=ɘJLDQ!~B2tf1b 9,s,01}J>Q544d<=WΛ܇2Kے*녕eNXXﬖBj]]] ɓ'O4iD^ aN[?l` ZE AxPTx-j SIK7 wujU!Oy[ֻ +!svjd{)%;rH߭ac;hfk&Zy@MUG9v_zaLRөvxC f3/ ߬?]a[B.睪 mf᭰m+&.56T(-7W-=I /)!d=tUPp`r[VTvP1#Qϙ8Sd!bIIh jLH-7f*# c(j V'Pbk!!m I/B\l \6&B _dnHWHLLW Ɛ;1Wnɸٙ0_2fExydE̘1_  x;dȐp,?qD헶}a;~x]0-z }x4s{'PT7~6MKa*W*>)1/:^ceLI>$N A_{._ VΞͳfF/&*e :kcB~Y-ޅa0fBq=G +@21>^'1OZ,0*SΪǠrO u]p6 r%sq\|Y c8|Wf˛caØ?Dcc~aa e+10pd`x[; &vr5NDF ¹b+18bY"_j 01_E[8#;uqǏIY߱cG(xO:U Bâ]P)?(;׭[xlll@6F~FKB?/c]6*X1'fڸOgM oN' A#Ə M|A?o?/Ҝ쓝Yࡇ6>leTЅ _4C5urɼ7On]fn8z0I_ޱH0=~b* vu$6{ $rm^AOB#?Ef/X 92 JD!Zz11Z2 {/BLʒ0_koB;x>V/ԓs/G%V<cBacK!G ynLUf%s5Q#4q@_ŬB'T Ǔs6h3B}1Kw_%0߁T-~KJJ͛-Q-_~(: Z,2ch7Yz?.{+}|cϝIK4mC,V `*1mc/wοdu+'7!h]xhע጖=imJvjvUR96hl207 ;֝UC mI?Ր52^U&ea@UW&3!O^h&Hr2 f U 2D9geTse՜0R%^Hg'Qm'Zb`:IMQe U0רa`j⽎cկKhq8 NJ% c% x9s:ߟRZs@WRT5JNIt:v(xq% (]uܛ M4V@rVr3Htw[-.חZCw۔] o* [{E[wW5-+T9cM.+ `H1މ+O{n@?Hg\ltGޯW0A~6e>r)/o8iR`|~? aFc?z듶oX82/A\g‹.|>øT,b bB.˛hxO˗#mB h0QEx51V&j=2bwX1Ta Vxyc% B2Ek0NEcHbuV$l̹Ucɱg8'L6z&!LTёA^cB xY_} pZ8҆ai$ {Kfk_4D'0\G[snvĐ11:^֒YgDZ"+`rFU%&<,DuzxklPu@ڗ/_OJbaP4 +V *BQߠޡ+颶j޶x[i{jF۹CT(L{,9mL=TPM+i yr]&=ݬ6HBO6zq (S|UVr8`^VPd5 ƘsHGsN1[^Q@31 V0&U$ %=G,f)‚w +ȪYƸlw⽥' ^b̭3+qT2Jg*RB1ڝx~ 88lb "bagr29gb Tuŕ$}s\ȵ13ˬĘu#;QZȹⵕ[M&hJ)I\HQF) P9b,RW jsrLTq֑(=;xo{eb {mJR9$I/728ZIW9Ȝq@6l~ 7!5ɇmݺu]{#JA`+b+|(ӹ]}shi0 ЦjzCV%on? YѤ٥Ƨ^6|kǷ=~l3 j>m /Yn=?|ȷ-ڨYrs pB$fOt<'f-cEVbgQCg04#FeX~b*%fXhH{daE;8@Fq;Hĕs0j: ZqVKBDLȑ73:9+.L\ |l;*5hdq;Q[sžQ+Ki5)LXї,V?ơDx~O!Kpz!|bbN Ý;=SN눊7m%I0Nz0K% Cmx`e(UDH G<&,;dYHi۶mS+V wSWD('Q=z |88888[B^E?CȑFQTX<1k7vٳOiZ 뷘<Zas(bRGNi]&;ZYEZ+C= `Xo5s_[m΃?/hvz<~{mDrQBa/}طc% ˯G/>j}I|9g98^@GZn69T,ܹsgk0 b+㞼S6xPߜ}y6ȉ>Dt6ga C50.|qppppppppp{-g ӧO>` W0lذy!x nppppppppppppPՑw7KXn1wZ b獿 qe˷\ήe#^ddjZ텁gM z2>yӁ74^y !f_|KOo2Q_\w{I͞kfVl݋WUb3m̺|owryj˙Ur~Y?`:J_ϟؒjkuO͜!+wTVYaͫ[z2qȊ(VXﵛFlYSfp"Wk& 繙 Md7I(i%?KXi\ YΗ(1ŗT.cnQY7aua.q@^odz\eHgf|!<-_?r5ޜrhĹೌ1$Uzw̿u٢'/Y\y֤rXc&|~Q7uޞ;coMoq?+%Q#2>Pnޒ&fLԫGT.=L{T/wր®g]o׻r iT+~G8}9OYGv4!Ek x-0~CReΛR}sU 1oSߊ8[F\<fgzڧcvoІQmJKݼZg>G zVr܍#lUݥ~t6Oyo2=$߻09w矾}GkkK){}?[^#gt{=qw 鹩hvI`#Wv /vb8`E*W\#9ޅKjc.k V 4ۖb.Opܽw G ƙKt'ut@G+ ұ7wMSұZbʔSd;,̼Ycg0 A2/c櫁lcfߚOOBnpppfO_anoM7Ǘ VA"² m;c}4 >bp+7Ji] O_0hh%߈( [,[wm|iD9||8888888:@i*z788888888888 npppppppppppp788888888888^#6 љIENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/gears.png0000644000175100001770000004603214721316200016603 0ustar00runnerdockerPNG  IHDRx pHYsiKIDATxݡkUqi솭84ah0 "lFAXV [0b6As` y~ç?'|.wu_@=dZZ-ydPnEM_b 5y!S"]Nf0dRHL~o/)r}C"(w8 ^6ܷy)by[~{OkPVIf1 }q<;-v>u0qx Oݷt(.8W PCQJW7d r'@2P?D+ }j(\i9$@{J@}r=pj/@ e }Swbp}{He1v=@{ A@vݷThO!c@*>m) !v9rR-Ej@]NThKb@">Ӻ@ϫM{+?.N!(3&L_ )5S\ %2)ݤHb;@r|s^k=Z=x 9n?wn(Mr5Xj4aOk"rM  Є=^t̒IYT{Ks#r{I{r^f5zohžzAkBe/)3 ʢ/R&PLy {J N^Q29Eo?c=@,\FP~ʗr.-rY>hxZl5` ={Q{ 6Eҗ<.D5 Q8=.oL^PK]Ų]Y]`}K< P!ss(ޟU1([ҭNuYvzY7vYYe%hٳ !7͊m[ !+儼 (eWWA% d5 V_ALm%׃okq Xpka?yqk4?|v_6Y[:y0ߒvq{a7.GnQN/Wk`P2?;iK9]uz]I CpzaKwTM:dhLaaTѮ0 `,foIarWD~[v{ pr0ƶSee'ygCqC,;1Dy-@0 [V.dAynXR9:4{nX?P% @20 F4{Ԝ{1?!11?CPSb4 |h@D @͸hg4 2F&$F!ݿLaiDcND)P( WNG(mTPP()4D3 aɼ,OS}&77~܏?F`KÛ1]0\xNü# 68S `8JvȀv)𯸔2]hfh}| {cIXX _  rv̀6F(N׃m&P\++'˛|0"q@e X+B6;f@# iv̀6F(NVv̀6F( Fv̀6F(V³93MhT`}.m8@q>҆# vvĀaR#[ .̎0Lxc^K80`""G‡A`"&  X'8l\/6^ٽK?`[De_nG?Blv#~UDf#|*;mJ?`'|0̲~Dpn^=OUeqqȘpf4 Ac6.\C\ НM AQbEm6IRT1+PVN w{7{=D3((@Kq)+QE[PQ Gh^2 P@[@tKIbrW\r-3Srv}}P9 P@ξrsa`-TN$]~R#~R:ry dY@?s{  kSjTB[Ee\ɌLˋ同 CBOaѼ}$A[9*PMYmWtwzXgs*mɼ*ϙB)w}_Rސ A!:Iq T>@'^|'@}Hݹw7^J|7?ܙ}NT:o(ݕeĝS-r:񀨺=Qweaw)gwweIRyru1eTXԏ8G` O9rsCY'{5͌;_ձJfcn>gy>'ܹ}Rt8w墸& s5NO䐋b?MsP ˩`Pt9(n3;/V]wa`)E];Hɟi+,T>R*\}g.Wdt,>R*_[Awi}6%.3;G@#@5Uq;æZO僃ܹ5-wzg`JpbkFk6 dTS 1ONQY`:Qb Kb9?;J& b5EVaP?"BJd+ ;s̝<3}~{ @`Bm~#6y蠆9:`]?@? mo'X/g:3OֵPD& ܼ <;Dž'*^w2źv" 3ZNV V%:[57[` :f9d]*y=꠶9l]3g~"YNzkVU!=IOzݲFK)?L= VK7wH1TG=A9u| i!w(}9n]oF/SCO iCU]Y#/~롔a:y^u.[1A鴮?PCa5xY*Cڸܔ'0nԉi5Wr\~?$CzﳃPGRuXD{rHNk_weu*j.* :[s˟mꏖ2:@j!m4ch:o G2/:@sHXt: +3TŠ?z48 Rw!੃, y.sTOT=P68 3TAv@u(ܰ~GkH<3$> cu.P3tlu8PP=8 '3.vyQϧGBš[bk wY73;f - Od.͌!el0:@P;堑Qxl j&dZL9C큃FFu(ИsAF8о:hdY>;d><9kq([> ܲS^; Xeu МtAV(/u^J`4@9iP ?fcT@)ˋOa%YٸlXXF¤,Y`\SKfGL,pKY(EnQ\ϯC4=9{Z~?yNj0 v5 c)@q+: H}4L(@ 0 u 3(>E鴯K)@;: H}7L(@ ma@:AfJG2P|li_ 2S: cmtNcQ a@:kAfJG2P|li_ 2S: cKtN:iQ a@:AfJG2P|i_)@q*: H}4L(y#KdlrDUy 佼Ő̏>Gq5: H}=6L-: Y(elr+|iaxE Ga@:AfJWLYgɠ4E`nLStFZdPtƥrj)Mev|eAt 0~ :^t`Γ!3Q= d-:\h0Qpps`M싞Cr/:ZUYt7kSd zvhP`rU}::'uTos9 ѡ䴟A3j0~݉!9 LNfԺT ,#y _U?}0jG3b3׮9 eDU lљh1[ k4zhx'?+3R*e0\ϢgfznwNK&:O;'c@|'>Es7g˜y"]/|S6~}s-cK#['ѱMiR&{Dd<(y~ A{"%R_\i<<:0#бEyS` h r1)hD^[ >YiB\] B7I:U}6K|ߦ@s]f){9y6tNKr{(Ӻ^O%DCF2| G& ^YO 2[l}&]G{ 0oe`0>s+ Md+f3:cvo|PO"lOTvfы8&J:71ZYDfœњ@,?2Q)dJ j[N4\eu67|u+0W@'Fb7Lܘ]br}DS#)g]޷[?fbfPv&6 [If 0B\kE :-s<1Q[y;!: 6h i3/&ڀЃ~;]@БEfM 4  gcu飑FS#5z/ xe3#o!ju5T6#ɗu)G,h!.ߩ@C70ux Խ Ċϙh=5@knCHôG4iϐe#^dv :UP M5IMQE,а;I~~b/Kƞ!h&.CfȇBv9tcb3@NʾyJ:C:a#_NDH@"F-?pҌuHX"*B<@aCZ /U ; Eko9{ONr'6+Xw?8X]@Xh=ze0Q<Л(Xa| ŻF\@܈ $ fǰgJ*XadA7‹x" oVسAFvQƳ[ ^N] r^P% kYя;hAq[Ij& |UI(ڥD(SI% bga+"Z(4*FDEp.}/~Nfww>O3 PNjw'{U*UVQ|l"'JA}rMRK0:UIW]>ӊlrrv.$?a(@rf[Iuo; 8_/OWJ0ջMvM0p%+ۡqYW{[`\;FGҹׅ78~"iA^z($f+ntƺgE}lht8UWvS>A]}( v+gN  ^q.rg[PL?`;c J[vNzΫ{ǩZݮsx}RopX&|@j+1oh#%9}R/0 ЪJ;O/@h`4@Vtļ+iu@ihUҒr@Ui IMxheJxkYhtœw 59j_x=֪ ]x Ku*FȍI9#@{O{>U ZXz6!h )@;lp\(brXv^%@҉ox-sjm I8w=?i+h ޫT]YhgmSS TZ@;k h&w`~G?TDM Ti?[NQ¸eH#)L[3ES"A& gŋ%/](H~+;Inn))fs=NZ|k BDq{c'J{t^'(!fp?T$}` @!(38@ DipjW[/fSxQ41o9(u̾]]( /0 D֍]~ "pۯMUMu[`}X}fo5H>?WɓybU]KI@LYM- D^@=Z's:DbԆU3Us/m?D40`&puA_vu 6H<F) h/(vIT7 7Y0h(QXG1!so_GP+E O$ʛCPUj9]x)=<(_keu<> z̚RrhwI'0~МR]>ا0@ H<.`_,u\(ީ w?P-uE*_"bx")p_kPw7A6 x&k[uCc[' ;\BP'~GuG/㜏FU/%h>D.Qi5{AcvAb^ED ySYڋ?_:qD6nM͚nwqy1O^:;Dk2HCQ$G}@|HHc*uȓj {p%ϒɺ[P'k "h-k @V.bﲭ'ϔ@ %GqZp讶h%y6/?!C V -虸 PYg  FTT/k}@YYd3En@d#?z/7mͪҺǢU nɯyp JWj sfY,+O׼%<ElDiKu^ߏN.랚%B:ڒ[d}pwa?(y?8,*V pZ'. [VUKjZe* |6QD$iѕ| -@B]Ć!( 2ȱ!fA~ ߜ~k_9Mcx*=p!^h|ﴇEbo'm ʀ\8e.B!xB a%̾B=_֮ko }2t rw0[sJZt$z"`^2.pa!ni'G )6N Rb|`y'~6xEl`anukSgҮMH .k>BkINo,UD7sl0f `,zCS@/12&֥хF(T6ҎE ;6s@¹3#8&u;= 븁Y,`\~awF4t = x(*𹁢i{%|z瑝eM 3bۡt$v? zg.3bQ&kb##uj Ia:㷆$,rge`9|'~mKȅM %P /!ũwMl<KA_O6g1Y¢FYy=}cn ꙏ4t I\vxu%o<3`߱~es̀?ObDn + #]dH9 !: ; i,[nK,5-ab;X.`5ٵ#^ӱzf\H@t>bt&8Y#aPd.]~Nzb ӥ]Ĥp> e!f=|v,=]>bl JdnףM_e _zf.B!NOS1Tyc?X:2,v-劔(u.ႹE7?0^dP҈P_c$ݝ,wC+橑IH!xe dَ=s-]"pvtjnX auw+%|O6PԀCdNHxJv ޽XUS<#.挄'َ=.|kX?W[;ڀC3nr؎=(]b!i@οg]yǜj3[ IH pl9pw I# {F‚XClb{ {3Br"cH8';s03 I|E.v=-3;sFolǞ.1Pd,͟y1gߓ 8]b!iis1g!=.(4ٹ wy1gߙ{YBF2fКwv޽N3;?= ;lgF ; id=$v$*]}cg%7 ivw\zJ\An}27T}`ج,ÙMlZvQI9o;*^bH.q0lnbr|A樃n[!LE dgRv LJtٺ=F\tm#^dP)#@9u v ;t="}E&᳢O<A,aw 2}A-V#dX;@+ϱY^L^[,e&s!r;(2}F}F,/vi aW2ɂ:ڶ켣@G84>@[?ü%dhs09ֆi [W3ǣ_x>ǤxL'zc6|`hksypk$G;m2|A,0SN2hB RGy|(rfqK*e'_L&[{خeK~'Uҭ}kgJ/5ٓ~|Pz^4*dAϑ\yҋM%ޯ`);D*utEb\OԏpL4xAѼ%R]?+,0l3a -"!Xv^]q0bw/KP> N )[.0%嵸N]y{Ĉ~CF7:&|ĀE6N */.RzƯRnKl!~c>I 5+#o=Y*cz)YV8gCQ]#)Wܶ9aUS"ek'p4";P;ew 2A7 z9'z)Y:^6M{/1 E잽rkn^6ZfM/;ڤ<&Lgg<0`g6dsveSIʽw,ֳ3 5L huءtZfN/=uwv;|A~#ZqwEag<ҥS٬m g<+Qo=jJv>fTސzB_;,Ο>)ՂEKHBf98sٙ"LT7}RA.bE;LvNC{΂;5)-}*eA wHYM=cV+ FaH,[,,cd܉~ӫzRn)ʧI9ݍ7ɡ*\-_v,=[VUpp ,WJ`$) Ez0.FA]( K"".D](Ȅ.d݋.VP_E|| 3Z眽 jĞzv;edqD5o2(ڼwU%-}'Af/bn.X,c=@DJOĞΛU12sa9?/u |2'zfQ]|n6s, j84A,~ۥg=EJ8]fUfh O~p de=EJCRJ]cWVhS9a:Nn;8v㠭>zc$))]-aWfh(|-R8c\2XXkGϰ"!J]U-!}4,~>؁4Um3 .]h10fdq'(2t]^d_\Zvfg 2;XEOnڟ(nmϠ){dWEܫŜ8ۢ_Chp=%'n`=%? WE+YK^f1 yKK-^uwf'H-yEMj 9(3/xZA*_Y E / x-ȱ׭ccey|_2背d1Ͽ`pU,} k+,yIcknYĘ|0A_yUNbW˝*_[cV5=~u61Sk'סDjuA)`# NM  gXuS^mn}BR|L"y>HF^kJ|}*8@_΃*~OG7HmԚDt$ClE%0v|hsSф} A{UL|ˢ8mEGXe߻H kk}b0'n?ߪxT:}ᄀB`:״Nzon`غ"}t"iϢlK~^BY ?8+%>I[h[<Ƿ&O֊J-Ћ XFt$~iF"`c\HgMqoW%7m%>(IŰ3Aܗ˿6CE s+ǫtxAb؝DX|a$m h>I26Y; "´@>؁+˧;h^>D^.96]UULԲ>EҨ4R"ȂʋF*"$Dd %P1"‹H%* #%K z?Vdg朳z}{[woYޜ$?f./rҭUzsWm/*{9I~Ͷfav.o~Uޜ$?f}J}AnDpOսޜ$?fAnw[$@oNsMKff/ nszs Mۦ ףmfvSE"I@dimdu f`B@oLRBt閪{OvUzE}ޘfN,t $XZQDpKs'Wӧ-^VЛ`mpidt{EX( Iʲ{B.8 ]ᑒސ‡ `2Hmɐcz}P9<}G/@=&3yoI=}׏^+Y.>;z… `J e'q8,~7 2A~ 2m-ٔ)Lؼ?v3?c}|9da!ކ`3+Fz;([~˔>`/se9;Zsg <&./d]…{;,}3{iK_y_9뇯+  @  AXN[,E5xK;"Y4tr1X/!ՙs 8}?zP`",)|ϰtg/i_V챎 Zp;~珁.(,sf4k"y*F5o? -m1o+)*b3h}cu=W:GƥS3esA϶R_<7i}GFې8Z\g 9uŏhqe]>.R1tr%17Zc#g\z5h-T<̳w>:Kɝ(,u 蚓 -*:Л x\,Pʸ> l*Ip@2|,q 6 xA/z6G9x$xD xB#?Hާ=Pŵ'i-K2Olf%]\BJڛ(b 8(5, nFj&+m~l5, bi8 ɦhJҏ'T^Z5e\Uη>-cNPǓ@r,Pŵ(^u\˝hTtș$*xr|(4Giq}V xDapw^0<0>3F?LzTKNa\FHy Y I," r"lD"J4D,XdQ(f%RʥuAӛ̸̜<9,~e}gfμ/}KÛ@usPdvC`_ӡ5 >[J4 `)̆s<4i +]vB*EvEI[`4gǝ!~;i/ +`nVl4 +[;e$.s0+:b{,Pla?2bka`Vt, D\eg ȇ,;`i(g`*; U!>p· ;`i(`!; Ua|Ns0+:qgֲP\ȇ$;`i*@>l9}K p v*hgg{egaVt,-S 9G~}FvvfEGB<3]*6PE }KC(WWy(>F[\ؒKfF0+:b ZU\ek{^2QYPp%b;i< n ,`@l`;2`}C̈́ya[PYYv 6v8x#Sì,؟6'e[΂8$YC0$O]_ϔ|¬lؿAVyx/P,k9r"wN  :OU;˹*> q|p,˄^@'Bn6XD'$(d¹v=WBP?+ɏױwZZv j!wkf=X>Pc[!+Rkr+,]gF2*0 I~US{5>wA6C]}fX~B|C֒_Գ;LK~ 5ϋNaQObiFfCBIɆbJ0jda KvPJ))Vbz:5sjsⳞwssP Am]4G@Y#۽ڔs!!@Y6 `2{XmuzBORg]kT#^%S_Oɛu)a]o< =`I"˲X#$@-Jyq#LYSÃ~7w o^:t5t(s0=|4Xnn:$ihP2 ~ܣM6{@s>x 㒻4g4wɗW@3(KzNX@Y7, 3q m@O;wTƽ0 eJ 0=ڧa|(Rp: =ڧa8Bv>. u @Xh5r_Ƴ@4XkxhKö}X``Ka\Xg%KlDDI㒄‚vRr)P`!e"bccaRD[Iٝs9lg~=y{GwLXT@-@%{K|ֶ}dri$PhmKJ( 뛴P. 8"@ku_gi/? Z~?{4p}@ch Z.00{4򂐽+sU~h, !;.4w2=OyH(|Zg u3@,vmGzJ80ų%+{@~(M+(/,seձ2ϗk P?I{=Ҡ!.f 7(ԡ ް] W;㞏2| 6HPM}#2@ɦ軜[$Q_7r{@PUfY)(wۇd k)/J^:{@} !{YDYĿAQO xw  }'XlWp  *W @,Yw}:@L}K "_}#(1) ̑>PbRd7})}`Ĥyw bSl1d}eA*m=}Pw_>U'@i1Ne)r / ?|FGF (OȎȵE\q s 7m,ɲȚˆlv]ٓEi>4>4>0VIENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/glossaryicon.png0000644000175100001770000006035114721316200020216 0ustar00runnerdockerPNG  IHDRx pHYsC`IDATxKqA(^eАl;M6BI6LhbHҤ,\J.Y kL>סF3y~9g7y6ioova̐-rVCy'mrUNIޝ(:{KE>]3i/#ik@uQi5r{+PTs䈼w_#P$*Jyl[ JO}@Q,*:Y('cG{3@ePrPp@qhR9#_OÁ3"P<*׉[(?ߠgDeTd\ Pj{n@TbSN*@+?0{@DQiUr)@w:<đ?,oww+=/0(7u*9yYqS;9="%^RIe{@$Rie{@$PM&y|#hu@J4t=SMfyd#kv Q X"A=`VYJH;"w*An(>IY$u?ytλ D`sY鰞yr,+}S-efВ5#' il :f̕ڿbI M=n]/Y,='b )c.!FdF;dT38ٙ"T<$0]58FWɹr,d`7k|9 =P-T8dP)`v?j~_.YVUp{/rDE*=ЃIHHN *He W"AlQN("\Rpmf$[gagz"U0 ;qS+kDHa<[-==\\R=cD*z"UAñCz ^.nrk2{0e/Ihսo* %zS~ב=Z1^t@Kwtfu?ЗH ˗<_X== z:qu5vn"e PX ?O`?z MlD"@/Y>VzNRǮw{ũعH$á.nvhlc&R&z"l<|EqA/v Q݄Gع 61L9t )OvzЩfٹH[/a_QyĮc{Y~Qts) =2`c+`׺Ğnwe3l`9 0|3ބw;D@@Hؼ;3U cכE9~P$ z"E=ކ?zGazM=e=/4ov#B][ɰM= b!=@5}G,RmD60^җ]KP'z"Mx hG0ɂ'*?8&y ,z"fp$r@vu|éOعx 7l,_q[m0]AMmsB@Sd07}h{-}лv`vn"Hv*|j>pE, #N|zR?/)X?^[~[5o=M;?jhVe=]ЃN=Mha>-J2te[a);عT60^x]o:֡Jvn"E .\ 0=˾|Ȭ]oyzs)]UUpLKeqIY>" >2hL`$8JSÀ(%(=TQD$h!=R$ T0J/7{|9:k{={f0jۯ,^razKޜ pHJY0 ڰhQ1 kߩ?sICY 2XMb>H*$O7xj[7a9z&lS=H $/n}8]-`]oqMNH jk,z ҵz";7ًd@{6 v9KD ݅5 ބ3X܃a BN= ;7Hwp|Q%@-@?kG$F@`@= [ g/wCX ͩb&R=I hᰗ0O-yC̴^>vn"Uz0f[;>9K;X9u OsTg TX cLbW=SsTc{v ᐡ ^ykqH<[~]GWϮLzNOعE@ʳxX{"l^veA߽׻عE@y~8Sx]?,s}ӡoR ``A/#Y{uyb&R=)?w"vD>:M zR;,uTvD:A~pϳs)gms|vmD@Ϯq'g&R=)e?pj|NaE p94=)Ce)y;@mZN~xY<5Ӊ^tMzR Z¢~<+ v bܷ0V:܀ɯ|99;7"H1*ۻǘs\k\+R_O`gD: ` ;/7f7dik[(cN;7NH1(o; _/α,a_dTfLqҁ]][F3*~}LM)*>SfLzRZ][ɐ Ŀ<Χ{\) FWHRu}BvOdpطMLQ Ե5`C3KjRok+ۜA?_a:Ck3k/f0TWz3)5ɖ Ŀ-Rg;d1!x|umf|4Cb2D? w/'e6H &RB][w!ܫ,Wf2\& _~` }IJpq ܣ-WURkԵ`C3yԤk+MNX /a:!sQumfXL" Jx=;!T k BsԵ&_frt0lRVL'>;l$K]MlXh&ϒIum%I<,lݻ@!LV瘓85B~I`i7M*u}H<v&~fw9'e3|ARzM][)=?o!-u9 l n/A3"5__bRwR+|48݇!s?Cد6,-4H:um%H<s }3x_mXZh&/1um%H<}vsAI6 Lf΃c.T[86>6)%lMUuAD+*KE)DQ %aI 2QDO1-@JP8(4) tRH$-,Aq7{ڗ}ֽ 0 Rok;_s/8/k̓.@cdhH$-1`Ud!C@>p)c61PvmY 6LN.k3_Uc1gr_y]htb):^B@~d.kbW~ˮM4 !U? RZbr{=5|.3~ Tvm96LvgxC !/~>y6& @CJKL3ls! Cp/wRi#.@`k[SXt@ ?sK]bdhHfVGGz$*c^'˃D3 Xl d\6ť\Sɷ 1ĮM]cdhaV7>G]KBQc^@hdž#!u][ps!+c=GG= ~wyXvm\Dֺqg,1 [} 2wٵr qIpĐ][]p2~ o{2rۯD8nA~::zN2m6Q&t"&o{s8yw@?A\ɩ:۷=DXlL4nyc0>6BũZ,O@ ~A! /( fkeA 80 nvmd_As( CgD17g/ҩIW,#5 >|z36%7BrΘG>M ]ak@]S0qq8ll|AS7t g@C vmDOGm3\oEqN^M og&ʁ.@HCz)O~xV8ea齀ںX.@pa 4>6~z9xi\E]N#ؽtx/<]vm oc-[8me6$(LHȋ-K)kBLi!$ b!ZfXP ;t?/~i<=yysyu.z M&K@Iꠜw}FC/{3Y`?|Ʀn`kKC #eZ6 rwI>Pf״~xlmI|$|ױ[xye5uH3/ﳵ%\ HCW /KNvug?k9Icxbw6,fkKbC &gAa>.V)C,׶Ǟ6$6h2YJR߲Ut`gk_8֖ą@M&;٣# Zkkq_<="I2$~d(IdkRXψ ˥S_u;a_4W c7[[z4?e] ;2jFfKuE7^xa͠ ڒH|!8D^Vɿ!Ǥ1Bfuq5ȋlmIL$d>gkP ]6i1unCNS|֖Ą@{Xr E/~P*^sͰ}3!5o# /bkKA &ՠ$<[[";n)tLϺJO׼/ -ڒxH,4tYx ^xدlT惼TAo5˛sڒXH|EP¾ J_ \^-='L%=lm`߼_J|O;b{K?N#5fkKbA &o@ "6g edO*Pu6%qEzPz 虇F:Tl?&2][ zoabiZb[[z_4U$_3@ _gelTn*ȏ^ؔD7_zo35ĥ_5 ;k4Wdk?clM=/gkKb@ Ɨ~FpTyBuOd[Of(?'lmI $ф I٧^ r=@: dEΞI.ekKC 4bIq6W`BYktkC\Nհz-=?LK3.tI9W \{KnvPVa]Pǀ=3Fۤh2yI&>eͷWc|o +s~5Fۤh2>Ƨ,gQ1`6Iv| ͳfI9$1fa@s @e<7lMA &̀$elbk+QI}s1d@M& GK'il}< wΏ2FۤKMQtEc]E b5GwD dtbq'^En F"噈BIl;圽wk;:7I y_*ml5F߭El'z,v%u-Pt3>O;ĉ^ˤ^R?`;_ U)Z$>$=X& -@ot*3軵MC/@҃e2^RF:C f?J8$MX&~fg m0nI\HL-Nv}@P3|7lz&g./0B_tst`MA/@eBOK>GF߭ lz.g.;lzf=ȥg=v6Iy7d?b>;Pfiwk ;ā^ dђ:V z1K(4 =O$$mx;%`^d)5O ,;ā^ eV}۽>߫'\z>,Fl%)LBa1`d 5͊t6Wvn!>L ;H ls[q1;ā^eºVg``s _ ^HH^\,,ƳهrI-8 ca$^ā^eVg튙7sH߿-ӿ2]rwֈ z,!~^Z^Lo%L-'BBߓGI<H~d-^:20]p5 laKA/@%3hygg+ x|a2^s> {-^@v$$Oa&hv ='b}p|msz0K\񲧵p+ z'q yr,v6}0w3y4o#\Pv%.$OX6fϵȤFx0Oq6MiFFҞMUpHAQ1D:B.܌ԕe "(mԕJE¯rcJL." ApQ}9}{.=vk1K1~)`?cVE@e@@cnz^/;7 ή MhKPX룚K͛ӐuNFFE@F4]3n414eKJ͛O|:vup) jzKعIٰF{tn>+X ٵ Csz.f&tpM: lŒ2Y:l5o X ug4ю ?k-u mf&uy[=IGMCIz4~ ϳszZ 6OYvnhhA ѼyONkl z4UA ;7 [3>L'JXlҹ Bv=hr,_9w kGguK{k- mA{?MxmEVqo4{a>"q7~M{M̆pM:7_5ĮH=h5'ٹɘ׀} :G^x`=h5=d~.|I2_h^zKhf-ѼLtr fk-r/H,w5WٹGlҹh^h^$z&KE4ZO\tK.<ʮȃ vnA͟OѼ>3XʮHN]lh;عu6 ؤ# LyR]E>|t ;u[[y?E:D mCWsѼ[`M:&LcZd>ٹuѼN7|4?>Ů=Kna&;.Bz~-`0]k&z"nEh[ٹuj5^oѼ>yp[,wK2Yk{?Cٟo&z"[oDZ}R!v[K$Z{p{&?"m 2K$nz85Y@EFx֋wW-]D A ŀquZdu3Z8VZN] (R z"X Ϧ9n.`.x.Y쳓zi0vRfnּ7=Fރbb}4y ޼1}d緬^K|YV=R&z"RZvmw[ ==V{ng-eSl͈ǥY&,7"dڈإ({%6LQiE"nJ M!鹊~9sw~?sH?,jճmӉ_uTוc" 0mvLؚAMK6q`?_~vd h-5{|Ee5vRM;axdk*#y ~?%ukYm GY&"=D=Mq`:{.[\u }o-|E!2X Õ6.R6z"@z4 '玴^:'vxŢf{K`OgGң 2$}-J߲dܾÿExŝzgރ{f'[%7RȠ,x0Ğ`V40Ƿ#>/3ĿnU:&D3 ^ ):8,fKn0' w[g$^E?k25DR@1)z}foWk'!Op^3dK0I@k*o@~s9Zo18.YOFX lC>z")!nqɄ.\d:˺:\G74:7U63_ g?]E@$5~0τIK W"~7l$[T8"G^ U!?Dk%Dd2(ObG9l_})7Θ6QKv^ӿHNh =!`.M}כցҼ-Hq0~"V[< Iev;H+]Yz"mB ,jnKmEPJ?k5|y٘[?|00ʬiV[Hf+̱.6gɖįg5My({-KC@D5mK>g9;,7ْx^zYMc-E#&[A8vcgnH,kN#qsg8JgJͼ7N0i'wa+Zχ 6ْaConI@Dꄄ{\9(}[3dKžme{-@@Dd.u^W),d +MM$!8 M "uh:)/aTYp «~wR؉)X^%qjn\x`-270 "AB\@efp<[2dKel|rH}W5dW2dKOTɊ?abDH}64T*3-ze]g->8{g0f_D.jR8=.aq:Oo`-[~;oi?H"R$SHI|OdKnհu]UUpT4Ze$I eEEA%RaHT h$5@!C9>HFEFRQ1 %}An޺=u>~k=瞳w!fA$o΋iøAI';^ ΧsR jX+! iG) =HZH`t^=f/L?u1].pX|'mzoJ΋nʎk~0s :/2#•4|^ڬ `?^@шJDfW;kst^:gm*`}Lk4`U{1I mT͈n†=`CVFE}?bXoog&? XBIWDnHF#<60;:/ O3%;?0I߳k}iT@QJ7 n}`C]EE|U>+`FzOzb@N~AdATľơ&CyQ0?-~z(` P/'7)vKM @ܣ8;:/̗E_ X'e{ (Qln@/9l\m܁s$˱O$k۹5ꥸW8Qk:+.uXt^Oa%lsKg$D~^n{5X5 /Z@DMgi[Iw[`Qt^ ?Ч[(Ge?л0}":/j/IOϒt[W`*˫Sc%jэ˞pC#yM|< .'ܻmWN!+ Q ;}st^DHzxNy$X%ͮ!7>D0_O51WwN6E?81輈ra΀e+$HjDEdH6If$񸏈S l~!Uyy>OΪqM-E7)qyyܢ 輈a_vW#'pn:h:Wunt^D0ϟ5^7?q Qm iq܈aevFs ZXbz=:I|Gl5o cH:n\y?aܕKNuТ z&v_t^AMI`[]V טqX; rb ZwhbEUa ϋaX ӢkD̍%'xn:h1y(:N$=>Z9l aqڑϿ[U!uՅvЋ)(j  ¢bH:L L:d7AB:\d%#SQHEyTtZ|׷^^Q|5jD! 6UF8i;A_8iw)VDktzزF(P}{c=2wa:)X)|k0E 9_Yר@(PsxF?FJ/rj~8>\O>_?H0\'9QV:5H@u6-kb~^ԫC =Zψua$NM5Rp5x6Ihy)rBh}FY/q[ש0_rs9X?jkfyyyY\,eNd`}ƽY߉&ã>#_s@O< c=wܣN0{$h277ԜJve8KP%wõc2<y"Zқ^ej!lFIȭFf^P=`õRw0#{~ K~dأV87P܈d(z-Z.S{>X5E[/Ej_yt6ףMU~Bip# "" 44KEv)(A~n!j 6I4[ "A4io#\ξ5Xs콶 亜^,,_`Sipoa6s z %@fצiIsaJZ}Mzdżʍ|%nrMĞkU`3߯dצKq.ZhtK|%nrM.MݡA _^0mBuM# @'F6V3Ȝ%nrMŮ:ˮKM &Hc\fhN&Ȝ%nrM61>b z/bץ?ЗsbΛ2 @t>/캁zQ949ܘVPϧ | ͮV) 2_(C,tk~cæ f @Y:Ǯ@#fu8Ew:o \7&wzX `kMIl;ߥ%nrMv{P;5"`c;"`2?c:o \78Įr/X6*fݛ:}ꠟ߇nApK*p0yku1^%TcM?3T*-d?&ĂW'fPL=꺇]!H)>coE:] asVn1Lf!-b/zZ5O55HlBP66q%r5A$c̋!TNrvjv aq˙WPbO$S^/K` 6W-|-qi%lV%P>aVc[l-\O>vB^oK/vk0o hLQI'f@=?aBb'wl$=.颛ײsRK &6$ׇʭ#ɣl>%,zaB<3Ѩ2"Q* \E"hjinm0cyzfr=W.K5AAs-sl ژ2/r& @J -z R aֱ Z(fƮ@>f %$nrMi6W޴k7b66 :>̮@Rbyzqoѣh7&0ЃCg'\ X3s:.z 9@P홃ʽNE-vk0\=O7)bϾ,TC4=ȮPU~N /GKsH r6WCl.],%nf@bk s!@gsy@ze6Wv0X$nrM`! oC{h~66V 0g7C=N< /B [UU$3(+SQd!].>TDhihTOI@}VؾtJR66x8_H9D(Ttm~grbS|_=Lz:@I0X~,"Lj!'{uAMG͎O'Zfg&=&پB}2',?r&pW zv%lo:!<a'Wte9,?r&PTOe{sd ]@aoۿb _I,?r&0loΜߛ #L;k `z@3A ؾݒtQ' ̪p#kܝؼ0Sȵ?n;˘,"Lc![:?i+۠({_Jca [ 0=Y Dș t)OAm` įtGȉlNl#=+4{1LOr&pzM׮u>O'Hyf(x3t7#+4Wȥ7~/ 9Dؑ>NB=?IEp#Ǥi{a<#qI9y JޓzҚRMT<}m s"/ 9DFi8]/p/#L={͋.GW%¯nk$-bvwf}Y@DșDYCJ#L'; @$P]B])l 9ksdvj"`F ,پkwwؾTr&ZKƮOͽPjxPlހ.a/L6'/.ЦĽKe gڗ"i 6^(5lޠZmzA:xؗ`yQY`DșD &Ϯ  R=|]:NTC7q3b^gHIs/7B@r{[f[ؾ`Txx׳} $@$@k閠>)lJ }\G@ /bҡJ 7zmꫤ`Vmlۡ];#=l_Ͳ3Aɞ4+ھ>,8v=VR|-hk5vyMA+8m]v=TGsH5>?WgjL "oT'}S3lm*"v6_rmOzTn[Xuu}}'i?:uaa͋~VMUU'Z|iH]50)S~0B RDʢ"èȈAf΂JJ"e\TW*TР@KAZgwk3yu={YxH Nv9 0\*4j/H9` `mgefIjM21bvv^XxH +\ DGpSJ.׳zB4DxN#/btv^XxH K\C2-?2-\vFz/GaM)UF|kZ?+OHB,NkBM gtRBX XFvJ1P<%ٹa u(ϲs5̾T0 X$e뱕D$>S2n8z`> /DxE@'Ul%T'1P aȰ_cE TEUʫl?Math! aA5D-رH:AعZ:ȽL0;ûl5f*8m[c u(^7`'Aqӳx`uZk2X>q7s " @Qdj`3$Sճ/2~ 16$~KQ#elϙvAX$yH fvӳU][b=Wd;+f26:!cE QN6U},࿘"q RM+}nvaXDyH @U7] :<Q7+_HG!gWYߵ\ð:A\-:T5F̯vv:ׅ;sG.N] 2E (ogr;K`^GA#ٹ` u( ٹ Οg=;MEP8\| i^R'0G ^`5:|p_SɁ`wE 3xCvIk8YR:+Y?_(ۡ)vXyH ELaǕeto];w:}-q^R'p2Ddr^>yذ%hҷ{*#|,oW5q,fouXyH<ҳ6vsB LCA#w_MVUhkAdD4A\*$&AF)$BP яADQ ~HFEE .缃 w~쳖E8&4qwhyg-d1u[?BR9w\,5[H*߼ d. *\Zģhk1{hTr@Ο֒ ߵqYG$)MiyZe'^a By;#,͂E@&IE9X yES9mDoiq#{eA︬XhkY;o r[swT-&,֝ـ4wxe" @[y|Ό|r8} ֠ۡd i︬XhkrS p-!lYZew>'ۅiaw^Ako4mMt&,sH;_k!g# lEۯ+1.KPEDɺ<\B[*uNt&lVx猌;>XD^ŵ$`$:i. 1a9$iil pw>j h>r/-)w~eg_pp H$ɊfS{mai|;X8Fg @u?#׃\{{Ԝui 8M:şͣsNin/!if#7WgYC7-}g7"= s4yAO7e'e4i` X ? /  4y4t'|=(9_ "܉[]$? jw\1I@Hp4NBuDx\X%]/$=H3vb`lk?_g`A0~ci*y4t @sK]#s5IB]O1#q"( !0ztT#Klm/ 녋kuw\J@H(̸.mU}:h|+B*^1 L A(2^PQER^(`5CP /*>/~[{ZKsηWLa pؖUr'5Gg>'~kDgos- Dڸ{/ixᚣ-I}Mu?~S\aJ{=\Ɋᚣa,6Zϥ! D:s v>Veu_ߠi\b@OAj9'5GSKpok4YUyfVW-1f ں5(@ 8Xs/0Ng U1 DPuЃ!{  v0i=o={b @ U &G߃[9zV9K6ekEczySha)O_o˕p͑!9ROCß -f_Ľ n$p!5WHP(:}oXϓg zq&-~+ j!f³QS3^b5W7a5zO 1C@ g fMuXGb(@(~vZ=s!XW5P!e *nr8,;p냱Vt[,1C@ sPwVjl#ӺeDa_};⯭>8$%f `NZ`.쀫 eb(R K<k.{Wb `Ԉ1˞A*S0<*0P0S0 ӂg6Wn)S(TaFl } ̼{QϋqY1SYQe!ILTD)Y(IY)[iB0lԔ,~%R|fsK93ssΧ;wYx(ޯ):I@p<9%Oېiyk[o&X~CVv iCWwMb6@h}> P߉O`ԝh"C_ (Jԏx4z11x ?>v P/hyG DZ˽C@ZmJ@I[^F{[lUyl1Цt3稏y=$fDȔIENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/hgps_map_background_estimation.png0000644000175100001770000106004014721316200023727 0ustar00runnerdockerPNG  IHDR csRGB pHYs.#.#x?viTXtXML:com.adobe.xmp matplotlib version 2.1.0, http://matplotlib.org/ 1 AX@IDATx YUuć( d 20M:b{A1jlOQD餑#mQUTݾy?gsυ{Y[kٯk.D6mSJ͛SoX`ذ6,p-߻!ǎY؀ lX`ذN>ҋSolЇڵ^>ny#x^מtG;4uўv(a lX`ذ,p >|}|M;3Z6 ^׼_u{ғܶ:u77,a lX`Xw'__߯Fرcy2J\X oq][/Znʓ>曺)޸ɶIZc2z9ȓ| cZiS~778ǸB;NyǓ4&yq78ZcU=cct&iL7y+j[ D11q$1nFk'7CfZ6dw.ϝ[ 7z^p+%7W^yeo}}m׮]+b` ۳=a;^>|_ȇb8Eyx6uȑNk5}n( "@QbqݸZGi\Hy '[7$θ}}SF~lСT$$o2E:csi5ߘNhMkYU؊5+z ޸ \b|85t75w|b6 IJ[ʡߘ^7t'rO<4c:iKxcr~-RFcq2-5h>vY:XܽkG۲u9{}[o?m~g̵;%Y's8Y{6;Y&LwE]^׷ /6y hG>r[nW,x;ڟ韶+bRQnvw,*~p$7@&4UW]Ս{5MN -7xc9 1, _Bo z0MvFOB'ꫯ.eT +L%+@˻\Y%C&yDВM!{n@;~msm{=$=}a4ƃM>϶s9}si _ ?]!cxg,cI:6к馛:?&3 ~cjgwc'r;ߛIdD z0d?/8m̠%.eZσЋ\f+> uЗnmzcB#hڿQ4bGtelGe뮻]z'?@? ^ O}+cDb".v۾ۋ6=ؾm.G+*åWݻg{;gζcvgO} .n]|Aǯn\kNdE9U}C.i-.:gW;A ;eqcsyIt9996~yxw B 7= Oh??|WLbz}w|kܹ; Y_(W A?] (]Ek_O|T}M9;Mz/|I x<&GZLl7o~G.~ I&L\+K@NQz2޸!7>_c?cjR."C}p@vWgq=bmo{[{+_~~YoIG%6l-loš= Ɠ8<]zWxޢ9ݿw9yN{cmۢ:vO}_kk;b+c@= nOO+>D KZƠe/{Y%~ }9coP-~~OIy|LGs B\E/!)\?}"EST{^W4f1/RsB?':֋_v]Ѿ뻾*YzxXPw7WvW JxB]wˁw,"`έ wGvƎ:jv{K˅c`?g'6{ݝZޞg>}Ùv|Dpxc/V3J_vrfryӞ^򗷟~m?~aЎhuvYx*?} h8jS?&J]|EzY\$:',Hd]|%fr}xr&W֙ ձ=S-B *Dx['ö&GHf] ;9[kNv ̉y,j'cೕX.YInWeY,V~ŕ:O>C .UE?>~|A$}Y@bSigY[Ih?{֎;ۭUm5oj^ p]5غes۹ڱmZW}57kW^{[C{um_uֶkS ,ţ]`waC>7 /qu{7.dGZ"߁%^ܞ7ˇg(0^E.A`Ic l[-Dr?Gd1u{6,bI^_KM90 EdBSd@/tǓ|OE. B#0Od?qm/_a6)E@dH>-<9>۲+PV(=vpјq-͉B'6d/ˁ7&ADGu}֑FȐ\(Ÿd@wh94]}Ԫ|m9}׷+o?>\jݺQW GjJvsXϩ{mlTlڼwvTݖ8*C;X(;kxT:.ݵvK/=>.ϵg?'==usuftbm9PC'96[vrtbs%H>S`\F9`~c]¦>KwI}7s 3M.d iyd2y|w}W{K_ZQxȡYSf%q61;E,(p:e&h}m菎K.>m=<:e,gQ.& {Vx㜭9%:(uw_1rb<y\9E wYб90.bǴ ac nz3Ѿqּ~mCC:LnE"/\6Stx᧍ 8$kp%~&5 }>?wwNl(я^awNC=PWj3hj!+{v֕ڳv 9W \}C7\W"._ym˶UqG'^־)б|ǘao_U]bQ9q]sd e雽yr9P2AY(>AZz m??ѥr]Sj1<1Uo@2,k_@ۀ61ǩpe87M dD=2%98&_rR6Q&e Rx d6Х| ۦ3JMGg?)\9'+A +OfS!!<:xYT?hoȥN]}h -Sbf%%yCfD/y؟Wև?Arcc|L%>Es6$e@G-IWit[9}ߡ}>{]ewg6}-B]gbK*~uV|bᅗ?Vs62Ƹ߼2|/|dğrH;9"q4O z1n+& >|%%GXd3f=#EϦޏ|'oi?w{{E, YkzM ,:[ܵVP]&8TgUEtW=Ys덆,8iU}JD ;}ˎ/<l矵]W?nκqej?7:Ti#GNdOko}rD-ַ|d+fBFw-@?d$;&} We~u<Al@G 2y뫍-Np$t%eEe>c7tc Ǹ?sO!'7N|/~A6#xy9nLF%GL>9VGG\[AS pؔh!9t6UZe1ы#m#uGZpm VްM7>iV{F`=װ*"jo6gرUF`jßUSns?xsO}oۿsww~u•]赟lW]wGx+DMَ?d $Gҩ:Uko }۷}[7  5v`2!<rΧ7SiBץGR +L '&}MzNUof~^3 n|Rk"6afUGqǗiJd&&,cxPϡ+GʙtV:>Q?f~ޱYh&'eҿW#""v??Ӗ1jtb&Eoy7: Ƹ/Wt:i2Ǯ_4dW:o oXWOtGc\1{{~MWonV}pS=m;si^;m.v[K`1ais{{s=%N$l\WmÉKtHm}3\5bg~gL쪹vEg|Զ|k|ҷv Y9{!s<.٥zERGy(.JsՁ>t^)ǴT' em@x|,/ -ؙ8/1:7?.qqjH_|L+.٠lMnY4Yᣝr%CKl.] H|:.:$/VI+j4 Zk}g٨! 'Rfu"OWI|ߒ++$=OkCȉ>?whgyM{v)og cyi3Moy Bp[>I?>;PÇjwl.n+5cvna߳6q _l-Pzsv~Mv9K}l1;"XkZ+Qjb epɱ^2id11IWl\ {h߂+;zM\of&Tܤ3\Ch$0Tɢ^:|rɵWzY4ВkvU,(p88x' ><4'<@G?t~"jCSؓl EqȖruzr=:PVO6Mԯ†h긐.~iS_a)o9ZMHzdqS]~6 [ɂ}8ɧ? fg%#)1V}R}>u%O^[z}Jv=Geom?i1?Hhlp^Ebi]-w ".50VA>q9ro&%-.}"YSNZSȅ;NA N?e*mlF>1q$]C'8h&]ɂ~@ m.np$N'n誇 RF[^|,PcMB'탣/0yƮY̧ y9[XtlsVGR'W^Օu[aW=XXo&zpyE{:ˁhҁ#e^õ!ÖnNm|Y_È269ׯR ,grM9dztEӷ^>|knmhcA1-=2v$@{C8 22ɍ֥μ h t6]6{:֎':̀@LZ!xX_INL얅A衞 I_>XhY$gle8髿2D)OҠdwP-4h4OO&!ɛ/ \DM~,fc^Zc>hDɪO z$'D~Ѐv>ZA_'ȽV~cROϷGc/_APlٛ,nG.oa:1JmcNoXdNQ(*ܳwk-z^74~s߰ 4d yr=f@fXmR4Eg -g%Vp$8J}:Kq8 WN+s8 זz M[ڳ}b8n7ܯIxE~Eh\С4tuƪܛN#>@k71A9 "K@I(%xd2p#|9U𲰭eO"GꃗIr9fX 9SeWN 3k7N[|1г8JccE8lVx(ÑR?ɺzrHqؑ@_zP^]4go|Ȁ&^{dt} |ࡣ/Bq ;A7@R/g}KHG˱KѝVgmt YmfK~΢s=4[7o]rA)YwXnyW[3:g"1j^%d?8FdcckVe?Hȷ~U?HuڙՕz'ڃ/?}ӣ.x|39V ʫ!ưs aP'X tD>xhރAm=*߄ Lk]#d>x *x(`O ձ5<] ߱amY\R}bz[ S+e3#Gl:44>#ȃMߔ#1 WB/r1~@C_uiT_BMzGslu3JB a_ӓ0}dk%u%)k-|8&2gA|OtgI}dծg.#%wկk>^ϩW -=xQv^[k)Vffc!Tb2,5 [m= B\?l[WIw>|dx@~k|l|ỿ_jwԛȲڃwA=xؓqeI{%$NjYW4@z<rs5F 1(@ҢdǝAّIݩHȞO&8ɇy _V5ɾӎ#)KxzCL4:GVyʡD ЗdN@=KhM.nȃ6@o5;sL[=deWZ1Ix?=}؍z}q|i$Ga>ʂ5n!2ws?~״-q/²T'->,8mcG,R+Gchc8v6Tw,!KlsbSoipëb,df_o7\xeh^yU_WX*|G(Зa/2BI %q_;#Ns $x],1Xp <{X4 OcM ZȀ=O+#IЍxg"ebCkbs<[L[1d80M%pLwN9вH?gx_jG>gb=yRW`@;~J7F?r$V"'\'4}  ͉>9'6`$2hz~xϢÛ\x&We?65qH^Obñ4 /4bVIhzeǶMI=$َW7rbsn5\TK~>RyV# B-Uǡs79kx Ynt+_Ay7 Ⱥ}WJWȰcsN1mT.?=nzUzA4,i̇ jW?2Gf3Ҧ}y#8&!;1ǯЛ|7+Nl&W7y¡nėޓq,?gAȦ-miZkvu{k]ACnAO}p;~,uxcWxM_8t_XηćO.׼Y&*`JW6jcj°9(?*n~r yy}mw=ϟGyvH?10 v5I96\?=wrn/ &c&ƀ1gBlr%29H`R!ˤc tࡣj r:ɸz.d#Vl4mĮpc?e<'N5nҞpМfu1їG3)$WZpGLj+J'#Zr2_tBSFf|Lx7d|%}B^cgL'q:i]fTqҧu{M~sǾ}ys=F6 [gɮgVM`׶N>|SW( ~B+JXph^/:|wj17zw~->ϠLLh@x uЁ7Od7/}K$f`960`Μ2dn,ހc11dO!P!6׍ BjbBE3vK]hN~r ,}hHSMWr"|zDMxt.k'W>}T؍v4fri qM1n5Xoɗ/dxFrmcI\^'9\e;0O?wrG͂f᠓9"9cez}7].=Βk꿴X|w4ĠBq\VrlA'+y@{/ʞ ˦x{sI; ^]~ގv+;O]CqYe2qdѸvuz?v%"qR+IMw ?f`.d m`“-J` $g“ E$/}3xBEO'u#;Ɠol?(\U 9Rߏ<6RGڏNZDG?>gmW_`]ݞU[lXo񇾸K^=W-yTw B2j6ʳ@vm̀bl YxE9a gXRd''|ɓRN@Q'B5IdbxSV @?&@=<:]"~hEnkZC+:S^[XJśϰDȪ3x=8$F `? CYl=>hwK 8#z&J z2^Y2г@R.gN 'YѰQJP%+y&wo5<ׂ_V&,2fDv 6V't?4  _rLKpJko}벴 mj{Ε"/)s5ݵw~pkz ycP>tVh-$&[ĊpghY+)W'֔qL _4z{r 9{Yo|]={ʆt+xGg3\|agFÂ:h#kA?^?ާՒXÕ8$4*{È6d+7sKs|jdIHxsv v}~_S?&un}kok!Pt{k>p2 w0 dㅱ;2]2j=7^&I~dC _ Rzk1)(g0${\"x"Ȅ#|.c٦FWerU I@HϙN2{5gA`xbNZ Pw[{c[..kǯ+ihv]ugGر7v3CRb?VD3Gƃ&o8LV@'>PN,B,YOnC?sW~䈾^'8Ƃ+Ur cpN5lЊ0t8/p=P6oxmv(K17Me.52?\ ۄo-  /W:ү6[ɿ[_37%_E`VgQZypfmꪶUjM |7Ak_f >%ga6b/@|Sb(}$ỏQt^ɵ'g!H+h!@Y7Of>9_T_|{|껵[^g<.R'~ Mi6  P=cu$*٥G A=rп }1я-эNMExP#[8s#@d6 e<$ڹN".|][]Mhmm?ZA"1"ybĦlΝvSa(K 1\XW&ntF{ g=W׶=q5OojW86q=دSQ}f4ϭ&Ol1cBnL7?1+.IciJ{dI.߮pJ ?~}1=Xd_OB6c6xKmJ7(+{\PWUAuSTSnStզEЯQ_h<͙+6:Kb>lC_сӿ@66lp{/*u"B'f9Hw&nN ie`evr 0 % 8d4!d3ɢ/|傌l= J,|}=U GqRpFSp~:rG?_EkK4<,YY|/w6N>_JHu%#}dM;Kbf@ 4m-tfE?mڅޫ)Nrh{^.`oWh f'8l9R0O.,n ԢWo͵I#]+[tRA x*[Aq0Qǁegzk5 K:\{FW$7sۖmkُ8On6(Y 1fĝrxi?{n3A c Ӭ N7 qz i* 0gZ{[joꙁ/ekO~p Defl75v1ȘH6M7"7!g~Z{["O$lj-Hb, lm֯MJlBYl9Ǯj&FᠥL+'ə1: o-ud.dDǸcSԇn'|}ϭW15f՟pFoRءVׯ9M@r-o%N4\%%k֦&>g iQ^LoK.'}6mIRB'Uc8OK@$sn~Y.W{Q)H'=p3N^q^ eщn&Л\e`ɛ~& 4 Ǹ7'ur@ٔIŖcv78tG}Ϝ`Yg?.f'*YdKI#na϶&%&GТ\G]lH1J&^R]wI|yzhOk5"12v8:OrovGڅwՃ#wYx6:g(1XP twԾpW3wL'6^Ίx qm"lOo :Neh pE:i]pjk,u'2+ô13"'ۙ?sWWp;4I2|6K@IDATIo >>b$ F''4>> Ͷ &'# -%|}"rȊ)39 HzNP}[_8Sk_ޠ&4ROe#Hh3cͅa$OYƶ~etQyyug讦1ŏL>x [Nۈ m}Wr~;ƉuLW~zи3K/HZ|_zl>4F.'9k9kЛZUSr$ $ з_sAB'ܮ @{Iq-qkGwuO%m6@uQn|^?șh^ɶSIjw8-̛H2~YNzxs ϕnVSYDL|@&"H+ CՍ/Ia" d?s: 5tĖڲt5zGϱ%0|&@skOzp% 3tc+):*XOI7tg{Ibf 2;,#WaFۊ޶ v cآ? ۀbb=6dtSS჎_gx 44yu#mu]vF=[ezw)V-YrC=7? 6u5`qV0ޯ)X/'/9; T$ ٨,mV*ĝ1 bcNo7~*wwIVtŤ0l(vY[I>߸m}FҘ],z%k`TeZߩp A&<yt |o+,S'2:N+ 3EO=9R6Q$_@;:_m,*ڱŝ\qEk?kn'T irz@Ǘ=өE3>?$eR_Ŷo*6Ղ]:jb Mez"_J=הT1) Dk~2Vd8qE q&ݱ`շ3skA3v Um`Lo̲M?,Amx[m]52^-@:+nY/f߆回3MuUnoj;$Yo3jAg/3Cga8USxrP.gu^xkoT+K`*j7DOB*S.VŤD4ܾZUrŕL|-&}Tۺw;ڕ71A& F)P8m1",:ocFh&6  qg`Q}MhL1ə@5x$`҅'_?&Yddo#z]N|O0 4Rw:r|NcPn>7-~ugMJ^&kt@|Nhq<&ƤٝOG!'۩ȃnyЊ\e|eYL6eڀZH_U Ϯ8n7lxZrt9ƲCŇReKgKh~ۡ]Z<]82~K(ˑGB}+P<:K:! U=l(NнqA^Vh߻~[j3vP hNЖc61|/nƸq;Mren# ~郧P@cH2ME.]W+$S:.Ya\^ \Vta+)19MZB5z}іv:dgEzlMhm~뷶瞟ٶ^mynGjӗ~undAgJ5 dq[`·<$}2Eް'7N^X/6^aMD2ݾP{}8~eKϯۅNÚ77lo>!#O-.SiK1Gbźڣk Iq9-y/]tk~Nd-r#县ԿLLW>] eK>TG KdŚwՃkn:?t6V1mS69R1N3k.c'g =f@5cxA$2赏[&'W$D ~ `6JʱWT NO6,:.L_,'r8Iz)˵. "qKnb5}?ر9׶3cK[sg-X]Y}׎v ^>%Obcl/y1kO US6Ӈ`sm-[A)ցvb쌎[ ݟU?}>U5ٗN8utce~om7?έ/O^~~@i:R}֏] :[;?jmU]tOEg %W#31$`@Ado0ͣr|O8MHf T` "ePWt+2J్|ҙi] SM/BĆ2HrWKk_97y-,puA$k~۾_ۮo~z7韶ͿmW,k+1Clb0lQJ]rlj;kQ.VuFsfZ Zt aKfmuE6*gW,tm6Ⱥ?gC1lg0`SgbIABm}P2Wl%4 : Y:߉QZ?d BzcvSnŷ<^siqsanrNWc<;sI6~ŦW!zn-E4cC#@$9p|*_ OsZ}G^}Љ v=t4C?v돖>cmrНuԾwſFik'R0,ڃ".MmBZtyp/~J:b],?DY"( WeRo#re}&CV\U?/o{v 6~*r6N"8rƒce9HLpkx녩Ssw&SF m ZYi=}R.} |ܓ2,$d7٘2Q)G_[p,X<|8I豷{%'yo_t$wΰ6x=/YK'}%/~~@-V'Ɩ;cLf%,Lh?>+<)xtS־;^Co|^H ؆䱅|-_8w|"%6ShMz𓟻}^6gmk謒{)~Ro R[c3Ն[JXX0KW{U+] Lq(JVNN`dKs2yjG+ h l҇ ɧlD,n^ Jox^^H/)#voG~dB;3lS6^ [ l؋QYJtWjgGl'[9Yή_R^@_rr6< %)&fpǘH̍'ݏnC}W~~7sZ_O{+g~ب"> ^}|b6Â:Z{$xvoƔqv'ohLs ߭ضQ匙X@> z9ߋG c!aw9pcrvmdւ:8Mp n'[n7EdaA?6 ;: yvJ[Sux}t /xɕ>V#ןߴ_'|[{ыN LH{l36 J+7쌗 rtxce$sĿt=)Sqġ|K7z{ѽ;;'O>v>y<]p[8g&zZ] boXq(Rq]Sg ĉ@Ξ+J N7mx/fo)<mؗ_KFcePC|gkp@l0o~++˗}}A^}/{M@&)>g~]w]WocI9WglZ=f! AY1dw`*Ԝ,xΈ7dL灾,7`hxj'\?|@yCK;ܕ@O};Ħ-ǺiAO&J/x+A mjZ{ӆrv$iX* gyaqq=|Oa 1|bO&z[yH~̃G9]FcY:>_ SD6߭6F3a M_z~09onCyO^U'j;`p]#.%/TPџ?(ojk'{z{ę%z*AǍV*VA؛ٜIGwKdm6hvx*z^~}T v[R86IC%wi=; a"2yd\֤ݙrXyXWJ ʘ ?8G^wE!1cgrl'P \4K+Xhe`b5l =`Cr~Fz;I1׬6?rn`C_=Cٽ'mh3puƒ !$Vh/V'x%~흖1ǹ"ܣ6 &|6Wsߕ#DuhJ mtKIgD?g<(O]'iG&j16M=Ɏ@` /Z-tј KkǭZ{srQX/؂.*WySB1ͮLFL&i3/3}WlZ78&Mȯ,B6xJ]`nmzXgٳί&:gHQuKDmMէ?9]XɂGq;ָ;2tӽTY*y !$! }@ګ[ןlrU)A#6}{joM 5ow=g'?4O? AWM[d NL"{g B )pG{+}Vip9 !&89Oi9y9֖G8pdRLdUgGO#CL[Բ@%xe/hJ <6J>ҳ9?iЪ_ G? qR &c4pc8!4L:O2_uRV{#Gbgb@_z6WUʉ0Ofk)Nbmܘ!IՓv91MzsSOb%q}ݿobS-uZ̧]0>KI~;|?Wn/ Njlέ?~:aqե'K47FlAN*s:># X@KOt'c|.ԓ^&[KN\̙S{ds|rvN66i~+9Gz }ĆҾa [Qbo<&Sx!W솖 W$][&1PA\K^6 NLm6{ϊ#%UTȜKΦc}*evboge@r7ˮCpv s͛|X B[Z^8^j#߉/QMJO迩lAcF!lsrG G}B?μڔ _^!>l.m+.ߠk& { d\a g e &&zL)&'.B\ 8M^q@&%5yt3 .e%~'g(ԃLNt2O9#eÛ";١:Iiy1O]U3yهr ![[E0; e/zw-4/#ȡN,3k O9[2]Ha1Wqxvmvsڦŀ|pc[)'x_pl_c߱6|dz}/[ ?;O}X]|ZNA=)RpU1qc%4L Pdp6<'ƨI$Uds鸌6U׼m}Ukv\R8K[=79{MƂzqE̱??L,ڵ\\ֹp?^yǃqP;)Ǘ d^1Oܭ$n3vgbIA^BN 7$l"]oŸr,l(Rlʖ|vY5o8~hO~y8vo>y.ngL+'d /#(3,C2" 4p >\)&MrQ.i'x҇fjG3R?}n_l#Zd7qؑ/H{D9}ѽ{v}$ ~u5bkC,0+Wuo)Z-WDFo.:bU;JBkvC+>3.^~x߆?{woUև17  d AE`E *EkR(H[+QQVQ@q$B2Ϲ箻9{oug^^$ރ^?Ճ[dz%g_]zjagݔxLݟ'~m ,n1Xs'YN'QL^D@2FQӤitF\r|ɖAu-zKN4ȕ7WZᵖ\գQFL0KLjh2ld\v?}֋ uсWg{('6R=ދY=svzdG3WNgrm~5%2 W'g=][[ѡ1K W~g$t!㣜}|Pٜ.[hzht{_n?X'c ,^ 9?8@l]l|OjpX_[O{1Ƈ2N<$ỊbbAl,}Qԃ؏%bkg܁mj%JZ]vU;>);O~NvN^m yMneĨĢGMHg(ro7VEFݜh%;Vpi`( qcOyw}Fd?Zڎ<.)5>|9:Vc}p}NUQu= qǛ߬ctnqpYg,cxȠ:Nc`-Adq0 Ip-(OG(26hy GST1~. ǾE풻߽QrFZF2Vl2H >$|Ș\?1Y:$shؖ^mlvECJzʱÚ&Gλv돪ɘ=&r|-&7NveC\}_}+T,^Oqǩ^i g ΚwyɁv_$Tnju)=>uu#E7mOzuIgI3O>63|sǟ]f?<+w˧k}i|/E]'qͰq@Rr`b`'4t,Ea}0eBcN6[]{(ϠXMh58[ߺ g%(ꪶZ{׻ڦO;t姳ټk>Ӭiɗx]YT0r,#WUݖj{QX8On׽PGyᄇC쩭:tC O;VvZ |/tl] !4 i,L} Ƀ"sCT~KfOVT_ch+Pdo.GhI~\hzA۫}jvUSwbiԥVfsQKNO(+wt?_e˰?89`L2dc k/OGGp& eqFlγrIq, /ڿ׭ RgVSS6ēn=! ?#;PΎ Nu[9Í|f0zM+~#9TV'ssT,#fqO@s\sN# R:9Ź̽#V>c 2cX}rkvLPk_bH@K‹}W Ь%˦2BY8&6M#5M$vS44ycMW_E{J-U(>W|T{jB<h;OzRr꩓$ vlxn*JWw[ %:l:XgQ޻ݭ]s?׎>ᄺ>UF~xB$.%clGV)er|`cW'V;q %;Eq? dϖ[ΦUҳPK oVkge$`1{%%Ip=ۏ$YŔkns[o\>n# %HOtCn\{%ߵO7}e ~bsms-,_i[۞TkF?N7$' K9=~BAC7:1IՖL/)Bbu9o _<6yX=K~9 ډR ׷=hk7n5]A[+-KEw^kQ [0x l;]yfE&/nz=%8C~dOIzzo3ݶTS7]S nK ;^>ig6WiD{h0<ov+b4l3VWzO'r?yp"'5A_`Y$.dc!yq;r&7'=ge]cv;&oQ,yO#D"*ggO1+b'ޭҖ|}A`.d{S^ hYowzQxKz𥭮(; ^tu}?ߩs+q:9<%z؇!&A9?&ֺU\ʼ?uAj.1k'F43~ >Ñgpq"GҬ3;?/ALM5>g,qL&SZG Ou՛ևt`)\~6xщ}؋X":<=A'ů *Z{kZ({ZG6yD٪B~I^]cčwIov]]-Ҏٟm^vTfkopvvAĨ}v6rI@ܯߍA _%2>x9VdvL)ZY,>Mqi,؄=@;kkO{W}mS}_/n}؝k}8 OB7U?K "ꫳOٸFF|cy|+#WaWHcv|22O=\οvLԥ7;|D{ɳnm: ,J$ܐ+Xܱ?U_T9h+WGe`qT,jp hƘ-9`xe脮oU6ڇ\ΤhA97lc}{%9kV/;݁'-'+^b ?>Fr"!5- ?=Q;ݩP #~71m{-tIV +[Ԃ.̮Zro 7\>?}Zm:5Οxi^c}{dJ@x\c}sf8(wRvܢ9Rng, !c/cg9Z <ԡi@LXi|zYOrʣ%AO#Dӗ^sęw 6b ?2-QUd/o{ڮ %Ω_+rz 3#ehq&F>pB2{ܣ)97D~I,z$/2$,)E?:}vQѿ..kQp}N7mx[Mjg1dMK>Wnldx"dν ˰5 OqE7[xwnSKppdc]2j.PÔmi"TgȎ*, D/k}\)Ή^qǖx6*A~u(=vTE-ҹ /ķmV:%Eqc[ctz;vdZShwIbb'Zz/BvU~>V+\":nVQY8bkѠlz׸\l]_hˠ[Oދ߰='?Ǫmtq}Cp.&6[;㴇9kE-[7aQmz6} OiO{ZiVXE& 1B?vg Ýt9l|^7'䆙}*XCW&x|6׷Vڷ~kk_Rm3h3ͥ[Y&299ۡ]'z_S—sjGx|$| f@dQdBab,jTO\\򗷣|g- h8lɔN|A 8?jxQKBF8#Sh8v^Y [_OKfޒcu~3On~ح>jp%׵Wa8SڭOU]L|BW[M,>^^HͻՎmk`СΕyzrKUr~?za-;z~Z~W\<wd`A.oD`ˊau({i=Ws8'-f`0e;/|a.g?/^җ}Z? r#NۃABZf7){D(1*1( @2H&"†@l ]dQG{au <}E+Yi/bkE%2߄~h}?M~m1^‡'D,Ŵ_z&0vKH}476~zv@]v/6y,rYK 4eЄM{Ue*g;mFݺ^S;zbfbu|ȎB2'f[araoڻަ{hO iw>*7 _oE qgN1~{WC}jf@GTW,Ŀ8S/o;oh<_/EtCxܡ=.'yU|į|,fy~.ğ2xI4 G\IqHOjpәl5CXG??ig}vn=A)OyJ{nLzu-S[dOO!K34mʼ2q< ROu* 9K{8 &z.&@\8@t sSZtjk#}bdt^M|CXUwϾ]:y8Ek6]x%.(ŗpMYp06(11); Z''/'q<ʦs#ft ]z*] 8gigRG_Mw_cCv2>Dz:lJ>@q~s,ͮkjN~]˟o|{k_svLM.9o7##_lRK8:yaKKuMhN.<:CLE ' zӅ$9lY (} W.ݹ=NGmڽrb;a1vb>fSgvb$4Sţ:~GMYgy!)02搜~/hkf1HLLtRmB?;XΓg`r|Agl {l$pɋȏ ?2 tunUa=|LOW|=WB+ Ocybt"P%JǃGGN71);qgq7>x#Gl)13%~ m9z2K17E?㔅fM?mG>> 톇>irj_`@]QdaMv<Ԁ?9/kKܰvdWN'C}gB<&ݵP욝+>? td;G= GViW!LC,M6d!믻qOʝ+oK7_]Y_Y05옼Oz*Ӆܾ۶;޾uͿPqoYҩx+]UZbM٘bkIyoǺ[`,~ğth- =VΑqZ :zYvigcMFşl,Tw(xFu|#20CO9W,hΚLp ri0ʬyw ؄X'\x`}@MynReqݻ cGCNruK\7GX7']NncF"Z ՅKwV}NlFv5>[C5|8}מΪRI_wZܺw}:;C7ڈXO|cWف¾zA{Smo{[.7\^XbmalDxzW5(24 Z[ujrGhТ2MmiXzufq@W࿨ kZ^4yNF_ADrw^8?Ye3tq,i O[8tEC96xbG9n\A/%_.qx:wYp9v۷=?-?SmS}cgW-AO8v S`;h_vic;ߋ6&G־W^>Ϸs>wi;kg.}5_ZNYZ(5'1 "Q>=GEm*kŵS@tX޾v NkgM]8^Nvl·hI[w}1:> T7yʗo F>z .xK^/5ncq-B_p j6 d/xdF"gdN=pu茞 3CQN>2,H؈̒6|", BӁ@lR:bh\> ;:]-MhXkw0c1[;tFM6Ӎ^@IAY""Eb4=>2uBG9z1~mO7)p׮Ily/-b'-{Rfe`$xcl:}1lW4h SW6=*y蝻6˯oԄh ].vW=.!t].8vOpյbGT |#NUOoN8nrlv$YLd'&]6bWW۲ƘV{o)> "y睷r'FC,` z1vr]3j0Nܠ3i l0@_"9a9c}'@̣C‮jM_!!؋Rvx1n=xBɵ"1-]W>1K[sxGk^!ϱ[XWvb4s r~lgd>KUTk^8 PM$A#_=6@>ԶGۿm|);(DO ɘ~/8fgE79>d`?G6Ymʏڮt/t_)%uN9_T{&/0"v׵ˮq:/ֺ]yuzCE79{vx[([_[<^}g9b]0sv7F\|Т[Ў,@۱G?~^@[};.VB9SB9O|Mojwd"|S 9`|RN[1*g>8۹+l`"r4䌬+m`:D05+㬜F }|#~>?H%Lb#nڬk/ɏ8(: g"|D^oh{N?}_b;s9s\ܼV'.?ێ÷m*g3ee`|vVf=mW7Hz۶?jPl9>8/'3}>K!eذӆƖmIVAC{mmLTځOto1Po}T̃k2>~I'~>REl''w/H ul׽}ɥ;߹]/o]8^jG׍/ylrb^s]6W˞z?K֌G&k ;=K{F> l_ƖL#IlwMȽ @=ȁlQ͔[='5)Gx=oPA= nZɆa_F&8oll;.#ѝ-Vayw}e Os;|zիgeA#1_+v X[u1"/جHSp2(@ud,%@oDtЇrsv =m <ϨogdnEB:>'0:!2#qeImF~\xlЌ x'nhFUhJOtbߒcK鹹OGڷ^p.1$NЎ9ƁJSbbێܟw7'1y / W9=hц3Jui_݃ka][ǟ .å..{JOUP_ <1iw mޭ}l]}5Fr_[nE0mR_2Ʊv2Q~|p !;KJ;2 @>g-hH腎sxdIqLw':b я-j:+}MCm>&Z'Yn -oŰ(B x2*Cr'6$/|x,8Ё"2&Ȣh#|L32aE^9= &3>1w:YBPЇ pƓou{!9OYldD:G4odRzW_&@ &$z$>7?Duvl-^Zoa]ׯ7 ɡW?&կmې콽ϮXyUMvDOv}W}e[Mvϝ_f^|Aj\κ9 owWm߳v,\No2Co~R6j?cگϴ\;~8O)1/!H?Q.دT:>ێyCx(sF, r!h1!21,@K߉ᱚcMɪmd=G敍i4 R??=F& 28Y/0eL8Ocy9%ġ%Oԧǀ#sĹh@~QRgx#mwTꋍl O??6|= R+f{!d3ڒG|8fR6NʦWzBOڶ/`@[{'τ朮.$g1$`2tքxBmY?.{\<ؓX{KlgJC'TG?]Vx2\V{~7?Li onguVO %gmbzz3!q gL&1 t4lee>k7\FKp"ZR$b5~H٘,i@b{*n 1EW`a0;3f,86+Z 8Fe,Nx.OƘEer'ђ8\~sQ&:bj\BEu|JgD^ճ\=A_=v7rG:)K1 ڮ_H@? ֚6Zw0ɇt#R/d{w=uw|w{W߾s!yK@r.d}Mgqc?4K KB pCmM߸Fޮa;{1dC-(P;eܥ=.S|v>]gtxjZ<2XL~bs-&: ,7?k}GExcyx'ǡ9'Fesc?/}cŮE' я y:1>DB2Fegfq|hDٞ~h>Sn1 wc>iBuD AMuݬisn^p3|s~\r~u"Gc3=uL<:KڑG~t |6!U&׽)nwݤ1-":[An[-iwpW⳷|539s}Ox4_jBZ䭢sj q?b :ENMwəصO,ĎՏ )xN+kag m[j?{^vCjcwcUKz Y_휺sN?ս //'>3o .u#e[8{ԥW,$(WRg*,)ࡳچeUh8NRW1dLt=)H SFȠz_82ڭV>jx֡b[ɣΓv^x !ɯC$4&@ X1<1#36'{*48I }uߑ=~ Y LZьnpA"7|;oy鲥nk~ ek<O=(I49u층&n}cM8@# {܏=YUw.@817|M< K~iF,mWSqp_uS1V7P,{a(}PwU:`c mZLݷ.[>wo~j1]O G=iBz$%t 1@d^[6G4mdߍ&Z_O&1H-_i#Ct Et[N^|㕱: `?JcjNj=ZCTGe@x i b ._ZڑAZAO&m8Ky|A~IZ܄'kZvAt G'9}3&@A茶}sz>;7 䐴7rɦٴݱ/o~yNS?F}#|%A/ :E)w?vԝߵk{Kj\fsꩃ,??l}kCM1uS{ l}{vo,Ň-Iyj+7!7^#C~C#ЂŻk푏E)mu.@QZuL$X %[ Ou'Q_0ܮ&37bk|퍵0ys]w£QGSz6᫞nuq }:mtl KgW"#9;B:X6{_چct!cOm@Sʱ|%UMk]oVl߰֋mA3|D>$lFݖ߶!inODm+l~c5ݷy߹=q&Es9u RF~t@CB7{.Xb#i8QDZ/?qFTvc^bug!FQ:蚾 ۭpN+XXܿO]}#Y~o <7U} yWD:K2p'vb ~;7Lf/ -82Yb''^f/{>ʱVѕ\Bb#rei-v,FtʪepǶ(ء>^8 h:`50mfu&uǁ"ީ3`眞3XQLf/D'9N/?XC^rDnV_^[xj~=d'g| bGmaoGck=oB;Q kJO{ hmG?;ݩ/ хl@6&|Փ|ev.DZi(GipMNzu%G1jJd6SU l)}[Syz^s;]‰^%2~KJ+8.GaUïEڹ$jgkw῕_%{SڼK®h;tgi<;:O$@3d ]mÑ_LΆ#/oȸg5;/CKC}#;1z2_8 c3pc˃\kէL%^Vhy&94 g ~p|7Xv:)i+H2Oz0ʌOu0Ȕyf3I,X IZ{mZc\5|}O//͡GS ,E O6p}#ؗl굥OtT/g5>뻾}{p` ȵȶazL?>ؕyĘ]ţX=:l|,Kݡȣ3-E8}~t{ lll|qԬ˰.g,˂@dЌΥ# @XϘ_@Fe\ 8E $<&SU|CvVwd؇? n%O~ͻKWU?MU/yZmG??& Zdd!~ 28c9o6dC #§X4T(}х3'dGof^On3G6m"{Id- Z-XUW}ܻY$ zu]ضu=7"K79V'ƥĎ, a+^%n;3\M/_&\l>}9W!S%0i5 Li4 ,dW> o5:؄h{ )oS$vW䪭(dYm5*Q/yPuSG_!Kb<Lzb0wڱaBX~3e }({J^.P/d0p4jCEtŰUGguc'&ދ^Q|ͦ?Vk~#Qoj6 þb&cC$G1\i%%C/>֞cl(zƷN`F]\L#Ir[}O?Bvd:=^$.Ms-6dxoaJ~ŀ㔏yϾ:Q:f:͈(HA)|h #G6zl3.СGt`]쑉Il;U7yzh%ZsN{Ski ~m$&VO\#VyebbOLkpY؂M%u@jSɐměMz8C O E|) ݶE{";uGWUW(!az"N}~{gl;sznf~K^( a-A96 ]>&A1.\ }8OUJKl8\O~B&"H9^xЙ׆vQ?V2WRP#1ΦQ D:Lq1q 3'`L#ed|@N؄w V|sړ\/?a"5]rs>x‚>lkOt86RW㫉[pTmNo-;b[">D>F^6B׎{JYwqDM7)SJ/[2 w{ ,˥$A>ޠ ڏϨ5ИK^Z|X+LRG\9E D-@nCNB7&wotc<Lܮs"-l*?8 SAdG-8e+~@旦f('3. f29(G7 Ȉֱ6,LPk!/E^t!󲀷L_Cy[h, @m}+&rt9+R.%osՉI&h{ [{(xDy{^cq3iYb5=S.!|ДG?+;_EOEj|G9W;Xul<1Dv1w hL1?e+2[ACǸؤd@QXπ ɨqHqr)NKq9d"t>Z:rujh @K bC l 1e( `7z|c<pϺ즹 Fno5䘵XY 8'\1cJ=g?mwW]Zz%4f>5YgrB_OWnTɹyFhA!_$yM7fE/R6W{ol6GA^tfi=_ִuٿ'=Iw5<[{8(XyV,6쒔hqxd;3 P#t-+40cȭPYl(O]'K\8h;L}>>ukvvYKXO|0V(gd8NǐqL$h GA ?%Mf6!odg^P2ȒgV'e:AcC) z L[Kʁ7~FM {N=Ž&c7m\ ^Xy󫂽ŏWz`oMVPQumS=]ZFV 'K`1:hgrRba >u ?t}}stIuGۄ&D8 |`ז(} wޔ y` EF0oICJY +`wߔ\=<;^G=v5\Gܲ6)G9od?Dž2-r㴏ԧ>#1mA] zj#9 $R#en]/ÚQ$I)WKG//X` iіS:k_ 0 `$)RƁIq%OYhmq' ᢝJ9'NfƠHGgh/1O=t8NG$DVv`1R«^ԣkRz0~ڬPEKSזJгM9gQ^m; |e Uv:󫙮]*^Ƴ`GZ>u#\b/O l0'?XOXhC$^@9ISkv;WVv=z۞]ezӭtdo8"\N2) 8m ,1>x8/W8N=%y*z (Ł_dkQlm#?iKq>fY}}[q(ClFX#/?feMwgJ$efv,Mme،-ƌaS6/g89 pe \`0siG>/™.exg EF,lN`3]R Y؛=х d|hɾ>گW~ۿ=||/r1]moMxgGgآ~UaK|!L沇6!=cؑ6cf:Hʭ Sl'?/u=ѡE Ѽ33 G6cMBbNeJ޼ROO<)>y DOeoi@S9<q`ȂNx9H<}RtdRl,2JD䯌Ab1M9ܤ8C31[Xh#iMrВȓt! VCd-B t,:/CsZ+=Ј.GP/qCx=~{msAgtw8wc7|ruvZ`Nj-GxkCUv\]=Ye)[ym uɄ X9+? `WPt=ځDbNw?=J'27lؑfN.ֽzWUMM[ }o|t`Y=SW>0'H" C.$fK WYDNƲ6Z{,r0 A9:P4<C@sN AQ2y $ȫ( ^OF 5WMg;wI򒣄kq|HƏ͇To{n{q}~2Pw.'92v|ٟMTͯz!эvBmb+>YBo|b[YSiw}3޷}}mgS A99ŏ8pNN; ԏv na,t/Б;]90~z\Aݸown/~nӟ~|B1]_%hMh-䐢$1.?+7/x 7 = m^by)) ȸV̳/zL 1d:4HGH'I&<9@+vCbU #ho?fK_ _TcZeW;ԯnu8rCe#z*)v^_&iKo[4 G??i;vb'~'W#}9ᱵeB,ėsvo'C_ĂR>.m s1)>s ͜8e}meD ɠ Ɣass#Gp^ z:Hc2 DU,c :8уΡK!n#*ndPrzyA7L>St3e:^_®/_?T?Pd[&:]֩<{2YJt]tuwT Y9TdA&'͡3? O ![Fȑ6s+*zYK[=+ 4}7_|iϫھmoQzK}1u8*~A]'G.1t <*MuXbΫ[뗯WZl؁uOġtXKAIisPW@#s q<n貭~2.լ1zFO} }ލ4å]z:<08}G~<@г>=o?3?t}M=m=U5mC#>V޳|K (E cy/!\'pS1_;:_[Ϋh[ d{NZ̈́=I) E>sSs-7Q Ӑ<uknG+𝵍 yL)W@wY\w&Vm{+v-{:fsO +`{=iOkկt<66 +tzfœ=$h;y@}v\q* {?wjEmzu.(˒Ȃ%ZTE4.L}ﶭcs8nO:r?SƜ АÕڡf@|ד^hPi?z˧vdG.كp~   |28IJ~ {- |qN7YK-T!,$_s2J:@u." WTer*GS{m@؈-pDbIYR\e#8PӅܑYNYy {~u__/<|A]v&n+9*|]sdMgHT]E<[h:#5';gu2 f Et}KM9қǫB9*'Wc,޴-Eogَ]%}$/F~ c(cGvJ׼?JdZO_ ?|>B+-'.[7`_m7cQ@Hg q׷>RsN1nS_;V~=Q`FL3dpnفDZAf@֠?y(2 #gu ;'_ ǞI[o#|=/i mI]:wZM%h4JǨ1~$/Q#&'b56/@g b"ڭ#ݷ;sНSkU?Wkժ^U>^_tڟܭtx^%Wړ%)_yٶ_(kv_l{mqL+%;o? B"Ķttk?"nBp9 zC"'s=?i:=m-vQeżO.gɨ`\h8j{&~zc_w]<l]&-ks5״GׇF43o$r,xyLAb"S]- tM7׽u# _Z˿_QGZڞ:<ց3y\ muX䄝iP bAVKw%\Q:&טؒny_ Ȇoz'yϨ_vD&w F&M3w_"~_ЮzVPZ\vgwV&S.Rdbub|!- ҧdHlMm*\ڌP0nԝc=9хmGGXGmCUW?s&#hYs7|nyW@[{~\v46ፏquts1Dc;'{{1[elߍ/)n 'ܶX)+ WWon}n-Gkַi\'ϭuKûNZb캈9 l^S7Ayw3ºp7;v)*i'!n|dؤ=cv $W/!+s>n#KlJ?Kv[SQ;ytofuV_0\vW=UtȥM7︈Pl5~&EULvg ?>MbUn&ڒ]yd[_8A![{~ߡ"`F-tJ-j:lt$ |1\dHNLP+vtr|"\L hH|&c`^pЏ/Ɓ 䢏N"߮\~]?@XW&EO_{,;]=EXƗ>S6?_RmcF}2ڹCu.Eo4rE~QC=cɗnVb,t"MK6郩 CP%YͩQ5l#?}@<%-3I']Y}tħ{W֓SAY7\G':c9ɑ5[~˷|K{T>!oNz-I$qXNVg{&e9^|| +~+q!8nA3Y α,kwľN> 1zщ(ԕlwQGI>NRGuKt:]9[ff_Ȥڏ^s/zыڃD\U~O nysq@?cdx]6>T4B*ks }wsn:x}NtM?Гrɒ-00Փ1CNqqLA6R;/~U/0eIY_ůS gWRMCϫ6298\ ힽpߐÌ.?oQ\'1(8[:V7ՍϭG[HW"oMKk_Χ5ĘȾ'ٖ>}61C5R-^~,[:A`(qqc$2J۝:nNc,+z9/4yd˅sQ(@ $#e:z! p=wYEG''~b;l@ֶ\_⳷:]ɔ#A D+sX|r *9wdM ;~7z?^#nJ67?=kO.H<}6<]GMڨ׺I@Vrz.lMz:i~r"E_*.!MO1m%E-U9{W@Fv"+N@'+|n.|WƊ-ȵ7 q 0ցOP:mO%Gn{]i:"^dG@#a`q4y%%mrTc/>. c|,y#(_//_m>DFSz71=!]SOȥrvXn["+?| !hujcL;?]r+[N~QTNxAg/Cc+Nu} 9P?r2 t8Y t@~!: Ȁ\' $:G_zuqA?#3"3;qumE_E؃kJ]ukxO)mTC,81.l'M%LQTES"S /(t>]ϿYwׯ8îx|:p|e|?H\LҾ#3X֫ /ʜ_͜WiPt}|?("h=__gmd@JeO-M፺6'fDZx 6rrKDeGw|wfp&uX&I?9l/h.3>}ܛOkYr> ILdMd^$7=y{or+>znzvd3r|u@6MC.>xc_2%xY%EZ8GAmE// ?wrʭ߯ٿ2믿8p#TRx0w;AEs-C{!o=o2Dvӱn!9Hڐ!6A/}[Uv}]p 累ʯce.Bw~}=uuw_KxTٜ$i/vyu~BAl;W۽E_^HF^l\w;֮w^_-IwT+9Xۮ鶕p'mĥrիmm+ֻj5:R@|c$uc?PL$pW]DU'C66fOeQ✎d2fuǷc1^䈾=\7?vKeoЍ_ D~<n8O U9hpRudmwc69}]`s\5zb+ WLbfEo;!<8Fᑱ(3A˱p cvdJOȍ3> ȟ~E7סmo+Xj/~q:L?^1sxFzY}7sЋ3 04s ça9{v2PO:[c^9"gSgKGi=2D91`].L]=x|Y}.lˇj޺Ǣt;K%|~$熥dۄp eE;>W3nj}&Yķ=A@/d>:dߔ-SDZcbdEI]^@[2%28s }ݦj@`XDJ"s?> r݉qDۨ7Oډsw}soMX'?6;Fx29:e?l=ugK6bfqE9{d:~c69vnSpx8Д%<;iMΩ;]uWrY%DㄶOYָ^ķxc= o졲M]6^l8v|/1&؆N>q7`xΪcaܜ,!V—"el-%~XMɓڦ7dܡC㱲΂݃E):ˁم%2:ݗ|ɗtRgAl~qumTU./;ķݎUnLTw+EIF4OI@nǵjjks2~V0/kdLlgcL8ιv#=>N[mĊ \k jQ}Nn%uua~K MXv1/'zDQqO\{GL1lw&}ƂEܝ>'gnX?/r6bBH|'r~Lfo'M"YAsvp|: nWxymVrl`n_ZZT{Q78|n6S#a;nGM2ewy7/[_,!<`|}F~)K=zK :#'y^I|4f={?3VwyQwώ*V=>xm:zPc\D:g6%di;2$洉dqӏ!wM'q4әg-|\Y`}lƝK۴Yj)?^ɼ.zbxC'f&<,o90( K;Tx: ? ~8F6- !+9.S:vW:[$'JЇCM[5⟲ў :;OxFI . PvJt#x:+6JE$ ,{~z]nxkRuc/Ꜿ}zgYȲ2fwZ@wJmn|x g+mgb+?\]!9ɷuuΰV?vC{O#}7mA8(ϊAf$h&]X X :.bkUJG/qN?\m0pKs, hxw_;96֎ Д` ]]^a;%_sr7 κm}n aԣ kvlG^ۣ[qv[ܶM`G j+APkM$J2:3(^Fy[w;9NGGW o9Y;>ʶոeŚޚQoyljz,spL&;i@da G?8(ȵ |ľ@2$4)p;krО#^6A|Kk+^Vz7f\O`ӔOrM]G^~Ǐk ; }L%%m#2;-71 YVuB0k tH:I6cD7xsy۠7^ű %GM28%7mF]?)) ~t:&wN 2"E P6ٲoU^%^C2WҴV[Ƶ =ˀ -@mkwJ.+sm=pne&o忴zN$A,ҝ,eEpv_/O,,8R+k@#$dKR(z\9Q;iKFcLXÆQ-|З] VG}KNUT'n@!'КwZ {dgd^$]F6pbbBdg7cI'a눊_h |q_=W٥0 J9 QxȤs0r MGGt8NV?&@O2:+#C^5L9; %ƶ‰]ÃNtcwJ4щc[hG]l#t"8K]$=yn_YKQG P6kl[-1>֦삮WCksYg 1_Q|RVZ{+[{3o/|c* :-[V /_t$Oh^cBjBwG*9%:d V$X`xd9|M=vdĴ@}.6<O/?叫ŢE6?7?3?Tt'_ %=ߎc8/{Njj2V3\gA$_oV (aywe߆n:?"2rg,E#Ƥ.NtNPsAo %<П4t.w@ zF' Ѕ]ENtɚc2(GoI7`byyyw^ɭ~*yyG}MBqO7|G Qa$*24IX`;E Ň-tj:ԤxkI8|NƆ.D3*NdK?S,&l4h芏6wB._2Avt Ξ>[ wzիh^κ٘B&3δ\]όelčsm@?g+4ѵFC45OoH?`Ђ:σ>cKf Xp(7SX]rI)WgUN^$@& S &7$& #ᝀVC+f:"7^//ӏlHB?H[);"5 ^eMvhY_p]᠘XXh/eV?mFZ{ZW3ۛ4M,O9 ٞSZ?ވccIx D cuFb6GQlA2{.mBcј0?%p\HƤLrkoŐLp'mn׽u]+Eϸf?ƛE]T2rk`Fej{ho3M[F& k$6՝2^@]q?R_;XڒrfHB\.&Sg2O ^έ焉Bn^,|_׍MV.ZKc:=2.=Iyy+'dM/v Ml7Pޘ&>2ƩW)/c_ \ZOm~_4|Tͮ ( ڧ}ڧ??o|q .!{Zۈ:G r>BDc!s&ڎsc}egڪKNJpA)Kev@FC?8t3~Ugz_`Ti7jON^m7N]t.El<&>E3r:bU[:t;eѠ&:}WvIy '_l2G:zo7';G Ɠ\fc@U|e2/qvr}["rFȣAtԣƙz{#/G9;˅qDf|L42Apvh6$o=]umoYN?{Y|^fvuHp\wu}o|o"`yߕ& 15\7)V'xhIrqpF\FN߉b@td'GlztZTL[u Hd/CeI9yQ=K\&BYmj鋊z젻Xp ͔\) BE(bՠ+s-[VGjV2yk~>=).%` 0alxng]D+Z%XaXc8Z:ⳓ>[xҮ]Q_wu|%9Sv^g?2ȕq qB'Y"k džxo5^ycѹ@Db%D,cmM}#ع\pR\]dw2V,i,[k06H`E:B]zvqG?}GG9P&K)$8N{6rNN.sZf#^xo7@ A|y ^GF/h'ePn:r磾p2Nd ^ŝOC $wL3([%G)4AMz6|X6qAsq:Zv÷) ^Q-Ļjl.zߴǛqMz `D_75߉ _r_Ik?|E?} ]WZI +n-_7+ؗ#`G4G;*ӆ%c ?>n`U>kB?MV. cx>Mcch/ vS[>2`U{cIڨ_@"8pJ[` B վ>^&@= VELR~ロK:\@UƄ'i/Q#2b; iXgaowh*C_dt GQNrFi0چopb2+eM 7Eyh 9萄2<]$cd-ӡ=믻u1[1"mw ~ŋ|WdrN\z5m?iY=Zkf0K:I*۞[r`FI@V _B~ǧɗJmn{}6~ǖd|F 63?Cj+~BoN,H 9rl)_zU-ci_;,V"S2f!|=SVG^eC6؃n3:1 塓v:@NVkQz[GbBj֕;`Qi?3Z'̺mv4^lgu.:lUf> [O=ݬ]CJRx]|/[w[jpA@IDATπu'aw vzaoUSojgGZ#V~S&&-6(c>ftݔm&Eb'BeDqhf_):Ҟ6y_cR v W~t;\׺LWEi{IJi;L+X}loEO* x0:߽[ۄ$ぶb]@ѡPu)[}Q q`+>x7ywk+."0d;CX.ivrf ]|eGDlglGI`VW֢"kӁoDӾ x5fd't̓2t< b=V[5>=﷎|A46v6:hD.:drѱ&kӨc5zcG{KР%/=6 t6 |]9֢&up%/r5qjşA*:y-aJ[%V|OhU7?^k6'22..Љ 9ZZc&) )ׯsaŖKRW/7|׫x92Qȸ3ާ+'W|gocۥO2DFcrɱ.ďD"zG< چ:m֚ DXB[ְJ^{mMꫯq} A_GXtMEel q"#k;s眑~'ɂ@\< %"+\US&Gl@ nStViklj(sLN~<ǖK b/ d0Sv*+GC=.+Gt-ET^{W~~®vzx}I\%'K]~}w>!p'ݎܛ}dC3@gxQWt>AwF_ЖSw`& =tQ-n~!08rEh) 6`c'm"28:Ƅfmcw([kReu'ˇzS8RHdyVjf~F?;&[=>ɴ I;mBaB>Ϩo&ȬxX#j)8LEjĀĠ]$0v( ԍp\ʱtfe(!VU9B.t_|..XY}[]hݧ oghh])mKz.vR]WUmƷXJf'u:7#nԍN$+ۧ=N k6iEnx)`Qu_>zc|u\TVwlӶa4kM9y|9=`.S>zxє=hwl?W9~~nx{ Ś`j/3r/̷I܋8J7%x<C^'1@7}7ϫŨGXS Mzmѣl'@\ mos9lΝwD#t3'q8r:Z4_[vyVr@FiROG〱h16ˈƸpDmtLqyɕ6|yݔVCWQ^Bԡ Hkx5L,C"ࡡ.WR <\2wZC}V; Sq)懻?C?Է>N.SE(r%2턧،hK/5aLubȋ> e1#Wg}/@ZDdQؔ livۆƲȖ2c4 4C.pE.%/G.Db1Aeffn#mkvrEGm&<6Fy҇~F1m{M\tMD8(%EE xSG L]fWpmRrD7ߦ]c8 HlQD Uw& \lao#OG:k^Rٸ/:/:<Ўal54xpq.K4C_{\9VR,7+t Iϝ 79:vǮhSFv 9bKCMmCl;2<ЎMM ^\EUWV_|(6ݪt#+](7@Gt ]B'=*;*ΊoY~08ݜWϬ$t._X)w|,6/:Y#.p=H|G[@v>mU-X;6͜{ *h^cZhT[Rk*VNWKUF*Fߖ]5&F}+Rƀשb˱/wM#sisF7tvtOC;v9{C..~yyhvH=.uq{{ۭxSѾ^Av=.u(ygk]pV{綇<^W}bF#0w==rO!>/_t6vr" \=<]q9lQj;)CX=uwv[u@.mUf]HxQ$C Scq(1Gۯ7mdINf x3K@fa㣝t|* tvzc?cW}W4̞'ґ]wO٢ ;5iázԮP8Udk?[svoݭzqg_~AK/j?\өva_#|Rƾ9Pוea$ Fϡq \>d56cHovљ/(F#3CEvMVc%?2@3iщ9Uzܟq@-9gΎ` B+l,[L==i_>7"ot:BZFf!o |>A|rO˰@oٿo\x|8ۏlœAܝ:fu|\s Cx̡IIdJzJ=ޒzʝF6iM|Q]?"NĶ vig.y=5|6@qo|v2|d;x<_.G1cT\h7o~C{n-W1¥BQWuPD2޺Twwu15)Kx.AAuz݇?=½)_.nƮ=}'!N.$Ċ<Ɏ#61~x\Z#.O~r_4h?22Mј<9]qcFԡbdȩlW#KN8(K=:-^td9%:Ixu' h ;yL/gT_Z>Ĥ?0NEٻlbvr}> $~^?J&-yR|!e^DQb3|Q 2v;EpmN(WFOG?A{l\Rm 8 n_.Rx[([胎ɘSB}U_l_kMuEb<^hyܜ%m;M䐏zcG8&QNIrelvOcL.V@mrO|3;N>nYwn]Kb_}G{ş}>캀O;j R.?viM?9{uHλ;kⵞCuC;kcZ[B/M.޷]tzP 8E'=L>RF{vHw͋9ugcMvr_; y}Ƨ(7V䀿l/mr`i[Ku }J>j~؛^t'l>h/ Eri7EEU6Ogc #zWwz,Vhѝ wOПegWgjOI݇[kgiviyP}iMN^a1Z2lEvﶰ½TC&:wq=rƘz+=5!pWd ;jxϝ}5aD5}Λݾߞ>یNl7?|8BcO1_TMІIy#5ALI!Eo^΂WY@P 0? &7'q6d.m lյ^(^{m_Q}~~oo}qco_gӞ~z38ICntHW zH]pO9l;fwY*½ugNJlP铅"umuvMX?w_{CC=;u︥k}XKo~G+C/m _1 rvOLZC#>[vu`!N!̔?>#"0s V!풧}ڦ|'yh0X$KxG9wL5T<Ǝ[9ftU>* Edα4 Bg CP_WN reLGI6b+$pC}}pѮY_lsƘ–/23 1J#(\M/+;813g/'_|jB-?hUmz t-}LC8}L%e3vٍԣ\G$6%9lyFx?hh k9]9yͱm_^i)Oi7x^[.vS}=]\ouzpPvww^nYL<Gjj%6zHM4#e0R4Ú2ԸPEtv?on5^& e{o^yݍٟڧ>aw];77tKgRpZ1V7kԫ[j;=e)O;ԏ8cu mk7Cc'n(hlZ3(eHr%Q/q8Oi^]7ߐ־Ovcf#W ? s>-z96HBW2bؗN|6#srD ?\mbٟo^=n[gP<>綧?#ۣ?uhұt]q/>/~B:.oM6s:\KGuN_'^{%ovS;2XyEo|O}oh_+ڕ+kl 1mNL7Bչ V#ޢB}=b]%06+fa  ~v9%)X;:&Lp'Ƕ#Odڐ<''?m&2 tq:2m^hnڲ;zdGg]ЖAx$WwlyȨ.x2trGߘ{DF36| >pˑvW/_~ً|5m9:>oS/CuQ-z&G} cGb آ];nr ]' x8mϪu![Ly9\zѹrE;p9w[mIwņM?W=}S ί8Jʖ/[K>v,Ӗ^]r6mVہW|lɇuhL0u<;71~Q>OxpA{1XޥvUW-zѲPҷf _fr;N)rI[97oAuc=Z9C\}2C,FW\5&oa@e^6qmV 9Vo=M׾OJ3;~%'no_#_ڄ"Z!<=$ZF[;NGu-WΗ7aG)җwn*_fE2 Bƛ x>\DN>"+,`'|8~95ool?]wۍ׿]tSj}i=NxDkZ<,c "ˤw] ]g}LZPlivIZ'6nlar[Jhd&yI \ kƽm~K{H;2_;1y~﹣}^{g i3?klj9u9|aU@/R/| ]|ś =힗 rfVLO>>ЄesQ` ~g}V9 yL+AY8kbɁ!zuIq6Ѕv5LD&FgqOǭz͠sd;[9 hriN ONlW)csF'L՗HkV:Yq[¯$u*+>cl;'۵w8MK%9xoo1pMw{k?DŽT+UWMkq&w`=Ĭ6#NpW&gBSѾ9C1q#l㏉yHXsv-׵kKjL|cd(cFq;g*q;V2lU2kwo70vEW>Q8|•qN{s9v]]{;/?>xퟴ5Ol9M%'}3lړyc ~{KxEB ;?] 0ATW^ً(2:p;-jK߃ͱrv3H ȭ_@Bx$e`vٛ}kٳFlV_i틿W~wBٳ5N3$~vʦD6S[b#{j8Kbf̪L18 ~U_B6yЗͭ$w2<^IF*Џm#H?D6I "gOp幀72ʏ59 ~d-r:\ήwq5?<>ZmP%EWY=n?SjM}rs q~ưG$cwM,q"G_vQz&U_1o2x掄Ai""%FwmbI[͘օ6Yz$M"tq&)iMno?JY]_wڣvn;8~#6i>޺>wۙ V^4"HϪ %`ylMl9m1$//t@uʷŢ?p{om[Mu ;][*}6xoPKe@}]\ևdZYzo_.uLr,t5 D>*5׿dM sy{?)o~AP2Lv vۍ<Јzu㔁^VszVOTkz״.:e)s&lʷAh531D6p٤$=miD-uv'xRk\/rk0C}v;Z}odߵ Ddf~.&]( Pf p!.)hIl?N"zt]TӎĂG'4!:I\$sk2p{hWjeM5.}G^>Shz pQ]mO{?ڠ(O/ct) nCl6]el-w.]*-ߖ<}I,Yd:^jmmvxg?v]/zLyX/,bL"'c5h#Q.mKZJ?0ij# 1$MmW@|pӭuڷ;oi」޳gX_|v}윺Spr*BCY49y{}d?=ҳC/>=I}~x$aF={W՗%sM*b;gt?!Ϯ p x1Af؝#(rr +=j1O-̭[{k[{Zs݄[6*s%]cn/cm:'Ď~wl+?RSZߴk!lSY2[=Xj[~91'SW,t#/?zCt<ތ9䎳s%OĬ \\t3fsY]XNRwB{֭ڂqjyH{Ƨ\7>{Yeç /܋=uK:Wr KN2p7@;a1ٿQ?j/cMk֢ p"H{zIOG]?j jcȿlm/I9ugw$y;'='~X>]=c{q&lĆׅ-'!b%WS@֕U'2hyVTv`}wpC{knhoM첚?⼳b\[ 6vˇJ`O]M |̖V^XwQ5 ?y\{xIm.V;՞結Q=<=.npv{+oo_eW}=D@U͂]?vC;iP{UiѨ4#ɒ,KnF M 'I;pRCHlcpŶl[n,YͲUF3[]߼y'ي9{޲^{^{ zŒMTyᑇA }oءy(H36 >y)GS+}M\񥭛#_T"ΨXOx"˝ޑ邲@w43 +PaJ6mp&\RSÍ U6tR( qt5MW.& Qo!vA<h ~ŝoa3p΄ܡ w /pA'`_pvR?gRftn+W8&3˽`v[n{삋tyENpCQrHg_ 'CT&||b&5iߎĹ$d{zRYnb\k3{mzh3f8%䋹9yeow(r R(ndC@WhR"\ciP8"Wŭo/=dLkHɈ89]ĻӇ 2Iz:i#P# a֣qL&-޹pqb Cv )q'<x9m <ǑW]7PF~[˦׋[eS8 } "/}}4E vigg !RE9#I>3LpbRa8 1<7|8Tܩ3HwDs@:#adt rܭ&( M^ݶn[}Xg9v OQi -hlpEQ@g[`p!$-Sa\X`˿t}[Mv+Mשy[8Aq4؄:ɬ'-T@T$gS٥:Yti˲!#`~MA6x 9dF*R}+=)lby4*{UR8 5JN<'hL={cR c\d]p,;y4kn%i :r$o9 (LqtW(|. wwDJ{Xo'i /;"ЯG - ⢧ +9h(E |p8sWxSy(Ԟ(ǡK:2XIxQSy"};8 >r*NEk6{\ ܉ 3ϐ{Gw` 8OD;H ?.\O¢/Y~DM'Eĥ8X8CDa[V\xmOع\0eg4 USe <¤|'x$  r4.w$ py3j0cDHAgj)Sv5\"(ܶ??h]1.?̙z${ P&Hp7$c(?F(8yQ` E9 \0af(!b_9;Y'kݰ^3G=-ړem҄:/ETTrp`ːWIsS kxb@y!áxgK)\ێ5w:ڳOĪk}Ra61"2.C:4|{hn|x}mvbyEmAU ݡIQ2( -s2oX#-Bi3`«B;|?[wp23I/@IDATIl\7/~Qiv_;4Ld(9` 8XA#bc"Ay`HeWTVIU|{ٻmک-}\{'w.r3`Lx~?Ÿ䅋xp\# ZX;o?[uPk!PB9Нa8hho]f7nF;uh+2=Kh~A0 s9̌Ͽ>#ib8 mk5T ;WD"]HcߥҙW{ڕ&ة'LTLljehGKe%@:CsDC1cnU~ʘb6xuΚF=ЍWfYc.1a ٠|?Cݶac5m 9,Z!^_lVmjgԢPTPP(HÑ((|ˊ+8Q>ewIrOyLݷvDܔ;*),;tv= c"CΨ휵3ţ2`%Kޡ/%=U@kZRFbmf7iR5 la-}#\xPIrOmQF8a$*g8kx(?CBoLǕǏt~ˇ1=3\8x.thrݦjԺߡ+OjNs=2M| pr6wٓ=h?x-;BY# Rj*.Ҫ**ಧ:` KD>oҊs ,h9 GՐϋh|-j(ogtvC^L,\ 3F.Kө/ĖLsd,ne'L-UiQ~);Jb o?=^g܏%A%8%((I֩ wno4) ) GC W5]Պt4(xMS6!yrq4*az\Po_j7/Dô4hI8(䄷) wKc;wk]𺽶YԳ4>vٳ/ۻ/ky3rP03 /( Z}="M \>S/`,<@MW4yGmF@ǡE60憎p"NWUjS-nؐOzTl"!$A=*`+twc,"?QAE&=**W xЕ+ކ0 ?\#Ɲ 1qH/҈|oz~zvC;5΂q{jןsvFw#9߾2i-Sg8nB#opX}ϾoS$RZ/(Nu[^9Ќ1/t82HYQ= %[l ~^{ދ 9X^#}9Nۧ?o~eI&28<]2 u[ZZl0=oq+ &4AB ygwusf3mC"%KrjFXZ ڑNq&ਸ)t,[/# +GV~fΚ+vGz|@IJ?< qLlӖs$g-;dOa>vi  ڧ=.ׯl3h /E y97 3\ *#>KvJGHKFE3n%uݸp\Yҭmk٘ZJx8/Q(2WqpN= e} &Ŷ:W'H $\t$q. q :qGwD4 *kc_qW *0C5yaz<zvk헏=6 NsdFž[+z!oMtl>>7{Un$0qu9χz*v _AC1>>|W^Ӷș>#~ O€gPu2Jopx]I3h~HFH:X*WzCx LޘEe:m]kc&g[ P^8SjrC4ߞ~-?CJp%!p7ur_ItM~bO;S;OoEp9&AB &A~}|,qѤ"N7zlv58`3w;*(1Paą <%dX.Y<}/n}mBD1zI[Y\K8V{mv>K4aOW?hzeK$lIw?<"L(a6#\o:ACArhZ:V5C4q]n@ 3gڀQD6-YB#"HbBB ŃO\1J&Cܽ*_x*b$0:k'0uƝ+2[FB4tCv^+L.z 0"H47W'.⅌G~y'0Q)z&A35e;Fy 9xA7\3$/њϽ<#&yP󆁟kI4$]q=LZgy!4^hG]̆y?k;i7<[jƗDD˭wL[Ԩ 8(?,7hj;缋|Ry'lSKdC1’nk$'E'B'P9{h76dg#aގⒶRLӓ7T<+ˤ?!pޑ8δWn5>!0C÷۶]%{':L@)@L*sO ˿O8Nm?ҥKqn Dx9U}GsT ĤWL`oɘ&z|ipMz!˖F)i4)~/qPe'`}y9ɮ$Zsxd\5֮nH\|V /(1Jo\oŅ.Z]qC<W$0i0ZVqE.{@[7hZm=\{M.{M#2ޅ\ GµO}Ȁ”ɸW=BG0j*$x޻_5dP!2Ν#rl綾ZvҮ'/ȅ1C33/ziT71Bގ*,R(*1e;pq#IySѾKWm+\ > P,dr 6-mc}dwH+ϼ[[Qs ԰YAj!q4YVCXȼ(7JdG6s/n sׯ3qx$GnԒWe^ 0+Xi?ʅ)8+t#,iyoȠgop/ť (aXV ?81is ms!O?g4K2 n%ݝr'|LjGϻPWJ1<& sQ 8 xR9/̝X ~.Hi&{4Ф{|^ŃmͮHB*ܥƐ }1-H5;/Ψ\@NȚ¦*)m[7/nPwғO1 lǣq|x <[996IK:uR4}w7i)}5H; w ?B;{,.<#l>Æ'_\ K.| zԮO=Uktj8\&pQC'NSr &xV cs?3GǴi֭yV Uq# v$JNYxBq 7&=Tָ!q2m}dc]>$/4Ii4+57y  o!hCT\ZQ&8`a%ZjO=YM5#vbk^hÉ9 [uShn6umSsv-Yl.: 0RQ֮ShsX^]}5>.Yvwa|χmyp8QQ.9QiX] Sr?!_!dc 8H0<#^Fk$u?`h]+!Ƞ\V n0̜MOgUw?!ܜ.@j|n#R x\式cT(7 ^ LKjn%`N{v|p!͉>j&"֪N9p˯}-F>}]˴\dƅsC?Y"+0- 4/x(A݋p+(% Zn^k {,P'sgnXG'M*Z|G xd+K 戋Qʫ$7qR/$.K=j[*Wp5ߦQJGM眧qY~[+p"ӒKƘ;T15O ӭJI՞u2G'ѝ¿j&%H_xJ!6' p%7LC74z];)oJw2J}̵PnG)QNI9>SWu7D5^8gOmg+&mzTSs{^z84mfv620`P_W-y?}+`kyR|ᒌāN+#cM=-T 0\rD zUQL ?ȨWT WL ~ 6>,$=F2$uM^ =D*B΀MƬQ„uޛRJ% ̣˝Gr ǵez{R#(;apI >nK4 SJkuZcrPth~Z Y;c3;;}2 p0^~D[:o&F(]K3f%8lڧ‚Z csXZ[jSAG>{*D9VHbpn]wFRwWmV,P !S*W4{lHR-@Ҥ7erUP-Dg*êk5rܡ?U r+á[< imV^h!{k 5L8V)+dZł]~>a21r(8!A/7J +h˷ \8bٚ5V}+jEJM59!6E{9!Np/pLr O1=wsTVkꝨS/mgkzaS>ll/Olۧ/uFhnrӪʤBn_Ps͛`g.fsg6G=#=R1)xƙ+ }6NZЌrϏJN䂰2aϐs[/Mtb2|6Aɶ@DGGݣ})dG!z\A"Iz ;'{8c>ЦVƼNx0$A]U+IcC 9S3o y|ŝ0SxH"ǟh@@©Zn)O\xPKH{:TIdLڬL_s8-s`27z-q%a3"'La#ƒ+_:uEdYH% Nc◴Elt2^v4&HE\9?(DD&,C cIZ_˞nu߰Wi{aҫ]&a&b㏊FA ƗάNh$dq&8-p`Uũ")=0 $2WYY>!*7]4aU:?Hi+r4U*u]rl(evc{V5P!!x>萣H.Fنsk*7`E.H߈䚊2tTJ`D_U]PT^)>==z?qєFtVqq',8D>v|!B 8AR<~أ%> ֙}3x'SoZZ5@Ћz+ g7ڒo<(ՒB9$~R.jg}gd5Qazs=V{2ƚ7g;^KWmSځ9۾sOP&;=y <-i@p귽ǾwnO=͚(%L\'#easF?HA-]t} |ؔx&#D+]o ryp %҈4H!$3lC3WR"%gƲ~_ib}_H*?OA7ks&@XJBQyB.2*S=׷ۃezRz]&vSTziG$0+Wȥ%&%aRBh'i1HtS^q̠W֭&%: =<| T̕jqg8j<29+TapD-xKOf?_oD߽}] SDL.GS[(Rx,2Ow8|˜ <8Rv{;p_}$QQ jݙn.WЊ缁]Bӈ?a@po҉pwנZSǾ;~yfmmf,ˌ ʇ2 q⊼8>w.\i$@GSv=%5[* T2\p]0!`4g!pEP3%0d)/y6҆\Gۗ:gZ&-P'xYF.=Y*XVTׁ o,xyVv[j^;4s-v.3 ?Ѵ]Pa#2ijV\dwm-&R?VH 30's3yhA&PO8/+ΗƊ)< 3he1|{t;i]6~yNOס=*Z+@[ O~cjwj͛]c0Mcolx0nR+xmB[뙑rN-;DvD8cR{G+ORo+3W} |-E)T(#Fb駉(LlOa0Õ PLhJϩ%w YڙBlw`#bb adX〻*-&$9Gy~?=4YY{U#V _z%Fqш!O~D!NE!%?<AuB>Rws/zktfU4Ok`A*uR>Zu >wm@=[+v&ĭF"򃁀#H4lÈ"įM{.۠^%KOun薲^tCEeC wVظ=SQ9pz)ϔ?h'(Qv-8FaI:! G=u@L^=X1)hiW'nx\Zy#=V:41ދ;wts &.\Q=a38/$~(aw`~*S yA71˖Y.u j \%:pǍWtB#IG.t5ƾ}P-bU;SJ MlZmԩ޶>:UJ8{ILg@VuJE3g>C/H`|{)8~x͌ 'A}~t@F8tȺ']|oS Y+L<*#/WR9x,O2DK *̀N^ Cڐg{ś.m]1,dpBAx?w4Cs~}:=:6҉bFM^ԬٺuZr(%lwMkYv !"`Gp*~1p4:OP RW&0h tBXˤ4YNۡ{E6m,mٮZ]AX{7xBTTjod&g Q`B)^hV3NTv) kGd" O5R+y ԍ=ZE.BB/āa"H&}S" @YgH!åM8hKF#9 WnQ4<LZ٬&jk_ fNỵͺ[Pu2ثA 2 \N,;(SЎ{ҷA'9\e'fC RQ['ey]p6RVɍ*T&;{7@qTO2"*<"TIH?;k_vmi9>tQ: |iN`XhFG>imڊժi wAW{/IdD1uâ`AyNX=W >E4=ǂr'мGw3bS?X`?.pŀ@^H\ìw̉9!-'ħ,+эXzbdBPC0D AgvzZʔ>^Ɉ/j?6e62`! ZYԐaɇT0XrNV}/vnJX1eh dsyGVNW?켰ac{oIkd `9(W99z+ch]zF0KzWXDTBPKQ$5!K[Λ!iS4k6~glSWZ#UVs\=3!/C勖p-`h#83_8TT x;>ĘV,_/aD?2nZQU!C|/%?x-+5YO=S#IhsªU֤ շZJON.=K! w1$ɖjG4@/Y9]}q=d|e)[U&rN,ryVx/FzDZ?e_#H ?E] IX9[Զ{ph$kR* Ǡ9ibx44שi֯YUq [+FCw8벩A=p4 ዞF8+A ,cM9aGt't ?.{iE$=|D-MLD#ģuKk/-PH!~ fr]x.ԋ*'kڿa G#Z(Sh09Wc_cHpeOr* -mU^_ y?͚3_p>+\!V~\0r훽)5(9ð /Buթmvy$-}_b-ݨMSM=LT 7#]d4!(>Q F ’&WăTxFTO>&9b?xr +( F 0po-{yFŽ8\NP/UkUWh9D,j G+8]9QhM<+#}V!;]ԣ KҸ!:7AdJod?J&[*pȃL , <`"|zKc~ }~c.%?UtRr uUvz nzd5kH[k-V\ 4BAG3y7 ^pŸ1Ԑs䇼D>!&ŷ÷p,ѻ?C@'p[P.xX%lir`WHފ W;iF6|# 07  WA++?|I[tV -t=bQV^.􍥲w.h=@ C~R@GGG:HW|`|#l VLN9 CX(;gS>x*D1 d; |Hbك \@2OиqRTK p|!*]f B>qlPb3x 7+yT|2i K<.aD|x|&Fk|t&]>.|F^x5z*͑FAi ~q5 d+d8IO*[-X_At5+@<0A"6 < Gn!hɍzW\vODw@} W ;ϟa-ftr#}TwkiTFHG8-OPF1O#L .*U Z0fwhe3m\cs4y YU~/WWOOy|/;5mNZ[2 )w"7/f8*Rx(٠@ 'a3L:߈+~#a(_":<<ʟd|G~R iLR,܄J5uڧd}zR 6ϨBH#ZKL/A(D Wis|8,aiC+K!1`eCL C|K#)C p(c50 ~\ !2R GF0\xs?vdA^&8JvM۠ptNXP8A "s |1ò `%'Zp`C_V;{$K_7wn/* vHi͍/9s-X'թCC$m%V|~#Єg3+gAOs1[EtCoj ɎLwc4OS@Yҥ4i)e Z!_p!Aߺ.GB9"$Smx*O9{r*١\ɥ2x,y#dA4>O~W= IIimȃl:S]|Iᥖa(R`W]%= з;YȔV< Ke&w@uv͕'On_}6Mj)Զr[[>.9e-۠0)4LB&t$ p)#|ܩ# $Etۣ2.p;.n|p91~qFЇ{1_@ =?$JeG4l3[k3qb%n@̗}v~(]eL+#/y:x B@m9\I֞)|$#nj qgi6Pc.h1ۦOШ&7b&7$ ΁/`o> ~rR| `{dR3N<+S9zKUS^!e"K!@ryM:k>d3fuKV$jC]谼L*w{zY|zIpc6T yzJa|x b`dyeBnWGy=UJ?Uh"'ߍ*+NP:g h=Q; ?% B&c+Roя~tHEAQqGXzBl}3֬CLi+P  .ܢT8_o/y<@.mӬ-2# 6OywF0rh&s4qxh0.k;!螌enPq L4 0AwL0|XC1es/aUpG]geE%JXcw{Ji 0 |#Xa'0΢v)==Aqpw`E#1 d9ݪԫ0m_L{ HT1J-0 .iy\8pj9qNb=׏$Ī4^~j76 U0]-._`KNӊti_'l Ss ?4&z( $$h|f ^m-ƣH]|Yke3O;\L\ro=ueȗ#ϴ(I:01fy )6m~wzL"}V~o!}Y$S€d^xa@SL]E93"=ګnu}N\I_=\kOO/Νۨ Dђ# k.BO;⎅ȱOYhAvDXiL]ډzֻT373==*g78@IzTRRa yGx.W yAHAXaxɧƕ%w,*O7ukvL;hBGX"-q<ȿ N7t17k?{ . h_<[2~XW5ɩFl4@}vS3&I2ǩGA|&\/-d)`N|FN'qx9_y,C6[>pτ oKÁ[<30g #.q>Ra9GFf¥eY}r&vѯ uD- LpE@8O Oy$<{> ?kWگϰ뺚ԃPacF_'ڕ+t1 _FP8&-%3_Y<ZΟ啠-\o`\#T˭}Ciu&J)^ɁF|m[ӶS 4-B‡Nڧ+;ƏMW% pLbʿ;C|S& Qtm( wxǨ zSZLQAC;ϤmF&87>Q]|٘6(qEkkk9ѱ_/9 qIŁ; l E=`|q^Th&(GպGénu#1(O[U`+mPBbe˖" LxS)m%>{0 ;xIhߖMAPUhH4Z|P(R|s<_IT8#݁˜#&OCT*w ' ix.*x)ai&K6{$G|yUIoka_}\W`%8Kzqa3 =0:7"L7"i '"'O)L ]Q=8*Gw%0Ф!;^wM/es-L]b6xtR(X0-2;q#?z8.#>`BJ(xuJ8>P]( *#/ߨ<ո =R8xæ2I^J>4I8G!O%Q8a`)&:I+AJ|Q:n<^.4\~/d;i!sy7'dH$l؃ '4sf'dwX,1_!!HbU/J"sbH.Z C>^XKP`=[O0 Ÿ[yCAda8\@|0|¢8K.p@\wcSYl)?EJ z=PUO I]qpA S/oOjU=:.Jq50QP|{nbo%]70ђ;A^#swˇfOi|* ,apV9H \:/pÑn5\.WkD\̙O×Y\e~ˡ`k.VTh=vud<%CⲜ҇^+p,,JO*ӧۛlexZۧ.9j-$[OdnNϫH[sFԓ2qCDJr‡mvT3ɐ1 ԭ<)o^77BRʙ4}`6uRV[HI+(k;4 aqyaQ\a#ˀ hBs>f.|#l:Q X"|#m0X!qy;|6k y5Wi/ggi%tdicn,@zU#8:DJcиħE1ɄZr'@asہJ;8d4Mx_=էqUF-\O66h$sZ۴jMXL+#|*K9>X"_.[62{'}㿮?6I2iTHE;e8!  HSe ـy.) &t/sfj? ?gO:rT\1prO82јwami`iũ3ugӴθZJ/,REZ#/x@عfVhxtyOy7ֹtM#,&zG`BDZmaQR&0W14^;ɮ}CmmJ}M_x]. fٜ:s&~l"m* d te) ā(ݠ/;#,τK' ;x^#pא{Ixn\d*V;~xTGV($DGN ,.\(k҉H;=?+oQWfhWڂQPpDeUj%wK~J;qF͚#e TCWMǀ+lIV!p,lG'+Dh<󴠧zX4VO'l MEwƹR0f/GCQqHCҷA#}u%?!I-D2r 8=| 7Y:~#h([q1|F wf #N.n063 x; p9O{d)xMSoN |fϞ:-).s= a) ]ב QPCNfZ\t\,Y0#z1:|Pt`gB[펷\8_gj&@}G=2LǑĥ1L]omh+UuM_qW8(!/.@ DdUso`N ,^vҡAY/eN2~i$Z^yE<7kco)vwqvK[` rTʌ'2+F٥e'#!7.|!8چ^BtsQv=q)gx[DQLFi6/WZ)[#\MQ087wg9zkP8 [n P> .?7 Z$]t,$]҈tf4"Vg Ooc(W #M֚?ESYLiOݙMIVl(U9_J#̝J:b\%nKEYN:QolWK|O;n2p&WK6n2ζ\qX0{4XZu78yݖOj!=K֖\m<>񭧴X\~v=0k?gp7AHpMN۝ i)#`eΙ|t1͝!0q4GyShHK)~e\Y|mQUӊ 7WrSCU't&fX\÷e7GB{Ƨ{?m6Q.peG_,h+fio$p"5ߖW]`VJ C/(wqw8*]Ҟ bQHh)m=s>efS@\V L$C _ ^ԕq=gwM=FLgP #[ 75T >ou+}K6='*e;,#1YaV $XM(485E^#V-?5n}Z՛ Q`a{TaK2.+/΂+Ou'47L>2ε< e(˝4N:e NGUߍb8tܳ2D3C O`m,]|^y0gowX,g9%i♫{5zɕaWqwkʋvsuX՞x~cHa|'kAԝ< e`q)<ҙO8ah/\<}Т&(3MuWƳ GTrV>քmۖ=GHrt9+/g.<8d~Ԗ4fU=VAp$ yH(?)KI=>Pȣ Ѱ}KJ+uO:*.'8|rƦMⓟdNy?o{qN2FaZ,_Q&|[)S8fJy)(m]++~@Ϲ/C)MLЄ0+޸ C )CP6z$<,+S/xK1#z!,JԺ~{cBS YQ'g:nןkpOu| ̤GMH| a=1V,B3v f [&I &.m :3~Lc$?4*Ӂt*۹C`S: @,FWE;Y3<ޒPXӑ+DH[bzNcF9 {]?x$3W&pC=mhW|ߞВywTÄ́rCpσsnVtuz3`k?Q+{[L;4A+S;;ҩ4}Q2%ϥlиJwh XuuMN5 - KV:Vِۘ50SS%g~fAHA:l'+ʓ]he鱼|R  i~5Kpp(*,1J'KҠ{jD-t?!'r릘[^ҡi٦*nN@z2cş~L6>zÎRM>_ҽQ);4;ţD?k8*!{|+ , 3ڙm?/ӎiYY19ÿu+1! Olehj':=J/n(j8 Fe4jHL+M*:Ax*V$ ^pƆ)=FMCгMUaUBC=(O 3fj>-~W5zY[O_44UϻuOgʹo| ?dE73(qki:<,J 1B0zʹG~=O^@=gce&بNa ^BSܜē"B3X!:8 ] pڇ'|EAd?A Ҷ-#ߡv%?}S~N?A/A}6^H>۪ưt j½ҊűfIqe;ڴXgQoVoyǑI{l 6pj;doaҫ1ʼneHhS)I;m#<ؑ(mxJ'j6sf8Xo)3:Y*Y|$w; %1<Їy{,wF5"~WE_2bPlnlQ)#4"J)a `zGi,0V F>r.ϊZRcgViTȲ~Q80nx칐R ܠmPF[idÃu?`>|}0HyMjpFLH~qOGG9 #|Hyv\EI9ɃcN|Co6d^ף8+U_D] MM=M)S&MCgffG+CY)_/Iө:J ISޣq8V,sG LW,\yW1՟V+b&Ox5~]S7.3$ـDF guźs.r:MAݴS cWܾ4&p74@?0&ԶhkZ C^%Opu҅ ?K'm0 ϶ksQ"jka.UwQguL$+́ \'=O ْ"4FYי 7pf0|*vCHClAA&a!цgu,jt@>#ҡ`CeX(3>ECH#BTf аxxlgٜibV/if=䣞!F=tw3zZU썍ۆżZUelp-f)pXi?UQ$d![k;y;ֵ=i/ӊeɆ .96 oOOj_Y9MFS^$~sO:dqхc^G\|ݻp/fގx=Y2ݯ?w5;op/;}pM\spMe99?qpO|[>gS*-}o )Sr=xeP%]N hNN,9&w)΍z +<+~wX(ӥʧ֯ ӥ:ݥ_8w=g^6-G7D&BiDqԥ.ģ O([ʢ_dw£a@\ "\4Scة Sx 2t.^thjO|9顕#4nBo*F#zmԷW7?1_I]2n><ˉPq+$U2BX8|vZ4j꺁8y7i6pȺ1*h^SK2<J6Ω?~i[CC(āav|g,5|,t}G=\cp xձhJ-dHdpϮ);s\}_z$.hzv4#o~ :/%.iK^SKX<羽{ gqsp3KgɇtmG't|0cMޝ @zQ''H$X?Я)Y@ +NK5"svo#g[qP 4:-pO`8.ݐ0,j> T8h +LOhdp)T !2'4FD~p.pO/֬ ziw3UޖR#PPdȸP=^d 3/>Hԏ~rD td=A|rPuwI[bߏwQ &N@1fvCS5TǍ~Ʌwk+k57Egσt#:Ϛ58s?;iCȎ'5 XFc)\W>-HGKq0SJ!~|ԛL}|nSOFh^p&kw4j}OY;ms5kx*^q]/ GpE 5 ;C4ЇA]1%>ΊGo{턀nVC^NƋo"Щ8h4ny11?#>rml~3^{W 3q "-_i@ڦwwtr^0M hauZ8l~Ɇ-:,|ޤ~+"8~+3wD.6C]'A$6(+ )# !khnFzIqB>/@wMzk,^3M]jxZ%*2ҹ◵}s|h$F-Âb UaRZ)dbAGE4eKc@x!jcyOT#'~i+E'F8Az8XJsZY2B1(=,$XQBk阀r3 흮Sǁ4b|V Hxn"4rO~Ofz=?z[,T+MxL|{tt3$y9PsVṣ^ Gj?aQX vqR6^ F@+< Ij~iX#'%Ν;.O>=HfƒJ,˕oNwSjXq83K1Pjw8C!iG (_4t I4Jeep-fRWT&/brAB3c*Y]D/+qOXӕ& ?`SQǵ6dA\+2⋝ʙRGN6MTH  tʺ]0(B.,DKST0)yV9ÙRSTY#|F*59ha+pXLաƢSDq |RRX ~Wq(ƯNvh`#SZ_/ XT)]e*uF7 # mRP_3gvA$O hѸ8ڬ_j,4BFͽk-^u:POx쉭fxL%M׻R3UwwgQ{9}|\2[ rxjкp?״aDďzDe`s{EX۰]@9}]ta_L鐓P(-e^g w6:!Fk⨜|ҨD)#" vR^Î fC͹{%6r(Wo77_/'UnH()6 =fHӁȶ\wТ2 pd҉f:1y,Rlf ^ԏR#u\\'qܼl-q?igO;vNWnezw6w}~.C^vll1MkՅg!tr=*aqv5S7c-ʰuE &9۩ xt(3^f\Vy%uk9pD3 ԵkIMa?[åC\s֜ouz*" hS+Cf7ΫY1$?5=9/;#NXiGo=8jwI+Byh>3uq͔쓣*)e3oQOīa4F.D~CmwOжrxZ[cq}O e$ =E˻\-C18{x c4ˠX.zpC|1"%/nۧזxUg,X&#]wvŦmί=弅yH\}Ƙ}J^bf\\_ |ӈd"ۻ)yـi*IC6 r0HHCCqCxn9[$hE.>+o0x, Ld-gf= g0ZN7G5 9G=d>0* ]tQn!V\kvg ͆<8& A E5 (=!$I75MG{gY[fqcuO6ēc!EzÇQ%1|$t:ҶDi̝{e],N 8ڪ` yNH^ޏ:@΁G\tX6Q?{.SU N-l#-z.儓b!-g  aW퓶 OpQ^Q>;M(tڈp(B*o"EA臉;Lee|Ia˯kky|7^Oǵz5ۺ3:l¶`y3,ǁrBA[ *PQ>τ?ו\ӵFŻa˲.G rz'bh+ȟ>2G Pa<8x=Α`QcnrS۷Qc=o̎?rJ3 \By.hO.eհJBȆI⬿ %)Jo] ޵L7-ՙAfTuq܃`'^uG[ J\} eTpO7eYjXogKDx\|AD.>GiX0wymX,K2g-R,+v#BJ91 qC!)`'GiU (p,EW_t)S'+PQ̜{ҜC9.(t.t#[4t9;'pel_Amv4yᅚ0U .U`QcQxP|/zqe˃xBН01ԉ< 'tKXܼ[qLWBۣjն*֌Lj7\$/,;uttKXnw>+.ء(6 $%Ih" iCDqXp`eq 3<Ώd0Pόƃ5 wl΍IҁL359hYIϲtI?R/ز]h`.H?{kSU@9PeYx$gj(zO4=3:H{FPY:i-۱w86q?A #.8ScTA 6=h1L^avi =>UIBS:bPÉ@=1l[֌84Z[yR7<*B=e8Z`qyl;j'izz-*Gl )ݠ]:@jźhpuor08n~Dwr@S6lrv;7x BpiE#P"꾴.\t?n%~,q\\+ƥȬ [xGHFtXa)VVA1>Sfw&%*C:^j ̵ڊ?KBLϔSp#pp$:KvF:Mu.ub5;<»H#qgH9[ٜr\eC!G>w ӥ5Ҁ4; [$ui脛xدق55;12E3L#c}b^ S=?~o]䰰Ko_ݚ};v hf / /'Y ((sFDШ yM2%oC3 =Rg6wL |=eL92\px͓Swj8'GtȌP9nB^8Iā2cH~+q0U6">cc|[A6p>RY0^ԕ׺}{Ӟ=zdhQf3 g`PXh̶H:]&TVúG{}/48Ѯz{aίxŐ@ؼ"IA[ *8N Jf-hѭ <6QF|ɜ7c*8]: ߅+=I#U:v HO\c_]iCͮ*cIӱ+81mley\'ff[ДlR1R{ۦŚ_cXFT|)@%(CWɟN/etB!p*d} xb~cK8H 14<0y X''g;wlppaJ:)$ _1C{{o|#ַ\K.53cALKn$ABQl^G42ctqGQhn[hx09 g1*,醨@~ 3Ҍ/7\zUf'.|568C9p̝OOn?,-M% TcI1JtzБ_$Apt^9] qr\<a՘٘i_כ-#+匋~1SczҲSEU]S><$RFC8&7@FTERxV]ԛ zyXm8fh w=:eE|"!m4Cސzl*ruT>967 }|xIo( !Y 3 q'3Ƀxޮ۶yLUe׿:I'"$> 2a (z7fL2o5]`s>=3OZoָPh`aГۅ.C[C$qڅViW*x%?D:2Sڧ2p/.j"-z3Gy'΁gƢ`6 g]L4Gً Y(Y.OVtO968۽O%E?gviwyN]'e%/|[5Ob]#t_-&lbaӳi^ ,#iP1KpzQҎU٤P#(fq?ՌO4Y\6Y6vdǟG:Sƈ)2㓟d\砼1Aф9s?A}r,su*!y H 9m@A=8O#rPЀ)yT bC2 ]Je̳)Q[/HR($te\V8Xd'-̌x-Gnzq)sc@ 9Q|2(\?.vVzOөB<1Ѕ0/YֆH8"t6q:G~+l q ]z.tNaju 0b4,4-ȦUڍ;vD @'d}iCo344#_gifolp1W v L('i:z (r$eHsԐrnos t뭋/wڠwN87夿%zv:8"x7'6<^)N:L&ߘS+hG=\_a-BaoRV Y 7h`bϮ_28~[/NGmmΌ:V𨇑ЀA9~;󟌕wyp܉eOo(mP'a%U#+6}xG`G#c&~ {e>X]>Y1݊fM`<2sb÷ߍr+rA5P=q%+'OZ5Po=+^<ѶHLl=w}w|_|v2=+9><^W ^++eCM/Ab ͆x饗:G)ad4DV8LI*Vj! oG"h'陴v\:n{DujMF皙.pX5R $*U^UE+)4V:F{33]jԙkĒG헝+x\í+hM7顜GGK[ح_Vu `~m"zCn|t88ЃE7ԅ>k%^dUx.zGx|JRI##'`q.ICx iK y=vfX&J],c骯< Ns LK>i9%{`plI#/7Ģsۣ N"N>t(g=tNq`h3)i)pYe ah}/ƵF^py|]'(g`\~ 1W3Q1]3:m7:hy/hVD/GejL5#^? v7; Qyl6 }}9Ѭ3]T>ș?_Cc02[Y^e_U' 0{QDv t#&Zt'c{ ;%o[^r=]h]W0*sPOSnc? yPZD@3BRN{PCӍw:MwǏ9ld({NMM,Gpԣh(ɱ#u8ZCBXb1]{)[^`6Sn%9z 6m޹{:> K0P葉hޢmNJG+ 洞ڠSBԼ(əud&(N9h7aX>#!_uSuF'ե/=r!q%UbAiW;|; 'wjfDfb EEv"ӌ#1D/,5E.iW>sm)/C<2< AäN NJخ 6vfQ0j]vzѦ kñx \ ;/] Zq%0-:Z{klntU\ō!gxbQA*\gq\1[Oߺ#7l=O{{挥g8~^᯶m応.f8rCV_YWIZRd[ 2h>w+c̊;wL%Ag]pˌG(N7h V/:#Bڎ x_!_^َFʈ_~}9=k_{M̾zO+_r|-؅邳uffnax^FE7„kZ9;f7e3^/?-u G2C&p60+1zk:o.]8 9 2-|K ~Ay!=6 H=NY~9 @rԔY/["Sy4 4Y3L `[a5p٩Egw?˪t+ 虛;cw-77nmej$SOR;/1,PRcRʶmhZV€֊yk<ĩG;Q;n..>nYoB38lroUt(:h!Zlea]/}u#@mxI]'F^]җ̊+vaTmdƦs͙ 6NF&9:qG&8…@u ǸA{A h5E>C ɂLGs+  #!X杮+|W(пRpMGͧ*ub]3tRXS΄y@{3w[fj)v. 7g=%$9rW1:nz,qLR8ivʵc]lzgM.)(W .K<$3 IπjZ'R/|;{oI0Wyde\ıxzV*̆n|Ƶg ۿ}H)%>vH\ G>Ɩn|ȗyT+N.$Z'i|6'Ű7ƛqq'ycF;#71z5qRGhаhwz}m(&N}IlK|#ŭr!+BxeE#q^\ܲ1 J qw!1ȩ鷼z6|ӗY?&inXV8/I2CξUvbiYєV p^\R2@GB$ДRGgsʕ\C 8N89R}U<)x ܢL95pQM$[ǽer.g:ҩG]*%p\ѯ]YW?^s|lǥԒ|?kܥ5 %oA#RI5vfS7fq:8çyEe3Fo Ag7&Ҷictt.*`;AH3\BMB _e.d,2UQ,'~)G2٦}~/k5Чڃ EH ´Q렽F˧d0T^6@IDAT0m3.0(e[2 ϼ&2,}߾6:6?ܘ%]<(O'A#MYN.=-Q>9@axTF@׬qS}RGOGt@k|pFlc=6>?Xxa:_\,?l3Q b|(gx<9NK6VawFJ 4OiVKUwb } ܢ]tk#(٤Q'ʳ#_:o)p]@cP1,+K <2)t hE x'G4`GgC2>(2*.pW7phQGT ޡkM#|=x;bûR;N_UNԦN\gyYkvȃ^XnFm)~Mc8…Ԟ !aKO<1^$oC&|' P=WuLC:M|o;@ye*N@Ayצg.{f#qi 5 {aV0Y݋.?T<.,Si).x[M2.L/js52g=-QSl us7Q |aȇa(l%_>+<+HX?㗼=I5s׼)*HV dvuEV<* 'Bʕ9s.D$Dz,@`,:,Yk,Tm 3_%#WEKG;gÞTEuQ R%'\m"|Lfs6~N\ XShᬝQ^;6K#\(OW,`Op8|fzxW~xp ^]k}m49,PB*SYQklp $`*=jwFrlx*>M+j׳̑aɥl ʽ_{SP?w G&U#Kk?P#uz+aA⭖ca^zβ?*ZÁ>:mhSH'4dڐ1#L=|z)|ix:*'UOĕb<$cAf ' 2*<Rm|ԢOq}mkbL >6[=s@Vڹ~:[Yh3C2+iPۂ34<1"$ϲ7G`."9GOԃі<͝`Q`D^7ۂpLm[YIQPo)J<tkLw=%G? cYfqsu[ut:[piCrr$>&j74f7f|^)_yOgXsGpW1V\OqfG(T:lnT{O7x$aIVdO$T4j8&#'ҕMh4t̕P'DN{gu T"DQ4|02+# A5{h;3.xB/ (q;8(Ӽcz`=RQȔL%sཫ-jQkiPEW*J \T5vt\04hT0߯)(U i8@Y'ĝ24bpv6Uї؃ov44>80oMvbap0D8bؓ_8{ԪuG)ʳiGr_`C'727v}.$פ\IKIGw HȎ~g ?}z}er>#~BVy5KE9x¹3;,2EoYF˰%KJ^o J#_|wGvqҕrH|}1}CD+xC~Bl*1b!l WTwz<7S yVeO򋶨H H\͋43Agg~v> 5jݠ_7Q;jx'Bo 9gr۝c02( ٭(/G0v]`pL鑵vAyf7T(и\ʇ X qH2tRRLzA>*!}%7$J[wDU$KpsDV(A`0 +uH9mԩ@6^>%jwaDLѹۤB6Fy1n-vdoz,_8,12apS. x+Y(eWxOUYguW;=5{̤hD|cd~臖39A22¨z;jLyFSp_<@&u[$^DkeTT…k:,=Ɓ2 ~R̩{1ʘȕ#ާP Δ1`)ƁOu14oXy"5j/Ne_QgXg0^#>$M1 9z&#V:"L\y=Aۄ.G4%lI|9ACiG>}\O1 $%aGOFЖ qBppz o9i;P?媏nyqu*FidδDK^1Gf-[.iU(\j*R#K$ܤ CG>14_Y/\zmOC/[lُ aw>Bǐ_4#)Y:$+!~ }{F-EźsKT3N=pH7"x&(Y+O1FR( :?DG i7§S%0H!TÌH{ғB0=0(;68#󡡐.׸oㄜ ӉMc*PuQ޶BG޳AgELQO2Cr7g# og c'V=er o1 No'6NUyظ[D ~`pQg2x޲,T{ёʇ͗0}!xHscHWp ظa1O%mțrQ8N6Vs`1m5H,]ZN(/F+C+^SVϖU/|{Pb/%eU:HTe#J$>N?Ӑg-(C-3(JX%eJz+ʆ9Lc0>?K~G O53CK|Gq" #FKFgrd EO6B:c L;Oҟ1/AC.V$E CwB TQa!?Q&%Na(-vBiGvk/ZX.ppu9GND uaհ= Jnlxa/yAZTn%ٴWt mpy1֔NtN&!eAAX ao  AJ; ;Bqh  J ƫy%<ݡ9/+XG=㚲VX܂_w Z?pUl5 Jleg ^}܆<+i ߐf^W.e"_=oB̬^-)֫5jGS%|UۢC´@geәSV-n3x'_dw|&(:ZdC'm24NQL.?KȨ ^ÜV#opP>Qpnɽ4a\>EMyR*Q52܊CRe2(z1By,{):xN juqI&BEϙKo*nT2U; t4D Cj'nB_q嫯wGˏKW򭫮OD_H Sk 2\`M)gA\-r#]:| ̈o ~ޣx|v)sWPuDf2odH'.ihÝGFct^}c?1z 3":1Dx|;˃Je&+HI{65K{ "@-A ڑ"3W蠤vz,w$4)Gv,䅒y6Dn\hCNNʺOxd1P4<h^w:8CiOv2rahPk::l F~6N䔒A@йߧQ!o:)(Xq[v<ԸĢZ:dp2 fЛ2@݈.uaG 0\FA:8~<8҂#~wPJO=YZwZ&x և-{ p1 *Y`÷~ex):9c1crX>,9`:W}\ d0Q)Ӧ(yP>  ̗C{Mz\]d\O_ ȃۻD7ionʠ,S'!8f{dd1hdyօߝyVp[̛ġpvyFmlmu2-(qOv~[&!3O,8!p.4E ~7sc!2!:hu $iÿryyjrx"#.Nz% 2Q?ZRh [2dKmE3N#"?6 4SsWCMwT5 1YHIށ()c}yjg4 k[㙡Ir9Ǵc:ZTׯGN>J'zo(g;*XQV9Ap+6Z<#隹:qhyE'y;bu4(#2gONҏ#ŵ0Q]PI2Bp.kN (MXV) yoW8t~`New_cl,&}SybD0(`5sf.cg|(,ĵJ#ДC(eϣ]z\=J88!8F4A >H0(mQ^pTgxp#|ɋ eXjfÀ7б\|!pYk꜃_G~rw(dx\x1(W0.S~jec >4 *P@[ҒvYl4"}w/+gϑ\K.7c2@GYM)#h%hF?L3 5(PݧMko𞯕uW[~ U[?W DK#@ Ak:ohr]R8#;gnGz_tIq22PC0cHa֬YÔm+/)gxFxY#>KL/C/Eok=tNsa,]0UG-kv YnT(e,~b {"D 򚜝 0) 4f(/6SΛy?)O+)0q!̙h#:QdnNhWNRb|2h'`=8r!i tYaRvԑ`A.& ? CN꙲PXf!Z/C+|td0ڪ]y H5\AN{8mx54^TQ !jeE*p*  R]xDFgc<\tF dy*_) l`cx21CDxtCdurg=|An>!I|3LgBp^JݺGxEn1bl@bJܓ2WQruYZ8igh:7!CNmDNg _=! JXy=UYw[e \?,]pqyF# D{4T=MH+ju,u|*C~dHϙ]d['FIg1o,]ώu gyٛᔟ?]FJ!0S't҇P-^Ui:XEm=VtC_GXP!|7l;ZM]ʮzuQ 8G9$ +av+`c#4^ cȘfY^v]ynWٹliX42:S6.t@l88G8c#u[8-v(CS٩C+`pΑ0oH7}CrD^ġ=< _7Љq,7'8h"8B2[]pJFKh dt)OtA7u $N[L<0jx7mi8V]ˏ#=^`]- MI3flsה'-1e۷=}No=:o4Y7@Gā@ϤL^줾`h΁dJUO^_}ibh8ϖDoկph O{3M1T7=;y&om=IftbxJV'rn,蟨SoQAhLo'X\B@v_AVI*́K āLOpxYu[zLrD_bJY䦴JrG~}i^Y}3=ORrLE>ce羣cw(prEseDEyq_ʃʆG˒rp/Rrk=+t)W'.^ОЉ2KϻsfM+~ix<'dQF( lƟS/G>G)fOi TbJ::M }rBIBl I LnӍҹJQսp>moQ.Y^sr:VrHjV:,?-'#);RV8K0,~'YK|˪|ɼGʷG)OmT]޸ /2%$=Y)Yh6nq|ޗ?SK .;KUyC\r0әcqv^4gB27 ''pSyP~Tg=HS{ ;YgO>^?]w@ Cw:GE~iRP#OFM_t:GDXm1>?BHO(\}`Gas{í.аzM'y]}^c<7\FhrZӣO(On>Xjٴun\F\eQ$\C .JfX%Bn"/ 4ym`EWD#SoA6=h5qXUuɎיSfINqQ_T9x apx 1 _gy~:4}ˇOעgw :R&F' 󧺳_S10vJ욷@YqN0v)S+wdmnVz`2\1Aƌ9r16nY[[)37n,njrK?|hXSΘz)gFtI7 p ֬le3ŧ\xhK5-tQwPL?'W.8W߬qLrڱrXOl++N)d4-oATH]{!oo)ɁʯBWӀP>AM)o{vvNQMS)o)\8tOD67*7]}pTcݾMy ƨ}'VW_hFTVϲӷ9z2O5j$t0 Ϻh$/tʈAh|t[wx|apum+w O8h!&)]9ƾ҂:1O[Grl2YK#HPV2#@'T#6f3)uLEZʈ+kc4yl?QjjߌF!VThe,<,4#@ @cͲ0QS6qHpׁeӾ%e~FF#wrf0Wh?2&ΞV.X@G(z^9↯P YN%NZ=@,rb *5đfOEB8Ee Oc@5Q1&AKS7kjW,﫦%F8NXW#(np<"{ʗD.2יn \NبQr;fvFtp 3xN94s"+oD},w,mxa\ pmե<@[#oP(t zoIJMWޏgm t+ Q  ;MoR@gs=QIA7 CHX/D< A(0BYx4Fׯ/e?uZܰ:Zf WQ!2m-*.96s4Yi2jP4ðPkӅ͐P3eTֆHp٧/9c-U %gKt2C9TiC t8q,w:*q>φMJ D{;GY؄fũ+E2n':t]tvkL>M CSF3^ylwdO0~awoPkwXe2^bo4p1.W˿mMzqժ2;gm0t!RbY=58f!T~γCn__0ذNL]^;^vrKlK;IYfh8 (V]ܙ޷7 Ɏ;u#ȅe%eFwfZ;n N4! pN<0Ix 8"ۈɴd*8HOSx*Gޤ󬅶;gqyp IC"`⏤Qv| _ e'}f`a-ݦ &=pB-1gQC\(,dDIcf)K".|m_)ێ%OF6x&tz΋ReG949''=+ <]2ޯ(5OJmEFQę(qE2V1tz+/whWдL[ʫXTxt+3я@GZ   ɗ΃|;^X )]%I,+Vp|p/F٬Ņb`Xʀ:R\HqFt㣜l&_uDSKba(C++%4E!nf.Ɏip6ns\tn,5MF֯z!}CcUA R& iJRyК˥M,ڝz(? 8Oҷs7 a,zO^ u:7T`ھOSoCCqN-ݤ…!M+2qNz9o <@y~a8D#Dze⁲@eVߣ򸺛2/ 6D4d*} XznӞoX~Ĵ,_2\x+G|"9{pRc,hyƍxEBToհ:\W#`I r[7ܝ|.}): СҾgjdzu(Z>lJW[2JnTfO+FH |$^}DaJK^8 2iOYV@js N8 Y<|FtRdh;!YqgvI+eFlOykL3gWV#mgm* 0ÀW[G7%|KX'WfhSOL~%s2e".q5I”,0L?y;A>]RFs&SewMO3&jH]_u ?Yu_TL8=&R4FkE *eV72zzWe$E4V`w:ob`[dN,Nct+ì0&1 |.18|'o>GyM)Jxn?Zg.)r)+\ӫ'Jqzga7`:=5ЩaB|# 酡{o)hE/.'e4-U3? [t^hsfT{G0)g)!J9qrl*GZEp7fb .ӇaicB4~yG6[ q O瑆its̘^ i(~0b|4P'-@.ťT7.C6=f?Mk>FY6 #R~P=$iϥה?*-jŬ 88j?Ó@AI*nwz x|p 4X#_Ņ.1Ȓ姍"[A+8.H4Ѷs^s,#,h0`x dp&;\[xm! !UX+0)yF#Nwb4n jz%HuO`:ĠgN[eF,^pkR>]3˭W/?zlwOYxg}D:$E[!*;G4 0%9rt`yJi1 =~BwrtGMFʪEzmR'ztGnZQȐk /B J)4l9B`󂯖MUe@r{ PN|ӶШ>1ASDٮzڤod L)+d$,Ҍ v\,8q*(aQ]ŭtX>nwR?b̯\Wpy A&v |6? }Rw: $Vi$pѾOz1^.u套S 'Ey^tO\TNj(,9dA2By|q.V.# C> ۢ*{ 5AffpO 50<~:S# kO&1_\aGxeZIPW/]+4_'5Hij!`y,!iWiSHSҚZ kq+djE\y^UT?Y1()Hiٳ˿t6qFBz/s/)n:*^|xǦI 2'aP|#fVB1W_\,eENPΖ;Tܥ~xPgF Y’B?tLGqXucb.ȠzdyFLl9+qh71}bʕ{\|QVOSZdn=ϩ^kÎ_X0R~&8:v2;6fY@c%2m Ă.'FK}:Dgo(4Z-+/30)Ȇud7;} ((ϾeDY:e# (3b~+("DH/Ho % 0n D cE4eeІ.(L`gP!wj7()Ihၸ7tYTnR'qi8E&lTo= O* Ǚi87w L!b2-J@u@73`6J(o\ VF}SbdbB<:T5pCpNBy7~>42s=\ BzyG3*^|7fS /lkyd2,_A-?f,D7t~xZNثFpFRpf/Ohٴc<ܡ9e'C9$ ~&}FdLݙxJAQbl-#a`ᨫ`oϿiEUe<,h + i~C=$E#a #]#ez\ Plx?",-p~w˱j\A##E;l"{92v/Z><&eTۜ%:c#2/3/u20`[گCg|Sw)AJuЕ ZmE4K錳"Ca62ϔU V? 9d%^'h Q>ҡT`x >AÖMϗt,lѠ@A=0p[( 侃E1 \`-ܻ $_ukJңVjݐ|֬`=S8-o6DO\&l&t}_||#s-{ e&~?:f*`_ۡIYPT* vdDSRTq` CڗBAӡ\~Ȗ%{jffMFKiTeO?=W y|Aطf2K怗rZ`n4D e7E.ZtN淔/}o* ayxS傗(?3!3D5h!N06;M7,onQY~LSd,Y0 hnyWpyFhff=h 3ϨEحقaiC.YPnlt}kDoNŃW" wi1KG;P+?:>x̕Xj]֚j0&-: _G?^ʅ>"錃́:`vm٢Gq׊H$5{> fz)ēwgS{R^ ˵:-:\4!g4RM{3J ;zKrN<)p#eČwI;yoCࠞ\aj93Ӌx|dʌ2;r -[e35 }gYeo'P ~'w=ͤumNu74ү~NKYCǖ*C}w< z`qA,WX 2|Kğ鷧]mf7cNEOZ:/ؑTW7JJfް2۾gliAU1UP'sjΙuL:q6;GʬʥǫTsQ r?mevkW۰ܦ͆zqhb;/,N :psN;YC`)F&<z9y/"8`c\Z)(jmi7kvo&׬-Z>X7EʕS1-0rꁲ~F]Guy8#@3,Qux@]`a8[xd;WJ:jg/eG|ѝHC\IJ88&]fԎكV{ 6ND233 4gS+:Iԋ?VO_ys&-۝g+lX9t8 `w~YP"*DAH+*/]L hj?L~q~&tPi:ICv`&}@Jkr\_.W(kGOrh^75pԲQsr6@0/B# b*b%R@F8Q@S/qFz`9~:/qu񪕚Xd7icNYjA+dV#^<a7Zf_gקNt"G8YV`VGe6BYz j*7ϏNjT(c N=G1C3<7ojYjjy5S Z9P?$vij&_YЁH 12e{l-A5٣cǵdWN:\ʀq s]ZxܩpH.*˴/tR-׬Q'MA+u Zǽ?%Ly-5dF|diG>y9,~{!ӕqiӱlβ,L~Ġ@T<Y<ޤ4$0aB0׍ò܄_PvGaБBmQn5?@ZqLE(A ]NʪƊߗNSNQ;ݎt3Di~o0,º8z /-)ɚ},憲kh^0N}gǂ!u,o4}K pu`i OĐɡ^áӂ4,=r@AhT.Bd:7tR; XU1Vث[QTlJRw% :Ɉ W-,^4_t֎ aDY#N YY0a+@ a;?f~7͏hSM6[ElX]8>=:TYHǩ D=x:Qɞ6/{"g m5yΛlc{7T}x =M=^1t"1BaTd:sɉS)>xM .DfЄADЁ\8 o8^\_Nq9.6'?L)-|Oߍ 67sdeh9AlNֆzd6*J'$T卅6=P'-ϴ 0ʉ A&S}çꀉ=?!k S~و *6y!~ i)h5.[BPsञ(=~9rBatSuZ@+6<[>wue`,<q'yN=M>λš௶h ҷ=I|?`35Ir=M5ǩZf CNt4U_q3   PQ d7?.v#$4J'Bկ,i+X1 .̴HpX$ Jχ')[Ap4i+0fu^Q!ȟ1 <.tYG8`q۞a@{[\~mk9S4uUArQt~5q@X/Ryn8UdEXҶO:z>f]ڎ<9)v42W*0Τ{LPmNA',wTj`heʈز2u܂=We/5kJy\T&)uC&E~c˴Û(RۮWu9 cH$n)yՒ,aϐ'1L y,#ٷrUYQt wQ?$ND[@b;INx wp\|D0+$YK 6 ,&#4ky)J`ڵn~qbIԺ8nd+x:ߡj:yf썌("ujg5IqQN%DWslRW^k)w\tn`7̄.IQ PyO'!sn|< ʋ4PHVb"~@HJ;5*^M!6ԋxknE+ĺb]JxXJD(@;SD1,AX~:m6y UYRJ'u2TzxxHS_˺}ʡij9ڼ%o\v( (܌-JTWݒvT&;onv@{0tr¡rdX 8%N'_Fq2>aTMC?hC9okU9Q7qÏY1Ebʉo,:n ?pC<.;FGSe-wt"#A2u=8\xоZN}f|߁a4:C6G]G,dNTJn .1 #gL,rL#kҜIvWˮޠkXwl)+V_feVDw>`O>.7o\jM9N.U<׋ $ʖit4V#Rë+׋߽\-^L\ s\;@oxJG.T}_@ŧ2UHL:tU>ȁ&c8F*l~ckըidnTd0$91(MGNaϛNe4x.to.A{& 6""F:v=F7{h]w 4tJ6'FI@~<Ql?x+2g }!_ģ:@ܪRx6j_Ks]uv=gGm^2fqn+e2_wvzC` X68 6}m?:IK;GzK A=/Jf>r~ūur"^)/lۯǴYrGM .T=Z~%8rm3ȯ :`m3gR.ݴ\Yo<9*Wd.;p].KgfDxNƟa8Wi8j b:nv_QR6G8>F&NI܊Zq(0oDsI16 呒 8C{0)^g37\s+IJy5o(|@:s;E*C8`qrI,CP Fiadͥ1ePde{FpimjVFDi3w316;2Ue">2K+w'SyU'dcgBDԽ0v32fG6 hKڴ춦Y:BQ`iN[ QIo& d)%u$F'eC~13`"G`   WGi-p4cA qiG} _d .q7jW+Lo) moZ"ry_A|,==yoX3E^nqeǾ8\0wZO (u0Iya$]2>ϔ3<3pr?+':,$f${[Ԝ?+D%WM V7k?H9%xJrtM ɺ\rfM0Dcڨ#6r3n(h\{ |&D7iY.$kemJaa$ mOӈGIHZޡ&LG-jڻYW뤬Du G0].tv1jm<݆gpLEs8HNbHY7b>po#}ƫ&PP\{ugqm9nU>\o f绥sN~ţzdLV"8QƐ=pܕ ՌmB>mo8 j(P o3 3aߧ;NCa N{wC&"NeӏiG΍4rz椑ZIUrj*iM# Qh3fOÒCu΀.N@ZS*ˎ~0:UWlA;);Zrϝ_.v(Csk!<"o.; d{,S#.T/:Paθ܅'n\UrmO)W~K?ya_u߬Ey(łyIG˰^wt:36l(哟,eI3&*E18c*۔h?W:eQ5 uР:73XTHtFi#6b%v}=ەA '_,i7GA~j-۠S ۧA[O=L7y*0HHGF`CY˱LCDWRIٴVڒ3ʑa&S?w`<:fvX(( Rx/?Fl=ZcFQ Tbmwm40O wu=KSy(wΔpg60CPަKu@qo6G9tժr T+ sG68hYO)#ff8B>YFg(g4+w׼ 'W-\-|I)9|+W8JQɰ7\]۵z2Z͎e:SF[t9YD=usX;vCyT*th?ɼ;c=[^kmHP4(:ƙ.Jgl |e,o'0PFF Ӎ?@ m8(W}f Nf^-ۿG{^k4gLؖH `tF}Amef; yщō ^KrJV" nQ f-s[rœ$L#,OoCc,v-Ξ@x衇o4:Hv9~0LwWüh UkR(^eK:ݡ̠?oLF(nN*YCCsc {j,|FtB/#֣LjJO% RmN JJ*8{8d8AX˨Q_,-eK%BZH!k:im9ΟSvd^rд|?,^f\C<LjHHgxt&fΉQ=7@Z&ªLLW<\O ³ϖczk)7`)L|߼؃.@.&D[jڃ^TtA 2oSd]bSXU?|>6?!\9Fe3%AR_HXpG:Fw)寸og`ORd; FEtPC蛘5sX*\ș1ª^ihRZ肨?"b8>Bb c_{1>WcK~\o0đ7D޴!|`3NI\ }1נ;((я4hߌdH4\d!>2C 0}·Ρ:HFj4܃>X.r1pa_r-(o׫Om^uW[~w@F}ͯ{_vj" qthztFs,KEDf~uҤ0cA~*fptd,(P#򉻋;QG0b`҂@>&̛dm_n}ǻмzA%, |m҅>rzVh`=[yó,F@^P/ ,?5/cO~ym#Csv6TvʐuPnCxtlĩ?O}UCݲ?9Xel/w/kd m(fdgʋt^U| E Ft}0c6*$3fiIڶZB݁Px@r)98 Òx4V:!E ǒCɠ#YqBX ѠU$X7lGOe־NVv n*ͯ6pyyۯYV>rBPO_XnbEVluUg\B~@{]qyqM@P{z/tv5˖e?Iw`?L)6u>xCҍL)/g[vΜnT7KhNeѷGu <Vq#Yffjg^1uH::D{ Pyb@e}F)Vy2#m i9pm+nK!$9ݳ۹Q9m (Ն:ӮZ?Gp e E⪎מZS+],vNLey/`QF*u3 &Mҽ WMs"Kj B ,z纛nD3ew} )zL@.ҫUwBf@!jϞ=K .⚲%/⭤$gԈBI7 G[bن%=Go~:44)8O{*_@PrWYN`~ 7֊)H+Bx"S`}a.: qkm l|ٲZStu->nvۦ9)凮wozUy}e2S>OUg+șUP1hwr#'qy,edGs;tM_wޜt I>I+;t 'UyH{nGB yYk3̞5P^=T~j7ӧjʒ)#2L}Z(urS1c ^mM µ|W:ijN?M5 ?:M /huќ#yyX3 t^ij˫0-BğCDn'?¤ƝBn|d'S.:A p\p2K(]~3V10[y ˆT~/O@M<1N8!K6T3H-\80T`H8ސP*ɏ8\'vS7޺{P|N>-0N1=D5LpǴ a)$qO>dhM߇Zӆ3GuQּHl〻h-܈ CM_!cʺg[ '@yk̒ p'D < Va"<;^={b8^_5h,gy||q}С~f\eZxۊ5m9]R[o\]kE4pz';PwQF:;e0ڣdžz:'%F<Նg#Ft팺r),/|c ghOJI/rKy(Y9hdc*;>|&Ɗ9e바BM9HV {u:ېS)6{/3sYn>!Y v9.Fi,jRkgfԔ66sVTuWgN΀2|S6yt#.\r!%h$=l>_z 72s}&ӼDK#D ]~Ksp|2l@R'oS;Qlj 0E}`8r0?ZQT=|Pi1%.SHJ*qܳqDTLʔKsHG'c,$. .SF:dμZxy̓7PPXCͧ߉_w\ѺU VýO0\>oFmϔ\2cwD'O?sQ6v:B? ]^ҐJc|,!2֍#kcy8@A^oPlPe3#" |kæ,?soQ9:Ѳ,k ϣ `)nɽUWz^*$n»f"08 Ɩٲ-ɲlI>uΑt]o$@{kkw]ILCmJuّ=VΪbEL ~O4 34_9Nms2*bC(r S\(иH?9Co+ÈuJԄJH|05T @5OQʅVhjj&ٲY~7tKRz@Ct@+bO|Cdtr>͗-w8Twϓ7/#cSq(+]Mk5g`ځ{w4ֳE'nt=$<1q58/: E1ⱁEcw|Խ4m捨‡ s0k^S[E_THy֮ZPuC 񝎽Nw:`3ǧY_ H8(s523Bdk`q詷.oM] N'L903#pulPqn?鈥oFJhS)DN"::W@I-arB0H_tJ k'k֗9r72wvsTx!#=!)7ƛ驏(ؗiyƀ$i@'/;;c4 iA;LDқL|smɟvB ~]helผ{i2o6·ķ&xڇfP$JKvS('=<dd6>]n5-S2]I0H!(sB/D35M6ӎhqvN2ۙPz?c"p pJ8HP{69HX ^5$9zŸ@Yv M@(VS1 t)%F <$RprL̲1XoA7Ə``xuGG{+-jFx7k͐B'|q aeg8Sv8F)hτg醠-8 CLH|3'k'+Xf~g.M}؊GSFo㇐mܙ*-TPߟ~~fBSO Mlde iFA9зCwl*{GݴOj'xp.fʏ3#:P=98S"QjaN)g9r#qcyAupa"mZG7벞W<* ?=ξhLlx']HS7CjǙЍC RtUw[Z<}œy6qkVE~X_-K:!(kxT-dGöy{LkSAHn16S Xv/ڕ*}lv z8ǐTp4 ^SO<'?}\rf"MQbo SuaK)IfCJc~B()l ׏e)kGq9P#JԳ D+maiTpԼ[ѼF _E6qk:qUGhdt̄Kx$8=e bG9&=dL#_'4Fv(3è(0+dRh~kf-OFOy|Ϥ1E`QKLjT'I$!b13(]3xx?}K  LO0")6[ G/+g XEg* +~ыy “mtT}+ <1.#fI !>E0j!tXc#ꄚ7?3zřs϶k<:s^_fѿfh l=zLR8#sƓl:hx:_;n{:ilRg~[ oC`$wV)bDp_v.G8\ :B꤅s!|ļBG>P:V~U+U{φGj1Cg } iҕReFicPБ#^ƿ ί;*_dt|52*gC*w'{x"T I oߎ:)7GtBOj{hD Wy0Y@IDAT1T52{_'0 ~ڦ(pY-SE9g( QII?&mLu:FYup?p9i6Cq `mě68؀l2ZiCu/^ Po6mL馛E]TS7wFAƌЂh!y  ˈǭ/𺃡P$;@f1Ik}M:y7jl煉љh"7:TJp8X6H=cN0:Y^o쇼Ylw>J?,xf=U^x'-:0`𒗸a͊-vh0xOcQS/J-Yk$ޅ臞;Zxz\>?8aӦ3:G8im:Y>B43Igs "ƴv ׈ĈE t}x/ ;K8yC <'Ë^2-gX'MdFMdjXzQ|:EGbq̈ϾinuT6Z_eNXe|8ؑWŏW9*̺L;FμNr.#kb|9@"EObǀBׄ۾!i-M8qu4S. Mt &b0Etcy{/|>Z>wA}/~WA°ė(~:pDৣu=sYFdl;:X`K$CPd#2vXFg8FC#AyVlZxE9]$X=% 7r`ZSvp%ϹCuX&cn?Zt aE 6n?\hYa5ȏ<\?/[[eHzЌ/;]y&ѶA<1PaOΤCzMWopi/ =ZS ~<\m){);pᅅyjN^*m#l;xRV)y3k.]Q돗KMM1mٺl\s/kV۽d۫p0r&uu8ҩΥWodNwJMpn9L x%CSt-F y;NAh1${\clu$ иq&vZan !6V1e+x,_xo/UЈxђ*AuDSHSMug^GhaMwɢd65*4CJ_=' dlm|+o(=pOmD 8 qBA:7 g.g]8dL͜:{rֲY'nSܶO CNx!Ӻ yƑ c NX^mtFN7W"׈6_o+T'R^ vKs $!NÞvց7Lu`#DO\d,zŝVwo_f{ybajrX 2at9CG?ilsԇY7gxCroWVfioy>'vfqm̲Wx3'@v^M  ۀ= N0n/>t_A#Gu0pcks#>  d2uȝDJC"ui )?X37~O_ 8;Ǚ!98CmW) |W;i~Aҡ".y`^M)$:u5L;h'A .&?TCS(S2Q<**q % dzf~;XB.%;ZػyeEeD |/2I~9z(1jse -,o?]5*?lYdO[Cp棓3#iL?DSiƕW?. ~T =©*gp睅mK)d*zG\$nOt ^㞸qbkhQ ;R~N_qYʇ"D׉4q%60n471X S<ĹVߎ'\ h_R=|YoOyv"}Z\! bKܾ/.* U:'J1Qk ׊f6GʧTdk׮]@Ixٓҁ' :٫lhx~T8i_1n/QHr\7q]iu \C ϕԏa|UF^Ac㩆oH ;jYȍV^S^yMw^y)M;7 ݊k41ZͤҶh\r ӥ'^>a6T፧(-H]Vjg ZspEo#>ܨz4N!Gq=xn܃::ٺYN9:F*bi(1e ߧ,2hdӵu$DZD8n]֧ t`2eGb71~cT\#A}uW-ϻ+™}),> bѝ.Zko`]4tQtvm!Zȉa|tAy(✮GD̔IQ9 uIC'6?7΃h}t~xq6)(SU|ȀPF^;+$iGH1Dfߞ倾܋A-o%).(%wzQZ't6N:(oEhysJ"x~Uq{˹gGoj|\S6Qy A#~\gv'p}Ε#Nֈ]p\yC/o:zx8Mg--&$^EStB( h[ʩ1:}F t`gGk?>ltuyժˡ4CZ\5ORNH22]( ˨Ɨ)8)[׵Pؾ}Z">i¤'~$(GvЌʶ6HExG9B N.  5_E3:;H.m~CԐUSUv<ስ.(}jK=~ LtoubAN|V> թ_MMe1"E)4SC.pU:c!ɷog[oNj W͢ICS,vAhwñ0d cL 46P:!jl?.bDW# ^: ! Q=6\ )=b=G3A W~Zz=QPAuOȹ}>-e'VFy'UWaE1|f] =畁irqk-P4՛Q&ýGs1kݲוK[L= h1WдC$ Q\gҹ'o<$d+q.~l ^7Vqr՝~kBMZrYE(^",/K鍛jb>`>qPC`IDŹfBM)H8Of0GDtqgf:̂%x `!A `GV1ВsR^sC;qTO[a `t<7 .7tVzo?篗c3grYDS%hKQ =Dlj!iBX2 !CnNBd(or?Fxc&WJA@(BvRIe&On$Biٜv4Tðz5nyrg]^S-k^McHHx>?_w_Z,8 T T~:D<$$1 F|42&w1z>:?O's]H8萱k֖mʵ7lܘw#) mt-=j6]FY<Fff "lp*G^u(K =d'j~CZ$³PF3쩀=(/zI?/@rYn@f6")6m|ħ,5Ra!ܸim1-zEYw ,ܫRNl1 #6ǟ5|r:W$A Gܔcǔ NCTU0Qi'gdO$=) ـh+D~N7NYͩ_"ust}1$q'[Z˵"b嬳-;tT$tщV+6=F q$ZȬ; ?iG:4r7_5ԃsȊKd!M% ]7Z9;_+k0ʢ9k<܀I._wȺGIeL@6R\\?@9l>LjVdl<}ÎWd<4pVͧ͐3~Vh^mlqoC*W^y8~IgezWy{ǟ)Թ3}J'ٟ(tf>:KV.o:oq9sܲDdz=;ʷSʡXF'Me:&F^* ϹaF=G&i7 Y"ry 0LQ)=)̽xڠ1vV]/_xF;(idO#^;z8fڂŴ+iO/NҰ2 j*pXu=rtHIƝ=!i'֤SvHX([nr1u iڙg^9O/K:Q 9НBwGlYVJ9q*7ip\*>ˍO=`YbuY̦.f7YHƂktm3煰TY{v C !scJs^q)eO%mWD3TsnGQ=]ed?8Cؿ~TVM=*B>~뭷z!;1''>o*!踎^N>x`~Y8ʵ Fj8,{'ځz̚gy#7s5MRw,Ho&HMg1͐6⤼<`a#|\ቲGuWywlݵqס⏛$p'L&rNǶOΚSJ-dQ GO7/אB!Xd&g]B1gvY'Zi" 聰wp>Q!t)E"Ԃ\}WѣL1n3,9S&Nt8/|8hX^|(:R^Q+^l?GWz*OoL_zqQN0Ң9ɘ7kʓܓFt)Th_Y};GL+VKe$*SanTO4 ck݈ɏugp |ObD2Q1P9{#6_^^;G6@7kH1:RT'~D4iDO l}r>tp'v3׬TUZd>^Q;b8 b~KD#ԁ ~gӟ7+f N.6,‡pLZG.7?qOCVS015"ӕ7w: NZ1<L1mm67FO'ik mߧ-1|̻dZ^tr9+c8txhٯƁyH](ϡ/[ Qp%jM2u<ٌO"FA;=y2l\:y& tayr"}UKqCOi*K\$Ǩli jA?UՅR\&rq*_gR+ugЧSk\$4-Cdv@Ч8ZxM#0O^i@3BԊbEw>Lm_d9Ec2.k9&d~Y ({`e'2-ܫV^( _9G ['YSl2_6,Pe"yN0+HVvPZw3 %P=X }3_vZԧ)'kV[n8\)OHy#4RWiZļ!rd「4|٨#һ(3e43vM%K4@zĀZ']PpHpCO08|C9:Wpnݨ@WU_8Weϰ>k pXj;h⫽:yEVK:d <\i\;_ĉ8|ꄜxOg3*ZXJi~!;[*#5VzS\3/8?IOŃ˲J^ ۪Mi,B:c[QXe}4P;m3З˖ 3< 1 4<#r̲ua#Oөr{٭[9 `:LG fAqEXppgċ*Gwp(uhveN:~ "ðȵ X N D. k}eHfk+w8WN0QMؠ /,vt@<2'rE'Tޤrc,-3O* Ǘ,+[8O>sO&p[g5?x _C5 e[go.&ůͷr׾M|/7?яkF4T =xyϵg[ KyIKtt1iߠdFP;"iBlnpb|ykҹS\ѻWYۤ9ğ=N[0WTpR~+Os:u$3#o_.l2oʁ3$? zHA@oj? 4-'zd}fv$y*M^KpvЇpL)p>\#Cx(ڊD~a*(O?Q֬,.^K4AUqM=ƻ`^3Vs0IS kd=T *LݺTPȁ6yb&B_?ʌځs^(vSr̤pmѦ)|!M6q[>;_ON1-7]Q֯7j3 8T RQ*Tv5ǡjyvS.T㯎vV,dj3A/ ׳=<(`ûF RX+s_ .. )yz-5pMIg_\ 0S'cl P:LKn d46:蜍9 dȗG#(U-ke:)N:~Ro00\(y{'8R\U㈲g]ckt{QxvƷ\< iWEމqA<Ǟm1o<; c~w,vaY|Q9ol=`Da^?Vu~ R8*<0: Ct4 Vo挘.2|b#M"|VE6 wW Nӝ\JSUL##3wOs*̈́L/@t  <m C@^'֏8蓝C]ҫb+Z'聀9"_a OS ޶Z[U5O6mjpuoҹ4jBl[gBp즡Iq^lXd"p6~'o F(AؼysSϐ4o}[  lCx[/ -Zq`| ʆ~*s6n8h%)`AtQ5%<јE/2W$42ůTFY#G,R)f*E4 K.d9[L.b/|[_-;E^Cs\earGP՝Ν~'OcE"HE\)8@D[gi(]ް'Vt<֓pbF?9{t䧓ƃA:zՖ!hAW OjaZ\,<ɂ(C1\LH?Y&ʴ;0YgMkhc8:'ik' "G&oto00xϞ_>Px֪eư^g)'dhJ>hxuONY0 Qx⫒dľCuu u8FtF{QEF| STēO#N.P!Ln 1]$uM0)~UTLp83&ٹr8cZX!_C$'C6S0@X0=A8Qx2TCЩ\9xGoFS.5%fQ$,.'du0@)eT!@y4ӦzW*0 NODAF ^h̨ ߝس`YNڰL9t~tc9 3z6| Ȝ4ҮF&>h'BNE>ʙ8]m$s&Yp)xnvSBv2 _鈡sݙࠦ]TD7/ygZE;;w*.#sڋz~@u:=xGCyuCF=_ʖzmEZCQ萈O<lC886ˁ:u:Iөh4hs 8mk[kopmv"gޒq=P<؀+_W纽B&`SeD3]1*[do7a*M&Аr9TCFtFV1#arR%ZF}4&_d:WX^ ΗMЦ.۵:aK.,D\W0`TzK_Rԧ>;gOtt"dyPOR|KsW\aCERQl=2.qg|#C64tTD)[ԣUFO0?~SegJ^r6"ha gx-K4:<5wu9{'bw໷5goh E[SIvX |4Cz԰ΛUz:P9taji ?tL!y~Zm2~'bqg|C/#'sn8#;ǙcuSzk4m\'UN0R(x9zcS^a kZNMTfqkQZr'yEa; sxoyz`ٶs\qNƑp ]g r8K(i2:ћQa )R:#sZ.S.Nn8/EHZqΆfkgdS4G_x55Cly牸)+ NPL^CEoc*X.s\ʾ ^S=MWZޯj;W]c@20a6];WwxFu舝]A}q xxeG,i<<^K!=yL6\zIQoʻƷ1$YFO8: )% <9CC^a'gA^r [~iDUƈCS yKn'q2ut-d7gF銕叾hxY_/'.nXאcX˄#m(#JDzn^Ej.$zIh&yz @wҎFڡdL;~z$ Sp ˗e Q)G clG0e󠞆k] R/ ,A#£v;d|ituF'mqOʞ0){ځs; g"9Do2PHI'Ix c<߲ 44@3O# }W%rM t4rto?pM=+*li޲eSkZeZW,Wpk;kipdGw9$ cÌV0".y28.?oafWW8iuMS^T#g.t%~>e=`׿GRޙ䬂ʠ&Zq'R7# Ir+7R.X(1Zٺ8ɰ7R!;R܃gOt`l. qNZkAV(\:'*5ctysE)E'KjI?\Tt{(6䩼Sv 3l~P"_{'Ư&$hDUɗ*0]Ac +4ithWҲQ%T&2 )j:KVo{ PSRJN8ؒa|?,s>FI GCQ#ޣ3N+qͻ+vV|^r%?rAny7jx' ySF#; $q@fè=x^V1C\;/ѷ3K?Y& ,#ʉFDS9<^~ps@2Rłg=81IwxoX]fx:iȚ25|Pga4!qOF-ҙ!;NFe%]W,_嬅3\}{vug-C#1lD:SkucvFÈCvŶ$]-(uFsփ#c!h5MDAA*'Vhʇ|y'Bpe$}ӤXe6 XްSG~qLms񵳗u}?)aT6}=pZB0@}ի^c,jT T|rJO_'F zgAK69Fhp&+A)V[)X@e:v'N>XܹO3/SY1yX֟> r@IDATx$@RND6n9xWcRrcM?X7 3Ym;n\!K=~#RUJ.GVtݬg;賴ؼN>9y؟^1wz9 5kLEVϙ!pI=Y'lrHݑk - <8JQ3^PqG!t vI~rBvɃVޓ`|;zC~K|m~qV޺Ny+՘GBHPӱQ1AxiRBj:Ȧ1<'ĵ`ӐxcESQjaӵe|-o~yu5gj걣z;Sn,@Q<{^#- )>ĕEl3V.\3+)̐ގ#2:݁x^oMO3:X8FCh:lZ['H~xL~oCh}wCK2Dm|&.yR_؆ pnN~%~y 9ug޹]_h#kкXԯ7_n:Y q paSrq[t:%1OWmJMhy:;t?ߞ_3u0ʿѦYQRwG RH~92jC|vt_Gf-`7|3 */Aq:k+[jkZmlLwqy69|A_0gM%/pW Oƃ'q 5).}?]o\VT)LSpGʪ堦ms-#$:##zYeuOBhkőrO >Ιv42},dگԹΛ7/x:̑үAO ly3<rU*?t$&Ѣ P`L u7uS/ݺxA1{-̐C̳G۴oo3O ji5΄I}dVL1r;uVВG&f@6>)4ٖ0A.DЋt g t3Oh jxOث}mv u-K_gGi.$`ޣ7_ċ>:OW ]X)|t6u$8];D7I#;WqhY{ʼ/>Q{;?}ehQn; ʧeV59RF*xv 4;q @Cryʟ}{gY:q|yťCZXC'}Mӆ wxJtrF<c9^ x\>#VRo&Ta‘8mtT6%o*WCN?e}?-f5ZDUL62G=¹6At͑=\fi(kAd:\ GcA7 iEe|t4iWPND4rtS>yf>zu.j+ғ?xp 3._}9tOW`lچ=\.Z6,W|๲Yel9ƾU\tF]I!!5.]%f2+]ppH=Ohqh-K̍n:XU:Qa]U܇`4f;F=r3u5 ̴^utnd0 a\etJkF*l)۷=\7^w`Smpq7g;Wfǁ xōٱSn#k'-J32x}ɕ/(GzP.=()䏶ZkrYΡ_}nсy;qVmc eٯȜ\M|aaw2`Fn&GQ֭[p@4+WKIcTƅ(<< igil!K%h9p~LOW./œT'[>R?l\,Ztf 2cnC01PHtRSWOR!3'6uSI$CT`W?O]KEa&E 8s7=垟||{97tpQe,77nLwls,U?у*gvсaWON>ͣ-`3䡼:>Cyv褅7e(рy"F˩f8x@Ǥ;_嗸1TsI4:x)۠œٳʫ.]^.SSO>\sB{ U5f!|" /.zإ.X2%6tJTNLR`rz:F*,Nqbd'm<3bCφ:̈9p;v9B=85ttWw+n,3B` iaJiSMLASc TOhk^ĵ(:O ?2YPSxZǙ_Ps:#]mF[Q}瀝#yCIL%LFh뮻|3[ _g?N>P>kזK/Gq8|?_w0b'N  _d_oٍ9#  OdC4Dٴnne5$iw|wYz 8Pk[M:MN Gʋ* o3 DLjTQ):tAݕ5V\N(pNo})1@PѰ7!+64t+HxV7A`['Xn?Tw}sтx4yk_4Ϙ3^3q#q : \X]OtRugډ ;:.;k* y/UF?OiDHpF>`=l8!!2~&BH#z2`hniAڸh#蠇BS"^5"Ku[{=V\wIS>-h=b2"? mؔ8+jËwN,N+m~Okj} _yG<^[)4%tN: rRyrZ: *6ӹ@ŅKV`'5?΢., *lؚC5_c=v8tgLAH#L>T?MuE>|u׹ &qeZʰOK'?Y҇g ValhPx`lTN6bM>3'\aNPg[|M3t@ٴsoYzpFQWt.FTtAmxN )EXJP ta1$8?}xF =3d9 tHoz|[maXD-48#a 1AnTI4mw̞_^<X?'W/.'2|q= kt,W'ɏ& |(i@|=L(":f6`YaIa_+X&Bq<" |A[֣>Oqԋ_7Zb6}#w +ѫFt':Y A:p\Ow~L N0DlADP£w u)F,4SifȤdsNOj?iD9^ƑR&*K\}:H*NSv`9y/\{xޓ0z z衇dRjF}O"d%jJ$%7w \^wJS O''fodQWwyFy|@+ʲEsC;*HJ3qK 9@w # . ŋ{4*4 4 (ӉFåt:zNc\Mt=.,vF;J^_bk_U\'\Q 5NeŊpFhxcxjAܣw:c3=VC2<10T gKbh]`;ΟQ$"5a1S$8%T>ѱɹ)]B+A头6I~f4A:yCZ<(x:,>Emc]Sf&od[gҸIOxdiw}isdyfn>I:Gc+mA""g@1<Ӂ-Xс+Gziŏh)0lЁңPuCȀhRHn'B)hey۩%Quzq6 R?OCZSQGCeqY"`%ږ|G0`^L [֊'%}ۋ"S}?*|o˿X3w ȭ^i:Uӌk]sś)ۍ>u$ .-z?ªzhNpMt9%D,b`!bVN [l)k׮ueܾ}{ .R ! ׾t4$+_|'"Iwkpxφg [.s@{k{3zjvmu`<^uXG!,#{Px|.`dC>? k0#e D(gC  Qq3pFxS.l1ACB2H=Ƈ>R>|~l8p"OWX4y|Ӡ\Mu6 1ϳ{k`aͼ5%F;au@O8m{2`$YeR-#ǷGSaLr6:3l}Bv7 ztMpNMn"G!jٱ-ʒ|G_H/# ܤyBxźY廏)`lz3e27Igzm`#mzO2ܣ71!}NGStA9Xc2S)gޡ)AɁ:#TƴꋔH|KGnA ("Ʌuyp=)@݄;.:8P/׽56ER9Pv !Ͼnj0G _BGfOv&cs*7k0*]JVͿI3-/ Ŧ+`mff5GchA+ҥm\DvBٹY.194<~zrX[@K|o&Ʒ-^(z&l޼g f  Cp~}9=+Q@]744lvC6-#el`4~+/+oNozܺ\|^y)xNYYfu6RGCS'36whp?%,&-_3VYZ"dP#r^$SxYJa9ydy3o@C=`9ծ{3 63:5tɘqѶfOjO*/ۄ!qXD#pP(D i%`#6`~Ϸ,u=ic 6ፙTuW+Q(pQ®%#UδNp@¶v᎑@,̇^{#TU8QDZӁG:?C|V@&-T !2!3?L,5i%POoVTI X{铁 /r78MozS/}Kc04z2@#zq^V9 T @*T(%6 R:9cU6+VoIc>c5X S-b%,=tHӖ#(Rp$#BUZ?w^tN]ChY.4bTq1nʰTácoK@DzHe3^6k!׎vÂQrWu%*qWlAG;܃Nu[=R#VACR'>肧yXؙApc^"˳:tI=RR Ғ{Mt!O/;z2E|xuI!E.E,D!qb@bKƌCm3g6 \1#~wJg N\R_g*m&?.d+Y9ؒ9J6AxTUA5ڕy/!eҽ1`E-]C74/uvکwiV[Gw#AutWw+P;_-iR W7G=N0Сz5Tx᲍+rԀָVӂi$ Na:h}AR9[%lN-8\iɓNƴ3P#$ UbѢE^Mٳg7*܂G4JΚ5. {Su%#h(BVTs}6M_ܡxpژ3cLFG(s*#z+BXWU ^2O:/Qҡq1g9yT& aU3-9UO4ʱph#`oT|>>/#ȲQzƇlAkǪM;c-SGfNGcxB:t:1h%mS'¨5u="'8t'G=.A4o ApsA8hSRgqoYG{Cc% 3 \ Cxxڇ^/luxf|6㍄ޜe׼]U[?w]K#O)uQRRR6n7? 'j)q:tGJ wz.ƁN@unTCeV.ui+W1&M:-K G+7+|~Wј$Փ%>О<'?^;}p=כ&+$)8cAYK ` IefH=x~Xl밸[W2Mb/RiϞ iά@pfۃ\7׫p|.ȕqJtW?>){@R랦/ϩsH|g sȐqĂqtw=>?|G AH$0B]40# XL[xegFN68K;; JNܓ瓀)mLYAtn8"x'Y (◽2n޴_ymde :ґg.FrcGv5f+f2HL::3xA̕:lT:{/@y˲l$ eTS$r}3Ky|)ΙW @&qę_ 9-3\F3*隸1V)Džn27J þˆ>Q' H&]N]։䦠2+Hn ռ%]=P5&tAeio,җ]} bCr)thn$.)3-7:V,{(<8#1#C Lx@5-+|1 $k0\V!#@LuF9%wj$*w+B𣂋n翋.NAU&Gw;V)hKOo('|QE)@ LU-)SFeK%1'Jk΂FU@騠d~2?焻Pt;~vKė&_=V sGYjädS9s:fNwb*F~+"N?U>$D"FeXq%W0c&BruO9lzZS{[mk_0sqGkDȩp|߫xvxRf7XFgom |f%MՑjqwD:e % oӁOe\<&LC򻧲Dzf^t݃$_,XxxG6OOv Rϴ4@-MPC }É&x@g23Ƨph+Y!WΡp<#i:g4aUr֝" Mк42r\d%.z{~=|)`NLN' m(FDB=z`kS$0(C]:zxW2x3]\h%ұ0vgG?/{Ș1H 6'PƧ 4א*+J{4h7x7Ⱦ:2~3g9{m:p&uwg>0`~T\\۷`M):ekcUaD@R{P6LP +qh!IBy8{`r2PZ+#58J A? :1e2"'YʢLˍj~).&s߈#Ggl<)~(\g9|Y zk6S-7(KX/(?U.Ί:=]1;羾 wֱ|;3b^ܦșdF0m`oٛtSq~c$.Г#x;k8빛+,Q&q_^FC &1K}-':XGBE *“3:s?Ǵ<>h.tɗkyu~aE?? n_4.^+T>.5TϭȔjذ Fy/ M|)P>^ ?},:Μ9Df0*r +Iag|+ w)0z"Lo/զBWJ^*M(PQ:FY}0ת~E;$Ø pi #ePr \&.Ͽ;45h;+SlpVv&(.X?^O6/x_?snc|e^0hW!X#^r䤘2qv߬f@1o_oU|M @P~S$QG9 3>C@-OBfk>0pzj/ɍg],^էqѬ/n^xȒ`%=х/62`u8MA ̗| {T`bO Km :+iC!%G9S]:0]kc`p!G9Si#qt`$ bjel+S'9G^D5۱ _GՏi?=3A AW8Q:/]E*Hg(_Q^@+zO|+9{wOA^q~}+=MgҲn+8,3cP.t4^I$jaA }ƕ 䞋 T3v[:a6ib×kg2 ?JX^r& Gtfv&o9g;. JE*]Иϕ:#Q\;U?ڠO0"RE|H 2U%ܩ\e+`F@PD0ټٛ`3t뮝;b1ivXc/్;4ޛ%_^ZLn;]⊁P5w1EOgN|3_}mwW\ۘ9ψ |ƠjkbcQr~EءSƾ3<@\8xe~HrKz7'3TE2z蔁E$뮋[-u#y%-Tџm^dK:R玍W66eMLc_qy̟9.F c&wAY4Rxpeσo=4*N7RJ\H;u (M@'+%OhC}v0;'kAmYu[wjLnc\K7 xvR z(0iEtSG_'ƯHy$=j0h5' `= Cgh3H;.+?+8QDLDx¢4vhL?~lNUlMV'ހ,;g~ #[*S"!rCD~q0P'YE|1>3F ӥq\ %Ny7K>]-RvXѕ ﹎_]DpX&-٨Re(/7Z6A u{WK4>4qJ\`CoU#e2WVF|Qu. ^D i`>C68|M$(q)i^]?9/<3pht wT qC*|\U@ SRdQk׽ ? Ē?V|e[?pQ|qKr'4sGA^ 6X:BFl^ !йjV_Jd:hQFʫe*V.O}A+(g&z@J秂 Cf0{.Fp9G7稒BP!+Ls 2 ԁ%;%:i @x{;cR?ӊNIeU{ӂOFpϟzI2RnEs`C5G& Cpr\ +ag$0 l`2;}6*ځBPWp)V$3K@rEdCUl@/LO/A! Lꓑ= 1~xы^tl;3bμqNQ[>logr톭gKlHo7u@} Y}OEt$$I>L09`. 'ti y)kcNԞFϔ :'Ovm<ciBQ2O FB^{#R3?E_˂uҲUlGxizӎXx8y bi);joDxܰ0-ereVͧ벒-e,@Ãw"8U_Q*ݮ]sB甁 eycn]WcjPO *?\o6a RJܾn63n6+H RK: {ȴ+GƖ̒6khm/Tsg=NpOM?&<ꖶ@ڌ3͆Q̠KȒ>I̵R\|yOn̙ 0  2R$mFK>ϗGJ:ULT&|pbqn^^ WHB,(6Yr))wR{q 2lJñV %p#@ڮt];~4D:?%ӏi?[]waCg8.;c)GG= '\:nHOZJa8s5st1L򐟎 }g%pu@ojgm2w̚U+_{*7}ːRy{vIx\vYw^l5e(c{}hipeQ_&sϘ?:~j|E1_g=6#<` f' ݕƪT #)wpc|yfLe0UEf2V tHr)xӠP9CfSsV'8g^\D ʹG=󆷼75OmZolvG)Gp8豫h%ܷ$J ޗ2/I+(;#\Z,9pxM7Q6aRXYR mb$>sLG_3򪳢_xyh*Bi#\)_䁥b岖~քẇݟ]iBjѢE6N:餸[K.qR>#f/}x~f+>2uZa* F^ +Z=+KH/S';2֨3Cбݵx*lndt:DII(&CX刂a֠Zln+IcǦ'PB(62B_P' Z;T^}|["?ɿU:ΘYbT,}tGM2MCZ5ՋCqyp8@IDATQ4so~1>G~S-}vP)YVw^,>y +R.MLSVJU8 [)c W}uOaǫL tJ>pNpK^]A{) ;tz3} Y?ظeb`2"lBԉ핎WuiwV >e=+Xc4d@]إcC/ ɳHS=zAlVgěAqn %=!&}&*=oƒ|q.Ujp)霸IҜi \[*<0^,B MbX.ǜEݔ 2eIq5/]lW]q?z .YݥkBnAE7ȋ/8͟?>#M6-=\v)430%n(lXy +r00䧲 /8zͫC_{}8J$*#xFUP# *ÛPЗ4,#vv-tO;4 F`{:z( ~:ͯ~.ަ/GɌ6le@M˪,kk>cTv^..:a@==yaR=tQ Y^HK})CC =S.tjD$3JCiI#>^3">%(zk3&vi0~?ƆY,WNgq'}FNʇdZ:>µ L7<};G}+ 7iR9#c[}݁/.a>P/OxÃgTJ'#Et?zPd+{zY|P )x)H֝+tJ,5.-=x^J1 \qQW 6] W-;<UtX&) fս8;|2<8?(&g_JJ@ 6ڏaWQ0nj*8Җ\XNo=h1%2%U[T7ԉۿxi*;:sfz`VnwCd y7oS,p( ,϶p5^* RbsQT8O' pOVO̳.Oz| ͈3ZW 9LлRʓ5 nV b6zdvIoOz +Wt\JԮ}t{-+.\]2h{B`֕:eԨ<檸;ߌ3gK<;GkzpzbNO\0qp̜>Z'|굻2||RAGgs!L>Es_c514`Cq䩗n Nu:jDkOQ"#L@_3n6G/y, `'o =ᲧraM_Wݺ&WjwV2QW{WgrT.tf֍&*uFܱ_(HhXr jOfŃ7e$x6%; V=cl믊7<1\00><ͮl$`.+vi u:->,3`,(%P>NP!Xٷ|C\/ugU0'Ζ?iC= <;BQAG UY/uλdܽ|SKc)|ٷPEQtF1oSǽ{i%h[HGtUX;s_̓4{5W4e;j Ox\) J \u_ЎP/A~>^N3`9fZ8 +d1D ܼOefĞOQq(;H4 ,Z}{^mz/+'gfU _q:|O/m%^}5vPlP܍z,~pYlqxV!O \`4꾦{Gl[(n1b$򴳌sf*:AlZϧZ/=Rlju}I  2͍,*R7B{v5xx 璬hUQLD Pl]M<"H2}K3&BG#<+.0{\67v!0}Sgq񿤭# n!3mX5YK@Й3g#.MvЀ#m("ЉaӐF! &;?#Й4bfWFNP0}].-e=@R:=#r[4y{ۼ+ qe>#xs|ԯu .RGG?Թ/kk媼beYE#˺̋J,?dǛ&)$`$l= d5k (/OA.mΛ4vX|=mA̍/n~pc\efF2iG/H0daMg+{a,j[<@'G#O5;Ne)FguY*yQ T|x{?s(н/޸kOIh.URnv!^od(xgm͘8Ř3K7runHJ?6bi#8odf:|c_[x]n  $|G7u_tᨿ\:F?[6ễd;7nJun48ȸx P:"uGI~ԨQMæ38I٧SFj'259J5PO[/})nt#3Z+# fkVHʀ>Q<[U!|#)#^pz;W>Pns햝qQ1{:]oJc:6΋3Dȏ\@R޲>+6٩vwNK|M~Q@z>qTuCPu#dc^ o`uX_mtNZd_7k5B&/oܼw1]^{AB}QRj{t^}vD%-m Wr C? @2֬y4.gbqig/ w1,IFp+h^lM*CtƥSn d e _U'zN07&xh#Bٻ )_ȚqBؐ 2uy6\$2!T T7[etV緘QQ Kq ށo)9W\j3y輔 <n4?mٳ#!;Θ4i^۾/i:sbn7?[>1;4`2 HCʎ'(=9B)W9('t6p ͷ~`,])N>j5\y,Nٞ<. uȎ{\>V|ig_4!Ң 77.صWqu_Y'3Nw |לu)0Mf_*[pc@&!`d8ҹx:UY'ϒmsz6M3]m vq+:ߥïIbb!s&fp6> |͇. [1B.i$9e of|qwo[HPayݝb8ȩ9W\gUk=t "2ˑ+n&^M\`Ȑ%&|ʫ"^x6K ;|9eLOW+Kj9sSPP1^Z10zӳ>T&@R eQ@^[+}ď-CR€uʂču?޻.=X-~սqshv).C}Q7u ׶~GH(|yS|)?1yԅ e ?m&N>$0Hǫy .` %@ 2>FJ(#a lV=ks{],KFa e#' hxbzWqU|تCh7)}p{4s Gpeس0#Lv(d * 8Cdޯw1pM*Նqɕw_9%;tuorgMGx6 !AbW4 Q!|J u 3ܟfcBH!iVݓ~ذuMKcFzO3>ЛBh.7$r#/2O.7ĪQwOJ~7:r4np٩ڴi3.4|8&}(40}Þ 2'Zs 8餈ӟ@Bf(ܓOЉF;{…C^xf"͓qw?p>FY|i>SFGf3/;q^^,!wtl(0lmCY̑/yoyR2kPd|:pMF_Vۡ ?i39:l x.?<_8uܞ,SI1eCOP1<ֽiŏK]uc=${[{zo;Wub>C8ud*ϼqe&=]C*]>3m:Q JM{ucٺq3xA2m49S'˫QcgjT20.]q-@ Jt81r8 ſ7@ 0+ twP5))f_lX́;~ZbS#,=7P\%Yٵs{ sA| _8F~{FJ:F@4;8q_3[+_333jup"o~Y#:xi_YS#y[6mH\țtd8I ,3}:uHI}u-9kpagE3!@lw$}%Li+pzyN88dZ1 %6pMƺM>=Q?ӱ;[|dݑ1ڃ`ZJ51γ09{gwxԣ:MQ: Fhyv+H$]Qp."3 (n/yFLgFT$$;qG)Vܠb(=1 gz\r1na75 ׷|4ljr=h P կ~?я|afY>7.guƳ|T0 ~g }Oxu})n\[^@|1qFI#ǭ7F5٧sҲ'Plxȃ" ArJiN-eր9h Jg|`"k80 A.GA4&| pX=ݢ1^2UDkc[A]GzS,F{w4);¡0z5opOzсCǁ$<圲SOCW=;;C^۴z/a=0 <*!=@ nRcTO?c\o<Є3#l:.0zrw4%YUi}/ q̙VBz&ΤGq%uHٗsBK7O}U@N1@ikh2e$<)C@*#]'HW c#rj`Vi2 nv ]ONy,<-u]6Rh5v@0AB{ #np`>/}iCA$9n ~4&8* G* dž>m^̝6wo]g̴ :(|"R#:U_Rcpa ̝Ѧ=Sûs>g)ANg\s+GI 9jSj-y^_hmb $Lh1k@=7.J~d^%nt\I}C+3V),kAybLӷV.zߨBמgu2#uƫ[[.ժQ*􀳭WMG@6#9nCNABڢA5-3UJC)~WTXy$@Q/oWyse(NU\-YzRF9tvi6'ʄ/w٪y5m}3v\>dŶn2CA>O/-9KLy[rb2_o~S pZw>IҗHAxFB','~ q؞NTi8mkq؂1mu}I5}0F!3NA#t.BqNX0Z,tZKqGԬU"?OW8_RS҃Խ@4| &:hXC4fFb~ ,: `=QcR%€&Zq'NXx[JZx aTEY:k:DE}V0?5;7@`/NIS;4hgpLP?wtKUY~#uG8-k=4{m?QyG9\+xSNO<LJ^3GS[Z#ٽiq}|Fp?`vj*4%c…pG88Ҕ5jڑ]9z':3V,ŏ~e?w-g"88!yQpςU8h2ذN` Ri@j) [˟/+&85p)t$w$Fiya-Z.[ASFw3ht@KvwAӨX3NGZN,h@i]xtvRk?cZ("|jٛe*F_.#9. W7e %C=d<mB;Jō{g:?xxߏ+a#~*hLt~jDWY %U֞f!/g&w9`Fxpa='a\i-%[vɱyԘ"NbUC ^n~_>.#y#.xY䦌Q?4bɀƅxYn*Y7RF0uN6«`o=!LGP( wGZ6e89&f@g[!8gc'.yr)>aP7n֞WI-z>9-JdR)= Mkh0T`Q-e\DEO#ps~l[ȶXz.-'`YzL'Gq^sxy3.;5b&cYzo?ܘ&S|yVë2,c`@[ST#Q{`4H{*F#Bt!O<7 /Zf@0v~rt5NϾ}`q1ܳ:Vm37'fL֚`1oP \(I_8:x0 6rG?F'Tsn w3WR/:m֮ظ3k?ŃrFVey"M /?`GѲ W7@'!>ؓX /gm!:zIyNfv3ߛlg#By~';6] Wia;Fe=G|1s6Έw#?G^w5:I𑷽kN=uKECYVO䂼ħ|N?=h޸x{7 ;1bh>^)i (^U)0s;dW\hL,a0@} 1 ʨzN4Y%o}%X:\mHܩ,mmG8Wƥ{=ksf$i~ģk%QV']Ya{t5A0mr]'Œ3b0hʹ 8QYt*"S:pvD)CR|JөZ{!sq3UX.- ,OݳˎPveЀ>}sM arH|3Gɇl&J[k&X^ .a /Uv(,xt@/ t\P\xX(*^"$4}gx%Gv0%7=_EM#G 6;Չ 6a)J#M{PMWE;'QxW(?F>LFbb>G)+g,2; "ldCe(PTsR>tQiPF:p˺0/XjԌF3_̉ MѕCSg!ڳR!M9WhH~q oęq 6j#;Q6QK>Ц|ֺͳ$n*)/]07FСnq~Q|a@,_$嫟oYS5`Lgf4CF<_R.kӉfe;=03CףPś" Klu"a׊py8gņmsoiy]ǹE~G8zlV? :8)\Ld1 ӡuTFcSS*-1C P~+ee_WS)3ep0֫/Srt (|#S i: 83,ռO.!pm\nR(ڱ{p·2S<対?q(c:PcFP0[:L>w;;^|5nPkYγ/)RnƏ~[Mַ:*7Rt*t6 X)1XcWusDO(7׽K_G\wPϤۓ5F'_#.;0 ,:aҐm"8@Qoo2F'q5EusM\I=ê]}ӱUmQ`C%.#K:= 4ZϷI|~(̡;NlnOh #'bǹAL0<}# F_}|>'~́qUx4O2*UGx/pFlGLV,gcYwŁo8]1~doHn(eGpL1ϲHiw~HtB1o|ݱd(1F2p##tQ3y2](&\sel3>]60@>޿5(4b$D_4:(z)6!^ꅴ9C`^UKШ L .HcQifuPt 6{N4שk|#O8x+^W/כ@3 7Nd72_4Sh\pC>hOCP*\C(+4 ]??_qK|KqzC f}jqey2SA6*HI JsVB VG%x  "(RF2jb {>ZUσe R;Q8To(0 &7]'?6}[ߊ|y/AOj¨N͆L7f! @g!}۬tj`o]Ύٳa5[]o"?(]PyZ]$0UfLm8H}^-WeQ<_||0d7f|+b=923t%aN V9hb^uJPZr;y7@z\'\.ڴ, eғړd޽:͛)?ؤ>8/&FjroAi,\,QShӁjG>:xָsٸtuCK^e)_v]x*LfzwTbUj%R' *eX+{F?f;ZMGUFE;K lhĠ#_P`)gH[#۲  yɉ3J]7e%AiمWu"v 2 RCi0},''O^%-ptf`A MXϦˊԧ>=t t߬H% boh:)l>q88UF ձq~kmluX8jPL;8ƍ y^ e44pIiiYC,:u G#m[5MkM-ށbX_ |&X}QoXJTg31=xJ7wrcfO_7^S_|?/|A3UW]/yK\פIg~#ﴇyFw|)[ {f {pٮ!|+cWxa[Cp'F7iP^/+qO@W !1@o6Qff`fǙSBYQOPoٖZ7I9`Аx1 ByH \XZ8y>71uCkNIr=at,ed&>_wmN)7_ħ?g;W+^{tP#ƴv}f`$$msKT%\p_sJF˳:j:@5qJ9ܻ"c0X Ox;_@3噮CZ^C"++=7 { yx(`wݱXf25r:0';lgo?_p갮qͷ:o^OM (U~k^ ="MW^y2B-=04LtQuR"o6xI8}@">Dd*ZWwO:z>V41}o>afQBc-?==wCCՄb1jʌ2)!/X_w6ZGݼe fa2LP xm2 GyFF(ҸjߨG .eMϯk{ ;g:0|e]0ܗҔrNia \4b|FQlMB *iAP@O^.+*A] =C=H Uԓr8G>K<=CF'FM??H8eXϽuZζn&|PuUKcڰV=5sgԷyR8nK@GLG[U7zz;t@ {;^|kAut^âC6 ETERZvt~K8QxDNq#R}q@<O) 3T'; = ~1DŠZ&X>]m,ú`RLyfYq?j20h.4$6s @k!3~oRYNPXp>PR h ҁ(H)śtk|= fΜOkt'00s 9 OE3 I{ˆ/{".+@#]^8q:[o@h0f0pF* hPO\f+-{>wwe@ʝZt]XB y⭇h?Ĭ1n$L]O|!>Ν;7x_J8^8#[ a%A|iFSer*Cƥ}I}ru i׾6VXiP2`9OIiLo DsfLf2 .޺4)ܣՓ _q#WƝ+;cb0;aV*젬Whl09E3s B3:524ް`ަsf!Zھ96n >t=rț =F.PD_H3:D:x x ِSu CnXaڠj:6vyIJeHN4X9ΆtoRYI3(GB4Zn-jQ4ny$7|諣b}9Ⱥ#?%ޔ~.IQe#& 9_-j"\ӮSSƮu9j yM|$-;_Qu᭒C4 ,C(8K~0.՛ qa 1*<>k:3fKv"U  HW2:c3u & %6DxV2:0+F~;V ٯ%xEFGW%?OWtUs>f?xPxރTңX2t48*G~E:!NKd}@D &cwsZLu h8N '~ e@\w^O=J1wzS&0Q\&/ :yɋHcc 85ͯF~`̛:GCO2:N)˲nK~J|{SGYµ^'|%BdR dʜpP a5.;/+@T"(ڡ4&u-(ʠM\wy"0;7N=ҬB,xnu$ò@U _>гdd{J *vEOWweꪻϲ* ZW`E#$)@z^2~};O^wZ&ws= Q ܤ8vxv+^g5:^NݭŊ"0z47W'p\>P*5U s-O;}fΞp}b;"/l]ES(e¸/p/kAU` ? ,.~Yَ 'KTTLU6c0C:M vhZ~ #C;.Lx+d6"!#ߊN߭r0$p>egB  ɺ2Ȱ4! MbҳvnD "]TL>@{LG8I |"аNӠ+(\T|\%i5ɶŢQ 6Xw_Tm!K[vj~;_#㖴0\O鸪5&7U鋨ҚucFWn3"/[tԙ*{=2  ,'{_# ]toiFPZ5<@Bi;2Ou:A'^yvj=t!UsV'ܯmx΢p26+lh )nKV>mU2 {'Nz,4K32x: Z t𚅋t4pz1xE 7I)tY:@G6t1wt-XVu\t[6 5)̱UwC_TH[iD08|\I'IUא[YN\hO=iDOWZl&)iZآk90<.^JzJXt̜!Ν q O5*p!bK'4(&ؔ5vYtz}ӟ)p Gg4L-ex'#0GEx#_sSgGV~u-\#C<^gJ 5`N†/6@2B邹ihC E/.QFޠ<9 %CgQDFx4>W eBrCIC>0J_%UیqWPf e|.]?S+l2әRqYVw"ꝺx C>H(c!c`SI+~6w'VLl\1ֶƖJV۾[)A \SE0)Ō(n88^c{^(xAt Yp_%d]?!FzR^. `9]"4ޓ2N2\G%yj;}$ xd#BtEb~X}(ܳ'F/hy NpޣSF>13 ШF;4g.Y-tl$}XM A+O:6bW r86<+vA'KNިS,&u4!#}p>l㬌˿HG}q\_iWi]:UnJ;26mJ-VcC&s'. _hyAwS[Y^) $dc@zo<$GCFb9OˆcrFYWd.T^HFycѿyBhBv۵NhFfLiRHڲiJ;{7]n;5oI:}b+NYkk}0 Kď|#o3QXe"H4R%=lD|%ӿ 2K2Pfr\[8 Apxz8xvPhH^aUZM;ZCV+Lr8P🋢|Hv蠦]3g0%uVAFQE\HɋCaOu;!K.C9#n$pJ(zI (?hF0S/W.^B%] x1RFrԌQp'qt)uωx>a;^'h A#xwY(`IKrLH8܃ߔʝ:H8מ?.J4\"7,̬ծ /[e$x, N:rUN6B -! SYݡtQҙ^ͱT,zCsQaN+ pكW3#3Vݯ+Zmo%]quv cV94[-a> çA,32CEBJ !ENAO󌆂`<rcE |G[>3tC`Dg |Fu&4xpBkC(S@C i(4u<]Bp{}{~CE hb-64&hǩcW5MW/>sw c:qE2e'!N˛o> pF: }"N 9 ayބ XaD >b|0@{)vScD(؁[-c~F;z}&Mz.;kx |x&9k 6>]t<3֩ƒN|Tٌ Ɯ4JY孮 <Ç#.+a;UU&\ =BR/|"($:2$4 >BS6U`{93#_Zd5*Q.̃M^ "NFG2XlLq1},x|5ֵt=q*h6ՓӒgf!4cPG  @~2\H1t* ,N(-i!4$H:E>eQa;ȱf[QjB_JKVxKuF8y9c^N~჎zro}[~0o^㍋GXeAP.<䧃eBM'2OKDJ6ځw,c#2R2w4萔EȓqPx!?[uҧm^ F}B.V #.G<%_r O;LJ\BVO.S{-J(UvCmǮ)$h)*ANW]> _㺉l>jF/2 J$ &tJ3A,ep찤iLtJ?)m TZN4h^FK8Z|(WbE#$hd8aIN랉?;^GKѰ0fpL`L{`1$7Z2>+vL4 ZQa{s&?ՅQDn(OCh&:6\YmR9G@ n,^yX`%!ַկͤ+V0tvH!+;.wǐO7$T-C)ZnCn8.cuܺ{PinCx"Ȕop䍺! p(Ƌ*%l8a |ӎ_hG¦.0gvت'f2?qC?2mfv(\s`g\򷮶3l FUu|A$MS©?Wf|K|W9 ,]1D/-"L;8~fV>hsoN[N N3Anx9z׿1ϟɰ n칲=wt[j~zͼΏG[ In!2e˙YF1^{+|}-I]` oߡyRom4wO$2yL5M8ڎo{ԥXz6Q#yJ>pEv 7*dy=QCB ZQi4Sc: Ь3Ejneej )Mұ/]|%W#-eEz%*v% Te _]u,fЭIqRd.~㼔*mcԡ<$JOr] W8#@ C*Z<ɧwhHN4s7|֨rFGf;\`:-R2A'{R̎ #,~%9ԴV-ШQ{7 CG!s9~#{ 'ѸLDZ:F`BЋQ5@#p<;}4iq߿^:d8}3 ﶿv#w,8;dxLQv{HN8%IaZS5vyl+l8:/d/nWhF6A{'_ XK2o, >/GEɗ{ni.<.ﷱfOoޭ'KkDV>t]uw'Wϳڷ؟pKֶ`gNBuʖ`JF(sg@E㷭?NOCvѽ/ /9O< e KpcD ?>BM6Gu)dv#}^'Z< /[12*9Pt^pC F(00$p.1э 3z*x|ҧ\C}ICàs!l|t00.**HEcǓ8 TZcpV}\7VUXa*VQǰFA+8K!V VЅ;Ch.|?l_+_{IXぃ'q,Ey%ym/JEL=|rA"@)/u|/48Ld4eO#_̶OZf{>%K;S>x g<+PR.p6dF8X}t0Tx!Z$ 2N)dt견!eI8ʣ/ bnq{ȞwޢvWxpK OU83}m9C,[|}wEkuW\mGW?b/=<6c_kI1ף > u^ ^+(}$A_q6!^M20YS#'.m%s5=JDrj%W/>ځ% QFBO8)Qx%VXT*joEW[7Ҙt_k*p@y+Y$^.irD>J1lZ=£~{R52[#nL_9Ut}PY!ݡnd() ܕ`, <#y0$ oT@+PaQ[:Ν] #qFK g4h@K$]4j(@HaEx0*RF2^QT|>X5q>:+h ~4bʠΘ>c A_< c40riop<#g)e酳΍_t|'+‘>򖂻a;]w3;4ՠDF(+ fٰ4{nj;R>S :ZQYa/GJ8 GA6i#=2{' FHaݖ r|GPgd?tt;Z# m뙕qgv;=kvPoSݙR{0ʕ AP|}rmŀA{ G!jpA9 | 11 y`>&Jx%7P/nrqL/qx)ԇ!X%zUM9N#ClW:Q?ײ̞q¤E,j!P5Sf/4,^im. )-YXw[ɧEZxX-9˿/ud>]ZsX.Z8 ̹*o^5Aү{9|$!R唇_Aâ0gSs~P P1VHP-=}e]GZist^P>7>&5VZt(D!m:Hj6]2K <)wJ}͙l/G`EDRC ';m4K8MJ*۸]aK5 2uR `]2QuuԷp5MK MIrҭnn;l!ScI$kiq׬C~¶貼\o vڬp& 2*MaS.Ozuj̉նt^MnN2I U1'O F_~-#?~!]Tx!җW.+hU: HOCO# O -  C)Zrxo_(|CXOp -CT\//o>!_+ 9l`‹<*g8$e@Ð.1oC0 q5vPB-* z"<7F hCI C9kX^pn}ѱhv4(3u?JcϾPq:2U,]vZSklDpҔsO]Nnh 5k쁵m2^]_.&a u׫m4݂H˦~6;Tq'!ƭ:GT6½ٓn}[n?ɦLTPK$Cz'= g*Μ8ammqt/ʹ[e\/pڄP 1{x:,q:jt8iF@X(#8.4":AHA#gsz%eqpё2 3\_|9SO_W~>ϻf.yg`o- , /۸[ͣ68ǽH:nanPv2z(K5B)TV[i^ja!RP zq-d6^c r) :⢮%~0Ǿ6o=Z6C0"!~`CQz&M ZsHڐhc;bHD1)}Zخ[)w4HNqj`XXRkRvKaOPY55ݶ]Z)q:C-2zTg\esd\4ߣ߿Mbkln=ꍝk?W)fkB`g<юl:ɧFT.?M.n밫)b #;t)WuZBf¨v-NO<áɓ| 󍌆? pcW| :z %A;O~OY(74♾zӛdgu.5]˟ oF{l;uf4(8pQލ7ZQGYG ,hzʨ,-8~;-o=7dZ/Qj)xmЍCfHq!?T#rP&e;cq͛^#/_[l.; o0MlI0@e֭6|rk;8뇴m-&ȘI<ӻQhB a3&jTNƈ~+JdX3^BÖo/;fNԅ0[cG7!|#2V! UM|ޜ6ymڥ-2ʪ9Ie3q#? cKnV>>pxx'c +S!005`!`4Q4#>Hl>};4 t`D3|!]t&{t,wphԽy9u@Wii 4u߆s+/:>M9颮qZ4٠*Un9VP2 yA <U ZL9eUjA7>jdMK tUKdP'p@Kd/A ~$kIROd䠄 ehO^^b-A$։7ix [';y: q饗_וQGx>p8Onm:q4et,Zd*T'qYralPdu:O;ko/ұ&GQgpyP#BCY=8dmomf|U{r h|u&Bq~؎I聗C+WzO ,˳{ߞ< Unѧֳ^rK><]9R,yG8~veOymnM҂Cxi>|\;M"`p+~{ty)%\Գ SxFnE ݀_` `>8+N!ؙUG1#$:\țx_ex8D_aq4} VǿQ$8,8NpG{LWg>[/^Je,d@Z0=Q&?04A}n[{(;n=~qv֥L`^Дe$Evp(W`RVVςABΨ'r΢A22qZDP3m*[Vٯ(bw=?@gc'+]t`HQmcdT\q>* k dzdԉQ̑b$zؽwf e0  e1=;ߗw~$!C(S,gibhwp ):~/e}:yog N~EÏH_tEίTFKnn+Λ* 2+ţfKMӖI:`@x(CFS*s*= `d#(`>~v5A-}p7?Hi(0_sN]POB">9B,,dä9<g<#" ރR'*T1)hIX`'|) FֶS]p`fIZKp(C2a%8(MHUݥZo}:w*(}pt*#=RHC G --XfwZU%a-RS.ds( 'D^}S8իW7#wp,:d+}U8xF)ۥ;tp;>*n )T'(8} Kq-5ߎHC+t\k"l%E[56ǴRFvM[OwoWډ ȃ'b^IvmV!%GMv]cYKoRКMLd9o~gdquv6[KB)5]K+V p#kڶ]:;aTᯡ 5䆝vcۭeZ5kDWW:r^t[a-:H ]vVKc$hzXaeMɖډ"ZR  4TjzaATJK20Bit ϭBb|JGAF(N\иa:#|qat8O\CiX1w|hh Y(7M;vEA?Ib6FF~eQĥKuF汭Z9GB} "hީkx|NZ~n-^JmmЬ]6ſɶpVmeW߱fg'jQ" Ȉ_S\dW?;3':^9͏uwd)Te2utٍۻ탊ӁQvIVcׯl{xuvN(w9.P]2>rz;궧m &j ݽv{ŋ1sMљdcOcn9~ޭvmm.4Ԧ>O^Ę '>E}vW;tCaD8I. 6HH2+E01c sSRe|INFSn^**4$;reèe `2&/WN-+(B$6* \{(d A1E|[ZZܳMoѶ@5\V?D5 LE_?3ʰa{GC>L31 XxzvEx8K>xl)N}2\[:u#H%]NO? ^ɡXȁD xTS(㫥,Md267̯%sڧI1rÐ4}۷iӞxɟ٪ÖN23Nh܈m <¸m2$.`iQkGSuq8FVӨr:z c5"J3ojtsDDPg\ݠEK[`evޥ<\_ר3"4%=9396ibJ=\egx eLx:N,mv"RtYw4*Aܠ 5Ls֭[\EАF#r980@4 GC pcB/ M @1/piX#)}.P25>>WT?)+ꍠDgiQ mrQ:Y04a7냧xW+w|`D=wG"$*~2y5գ` y Uw1]s+A=_ۄ9 ׽ͧye1R1t_ŬXT"HH^ L>MU:(a" Nf-;hU.g<:OmT+ p|å~s0 ?RALWh|>އ!H#&BQJz'uFe] O 5:(c858A&>_f)$=uBM!+oڻa+ϱs: t*c;#57=JFPQ8o]ѩ#JܣpTRUNQS=L@Qv㴫A/^J,R85_yv)ďo$uҙvû?h3g["F)GW2Wk jňOJWB~t9hn,g_>=B_vw}:a[e"hQ{W wL.BHߝ21R~8"3@LQS}#{$<`yPB}w\|~ \e($001:::.py іOJ:t/Pn,|t#N/`|NtE)%2d4dQ{Ewv]?c?dn|;GG(7b,JBe{ʫƇY!('ICIHÂ\ M}#=Ӵ܀;N_gC3?|7b*u4ץS>N/t@>sn8 .x  Zp#4荼\?tAiVE1" IUf HD@Xˁ$?2_#$+ {Bh`_<_}!0GrxPPO}anFgFE|yG9-_G^*ZZ (/ynIȘ|n\ C*`eoc0Y4Y 5ډ|/rhMb C +xh8WmsV?dK?s;Ǭuc݁Pr2HMkAmJ|Uzx첑5@FRng&poFu<\;5Q0nj _*eө(s "0gr<-FO0sh<dO''_]t Q;+G(ye_U!T#e4C@ hl^*8=1(0x`Hڝ^d<`PV.YB#.Ee83m.uE[lM䩗|5 Y)%K;x(C}{-=H[9z9am_tV_"0}:aBGuT!,2:A!ܨ :ww Al`!4?Ɓ w1'sz c!Z1i+ET5 }ˡ VƂܔ#?!G<6eB/ KB | Y".d'Zz40ItN'uOw)CMwɝ Nx"ɶDh_TJHFx.| x ڪ⎆ӗgVΜr~S^;em<gn&x€,qe:Ԃ*/s-܃W>"0_T )j=$)cAc8a@%'Be w꜄#41;/~j/z7J \&xk4z@DЍ8l/fΞks/P4uV0y ap!Z[uq. &yQAax>.:kʂ0/0h8"$ ewt&A|@paZ@&Ä|^™)p  l) Uǘ I^NxixGhX<|)B'xē9 |:G r<>FZcqu)sāa0#p.Mb-3%]?xp)*t)ɊwtSMǭ_g >Tq>8GXebc9xB[ ,o׳ҡX_vE50Y }kƣʖ/)%,M?Pp>-G}AO,7P rdP<܎p%ҲG:OPGHOցJ'6-}jNB:kP+kPPR HK 8vV F"M.΋籦Sef4QJɚ LgUٰp;m'-mpߤصɚJFS/m"sL/ S|Cozy牣*t%qL>ܨ^IqZ,)9qt9+Y7wd.p=g䋧W*RN#GZh&1uSaQ&E$,D 䋏rE8EHcQ&S8R& t4pwCK:7xDg I*Dpޑ>'#[`QV uCg!ÄS: QNRY.>UjCm=[e2x_|0l.ܪaUMRzo -]šM: ATFE"2?]̥$jר55PNkyHuP##=ֺt<򆍚rS~.>Qi֭ˆ+эYp({Z,2@> kJbzxz0xV#3wp+ "ح7} fS`A#,`Sy8~0Jߣqtt8]q|PL1XʎN0w|FWʁs*EULE7Ak {om+t! ^Y_GI>Z5j[ |A<_v9#{" YvdxQ~ܲ>XW@b900wM=QGٺqZ;IZ3m G.>fNئ:K\l4nh/jS|шªûw_>uv.fB fGP܉}e /trT;Vca~y2v묃 :nFomc]L?e# H]ؾa|~{aỿY#EeO^JD/ rJ -J%T11Lȝ=Uo'Sd5W='_(mM)]ė, 8m;tImdM}#7'M{`ۡU'̰ M>:p[mۤ-x() F7wl٫Ѣ좩votd98Q= /CmnG֯ӖM͟$e<.i~^}/pxܨ9; ]ԟuعAHe9*O+}@@h?+LT@\0U&C+M0V£rx2Bsw0BY{gI w@4KV?ԟU ^<,hTC5ѐ@TOEJ|Gy< aВNL_!Xa2yg;f9e'h\rp <>;h<`#Sy#}L-#͗OG~% :!ms֬Gx`ûݵ /Ng',i)Ro+ôqFD<6\\cɨxb0ŏ[dplӗ6풝pX5vIvμ)r )i6:Zl n7GFB.[ol굧)!#;w[d^REqFc Hd :V< a"n̋+Cc;.1H V&h ,“& +2<6cEl%-( \q/GN7`Gzw|ē: S/v!K?BQT?mz_ z6>T0'\18yEN$5&AA³B@'O~Gr ^GFp#}5\xX3NhoiqMQ[.]w"UV Kt< L`<qtĸKYzD * ܁CO>NwvPƒwұݍ^(.m\rޭC`l}Z8A1S敭J a7i}tESFXTP=6yF[t-6XƧguLjEΪIM(mL_jTkW׬=zz7tW&k7;zO4zsvYkk}'2^Ej7 2=Kl̀د:NS6aZi eb3Yf>蔇3f֤5&5a1> 2 s. AgG| B^ܣL rMSI~@Q{"?q jΒ2x*5 xj>q5ɘ.p.T1eq5x-2.hc(x0KznєSE[\88@DY 1Ey] |L˺\('dIz]$'F:PF+06dB{d'іg ׵Z`'a>cPԉ|\WP >I =({#kSD>wG(<^)wL3.GXѷܱ^'gv՟o~}g`|^ޮ~mٺjncDtڃXpG=:|I]=ꁇ]˛ZA989wC&uN@ϮJ>4~ﶥsdJ߬痗'm0lCt/_(GƖp1lc@NXcaP2`1juCykѺCoӠawxS\XT9WkvAZ%UWF\񄇢!BSFe֠ rp+'s`*4‹@GȪ&#HFچMjJFj4/nYeyd⤓Zm- /S? hSKWݕUE''= #_]+F죐O@T|jp ȏ>ZRvG>w; XVoK2)DO~} I޷_~v ,% #;HG@%YB*K<pK7U>q#6e \WK/|ջ9YiujTZY )\ţhp(gC'1n+eE ^u) VsO^q9|uuqr=iM{e[ۧ} ɚ 9؝Q-ۆuY{lDt[9'q_7}vR6Lrqù/wʴoܞWcvj%Xz;G/=>_N˂>T  w_0&p]#PO!INq~ӏ >Tw$:hMN#ve暬w~h5g~9ڃc l7msrT=V~-Ɉ<ƅݹ+{MfW0`s!\Ү;!mq')c]oe/234RngnBug}X&3O\W'܍dmm޺}b>> ^C_y{;< cpmSogc獄msomO}_|1۷Z/^nǼ5m3ڧO=ҶţXٌ=W޾~Qb^S9X d=u,H_/!gu;ȂRi]q>Ag11Hg-6\}Gxtix}uQqИZch,dL%?/ͻ* 5̢`a ()|(@9a1_l`˘'z +# ܳ#-R9| ?myo~z֠_] x"qjmRk+Om /Z >9ސ{wr2|  rw+}Ԗhߓu:%cNܲn>唶9s/C>ORH%w}OxBۗ!m8ɸrSn =gxE;?o1ne>}k;څ=%Si !Lue1/Le|ɻ.7n?Զ#ڪYuX_1!BNϬ:cGG ^#.]qͭLuwic2Zwָz8vٓ%˖{lgaSV sŭ߶mak_|74g|33xW?Kol]sG;.{~6G 'td>1Oٺ%;JƈeJK2nNU72fo[1'^Mc|Z' O4S0N٠lft^K1z1$ˎsV#;${vXW|G m2iE:}#Nq!g߈}vm6J#w{sNjg=?k?ݮC)Nɏ8=rk ˢUqذ>OYw`a-BS裆i s/5uR.ȡ#/Ye%}ǒ=<{c'1T~ejk͟>.؉ w|Pw 2U|j|og uD_S /M^('`>POIAᐶWP~i*T+^vo/7dvҴ6vgyκᏬwF-45ʛ ,ϛ 3J]*1٬rz/`(%N0sSڶ=1fÝ_aoxR>ivVMK(>r캔_==.ˌyus{ϻ/mQ_|[w-m?ܶvŋ_7*9;MwG/vtϲЉ7^Չ( zxħU\95y fEQV~\u@]+s2kfA 1ΊWf*9#*WR{'GՎU䫼rx8A 9å^Lw9$e*m)zQɢ~Wq/\Ps~z.K{歎gړ}Ac T9AVt!ap85oYfz_lltuivԥ#w]sHVyN[YqNc,Xve>ys޿% V69zY2pX78K7)&)[-'s~[U{拏Yڜnil1Gdc,ܷvhp8W~׶]8;_<.8DŽ ##u-o9ԧ4q;]ء.94{-Ѓć }9=}:!W(bm:iM_B @m>pkka§:W+}kg\nΎ32,WGးS,yЃ_#8&ȏvѯڎsm[>mTz'8Uc!: ])&gg=e?Nv[OUmb?At3zyTI70F$ X[.9\0ѧMy>7@Svl+Cϵu}GqC~۵' :]j:n˖sF[C?e2U`v]L'uJ8IYWs`؇"NtGmG$~3齯ƙ2 :oΌï{+c2wk[y:~;UsQHe@5AOkOH5lvEN#(#wp0r7fY-~~VL^ p׀O~}ӟ+i_q IY(/a=wUWR KcY~\yFJ6^uUpg?yl2wl󞙻>q,ދO'__y啝?_Y==j1qY?p/r}wNs Mfpwpߕ> /U_GN~V2:YЕ ]qE7[n _|BEmY֣ooxOzCB\O<=ѧt}ӝ%{ROuhB-ۨ5 h?/9Bqޮ)籀w#Df_voQĤ]Nb,׶~ͷ^ٵymڃ1;Vj}S/[ږozblQǶ[7%R{zf1pАw?Yc{?O/Lcir _xɨ9!A [wxXq.:qHrڦ>a~׼PxӞ EH.Ji`5iU^]I(ʤ|BW?ySFja-|h@V_lg6 n4v)e/{Yg{h-w2?eÛ%k_ڌQ%QJ&`Q i"Ws~/峾3eqjoxqsNξގڶڭڻJ#X|B@;|uG9uW}w}WIsm/~/̧c=A<zV=U{h^\ I'}_ڢPy>#r>!gp n/9b*p8/9[kml7~=՗:޻>vm{Ǵcxvup q;*9p4q؍ug؜g0Lt4?k~=2.on}{w (#uSTO朲__-9fpY zgchTPV8uY{~Mo겤̦p/c|^#]W"|^l.{Rѐ@#s2X dE=X }x|kw]W߰uw%׶?̬ʷ>i`C׷TvB{!*gܬCUF]^ƾC] kBq&%T2>ncT5IU?f`}3)ٞCgW|`_y:a?Ȳ|qO^[ElgsTZ5;;r*B '.ӑ󽩳mzo԰5+0 ߵlځ'8<:nHڏ|y:L6!"P4Lׄ P+\oAYK>Ǡ#)F۟5xʡ`cա'MJ1Ÿ꫻?;C`Kɦ#PO;=9#Gw>ͳgEi@nDi_48qL4!RD_ n7.]._KkW|]1POof$pv~;̠J}VEd:_{#O֘ⱎWսh0zkvBRr-U}D,.T=vpVPFhS_2҄.k_g-uL cAzP%?ܬ<2 7q$_s!o~s{;1AMGȧ=<9V{%lY.Pw׾% o=+ʧ4 c=u*v^_E:oq dSrћ}U.; E5홏;ߑZV3{MpD]J }f<#^z<#FF s7zp~wz4-'i_΁q(x)1On>'OܲTNc2;ppאr| xq! W2ЛDຟ4"݁u$!-7@g#rR.Üszn_wȱtK8cYh3 :\ ]bƃHo12nrp[NhWik/(+e)j'5 NK?՝bW^_84q\ג#Ug8:WƁd{>|pW9xt^3JWtM//'{+~<_V7NW̸`·}_d 8=vq9Q}8 c>A?^G3;|՝8 CMKghO!-#sX䥰eq)ܔ8VFqPמa |W1@s |1=_,sw-wϯʯ,g?mڴ,27mkWZ?a ڍ7.B/FdW(ڇV\%l3 ?g'M5^Wk+ #f?~N7_.?PyLEO]M| hrˡ=!-++[&Hh_d=2|4*ك‰~S C?pdݹ@7{u\-f!Tn ڻ.}{psOԞe<Y'GfEnDvCqcy7LǾu#wGs8wwK,2>:om](QagxSfF%!#Yyվf,0[1aK'+uMN"rH];έz'Q$靫N ڠCu'O|xTqރ=/f=fj]w f8jK/ G%O~|LltpN<ľ|yPZWG2‡oz$/8 eDN+36傴[+ؙG$wqʊ̒ qdp0o: pgzɩ~~o La1߽h*>m{ۓ?|rʻ5p3-g9xk}o^t᎚nafNAd'O3͸6ћ۷my׵-G2 !gt?;g̻nIx R^Եm>-?8N?mXC1 Ω[+x O9$dn׻8FhRG&o &-Iҝ/oϿFoGc3piѵT7JqtЮ<95x4R'ƈWb:RZĨuy[bO\=CSWۋ^f%} f\ Ɏם.8 .hW^ye{/TH|yԾKLk1.ϊSūm#1pjCN9ܼlPxTٴ<=^)m2(4~t:yI9/l{Bi-mKfBv/-hUxktJ,QFѪ:JCї.Wώy+;(=3]vC{m+e^~{;5еs36uXqE-`ĺY' ݒI7l5L\d֟{PI۱x*K:6.]N슱>okg'~gȶ&mD~or׌۞$0xT {GL/=NTf&'jcy8ݻVGpu;^91IF]Slp0ɸkA(`@'1 P$ָՁ4Pxe7>ہYm /j38~\s58߻ƍxi|=-JS|\ǝ7 v]"5[]sS`9^r AW7c?Ϻ\ʚr~9yN{ߦ( ̒BtiK𒗼ktJν Ls,yj pm6onxA0[`F.q >>G;^ h#:$8;I[ٞmyukcnjUV.: _!JzP+]jYU[NdTVdڳ%7!5ܞzs{ԃ23 ƐxWs 'Lae®?A؏6qcz,fdڎڢڃϨ0|g/mՎzmLs7FRR<\O[1ړiη+ £Xsz -?SWdD^uuތfb; +wk0A( m 4(m9D~c bL鯢T5j+evuA7HV*h`3Qq8Qx8N>`DݙbgA\9A8>[Btf|T~j'F plt*ÿ́׼5}g~aEؕC>k@Lx_ONjϿޟ}Щ;~ƟLls^ . zv#.x P~wpVwROU-yHt+{5LP_ca ׵yoͅ 78ʈC 5K0 /|@$Еwˉ//gAZ8SX 7.<"8 $\59Wz Ҁ[_cq Ņ,d4u>M_+}#~œ1(- ; zo嬑A/1 }f*Vx?nw9+fᗗ((`y+1fazU/G4h^s/|hk2oO,؞pm_{?/o}v3n E>Pφ(|j8/ghy6&d$DL8@5QUeHn*X#f;4vSPsZ'zY t8uw@4|5X@,@hSo߹l<ɯ\M-Ӕ P]&4΢_BW>2p8E8P'3s7YUQ8 ] KP|Ea| xW9e< ~[i]B"ᇋ1Uג.WyY77}MOmO&Zi*d_G3$%{*HG> S Et#\ZUrU̹trtmR9ʍ+ Ue)k6=c3@~mmZ|?葽IQ,<>I8C~ăjM~*EygdKߴ]e1LfL96gAC(w{#5ZmIyi=U?ϬWڧco볪wE~_S`=As(dyx蠃}#w؟G }r]87]r:a+HMfIxp=^UHg)0n2ũzKz8X OehSGɶ#'ĂI3dð>_k~<$Ayվ^җ1syS[[[ `lFT`-S#Oi:2x#'a xD_v#ye41dhUf ?vs*rp-;箾дXׁ?a!KwDWDxi`:ײf炇y@qOl sϬA&Jڧ;17gA@WcsƣZH\q aI|w&͔XЙNZoB+5yw⟷".|US=Nu:%S.snfz_B^rI؛7$`noŎ_7g̔}ىuV`՘5t]_m5pS[9rmԽRp68Eg@uyƎ/wcs?sy|yēz ͂ԗeu΀CVu5@Q`6=tHFlq TNRϔ<ɟI \+ݱxY_r%gm=~a8]3xhcmk]rE#Tx+ߝ4%_:CI !^ 6v_rDQqQib^? ƋDuډڷd7}v!YL7>|vQ5{r[mw#w"d"ob$ܝ W|Gf'YWԍ|zU2E>1Y !g:V8@et MBr_ >8+HZq Q J/kmAO@cYf)HG9G+'߀<_΃f tqGz\KgS.`敯|eૡ"G#1AU`FpaGӭYtF.wFAG,ꪫ@GI dzq.}mdd]q) ,wPpgta+Cl~ 2hsg?W4-Jj@=6$1 BlUL3oSҞaSu"9‰>9 ŭֹT_Z t|w.svAOX.<`D$a;Ȱd |sڍW·vm_ْXOǎiBQ}Q-[N˦ \w݌pPّ>CP AO$. k2Ռ$)ıXK7~sIn;m>w=*P*=PfVu> ߐ_lcX-3/vӍ7]V|#97]:׼{wflk ׈ d${qahЧlA4kbPZ@Ńg\'d z% x))hq˧,4<3.pU~i2@i/ y呦#Cu5egUt[mn]:W[>*TYxE _\`s|a̫>J;$q_>Ḥ"a #W+տxe9TW9^[2_t@=,'ƨ[DwwRX SڶYzL7 wׅ[2Jވ`T{{qvELԤsp 誻vSF>߷z_#PR޴3zmS,u8 ʚ*r[}5g_G nv-}V{ع{qB} P`MJQ RP.PxAw f|_4h_X] `Sp,f@9çYsE3eʶ9 Wg\'D:撧e-HVV<}`U_ (MGGLl@0)z(9뇧@BIw\z6NKBpY(WqZ D`c&T9E-+U)eŹ{) >{9Nw N9򡡜4ץphsGiPgi8:ChWب:.Ե7'>0$ x5GW&T}ˎcg\ɯgqr'AB\i cG E>U=a׶y;- QZ9زEc2U=L#oYr.gɖ) 7}Xʣϗ^r/٣{)e1rʣFi}}nmw{G`ĩӇb}_GNYM{pmKۦǿ=}wOx%In8q$ݛaXj; Ovo_ gG>D+94볜{w0qˏh3 ʴBop/ %'8uc T9cB*?N kA >PUgaSy1p ~Ɠ Z j#|-ūA< @^@=xiWHs,9';T>Xʖj# W/- }[XLJdO/EZlx/hdS^/s.h,%ےMO/wvbUk+IxmpW}eg^]pvᇖ#GgSt5F,~-.im툷i~a‹EO҇J7M 'CTtzeY:)́;(##Y9N4(Cy=>/&BԳ_XN;}~Sѳ%1u@Ӯ9F5:2*]9neʂ‡,:#s/|x͠c< ! ':5Ny_e먾Ѐw!>R%˪7 uu^Px-:`(_<%㒏V{S_tɵp%psܤ/,ϛ:] /1!}Gv,rW]sk?Rq?'' xM 3@#ZG6_^n߲gMC+;=ĵi"ejݭώrۭx׿:gӮg+X.Vڗ^:LW[n /i+F"#])BpwF8w}9+ #ܱlG[iA~MttDKq(L.3]W%#Mؓ h}6ftezz_vqvqk2rƷ7u{s_xܓUgVʷб:L-_A||qfѓ jg.f|p݅oYJ=+By ŝ ԋWk%˱[1x8AXu}+z%m +W]8!>'8ɭx}o9PuYJqC6Kڠkmc׷~yq;>}C8 k(p(|rֶ7~T_Oi7? g۾|mdzS,kD W*D&E8 [}<>Y Vuɟ.7&ꂭsn1@= S, =eHAt돀7{0+V¦G].&|'id!Y;n8@Yj3_Ú -sGEߒ5x 4hk]A&ep쇠;Z ,"d YkcoIfDuT]Y?6Rxe*o)80뚢 hRL 'hL0 ZP4px4]?(%,+P=K}?#[ݎN8|}Ɨ|M@1Kd`0ɰ-V޾mO=lϵ͜V3t矘O6tL}m 5yx`Y*d=_ҷg[7<+HN#7n=gѾ-+h'|mm]OZatzW]K.8 cn"L,V88#۱sO{ϧ{MxR{NzkA3t=_|'bH\g,C;rLdG}~F iL:8̣*Ŋ'C]&zU<])T˲r KR)ualFbok5(PepQ*V3׀Ki* :?_xp,TtOpԯ%N*̈́ҏ;v-]oJ˗z@V%l袶b>[>w)m:=sϊv㭻?ws{ol!H>C -`\]wGOx9v8;2yG _}X8#^4'Zy t{>K:[ Yk 0WPbӖ`Ts:(D.ʸ'|季eD:/3tAtFɜn.mڃyCX#W(f8:rX jh5j_錦BX8\w%NB 1'P< `>D~,l-Z/pwAV: +wtgw˚}[?gm_*ܙ~X+ ;Mr.kcw&W,|0lG/_id#LD^jվlhz|᧷lѾ.re7+}8o1̇?6=ZǝPQ${maKZى#V3^6dõlҖvΑGWx5m]f?#ognj]d(s^9G"AFwp&}{rg{]_o3?1؈!iw ?]BC{pfxBcPSȯ7B&i S\,c+Z*Xq<@3SYM 4Wn3elǸϕ\/KgR"T]Ay|Or:_8kxÉr2|u偼ww>('iڢ7%*_繗9X}Im@CBfzYY|w^eL^Y2Cϰpm{ꔱmE[1  Џ8* }|@'gtku[nۖ;Myp|0F,C>ԣS *yLqdvj:||0uãk}Ki/%mex1pLf-k۱YyQefƲ}B:gQGI1t8ƜYح/*9#D{t NJʢ3qb>5ɣ{\#*7* |;;}Ξ(廆Y}a="".~Y@Cie )%p&] ` v戱~Cd[c yux'?vdPx;yCt1u /?v;cmv<屃)|o ɦG>UܖF:e޼,DLFʃ9YёҕۑoF賻+Gcd;|oۗoGٻNlugo<#Y QBXHO(5id`@'` >k+Z\YJ: CWɠDA"}e!ԧ SBWxOzjOL7 U E>e/Y8ֵ6;Lbx鸒t<~8uPRtM O.֑s ]{VVӡ-tT#d4-00IE3p[>#'7 ]s֎ћ4ܝ{*28&t3u_w>*=cNUrF>$;%ruByXεoIB>mCJ~UaaXh,+qqI{^?D3 nK{ǯkzCJcz~Zpw}}+;8VcI޾aQ\%qF j"_CJM'rck;#9=x|'vct~w?K@8`3FGNtWP[t1/,:e!idŘcRN< :*OBg՝|F|!}:;VcTy,#qwU0#QTnoh'}Ȫ]k7=ܙ`:.eyNB_:a2=g\#d"'⇷.$Pqpr|H*Y; :OL,HdU7kfew׭ d/G})Pd ~|36z:_Y3wWR =t!]\,Ry/itp *U'v= gG"RA;]γW'YG)9(4s] z&X ѹ/U=ACiG,[m׵4/6'OV|1̊/E +*13 ݍfLN<߮E<(hrᖮTՏGs|-n1(C2੧O\u[ۚwdoH2p|b4U`)C{N eޫ,YW͢L $zYaUj'dn^1x.\#:ގ~kۼe[xrΗ>N*G7 caګ^y4X'sw!IAtEQ\h=,Wu5J+wi.Ma?1 8stP<[q:;ͧ_̟B̘e1BDJ FR} h컲E,c_e PDRKQv}s|{~3{=s]k84])Ѓ 'hM:TzFFI7{>(G\#ʮux7X<4C1%(^3ѠwL _vc}C+v'M:|<(M! E (EÁ?Fr?woƼZvoo8JF;z#1+ƃ<~Ibހvj^Yq3Nv;N]SVgBw'O<,aHwިY/=A@y#d82'@bLW425cWaĄ/&RբU@ Ӫ;~yQ.r<ɍIUhBP5hGtH !?vZ4 ^"G<}Q#ƒ#;CixuTHCw;o8p~xX|<%"$c'0~ J]P+;txlã."\OxCn&du櫟) _`︃ǿq}̝r?2U4i!y ~pҁMoZx5-\4w&~oCv;+ KTTAS΋ U,Cwf5B TJ"t;v< iFt:@  `;SC m2|FxBiqYEP~Afdе8 =r_d; +:1|^e"1|;P8ǁ8K+ T\d qoG8Qn#9((N% _(xG (x4f&qAPg?žx܈$tA2*MdE.]#S7<<,M<ޑSv4&=/P ODff'<;>.Z#](n@/Ȏ~7(w0 ]xcdc ,~0o#8u~W }y0f&'i,H7: ~r Ll1ik1]vkqfvB$ : .}8愇5lNz0s `Z&Glt)o)d %0 ;L_'hA BgZ]q^txmM⣀PĕHMHn*-Z#ţtAnĭƴ:]`JW]Oay$# &t ].J'f8Q t^ҨH*g❰Je<:XBɝqЈA4?=) /\a겨S_eEE8޳Q0#9Yc̕ U4-@P5 A_cHOxJ΁c3zi.\Lcp#z'>h=6aN0)u 8ëS8yHhVK8{+ ,^2U'vxRϽX8kXtRx@X{ƻHk0L@z8 xθhhHWtF3@z$OyWTJ ?Ga"3z%  HNx_e ~< >>K2+ zz 3Lx I Ox1\,}V ^@oEmìa>{aEXq鏚գ<1]5ijnS֘CM:Et!~ǿ!0iP/:)@_hya;.p/ ZYv8|Ўd^z쨺k| /|@i* ZQZr[Oju9qѢTPHPezUh. | f tN8 xT!o&sxPfrß(K Hqi@? ّ[:/3*n{3[} ?]qs HBk}p [&a״-h% EBFH_ }aeWp횆k]nnYI#0u #N ֳ;4c0]>^:<¿¥-1_4üs0Ô2E(E4x+ki߳{\ű/g>7/ e3m=8o2ψ/ qW\:coZa&$2H;$P9):A!$.ɧ4e!2ɘt ypS!< ?Ti.$Z:"m/Uʬ4".Ȋý8Z2r/42@Oz ;=ʘ%^v3 t)Px77YvV49l# xГBz@^=ע;3/Ƿ?rN5=i]4ڊm- \bk *az%?(DŽcZw:;NJ}շ~H8I3tE4,>Hnz>rH?ӻ'xh:=*ZtQKWKWhaYn}dfA2]$"yWfBڏwvcxC:8,a &h4)SG44]<r?'L,Y;P/L9O>9tM./zSpYgoTv>{Nh\f 7G~ʍ}q:a&J}- f/Pq0O/:%@+/#kq;>St&|X쨘bx!WݡcB#;HfĤM"I<*fi*c-~" Y+6 4˃2Yh!D@RGx[|b-P[hE1EgKw’>!ҭLP9;4p<[뫢r%=0ґZ4-u7O^'!<aAAWL> <:t TV<Ǻ/|BeT|ܑ;e|@wx̎\!h .! t|C? #~ܛ (!٥oܐSWAU|ȫz}aNx~#v,0;Hj#tjE^ⴃ2g4ƈ6'~؏Af={v"z` wz1?\/;*TC9oG}`vAӰY{UW!uE_z%WQe0 ayZ3~H'? \{9pq8{vcto,+"5SL $jE'N7'=쳁@ _OH\O~|G2%O:,=+-Ýˬp~:LO X!Ȏ9lEelr-zW, }sU_}ճ[o5'{L[%}[ߚNnd?2Q2W,o좋.ʶvlw3+qn{ӆnmoIS~Wm: Ua?:6l)wy .l7_fnUuН{7S|_xX^x!xp7/#lҤIg?YsQ+[ ݙT26; ;up;â{m9FmM:/lyu?29bHo=3fL0曳]wR O""8Gydfװ+}t?ށb^$w9;Oyjꫯ.lVwygUxtO !<4JP$G g6{gCtyŞ{ SLp/u;zc.sR$]>dAzatD#$rS^@ƍs}WN'y3Pܤדּr78GdIts;oH.i4;Sr?:qBO[ Ht|ꩧ%\>Ueݦ`ɓ'g=XH# : h=֦rN$=m2Yb9svKuf`_v5*X/4 6,C83{x 7q@*_Bnar1;y:p 9R?x7@OVQN!`å>GO^_ͭ▒ \pA)B /nʾ=?/G_PWd}@/W7/0b7YK$ .ƍye9S=M `m_ GoF5NJxTNpW,FU+ x,%T +髞܄QEkzi' v7uP^]vwwc9Wpuy=CP̍=ztB3SGA;nD[%ONJKs5WQF8-c1-J+:-]G:k$zAY 4KϑgO Ad> -z T1_ϧ~~w^H&⦛nr'vq)ӳ# |0"#+u]WX]yJ~eqgN/e\ m AeZ^1FW0n}w-Z 6=\<|znw'# ORY@ٵasTu1խZ* _x'xӱuߎG&)\+л?ӂy0fe;-|)[%r''F[L-ltW7ȏ:0.5[p^Ci sʖُ~mv'N rGp!RqU5_o,+ 4S O^[\sMF''lF~Yrqj9O|0*$b?~{"ذry@Pu6X+}j evolL潩Rf;}vިlE!?rՊΚ5a̘1A: ڊ JlIa}b/J ~/CZ[n/K^O,t|7z̫3őnak=H[2CE}waQhJʑ̀heIDAT2eg;y2,䒮 SՒ:L2* 'L녈b!AVvG`!S 7@vgzpPʞ;A2ء> Sܸ=6.`@.t{Q!o;i2?Uaގ(p::ig dbC1cY1v!dM@G!&#QAn=5: Lq{ݮ'G:|<+St%e2p;&zUg1g/J萭`v~ka6>p}m9\gҨx?%F;g9aZ)?T"Lz^eBN["s t!s.*_*(rA"\'K{ϕ(P#R]qgp TL`iF hʙ_?i۾h(7?aUUN# =0;O4s 'SY9MG'Nb\'?^OB; SpX/_sM^a{]N=T18#,K[AhCF%y DY Ј*|i.wA1ܻDnd/nx/ FE+ PNҗlh9ʀFS%_~U΀x.oՍF:*?IsVAciD4q̩F6.:CW>8pkHdv( w@ tUru?Tt O9.~ʌJ8 a%ƍ0hw}13B.h cƌ ϡN~ AX!{,4]t嫯; -#^S"9kݦd:pfOF4mN?t/3ϸ5nZ-iW溷/裏v?k\X5o]c5ru4_#?[\k4-iە5nQlw- EO!(a1ΔOOl_|C\uUn sQ ֩sAF`7趠=do\6= su׹ ^'Ptk_ٖ[nJ& p?-hLf[oF<(hWۉGg=p5 BHw#pLyM74*x}|N: D?J 3 _6qوWFrl#"#k#=?xu]Ы:;_M:X)^m"s dV|, BgGrJ%¹K\VUx/o5qGEt9}"_A?𲫢EVrS_Tе@2+/(*".8qku7qP/8np '?&@/%|IolfUul h/&bz~%cavqCN! :S+8§UOU޹c;c_rMyl*`= HC! @d7:Ec$W|EZޙHk,'s?O.T1t)*8Zż=ld*{**,d` bnH:?ocj]w-6UUc=8~?kCN:$f=<5FiG1ݜ2s5dlBۂMMEY$w'iz{3nXV5CJY2~m,Ӑ#/kMXx ]}f@צrtd# qO\}mܡgZx̠sc}\q6偅6?=!Ou21H/mt{k u:?zC{TN1Ra[1 8@i7w1rba $]ۻ;BFƪځ; X0FMa%d)2Saf _eUŒXI *0H&p! b? ]j _T]q W˝|,ǣU .ܬckKjyKd[r%[  39#Bmjέ,,H)e-J.X:.R;[3QQ,.<<5TV\~lF9 ~?V~JvaY+iQB^+poEQ@|i/bؽ8ԓ&V*NF+=qFGu5"Q# r/YEK#z(Gx}5:i@ۦ prB)V`('t}r[lJ6ٶN܀F87nؚvM׹QXpc \-?)3 9QL/S؉[mZ+-kċ_līhb#I_v>A# e(¨!͝ MZ`{c|%^ qi:/jH:HIbS\lIWtrȖNG˶S2FGh q`?oZfA?|(;aԀh/*,ny-bw*/B@~q~m!آ+>yF~441[j0 bE7xcfGhGV|ƴ޾_…<48 GD(SLOԼ=HFZ_MeztxVxYy, b /0@704LJgu^G pb8 3Lq'?WY!7KٙW]Hq9tt=uf1ꐈ-vRWYdMFb]w9L~|**:̯dFEt"^Tx1UAeFM *b *Yk ~y˒lU|qQ92q=W櫩9],eʂm w^`Q:!r>FJ=wͣs6? ӈwyҟm ttawKG ?t.bF<쳽AX HV116"CC2&2,㦝)iYvVϭQC=a- ?=?<0 "P,XQё *&0}6r:v&7~dd[ }b<]24xOFSXh,p::` h-^$t$׀%Z<.k=VD8}v8xƦ< a"0L|f [d/]k_lDעx/ϦH^;f*\7\t1mY#X$yP䯰/kTHqV1EZpOa'Sq/1Q#E~qX81>GS,z84x'aEU?F]&\cD799b!t6ǮHe2b)U(Eeƴj*G~QkG=YD36яbzqqc:pj4vg8!{ɇY4hG2{pNNH+z|cZ)n%\MOމ}Е-,dx_ >2W4hް>F`j ZXonԙ'J"HS FäkH잍`7@u׌0?ϺvRp#2SRP;2wl'a Sie0}mL>dI񶦁AhM_ ;i iM`/T}-U԰LS2n̸7҉RZalbZ_pxtՎ߄PIENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/hgps_spectrum_background_estimation.png0000644000175100001770000103100614721316200025014 0ustar00runnerdockerPNG  IHDR csRGB pHYs.#.#x?viTXtXML:com.adobe.xmp matplotlib version 2.1.0, http://matplotlib.org/ 1 AX@IDATxieUPU5RIYH-0j&Q#ci Lpw! ӆ$l$!jrzy̝Ww˗RU9{y=s=wǹ((;v%j}Νiڮ=ml{`Qbhٳgnl a=ݗꜳ9[on{o+g|g=y9yN0u ۘ=ml;|e/g~g뮻4 om?򗿼nmml{`6G!w>}>W\Ѳqx׻վۿoF?|=ml{`<*LM“:tڵٳ}sۿWc*nl{`=q<*Av6 6 wlvݮoh~x϶=ml{`}Sp;5 y6ĉ/~!m` oo= xTmu]MozSۻwoG>~/b__݇{ߍطo <6cFy9Ɨ}|4_۬l:E񹲑B7 yzWp|yp7å_5gU _2fi1}yᕾWby#6|gzCgӮQ>|?HЌ|җ6@0r2^b^rkj_;}~W,?]gk8Y{7ʝ='hዻnjnƎ7m Hh?#G ^O}Rb =G9U~&A2ԃnjrJ2xٽyC3/]\D}CnHox|%I0tpȳ"v[+ 2Z:dwX|'?큇Kq>=_ۻ,g*O SeW>]sh۷o{v 7n]w#6'ڎ9IϽ}sni|$/_׎n@\Ż埱ɹ9ǛyO|E|=99>Y}2G!YOz?}}_>N\|`"QN1imKԾk߿H q(7 A ([Uaoow_Eڧw^c|+c@;pww;1K^Ơ''/Ïlb+jo=;96J. yC?C}#6Ecѡ|}{۷}۷ٟm1_!_+#h /Ҟ#NlXmO{SW~We3uUW۔kj#ζ?=y/iOt1nx=׷+ՅWwϷk>\6X.='?-.ow#r%Wj{ hj`@7|ؽD2Aу]D Z+;}&ck:َ<\ p5ؽkgۿڷ]w]mv'۫_gntv`G] ,%]wӆ@$#19Yol{nO=7{ݝşues)x%}wz򳘹5x㻈nQqQ0$"0|2b= O}YYhBRxsy|I$*:)ij2`*HpMvJtpQ+O ~S8#%m,#?jr6QK_K;\3bI;GIH^:k4|I}^qG,`G`lOl ?yo̗H04Dh-VO/wRۜ8x!+|KmCLGt] гߘ 6XgI_GD?!vPxU 6ADQkU]>V#]{G*5OהЙskX鯩'Rع=vT},q];Q>-WLFcOջɨre>|`W{H}/;h_ڗϸkoW 1v먁v;|c 8oy㎧ç3[Q&EQ2y6Ѥ.m??W&18En&),9g\~dʯWn=&96Q@0-zx#=!lyy9 \1]SGgce3B*W-V۷x3zߪǏ ~t{]"o~xEfsȢ7#Ccu0/,Y?NgbG /+1"A=A.>6Eq>p؆ƆY-˾|経n/|#6bȆ v('JRsyy讀+zZxs0O.|;77Z6>НUଅ\pGk~Jjn-lb ^43 u6T⡺phݩj_#~݇#lxc큺qTŪ6W_?^v9*)On_ϭ^|ǘoh_U[rQ9y9ٟ2=>:p8AY(>Y: ~;Ѿ;kv̨yǹxVɱЮ+oĜTa4obX#Љ,`FOI|dDZ>%Ltck3@@g _G63;d;1r&>G pm{x7w Bӟ&i9zVkeD "whzW1kg^ Srf=3_b:@aш');3D<|}I/|˧[Ɠt?R%Y9xyOO|w{Gg]QW@;Y u]?U;XĮX[uǦ\\w`広I7l>]bkZyY&e=|޸bõym_e^O}r8AX+"vtHb|dC _IR ۿ]w|,.jwoUʂdsX=B"XMbиYr _%ϝCc!~ n1|#.bd$jH?=Is@7^ R_2L,EuǏȽyzel?+{OvơY#5znkpoOmDKud];w`-A\[hػŠEMtm`K L໿x7m~vՕvݵy_c<&?o?1,.Nj4f| -Rn1A6oۛ[Zᎂ3.Blvt؏]$@T6 VAoG d2ritt3ӀGю7Ohrı>_կr̈́f}l"I7~Lx.|/Yـ 2yb'8 c&?c>#˹c{ɉ?I2fL<#.}׍x 'rVl@qJ^Sҟo 艧s$~o貈;EO'H_t]MZpm Zh=Vצss}{֓i7]U;|E`=״*"no6gՁNwvֱ?S4}q׷i:?GD~=펻i>63"FC #mƑr8.&m$? W &`QJ|vý+|z4(K$DM|"k_d<[%/_?S&ѥ؄ITSO,ǞO1&,MNX@ǘxPϡ;#1gYj hb#:qYxFctKȳ迊_GV`6mMӔ؍_b[l?l d#49~ק =rTӝ\ZBKI\q%4<͕Is W}+|SCdxɍ\eB?$x9?ņq?ϿpkzzoyLcj?]8'NG >N̎,Î)yς]uղGu^D8 ow#PS#_g9Xv0O>]tÌ1wȍfefCǨyq8ɛV@ʦycbR*[I O؂";vM^73p* nJLy! TPFrC<@glV_ݞphO{Ws$ƥGsy\Zxj— G@ GFG?4[ͤde&5l})Rd@ 4*}pR)1F?$6LJFI> ~Ub>O^.¢CydsfyG.KJ^M8rGW^+ ph_ʗ֞p~o=}r^ttǧ$ǩaW`}[>?oa$ ;k~yBsw)gI9UN&皲^펇37&w<~]7(K;L?yxEaLrCז+/,2_ |}qq> ʤjynt,<86lHhHNd`9"pB1-D 962DP-4 q1?vlM xot8t} yQ|#MF\QҮzzw7HI]74~e/ ̓Ae?O^]c#T/=̺ }_^w_Zlgyrk~ <YL\I.q\YSgwY^zg +<}~R;wV$l}ܾ+y>+Uo:EE ā,$ ,Iz?r&L䒑4RÍ9TE¶I</E/*\Yuf'|p/՗U5ăntW~dX|1tYL1 p97ۖsxNe6??eڀ|n6rL 5cD<أ;.ʁZe'c@@\miW z|˸D2@ ^sf`3@?ḧ́h N !ΰs aP'Y tD>ЯZcF_OFd7dr3f7[zQ/I\5|S;6g}4];>L 2˪X'a09_>?ziÓY̓>>;6K8v1:9-\oHߕ;b@^m<;F'g97ڹ,|<;|qF~#mNxZ~E[lI?:K"0NEؒ}]ex/on|۽W Z5ƛ#n+Lre pl";vM~qA^ BY?lg[WIw>uzz@}rlk>on{?_=i7C<[1ɨK/8_gϪ#x$0Iq@k3xF1(%@ޢdǝA%IۥH:j@Bؠ*hRױ2z1HLfx<ȯwnϨh[;tWyYd<ڔ=4FR;Q_#<8yu|z>Q=чΊVߝDyWϯ=F- }Ӯn/zbBX7NbdxɑOwYbqΞ ſu6 eHf|Dwi16qIYG|}h_~7KYJ;7| `B?cꀟ[zweۘ_qΟ!#b9s Oߔ/׺MxA 4i/[ؕxn ]kq2$Fl&I~J0O'7z@ Y|1Ks2y`nK ?vO=H?Wx\FG?Wr?uJ7`^%dœ5W>s,Lt+ǿcoj3uY{/>8 aκ 08gv}|bn-U#>)O~jzɱ1M1w BZ8`DjϹI$<.&GLS.tyiNFG7z'6 0@ rlToɃ62񏉄YȮ+7[g.{yx4Cjب=h9:7>MB&~?x@#L*ٙӏ[㏎~h3z3gفѦlF"~ɘW4_-#>^phe!{2NY6-kl?'w׏.yA_v]Xg6 5E]~w`-ܺ^\f}5]Ty(٨ JW|Hd{^m;H>׵7=N<9'i;ɁE S辫NEc,1'xAr1s>e#܈{]4pG>" E镄˄cP"`ˬ |ᣍjGdb2m1]n>|57LltWz9+wym<ůFE?v\휝)'t,E$=(+<&<}t&9]DVC䐩I13.ɧG3OE_k'y>y_Mxfl>E;4=u%oݞrnT7D7p&7^hdpu`q??s*XٵڊʤzTl)N]i`Qs6kg9竓NcG^޸5ǝIm_{W|gP:hNv} Xo=xo$ 4蓘arʄ.WxN cbȄNI!Qm1|tYHMLYhxoi oU^I*fDl*hQ c/~?xZecea|m̀ˌ)*lL o%I^_sj^c;p7w"{!z}{ڭWUrQKgzAtU'Q7.KvJ-tt mk6zqZ_!txtTWlw`~X zmkV'~EuaңP]v-dm$||`ЅU[ Guؿϧ <6]}]?|~Γlځ==~Uug>o:[Gr.GGN‹^sӗ~óE z̀l ExUiMWXNOS8 Uu; -7.YU&:vhgDwm HߨztfSxqxLR jn j(tP>g2 s)qRA-89w_.H|w_wG|2+iߌHv\N$G?~H<^?|u][/|5[\{3g} _jWdSٵ ?RoU^n+ ?*u݉Ɔc}ơ)}yMU\U"^hO}}+/5`1y~5q%f_T_hԏC$5y(-\od"`ݖl,-p } .ǑI%b@ q -J"wEO|S—^,"rCtwrp Cx)&l ^ROF/@tvL˹$ En|m@;'7~1"b<^aǧ +z}g5FWo=O_Ά>ɑyj7^bO G%l=!88c%c+`똬,FYr+WɚIqDوx(%d_Lx,Fdn7c;^蘉vĉ|1`'pcs }x9ڪZ䓕їn?sb"[~9r+<{ӗ~ O>"c="O?jm|81YO\4~{ B_zFo0GM&u6ȑK䫦:מvþg*6ggf(uLr\=:Mëټڧ;IqjI?@^kÈ6]t;ٛ%ƚ\<|52L2[1/y߷=醃<ՏI][F!Pt#{1eF~p I  Ḻcg^$,ͬ<:Aıka eZIqCEf^D G=;l_F-@( ϕN2p,(d| {}WEѧ:9>\+8?65>Ata3]g7XH9߈]bNظ>p #x䙟 ɆoP{׽'s[ lG\.66fV]52?a DnX@^$tӟ1poPU+"d]V 1(*%-ᄫv7Issxo-V0d5\`cm,zSb*轻q}ߵrxvGij3'_ν H,Fḇ2'iM"Px# @bsl7IfdkE mZ8 v;-W[|8J?zƇmlRFZxlȐ/93@>>ֿ9g.؝!4ε+|K~__y!SaF!6rVf:)xs#|3[#mcW><^ Dƥ𧳘YK뭀`{!nh9\_gn{RbȻat  |[(G g+Iן3o'Q?}6AW}~lmlR嘆/ Ý<{. ]|sQ/3L6W7.]ݭMhmvz#z-Eec'-1L$%<rjOWL y=ʦ~hSGlvtu]t7P; \xb29r*9s,ypn0ƶb\O{bO^4u=}g}sK=_zg݈>9+/Y#qS?L۬r]ow8p~;>p›mP_2|oXPMwӋ ccO?Y>fplUtm :jj/~/MՃr:}oRY c|G@xy.cڜc?4L]"wkm6:p{XS|wצƀ]u7z9XuHWy&/{aiw,Kͳ2+`3VUWO"(JR X+O$UK2]N.,rEHw9w20s]7)9rd,Ǭ~Cȓ.Z'y->OX$X*gԅ%.+Ma3ӏc d+䲓dd`O`_%GYٌ@I6'։k쌭^Uw֦:y~cå_}/i7ͮ C2vf@-z2~aFpkw zkU`U2[}̹inc \(8jŹR\t~9RNG#ɝ^\uⶫl]+u7~շ^A}`J<1fN{1(rnaZ I`tL4n@{eXE&\$`2 n&,N[OL^&r'G@O?ą>>` 9\W^iO?t9ev}f|6n&GXt\3R'1Fr>Yf#_mpeF>g cbeg9Wk&⾪w?N,M  oPn<.Zw-*6Sb[ tb+~60=keu9t*{^zT>n ЉU5G|L$s:=/Jz٨rH&=pY䑕E_8\W$o9[ݕd @ӱs7gZiE_pNUΞ);NECG9}},hbY?3dؐ=Ӿ|lܠQ'MڜՋn܍$@=lPhqI>Bxc*›oAxNF^_kh>8 ?}slr2 E l+>F*><8O-b=>g7X\hHɇ:q}u̳E|7.c'ytR!RrFRx]/yr+NwIǶLH=Fh\p~­7x|j}|z|<H}_YtԪ_ϝңPH>}C+k8!*:vvxmWֺ:}P#m4]v}މP/w@.6꫐ G=3- (Vmcdl|C5ḾAmt8灿2^_kNp z}dx~VlzYg*lx'oӲǧyc'?ڝIG;|E\ x4ǧ6t"_wyoz@h|sfpsl0l͂MIt,ytIL>lO$$_FaC&lx鳬<<ݣog1vdFg)F\k}rNKMD}i\jSUlmFv~v51nsu&z“["=w)sM6_6oO"fqgυX@{'']Upe}t>Q :)\= i&ސ%}_ݞR9}ڸ*NփpNՋ&Dw0sUwfXKc-V%5&>S]:.cVl #gK~Co]~db`Tǂ@O>ζ_e2X$s}U]gV9olKND3dHB4,c3#1k) OEvvuG7;fO|@ēbW1k&o1\6v _qJf_h ǧ:uy-͹#?7;=\][<tu!/bZm##፦oE|xLWoL+o{|zΠ>^M'O@hƌ1TFY Aq<Ǥ$0\8'Mr|<ήǞL&1.EO<7kLe1P6l|j֍8ncGn:g _)14e 6hK?3~"}M=v8tig#:å ^t(W/zml" hßNw 3{'UbDSN=~~iD?6DžNÊz' ޝ8?di3u<-E#d @-Q_?v1k(pk-ŷG߉Y^~O.&k<s^Z 1l"Yf#u9:]>\c>ru^~xޤG $9qt < pFg@ܑ GM,m5La 8F<ȢBf[یG2+c(~7n6<}џ>GitGxY uˌB_{W[ < l-6j2)؎-<_otN4C5g Ef1RjwX>|߃cgڵ1ۮgAj!_;ZKT OZ Xt aKf߽]M}g\W4.;|m6躣~ڹξqH+<Nn(}27 #v7<7~~;@6 ~2nsdh蕭:*d+PEzꔫdcRD b}A]feñy)pRo z~LG7+/6;WpQbJv%alfwbQ[Yx?1+[䉛:%dȎls}t*>\.>S@x6p.DIn28V]{>p_{}{ ;7_=ᦫJeK)Km_:>pe6zRǞ~P Z8A|鋶:Q+6#>DǷN|vpvG/xY@beU JHqgxq=}=sߴ1֖dwSp{n߹`{MM4gwY5&)^@c~~=i? B8t=01U m`+&Cr"w [?g%U\_tS"e#D>ЧlWlOvmd/_uc؛Ar/Gl/V~!95ꫯǗ* =h#C/) >(w6#-މ1=H΍=kݔ\<z{+^tbF]ԃx~N  ´għ;MEԯ:Zh~+'^3sk+vN,:C4 oZo`|ﹷGVp̓ε<~bHǧc`vCoƎg7*sa? ssyVU +೎HYdpu> x9rb7‚|@~/mpc{:'? 1-|U ?ec6oι <p@v띏?lcG٭7c8vO|Z{ bnd;cHv NWkw͇_x/5׾APo LWϕe{υΧ6v͎~@58\[ĿZcp6G;+& O|yy[ѽ1b< -C/}ƕSqm.=-1u7w*jBhjpeƑ=eXc6f' -Y@0%&.~d ڀWjt8N%3~$D4Kf6}<ءIQ%x셿N}ɦbsg0 l5=7* 5?2ʯC& >jve⣏5VSb1m6F3a MXvq0lnz|h6.wU}G>0C? Jt JƷJvu!GZK)_nTzvP~u\[bo>sP~;E# wē_~`C#ӱG E7Ku~~=T?}em'K/}ѭpvglnd*C6)zu*~``&83A vϼl.UA@$8% c0 n6Mzx.q`ޭj, 2Vwl>˄%ɐ]nwmtKNCLd- /pWi7~EKXgC?<1 x3!v!Mb8x+/d\>=Bц1Xd,dWy#<>|@عZ'Niwlb( wuY\GU{W/Fl_az;&'rw+A-WcȲػgGz{=3c} ]hoObJ{ϑ'g<[H@/W'8vْv?pqPSL>)dU \Dmx*I}tkzWD3i`r' :$=)mT4IJ2G;>+o7Lx '-^04Ƃ/آWZ}O^ <3MLg^"g}ݨONъmY6I|Bc%@S֓Eudķ|,E]^c^|Ƿr!~%C\<wmZ0w{<+'ӎs݃{kX u[[5&G?9zkC·<~X&c VSJz:b]$/ob#GOPmvez\Ux6<3l:=/NJ3Bt*SMt![m OW y0mJ4l+o>[GOs>NL8r# 9:;Ϯ$jplBXY;c ڒ0-ħ_OK;}e2-I?mG7玃OYpq>6YexK3`d̖W WMsxԃ[d2.lFzz''Gߋr@nፆOG]j>yoяϲ E~ R|v kWׯZx]J*zQbbQ/R*]&yMS\+ac^?x|K$z7Bl ?Z(|}Qo;t N ~wjCR٨/tyћ5FUax#ѳFĠsAOdފI¢dU@p3t,?}%6z FؚgK. P$ձdŸ/z!F›X`ir;ٷ/dN[=ɢSkׄ3E[9*{v]Z| kxM1`;oӠW,з(O1 nlN'8RmF H.‚@]"+ɩևԑޡԚ&k6MЙxJ a XL&7@lV*&z9Æ |r-ob~|tDy9FջX?xɥѾȏ.HC\@h GO5hֆI1K+x#] n|8'ǁ8:,d=ػBoomƳ}/_Š2ӹ6 +꥖7cE^ZEtWt!/*W=׷cB{bA~v:$^9&^|80Uv`T2y6 *}h:DO&W,׆ U`GG޸v&$xn:QCY Ǥ^&.]vm@<☬?}gmhWE<}7<e[=I"ȸrlyHLQo5 sG_.n\.O9Y"Ձ=큺un;o|0b'Zӡ:[u?_5$O|=u]N!ӟ?Kh_пPM=o\[YHT:*Np~z;\‰11]l5<*zd3q!3苌%tČ7z;х{Y;8!/ ~d/9hC=DG]%vĶ MjЄW?5H_q'ŒɏgNT?;^K~i_́z*׏ [`_OwRĺ_ɨ-B:)ԷՃ'KuCĶLޔ%|E<+ߊ?^h88|2"_/u0γB(k=0a r'q+ v`" 8l?}ɚUk=-)ak|6lMԑAE?OKїc4,mcs: Nt> v.]DwA?kX/pӮO׆w@q#.^eE7cY2+ǜ;I-am9|`KJ?ȱIv82N\蹂/-Y-gKώT' =l諆՝ y~vOo9K_g @oCT~ 9Ik,l.ز~!)!/o]_߹W{<>vzgOyqwQKw |`^bP}hfl&B+|g &QM(ytKɍ E,͍)>#O뇘/Cty 7ZF_tM1ֶ|l$|>&p81h`I$Wq0~@2d(xnDOxJ fVvD"y& Jw*lucsyY/~~6&'ᜯF_![Y.QxЍLe$FgcJlD*%vhU%ѭlz<äNL᥍^lI W?%ol .~c 49> H>}Qe#c9cO]rVlێA"o-L'b_+t >qrcs :x6NلքGNGlMG~3ǷщԳ|7r.;rV2:Q'o@XUWWiC-}nGY9z168b'&F ȍ8t[rl#6tGuFuJ|q!Owۮ[?r{Cp2o&:r#/s L\Hhi`A䢯>ZO{煹o2Dۍ./Pr,TSpyl{4bz#9UG,#:k_1vqӏn#paVcw|Ns\Ykߵ:{ dsKɕ De>>Q x)bx%W?y}n17&~lϳɘit`C擶X]+ .}=}͖enqH2635oVt:3;2MPvSǗdSF|7+./\uS;k=vZvAxUɽ~0r9.WpK^lyIp]1X'_ >w*xjt >C Uj#x WnhPv)ԥM\_~ݖT\QCvwE+bgBffL;EUs"J\B~rLI~NRX#}LҡO.ti}3ڐ)蛥5/|^lO͋'{o|3K&dC߆~-8.*x#nc;+#1<餐cDnjk܏3>+KlENCኢRەdM^o]ƗLD CKǭbmT={oW6nK_kMľNYqml+B #WԱ>8[nET6$U˷K&PUjK7>v/>{ #.{SrnוtB!ȅ6K>Oϱ|=*Os K.ij$5)J8Z/ F :?)Oo:7{n aO߹luōekvޮ ]n;9qIt+'郏"C}Z-e.g*-T?Y`&÷#AKv$}?_/ߢ {vu{Ƒ9)w)NCb k@V1ڭD/{Ԟ8$;g!'PNp b"'Z|*~Ksp mR Wg#!?5'<3O&CÄ#b<pĦ>?5j1}cs m>yf,ꫯ8kDt$/989DMZtOm. ߱IyS֙rSY@=m|G;1ڸhwi{}ms.NM%6"M|08͗=Y)W%3EN[e!7 e͑"X1f(xlpk>XڠJB]`S(/ ;oiw.yu<Qޜ]6͂ ^d>M)c,~[ ͋^ڶ޿mo9:0~'8) qpI$ T|i<*}jI(|sm2LVr1%lXM:K=}|c >+1~,?Bh{{Na^r ֿ|*/(dV筞lɥIIُc6æїͫ}j%2‡Op;==CW&c,r% G/3,ƀL1>59}?Ѓ]W-΃ck۵1p/~v[O7eձv[. "ԫesЕ塐:;׺^ ҷ+.%#bUĴR~ .iTbqw.-Փ}1i[ Oyy{yBv"+nXO}|>y ^6 ?ѝ2 MB⊖mEw1t~ialm]^t7knՌr}Ot 1 c]YpzyRrUJK\6U_ eB/ŒxMn%$|&&g,as@lv~[~h6qy,ok\n&R- r9IuB,=䒌|1I"|BI>1KW1b? =ST ߶1Y_S ?ٱ?Y1nOФw`1͉7yHM}:Mqg2A5yꌑ vx&GE?:l^{;zbUޜ#ܿ)tO~)ݻj͗ޢXq4Q~dr 6*5o\R8뿓+~&~) -@K=ۓ `^6'%7yQyANEPז}_ޕ; Nw ^wK/:{QTiEʎ6Gߐߓp#8Z9Vr^I"s dBĉNq .|N8d9n5%^e2UĞ?:O#|h}BbAf?>lb16j؉҄' >u  l r>W6E/rZO ]|Olt֦k1v'2F/,>;*VKgo={nS.̗/=$4=|q2Oʭ_EtV~$x\ ;gϞL' iB8<_ãX>"lroĽ؝Ҥ AC+g@&jdP&yMʵ!zmnb#5^kd|I_5;q@V ncpoc`KrO/iSb;GruYD!d(kAto;߲ձ7_9<лu/;ˋ)_zͻcOps\28/!@X: ԫA~{9{>N|MLe~#\ V<[q w]tM3t_)e6jp>[Gԓ=f7G W[98k W߱+,ؠ6iƴ}k9KxH8K,&8 ;Ƌデ1<l3A&V/jW_c2驎 tT:?3[g/bg3ԆB<"N@n+z~ s#}xyetT&<n6v>6-8v R ؛:X7s.-4gs]h[[]l mɱoq>Oqc{ؐKq36o4w^ra9>e9aM?s~pc{wٲdxDgckY־y0gJW} nEP\ZWFU<l֊-\p7t޵)ms6g~vGvuSGw|5'-My3}U☋/:Hn&#7 Źq>\BlN[E o8L<9 I5b.:iWfՋO8 _BLligc57Ч=)gF2H4._7 Z%+)9J~N.r Ѕqw z.AtCT)b +'&_AeZ^%@~Qқodnԯ:Ne5.8rr{u(i\_us}~ ]W'_z-OwcXE73wb>XNɝ̹knj?SOu} _XS}n=nmݺxC8N&z5Dbbۨ8*,>dddqi^Z:XJv>FZd-fbKzKl67':zMtnq] Y@y7tI`B.=jGOd9!;n!9-x_ F?ҡ~F<GF~H_6(p|D+doL.o+94~M%,ؚNI`6#8˿ 'P'W\ѽ/7+^Οfp1IEB;)}E(X``JC o~%oKMJI2 Ki/|Zm;餈'`)N 66^{^bhK{XS1i Y@|<ȡgs6hb|z" ˼$8Y}Y4S MŮH93k (80!W/ͧWlX>ح_< ܱlʛ7/?K/@{SǮv Iʧ|ss d)-,Ӡn6,ռ2mܾ(W)ٚ\۝.%O;͌qk턣K߭;Nj˥o՘Y@\Xs_>xIuic9)n&c)E@IDATsHzЃ=^Wv~?.>{đ[sTM\?|XHT V!3sҷ5 &v|^&j[S1Jn8N{\r?;yOrÇR^--j2>/$pO =$/~䤚,x'q:O}A 8_ Chr cMs.;v1ؽ_qo^~ݺ{}n֛Cb!oi9?3ɑ@g dGc`87-;g*ruB90nw[|S;ZAX8)ca E@,"-Pੳ0i6- EgH_|/UZyt06χd mf¯3Vϣ;\zk!Ï :?mrRo;_L%>nrȐ6|)|$SY??cGOg楜Kޑ:焫<ѥ8V'1 zsq MDm1wttחÕ'zg8t TN-+KGmC(V޻nݖepn|շ|UK6dv:[5W]q Ǎݷ]U~e| /Oo3O>tawrs_6P}[|)_[hM*llE-۱~|H8 gXnJ& QY,}oV- ,1㒎]6,]L,p,4 J+^Lc!3z/$,xDGޘm&k&rrswwELF}:>/$Wt /1 Z{To_6xpdl";uGWo'4tF׼B[yD2Ȏ䠋ҧ><>{uov݇۶\cq9)ovT>ŗ 7˷>K냎;nW[(z ?,9\˧n(**/0=e<|yVR 1'+cKZ[Mjz ˸$4;Cd[Ƶ&M r:1oyjuՙcdgR.|{W\rA? w~V_b[zz>KrOb#<}F[A -EJf|6е2؅xiGc83f<c=Xzq=%-S:r~;'N\Iԓ@זߓ6?`7_WG<ՖO*G'#6w[ӟd-CJYl/'mȖ{J:c 67#0G6*h޼ׁ;vYyesyF}Ǟ?מ Nc<zsK8᧍f!-::l2y/8UVr3⛚:܍.lGY6?f^Wr;ey@n{l7#:]+˺rk]uu 6XMܷq/g+mĝvܥسsEt%?9{ˏy m >Yjg|N&:c|_W/էh]57z?CHQbrt{lwYguG=Qՠ61Q_z I, w޹{M8,v\5vle >p" ~<+4/\ڄ%Lzuo,'fW?9M%AM<+NȢ?їr$^tr?|mi\veݥ^ZOsN}Q_c武'Zv{qvsvwHɃI~yjś\6/| 7 _&7_|6k+,@D [ pP87mT,6۾[lMK?|)?[^6vY)yQ_,:<}KOawU:g3=/G?9bPΥOZl885.>~Eu?c?O}A1 ^@rk0%$!J 0UNt[O2m2:Kjc@W%YH?]-mz= 7ӦWLJv}tly(={11ylr"C"O?җ}.*pyZWg%MS;yΎėo1s^xa9IpOD6ޑr!ԫw]_сnj_9 JJ6,Џ!٭b:9yY[rg+}N?4~6$f<}xj2nE, f孞K=oeҺ [|rcoj>OtЎ=kgdvГ}Т9}yAM .A\{l8c|Zqw{\hA)gJN4%Iűb\|B l&ޙ !CEӱ &LdqM<7ɋ뢗Iħ?jvO2֧ś@ 7?m@ىO|$AHXV[څJS}nW_yCZo^Zy|/6: _7KxlMq2rW[lr}JbWk_zjvW\N=蠃KCVٺ ^"vom G|\w2Zىw3Џ{񂏇_8G|^oVҟH\[-6Gg6gC &=>x]||XzeNi-|oގÌN7 B%ΟɟoŰ3֠IGj29 ~$6$yP"@tS=-Y܅":&bo#rrl?1jv-&?$v&]?'bV;Cq`O줋ON&cd(.Oy#!vGFs%II_bwy1~!]wkasG.N}UmB߶3zuKkKnkB/k~TC6>up{߻nܞo:ra~ $މCX{I31Jr^B|yb\[B|S* o[2!Nj2%nNmݱBEi9iAC?iRsTdғMo< n@[R/ok^; .ZXwx<6}jolO 1[]~wKfIi'g>v,x;Pu ݆lODuÞ؆Ok#<1k>Vq/tl:om<|{^"[!( qXGiV3~IJ}x{َiL;C/>y -hSC=ตק{L/r!qpэ^YDY2^INKCښO4៹gAl(!\rIƠE׽u m6 +^I8,AxNƙE15ޑ|x)ڠ zVf$8"9?~c$>ɈOxƵ #哾еONB>i/ #]ЁējtK?Ow{_?\_Q>0@7,T|nAV#Xq<~Ió_c]&a]û% nOqL|c7?;\] /ѱQs3hmbKl%2H#c1/xڡ͘Xpg## 81_ljƴ_|3ģ]hR8͂[WYmM~>:<1^tRӛ7_<_w|7vqh5Q[?~J<?cλbFFNx?>?{4cod=krh8Cr3r@ oDXIɓĜBc.UG^ dRS@O^Ma<6 ?83+Ɍϒgl; 2FYhГm'2G~G@0cz}lI˽t|c|ގ':O{z6,~1>HMgzD.#8- - ,4+4ŧIc/1 d%l$["po}cv>?z(ȍjQ[&.1uCK/.z)zFV/臋osN>zI*X.}{q:@܈؛L CvШU[_!x1bsKYom ɘB6?/%>6Yx.E/zQ}~!"L~ԫ[|q #׺3N񠻒6G-Y֙q0hS@ǖx6_ /6q-Y;K?hW|&H\2^pHZ%YɵRb%D1r95I$u;i lB3§OC?cv!p4|4q&CoE}*t}[jm\᭍o&dik!@i ,ɝp gm`KolR&'>_%^r]?־գjӍ JK "9 Y 3yYsN,TؤxeԎ6KpK譭vbc`~$|N?W⣏\m==SAC^W{M:?|t/ᤓ_E?tiǧCpMMqc+ MMbCjП??َC+l܆ 6 ZKUwyGx@ 8k[aBԉx&_'\|Ѳ=ŧz}Y7e=p/?cu6!,F P&iճ&̬Y(98Vz,lkdv!%3 { yЗP:Ъeүn?c!5R@h'ft/xᝢ>dY{ra+ JN'\I? ~w~ 겵g < xڱ7z 7%W[@0=Ѹi|xhrĂ?s%@M6]IOZlg7Qny 'PoM$Mrj~z)acvI|hwg{];~C'?͇i<61`Pձ, qTHdn"-/yLOR&OpE oʼ`saԑE/z(x>CIX;^6' +NV~7F҉.J}PNx b6mݺ}ݷկ~u}1_NL?syH6_M%?'I'~Bcv )p17}Yfrm̿I8s~l`2"dJf䬙F8M" YƂDdLe"I@ W>=2iK/'aN[sfm?4$IȤY]%ȄLG# /'O'0Ggmw lO,)*b+Zf"ggCy/p,~CmN@|3}<>Qk0?RAm~ v£ h6X~5gl}tSV ƶl"G|J{ыN:I?W~~W1:ڐx%6҆qwsImѶNY‡t<2#'rs)$o9@Y@ sN} g+zЃm 4b3Gĉ=9ůt >PMcpBCnr-52?D&}"{1ߒy{A)'8+yC=F-xO[oAY 9\V舯@ e}S9?5r8P}lh'tR'?"d-'^\I\A!G?`X[(l8#%u&1gl95ϱDk2>Q&Ai_`E[>(]ڶHwjm'/r,ګ 5[rٯ$o#; ҳw4|S-ג69 q6m[J^d'YF_'Eb>s,|\}+ 6WPJ577zѷ-/oF}1^d'W*ϱLy| Jro#1;MJ?ErrL:YJI%É >,{6[g?ŧ44$6Gq$85A-!fj:9I7$H?Fr!g,}.[:&ءZ'9h'(9ő ]HZFL`&ht|rc,£MfM/'m';>\MMWC'ҽo{캅]䘒OlNNc\,O'&3zl|ag߸Ј.V߼j᧟_[L{>Ƽ~S ]kryo~J}OOWGjEqТge[C~j2Jr_!Oݐh؞xӦo jm'<󋑞QG?y͂ꂟnOme|lհM6@Sje\T89X^W\qEkcÆ6|JޠoK~%7rE~GtE^cM6>"7r|~Rp2ޏ;|\Q<YexKcvb?飇jXȫNK>}r2}dw찞gOH[??<||378 , 4iGKkLD Ъ[y 2>IBN $C;=' :RX' 0&ƸV9E_5yt V'>6x",q/@RЁ~ (j1SocyZ?3?ӽ/6zr6ln욄`a|: hޭyo!0F_e#›/OwdYKRxG9=:' }YOجY&obw'8S6 @$9v\"#5hFI9Cg6cV/R.:Wc O;l4ԡ n;-oOOA2Z9ib'>l-6= )}=Fj~lJp*?W(>qZ<hA+&C0~ 46z. N?c)~GpI_r$glA7t^͛n55\Ųi^M9pL+ee2 _='.\<%_'<w>Ǜm KYt٭6'_M w,Гor?S-tGN8nF;71EEal$6|o}GtSy;\7RWFxxk뫏 Cqҩ/G Unx(]SLb)v_r<᪁Xȣ xtJń+TyQN 6ƔT@)n$8Z Y@~Ռ$!}F& }6?Q;g$lן6li?AZ! 1gy=I'T9xOj oS&1% OWYKhrʭ~.^/lNm ~෾xAOW:<ڼ\YCl6wبn6K|pg6a3`Qny?\//ի ޽}Vw@&lK?hф UIN<[<~S/z 7?o9/8X s<~}扵wڧ%'NN82M$-pjp0S"CMvUVO< h1CѸ6T/,t3}j7~_'6'< UiW bRI۾А׎mrbKO*hN< OL1J<=C=K|Of\;D$HءmV\+ߤq -\~TWKh~jYxի^U =q9->Cq~ _,g'{11KqIL rFI14r-E^Qjmc 1s9'y>c[%3q8CAN -DmpC/ q֦S+:/]O:8Aϡ`2bF>=29F7}^7_^PF!Y?\Kb C$IwZjkyt2+>})m됾.wvA|}?Ra9?{̀,}v{o}C}?t4 NɪX}l>I|Yb~ =Nt LRMq99I44c4</cC%fvz__t? }lmp.gAjc>WKﱹ*Qms-룎lH1>bdAc[h͓ܚԎ8MonOE I- 23'_OpT?Vp}}+`{|3oow^=5DZNɁ"?,%}g=?[Gbr Dm J,MǙ&d.cX:؎̀OH 4䧤OI\-8f;='[j[;}xo'1%?PZAjq/yBV_x19Km)'qu vl}1)<<_++WȰ1+} ;k+ Ol 2C4F8X#$<|n:|d7W~W>?t{>oeNjJVJ<;!$% b+\=?j@0 DJqoc:*&@4Mf,bjD6El IIi,j ?Zp.FjV謦w_yu[l|~:bvijLć?WS} ,&xb`b _qNrOb'Լݎ4:K|/𲱰h;+f 6F/6p[nCϖ&C`wN:H_W2#Ѕz,$Ȟv-ފvn{3Yt}h-Ito^b&DO:˧&LZv'Ќ҂f{dAWcp%Bn +1$b{{dB_W|~rZ#ĕx'OS8qr}S:g }u 2V0L}VP#OE- cqXIo5?)rCڄ)N6 _!Mz :-6hqw8#*;~&?t4W>joo+aY'XES}镹s"{z^\})Ji'T___> ׎`g|[ߪ/x_\}~ꧺ=أEk:TYC1'sJjVh$\@oO6.9II$Zt@룦+[RȤ'»l ~쓞~s[Njg.ڗ?/'C1Rr6^-g#%6pCk;1l [NRt`ǘ7,s^b$.<0'ǧ'" F[xJ5:xo~>iG]|.;Cz׻741~v}W7D69[/kcqtnb5ݕ4ȗ2~,b[|dHܭ/{pZ֓9蠃j2ys;;S׾駟^iIٟ q yE_  p4xd_tdAw0IMܶΤeOtԎޱM_l"4nѝEG</Z `"]@IDATޗ^ziQ|Nl\=D9ƃK^[BXf+zi%gL#/> +ϫҷ?9$cϝE#TgP#pӽ4O\0"p)A]Uԉ BI!()N3$6`>p1#1IdƮ|[4!L x(b즟6nX'Fz$QϫW&O6GT湇>5}?2'ɕ;ɡFp'.L؄^b:|FCo{dÞ.Y :׃e _%|3EL++/@ڑp*}ҟj s\RMxѧv< G#̂G֟lxE~+i~sΩfku[VO/0>[#]e989Oˑy$^X u։Y(Q3ȄJVrIb$|҈.qq$IP _n$B\&J& p}p#'-ijI_&LYea//!R_ߦ z -M,''p9'v.4 r/sA_xh+'=C[CCOjGOD (W'—\mG]tSW61y@O>Sc@чG68qt m÷C-q%'#>y}i4l  Ȉ<輨|5Xwp\2-UQw}&n8 $y>NMG+ap36[&?P2 YVOɪ6]&cv*>2dc| C 2ї3!KTxK_ZyX~us@[_1 >O7fc`d'gqnǾ\oAʉ_sR.љ Wnx6}kǜǞ Gl(j?[t_|֯xZ)]|/`7А;)_[-o<Z_*\_Ip.ψGWE_z{_|XD86~.}KC[|UZx^3g#@ORڕġ7pC]bf ɢ8& 6C|\*o mqn'>6lVM6_ ʢ<-|}Yguycˏs?2ԠQQ|XX@/pPDZE='utpDo/+|NFlHQS_C&ĶdMM E}LǁemtGÉ}՟O>zmF`l|O\gG{޺U__UwOG>cc[=MF-qz oѺͥExb$"IO{1<'p@%χkr/0>U;ny q|I[bM<%j#+qV Ƕ&c.9E6_1jXO,?qkk(m4-\e> R҂cm]Ap+N_}S?@/h bgI: >h|ڙ +?fc77{=VU9F=`K9F׹6.M{c2/ GuLV!$#G :4p"ܜi1'ux,hI>cuAOI'9p$2 Bv64&MI-O_h68g}3|bs,NNn;o_mm=&W_cYpzoK'*~H~o 2zK[ H:8=r .;-tlS!A|F+Fc|7slCC:+1†S1N=Sᅿ:?EV_i:gr-M?G>zXtkJlL|13}Yo0f9 wr~>O? =d_摭 tUQWm;}u oxC].YTO|٭}>S~c\IrMLTNӞM@I2O`3 h2Л OLlWY=$5<%4tOdL7?l˃xtG'?US?g?'s>!38{S_hΊ/dÝbZ5|˼1nG?t>9I_15q4Z+Ȟio?ɛ/%DKW#32dÂ-`c[M~x8m,xl6щ|vmWb=nmtB2B 7v ǴOϕ治Zݴtyt/5Y|%ROȷb $f:86a沭w o8HZף}3<8!  l VֻziXN.]A V“nPC/=i C1 i9@g1Z˕A}9+iv`Cx$+%~-68Y|,,ګ$>bԅrȋ.Yvl#S{ hYh$-{_^Z!@*[ ^p\u%9wp9Z* P43 0K= = 0tM9JR O=iʆo9顇29וOx 7B?K|J9i#A9Arh(1P&#?ѹ: 6|p1 /<ޕ&B# 3Ikm:Nx\.t[5ouʩSt=nwx"죟xga5Ք[-._]M.O 'N~ğ&!i}w\Lb<ҼF\V )w,c2y~rGӍ}(?8'tiᡗ q)zC#5b 3E Uʘ+?TtN&29WtEBZ>iɂ] 1Ҝjk\=Q5@r| V{MaR^&zg!rUYv<чk֭<@IΝW]Oxe_~4UcE\rG++|,)~֮j-<҆?lZ8APOPO$S[L0u{]ߤ8m0¾կu_2[ +1tJ{魦Q)dGЕ&}t188:{:;tG/qS̜ۀ?x@9ϑojVCBF۳^@_ z3^])߉_ʢ?G7tzWklL3|_.?EpQ~2ihV(^cy6BQ/.;[5!;ut!6;o|>2wF觑ZͳO_bg:a_}}>褝eo N<^ՅcXJf }qo,p8_ ^KK~f^NFM!,q5O-|tkNd=~i(ISGG&x:iɋ7ΌՏԯ. 6|%鮥'D^pOG?4G~Ll3]f,k6Cx+X}? KۈxÿɒVh,&t֊G&J:O=;wAGN8>>ⷱXk/CR''(.YLދfO? 4p,o9D}^tTtfenѠ( aNRM 4vMt\QSrWTEbx >pȒzSV~hM#?qxűӽ:3RoRΟ>DәE_8r[tEӁ|[^& Rđ/ r ȧ+eӝiPNzvנ?#4R>:G_,|!t!O#Y4Ч/z_ }k9};'~7\=f\J^]K% iCⱍ+`thFPN=Wuc |׻ծvUWu w*á<+I7wo`ЗbB W1\auXPܫ}àW8 :DxpKcuC⤅M]] d ͱ)'&rx?VztY???<゛|$qt1&_nc3Dwx y :s41: tjȣ"i }xǟ0L+?{Nq"Z#ߵx=.Wm!t6u ,;y;;*ůt$ARrC@ Y?Z д<E:sk1YiT׿> ʥ I;7ypgCeZY,SVa/] cK^kN\#dfO2 ʦs /cp/t9ku[}є/-kA^BpSVh/r uq^` £/BxAN1s6YZ|.q⠬2#Fxy V&y-uWP,&;@ܴB= ~g%a!UfO. E &˵WFơڣT'"ruh|{2lNYJp$rpgG\_-Uߝ7pg]-?>qz__q/ ,?K;TQ/Kū LQ#m1LPeqP-KGFru1lmbcPW'uЫ~Ë{u/p|N%27B;r0R/&nhoqc-Q)EcIRCL}t&STwf:-k2+/ Kz+;c> ihj"pl| Uns]o+_lPn w8m?yGՂN~ CKp^44 dBd9*g2^UOTy_vw olv9jg# 5؁| 1Vy:o+e՝-AJN8.ʡ39yB}_U3,IWʱUt-hЌя?С|q /8ϷRYӦ̣ٻtʔ9]MKfzFSWX3{ڡz j>ٽoFk$==3h{$?Y{_CO5~+ pN܏ob`\G=--|' o> #=];4-YMbsg]m]|&bϴ5)2*?{A9?rY *6%<Ro7+G>S'?ٝg wyra(ty(?8 щ$MVhpՃG+/q8yQ`ų;|,Vbi@<ʥ}siO{ZϷuZv3SG&pvO_B'_t& :~V[t3N,iKjXzh;w^k+u28S.' 8ݖZ j$ԓۣhkS3(n¦zseKXp9\UL(@dm K3~M]n3H(>FȠU{b}N}_ۏȏ׍/ݧV|Y;o?&(# ܛ4a̠ X!ݍW1$]8bYaU=N©#fO~?GfV%^Q`NOE7nF!SNisI,Z@&tp@?2ENy,#X pV r9]>CV>cCf*s^U7Q!juD{OEgab f:r_?v%[)oBdMo?lk~-ל>Ah~#O{3&ej5AN&ebPmpP(vZSoDm!=qf=Ӝ4w#=O"ߒ.wOhkcg൥Q6P+*"Zhܕ@N~ /G:='83Y hX9=ݭ%1[x/]ulo||?v49LJ9˗n ܝ%\vex4 ȵ ZC&r<(B5z#0ZgJC/NAOF4 /u-/MY5 2_#Vt:Iǯk< N{tQ. i[Ї'x";;ԯ^|ν*O&lkZk~pkg=%?)Zu heI^9׭u~]mjA6!uzǹ1L(>K*=\u&&[;mx+[Pil'pl\ؐ]cDdi_?Kc3iKʸGg?"诼^_vt&.&>< 2 wK#-KQ6|)j&tHhUʞo.NC=##.HoZ}"ߣȟ"oW[/zVKt,6VVFn.SdbD:'lY#$?&MR[{G |#;z{O,ЩGzL~/w9֗]]yNg嗷ڨ2vүH[<ēvxUޚ$;|=>³ĸ4/ Y/^cNYXU}Ce?#O~8v;wȐȅoE!);KCqzl2 ><*{M?AO] xf?/+&L;z33th7xc?mI$^7JI>ɸ MOzQSGq0u??#?p!_Y1O#OF:>:)k`!j~;tLV\ `cB" B-;i~M&e2섴:)9{>gbs ȖkZUj$0lVVZ'X ;ItEۣ}էoj_vWPRpWq'0Im̬oÔm)?}A]NbWxt4k 2~q!FG3H~Iy"?u y/I/!] DOңǴgk5~Սjt2@}'u7o&Z  kw [[=Xg M\=Veeٻ%y%"X.bA:sihrK?!s8kΖ:Ցzı3@ZG47Ғg +^Y:AhȣctA.& k*?HL<,eaR -ՃjV&x齈Op;EK_V%7dn"d"U:{H(UV*~w۾>%MUvY˦w|d,~_8dk0?3j5[B7ZR^AoӓZ]8]K3K=pb\']HqwN& h:;"G~֎lj ])Ev| l-#3:h=3\Փ~b Yg cjiز |)]z饹25aAnd4eKc8A Zil#>8#F0Jf<9H`. IiVp#<:OêO:inX2rޏ%r ~i5mzr@n!6t:‰gO_}@MQʛ0 W9Eën4i64/aIu_=%c_<`z]rK}B~spW'˟nW#6*kӊ4ӱ&*!MG |Q-7p)670tӠ@O|e2`ɣ'eGu̽k,\L6{kʇzRw_2D/JZE=Х8:h\̾0m G1V/@WSv%yJ0s&h^KdvzXpW;RF Ch` d # C(?p RtZz}4x Wkʬ5Ɵ%BE5~@YrQu MX}^ ]|t.tȕh2=Ϩ;/kt* =ڥl@#(i ~5vN6%jP2ٳP|:ƽE~~tRDW}zbc=X+Olutb!5IqB5]U7c:fLVAy +G#KtG6=왶2MD[HH '~O_p܃FǗe#8b:OЇ?܇>@ ^-ݥy,!2INӭtiikѭfh>FJYygK4BW/ܫ'կlrD[ ,F z0 рc-tТlJQ}%Oѐ iPɏ䉥4i٤JCW_ug5ǹ鋌AS[!z$rNߘ]vYhZJMCLLg5aI+G޻_t@ou7ʺFNGnWorvAm*.盤ZqkPk~QPm#?xxP 4*pb_6rszd~jB?>M]cœ^\_ƏII}~ߌ КP0Oxny_T|Ҧ{r\|ݕB 3xpѐxȟ4o5XMz+߭@Ѕ{,;ph-uȍFQp_E[ߢ)OYqʺ'_OT:v ȳT|ċ`,Fd(X ?SFՖ Nאu^1 ' )A8 >|'} $oй3ƹ,ui2#-`OO 8-]E_9]=p-DCl>4/˵~_">?#w|+dP y򽃗dfPkF7YEa'-~׷V'뮛ݹ#HfMpWʞ @.̆E)⮷j[ ;EL$Fdp/^V hX@c[)V&䋌ʢWy<{d%LcI<1AtM|FDlI Ng~'C29t1nJ'Ey*㚁su^~>G l_y+@^;ؐYå&"UO 9h,^$ۑ ,ȵFċJZbx=~~jSؖ?nW?3?Sh>o^|xz:JވZ|ˣ~!zru3out 1`Z[f#(ON5:BCSԣNZfe'6 /RK=@j E(G16<8WZ˜% ^a_V.7+tqNhS7>ɍ,^)|Ft>O覌NY]0׽cw!jck͛ v|lmn#=a?^{39P fҙb11Ўݓ"eƺcɋ8 MDA#}ì#G[,ƛtC҇7m,7x\Zl- }RىO۰Zc*^Dnyܞ?ٛcguєezJzdQ>%=`oi}V0m;Jt.ZGxZ~vg Zx(UHO)XAp 48G$e,RNk [6<49Ϭãmz@-K]LxSjx:$jR`sS,cb} *gnk5cvd`>TewWu-[u5IB+t6Y!F9~`c4=TȪYڏԾp]鱗&гIRP G \eld({Z9 dO͋6-i5A|=KPXށ]m9ix ~Ԏ?|m:kwK.q7m؀bcVg!6P&Un)@DЙ2@zFlS&0֝%' a>1gxm`9cjOO، {l,H.d{Jl1>z <(C^} (N888G iX$!]/V-2îf~T0Nڶ~v[ۙ9v ܶmAww_}[+7+m{k(~6I;ON?^TmbC.']&M/vO7zzrE'<1{x'z]#Ӗ q~}\;*Aa;kSFCP.egf9xERҤ'e8:Z,/y hyL녆?)^=՘l eG9X@辡~{HP}~d?xYۋ<ے/l_\(ymo"X+jZ}ݪ5&k2~p~0U|ߌWuE0ogKi._‹=D8wUfQ( ?P6~ܡp‡w[|+ mjZͨMpʐqfBXk_tFBϗ^.h~Kd x&|Tm-M'ʧup\x0O+~d{~#*98ب@Pզyx,9VY K.>A>2yчCc6SRK~BMN`?#w'/^+>sK;acTOK$ s5o uxjVt5bGM yڤ/БCAKow9rMo>zc Nm瞽cFmĥlҳuN1L?RL=.bѵfӠ'ZRWǜxg8eG5(85>ʣd!x~#:dGc4>eʋs,F#<õ@&A]`I<腧e`?]-bO|GS׼e>YDF/Thxc@j'rmd\WB6*}?ο2ć{98Et*gGcJ<<<AvPϵ f|=}?]?d FҞ5rw=ie.$]teI_Gۛ_ rӥ2pV7< :y_ 8g?7](%WdcCӊVƶx]7?߷=\zr&*k'Utފfmƾ|gV߿Ö>[awm6{Z}vpe/`9u -iܱ1Qb$e:C7Ƥ2Zj,4dNSn@jyRPz-D&BZ ZWhΊc#}o|Nvʦk-5d z ¼?oϊ^ca=4Ygݿ>zW(߶@AԞ5SO4vDǴJσ u{sV(78`C2gԕc輔N?{??\}ik:]>4% $`Ͼ]57oV:/BjecsM:カolڮ?m= mk1[Gl9GVf}쥅D-yjY''{fkRҙ՘֢izG{y"(HyGߧ֗cU(R +z *=K-x/u@7#m44PK{E{MY;^=?qƴؠChA_{YѦE+ 7-^w~mgo]UsP eC{۹gmkX`\'RsAmlqlVj&w_\>ފ`CM Lbd!vmH˻W־T?LQrׄ??o~ЯWnVφ#-=4Ee9[eN^[Gb3MĀJI6!~^JU#`TZh4|+&- ? 'oNl.mb\}t)-ʵ'ˈbЇ#GXn9PH_+S7uInNtSQ,ViM9/(;fLYd AJjb>wm]}kk]K*]ΗU|⫠ >_ګe5m״D;uX߆<.^uowdu>KU(&D錄C<̺nR~Y(L>=J>s{u[tlӟgȓA͌ ]EJpf oxcP.iOL}~?|?>wÞ E/MSvf{}wfBTuMoD.S^Y(SMlm 7sΨ>`շ^qC{ɵ?$cWҞoo߷}_86$IO^c?B;-[Tڿǜge'1rLNɵ(``"0s SB%NMZ08/3u{R@1al8E3hF^ũ=Q\ cpAzuO _¾iEx&?2Df rWʴcZ|\wK.Z@/6Cl|HohՁY+z뭆 {kMBT}+Yϊc|$u\MlW5!i oԯڶE9iWj!eF@=TKXЧJ| >}uD O Sy s5!&\Q6?XaNQm]u/oPʳӗZ@ =ѩxھNh?h(_}2T1H+[o_ov3ڟOcvջlГ7O\[j9紓̓ݻwk 5+ !g'^mhG^gr7XN `Heo'v l?pu&7';jPoZ^^-_՞s;&ontɖ҅&ct-o:y/o8 =m%;~s/q|@$Vn4N~N;NxObB$@H|啑/08nsVƑ61Ck5jOgeei6%:_\F+ !eW+j_+.:e2yv;@rQh'@m|P}mOmbPm1c}o3+ۤdc}G&ƚnVKp8Pmsǣ|4p ~R}rPG%W_A`Gyu}\MY|v_13+mK=z0$e f҃5q2p-O_}t t$W>]K %l6Û{|i~eJ{/i|3_mߎ6_Զ3I=i\۪ٔJȣ''.` ]U}y'<QɶڿGh8ÿ\׶^"ϫ}xoqe{Sk!1-}mc_p} Ҵx#G2pĝ^r/^[JEqib^Wj):2ֵeO I"X Ha;N8]:X<4Lex x ʅGC xZܱlQ%zWe=3+(GPK/Ėib6+gKwnVzw*ӭkz^W[&6mzָQ$VA^r:ճ~c(;1Conul5)Z_֕_s G USw5z|uj3qX\w &Es5Q_+VZVt^XgŬr8Qz@λM@'gNKQ{cFkA_N=j׫>nG_.uԍ&W s_`;܋+W7emӶZ*:7>累vܶz5~A ɡE'ԱΣO: uEu!ԡl\Nj| 9XN;i{{k;/޸~<^쭸 C.>}wFzp%v͘At-st<ص OYruywj`Q@e>:NDtLRJR1ݽ;j7|ғ:Y>}P_܏EFIḡC [2'/0Gtr|r/EϽ|8xԳX\4ȋj+^kXСFgMw:2t\Y9}Iv͒rL_`=^X_oLlz\M1*X \H|35zežziӟ֛(!?B:9ա?/%\ok6a5:`5N/WS+d>'Ɋj}srJy u`_a(AA{H+1>*'5HZ@C= XzjV_揵UoνI3}S3h'Z9Оq½ۙ_ժB\D }kZ,. HOt<&t/KAzfWB԰yP7nl{>ys۽`^obU>!xԯ.[|VzhY Z:볧4o3xTؽjN9X'1 F0IsfP ~Su6V@ dDLh.WN^\!6Fb@Y !$//!U&u:tᤜ<|ꚤLJK::M#R]kIVt>ʦ )2GW^o46Vã)zg)$_P6-BvM8}Y&bs~zLx "2M/}U}oZ1p▒G dhw.>76=X|myV8%+!{7T+:{L3~hȃ@ä_?~Cz4kekvǜY S|ϝt^;oj| _h\S'a5ՇilT?ߺ:%yd2A'뚌u5ۀXMD~_y`R mf -dQu҃NXZf~ãl]Wg1jB$q܆vUS+?vsNH T~&M:4#mVXS0=y wu[;+N[|ԣvܹ,L,Zrԩщ D391ԱD/w3=#w^\v[c(TEst(>Vew/R$2Yٵ>փHGD&|u902Ϻ:L @orQ7 R&Ʈ˟;(JV.o_m_Pwנtz$Pz8lucW͗=hs]oX<oO\g/P%ל5@MJˆj/7:T7jEԪku͢4ͼ핯lu4m=)c73`WM_Ag!8,UpXB[q}7c `OxdrpKMʿ{ۡ'~m_zQ{v09+ V : J9\|o?X C>tm{]xLncnuoU}*\Sw|>>rjq}쬝\ֳ2g߫v֞]>SYgT Z hkPlݶ4X&֜z~ڋ'=|Otή e=ri^@oU`:Ј}b1k >J_\LJ<@uekY}\ &.sukclºzd8W+}rAɛg}tlN7O]onX{XNm5nMj@}WzT;w=|C-o_[ gC7 z-&MCU{n_ay%QzLH#P}֚8tHfyEӎxtԡ&?VځӱvktI78p? L SRnN?@=!v_)z 6m߶}拷Ʉܮ~Ѿ 絳NQNj絗zkK{"UI~0t}R{GP6Dnfy q{G}PʇhUFU%~w&:jPM\d<ގ זGPwGm&OyٱST)mhDZP׾5q;P}AC+ǗgMI::E+}[w_^+ۇSY{Ǖ5p0t|D 6meIj #_37bu"Xy^iwƶ@sJ{}On'Qk^>~w=Ҷ3'JH8dFWgihb'A>tJ]҇|Rfntv::O': 6̷Qf*rH!{RY}r Td2 ЏrzU{jh KO1Y}uʜ(' 2GqUp==\:VʷY3nG:94y&/xAlP!G (b^K5i׶kj;krpIx+<j@;}w}_~]O/r'޴Vljn$o'G?fؤ)o:yDc5$od`]{aٛkSS&|}; SnګڳTo~}iuzł\I@+r_7&VY,h@DʋbtDqhԁf#Klt7GJt>rc(%p&Rc~mo,r}ܱ/ZX|׳c#"3q/T\'BN sMǯo^Qġ]UvYǻ~ۏ1nbW38x9x:iǖ~{R=8ydeW:ID7:yt48:,Cd#X[8m}_LW;&E F3pH ~% e_u(wmA7Fkt>-}f UtmA_JJ/ЮNo[U r;]*Cǿ{vCO^\Zu8T2@iX33tљPUJKѣqƿCEkcc!x_\qU=96N]\aO2d=q&A#:#g~qۋʼ<>9PGW_qSO^ʿh7Pf ԺZ)xK}`86/;m.<3'VZ' Y[ekk ,A_Ur7QrA"81дa{e)۶^uc{>^Y CO_i6ܸ:8ck}ls;K}]o wT: en*m7־js\/ߴZ;Ok': O L:FHL󲏸$~SGwJ'i4Nh]j5 RhOš$7C%Aܣ҉͆ގPoT|x Ԡ򐅷$|h &d['D[e2yV%f-u a:PHKjlurpS~L{xcU ISVb[WvƶUGaCEUɁ#;c;p{RD:?Um_9zsOL{|Ic_9pQuT0﫥UǺZհdX[ԫenz >/j?nyYsn;~~=4Mx&}$o^ LLsҀ! S8pk!:x}?nR פV N3ɿg ;rZ,5q}`~Wo6\ Dª ,p`}z+Þv]M`V)|8Ht:ჷ/_}yqo͈-ifa:'V&QWvq}['ݫ&$`L: h3fWU AHәNrKMĺځNe""T;9X+@ l"7t+V ֬J=?jk$@=8TX'~m*ҋ,W g@|>}QK pLhS [kWTV~l-~՞=V+r;7< U{>&Wu%CSjHLTNyzTJ='9+ş:G5yG!݁7*&oSGs:7}Zҷ_h<Tc'2a&pJ_=t]iSVh}SvI %x4Աaixp՘+OkΦKnUoڳxv eYƎ<*IG8q[o/zPfkڀcs;c`m+8-XZmmO]Ѿe|۶/( ^$|buE퉨>O4d_+U ڽ K7nG}f{ąjzR}K ҡ]:3IhZ7좯K[:C.dφzu׵Aٮ-~6z s'*vƙzX!^ɩ,T9@bic"_- Ni?=qYJG/yI߸N]mvІ̲_ ?a૘&y\dAgWLTH+ZǨ8k>'myL/] OI_k?M9~LИ ;#l۵cM=i~^(l=׶ڳ_{װXci}Н{/T}` diSM-q{ةwHȠMot])\CZ݋A5o̓OԿ>ٛ뫑7x; ԭS7`~sNlևN?wيFFf\ E/D %;|#Mc N5x_ios&}v.yzfc2s~7INAVÏeQKYT[j37oLOVZ#ϴuu^Ĝ &&&}v%kkK4ܹlSjb0g9X?6ʏ՛"卸lMBg.ܣ댂]FMjO?uj<^gޏ?@o adp<{j7DJ ~g#^_ԵޢM2YrWG:R=z ~2Anu Ѣ˥@9rӀ:F}yԅJ?I[-ZA)Vvv/po_b:z#`iP|E4 :OSeW v{iT㶷/Խ_GuoQﲬ%K-7\bN ;L [Y@x^^5H01^plIzn?9w~'9sf޳gϞ=}_4 _~ίjaYDf3(vϥ06m٭=7nMzV$-2R(".xp{Ex$$N=t8 3]Az)niLd%OH8a\lnnW?;0 3=#Ut'9sd,\g1mBڑ 'vnN>OZЧ:P%n9|,)nAGUʍ@[2=Ι<%_ӽ죥/Gjy mGb κ8f2B @"*Jet.fq[/  )4y=26l5<u:!S]>oqSK~ A;]oB'i Bbzu?ݑW"XB&S.Җ $g'yxcͱYC6$K'k/{xKJ+uEKy9? XJDųfMh^x^P+U/C&$rr'% dWdImP^H.hCPAw(CNq,(: ?娞fR^#8 w! Vr#أ 9Fh֐S{5ɼjrj01,A#{ʘ34s#Tz . *2 c Wj5!6`Љaы9}'׬/,OG.3|8N9vV~q%0#~|+ +y lA;J?X1Eާ=SW-͞8p}*m 1 wRV,%sJPnR6i+QɨiGG~ 6bHjp1@ps(y)l<>BFOao/p[yKCoIDM ~x hMكr'6kP&eO3w6/7i&e1y;񀓎Jqa4syx ⛷m ꞥ%x 'z$wl^匯9{&Auh ؈sη˾ ˉz :!Qt?Ms4#=}D~x@C\lm>>u[49cRD1'V%My)kU=Py ~ڵ}ZWSS`uukɸB\CӜ19]i#)HGJ#VeR2冝joغ?ŶkdDuS*Q#^>;-g-H8$a!͖L3DZчS;9-YfKu$t#M֖z짷W!EyxU닷dJ5s :|Q~} k)UiCڑ@;D?*2Wچ?f> vtS$Hp]=C TLEK0 <J> >9ϩ |e~B/e"ceiSin{ݯC;8q.?gGk'i_%\5{ or':xOzCFX#W@h'-љ /5y?<%Ŧ-2 4TkYKFqZnS2mFexBwq1~˴82q πx8PXmF< t ofQ!K<̃m]v -6[dvcM5pgtgTRT3-F`ўHpow9Ooޥ-wkrbNhۯa%(??}c8Nm{+W /՝S%Q2A ?FɬQ2s&zuޘ-+kMA,Ҝ!^#toU%P7%5Muv ?1"(!N?wxEǒ{5Ƶ'#c}߼߆Ĺg,?"nq~;Fm2mK⠶)JC2U{ 3?~oθ 1Pf R=>~΄8kM:[:09N: ^gk3^vYKU:,8طj.ɓ +x }'@n >㈇NsaMO摏%me8=![q%zIi0p&лC9eqCk6/gKB&&TpImc`"օjiBA A'tAwP~cWtEFZc,`:4`ȶ-[Hz 4)P1p w{NWDHb 3?OwCu S^!4O{2~tٿ'">R-a=RLOc9/[GiI׮|yG'~W|}$M\2 *d[j/>q&`!N)DPĮ,$;8+Rxׄ8 F87/ÈB#~hٕL԰X hrS;S#{u2lA?  2cRid5B]n i==?cGZW3{ #sΌhcɥZ1:Ѷ^8k9V4ǡ~c4J){潟i91=ܷکtQ}jmh6u=6 Z8eT[,ԡn %ʻU쓮:Iu,K<s> N8 Xpq6qgx SI@UQ3O@%o K^ / !ɐ_Cj>˘`# ɉӦlОC^"?csҏXPiwJ܊@Nw eؽ}ބS\RFM:.+֎m~wt!F= 71bѢr"Pwezj(۴,)͏`h~upg 25#-yA;ɲ@|ܪ ^!T:e is^JƑf\$.@CCnq"&8SU]~ʝRf^1$֝;OsHXU V-H s<%m*3wuǓmG5Śc&o֒bcVb)jVF.*p )Lmda$C tZ|,?p %iz~6`'xQH=ȟ! R]e)?T*aqT,ȷ+Y_1nk]}+K#yX#èw-xƱsu;.+# w9􌐇}o4sz&OޔVA*(d].Hkj&rp/N<Dq3NJY%SBc{ j|ICvi3*d]jr33YSglں;~x@0?d𚓧%SE2v6RA~lJpG\Bns@oz=팳~ nՉrxz[=G\Mj6ǫo]T*CUÅ4]s-{HV0YJBT/^90ڣʔm;{,؆yٲU܄t\1ƭ^mJ|]s|-^d) V +B ]8Jg?qŏ/t0rH0ivh ;\*}mizE:ý>`2ͅI~GAq%K]?Rx{7wJSfFR9On>=nzlWyPմ8ubm{vL/{ճOsjE)Fi9=c#䇴ώ/*̵8lM('GB/| xg/JNC,MN~ϐCO6V_S Cy6AI@z"]e,88F _-y+4cn^chuU%M!k__X,Ay v[|뗗ӲQ9U֜΅ؾw/r-+yR&QO횜8M0(m$ޱďǪBJik̩K#Zc)T |^:G.R,ge$1f͓ϮA,?DM$ZIZuEB4T8<9SOOqDT-( rg\+cMuGӫ^U8b,Y9e;4"WK 16ׁqFi R*rܹ>p]T Yd(qG n0=yN|L+wLŇ:q;>gh!O# HQL*=(;pgFzӏ;EuS5 qՃ;ߧXN? +]Ī)&ۡ؊ GO%3ŊScܣcPPA4hy/]~h߉zׇbc< "]+王SIŪLpD aaYCR\pc2ȕ}*ܕ>CTpoŤSUv"Xv{-w$i1kEuŝw|o۵nZoѪsHCI%ꏭ\w] Qƞ18 /Dy]C9wVAF>R ~vcuBPS-z(I7I3kA+tse|%<~yGʋp<' 8{ts wIVڍ.}6-z{@ŽpaCg|iOgCNnl9$78C !]۶ewlWOfjJCK#*0M[2bN~%]=uxu >}ʶ9y?'3#&M|;sDw1oAifÅ0y% "v"' xP))~- 8} slQ+n̹No`xe4>c Kp䜆>&**Cf 3O{6'7%x"^1W-4 Tkd=PTnQXiɖ*I=>;nylO<&`>Vpt-MIRsA )Kk@L\9yt||,N3ǠMuřJGV 3-+;5LxUᤗ.QJ88wQ@ G>[& ǩ?]ԊDL.(rdpj.Ř-vS [/-=@iB3F@o(4wrP߼:+Z=;t#;;~L42l9gíVi$ Or=8sΆ]F)bzhDs ^Š Te.-\uJOS]u=nwxZ8WoWjeufxū^bO 7i=<[>]?' 8Rk$bJq4A)lDC2,R۸7F U(7/sf$5)~MnҪV샀l!+:㗗d*NJŽ~sWxb:f=nzpֆkܥ9?yU\|qUY ƫ5p@D qۣzrQ.̝1q' P !&a *ooVhUSq^qc⥁AHxRVʙpJ;N?'=O3=%3;o]vjx5J0[|jאl]w6)^ Wt͝`?ZUu(&=knOݮɅgӖOl㴃db)Uu R_ 8PĐe| =4"cLNM:_ =SFѠo (OV#d"~"}PցܲP6ݤYk4@L)yd +vjMw~C#GkYC\O6$9GrWqQ `g1*7yS(Tn(HD6oټ<8FotR(t*Cq@(; @(b}/61cX〻 I;;n1Nݼ.+ظ_^k)28ȼ 7ߠuƴ۴|\jUJP tWr*>p `hjo߂Ѧ;}Ц^:pp0,郞/)8#LOA3zc_}5ч⸕'fp[4?:DzQ^`\4*GP>9S~9H#'n Sy QzW!qX* a˞9)Dȇ, A-s'_pez C=Y (t.b;%I/3cfmKs/e8wc}2 +BWL=d*f"(k+]y=7)hwIɽm{.?vӏ_cޤ(0ҌR+zK嚒kVz (n"ejXY3un~-#aՎ\ G *0 Vj/Z]"sF:,W<S 0(-߀DqU)=G+kp|E7Cǁ;4f֔: \e'-#'|l/ ħA;qHeJ;3]&"$8!(1*n`y)* d_EM+a>hhװlǗlo~3X910G#.vƶgzW/_t,Rx6jKą/_=DKvK1gϊTP/B1 Ԡ? `6dA(ܴ2pFrxK1~1睷M3tnCWЍ@ރwGw`3L#WjO$^'F| PnB%~}2lOʼ3C2'\xROdrN 0 ^p|>`f@qqƵ!C6 sI`_l4G~r񽛶]:H'&*4&k˧"=^FB62i-ko2AfAxj A0h*CEbuAV"0hH |BXۤ4zd,'mTu?<{b̜=O[x=:]A\ jF qLѡrX۷1q1 3RUߺ)NZ:QЄzϠʁL( L1zNɟַ5XD1{biD^61NʂI#tQITwv mrTqpD@=9 3sι&QG>BW5*$g*je 2giv O8wuw2Ǣ]zF0aݨ;_a S+"(Z[KaաRySKZ!qOw\}x˹s{Zt rFOz(t#JV=&L6wfEZ_A:/r̊B eϑ97s҆aD?h>hR'M?qއst&HCK?KLyexu?¶]}4&qGYuѠ(,3UZ* ec,zib)V+ЄĶ2 [5-knCs3]iR[W?w1Ms8@q^4NZ1yg/yNE[&pt¢+ \ufamL/~:  25f9'LԩOŪy*RH׸m(݇QP t4=lz߉?]<"e*8Xy.M^2)iԏ0l|arNAP.3m숿Ƽ~aMGfL Ztns:LȥG WM,V尝 fxGwa3*GŚ9Lb)LgcsJQH2vثIlG4(8]j7 T*^Ȣ<;]w«-r0,oDWyQ=ʋ%WtɋɚG2]@?gO+k*O*,& +)}C[&{&~+KJ8 i7y p] ZK󼢝˲ H BF8s<#o{8\-)iwJ/~^>i%lKNq$4U.o}DE|R yR!i9B[;{qu?.v&OF([ -]q=W'weZ>=c֔zq)`偘 %Fi.g60@L^aBP<ș96=$J6%ce<@w6ahw-?MI24y^ !zHDGYz![*,U|B(0ʉƗ+ '%n'B7|J@1SqOG-g+T4lVM\t~kT 3xT30`9%d;HK;(?;>XW[^o]!>1!%AJ1[7~ˬE&STCM8{C%@<=[S7ɴ'\(ctu2JZC;a'm:&ҙqynՑfc7Ν]\8yG+_~~I/\ƃC^ /E<̈(+P\}:Mھvh>2'{O=>[1.h ;[߾v]ܣYVhBlLmi7-.QS&KM4*`(_vȃ~(]i41R@?cᦧV8QG8V-!߈< q6Cgȉkߑ(_*r(;? 5 dGPJy0F B|^ ҝ0x|(S?|\;brL1 Oi2LaܠҭVsw+>8!D EG#e]<Wck%]sO-V^YDR2t UXni0V)MANxZy8' !* @KRrӣՎQ O=i7Q>]#EhhAi7WIAvx[w>\W>q+]ok1۴iPnۺ+W^0ko=R,/M^'\ ~U2ϙcR#IU8q6w+pݗ .F<ܼNH]"Ltj ;!&s_eLIWf2*q=7ɸb 1{]tI~CETғ g,?IHޡ-!uxV5D(tTsY=~茍W\>\`V0co6ָT5ڦ5p{uoO,6&~XddA7 M.z6e'{Z\D>7N DYvzIrG\#;=\?/+AO<%y7餵:a`LKL.l;~ ^2Tñ=r;=,|S'r&|#a<[~n09@: lÕz$ n sn%Mv1j#A+\:C w;>ѳ\r_})1c˙r[aJ_ReFħ`q/<8X\ -E;g)hXt<ŚB obApIȄ0J[ 14!XTnܫr}]nBMh2qT/Ё̤?|Qs*U0 bR 'k}ݓ4pS{3~4vl/5D'jEyih~T>򬲆!Eށr184G$:i:iq'y-o"X.tGoKYhx*ҏ>0i }0|лEF#|Y'[Lm^:Зmms&izK il9-kh7D;41Tu0.&(ު KJV=|WOCg5 JK3REnP(44\3Qn9N3N'o%E4t ?t&p <"X7C6(N%#O)gܫ63>I60Pp& ]CdPD  qZ3ou8/Lllv}B )Ch` %iJTB"}V4%BLlSޕŇ+B+aw-hRJ.Fx[ KtY#1)]HlKL(BHodltqu1}\uMk@8`@;]Ɓ?Bq&yv m+_ia+WF:pzVbŧI\6Ñ[ފC沼ėo/{Lz,L pؠn{BE?.iI אC`&r $,`4KyX;rdz`` \AJr0BuBNX9 <FonN~6.u!"9!,NB?#BnPQ`KeFmJ֜΀N)^ 3q%''\W޵vZgT1*=aR\|!0!<RB Em9[062:TDhy:un,i]0 ݺleqKi;G_tjx)ش`|᧛ OP5R= WphvK.2FۿEƘ] Z|; qz"ZP5Qڢp}'Ѧ%=,zNp7l'/i3AzpНP:=,zh' I0;eH`vz 8 88x[S"SƯSwj8W/f'D~vhUαąg/6)nTPFp.LYn nwPFBd[rJ79+'~tDN%BI{ N/UGU*mS+dZ إ;,66eOkò~<{2o` *3iYǤL707CpxkVm g?-ɍ>Ozxq&5z ߡ,eCЋsaS똆vJ@hݣpj5[]LTT͎ )oq o-%T$]"|ҥ`z.`V x^D`^o퓫fj$ibC:2Q֫_pZX)ˊ@`͏LTBA|9 *~ȌcCKi3}b +';7?U1Sc2ia3㳗oW-1MaP]'31N~:y |( -|G ¸`NY-qPG;@ jYpl| t)=k+$;|_B#WC@i4A쌛03\ GEJ88<'L E'JiO;:9͝0?I tՇ0SÖ0B [vpo]du0h Gsc'+ǔ`h'* f?UN:o1 Mѐэy& hlj; v.=GhP+nX`O8 9pC*gHx=cV7_C{6IQ#]G!/Ӌ|mW 3|6,1,Th#dt3z񫸙Ѱ(/r ;z*at!=+?ML=4d  U,wW{׳k"n ' .lD;KS22 !1&*j2\JďW>`Y*q|,+4~ɅR MFg:HKO%HUI d!V gI*RH1"muĄ]c@'x4ex*3?xB)[Cr(9 5tz7KKZuD2nC+u)jeݺzoܻn]iqڊ",Mo~s՞#:&rڣ9}3f hu_H'i Q+8ڦw ;iC9D{[V;t.pS9x-Yroݲ ȅ \<>V23'ދn @H^%~`tRg%L\*X0:ҒWsqg*OЖw Uk_n[ܤ͎񘎘?fȘ]])es&wh `DW0)p/ѣvCy:]|On K]T|٣ޖa(W DzŠg˅ $\ tgB}$*)ū j̞Ї7NqbS:<[|t2X$cN 2'ٲ|¹9l)WKXqOH/ C$dXK ZpH~ upO{xY&=0[OŃR[u-_"8mD(GfNAn# 䠫,03ƅ SLP Gm)(ŧJ/X}A]>(>xj!24iB\ ?p!Hu'AWb ,oK!@7CATa8\% 'LR C_vB' (LmOEk8Tt#N,`hUx5 ֣ 57.wPkơju?+nNEPqvy (}xۖq3c7Q=nVG]pAs'3:gyOeB%aWL420fZ3߭ubi3ױJ|C`gjR1KEy~ pКE8xmyYoАa8䒸! ]\8` /qoI[V`AWM:[<#np[\xF ki#ah`v=gXtHQ&T-;!d]t@=~d#Up8JxIc]6mUw³H K٩ZmpK(SЫUD\;5qgӗ=dzuR9~N&_[ O)%tPfc +ǢB6:S3rƩiNweNaT|QFB0+x&4؆oNa+Nt)Iw@ཟ @TBX+w/?Jb$u sS٤$weW_]P|}#pQ T0FUy, #'&!Cٔc1ۚ=',4`* En*'TpTa[C;.zXUWdn4bC.tARq) r7I= ,8wY|z'ue-PWw[vu{˜1N>,Xbѩ#yhnyDֵs4MQ䙘S6UoLx{r_$#Hu_Լ+ǖ`ܮ@IDAT.Ay :Eg+@2jMX4`-e+?RSqSGz&TC0I'pz/Gwj#z2'.{ȐJRƜ].SGm~1Ai;cf(ث'ſtcbQ0(UKYBAU4ㆭ;ԕ5Gb'߸tlpE6EOTzTZ : Rg**\r48(Y­!*3wep%KY9aĩv*6 wh;wh7K%ŝe!C9ғ;0Gp ęi1Ş-tŦ{cnz@ZcZ 1qdN)DRaWT-Ktظ,+e:__rp[K=R z7iԒzT(EQ)Cd1Ne|ӎrࠤ^GF:t;E517=7V GJWܺ N@ ."T<6+9#*86x 8|,![orn‹?qJʜֿ D7VBV{7ӳsЃ6>/xM~. =,na|;h"':sXb+ F7Kr.1$Cgs C1Ar\\U70ߝE(C nD -PFw%Id<`X#-rR՚Oqp&(ڒWB9V 8<3ZP(#-N›ŷ#@p(z!n?FtA;N&<z?p8 >yإI40M-?}-8#Fا96<viNgG>[. -P3,%1Pȓly gCRe%wVp|9{:^QCpx<$OXhM|WV|N;WdgVƸd!2Wk{W틧vgGA:Webz &j",A  g鱔]y4….oۀt@FcW<f Y>':~3C~U+yI.*Z24CEUIrCw8<Y6BNeح3dI#ExI(dUMSZUmiˋ7)ZWt3qA+C \' K0 u0t*q8p8{!}`.opèw[C*a +V??[I52]aJ&IiN;4͸CBQ8mkeͳ3T/UsZw4$O[@~r6 ;؅N<3aca~4ڪf%:$g4+B8w?#9g_Px*TJ\I#D!vB*4<!  pX碸.]<Ͱ _!'qEIe_$*&Gu9}66~pӻh -07ߥƅl737{h8bz:e0Uc!}: [N{KC_G8t%)s+G (/VU  ;ySǑy p(И''LJ]a-\!UyWاpG y lHgڅfOkVe[a7xM=kcB?y)NiXw\wiUyNKy1I2_tL׭svM\Vׂna&>+}h.ap|ltZ?d 4|~eσRa{h7O%Wfc?a烃P 7|x_pRLp [LN)xB,JZd8Q'@@*#]uwr| 7Ju{V*U-ǻ~OL4[>udܾjgܴWūt< #c9- SЎjv 3tH9vHGj6aYcfA*v}GnYI[ŇHHvҞ-zx10qϊ8 L ! xd ϝ ||G:00p #󋴓đ'~G|;K~BC#0g ͩS&fnܲ)S \DH4tnTX̘̕22@ճ#>6+3?e'nx5e ;gzFtW˺QT|/Iە Rqt;:#:2"H>|`^sA: [{8@儓O5_I.rhY$sN(wZuwzH7>V0OS46UR)$V+vRp-@(eoa&e _s ߉G)oq:I \~Rd b-&|C?TL Z?X}mIzrIH#grT*qSVvu[ 8K CWͱCtO XQ0pU5oku6HQH!IOz ۍ?)3 jழBha\%k">%s~\e"H@E f5# 7<Θ߿qMtwLq"as3fNJoZjj:!.tb]*jVJ`~<”l0](Ô1+Gh+/ fKx\"wr=|w~NeK2S+7e>4(C=a_ΌC<`fXf:xĕp?} acvtaG9IysѐAgL5N(hBA a X4h2[Hl`6pQ Dp~7|*83ϋlT|/lAw6KlY&4c#fv5Qe:q¤8fJ<2jx*ǪZx% ,7*}Iud/<]CKK=<$@'cz=gƥozz߰#DAZ-J#9 ŏ0L|Pi6}'8Ga+?<VTC^KqIbO'-Qo9eo&0S>:of{zJG0햠K8OZOTWz8l2/ NJfʖ;뎊7PE ӪkKVgoxhfLtx\xX)7Sx3=oO}X},T/vũmqqⲣcвg_J^`vнMGr6ˡY?QNOeHpaAH7z )nv-:&1.efX<1}?xZMƯknfB*\%@J 2&8Ҙ[sFy,2;w0qq|2,r"& JUEgL?2.rVs& 2,,㮾eU|Kw1[ n<3%#cǎV$[ 8 X@Y8@0AEMEGE.MS/F4A. ͱP[ WE3?EN;;[gWX_bJK)US А7|U| I?;/(dS:wh$"4@Qgo,8pߤS^6ŕ+:g>[( ngW , K^>a* 0 WYKOf!@4C o?Pi~Аa KٍVt0VgI P(-R@:+8TĒxxp@y opGR@ARˑ]e EťTSE)B{x$d Ӥć).;a '0$>SC8tbXyҒZOk^ .2g!:,*oH(}P"-})ΎsuݨMhŊ)#l⢳f oR/Lqsҕ-g9mYe$xTe\3tÒV r'޿ƒf;i䂦Bc=I{;~t5S | 7{O1X 1V'қpWGq_rTB~&ӼӗS5DCa-l!n@Qފ1dE<#oTR4(t Nfu`J^lҍ̂'[u+#ɫFϞWʛZ%c`5.` ?V0L“2u(!Wh0Y9a =x]YۖO>6JeЫtw5u)]<ad[Q ]A+zBƔC BϢTXYT>%h*y\dRE>W0a 'dBI~bD@t %B .Fz(``N~K}wg>t(AdB!@슰 z{5f;u!qp*],,CՔ{Yx[؄^ MZ@Z"e5'! Iz! JiT,8 Ǥ‹`'?\ D^+P)wL7,"od;]Y>watT4{ wQ+#,^e1okWkhַw>ģ<. _*wӟwҗ<4O8Ry 6_J&b*E@a@Owp&}Nڲ'>$~=p!.x;X.w`˟SPl\ӗj(A+]zuz&| 6s+2~06"G#g]:ٰ SG3μ"4'_[vbUG]ἂ"1,t^z^VC,%4%\Y#!ae'Ú@ؼ1&O=JTGIWKNWܴ>|8i,S'NMTXUƒFp=ZX"_3twT)?4!+˯p9Dl48 NSܤrB/E?C$+,[K'?2bU1:oK\w!2<@K1cQ+sK'YVs8ăf /+g? 䟚E m,(ܕ*5|pMƆ㑸'r^\tQmcv:'`l3 %; +<$K%ZO5V[+% s9n:, 0pӴ]M}0H'!2'.-U i0'Vg_ك[%^ ~5',L kR)s^KkƄrI;ck0'Viu b&̽ _U_ݭnRki,K-;by`;N@r'`g29  s C2s0Ʉcl8c;8xɛ]ꖺ[Ruu%'̄~^խխ[c:AF6 ʣ 6Lojp*a0x FcL'@(2V`> X+Xy&ݺ C#;cd<n>;h:##M˫2ϗ2|O( 费c"~sWHk_xЙppM/S-p2KӅOS)ҞaJW=` 2K N'H;rdUU]s_:lpKkM=0U锳 33l*AD;}ӹ`7ij}!M|c @AزʏN&Pr'y#u \(촁q:hP]f~I /.uL`G'~L#Uyp:pR֞l_>;3ŝikc;,rsz):2){ۥ9;'D C =jAk3@u 9ue>4 Z|Ot=9FP 9asCD09N޻-mb_P4*L1vGbj'R)CiTY5>(>RvG$ғNS FuFmX}PȨ夦D=ND 8&60/:~4t3^ ϖ535`^,?&OnDEX?Sj/ލ B('1+.rs%k@r~yߣ7Vo6sް^;1iOt!Glvitl,W+8|@<׾6\ R~hrc,ˀ,ܻC g7[O4zoN=8 +HF5gdF 4ʢQ/dwɟNdLx:v yd1p5PBh,N=V),H*΂T{jQQ^Fܹ:?;jr+* 4}AǗNѥA5κWL/C[0^c?(xՃD}":| wCgLKRq -#FRCXs J)2iW8`'E+g|pЈ>$4,\g63<݆K39]A3㚃kBsތCv'{}cBs+ g-#9#55lms^Eg2V>'8-]}.v| cN4oB]w>\®!N />$RPF{ r*oIǮJ[U*S:?=cNAsJqxv  ?n|$'gKߍ1Kҫǂ6v\ͅ@9H"R bqaUU $O^ěIe$;Y(;Nx*[i) N_X ;w#t-Xpm,B?X^ C(zV/'9V5*ۆ #0i4,#Ek*}Tc $gBL˰tmPRѨnim!ԥQphC4Ƞ8(8@s!y%!1-"'%GcpxРQghW- ",'M?6t5R<ѵ"z{~zo4]4 .M͈id?GN \JcȪx[Mg7~ݘh9TtJ T&rkww2TF2FmPg~7 F |Ty]冋g;JhǪܹWLK7 cT8u@3y@ʆN3t9gG /֭@|<eyG#$&p&L#:,?ɝHsa6NKo5ɝY͡Qy.d `xٹG_aQf :PeӈDdi]䡂OLhħI 1fq0ixxOp<]sb,զ0+qjJAgty=zޥ΄~O./C57^,_Z?|S^ymY BOGFPioZ\~|IK$tAF.9n@Rn<'<\es[F |gFXtĄ׬.\:X>2xKY}_Wzʪr.]eM'\vp m9&#9s ?y)o҇o:/ s JyK\ t8Ap,ٽJ G{0)!204n kv#~X?⍿1ʘ񴻩@Q*LjzTՈ@022c=x֬iT8ojaMkcRc( 0G#g 9%?j|h¹69_¹?g?5\y5PgsnFE K&z Γ'tpCY;G1fquu7|JB48\88ێYFV,grȖwlG1%\LFO|jk!۫%(;MK7Ϸv8YNܓ#?rXR6N|8#S6Nh؉,fry\cKB݇k[oeӦM'M{}.rt[S߫@6C'p\J#. RIҮF_ii"m#oi"ǢuD,QMO36\98-pOpų2(%Qx l8l"*[LM9gCK"ρWp"f z>#{qlb[+zxS5V X HbǏ+~z+zZ`O zWiQZ Rl1 =xxș GğC:y@#%/P8Bm iԟmSnm|,7^8X.=O{X`&r]xBK>oG:G6m?cd*;+Ƌ>[{ "DN拔%+:NSjjtO̫x 'A*xk6 (PF\([[ě"Kr!BmU6ʙGEhsjy#rQtꏝCVo蜺OSPqہV9 0C>X,n_lrjS83ad/ n{]`#B8J8 4t .7˖D$oWA;W?ёpzAx#ZGϳ ~*6?7\g$pF>IΔ:q32 ̯B!bnWm¢Ol4dYV9W_]lܸzO=S(x&%O>%bBژƢSpS Ma)I.0&+ :xPɎ(V-Z@x *~̣c#GOC#ǵ[,Z}vPvI~#l{|hThӀ>2_s ǎO,.[_@Xa"4,{E|\h1u;+v.Z1[V-(+X4"Ljw,^8X@2,OG@:^+kW-*$HXe]JgUT 79)œޤ"SXVnFS~):_+%SlqA3t mrsac/r(]4ol/WlZOD/|cv(s d ț!eOt1G'|5a{΄č'#ܦ I?m!HxfvpC/|޲Z[+%+׈pxu5ڙyT(;ggTF{A<9 kꀧ";;vF-ࠇ=SG6,8I3ܞ Xw!5.cSmf y̫h wM̺ qh6 r8pS)qv0`LPE4Pc\=Y̬8Ws7 Hh9 +G XzZҧ8C2$=sCf`>Ag\v tVaV3P\p #?Hhq͖ֆz=rxE BA(h1DFBD# BUei> ɓF7/W.Z;OG+_xXiC#OZ OGwa4 Z\3eW/*w$SE)oŻҨD~#僟yꋘR/+w}o<^셃ȟ-;-׭Wn"a !x|™>D?!sn?epLot 8#ʿȽ@ΆʑzOM/[Z^%%Ct *|b+r ;IͶp& 1z/x 0y&Ώ8 mAʀ&[eeY{dy<)8lb38۱R Wo[i"/d:ěJk^ rvQuY(Qij&(&`E9 UG)چDGrSPh% !ToXPRF+*\C'bM'Ag -0ЉךiН7S:;BI7':o8aFB,ND8x Ll`7_~?EٳǏ2nZ.]Z.䒲~z+ l@/yt.ƀR^؃NŤ䯼Q9_ 0ݧR޵L0M^ufGc3w?qTyOٴD*l_n77PI#Ĩ.]\,/\iWk=ąeA'vx@IDATֲ[ϖrhPU Y&x EPiR'lLf6jH(&H=`~GYȅ򗞩P1Ӓ~ǽ ^b/l^Yן44N˫]; Tswm4\ʫ 8l9;'xy2CuNk*] $-pCO@@I\;$4'~路+l u_rκW_\w ɇ35Uޱ̃bk^ٞs3j<΃?БQbnɑ;tE5:Ҧ >xdr![u o+\\y<_ )p\Rmw!:u/;?7lM=[3Lf>G yTu$4,c$βKh4fyxR~s'y?aTv;toSg8ٮvHAGͻ Uo;XuW't|>M&|*ƆIL@z *B~|b8$K"y ˢtxuﺙiGyc0_0b _hf(jX(|7Y-7-(JIAsÔ(X >F4$Ӆ|V t0šPh>vhLXg~<Ŭq}1jcT@ *ڞVS<{uhq6Z 5m"H^O2?(=FѱrӚS9h'`nYw\Sp:ћn]i@=Z{٬W C|\o;qyiۮa; [$mQNĩڗm{l\ӯى ?c>}&wiˋs"Tko9ppXNFsAXگٗe߁a-ڜk7=C7UFlf6 :sfDQ`Б9G t㦃qWF: 72A9jtXNڷ-(׾|N?5OBtP96! NN %x+qOWG>p}5xpgQa1sNwW6<8-@|Ƒ?ж>S w[K+x9dI"$9$܁7(\gg*nxtg='xˬLh&˸ጆRtO ]0zxܡK,S d0vp ?Hc^*N&32VyR&;Fa"SLGY˵|!0 Sa(=\mA~mSځjWOpvb0+!Z_U)A*m#r”GqL0XߜpBlxd+# g3:֟ɟ??nk߿)Pt gc>OpH~7|s+?3?S^WkZٙπP KiJiW+ 0:HAȸCY2Si8`h QTYlNnD0z9 0* Ä6vujs1!f~)z0E.4;v|EҸZ84rB-k^>Xrs͐h:x?5D ci:zБڊ:I7N]e.VWr;!]N w>ڰ٘iب™Pݘ~HGu(B- 3cZf;@ȈJg+ͧ$T<2@7mz}eP3L{q񌁲8yqwހ'Ζ}ց;|!8D}@g)oxOO??)iM f7R6|^GU||{/n-.ټNhVnQU(;'8ę [FOPD]|'4j:P'ưdUxyQ3_ d>j@y L!^mi@({Q! %f=+^nX`CfkYsW8gK%yu?˵iB_y"iqƊ'h#?aT=5qTx[N]dd7Ϩnu -ۇn; '7?%vʸ\`kmKSBU*z d*3F @3';L3>n˖-OO7\xc CCC{:?tF7TT8ngBACýLХNΏ(T/09 Ց*RJBWH2+8Ndž@*:< H:)0!Sqf {swARk/]U֭,9e~q|⦆,N3gg e0E3.k5ʏyx͇oCM>0.)Ѵ;s ,z?iykK"8iA/fKCgt>ST_5;`/oQyX*l;鐗A$"Y_h? AfM]!EN QA'b4 ' }ܻUϤvR^5 k.o)_k)Ku0A'|UWE] |c tb42iv#GAwLi'ʎF(<:0O]\ G=*lZ#ˆAHFY"(}g0T-W^ye.x8xi-y¦=T;:u4ha\UyK"JJP`tڼyU{[ό`}AAxDzm6le({he1XBAUYl(EEI%]>u8}za"#b]Sm m&k -.矿ls|=aa.F ΎHA; .NJ݊g){E*Ȅ8 /8)hL%3m8X7\p 0A6}D!U`\tnp$lO|MAC'a^=r4Y<$S^^.jy?~ ؑ ~goGM9c80m!Bx#qcN_:ee .}D<Lz3~q^8)AQn;P?Q.k{ޯDnzrԡp"řpM{+3I1wl]ѽM]c f&$'17@E^;ЪͶ˯pfβ{&@#߲";#5 0xA}&wBʏ`T ($~Anx@5B#YD)xtt;37!ipkJa#t <H' gwd`H(| @ttWنn8X2ESj@zJv1![Opyd:Ŭ3d"+)ӒD 1ddyS wkipF>6*鑃@ PK5ĊTr"qQ#%1`9xˋz#Wձ }jï\U:Q댗00u+w?QRcfۡ5`"P.pη4EGZv==h~|ClX 9"GV.bCʮҗuUN`,_#{c;`w9kH#I+͏H%|bLeGLQ`-2Y)f4S,|: f\ƺO/H+~לD'$;Lp6EDځ""MU 侈'}#zUXΚր`0 }: Υ :gsZ@zv H ܷ;{褃0Sz;mxx&lpb lij"p\S{S,zPf\uzfP[nr_1 %3s3"8C hNw!˄+dxD~w4VPpt1v9OٰSmzryՋb2D'^hz{֯Sq8ua\;29 Y݅x!FQ ) 5x&oC捅ҕl`raUlB1о#@vqhxOt<z(W1tZ*JNhW0 {i aj%GEz[;)2a52I ԰z\gOJg'-k(Qoc='k/SVSZvH{Kh~=IӾY$gJs7:^zF|bjlaXؼ rġ͈6R]p̦sǶjâ#e./K,֎&wL;0R^<%;{.CM/-+[<0UG#~=۽Fx~U )9z< M^#Qx Ĉx3)vZŕBlhI6M0YM'`f !1 B8lso UjMv iYrO'IuY#Wx:S88țz!c@9[s%~۶mþ ڌ]v8y@z{Gp9hz-]z@52^~6^- YB ӭ: ]؄Tk:DaoMѷ^s,z]w6*ERQc$Mf=Fu43g(W8wAu²=H3gY\!rT܋G-Z&#g:!Ac r8ڏ!zQ`IoA!Fv.`/ ҔF|e;V*~''uU _os68[vBbԀ@[ΆIx ev fg[CvuO%vj̬ {aGk1H0 J y)?~*1|ՎR:4*$$c|GYprSZ9{GȲ㙯Z\uS1&`(5l>4CTRn[]nNٓQU%^{T'<%zpӅ9[qhl?'_#Vk֤QLB15IL/5`kW9T)أW uڼh?ƈ׬,~ۦ6A80ݢ 떗w欲qb5(J5a1hs# 2@^Z"\ۼ\n@ r4{#CnLkR8m6\9Rȼ!覨%, Gg/}H/syS5ےڿrHM#R-$|Pd^^1.GɃ:2[=@saG#K^B1 R6q'Oԣ6oٗ{.Wry~̴ɠ :m/SxQe{r>ծ{.P(3䧓<]H=Qud ۅD>uz:R;.mOϠ wyluc# rt!,v)T9`v `io)JgzT ߚ=8.j+ [vŗ댈iv22ߤNN3Mٺmn"wįyxōɟpҽyp<+P$f4IX Zݺv6crM`NX߻ݳ_fV<2ci~fu0__} ~~oR\xEۚ-nõP#Obũyi8 6lh*T6=w~Y0qzT/~MyNȍz5Oq9T,*" Ƴʫ .LKϡLяh4榳zUzL:O>7$”0$WUsV,;1K^6_P IY]LlG5p9ޑS~pfI ϙ<Qk;e$sF,^feI動+R~Оo*+VcUP #8ТӦgth`L޺/1Ǥ ݘMsqOFPF{B*ώt|t3;)U( * *t̢*h |8R1 93ZW+/|rd@'n4'0j0|1dV Q^ʐ/e߭H@- /|щw\æލG޻>zӁ.:zVQ~ytts@*vȞ4l7qa9I/;"ϧ 8??9zd g Q?؎|ٴ+V7o0+iEG*wlwaܥ2[|P^֥`h %C/}%XsJޚNg;J0D@yş;p %A0Du# q䧾pDBecINwnʯuğ95d)$d֌څ,WC w e\f:0 NĢ@|MW*O,:h*e.Rmɞʄg١ޅpQᷭWۅມɊ}R&5-򜐣]0ӡbR8f7|hVAq^w\veFnq\ޕc*8T4d'P`@fN(daLMƕ U?idȊ证pjd~=ˍ pNAD Ix pSѐ%O:4ȜtXs@>tc¦6;P'SfѢ$][v6т6094GѼ;{5|B < hd s([:^׸GhQv`X[-'N+}.[v/PgHm΍Yod"{N5ii)/)+63qAJt>p(]/ȣck*qgxNGp,ׂͺD~*pz_hFh c:-7fښYTkrpХnk&NЭrщKHȐxG^o*eOpSp4e/O P[ܐ\E8Lu%;nIx M_M0h20&#;v9х4tWuR`(.+oVx+Sim"tL򖷔[o3yk^MlL`e2%o3]3 O{Q6R3)ݡQ`z3P@ݛdEҜಢ\Kc%d@V>FUaHc{%`x`{ G4( J?℠ i\Ʊ:Wa/؃ovTl%Q0 T\ =Lx(l.m8H4!$Y!GOfoܨ6iO +{Bq=qx2v @6`#$/pϼLyO~ޞΝ3!yIxp"9Rē4IpG:8xUZv"k1}CD%.eˆ 1mtFo0ϲw^3,:(^:Gc̡z:d6+ m1l~=BE^C/hQ0/1) xǔ7"+!@1($g :ּ۠#3dEB J(G]j'gvUpZǵʐc*fqaʼnv spݯC9RVʎ@傼gR8\2ppj E.tsO0^}U^Bb $oO~2O97[@L@hAM 6؂o};O2Sg8IxRG?< 8]_`h - 5.Sekl8pLw=SzH+_so)x{ $%\ga)>6:ˤ3ZY`_95gځ 6 iǎ!w8| ,?rg|h $ 7#;)د%*iQz^f윤sKT3+׶N(T޳!O`2@^TS۞d-8*G?4&kCKdžH5#xぴc|:Ug̵WZ8SQ6 q^G > tGB CE! , e5nv4teV?tyJp *R~B0MdCIl Juh}rDNzd+eUKZ.g塁ål d 6G#-9w~tlt,`;yUiSNҹr 8 gzVw|LzJ=3o-xv@B2Ɠs '&ӡ GJ.&Q3\ywL3g"ēn15,vrcM=S9|{ngq80m IP6P8i3o.y?Ad@|`iSlHWMoomƫ BFd9ahAYcYn۱ZaSO 27҄ΙiAk,u+!Wo ʔ?=?iT3}>w{)1b#/q| ˔Rju&|2UwB Jd6b 6:rvOd5Vt%TtJtXf 14@@hX 4ttN{\uPtL 3GN OWģ M {RP>ՐKSNDpØ,1FfO IcD@.aPذY;x⌼s=$qbRJ3 ,SGdCgK!4K pK# dt΄F R`oipctve=’ƍ 5btnŰ 5"RefD0C~k:iҠ-p;Ȼb?[aVGkl:u=YԱ7 C*Np)Fv"[zAN!"yN&_got΁ ,.pOBnI5;g"6ҠA. Nzv##`^>@G:; X?G\04 TqX 4 199Iߑ󍞠z_ l+dg $!@x;QF2CeƐW\aGQz6 10]\*$P֭pK5pxmA4T2¸?1Ɩ# nH(? ; w!*JL3c"f>x䝧E{#O=7`\cɘm! G@#Ѫ#:!5Z90*l Szy+9F:=òGeeA20Og e$dYehr491Zm ~v5ڧ֫oҗk@=ȀSw82F683oUŪ`6B ~>Mo )/Sƥ+֏]Lد)Qf}l/eyO.뙺uv#q h[wS; UGh6VC=5u<34 f,JNmzfQlf 嚄-gQЀ"fS!4p l%lv~x1ŠmXvZ:;0N,ŇTb \~P??)?H)β[Gށf*NǺL婦; קtBUcZQ6Ǐm?QiW6v#FVTL#FKbtAQ$8=A@@EWZi p͋KcxE OH((\rGnMW.JKwGh{ tbCwtSזMg ,\ʁY#@ߝ\~ػYl0Z=ʁ<#?=*U>tF Ƀ]0oHyjHd*ucA"9`}jǎd ~ta7O{@g-wM'|eAz2qN3k`}-;lԁ22)Ҁ6Z+}f*Aqah(6}|)-γݫ)\ZF!v~n rhKZY  ǀ7hc9~C:3a;sEt|ceG +"'2jU/|ŽժGδM7&G.@fԀ#ޱ /!s$ D4Uus! D_ L=lV2etd[ 0Cl?ڎJ͆3r % AV x&d~vCx~g0xX#n aFYƊ,C}:˪=bwܣ9n٭/4.Hgu (2?8H}c 9J59LF&wg㑊 |Ûd{_5x`bC/\ к!Tx!sP |?Lըq9xz'X4<:`*?k:rt.q1aaBfID0DyF,/Av|Nh@GfM;8ՑD޳Q9orR}z@xg{r:d Hg9$_*mXq?΅9p#ttL!aə ^R% 23d̕JmׇviyRye /26N"*?Lسe}!U8K]<AIT.{t4B83c|F5x<ţ,7daڐebx:]` P@\ Fzq0LgvFzu1~хBP}.IսCm-\Ǡ}y ta-[ )R7gxPjСKF E@Je'6KAЭA@E C -2jAB*pܝty??o]օw^oIʖFIJI衢Q1˖["? " 3LLP톿O0w:F6 Ov_)>vhdEvG> Ç({jgM/JeA:oxv/:e׾ѲpA_З,հ:͎Gt>6Ĉ]1p+F.`xBųGʑcu9r9yJx%0:\t@B:V14#!I򬍻ol_Խ+^qs3S(p*h=zگcjH}G}G}O)B/=Ql2tQpQ&W(ctlIa8ҭG+! `e=lg*鴲c'OOT۟x"̓q!>lxasTl~@IDATmˬo?w6Uf f:]FCpJăNsEYPW[7u^I ѩ% *j X&sRU6^Swk!LWD.Si_'O dSMaxI;Rix ; !OPk`6Ng10!g}t ~ yEB́;B DiD$)Wd8"ၧz Cu_vVWY G)Er ~@wjc5]嵗#D}`|*?{²EPńx|)[JeW {#cƋ{Wl?r ~>>|`Ny+V/"#zÕNtHRE?/5X{J%VV*ך/h$ш'L@- PmBe:ovdQ!EWwQ^nͨNj˕1$o8tHgu<ķ6yH~sf đ#o貐1fh-/8#ik2=0)#甛zFع?54(!ŋKF[h6 ưyfC}@[;,e3k`DpT1٬\2[#243^ M?@sϟ7cHk:6m!qkAT)tA'bm_7OGϴ _iӎ3QCk j&q5J'Ux mD. \JqБ)XՂFMF|e>Ky7_ {~ܽwKٰR5w/+w<2\h^>Pߓp#hdREYQlA0 v{^Y(E]/iЏPHQxԲ3{B9 !|1>$F 1թkW;rww{!t:v'o+F683唝:x8N|c=iIXE|$mΔxÑ28'ʾCɗ-LD̫ĴM> ?yVhB"daUg.G%MOqLA?XD jD[8㟃 9+?T+9Wy5in-O_q]y5eed- 7\U?Fq;NlӱPdw;g*ˏ -6-ڪHrӹ >k-y\qZ{ լN x]|"2=3eKϪc 4|@+ Gr>qKΙltPHu¨݃@abXJlnufEiq9٨bŊ‘51ށ~(5oAr2`28mxYAslr3P'w\sRY`bTLJ;!,l<^g'VH7I#tmGxq8]WޤiVb[8uDp#eӞce>' ʧ-Ї{rf90h?See*?J<`ėT#ڵW&pST;gG@ :p0A|H.?yoyaT~xN;Wqaz-wӴ?=xۖjHӝ/QvJdZL9'Oif'ϹL;d|Y eVu.t< 8]!߲m?ta h]j\טD9fuV5rHԁF#C!ՙ*5g >X Ɓ%[d'şdjlHbY٫O:oaa=[^.sw,fͯt\9way?^29Y @ ^( Dz_ 6<25. ?AI.ܓ\~'W ׎mW+$U"C , .#NmytD U he݉=+B pmÏX,gPµtvL?g t3 #:H,/.4Y׺sNPGd15tӣ At꨿z+˓=R{(hQձhdg@| xDss|4Nv6zԇW]-G~)71A;QIzdRW*+[#O9B^q'qpIgA9X/~:x|?UztBgq(#ל皺#]EWMs0s6;9u,nYx`7=gexy>D^yM,1 1p9l|<ڥ֑~!&,<:Y FvBnW}β .+ ..N.*'ex[Y -$/ws4_ t;D(p≘(t["8%0CHOULUy5 Z׎eŪu[H]jkb]f6d ] E&-?Zn;PdXxt̼U/d}<8Mt1\uV0!io0E_0zW-l \R2Le$p t)+'`>CǍtޔ51St=3=g!bQ<Sя#PF-Fȸh\ExAఃw k znSg^$/ OۯFd ;v©Sm1,ɐM`Gq.:ӣl8&_pCаMl;F+܉A$BDǐ^:1Xr۔0zaG;{Öҷ֬Qɡ72Bltv0 X^XyitS~!d2J]1~y!#٧%K}FۈZZ:N)FH Dٱ:Fg\|諟/7SytʮPN`e>U#&H'1{=3?^|Ewˮ)e_Ï?Txi.ވL(kI fYS8)-+͕EtIG'LPG'p ˺W{G6`,mgF+H)ʭV'N|s/ϑʷ0J8 Fla6b;a3j':I0M#ZpbBp_(YK߲κ,2vZg4Xʸ>% +}Cd:9-w6yY݆eG2xFZ՗e)Jj%2)n*v׺n1䑭ȼlO>r,ܟc3替.,$$68ցJmcz{X^eWgw_S~w~lX qN!nJp+.X'6HvH;Z/ad6v n' If[9I 锜 W\ 8avu!H&!~c*wpYe˘?e7?e622ˎ1ۿ3z0p>th< sC̀IEYx,cpAq1#V^T17txz>yOhVaR9GKۏKyl!Agw);Ղ~vu%N] &jӥ# CY[+ytjal1⓸(\WsM1'hmP#=&^{3 )CҐT*U%% I2VYYAzzKoahѦh=,WfTꙣSJC:+DgE':1 fpߴc/Ӛ\Ⱦ:|]Q??g6ϴ<69IдQ:mz8մk1ٺ33[Q_> *3؁F c.|Da x[{wv~c4ur1xqٚlIN ʟi?SYdsֆKW֦HX̗/숒Ə9&_q_DlK4`#gWdix0"qѣYf(h%AC~:dYB;tGxe 5!M|{!{%-5xNtzz1L/ZFHt2Ye$lE'MEW%g7TVa[0SS{˷B*p"'!y$wPT<ƗZ[Kz|i&8˜BVD"913AHeC jRcH!%8݉iMk /ACxZ $/ѨuIzmsIԪKN/=l殲z$ ͻ\7;Xϫ k3/s@L*#kbc${kfjţ)vesڤ>NĖB^9WEtHwj /pLNgа(GBds0dܽѱ:aߴu_[3[kӨ^ϏxXcLSD٦zڤod %rjƤg;8UŬ`t7nUw./mWݼGp#.__kMWt%ofKݧSds&'"M猬s}Ykc6]lNK3^qDpwk}Q-a o:Z>]S`H- Nkh3!βg`dPq2.9Xã@^$rMTD7:s2M9]G ~ʣ+ז~pTNTN(͌+V#SY^(+?ӛ(<2U6tZ:)7Zs, NP,|Mّ1N?iW {fj_mlr2(l5uB}q_>/vZ R3/BF(2T}i &IUaSYЈiQ`Og:tTգ't18y̟Q'K0# k:xe"$"04@ Ӕ FxłGU ! e&3"U:4>xT:W׺sk^u84=bizfRJ-lL!-7\2,׫ҳe:w/^؈g˴iSO^8Cہ ~kȽ'6Ժ>"޾rRxkď8ʚ7 kr4͕m.cܑJxG6{hyX[=jDfp vjE6aCŗ550U28:ZU|S(([Uw8Hf)fT>9t[ _~}hiǙ^:Yd )SHt8Ө C:#l:njZսx\/0ȑ,rm Ԃ.S6c :{(iTneTUNSضGt6DM@~`v!*{z{[J Ǔ6ta=B)3Y8Wx&H6Ew@N߃+V5IoG/##@ 'EZ/: AjiD,+}1̧wGC*dmK8Ի|M m<āU ܆5ԣmG CR<3dN-fuP"%7y,dg 8~#WfQ=y֊J~҆1ȌͰuӈ@HuH6Yq[ RcxGGJF%gj?J6Fyo<:~^z|!Qi FCDm$Nץ:_;`X~ # ^gFtt('1 kȞ{l3UEtL;,K??^W yr~wYe6/\hK(ۀFܼ9wr /~ԑ(uT%ެBv>~8t8iDB5Ӹlݏ{`!&y7-nY_sܲxT=rd4s V`yV#hfK?h3@f>"قú^^098JSQwfdzą89uOu2VKJ$o:YhZyw[a6oЁxMӵ]ZR> ӦO+.ğ,^ɼ#ݞ%'OskF&(uBPCNj',6M+4ʝ}ːNkͨeʆ *y}c)2(;dff97Κ]6 -=Tyfނ2Uy݉rixSuKMtބ!Ezn8Pf :lٷwiTzei#1)1Zts_<7qBS揲r@Y&+Be=< `A9!q=Bܙ2!3]ve;W>`NM^:u91qƸ3SG߲B!\ u4T+we R|~N9NNൃk\Ҭ^5(m-K*pAyU;;.__93ߦrs3%(%|"D۶UW/* 5*7:Qg@9L1{$ y1N|fG>{+l,@0\5̼c߾)ʚ}eYN-;heCZqT["kz^3<$#33N*kitS|+iS+'+_ʈʩ;.>Eĺ8xe(g7bFEc3FnBGy CA >3L"tGy_R4oG,u;V{|ѯ1t mŔuIDcKζk^jI]=,@n:(C= S(sG UkJE,bBASPkA0Bi,Y37威Jlt[fJ u `w~0^kY:^w1pW(FNaԙЂe@~8q:AK''a>0dlGkx/(< .S3ά`g: xҥ>΄dtp4=Л3yF'F`FA0 \c:lV{ơpN˫:n1(x{Lq0I,L%B!d4$ O] ,\~9^:O߲`[d7ic1eP:Cxj$ /u~5x# /A0TNtG8yaT0K2E!c/Wnq=L5 h?t,gW'5g3F'Pv̙eW޹k_ٳ!u >m#SG)<=^9E)12}ϝmhᔆ+>9|:vY#-q^#?g{!ȧ+2tlqYUމ7g_R<.=UnS2fK]"tCig\+E:TC+ )|qENKeV>2:H1,{m wYrv;iG;BB{ #BpL`V{2o_qV\ƬXL3ר ਰ̨ flpr ,\f[Ϟ!UJfh(C!5N3Ո04Z3Ń0piڱaTy`HD52f7NV`Oث_TuK5_(I.S:G91v\<)CyrlYO(/bEr/fX;:4 Įr:&r36sA|C:;L#, .x#t4~yN mxO=7aJ28E8AbAXG+&PQ=BXg)GXhlD9H3)+e+I҆lꜧ?,X6Gu"F`$`zr)ut*I_ϨHPy NAt$e)nDxثg hAC7Fg(T9%Nc7TSv#U'jJN#?EYb-Az y+;A]B̋xc tx$_r7O/%\GGj3At8V#H]6l8qw.sƭ֛5gwGՙ*s\)tf9 n':aQEb_I+pYvf:"N&Τ;d x&;v'G$:zh/mDNotLKG:@7Pq&Mmӫ1AJ) v3XÚ! ޳:3c3%NZM+d)Ѥ7 F. )cP>w$@wg6FG8);4IGk7E|yBC (H/:7]% 02a>tҎ(?jB6l_ڰQEa75^'2QİL-#vFb[!Bku홲&_հzo#eʹJ@ պtPA|a m_u㭝R=40H(5V(Pl$z9qpt ^UG)Tܕbxۦ\;KʴDHfX%Sg =F2oPL%-fӠ~SM=Z<ɪko-PV888\|5(tvWa? e3Q%ȂQZ.F19tD\&p؎hd't[YQ,Sz#$s x:21ҧ{Ҳ<|hّ>ZiC&8#7>CFܤ`4K@=D]!& УY3:\xֺZ}f|ϾitfmxDm:"n-P #{ ?6!6xHCN p2ji8-]N]IP;t;θBUjMv9|dNTЀ97?`D m{Ks-8LΟDFYi|R3*DsBƔ03ghjV*IQ&FㅊԒWP1m(xNQڋCKE! 'Z8ާ#'/N1fL#N_t8XtHλ&PTDxTn|>8XV:‹VEKN5TtN ܋Q6hT+:8T9M2hxD{iPU+u0^Antŷos,isGvdn$m$Iɛv6ױy@~x3 ")GpgZ9|t~n ۳+'2~8Xl|A;gbֽxL+u!1=>mԩ,aU~H2&Q6k  mdž r1g:rR4Qb؝zizܴ/)C Q 15Tq\(  praw;`yd ^4EiZd,]=Sba8?.7 k%$`GE(ˠGOMrC-<0TΙUnyt`R_TpQy#O9 8N ,F?;D!K< gk?:92w$zFv hKZ춦Y:Be2Bt:re^7"]:^0_9-%uBڇNz'm+:cl4z۫^&:k2:/8:)R5q5NLF}Lz+S٢/-[,ˀ@ڗQ Dehʏ<:qƗ]geyg'Qx6b_̯ 4E3e"a3pȅnv%rZ #G;,:hQ uVk ^ޤ2D[ f>*ʦ{ ,, Hx[y7#'UH<( ZGx]-^{75Çوf~+R#bۋH\ED@<=e!&VLbx/\5Ķ}ϐvk-kgL,o?^0XV`ֽCEO}\sMg"AGN9gpAk voDϖopח )ui0 jL;i^w^(ng1ㆁ&}v Ph_ 7nS# 'ztaΊ?7bٺ҆Rtm,iga܉yʺF 섫Uq&yڲU {oet`ktH~n i#"_SjJs +1:sx>v۷n6l}*(\!BnQҀU,_vjMq{ec  #3Gzi)|fBv 2^=gp;4xG ~S%0Nx`FfgЇe+oޣF4RĠggȰO#ކ "\gQyA^i&vf$^4x+!;tEag*ÚR0ސ^7IIِalt˩R1*Nu<]_$;a4${. ,'LX)2i]>Fف"})OxZ4oA^?&*3 >Qټ#pޥ;Ȁ#}W@Y8č&a, 5[:.ѣ'*rRQ?wޱց=8X/VMcM} UtD:.8%`ʅf4y/}JOn?85nr@mue?Đe[_6]z87'kȜZմw^Gd.Cq> BnC%s|R뮷@ܦkj2B76?Y&l \^XてdYdV'uh@9֙?ܻ !UzorŌ6 5Y e&BuǏJMN@5u;a0P~Oo+aCr΋ I&;WGK b%4'FSW# Qh31B* NS8S?Fo9!H{2}}u1U{Rh*}aC39 (odX}vĵuG}vyvpF#Hl~`*@IDAT5@( < n#ε)idMS4}8;\ :Ip~eٌtzxkT18zqA&;v,<xHp 3=P}S}@FNN7n=LTJ1$ 1S4k;If AGEZYgDoy*;, JQ̐Fg#H(IK16Ҋ6tM;GE\Fظɯ(CsssMgtQ&R`h3Qn6L8]ٲryTK7"1#<,|"dɌ 33l,R^:ۦpukr#ףj*՛ xE۴z1Y͎ u^;sfxo=mv4.'֭L? >F% R EzXqvЛA"\pPvfݶ-e}_)-+,'|خ&$(+dy;AQU6(Kpq3r<A䯸{wZ8%!O͗|4N!tJqX?;*3B D+ gFWl8B"c9`rªɎf?k:ZOaǸ9+G-;~(ӝ͸/TƃN!_Olɧ;43皷 &iG}`\8P6dajڣVc e ,1ӞLjFO %"…W>hu>)Tp9"C&A- Š_FSx䢃NΖ!3g:΄1pgWܰ,9_DD9pbا+,N9w E1d}@'R/ҹ:}T烂c{ݣ<9i k^ȱBoP g,d Eo:ԛ.ܦm6fS43u#8 ֆGY2X/6|lb}RQǵNpu_gPj=7t?G0_՟+|s(l2Pޯ8soDŽ&wJ]ҡ"78x kr%gNWJÓ?o(1\+с'B`]a6Qoxhx:0^%xzg޺"}A5Ms! hW~ȗu :-4}lgt~JOa8C{GM{Z/ #/]ˌ'-ի7\~w؈7=T ?y]) }$SсwVTcx~H)^=(( O%"FIv;jpEϝ#'U10ya 3g7?Cep,rN0Rl G5;FmC ϲT .3xn=]o;O?rutR)C3QלNfӹL"۶S{/X& ֻxq9fM0CgY)κqL83{=p K &/e#__ d.vj4|,Ky_Y< &"Tgtp߰Mǘ;@KDyS`Hg돠dDs +0=m=zpyGگw\F.qeeF&1?u31&t1m p2]#w 5K ĵy OR6`͌֌A2aÆpBWHps/4gΜsϹ1uw"xsbY:jitS*Oc5trtx-/$ Z28:%ӡy#v~U6<>fOgOlФ ( zDVt~_\F•q5-bQCy <}w\wɒrxE;eαȑـtN <'!yM<o甜[^%rK)xG^ʗgP8E|LN>۲Ψ3>lヂ1 i_(KBE\ޮO9^ͮ hǴ<[:i\ ;Gz=m/g\ג%"{xӦE7. pGYSwMENI8u۔ ՛9^tLfes@_r'~q|}pm2I *bOwpbtFYvl8$B-䛼,bopȭ{˾=}qŪ b4YI ,f (BΙ6iͰ@Ș< ˣ׭-wxRWSPD:sjHQ\ i잯߁jwܹk2|k k׮bŊ(PS\te>(+O][ Ge( 5EtM=mfLS{йNySt )T,#ּFL8!GSZqCCx\`xUа!So4t R3/S>i~Zv﫲w? 2hZ{jEp !cLOD'wvt 7n>pH A;Hѹx<'#myf##D(y ƍ<Q)1'1޾0Te6>77~+$i8(.S YNڰgG<;߶5M O {$/~hf tS9GmG sLSU;=+_B?/rF1.8Q==w(z;~'ƒGv/Hl=i 5O.ʣ]s}\ԧ+,:,kEUDnYf1]trPZ INV.G˕ʀF_LnMUD9jv ͊u|>PO?^&7w|o" x@}&o' q2ߴ}W*?+ f^C u1@rK?  37^ /W# [1OL9wȥ&yO'Aɉ*v)G|_[\!pRZQ 2ƭIkf7= X5x`lpH2uNi#l¥L*/R>V֭[p~֯_EL5B~PfOJt *˔ :<8hQvEK( fl|03-х1kzձ F_T|=fmy8E#jOo?7hiH3#3܏2RYg?z4DfO8u˻reòoC}~ݘop~gT&ܱͯs7f~|]+xJzg' saNerQ((S"0yo SGw]\^)ח1=N1[4}'OR_fXYEB݉w+ooer9j >^h __JX G9h8mJog_,{y(SF:NtǑ: 4G2(rpZ(gtF]a\Q/j .(=sݥߖ"\E n} QLvٻP{$Ӂ;qsen![[Fʓk&E:S5iuTw}LCCHZqv%G2,=`wWl\rN-qn^t',$S/xB/sj?S2>&3(⑪RhOQH+{=j9My[q![ԅeaBDGZ[s^}o'5 3H@ A6_+Q#/EHG&O1 ,([ua|f6FpB<u#&3*җn=Rwљ Ou͈iCFl,/EF d%aƺ[ʠKɹNU[TBӖ)"GXk o0';^>:A], \hD;DǨ<( (-ަ)ysŗAxQ! 'EJq+H'x4^3Xnd_7G Xnfe6U| suSƮkZsprɒWmbhRokk@0=S&Q>iByyvpfڀGm4vh' g@#;ɡH@b񢚘g[}~3X-sryk5a)d1YxyS +> |ʠG_t_ٱ4Ʌeb&&+Ҕp^xlge!8,_x/ '$'Mbwl{Ew,]Zƛ*zq7h ^([T`\ENa) ]E|0dGq̰aByc-ϵ@ M[gx4B`{?^עq;m|;5\͐<0O ϟ)6[#ǔ/< Fe6FŤ_] hŎ`q^ZMͻSA85y| \.rL;4:O~']k9EɸU>iBYbaWo})=qf4ŨreHh74 y4+8r#3ǐ]:P|n[z~oyꋖHzQQքhP2PUrF>:' G{)9˖)d.2/"u=:သs6[!ԧJW*E,(* $5 k95p x-#,307hB)ǴqyZݯo/0h( $gORMy( 8xf@ Od#C/ >e=qxU,_0k J6|5x$gB7PZZt/Yx2=φtܚA=Q,?kNA߲Lɧ69еa7͢j5d1dBB8 ځnýnUhU#F|!4r`1)&:D ɉɒ:vBZ@` cD3'\fV֨Oh!9B84F@}?!\hgvj\$1WðJ<7RK(L=1h Oܘ!gapv[W|L|vڎikGCB 7&4uv˿g*/B;ΘȚi|H)3MVHreM/cwS)? 6~~Ap\' & tAMeo+@΀H#<egqNے=RV /Z9|e|^~~ӞrZtdՍ%HCzE͚Rŏ$sB()HtTzԆH`)s5~u3o2Gf^&BamW;9ԙ=(ٻ-h`c XGmmik]/֟e] a=4۴vnG&2&^Sh\Xqc:R'yY\g/`v݃7y|4SbzдP6tr3<Uݖ~Leg̘eH>JK>YLjCWva!־1"ۍ;f{a L(P7*xT 9E # :_6P2(G(FHgc%̀=# @Qc鬻r;/mxޯˈi7b4 4Yi]n$J6h~鄚7? Z2i+<#._2tɌ331x9ɵ^[Wj b#cyB ͋ iDh%yDG1IĄ05Uّ8€ӚlU&M $Z[NFrL+q?6•n*O3O@ 8 FL$sp?tKRZB᜙.̛ѣ;XnOٳPW(S(]XT tdgd >|"kk$1u m cꠎǴ9oÆW6J $-rρ9A]g䣎h6er޹ʻ,o]5wҮj!~ym${C ʃGJTeGeAG"n8jTP<8SV;Nۓg>ɸt"y*`L h15L9M"B34t<󝇜`2E* ]'nqF7 hk'#:S8<LtZ)]~N14ؔiWwe \WnC{Qi0 ڂzHÑOڢE\3:6xz#%@xX`CTDx.иR7?5NӅ_nܧ7u {l!FJ"P^)#X+|sn)7ZtyN[w]e+/BRW%&icIXaC|3MtɉW6OX*1_ i c-V{y܋ۋބG|i zpsݎ|5{1I;$'=׳g*;OȂ:<#D8#)z{]&Îzv.cp dk΍Fg;,qg1+JI{U3f {9?vVnX{zgd2S$ewHXO-@'()i!γ z@-HtEt*Wq:]L3'54-?1U#?iyt\Y?M@zaΌf|uq8c@6xkGM|3P $ugr"L2ٵs[٧/"o /2Xp!7! Ђ &E[If]6p 4U.4n\Y$a;JSlXXhB? <凹l=8ØB }WޭS;<_l$41;.V}^T~hxA=v؈eeC'Ud9K}zq^~}Ε- x"8X1Jڇ{r茲{%Eyam .3G)X6\}Mvyr^j^l?3k9*;HN bCqq:(Χ4sRKǁю!F}`IJPw-\[ղf^ ;tMa\9HɸÃ)>B,t|g{6bOB I> 0QۘK=D?,䙼8t ʿ[?ȜA9[t83dߨx\ }r%K?S>M8;MW.7|u $8*啥KQS./iSnqAѐImzZڢm e1 DI\iqNME{_sjG7)t"#g/Rνy1g0`tHDFY`9 99ҖrΪuv&4tC=El\(SlѤyU$"BuN~j(xyR\PheO`z"=PkȀoUtFutS?~LhPh١& w\1TQSIlA1](Aa܅Y./Nꀦ(+;*_=dŝ=|H#eŜ}7,{0ͪwD7M7CC31cɗ1OxIL&&>O h Q8D@f鱪kuSoWUW=kxTc{9Z\W8O@)N:ammOT:p"(%b-ebqQFG1@?Ge6|N+EVh8tB+z !֬{.JBȂZ`'eLhsDx9yeNҳ(r&(C?*(| X i#N/yH !u-eŸC\zE4p=R7tw\ڧ^q&\LQ% ա.qx%~ph7-i+jOOiaEYT;nj9<2 :~v 0jCF^?(_{d@,HkW"8=2P*, zMk3 (?yu0On vCPX'j 4Ƌ3O<ag^Sv{ۿw|ȡm(?IUs!aq2ytOZV%ȯ4#~I;i wS${4taΡi*vb4IO mUF&Q7Y2 4}q֚0ŢF\yUtrP,h(35n{y].lv2mW-g5JO+kS2C#*L>?)˲%#3Tw*? " ʼn{Q(U(.FGP;BX0 !A+,h!XRdd8|+\竴r>W4 h^}Z`whRt I¦G"UOԠ./_(Nv)===Q2|Q(xF3 eSO7bhh y_qbT|9^/~?(QjZr| jdt .8Ų%)IO6/`S\43u+kpg\9?xY\ 5-{k@Q1T(gh'X>yvj !sl_'~JӠ_ъÿ#GcQaQWsamx%! NLQ dS6F 7Q4)x4_s;P)L_joEfZ֮t)c}L89 Raɯ^Vh WEwUa0PDu9h=\iL M&^;qy,oimhAH$Ct̨ES(}{t[47tRF e~ 78O\2o),?Ӛ61 _<鞡|&Áwɋ+( H m| Ǘ>r_A)c4w@.AsvD#ȘcCyQf `RwQ2_R7x]9ءf(>9zi:O FOxh|ue%C`΋a;qȏ҉LϙoCnDe2[6O z.y\ʿ}E#gxfcOwC~ѰX>t&Lq@nv>YH<)8#q_t^y5r 7kḵZ0$3Ũ^bQ2zf9_es;qQ/FdMS-zآSI`.2hpO,$Vԕw%8/s i%nc 7>E \Go<5Nx;`reHޜW4rqߏowL/hoX5#`b;P/_tL@{>DJ^"(hP\qCL3 x3^ :Cxո?8?G3 _ƪTxn1ZQlml;h{~O'??/K.Rca97[{=EXH~F؏gu>,W^yA 2)#Nkg[*7dY"bl+WabtaE A1,{(^ތz'˱\7@@+)O(Bfa]k#'+~.}[ʱǟ=LPQ!nPx6Pfڂ*JSPI %*}e5,"azg\?ң095č3eLeM˽pY u8E_Pfw SVʯGDTv8ChS4F֮gՠ2ݠ@eL$=2ZL{ѥO,'ܺRpˆ4G0x'-8KDYi)i(_d,xnYNГe7g-G_Y7F}vܭ('}V;e4,N9{)12hdɗ#24a͛ oZӉv'n#f JpjC(* 4aOg{k=Xk-a9~ӁPʗNLR+p WISԁ4xrYrx9,4ˀxr` DŽwvyMG~S ;>bЉ'~Lgxy2XZUZy# UNKcbH/v*wF!z@@ChŸ#5\KX o;f`{$ĄB_} \PVZ\:bW/]g*k˝pԊ{Ɯ=ea7GCT[)'~(h7IJ82%`^˜'n*%+ " z>JE%ӂ- ߣy:zҗ[`r亣E aĂgU`mScs'Q *3Z^y'А!8xB Mݏ crr{|)}v2xP(z(Kz2(\\,ANb%PҰ~k"~gI73C:7ȌZ83CP4~Y CB1S;rJGLi8 .9dzYjgHD6udP GSd q;2@ +'\?>lT8GYW<ATtoM-[&cՍw\9pO=ӨBD."m%",#2n=ѣDž#7-\wGw"|u+,S6!K$B}_dE 9jzA-4SVĴb #|XBɖR9 [xHc]%av'9嬋ʢ%L8fp g}mNj#mxޖ=8, q,45 b 4уN~ņW`r{!eeT+`ʞa}`~y6_b :i0<#Ac*Śg'SÇOH4YxARLŝGy~-KswlVLxA^p6h5zd-[JY(L1̯Cj(ktG=~sSsW,Mұͣ2ʑFN!.`MCt80r¹SWCsD"1qIv4P;l3Ч\s3 ѧu<ۙ=hd$ieIȷtܬ[p laf:Ifܐ(ШSp y&B,4?9ߡzg<{>>@xgX&|h_/xBW-Uj*y-866A#LZɉ87l;eǭTy&?&r,`;ppK,IxwP2Mu~yN0͝G'$lB'8@Ҭ|3,Kq]r%;uܬҳŻL;9 ?{$-.o(hnFlD}I^*Jps.,M>g u)7 gcc1J@x:/?@* ;2kJʄ8ѡUM\gYz^\ѬCFWZ|iݍ+GR䤌YD,Fn{Kp2=KF^9JO!=4Z6R֬)}\Sokx/WeIBHhrJ#F`cL0 G1rZ8sS~nB }`W#5Z2+OFyshxe6c"MY.0^(xi\Q&6kTB}J~C^[ k8jnnT/];lG9 <oJz?ApI (^8>6noxAGt|>%3~PFti&Ho=ΆW>uYF}m@H{ï\\ʺ 4+Z)(0$JwcPBAiGᣗh;ZI8$9QY[ݛd~Q`CF 2P+V/S !2#!eD.\T.~e6n5p& 8M<\ ql"^!Bj# Cq).yFnq&°*NץQŸ ݹw򕯔.L:bX2Jit=N ty<[cu$ F%:bi9f<=`DIm?Vu2 C#OFSQUZ1 ›>B'FB[I ~O ^6a' Oʭ_zMށ<%s<.ƸHg֏9XwG]83U-24mZ "pWa?mt)L dN_ro_g?h՟ҡWy5M234$DO!0s6nhKgzkӸ(q-8p 尷f/xk<"p۞@@<8 j<_X2i`+ 4҆0g,Z#٧QnwPțG0 iUX,R`&tiǟZ~õ奯pj<4DžYq0Kyp#xnk|4oZg*nzw+|_Ċ{z @X/0?_fW5|U'ѐFAS =\``505?:rђQ㊳Oe>o ϖQ2j4i"0@ƍa|68FJ`cG@w^vВqa92A|)ߔ^Q#2ijxpJ/rꑋʧ,Y"yR[8tqey5N#iHF:Vz?*(.n7NaPu6Q࿓~ :;|o^R֤WU()Hyi(Qe LXƃ(T?{;i&t`pq:N|-X# X*-hp 0Xk;RMt"$^9{(6ƘJ3;4oiI2[C'xsw) 4 @nDKx6 ˈ WaB,ZוU[g-kk<(O0 J^12~E׌B9uAա63mT0-i(iҐi'v}W_]j4a֗o$<2yz=F lV`5<~}eG>✎vX}Ee6 UVĭ|g*>mQ*jgcUn2vmF77Ї4?]j.&M&$ 1x{'`iDk=ƴz>ק}h6N+l)ʃ/M#W-"Fw4>u", K)ީSýuC6 iX8xz)#ٵ޳gbЌ6_ AP])teNLljF&∢0aU^?=hdw2A??9ŏ|qytA~]1.Y}'{i>ŏ8sR8Qxƥ޽B1|:7xK @JIƁ+k'¡N|靡x|R٧S cʿs@У 4҈KW6> ;;ze3p1<A4Nxlu&ßePzPpCY~,z衲K 5nDaqz0.x>]24h}&)+(!'_#=r EC2.=OtXS}aȨ,ZqeKO=Q)p1Cj0Ƥass:wJZ6E;тd'>=w`QHȐ#J-K`ܽ`KBGXhIo*?*eg6ט zCjt+/zĭr+EfݾsdCjbL]h՞ԑ/pa`AW#-VڣmO蜂%Ge@F"}qqp2 詏j?PBqd{8NDCet ~eQ?iOc\ #@?^_ dY2cevG [F$S80{P}Rf^?>sʧG)]/zQӨ82oF)‰k֔uHƴ3 ߧ'O?]mRj@)$SeK_]X6oVNpA ){`Q0n0-"zz_@a0w9gNN{oqCeYgO\.e*%~ccO8bV/K\Lۍ!CWXLVY=k;wrӫGT@4/D~ RԻ5_# [΍ [lNDI6xq`Ge)ZG;d4W~Rd舉d$MC1j{\RFu"x7,/U|ZE]z>2淔o}Ky?]_NEZ;+ǯ?Xې#2m^G݆_(wp*?ld;e(uOwaБV~䋿&A+h]<`)a64"?ahkO #x:B u06"[MAaWtV]ÇcdKoYnXs܎81+q 6p8P H+]e?NΘ5M*ȞWp#p9b&0e8yhOO W8xVy3lMYw8/# qoiSQqY~)TВG:)kt=#O:yPYZ"P*=gŋl 瞤C"ˆW}v:)jH´5)2|(,_xD H9N~"<-x8z0_ Mert2@c-ѵqHd٧h _<֢RT4*KuYhhĔ`$ۿ">#Zf/|zF9vm)WŵiS) _˖N(>t9Vpb8?$zR4x YF%߳N{s)PwálǢٰ* p/zCzN|n|4Uů5 SWMx>p}uSt3 9T~-[#H .%'rrM@Y8O-`d: ?xeE1l cmO.!y.!ғ~jف/HNEWڕ4h0O| \SzGk$@`NSZ2a-;ps7?^gb[}V;1OH_P/.r#%D ŻҖ&RאS"MŘ/5.R ys qGj\Иo:i.D|&r(0LIxt v;sO2>Tlμgυ͸ W-]4[B+_Bhb>Wm׸pB0> p9]׽O dt>K}lZj/F\F>lP1,n#Ucx ;0YzJaD21͸G_JMl[וOU}7E[w;6=^8~ zl"0Rn]D3a245!Wke 7UeQ'(tFcMD]&oC='gb@c+Fd1}e]c^d$ yʸC^D\̗4-C(HƔKh ?&7Xn;w+ r=iT"t#KAegHh1k2^m~xx<=xt 7 1iF<Щ|'ؠ20m4 6L~,3I@\u7S˂кwǝɻ+' D4 z 4F~Fe+CP-a}aNnc H4FUHyѤqdaLa&ool5+qeB`m~]MnBA~xֹetMܡ<%,I a]>Kb1*EgmU t46Ύ r?[wufON%"rlEtav(<}yAO"9nuݐְ֚#AJzt GU/'6NaN!-0tfVzgj9haLZ'suNo(|\Цz:tٿokپcP)uKI!#4^a~Fh/L@T & 9?•ANXg:$ ye]3Q^\_h0a10k}bǐȷҺpiꃐ-tCc'#M\ē,T7P5~vx MڸbF*öt+tKz K?p=K8t9~4mXhl\ZxyO|yPVGY";Y2MF@Y|Bkݨp@mU.EWry2i3? =he ZAL RQ4, mexۏB7J\@ ?.\V8Lcy6iPpm?Z+CÏ?yN)HCOqOe÷ *6>G e#Ϛ3cǁFW@Ӕv`ׇ <_ c|g\/ϮZxo,߽ri\GX/Sקj0[``R!i(#x|{AzYGmsz6z(e,r"®0vd"Hmvs%@ ,Ջ|RA'E^ %!5Mh&DUc>qlSn}湗nGK{Baq?vRn-pD~y]q*\I#晈5:W*^/=S5đ3Ncʙߊ0錴&tSe *33ϴ d,n?j>RffJ+̽~) hB - ;W Ў'q( ӏ9׶P2~S'i@x,1΍&DϾ(lHhp7 FI6N(!X<ł%||u:|P0 *B.D %@/@IUNӇ7o~9KcYMl@OJՂQւ,dot;|:ϻ+^/`Y-W h|8c8uU/ChVeڃC hhp .EB(tU/ 3PHhLsX۽' L# -UD(ÐQqE` Q: * J/yE\*2 M! kM!@hxkXn7˧?ṟ'z6EEfxBaF|xcx?β=eF֩,/cU_L9OzGfHCy!38A( ٧O{q)FNr 4pcQ\ÐHX~yJ Ly\#PeT5T=R{N'b= Ǩã1-(ԉk+f/ZZi*{,gDFD]Q<E/muC*xqJ]0/ۈ56xo%N;~ʘӓv0rITM \$;`v&ܸ$i<1 OK = jUh,w#UP4=@;ܨ(rEeμsZ0&$ h+w Eq{3kxF0Q&)ANKJIC8j8Ӏpljnߦ󄞖,!+(ݻvWzp32-\Y}:]g+y)o/k֬.cُ*]wH#aB7 드p饗Z aDb `!|L|aH9kr]ϖKHG/g(eɊ=j ݈LFP0.h!_I+o?FyF?>T R.xl4%ň/4A4y1{+c`K^yOW/ %pRN5`p.)Tx'~mӖmN)[;.n!pȴf2qM82Xnv?Pr&xc8ʼBk A"r 8rА괝ꮏ-Vs|Б!c;%83 >,?d.\h +6ZyK|S/.g{ ,Xe}`Mu{K>Uy0EHBKܡ ^F(sr%lΆi8FTSɺFDa!G޲7)>YMޓ]U |4~!=uMٳIՈ?!o)۟yx5;u9"(+SNhm]SQN-?`H]4^a+BʻFbG߆0L~$aGM&R+`O^5om7 :Pm0*tf#0 }\ 'm^AE#_n >ǺW . 4Π%Wj4,d{ū` k4d|z^cD8h zKT8Կ̛_^wqP?8L'=:NdIWߢ‡!\+ǟ|^M(/_ikc4i|N16ꞣI!_cZ5Zvuj9,{F9][=WѐZwj=100Vpyo1(HOt rra][ '1$#rwE /u=ez\&reJF?ШpgYJ!\Ozttjꑭ[7NvY?M8hT.CDPBXsAR5~r 0ë~ !ȳ!4\$Йyp"Go <fMdKЖC#N(xQ[ۦCC;wͯ*%"@@#~B{e"tY.Vv޼Uo+҈S /`Q8,)+4s18(AM(ȳqюR/~\ϗιq#%T A"r#(^#JيR|;rcVloP (+$ _S.xO(r2E 029ceY7?݌v#W⃃ȻIamI^Dt˸Qp4x'l:D?z^tEL^i؁lOqA(`x䀟&<-NDiF]ʅcŖZ/y7#}U2K'wG.kW6h:WuE5Eƀ bK>]C3vg%YǰMh44NZà.e=!˨ <.%2(Uš D ݱ xrjx䠎<;˝},a$*gę8K[pZGvpM?u|=tsGݞN"8p=W+$J7~MFQSy#|c4tY|_+ ݮ3q0{˖- k׮m*SӾڌt>h#|E7LJ-(W4\a>YZV8=Y:|ee9q\É~O.wWY4yexOHC[)ϲ|xwOUB=+0p9ezx-HpB!X00BPl1'rà_P*m .ն_x{v'?38V>P9'Vɣ RVOcvP׃?yOX\tjk9Ini @C^O2q~.d\/zʋ̐]]?"ܸKIq<ˀqbފ>]1|9ZtFsC%x CH)L,`<q>%YH)䑞e||@^{Z9/W~llfI7=*2I^\OxG.q 6uyN" 3PzA驗FPXc9ii]h>|S>.;݃m(x_H^#i_EAFPcA.w" /E/܇"Q^Y?6eSSUy{֐|˟ҕI O u"Z'?ɇ,@IDAT_4Q?15 ˿`zmzITF+F8ߩÝmZ# 0k p|i1V~ESϸ78=qmokZQ9qШXpTȋ+cU$2לW^\1W[{-<%P}'T0ϑ{ -?=) c(f0(6vT_$=s5+xS=[sB M)e`9ˋ_ru_(߸W* #n8 e.1s] FtlڲΑ%sq}YiNѫP6_(xF mmYO6hwCF1z?^ztȫ^}8p͖l@5l^яB{k]2}1#g%O/`k'#"%ǔ;[<)kOU)Л0㗍ZzSA4W rmW>:@j׹Gcό"dP#rSvj0$/qY5928 r@(ttJn?)8x.y Dh?yƦPJx힖H3Ť!t$Cғ% ӽ6*o1+~[n.۞~kmp݌: ůGw;Ӫpp tԻޡx|Qrqk/|dz&ӄlyM{8oM+3—߫ϲ0, 1mD  l(m;8o`τ6`wMH)͖=#k_ĖOI;G.c(eeO!a|N7%68ӦGF=ʓtrL=;O‘zЎ,DZ|5!= WDazY 6+2n`¤z7+^T>e*YLd+V$o޼sXn]Sd~k׮nţLhJ$ TDU/,/9yE<LC= [䥒uJ%K _FuyOzb U|WxaÉV.ZlZQ]yeW!5_(Hp!X( `ƚ&8d SNԋa O9dϊ V.:q|?t đƟsךFlb wRfzlxM-tO}ć~xX\@޹ݸYжc:jş*/rjlP24vwIYW#>z4;%s^pwV{mfOp#URM99r#\_*+=B>ǮG'كFw$~cͯ=C+8Sɓ >AExйEgL;X!P3NDdNw8U:'֌دHDvZ)c8+m>Sqq睃<_)%>ēx% CL|+. +{wd+`9s>D3?zp{3qt9Ib؍z)B.D#|& \Ӏ3ư<=}HGl_D`S̍u<%_ Y *t |R$u95 &g@s|ū_W^וnNWqoKh鵑&L!u"ۓZ(RɟW?fQYq<rg(28#A{䁆|2 'r{cHXbDF7b4JЂ'!^PtBaoh=PCuc =bH0?c>ZCRnH[nŸ+;zo9崳ʩg:{+aahj΢ࡧtxgn8X#h V ?rny7g.ʓPP'sɑ?ûBw>:R!r ,N nѪ8X]}ocFD/4 Yr,D2 9^tI_e-[:fMSgU^v͛w ku,y3U;u7Oc7^2j0[S"O]Qq#_/C1K@Cb}E '=iKfwaT:f1SSO h #vN)6нYn P ^2Iy -I+==&a'q& DxPTDi +_ p"Lo|2#&,x<?2=EQfaxzY8zjZ:}eR7$#.+Yf3g&駋g!u7\d\;?4P<44Rd wn}p3e"<ӓ0Q(8g @:KHo+\B 4|%4T«#̴0e)+v #.?Q>\+ /|phՐ -yq@^F0-O}j[e1c-W'ek&k_Tih,圉>ߋJ⊢տ.ZSt$i c@t;gzE;le_i|{JY>B]ءyB#6]׌indZP0T6n-&|J7M>d$#qēAhz!Q}kA騵--o0,[5T_>L=OPݬlTX3tb2 0i׎g˗GS7r9#%>:]Rz+.Mt857swxɗ)8h7MezymFK[ W-E0{4Hi㐲p#/~Lha_OAR.V!hLAcA!?/hŚw5x+eUh5NSRM68hHtg$jo>!vFek>[>٫[5'?W.yc3(Ryy1<$:X9\^ur!KS>ky 2g7KyPIc/v}.m#G4Z(Ɩ kL%4rݥzk]){Qd"F_LP>8U^:$`-q31<*U ^d2 R@Q~wo.EBzOqi 3-߾{[yɽ>sFH5▲z BFpC#h #qa bޠZo4TF> zt/PͥJgNފ& \SxvLV䍜sc4`{7/o] 'zL. =ffLȉ+&m^ӿۆ`?Wiͨ1OC<@8_|0e~#9d7QǛgRB>[eq9-8g* 0LO`K恌Лt!7I茜OZ 20G#ϰPqiP ,ʪ6pGf|`@KNŖxP% I(Ɓ.ϝ LU8A? ,VD8HAkU>pyæ{RFX(lLKϗNy|9q1QhfG Iy* ~)(裥rE_}+K)e.`l2< !A*dpcoaw"1`aU[hNy%G]2,{c{+|Ĕ#>5a,(]='B$gLyE~sFfkZnt4"OxPbm*-&YFY>X$.1-T(0Kpf`$~T|p]͞ZPB;tA9~oHz9"ǿ8(d^c_,| 3:Mi?Li \tYXN3fMs*G!FwC2h T0i ,d#ʒsvMzz> }(x {4hIh50#B,Y=ZhzYOo/@_Y'p‘4,ى|$КHg*GsQ8&G/e)c|1>̰9# M;ի(Af?;108Ë)er8,)>ь{Lΐ+yzEȏ!ƥo?6wsX):o—*5H(ذG-+;}Yʪe|fyY`\&4eNd[:4>PU(_^LQs-G4LJ[[rß |^dXIp\ݲyH;'gtM? }<\:oeAey+\$_]pBϐa8*t_hC};-ӖWZ}#eDQ^,xGn.32iҕeEGwn Qm4a}Rdb"DAL ៴Ph#42xJ&w q',#*>=Ti+N 'WcFNCa#;5e#8Pxܸp~ANG^(c"ly@d1(#1w3Aԑ,xdKLrZ&ᫌ}Z m\1ȁq_|>D_ڲʨwCհ\.SFtvLI !U;}%>_ޥ3rږ`U,YyhYQzorL 3L!gI2"̀Y\ *'~pE ,*&\ (AAP93 L}.ᄍo;鮮ӧOUlH& J_[(/fΔFmWlei'v,}(3(prPOg&U/[4p| [; :r`Qkf]s9sR&9r@oTu :PpKaR?˝,=f˫|9c^ 8Vd(g |`>`gn|Huj e>f ڙ>>łE#Qtqt"G_]w_,~6tZ\B %~{:6[=j4~D}NsX5ez tz:m*PmuZI:n!Xtax[eH;>5~^0T̵ճK8||-S~8tb n<<^xDkZY\|MO#vNx a-xEhd3@5TF_ƒ'"눪|apl3!wdCY]Iz[n|Z^5B[[<<eyIu \UYw _IsAg nG%ڬSd|eҦ/gÝ|n t0ٹ`!ĉ>՛mTڧ3V6#5[a8hs `j+:V"_OF>F(B + ;/ՔNIQz»RQ2 H /xh:ts<4 0?i%<&n5;DDT#iRvՅ Ho5t3/!1I[3^(`\0,Xy%05cxu`PΊ>%N[36Ļ8댏Ŕic-6Btd O__%ᚤ6 K,2gC)f;snyPg!0öSpe;O:ztҩɩ[X[~,6:CCFCJFIyQ<\"q?c)'Z#]bٯ8> x#B&>T#-}V Y] 7\љgF{lg,T糗ۍCV^ndF6Ɯc8XuӶF\kY)V,8)Cyڅy:ȃ]rKIz)|h`McqFm[ R4櫽+3jJ 4㲎=<7t x:^3WрR?3=?} d@[XA1R8ߴ-{reܢCڸ >! 2ϸJTz3xUq$gćel9Gk5u|K]ZۊsqAYwsԦylA_A0^YGOѣsU~떜3rsKvm9}C*Ϲ΂eFA:TsXLyf&A/{ލcc&G n,\,-x8HU#/%ͽ灏A;vo)EsvrH<3;51|BA} Bلt2I,tL5_Q"N:):t'?~R뢭GcR@gg0q`t c}ŮrXX1Bc ]{ϛ_by@kGp0DG>1pxG;b(@cq#-/*N1}^kw?Ex/okϋE+wzy=H4 r`g-P,9o8  DZeYU8*c}DIF.[7M해h>͍$νz>piP`#p@E8Z{ \~7[93,rĉX!)өpH303Syv~dAY'&4/<ҀN[;=W7͌)[MJِ&J4 mvFtN(Rvz/bT~ KKC6g.9-RV 18qѣ>Kzl9ir5GvUwKOYXhĤ-]SLS>iЃ|3ta #C#m΁5N-#Pe 䃏8IO]C-B:͓0ge0P_d.iʗ6ŬX>\Ps.y$XO|9j/eJ3">_8-\,bLQX#hl}6>hMx2.iNl6b^W:[xWNT[kwbv3 ib&Dyc4dhQPtIӾXQeOxq- :*Xf _=Ⱥ00W]ui\shxo>]uR8)l%ި(3 8 !(xӜYr2^eܮDzfɝ8TSr1 Ujz?N:YȲ}J~2Q_@niV2 yǟqeqSzJ< #lVmQR(t28:u?ciY:tils99r9yzPĊ/s8%o&&1_C,%hOO] Bt>襎4lai (!C7-.sygL5\;y`L<5:¿~4ƌ/K3/|uOǣVTw{tȴniOM Gi⍣KQ¡gDl{9%\ƤkK}H v,\rS|鬳b6r F!e|S8H8;uЖ2k@g@wԵUr J >NZ;џ]s£~#fTQˉáVl\ n\}S eQ3)+ 5s@hBelL}-Ԓ8)!q/2H+ `%FTQRFJR%U36.C{O91 l_.h\A/-U'o8Bר)ʰvm'MW^yqM+](C]ĄLSs@:r }Τ@ 8ː(s.C>88o3n& 3_Qė1e2Y.)S"d㨣bD^Ď^@lXu +d;AP}+c„R>hKF0Wk1DCU QJ[\Kf~;85*C؏x߂xZo7م}9r,hlV5s0jXtȠa-&T7^: }8,ksФ;O \4`S^S6!3Sꪎz䢊1Ս.J7NB&CEA|Q5S3"SOe`YVu[NG`SmL;+[gI]@FQ,󜦱K8 3iKR>몴SU7'Ѧ.r[Ӛ.A`judL"o9ASгZ5F+6D:402AL'# O4eq `S@Ҏ>|MZ64(]Ʌ+<봉ISi#>WKF9,e2!_ԁzV'?_bC>b+i}cc+4i6E85<>斝hEWRs$-:9u3q$ulOA>vEѩVi-2,^eCdiVq08ߺx1Ƚ*'@b_n̐Ps]2p-Ø8}C/>V.Y̨qU!MW '{{-8w;W|JQa6R`RoOG+ݴIb Y#ƹ_o~ĺ*o]$ c_/du1li)NMUvuk{I*o_9 d6%5;Y5#U8YUHz:Wk@n%CnUVl)E)cS8jb;7kn=c8'?-axL%wB0&8INޥL3fAy zI؅Avt̉#9O<^P-/nzčP q } @^X]l1VZl7I`rgo`\*R ϑoy{};mq g^@}k t06B> 'ٓF@,LC,@<>@aKg2J>'ꇲϹ'i-6]tоts0D &ON}Cu,Z8DFl`Ɓ _ d"krwLM@~ujtG!z?75k(˷Cnt]ESApg#!|kh+O_N;} Wi9sf_\8Č) \~Fx+p$VIZͣr'o +a^ьW~|T]Yn$CƣtۏlE8J3^[p44*<4ؠW,}E6J3 Лt6^-֛Δ®~8Ykh70yd`MAnA:3i:FlV,WLgH@LgA thi$?giiӠԿ玛y͛E sӎ!oG\3^U Sjȼ lmeA? O]juQ]6撧Lβm5Wf3>#=x͑jSwD]GigK%otTp`qMt& o C8KR2ЕM}%r)W.4ۣTg1FH)%>]w9's]h 4@VD "iSnhA#4i/2 =q=4.yi@hgZi+o}#Axp,-ěGul8 Ȱ'$ՙyVj{i5@ 9ؠ(wF( -}ԏ8[ԇ>B+鯤sY2;Wcu֟r\Xru:˾Ç 5 w9=GmL{Ӷ5GNP'4e-Bi%_~\% x8p,3-4,>HlL7a븳Vq-UJɴ>9QJ> 4%? SL$\ \DYAMfh6fT]ƒ8"lTaDM?w7>?x܄vȻ'>]+yۓz;:@o>#;QyJ:ye|#*8v> Ȗ@ټΔ]ͼp$(Godz1]m뢛Hc ࠗ uVo^|)q;ޯ72Mᲂq.y`KÊ4tiX0R\/1UoCPWd.3'T? iP@IDAT cgծrRr|G/#-]HX?w:bV([;Z$Lͪh뼪Ν9P!ў*.8i9;݆8}0gv,_D:vCF(zkSzݜzNA^ OKr^J\`= IuÂxScԈa6)veO|[I@a n6*>cF 1:'#G+i$ΣȻНeMȲoY.k4?.ɼz%Жt& ӿH:90M_IډwIpI&6X7hvW M}zfmG#m/w_-9.$En2L-r@6x펱f1@ܘO7gj#}}]Y~ NeA/~zqR}C7댚9` ~%<#&ȍء([3*YO8oۨWrQ~JY~s1zm Q} *: Ajbt~_2#@yt>Pt39qhF{}}$4[8nc7.SY8ԀtaNۗ?5& bXSo2\tEqGh7@/l謬8qqT`@]tE0u\|O7}1q4<7= V^~w}Zb`C a Ha/ZR:yyE3TLw85cH :?:$%e0"0[\%qnuW& xX?Q{i('N4/pjPox@Gy8/SG<Ӊ-S8o;N ׺Stp6RDHztuJC[4 );IP0>,uC^+!pS6it裭 %B,F|.zrpQz Vf㤁:(OrP?C/{vW{.Ð>s.h+۵7OF)KUB4~͑ŊJGe0f.~g΍-Wq2_5p 187)ge)Zx5(焒nu^.p$ IUɃ nPf퐑^eιҍVv^Og>m*{'Lm'fp ut[TR]›?} =:fHD"N;^$h ztZ6H92'n  } JC?y4BCh<>24RN}ش8Ǣeer#}(Gsg4}ɦN9 [S^~za+;DkCFV264LHshŔQD=蚞ة43:K^} nuLmL3^wce)oʸĽcږo= 'lm֓Wт2!}@-wdOYs?I^S7rY1m? Xِ\_Ş e (yQ ph<\WKcJ1Ou-i!NUcPNT ~=|yF%O`lN{#9d+*>ÑN %T>")'YGXzu[ ϰbs)wHC_s5񒗼D۱ \+-~3`ȣQ#}C6{_mă ''XzXp?}CX~3~XAo%]u wƖYҠ nd* w?ۯm@YQA+t$~b[ɈQljPiH9|KsW-nkZTK||k4%ctGOXiҦv?*wN?Vq*o{3 +~Ӹ fAeb[+mYт,Gqll0K`o1)SxkhwF 3W* ?/?ŀgFu|QqnT!:ߪpj $YZ̪SX^&cɔ:ThvNK8^ܠУBy\twvLzV0cǎm7pN|9( sE/3 RֶM/_ro<0{YwW5le`.|(ÍISX5P6jw:oQL2$)7f44sr̅qs): ZYK)#oSd~$1NiĴk萓lkcŠ{]*|qNlCR^ɘWN>nbP8Dlzn@h}Zt׌_DȱTLszunSLe@E)(>t(LC< ʾov!x~SNa:U;A; BQ&ɡ АŪ':<@Q@AV)2ґ_׷?$N+>@l9dzV CGqpDs&n 䗺9(N9dzJ5GO!Ri2xE1udYs?HkN.GM2/y<:M1 l~t qprOf^=68hqs9Z_oDI?|森2qr:4m ~]A9iEILHrU^ʅrܩ~?qigv;&hL,#Q1TKauNcSjơBА *ąNhNp)71M/_ qN-K*9ttQ'9vT-ND Гzjy8FG08-nHYޘ>*V0ڳ9 Ulze9j6zxu>QRq=:s6$^{/`SSקzhTs9hBҜFLӚEz 91@Ӧ[oQ|vǍZM*3 Xׇ0EYM` OC;ZD(Jl2bjk=ԲeqJzE[ֲūbU=Ǜ_CL4;A zaHI`[oe}}mLmg;ٳA8NYZژg9}[8ԑG^;C?@1*8Wʹ"ǜfOc_Y)KhN7;Tlˬ2D>ЙQ!u]tGpMWA\et@߯Ǜ%;Vc7<Ɋ @40*@ FÑYe]l?u{g_ㅋ$1%qāSbMFsgAM1(p0HR7+bä/ +(w.5Ao"L6ހ3!e#-ncZOq߬|-;+g7O,[֞G8)^V| \ur8$03L̜93/4׀>.eP&lg7[a M9.i6&& p-wx 4ɲX'':Lz{7c\vB҈91p|z D/(K>2?tOw6.^ܸ'F#9(8ϵWtn.b$0c"5. G"tJH~sAUbX_Osf?Wk~c9ϊZJ3,GwD#Q8יYDFs4.'~W\=Ziv|{ ([BRKW `d5xɧTBE)b*Yd4ïWmNG-|9)=DXAۧnnEk/ u]|̚5Y@!4 8 >ry=r6TF:4<"sLoR 3dVJԍfF'{>R›)|ѧėxĞ;lcFOmra!OW2vd`T]Ayɿ!i`/+y~.*NO,赶eQVq M3̔c]ǫ_4%2N[+3[:;kC  ׾oR/c p8ڋ=Fsg.Q> lYUwq78|З]3聬L.p@K_Y!chx.5|ClpMVhOr  o_uФF Bܫnw:2)&0S?ks͵JQX}1ߓ6֌~Qr+Exч}!nu[?b)zCI3dY+AxM%v L S("g>2w]*YU>xƥ\Ru"qתp}gRMV!m-"Qw*6MV:1zbYIN+ke,_!-y8y(k` j{^a[6l1l IP]q.7&PpWHޘ={ &lb`UFa}TfW0 L:8m;8?? mʣ'POw͎w8<.ƎVR(H/mB|B.;tոee9>4t0}W$Vѷ? N"yT)<5OXG>Aiۍ=BH5m^]4wuNn_:?x1gΜO?vnmE{ϔ#nh:(G։:\w fu}9H"ǩ7x$nhL:z:oEzӮܜsqnMi}>s] n ^ 8Xq._ 60"ⴋc}S6|5jSes?>tR⤋( BoxGCg8狟=vN7ĺszeq43((ŦisRn W Q'!қFFB t[Y7i)-9(*7(<ǹxF.+v)I(l$ "EgAᢎtzi É|9Y,´5,:t8\WjI̋SrT. %8?ל'n**Z^`K}el($|P4UM8eZ-&gQ~p2h+uv$+JC/̳ġ"+6Q'9r?Ezki𔾑c|hchH>\rI̘1#>OA,dƘs=s1nz>C*g?YdG$N2 ~y"|B} ZXLEcɃ⤹ѕ6z8mƞmָE P& ]!IM qcb˜1X9 ݀H40h!7mR@MLjM/6-pe<'^^d|5{o0œV@cG}}&ƾ;oMݤ13 ЋƏs?}gk=K/UG}lF/06/m.n`A0ROЁ+u}iEuA;ƒk`IKÕu']'m\R?8Rod rE݊kpif_jmz&*b"}m2 -qqCzT NhiYVUPZN2,Y}r~T{+)k~ԇ\_=8ä_zQ:qscM9@`k-/Yޫ,8Hs+\+9yi@ B/@p;jtWBo q!Aʙq;SyGIJpYC(0, .2'.ey(`^UUԮ #ڬXylCOSg6Z7x:ALX?17Yht-g?Ҁ%`гc<#rth A!Zm1@ ?DndKv(pオ>V41.<j=d$x.LϚ$(4ի  fLak=NJ/,[!X+bfX0P^9&|vs6#r 6=3/OӶǍ#GS.ȍvı;3PvWw;`59-Rornp<[>9@A@8Go1u%f`) ᅁ< ٌ74BO([ u"zf]S' -\C7ۥ'MY_ {a:ࣝ.]&|pyܥ;cf4~bQLrd-Jq_R ?dtjUw!~[c^cMǾ79^RrS2]W1XdVUb`cK<"`vw5 qX Uȶ!9:Kw9fhS x t )&5S!Ӻpv/Yo/ڑkҽ>K1.s(v9-3D0&I=WB~27JEa5N;C6ZZKѽ;n? I`ր0i$Ȧ5LN0OA!A9pM#ҋT %dsP 9'tlOϾ啻!Lkn|(.ŷOxv!1!H'kBu%7u6)3W1 D0IL &/ICK¥S6y&K*qڍA_FPg6ں$@ xktCC5Y+pRoEs !x!+* xve6G05F=z\':f [(<[;_M']H7_GUQO=$F G.E}w~7)q4(χfWNIPiw^Ccc R)Sozӛ⨣'xӱs8qb1`MD؞DQFC3 Q <4 |р } "e)(9,109?ۃ<{Knی=0/4[*[Xub>\ѓ 8+L( M>3 ,: ꌝ6o>^oSlS(%bB'9\:N!o6\vrYrwĹSL1 p)gM#C@7Fq<.| zYwCʠyp y:Kz':N>R0-@/oRN8|&g8#v Osuy4k0X_vsB\}B#[͸ݫYshafIqhnF,^*=?~(8ACzS?_GO6}GM1/1cc H#B]ԟy8n(_ӄt~d"dy]Y`a)|8r+x-rMaCJ\1ؖr7!е(2D"ȈnhRN7p:+f?xPt䜋?<2" RZv%oFn+=pOO-+!!;/!Ug,m[M1OшsW++G=(0rS I -i Лo:&6,s[O4!7٬}*73fZh$ۗMg̙AbpXL@4-C3 #_h@& yQsBcSh"o}'k覮yNm OM^l Ȝv>yM\)MjL?uD1VG]J;bEfU)ϞzO( =eZPm+Qدi':wx-׋ޔ@=PXPi=!Zh>@h0_t.O9[n+AxU?hchWuSL]yOI8qA:xU7a hpN-?ܥҕQ V2it@nEϲUtn}+G^#*}J*ɺ2=oai!$Ж2Ȑl{43h*h wd3InЎezh(ur1pʭKzq+A{Y/ n<~MoGf@S;4pHC=1 Khb0bJ¥{i : as_lC Inv<;`;cW*k'/0.7^g02]Nz {%ݲ53;ሰ%LqJ!թ'et#GhgPXqF1jf}pS.E|kO~}tf$FRqHȑ92#C%yyQtS<<*[~r}MO=>Γv O 3 1BpԍL8GW\dߓ>g>uocm&[>?o,Л wv(B1~@Ç0hАx{o^k'ϊgTz5Dr(A7746~9jQ:3 @3xi_2XGPPTԋ˻*#\JdRף  v:Vs*f7_oϯuJZO.{|c;``1^moeDk1&4&`d-Jd(GeN}|k,(8UlCG qkpuCRB%uzv`aE.TE3HCnqJn2u.+P ,l ;R8`d  Z1))ԩ5s`t׽9ƌg*gÉ>@%X,ErHcŦ+ʖFʀ\$, ~s)d`* Ձ1 bH{2eFse\N|%p? ~Ϧ<@So3*g (h#C`c)k0< ƫd9lzʣafGk,ȪPVCicmDC(;?lA3țy CK @liK oО[ |K}+{s}K')LsP7aO!o%$~:N 'ԑ d=;=ёiIxXkxGG>/~o쳏&0CЌnq-Oy 'W6ב6a{ Z9H#4WF?(,|r]-2ah@, "vb/?[l4zXiƔ8;h ~q=bp}،DMV{wqg/<#-XpUc4rΩt MpL.8<륞l'rW{$x;ygƱ^f2pp*@O.eehGdA;Dyxkv0zKW ~]?@+ 3 Y2P#aq"XC:]:q /oм|$b8yNzk xEߒsEgp˴+1i"G w|vg)ѸifxDĔȑW*O&V`-E_pA+ՑGj eCV7LVWkp*G ^g ස@QWNT]Rg+p#tH; dG/xQHdI=f ^ Iz.da^o LF<*5f6A8]W%tm裴)UV%[ G@1x08L@\#L#;K6sue3q=ySb8rFND*@`Ɓt@ ~Vgd:b7d'|wd:Sz i Nyz~QGׅ^a|ui:gsN^QGWDRI4Gʁ^92 |2:wF41:>nE=_YE[Fʔiz ]gM |"E3`dc=7ǸPGהb傾Tq+~ŐLӄA ]rs;>1CK"?76ykWf[th|*>t:':o}[U8.B&uYN"[_v' \H2=eχB)YOzsLI#/qN.c K&@=Y6i >0~8Oפ7RAڌygǾnbA~,gi?>iT3^xn1I;+ lHge5ʢ6.LL jHS>eTFxJ_Jq>:b4寅м q<1njG5`|M |qu 7^ Wc˖~0;mPwcCȣqs2䮋tSՅ236yX ygM QTN2zw;AӨ\#' HpP_{Ys?gq`@gp¶ AٹH} lM1 8NmV<屒 rJ8{ CC#L( ndN3z%ul -З8e>^M{[Ƨ.3v?8FꍆlU]Rk_>|=w| ~6,^MF1 egF?B䱊JN r8_8͊x rr<3O~ _?o\|Gv%(p wc\Wf+TOH} 81u!Jgpꋅ˖)pXgoGt􈶮&rXO]pgJU`i+En~!,N8;ԟ2ȲB8! !fOXzG=u2Y0 dtS6gU䟘|0ĕL|8ĭY]jiѷzrBc{eXWkw>05_[Fqqw/;׿%^cbIS8,h<}92N_Ƌٟ=~xM~[)c̅7CB*S֩Wk/9|pƧ{pfGvfڑnFӏT^azꢴ0TmMA\ v˹/yy(9h|mz28kL*C4tCx>zNXEz ߿m~,h^H}4$\-J>22kҺ5@y+8."?!Ԉ@IDAT^_  6Xtv{m*h@|2d:Q]PAo]qta:c +sbt ^GIY"gp=+|n;>>zqy7p9K{QzIVr[&v#,}@FԝwY'<guPZu1:F)7^GҖ~h[yN:xތdTzύ{x41}&c/qg?q)쾏tʹcGÓFUp@v/w6_Uv@ÓL`8]8]1Nf)f>4.ӱ] ezvaN0S 6^hW Wy0JǶ{F>"GdȇJ+̋ eR[Ɖ|ԝ r c6[u XU B t:̓uDUCW5!g%e^ru.7m^$uZ7 dG'!;6;ảYa Ir%<0KsҀ:A |+@_v nZà`_C'KI#\z cO!o&҉'wf}BhLPU>1VFvdlX0r J +fK:QA tXOlR&<u”yyA (e8\Izp8K,= !~V!;X?YK? RRSr'Lh# vqnbaOJz`\N1Rp A02UǠmm.} }8ןfTꕄ 1[2ppg~*Ԡ$ٱ2=G T9yyוä}]7cgbSڜy%#W7:Ly ȁY/v&lf̘/B~+_W׽_,|\7mt&9?'^_-̀N$~:(O?4'<:Y/y';y!ݐ#-u ГG¤L:vS&nIXxa{}qb?c6۹O-tce)(06?p,^Ƚ!wpM1~(!I޻6,7a 8`B O dE`^" ? J W܍lK.Ȳ%YόF3]s%3sv]~>MB%q~7]ƌMңc9 #u+:ǀէsQb:"w9Fil_pKY7>Fg9NXS0V ȑe2,_\klpˢA;RU+")=pgQ!?=nL.Ȅnŋ1r>XѭmTy9Efk!۩RהUuh&O:]I(y[sZay{N;=L@IsGH<h(TGI<Ǭg>g5 b9)v\8E:1Mc ^"F>7!w/LM0>xv[xV;ueK9pm8hCc .FG&ОOMFq2@4*;w\,Q!@/핁}1Ex,NJ푱akw<8A 3uz -Z>;0A5kJgb]X>}bޞ8y Ɗp&=ꋵǦݚA)-c%si!BymӖOqD_ϥSG 'ɣK$KܖZ:GC:@rڙ)w~k 濿ƊtVɤU ӑ(l@^OF#?ȅh!r j:O=hpWʹ]ZG+xnN]tQغCH2Y#ܭ-mv |Wh~S/ <p@/;9whD}!/@~YS^GgF C^{'o__ 9EwǤӫ3s"vBN | 6%qTaizDG2=ϹGޡM=zovdLâl>z898 u2v`OQyG{bv|cd!}@${#'NPN̔u'.gUIS0EJYE;5>=svqIш^⿬OCPUo΅B4GH'}1X9>N.Dw$'/9V)z=E޵{'Ċ8]Owwz|O\rb8%bF!>fħhd YQFGHqp8 8 46#c8ˆt:J#hs[a(r,z ݲsthgk2~|ȏ?L h]X ]І9Ys-qrz2n&8j?Ȇ]*_Csh 3vXNf|׿Y1|_-zK023O"#〱CL3BPiF.Fyc@YhsWc:{ꩧnzpf`* 3x5nDLjZ 6;OlQGEl܃!-LvKg'C q-dyդ#Zrfhޒ&i) :/#`,݄4Pw p RiԫiZv,H`meSg8%d6D4x9(F®TjDx <8G6JfG'q\ ձ"> vAv)de~\oGz#P=3}sv)C~e&2@YHs>v˙6rQz~ZFOL#;89XIe???-7)w#H"kr$@L^|izx3ݺI ꨗ͇$cf|@w<~2a tNĕxfa0J'Mo Y_* 㚂풼 *zehV@͚tt҉*x]2:Q sQđNuO\;v>[Su>5r^mN]FD:BYt6ilL҆C ,h:>tv52|*9 7K*FEi] 'gH^S#WkWBJ.ȄfE%O3O-+Ϟ?kwYV!8f8W^}q@23R1[]@}tвhL~(-z^i5]A"evׯz0 r s[Ī,t O/=₷xVMF51 Iv01 Pӆ"x{ܖ])5 rՔnTW9#42_*Td#>ŵ80yjsx5 gu߃>e7lt4iĐ~1s >*B8/zы}o\{ive$O}*^җˣ }][p~cpNyΑxEst0: i濝2<=LȈڪWˣGvw_\wu<-=q%Ē%KQ<050>ضGM/whbryˬiy-@s9S CḖSE>.t(W[q* 9.v#qC;uⵧOSOYƓ[kK\yެ0Vo<;<9aر|qXqcl"AnxQI?V+p>F*? Ӛ@cϟZ-oի}}#<#Uo+Xqt ^m[nYzes5_9PphNpۃNN5S3E:i Jy蓌.kA} jRG)pc]ԩS+Hw#8J,4E9XtF<{h26<.9  HkެwljGᎇGy'ziȅqNqӮ ;},̚E~y7T5uf' 0FS6;H9w˹]n>Gf\MsL{eѦeom0&=Cq7LIc!1V*!LA Z7r㞽ڸKNӝ1{ tqs2U-4k5Ad#E5kuHL7cZ <޸fͮxqS@3^qƴ-[f=DFO zǺʮ1i ?QwsfAgf,-,.\6{l\X[BU^=w?;zDu@fCG7ǖ;aL Mc\"/V:\63N {kk9sSCyx]ھB3?}2θ%G`f5v ;Nj描W{ (ԅk}rΜ[Z0QZ:z)r\Xi}C:ߵ_}6@~8AOi봱Q )CfQ*p0YsТHB),be'?ɜ^yOO%iO<.^wX~Wl֢C4}G=ٗw>r:fw;%z4㟙zqL}K H}!Ϙu2Bh1zt˖%z̘L<3Zd{AA±9;}T|載j 8S2H:>}XPds+ƒd5z .a)^cqo}Ŋi,3,}[ߚgiy.T  #'F:D g D 4vt]p6׈Qjp=g |eʬG_mۋꥪ#D 8p#Q|9|rTaѢEՕ7h4_[j]ϵeH ԛU^fUhj5|zbq~N _pa pʹAY3hE(^D]Ux:$7k +Rtۅ :nWe_S ːZ! n8Zţ0Aů㯐Vz# )ºG$`X(Pm0"?( 8tTʦQGz/ 9=!sf8?Gy "# b' vÀ.Er.kY|,«^?~^X<M2s$`^/# bX˂#OpBU^GL!Hmڴ)GT˼-oN;-vuu… S=V^ݖ S8INѫ_X yXv#cBX#yàx4:J6q)kԩȖr&y@dhs-wF/: "ye(xu?Ak&qsyPkIlcɃ,3"<(?{)"O]W5QZ (#M0cɊ9FAt^$ފwAjHg#Vh?3E?f }!o ǔ@ETʦ2HP|Zs{x1@$0;0u @$@*yK:e(KҐz`%t`:ۛc=o9Z|MY_rKp*`߬aa>x_OB%Kr^Pݲ<!3El'ҫxM^Yͧ[ܷr/dv/Ng,q?h[jzs#?^pm׼ v\?].82~ȶ8=z6>127ZFԳ+AE!OE;ۛuqSӨZ5[Vꠦ#cFi݀N|UPwB ^P ZdI2VAAhAQ~rv;>Փ)nj|w;XSM pgVssxe8dNOedr75Y^$]j@kȏ;#_ԹXJA#j~<flvސ/q S> Bi`啮j|;  p92.jF9I>dYv16nu'B^4I'(C:(COq <Iyxw=PjG/";7##N(0q|֎vܤk=dyhbÕ_A& ;G610o8S,]ު(C8\/~Cxemu ȅnH'zqt ל G!ǣa Ls뮻i}x .}hhi~Cu ^ yp&xuvQˁ>z ttVlzlgyM-9s$?C ' 7\еChg=p'Ƌ{pWkܵ=̈́f㦊׃%8 E: #1 sl$$( e*.GmWu0C=Ȁ GyX&3=Pݳ\B.td+S a4%5:V.]&XyOSrAFBDy.rTd p Ȃ3Rr3u@&1Ե? ]+N/JH^гtZsjh;C?u݂ucV3|05ΙYb %R3x"MB`>w ,0ʳA>q VX>g`K4rgvGHy$4_p9hrgb: z4 dLt֣50?Ӄlh5,_YCʘtK: #<LHnѩM3S౼܎wh>p3Wumy8m̀t|Tr̀ `DqKJ74`y|ݏl[o4mH YI]]eNɌ3'6 ua_2PvD?ڧW΍WtMۺs$zU>}DMhSFGG1x #k54UxU@4n,.9SNo3̩l3X:4@*p_p ck5g#Aza@\3 U( (Ty[C$t,g/Έ^p(lfh(~3"^GUVHy>8kqNNmcL/3oʈf~HeJihZYȉ ^ ĻthJ9Bgބe]ƚUyt]fy?SYgkɊxDL' 'VS}3kI 9t}sMA{iB23pg 5E$F'd}N/THFnH/`Qmzs 5GۑE^hCNl UyF@9cx##u#iiU.ϱsc! };V=?nS\$B21t8#_+py׷iG2 G;&xHt݊?dр-4Αww#Z4whQn{ZO9NϏ6Y/3apiߴŚa(\My,Q;8NuW0[|3VrcmO~ಕI_v!ݸ +ARəx#p]1#_GNIiCx2M8Kq\>^ɠ'D|Ȕ{&଀X0켕08!>QF[ /m hWc=G u@;1A Èpbqܢ*UkѢٽ+F):eܩjt9WyU 曃(~(ϫ:`2 Nx?JE܁F>p%qģ<.0"kiL?hqVe6t 2B ULs,:t^&?r +:$ȃ,8+#49GZ?7 j> 0'`A4xl,x$n0N'([?qE]QD]|  &o }k'qeYNG>qΑAֳaX%qEO((L_5N}+# \NEK/dC^(qh;O1Gw2*f|) $N w̘deNK-{`TKtmH}\0<nӗV` ~i#Uiw^02WM!skB Kد=ެI4AᏭ/eG9CҖpp1Y Ŭ4̭21KN42(I(ꅾbPk2pDku.ǔDx5B_$ '}ڹnGɑ<79Iyo~7N>VT+&|ʩ 28"01hx0HeovNRaALk6xLFB'AMco C4y@#4m` mD b]~&g:@' 9H*DP_c\2HpeRg̘ʧ5S%(gt˝6FjobޥGâç'On`|Wi>8^X IWokf==Y;9!/> tLc[#۠Gn*fϘٴ1ѣ?ىx2Tf)2^ZOG@?@9WSQ3VeYִБ[Dmfr/IG) ,hiEe03n*41b\01]df@pNZ 2xsBUFʒh1wŀF4"r`&޲huiSơ6ّZGt:A0v΋l;5slܼ|,Yݸ}S&*tp&sݛ?xO#Ayx!gz}D88 >)wzw8O_#9 _K?$}HݾzlX[DVߢ;H~mDySc-TlJsC+v,uauܐ8w?7Vk%ѣŎgN5[I?=6֫ډ[/]1971qÆ8gĸԙ*΂#'w7wX@z`&AZ+nZ3և01¢q} T<"+<@|?ą'͌O_xdky,{R}{bT~Qw'9GJHrǏu uѻڵxiK~t_ ";PlpL8M)ƞn̤q7 ;xwe@7'༤>@IDAT29NN@.tN9>Lhmy} ]FI@>1W^\Du hxsծGh@E0:0 <+ꎦW,A< z\WʓƝVTXgY!y YsSF3N+Z_949qsԋ8ɠ# i…N( AÕezϼ;Ӝ8'KH9N;8}Fq\fkwzM4)ǁ|->U[dVSyx^f:+$+ 2;'DzuJTY$tjy[;uz➵zKIy96ĕ/O@'02##[7mE9=+ү>w>4AecNukc>q'C'^8;(ek 9'SL?fwB9#yBoEՙ-;+CK| PBG o&M֒zkrZ3 (% ϨߒıHcQIe- tVp$ʢWҫ˂| WFK<9`zӘ>,w;@s7*?Eˆ :<iy(gCiU8'] :%t뒣[ ).bMB)gGpg,Y2W߷MSqJ!|Ox8T726r"2Fja]2h1|| cZ@ ` ao;vc ^s| 1CFb {{c8Y 9()M̩q6im)ZIz ڜKX3|Dkę˧]rmpu+*h}N]cc97Yruc HJ~ܧ|'̛k͌{x\kqo҄qt7}s6_QN{Y}:<t[;+?1z ^U.5qDn8T x \7ns]qFv>X31pn7R5MoYb[;;ZqxхuC#d$ G:זH薣_Cz=P\w\ᓀEĻ;#iB]V\;yrXmp{ψiJѿ+4=Q0(rU*ΖsR*_yͯ^˅2X8K捋iGPҶ5{0kLH ]gOJLz$o#p3=XHҺCɋcD݈OJs=Fo~PzcwAz|S$|,L$$M;bf&$f-P,+\̺0+m3&Q!7ଁEYYt t nhfœa, |֚2zTD;- gT:Gq[Oↂ ,w j+UI94>: qqt<cεR&΃7B`]:˲Uqr\ЂiZL9sX'HoW&aQ6^#4섴@+OOV'i'ܥSkt2#A8]Ёu? ~.` @9xܞHsg_[_='gHi)e82ę'x0 ̏JA\Y(ęWpStV8 Z<~B|@AWu]'G*u^BpO굑.)r҇Cfh d&-|>h= >8jʝʟ{\ݡO4]Zid8k5%h}{] 1:% #"gZA@Cŝ Ԙ們Ƚ6 <ƲycFpk#*(C7w>0 <tZvqx3gp:͸vOj~G:#N|Ww6a=pw[>0ZrM65eqcw3cuymqܩ1iuۊj;\sta$-H co9D֌/':Wn ]\;0sJ@sqc:3`?v'Ĩqn"Wrh# Ϳ'=M`qG޻1ӆI\?΀qfhLŃ9Ƣ{でcł8~fHUĝ쌩,i$RFVI٬`nMMuD^I@9Z$DZ *הP6A^n)5_qUO٦ uOx}A qtAvr[C#Hpވ/Xik8‡y!<+FTGR#.<{Y|5;3?E[ SƍK[睱(y?D.jE3BepA\(|4c3&A~hV`47cO fifk94.:m3틛T-\:gBb$&{_2Z1~n{c/^3߸imՆJy[uys&0[.wCYvN XgDrI"$,rQ,tp^HAn|]AsɊ]:# a ij$⽔QOx˟jWbp[㏲8G3amo4PE>@Gg皲rճ j 2` r~'xڃpE ox"3$ijxx2 YMD>1 ΁Ѯ3<<[\- ݍg|zzS=N'7?}) h Ŕ4;M;^ϓ/ Igf4VȠbo`%irSL t$~fk47#QyIA%m  ;uNʶcZX5 Az6)ɤITcQMnd@f5 :JE#+j:tppPP&)^f_|xj*\cL:f]IG; ?iv|8<20`:gCSEѣ<݉hl AvJCOS 1l B}Nl Z >0z CzV>Osg>G OF 9iCz49Gu9?:yE73ow'aSI}Se49'ԛud\!̔m _tp8@64On;>: {?scה;u;gÇל$sԮ1! $q]pkƌze6O_Kݫ1'ALd_Nc46IzBKB\8ʠK:m}'Ϫ95fjLg"C[69.鋏 \Bf=DpɔBH?YqrxHyOAɾ*gYNa@);dGޤ<[Q8FN>ESY @VEN(2ِ<;D`p 0rW]u{n\2@6=Z*x\}A/(<q4>,-+4J:w0g}v,]t]6&#:qm9sq8|f͚۲^^פ\N#_=8 wkҨ AiGzM3h4GHUR_by9t>E2njK:y+$~5-e/9y%:ʻfJZE f2Ux,kt(z32/&c@䣈;,kmm3M86(өYX6iׄ;2[ho!.149AЀԁdZrQ"ExϣʸѾ0Өqu@q:rIl\En/(]9$yA"^W`$+FҘV{G6J1/K>z#Ȱ40|AGe0y(ƾevNZGر#>n oqǧl9u o+X?O{WsgO::/|!>:k#IJe˲yfv}ڿ[UEp#uzs}2—üR_z׻2;7,d2}˿K7Y8>]4kc0~'h%+A/\[ĵj B%M>ꭆZxݪqڊ1sV8!eepUUhJDZlSnQ`mlS S'0_j_vϣ c=Ʈ3t$p>2|doI&.[7Urt7KZf9^ãs9'^ppkt=eyb+_Jo .?xޖ4Nj^hmۖ]]]#* ~ 4kt ]Nz*@/{AҗGrJ5e28h&i3->>঍q.-v-0XgBx=u8+{9Xo爧j@ w{9FF/uƐC.9I,4ԡᅲClD~0w঱յˤSg;`i# ޡq,($ߔ)Ł}rɟ1UN|?yĩTGtPVtZ?{x?:| Ҭч/h"ۿtPLYt$ԏP'Er»q&!~^02dP9g O*"J9F@qhT(GV-4W5O7G͗e7 枝 `v>]G<脸F~7pW3fd @Ɠ4B#bÆ W9N8:O<1qQ͠ZQ2xI;Op[ |8#u?km 82<x/EPHyW8~׺wc?|#NM]Wn[e#@9q ^͠#_csQp<_4Au)<+NhР^C yJ 9f\PqÏ,&ࡼ@cϗ)9`2e^ M8Go|N?MRE^z T2rJ/SՆѺT9 "ot$}u*Ę|fwߩ>BR+ghz௃$j8RCem_:zb<PG:T(Q A$+ K9pGV` 5^΍NA8;FNv^o>BXz}CF(_Ÿnc=ùshs<XiWzٲeK|;5;w |ߏŋ|E6f\򖷤 k;hE Kq:7GmQ]WqQ7/F }}gvcݡviTRY9@UۻO9۝ϺwڻjUkZ5ڵkG*ϨבWw޿o01ChTUn_lܧvЁ5{ Σ }Qe:ꚞ/j:_B(_@|X`&c$yň u[굻v6i]P5$&bx+Co @u4HHx/ԹQ9J僆8"g^nwýw T<ޘ`/^.wKݥ~~=[:!g}ԷQ6@*`<᎚͆`́}׽N8%n0 p߿10__w|Gcf@c` 9BO xGS#]9~}_n%70;:c9f y4uCAL'F3 zBMY |#|䩮Nz.U vb3R$_םtw;VWs-n{Dwۓ'Ty&fήM)T໥^5U mq 0bKu} uxT66ZTCX) !Wl"]&.-[4?uSSy(95]~ ` ;*ؾ f@[WJ.>|$OYmud4p-~hX7+f.KVrw/~?Q7*9睭Ef;r y3?o}k\=m*xsȲov>uYgYvJ<t6A НyI捈'?ݟɟ/*V+fR[׾58ǡF+>)yiTWH e _!ZN|yӤ~N~G+:J#CUCbc@lf'|ݫAwB}e@};d>>qU:Zg:0n~,#Nj׏;T6[ZWoi3vq4U966^UtM׊W语AB(bᡰvtMҏ-_3Gz7ٓmخ-OO>oeoe -p?uk5 O1O~eJ?2hiT6tEGn fLJXs{ 466 ;l8tn*O3 @zg ~4| $@'ttL0V?2;;J ?i+^tScmͳ*u݃}bO\X G ߯eݝObu$y I?BZ#8ɷw-WVa :jx󖣲Hy-RuՁ6QS4-/za?MW=dtO9H|<~lSzYF*V)`$AgS`V.Ɂf#Ϥ2?':z+ >{蔀 L0R~`7^WْP?sN;5` HsQ X/G~rCOll8P;y z{ӣ thp䅧șTӎlGɕ<2?UdhAwbGtNAy^~c;6Pׯ.l~`#9X7eTfuU3jOVͷ #*OGދM+BS+k`[Rv9O~- B/KSy]0 m[&rr_41uӳ J <ѷW=2C6ν{H}ʷ0 u=+[>F6nk@j``WTl:5FvO6/gF' ιCe8 8We^J%ْJQ1ZVy@uG0`!#ФNio]|"; ْ  qGa\ΓOBeH|4Yc0o<яn$Syx-38#Ց4Dl0~H3=E?sz;ωcsBԑ²N#\F;V g޽5+S83*$iW>***ʣ!Ոw\@ >L-SC9Ӝj$4gWڼ@}ҁeN"JG;8`TxxfE݅4ZY!#b4@^&LܐfxN;YСwHf ]|G~aYeG14OySھ ]xS?t~l+YY|#?ZS<6ptLK_E+ݘ3#C=71"z.ɰrq}vG;Gv/xo7>˻N<;~tCkpM]S+[RγgɦC$gu>LٞךMW9/+ˏڗ ah~Ơ,[g[xJ1mگ N|<=xjFھSߧ6^ʱ]4{_חʨ/ _K%03@@?^:SƆ)N; Г!rEwdI5oy[ښ oʞM "UOl Ilo[,zMoz!7yW?3:y+vE\:[ ]+"epn?Z 4Z L#W'tef{Cr%lo_tڍ~T3ݗkO-}L@?QrmZljG2ZgL<6ziMZ71+z*6 P|(dī}۟ho!s[dxo3|ia*m$mMɍFuQT^ħ2lie?2mhŴrԔEÝg (Ys8s( ,q!J)=ΕJkas3"І91; Qh2@CC5/kxxtI݉':_* ,*lx+_s;i/~cKp[W^GFyYgrpD^ l:,#z2ѩZ8#<`˕| \穜s6"Jߩk`0E6 l SΞ @W?KK\.*{>BxO>swj O4)giMӁ7_4xe&&D>O<'^@IDAT_o@f,f#Eq֬Eue݉Նia5gee[Y8*W]=ڃNmt1ӄl=y&k ̆Cyu⮮kj73v$2]+Bm&x4>an8mS:5?/֍1sGh6mz+ۑ-U:k6 лu/@ql㱃Gkk#&(?3@HH+A nV9W;; 1.'yz;>ig $ow:,mr$E(\ 2|ٳg?>s(:rZ |f76v8__hAe:N:餶QWudfA>?OmŶyw#Ȇ7;r{6fZ^?(S6t<X0Gfѱ!j5d?|EAyվ^W']ek%/|Bgp.ݡuvdyx329Ǐ=Akhw$$$dc'(ᜬY 8yCYٔ,ΕѦ5۽߭*;k_}+~CzMXJ-ѴQmҳp(O:,`Ni jj"@+p-}f_1%#6P,ی,ɈڣZHCf\_6noLw|ՅgE~lf*znQ}Uψo!5;oRUoH]ߖTLZ ?Rfl7+5znOc5h͔쮬*+G*L2qVkg|ƁRs sy˵FSJJ`a|O*\i8ه^|ld Ӟua'>BG*q TvZ`d181h1@ !_җ @tadRY_rt)8'SO8wd?V6sޱ,C#ib΅2MO/ᬁ #A$A䇞^74 ,f-;(;ZL]o4|˺ƃ:Q^DN@Hy՛$[k{] L x@mUQYDgɨ,}Hx scoZ5r8j~YRu^ھKWc9\[iQ ź/4,Bʑ4"_éJkRfK%.YG{LQy]O;E~exOmusj\φ x]j4 l:9[*v-^eЀT4J44$ u@WtlIɟ9ɁCf.Y0WH1RҁpZLq+M5l>`<ePb!><pHSؐC $~R\҄nD(\&ׁ*VGZKLJ39 oa3>]&G]͘ t_w:z?.[C6_BMe+ڦkТ3n#Y[Gd6A|Ԡ(Yt۬F@clɯ\g=EXՇ$_]ggNG'ݢūbڞ%\?:L)F4'tIrfȁ,'p"@"Cvx9GC:O_();| 䃿pNGYtIK~+,LSm>rw.;^yyTVCo53͔N?ҿچXIκi3@Ty1niuVBӬ7?(;Kc׫ΰ{FkB,\ݿ ]ESu5p\ovM+{+-KnG| ({[z b?j|>]`~mM^z4۪#F012Z9[3,c,y4;P)@a4 Ζ9AѸ'x= oqFfT4ӱO◸l ='4 ![9@"k5M싖Ap?d%; 'P=>1ݎ'A`܎tJ5,tD+tMvBV( J'\xr e43䓇'adX ೭z'0ϣOKc{ú_^ݟ>TtJ+ /;t7:, ::To "Ctjx.dM$ۆjro 6/_GmWuHy$1.:kpn߀kQfy , h] 2ݫp+936eNفѠcARaZ=cek}(A.{LTqr磱-a6v\C|4JZ)~e Ʌ4?NYM<^CYzNlL`'3@}c,vϝO=x/_ݸ^<$>ۇN;V}dq.CW9Y)?,[*/[[YMB+]"տFXr^e1}4,D!FQ73( U{]֢!"2"(l 9ػg&(SeO}l:fGil$ ou>A^ ÷2P@[0_,fXw*Pȫjh8WZ8\ti|56 -:,O '>d53 E7r'HC.( th; mBipGx #z~<ǯq[ddC="7]+ aC[͡kx' /vwy .s|? Dk d?"2Iy% [138r~|k}y_W6w'wj=M?Ԟ]SٜZԨlqA+;e.s] e?sW,4V/`u{rI6*tLȣ)i1#3X_щ63ͷ~o:t49~@QMZg^-2Ȧ ydvm w9Xko<"RL6irE^eVNYَ4N@8N/U+JFLF-@NMIkYK #649zJ!ӆ7_8ݤl3o+@+6wsP[8'ݽ_>+kp@ 6>';thww79^w _#d\u|/W gwUƅh{l:]#(6i%aO^rdo8M#:fR9 yOGGXu߻޼G޽}>wU1!ᰚQw]S {q})c;j#}qlQd}~ֵ-fs{7_]Z<]sZ__ mQU_:f6zmkd&m ?%xdNȲ|S W );7?S۠]ܷԌ>u5$k3MZ~Q j]&LDљU-݇P;~WRGowauTK:馼wokPa_Wj7_}ҫqe]izi'ǧ2}@2.<6.#\f챩i5N G+JOņP0y;yp 衁:MO|5Rt 4#EaGeZ>< ٷ 9"ۼ<1,S:ZZl LaY"Ti S(>y6Biyudt.o;v}=#a GY׆|$4cS;C ]06k ~RYz3ҡZCκM;zDo^}K 6wOP;֢@.kt  8 =d˭vQ ~5ܯ:}thuW]}]N\=n7~heܳƠYzԿVu?W`ԡm 僿m~YKP2`p0vHSFZHXq4~foNilU |)ګֆX?O;ŷͨ5#i Yv*ZaMh=}t8YJrDVGxf!/O#-MS+w~襁NG9nѸdktdaG<{ q!v/! ]k ˖tF2I[>wN;A]kw}]_;TACWpRctXn_?} ZT\9N᪚Q܇ b߾p~M[{Czdu6K஺ jZʢݡ7'?Iyeay9WmBku$߾Xڃ& N>3U-ר)X}MArx1OEL m ) K,P t&Z&9A]͡lm98·<[ WH~p8DqElWư~̲{htOr+nxݣ7>9v1DC0$8'n_“RFSB6Ǭ2a7^љ1u3 9ULSB/ߑȜGS sl2,|觬r8rӦ9;v8g/׿}> se:Cd%Dj՚q~^u_V۝r~-:;>\𽫻vxWػ)vFXdmM5|_uA:"An?@ѝ=ܡ7(|솶5ߦAhĎ Nm4 \Kf fռ[6+m 5b6J 6 *lƎ#$54C;h᛺TUN JVdGxNJ'#:;|r)WQ ~9^'YiRV& z7eu?xD$A7qM~C^?)-x\ .H횝B;bGg%#LGhotM/閃[=~J^ 6]& &}ItOēa˶D3򅿖Lq# qm-ۢ3n}[ߤ}w_^]T.hX-r@Gda}ݱG r`wkˮ>u|'}K&3 G׬>pAw#莽Azml :ޘxv~tQmKsA Aɗ VsC>p.!cc>*5Aٰ:4Bg<9`!s~֫{_ϚkQvVq+H[q7S`x>!MAU :18!O7x tHcvgFCCFf;+/NHg2wVAF^|NC=ɚrV"Y=A? _C3sryIzD ?ɗKOq^^u9nI8θŃC_9ZƵ"g @z` [צG>=7g3_jY͋JӖjǫ-Ty`mUiUf&f:{Cj1豛+wٮn+;yt򣓍Vˣ:#.at LaMzAI#=S%i* ~򂓆c<øa9K!27gm CߩD@},6-GE%T P|ʓeSMhـ.G>auf}xRqxekkgxX;#%rYvQ#TV* 5#,Oc+I&dsGU苎Χ55%속W'qB/w, O>HD桍B3I> f3e[-o>ǶBGL?IH9IsxӐ>pLG8 gΈ#'#kF6rr RFAߡ]Hҿ76m~>wa};MQoo:mg~t i@VSzUnl ں^'Gޣ=J5PAD:W) $;Gf*vi!Bmy(?uzXf&G"j`&aℸ=f};c6_%PqnFTl:1Js )p<+M`J0̳ ?z`Ɂ]O6|8hr^ u<#?u#?.+z?;؂?0"27eZR7!i8##ߝiC74Qى.ణCydMIst#^[ w_jl~'w%N~7$_i27/0=zG~ ^?ehZ^{yѶu<Qt*,jaq:}ayM o}V.\n9ybVwp\rn'>/{˦-(ܡgFoȿ9RUΟ5)_ 9&O\dKlqzxqpOv. ,≟qq'G|n*p4pa[^tY@ya|ȵc̶pte$.6w:q{g9mɋ?#%<|#ߎC;Cv4͠-i\;a27>nB#hɗMV 'wOygm.TߨxۿشAwm*iu6oνw@}ŷdӹБAGGs;/o n|Y6:o|U:g_LY|Zo}-r%S۳qCm?97lhTI:QHT&*A:i5 gG;`?(ȝ\bٍ, hr 7KI4䉜_ft@V|g|o=@䍟$d_\p4x[k?YM~7 qC#Z=t m(9ɵ02`#myUv9)e/lw5[G+/AUo`Z^mUsȍe.yl:#P]N%HIs [-h<P0˽OC|f?l=䒋w=}n{#k&d~;ϯ-s+o8*2_?۰V^%FyɭQ?bQX͍ >#5f+}7*ZG6ȨN1؜|h-ܚc:v[!Xf{!1cIv6+}ͼqV!ΒkW*ʩAK[$JsNTv*YtN'MElG4$| W'c|؆}JY#Zt!Ї.țz2a8gkQʞzHyo W~|^|IBdً߳ոM]ƣw->GE)_;F"->"9S{޺{nc#nTy!}ĽNntAoe9 [\i(A>mQ{F\FwKkA{3S~LUѱI ^{~zo_8x[bo6^B!!O Ir.~^S!ߕPhTXt5$骠ʋnOMXҕMB1xfwJ/ivw-ޑ@yx[$O2!Gd8idƒTeFX }<|S~*Jlp0lDfdM]凷cHx+6LWy!a8$'#BorǮC^ytumۨ<_w;_t䧓8vncK~f#ym@(#0"?=m Ϻy7TEh`TR޴ 1>z5Q-M-gdZ XaSUNVJ1M,¢WŁ> S \52rdǏ*s*DA:q^;Vb=kNH'\L`CŸh"GK|5ue6CMHgv忹&H'NO6@H>O"H\~r6$Ӎ-StN -93I8&AEsSo=IKjbYkc~ޛ>ם]ktqS٩=~uLD+60;= YbZ^; sw뜯TRBG0i ]VD8C3uxq2፟ȣL[)WEJ?+ 8F0+E Zed{PEA67,0tFذl9 $Zqmg[LC]ɋUAMozSwg J--'0~-dC8@~ KfkO^/. M8a`s9i ȥQ9p(]E&liYgu1 ~dlF=eBQ`Gti|ǁ.~&GW?w'tR;2X>x uCȴȟrgЗYaٰX[$^%=}GI5K8s ,x'L"/C͓ag^K~C_r?l`.D.z/| 3ꡁ280S.B3q eCßעL uc_pдFC%:D~}c.su|{kq|7]`*NCUT69#'7t68ڗM/wܩCy2$ߞh,=tܦWv˻ mJ$x_mDzU~7]}Օ,p|Y|_9YcIMӸ|َ}͑Oy̪G3r0w~Z4')*FW8WSSD^4xG_ BGd X ^:fz5p Sy+.99c8~G~d!TF:0G vns)!?rzo|{{֝rw?;6SZYqd#|]4"4ë=Ѩ~0P'M&!rK j6˖pedfp %/?7@c(8Zhr&=W1T6p+%/? ?; Gˀu rm8tnCYr-2'n IJYGO蓴I!>4ӝGݹ{䟾FAuGG^z>i'YSތn:fdvG:w7-ͷMaơ˥{77[>]4Vזe%?qc(}:{( H# .h<8u:yy'm{B tble{ ??yۯ'Ysg txY\8K|OuϹx6N m"?ڏ__kuu҅lt1a:O;['8?<#у>1ig(qZ`ͤ4Ao|cӗ=kG`']7O ]u'n>=ОǡwOgt/lG'M cQo!$P(@l@ 3ȗ:%` \tхӟ[a'9FGz49*_oT\b|S~W+/ڈh=F>g,i䭐hFY)8yyƱ?MNǹ Bc C<2ĉe@ v1'xcItGG+ANyG).IDATNT NP>ylE1ę Ƕ h0E:*a^qGF Wꔷ d ݴ2 zqZ<atgp`P:.V ^+ρx:VsE<#p،.CZ?<+sh(A{0" Ķog >#^uSo6ܸeT)U}?ؽ|bMi򬍏ܙoeP>%l3MamdkU ^kUZӧ]TEa2l=(@ӦjwE O&&vmE^xaSk^A 1 |*4[ř9=y3l єKeТ1Ѩ i2]ph K8)8{i0xǰ˟<$̑#& ໒O^wc9f.{q=o˛nh ݆2 F/o μFnd^GgEtCKb_i||NchƼ9ȕ;xB><V겼oA|RcxiowC{g xEgrA/m >KGy+"SyӟTn=,K ׍Pn#mp5mmO:峎 =>Rwզv׽->mN>kƨtW-0lY~ W6!UWmn0ݿf؀(\W6!g+-k{{ f0 >Pl޲}u0e/v5 qeʰJ Q}X .YX`a! 䣻;;c)hcm tӆAWMnrޫ>7z52p!$i4Ϲ0x +^CIrE ~҆a ,!ݤ𚔶ڸ}j%/|7+kgg%>YY8oJ4k}xF~w^^;=?ZW ђ;$$I]}?b8r 1M@_ڐ3юs ] ,, l$ =zl=|l3I =@"naXX`aV~@}#,, ,,V 3ag ,, ,,ﲰ ,, 1X ^( ,,X saXX`a=S E7nwM).:;/:{kkXR$ -g$! [Cw90:ϫ8.W`fHWǸ?bwf7Yq;EZdWƱ-h\!=a+8*kb8>?vBNwr/+>w{=ySN@VNw>dnCTO|{sNnq[Hސ`mg|+v>C?Cm{} >??ǔvV;ۉ1=8ҵcnJle OtO6?KGʖ^-w}s]Nv;^//]Oo_R}Qr^C%>,/^z#tܦߗNҮتm_]z#?|$w_4lC.Ot//.~mh_.OO._J^MV&{e=c+><7񍥇&{կ~??,y晍|6b7Y v 3`P~r[nؗD(Y5[:u0(xctk_ےR 'mBbv$3$/Yg>PN>>7dˡFKߵxaE= =i%ܼysHKNS_k9l}X_gy{֫ɢ{=Y:s>O.2V$z/}֣AVٝxͱJx{B*3a \h]ギv׼5Y{p9昮FݦMiС O|"s\ ogmMu??dtBd iG|#6q Xةv|`CPӥ=_tV!]ń^vȪ;1b;-ncҖ XPн@ nt.Ot8㻳:!\f6#uo}k#{> 'ⷫBU)x4PSdzF/΂F+"f& FJe>C۸ֹe pBWw >|pc?cݿ˿N!l+w'n,t 3&W^ңe==Юf}c7͆.,w[6]v#6[I%v7/xA#˯!Ÿa R1TK-z4CD<9 VU oV[olQÊ3 iMjYWS32w։io{۶ F'4b~ge:H'{ NzJ?m'?Sz=oko7/额rl^kSwkGw[޲ows(ö0eqАHguG,Jx=Vģ')iu@Z`w+i| 0[F&L+{} oxC dj9b;a3Z`^=k˿=ԧ6;anzC?Aju9D8r1@XHG~ֳզzkow߿4 I|-9//44k,nnn2,sl[D;>%"3h4@R;ژgֽsQ}][2]6ߘXVi ??-f7V-Pl1q#]1_`z;餓[۝ྫQr; oot6R0N㤒i{~F,)n6i8-%lO3%|dߠ;6A A;a ,4) l$Ů6e/ꍙRl6"%YhַuTi2J~l,B)^wW,f֨dn@;Yi^׵Ʃ6[=Ίt Hat~ Mq[m$WKluEuWDg/yW4VhV)Ya^7=gj;G]nzIͥڣ`OY|D`AӌY|e{dOU3؏7juY-]n A(T_֐F6Sga;h/'֊t k n"Yhϝ;0n'.(=Oyz-vSlHݙ:mrM ׎k:Ҵ4ܑG:==-lu"!=r>n/%پlnjf^bI5U{tdwi5KnY,Οó<1?V4<+8AAmљAҌ789}T ]mΥ2M[\Wp [y<3];M."='2 vmp"x`:h;;;M#vf6fv766jFG*i1\|ͱZ*6-ZWn{M[F[,VڗǁC;޹S QWLuYg< @IWw霻z{T/ǹ3U[NsOgj7Hf(Ʒ>t??詡(o9=f03e^n'6gôs/_LcUs#*XtUY++uU/Y uMQ&>lf3zI.Θ9X#T}E<~ʧv>e=BA~k֋΋T aM4.w69O?Oz\*i<hzjMo+hW2խoVkuidj?Sgg}캣t*lo>}xf*:4/v ɹg|.t h ̠!nc~g:a%-b.S~`9I} `mTpTe+k3J^ ,ZԶi\M|x]Dgk*ܰY᧞zj5_sx=>=nQ[]zi5pT*ĤJYtbIs]9$~\m|&l<$gNu^=ǯ|f Z5czd&ŭ7a4GvKA6m|M j\ VcJ8^hWc: ,]f7nY{- r[H ,,N5ǝjXX`a֧Yn XX`aةX vyXX`aXX g-@ NQewӮe{,:eO ɥ!ZFidVcgQEf &B=4l:[$i[ sjH>lH Wv4gea;{GE>_`ܝQ֚oo(ve־:I/yS jW8;/ie * !mtva ӟ+;٪G>-"d@n}W6wtvS$UwCS 2O_mN>Thkl~䵻E/z6zX'!31۵]SA2M7?ه=IOWߔhtI[ $7 뫣 b, Zj, |YzV֘ٮ}nío}loCsm0 F@KwI'uϑA m~vM;7@Q<n{ :袣S~_o;'AIw_6WR̀ӷ8_<驣{l; *X^X vh4'?{>c{ _״Cæ4P/-8h҈&~R\јO;3t&Cۮ34[r衇g11I^pgqFZC#xp|0tG¤v^e׉Ogm0yo jk͐.xed/mzZ ?A.`q4޳]4>C/DCkxEX,uh}ס ׹^W_{3Ѿ28TIC(_#<#ɭsH MO `C°O<{{v?$T9!~9)DM$$Jv8PP!G{(o|z{=ymyQ뾮9c9s1 ;:ȗa,_X r\]ἣu6~2p_pLj /#uKK:蠾 OXnֹ]rz K q_h / MxNc\\.첖 xx G mkhohG&:Iߝ׷+y=Yy䑍}+8ç9uxJ` ܶٯ8K~mu(:t$fqu fVno>O<5+o; ѳlPw/Z{N?cDğ;@Ngu'|YYG~xzrG߁z[f{ORd^ 4f15\2`=au_ ځ,(2m~z9紥ꫯMz-pe01r3vC'~dcmD?^BbCt[I'4ݖ(<&>6'M2pOo(j5%K:.:t#[T'q5M w]w5yQEa;n9t-ʤRf;[SY5Lo:v6nͳ*'ƿov: |QGŅ^8;{._O'qZkםclZ'ypϟޯE):.F t<4+jZu\ >SЬe{g:>0#/N;~6Ѭ+~'׌xCuZ (wʧm8)\F~V΅[naD_ -~w*\r*h7,\"I+t:)r, ?u()%N_NVL$?/bQ3V0 6rz\McoRJAЁj4 :t5;nZ8 4:Z_ I_Y#5{ƽ+eeXyyh< 4XkaTj<)A\?qU"(oC, @k׮.&rk~{׭'_ |@ɶfЋ/:%Ҷ/k̢[;>j|"(+BgԠY+'s7wzRHq򕅣$O?⋧LlywtZy7xT7A( ?i|g ,uYe52,_Y=c|GQdãO?ȍ([֔VԢ`3bA:C{w[?w jt.ΫuM;=p Orj"H_ yіǓ\ R͠tsw76[ouve(d;]eg@x≎)LXL(_K6)Qj wo$>i,7pC(+`u!YfG|_ͷk5>e pjgL^),ZQV-lR ZP>uOq9RJD,SN)B}kK=z^(&z^W(֗mZ}W;T紉ƪakبiYGFu5H6?w=V\`SxG>Nf?|@Y&Q;~?ߐ'|עsW5hofcM;l5}RN)~F-MlyS|ږ eM)o;4मȟ/ ĶUεcW_&/p@ۡ[hzT.4Ovr!x*Ѹ|!ՏLl ˚6r|ɌwY:)4w oC^@4B:~vsҥ^:bH1YDg{Wp@-By 9dH'8 fPK!x eEdF"qhm}t'|,r1CAxL8qN@ցIUS|9qjʄK4U?[*ii|:&n{ ̜ Z 'ϭCb-L(GSbvV ()c,L_l > w:5u2;@4 yu@D)q g6x`w{キO'\au~^V_ok2ͦx.Xw|oU}~Dc'ngey,Vga_%JX, {+7rrƿxE1/%MzA|hŢGY| ?_JY; K ;2ggfGO2ky:]Ζ9lCձazKweX8·tr&^}{(7 wV/ EGN>0^oϔg嬶am˚r3-ؗ' o}v<°lvʙ/y2_kA駙/%9^mcYYN1B؃T%6Jȋzْ<|Wk[m7!B}m;ԒE\ !s{#OXgaNǠYZf~60'HÒfOxy{]a:/!`Q:mꪫ:=%O +~rVgo4Gy46~RsE[{Ƀ ,zQf:r?,QHπ~X+;qW.袍݃M7k~3(Nbj:O['y\`xK qT3ySX7,U co1nfiԌKh)?Y;:('cYSKն6K/mpJ#e|XJƎ};cȯmPr@A;0SG⇂\Ŋcі=\HfC2,R@FM ':SmYuV@@ q-I}Ύ>pg?Nաu}5Nن'ohk 5sjdqg/qʟggWd];sg.ٖo?S:p30[ [ᤗ2m {~ͬ;`6mL}#B r||#T_vTʷy sZRЦtr(;\cèM/f`h΄6kk޼$O) %))9~3u 9[#oLtk 'a|ҴG!ئ|UR/[K٘҃WցE)NK[&[H_Z|r}22ߚaU^Ӝ{G7_L嫎صL̏6;kfP,dUSfl,_:~)*=Ox ~ G"ᙅE:֮_e Q7qϖ*lE=Y UNęȥ+9c>oyE@jr'L;o~< f%omP뵃xrggD,+KR"XKylGt0 Ŝ:dQvoő#k+kꬎ1q-,/6mhK3ueuVV.I䫽}KFpAŲ[r%8D*>8U c+PL(#Nsl 8ST(&5a4QtP:#MG%nY5'ghx] image/svg+xml ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/index_contribute.svg0000644000175100001770000000474014721316200021062 0ustar00runnerdocker image/svg+xml ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/index_getting_started.svg0000644000175100001770000000761114721316200022073 0ustar00runnerdocker image/svg+xml ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/index_user_guide.svg0000644000175100001770000001443514721316200021041 0ustar00runnerdocker image/svg+xml ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/install.png0000644000175100001770000004564714721316200017163 0ustar00runnerdockerPNG  IHDRx pHYs+KYIDATx;ka pfwcEAK/ D6[bTPR~`g(D+RQ;D &ef΋MΜ漳dhӘTX`ТEoy%_و?V۪@+yhuZh3X.x0Y[u4%{0ֹpvA/&yMoCൺ_MR#VrI=9Z]@^`~g)y)7O]9uRRn=-{x̃q.6U G\#pZ|^?(Yo/]WsQܵ^S a.R aKheo+7s/`>RP7iT+^)).BĂP\RPVE)[ R7Y؀HUjS(4 Mm^ڴɽ33?U9s3gxDtq'MD@k0&1FOO̕yذ&i-OLܙJ6z9Ns&~>tS"Wj}*fdtujj+HjjjW{y=׫ѹ^ZFO> ?Rr= 6OpZN[ 9;m)D:ҹVĔ-PLCgV+V3V;%j}j}0<]Hs=@#=h~N$~TfC;aNlZ߃qmr[9@)&5/DUUgLTD/84zy*j&?cPuPϠҦ0:?Ӓ͏[//f@岷~uXDyB %t_]tW2<~^- J4]!zn%;xKآ{AYtPFH${AWB]yDJ/F G*3Z'4AWIY j-)n]g'נI3"nע<1 /-z+:U ի>LйKئ8Ct~8D+S9/<<Z=:wʴNuU(%]|6W+q)_ c{7Vp߀@fW8^$[S~8D6r*$kL_=lD&A#4.@=rkڲ^tie۵5=GEC &u=yFZ字7iqPpOQ *z;3QX^Z<\[E#gF)p= 4J.t9xtdOuDMh!9l"s赡K.M$AdCkT Zo8ȱKZ~7N]17m_שL@4pcCD7Z'EB? 5}A~)^d(c-høfe ݦ;[r:X"z͚ @m T'kW -)69o$MaSrk%C,|dsg+Y(,|(eacckMda!; D5r{9,~y9{60[EWՅc0Av/ ExHFT؊2Aװ. \'5ηWc:օc0AztME:z)'eWb0< Ɣk<~m )XP&uuLٍίW5`U>G ~]׺p &T*ȯ%06͑ƽ`Fѵ ` ךE飍 a',s!+օc0!? MQ@xT4Bbet1Z V=(-GxVtv{`X ]:q &>*ȱU*\,/Ҙo,u`\=9}(]硟/ ugе`\%9sEZ@{҈S uz׉c0#蚩"%PqgȠiaԞ)AA={p:yAC_]k^7 5k^"x_G4+at1VgW9 躗@I⡇9S:>RkgItKqF~]n̉aNk_ x%S蹗@8sSk\뤂|;N_x>U܎p &+8amtY8G7q f^Ss{ oЊ)^[ VPKy(w{@+x/4_4GS964F庿)ȽG_ڄh#ҔiJ=1={>{OOD|rl SQ2pr9 ^@ KRФ.I/= ( yS+ jTz٠>id|oN򓺧=1xeR=<4 4GoЀcpG-HWԞ4xH˨=]a ݌N%ЂcpҀc\AM<ϻ5{OEPTPݳF F1 &EP&! b( xAqœ H$ .*,YAV483{]~GgLwU @4Є}߇~8V^ },? 4_,oFgo{ވ|z&Ag/!tze ŎL9^s,z^ȥX:ֈ`1UmY{)t=[s\݆μ+CݮSC)nBg&xmB5,u+:d+,Йw@q$qSqi8: hr\׆ :.rO 䗺λ ڹq*eTq 7KytmдwS4\[:0ersn&_~8Ʒ9q)m.8)Д9 [97 ^@S65(EK#t)qhF~g C⮓]tMЄ6Q8\3N@\+٦T 14+oJ1+2NJG Xݾd=lS8PMrw5@)Y!`mi^vܨAlYR^Dg:jA^cNONF ^@]K5 4I*[S}e@5+2:DS$^@]a 4H~%<δOjH:ܓYtuPxs|­Lq $ܬ ٧`5 :: j^ƕͣ"Ǯ i9԰roS4tU \φGx{8Ԃ,H@xBǗE "4 :ڇβjQ. ǦwГ,09)΢~)v':K@ߥYnleNVt0R\gNtTr` ]ǒL,g/`;,p:GZcWTsi-i]@t<Αq 荮Zu2Lc^R'/ URGH8Pv_UU|=Wsi>j NGnW36WL+ /`RV8i=8^7']t>£3I*q K:^ngxˋ\EFq&BB@7]IH "/p# W"4 w!A_( E&aQfl>77u?@yP~TLP?ݵޭF`TfAx_Jj$rP %['l/>8Yz^Uj$rX~kcVJZ_Ё^'Q \s'[ZgcXɋ2ʁVݫXʆY~EPOQzޚU[SSsNW׭~w?P8*c{'!aP6@zI4wpx@Sޓp(jP5 (:sgL=Pw&n%Ug_Rͳ䛽'aQ6`vئ)`@OtP\/ e@ װ90ZWᶌn{{m [}PP:Rl*vwphZ{&e@'L[VgKSn#S ^\KX5@"y^\EֽK-[NکL А< "\g˭F)cUdmb*b,4 `IC` zO*¦l1~bx8Wɏ& \ 9b<щM L;/P@ҿ*8tZi (UKնAjd"|dϭpZX $$g'Ճ `s ;S'ηc_= рp REM|_ 0fAM.kI Cl^*0@͢x2":M 0pP!'Շ z 24@BhIel%tmX!۱4T/2)-V\ P`H58옪|LFw5d;֩~>Yv k3i-d<;yO'e@Fv~<D% *IQ&)cС<vW @$]I͢lbse! YxO /e@4Wzڒ2j 2A&EIAlf5tٹxLanY}vaDhh_ku Mn]lhAsPl÷|= тAk.a??u Z0v="._U : тGԳ;kϋqO8<˶R,PήjGlQ(.ȉAY ɏ]3?}y^gw3gvd6z@T)Jt<+z!(qҲI -8"*6~l(rI p7.,r MGvJ%5&8 L@E9fƽIe/ˁe.+r ?3}oL #~ ;.*r U(݇+KU୪5 hTوKCʌ Ai"$n1ǠB] YDDԈ("B4!(hxnjQ:/tgܟsjnTu Ւ--MgP ݤ QGu1l/1 ݠ Qrt;hC6.c.RVBRJ܁nPSw4Xzls<.썲ZtR7D"\f\)cyG$_ 2$~`,:© ńçNȄ_r0qf1_х"Tzp+Q̇͟im8(ڇ#RcS[s%y=o_ sP2?#:OEG_AѴQpэGȲޤ='zN6JWy-N`N}Mu^{@"*@p!HNBت \^Ap(I u Xқ/iƤڐ 3wߙ{ x3>_ :x_Lh#;?J'곾v+5SzT+X}~Z6T+A1|lr_]vާ~@? mͽ UuZجd58=aSX%M( Z}Ձb'WC3<^RܞL;lNjB>UX]'۹蟧T%%` B bo=''~@}}A~B6o VּO8`P{\GGSZy*ޯ#A߾ ߭.D]uj[ZeaMWVry8~9 FW%_[^?FaӭJPHu_RXIcS[ hYcC+~Y=GБmжueHSr V=1V?KGX:pHqE;}X`F~ c B/Fuhʚ Yz<{ |guJEgiup)|t}=A!aZM4@.e>\Uϧ`qrl+`1͋aKd93 6J)yabN KLlXb)MPv883.> V}r^]=?bi5r_~R vY%ՈMlPCaSս$9) 6x[8@ vPR^ j&ՁPԨUA\ ZU8@ UԽm˼?zbHexQKF^ԃ9u1^%A^ % ԁĐ:N^^fKՃ.zouZRYۻUrձzZ<Vc>޻| 0p^<V۪ >}B\ j_/{(J}TGW/HSķS]g_@yV=OuB]W>^.!"^=&H X5!z2w ջ+گ @cAu)<w ݂s4=o&_DCInh[y7@y[E`0gku{ݿkq񯢛?ri Pp%RqAjET"$8Wq/8u?MȏK|y$Ç{hR "/>Pq%> :`1=V͠~]iza]yDnU3Qg pR"RBd50z0QSEe'Q޸A-!Q៫V~͔wW3򅈟²%^D@Xv]?U3hc 蒝e^{Gd>P \z!auh.~TCC(z@Q(w ÎԽ/(1jZur; D} ]_=ȯ!7I޲bNpkfN~Q ;3*U/&߫Fd-؋`PEފ\|U7oh NJp&*P/&^oߜEV8ԃj‰Q/'7F܅vW`9"Q* zAQpɨgT.9ԜGK@f{zX @<^P@oE uψ,esVx^P@oc C"vkv3Ydeg ˂:^T@w]|N/2 j#ꁅ]vl~Ideʭ)~N/ !+YXs0߷ǯ60{M)ÎԽ"<.\2\X-׿{EgI\ }@TNJ>Zvt )@^\OxAMrDWV'=D#rnl:go>J!W>K;Ok\U'.ʝ;T%"!`7 s ƅL\JtQt#" Bv U"Bpa h9ISƙ{X~O~$s{ /m5vD,>/cV˧D/ 'ʏ`| /1ܺ3.h`M(e@T+ mml89qAPGT*a?8(HeaN=d^S ml  ('6}CzAJٱpF]NPj@qu72Y ˚}J.6^/FLbLVB 7-=ݲoҍy6MI^Vp@o7uPk[y6<?|K]g-O(Mͫb5*m7&ԃp5M޳&JfnYxAj8.܊dV);=ubL86qvwAg*ղ8w}A-f]P~@Kkq|0A_ fW9q+Aq^=ko~ SUxc,UIX]Pn/?4Unu!\_ o?UOhr{u| Aj;6z@tlnÏO9(܃[MӲ_.mW7-qQnM[1l?nv8Ew_9O";X8EeVL'BƏT9.Ĕ-<Mu>Qײޅgxh,;ƒԓFUǺg]\׆s 6QEMzz" _ϋUu'¨<^eBWmXtΙFja!p-$$MAE?peT F (0ǂ9rAFgl}>wx/^s>ysϜ9o|u;VںG绊ښoRd0.DсhOz*D7 >SY|n)rMR@(B}:a`{6z l3u\oz 獶IR(0U|*dEma`G&_:<]MZ(?o2W۞j,:2ߤŪ{I ~~@0|Jb|Wy>:$!+MfZE0e;Sfh~oY]L0)Ҿ:䅛}>`R::R眥xlXm7@sBrs.'K<v Lk-C~4GيUthj=\_E5zNz! OE'TK8JɚssZx&Wيp+7.F5|Ksep|r=lbhoʙ{9&n:6^@| )}3 #ލzX1vYZD=`%y#oH(`UʷJաX':\Qkݔ?lr.oLɽ//\Xw{ټSE?_~J3y;~ʺ\z@Ъ֍R7CڣkR" g|ز.ȢRZ_k\ 5/ Lm5M 7k$il66!Ds] ta0ɮ||?O6kYt#T1(AGW񁛡o0qs'(nTA?F2BoAt!A FA4 xiIu~6FA{ ,iz_DDuCu`7*ƈy1h`-N%ELDT_>Q˞! @ƅW3b&"+NM$ QdE:i5d٩2Лƅ;%Еy6˥5rx::/"jˎҾT֮D0.YtNDgEDTН0ГFg4 zg ۋ΋R" x@(#`D0mBEDTVz{@?:*PPgDž%`:3"TU yvH,tfeAVM]m-.zn]"3|JH7![%t3# = {ЋpY%].>,7ΓvtnD4uē{?/JXέ|Rm<ƳDuF6 5ޠ6wN>xTSΝV76Op@ϹEBgW |f7b|':;"jc搉7U`"6]BLOCGD.Izh M>dtҝTȯҝ??d? @o {1 '7;zߚ+3$"?elkrm<aJ$e XNY3$";E7{'߉0/h.etϑ̊iW<0W^]וlA|Z~ xm d“Ȟ6鞯=S>Tzs|Z%mG[ltDdG]`Fu+:ZG¿4Q_/ЕHF;:zGݵTYNAߵ #Ϟ[O5H֧ ?b |φ>tD]vhFRɺY >@#븁 Nay@4p g#4bMU!~4p5ΔxdxC}GO_OE u$1JJ"tuvEPLP! J<Ջ"FABDP/WQ0jC$ADEE!fsp{΀kTUw{ ^rR)L^)5o q]!է3)8Փ:tO ^@ "?MU)5G'ulsB i`I-aS6\W_!2z?Ç#z Z0&D_Gy)6^@ z6蔃h!Sн%O=2y3M4;{APRsR!D^'\7RJ|ۦ hgT/Ò 0_";$Y׊oal4H˶O/=&?k:~ךOt/-={m9$^WDX:u8pSK hSO{CwWL#;@> ^@x 6ن3QNl.w9*ڃstA^C)qgmQɈMCx_7~. jhSCm97QƯ4y>a^@W]NP_;W߀W=tωR`qz> ߖRA+ԗmB D+YzZ7s{R߆yt8x0Fglr!Zt߉V?3=5x:!= p9+{O䙍}$p+ߎQqt:t8H܌=GLޮc43tQ ex6M|cס3@-:ע-2{:(S fj^o*tF:Cq:Ht5G1Wyf P.4vN_9 /L]xp{;Y jS![  ֣/47™ܾť7j8_;їj:^ c5K (djCM\77b[Й^7㝷xgGǝѹ ?J~qDR-Cxޱ{brO3K^Q\Ly%+A?3*zZ/Ly(p(=I/gz!򾝣KƏ|)zƁD6pe\6{V\ ` 3iy#!u[|mZλ 6tV(%M.RRw#yU ɴ9f]u^d׻jr2鏩AWHMMFB‹ي\`cP{=f~TvQ 狃{=_zO5!7$tnvzF/^OZkPM[>n`׹=c3hd""C@np; rvz9®VK#4:+/s^f9r\E$KA@[5H\SM>zҺ4J^rW]``_}1)"#ŸA@ddo""]s&8<6þլ1ʩ{%b@, ;(F?m_=(>yG5Y8s.J4LH;rXՏN@LօC Y|F,O.c-(83P/M0h7wy x֭[=yTKs@~:ݜg)~n+$"}jJ~~ t[? o%>P]&YnݦZFCBU3a!MCA~ 襕{jˇ/9ߵT.sߢݿB@ȪT Tج?ʄᑻ{Y_wAThO=m[9otk {oq9hni@"(>w9)=ؚUҶE쑶>Y@:_0 8O8@&m0+m}fv 3YP0'96؜q"q?,*0 @zОLD䄹R@Eݚfh=3r͉R`޸u IU΂yY5yIbɒĬno 19NIENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/matomo.js0000644000175100001770000000120014721316200016612 0ustar00runnerdocker// Licensed under a 3-clause BSD style license - see LICENSE.rst // Matomo Tracking Code var _paq = window._paq = window._paq || []; /* tracker methods like "setCustomDimension" should be called before "trackPageView" */ _paq.push(['trackPageView']); _paq.push(['enableLinkTracking']); (function() { var u="https://apcstatview.in2p3.fr/"; _paq.push(['setTrackerUrl', u+'matomo.php']); _paq.push(['setSiteId', '3']); var d=document, g=d.createElement('script'), s=d.getElementsByTagName('script')[0]; g.async=true; g.src=u+'matomo.js'; s.parentNode.insertBefore(g,s); })(); ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/upgrade.png0000644000175100001770000004524014721316200017131 0ustar00runnerdockerPNG  IHDRx pHYsyyZJRIDATx?HQqD HCKPKjhD*r(L h(1pq)\ ɨFIhs Į]}᳈^{OREze )2*dBȔ5ֶ{p( F&rAC|1Yvih hrnWB<)ɗ|ڛ+iȝ{@NB*=`#o!^(|[X.XNlO`#75 eDe0x'ˍs%x--ģrNQ;!%>@ԃtہ|ԟXwVjrPȷ rrrٻrf9 sf9hh9ٻ>Z&1*:BV߯ɇ ;P Yݝu=`/;#O3h@=X.[Nsi!`B<{ Xۖ}P]jrV^dМd9o].P-!ޤN+?3hĀ' n6 AͭMN,w jjͻ^Q ̺R=_YXty/=Wܔ4TvzEjLb (%]rqȓ %PfVcVki=W9)3h@x=OGe.FTՠ"f8d嘼̠jjѻO8@Byπ<ϠXը*7ª~?Ha! iWnDpEr@Z! A!\*h j wH$K~{ν3(=P{ !Hv&" g3 џt_ Db ,p#@pH~l|XH  JDgm3^a@/@-p(?uvA%)+>v )~#\ @"gYKReZgd6z2>Va|ðZ}"ߣa3}Be>v z絪sT2o` @蹪J2u*F/0p?P Zj2);T{|wefPf-:0}<|+eجf>{}o%Ųo ʌ^@Y{!,n;k= 1X/xOKhC(1Ha`h g t2Aq\]AsKkM5kcNPCBP9o{Ws9/itk-@]Gb)`nvM/It/VA v;dSfh'<7Atߛ^@S`wBV3{TU=:,z&َA)4 \н߂*c,ٮ@ Ny^ 'jJ3 CmlgΣ\;x5n *j]?!zMt>jboHHҍW&<?yp*:/5]#蠶fbOItozaZfv/@~ja;mmDaZV}%: 2ûTۭ|ѷBB{.{J6eWvc=BNZG7h::oٮY}/ ;Q>J ?Xh>:YE2;x 9jF ?5Y3% v8 ^@D^qt߾/DlD#~MyaQ  - Zvc;/ +}y_gGDԝed/">Q$k='5m=D)_2~܉!tDKmh>YvK{$-t"&ﹺedOwlH#*$"\ to읿Ka)PPN" &ks`\\r "ɡ"~Asxr{{g|s}~?{$.>BZT<{ |6hϓ5t^!͡x~ 6yA,'~:+BB O;Ÿ yA S*>%nsBG\uXx=;<8qB7#$C;qB&ڀOhGa<Cȅcb &q+I}$Y'tb: +@*4Wй fB|9V_ D5" tB΅ ^v}B炘9*m\̟\3u t.@"O/\R:)Пt)tBc&РÉ$Brb{bs^T@k:D{0:MAqȃBB瀘x>xpLs c[jt@Cn:ŏ:kkSq &ڂo(69hx :@$葖HyN|q0br1hhOEn,naeS1K/p1֠CU[(.z@Ͻ5'AZk9cZ@R:;:*8 T:p.z@M2Ks+C  H؁shGA[RXgv*94'A꼕hEAC2X{b5'AZk9kc/o݁E;1{P@hmr5qx}6@K BMǽW(n{ ǝݖ_$ E5@8H%Ȇ@ A5ǽ \◃[l?_$ REP{zܻ@R閿7m}7'A꜀Hio VQZrm%Ш-  4TPcYP[RC!ʖƐX%&h=rgSsΛ3o= ϖS-y֛dK@%lkg:sl\CYBML2. 3ui#5{ <3-x֛d. A,r eÁBkMLJ ol}2p`6Zs`6WDHY H-]yfu/isF"FYf\* 5>v7mVZw~.J-D=,%d37,O~.av`&?3:%d&7M:w?@GfS7=CLs/ kH͢uw`&hzY:p 0~k/80GϙhjYp vVZ+hMȼw`2: mDgM L9Zc_-\#3A4Dk,|u` Kb<$AδΛFfhqd]?.I͇s9b1Րr!`]Ffh{yhsO Cr,u1-N"M]t CX^AƴXDi4ԁ|A_4D%,p v3)JWKD@ u3Ss^pĀ]~uTm"ͳSh7ap6D 맖ۤ#%BzA6Jg76sޥH76-as )Pz(!u.njKCG@K}B[,Y遜tgw%Z/-`3 !>t^2hmwK-nt O^ AxHNjstKH'*5 8_fֻؒI-野fC|f{w ٥3%*5#А:ޥ>jkͯC#9~<rk& -}:*f#fs*AXHke.jK٭ti:! (}RA_0tT,nB{ $eKRIG@lnAZB KJR] 2i Uʫ;/%l;(͎Fe y ^E Z2:*gzAGe uʫ/#26uޠ\rA@ '3]X{ZAG@ElV!>۽{)A ([e{/tTfESD&P`N.;-a#6ޝAhvnO#B%eۻWJ؜ "ж|.7-a#6}A``՛kJ ~&ry% G?{VJꃁ:h5d̀()#?8?7$PY _Z\C$qU{#?{fJ}jH= Os1 ?nD^ BKxX3LjS[e̵i%O=@ H<>ߞ 3Rw aoHi-S}@ Hd= Zٹ-!f*u!5`V{<@>Oa|S[BϾ% )y6*u<2 @B/"ag`ޚ+ I?Piw/*}u_ !!&- iOx֦ϽBQJx2$h/ ŸE8%`C{|^10k 1~J5/ | u*o<7!ށyK6N_x|Lwל{(x-]> w [gFDp ڋտNŸEk +׺YKJgޔ NWywIghWB8mh ﵗ kNßE[ }}n,]1p ßE(S }@^o;g =YQgQ4 FȀn i0` b0j`F&`j`&("b("l;b}{h{z7?u':zUYN5۰;0 ^/w߸ /{ ^86\kҵا[GL=`-%w PIWе 5V=`.%:.O)?Dg5( ձ+8x#WozGomK orEpVTԅ](C`J(HX[.Ɵ2 VWԅ=(` rԖޞ VTԅ}(ʬw0rF.rT~? +_Z&kN`䎜-DR?z`8SXny;r;rІRKUTw+iLlp#ro%@{kGw;fw(bv d~o'=G\vy{C3 V:.fw q8^ri ^: !wraB*6@aVz,is3$<\39o/ȒlȅۥGY9$Dž2'?tsҘeQ*-@aV&QioTPiMCwͽ-ubn/0eގf+ EI{_jctk2PA #7̌ |_5 4c1kzK{c[#9 zʔj((} U7Αm4~? @p>fO G5Ug>@Y9J! r;@Bm0ڃ5}ֺu_ ŵk5 `41c6B̼fb>=fj(\(ʜwH'ʹ^Yۿ 煓@8ٻN_$f4fu&f|l̶f|w1+TC)@A!F@$=P I;@H_$!-:{zgB^dAoB"p =ak>y1JQ.X6tAn*,Le' vB`:p=ů?]1d C-oiDbm2b: c:L t0gU~À U$ ;(`ΪGA./I wPUNTM_(@,9::@'QXO?y@ t#?4}Nb?N_(@,9\0}N RonѼfbۀ & ͻ(`*DyZŵ5}@7QhEVqtDD;]@D. ;R>x(@4"?LO(ԍ<%. 1sUI;OWz KsUj)Ocvk'$Gn*@5nK3UnۙB(wэxJ|_Ggx"0W53u^ [F<]`x"0Wlϵ3(&'x|W`׃sNI`ٕ/5knK:>7q+W_3XnKi\/Z>7KqLW9aO \՘5@ѥY}Mx|Y`.I~i_7 Ut^t<YLcfD`j6@5~nS s5 ?ÎbD`j\6zA//"7c+4/C&$zj, n/z%P|c x=}2>p*|OC檆^45+}5n [ofjlVCf_P%o Խ4)+!Ucu(߬';D׻_fqYylwLvH81?ݱjA"0i$}SK hS6)"$tDM] AQD4 aF;3gvϹ$r>΍A׭ B{ziڏ.ZtAAiHwܔ z|3]p_fL aof R[я>OoܶtFЛSXp0M|6zH~@פ3°9S;9Fc" zU#?k`=З0z^j;ҐS.Y/Чsz&99>IWd~^Aj' ڼgf5 Df[,'Ry1w6ۙԖQC0nD7>@vwv P/}Z#ZE,'UA`ٮ_B$V;Ѽ}ol6R+ÚŲw&Weq*0@_mgoWvѮ&}a37pffkJb:PJ=]bxZXM*gps_vc# h)nOHlkdűvFyFE=܉> b}a[#H}vb? p?ч'ϋn~.מ BDCEJ|XmDF:vF?HQpRbB"C4)4IAc[DNrA-ʩ-"*Hiիs<>z,AL恍k*z;@e.v |_f/S1ahѻ3u>g 62[t.t֝}K =h\!؋ K;Gԏw e(A]M`hqARM+r 1K)>ig@G:q.ͯ6zfS@_^FTAglmn`*zfB +6;=QtNlp9ޥu^G9>APOoE':ܹMӿ(L:tŵ&9or|ZAg; }= (?*~ :\Im 'Fzg6ێs ~F@POt9rioe˙>>taG|z#:YF2NнvH&IiYW Ncqu.\nގp˃{4:0 J]lA&{́+~(Mзok`q :H6q8k$+]#k`'(09t֩/lpIs=pnc$'%pw$8@5eYC{[1~LѐS%'>zl']JY?@HPh?:I5qxk$1W\pw=s8QM 3Od[G车F2tŵ mz#аKPnD/:Hz5qkJ$^W\нat䐖 Fg rSy^S!YvZ4%tePrLAɡ<й@r"a H\q Cޛѹ?@r?1p'{$ .Mh+ɰ.t=i{Y48 A!-cG pӍ#{⚅2 :F J*:Iw z4w;*A!齈 j/t {$3ng;wÐԃνQ49~h|/.16:;Nso0#Mf^ 3+&:fݻkAp-,b UZ ",# EZT$``-FHI=g9ů=sfg^@ይ еxFzˠs" :m/o9 F6{^NA4״oAkF׿  A0 [Vֻ9 9˒X:᯦C~@_ h056A8mį׈H; XKgtbq)rr4 $ }Et? z'WjRqo@IZhe ,"tJko1_Izz_{J=)RP @$~MuL4[}CG Z EРpG=(_CVDV%y{r݃ h1?Ax΢PKI_M[}C{Hߞ/nvo Bt2wfn@vU] R;6{&xyN&kn%5.Yݏ>K#pĬ3?gCm: I@Xma(;[qwCsg[)3}D;Etϫ^CTK[{cqtMr/G{,BEWt8>AS~ VT{D5B 8q~S1aHbYG PwUݽ#n''"ҽXy'lGk@#-A4uڴUP 6Ժ U/>M7'x^y"}A^ SdBEDk;qa2 4e2ǂ K'M 4Zw ln52z4x , l^Hb5#lbKg?!'8O=ޕпD `;ˉoCdx0^YI<2ۄ,֨_+پ#&X:(O;aߙK2k='Lȃ2l:-p5",m% b^+ӯWEXXUӁf+VypO!fskAf1 ^;~ QtDY]~ ;WB1h "M,,ilB%b@XXPF2i`O-ň`!jgs(cuw@ !i|W*:޷Oj`!"؟tѐT҅=V2gUg{bE윊 r t>cױsZ z5Aa ,]o:ΫH/veO5:kM d,x+vnE:5 Bl45\,gL|#'#'3ϴM욑OϡQV4e>stW0b]Xs/ϲYڴ#ˠ;O2) 5U _XX˱Jw|v ӦL/sQ,y m]_lcmZ%5/2z2?4Ҙ;M-¯pNko8>lލ^Iñ-OP|[ă~dS#`xb{κwGg)3[Ã~gP}q_1Kh`tT̩RK0#6%eʽs{^_& [rƽk={G'ˏ{b;bCx¿0^*\w@wf̹ ev5+6bݽWh=ڣbϔz;WlBlÌ{{O%g F_l{>{CоE^I) G'/{18)W0zI {Rgq@D\bY$.G^v r@ny`\tvŽzZ]t4/{tFeB.˧Cdts½{tF\ Yt0x½ {tFgJn!dMwe+KSEǢkǏc@hɖN0@DSF4V P (EѝЂ'{kVn˷ tAt%:3/à19E7#b0,^*tA~хUݱKUaqADf'ZhuTH J5\\ I$"ll "}.+YD}'4Qy,_ XdFhpJg,ۖ 0iqLH>``YL 8/-\~ݰZv'#NRA$"q;!eٴZV>/ZX厼_,h6ˠe29}~/Ft7U hh˚eβxPYxHkQ@@X,[^k)uYrZ@AXf,;!_҃ /跔o-/hϲrVe^pR:pCc@[, GDҢ?"eIq qss>? Rx@IdBd] h\}9Ӂ|% /(ǸȲ|V k,\G(]x@դ|LHYg4º;{Ku~{~CYm9*iVՑ኏-=^PwjZc޼ȼ-&'4('>[>&>F6Vc]x@rSfi׬)T=-}׏eݏm֏Վ Й\Iy77v[ٔ)_:Mq;r(A͎ƶ榿ǚ05MxgT8IENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/using.png0000644000175100001770000002205114721316200016622 0ustar00runnerdockerPNG  IHDRx pHYs+#IDATx1 J@%o Cs<`/{y^<`/{y^<`/{y^<`/{=LaqAA{ وD!*ԈDh6 NDh$($lEl%V~w"T曾y7 nBE`]pc*zy]U<צ{x[U<Op^T~WtC*z򻣻xPXo#Cl6L_k;exPئ1CiU~ "wmބ>W~ Ow-`o^V~< cjpw n7oJ]`.$$gwca%ԭ+5@W uN/zQ4}3~tV )k[H7{U>@Zp>2qN\n5'c#iZnoT 7#i>7· |t i2|JH+Ѽ  I[T >@ZkqIڬ͵* iy>H6}.`x x>J6i.`-/$i3N9iM/}=5x ߞҊ@ԯ *  /YEeK\ ߣ=*  -K:uI<Ri)IK=e÷_!Sqg(f6d2N6L6d2 8r.g>'N|9%h/y||@K7q*ߩh9|Z qj!>?U ZfМ?ߓϕV oj%>!9୮+߮Vq^~P?p8|Z`8mN|Cp1*߱v<K[N|c|Z`?0YKlrQ8`.OZKltWf8`n*߶,}k+>s*߷X5,k->r.+߹X%,yRo]kv_ֵ8=*߻-Oa Jd=Fzt5_T4! Ɋfgy'su?'~0@+o hipR-^?]thcX2C|#!~f}|'!~b|'?3@~U`.*KF݌?0Uf2"~bx|3?73TL`n+N&`/dB&`*O(e/ܭKAa+j4jb0lf o6 f٨X,FET8,xfYVbj,Gy|ٕ{=@ƦR w?FS W?Yw*zSw*zhm*zhEw+zhw+4ѡ j ax̛g@?L߳H$w;Z$pЂe-8hb 1ZiEyg֍/<ez݋ oyࢀ<(Ϲ`M7_s,_dp7ǿW@6eLcOdD)02T7g%x, +NL@R?3Bo]*Û⧀rCg,1<@ex:ੀЈ@uc y0&w;<4Ʀd $P  h..YFe 1hTAaXDADFDDNmo-"(ZF$h`ac3!!}3s_޽;{xy]yPDgGH8ޔ,쎁B;"aJQ蝁C+kFQVg-O?:$`ojvSCÛV埤GkܛtF޴,,}pG~]e<?QvѾN@e'$>z5 y1ި?;10ssB?R3A9?፵OG í/' @埝ya<@c?\Cs0F4DÛ?9}20/,WF4BÛ^?xЫMo=60;Vy(xk'f,#Poz.쒁9bvߢ=C*ፇϞ'fa<@?TIzn`ݍ Q[g ӥ8Ri0@Af>4@)|:V- !&Pk6 ݮ~pd- P?Or1ٹ<@O_Ky^~ǖՆ<)AJv [V\ C0;^6- -VAqt‚ 6Q6"aŬlWɯ 6Wְzgs?>s}dxh!r-x砓޽1}l;᠗]]49~z6'$k!R/~vAO=Z fyDL?(@ 6sAW={T"`H?-會Q<@=zAD Q 0k}"ySg>V؃FE `/c;$PՃ Y:mׅXy< c2yDM?0'@'hAL  +6qs9$:obccA-y$ ~pͿ.uApku}+N:~p76x|__T y kA/L?@ޞ೨Vt#N>_C Eݱm0EQ v:$+xϐiRԦ TH r/9Op6vTc,=u5m8A;uަXo߶ y zW@^=ȫy zW@^=ȫy zW@^=ȫy/!àb7<л=w{@zл=w{@zл=Data Privacy. {% endblock %} ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.136642 gammapy-1.3/docs/api-reference/0000755000175100001770000000000014721316215016054 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/analysis.rst0000644000175100001770000000036114721316200020423 0ustar00runnerdocker.. _api_analysis: ******************************* analysis - High level interface ******************************* .. currentmodule:: gammapy.analysis .. automodapi:: gammapy.analysis :no-inheritance-diagram: :include-all-objects: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/astro.rst0000644000175100001770000000062014721316200017726 0ustar00runnerdocker.. _api_astro: ******************** astro - Astrophysics ******************** .. currentmodule:: gammapy.astro .. automodapi:: gammapy.astro.darkmatter :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.astro.population :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.astro.source :no-inheritance-diagram: :include-all-objects: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/catalog.rst0000644000175100001770000000033414721316200020212 0ustar00runnerdocker.. _api_catalog: ************************* catalog - Source catalogs ************************* .. currentmodule:: gammapy.catalog .. automodapi:: gammapy.catalog :no-inheritance-diagram: :include-all-objects: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/data.rst0000644000175100001770000000035014721316200017507 0ustar00runnerdocker.. _api_data: ******************************** data - DL3 data and observations ******************************** .. currentmodule:: gammapy.data .. automodapi:: gammapy.data :no-inheritance-diagram: :include-all-objects: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/datasets.rst0000644000175100001770000000050314721316200020406 0ustar00runnerdocker.. _api_datasets: *************************** datasets - Reduced datasets *************************** .. currentmodule:: gammapy.datasets .. automodapi:: gammapy.datasets :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.datasets.utils :no-inheritance-diagram: :include-all-objects:././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/estimators.rst0000644000175100001770000000060114721316200020767 0ustar00runnerdocker.. include:: ../references.txt .. _api_estimators: ********************************** estimators - High level estimators ********************************** .. currentmodule:: gammapy.estimators .. automodapi:: gammapy.estimators :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.estimators.utils :no-inheritance-diagram: :include-all-objects: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/index.rst0000644000175100001770000000063314721316200017711 0ustar00runnerdocker .. _api-ref: ============= API reference ============= This page gives an overview of all public Gammapy objects, functions and methods. All classes and functions exposed in ``gammapy.*`` namespace are public. .. toctree:: :maxdepth: 2 data irf makers datasets maps modeling estimators analysis catalog astro stats scripts visualization utils ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/irf.rst0000644000175100001770000000035614721316200017364 0ustar00runnerdocker.. _api_irf: *********************************** irf - Instrument response functions *********************************** .. currentmodule:: gammapy.irf .. automodapi:: gammapy.irf :no-inheritance-diagram: :include-all-objects: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/makers.rst0000644000175100001770000000041514721316200020062 0ustar00runnerdocker.. _api_makers: *********************** makers - Data reduction *********************** .. automodapi:: gammapy.makers :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.makers.utils :no-inheritance-diagram: :include-all-objects: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/maps.rst0000644000175100001770000000026514721316200017543 0ustar00runnerdocker.. _api_maps: *************** maps - Sky maps *************** .. currentmodule:: gammapy.maps .. automodapi:: gammapy.maps :no-inheritance-diagram: :include-all-objects: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/modeling.rst0000644000175100001770000000072014721316200020375 0ustar00runnerdocker.. include:: ../references.txt .. _api_modeling: ***************************** modeling - Models and fitting ***************************** .. currentmodule:: gammapy.modeling .. automodapi:: gammapy.modeling :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.modeling.models :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.modeling.models.utils :no-inheritance-diagram: :include-all-objects:././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/scripts.rst0000644000175100001770000000032114721316200020263 0ustar00runnerdocker.. _api_CLI: **************************** scripts - Command line tools **************************** .. currentmodule:: gammapy.scripts .. click:: gammapy.scripts.main:cli :prog: gammapy :show-nested: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/stats.rst0000644000175100001770000000030114721316200017730 0ustar00runnerdocker.. _api_stats: ****************** stats - Statistics ****************** .. currentmodule:: gammapy.stats .. automodapi:: gammapy.stats :no-inheritance-diagram: :include-all-objects: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/utils.rst0000644000175100001770000000253014721316200017740 0ustar00runnerdocker.. include:: ../references.txt .. _api_utils: ***************** utils - Utilities ***************** .. currentmodule:: gammapy.utils .. automodapi:: gammapy.utils.cluster :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.utils.coordinates :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.utils.integrate :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.utils.interpolation :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.utils.fits :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.utils.random :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.utils.regions :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.utils.parallel :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.utils.scripts :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.utils.table :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.utils.testing :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.utils.time :no-inheritance-diagram: :include-all-objects: .. automodapi:: gammapy.utils.units :no-inheritance-diagram: :include-all-objects: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/api-reference/visualization.rst0000644000175100001770000000044614721316200021505 0ustar00runnerdocker.. include:: ../references.txt .. _api_visualization: ********************************* visualization - Plotting features ********************************* .. currentmodule:: gammapy.visualization .. automodapi:: gammapy.visualization :no-inheritance-diagram: :include-all-objects: ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.136642 gammapy-1.3/docs/binder/0000755000175100001770000000000014721316215014612 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/binder/requirements.txt0000644000175100001770000000017614721316200020074 0ustar00runnerdocker# binder install requirements # for releases this should be a given version git+https://github.com/gammapy/gammapy#egg=gammapy././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/binder/runtime.txt0000644000175100001770000000001214721316200017021 0ustar00runnerdockerpython-3.9././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/conf.py0000644000175100001770000003060714721316200014646 0ustar00runnerdocker# -*- coding: utf-8 -*- # Licensed under a 3-clause BSD style license - see LICENSE.rst # # Gammapy documentation build configuration file. # # This file is execfile()d with the current directory set to its containing dir. # # Note that not all possible configuration values are present in this file. # # All configuration values have a default. Some values are defined in # the global Astropy configuration which is loaded here before anything else. # See astropy.sphinx.conf for which values are set there. # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. # sys.path.insert(0, os.path.abspath('..')) # IMPORTANT: the above commented section was generated by sphinx-quickstart, but # is *NOT* appropriate for astropy or Astropy affiliated packages. It is left # commented out with this explanation to make it clear why this should not be # done. If the sys.path entry above is added, when the astropy.sphinx.conf # import occurs, it will import the *source* version of astropy instead of the # version installed (if invoked as "make html" or directly with sphinx), or the # version in the build directory (if "python setup.py build_sphinx" is used). # Thus, any C-extensions that are needed to build the documentation will *not* # be accessible, and the documentation will not build correctly. import datetime import sys import os # Get configuration information from setup.cfg from configparser import ConfigParser from pkg_resources import get_distribution # Load all the global Astropy configuration from sphinx_astropy.conf import * # Sphinx-gallery config from sphinx_gallery.sorting import ExplicitOrder # Load utils docs functions from gammapy.utils.docs import SubstitutionCodeBlock, gammapy_sphinx_ext_activate # flake8: noqa # Add our custom directives to Sphinx def setup(app): """ Add the custom directives to Sphinx. """ app.add_config_value("substitutions", [], "html") app.add_directive("substitution-code-block", SubstitutionCodeBlock) conf = ConfigParser() conf.read([os.path.join(os.path.dirname(__file__), "..", "setup.cfg")]) setup_cfg = dict(conf.items("metadata")) sys.path.insert(0, os.path.dirname(__file__)) linkcheck_anchors_ignore = [] linkcheck_ignore = [ "http://gamma-sky.net/#", "https://bitbucket.org/hess_software/hess-open-source-tools/src/master/", "https://forge.in2p3.fr/projects/data-challenge-1-dc-1/wiki", "https://indico.cta-observatory.org/event/2070/", "https://data.hawc-observatory.org/datasets/3hwc-survey/index.php", "https://github.com/gammapy/gammapy#status-shields", "https://groups.google.com/forum/#!forum/astropy-dev", "https://lists.nasa.gov/mailman/listinfo/open-gamma-ray-astro", "https://getbootstrap.com/css/#tables", "https://www.hawc-observatory.org/", # invalid certificate "https://ipython.org", # invalid certificate "https://jupyter.org", # invalid certificate "https://hess-confluence.desy.de/confluence/display/HESS/HESS+FITS+data", # private page "https://hess-confluence.desy.de/" ] # the buttons link to html pages which are auto-generated... linkcheck_exclude_documents = [r"getting-started/.*"] # -- General configuration ---------------------------------------------------- # By default, highlight as Python 3. highlight_language = "python3" # Matplotlib directive sets whether to show a link to the source in HTML plot_html_show_source_link = False # If true, figures, tables and code-blocks are automatically numbered if they have a caption numfig = False # If your documentation needs a minimal Sphinx version, state it here. # needs_sphinx = "1.1" # We currently want to link to the latest development version of the astropy docs, # so we override the `intersphinx_mapping` entry pointing to the stable docs version # that is listed in `astropy/sphinx/conf.py`. intersphinx_mapping.pop("h5py", None) intersphinx_mapping["matplotlib"] = ("https://matplotlib.org/", None) intersphinx_mapping["astropy"] = ("http://docs.astropy.org/en/latest/", None) intersphinx_mapping["regions"] = ( "https://astropy-regions.readthedocs.io/en/latest/", None, ) intersphinx_mapping["reproject"] = ("https://reproject.readthedocs.io/en/latest/", None) intersphinx_mapping["naima"] = ("https://naima.readthedocs.io/en/latest/", None) intersphinx_mapping["gadf"] = ( "https://gamma-astro-data-formats.readthedocs.io/en/latest/", None, ) intersphinx_mapping["iminuit"] = ("https://iminuit.readthedocs.io/en/latest/", None) intersphinx_mapping["pandas"] = ("https://pandas.pydata.org/pandas-docs/stable/", None) # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. #exclude_patterns.append("_templates") exclude_patterns.append("**.ipynb_checkpoints") exclude_patterns.append("user-guide/model-gallery/*/*.ipynb") exclude_patterns.append("user-guide/model-gallery/*/*.md5") exclude_patterns.append("user-guide/model-gallery/*/*.py") extensions.extend( [ "sphinx_click.ext", "sphinx.ext.mathjax", "sphinx_gallery.gen_gallery", "sphinx.ext.doctest", "sphinx_design", "sphinx_copybutton", "sphinx_automodapi.smart_resolver", ] ) nbsphinx_execute = "never" copybutton_prompt_text = r">>> |\.\.\. |\$ |In \[\d*\]: | {2,5}\.\.\.: | {5,8}: " copybutton_prompt_is_regexp = True # -- # This is added to the end of RST files - a good place to put substitutions to # be used globally. rst_epilog += """ .. |Table| replace:: :class:`~astropy.table.Table` """ # This is added to keep the links to PRs in release notes changelog_links_docpattern = [".*changelog.*", "whatsnew/.*", "release-notes/.*"] # -- Project information ------------------------------------------------------ # This does not *have* to match the package name, but typically does project = setup_cfg["name"] author = setup_cfg["author"] copyright = "{}, {}".format(datetime.datetime.now().year, setup_cfg["author"]) version = get_distribution(project).version release = "X.Y.Z" switch_version = version if "dev" in version: switch_version = "dev" else: release = version substitutions = [ ("|release|", release), ] # -- Options for HTML output --------------------------------------------------- # A NOTE ON HTML THEMES # The global astropy configuration uses a custom theme, "bootstrap-astropy", # which is installed along with astropy. A different theme can be used or # the options for this theme can be modified by overriding some # variables set in the global configuration. The variables set in the # global configuration are listed below, commented out. # Add any paths that contain custom themes here, relative to this directory. # To use a different custom theme, add the directory containing the theme. # html_theme_path = [] # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. To override the custom theme, set this to the # name of a builtin theme or the name of a custom theme in html_theme_path. html_theme = "pydata_sphinx_theme" # Static files to copy after template files html_static_path = ["_static"] html_css_files = ["custom.css"] html_js_files = ["matomo.js"] templates_path = ["_templates"] # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. html_logo = os.path.join(html_static_path[0], "gammapy_logo_nav.png") html_favicon = os.path.join(html_static_path[0], "gammapy_logo.ico") # Custom sidebar templates, maps document names to template names. html_sidebars = { "search": ["search-field.html"], "navigation": ["sidebar-nav-bs.html"], } # If not "", a "Last updated on:" timestamp is inserted at every page bottom, # using the given strftime format. # html_last_updated_fmt = "" # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". html_title = "{} v{}".format(project, release) # Output file base name for HTML help builder. htmlhelp_basename = f"{project}doc" html_theme_options = { "header_links_before_dropdown": 6, "collapse_navigation": True, "navigation_depth": 2, "show_prev_next": False, # links in menu "icon_links": [ { "name": "Github", "url": "https://github.com/gammapy/gammapy", "icon": "fab fa-github-square", }, { "name": "Twitter", "url": "https://twitter.com/gammapyST", "icon": "fab fa-square-x-twitter", }, { "name": "Slack", "url": "https://gammapy.slack.com/", "icon": "fab fa-slack", }, ], "switcher": { "json_url": "https://docs.gammapy.org/stable/switcher.json", "version_match": switch_version, }, "navbar_end": ["version-switcher", "navbar-icon-links"], "navigation_with_keys": True, # footers "footer_start": ["copyright","custom-footer.html"], "footer_center": ["last-updated"], "footer_end": ["sphinx-version", "theme-version"] } gammapy_sphinx_ext_activate() # -- Options for LaTeX output -------------------------------------------------- # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, author, documentclass [howto/manual]). latex_documents = [ ("index", f"{project}.tex", f"{project} Documentation", author, "manual") ] # -- Options for manual page output -------------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [("index", project.lower(), f"{project} Documentation", [author], 1)] # -- Other options -- github_issues_url = "https://github.com/gammapy/gammapy/issues/" # http://sphinx-automodapi.readthedocs.io/en/latest/automodapi.html # show inherited members for classes automodsumm_inherited_members = True # In `about.rst` and `references.rst` we are giving lists of citations # (e.g. papers using Gammapy) that partly aren"t referenced from anywhere # in the Gammapy docs. This is normal, but Sphinx emits a warning. # The following config option suppresses the warning. # http://www.sphinx-doc.org/en/stable/rest.html#citations # http://www.sphinx-doc.org/en/stable/config.html#confval-suppress_warnings suppress_warnings = ["ref.citation"] branch = "main" if switch_version == "dev" else f"v{switch_version}" binder_config = { # Required keys "org": "gammapy", "repo": "gammapy-webpage", "branch": branch, # Can be any branch, tag, or commit hash. Use a branch that hosts your docs. "binderhub_url": "https://mybinder.org", # Any URL of a binderhub deployment. Must be full URL (e.g. https://mybinder.org). "dependencies": "./binder/requirements.txt", "notebooks_dir": f"notebooks/{switch_version}", "use_jupyter_lab": True, } # nitpicky = True sphinx_gallery_conf = { "examples_dirs": [ "../examples/models", "../examples/tutorials", ], # path to your example scripts "gallery_dirs": [ "user-guide/model-gallery", "tutorials", ], # path to where to save gallery generated output "subsection_order": ExplicitOrder( [ "../examples/models/spatial", "../examples/models/spectral", "../examples/models/temporal", "../examples/tutorials/starting", "../examples/tutorials/data", "../examples/tutorials/analysis-1d", "../examples/tutorials/analysis-2d", "../examples/tutorials/analysis-3d", "../examples/tutorials/analysis-time", "../examples/tutorials/api", "../examples/tutorials/scripts", ] ), "binder": binder_config, "backreferences_dir": "gen_modules/backreferences", "doc_module": ("gammapy",), "exclude_implicit_doc": {}, "filename_pattern": r"\.py", "reset_modules": ("matplotlib",), "within_subsection_order": "sphinxext.TutorialExplicitOrder", "download_all_examples": True, "capture_repr": ("_repr_html_", "__repr__"), "nested_sections": False, "min_reported_time": 10, "show_memory": False, "line_numbers": False, "reference_url": { # The module you locally document uses None "gammapy": None, }, } html_context = { "default_mode": "light", } ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.140642 gammapy-1.3/docs/development/0000755000175100001770000000000014721316215015671 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/dependencies.rst0000644000175100001770000000435014721316200021045 0ustar00runnerdocker.. include:: ../references.txt .. _install-dependencies: Dependencies ============ Gammapy is a Python package built on Numpy, Scipy and Astropy, as well as a few other required dependencies. For certain functionality, optional dependencies are used. The recommended way to install Gammapy is via a conda environment which includes all required and optional dependencies (see :ref:`installation`). Note that when you install Gammapy with conda (or actually any alternative distribution channel), you have a full package manager at your fingertips. You can conda or pip install any extra Python package you like (e.g. `pip install pyjokes `__), upgrade or downgrade packages to other versions (very rarely needed) or uninstall any package you don't like (almost never useful, unless you run out of disk space). Gammapy itself, and most analyses, work on Windows. However, two optional dependencies don't support Windows yet: Sherpa (an optional fitting backend) and healpy (needed to work with HEALPix maps, which is common for all-sky analyses). Required dependencies --------------------- Required dependencies are automatically installed when using e.g. ``conda install gammapy -c conda-forge`` or ``pip install gammapy``. * numpy_ - array and math functions * scipy_ - numerical methods (interpolation, integration, convolution) * Astropy_ - core package for Astronomy in Python * regions_ - Astropy sky region package * matplotlib_ for plotting * iminuit_ for fitting by optimization * click_ - used for the ``gammapy`` command line tool * PyYAML_ - support for YAML_ format (config and results files) * pydantic_ - support config file validation Optional dependencies --------------------- The optional dependencies listed here are the packages listed in the conda environment specification (see :ref:`installation`). This is a mix of packages that make it convenient to use Gammapy (e.g. ``ipython`` or ``jupyter``), or that add extra functionality (e.g. ``matplotlib`` to make plots, ``naima`` for physical SED modeling). * ipython_, jupyter_ and jupyterlab_ for interactive analysis * pandas_ for working with tables (not used within Gammapy) * healpy_ for `HEALPIX`_ data handling * Sherpa_ for modeling and fitting * naima_ for SED modeling././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/dev_howto.rst0000644000175100001770000011244114721316200020416 0ustar00runnerdocker.. include:: ../references.txt .. _dev_howto: **************** Developer How To **************** General conventions ------------------- Python version support ++++++++++++++++++++++ In Gammapy we currently support Python 3.8 or later. Coordinate and axis names +++++++++++++++++++++++++ In Gammapy, the following coordinate and axis names should be used. This applies to most of the code, ranging from IRFs to maps to sky models, for function parameters and variable names. * ``time`` - time * ``energy`` - energy * ``energy_true`` - true energy * ``ra``, ``dec`` - sky coordinates, ``radec`` frame (i.e. ``icrs`` to be precise) * ``glon``, ``glat`` - sky coordinates, ``galactic`` frame * ``az``, ``alt`` - sky coordinates, ``altaz`` frame * ``lon``, ``lat`` for spherical coordinates that aren't in a specific frame. For angular sky separation angles: * ``psf_theta`` - offset wrt. PSF center position * ``fov_theta`` - offset wrt. field of view (FOV) center * ``theta`` - when no PSF is involved, e.g. to evaluate spatial sky models For the general case of FOV coordinates that depend on angular orientation of the FOV coordinate frame: * ``fov_{frame}_lon``, ``fov_{frame}_lat`` - field of view coordinates * ``fov_theta``, ``fov_{frame}_phi`` - field of view polar coordinates where ``{frame}`` can be one of ``radec``, ``galactic`` or ``altaz``, depending on with which frame the FOV coordinate frame is aligned. Notes: * In cases where it's unclear if the value is for true or reconstructed event parameters, a postfix ``_true`` or ``_reco`` should be added. In Gammapy, this mostly occurs for ``energy_true`` and ``energy_reco``, e.g. the background IRF has an axis ``energy_reco``, but effective area usually ``energy_true``, and energy dispersion has both axes. We are not pedantic about adding ``_true`` and ``_reco`` everywhere. Note that this would quickly become annoying (e.g. source models use true parameters, and it's not clear why one should write ``ra_true``). E.g. the property on the event list ``energy`` matches the ``ENERGY`` column from the event list table, which is for real data always reco energy. * Currently, no sky frames centered on the source, or non-radially symmetric PSFs are in use, and thus the case of "source frames" that have to be with a well-defined alignment, like we have for the "FOV frames" above, doesn't occur and thus doesn't need to be defined yet (but it would be natural to use the same naming convention as for FOV if it eventually does occur). * These definitions are mostly in agreement with the `Gamma Astro Data Formats`_ specifications. We do not achieve 100% consistency everywhere in the spec and Gammapy code. Achieving this seems unrealistic, because legacy formats have to be supported, we are not starting from scratch and have time to make all formats consistent. Our strategy is to do renames on I/O where needed, to and from the internal Gammapy names defined here, to the names used in the formats. Of course, where formats are not set in stone yet, we advocate and encourage the use of the names chosen here. * Finally, CTAO has proposed a full data model associated to the needed quantities. And the community is working to create a data format for very-high-energy data produced by gamma-ray and neutrino experiments within the open initiative `Very-high-energy Open Data Format`_. This format aims to respect the CTAO data model, to respect the `FAIR principles`_ and to follow as much as possible the `IVOA`_ recommendations. In order to handle past, current and old formats, Gammapy should follow the `FAIR4RS principles`_ by proposing a user-friendly interface. Clobber or overwrite? +++++++++++++++++++++ In Gammapy we consistently use an ``overwrite`` bool option for `gammapy.scripts` and functions that write to files. This is in line with Astropy, which had a mix of ``clobber`` and ``overwrite`` in the past, and has switched to uniform ``overwrite`` everywhere. The default value should be ``overwrite=False``, although we note that this decision was very controversial, several core developers would prefer to use ``overwrite=True``. For discussion on this, see `GH 1396 `__. Pixel coordinate convention +++++++++++++++++++++++++++ All code in Gammapy should follow the Astropy pixel coordinate convention that the center of the first pixel has pixel coordinates ``(0, 0)`` (and not ``(1, 1)`` as shown e.g. in ds9). You should use ``origin=0`` when calling any of the pixel to world or world to pixel coordinate transformations in `astropy.wcs`. BSD or GPL license? +++++++++++++++++++ Gammapy is BSD licensed (same license as Numpy, Scipy, Matplotlib, scikit-image, Astropy, photutils, yt, ...). We prefer this over the GPL3 or LGPL license because it means that the packages we are most likely to share code with have the same license, e.g. we can take a function or class and "upstream" it, i.e. contribute it e.g. to Astropy or Scipy if it's generally useful. Some optional dependencies of Gammapy (i.e. other packages like Sherpa or Gammalib or ROOT that we import in some places) are GPL3 or LGPL licensed. Now the GPL3 and LGPL license contains clauses that other package that copy or modify it must be released under the same license. We take the standpoint that Gammapy is independent of these libraries, because we don't copy or modify them. This is a common standpoint, e.g. ``astropy.wcs`` is BSD licensed, but uses the LGPL-licensed WCSLib. Note that if you distribute Gammapy together with one of the GPL dependencies, the whole distribution then falls under the GPL license. How to write code ----------------- Where should I import from? +++++++++++++++++++++++++++ You should import from the "end-user namespaces", not the "implementation module". .. testcode:: from gammapy.data import EventList # good from gammapy.data.event_list import EventList # bad from gammapy.stats import cash # good from gammapy.stats.fit_statistics import cash # bad The end-user namespace is the location that is shown in the API docs, i.e. you can use the Sphinx full-text search to quickly find it. To make code maintenance easier, the implementation of the functions and classes is spread across multiple modules (``.py`` files), but the user shouldn't care about their names, that would be too much to remember. The only reason to import from a module directly is if you need to access a private function, class or variable (something that is not listed in ``__all__`` and thus not imported into the end-user namespace). Note that this means that in the definition of an "end-user namespace", e.g. in the ``gammapy/data/__init__.py`` file, the imports have to be sorted in a way such that modules in ``gammapy/data`` are loaded when imported from other modules in that sub-package. Functions returning several values ++++++++++++++++++++++++++++++++++ It is up to the developer to decide how to return multiple things from functions and methods. For up to three things, if callers will usually want access to several things, using a ``tuple`` or ``collections.namedtuple`` is ok. For three or more things, using a Python ``dict`` instead should be preferred. What checks and conversions should I do for inputs? +++++++++++++++++++++++++++++++++++++++++++++++++++ In Gammapy we assume that `"we're all consenting adults" `__, which means that when you write a function you should write it like this: .. testcode:: def do_something(data, option): """Do something. Parameters ---------- data : `numpy.ndarray` Data option : {'this', 'that'} Option """ if option == 'this': out = 3 * data elif option == 'that': out = data ** 5 else: ValueError('Invalid option: {}'.format(option)) return out * **Don't always add `isinstance` checks for everything** ... assume the caller passes valid inputs, ... in the example above this is not needed:: assert isinstance(option, str) * **Don't always add `numpy.asanyarray` calls for all array-like inputs** ... the caller can do this if it's really needed ... in the example above document ``data`` as type `~numpy.ndarray` instead of array-like and don't put this line:: data = np.asanyarray(data) * **Do always add an `else` clause to your `if`-`elif` clauses** ... this is boilerplate code, but not adding it would mean users get this error if they pass an invalid option:: UnboundLocalError: local variable 'out' referenced before assignment Now if you really want, you can add the `numpy.asanyarray` and `isinstance` checks for functions that end-users might often use for interactive work to provide them with better exception messages, but doing it everywhere would mean 1000s of lines of boilerplate code and take the fun out of Python programming. Float data type: 32 bit or 64 bit? ++++++++++++++++++++++++++++++++++ Most of the time what we want is to use 32 bit to store data on disk and 64 bit to do computations in memory. Using 64 bit to store data and results (e.g. large images or cubes) on disk would mean a factor ~2 increase in file sizes and slower I/O, but I'm not aware of any case where we need that precision. On the other hand, doing computations with millions and billions of pixels very frequently results in inaccurate results ... e.g. the likelihood is the sum over per-pixel likelihoods and using 32-bit will usually result in erratic and hard-to-debug optimizer behaviour and even if the fit works incorrect results. Now you shouldn't put this line at the top of every function ... assume the caller passes 64-bit data:: data = np.asanyarray(data, dtype='float64') But you should add explicit type conversions to 64 bit when reading float data from files and explicit type conversions to 32 bit before writing to file. .. _dev_random: How to use random numbers ------------------------- All functions that need to call a random number generator should take a ``random_state`` input parameter and call the `~gammapy.utils.random.get_random_state` utility function like this (you can copy & paste the three docstring lines and the first code line to the function you're writing): .. testcode:: from gammapy.utils.random import get_random_state def make_random_stuff(X, random_state='random-seed'): """... Parameters ---------- random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. """ random_state = get_random_state(random_state) data = random_state.uniform(low=0, high=3, size=10) return data This allows callers flexible control over which random number generator (i.e. which `numpy.random.RandomState` instance) is used and how it's initialised. The default ``random_state='random-seed'`` means "create a new RNG, seed it randomly", i.e. different random numbers will be generated on every call. There's a few ways to get deterministic results from a script that call functions that generate random numbers. One option is to create a single `~numpy.random.RandomState` object seeded with an integer and then pass that ``random_state`` object to every function that generates random numbers: .. code-block:: python from numpy.random import RandomState random_state = RandomState(seed=0) stuff1 = make_some_random_stuff(random_state=random_state) stuff2 = make_more_random_stuff(random_state=random_state) Another option is to pass an integer seed to every function that generates random numbers: .. code-block:: python seed = 0 stuff1 = make_some_random_stuff(random_state=seed) stuff2 = make_more_random_stuff(random_state=seed) This pattern was inspired by the way scikit-learn handles random numbers. We have changed the ``None`` option of ``sklearn.utils.check_random_state`` to ``'global-rng'``, because we felt that this meaning for ``None`` was confusing given that `numpy.random.RandomState` uses a different meaning (for which we use the option ``'global-rng'``). How to use logging ------------------ Gammapy is a library. This means that it should never contain print statements, because with print statements the library users have no easy way to configure where the print output goes (e.g. to ``stdout`` or ``stderr`` or a log file) and what the log level (``warning``, ``info``, ``debug``) and format is (e.g. include timestamp and log level?). So logging is much better than printing. But also logging is only rarely needed. Many developers use print or log statements to debug some piece of code while they write it. Once it's written and works, it's rare that callers want it to be chatty and log messages all the time. Print and log statements should mostly be contained in end-user scripts that use Gammapy, not in Gammapy itself. That said, there are cases where emitting log messages can be useful. E.g. a long-running algorithm with many steps can log info or debug statements. In a function that reads and writes several files it can make sense to include info log messages for normal operation, and warning or error log messages when something goes wrong. Also, command line tools that are included in Gammapy **should** contain log messages, informing the user about what they are doing. Gammapy uses the Python standard library `logging` module. This module is extremely flexible, but also quite complex. But our logging needs are very modest, so it's actually quite simple ... It is worth mentioning that important logs returned to the user should be captured and tested using caplog fixture, see the section Caplog fixture above Generating log messages +++++++++++++++++++++++ To generate log messages from any file in Gammapy, include these two lines at the top: .. testcode:: import logging log = logging.getLogger(__name__) This creates a module-level `logging.Logger` object called ``log``, and you can then create log messages like this from any function or method: .. testcode:: def process_lots_of_data(infile, outfile): log.info('Starting processing data ...') # do lots of work log.info('Writing {}'.format(outfile)) You should never log messages from the module level (i.e. on import) or configure the log level or format in Gammapy, that should be left to callers ... except from command line tools ... Interpolation and extrapolation ------------------------------- In Gammapy, we use interpolation a lot, e.g. to evaluate instrument response functions (IRFs) on data grids, or to reproject diffuse models on data grids. The default interpolator we use is `scipy.interpolate.RegularGridInterpolator` because it's fast and robust (more fancy interpolation schemes can lead to unstable response in some cases, so more careful checking across all parameter space would be needed). You should use this pattern to implement a function of method that does interpolation: .. code-block:: python def do_something(..., interp_kwargs=None): """Do something. Parameters ---------- interp_kwargs : dict or None Interpolation parameter dict passed to `scipy.interpolate.RegularGridInterpolator`. If you pass ``None``, the default ``interp_params=dict(bounds_error=False)`` is used. """ if not interp_kwargs: interp_kwargs = dict(bounds_error=False) interpolator = RegularGridInterpolator(..., **interp_kwargs) Since the other defaults are ``method='linear'`` and ``fill_value=nan``, this implies that linear interpolation is used and `NaN`_ values are returned for points outside the interpolation domain. This is a compromise between the alternatives: * ``bounds_error=True`` -- Very "safe", refuse to return results for any points if one of the points is outside the valid domain. Can be annoying for the caller to not get any result. * ``bounds_error=False, fill_value=nan`` -- Medium "safe". Always return a result, but put NaN values to make it easy for analysers to spot that there's an issue in their results (if pixels with NaN are used, that will usually lead to NaN values in high level analysis results). * ``bounds_error=False, fill_value=0`` -- Less "safe". Extrapolate with zero. Can be very convenient for the caller to avoid dealing with NaN, but if the data values can also be zero you will lose track of invalid pixels. * ``bounds_error=False, fill_value=None`` -- "Unsafe". If fill_value is None, values outside the domain are extrapolated. Can lead to errors where e.g. stacked high level analysis results aren't quite correct because IRFs or background models or ... were used outside their valid range. Methods that use interpolation should provide an option to the caller to pass interpolation options on to ``RegularGridInterpolator`` in case the default behaviour doesn't suit the application. How to write tests ------------------ Assert convention +++++++++++++++++ When performing tests, the preferred numerical assert method is `numpy.testing.assert_allclose`. Use .. testcode:: from numpy.testing import assert_allclose at the top of the file and then just use ``assert_allclose`` for the tests. This makes the lines shorter, i.e. there is more space for the arguments. ``assert_allclose`` covers all use cases for numerical asserts, so it should be used consistently everywhere instead of using the dozens of other available asserts from pytest or numpy in various places. For assertions on `~astropy.units.Quantity` objects, you can do this to assert on the unit and value separately: .. testcode:: from numpy.testing import assert_allclose import astropy.units as u actual = 1 / 3 * u.deg assert actual.unit == 'deg' assert_allclose(actual.value, 0.33333333) Note that `~astropy.units.Quantity` can be compared to unit strings directly. Also note that the default for ``assert_allclose`` is ``atol=0`` and ``rtol=1e-7``, so when using it, you have to give the reference value with a precision of ``rtol ~ 1e-8``, i.e. 8 digits to be on the safe side (or pass a lower ``rtol`` or set an ``atol``). The use of `~astropy.tests.helper.assert_quantity_allclose` is discouraged, because it only requires that the values match after unit conversions. This is not so bad, but units in test cases should not change randomly, so asserting on unit and value separately establishes more behaviour. If you don't like the two separate lines, you can use `gammapy.utils.testing.assert_quantity_allclose`, which does assert that units are equal, and calls `numpy.testing.assert_equal` for the values. Testing of plotting functions +++++++++++++++++++++++++++++ Many of the data classes in Gammapy implement ``.plot()`` or ``.peek()`` methods to allow users a quick look in the data. Those methods should be tested using the `mpl_check_plot()` context manager. The context manager will take care of creating a new figure to plot on and writing the plot to a byte-stream to trigger the rendering of the plot, which can raise errors as well. Here is a short example: .. testcode:: from gammapy.utils.testing import mpl_plot_check def test_plot(): with mpl_plot_check(): plt.plot([1., 2., 3., 4., 5.]) With this approach we make sure that the plotting code is at least executed once and runs completely (up to saving the plot to file) without errors. In future, we will maybe change to something like https://github.com/matplotlib/pytest-mpl to ensure that correct plots are produced. Skip unit tests for some Astropy versions +++++++++++++++++++++++++++++++++++++++++ .. testcode:: import astropy import pytest ASTROPY_VERSION = (astropy.version.major, astropy.version.minor) @pytest.mark.xfail(ASTROPY_VERSION < (0, 4), reason="Astropy API change") def test_something(): ... Caplog fixture ++++++++++++++ Inside tests, we have the possibility to change the log level for the captured log messages using the ``caplog`` fixture which allow you to access and control log capturing. When logging is part of your function, and you want to verify the right message is logged with the expected logging level: .. testcode:: import pytest def test_something(caplog): """Test something. Parameters ---------- caplog : caplog fixture that give you access to the log level, the logger, etc., """ assert "WARNING" in [_.levelname for _ in caplog.records] assert "warning message" in [_.message for _ in caplog.records] How to make a pull request -------------------------- Making a pull request with new or modified datasets +++++++++++++++++++++++++++++++++++++++++++++++++++ Datasets used in tests are hosted in the `gammapy-data `__ GitHub repository. It is recommended that developers have `$GAMMAPY_DATA` environment variable pointing to the local folder where they have fetched the `gammapy-data `__ GitHub repository, so they can push and pull eventual modification of its content. Making a pull request which skips GitHub Actions ++++++++++++++++++++++++++++++++++++++++++++++++ For minor PRs (eg: correcting typos in doc-strings) we can skip GitHub Actions. Adding ``[ci skip]`` in a specific commit message will skip CI for that specific commit which can be useful for draft or incomplete PR. For details, `see here. `__ Fix non-Unix line endings +++++++++++++++++++++++++ In the past we had non-Unix (i.e. Mac or Windows) line endings in some files. This can be painful, e.g. git diff and autopep8 behave strangely. Here's to commands to check for and fix this (see `here `__): .. code-block:: bash git clean -fdx find . -type f -print0 | xargs -0 -n 1 -P 4 dos2unix -c mac find . -type f -print0 | xargs -0 -n 1 -P 4 dos2unix -c ascii git status cd astropy_helpers && git checkout -- . && cd .. Making a pull request that requires backport ++++++++++++++++++++++++++++++++++++++++++++ Some PRs will need to be backported to specific maintenance branches, for example bug fixes or correcting typos in doc-strings. There are a few ways this can be done, one of which involves the `meeseeksmachine `__. 1. Add the backport label to automatically backport your PR. e.g. "``backport-v1.1.x``" 2. If you forgot, on your merged PR make the comment: "``@meeseeksdev backport to [BRANCHNAME]``" 3. If this does not work automatically, a set of instructions will be given to you as a comment in the PR to be backported. This involves using the "``cherry-pick``" git command. See `here `__ for information. Release notes +++++++++++++ In Gammapy we keep :ref:`release_notes` with a list of pull requests. We sort by release and within the release by PR number (the largest first). As explained in the :ref:`astropy:changelog-format` section in the Astropy docs, there are (at least) two approaches for adding to the releases, each with pros and cons. We've had some pain due to merge conflicts in the releases notes and having to wait until the contributor rebases (and having to explain git rebase to new contributors). So our recommendation is that releases entries are not added in pull requests, but that the core developer adds a releases notes entry after right after having merged a pull request (you can add ``[skip ci]`` on this commit). How to handle API breaking changes? ----------------------------------- As stated in PIG 23, API breaking changes can be introduced in feature and LTS releases. This changes must be clearly identified in the release notes. To avoid abruptly changing the API between consecutive version, forecoming API changes and deprecation must be announced in the previous release, in particular, in the form of a deprecation warning to warn users of future changes affecting their code. The deprecation warning must be present in the next feature or LTS release after the change was made. The feature can be removed in the following release. We use a deprecation warning system based on decorators derived from the one implemented in Astropy. Adding a decorator will raise a warning if the deprecated feature is called and it will also add a message to its docstring. Deprecating a function or a class +++++++++++++++++++++++++++++++++ To deprecate a function or a class, use the ``@deprecate`` decorator, indicating the first release following the change: .. testcode:: from gammapy.utils.deprecation import deprecated @deprecated("1.1") def deprecated_function(a, b): return a + b If you want to indicate a new implementation to use instead, use the `alternative` argument: .. testcode:: from gammapy.utils.deprecation import deprecated @deprecated("1.1", alternative="new_function") def deprecated_function(a, b): return a + b Renaming an argument ++++++++++++++++++++ If you change the name of an argument, you can use the ``deprecated_renamed_argument`` decorator. It will replace the old argument with the new one in a call to the function and will raise the ``GammapyDeprecationWarning``. You can change several arguments at once. .. testcode:: from gammapy.utils.deprecation import deprecated_renamed_argument @deprecated_renamed_argument(["a", "b"], ["x", "y"], ["1.1", "1.1"]) def deprecated_argument_function(x, y): return x + y print(deprecated_argument_function(a=1, b=2)) If you rename a `kwarg` you simply need to set the `arg_in_kwargs` argument to `True`: .. testcode:: from gammapy.utils.deprecation import deprecated_renamed_argument @deprecated_renamed_argument("old", "new", "1.1", arg_in_kwargs=True) def deprecated_argument_function_kwarg(new=1): return new print(deprecated_argument_function_kwarg(old=3)) Removing an attribute +++++++++++++++++++++ You can also remove an attribute from a class using the ``deprecated_attribute`` decorator. If you have a alternative attribute to use instead, pass its name in the `alternative` argument. .. testcode:: from gammapy.utils.deprecation import deprecated_attribute class some_class: old_attribute = deprecated_attribute( "old_attribute", "1.1", alternative="new_attribute" ) def __init__(self, value): self._old_attribute = value self._new_attribute = value @property def new_attribute(self): return self._new_attribute print(some_class(10).old_attribute) Others ------ Command line tools using click ++++++++++++++++++++++++++++++ Command line tools that use the `click `__ module should disable the unicode literals warnings to clean up the output of the tool: .. testcode:: import click click.disable_unicode_literals_warning = True See `here `__ for further information. Bundled gammapy.extern code +++++++++++++++++++++++++++ We bundle some code in ``gammapy.extern``. This is external code that we don't maintain or modify in Gammapy. We only bundle small pure-Python files (currently all single-file modules) purely for convenience, because having to explain about these modules as Gammapy dependencies to end-users would be annoying. And in some cases the file was extracted from some other project, i.e. can't be installed separately as a dependency. For ``gammapy.extern`` we don't generate Sphinx API docs. To see what is there, check out the ``gammapy/extern`` directory locally or on `GitHub `__. Notes on the bundled files are kept in the docstring of `gammapy/extern/__init__.py `__. Locate origin of warnings +++++++++++++++++++++++++ By default, warnings appear on the console, but often it's not clear where a given warning originates (e.g. when building the docs or running scripts or tests) or how to fix it. Sometimes putting this in ``gammapy/__init__.py`` can help: .. testcode:: import numpy as np np.seterr(all='raise') Following the advice `here `__, putting this in ``docs/conf.py`` can also help sometimes: .. code:: import traceback import warnings import sys def warn_with_traceback(message, category, filename, lineno, file=None, line=None): traceback.print_stack() log = file if hasattr(file,'write') else sys.stderr log.write(warnings.formatwarning(message, category, filename, lineno, line)) warnings.showwarning = warn_with_traceback Object text repr, str and info ++++++++++++++++++++++++++++++ In Python, by default objects don't have a good string representation. This section explains how Python repr, str and print work, and gives guidelines for writing ``__repr__``, ``__str__`` and ``info`` methods on Gammapy classes. Let's use this as an example:: class Person: def __init__(self, name='Anna', age=8): self.name = name self.age = age The default ``repr`` and ``str`` are this:: p = Person() repr(p) '<__main__.Person object at 0x105fe3b70>' p.__repr__() '<__main__.Person object at 0x105fe3b70>' str(p) '<__main__.Person object at 0x105fe3b70>' p.__str__() Users will see that. If they just give an object in the Python REPL, the ``repr`` is shown. If they print the object, the ``str`` is shown. In both cases without the quotes seen above. p <__main__.Person at 0x105fd0cf8> print(p) <__main__.Person object at 0x105fe3b70> There are ways to make this better and avoid writing boilerplate code, specifically `attrs `__ and `dataclasses `__. We might use those in the future in Gammapy, but for now, we don't. If you want a better repr or str for a given object, you have to add ``__repr__`` and / or ``__str__`` methods when writing the class. Note that you don't have to do that, it's mainly useful for objects users interact with a lot. For classes that are mainly used internally, developers can e.g. just do this to see the attributes printed nicely:: p.__dict__ {'name': 'Anna', 'age': 8} Here's an example how to write ``__repr__``:: def __repr__(self): return '{}(name={!r}, age={!r})'.format( self.__class__.__name__, self.name, self.age ) Note how we use ``{!r}`` in the format string to fill in the ``repr`` of the object being formatted, and how we used ``self.__class__.__name__`` to avoid duplicating the class name (easier to refactor code, and shows sub-class name if repr is inherited). This will give a nice string representation. The same one for ``repr`` and ``str``, you don't have to write ``__str__``:: p = Person(name='Anna', age=8) p Person(name='Anna', age=8) print(p) Person(name='Anna', age=8) The string representation is usually used for more informal or longer printout. Here's an example:: def __str__(self): return ( "Hi, my name is {} and I'm {} years old.\n" "I live in Heidelberg." ).format(self.name, self.age) If you need text representation that is configurable, i.e. tables arguments what to show, you should add a method called ``info``. To avoid code duplication, you should then call ``info`` from ``__str__``. Example:: class Person: def __init__(self, name='Anna', age=8): self.name = name self.age = age def __repr__(self): return '{}(name={!r}, age={!r})'.format( self.__class__.__name__, self.name, self.age ) def __str__(self): return self.info(add_location=False) def info(self, add_location=True): s = ("Hi, my name is {} and I'm {} years old." ).format(self.name, self.age) if add_location: s += "\nI live in Heidelberg" return s This pattern of returning a string from ``info`` has some pros and cons. It's easy to get the string, and do what you like with it, e.g. combine it with other text, or store it in a list and write it to file later. The main con is that users have to call ``print(p.info())`` to see a nice printed version of the string instead of ``\n``:: p = Person() p.info() "Hi, my name is Anna, and I'm 8 years old.\nI live in Heidelberg" print(p.info()) Hi, my name is Anna, and I'm 8 years old. I live in Heidelberg To make ``info`` print by default, and be re-usable from ``__str__`` and make it possible to get a string (without having to monkey-patch ``sys.stdout``), would require adding this ``show`` option and if-else at the end of every ``info`` method:: def __str__(self): return self.info(add_location=False, show=False) def info(self, add_location=True, show=True): s = ("Hi, my name is {} and I'm {} years old." ).format(self.name, self.age) if add_location: s += "\nI live in Heidelberg" if show: print(s) else: return s To summarise: start without adding and code for text representation. If there's a useful short text representation, you can add a ``__repr__``. If really useful, add a ``__str__``. If you need it configurable, add an ``info`` and call ``info`` from ``str``. If ``repr`` and ``str`` are similar, it's not really useful: delete the ``__str__`` and only keep the ``__repr__``. It is common to have bugs in ``__repr__``, ``__str__`` and ``info`` that are not tested. E.g. a ``NameError`` or ``AttributeError`` because some attribute name changed, and updating the repr / str / info was forgotten. So tests should be added that execute these methods once. You can write the reference string in the output, but that is not required (and actually very hard for cases where you have floats or Numpy arrays or str, where formatting differs across Python or Numpy version). Example what to put as a test:: def test_person_txt(): p = Person() assert repr(p).startswith('Person') assert str(p).startswith('Hi') assert p.info(add_location=True).endswith('Heidelberg') Output in Jupyter notebook cells ++++++++++++++++++++++++++++++++ In addition to the standard `repr` and `str` outputs, Jupyter notebook cells have the option to display nicely formatted (HTML) output, using the IPython rich display options (other output options also exist, such as LaTeX or SVG, but these are less used). This requires the implementation of a `_repr_html_(self)` method (note: single underscore) in a class. By default, this method can just return the string representation of the object (which may already produce a nicely formatted output). That is often nicer than the default, which is to return the `repr` output, which tends to be shorter and may miss details of the object. Thus, the following would be a good default for a new class:: def _repr_html_(self): return f'

html.escape(str(self))
' The surrounding `
` takes care that the formatting in HTML is the
same as that for the normal `__str__` method, while `html.escape`
handles escaping HTML special characters (<, >, &, " and ').

Note that if no `__str__` method is implemented, `__repr__` is used,
and ultimately, the Python built-in `__repr__` method serves as the
final fallback (which looks like `` or similar).

If more specific HTML output is preferred (for example, output
formatted in a HTML table or list), it is best to create a separate
`to_html(self)` method in the class, which is more explicit (and can
be documentated as part of the API), and let `_repr_html_` call this
method::

    def _repr_html_(self):
        return self.to_html(self)

Thus very similar to `__str__` calling `info()`, with optional
parameters if needed.

To allow for both options, the following default `_repr_html_` can be
implemented instead::

    def _repr_html_(self):
        try:
           return self.to_html(self)
       except AttributeError:
           return f'
html.escape(str(self))
' Nearly all base classes in Gammapy implement this default `_repr_html_`. If a new class derives from an existing Gammapy class, a default implementation is not needed, since it will rely on its (grand)parent. As a result, for specific HTML output, only the `to_html` method needs to be implented for the relevant class. Convert a jupyter notebook to python script in the sphinx-gallery format ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ The tutorials in Gammapy are represented by python scripts written in the sphinx-gallery format. However, since they are displayed as jupyter notebooks in the documentation, it is usually easier to first create a tutorial in a jupyter notebook and then convert it into a python script. This can be done using ``ipynb_to_gallery.py`` script located in the ``dev`` folder. This script can be used as follows:: python dev/ipynb_to_gallery.py (Optional) If the path of the output file is not provided, the script will be written in the same folder as the notebook and with the same name. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/doc_howto.rst0000644000175100001770000003102214721316200020400 0ustar00runnerdocker.. include:: ../references.txt .. _doc_howto: ******************** Documentation How To ******************** Documentation building ---------------------- Generating the HTML docs for Gammapy is straight-forward:: make docs-sphinx make docs-show Or one can equivalently use tox:: tox -e build_docs Generating the PDF docs is more complex. This should work:: # build the latex file cd docs python -m sphinx . _build/latex -b latex -j auto # first generation of pdf file cd _build/latex pdflatex -interaction=nonstopmode gammapy.tex # final generation of pdf file pdflatex -interaction=nonstopmode gammapy.tex # clean the git repo git reset --hard # open the pdf file open gammapy.pdf You need a bunch or LaTeX stuff, specifically ``texlive-fonts-extra`` is needed. Check Python code ----------------- Code in RST files +++++++++++++++++ Most of the documentation of Gammapy is present in RST files that are converted into HTML pages using Sphinx during the build documentation process. You may include snippets of Python code in these RST files within blocks labelled with ``.. code-block:: python`` Sphinx directive. However, this code could not be tested, and it will not be possible to know if it fails in following versions of Gammapy. That's why we recommend using the ``.. testcode::`` directive to enclose code that will be tested against the results present in a block labelled with ``.. testoutput::`` directive. If not ``.. testoutput::`` directive is provided, only execution tests will be performed. For example, we could check that the code below does not fail, since it does not provide any output. .. code-block:: text .. testcode:: from gammapy.astro import source from gammapy.astro import population from gammapy.astro import darkmatter On the contrary, we could check the execution of the following code as well as the output values produced. .. code-block:: text .. testcode:: from astropy.time import Time time = Time(['1999-01-01T00:00:00.123456789', '2010-01-01T00:00:00']) print(time.mjd) .. testoutput:: [51179.00000143 55197. ] In order to perform tests of these snippets of code present in RST files, you may run the following command. .. code-block:: bash pytest --doctest-glob="*.rst" docs/ Code in docstrings in Python files ++++++++++++++++++++++++++++++++++ It is also advisable to add code snippets within the docstrings of the classes and functions present in Python files. These snippets show how to use the function or class that is documented, and are written in the docstrings using the following syntax. .. code-block:: text Examples -------- >>> from astropy.units import Quantity >>> from gammapy.data import EventList >>> event_list = EventList.read('events.fits') # doctest: +SKIP In the case above, we could check the execution of the first two lines importing the ``Quantity`` and ``EventList`` modules, whilst the third line will be skipped. On the contrary, in the example below we could check the execution of the code as well as the output value produced. .. code-block:: text Examples -------- >>> from regions import Regions >>> regions = Regions.parse("galactic;circle(10,20,3)", format="ds9") >>> print(regions[0]) Region: CircleSkyRegion center: radius: 3.0 deg To allow the code block to be placed correctly over multiple lines utilise the "...": .. code-block:: text Examples -------- >>> from gammapy.maps import WcsGeom, MapAxis >>> energy_axis_true = MapAxis.from_energy_bounds( ... 0.5, 20, 10, unit="TeV", name="energy_true" ... ) For a larger code block it is also possible to utilise the following syntax. .. code-block:: text Examples -------- .. testcode:: from gammapy.maps import MapAxis from gammapy.irf import EnergyDispersion2D filename = '$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz' edisp2d = EnergyDispersion2D.read(filename, hdu="EDISP") In order to perform tests of these snippets of code present in the docstrings of the Python files, you may run the following command. .. code-block:: bash pytest --doctest-modules --ignore-glob=*/tests gammapy If you get a zsh error try using putting to ignore block inside quotes .. code-block:: bash pytest --doctest-modules "--ignore-glob=*/tests" gammapy It is also important to check that you have correctly formatted your docstring. An easy way to check this is with the following for your specific file, i.e.: .. code-block:: bash pydocstyle gammapy/data/event_list.py Sphinx gallery extension ------------------------ The documentation built-in process uses the `sphinx-gallery `__ extension to build galleries of illustrated examples on how to use Gammapy (i.e. :ref:`model-gallery`). The Python scripts used to produce the model gallery are placed in ``examples/models`` and ``examples/tutorials``. The configuration of the ``sphinx-gallery`` module is done in ``docs/conf.py``. The tutorials are order using a python dictionary stored in ``docs/sphinxext.py``. Choose a thumbnail and tooltip for the tutorial gallery +++++++++++++++++++++++++++++++++++++++++++++++++++++++ The Gammapy :ref:`tutorials` are Python scripts in the Sphinx Gallery format. They are displayed as a gallery with picture thumbnails and tooltips. You can choose the thumbnail for the tutorial by adding a comment before the plot: .. code-block:: python # The next line sets the thumbnail for the second figure in the gallery # (plot with negative exponential in orange) # sphinx_gallery_thumbnail_number = 2 plt.figure() plt.plot(x, -np.exp(-x), color='orange', linewidth=4) plt.xlabel('$x$') plt.ylabel(r'$-\exp(-x)$') # To avoid matplotlib text output plt.show() The example is taken from the `sphinx-gallery documentation `__, please refer to it for more details. The tooltip is the text that appears when you hover over the thumbnail. It is taken from the first line of the docstring of the tutorial. You can change it by editing the docstring. See e.g. `Analysis 1 Tutorial `__. Dealing with links ------------------ Links in tutorials are just handled via normal RST syntax. Links to other tutorials ++++++++++++++++++++++++ From docstrings and RST documentation files in Gammapy you can link to other tutorials and gallery examples by using RST syntax like this: .. code-block:: rst :doc:`/tutorials/starting/analysis_2` This will link to the tutorial :doc:`/tutorials/starting/analysis_2` from the tutorial base folder. The file suffix will be automatically inferred by Sphinx. Links to documentation ++++++++++++++++++++++ To make a reference to a heading within an RST file, first you need to define an explicit target for the heading: .. code-block:: rst .. _datasets: Datasets (DL4) ============== The reference is the rendered as ``datasets``. To link to this in the documentation you can use: .. code-block:: rst :ref:`datasets` API Links +++++++++ Links to Gammapy API are handled via normal Sphinx syntax in comments: .. code-block:: python # Create an `~gammapy.analysis.AnalysisConfig` object and edit it to # define the analysis configuration: config = AnalysisConfig() This will resolve to a link to the ``AnalysisConfig`` class in the API documentation. .. _dev-check_html_links: Check broken links ++++++++++++++++++ To check for broken external links you can use ``tox``: .. code-block:: bash tox -e linkcheck Include png files as images ---------------------------- In the RST files ++++++++++++++++ Gammapy has a ``gp-image`` directive to include an image from ``$GAMMAPY_DATA/figures/``, use the ``gp-image`` directive instead of the usual Sphinx ``image`` directive like this: .. code-block:: rst .. gp-image:: detect/fermi_ts_image.png :scale: 100% More info on the `image directive `__. Documentation guidelines ------------------------ Like almost all Python projects, the Gammapy documentation is written in a format called `restructured text (RST)`_ and built using `Sphinx`_. We mostly follow the :ref:`Astropy documentation guidelines `, which are based on the `Numpy docstring standard`_, which is what most scientific Python packages use. .. _restructured text (RST): http://sphinx-doc.org/rest.html .. _Sphinx: http://sphinx-doc.org/ .. _Numpy docstring standard: https://numpydoc.readthedocs.io/en/latest/format.html#docstring-standard There's a few details that are not easy to figure out by browsing the Numpy or Astropy documentation guidelines, or that we actually do differently in Gammapy. These are listed here so that Gammapy developers have a reference. Usually the quickest way to figure out how something should be done is to browse the Astropy or Gammapy code a bit (either locally with your editor or online on GitHub or via the HTML docs), or search the Numpy or Astropy documentation guidelines mentioned above. If that doesn't quickly turn up something useful, please ask by putting a comment on the issue or pull request you're working on GitHub, or email the Gammapy mailing list. Correct format for bullet point list ++++++++++++++++++++++++++++++++++++ To correctly add a bullet point list to the docstring, the following can be implemented: .. testcode:: """ Docstring explanation. Parameters ---------- parameter_name : parameter_type Description of the parameter with entries: * option1 : description1 * option2 : description2 * option3 : description3 over more than one line. """ Functions or class methods that return a single object ++++++++++++++++++++++++++++++++++++++++++++++++++++++ For functions or class methods that return a single object, following the Numpy docstring standard and adding a *Returns* section usually means that you duplicate the one-line description and repeat the function name as return variable name. See `~astropy.cosmology.LambdaCDM.w` or `~astropy.time.Time.sidereal_time` as examples in the Astropy codebase. Here's a simple example: .. testcode:: def circle_area(radius): """Circle area. Parameters ---------- radius : `~astropy.units.Quantity` Circle radius Returns ------- area : `~astropy.units.Quantity` Circle area """ return 3.14 * (radius ** 2) In these cases, the following shorter format omitting the *Returns* section is recommended: .. testcode:: def circle_area(radius): """Circle area (`~astropy.units.Quantity`). Parameters ---------- radius : `~astropy.units.Quantity` Circle radius """ return 3.14 * (radius ** 2) Usually the parameter description doesn't fit on the one line, so it's recommended to always keep this in the *Parameters* section. A common case where the short format is appropriate are class properties, because they always return a single object. As an example see `~gammapy.data.EventList.radec`, which is reproduced here: .. testcode:: @property def radec(self): """Event RA / DEC sky coordinates (`~astropy.coordinates.SkyCoord`).""" lon, lat = self['RA'], self['DEC'] return SkyCoord(lon, lat, unit='deg', frame='icrs') Class attributes ++++++++++++++++ Class attributes (data members) and properties are currently a bit of a mess. Attributes are listed in an *Attributes* section because I've listed them in a class-level docstring attributes section as recommended `here `__. Properties are listed in separate *Attributes summary* and *Attributes Documentation* sections, which is confusing to users ("what's the difference between attributes and properties?"). One solution is to always use properties, but that can get very verbose if we have to write so many getters and setters. We could start using descriptors. TODO: make a decision on this and describe the issue / solution here. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/index.rst0000644000175100001770000000153014721316200017523 0ustar00runnerdocker.. include:: ../references.txt .. _dev: =============== Developer guide =============== The developer documentation is a collection of notes for Gammapy developers and maintainers. If something is not covered here, have a look at the very extensive `Astropy developer documentation `__, as well. If you want to contribute to Gammapy just use one of the `contact points `__ listed on the Gammapy webpage. Ideally you join directly the #dev channel in the Gammapy Slack. **Technical setup** .. toctree:: :maxdepth: 1 setup dependencies **Contribution recipes** .. toctree:: :maxdepth: 1 intro release **How To** .. toctree:: :maxdepth: 1 doc_howto dev_howto **Proposals for Improvement of Gammapy** .. toctree:: :maxdepth: 1 pigs/index ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/intro.rst0000644000175100001770000003213514721316200017554 0ustar00runnerdocker.. include:: ../references.txt .. _dev_intro: ============================ How to contribute to Gammapy ============================ What is this? ============= **This page is an overview of how to make a change or addition to the Gammapy code, tests or documentation. It's partly an introduction to the process, partly a guide to some technical aspects.** It is *not* a tutorial introduction explaining all the tools (git, GitHub, Sphinx, pytest) or code (Python, Numpy, Scipy, Astropy) in detail. In the Gammapy docs, we don't have such a tutorial introduction written up, but we're happy to point out other tutorials or help you get started at any skill level anytime if you ask. Before attempting to make a contribution, you should *use* Gammapy a bit at least: * Install Gammapy. * Execute one or two of the tutorial notebooks for Gammapy and do the exercises there. * Ask questions or complain about issues on the Gammapy `slack channel `__, `GitHub discussions `__ or `GitHub issues tracker `__. We'd like to note that there are many ways to contribute to the Gammapy project. For example if you mention it to a colleague or suggest it to a student, or if you use it and `acknowledge Gammapy `__ in a presentation, poster or publication, or if you report an issue on the mailing list, those are contributions we value. The rest of this page though is concerned only with the process and technical steps how to contribute a code or documentation change via a **pull request** against the Gammapy repository. So let's assume you've used Gammapy for a while, and now you'd like to fix or add something to the Gammapy code, tests or docs. Here are the steps and commands to do it ... Acceptation of the Developer Certificate of Origin (DCO) ======================================================== As described in the `PIG 24 `_ and the `README.rst `_ file, each contributor shall accept the DCO based stored in the file `CONTRIBUTING.md `_: this is a binding statement that asserts that you are the creator of your contribution, and that you wish to allow Gammapy to use your work to cite you as contributor. **If you are willing to agree to these terms, the following agreement line should be added to every commit message:** ``Signed-off-by: Random J Developer `` Four solutions exist: 1. You add this message by hand into each of your commit messages (not recommended) 2. You can sign each of your commits with the command: "``git commit -s``". If you have authored a commit that is missing its ‘Signed-off-by’ line, you can amend your commits and push them to GitHub: "``git commit --amend --no-edit --signoff``" (see also this `How To `_). 3. You can make an alias of the command "``git commit -s``", e.g. ``alias gcs 'git commit -s'`` 4. You can create a so-called `git hooks` allowing to automatically sign all your commits (recommended option). This method is described in detail `here `_. For each of these solutions, it is **mandatory** to correctly set your `user.name` and `user.email` as part of your git configuration (see `this page `_ to configure it). You have to use **your real name** (i.e., pseudonyms or anonymous contributions cannot be made) when using git. This is because the DCO is a binding document, granting the Gammapy project to be an open source project. Get in touch early ================== **Usually the first step, before doing any work, is to get in touch with the Gammapy developers!** Especially if you're new to the project, and don't have an overview of ongoing activities, there's a risk that your work will be in vain if you don't talk to us early. E.g. it could happen that someone else is currently working on similar functionality, or that you've found a code or documentation bug and are willing to fix it, but it then turns out that this was for some part of Gammapy that we wanted to re-write or remove soon anyway. Also, it's usually more fun if you get a *mentor* or *reviewer* early in the process, so that you have someone to bug with questions and issues that come up while executing the steps outlined below. After you've done a few contributions to Gammapy and know about the status of ongoing work, the best way to proceed is to file an issue or pull request on GitHub at the stage where you want feedback or review. Sometimes you're not sure how to best do something, and you start by discussing it on the mailing list or in a GitHub issue. Sometimes you know how you'd like to do it, and you just code or write it up and make a pull request when it's basically finished. In any case, please keep the following point also in mind ... Make small pull requests ======================== **Contributions to Gammapy happen via pull requests on GitHub. We like them small.** So as we'll explain in more detail below, the contribution cycle to Gammapy is roughly: 1. Get the latest development version (``main`` branch) of Gammapy 2. Make fixes, changes and additions locally 3. Make a pull request 4. Someone else reviews the pull request, you iterate, others can chime in 5. Someone else signs off on or merges your pull request 6. You update to the latest ``main`` branch Then you're done, and can start using the new version, or start a new pull request with further developments. It is possible and common to work on things in parallel using git branches. So how large should one pull request be? Our experience in Gammapy (and others confirm, see e.g. `here `__) is that smaller is better. Working on a pull request for an hour or maximum a day, and having a diff of a few to maximum a few 100 lines to review and discuss is pleasant. A pull request that drags on for more than a few days, or that contains a diff or 1000 lines, is almost always painful and inefficient for the person making it, but even more so for the person reviewing it. The worst case is if you start a pull request, put in a lot of hours, but then don't have time to "finish" it, and it's sitting there for a week or a month without getting merged. Then it's either blocking others that want to work on the same part of the code or docs, or they do it, and then you have merged conflicts to resolve when you come back to it. And coming back to a large pull request after a long time always means a large investment of time for the reviewer, because they probably have to re-read the previous discussion, and look through the large diff again. So pull requests that are small, e.g. one bug fix with the addition of one regression test, or one new function or class or file, or one documentation example, and that get reviewed and merged quickly (ideally the same day, certainly the same week), are best. .. _dev_setup: Get set up ========== .. warning:: The rest of this page isn't written yet. It's almost identical to https://ctapipe.readthedocs.io/en/latest/developer-guide/getting-started.html so for now, see there. Also, we shouldn't duplicate content from https://docs.astropy.org/en/latest/development/index.html but link there instead. The first steps are basically identical to https://ctapipe.readthedocs.io/en/latest/developer-guide/getting-started.html (until section *Setting up the development environment*) and http://astropy.readthedocs.io/en/latest/development/workflow/get_devel_version.html (up to *Create your own private workspace*). The following is a quick summary of commands to set up an environment for Gammapy development: .. code-block:: bash # Fork the gammapy repository on GitHub, https://github.com/gammapy/gammapy cd code # Go somewhere on your machine where you want to code git clone https://github.com/[your-github-username]/gammapy.git cd gammapy conda env create -f environment-dev.yml # To speed up the environment solving you can use mamba instead of conda # mamba env create -f environment-dev.yml conda activate gammapy-dev # for conda versions <4.4.0 you may have to execute # 'source activate gammapy-dev' instead git remote add gammapy git@github.com:gammapy/gammapy.git git remote rename origin [your-user-name] `Mamba `__ is an alternative package manager that offers higher installation speed and more reliable environment solutions. It is also common to stick with the name ``origin`` for your repository and to use ``upstream`` for the repository you forked from. In any case, you can use ``$ git remote -v`` to list all your configured remotes. In case you are working with the development version environment and you want to update this environment with the content present in `environment-dev.yml` see below: .. code-block:: bash conda env update --file environment-dev.yml --prune When developing Gammapy you never want to work on the ``main`` branch, but always on a dedicated feature branch. .. code-block:: bash git branch [branch-name] git checkout [branch-name] To *activate* your development version (branch) of Gammapy in your environment: .. code-block:: bash python -m pip install -e . This build is necessary to compile the few Cython code (``*.pyx``). If you skip this step, some imports depending on Cython code will fail. If you want to remove the generated files run ``make clean``. For the development it is also convenient to have declared ``$GAMMAPY_DATA`` environment variable. You can download the Gammapy datasets with ``gammapy download datasets`` and then point your ``$GAMMAPY_DATA`` to the local path you have chosen. .. code-block:: bash # Download GAMMAPY_DATA gammapy download datasets --out GAMMAPY_DATA export GAMMAPY_DATA=$PWD/GAMMAPY_DATA We adhere to the PEP8 coding style. To enforce this, setup the `pre-commit hook `_: .. code-block:: bash pre-commit install Running tests & building Documentation ====================================== To run tests and build documentation we use tool `tox `__. It is a virtual environment management tool which allows you to test Gammapy locally in mutltiple test environments with different versions of Python and our dependencies. It is also used to build the documentation and check the codestyle in a specific environment. The same setup based on `tox` is used in our CI build. Once you have created and activated the `gammapy-dev` environment, made some modification to the code, you should run the tests: .. code-block:: bash tox -e test This will execute the tests in the standard `test`` environment. If you would like to test with a different environment you can use: .. code-block:: bash tox -e py310-test-numpy121 Which will test the code with Python 3.10 and numpy 1.21. All available pre-defined environments can be listed using: .. code-block:: bash tox --listenvs However for most contributions testing with the standard `tox -e test` command is sufficient. Additional arguments for `pytest` can be passed after `--`: .. code-block:: bash tox -e test -- -n auto Of course you can always use `pytest `__ directly to run tests, e.g. to run tests in a specific sub-package: .. code-block:: bash pytest gammapy/maps To build the documentation locally you can use: .. code-block:: bash tox -e build_docs And use `make docs-show` to open a browser and preview the result. The codestyle can be checked using the command: .. code-block:: bash tox -e codestyle Which will run the tool `flake8` to check for code style issues. .. * run tests * build docs * explain make and setup.py **Make a working example** * Explain "documentation driven development" and "test driven development" * make a branch * test in ``examples`` * ``import IPython; IPython.embed`` trick **Integrate the code in Gammapy** * move functions / classes to Gammapy * move tests to Gammapy * check tests locally * check docs locally **Contribute with Jupyter notebooks** * check tests with user tutorials environment: `gammapy jupyter --src mynotebook.ipynb test --tutor` * strip the output cells: `gammapy jupyter --src mynotebook.ipynb strip` * clean format code cells: `gammapy jupyter --src mynotebook.ipynb black` * diff stripped notebooks: `git diff mynotbeook.pynb` **Make a pull request** * make a pull request * check diff on GitHub * check tests on travis-ci **Code review** tbd **Close the loop** tbd ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.1446419 gammapy-1.3/docs/development/pigs/0000755000175100001770000000000014721316215016633 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/index.rst0000644000175100001770000000156414721316200020474 0ustar00runnerdocker.. include:: ../../references.txt .. _pigs: **** PIGs **** PIGs (proposals for improvement of Gammapy) are short documents proposing a major addition or change to Gammapy. See :ref:`pig-001` for further information. Below is a list of merged PIGs, i.e. the ones that are finalised, with status "accepted" or "rejected" or "withdrawn". The ones with "draft" status, i.e. that are under discussion, can be found on GitHub as `pull requests with the "pig" label`_ . .. toctree:: :maxdepth: 1 pig-001 pig-002 pig-003 pig-004 pig-005 pig-006 pig-007 pig-008 pig-009 pig-010 pig-011 pig-012 pig-013 pig-014 pig-016 pig-018 pig-019 pig-020 pig-021 pig-022 pig-023 pig-024 pig-025 pig-026 .. _pull requests with the "pig" label: https://github.com/gammapy/gammapy/issues?q=label%3Apig ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-001.rst0000644000175100001770000002415414721316200020442 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-001: ********************************** PIG 1 - PIG purpose and guidelines ********************************** * Author: Christoph Deil * Created: Dec 20, 2017 * Accepted: Jan 9, 2018 * Status: accepted * Discussion: `GH 1239`_ Abstract ======== PIG stands for "proposal for improvement of Gammapy". This is PIG 1, describing the purpose of using PIGs as well as giving some guidelines on how PIGs are authored, discussed and reviewed. What is a PIG? ============== This article is about the design document . For other uses, see `Pig (disambiguation)`_ Proposals for improvement of Gammapy (PIGs) are short documents proposing a major addition or change to Gammapy. PIGs are like `APEs`_, `PEPs`_, `NEPs`_ and `JEPs`_, just for Gammapy. Using such enhancement proposals is common for large and long-term open-source Python projects. The primary goal of PIGs is to have an open and structured way of working on Gammapy, forcing the person making the change to think it through and motivate the proposal before taking action, and for others to have a chance to review and comment on the proposal. The PIGs will also serve as a record of major design decisions taken in Gammapy, which can be useful in the future when things are re-discussed or new proposals to do things differently arrive. We expect that we will not use PIGs very often, but we think they can be useful e.g. in the following cases: * Outline design for something that requires significant work, e.g. on topics like "Modeling in Gammapy" or "High level user interface in Gammapy". These PIGs can be rather long, i.e. more than one page, explaining the design in detail and even explaining which alternatives were considered and why the proposed solution was preferred. * Have a conscious decision and make sure all interested parties are aware for things that might be controversial and have long-term effects for Gammapy, e.g. when to drop Python 2 support or defining a release cycle for Gammapy. These PIGs can usually be very short, a page or less. Writing a PIG ============= Anyone is welcome to write a PIG! PIGs are written as RST files in the ``docs/development/pigs`` folder in the main Gammapy repository, and submitted as pull requests. Most discussions concerning Gammapy will happen by talking to each other directly (calls or face-to-face), or online on the mailing list or GitHub. If you're not sure if you should write a PIG, please don't! Instead bring the topic up in discussions first with other Gammapy developers, or on the mailing list, to get some initial feedback. This will let you figure out if writing a PIG will be helpful or not to realise your proposal. When starting to write a PIG, we suggest you copy & paste & update the header at the top of this file, i.e. the title, bullet list with "Author" etc, up to the ``Abstract`` section. A PIG must have a number. If you're not sure what the next free number is, put ``X`` and filename ``pig-XXX.rst`` as placeholders, make the pull request, and we'll let you know what number to put. Please make your proposal clearly and keep the initial proposal short. If more information is needed, people that will review it will ask for it. In term of content, we don't require a formal structure for a PIG. We do suggest that you start with an "Abstract" explaining the proposal in one or a few sentences, followed by a section motivating the change or addition. Then usually there will be a detailed description, and at the end or interspersed there will often be some comments about alternative options that have been discussed and explanation why proposed one was favoured. Usually a short "Decision" section will be added at the end of the document after discussion by the reviewers. If you're not sure how to structure your proposal, you could have a look at at the `APE template`_ or some well-written APEs_ or PEPs_. `APE 5`_, `APE 7`_ and `APE 13`_ are examples of "design documents", outlining major changes / extensions to existing code in Astropy. `APE 2`_ and `APE 10`_ are examples of "process" proposals, outlining a release cycle for Astropy and a timeline for dropping Python 2 support. `PEP 389`_ is a good example proposing an improvement in the Python standard library, in that case by adding a new module ``argparse``, leaving the existing ``optparse`` alone for backward-compatibility reasons. In Gammapy many PIGs will also be about implementing better solutions and a major question will be whether to change and improve the existing implementation, or whether to just put a new one, and in that case what the plan concerning the old code is. `PEP 481`_ is an example of a "process" PEP, proposing to move CPython development to git and GitHub. For Gammapy, we might also have "process" PIGs in the future, e.g. concerning release cycle or package distribution or support for users and other projects relying on Gammapy. It's good to think these things through and write them down to make sure it's clear how things work and what the plan for the future is. Writing a PIG doesn't mean you have to implement it. That said, we expect that most PIGs will propose something that requires developments where someone has the intention to implement it within the near future (say within the next year). But it's not required, e.g. if you have a great idea or vision for Gammapy that requires a lot of development, without the manpower to execute the idea, writing a PIG could be a nice way to share this idea in some detail, with the hope that in collaboration with others it can eventually be realised. PIG review ========== PIG review happens on the pull request on GitHub. When a PIG is put up, an announcement with a link to the pull request should be sent both to the Gammapy mailing list and the Gammapy coordinator list. Anyone is welcome to review it and is encouraged to share their thoughts in the discussion! Please note that GitHub hides inline comments after they have been edited, so we suggest that you use inline comments for minor points like spelling mistakes only. Put your main feedback as normal comments in the "Conversation" tab, so that for someone reading the discussion later they will see your comment directly. The final decision on any PIG is made by the Gammapy coordination committee. We expect that in most cases, the people participating in the PIG review will reach a consensus and the coordination committee will follow the outcome of the public discussion. But in unusual cases where disagreement remains, the coordination committee will talk to the people involved in the discussion with the goal to reach consensus or compromise, and then make the final decision. PIG status ========== PIGs can have a status of: * "draft" - in draft status, either in the writing or discussion phase * "withdrawn" - withdrawn by the author * "accepted" - accepted by the coordination committee * "rejected" - rejected by the coordination committee When a PIG is put up for discussion as a pull request, it should have a status of "draft". Then once the discussion and review is done, the status will change to one of "withdrawn", "accepted" or "rejected". The reviewers should add a section "Decision" with a sentence or paragraph summarising the discussion and decision on this PIG. Then in any case, the PIG should be merged, even if it's status is "withdrawn" or "rejected". Final remarks ============= This PIG leaves some points open. This is intentional. We want to keep the process flexible and first gain some experience. The goal of PIGs is to help the Gammapy developer team to be more efficient, not to have a rigid or bureaucratic process. Specifically the following points remain flexible: * When to merge a PIG? There can be cases where the PIG is merged quickly, as an outline or design document, even if the actual implementation hasn't been done yet. There can be other cases where the PIG pull request remains open for a long time, because the proposal is too vague or requires prototyping to be evaluated properly. Note that this is normal, e.g. Python PEPs_ are usually only accepted once all development is done and a full implementation exists. * Allow edits of existing PIGs? We don't say if PIGs are supposed to be fixed or live documents. We expect that some will remain fixed, while others will be edited after being merged. E.g. for this PIG 1 we expect that over the years as we gain experience with the PIG process and see what works well and what doesn't, that edits will be made with clarifications or even changes. Whether to edit an existing PIG or whether to write a new follow-up PIG will be discussed on a case by case basis. * What to do if the coordination committee doesn't agree on some PIG? For now, we leave this question to the future. We expect that this scenario might arise, it's normal that opinions on technical solutions or importance of use cases or projects to support with Gammapy differ. We also expect that Gammapy coordination committee members will be friendly people that can collaborate and find a solution or at least compromise that works for everyone. Decision ======== This PIG was discussed extensively in `GH 1239`_, resulting in several improvements and additions. Everyone thought it was one honking good idea - let's do more of those! .. _Pig (disambiguation): https://en.wikipedia.org/wiki/Pig_(disambiguation) .. _PEPs: https://www.python.org/dev/peps/pep-0001/ .. _NEPs: https://numpy.org/neps/ .. _APEs: https://github.com/astropy/astropy-APEs .. _JEPs: https://github.com/jupyter/enhancement-proposals .. _APE template: https://github.com/astropy/astropy-APEs/blob/master/APEtemplate.rst .. _APE 2: https://github.com/astropy/astropy-APEs/blob/master/APE2.rst .. _APE 5: https://github.com/astropy/astropy-APEs/blob/master/APE5.rst .. _APE 7: https://github.com/astropy/astropy-APEs/blob/master/APE7.rst .. _APE 10: https://github.com/astropy/astropy-APEs/blob/master/APE10.rst .. _APE 13: https://github.com/astropy/astropy-APEs/blob/master/APE13.rst .. _PEP 8: https://www.python.org/dev/peps/pep-0008/ .. _PEP 389: https://www.python.org/dev/peps/pep-0389/ .. _PEP 481: https://www.python.org/dev/peps/pep-0481/ .. _GH 1239: https://github.com/gammapy/gammapy/pull/1239 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-002.rst0000644000175100001770000002336514721316200020446 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-002: *********************************************** PIG 2 - Organization of low level analysis code *********************************************** ----------------------------------- The case of image and cube analysis ----------------------------------- * Author: Régis Terrier & Christoph Deil * Created: Jan 12, 2018 * Accepted: Jul 27, 2018 * Status: accepted * Discussion: `GH 1277`_ Abstract ======== This PIG discusses the general structure of the low level analysis subpackages of gammapy. Low level analysis is based on the gammapy building blocks from ``gammapy.data``, ``gammapy.irf`` and ``gammapy.maps``. Low level analysis implements all the individual steps required to perform data reduction for IACT from DL3 inputs (event lists and IRFs) to DL4 data (spectra, maps and cubes) and their associated reduced IRFs. Low level analysis should be structured in a very modular way to allow easy implementation of high level analysis classes and scripts. General code style guidelines ============================= Functions or methods should be no longer than few tens of lines of code. Above that it is better to use multiple functions to make testing easier and allow more modular usage. One line functions are usually not needed unless this is a very complex line. Similarly, classes should have 3-10 methods. 2 methods classes (e.g. only ``__init__`` and ``__call__``) should usually be functions. Above 10-20 methodes, the class should be split into several classes/functions. It is important to keep the number of functions and classes needed by the user to a reasonable level. Modularity is therefore very important, since it allows to easily implement high level interfaces that orchestrates the common analysis patterns. Algorithms and data should be clearly separated. The naming scheme used should allow easy identification of the nature of a piece of code. For instance, functions creating maps and or cube should be named make_map_xxx. Data analysis subpackages in gammapy ==================================== Low level analysis produces reduced datasets and IRFs from the general event lists and multidimensional IRFs of each observation or GTI. The building blocks on which it relies are coded in gammapy.data (``EventList``, ``DataStore``, ``DataStoreObservation`` etc), in gammapy.maps (in particular ``WcsNDMap`` used both for images and cubes), in gammapy.irf (e.g. ``EffectiveAreaTable2D``, ``EnergyDispersion2D``, ``EnergyDependentTablePSF``, etc). Analysis subpackages are: * 1D or spectral analysis (in ``gammapy.spectrum``) * 2D and 3D (cube) analysis (in ``gammapy.cube``) * timing analysis (in ``gammapy.time``) Low level map and cube analysis =============================== The low level analysis cube package deals with the production of all maps/cubes and PSF kernels required to perform 2D and 3D modeling and fitting. This includes counts, exposure, acceptance and normalized background maps and cubes. These reduced data and IRFs are stored using the ``gammapy.maps.WcsNDMap`` class which describes multidimensional maps with their World Coordinate System (WCS) description and a set of non-spatial axis. The default map structure for most of the typical analysis will be 3 dimensional maps with an energy axis (with a single bin for 2D images). The low level analysis is performed on an observation per observation (or GTI) basis. This is required by the response and background rapid variations. Therefore, all basic functions operate on a single ``EventList`` or set of IRFs (i.e. ``EffectiveAreaTable2D``, ``EnergyDispersion2D``, ``EnergyDependentTablePSF``). The iterative production of the individual reduced datasets and IRFs and their combination is realized by the higher level class. The individual observation products can be serialized, mostly for analysis debugging purposes or to avoid reprocessing large databases when new data are added. Depending on the type of analysis, different reduced IRFs are to be produced. The main difference lies in the type of energy considered: reconstructed or true (i.e. incident) energy. Counts, hadronic acceptance and background always use reconstructed (i.e. measured) energy. Exposure and PSF kernels will be defined in reconstructed energy for 2D analysis whereas they will be defined in true energies for 3D analysis with their own energy binning. A reduced energy dispersion will then be produced to convert from true to reconstructed energies and used later to predict counts. The maker functions and the products have to clearly state what type of energy they are using to avoid any confusion. The serialization has to include a way to clearly differentiate the products. Some metadata, probably in the form of an ``OrderedDict`` as in the case of ``astropy.table.Table`` could be used to do so. In order to perform likelihood analysis of maps and cubes, as well as to apply *ON-OFF* significance estimation techniques it is important to have integers values for counts and OFF maps produced by ring background estimation techniques (on an observation per observation basis). Therefore, we want to avoid reprojecting individual maps onto a global mosaic. The approach should be to define the general geometry of the target mosaic map and to perform cutouts for each observation. This can be done using for instance ``astropy.Cutout2D``. The index range of the cutout in the general mosaic map should be kept for easy summation. This step is performed with: ``make_map_cutout`` * *takes* a ``WcsNDMap`` and a maximum offset angle ``Angle`` or ``Quantity`` * *returns* the ``WcsGeom`` of the cutout and its ``slice`` For individual observations/gti, the general arguments of all maker functions are: * Reference image and energy range. ``gammapy.maps.MapGeom`` * Maximum offset angle. ``astropy.coordinates.Angle`` The various maker functions are then: ``make_map_counts`` * *takes* an ``EventList`` * *returns* a count map/cube ``make_map_exposure_true_energy`` * *takes* a pointing direction, an ``EffectiveAreaTable2D`` and a livetime * *returns* an exposure map/cube in true energy ``make_map_exposure_reco_energy`` * *takes* a pointing direction, an ``EffectiveAreaTable2D``, an ``EnergyDispersion2D`` and a livetime * *returns* an exposure map/cube in reco energy ``make_map_hadron_acceptance`` * *takes* a pointing direction, a ``Background3D`` and a livetime * *returns* an hadronic acceptance map, i.e. a predicted background map/cube. ``make_map_FoV_background`` * *takes* maps/cube (``WcsNDMap``) of observed counts and hadron acceptance/predicted background and an exclusion map * *returns* the map of background normalized on the observed counts in the whole FoV (excluding regions with significant gamma-ray emission). * Different energy grouping schemes should be available to ensure a reasonable number of events are used for the normalization. This scheme and the number of events used for normalization should be included in the optional serialization. ``make_map_ring_background`` * *takes* maps/cube (``WcsNDMap``) of observed counts and hadron acceptance/predicted background and exclusion map. It also takes a ``gammapy.background.AdaptiveRingBackgroundEstimator`` or a ``gammapy.background.RingBackgroundEstimator`` * *returns* the map of background normalized on the observed counts with a ring filter (excluding regions with significant gamma-ray emission). The background estimator object also contains the *OFF* map and the *ON* and *OFF* exposure maps. * Most likely this technique is not meant to be used for too small energy bands, so that energy grouping is probably not relevant here. The general processing can then be performed by general classes or scripts, possibly config file driven. It should be sufficiently modular to allow for users to do their own scripts Existing code ============= Currently, maps and cubes rely on the ``SkyImage`` and ``SkyCube`` classes. There are various scripts and classes existing currently in gammapy to produce maps and cubes (mostly developed by @adonath and @ljouvin).Image processing can be performed with ``SingleObsImageMaker`` and ``StackedObsImageMaker``, while cube processing can be performed with ``SingleObsCubeMaker`` and ``StackedObsCubeMaker``. For images, one can also use the ``IACTBasicImageEstimator``. All this code relies on high level class which perform all the analysis sequentially (exposure, background, count maps etc). This approach is not modular and creates a lot of code duplication. Some cube-related analysis is required for images creating some cross-dependencies. The proposed scheme should be much more modular and allow user to use gammapy as a library to compose their own scripts and classes if needed. It should limit code duplication. In particular, it uses the more general ``gammapy.maps`` which allows to get rid of the cross dependencies of the image and cube package we have now. The existing code will remain in gammapy for the moment, with possibly some bugs fixed. The new code is largely independent so that the new development should bot break user scripts. Decision ======== This PIG was extensively discussed on GitHub, as well as in Gammapy weekly calls and at the Feb 2018 and July 2018 Gammapy meetings. Doing this move to new analysis code based on gammapy.maps was never controversial, bug API and implementation discussions were ongoing. On July 27, 2018, Regis and Christoph noticed that the description in this PIG had been mostly implemented in Gammapy master already, and that further progress would come from individual improvements, not a rewrite / update of this PIG with a complete design. So we decided to merge this PIG with status "approved" to have it on the record as part of the design and evolution process for Gammapy. .. _GH 1277: https://github.com/gammapy/gammapy/pull/1277 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-003.rst0000644000175100001770000001607514721316200020447 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-003: ******************************************** PIG 3 - Plan for dropping Python 2.7 support ******************************************** * Author: Christoph Deil & Matthew Wood * Created: Feb 1, 2018 * Accepted: Nov 30, 2018 * Status: accepted * Discussion: `GH 1278`_ Abstract ======== We propose to drop Python 2.7 support in Gammapy v0.11 in March 2019. All earlier Gammapy versions, up to Gammapy v0.10, support Python 2.7 and of course will remain available indefinitely. User surveys in 2018 have shown that most Gammapy users are already on Python 3. Gammapy v0.8 shipped with a recommended conda environment based on Python 3.6 that works on Linux, Mac and Windows and can be installed by anyone, also on older machines. To support Fermipy, which uses gammapy.maps and still requires Python 2.7, as well as other users on Python 2.7 (if any), we will backport bug-fixes and make patch releases in the Gammapy v0.10.x branch as needed, throughout 2019. This change will reduce the effort spent on Gammapy testing and packaging in 2019, and also will simplify life for Gammapy developers a bit and make a few new features of the Python 3 language available for Gammapy. Support for Python 3 will remain as-is (Python 3.5 or later) for now, whether to require Python 3.6 or later for Gammapy v1.0 in fall 2019 or later will be discussed separately. User perspective ================ Most Gammapy users are already using Python 3. We did a "Gammapy installation questionnaire" on the Gammapy mailing list and Gammapy Slack in Feb 2018 which resulted in only 12 responses. Only 2 people were still using Python 2.7, only one person suggested to keep Python 2.7 support for a while longer, until January 2019. In CTA, a "CTA first data challenge user survey" was done in August 2018, which yielded 50 responses. There 40% said they were still using Python 2.7. The question why, or if / when they would be happy to switch to Python 3 wasn't asked. Based on these two questionnaires, and talking to Gammapy users in 2018, it became clear that the only good reason to keep using Gammapy with Python 2.7 is when using Fermipy, which uses gammapy.maps and the Fermi science tools, which at this time don't support Python 3 yet, and the timeline for Python 3 support there isn't clear. Based on discussions with the Fermipy maintainers, as mentioned in the abstract already, the suggested solution here would be that Fermipy puts ``gammapy<0.11`` in their ``setup.py`` which will mean that users will get the latest Python 2 compatible version. Probably even that isn't needed, because the ``setup.py`` in Gammapy v0.11 and later will declare that it only supports Python 3 or later, so ``pip`` or ``conda`` would always pick the latest Python 2.7 compatible version automatically. If other Gammapy users need support, please contact us. We can either help you update your scripts to run on Python 3 (preferred, usually trivial changes) or backport fixes to older Gammapy versions that still support Python 2.7 (if really needed). Maintainer and developer perspective ==================================== This change will have a big positive impact on Gammapy maintenance and development. It is an important step to allow for developments in 2019 towards the Gammapy 1.0 release that is planned for fall 2019 (see :ref:`pig-005`). Python 3 was introduced 10 years ago. The transition from Python 2 to Python 3 has been long and painful, and is still ongoing, but it is coming to an end. Most scientific Python projects have already dropped Python 2.7 support or are doing it now (see `Python 3 statement`_). Astropy and the ``astropy-helpers`` that we use have already dropped Python 2 support (see `Astropy statement`_). Jupyter that we extensively use for the documentation has dropped Python 2 support as well. While these projects are still maintaining long-term-support (LST) branches for the last Python 2 compatible versions, it is getting harder and harder to keep the Python 2 builds (especially the Sphinx documentation build) and tests in continuous integration (CI) working. Dropping Python 2 support in Gammapy means less maintainer time spent on this moving forward. There are also new developments that are slowed down by keeping Python 2 support. E.g. we want to continue and finish the development of ``astropy-healpix`` and ``astropy-regions`` to the point where it can be moved into the Astropy core package. This should happen before Gammapy v1.0 and then we should use ``astropy-healpix`` instead of ``healpy`` in Gammapy (see `GH 1167 for Gammapy`_). Given that Astropy core doesn't support Python 2 any more, it makes sense to directly develop those packages with this target, i.e. Python 3 only. Keeping support for Python 2 there does cause some extra work in development (see `GH 111 for astropy-healpix`_) and also for packaging. Finally, we want to collaborate with `ctapipe`_ in Gammapy, and they only supported Python 3 from the start and are using Python 3 only features. So to summarise: the extra maintenance effort to keep supporting Python 2.7 in Gammapy in 2019 would be significant and also slow down the work on ``astropy-healpix`` and ``astropy-regions``. Moving to Python 3 only make life better for all Gammapy maintainers and developers. Detailed plan ============= There are two ways to execute this. When dropping Python 2 support from the git master branch, one can either remove the Python 2 shims directly, or keep them in place and do as few code changes as possible for the time period where one has to backport bug-fixes to the Python 2 compatible branch to avoid the amount of git and manual edits needed. Many big and stable projects like e.g. Numpy that have several stable branches that are supported sometimes for years, and thus do a lot of backporting, don't do the cleanup directly. However, for Gammapy, we propose to do the cleanup directly in the master branch following the v0.10 release. The motivation for this is that in 2019 we will have to re-organise the Gammapy sub-packages and move and edit most files as we work towards Gammapy v1.0 in fall 2019 (see :ref:`pig-005`). So it seems unlikely that ``git cherry-pick`` would work at all to backport bugfixes. Instead, the strategy would be to manually re-apply important bug fixes (mostly to ``gammapy.maps``, hopefully not many) in the ``v0.10.x`` branch. Decision ======== This suggestion was extensively discussed in 2017 and 2018 with Gammapy users and developers. This proposal and PIG was widely announced (mailing list, Slack) and we didn't get a single request to extend Python 2 support longer than mentioned here. Approved on Nov 30, 2018 by Gammapy coordination committee. .. _GH 1278: https://github.com/gammapy/gammapy/pull/1278 .. _Numpy statement: https://docs.scipy.org/doc/numpy-1.14.1/neps/dropping-python2.7-proposal.html .. _Astropy statement: https://github.com/astropy/astropy-APEs/blob/master/APE10.rst .. _Python 3 statement: http://www.python3statement.org/ .. _GH 111 for astropy-healpix: https://github.com/astropy/astropy-healpix/issues/111 .. _GH 1167 for Gammapy: https://github.com/gammapy/gammapy/pull/1167 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-004.rst0000644000175100001770000002053214721316200020441 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-004: ********************************************* PIG 4 - Setup for tutorial notebooks and data ********************************************* * Author: José Enrique Ruiz, Christoph Deil * Created: May 16, 2018 * Accepted: Oct 4, 2018 * Status: accepted * Discussion: `GH 1419`_ Abstract ======== For the past years, we have had tutorial notebooks and example datasets in a second ``gammapy-extra`` repository, as well as others example datasets placed in differente repositories like ``gamma-cat`` and ``gammapy-fermi-lat-data``. The motivation was to keep the main ``gammapy`` code repository small. But we always had problems with code, tutorials and data changing and versions not being linked. We propose to move the notebooks to the ``gammapy`` repository so that code and tutorials can be version-coupled, and to only use stable datasets in tutorials to mostly save the versioning issues. The datasets will remain in ``gammapy-extra`` repository. To ship tutorials and datasets to users, we propose to add a ``gammapy download`` command. The ``gammapy-extra`` repository will remain as a repository for developers and as one place where datasets can be put, but it will not be mentioned to users. What we have ============ We have the `gammapy`_ repository for code, and the `gammapy-extra`_ repository for tutorial notebooks, example datasets and a few other things. The ``gammapy`` repository currently is 12 MB and the ``gammapy-extra`` repository is 161 MB. In ``gammapy-extra/notebooks``, we have ~ 30 tutorial notebooks, each 20 kB to 1 MB in size, i.e. a few MB in total. Most of the size comes from PNG output images in the notebooks, and they usually change on re-run, i.e. even though git compresses a bit, the repo grows by up to 1 MB every time a notebook is edited. The datasets we access from the tutorials are maybe 20 or 30 MB, a lot of the datasets we have there are old and should be cleaned up / removed. The reason the notebooks and datasets were split out from the code was to keep the code repository small and avoid it growing to 100 MB or even 1 GB over time. This separation of code vs. notebooks and datasets has been a problem in Gammapy for years. Given that Gammapy code changes (and probably always will, even if the pace will become slower and slower over the years), the tutorials have to be version-coupled to the code. A related question is how tutorial notebooks, datasets used in tutorials and other datasets not used in the tutorials are shipped to users. Some related discussions may be found in the following references, see e.g. `GH 1237`_, `GH 1369`_, `GH 700`_, `GH 431`_, `GH 405`_, `GH 228`_, `GH 1131`_ (probably missed a few). Proposal ======== This proposal is limited. It outlines a few key changes in the setup for notebooks and example data that mostly solve the versioning and shipping issue of tutorials and datasets. Other related issues that may appear will be faced iteratively with or without an extra PIG. To solve the versioning issue for notebooks, we propose to move the notebooks from ``gammapy-extra/notebooks`` to ``gammapy/notebooks``. We propose to store the notebooks in the repository without output cells filled. Removing output cells before committing has the advantage that the files are small, and that the diff in the pull request is also small and review becomes possible. On the other hand, the output is not directly visible on GitHub. Note that in any case, a rendered version of the notebooks will be available via the docs, that is already in place. We count on developer documentation and code review to guarantee empty-output notebooks stored in ``gammapy/notebooks``, though we can also explore `nbstripout`_ for a potential implementation of an automated mechanism to remove outputs from notebooks in the GitHub repository. In the process of documentation building the notebooks will be texted and executed automatically, so the static HTML-formatted notebooks will contain output cells rendered in the the documentation. On the contrary, links to Binder notebooks and download links to .ipynb files will point to empty-output notebooks. To solve the versioning issue for datasets, we propose to only use stable example datasets. Examples are `gammapy-fermi-lat-data`_ or ``gammapy-extra/datasets/cta-1dc`` or the upcoming `HESS DL3 DR1`_ or ``joint-crab`` datasets. Datasets can be in ``gammapy-extra`` or at any other URL, but even if they are in ``gammapy-extra``, they should not be ''live'' datasets. If an issue is found or something is improved, a separate new dataset should be added, instead of changing the existing one. So versioning of example datasets is not needed. To ship notebooks and example data to users, we propose to introduce a ``gammapy download`` command. This work and discussion how it should work in detail has started in `GH 1369`_. Roughly, the idea is that users will use ``gammapy download`` to download a version of the notebooks matching the version of the Gammapy code, by fetching the files from GitHub. A ``gammapy download tutorials`` command will download all notebooks and the input datasets related. Not output datasets from the notebooks will be downloaded. All files will be copied into a ``$CWD/gammapy-tutorials`` folder, the datasets placed in a ``datasets`` subfolder and the notebooks into a ``notebooks-x.x`` subfolder accounting for the version downloaded. The management of updating the ``gammapy-tutorials`` folder after a local update of ``gammapy`` is left up to the user. The URLs of the input files used by the notebooks should be noted in the ``tutorials/notebooks.yaml`` file in the Gammapy repository, also accounting for the list of notebooks to download as tutorials. For the different stable releases, the list of tutorials to download, their locations and datasets used are declared in YAML files placed in the ``download/tutorials`` folder of the `gammapy-webpage`_ GitHub repository. The same happens for conda working environments of stable releases declared in files placed in the ``download/install`` folder of that repository. The datasets are not versioned and are similarly declared in the ``download/data`` folder. As far as we can see, for testing and online Binder these changes don't introduce significant improvements or new problems, though a Dockerfile in the ``gammapy`` repository will be needed to have these notebooks running in Binder. This is a change that will just affect developers and users on their local machines. Alternatives ============ One alternative would be to keep the notebooks in ``gammapy-extra``, and to couple the version with ``gammapy`` somehow, e.g. via a git submodule pointer, or via a config file in one of the repos or on gammapy.org with the version to be used. The mono repo approach seems simpler and better. For shipping the notebooks, one option is to include them in the source and binary distribution as package data, instead of downloading the from the web. For datasets this is not a good option, it would limit us to 10 - 30 MB max, i.e. we would get a split between some datasets distributed this way, and larger ones still via ``gammapy download``. Overall it doesn't seem useful; note that we also don't ship HTML documentation with the code, but separately. Decision ======== This PIG was extensively discussed on GitHub, as well as online and in-person meetings. It was then implemented in summer 2018, and we shipped the new setup with Gammapy v0.8 and tried the development workflow for a few weeks. The solution works well so far and does solve the notebook and dataset issues that motivated the work. It was finally approved during the Gammapy coding sprint on Oct 4, 2018. .. _GH 1419: https://github.com/gammapy/gammapy/pull/1419 .. _GH 1369: https://github.com/gammapy/gammapy/pull/1369 .. _GH 1237: https://github.com/gammapy/gammapy/issues/1237 .. _GH 1131: https://github.com/gammapy/gammapy/issues/1131 .. _GH 700: https://github.com/gammapy/gammapy/pull/700 .. _GH 431: https://github.com/gammapy/gammapy/pull/431 .. _GH 405: https://github.com/gammapy/gammapy/issues/405 .. _GH 228: https://github.com/gammapy/gammapy/issues/288 .. _gammapy: https://github.com/gammapy/gammapy .. _gammapy-extra: https://github.com/gammapy/gammapy-extra .. _gammapy-fermi-lat-data: https://github.com/gammapy/gammapy-fermi-lat-data .. _HESS DL3 DR1: https://www.mpi-hd.mpg.de/hfm/HESS/pages/dl3-dr1/ .. _nbstripout: https://github.com/kynan/nbstripout ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-005.rst0000644000175100001770000002042614721316200020444 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-005: *************************** PIG 5 - Gammapy 1.0 roadmap *************************** * Author: Axel Donath, Régis Terrier & Christoph Deil * Created: Sep 28, 2018 * Accepted: Jan 31, 2019 * Status: accepted * Discussion: `GH 1841`_ Abstract ======== This PIG describes the required short- and medium-term **development work up to the Gammapy 1.0** release. The anticipated time scale for this development effort is **9 - 12 months** and will be concluded by the Gammapy 1.0 release in fall 2019. The question of **API design and sub-module structure for Gammapy 1.0 will be addressed in separate PIGs**. The content of this document was decided based upon user feedback from the first CTA data challenge (DC1), experience from analysing existing datasets as well as definition of use cases (see below). The content will be **updated in the coming month** and be adjusted to upcoming **requirements defined by CTA**. Current requirements defined by CTA are described observer access use cases (private link to slides_) and in the document written summarizing the SUSS workshop Dec. 2018 (private link to indico_). Releases ======== Starting in fall 2018, we will create stable releases **every 2 months**. We plan to release Gammapy 1.0 in November 2019, and to add a Gammapy 0.15 release in October 2019, to support increased user testing and feedback before releasing 1.0. Our definition of Gammapy 1.0 is that Gammapy supports the major IACT analysis use cases (spectra, maps, lightcurves) and the work outlined below has been mostly achieved. A roadmap and release plan post Gammapy v1.0 will be discussed and written in fall 2019. One option we could envision is to continue to develop Gammapy at a rapid pace and frequent releases throughout 2020 in the v1.x series, and to create bugfix releases for v1.0 in a v1.0.x series. * Gammapy 0.9 in November 2018 * Gammapy 0.10 in January 2019 * Gammapy 0.11 in March 2019 * Gammapy 0.12 in Mai 2019 * Gammapy 0.13 in Juli 2019 * Gammapy 0.14 in September 2019 * Gammapy 0.15 in October 2019 * Gammapy 1.0 in November 2019 There is some flexibility in the schedule within each given month. With this process we aim to enhance user feedback as well as set intermediate **milestones for the development progress**. Meetings ======== We plan to hold **three coding sprints** up to the Gammapy 1.0 release. We plan to continue the **weekly developers calls** every Friday 10 am. In addition we could start **monthly Gammapy user calls**, for regular user support and feedback (to be discussed). We plan to hold **Gammapy workshops and tutorials** at upcoming science and collaboration meetings (to be disccused). Projects ======== The actual **development work will be structured in projects**. Each project is tackled by a team of (at least) two developers. They take over **responsibility for writing a PIG document** for the project as well as **take care of its actual implementation**. The PIG will be written in close **collaboration with the lead development team**. For the implementation we recommend a workflow where typically one person works on the implementation while the other is available for discussion and code review. We have defined the following projects: Maintenance and Code Quality ---------------------------- Continue the clean up process of Gammapy. Improve code, test coverage and test quality in general. Change to a more uniform code style for tests. Reduce runtime of tests. Implement required bugfixes. Maintenance is as important as adding new features, but will be mostly taken over by experienced developers. Improve the Gammapy development workflow. Improve developer documentation. Define GitHub labels, projects and milestones to reflect the content of the roadmap. Documentation ------------- Improve documentation structure and content. Improve install instructions. Improve existing tutorial notebooks and add missing topics. Data and Observation handling ----------------------------- Implement support for good time intervals (GTIs). Simplify DL3 data access and simpify creation of custom index files. Implement support for event types. IRFs ---- Clean up and partly redesign the ``gammapy.irf`` sub-package. Implement IRF coordinate handling, unify axis handling with `gammapy.maps`. Evaluate the use of maps to store IRFs. Work on the IRF interface and data formats in close collaboration with `ctapipe`_. Implement support for event types. Maps ---- Unify coordinate and unit handling in `gammapy.maps`. Migrate the healpix code from `healpy`_ to ``astropy_healpix``. Finish implementation of multi-resolution maps (low priority). Map Analysis / Data Reduction ----------------------------- Unify and improve integration of background and exposure maps along the energy axes. Improve performance of the model evaluation by using bounding boxes and caching (low priority). Add support for healpix maps (low priority). Implement 3D background model creation. Better expose classical image based background methods such as ring- and adaptive ring-backround. Implement spectral points estimation with 3D analysis. Datasets -------- Implement a ``Dataset`` or ``Observation`` container class, that bundles data and reduced IRFs and is used to evaluate the likelihood. Enable joint fit across multiple datasets. Enable joint Fermi-LAT / IACT analyses. Modeling -------- Unify quantity support for model evaluation. Implement coordinate frame handling for spatial models. Implement full support of the XML I/O as well as improve the existing YAML IO. Add missing models. Implement (hierarchical) model parameter name handling and improve parameter user interface. Add support for baysian priors on model parameters. Add support for handling tied parameters. Fitting ------- Design and implement configuration and result handling. Finish implementation of the unified fitting front end in ``gammapy.utils.fitting``. Fully support of the ``sherpa`` fitting backend. Add further fitting backends, such as ``scipy.optimize`` or ``emcee``. Implement fitting helper and diagnosis methods to compute likelihood contours. Improve interactive handling of the fitting front end. Event Simulation ---------------- Implement event sampler, required for Gammapy to participate and simulate part of CTA DC2 data. Timing Analysis --------------- Rewrite the current lightcurve estimation . Improve the existing ``Lightcurve`` class. Implement 3D analysis based lightcurve estimation. High level interface -------------------- Implement a config-file based high level analysis interface (e.g. as used in ``fermipy``) and command line tool. It gives access to limited, pre-scripted standard analysis workflows. Alternatively the high level analysis interface could generate pre-filled Python scripts or notebooks, that can be edited and executed by users. Papers ------ We plan to write a paper on Gammapy in fall 2019, describing Gammapy v1.0. There will be a HESS validation paper. We will support the paper with implementing required bugfixes and features on short time scales. Authors of other papers please also get in contact with the Gammapy team and let us know about required developments. Project Management ================== This roadmap document will result in a series of subsequent PIGs, which are written and implemented by lead or contributing developers, that take responsibility for one or multiple of the projects described above. Each of those project PIGs should define a list of proposed pull requests, with preliminary milestones (version number as listed above) assigned. For each development project we will create a GitHub project and list the proposed pull requests as issues under the project. Responsibilities, updated milestones and discussion on implementation details are discussed in those issues. The general progress of the development work can be tracked using the GitHub project board. Decision ======== The PIG was discussed extensively in `GH 1841`_, resulting in many improvements and changes. The Gammapy roadmap was accepted by the CC, after the deadline for comments elapsed January 31. .. _GH 1841: https://github.com/gammapy/gammapy/pull/1841 .. _slides: https://forge.in2p3.fr/login?back_url=https%3A%2F%2Fforge.in2p3.fr%2Fprojects%2Fobserver-access-use-cases%2Fwiki%2FScience_Tools_Use-Cases). .. _indico: https://indico.cta-observatory.org/event/2070/ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-006-class-diagram.png0000644000175100001770000026575514721316200023146 0ustar00runnerdockerPNG  IHDRkV IDATx^ U_!dhHDBe2B"$J24(d`H!!(CBdHƄQ?uts9L]SݻZY~ZlذaH)  j^|E/mQR QZligN6]/rRM99n2Z H})SUR;08pլY5k_vGxmfz-]ʕ+g͛7ojժ?~x^{X.(^|gE~* qWJ>sT/\&Md+WwyǪW^6m1cn̙vQG1c۷/ dճʕ+jժ'#^㯙,KtI*P@3M6Nqٹk]tr-kYϞ=mѢEVF #8E?Sw~g}vuw}gM0v}w+_~Zhy خZ0۽k\s}׮n7nخ*ݻw7/G:udD7l$=#7oWђ* NVA/H %EA668%.XŊ0w͖-[fO=T^zDƉ\gʔ)s |oMH#ݻ 2;СuY_y裏lڵ"wu-^N90`lڶm+L[u#"ҩ ;n) @ +H{8%?t;e]f#FoƁnk\pK_])&*T0RIV^Bԙ,X`6lp))Dӧqg'Ov"6ܹs]>l=.]9M6֬Yc[nKI?$=#7oWђ* NVA/H %EAu@/[o98q϶~z端r  +VpQ^zEOdɒF~2Ls|gn٭ZW_}C?A6s"ٔC9ĵ[vٲecWDZV)  +2]#tj\R;lr4r衇Z?p=˓fʰaìo߾. ё38Á}~ &T eS:vltM <-عcKiiSz !R@  %ѻԩS-nlvri>tI?x {[XFbH ,X$mhh=㠘ZZƎRRRց;wvؑn8 ~mM݅!{ԩ.nݺaÆή|,8")  +Dmv. M 5ڵkg/iw;{,bdW~ǎ#/ YH$]t(8i&M4q;o,6d_{.f#QlaC9(l+Ⱦۻ~.՟ӥ[EY'@+ (sY!PuXr)  J<% 9 ;OK)~%|9F[}d+ (xpdZ ȟL7<.@|mU-R@ xi + NX2ݐ sxdW@P1ΧʟiWA|@ H)*R ԭ[>0tE}%* .Q"] ȞeO{zϩT5 ߱eR@ H ¤9L ${) @dQ|5r)Tz iR@ HDE I 6; [!XE<% 9 ;OK)~%|9F[}d+ (xpdZ ȟL7<.@|mU-R@ xi + NX2ݐ sxdW@P1ΧʟiWA|@ H)G%9atC+ R@ _AI8z(ΧV_)  J<% 9 ;OK)~%|9F[}d+ (xpdZ ȟL7<.@|mU-R@ x@*瞳uZSӛo};S^wi*\n=#֩S'{뭷N%K>|}嗶~zUݻ{Tsk+Y*@*<֮]v[So;︾Μ9ٶ~O>6mڸ m׷j֬Ta</ ;_RsC ;lÆ N8oSL1nڴz뭶vɓm.j[jU]t~_ :|]ԜHwa6lp/=z_me˖ۚ5kfr׮]^z:@ c=f_wC_e5jpuWPݻrJoll@8VѤq.|re]>{^}UpkwnݺY:uO>NoѾ+wqGWG-<̚?g"&ҧ ;}ڪf) @ %A6edܹsm뭷v; UH{7t` .}Wۼyl.]fm/¶v[{~sp Оwyv:۶m.>#g#b@h"w/s֠A;3s9M~""O~ɞ}YtMm}#Føqgq}ډ'`{vpG{ r!nr@}&*UJTA</ ;_RsbAɄ  rWZ@B(ktlQFv%Kf͚e~;wu=6m4w1Qh tƌw_|U`SNʕ+ְaCf4 ^ʔ)c}H@.a"ʔ|ɱ ~DHkvO4C-LKŊ}J  ;F[}R JdGܱcG;37r-[S1@>};6=>RA{yRƎHp"L"ܤYvG{„`ԨQ.-9x뮳'|ҥP箻?do D4"NѣG{D ' [> )  C"ND{c?fSH69DYR.w}k׺p0'\"B`7zM{챇˦N_~e+_w}Oq]\+VۻIA$.g&.eպM x ۻ!AR@ HJJd &m% +.SOuژAN6m͎&/7͢ ?zO?tB*'|7iāx~p |]jp3ƥ8} i&DY!¯~ǂ5kָ\mG9쳝VDIcwхtnqWNv?/¢ ;,#~H)JR%4{ 7XϞ=]I [nq9v6O S9DLw,2$YH>HE"bGv?)qy]_z w,YITqIA+|7\tBpw]D:R@ H* άZ`I 5D%gyG>) ΧV_9%~{.N4у-:Vɟ? s 3ZR@ $@?k.pLE` D j8Ȕ?k } D&aM$%)K) $+ NcLR@ d J$Ovؿw; HP8amcgu5[֭[׮drWPUG$޶`whO'hFױnݺxutNHB R@ HglVu =7|s[hmx E;;Z=]j];<ܾlȾp* 縪WR@ DTC '$^q6b[t .p_ziժEG?sw ;~zw8DR{#96spDȦ=jC! ^ñMf_{ƌ A8wΞQZ5޽;?k:u:Ndm=:A91(Cm8=hz)Oݞ޴Uυ^a5q+WSN983d#Gtzaoڈ?؃5m%\Ϙ1dJ9[tHt+I(j(!2+C 2e>cs0:qDk4d}B0NT={@NWdZlNp.@6iH "]tկ_4 sH"rȅ#L8I=9#;tH"*U@-3)TLቷÉFC&@-6`xPjUӼyl4h')Ν&ɘr'l+W. OܚjN U Ҩ ;j) @ J=NBiD˗/wϢ!5k >A2R:v8Le[9s6`2w}MĚz95R<@Àeq/,9ђcZ7DZ\S^xw'TOz뚲zjTjSN6l`gr?&A)}qpKԝH>Q}  lٲ=LljƜ{(7jȂuk' d36 9Y{tHtI*fڵk;:R<kRA2 >.۷}hmDϟQ׬Yc{;֝*P߼yswPoRN8b!;Y8H!^դ02M븧m۶nR@6#Dst"SNuzNqzP/) a5i?j4& D9=7ߴ:fߦM?)A-[ #nݺ%ɑS)1JH4) NV H) R %ѰG T_~լY?RIZn"Dxd;H4dS΀/t舤FG="/ma9"^zK ;T>rc/dL|w`رc] }Axr'(wٵ׻wo{A{ƒd}:=_ SAv:UR@ H$H5D"G"qWiR$H 7#b [7n~ & 7n] C6#h3p7hР 7c IMaQe򅣷~… w-dq_\lrԩcDIi=zKMSXh?P6ПT" o ڷo m']sLi$E)'D8@G,&r?șx&60ya,5j[Z>ٌ-;fq] ϴɤFB ёmR@ |X&jϮ#*U@]zfdgVo&HHAIBry ;y SU9UJ\P@ $[% ;y SU9UJ\P@ $[%y;9êNP@-אR@ xi + NX2ݐ sxdW@P1ΧʟiWA|@ H)G%9atC+ R@ _AI8z(ΧV_)  J<% 9 ;OK)~%|9F[}d+ (xpdZ ȟL7<.@|mU-R@ xi + NX2ݐ sxdW@P1ΧʟiWA|@ H)G%9atC+ R@ _AI8z(ΧV_)  J<% 9 ;OK)~R%*TߪW 7|p/mԨQ1曝 wyg6[viْ%KSWs=vYg%mK**8OO?=_ex3ƚ5k:SUI*9U)nVXK) P (yG]v1[I'd'֦M[mUF {ݺu?o_*ްaN&125Րeo{6tdMsMv)loBH) @Vdvgܹsm]vƍۚ5kxO^{e^ziA4ibqYz?>N7ܦOnW\qoC=d;C.L]t_*Wl#G#<Ң!͇5jpZЯ?sv;جVmoI%: ~Zl:bŊʙhD[l0 Yb5hLb>9[R%Dc&%D5j.A7ڍdi7>;N;]w͚5EX?w ,pu} ّ ٙPYmu IDATH) J@e˖YΝݿ)Y?pI;+P>믿vP˽@$tӵk׎X4d ƾ`# ITL/2;'%׹aÆvn/آE\T\E`BBĝ9Q{&\r7ϩ 3x[eHVAv ~) @Jl wfJ&JM~10FM:y'tpj9Mt=VDy,Pkڀbz*.&z宧Nd'" "B5 O$&Rhcԩ.Oa2pExC@/l{w}]H'a28u?d'2:V ^Av5UR@ H)Jƨ g0'7\^zAD%cРA.49Ds/ jtǪhƤ HGDcH6r=5}!8bc٨*D|sa:M4uSZd䓓 =KO4S.3`A%:~胜l'f"ޡCmg'tg'D9ohlݕ*q+ Ȏ[*](ȼ@vIVE.<@?t\~WDS;ΥI@^2_Y{7l=pfĪ4#>pm 'DY7~x82&{EDm i&10Q \g"/i&,LD/vY!Nd% 깇E PS&;tGw}HIa'38>?kEy@v)SK+ NQ H)2R)3"m|noM.!%WW'=ۅL@R|g_tP@qT/C c=EI| ʤtagk Ji 7BM6 DIa4 yw) WwF@r-[R'#U8&ز0I-!|k>4=>+4}*>%[r[AvnR JB>y=s xwWK)G%91tun+ R@ \lAI6yJ"v,a;DX_p ֭Qt2E2L=ݿx4G;-*zݺuVd%+-(J--@ p b We<͢DNQdjJ pպ0ds4<{ɖ? pz%@HH53;rҥK݉\p[d35tׯ_fh 7va]*M{D9 C6<?ۥ37'o3\A$9=Zjֽ{wt?#Qupɑ{t$pigo;A;7x_O?u\V1>g>Lt&8؆o6w;x.Gva/pa<[vmgᄏ`~RϥCIL( ΄jC H)PJR %DtbC 2e>cs0- @İ0NT$D `-[qtȦ= _A$.;>v`@z G\BRO8܅Ch8w@2 VR(=hAI+׉Ys#|0  ^l`Ď+ Zi޼yqA6sr'ġAϡ@ܹs $cnLف(˕+WJ/Ts)FiWUK) U Pql7GS/_@EC6=k,w:̵;:8"DioΜ9ug}N 5r$x _tXAd={tǑ oNYp$qdO?)C^R%i8=Н7l_bEYfr?&A)}qpKԝH>Q}  lٲ=LljƜ{(7jԨ@!fl7s~)NTWuK) T >ӵkvtx֤e&M!w<-8ݱo߾Ɖ6۸H7u4dYsO{Gܱ@={9wPoRث8NRsD;i+ԪU2M븧m۶nR@6#Dst"SNuzNqzP/) D O{&>p4 lH:fߦM?)A9Jnݺ%ɑS)1JH4) NV H) R %ѰG T)^f@vt?H%iݺ%] ѐM]:Di#&KkR@5^"A;D~lrc/dL|w`رc] }Axrz/Ν;z/l < O֧Sڣ@:dS]-HRTCI4,r$bW^y˝&ET rs8"E1'$.L@%=n8%g;l~G5f{e 4( Џ2ѻ l…vW/lO.6u1"Ť=ڥWKXh?P6ПT" o ڷo m']׫WEX~u^ HH'dwj-rIQ&%Gb s9\Gaw(o‚HKlYfvM7,iA: G6|cw̙zE_ E,dQe5lҤIѻSN;ppx)NXM:3x)LlHa;Y9j(8}[vV).ҁΤFB@lf͎=쟇iAGތWVnOXB|AO?5nyvͬjUG1{=+}}'͎;4w&~Gg+ȱ;|綾zGo,EflOq]s\47Ր$k3LԞ]GT9*PujH Jxر0R.+#0vI?|`F/0aKl#Ȝ~dw3`lԨȄ>`/ĉf  xsd5\7SEЯ}{٤If?[OT|gDq̱*B ۹kHd]О{>Ri#}Kg_}U>Z`/ &&x]x!K#z>#Cѣ#H4Jm#@2f=LЅF#-!f|Pj+3Nc׳gį98G&LLrm^53z tS7YalF#cD:Q,zm ;iQP&aUmV?gEv5%Js#Z"k'FgG"A"Hvp`3O7^zcMɧ|sd,)=uF~fm^1& Lole &RlUP=2q?u&@9~R&lmtR3q=/} `gBD$all&xg"r!7"M M,ݷW>'OH6oF͙GA"L ȟ4KI @R^5F`W@X""i&D3с(`BEtp" $(E^G1XI䲸6"HXXΩΈ,@A) hB$` 6n8/P4e,ԅF1N?O[+ f_9T `1#(}L &]lV&.L(%K6P7dI z'D[&5墋"})8H2˸P>0؄0.=9MĜ1&F* QvGI]4d|bO?=]GA"L ȟ4KI |n@[v(gDgR^P@@6h"i J;B7 Q⬏xǀ4 ˆMc\dD~&+ "A$ )'frBE.B `eB]ͩ/:_9)%[\_}9Rr`OzprMw" c ؐB$-jAt`3u3uH?Fs?9d:UL'HO(.Y JB;y1s^{v:>&R\;@@6ؼy]NԙA1L(ټyM.*+|i$ZAH7^x Ǫ{D4vA$I]n *XƂIM]Oē5b&5w&&9ز>8-@yEf)\[ 7&=."E€f" oeM} sQHo|qfN|d@DTd y1?  hKz)  ;n-.90H21e &-׊JHY7T9"يdxdǫ 0(>H)~ق{Z@A 2 ;cR!) b* d?sVn]7Ç/F[Qʣz-;l *s=YRYp v'YI_فhcƌf,bdgQ|5q< FMaVsR nIA>j)r_n'EO>6mo[>*x7֭[g?m}`?7[$N-[zPR tv{٦&d 沪 ;gqKDVI+ ?M kժUd7wۮZ6yd0`6l0oSL1@iӦv뭷vmg˗+FaK.za_-[inݬN:և<<'moEρ^{~W[pլY~aU/^~DG;mܹ.Xƍm͚5vg3 *;'.RO& AiҤfܸqV^=gO?d쳏onӧOwoC=d;VPK.[ʕmȑvG"=tP#]F 7v?33QFv%_Ϛ5 '|l 6tmi{%}"8 83hu]ƈ?vnn*l)v ;gMK&V[U@~Sxt 4PFڵko&cǎ.=l/[A6obI`nsݹsg裏\ Aa2;8oB푂x h012D H5F[nm]v}P 2 aqn/آE\TDI!M_)1uT0"Dg]^}хt& \Z)5#" Oـ/l. H ϮzKu,c_ CIO4;tࢨ->J䤀A%P^z%oEA1F,{l/G! ec9f:FߤpN=vX)-}ra/00v'Oȃ@ IDAT;~ |d+ү ;gKvW) ?M 'bEs,?~[l@7p[GԔEqDrr-:#ٴE YHD?q-)%{챇3Iqv !<w,dd,{lRWHyyW]4UVn"Qu@ L3gi L& @0Λ"/Qg` z&PS&;t?`N(ϓ_/>?kEƁy@vѤ@2 Q/(Fi ;qDncf2#o寳CJW8qb֥ ldgKy P=m ^2*@6c=EIU&Sȓl@6|vW!:\w,a4 O >ڂp8'? dRP-&$Uj*|lal{L5DZYN֞TO?HI鞥MҨ{rUAv\v ^UJeSi ;V@Ⱥ* N&xoLd y1?  p8'?9r1'[т:1ɉl aaݺu7ʕNƾ *qUV_6pz#yތOoQ֭ʌ\'Έjٞ DMZUBA6pqlɗl9ԅ]Xx[*l!rj"SSمօ!S1!^'.r/d%b%tmf +p,]ԝxE1cёlg7@}y=)®",v,Qi1E~aJ@tp8{D$C1ck7d5GTVͺwNݾMΞ\lj GG9kEXS?駟}ip /8Dkl98B;d#Gtzzڈ?u᭷jm1c03 Avi} ;G-/k4cMdK.6d+S>֯_? ѐ q DYN>9=H_hٲeAl8(,@D/"w; |$H/rH6NBRO8̅Cg8w@2 VR(= }o&~a:ñ@wNx@.v^?4ab# La7UVu:͛7/.Aq␠ ~ܹ!҄d <؍);pdr~BPR*x)l) O3 ӽ6/_@EC6=k,>|dtq&J}s٨?@6D?3wxD0ʁ`,_|EtAJg AߜdsLG~iW'4+M39Ԥ#ZLMJH=lŊnrx"0.@kf즛nrY2!t2,7zgΜ7]$^YIU֨Q&M9DpH1DG9Ćt&/,?~EFrC'eg",.8pKLjxĪQ@3ת%sZ+ ?d~rN`=dWdWYٙ; ^2ZL\ ;qAvRBAv%V) h0b%-+ ?dإdQd'(.i9=|q/xC$]u䧂; ȄL6|Q@HKUT@~*N?.od{;42,  WU ^C [P@ìN ; x SAv8 xɛO92>^.R. (Q,p8'?dÓ ER@K uNwT~*i*HvJ0(HvFQ찏b'dC3d IAvG[G䧂pxztPH%o:;*?dxP$;^t]P$; tb'dCLAv\G䧂pxztPH%o:;*?dxPHJ0(HvFQ"aOd?֮]z s~Ձd' VN^*xa;IAvZlɒ%y6p) ν1ťW@]zrNKnS[)?dg O/ΓV7;%#O &X޽}Vm{ذa֭[pz:١RuawKG8& 7(]$| ;FY} d=lRKD&M#; [~/S ( `dBdgLj5$zVjUOz,3¬@<o ;[f>6y+N;ׯoW]u-^{ݺu;M2ZhIeFH?Y-\ 5(YGJV2#WJ2"ɐ fP@0)A|Zmהē)Q@UI( ΑR e >ڔ2HP~/~Iڥ > J<)Q()R@M)t],X`͛7 6%ڔZreT?S:nj={>ZPWСZ^R%{'w-n\r7N-J+ NN x@M)իWۨQ  c޺n+8pwYVtH0(вeKׯr-Vn]P 0DJ͟EÉ*R df)>6%q{͛gӦM3(>}Xٲe믷aÆ\C UU6*SJ*حtM5>{lĴfH$T@] >6ep`9ջ\F'a*S;m=>} (5ȑ#߷H/^ܤN^RS **R@M)jڵk"]vu[֣G{7KUOR?޾ [f=C֤I$k*{YN\.6ﶧ~ڦLoN x !R >ڔbً|.N<Ă_ؑ^xa,&Hիmv衇=ɧNd5 &XÆ ]Yfe5)R ;F)>6yoGrNJ8/wy}nf&8*Q4Fc?svWo#ԊH4 >ڔu]ݢ lʣ>j<"iԞ_xXR dx)P |Zm*%a3h"e]s99fm۶$`'cƍw.T^H#y睓]&u ?>ؾDӵR+^ PGѦ[UV8T~1DTҧ[)դP؏E2vXcAmݖ>TF L4ɽa7…}wm76[H\S@k#&{@ >ڔQڛcN䩪O#ubt.6o.]IQɌ,>b4Xmٲe(v)K sidH>6Bk3{2-W6 fG|НI{PЛ,j֬~ң}qvw_֭[8W_鏙$ӭR 'h});c k^KAm"YH=3gըQ*/Qn4~,YbHqE?3O) NF >6%9DK*W`ܸq6|;62ҷ7ʟ/3} ӧj~*#hSIܹswR(S_6d̑ P >w)>V5K?h})㐙>}(82H6ݫ#Fp{я?>;"C ;TéH8h}))9=o-Zq.MNGV}N{gݞ*R>lTGѦdwv_詭*zj?69*P`l?3z szd(>6Ő6~~A&*ʕ+1?SvBG&,dz)MT_mKl(#hSc7X͚5 8r-1zj3 {m}] kUU>(AC*R [ ڕRGѦBn˗/ȩ+_~g˖-KqͪX0i ;?]g|s5??G%}o;nZKzg J}U/fR"T@PL F!cq|I:t[iܒ%K쨣O>se^2 Yʗ/o~U\9tHHAvBrb) ̛gַY>fkfÆ bּyv,&{6uT6mZ S 9stIhѢ0P`7xíPP@ Նȶof,g֮5[ײm|Ͷ`{{r}];3l…)SL25j U(FAC "9oo7ܬ{w.]A1Ϳ92tq9sϷs=z- ])иqc馛e˖yku5 ڔPscV?#ѳgOa>*I*ϱ[^rÜ/?>͗9 ;GJfJ,*]~5o.¤VfϽ+vkQk: '`ڵ3<3OPӭ ; ~)AdV{9ccǎ֩S'F,mfϞm ^x!uVJ@ΝiӦ,)j٩VTI xfK\삑:쪫>M<91b=3N]tUR=T@*dRM%|W`:A2!buֵ &송z rl+g(ȶw}>lAN)0p@۰a]s59eS"W[wx'ɖ8e!Rw+}k׮-"D#dCP_mZ>8KݠnQ >"'?|??ɯ kC[=8C39'[rr0etR ۗ{ ,J=8T'ɖlΎegjdgJiRL,A|?lIPF]|? ( Ȗ{zp䰅h'[B3H к0'dh=8T'ɖlΎegjdgJiRL,A|?lIPF]|? JM=86'[rkem*|/c:U@lF1;0c=8fnf'tqL[o]x?<@kCk=}Vre;lȑV[_e7pʕ+Wb\6X_O6ۭGptȧ/nl)Jð.>3Wnݺ?|w}϶;ovWu]d߭RJo>_|4D|7oYK)SUR;0ݚ5k('|bfͲ?ko6[mҥ޼ys曭VZxCJlg-Юx}=QrzAv\۝gi D7x?_z%UV\iUV|N>bgC ޽{[&Mu]v%؈#l~Kӕ)*m`>|.g&L>h;蠃޳?A?;y㏷u pӁ>sT/\&MuՋe]f3gδ樣c9ڷoo_~ 4DC;QȎ!Wd~pFޢD:ur裏ډ'h-Zd5j԰[nEwe&g_z}9mE)8#9sի]Ĝr=؞{iDE8㌂f̘"lW]u};p"hC4daC)dMܗTeN8z-wW\ae˖uQ~88qթS=PV .^}Uiuᇻ/ |. |qtıOm IDAT-E}nsgt}qg0N3>7go-[vyg W^qHviݓGwi|^Uҕ)giZLV&[`bŊ6a  ,Dy޽g7w1~|1PF;vC9{og2^z9}+kcǎu XSpNԞ8 I|%^~Lh֮]s?&|c3f ;mTX$;_yO2ŞyovtE|"\;@ތ 8|;#EGw,=묳IG}"@* x+VSN9MO;|HqnvXy /zyf|@&&"m'߁7v~A/:wH"] 1@"N7 EAW_}|y8<@doEIwĎ;ފmR`HI2yN![{`ܹ.l>YS@f]3)Ħ>`#&wF{:;斜UV9xcmhw9D\ϛ7}6URVUt?8Kc|QM-Qb0Q=PpЌ7|}Q h1.<h+h!̗:j(`N1`"D[xl3P' ;y|yPHO9d@ď%mQ* 6401 [|%^~ly/Mt;dI` .wDtn7 XP&2ҪU+(ể5?QQF9&ABdgvH:hR n4|Fb}p?IAxa>KDL$ƽ|&yKw=7dCUՙv,Mlfyǃ3װѮ M7Ec^-0 ax@K/ +65AT<94K{x*B6)+8D%$dS?zObP'\ϖf8zO-Bv>*''SOuO<'|r#,ʟߓkEqM(&y7 -Ʈ(Y;'po#;XDO`򬍆lޞ ^|ErMB hwN0"uRAb}`C.},_]tb,c/^D*o8@ݼ#"arF$7~g!*N!q/gW !iM!71 "6i&5[{‘l& Ϟ>/EEv1}< n0Io y(;s'p%MwoIH >I !0d3_ʐ{]NX"D~L仍>] K3U>>dS Y |%"O;n_MֺM oҝ*"0dЃ3Yu2 >ْ77|/7G=7&=>)@:L"VXE=8"*S'[OHwB w˗/w)[,DdgBerL0|?lI@B]|? 1d5rR=8srBcO-`u$n|?[h] srdl*dK6DmgGw3jj5S 3IzpTNU>O$(.ԅbd=rR6ȿsvS}=jߗBQ#χ3 dÆ mڗ*DJt(dK:V~+nߣ+U@dYQ@άȮFQ'9H)n_yL~X_Ɏ>ْ)Rwݾ+ ~d`Q \sđYS`^> Ys5|]!Q>qddSLsdS'[9&j;; ۗQSR@L)vR)S%O- ʨC}!purT@ΜdKhX[wZ6| >ْ1QQwݾ쌚Z͔L)vR)S%O- ʨC}!puA")gF4dKnMb T (Ȳ8fOn |ɖl̮ewzd[a՟L4N|?lS>]"|? ++E( Ȗ[zp䰅h'[B3H к0'dh=8T'ɖlΎegjdgJiRL,A|?lIPF]|? ( Ȗ{zp䰅h'[B3H к0'dT<8of{;L[rtA6}tmbQBgGՓcN;N?O;K,oZh-V`; RU&Ol'N'x"*SIus*lgΰB| ] tg&.r/ \x?{%Eu' A( Q2(I$ H$9IDJ% HE$s[5̮3==3=ٳ3U~֛WՏ=\r7ΦM/,:tʌ5 ի7n\igr=ի&v?D6VԩSGׯe͌(Wfl*0N]:>PYǏ?IAz0s;v`n:%MFYb_&M]ti9pf_=-[(}QvGV]FWN:U:$}yiܸjE.]cǎ>L&N(JJfϞ-3g:ټys9x`dfy|r>h^ ]#N*y}A.￯D!.b…e̘1RbEɗ/S U "&LsO?$nݒbŊI%/e۶mxoCdCȢmH˧~*SVb6AB|&N#|Cd׫WO2dȠlT[Z ףF7ʒ%KT{ٲeSL ,({ժUj"ŋ2sL%qҥR@6c2o߮l~wGc@P߹sG1Nwر%{_ &T۶mH63g( d3/JG$3""x-9ׂhf?G?+"~XYA뛵g9ODED9Ydrرpָ <]DQYD=nLjCo_אꫯ* .\P9EN:3BA>4*<Ɉ*THY1D6D+"(#~i"}͚5={V X~D!G9 &'`?~Yh/^\s=<gy6wH6A;(3fPbyذa* ?w^7lO&M"C}3Q8 'ZpB] F^:|VPdGÍKt9GQTV uhFC@XC!W²B *j#FP\lDaހ2!֋/8@.C#G^ Qbď_g6 %KƂ<Zw dΝ*b4iR5)AA~7 b ]kN={TDuݻW֮]&$;vO>DMH0@wOVZ Dd<=lf0c ^ ] lx^>}}PdXBja1cFYl i"\C!ԩS*"5ٞꇈF(uըQ +D˖-{*F]Lj6#h_"hQf{@ݺuS"ȿ~gD1@o!m6m,^X;l.]z.7o^5@8 |0k֬p_ }=_oux-qMm^lM?ׇ/CBjC'>PƢ>,RDDb9eʔ*_H#>|B JdC\CԾc(6lP;W 2DHؼyK>}̿(ꇀHŢFDq!La R3 "Qnذj[q]Qov#/;|7NTC4ChRTp!f͚RreY`zb6@@dC\#; ۱Xƍ*(`#D,ؿ7Pלǫkm{&vGU,DGy6 vwlX1"kVkׂw w~V"'ӂ7NvcD?"úL 9 j躱:WC#`RpZt:'R+x} ~VXYAqȳL@'p@a[?Du+by $" *ׯ_:RpY0@jid'“+x} ~VXYAqȳL7Nv##ǪC: RM*XUm]ѩ#HAKtNO'[<1sx-#TY5eA9s;85ES,&1:}n \LADO 8Ǟ-zF5t%ez}n^ޤ5Vȶ( 8x #ݾ1cczc\iˎ4tŖΤ"ݾhE489.I@- ǟkl5P(EXJ7NKq2 4tG<t!.x!@aKqm1Z-q0;bOwL恶$@mKh89I@- ǟkl5P(EXJ7NKq2 4tG<t!.0]cixtGƟNˋ ?C_dZfbv@#NH`ulc{c]k4tΣ}n`e%4 8P1"f}neVȶ"88 id I}n_荘1Evh1pBiР\"\R2e^/ӧvoԭ[W{eGʹsdɒѶ%޽[jժ%ɪ7Ȝ9sd>U!"O[x-<#ӟk(Evd8>,=\rԇ]vI޽e>M6_~Y:tP37^*qU?vF-ݓ^zmoCdjժI:u~m73LWkABBTE6^_xǏ4cH@fرcΝ;5ĉҨQ#}hBڷo/sD_}UY~۷O}x*U(";wTB ; :TfΜ)_|ČSN<)E?Sŋ'/_6m}Xׯo߾2ydyt]v*d[qdYF J?rHQj׮->`n:%MFYb_&M$͚5ҥKˁA~js˖-}Q#+.zN*z>4nXE.]HǎU}L8QpKK*̞=[2g_U7o.T#'O'wd}}^ a.^ QVY!֓?C"~V3mю/uJLA B(BALN2EE xÆ e޽*u2|e͒%KYl?a„JB vIKٶmJM@h脐Fۅ 1cHŊƹx"2_|˧~ĉ{qWdȐARJ2a%~!q=j(ٸq,YDa-[6esWuZJMl`3~cr7tR)PG1q۷oW6;RD Վ.A}|[߱cǖٳ~h֭[}m۶ʎi? 9s~M%JŤ@]4flLDZk_"wׇ{5D#,D6D$b_iכs ъH:.ݻwUDӦE-Yfg* ":z1 IDAT$)SF1Bb>ȟ?Yxq| =#T nI߾}[E]#و^@Ϙ1C}aÆ(<^7lO&MLu3TEkk!wC Xa*Gȉ,zsʅ TJF=‰lHgDIT@#RreN vʔ)eǎ* DQvrEHA"=e^3F?6dXGX!,\Elٲ*zT#]i!FE7R@H";cƌ*u   DQm5j/"䞊F]Lj6#h_"hQf{@ݺuS"ȿF߈c9"C">m4Yx&6K.=Iɛ7!Wi֬YvAz.CG|=E6^ 5bk_"^l VlD]F:R7H̙3*j< = -_J=AUd2h U4 eE7E89~Hg1XY4=>D6B 'Źl작.]:i gD_4i҄ nF|֭* ?o<8 a3=MF 8;'>)3Fŧ6m^zI-$5';y*vcEC$D!v:RPW.UVU=sy-Zpµ/ׇÌaNJ}Js4t,C^ fI8'\ hϊ]Kѽ>(ǟgoFH@-~DΪ5%>MJ,"l@3Zx:?l8 =? Z=-;q]xx v0&iԇ7x59" Zf<ٹ`?3tLx} Pdw!my0o~{0'"u`:IdZϘ:] } ՃE~>9zhٿڷ;M ܡCYzګ^{5zL8Q=*U*6>ఋ;vlS _y22<Os<"ŋJq<ѵ?&0&u{նZ7B\ǟu>PdZc7N lGuݻORBZj%2eRQdCdceDA<0Yd#GH$IʕSHF] Bb$vLzlaVk!:sV>"@?hIqB@-~J&>KM";YA0 l #ݾ1cczc\iˎ4tŖΤ"ݾhE489.I@- ǟkl5P(EXJ7NKq2 4tG<t!.x!@aKqm1Z-q0;bOwL恶$@mKh89I@- ǟkl5P(EXJ7NKq2 4tG<t!.0]cixtGƟNˋ ?C_dZfbv8p@r"['[6t? i_ Pd/+8E!3x>tYH"c–x㴥cNO'[`v4ǟmI"ۖnѼqr NO'[? jPd4۱ode>idy>tC]B"Öx㴥cNO'[`v4ǟmI"ۖnѼqr NO'[? jPd4۱ode>idy>tC]`ǀ4ګ?:?litgX7h>,=\r% XJ@n3Z̅L`fL9mlgױ3s;v!gMl:wn9r'NI,;v̱<ر0(x-4qG@kWY1o&B̍ g"bf [x-׎L*fL9mlgױӱĚ@uČ-Fcfx}Cn"ulטּq^vM/^,͚5S~'I$(P@RR%2dǢNziƖDߘ̌`2c`zmSd;ۿ흙gdΓ'ځnݺ~W_}%-GV AV?Ya zX1I x}Cή"ulטּq:;w:?l;x6ǟiPd)lKq@3Pَ;:?lh =? Z=-;q:ƕNO'[lL-?|=l]Du$CG@nl >t#(_7NǺid-G#-%>Kʴ#@KhqcE@-z%wm믿̙3g9wKRpa8pic Pd;ֵXoΦEF@-q#}ܱ*cpE9{ڂ5N8+V,ygʕ+O ǍWz-5+EƞzDm/čIƟNpĄ@|ɱcQk_JDk׮cl_XPd[ tҁlm@8 P_QΝ[2gά~d" ()^ ru?^nԩS'\ڌ֭TPoǐgcB$@$`D!CÇ,Y2Ν;JdgȐAe&yUiӦ5U7 '?Pd*$p4rɏ?s5FxB $CU";ArƍH#ЩS'4iRو!{͚5Rh˖H l /uY=s2k,Y-p :T`';íjD@رǗuQ_N$W~IJBի>wxi& U,^8\eH?$M44ZLy*;V4iy" XM"jH $K>srKA4Wy,%m6y7~?X =}-~z-'_ PdJǓ /{>X3=!xO]I4̝;W*V<C͛7K*U>ܹ=x E?n G(YdR?yu4tnҡCiؒOd  8Of3$@#di֬Ͼ … RJcAgU)z+cgϞҥyGX=ЍEn=$@"WQٖlٲ?تv5)SpΝ|8}QFW-/_.XH@7ٺy ؊@.]H_ -f,˟?ܼyt# Jڂx`kwG|T'O.9szG(u m" /g6D6?`W_ xj va?lه'o>3J@iXF ~JX8Bv!@mON m ,XPv})S|O0G`ŊҴiSt钹\ʜ9ډ$VX>(_)RD D6 8Eɾ ӧpjg t={p} btRpz+W }C|O0Gʣ^zڻH"ۉ^eHBW رc7~n*gϞqDES#F'OCPbŊɖ-[|jjb=I/~bƌi}4!@#h ?޽{ծ][ڷo|A!&7X co&:GI2eYD"ÀHL]vjYl3֭7pի*:خ`?RJYW1k" PdkF$:fϞ-V%ܣz?zBƾ}q~"P:L$%#G?TXQKj_-ZtN|vQ;޽7ۂͰ@5$@֋-* V;wd<&T\N]UVI5F,>e!P$@^gI'axv1^cwOĤ_KW>pj%ف6HH  |tٲeB <%"͛˼y FA6Daƌ ,9C֭+ݻwFM<A"~d/HJ*uJpD+nݺ|ѯ8kX|4jH*T(I$E6G ؀@Æ ծ#cƌz8c /9Ɩ}=SߺΝ;ˉ'dٲe7F(5rM! oƎ+3gTۡᩑ,Q'+W.9~|WҸqWg^tI=X)8:u a: '@͑A$@6"om6n]uIϞ=eΝѨ%tO1۴i.7o$@$`CŋWcҥ ;vPؘFm۶rE,$@ Pdst M tUoY`M{<nٲ@l#pYDYH! XpKm*U*$4%f9s9r;G@ ؟vv6&MUڿCݻiӦlD-Ծs̱4t @h X@Zj'O2d9={]E Y8yzL>}TԟH7پ$@$5C?~X<ػw`qm^Tɜ93 DEv ЙFvܹu65h۷O4h , ɍ1cƔӧ @4PdGO% ] iO;4kLW3f^zF?P!6Q~ Ȏ?M$@Z@.-XN:"j||g*=$}DB$`l  ЙɓdGHܸqu65` (HH@kU+V={jm uo6&L.2eJf$(C4 @("9"K, Y ի vDӦM+cƌ $PHH@K.vک<^xAh.xxv(ݫWEbiVT"%KQΉt.RXӦ\)rk3f=qܿ_ds1y˧l7 {Qao"rHaƊ%≟7FGܻwׂ _Zv̻4j$cD>i@n]4(o?H~aܽ7ngϫV%(-JBEvhߌ^F,^<޽EҥȾ};L8/Zt5 IDAT$4ix>\$Y2-Dn Qx"(_ٲ̙#{sMCHx0 a 1Wfρap@|7iV"Įbo%?7[ES ^~ܹa\ ǎ & Zqb_ȯՄtB{`֫W~l78#R@怯kF>Gh ~7EF{Bh׮KڇX6pv:&]u|E$`Wv֊EV1Bbxj;R4 Ɇ ͑CGb"ӈd#ysX "{ʔ\Νâ-눔#^ƍ"?Ɏ(+qT7k&R4ܰaX4l8,*@vBuxL*0I0b!W$KcQ`̙"' 8"{ Di=1=ovi#1cX;ӻ'&4(@aAnlD1yAzo 1Y gqℽ׽OוHkIrٖ# 2|p9s8qB$I)%-"5%O<ʺuʕ+̙S/b$E3eEv;]Cd#r<A:&۷XQC`"ilDO P(C@q+a յim !] 5wٍ2!#a° R˗(x׵/ %!8&RRq0B_89dH :(C˹ 69sH7Po%SLk.X[Ev…hѢ2vXqㆬYFM^z%?w_P7.ٟF,=nln<b􈻋@,~{s?߇Ebf-]ĝȆSi89ΘF:'6"sCA,"V>b"DEXD+a"H-A"knlYDQW-\GAsqlx) ȣFyЈ#{C)9pd+6!CZ =bT[c̄}@){re:|zБE2~x%!r"reI,+WN@0_vMb֭[?#F޽{Ǐ_Zj%|'JH_~]k0a,YDΜ9#N\rITRT)޽lݺU.^(1cT"^Pi,ӧO-[_%8wUW]xq9tj5iӦ 1)T9Pd;Ա  QD.0D"؁֐^[,#]|`jC.38 *! Ziwl}dخMs,ɆEtw CXcc"I >Q^vQ{mov&`3kV}]E'~{)3XȉjpdLHaqwLPRVwLZӏ;`EG|/ M Pd1Æ Sjij7n\%wܩDvĉEAZɆ ޽{*ujժRP!5޽{JAE}yUbiӦJLT?OVDk(b#=ҥKJ\CdCԗ,YR_J%"BPDz[$@$@$EQ?SE)"ӆ#-R7UU`?mf(6 ԃXΝ;])CݺuSB~XE?Vl}רQC@LcWndEXbJ8wm#'#}E* E% qD*ntBlPpOM,`!plg2dl^[Xt*& x̂H:R"~XHC"9dOH";ٴO>st)X-*sB$,'{$AfmC["Em˖-jW``la8vb!p&lg 0gs$]-~,XtmM!Y!={TnXHK"۹eH";ٔ 4kL*xA,YO*XLmSYHM"eD";@ٌc&LP$a@<_T D1L6cƌ@46H4 @h Pd߇A ٳGmg*U*\vM2f̨xςAc ƍ)M$lƾ,$/:t ]aH ^u=zׯ@lL2E= #Ȑ!a$@fOAD9Jx sY:p-I*ګڪ2~x;vJI:Uղ ȶh^(?ZjI̙O?Kgn߾-)Rm+ h7nձ vمh1BV\)6lswܑdɒ vt޽{e&M{tl$@!@la(PvG|j? Xf")-ӧOWI4e" gvً ؼ ܿ_K֭-{ߖuEjժtѣGG|D$|1{&BD61-W۷ϧ~72w\ZO` "( n޶~z,\P4h`ׯ_/x(h8>y4s='%K~+i?7s9dUڱv+<!>0a(wcҥ2{lf oTzH ̞HB"75jvk.ݻ^tfׇO"JC0]f/^ܫ9ѡ^zy=nezÅ vRX16lL=d(E퐀"}dٲejOK.M^R%ɗ/MUʃHM@k5kVT ۷o/1cT#G>d#ڊ+3fT"|X2eʤ+D!Rq&"4&.^(ٳgWJeڶm[ɓ'lܘݴiznݺu_T`zyeFjGΝ.a@̈+O8!"ӞD6D[Ta,;v@: 7DƷ_*vIUjzSlCۜLwU-fCƍ%q2k,`! _h'K.xܭsp˖- bݻ/^<1R&=G7o oڵ*ll8Oq|ڴiFUd7lP ,(]vUoc{Ϟ=eΝJd*TH>eu G' 6I$Qgl/FȝFiժ j* {O5Kԏ7ZO o 3`VO^4kL:u$7 7|S{ >#,$@$ډlNxKټyԯ__5 "ƿJ6"E.9D"?5vRX?ȻQGs,t_oVM8~c|߰fx FF676~c2?~+uoKnv]'+xu:9q㵈w\;uɉ7&;1뮿#NP1Iyd@P\rs?stظqٳcXHH &X_a1"rŊs;#DɅ \|B#u`A%RP4d<#=i$x qq.GD 1  ؿH"0Sh G &ڈRcԃd!Sdێ8I1&HߘNX 8ccLRMTJD}c\Mq7+1)17o3p3LR1B:v,\N%J>u l-:H VPd$@$@$%{DAϦIHH@Cn2g 9,$J"HHH 6xȕ7.-Sp :dzɾ @ 0}vrA> (5t Ml ? Pd.!jvHHL6w!$@$@$@(9H@d$@$@$ r&<cϖIHH@G:z6َE\FIHH( P4I$@$@PdģH+l   W$`l   pl9] g$@$@$#lBlG"v.$@$@$W~CEvx$  s)w-jՒGQ $ÇKv>3Uה)S:c iҤ%?c@$@$@v"^dԨQgQ$y#Fh۷o˃$QD|IFΝ;syBpPd;[%  ] h-f5o>yŋ7n\|PZh!{Had,Y2ژ:ujYh,Y2\ݮ^Pw}'ǎ J>}z|rɕ+ZJ^x] $@$@$@l+=z$yQt޽J?~\N>-EQºiӦ2sL2d9rDڷo/ׯKfdԩ8qb#7nܐŋ%UTݭ[٫WW^yE&MQd#:޶m[4h Z1}esJM{.GMPdQ4HHD@;p.]T!j&L1bl D~ *$W\%JHϞ=%C JpCD#* !W_ɺue1wCd\REeFYfkΧoH6D@$IΝ;U)tX EE Y 8v"ەt-[H?|r REU& VO<=]vILdȑ2w\ptUz՝Ȟ5kcQ"ۥ^zэFBm+"^$@$@$oH )X`H6R9>RJɵkTle_t)\v([L#S1D+dڵtRu("۝;w(A6pM$  W! 4;vHBD pBٳe~ynܸTXQ>}D#KHʔ)N:gϞhddHB-$!~l*(J 3 h-#C]iSBBi͛7 *h3 7"g{۶mҦMՎ3Z8p@M.ҿ,SL^/ӧ3fH&M,i]%=ѣGɓ$IR 2DRLNqoP1;wNJ,)}n>TK%f̘Noݺ4mT֭+?GM_Ρ%  Zd/YDjԨ5k֔7|SZj%oߖHDtI4iWر^ZƏ/_.oXܻ٘wOz[TAdn'NիW%yDkQiӗs(}cIHHYf޷o^/^qUtvRjUo$N82i$)S8qB5jÇUtw==tPO$qҬY3I0":t˗K\dժUr-iӦ9sFbǎ-ӦMSx׮]Ҽysɝ;jݺuE46O<ү_p>}Z2gάs|Hv߾}U͛ҽ{wڵܿ_yfADx*/^?&`)<*Ҝ,Y'._&W}Fm[o%۷oW]MdodN l9sojժI޼y{"l_x  p6ۊlER ޽{UZR) J)5*9"۷ԩS+Qwu%ND;{/\ڃwǏ_V :-ZTPbٳ{k޼yJ BF,f:tH-ZD6FD|T۶mS,֬YѣT^]2eʤ駟#Gƍeҥ2vX㏕0ϒ%TTI}]LZ/;vPu.\XE\vMr 1cƨcϯ;.4j(e"]_lL.Ltjϟ?#+l (ۍaO?TEFL]&Lŋ+཯Z elW_}U;wV މ'Jr oٵkתaWҤIUTǮXBՇnCp9\u"֓~TDܹ&<({Q'OWXQ lƤ v|w'6lP"c7c Fv')_ٲe|c#Gi׮ 8v"ەt-[4?)HW7ęQ$H ;w\pAS "5F:FhE.uΜ9UH|)9 QdD.\D6D&'H7nz9A׮]S@믊ʥKTDi.#U`ӭ[7MR?نE4ۓȆ d!v! ! Ƚ?D6\ "72]re8@X0 >G_Im۶H@.MFd,>ȋv'Q8"QQtXFvX=HM{ȹ]WDL5{ 4go NiӦUF#~ ?a\',$@$@$`kmOډ.DO ( PdgXC:&"l)oyXHHG"[D-Ct]7V:[YHHG"~>)67 OMæ; PdgXC:&={VTۼ'4HBEvhE@Y",Yva! l{jJ"[S,<O^,$@$@"@m/ZM Pdk>6 Pd_VSٚ:fVǏWb! l{jJ"[SܬIʱcSdYHHE"^167+Yd6 Pd_VSٚ:fHB:$ɓ'yOh> ;Ǐ=-RL)m  {Ȧ(` 0S={HTQ=$ #l?eաC";t|ȞIFv%,$@$@"@m/ZM Pdk6mZٱc@l ؋EEk5%@clnVe֭.]: @=~ @R^x_bHH^(/Z)lMcs2f(dȐ=$@$z(CPd*L2Ɇ bHH^(/Z)lMcs^|E駟f! l{jJ"[Sܬ,YȺu$s6 ' #@z>g@"PYd͚U֬Y#,$@$@"@m/ZM Pdk={v$[l6 ' #@z>g@"PYȑCVX!,$@$@"@m_ׯ^z{՚`Pdow/_6 h-,Y"5j:jժI޽hѢAI"[Oݪ\rŋ%gΜv ' #ƢGztÇM6r;L6M^y)Rņ@_t >\l"+W>}ȭ[c͟?_RH!ٳG_믿͛m2`X#F/,5sÇE-&@Ws-|@l ؋mEǏ}uֲ}vZ:uJF%3g*o4nX/uԑ|}g{ȑqF;v|G~I*o^%J$Ch I.\ ]v=zrٖ#e"ٳgoXHHE"yӈ\r-!CիW޽{A:uRl#Zz&]ANxٲeDYBEvޟ=G[;  {fkt]իH@ IH/.?^E?EYd̘1"E6> &+VTm!߹sG#Tpa{yZJ"R 5 ؋";2Xxi+FWXQm<= @ c"v)YW`$?bGK.\ #3;Ea ,$@$@"`km/Դ( ih&L7es[& (Ȏ2:HO Pds47n|.$ ?3`V(Cρ%Ӱ($e! l{jJ"[SܬbŊѣZ  {ȶh(5u*Qz B$@$`/ՔEYhB$@$`/ՔEYڣbHH^(/Z)lMcs^u㏥TR6 ' #@z>g@"PY.]Z>#f! l{jJ"[Sܬ2eȀf! l{jJ"[SܬrI߾}lٲ6 ' #@z>g@"PYһwof! l{jJ"[Sܬ *| ˗yOh> 9{~*7ߔnݺ 6 Pd_VSٚ:fUXQt6 Pd_VSٚ:fU\Y:t *UyOh> 9{~*孷ޒv 6 8VdϘ1C4i A9|O^ޡ!@mW*UH֭߶4HH@DkdQ~zIFΝ;$y#F $`EJ\6oެ -_\r%V9s2mqi;k)m2[\Z5i޼@l ؋mEÇUǴiԢ~)}q7zݻLנv6W^]7n,B  pӧOWh7nH„ #FP_СCRD |z %K&O 8#.װaCf! D+>o"H iԨQtxAm1k)F:QV-WԬYS;h w2eȅ T^*I$a$#>((֮][ԩ#,$@$@"`[}=I:,]TJ.-!Cl uڵk?~|dkLZl[M{}] h-#C`ի$JH=gϞrI.TPAo.Wrqw`hȎ>@ݺuU>6,$@$@"`kP pH8qDZkwv#Al ؋EvBZj%gV;ݻ^ @7j׉ Gׯ__;h wٛ6mRAu)dĉB&@h{ァB$@$`/BMkL" ^߰Go),$@$@"@m/ZM Pdk=b6 B$@$`/ՔEYM4Qf! l{jJ"[Sܬf͚k&M6yOh> 9{~*yRD f! l{jJ"[S,lWxq۷{BIH Pdc?rˍ7ѣG!N' ȶh(5tMʛ7߿_ŋ'jՒYfټG4HBEvhl? jϟ/m۶W*%_~EDm lkH"[Cؤ۷ w*B$@$`ԜEy=z1cݻwY A;v,̶/i* @hmȶdWsyɜ91cʃBO$@ @m?J Pdk W|yY~Gknݺ2m4f{gnE|iRD@A@DS,BVnAZQAϻ~ss/̜w?3ڟg5k! %@gϽH"ۋ0Kʕݻ/l۶Mfm 0mڴ˯4loP :~mpKȾ}\ږ EEjժ^%8qbٲedΜ٫&@lǍ'#&Lu뺴=7" ?C3ɓ'R믿G\"͚5;ʳg͛Mix-ItRɔ) 6L*Tlђ(-Vu 3f7V qnO$@$; &VZ!C9rFYj~Z#DUTE ZiҤUt(^pƍ5nՉI&`?  0Ȟ2etEIݿ_0eСҢEP!1Y(O<ŋK$eʔ:RT)+V~V" ^s8"۷קr,$@$@#`8mY޽{%]t?*V(-[~Mƌ#{jժi%V" ^ LO^|d}+[pc  0ƚ PH|tMW͛RV-)[у"ۧ)x+^{C ^޹R8J܆HK"QF2rH9u̙SN>nݒ>}(ݦMJ٥{فom"۲[Y/GrƆHH'`h;H/e*{%ڵIM4 ^C\`A}CZܸq)Aw#qFMys2 B$@$@$`z͕B9*`oU/^aHdo,f$@$@PdslPd{"JD9Lv#i EDA"~2 еNr_|ȱ E`!@,O?SH!׮] Xda+$@$@!0(9$IUʝ;wZ̙3G"e! 0lsVEdB W^)RD6md^d ^{܋(U)D6-Z49r䈤MdHHLD"D΢%@m\ߘٲÇ#~AIH ( PdiovLdɒ%JrYHH(MYEc%i:gWPl˹ ^)1ئs<63$@!@5BNdC4iWPl˹ ^)1ئs<63$@!@5BNdC4iWPl˹ ^)1ئs<63$@!@5BNdC4iWPl˹ ^)1ئs<63$@!t"Gٳge۶mRV-~}1 xBrlO =gH'`hh"ZW}w)W\xQ<'b Q IDAT/ ,!C=zt)RO *F/ҦMٰa+V!N8!k֬mjȑC֮]ƍk[H$d͚U;%=2O;h6C$C3ɓ'm˖-Ү];wČSF-x3ʲe/ ɻ+s l… >}zy\|Y2e$֭ɓsGx!7_y:uja̞=[^z%ݻw+W A= 'N,:tUXݺuKvޭׯa퉔~ڙǦ@ k\rI^^z2sLѣr%o?|pٸqw^^w$OdCHJJ={dɒ%JO>DV^" &&LPaٺukt{7׮]ٳȑ#hѢ!5Jnܸ硱cJhBO?UrUX5mmСҥKڵ`kŊeĉ@M -]t"ӦM}9AQܖ7o^}v)]O ߺuGV,`͛%eʔ2uTegɗ/FqSӲeK=W uI$dO2E/:(ߗxg\Zhk %JHBGĸ|bDg!ZKΜ9oHS.Vhh}*ٲe4iȤI\s#Σx}9ԑnРA({ɣ6Ieܸq?~P}poxjw @Pg$@nq..L|R7/^Ћ.ݻw]D6^ȍ'!h*r )D* BdVUc onԁTaDi!PQ*W,K.U+V,D#wӧOeΜ9sHn3g8qT|_rEn޼)c !v!ȑfKF6#JsݦM4b!}̘1;vsׯ_;ho-ZD̏971hI&2ydonnc-\ܘH vUd޵k9A5D6RG /^*xqOdcrsڴiQF`_F1xP\ EÊl}gDQO+TF@϶I Vd# .8-.8%.hX-#\ĦO`){Od#*ܐ#'\[K, ͓5kK0@6Ab2 F0aQ#j q5jx* )Wq~w4R[=I&uId#}[${zNy}s&+a3 Rܾ}_wcJ܆H -u*D ")Ձ^|@"/£TlgT'x!7ȆEV ڵkM8) I`"f,csNM@3R 1hHl3ODtqA21 =DM:D@+ Gdy߾}2c Ϳ(d}ԭ[Ws>n_l)S.\XiPd{r&>$@V"`jm%G/&@mL 09öuq(? 5!P;wQlܴ3ϐ.(5UT}19XE( h=kD~ƊG *ho#9{̘1Cڵk#FboQԩSq~3~oV{X?!Cرc[ @g$@ً-UFK"ѣG ?~e$`# yDcRJһwo)\>p+WN.^(/_;wH$IBl{p4~ZRL)W^ 9ח9r_|\kDT Evƌ# "ݦMtĈC&O, RJĉF2eʔ7"وj?^Oɓ'hU%N8_?#W\f͚Iǎg"GNdGt\VX|ѹ"QaG](-r:/_#<?>pB7Zh*/_۸qF$7J BVd=uϟ_n߾3]hQoF x:x5动6|#Oݖwm!g" ʇ~MFWwh#n!ÉlNEyVFȉ'"PS"vD׬YǍ!q }A# q gaEeƱ ׽{ww%K&8@dCN F}ɖVoɘ1c)ժU0Y3ȶuVHfWre϶~޽[Al!򎛇3 ˈ_8"Bqtm[WƟ'r pF"ry(ի?/[L}:Z]ȭcfD j.<(>!YH"۽1Fѹ"+7oXkl\Dz֬Y5sFn߼ySjժ%e˖Ք".BbGҦMe~_H7;vΝ[Dq_8~;WƟ'r pF"ÉDq1@#"RcH.@ %q"G4fD !5j$>*Gxp-5a左#x- 0[i6^l= ^'8\ش #l6E$E6 x/^Fxl*$pEK DLrl#$cӘ~U$ ( BNA<"c#l܉H (U/< IǦ1BH PdG"烌 xDǦGظ Pd{" ^)yMcV@0/>' 9EMq' /DVASd(0& "` @ ^f}Nrl26aN$@^ @x!Q`L<6ZE@";>/>dl#<6=ƝH@" Y BNͣxl/Ev0x}9^)}>؀Gxlz; xE "{ڵ%KI&MzҥKi޼$MT=*&M uSJƍzpݻWW.'OaÆlwU)Vv>~X>#9{߿_RHpʕ+=`ZvmQy̛7Oի-lcJ܆H --Z$UVEY' x/F9+U${… G8>U&e˖-[ʣG1~!";uԒ2eJ,".G6OS/\6R\9x`GTl"Ν;$IٳGٯ\e[\ِǦ+ /Zdg̘1ȇ/N/ΎuD۴i#.]1bɓPBR`Af],[ne˖ɗ_~)>w}W̙Q#FȾ}*}yСo4h$H@6m*{#ݡCYdd˖MVX6g?ef${*ȗ/_7o,^"Eh<6#C DEvdq_x!Ȇ(˗/ VZɎ;bŊry2m45QF;wnUʕKoqF=zGu@ B! #G"ٶmCeE `2QǎӶ&&LUV~={*UHѢEC|pWD6x5J&N}a#`3gNe9sf3f̘1Co^|X{tFF}?_^z@ 2* UNdO2EsQA-ZX;a1lWǏK}tHwXͫ?}Ү]P"mDm{iݸvܖH '9{lt/xPd/]=ַo߮[Ɉcǎt:DF7ȦMT?Tݻw5DdgΜ& m YfnDqmQ]DqoD!B])D6nB{cR䧟~0t -VM!M k?(}G˗//5k|cR(n(]2!0 lxvED6򦑺(jxe:2d :uR fVg/!#)+IavMM{G+ŕEnذܺuKY. `mQllp>uC<=@ "֝;w%6i"'s 0FeDQ2TPa"ٳqxED6j$:: *zuH@tSQ6lؠ]4 c=D#وc oDYN6J8ڄ(#ϝ;Jd#" (8aB n_Ft@P#9h!cƌ9DE̿aN6{m6)^ >=|Nqa$Ogs-nK$M:Ec)-$/F4&=bUhѢi>/ "Qm+X+`rVP "`dǎLdC̖)SF=&!a#ٸJ4s#e#nܸlqwuDq#V*A&TYFSRU RK ={&iӦ՛sG4~[\rE'IQ}W:96RZ_ Mod=$@0vܞ|Erߊl_ZM=$6ghBNmD "g$`0s1'^ȭ%#6O H^9PjAtv B" & DG#'cH P(EZ/ٖ M 9]!6h1 BNm̑Ixlr Evȳ]K"RBi!g+$`2&s5&^)92iM @.Y_>Pf$U%JM9$Hl@DvpcMo޼4qo`$qJVD$&l7yK.k/^hd3i[ 2s! "@m!g:uJʖ-+'OP1m?ǟGI(0>|XjԨ!Pa!@xu g$@AJ"B߽{hBve^+f'H=hn9?ZOf&@mf}ңGٴizŮE=hn9?ZOf&@mfK}w_YW P݃涟$`ff^WX!F?BbWN"4'36ˌ3dѢEbv9f3h= EsNdF IDAT%KdΜ9bv9f3h= ES?#SLP1m?ǟGI(0/o>7nzŮE=hn9?ZOf&@mf}ȑr1bzŮE=hn9?ZOf&@mf}rm4hzŮE=hn9?ZOf&@mf_~BbWN"4'36_Jxwbv9f3h= E]JڴiK.bv9ӧOqݼysI4=zT&M$ &J4i4nt^x!͓z9}ذary%z5j7Yn|GNrg'OիWXbvZiݺoaK vR~}#~ӦMG_o?_zIM"Bm۶̙Sڴic^+f' V갫ժUeJ˖-ѣGa?~:ujI2 A={ӧ+WFh'\rzS.\ _~I>}*_|E"Ν;+V,^A:P#Gd„ >+gb &@m!7kL-*M6PpEd̘1(?d FJTHvELٲe+VÇfҥK#F 3)P1~Ν %!x!Ǐc[DQl";f̘!mO>qD!SHQ|D!W^ɈjM[ݻwPwWdK.~ܸq :"EUI$ lW(dDl0 WD7#۷oȯ}2Dرc%k֬ۍ4g|ƍ|R%K&vқSdceG.򩑆h}$o2fŸq5j8p@шۋrͫ)-Hot!n/I$ ݻw5DgΜ!m7DٳM2|.$ l ҥKk&"`,$`olYLE}9؋[g| "ٝ;w~cSVی34v#ٶƐ|nh{ԩYz[ھc";~ ۖc"H$@f%@mVυc7G#YH(!j8qbAT6N8?~޽ *0}D9"FHAwFqht%R02(k/ˍF֭[ҧOmabdJ486HH 6V`vb;&Q"_#݈D6"HM,;Ҁ;R|1f]hu Er6Ǎ' Pqh2eT4baDt#) !ݝH6crGL_TRK:v￐[WڵyϞ= 82 nDm";uqc`#G8&/BT rGVlAN4n#X+HA7ölrDnV~#c R_o?_:IO"B>#r,DžG,$`9mA[@O^o"-_?_Pe$@vIA  B";܌(x%Ղ4LDWWdY/ 3wDlⳗ:M5ԁJ,Yt V(c6{:i@* {|U:|_K$`E-bҵYH((rഃ/8^Pd ^岽KղHΑ$IܝHc3ގ1b7ExѢ%@7D9 AAI""nǺXR KPg _=&6'"iK=zɚ; xEwy6pNj[ xEX&B/.ưq#@H϶I Pd[xk\޼yʕ+a9V9gNjl+xQDlɒ%̙3a9V9gNjl+xQD6*Uȑ#G,#v*(rIsϜ~$`V?ҳgOٶmEznXEU}tE7o.I&t&M$ &J4idԩҸqc`[vdɒEfoݺU:u$7nѣew^3gN믥\rjkXŋ˗5jT'Oٹs1w\]̙d*xcYA$_ƍb:N8oԋIɓ{MVFpE,ZHVjժIٲee˖#HJ2L(*U޽{K…Cg.ʕKq 䧟~큗 "@y!(ex6\9/^ XC>lqNl =dɒҮ];ݴs(m/G}a}@ͣv/B3H @mAXz{*)M4ы=/Cz+"'"&'wj;x):uiu⢫"%<ɑX>hѢ7o^MB]9٘̈5VADEDYچ-vDt*Uh9k\k$@$>ll6BK.U0f̘Ά7Dl@bʕ+%QD:o|w'}=X } 8q_sٳE +T+޽[!۷oKWE6&&@mbuxE4׵kDGmt+ȱ7?UPdܓm۶WHiF_Ɏ7yhPXɛ ǟ|FI*(MǏ[ =zyX;u|M]oB9&tL30l9,@,&2?~<\$H3%JBcUVI2eTX#:|U6k4޾}{A5"QDK4i4\@#4z#G(ժUgϞvD:F۱bjN:UXQSjEf1 +W0 py DE #|M`SMDp!'~nwҤIe޽b DQRIo%v*4 ?Hu;ʄ teΜ95m6qӧOU,#:u#֭}͛BB#وBԣ؋w&PKd?l"cH8(]30oҥr-M@r ={E39(r!!/_iXYĖVN:yÇWQh=k[a{P2 RPP!ԩ#'M߰";UT;D1&V8p@ nP".\~ LD  Dc͖(9HC"8%~$@?d?l"cH8( ZG~Mc͖(9HC"8%~$@?d?l"cH8( ZG~j֯_ LVg! 7qgنp   ȶkٱPds| E/n6kh X@D6^bZD(Yd#G @D6#>'+uǟ G(=ƝN"$@$@$`lle$@$@$@J"!( Pdi  lfCF"@m$o  l=rE (=FL"ޣ$@$@$`|-l@e$@$@$@!,)+իW'Oz &իWv 7oԫWagl$s2c ɛ7,[L  hѢɇ~(2biҤI. sN"DڷΝ;_)} ً-U,_)Rȝ;w$I$mڳg[V\٥Kg> ,f͚O?$}zÉl~Cp; N77Hw   #0Θ1cQH֭ƍ+V,BBdרQC*V(ČS&L %KT!۰aCJ|R7o8^$;zڷo/ٳg۷o/k׮ U-9rxC8وhwEڴij =sL[l{K IDATsJ4iT?^^~-ɓ'89^c:/]$1bP^P!@ &],[n!C;hcAD_z%~8 z\@3fď_*W,D/_,iӦF*F<]&   W VdCAB 6h@/Ŋ3gȅ `*4i"ӦMԋǏ 0"҈B6mT~WI CflGYFhxlIKhb/!6SN-ׯ_#bn:UT@d7m6M5A?V2p@ܹs/^<ÇU\C⦤UVc 9;vL .vFú/JF$wBb=]tz6Ƕ~ۧ@æM6I8qw}W đs-Yz \råL2ٮ܎HHH 'LQ\PD:th"S$5FhѢҫW/yUpCDGE={O!0;+lDë^l:>r䈊MD1n0 WZ%+V5G3sTT"8qb7yQ(AT:iҤ~8}FF47%vC`?~XbǎϞ=[N"ّȆ0Gy׮]u0vޭ"~(UTQF]=܎HHH '!E.ԅu TTIS?7n\"Z:k,C3B߄'!ë'"B7׮]dɒ믿:uo-ڌ1T]vi,ƍ7#Et[A{ر5kV)[k+?}D7674DvrOHAmŋ*;"()xM$@$@$@0FZ}$"#/^\޽ble߼ySz6H{eɒ%uT.b_tGFSNvAD/r!>h2)flӧҧOA:rs̩ k+HAn6Ȑ!ڲpBׯlذAkO"H5AtvGٮ܎HHHlL+UE0R\D'|"H#ЅPRzrHy߿?EHHHezy:u4nX7=zTҤI#w*nfpM    0^hTZ5_~-)SW-$ID pCi(]F IHH< `h1cF9yr˓'Oyi&I./^\Ν;iD۴i#.]1bɓPBRJYdd˖MVX!Yfm;t> w{jL2}v~5Jƌ ?c͖HHH  =z̝;W֯_>@ʖ-+'N|I֭UVcX?^޽!(tXbnG;vLr-6l"EHzd߾}k.~I&'NH)q FIHH hNdO2EtċO?:TZhiԪUKnN;ʣGgϞ'Om3jԨ[eȐ!3gN".QܸqCo͛2vX)S&?͛7cv"  D}7|. Jke˖*@,YҩF^RB9{Vݷo_y 6LgɒEfΜ) ̋mn#$@$@$@n:]zuڵSL;wV ወ۷t/:n JlJlHH@ЉlZJ֬Y#W\EJҥeҤI?~޽ԩSG''"d+I8flj'$',w?&V2m"X5$@$@$`5پ}=ч4 a#RȫƤ .Hh47Qfɐ+WRJmD?#پdȎ?M$@$@$1t(Q D{,C";x|͞ @ ^t`jm/ 6)7 iE6V@GS=zHw̛7O'=._\%oo dZժU ׄ`xAҫW@nʶ)M6M$@$@!`Zm4Ԑ( E$@$@!@mW#v%  pEĸ%Pd[ Ea]C|I"ۗtY7 E6@PJ$@$@$7~C͆D"Hޠ-$@$@$`=){l q    Pd{;EGIHH(#Z>*IHHBPds0%t;;M$@$@~#@7lH( B$@$@#@m=G.v7!  E踣 Pd{HHO">> @HHH @Љ#FeҤIAL";Ϯ Zd/ZHVU ^i(M:N$@$@ `h1cF9yWAڋGJ6mҥK#F Lv!ϟ#GJٲej + g$@$@$ VdO8Q'[VZX cǎIܹeʕ'Ȃ ͛7Ø>RdI  É)SH.]%^xyСҢEHE{)y6F,XP "I&bŊɝ;wS~ܹH A"~$@$@$`U}.ҲeK)RI&ɇرc%k֬R\9x߭:_mHHO hEv߾}%gΜr7n"ׄY? 7X/ҽ{wS\~]:u$Ǐ״F}`Pd[ۿ E/د.rqx-;#پ o:)ZC$@$@V#t"jd<#@7E$@$@$l8q+ȶC  0ӊl r޽7p>}TG.QDy,Yמ3gʠA=Arp?Y>*IHHBVdӇ$ّ}IHHvF[E%N aPd4ğ(Im @>"Ba@$@$@$Kپ˺ K"۰a$@$@$` ٖp#;.lwq{   wPdCZEe\Ɏ ! Pd-4(}M @pnm)8 E_06Gh XE޸H"EP܌HHH#aNf'@mv~  06lcEZ   %@́(4 EӥK.Ҽys|=Z *f(984kLj֬KuA~vHH|N"d lv/R,;ټydȐAΝ+M6`]*̓$I"=z^zĉcǎ9/UFkժ˗/5?~XD;ܹsˆ Tp׫WO'vIʉ'ի!"ŋC?~_2o<`m";X=~ NdO2EQ߿/C-ZVdj*ݦA7o^ڵ~_fjD!ׯ_/9ruIT lٲ QF ?̙S#%J7nn|ܼySƎ3eʤBr'M$˗/=)Dѷgga+AtvHHHp"۾`"{˖-2k,\rRn]iذ~G4i$CdܹSҥKb{uߜEj4RGP +T"+e\ @$nB$@$@$1lqG36h; E}D }@"PY% @SӧK.]y2x`uΝRvm9y[m&LP<(iҤqk??s_ >\F- `ޑ";Ͼ Zd/ZHVBjդlٲҲeKߓkŋrI$[FVdO6M.]* .ݻw>pD-;E6G / Zdg̘a~AI iӦb*UoV-X@Ud̘1?~|ݶ_~rJ9zFWX->lj!wH#d߾}sx|XCٲe !{ٳGX_| q[ٻwh`;PdNj[ Gp"E]Xe5kVQ\9[4lPڵKH ȆPN.ݻw֭[kDsTŠ[j D7mV۷;vLfϞ-Ӎ)SEd4ڎP?s zm+/_V[=z׹5s9HHH|JR"^6jHFem쯾J>n߾DQyD!ɶEzBm۶IT(|0eHHH ˊlhsXtr`C"Ⱦd#%uԂA=l0/4boذA2g,;v,G8j(De\ @$nB$@$@$1lqG36h; E}D }@"PY% @l$@ngIHHo( EA[HHHzNdxB͛'؛ &J4iܪcذaryyd̘Qt{(ǒ5 I"{ѢERjUmϞ=һw" IDAToYrz"O8!ʕSq+ɑ#[NҧOsٞ$@$@{j?LDο EY(t!.'D3Jnm)jB5 骚.BBJ!Mt]Ln*u9)=w?ZJgW @ٶڻnݺSꫯ4zq dǝ@@ !۶\v^xArBƍ](+xbȑ#{^sقo~+/ J,VZ>tP[hB#kٲe7nϟ޽?s=WK.uD6lm۶@!й% !\Ȟ4iR^={tҮ#Fpǿ/]ڹs#lٲnu7paV>ۭjϙ3'ݶm[ժU˭P۱`=Zb m5܎z{JKKsa|رʹįj(Sls$W\o @s7dElK=svۖa{g̘As~g}V#Yɶmڴqcj֬nBc[ٶiӦ3gtJ7o/ZzڵkVs}dڴi[n؞B/![@ !^ lԨv{nʰdgffٳgynP\f͜Uu[&-l[ȶԩƷ /;K.|ͪ_>!;`7"d! @ xt 0@v{O9O{?_WlEl_[0n^nѲeKҤ[ȶ>j(_˶*m[߿cǎ[嶭%[[i('+)lz:@@^ !;?]{9s_xݞO?T*Tмytggq{EC %y^\8pޮPm/A,O8>nUCOTX1^̴m늽yE{~n{MO!;  9^l vX`vijvاlU;EÇkÆ 3fLJ E=PI" Ph^lݥKw}:u"Xv{A?,p>OTen !۾b?V;/}?Nt\r%xz_Ts%9 G]8@x$BvM@~ d4Rʶ3i@&@N57 !;Hݠ@ROz=eF#@@Y3,@{Ԏ @&@TD@B6C(١l;F@ iQs  jA@ ٩Sf!;$NA@1qlG _Qa @eH@ d0Rʶ3i@&@N57 !;Hݠ@ROz=eF#@@Y T!{ժUjѢ֭[… u%bŊ1]_vm5rH=쳪[nL5p<   H@W_}UwqG_kE4kL}ѵ^SMʔ)z+hŊ.l^ZŊ."d  @bVzU租~ZcǎUVV;Ӝ"gV߾}dwm#IOOWnܸO=7oÇkѢEz뭷TF M0+GuY@n۶>իWO7nW_}傲:/B=ԯ_?ر&NפImݺ5qXXM/r!?K d7F@I&GoϞ=.aǝ;wq=pʗ/°-[֚5kkSX\LݦM-h1oȽ'<[h޽1Y8ݻw6۟پHBZjm߶TTI 4?Vm%e:s=7p̙wyʐ  #mȶc1c ^N5kΝ[<#jݺ[}m+G'k Rff[Ͷ}ŋw{W^z}e7|ɶlS7n#d/*^yn{GjM7m+V:EaÆi͚5vڹm++Un}i%>+< l@N !dS-|Z8Ǐ.˶EC;m}X8m+hh-g$^z{AJ*.[t5>3O?i }wǀvO۳m[WM[նdھ|bs>oue"-  @" ىk'mv~r_VX5E(@oξ=|퓀5[-yO?֮] &t]dcnٲEGQժUգGa4D@p_}Uqq)hn֬k5>ouge_~7n5kѣQy]xq?,i @:d*VmEwرy睧Si+uꩧ^Pݺuu|2dLiӦ5j++WSNڶm[)8˻0|wlٲ լYSvJqW\LݦM-h1oȅa ۏE,ۧl+ĉ>ow@!;=ʐ  #@a!;mg $M4jn$BvA- !;zʌ" dG)  @C;#q!y y6@@ UVպu9!;g $I$hn,BvA5 @BIԣGg.]~Ĉܹs3B d#.  @" -d'b2a瞓N9%4f  /l_:?Y=ZȐzqԌ !qgT~@@!;P툠[޲EzGbEV#`@@ dj^utҏ?J˳W@@(@Ȏ#f‡^Ŷ_jv  !;>7{~ٶ' G@ d@^ŶCl_H  B/ν}|f5ۗ.R' DC.vZ+3gJg%t3F@@ پ<@HABM%d9F@}m2!Q7 @k پv@@ l_Lsԍ ! ddBn@!&}u# @پ6k@B @Ʉl_;G  BM&d9F@}m2!Q7 @k پv@@ l_Lsԍ ! ddBn@!&}u# @پ6k@B @Ʉl_;G  BM&d9F@}m2!Q7 @k پv@@ l_Lsԍ ! ddBn@!&}u# @پ6k@B @Ʉl_;G  BM&d9F@}m2!Q7 @k پv@@ l_Lsԍ ! ddBn@!&}u# @پ6k@B @Ʉl_;G  BM&d9F@}m2!Q7 @k پv@@ l_Lsԍ ! ddBn@!&}u# @پ6k@B @Ʉl_;G  BM&d9F@}m2!Q7 @k پv@@ l_Lsԍ ! ddBn@!&}u# @پ6k@B @Ʉl_;G  BM&d9F@}m2!Q7 @k پv@@ l_Lsԍ ! ddBn@!&}u# @پ6k@B @Ʉl_;G  BM&d9F@}m2!Q7 @k پv@@ l_Lsԍ ! ddBn@!&{7oެUViӦM\jժ?.P7  pBBg!{߿֬Y5k*==]Zr =9@@T dEB>ɓ'kܹJKK7|[oU]tkG@ d0x-`*In߾edd{.'  A d;'̓mV?裈mȔ)StWF| '" A d#Anڴ&Lp-"yMs֭zG*y  @فkI  Bv CsqgϞ5kN:oZ.@@ @Bb!!{ԨQRݣ]Af\ $@Ԥ ]l[ܹnPr@@ x&k-[~*T B6PDCVǎG82  Wޤlem-C@ y.@N:97D@$  -eJV@:5uFz~SԨQ'BBv2 !0ԣ/vvJǒTSRmI5*UU\ ?o! @&LOWɻ$͎6%,/TV\Q8 ǝ" @ɳNG7ޞ=Q]WT{J'ЅݻGumN&d'K  @a  K$UQC~PjLRw B@8  QTL"ET 7eD@!;Xx{E;[HzaCԤ?,=I/7C@%@Nt$YHoi[oY\ҧO4D*R5===ѷd|@@ 줓}Cj|>qpW@µbۑ("@@8 a5nהx 1:H|-oQcQFǬ^gjgl;Kl*@@Bvmiij}A3:̯PAnא1ˎ'! @jʕuqhʦMq7[>~;q9@!;@IՕH^GrNZE)  Iz@RqOvvM +@Η(y'6L{LxH=?L  UWxlZME+Q!|';֕HW (  @B ~ *OD鯮K6jra(\] $FzH*hF8G 2Jlt5:Uت*@@ xĽ/՝;뺽{c%KIT;c# 㵯:^2A@ *@.`?۷OG`dI2Duŕ?5+]/@M[R:U+4+;)C/l+/{1f H] \mz>[^kn I+%d?2ڒWժKUx'c@.@ x]_L?,^b|}ŋ?WFk/$zt/4n )-@N&~rZiNJyⅸ QƮq3{ő@@ 6=^SNrV8  ^Bvx{_[RzzzƉºour> [3{@@ʐ   d@@ 2$  @?G@H!;   nBv@@ @N*C" [3{@@!;{ڳgJ*3ɓ'kڴiZpaH/ \ 'dC1@@ ٩BK$!{ӦM{m69rD{*2dLiӦ5j+3}L>];vPϞ=ջwo5iD˖-yaҶ+WN˗/w!>Pj \#   d'S;Eeؾ*v9-{uBrk׮ 7ܠ9shرn%{ұtRUZ5E !;U;yY^j}˗okѢEn{H֭ծ];5nXpa|ԩo4h eff>n8/^&d'@&@Ȏe駟4`d׮]^ 7|-!ݗCڶmηþ.bѣnk|[P'db ,@{Ԏ  HBv BQ  > QZzyheIENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-006.rst0000644000175100001770000002575414721316200020456 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-006: ******************************** PIG 6 - CTA observation handling ******************************** * Author: David Fidalgo, Christoph Deil, Régis Terrier et al * Created: Oct 10, 2018 * Withdrawn: Oct 31, 2019 * Status: withdrawn * Discussion: `GH 1877`_ Abstract ======== In this PIG we want to outline an improvement of the CTA observation handling in Gammapy. The current handling has some limitations and misses some key features, like selecting a given time interval and performing an analysis on it. We also want to start using GTI to compute the observation time, instead of relying on header information. Since specifics of the final data format of CTA are still unclear, this PIG does not aim for a final implementation of CTA observation handling, nor the handling of all types of data from various instruments foreseen for Gammapy in the future. Instead it focuses on trying to resolve current limitations when analyzing IACT data, namely an analysis of time or phase selected observations, as well as an analysis of an observation stored solely in memory or given in ctool's XML format. This PIG emerged at the Coding Spring in Madrid 2018. Some terminology ================ This terminology and the definition of the terms will probably change or be expanded once the final data format of CTA is decided. For now we will use following definitions to go forward in Gammapy: - **GTI (good time interval)**: - time intervals for which the instrument was on - the sum over these time intervals equals the *on time* - a GTI table will be required for each observation - **Observation**: - has a unique identifier, the *obs_id* - contains 1 event list, 1 GTI table and 1 corresponding set of IRFs (aeff, edisp, psf) - a GTI table is more convenient than a ``START`` and ``STOP`` time, it is more flexible and allows to cut out time intervals in between - has 1 dead-time fraction (``DEADC``), the live time of the observation is obtained as ``ONTIME * (1 - DEADC)`` - however, in the future ``DEADC`` is probably going to be `absorbed into the IRFS`_, then we should switch from *live time* to *on time* when calculating the exposure * **Observation library**: - the repository where the observations and IRFs are stored - it has a manifest connecting all observation files and their IRFs Status ====== In the current implementation we use the ``DataStore`` class to connect to an *observation library* that consists of two fits files, the *observation index table* and the *HDU index table* (see `gadf`_). The ``DataStoreObservation`` is a proxy object to the ``DataStore`` and points to a single *observation* in the *observation library*. The ``DataStoreObservation`` never holds on to the observation data in memory and only reads it when explicitly accessed (``.events``, ``.gti``, ``.aeff``, etc.). However, it does some caching of meta data. Recently a new observation class ``ObservationCTA`` was added that is capable of storing all observation data in memory. However, with the new scheme proposed further down, this class becomes superfluous and can again be removed. Normally the user starts an analysis with the creation of a ``DataStore`` object from which she/he extracts an ``ObservationList`` providing *obs_ids*. All analysis classes take as input an ``ObservationList``, which is basically a python list of ``DataStoreObservation``. Limitations ----------- * We can only analyze data for which its *observation library* has the index table format. * The current scheme does not allow for an analysis of a given time interval by the user. Only full runs/observations can be processed. * We do not support event selections before starting the analysis steps, such as event type selection, phase selection or time selection. * It is cumbersome to write a copy of an observation (possibly modified) back to disk Objectives ========== * Users should be able to run an analysis (1D and 3D) on a given time interval, independent of the time intervals of the single observations. * We should support event selections before starting the analysis steps (event phase, event type, ...) and possibly allow for custom selections, e.g. ``MC_ID`` in the current CTA 1DC. For now we aim only for a time and phase filtering of events, a filtering on event type, for example, is more complex and is left for a future PIG. * We want to support more formats for the *observation library* (like ctools XML format) * We want to support observations (especially event lists) stored in memory Use cases / scenarios --------------------- 1. Scenario: 1D/3D analysis for a user specified time interval | **Given** a user specified time interval | **When** performing a 1D/3D analysis of the corresponding observations | **Then** get results only for the specified time interval 2. Scenario: 1D/3D analysis for a user specified pulsar phase interval | **Given** a user specified pulsar phase interval | **When** performing a 1D/3D analysis of the corresponding observations | **Then** get results only for the specified phase interval - Make GPS survey maps for CTA (process 10 GB of events data and 3000 runs, testing the memory usage) - Create an observation *from scratch* and write it to disk with Index files - Read in an observation from an *observation library* given in ctool's XML format What others have ================ With *Fermi-LAT*, you always have ``gtselect`` and ``gtmktime`` at the start (see `Fermi-LAT data preparation`_). Following analysis steps partly rely on "data sub space" DSS header keys for processing, which are stored in the output fits files of the ``gtselect`` and ``gtmktime`` step. In *ctools*, every analysis starts with an "observation definition file" in XML format. One can then select observations with ``csobsselect``, creating a new observation definition file. A selection based on events is achieved by running ``ctselect``. Proposal ======== General idea and class diagram ------------------------------ The general idea is to have an ``Observations`` class that is the starting point of all analyses and is passed on to the analysis classes of the 1D and 3D analysis (effectively it replaces the ``ObservationList`` class). The user should only have to interact with this class, which makes it an **interface** to the other classes described in the following (``Observation`` and ``DataStore``), and therefore mainly consists of *convenient functions*. The ``Observations`` class holds a list of ``Observation`` objects. The ``Observation`` class is essentially a **proxy class** to the *data store* classes. In addition an ``Observation`` object will also hold an ``ObservationFilter`` object, which is used to **orchestrate the filtering** of the data, mainly the event list. The filtering is applied *on-the-fly* when accessing the observation data. In this way we avoid storing the modified observation data in memory, which is important for the last use case specified above. The different **data store** classes are (this still needs to be discussed in more detail): - ``DataStoreIndex``: This is basically just a renaming of the current ``DataStore`` - ``DataStoreXML``: This class is able to read XML files as used for *ctools* (maybe this class can be combined with the ``DataStoreIndex``) - ``DataStoreInMemory``: This *data store* class is special in the sense that it does not point to files on disk, but holds the information data in memory. This can be useful when creating observations from *scratch*, by simulating the event list for example. All *data store* classes inherit from a **parent** ``DataStore`` class that names the necessary methods, which have to be implemented by the Child classes. The new scheme proposed is illustrated by the class diagram below. The attributes and methods of the classes are not fully worked out and are merely suggestive. Implementation road map ----------------------- We will outline the road map in form of scenarios that we want to achieve along the way and that can be implemented ideally with a few PRs. We split the implementation in two big steps: * first we want to focus on implementing the ``Observations``, ``Observation`` and ``ObservationFilter`` classes * the second step is the work on the ``DataStore`` classes **Scenarios**: 1. Scenario: Run a 1D/3D analysis with the ``Observations`` class | **Given** a basic version of the ``Observations`` class | **When** passed on to the analysis classes | **Then** should behave the same as the current ``ObservationList`` class | **PRs**: ObservationList -> Observations, initialize with a list of ``DataStoreObservation``; implement ``__len__``, ``__getitem__``; adapt notebooks 2. Scenario: Add an empty filter to an ``Observation`` | **Given** a basic version of the ``Observation`` and ``ObservationFilter`` class | **When** accessing ``.events``, ``.gti`` of the ``Observation`` | **Then** automatically apply the empty filter on the fly | **PRs**: ``DataStoreObservation`` -> ``Observation``; create ``ObservationFilter`` class; add an ``ObservationFilter`` to each ``Observation``; develop basic API 3. Scenario: filter an ``Observation`` by time | **Given** a user specified time interval | **When** we give the time interval to an `Observation` | **Then** return a new ``Observation`` with the according time filter | **PRs**: Introduce time filters for events and gtis; Add ``select_time`` method to the Observation class 4. Scenario: filter an ``Observation`` by pulsar phase | **Given** a user specified phase interval | **When** we give the phase interval to an ``Observation`` | **Then** return a new ``Observation`` with the according phase filter | **PRs**: Add ``select_phase`` method to the Observation class 5. Scenario: filter ``Observations`` by time | **Given** a user specified time interval | **When** we give the time interval to the ``Observations`` | **Then** return a new ``Observations`` holding the selected ``Observation`` objects with the respective time filter | **PRs**: Add ``select_time`` method to the ``Observations`` class; Proposed class diagram ---------------------- .. image:: pig-006-class-diagram.png :alt: Class diagram Decision -------- This PIG was written in fall 2018. It was intended as a design document and work plan, but didn't reach a state where it was complete and ready for review. Then the original author left astronomy, and the other developers continued throughout 2019 without using the PIG. We now feel that withdrawing the PIG is the best course of action, making room for new pull requests and PIGs in the future to improve CTA observation handling in Gammapy. It's not a "solved problem", instead it will be an intense focus of work in Gammmapy and CTA in the coming years. .. _GH 1877: https://github.com/gammapy/gammapy/pull/1877 .. _absorbed into the IRFs: https://github.com/open-gamma-ray-astro/gamma-astro-data-formats/issues/62#issuecomment-428221596 .. _Fermi-LAT data preparation: https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/data_preparation.html ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-007.rst0000644000175100001770000004436514721316200020456 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-007: ************** PIG 7 - Models ************** * Author: Axel Donath, Christoph Deil & Regis Terrier * Created: Dec 17, 2018 * Withdrawn: Oct 7, 2019 * Status: withdrawn * Discussion: `GH 1971`_ Introduction ============ One of the most important features of Gammapy is the modeling of gamma-ray data by parametric models. This includes the modeling of source spectra by spectral models, the modeling of source morphologies by spatial models, the modeling of light curves or phase curves with temporal models and combinations thereof. Instrument specific characteristics, such as PSF, energy resolution, effective area and background are also considered as part of the model framework. Models are used both in interactive analysis, as well as scripted or command line based analyses. For this reason they must implement an easy to use API as well as serialization to disk. Gammapy should provide a variety of built-in models, allow to combine existing models using limited arithmetic and offer the possibility for users to easily implement custom models. A lot of development work has already put into the modeling framework of Gammapy. This PIG outlines the missing development work on the model framework for Gammapy v1.0. Proposal ======== Introduce naming scheme for models ---------------------------------- Currently Gammapy implements mainly four categories of models: spectral models, morphology or spatial models, combined spectral and spatial models (``SkyModel``) and lightcurve or phase models. We propose a uniform naming scheme for these model classes, based on the prefixes ``Spectral...``, ``Spatial...``, ``Temporal``: +-----------------+----------------------------------------------+----------------------+----------------------+-----------------------------+ | Base class | Template class | IRF folded class |Background class | Model classes | +=================+==============================================+======================+======================+=============================+ |SpectralModel |SpectralTemplate |SpectralIRFModel |SpectralBackground | Powerlaw, LogParabola, ... | |(SpectralModel) |(TableModel) | | | | +-----------------+----------------------------------------------+----------------------+----------------------+-----------------------------+ |SpatialModel |SpatialTemplate |SpatialIRFModel? |SpatialBackground? | Shell, SpatialGaussian, ... | |(SkySpatialModel)|(SkyDiffuseMap) | | | | +-----------------+----------------------------------------------+----------------------+----------------------+-----------------------------+ |TemporalModel |PhaseCurveTemplate, LightCurveTemplate | |TemporalBackground? | Sine, Exponential, ... | | |(PhaseCurveTableModel), (LightCurveTableModel)| | | | +-----------------+----------------------------------------------+----------------------+----------------------+-----------------------------+ |SourceModel |SourceTemplate |SourceIRFModel |SourceBackground |Arithmetic spectral/spatial | |(SkyModel) |(SkyDiffuseCube) | | |models? | +-----------------+----------------------------------------------+----------------------+----------------------+-----------------------------+ The current class names are given in parentheses. Optional model classes, where the need is unclear are marked with a question mark. Parametric models are named after their parametrization, such as ``Powerlaw`` or ``Gaussian``. In case the name is not unique (e.g. for a gaussian spectral and spatial model, the corresponding prefix should be used, such as ``SpectralGaussian`` or ``SpatialGaussian`` or ``SpatialConstant`` and ``SpectralConstant``). In addition there is the ``SourceModels`` class to represent a sum of ``SourceModel`` objects. Unify calling interface for models ---------------------------------- For model evaluation we would like to have a nice user interface, that supports ``Quantity`` and ``SkyCoord`` objects on the one hand and a performance oriented and flexible interface for efficient evaluation for model fitting. The latter We propose a unified implementation scheme for ``__call__`` and ``.evaluate()`` for all parametric models. The actual arthmetics of the model is defined in ``.evaluate()`` and should be implemented using numpy expressions only. This ensure compatibility with all the possible higher level callers that use ``Quantity``, ``SkyCoords`` or autograd tools. :: # for "normal" models it can be a static method @staticmethod def evaluate(energy, amplitude, index, reference): value = np.power(energy / reference, -index) return amplitude * value # "template" models, that need to access ``self.interpolate()`` def evaluate(self, energy, amplitude): value = self.interpolate(energy) return amplitude * value # "IRF" models, that need to access ``self.geom``, ``self.psf`` etc. def evaluate(self, **kwargs): value = self.model(self.energy, **kwargs) npred = self.apply_psf(value) return npred A nice user interface as well as parameter dispatching is implemented in the model base classe's ``__call__`` method: :: # spectral model def __call__(self, energy): pars = self.parameters.quantities value = self.evaluate(energy, **pars) return value def __call__(self, skycoord): pars = self.parameters.quantities lon = lon.wrap_at("") value = self.evaluate(lon, lat, **pars) return value Model evaluation for fitting will be first implement using the ``__call__`` interface. In case we find performance issue related to ``Quantity`` and ``SkyCoord`` handling or need auto differentiation tools such as ``autograd``, we can later implement a more efficient interface relying directly on ``Model.evaluate()``. Introduction of background models --------------------------------- For any kind of IACTs analysis hadronic background models are required, which model the residual hadronic (gamma-like) background emission from templates provided by DL3 or by generic parametric models. The background template differ from source templates, because they are defined in reconstructed spatial and energy coordinates. We propose the introduction of the following two model classes: BackgroundModel ~~~~~~~~~~~~~~~ Base class for all background models. BackgroundIRFModel ~~~~~~~~~~~~~~~~~~ Class to represent a template background model. It is initialized with a background map template and introduces additional background parameters to adjust the spectral shape and amplitude. This model can only be evaluated on a fixed grid, defined by the input map. The model is interpolated in the preparation step. No IRFs are applied. :: from gammapy.cube import BackgroundTemplate background_map = Map() background = BackgroundTemplate(template=background_map, norm=1, tilt=0.1) npred = background.evaluate() Introduction of "forward folded" models --------------------------------------- To combine parametric model components with their corresponding instrument response functions, such as PSF, energy dispersion and exposure, we propose the introduction of "forward folded" models. These take a fixed data grid, typically defined by the exposure map, on which the model is evaluated. After application of instrument response functions, the number of prediced counts is returned by the ``.evaluate()`` method. We propose to introduce the following classes: SpectralIRFModel ~~~~~~~~~~~~~~~~ A "forward folded" spectral model, that applies energy dispersion and effective area to a ``SpectralModel``. It corresponds in functionality to the current ``CountsPredictor``. :: from gammapy.spectrum import Powerlaw, SpectralIRFModel pwl = Powerlaw() spectral_irf_model = SpectralIRFModel(model=pwl, exposure=exposure, edisp=edisp) npred = spectral_irf_model.evaluate() SpatialIRFModel ~~~~~~~~~~~~~~~ A convolved model, that applies the PSF and exposure to a ``SpatialModel``. The need for this model is not clear, as we might solve the use case of morphology fitting by a combined spatial and spectral analysis with one energy bin. SourceIRFModel ~~~~~~~~~~~~~~ A "forward folded" source model, that applies IRFs to a ``SourceModel`` instance and returns an integrated quantity corresponding to predicted counts. It can only be evaluated on a fixed grid, which is defined on initialization by the exposure map. In addition it takes reduced PSF and EDISP objects. In functionality it corresponds to the current ``MapEvaluator``, but with the model parameters attached and the background handling removed. Additional "hidden" IRF parameters e.g. to propagate systematics could be optionally introduced later. This class will replace the current ``MapEvaluator``. :: from gammapy.cube import SourceIRFModel, SourceModel model = SourceModel() source_irf_model = SourceIRFModel(model, exposure=exposure, edisp=edisp, psf=psf) npred = source_irf_model.evaluate() Improve SourceModels class -------------------------- The existing ``CompoundSkyModel`` is a nice very generic abstraction to support any kind of arithmetic between ``SkyModel`` objects, but the number of use cases for other operators, except for "+" is very limited and can always be achieved by implementing a custom model. Serilization and component handling of this hierarchical model component structure is intrinsically difficult. For this reason we propose to first support and improve the existing ``SourceModels`` that implements an easier to handle flat hierarchy for model components. Support for arbitrary model arithmetic can be introduced, if needed, after Gammapy v1.0. We propose to remove the ``CompoundSourceModel`` and reimplement the ``+`` operator using the ``SourceModels`` class. :: from gammapy.cube import SourceModel, SourceModels component_1 = SourceModel() component_2 = SourceModel() total_model = component_1 + component_2 assert isinstance(total_model, SourceModels) # becomes quivalent to total_model = SourceModels([component_1, component_2]) Possibly remove the ``SpectralCompoundModel`` for consistency. Introduction of model name attributes ------------------------------------- The ``SourceModel`` class should be improved to support model component names. This way the ``SourceModels`` object can be improved to access model components by name, as well as the XML serialization (which supports names) becomes complete. The following example should work: :: from gammapy.cube import SourceModel, SourceModels component_1 = SourceModel(name="source_1") component_2 = SourceModel(name="source_2") total_model = SourceModels([component_1, component_2]) total_model["source_1"].parameters["index"].value = 2.3 # or alternatively?? total_model.parameters["source_1.index"].value = 2.3 # delete a model component in place del model["source_2"] This is also simplifies the parameter access in the fitting back-end, because parameter names become unique. E.g. no need for cryptic ``par_00X_`` parameter names in the minuit backend, which simplifies debugging and interaction by the user with methods such as ``Fit.likelihood_profile()`` or ``Fit.confidence()``, where parameter identifiers must be given. Improve and implement model serilization ---------------------------------------- Serialization of models is an important feature for standard analyses. In ``gammapy.utils.serialization`` there is a prototype implementation of XML serialization for ``SourceModels``, We propose to fix this existing XML serilization prototype so that the following easily works: :: from gammapy.cube import SourceModels model = SourceModels.read("model.xml") print(model) model.write("model.xml") In addition we propose to implement an additional YAML serialization format, which results in human-readable model configuration files. Once available the YAML format should be the preferred serialization: :: model = SourceModels.read("model.yaml") print(model) model.write("model.yaml") The two file formats should be handled automatically in ``SourceModel.read()`` and ``SourceModel.write()``. Improve spatial models ---------------------- Implement sky coordinate handling ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Implement support for coordinate systems. Add an interface for ``SkyCoord``, by introducing a ``SpatialModel.skycood`` attribute. :: from astropy.coordinates import SkyCoord from gammapy.image.models import PointSource # instantiation using a SkyCoord object point_source = PointSource(skycood=SkyCoord(0, 0, frame="galactic", unit="deg")) # or point_source = PointSource(lon="0 deg", lat="0 deg", frame="galactic") skycoord = point_source.skycoord assert isinstance(skycoord, SkyCoord) # evaluation of models using sky coordinates coords = SkyCoord([0, 1], [0.5, 0.5], frame="galactic", unit="deg") values = point_source(coords) # maybe override evaluation for efficient evaluation for fitting values = point_source(lon, lat) Implement default parameters ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ For spectral models Gammapy has defined default parameters, which simplifes the interactive usage of the models. From this experience we propose to introduce default parameters for ``SpatialModel`` as well as ``SourceModel``. :: from gammapy.image.models import Shell from gammapy.cube.models import SourceModels shell = Shell() print(shell) Implement evaluation region specifications ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ For efficient evaluation of model components, we propose that ``SpatialModel`` classes declare a circular region ``.evaluation_radius``, they are evaluated on. :: from gammapy.image.models import Shell shell = Shell() print(shell.evaluation_radius) The ``.evaluation_radius`` should be a property of the model, that is computed depending of the model parameter, that determines the spatial extension of the model. The handling of ``PointSource`` and additional evaluation margins, which depend on the pixel size, is done in the step, where the model component is evaluated. Expose model parameters as attributes ------------------------------------- To simplify the interactive handling of model instances, model parameters should be exposed as attributes on the model, so that the following works: :: from gammapy.spectrum.models import Powerlaw pwl = Powerlaw() pwl.amplitude.value = "1e-12 cm-2 s-1 TeV-1" # instead of pwl.parameters["amplitude"].value = "1e-12 cm-2 s-1 TeV-1" The attributes should be auto-generated in the ``Model`` base class, based on the list of parameters defined by the model. Add new parametric models ------------------------- We propose to implement the following parametric models: - SkyGaussianElongated: Non radial-symmetric Gaussian. In planar approximation... - SkyDiskElongated: Non radial-symmetric disk. In planar approximation... - SpectralGaussian: Spectral Gaussian model. ``LogGaussian``? Task list ========= This is a proposal for a list of pull requests implementing the proposed changes, ordered by priority: 1. Implement a ``BackgroundModel`` base class and a ``BackgroundIRFModel`` class. Use the ``BackgroundIRFModel`` object in the current ``MapEvaluator``. Join the background parameters with the source model parameters and expose the joined parameter list at the ``MapEvaluator`` object, so that it can be used be used in current ``MapFit``. (**v0.10**) 2. Reimplement the "+" operator for ``SkyModel`` using the ``SkyModels`` class. Remove the ``CompoundSkyModel`` class. (**v0.XX**) 3. Implement support for model component names in ``SkyModel``. Implement ``SkyModels.__getitem__`` to allow for access of model components by name. (**v0.XX**) 4. Fix existing XML serilization of ``SkyModels`` and add serialization of the model component name. (**v0.XX**) 5. Implement a prototype YAML serialization for ``SkyModels``. Add ``SkyModels.from_yaml()`` and ``SkyModels.to_yaml()`` methods. Extend ``SkyModels.read()`` to recognise the file type. (**v0.XX**) 6. Implement support for coordinate systems in the ``SpatialModel`` base class. Add a ``SpatialModel.skycoord`` attribute to return a ``SkyCoord`` object and allow initialization of the spatial model positions with ``SkyCoord``. (**v0.XX**) 7. Implement the ``evaluation_radius`` property for all spatial models. (**v0.XX**) 8. Implement default parameter values for spatial models and ``SourceModel``. (**v0.XX**) 9. Expose model parameters as attributes on the models. (**v0.XX**) 10. Rename all models according to the proposed naming scheme. (**v0.XX**) 11. Rename existing ``MapEvaluator`` to ``SourceIRFModel`` and remove background handling. Temporarily move the background handling to the ``MapFit`` class. (**v0.XX**) 12. Rename existing ``CountsPredictor`` to ``SpectralIRFModel`` and unify API with ``SourceIRFModel`` as much as possible. (**v0.XX**) Alternatives ============ The proposed model names are open for discussion. Instead of ``Spectral...``, ``Spatial...`` and ``Source...`` one could also use the suffixes ``...1D``, ``...2D`` or ``...3D``. Instead of ``...Template`` one could use ``...TableModel``. The ``...IRFModel`` class are basically evaluating models on given coordinate grids and correspond in functionality to the current ``MapEvaluator``. A naming scheme based on ``ModelEvaluator`` seems a good alternative proposal. This would include a ``SpectralEvaluator`` and a ``SourceEvaluator`` and ``BackgroundEvaluator`` class. Decision ======== The authors decided to withdraw the PIG. Most of the proposed changes have been discussed and implemented independently. .. _gammapy: https://github.com/gammapy/gammapy .. _gammapy-web: https://github.com/gammapy/gammapy-webpage .. _GH 1971: https://github.com/gammapy/gammapy/pull/1971 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-008.rst0000644000175100001770000004703614721316200020455 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-008: **************** PIG 8 - Datasets **************** * Author: Axel Donath, Christoph Deil, Regis Terrier & Atreyee Sinha * Created: Jan 4th, 2019 * Withdrawn: Nov 30th, 2019 * Status: withdrawn * Discussion: `GH 1986`_ Abstract ======== An essential analysis feature of Gammapy will be the possibility to fit models to gamma-ray data. This includes the simple use-case of fitting spectral, spatial and combined models to binned datasets, but also includes more advanced scenarios such as joint-likelihood fitting of a model across multiple IACT observations, joint-likelihood fitting of IACT and Fermi-LAT data, combined analysis of gamma-ray data with flux points or a combined spectral and cube analysis. The joint-likelihood approach also allows for combining different event types in a single analysis. For this reason we propose to introduce the abstraction layer of a ``Dataset`` in Gammapy. A dataset bundles the reduced data with a parametric source model, background model, instrument response functions and the fit statistics function. It evaluates the model and returns the log-likelihood of the data, given the current model parameters. This approach allows to introduce a uniform fitting interface represented by a single ``Fit`` class, independent of the type of the analysis (spectral, spatial, cube). In addition Datasets can be combined by adding their log-likelihood values, linking and concatenating their model parameters. This enables all combinations of a joint-likelihood analyses described above. Proposal ======== In general the proposal includes three basic types of classes: ``Dataset``, ``Datasets`` and ``Fit ``. The ``Dataset`` bundles the model and data, the ``Datasets`` class combines multiple datasets for joint-likelhood fitting and the ``Fit`` class represents the interface to the optimization methods, which are implemented in ``gammapy.utils.fitting``. Here is an illustration of the proposed API: .. code:: from gammapy.utils.fitting import Dataset, Datasets, Fit dataset = Dataset() dataset.stat() fit = Fit(dataset) fit.optimize() datasets = Datasets([Dataset(), Dataset()]) fit = Fit(datasets) fit.optimize() # or for convenience datasets = [Dataset(), Dataset()] fit = Fit(datasets) fit.optimize() To enabled the different analysis use-cases we propose to introduce a ``MapDataset``, ``SpectrumDataset``, ``SpectrumDatasetOnOff`` and ``FluxPointsDataset``. Dataset ------- The ``Dataset`` class is the abstract base class for all datasets. It defines the minimal required interface for a dataset to work with the ``Fit`` class and basic user API. It serves as a base class for all built-in datasets, as well as for user defined ones: .. code:: from gammapy.utils.fitting import Dataset class UserDataset(Dataset): def __init__(self, model, data, mask_fit=None, mask_safe=None): self.model = model self.data = data self.parameters = model.parameters self.mask_safe = None self.mask_fit = None def stat_per_bin(): return (self.model.evaluate() - self.data) ** 2 Any dataset derived from the base class has to define a ``.stat_per_bin()`` method, that returns the log-likelihood given the current model parameters and a ``.parameters`` attribute, which defines the parameters passed on to the ``Fit`` class. The dataset also (optionally) defines two kinds of masks: the ``mask_safe`` and ``mask_fit``. The main purpose is that safe data range defined by ``mask_safe`` is set in advance e.g. during data reduction and not modified later, while the ``mask_fit`` can be modified by users to manually define the fit range or by algorithms e.g. flux point computation. For likelihood evaluation both mask are combined by the ``Dataset`` base class. Datasets -------- To combine multiple datasets in a joint-likelihood analysis we propose to introduce a ``Datasets`` class. Its responsibility is to add up the log-likelihood values of the individual datasets and join the parameter lists. The ``Datasets`` class also represents the main interface to the ``Fit`` class. The following example shows how a joint-likelihood analysis of multiple observations is supposed to work: .. code :: from gammapy.utils.fitting import Datasets from gammapy.cube import SkyModel model = SkyModel.read() map_datasets = [] for obs_id in [123, 124, ...]: bkg_model = BackgroundModel.read("obs-{}/background.fits".format(obs_id)) dataset = MapDataset(model=model, background=bkg_model) map_datasets.append(dataset) datasets = Datasets(map_datasets) datasets.stat() fit = Fit(datasets) The linking of parameters of the spectral model is natively achieved, as the same instance of the spectral model is passed to two different datasets, while the background model is created for every dataset separately. A joined spectral fit across multiple observations: .. code :: model = SpectralModel() spectrum_dataset_1 = SpectrumDataset(model, ...) spectrum_dataset_2 = SpectrumDataset(model, ...) datasets = Datasets([spectrum_dataset_1, spectrum_dataset_2]) fit = Fit(datasets) fit.optimize() Combined spectral / flux points analysis: .. code :: model = SpectralModel() spectrum_dataset = SpectrumDataset(model=model) flux_points = FluxPoints.read() flux_point_dataset = FluxPointsDataset(flux_points=flux_points, model=model, likelihood="chi2") datasets = Datasets([flux_point_dataset, spectrum_dataset]) fit = Fit(datasets) fit.optimize() MapDataset ---------- To enable the standard combined spectral and spatial analysis we propose to introduce a ``MapDataset`` class. A ``MapDataset`` bundles the counts data, source model, IRFs, background model corresponding to a given event selection. It is supposed to work as following: .. code :: from gammapy.cube import MapDataset model = SkyModel() background = BackgroundModel() counts = Map.read("counts.fits") exposure = Map.read("exposure.fits") edisp = EDispMap.read("edisp.fits") psf = PSFMap.read("psf.fits") dataset = MapDataset( counts=counts, exposure=exposure, edisp=edisp, psf=psf, model=model, background=background_model, likelihood="cash" mask_fit=None, mask_safe=None, stat="cash" ) fit = Fit(dataset) fit.optimize() For the psf argument three values are possible: ``None``, ``PSFKernel()`` or ``PSFMap``. In the first case the PSF convolution is skipped. In the second case a single PSF for the whole map is assumed. In the last case a spatially varying PSF is handled. The same holds for the ``edisp`` argument. In the proposed implementation the counts map and background model must be defined on the same map geometry. A separate setup step in ``MapDataset.__init__`` creates cutouts of the exposure map for the individual model components and assigns the corresponding IRFs to the model component and bundles all into the already existing ``MapEvaluator()`` object. The list of model evaluators is cached on the ``MapDataset`` and used later to efficiently compute the predicted number of counts. The ``stat`` argument allows to choose between built-in fit statistics such as ``cash`` or ``cstat`` for convenience. MapDatasetOnOff --------------- For on-off based analyses a ``MapDatasetOnOff`` will be introduced. It inherits most of its functionality from the ``MapDataset``, but implements in addition the handling of ``counts_off`` and ``acceptance`` and ``acceptance_off`` information: .. code :: from gammapy.cube import MapDatasetOnOff model = SkyModel() background_model = BackgroundModel() counts = Map.read("counts.fits") counts_off = Map.read("counts_off.fits") acceptance = Map.read("acceptance.fits") acceptance_off = Map.read("acceptance_off.fits") exposure = Map.read("exposure.fits") edisp = EdispMap.read("edisp.fits") psf = PSFMap.read("psf.fits") dataset_onoff = MapDatasetOnOff( model=model, background_model=background_model, counts=counts, counts_off=counts_off, acceptance=acceptance, aceptance_off=acceptance_off, exposure=exposure, edisp=edisp, psf=psf, mask_fit=mask_fit, mask_safe=mask_safe, stat="wstat" ) fit = Fit(dataset) fit.optimize() The ``background_model`` can be defined optionally for simulation or fitting to the ``counts_off`` data. The ``MapDatasetOnOff`` additionally implements the following methods needed for the evaluation of the fit statistic: .. code:: dataset = MapDatasetOnOff() dataset.alpha # defined by acceptance / acceptance_off dataset.background # defined by alpha * counts_off dataset.npred_sig() # defined by npred - background SpectrumDataset --------------- For spectral analysis we propose to introduce a ``SpectrumDataset``: .. code :: from gammapy.spectrum import SpectrumDataset model = SpectralModel() background_model = SpectralBackgroundModel.read() edisp = EnergyDispersion.read() counts = CountsSpectrum.read() exposure = CountsSpectrum.read() dataset = SpectrumDataset( counts=counts, exposure=exposure, model=model, background_model=background_model, mask_fit=mask_fit, mask_safe=mask_safe, edisp=edisp, stat="cash" ) dataset.npred() dataset.stat_per_bin() dataset.stat() SpectrumDatasetOnOff -------------------- For on-off based spectral analysis we propose to introduce a ``SpectrumDatasetOnOff`` class which again inherits from ``SpectrumDataset`` and handles the additional information accordingly: .. code :: from gammapy.spectrum import SpectrumDatasetOnOff model = SpectralModel() edisp = EnergyDispersion.read() counts = CountsSpectrum.read() counts_off = CountsSpectrum.read() aeff = EffectiveAreaTable.read() dataset = SpectrumDatasetOnOff( counts=counts, counts_off=counts_off, model=model, exposure=exposure, acceptance=acceptance, acceptance_off=acceptance_off, edisp=edisp, stat="wstat", ) dataset.npred() dataset.stat_per_bin() dataset.stat() We propose to refactor the existing ``SpectrumObservation`` object into the ``SpectrumDatasetsOnOff`` class. FluxPointsDataset ----------------- For fitting of flux points we propose to introduce a ``FluxPointsDataset``: .. code:: from gammapy.spectrum import FluxPointsDataset flux_points = FluxPoints.read("flux_points.ecsv") model = PowerLaw() dataset = FluxPointsDataset( flux_points=flux_points, model=model, mask_safe=mask_safe mask_fit=mask_fit, stat="chi2" ) fit = Fit(dataset) fit.optimize() The ``FluxPoint`` class also supports the ``likelihood`` format, which has to be implemented in a special way in the ``FluxPointsDataset``. The ``likelihood`` format stores the likelihood of the flux point, depending on energy and amplitude. Given a predicted flux by the spectral model the likelihood can be directly interpolated. This could be supported by a specific option ``FluxPointsDataset(stat="likelihood")`` or similar. Dataset helper / convenience methods ------------------------------------ We propose to implement a few convenience methods on all datasets. E.g. a ``.residuals()`` method, which computes the residual for the given data and state of the model: .. code:: dataset.residuals(method="diff") A selection of methods for the residual computation would ne available, such as ``data - model``, ``(data - model) / model`` and ``(data - model) / sqrt(model)``. For counts based datasets an ``.excess()`` method can be implemented. All dataset objects should implement a ``__str__`` method, so that the following works: .. code:: print(dataset) Specific to each dataset information on the model, model parameters, data and fit statistics is printed. We propose to implement a ``.peek()`` method on all datasets: .. code:: dataset.peek() In an interactive environment this will show a default visualisation of the dataset including model, data, IRFs and residuals. Simulation of MapDataset and SpectrumDataset -------------------------------------------- The ``MapDataset`` and ``SpectrumDataset`` classes will be the central classes for map and spectrum based analyses. In many cases it's useful to simulate counts from a predicted number of counts model. The direct sampling of binned data for the datasets could be supported as following (illustrated for the ``MapDataset``): .. code:: # allow counts=None on init dataset = MapDataset(counts=None) # to sample a counts map from npred dataset.counts = dataset.fake(random_seed=0) fit = Fit(dataset) fit.optimize() Which calls ``np.random.poisson()`` internally on the npred map and returns the resulting map. This works equivalently for the ``SpectrumDataset``, ``SpectrumDatasetOnOff`` and ``MapDatasetOnOff``. The on-off dataset require a background model for this to work. Unbinned simulation (sampling of event lists) is addressed in a separate PIG. Dataset serialization --------------------- For convenience all dataset classes should support serialization, implemented via ``.read()`` and ``.write()`` methods. For now we only consider the serialization of the data of the datasets and not the of the model, which might always stay separate. As the dataset has to orchestrate the serialization of multiple objects, such as different kind of maps, flux-points etc. one option is to introduce the serialization with a YAML based index file: .. code :: dataset = MapDataset.read("dataset.yaml") dataset.write("dataset.yaml") Where the index file points to the various files needed for initialization of the dataset. Here is an example: .. code:: dataset: type: map-dataset counts: "obs-123/counts.fits" exposure: "obs-123/exposure.fits" edisp: "obs-123/edisp.fits" psf: "obs-123/psf.fits" background-model: "obs-123/background.fits" model: "model.yaml" # optionally Additionally we propose to introduce a single FITS file serializiaton for writing / reading datasets to and from disk: .. code:: dataset = MapDataset.read("dataset.fits") dataset.write("dataset.fits") The ``SpectrumDatasetOnOff`` in addition implements serialisation to the standard OGIP format described on the `gamma-astro-data-formats PHA`_ page. The ``Datasets`` object could be serialized equivalently as a list of datasets: .. code:: - dataset: type: spectrum-dataset maps: counts: "obs-123/counts.fits" exposure: "obs-123/exposure.fits" edisp: "obs-123/edisp.fits" model: "model.yaml" background-model: "obs-123/background.fits" likelihood: "wstat" - dataset: type: spectrum-dataset maps: counts: "obs-124/counts.fits" exposure: "obs-124/exposure.fits" edisp: "obs-124/edisp.fits" model: "model.yaml" background-model: "obs-124/background.fits" likelihood: "wstat" - dataset: type: flux-point-dataset flux-points : "fermipy-flux-points.fits" model: "spectral-model.yaml" likelihood: "template" Task List ========= This is a proposal for a list of pull requests implementing the proposed changes, ordered by priority: 1. Refactor the current ``FluxPointFit`` into a ``FluxPointsDataset`` class and make it work with the current ``Fit`` class. Ensure that the ``Fit`` class supports the old inheritance scheme as well as the new dataset on init. Update tests and tutorials (`#2023 `_). 2. Refactor the current ``MapFit`` into the ``MapDataset`` class. Only support a fixed energy dispersion and psf first. Use the current ``MapEvaluator`` for model evaluation, but split out the background evaluation. Update test and tutorials (`#2026 `_) 3. Implement the ``Datasets`` class in ``gammapy.utils.fitting.datasets``, add tests using multiple ``MapDataset``. Change the ``Fit`` interface to take a ``Datasets`` object or list of ``MapDataset`` on input (`#2030 `_). 4. Enable joint-likelihood analyses by filtering the ``Datasets.parameters`` for unique parameters only. Add a tutorial for joint-likelihood fitting of multiple observations (`#2045 `_). 5. Implement ``MapDataset.setup()`` method, using a list of ``MapEvaluator`` objects. Add an ``.evaluation_radius()`` attribute to all the spatial models. Only support a fixed PSF and edisp per model component (`#2071 `_). 6. Add support for psf maps to ``MapDataset.setup()``. Extend the ``.setup()`` method to look up the correct PSF for the given model component. A ``PSFMap`` has to be passed on ``MapDataset.__init__()`` (**v0.14**). 7. Add support for energy dispersion maps to ``MapDataset``. Extend the ``.setup()`` method to look up the correct EDisp for a given model component. An ``EdispMap`` has to be passed on ``MapDataset.__init__()`` (**v0.14**). 8. Add tutorial for joint-likelihood fitting of IACT and Fermi-LAT data, based on the joint-crab dataset (**v0.14**). 9. Add a ``name`` attribute to datasets and a ``Datasets.__getitem__`` method (**v0.14**). 10. Implement dataset serialization to yaml (**v0.14**). 11. Implement datasets serialization to a single FITS file (`#2264 `_). 12. Add the direct likelihood evaluation to the ``FluxPointsDataset``, by interpolating flux point likelihood profiles (**v0.14**). 13. Implement ``MapDataset.fake()``, ``SpectrumDataset.fake()`` and ``SpectrumDatasetOnOff.fake()`` methods (**v0.14**). 14. Implement convenience helper methods ``.residuals()``, ``.peek()`` and ``.__str__`` for all datasets (**v0.14**). Outlook ======= Parallel evaluation of datasets ------------------------------- For efficient joint-likelihood fits of multiple observations, parallel processing should be used. The obvious entry point is to evaluate one dataset per process and join the likelihoods at the end. The current handling of model references (the same model instance is passed to different datasets to achieve a generic parameter linking by updating the model parameters in place), sets a limitation on the parallel evaluation, because Python objects can't easily be shared between multiple processes. We are aware of this issue, but propose to solve it later, because it does not affect the proposed API for datasets. Lazy loading of Datasets ------------------------ For the use-case of inspecting or stacking individual datasets (e.g. MapDataset per observation) it could be advantegeous to implement a lazy-loading or generator interface for ``Datasets.read()``. Such that the individual datasets are only read from disk on requests and are not loaded in memory when calling ``.read()``. We leave this question un-addressed in this PR but mention it for completeness. Decision ======== The authors decided to withdraw the PIG. Most of the proposed changes are implemented. .. _gammapy: https://github.com/gammapy/gammapy .. _gammapy-web: https://github.com/gammapy/gammapy-webpage .. _`gamma-astro-data-formats PHA`: https://gamma-astro-data-formats.readthedocs.io/en/latest/spectra/ogip/index.html .. _`GH 1986`: https://github.com/gammapy/gammapy/pull/1986././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-009.rst0000644000175100001770000003121614721316200020447 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-009: ********************** PIG 9 - Event sampling ********************** * Author: Fabio Pintore, Andrea Giuliani, Axel Donath * Created: May 03, 2019 * Accepted: Aug 30, 2019 * Status: accepted * Discussion: `GH 2136`_ Abstract ======== An event sampler for gamma events is an important part of the science tools of the future Cherenkov Telescope Array (CTA) observatory. It will allow users to simulate observations of sources with different spectral, morphological and temporal properties and predict the performance of CTA on the simulated events e.g. to support observation proposals or study the sensitivity of future observations. For this reason, we propose to implement a framework for event simulation in Gammapy. The proposed framework consists of individual building blocks, represented by classes and methods, that can be chained together to achieve a full simulation of an event list corresponding to a given observation. This includes the simulation of source events, background events, effects of instrument response functions (IRF) and arrival times. As underlying method for the actual event sampling we propose to use inverse cumulative distribution function (CDF) sampling (inverseCDF_) with finely binned discrete source and background. Temporal models will be also taken into account and time will be sampled separately in a 1D analysis, assuming that the temporal dependency of the input source models factorizes. Sampling methods ================ Inverse CDF sampling (inverseCDF_) is an established method to sample from discrete probability mass functions. It is used by ``ASTRIsim`` (astrisim_), the event simulator of the AGILE collaboration. However it is not the method of choice for other existing event samplers such as the the Fermi-LAT Science Tools (gtobsim) and CTOOLS (ctobssim). The latter uses a combination of analytical sampling for models, where a solution is known (e.g. power-laws) and the rejection sampling method (rej_sampl_), where the sampling has to be done numerically (see an example here gammalib_). As rejection sampling can directly sample from continuous probability density functions, it is expected to yield very precise results. However an enveloping distribution is needed, which should be adapted to the target distribution to be efficient (see also `rejection sampling in Python`_ for an example implementation), otherwise a lot of computation time is spend in rejecting drawn samples. For this reason we favour the inverse CDF sampling method, as a simple to implement and general sampling method. The precision of the inverse CDF sampling method can be controlled by the resolution of the input probability mass function (PMF) and is in practice only limited by the available memory. We will study the required bin-size of the PMFs to reach sufficient precision. If we find the inverse CDF sampling method to be not precise enough, it is still possible to achieve better precision adopting the rejection sampling. This will not have a strong impact on the structure of the event-sampler. Proposal ======== We propose to include in ``gammapy.cube`` an high level interface (HLI) class, labelled as ``MapDatasetEventSampler`` or ``MapDataset.sample`` method. This class handles the complete event sampling process, including the corrections related to the IRF and source temporal variability, for a given GTI / observation. The dataset will be computed using the standard data reduction procedure of Gammapy, as illustrated in the following example: :: obs = Observation(pointing, gti, aeff, psf, edisp, expomap) maker = MapDatasetMaker(geom, geom_irf, ...) dataset = maker.run(obs) model = SkyModels.read("model.yaml") dataset.model = model sampler = MapDatasetEventSampler(dataset) events = sampler.sample() events.write() After data reduction, the Dataset object should contain all the needed information, such as the pointing sky coordinates, the GTI, and the setup of all the models (spectra, spatial morphology, temporal model) for any given source, and it is passed as input parameter to the ``MapDatasetEventSampler``. It is important to note that the ``MapDataset`` object can store information for more than one source. Then, a ``.sample`` method will draw the sampled events and will provide an output ``~astropy.table.Table`` object. The latter will contain the reconstructed sky positions, energies, times, and an ``EVENT_ID`` and ``MC_ID``. ``EVENT_ID`` is a unique number or a string to identify the sampled event, while ``MC_ID`` is a unique ID (number or string) to identify the model component the event was sampled from. The ``MapDatasetSampler`` should also fill the mandatory header information for event list files described on `gadf`_. MapDatasetEventSampler ---------------------- The general design of the ``sample`` method is as follows: :: def sample(dataset, random_state) """Sample events from a ``MapDataset``""" events_list = [] for evaluator in dataset.evaluators: npred = evaluator.compute_npred() n_events = random_state.poisson(npred.data.sum()) events = npred.sample(n_events, random_state) time = LightCurveTableModel.sample(n_events=, lc=, random_state=) events = hstack(events,time) events_list.append(events) event_list["MC_ID"] = evaluator.model.name events_src = vstack(events_list) events_src = dataset.psf.sample(events_src, random_state) events_src = dataset.edisp.sample(events_src, random_state) n_events_bkg = random_state.poisson(dataset.background_model.map.data.sum()) events_bkg = dataset.background_model.sample(n_events, random_state) events_total = vstack([events_src, events_bkg]) events_total.meta = get_events_meta_data(dataset) return EventList(events_total) In more detail, ``sample`` starts a loop over the sources stored into the ``MapDataset`` model. Then, for each source, the ``src.compute_npred`` method will calculate the predicted number of source counts ``npred``. In particular, it is important to note that ``npred = exposure * flux``, where ``exposure`` is defined as ``effective_area * exposure_time``. ``npred`` is therefore calculated irrespective of the energy dispersion and of PSF. Then, ``npred`` will be the input of the ``npred.sample`` method. The latter uses a Poisson distribution, with mean equal to the predicted counts, to estimate the random number of sampled events. We propose to add a ``Map.sample(n_events=, random_state=)`` method in ``~gammapy.maps.Map`` that will be the core of the sampling process. The ``sample`` is based on the ``~gammapy.utils.random.InverseCDFSampler`` class described in `GH 2229`_ . The output will be an ``~astropy.table.Table`` with columns: ``RA_TRUE``, ``DEC_TRUE`` and ``ENERGY_TRUE`` . Then, the time will be sampled independently using the temporal information stored into the ``MapDataset`` model for each source of interest. This will be done through a ``.sample(n_events=, random_state=)`` method that we propose to add to ``~gammapy.time.models.LightCurveTableModel`` and ``~gammapy.time.models.PhaseCurveTableModel``. This method will take as input the GTIs (i.e. one Tstart and Tstop) in the ``MapDataset`` object. Also in this case the ``InverseCDFSampler`` class is the machine used to sample the time of the events. In the case the temporal model is not provided, the time is uniformly sampled in the time range ``t_min`` and ``t_max``. To define a light-curve per model component, the current ``SkyModel`` class will be extended by a ``SkyModel(..., temporal_model=)``. The IRF correction can now be applied to sampled events. We propose to add a ``.sample(events=)`` method in both ``~gammapy.cube.PSFMap`` and ``~gammapy.cube.EdispMap``. The method interpolates the "correct" IRF at the position of a given event and applies it. In more detail, the method calculates the psf and the energy dispersion at the events true positions and true energies, which are given in input as an ``~astropy.table.Table`` object. The IRFs are assumed to be constant and not time-dependent. The output will be an ``~astropy.table.Table`` with the new columns ``RA``, ``DEC`` and ``ENERGY``, which are the reconstructed event energies and positions. Finally, the times and the energies/coordinates of the events will be merged into a single ``~astropy.table.Table`` with the columns:``RA``, ``DEC`` and ``ENERGY`` and ``TIME`` . The ``MapDatasetEventSampler`` can be used to sample background events using the ``Map.sample(n_events=, random_state=)`` as well. The time of the events is sampled assuming a constant event rate. Finally, the IRF corrections are not applied to background sampled events. Performance and precision evaluation ==================================== To evaluate the precision and performance of the described framework we propose to implement a prototype for a simulation / fitting pipeline. Starting from a selection of spatial, spectral and temporal models, data are simulated and fitted multiple times to derive distributions and pull-distributions of the reconstructed model parameters. This pipeline should also monitor the required cpu and memory usage. This first prototype can be used to evaluate the optimal bin-size (with the best compromise between performance and precision) for the simulations and to verify the over-all correctness of the simulated data. This will be valid for a set of input maps and IRFs. Later this prototype can be developed further into a full simulation / fitting continuous integration system. Alternatives / Outlook ====================== So far Gammapy only supports binned likelihood analysis and technically most use- cases for the event sampling could be solved with binned simulations. A binned simulation can be basically achieved by a call to ``numpy.random.poisson()`` based on the predicted number of counts map. This is conceptionally simpler as well as computationally more efficient than a sampling of event lists. In ``Gammapy`` a similar dataset simulation is already implemented in ``Dataset.fake()``, although this has a limited number of use cases than an event sampler. However, to support the full data access and data reduction process for simulations, event lists are required. In future Gammapy possibly also supports event based analysis methods (unbinned likelihood, but also e.g. clustering algorithms), that also require event lists. For this reason binned simulations cannot present a full equivalent solution to event sampling. The question of the API to simulate multiple observations from e.g. an ``ObservationTable`` or a list of ``GTIs`` as it is needed for simulating data for the CTA data challenge is not addressed in this PIG. For the scope of this PIG, the fundamental class ``MapDatasetEventSampler`` to simulate events corresponding to a given observation and/or single GTI is in place. The proposed Event Sampler will not provide, for each event, the corresponding ``DETX`` and ``DETY`` position. These will be added in a future development of the simulator. Task list ========= This is a proposal for a list of tasks to implement the proposed changes: 1. Implement the ``sample`` method in ``gammapy.maps.Map`` and add tests. 2. Implement the ``sample`` method in ``gammapy.time.models.LightCurveTableModel`` and ``gammapy.time.models.PhaseCurveTableModel`` and add tests. 3. Implement the ``sample`` method in ``gammapy.cube.PSFMap`` and add tests. 4. Implement the ``sample`` method in ``gammapy.cube.EdispMap`` and add tests. 5. Introduce the ``MapDatasetEventSampler`` into ``gammapy.cube.`` and add tests. 6. Add tutorials for event simulations of different kinds of sources. Decision ======== The PIG was discussed extensively in `GH 2136`_, the weekly Gammapy developer calls and coding sprint in person. After the deadline for final review expired on August 20, all remaining comments were addressed and the PIG was accepted on August 30, 2019. .. _GH 2136: https://github.com/gammapy/gammapy/pull/2136 .. _GH 2229: https://github.com/gammapy/gammapy/pull/2229 .. _rejection sampling in Python: https://wiseodd.github.io/techblog/2015/10/21/rejection-sampling/ .. _Prototype: https://github.com/fabiopintore/notebooks-public/blob/master/gammapy-event-sampling/prototype.ipynb .. _inverseCDF: https://en.wikipedia.org/wiki/Inverse_transform_sampling .. _here: https://stackoverflow.com/questions/21100716/fast-arbitrary-distribution-random-sampling/21101584#21101584 .. _spec: https://docs.gammapy.org/0.11/api/gammapy.spectrum.SpectrumSimulation.html .. _three-D: https://docs.gammapy.org/0.11/notebooks/simulate_3d.html .. _astrisim: https://github.com/cento14/Astrisim .. _rej_sampl: https://en.wikipedia.org/wiki/Rejection_sampling ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-010.rst0000644000175100001770000006332014721316200020440 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-010: **************** PIG 10 - Regions **************** * Author: Christoph Deil, Axel Donath, Régis Terrier * Created: May 3, 2019 * Accepted: Jun 17, 2019 * Status: accepted * Discussion: `GH 2129`_ Abstract ======== We propose to use `astropy-regions`_ to handle spatial sky and pixel regions throughout Gammapy. We already use ``astropy-regions``, so why a PIG? There are a few decisions we need to make concerning where to use sky and where to use pixel regions and where to support both. This affects the algorithms used (e.g. for rotated region background estimation) and the API (where a WCS or exclusion map is needed in functions and methods working with regions). Also ``astropy-regions`` was started after Gammapy - so we have some code to work with sky cones and boxes that should partially be removed, partially refactored to use ``astropy-regions``. The scope of this PIG is small and limited to spatial sky and pixel regions. The work to implement it is small, we propose to complete it for Gammapy v0.13 in July 2019. The question of more general data subspace selections that include energy, time or phase regions and selections, or general n-dimensional regions, or whether to introduce a field of view (FOV) coordinate frame and special FOV regions is not addressed here. Also some things like the clean-up of ``gammapy.image`` which contains some region-related code is left as future work. Introduction ============ This long introduction describes the history and design of `astropy-regions`_ and regions in Gammapy. It is background information, you are welcome to skip ahead to the `Proposal`_ and `Task list`_ in this PIG below. Spatial regions are used a lot in gamma-ray astronomy (and other domains of astronomy as well). Often a users chooses a region of interest, and then selects the observations or events or pixels for analysis. Besides "on regions" or "regions of interest", there are a few use cases to create regions within Gammapy, e.g. to restrict analysis to a maximum field of view offset, or to compute "off" regions for background estimation for a given "on" region. In early versions of Gammapy we started to develop ``gammapy.regions``, but soon realised that the functionality really is very generic, and also needed by many other astronomers (radio, optical, X-ray, ...). Astropy and `photutils`_ was being developed, there was `pyregion`_. At `Python in Astronomy 2015`_ we discussed and decided to create a new Astropy-affiliated regions package from scratch and have been developing it since: `astropy-regions`_. The goal is to merge it as ``astropy.regions`` into the Astropy core once it's in a certain sense complete, for now it is an extra package that has been a dependency of Gammapy since Gammapy v0.5 in 2016. This is OK, the situation for ``astropy-regions`` is the same as for `reproject`_ and soon `astropy-healpix`_ (see `GH 1167`_), they are extra packages and dependencies of Gammapy. We should continue to contribute to these three projects until they are complete and help get them in Astropy core some day. The `astropy-regions`_ package contains sky and pixel regions with methods to filter positions and to make masks, ways to transform between sky and pixel region for a given WCS, support for plotting with matplotlib and I/O (string serialisation and parsing) for the `DS9 region format`_. As described below, there are a few things still missing in astropy-regions that we need in Gammapy, thus the work described in this PIG is partly for `astropy-regions`_ and partly in Gammapy. The places where regions appear in the Gammapy code and API are very few and overall region handling is pretty simple. Gammapy is mostly a binned analysis framework, so mainly regions are used to create pixel masks, and then the analysis is mask-based, not involving region objects any more (see section below: `Images and masks`_). The second place where regions appear is to select objects of interest (events, observations, sources) for a given region (see section below: `Sky and pixel regions`_). Finally, there is the use case of rotated regions for background estimation in 1D spectral analysis (see section below: `Rotated regions`_). However, it turns out that working with regions is surprisingly tricky, because there are different ways to define the basic semantics and computations for "contains" of a region, how to transform between pixel and sky regions and how to visualise them. The origin of the issue is that to make an image for analysis or visualisation, one has to choose a projection that maps sky to pixel coordinates, and there are many different projections, and they all have distortions. For example, the `Wikipedia page on Aitoff projection`_ contains an illustration of how sky circles map to pixel regions that are to a good approximation pixel circles in some parts of the image, but in others have complex shapes and different sizes, and at the edge even split in two disjoint pixel regions on the left and right part of the image. In `astropy-regions`_, after a lot of discussions, we have agreed to keep it simple, and to follow the lead of DS9 and to handle sky and pixel regions and their transforms in the same way that DS9 does. This has the advantage that we can use the DS9 string format to define and exchange regions (see section below: `Region arguments`_), and we can use DS9 to view them. As can be seen in the ``CircleSkyRegion.to_pixel`` method implementation below, the definition used to map a sky circle to a pixel circle region is an approximation where the center is transformed according to the given ``wcs``, and the radius is transformed using the local pixel scale at the center point for the given ``wcs``. :: class CircleSkyRegion(SkyRegion): def to_pixel(self, wcs): center, scale, _ = skycoord_to_pixel_scale_angle(self.center, wcs) radius = self.radius.to('deg').value * scale return CirclePixelRegion(center, radius, self.meta, self.visual) So a sky circle is converted to a pixel circle. This works well for most use cases where one has an image with an extent and WCS of roughly constant pixel scale, e.g. a local TAN projection at a source location and 10 deg image size. In general the strategy in Gammapy will be to accept both pixel or sky regions as inputs, but to internally always call ``to_pixel`` first if a sky region is passed, and to internally always work with pixel regions. This approximation fails for all-sky analysis and image parts where the projection has a shear or is varying like the sky circles that are distorted in the image on the `Wikipedia page on Aitoff projection`_. One possible future improvement to ``astropy-regions`` would be to implement sky contains directly for ``CircleSkyRegion`` using sky distance, and for ``PolygonSkyRegion`` using the assumption that polygon edges follow great circles, and for other shapes to implement ``to_polygon`` methods that approximate any region as a ``PolygonSkyRegion`` (using any number of points and accuracy the user wants), and then transforming the ``PolygonSkyRegion`` to ``PolygonPixelRegion``, as defined already by simply transforming the vertex points. This is feasible and useful for circles, ellipses and polygons (one would have to decide what to do with image projection edges like at longitude 180 deg in the Aitoff example, whether to split into disjoint pixel region parts), but is difficult for other shapes such as wedges or already for rectangles, because their shape only becomes well-defined for a given projection. These things are not in scope for this PIG or needed for Gammapy for now, Gammapy simply doesn't have all-sky analysis support built in yet, although already now users could write Python scripts to do it using a combination of HEALPix and WCS maps and Gammapy. Now that we know how `astropy-regions`_ works and how we plan to handle pixel and sky regions in Gammapy, let's move on to the detailed proposal and task list outlining additions and changes to make. Proposal ======== This section contains a detailed proposal concerning region-related code in Gammapy. The `Task list`_ at the end of this PIG references these descriptions. Region arguments ---------------- We propose to add support for DS9 region strings (in addition to region objects) in all functions in Gammapy that take a ``region`` argument. This can be achieved via a ``make_region`` helper function:: >>> from regions import DS9Parser >>> def make_region(region): ... if isinstance(region, str): ... # This is basic and works for simple regions ... # It could be extended to cover more things, ... # like e.g. compound regions, exclusion regions, .... ... return DS9Parser(region).shapes[0].to_region() ... else: ... return region ... >>> make_region("image;circle(42,43,3)") >>> make_region("galactic;circle(42,43,3)") , radius=3.0 deg)> This is convenient, a simple ``region="galactic;circle(42,43,3)"`` is much shorter than having to remember and type this equivalent code:: >>> from astropy.coordinates import SkyCoord, Angle >>> from regions import CircleSkyRegion >>> center = SkyCoord(42, 43, unit="deg", frame="galactic") >>> radius = Angle(3, "deg") >>> region = CircleSkyRegion(center, radius) >>> region , radius=3.0 deg)> There is precedence for this pattern in Gammapy, we usually pass ``Quantity`` and ``Angle`` arguments through via e.g. ``angle = Angle(angle)`` in Gammapy functions, and often call them with strings ``angle="3 deg"``. We propose to add a second function ``make_pixel_region`` which in addition to the string case handling also does the sky region case handling, and gives good error messages on invalid inputs. :: from regions import DS9Parser, SkyRegion, PixelRegion def make_pixel_region(region, wcs=None): if isinstance(region, str): region = make_region(region) if isinstance(region, SkyRegion): if wcs is None: raise ValueError("Sky regions not supported here.") return region.to_pixel(wcs) elif isinstance(region, PixelRegion): return region else: raise TypeError("What is this? Giving up.") Since in Gammapy we implement almost all algorithms using pixel regions, we could then call this helper function on each Gammapy method or function on the first line. E.g. `gammapy.maps.WcsGeom.region_mask`_ would start like this:: def region_mask(self, region): region = make_pixel_region(region) # do computation, knowing region is a PixelRegion object We suggest to start by implementing these two helper functions in ``gammapy/utils/regions.py`` and to use it internally, but not expose it as part of the public API via the Gammapy docs. Once this is done, we would open an issue in ``astropy-regions`` asking whether inclusion there is welcome. This might or might not be the case, because ``make_region`` is basically a one-liner, and is DS9 format specific, whereas ``astropy-regions`` supports other region formats as well. Based on the outcome of this discussion, we would then either move ``gammapy/utils/regions.py`` to ``astropy-regions``, or expose it as part of the public Gammapy API in the docs, since users might want to call them directly in their scripts or analysis codes that build on Gammapy. We note that this means that we commit to support DS9 region strings as part of the Gammapy API. Many users will use this feature, so changing to something else would break user scripts. If we add this now, there are still a few months to gather user feedback, and to possibly add as similar ``make_sky_coord`` helper function throughout Gammapy and to consider how the format there compares with the regions format. There is also the `HEALPix region string`_ format in the HEALPix map spec and support in ``gammapy.maps``; ideally this format wouldn't exist, but avoiding it probably isn't possible, since it includes the ``DISK_INC`` and ``HPX_PIXEL`` regions cannot be represented in the DS9 or any other existing region format. So at this time, there are no plans to improve or unify that region format. We propose to change the current `gammapy.maps.WcsGeom.region_mask`_ input (and everywhere else) from a list of regions, to a single region. The same effect of a list of regions (meaning "union") can be achieved by creating a `Compound region`_. That should be as efficient computationally, and more flexible (e.g. one could also do "intersection" in the region definition). It might be useful to add a factory function in ``astropy-regions`` that makes the creation of such compound regions a bit simpler. Sky and pixel regions --------------------- As already explained in the `Introduction`_ above, there are projection-related issues when it comes to defining the basic operations for sky regions like ``contains``, ``to_pixel`` or ``plot``. Currently ``contains`` and ``plot`` methods simply aren't defined for ``SkyRegion``, and for ``to_pixel`` only the local pixel scale approximation method is implemented. This could be improved in ``astropy-regions``: - Implement contains for ``PolygonSkyRegion`` (see `astropy-regions GH 46`_) - Is is possible to define ``RectangleSkyRegion.contains`` without a WCS (i.e. to compute corners from data members)? - Add ``SkyRegion.plot`` (see `astropy-regions GH 221`_)? - Add generic ``SkyRegion.contains`` using a local TAN or CAR projection with reference point at the region center, to allow use without a WCS? - Add ``SkyRegion.to_polygon``, try to be consistent with plot and contains. Having improved support for sky regions would make it convenient to do some things without requiring a WCS. Especially for circle and polygon, it can be useful to filter positions from an observation or event list, and it is possible and common to do this. In Gammapy we currently have `select_sky_box`_, `select_sky_circle`_ and `skycoord_from_table`_ which do this in an ad-hoc way. They are called from `gammapy.data.EventList`_, `gammapy.data.ObservationTable`_ and `gammapy.catalog.SourceCatalog`_ which each are a wrapper or subclass of an `astropy.table.Table`. We propose to replace those functions and methods with a ``select_region`` method, which for now requires a WCS to be passed on call. In the future, if ``astropy-healpix`` sky region methods improve, passing a ``WCS`` might no longer be needed. Depending on the details of how ``SkyRegion.contains`` is defined, this change might or might not be backward compatible for Gammapy users. So ideally, this work should be done soon, but it is better discussed in the ``astropy-healpix`` issue tracker, not this Gammapy PIG. Rotated regions --------------- Rotated regions are used for background estimation in 1D spectral analysis in gamma-ray astronomy (see [Berge2007]_). The rotation is done around the pointing position of the IACT telescope array. This method is also called "reflected regions", but really that's a bad name, reflecting would only give one region on the other side; what is done is rotating the regions. In Gammapy, rotated regions are implemented in ``gammapy/background/reflected.py``. This is one of the most complex pieces of code in Gammapy, because it was started before ``astropy.coordinates`` or ``astropy-regions`` existed and then adapted many times, never achieving a clean design and implementation until now. The problem was that ``astropy-region`` objects didn't support a rotate operation, and also their representation is such that it's not clear how to compute and represent e.g. a rotated ``RectangleSkyRegion`` (see discussion of ambiguity in ``SkyRegion.contains`` above). Also, the current code tried to handle the case with and without having an exclusion mask and a WCS. A recent attempt to improve our rotated regions code is `GH 2089`_, which shows that it is difficult to to in a clean way without having a ``Region.rotate`` method, because if-else and specific case handling is needed for the operation. We propose to add a ``PixelRegion.rotate(center, angle)`` method for all regions in ``astropy-regions`` (or all that can be represented again as a pixel region object after rotating), and to only handle the case of pixel regions and a given exclusion mask and WCS in ``reflected.py``. For convenience, a dummy WCS as a local TAN projection to the pointing position is used, if the user doesn't need of have an exclusion mask, as is already done in the current code. This is a common case e.g. for isolated point sources like AGN, where the rotated regions method is most commonly used. One complication is that in the current code, to have fewer off test regions, the algorithm takes a large step ``arctan(2*radius/offset)`` at the start and after placing an off region. This requires a "radius", which is is not available for regions of a complex shape. We propose that this optimisation is only left for the circle case (which already has a specialised fast algorithm anyways using a distance mask), and for other regions for now the normal thing is done to test all positions in small angle increments. Images and masks ---------------- Often regions are used to analyse images, and usually this happens via binary masks (with ``False`` for "outside" and ``True`` for "inside"), or in some cases integer label images (with 0 for "outside", 1 for "in region 1", 2 for "in region 2" and N for "in region N"). This is used in Gammapy, but also in ``scipy.ndimage``, ``scikit-image``, ``photutils`` and other packages. We leave the description of the details how we work with images and masks in Gammapy as future work. This could be done in another PIG, or just directly by improving the Gammapy code and documentation. There are open issues already discussing when to use ``bool`` or ``int`` arrays, and a lot of code in ``gammapy.image`` (especially ``gammapy/image/measure.py`` and ``gammapy/image/profile.py``) should either be removed from Gammapy, replaced by documentation examples, or improved to in some cases used regions. This is not high priority and a separate task, since this functionality isn't used anywhere else in Gammapy. Outlook ======= This PIG only describes some short-term improvements to spatial regions. Long-term, we might want to develop a more general solution for regions that also includes non-spatial regions, e.g. spectral regions or time regions. Just to mention what we have at the moment: spectral "regions" are handled via ``energy_bounds`` or ``(energy_min, energy_max)`` in a not very consistent way. We could look towards the Astropy `specutils`_ package and the `specutils.SpectralRegion`_ object as an example how to implement spectral regions. Currently there is plan to integrate ``astropy-regions`` and ``specutils``, `astropy-regions GH 228`_ has been opened for discussion. Supporting an energy-dependent maximum field of view radius selection, or a field-of-view dependent safe energy offset might be a motivation to couple spatial and spectral region, possibly supported via ``gammapy.maps``. For time regions we currently often have ``(time_min, time_max)`` in the API, and we have the `gammapy.data.GTI`_ class to represent multiple intervals. It's not clear if this needs to be changed or wrapped somehow into a "time region", or if there are any use cases to couple time with the spatial and spectral regions, e.g. for a time-dependent safe energy threshold or user analysis option. There is an issue `GH 1805`_ "Introduce RegionGeom and RegionNDMap?" with first thoughts in this direction. Discussion or working out a solution for that is beyond the scope of this PIG. In the HESS software, there is a region class that wraps a mask, and that is sometimes useful, specifically because in the HESS software exclusion regions are region objects, sometimes compound with a mask region and other regions like circles. It's clearly an option, easy to implement as a ``SkyRegion`` sub-class. But also causes problems, especially when it comes to serialisation. Thoughts? I'm OK to not add this for now, and just use ``Map`` objects for exclusion masks, and require that users create that before using Gammapy, in case they e.g. have a list of circles or whatever they want to use as exclusion region. Task list ========= We propose to implement this PIG though a series of small pull requests. Some are dependent on others, but for many also the order doesn't matter and overall the amount of work is limited. - Add region copy method, needed for region rotate (see `astropy-regions GH 266`_) - Add region rotate method (see `astropy-regions GH 265`_ and `Rotated regions`_ above) - Release ``astropy-regions`` v0.4, then bump Gammapy dependency to that version. - Improve rotated region code and tests and docs in Gammapy. This could mean a rebase and continuation of `GH 2089`_, or closing that and re-starting from master, copying over the useful parts or cherry-picking useful commits from that branch. Other tasks, mostly independent from the previous ones, can happen in any order: - Add ``gammapy/utils/regions.py`` with two helper functions. See `Region arguments`_ above. - Use the helper functions throughout Gammapy. Should be 5-10 cases, could be several PRs. See `Region arguments`_ above. - Replace sky circle and box select with generic region select (see `Sky and pixel regions`_, resolves `GH 1172`_) Beyond those core tasks that really should be done for Gammapy v0.13, further improvements in ``astropy-healpix`` and Gammapy should be done, but are lower priority: - Implement ``SkyRegion.contains(point)`` (see `Sky and pixel regions`_ above) - Implement ``SkyRegion.plot`` (see `Sky and pixel regions`_ above) - Review and improve region-related tests, especially for complicated edge lon = 0 / 360 and the poles. - Review and improve region-related end-user docs. Possibly add a dedicated tutorial notebook. - "Support region_mask for multi-resolution maps" (`GH 1715`_) - "HPX up/down sample issue with partial skymaps" (`GH 1445`_) - "Introduce RegionGeom and RegionNDMap?" (`GH 1805`_) Alternatives ============ As mentioned in `Sky and pixel regions`_ and `Rotated regions`_, we could use sky regions more instead of pixel regions, and restrict analysis to sky regions where contains or rotate is well defined, namely circles and polygons, and to polygonise other sky regions using some approximation, before passing them to Gammapy. In the future we might want a more general solution, and there's the question whether a mask-backed region should be added (see `Outlook`_) Decision ======== The PIG was discussed in `GH 2129`_ and the weekly Gammapy developer calls. A final review announced on the Gammapy and CC mailing list did not yield any additional comments. Therefore the PIG was accepted on June 17, 2019. .. These explicit URLs to Gammapy classes are to avoid possible future breakage of .. the links in the PIG if those classes are removed or changed: .. _gammapy.data.GTI: https://docs.gammapy.org/0.11/api/gammapy.data.GTI.html .. _gammapy.maps.WcsGeom.region_mask: https://docs.gammapy.org/0.11/api/gammapy.maps.WcsGeom.html#gammapy.maps.WcsGeom.region_mask .. _gammapy.data.EventList: https://docs.gammapy.org/0.11/api/gammapy.data.EventList.html .. _gammapy.data.ObservationTable: https://docs.gammapy.org/0.11/api/gammapy.data.ObservationTable.html .. _gammapy.catalog.SourceCatalog: https://docs.gammapy.org/0.11/api/gammapy.catalog.SourceCatalog.html .. _select_sky_box: https://docs.gammapy.org/0.11/api/gammapy.catalog.select_sky_box.html .. _select_sky_circle: https://docs.gammapy.org/0.11/api/gammapy.catalog.select_sky_circle.html .. _skycoord_from_table: https://docs.gammapy.org/0.11/api/gammapy.catalog.skycoord_from_table.html .. _gammapy.image: https://docs.gammapy.org/0.11/image/index.html#reference-api .. _Wikipedia page on Aitoff projection: https://en.wikipedia.org/wiki/Aitoff_projection .. _HEALPix region string: https://gamma-astro-data-formats.readthedocs.io/en/latest/skymaps/healpix/index.html#healpix-region-string .. _photutils: https://photutils.readthedocs.io .. _Python in Astronomy 2015: http://openastronomy.org/pyastro/2015/ .. _astropy-regions: https://astropy-regions.readthedocs.io .. _pyregion: https://pyregion.readthedocs.io .. _DS9 region format: http://ds9.si.edu/doc/ref/region.html .. _reproject: https://reproject.readthedocs.io .. _astropy-healpix: https://astropy-healpix.readthedocs.io .. _spherical_geometry: https://github.com/spacetelescope/spherical_geometry .. _regions.PolygonSkyRegion: https://astropy-regions.readthedocs.io/en/latest/api/regions.PolygonSkyRegion.html .. _specutils: https://specutils.readthedocs.io .. _specutils.SpectralRegion: https://specutils.readthedocs.io/en/latest/spectral_regions.html .. _Compound region: https://astropy-regions.readthedocs.io/en/latest/compound.html .. _GH 2129: https://github.com/gammapy/gammapy/pull/2129 .. _GH 2089: https://github.com/gammapy/gammapy/pull/2089 .. _GH 1172: https://github.com/gammapy/gammapy/issues/1172 .. _GH 1167: https://github.com/gammapy/gammapy/pull/1167 .. _GH 1715: https://github.com/gammapy/gammapy/issues/1715 .. _GH 2068: https://github.com/gammapy/gammapy/issues/2068 .. _GH 1445: https://github.com/gammapy/gammapy/issues/1445 .. _GH 1805: https://github.com/gammapy/gammapy/issues/1805 .. _astropy-regions GH 46: https://github.com/astropy/regions/issues/46 .. _astropy-regions GH 91: https://github.com/astropy/regions/pull/91 .. _astropy-regions GH 217: https://github.com/astropy/regions/issues/217 .. _astropy-regions GH 221: https://github.com/astropy/regions/pull/221 .. _astropy-regions GH 228: https://github.com/astropy/regions/issues/228 .. _astropy-regions GH 265: https://github.com/astropy/regions/issues/265 .. _astropy-regions GH 266: https://github.com/astropy/regions/issues/266././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-011.rst0000644000175100001770000002605614721316200020446 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-011: ********************* PIG 11 - Light curves ********************* * Author: Régis Terrier, Axel Donath, David Fidalgo, Atreyee Sinha * Created: Jul 2, 2018 * Withdrawn: Oct 29, 2019 * Status: withdrawn * Discussion: `GH 1451`_ Abstract ======== In this PIG we want to discuss a restructuring of the way light curves are computed and stored in Gammapy. Lightcurves in gamma-ray astronomy are the result of a series of fits of the source flux in each time bin. Lightcurve extraction covers therefore both the data reduction and the data modeling steps. The lightcurve estimation will therefore have these two steps. Here, we propose to perform the data reduction step in each of the time bins and store the result in the form of a ``Datasets`` (``MapDataset`` or ``SpectrumOnOffDataset`` depending on the selected approach). Each individual ``Dataset`` is then modeled and fitted to extract the source flux and its errors in each time bin. The result is then stored in a ``LightCurve`` object which contains a ``Table`` of the fit results. For this purpose, we propose to introduce two new classes to perform the data reduction first and then then the fitting. The time dependent data reduction could be a specific case of the high level pipeline, when a list of time bins is passed with the configuration ``dict``. The data fitting should be performed by the new ``LightCurveEstimator`` class, which should essentially be a wrapper around the ``FluxPointEstimator`` class that does the same thing for spectrum and map datasets. Introduction ============ Lightcurves in gamma-ray astronomy ---------------------------------- In photon counting experiments, lightcurves are often simply obtained by counting events in a given energy range in a set of time bins. In ground based gamma-ray astronomy, things are usually more complex. The response and the instrumental background of the instruments can strongly vary over time on a night scale, e.g. because the source elevation changes or on longer time scales given the possible changes of the atmosphere transparency or the instrument efficiency. Another complexity comes from the astrophysical background which can often pollute a source and needs to be properly removed to extract the intrinsic source flux at any given time. For these reasons, gamma-ray lightcurve are usually the results of a fit of model on the data performed on a number of time bins to extract the source flux in these time bins. This is more limited than e.g. time resolved spectral analysis. Although the latter share many similarities with the lightcurve extraction, it is a more complex task which we do not cover here. Background / What we have now ----------------------------- The current ``gammapy.time.LightCurveEstimator`` class assumes that a part of the data reduction process has already been applied at it takes as input a ``gammapy.spectrum.SpectrumExtraction`` instance for which a list of ``gammapy.background.BackgroundEstimate`` is required. Apart from the time intervals, the user also has to provide a ``gammapy.spectrum.SpectralModel`` that is used to compute the expected counts in a time bin and to scale the integral flux with respect to the excess events found in that time bin. The parameters of the spectral model are generally obtained via a spectral fit to the whole data sample. See `current lightcurve tutorial`_. Drawbacks of this approach: no clear separation of the data reduction and modeling steps, and only 1D On-Off spectral analysis is supported and lacks supports for ``MapDataset`` based analysis. Proposal ======== General organization of the new approach ---------------------------------------- The approach will be split into 3 main phases: * Time bin preparation * Data reduction in time bins to produce a list of ``Dataset`` * Light curve estimation to fit a model on the resulting ``Datasets`` The end product should contain enough information to perform some rebinning and apply high level timing techniques without rerunning the whole data reduction and fitting steps. Time bin preparation -------------------- Independently of the actual data reduction technique chosen, the user should first provide a list/table of time intervals for which she/he wants to compute the source flux. The computation of this list/table will be done outside of the light curve estimation class. While we could provide helper functions to prepare the time bins. ``astropy.time.Time`` is relatively easy to use, so that a user would execute code similar to the following example: :: from astropy.time import Time time_start = Time("2006-07-29 20:00:00.000") time_step = 5.0 * u.min nstep = 120 times = time_start + time_step * np.arange(nstep) time_bins = [] for tmin, tmax in zip(times, times[1:]): time_bins.append((tmin,tmax)) Data reduction -------------- Once the time bins are determined, the user will have to select the relevant ``Observations`` from the ``Datastore``. The ``Observations`` are then filtered and grouped according to the time bins using the ``ObservationFilter`` and passed to the light curve extraction function or class. The latter could take a ``geom`` or a ``region`` argument that will define the data reduction geometry (and in particular, if the data reduction is 3D or 1D). In the absence, of a ``RegionGeom`` we could simply expect a reco energy and true energy ``MapAxis``. We do not detail possible approaches in the current PIG. These should be re-evaluated in the more general context of data reduction. Some examples, using the current data reduction approach will be exposed to users in a dedicated notebook. Both approaches will result in a list of ``Datasets`` consisting of ``MapDataset`` or ``SpectrumDatasetOnOff`` with identical geometries for each time bin. In order to properly assign times to any ``Dataset``, the latter must therefore carry the time information. Minimally, this can be done with a meta information such as ``time_start`` and ``time_stop``. This does not give a full idea of the coverage of time bin though. Ideally, the ``Dataset`` should track the ``GTI`` table of the filtered ``Eventlist`` that were used for its production. It can be stored on the ``Dataset`` itself and be serialized independently or be added as an extension to the serialized ``count`` data container (a ``CountsSpectrum`` or ``Map``), as is done for OGIP spectra. In order to stack several ``Dataset`` objects it is necessary to be able to combine ``GTI`` tables. While a simple stacking of tables is enough in many cases, the situation of overlapping time intervals should be considered and a proper ``GTI.union()`` should be introduced (a functionality recurrent in HEA software see e.g. ftools ``mgtime``) Data Fitting ------------ The data fitting is the step were the ``Datasets`` are converted into a lightcurve based on a model of the emission. The amplitude of the source model is fitted while other parameters are usually, but not necessarily, fixed. The fitting is very similar to the extraction of ``FluxPoints`` in the spectral analysis and does not strongly depend on the type of ``Datasets`` considered. The new ``LightCurveEstimator`` class should therefore be able to take as input both ``SpectrumDatasetOnOff`` and ``MapDataset``, in a similar fashion as the current ``FluxPointEstimator``. After creating the ``Datasets``, the user would first define the model to be used and freeze its parameters. Then we would apply it to the various ``Dataset`` objects before calling the ``LightCurveEstimator``: :: sky_model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) sky_model.parameters["index"].frozen = True sky_model.parameters["lon_0"].frozen = True sky_model.parameters["lat_0"].frozen = True for dataset in datasets: dataset.model = sky_model lc_estimator = LightCurveEstimator(datasets) lightcurve = lc.run() To help, the user deal with model parameters some helper function could be introduced to freeze all parameters of a model. Another possibility is to assume that all parameters are fixed except the ``amplitude`` in the ``LightCurveEstimator.run()`` method, or pass as arguments the names of the parameters to leave free. Storing the results and further studies --------------------------------------- The results are returned as a ``gammapy.time.LightCurve`` instance (the current container class for light curves) that, so far, essentially holds the integrated flux + errors and the time_min/time_max of each time bin. There are many other quantities which could be stored, such as the energy range of the flux, the number of excess events in the time bin considered, etc. The current container class already provides some methods to study variability, such as chi-square test, fractional variance estimation. Example usage: :: lightcurve.plot('amplitude') # Get fractional variance print(lightcurve.fvar('amplitude')) The ``LightCurve`` object should be able to rpovide some rebinning functionalities. In particular, it stores the fit statistic scan, it should be possible to perform minimal significance rebinning: :: simple_rebin_lc = lightcurve.rebin(factor=2) # or min_significance_lc = lightcurve.rebin(min_significance=3) Discussion / Alternatives ========================= Time bins --------- To work with time bins, we could also rely on ``astropy.timeseries`` if we force the dependency to astropy v>3.2. This would look like: :: from astropy.timeseries import BinnedTimeSeries times = BinnedTimeSeries(time_bin_start="2006-07-29 20:00:00.000", time_bin_size=5 * u.min) print(times.time_bin_start) print(times.time_bin_end) Light Curve Fitting ------------------- We could provide distinct objects specialized for ``Map`` and ``Spectrum`` datasets instead of a single object. Lightcurve ---------- The ``Lightcurve`` contains an ``astropy.table.Table`` object. It could be improved by using the ``astropy_timeseries.BinnedTimeSeries`` object which itself inherits from ``QTable`` and provides support for row selection with time, rebinning and more complex methods for detailed timing studies such as Lomb-Scargle periodograms. Task list ========= - Add ``TSTART`` and ``TSTOP`` meta info on the ``Dataset`` as well as the ``GTI`` table. - Introduce ``GTI.union()``` method to merge ``Datasets``. - Refactor the ``Lightcurve`` class to add a number of new content in the ``Table`` e.g.: - a ``fit_stat_scan`` column. - rebinning methods - Implement the new ``LightCurveEstimator`` class - Provide a notebook showing data reduction and fitting examples for both 3D and 1D cases Decision ======== The authors have decided to withdraw the PIG. A significant part of the implementation is already there. Besides, the recent additions of the high level interface and of the new Maker scheme change the scope of the discussion. .. _GH 1451: https://github.com/gammapy/gammapy/pull/1451 .. _good time interval: http://heasarc.gsfc.nasa.gov/docs/heasarc/ofwg/docs/rates/ogip_93_003/ogip_93_003.html#tth_sEc6.3 .. _current lightcurve tutorial: https://docs.gammapy.org/0.14/notebooks/light_curve.html ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-012.rst0000644000175100001770000004205614721316200020445 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-012: ***************************** PIG 12 - High level interface ***************************** * Author: José Enrique Ruiz, Christoph Deil, Axel Donath, Regis Terrier, Lars Mohrmann * Created: Jun 6, 2019 * Accepted: Aug 19, 2019 * Status: accepted * Discussion: `GH 2219`_ Abstract ======== The high level interface is one of the projects considered in the Gammapy roadmap for Gammapy v1.0 (see :ref:`pig-003`). It should be easy to use and allow users to do the most common analysis tasks and workflows quickly. It would be built on top of the existing Gammapy code-base, first on it's own, but likely starting to develop it would inform improvements in code organisation throughout Gammapy. Achieving a stable high level interface should allow us to continue improving the Gammapy code-base without breaking user-defined workflows or recipes made that would have been made with this high level interface. We propose to develop a high level interface Python API, similar to Fermipy or HAP in HESS, based on a single ``Analysis`` class communicating with a set of tool classes, and that supports config-file driven analysis of the main IACT source analysis use cases. What we have ============ We have been using `Click`_ to develop a very small set of tools for an embryonic `Gammapy command line interface`_. Among the existing tools (``gammapy image``, ``gammapy info``, ``gammapy download``, ``gammapy jupyter``), only `gammapy image`_ can be considered as potentially needed in a data analysis process. It actually creates a counts image from an event-list file and an image that serves as a reference geometry. Hence, we have a code set-up in ``gammapy.scripts``, that we will not use it for the moment to expose the high level interface API, but to develop a very small set of specific command line tools identified (i.e. perform long-time processing tasks, see *Command line tools* section below) We have a set of `Jupyter notebooks`_ as examples of tutorials and recipes demonstrating the use of Gammapy. These notebooks are continuously tested and are one of the pillars of the user documentation. We could check most of the use cases are covered by the high level interface with the help of these notebooks. We will have to translate most of them to use the high level interface, but we could also use them as a basis for experimental automated workflows driven by parametrized notebooks executed with `papermill`_. (see *Outlook* section below) Moreover, some Python scripts have been added recently to perform `benchmarks`_ of Gammapy, surely we could rewrite some of all of these benchmarks to use the high level interface. We also have some *high level analysis* classes in the API that concatenate several atomic actions and provide rough estimated results for more complex processes. (i.e. `SpectrumAnalysisIACT`_, `LightCurveEstimator`_) These classes would serve as a basis to design and prototype some of the tools of the high level interface. Proposal ======== We will develop a high level interface Python API which uses a params-values configuration file defined by the user. This API would be used in Python scripts, notebooks or in IPython sessions to perform simple and most common IACT analysis. We see then two main options on how to use the high level interface API: - Within an IPython session or notebook, mostly dealing with a manager object to perform specific tasks in an **interactive** analysis process - In a Python script or notebook, declaring the orchestration of the tasks with a manager object for an **automated** process This high level interface API is similar to what it is done in `Fermipy`_ or HAP in HESS, including also the options to save and recover session states, as well as serialization of intermediate data products and logging. It is flexible enough to allow the user to work with the API at any stage of the analysis process, and not only from the start to the very end or in automated process. **Use cases** The use cases covered are in the scope of a single analysis and model, not parametrized variations in a multidimensional grid space of variables, and within a single region (e.g. 10 deg region with 5 sources). The main use cases for analysis to be covered are: - 3D map analysis - 2D map analysis - 1D spectrum analysis - Light curve estimation Including the main methods for data reduction, modeling and fitting: - On vs on/off data reduction - Different background models - Joint vs stacked likelihood fitting - Diagnostics (residuals, significance, TS) - Spectral flux points Making a SED may be the final part of the analysis, as many SED methods require the full model and all energy data. **Configuration file** The configuration file will be in YAML format exposing the parameters and values needed for each one of the tasks involved in the analysis process. To generate the config file, we could add a ``gammapy analysis config`` command line tool which dumps the config file with all lines commented out, and the users can then uncomment and fill in the parameters and values they care about. As an alternative users could copy & paste from a config file example in the docs. We will develop a schema and validate / give good error messages on read. We roughly sketch below an example of a prototype configuration file in YAML format, just to illustrate how a structured schema could expose most of the parameter/values needed in a data analysis process. The configuration file should be explicit enough for the users to understand which parameters to edit in order to define a specific configuration for an analysis session or workflow, and should use units for quantities where it makes sense, e.g. "angle: 3 deg" instead of "angle: 3". The final schema for this configuration file will be achieved iteratively during the development of the high level interface and later on eventual improvements, also taking user feedback into account. Prototype configuration file. .. code-block:: yaml analysis: process: # add options to allow either in-memory or disk-based processing out_folder: "." # default is current working directory store_per_obs: {true, false} reduce: type: {"1D, "3D"} stacked: {true, false} background: {"irf", "reflected", "ring"} roi: max_offset exclusion: exclusion.fits fit: energy_min, energy_max logging: level: debug grid: spatial: center, width, binsz energy: min, max, nbins time: min, max # PSF RAD and EDISP MIGRA not exposed for now # Per-obs energy_safe and roi_max ? observations: data_store: $GAMMAPY_DATA/cta-1dc/index/gps/ ids: 110380, 111140, 111159 conesearch: ra: dec: radius: energy_min: energy_max: time_min: time_max: model: # Model configuration will mostly be designed in different PIG sources: source_1: spectrum: powerlaw spatial: shell diffuse: gal_diffuse.fits background: IRF **API design** The design of the high level interface API is driven by the use cases considered, the different tools (tasks) identified and their responsibilities, as well as the need of a main ``Analysis`` session object that drives and orchestrates internally the different tools involved, their inputs and products. The ``Analysis`` session object will be initialized with a configuration file (see *prototype configuration file* above) and will be the responsible to instantiate and run the different tools classes (see *session workflow* below). The tools are middle management agents (e.g. MapMaker, ReflectedBgEstimator, ...) responsible to perform the different tasks identified in the use cases covered. The ``Analysis`` session object will provide access to every object involved and data structure produced during the session. ``Analysis`` methods calls will produce and modify datasets (i.e. models, maps,..), but in between method calls advanced users can do a lot of custom processing by their own with scripting using the Gammapy Python toolbox. The code of the ``Analysis`` class as well as any other eventual class needed will be placed in ``gammapy.scripts``. This module will contain also the set of different command line tools provided, where some small cleaning and refactoring may be needed (i.e. remove ``gammapy image`` command line tool) **Serialisation** There will be the possibility to save and recover session states with their associated data products. The user could also choose in the configuration file, with the help of a boolean parameter, to work with serialised intermediate products delivered by the tools instead of in memory. The state serialisation will be a mix of YAML (i.e. models, state) and FITS files (i.e. maps), where the delegated tools should know how to serialise and read themselves. The solution to address serialization of different datasets by the different tools is not in the scope of this PIG. **Session workflow** .. code-block:: text $ mkdir analysis $ cd analysis $ edit gammapy_analysis_config.yaml Then the user would type ``ipython``, ``juypter notebook`` or write a script with the code below. .. code-block:: python from gammapy.analysis import Analysis analysis = Analysis(config) analysis.select_observations() # Select observations using the parameters defined in the configuration file. analysis.reduce_data() # often slow, can be hours # If the user wants they can save all results from data reduction and re-start later. # This stores config, datasets, ... all the analysis class state. # analysis.write() # analysis = Analysis.read() analysis.optimise() # often slow, can be hours # Again, we could write and read, do the slow things only once. # e.g. supervisor comes in and asks about significance of some model component or whatever. # analysis.write() # analysis = Analysis.read() # Since anything is accessible from the Analysis object # many advanced use cases can be done with the Analysis API. analysis.model("source_42").spectrum.plot() # Should we need energy_binning for the SED points in config or only here? sed = analysis.spectral_points("source_42", energy_binning) **Command line tools** In addition to ``gammapy analysis config``, we will have a ``gammapy analysis data_reduction`` and a ``gammapy analysis optimise`` which perform the long processing tasks from the terminal outside an ipython session or jupyter notebook, using all the information from the config file and/or saved state. - ``gammapy analysis config``: dumps a template configuration file - ``gammapy analysis data_reduction``: performs a data reduction process - ``gammapy analysis optimise``: performs a model fitting process Outlook ======= Some of the use cases not covered by this high level interface API are the following: - Generation of simulated events and/or counts - Iterative source detection methods - Complex or memory eager processing on lightcurves These use cases can be actually addressed using Gammapy as a Python toolbox, though in the near future some of them could be incorporated progressively to the high level interface. For example, making a lightcurve could be part of one Analysis (like it is in Fermi), or could be done at higher level, creating Analysis instances and running them for each time bin. This exercise of pro/con is left to the :ref:`pig-006`. We could expect that restructuring all of Gammapy to be tools based with good tool chains and config handling will be eventually achieved after a certain time if we define a solid high level interface API. We could explore the use of `papermill`_ to run workflows defined in a notebook using the values of the parameter-value configuration file defined for the high level interface API. The notebooks could be provided as skeleton-templates for specific use cases or built by the user. This option would provide a rich-text formatted report of the analysis process execution. One extra dimension that arises from the development of this high level interface API is the possibility to capture data provenance as structured logs of tasks executions in a session or in an automated workflow. Capturing provenance in this way is more useful if we also provide the means for it to be easily queried and inspected in multiple ways, allowing also forensic studies of the research analysis process as well as improving reproducibility and reuse by the community. This work will be the scope of another PIG. Alternatives ============ A different approach to this high level interface API is that of command line tools executed from the terminal, what `Fermitools`_ and `ctools`_ do, where each tool is simple/atomic enough to allow users to inspect the output results before taking a decision on how to run and set the parameter values for the next tool. A similar approach could be done with the Gammapy high level interface API, but inside a notebook or IPython session. Concerning the code implementation, `ctapipe tools`_ provides a solution based on Python traitlets, acting as an extensible framework to easily transform Python classes into command line tools. We will explore the adoption of this approach after Gammapy v1.0 since it requires a considerable effort on refactoring of the Gammapy code-base. Another config-file based solution is what is implemented in `Enrico`_. It performs basic orchestrated analysis workflows using a set of input parameters that the user provides via a configuration file. The user may be guided in the declaration of values for the config file using an assistant command line tool for config-file building, which asks for parameter values providing also defaults. This is done in `Enrico`_ with ``enrico_config`` and ``enrico_xml``, where each workflow is set-up and then run with its own command line tool. In our case, we define the workflow steps in a simple Python script and declare parameter-value pairs in a configuration file. The Python script is then executed to run the workflow. Also Python scripts and/or notebook files could be generated with an assistant command line tool. Then, the user could edit and tweak the config files, scripts or notebooks. There isn't much precedence for this workflow in science, but a lot of dev-ops and programming tools work like that, it is a standard technique. One random example of such a tool is the `Angular CLI`_, or `cookiecutter`_. Task list ========= Required for Gammapy v1.0 - Prototype for a manager class, agents, tools, etc. - Define a syntax for the declaration of parameter-value pairs needed for all tools in the analysis process. - Develop the session manager class responsible to drive the orchestration of tools in the analysis process. - Develop the tools classes responsible to perform each one of the tasks in the analysis process. - Design use cases and/or choose among the existing tutorials or benchmarks those that may be translated into high level interface notebooks. - Provide notebooks using the high level interface API for each of the chosen tutorials, benchmarks and/or use cases identified. - Add documentation for the high level interface API and clean the list of documentation tutorials, making a distinct separation among those using Gammapy as a high level interface API and those using Gammapy as a Python toolbox. Extra features (command line tools) - Develop the small set of helpers command line tools described above. - Develop an assistant command line tool that produces Python scripts and/or notebooks using the high level interface API. - Cleaning and refactoring of ``gammapy.scripts`` module to remove old and unused command line tools. - Cleaning present documentation on ``gammapy.scripts`` to transform into documentation of helper command line tools. Decision ======== The PIG has benn discussed at the Gammapy coding sprint in July 2019. A final review announced on the Gammapy and CC mailing lists provided additional comments that were addressed in `GH 2219`_. The PIG was accepted on August 19, 2019. .. _GH 2219: https://github.com/gammapy/gammapy/pull/2219 .. _Gammapy command line interface: https://docs.gammapy.org/0.12/scripts/index.html .. _gammapy image: https://docs.gammapy.org/0.12/scripts/index.html#example .. _Enrico: https://enrico.readthedocs.io/en/latest/configfile.html .. _Fermitools: https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/overview.html .. _Jupyter notebooks: https://docs.gammapy.org/0.12/tutorials.html .. _Angular CLI: https://cli.angular.io/ .. _papermill: https://github.com/nteract/papermill/ .. _cookiecutter: https://cookiecutter.readthedocs.io .. _SpectrumAnalysisIACT: https://docs.gammapy.org/0.12/api/gammapy.scripts.SpectrumAnalysisIACT.html .. _LightCurveEstimator: https://docs.gammapy.org/0.12/api/gammapy.time.LightCurveEstimator.html .. _benchmarks: https://github.com/gammapy/gammapy-benchmarks .. _ctapipe tools: https://ctapipe.readthedocs.io/en/latest/api-reference/tools/index.html ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-013.rst0000644000175100001770000003115714721316200020446 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-013: ********************************************** PIG 13 - Gammapy dependencies and distribution ********************************************** * Author: Christoph Deil, Axel Donath, Régis Terrier, Brigitta Sipocz * Created: Jun 6, 2019 * Accepted: Sep 9, 2019 * Status: accepted * Discussion: `GH 2218`_ Abstract ======== Now that we have a good way to distribute Gammapy and to ship a science tool environment to users, we propose to drop support and testing for old versions of dependencies, and alternative distribution channels. Concretely, we propose to require Python 3.6, Numpy 1.16 and Astropy 3.2 starting with Gammapy v0.14, and to remove the Macports installation instructions from the Gammapy documentation. We think the impact for users is small (none for most), but the benefit for Gammapy developers and maintainers is big, allowing us to progress more quickly. If you use Gammapy and need to run on old machines or exotic platforms, and this change doesn't work for you, let us know! Introduction ============ Since Gammapy v0.7, released in Feb 2018, the recommended way to install Gammapy and its dependencies has been via conda. This has worked well, allowing us to ship a reproducible science tool environment with up-to-date versions to all Gammapy users for each Gammapy release. Each stable release for Gammapy is first published as a source distribution on https://pypi.org/project/gammapy/. Then conda binaries for Linux, macOS and Windows will be built via conda-forge, and uploaded to https://anaconda.org/conda-forge/gammapy. Finally we write a conda environment specification file that we publish on gammapy.org, and the end-user installation instructions look like this:: curl -O https://gammapy.org/download/install/gammapy-0.13-environment.yml conda env create -f gammapy-0.13-environment.yml conda activate gammapy-0.13 Dependencies ------------ For the rest of this document, we would like to define and describe what we mean when talking about "required" and "optional" dependencies for Gammapy: - the required dependencies are the Python packages that get automatically installed when running ``pip install gammapy`` or ``conda install gammapy -c conda-forge``. - the optional dependencies are the ones listed in the conda environment specification, that are not already required dependencies. For example Numpy and Astropy are required dependencies of Gammapy, and matplotlib, Jupyter or Naima are optional dependencies. Complete tables of all required and optional dependencies that we have at the moment are given below. The choice which dependencies are required or optional (or neither) is something we make in the metadata of the Gammapy source distribution. Our reasoning is to declare a dependency as required if it's needed by the vast majority of Gammapy users, and for optional dependencies, we include all packages that are used somewhere within the Gammapy package, or in examples and tutorials in the Gammapy documentation. We note that these two sets of dependencies are just the default, recommended sets, it is possible to install Gammapy even without installing the required dependencies, and of course users can install and use Gammapy together with other Python packages, e.g. scikit-learn or whatever they like. Distributions ------------- There are many distribution channels for Gammapy just as is the case for most open-source software. E.g. at this time there is a Debian and Macports package for Gammapy. conda is not the only, but it is the only fully supported distribution channel for Gammapy. The reason for this is that currently manpower and expertise in the Gammapy team is limited, and conda provides, as far as we know, a solution that works for all users. At this time, no-one from the Gammapy is using Macports any more, for pip there are no binary wheels published yet for Gammapy and astropy-regions, Debian only works for Debian users and has a longer update cycle compared to the current Gammapy 2 month release cycle, and e.g. for Homebrew no-one packaged Gammapy so far. Required dependencies ===================== We propose to update the Gammapy required dependencies as shown in the following table (the release dates for the packages are shown in parentheses are were obtained from https://pypi.org/). =============== ================ ================ Dependency Gammapy 0.13 Gammapy 0.14 =============== ================ ================ Python 3.5 (Sep 2015) 3.6 (Dec 2016) Numpy 1.10 (May 2016) 1.16 (Jan 2019) Scipy 0.15 (Jan 2015) 1.2 (Dec 2018) Astropy 2.0 (Jul 2017) 3.2 (Jun 2019) regions 0.4 (Jun 2019) 0.5 (Sep 2019) pyyaml unclear 5.1 (Mar 2019) click unclear 7.0 (Sep 2018) jsonschema -- 3.0 (Feb 2019) =============== ================ ================ We already mentioned the possibility to drop Python 3.5 support in :ref:`pig-003`. One reason is that Anaconda and conda-forge (our main distribution channel, used in our testing continuous integration setup) only contains Python 2.7, 3.6 and 3.7 at this point (and 3.8 added in fall 2018), i.e. testing on Python 3.5 is already extra effort. Also, Python 3.6 contains some nice new features that developers can use. E.g. Sunpy or ctapipe already require Python 3.6 or later. A major motivation to update to very recent versions is that the ``regions`` package is still under development (see :ref:`pig-010`). In Gammapy 0.13 we require regions 0.4, and we plan to make a regions 0.5 release with further features and fixes in September, and to require that for Gammapy 0.14. For ``pyyaml``, we would like to use the recent version, since it allows writing with preserved dictionary key order, giving a nice output, whereas previously it always sorted keys alphabetically on YAML write. The current Gammapy command line interface is using ``click``. Whether to keep this or whether to use something else will be discussed in `PIG 12`_. For now, we propose to keep things as-is, and only specify a minimum version that we'll test, although in practice we didn't have any version-dependent issues with ``click`` in the past years. We plan to use ``jsonschema`` to validate YAML config files. It's a small, pure-Python dependency. Optional dependencies ===================== We propose to update the Gammapy optional dependencies as shown in the following table (the release dates for the packages are shown in parentheses are were obtained from https://pypi.org/). =============== ================ ================ Dependency Gammapy 0.13 Gammapy 0.14 =============== ================ ================ ipython 7.3 (Feb 2019) 7.6 (Jun 2019) jupyter 1.0 (n/a) 1.0 (n/a) jupyterlab 0.35 (Oct 2018) 1.0 (Jun 2019) matplotlib 2.1 (Oct 2017) 3.0 (Sep 2018) pandas 0.24 (Jan 2019) 0.25 (Jul 2019) healpy 1.11 (Aug 2017) 1.12 (Jun 2018) reproject 0.4 (Jan 2018) 0.5 (Jun 2019) uncertainties 3.0 (Aug 2016) 3.1 (May 2019) iminuit 1.3 (Jul 2018) 1.3.7 (Jun 2019) sherpa 4.11 (Feb 2019) 4.11 (Feb 2019) naima 0.8 (Dec 2016) 0.8.3 (Nov 2018) emcee -- 2.2 (Jul 2016) corner -- 2.0 (May 2016) parfive -- 1.0 (May 2019) =============== ================ ================ We plan to use ``parfive`` for tutorial notebook and example dataset file download (features parallel download and a progress bar). Distributions ============= The situation concerning distribution was described in the introduction above, and some alternatives and future work are mentioned in the outlook and alternatives sections below. We propose to remove the Macports installation page from the Gammmapy documentation, and to just leave a mention that it's outdated and unsupported on the page on "other ways to install Gammapy". Short sections or pages about pip and Debian will remain at the end of the installation instructions, for advanced users. For conda, we will improve the installation instructions, explaining the difference between using the recommended environment and a plain ``conda install -c conda-forge gammapy``, and add lists of required and optional dependencies as above to the Gammapy installation documentation. Outlook ======= This PIG describes only the status and very short term plan for Gammapy v0.14 and v1.0! We expect that dependencies and distribution will evolve in the coming years. E.g. we might add Numba as a dependency, or we might start to fully support pip, if binary wheels for Gammapy and all dependencies are available (there is a recent effort to implement this). Somewhat related to this PIG: we plan to modernise Gammapy packaging following the recommendations in `APE 17`_. Work on this has already started in `GH 2279`_. We could support other ways to ship Gammapy, e.g. Homebrew (works on MacOS and also Linux now), or Docker images (works anywhere, self-contained). Alternatives ============ Since we have a working setup already, we could do nothing, which seems nice at first, but really means continued time sink and suffering by the Gammapy maintainers of the continuous integration testing system and the release manager. For conda, we should strongly consider introducing a ``gammapy-all`` metapackage instead of the current ``gammapy-environment.yml`` shipped via ``gammapy.org``. That's what e.g. the Fermi tools and glueviz and others do, and they like that solution. It does have the advantage ob using a conda-native solution to the question how to ship an environment. This could be prototyped any time, in parallel with the existing solution, to gain familiarity with metapackages. Task list ========= - Update continuous integration test matrix to test against the minimum required versions as specified here (and also newer versions in addition) (`GH 2270`_). - Drop Python 3.5 support and modernise codebase (e.g. use f-strings and use dict instead of OrderedDict). - Review Gammapy codebase and remove workarounds for old Python, Numpy, Astropy, ... versions. - Remove ``extras_require`` in ``setup.py`` (don't attempt to maintain the list of optional dependencies there) - Review, restructure and update all Gammapy installation instructions - Clearly describe all required and optional dependencies in the docs Decision ======== This proposal was extensively discussed at the July 2019 Gammapy coding sprint. In the feedback phase we didn't get a single comment asking for continued support of older versions. Thus the proposal was accepted on Sep 9. .. _GH 1167: https://github.com/gammapy/gammapy/pull/1167 .. _GH 1245: https://github.com/gammapy/gammapy/issues/1245 .. _GH 1586: https://github.com/gammapy/gammapy/pull/1586 .. _GH 1658: https://github.com/gammapy/gammapy/pull/1658 .. _GH 1863: https://github.com/gammapy/gammapy/pull/1863 .. _GH 2218: https://github.com/gammapy/gammapy/pull/2218 .. _GH 2270: https://github.com/gammapy/gammapy/issues/2270 .. _GH 2275: https://github.com/gammapy/gammapy/issues/2275 .. _GH 2279: https://github.com/gammapy/gammapy/issues/2279 .. _PIG 12: https://github.com/gammapy/gammapy/pull/2219 .. _PIG 14: https://github.com/gammapy/gammapy/pull/2255 .. _APE 17: https://github.com/astropy/astropy-APEs/pull/52 .. _PEP 427: https://www.python.org/dev/peps/pep-0427/ .. _astropy-healpix GH 128: https://github.com/astropy/astropy-healpix/issues/128 .. _Astropy whatsnew: https://docs.astropy.org/en/stable/whatsnew/ .. _astropy wheel-forge: https://github.com/astropy/wheel-forge .. _astropy-healpix: http://astropy-healpix.readthedocs.io/ .. _gammapy-0.12-environment.yml: https://github.com/gammapy/gammapy-webpage/blob/gh-pages/download/install/gammapy-0.12-environment.yml .. _Installation with Macports: https://docs.gammapy.org/0.12/install/macports.html .. _Gammapy in Macports: https://github.com/macports/macports-ports/commits/master/python/py-gammapy/Portfile .. _pip requirements file: https://pip.pypa.io/en/stable/user_guide/#requirements-files .. _extras_require in Gammapy setup.py: https://github.com/gammapy/gammapy/blob/fe8ca7d6caac77b8a31efc8bec3b21d09aacf6c1/setup.py#L115-L127 .. _conda metapackage: https://docs.conda.io/projects/conda-build/en/latest/resources/commands/conda-metapackage.html .. _fermitools conda meta.yaml: https://github.com/fermi-lat/Fermitools-conda/blob/master/meta.yaml .. _glueviz conda meta.yaml: https://github.com/conda-forge/glueviz-feedstock/blob/master/recipe/meta.yaml .. _pysal: https://pysal.org/docs/install/ .. _sunpy pip installation instructions: https://docs.sunpy.org/en/stable/guide/installation.html././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-014.rst0000644000175100001770000002240214721316200020440 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-014: ******************************* PIG 14 - Uncertainty estimation ******************************* * Author: Christoph Deil, Axel Donath, Quentin Rémy, Fabio Acero * Created: June 20, 2019 * Accepted: Nov 19, 2019 * Status: accepted * Discussion: `GH 2255`_ Abstract ======== Currently Gammapy uses the `uncertainties`_ package to do error propagation for differential and integral flux of spectral models, e.g. to compute spectral model error bands. This is cumbersome since ``uncertainties`` doesn't support ``astropy.units.Quantity`` (which we currently use in spectral model evaluation), and doesn't work at all for some spectral models (e.g. EBL absorption and Naima), since ``uncertainties`` uses autodiff and needs explicit support for any Numpy, Scipy, Astropy, ... function involved. We propose to replace this with a custom uncertainty propagation implementation using finite differences. This will be less precise and slower than ``uncertainties``, but it will work for any derived quantity and code that we call, and any spectral model. We also propose to add support for a second method to propagate uncertainty from fit parameters to derived quantities, based on parameter samples. This is the standard technique in MCMC Bayesian analyses, but it can be used as well for normal likelihood analyses. Introduction ============ In Gammapy, ``gammapy.modeling`` and specifically the ``gammapy.modeling.Fit`` class support likelihood fitting to determine best-fit parameters, and to obtain a covariance matrix that represents parameter uncertainties and correlations. Parameter standard errors are obtained as the square root of the elements on the diagonal of the covariance matrix (see e.g. `The interpretation of errors`_). Asymmetric errors and confidence intervals can be obtained via likelihood profile shape analysis, using methods on the ``gammapy.modeling.Fit`` class, partially re-implementing, partially calling the methods available in Minuit or Sherpa (see e.g. `Fitting and Estimating Parameter Confidence Limits with Sherpa`_). However, often one is interested not directly in a model parameter, but instead in a derived quantity that depends on multiple model parameters. Typical examples are that users want to compute errors or confidence intervals on quantities like differential or integral flux at or above certain energies, or location or extension of sources, from spectral and spatial model fits. Here we will discuss two standard techniques to compute such uncertaintes: 1. Differentials (see `Wikipedia - Propagation of uncertainty`_ or `uncertainties`_) 2. Monte carlo (MC) samples (see `mcerp`_ or `astropy.uncertainty`_) If you're not familiar with the techniques, here's a few references: - `Error estimation in astronomy - A guide`_ - `Frequentism and Bayesianism - A Python-driven Primer`_ In Gammapy we currently use the `uncertainties`_ package to propagate errors to differential fluxes ``spectral_model.evaluate_error`` and integral fluxes ``spectral_model.integral_error``. The ``evaluate_error`` method is called for an array of energies in ``plot_error``, which is used to compute and plot spectral model error bands (see e.g. the `SED fitting tutorial`_) It is using autodiff to compute differentials of derived quantities wrt. model parameters, which is accurate and fast where it works. However, ``uncertainties`` doesn't support ``astropy.units.Quantity``, making the current spectral model flux evaluation very convoluted, and it doesn't work at all for some spectral models, namely non-analytical models like EBL absoption or Naima cosmic ray spectral models, which has been a frequent complaint by users (see `GH 1046`_, `GH 2007`_, `GH 2190`_). The MC sample method is most commonly used for Bayesian analysis, where the model fit directly results in a sample set that represents the parameter posterior probability distribution. A prototype example for that kind of analysis has been implemented in the `MCMC Gammapy tutorial`_ notebook. However, the MC sample error propagation technique can be applied in classical frequentist statistical analysis as well, if one considers it a numerical technique to avoid having to compute differentials, and instead samples a multivariate normal according to the best-fit parameters and covariance matrix, and then propagates the samples. One can scale the covariance matrix to only sample near the best-fit parameters and thus should be able to reproduce the differential method (within sampling noise). Proposal ======== We propose to change the ``SpectralModel.evaluate_error`` implementation to a custom differential based error propagation, as shown in the `Gammapy uncertainty propagation prototype notebook`_. This allows us to completely remove the use of `uncertainties`_ within Gammapy, and to simplify the spectral model evaluation code. This will be a bit slower and less accurate than the current method, but it is a "black box" technique that will work for any spectral method or code that we call. In the future we think that using an autodiff solution could be useful, not just for error propagation, but also for likelihood optimisation. It's unlikely that we'd use `uncertainties`_ though, rather we'd probably use `autograd`_ or `jax`_ or even Tensorflow or PyTorch or Chainer or some other modern array computing package that supports autograd. So we think removing ``uncertainties`` now and cleaning up or model evaluation code is a good step, even if we want to change to some other framework later. We also propose to add a method to generate parameter samples from the best-fit parameters and covariance, and to support MC error propagation. See `Gammapy uncertainty propagation prototype notebook`_. This second part could be done before or after the v1.0 release, it's independent of the first proposed change. The first step would be to add a test and improve the code of MC sampling interface (see `GH 2304`_). Then probably we should add a ``Distribution`` object from `astropy.uncertainty`_ to ``Parameter`` or to a new ``FitMC`` class to store samples from MC analysis, or multinormal samples from the covariance matrix. And then possibly support for such objects in model evaluation and derived quantities. Another option could be to add support for "model sets" as in ``astropy.modeling`` to support arrays of parameter values within one model object, or to directly change ``gammapy.modeling`` to be based on ``astropy.modeling``. As you can see, this second part of the proposal is a wish that Gammapy support MC sample based error propagation, the implementation is something to be prototyped and worked out in the future (could be now and for v1.0, or any time later). Alternatives ============ - Support only sample-based uncertainty propagation (like e.g. PyMC or 3ML) - Support only differential uncertainty propagation (like e.g. Minuit) - Keep everything as-is, use `uncertainties`_ - Change to another autograd package like ``autograd`` or ``jax`` Decision ======== This proposal was discussed extensively on GitHub (`GH 2255`_), and also in-person at the Gammapy coding sprint in Nov 2019. The exact mechanism and implementation for uncertainty propagation needs to be worked out (see the prototype notebook), this will happen at the coding sprint later this week. There were no objections to this proposal received, so it's accepted. .. _The interpretation of errors: http://lmu.web.psi.ch/docu/manuals/software_manuals/minuit2/mnerror.pdf .. _Fitting and Estimating Parameter Confidence Limits with Sherpa: http://conference.scipy.org/proceedings/scipy2011/pdfs/brefsdal.pdf .. _Wikipedia - Propagation of uncertainty: https://en.wikipedia.org/wiki/Propagation_of_uncertainty .. _Frequentism and Bayesianism - A Python-driven Primer: https://arxiv.org/pdf/1411.5018.pdf .. _Error estimation in astronomy - A guide: https://arxiv.org/pdf/1009.2755.pdf .. _Gammapy uncertainty propagation prototype notebook: https://github.com/gammapy/gammapy-extra/blob/master/experiments/uncertainty_estimation_prototype.ipynb .. _joint_crab fit_errorbands.py: https://github.com/open-gamma-ray-astro/joint-crab/blob/master/joint_crab/fit_errorbands.py .. _joint_crab results: https://nbviewer.jupyter.org/github/open-gamma-ray-astro/joint-crab/blob/master/2_results.ipynb .. _gammapy.utils.fitting.Parameters: https://docs.gammapy.org/0.12/api/gammapy.utils.fitting.Parameters.html .. _MultiNorm: https://multinorm.readthedocs.io .. _scipy.stats.multivariate_normal: https://docs.scipy.org/doc/scipy-1.3.0/reference/generated/scipy.stats.multivariate_normal.html .. _MCMC Gammapy tutorial: https://docs.gammapy.org/0.12/notebooks/mcmc_sampling.html .. _mcerp: https://pypi.org/project/mcerp/ .. _astropy.uncertainty: https://docs.astropy.org/en/stable/uncertainty/index.html .. _autograd: https://github.com/HIPS/autograd .. _jax: https://github.com/google/jax .. _SED fitting tutorial: https://docs.gammapy.org/0.14/notebooks/sed_fitting_gammacat_fermi.html .. _GH 1046: https://github.com/gammapy/gammapy/issues/1046 .. _GH 1971: https://github.com/gammapy/gammapy/pull/1971 .. _GH 2007: https://github.com/gammapy/gammapy/issues/2007 .. _GH 2190: https://github.com/gammapy/gammapy/issues/2190 .. _GH 2218: https://github.com/gammapy/gammapy/issues/2218 .. _GH 2255: https://github.com/gammapy/gammapy/pull/2255 .. _GH 2304: https://github.com/gammapy/gammapy/pull/2304 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-016-gammapy-package-organisation-proposal.png0000644000175100001770000034642214721316200030003 0ustar00runnerdockerPNG  IHDRA g IDATx^ mS?"JevEfPHDnEʔh02WJfTTd̄{eR%du9g?9y>{9{gwXb&`&`&`&`&`&`Wa]LLLLLLdE`&`&`&`&`&`&P) Jq2000000 M000000J XhVӕ Xhz TJBRLLLLLLBkLLLLLLRte&`&`&`&`&`&`^&[zۜ[30000Atp3n®k 600000)FcBݘ  HW6I$_t*hBLLLLsf~Xhv~=6# #4O0![4jfLƗ^zIwu_\K,D7M}OwL _|Q eYgHzg}V1h*uwҺ;8\6'-oyp5wZhV#4hv'am ZhVtDk|_zn- ڻn};;7^atWh<.&0v[v㧗_=FzX@?3SN9zԩ` tUӣk.=裕6mx|ͺj„ Zc5ꪫo R [8^Z;su_|ō~e׊+ܬPrKwhHRf5Bs}Io?,&S0:KHWwɪ&}EU7IZt$RN\gɸ$rb~ ܁a%^k*i?Jz)W+%]/Mf O{G}N#ˣ\G?N8Ao}[˝L6pC]q3FO}J?j>*w.>3He1D|3w] 㞔 |3:2m.2G] V aFtloOSa &M4*cj喛җ4;LuY?Q[u=Bfm h#IB(`0p%_|~I0JNhuv{InYGH"-+4Y{HO=ޞi I7fyoع%a9yL0pJ_Xhv0K/q²(S]vf}'H) ͚N(v]&/_fAԟx≙G|ᇵb`5_V+"?ֹV[3^;=mnjlI!-y4.d_]>$9wB/%U y2e$sfj}~ ?4I2o֦F; >=.41'<N/r \cdN&.8(,fr5}6-,} 롸?{poh|IN=$ q:VJBs}P3"3O9I'K .Fu2?B9Th~[7^C*%3߇$. Ss{@Whs9v$"e,z8/oAITHR7 CA z'iHBko"qj;7KުVoJѽ6ȑthnXJ7tF҂\̈́?$:w_#c!/{@_FB~`OZ8n17w,}6 UVYEGf 1JdL6۬m‡n\{z - d7%=ܣYfE<09.Px`h#"f._|Ky ^hus5懌kv[.擵qwkʔ)",XS$q:*`'$/IQfxH#FO".F>BbpbQ0mcu]oNB iWI/|dJNhr,a>"~s4 f#yVIJ5ӝvB+H-eݲ}+I$+Il*b+uq\~~KzO'Yhvw YfN6yZ\LqpjU6xㆻTwK+9|+ ,݉'wܱ\/ Ba>ąOYwa*yEM |xZvegn\~D>V8D( 6? y Oݥ4<\@}!c$W}wV :ƃn6@q> 'f~xz>@ɛwM%NdMG;hAK!X>O sJ;ufP8XJ^ą1^Z0Oq/].q\){c.x]"կ~#8lۡeX>g/~:cy=4'\*&,/Rug|p z4 w;^V%hR{mky渴DpJ.N'Py|0%3u^T&/2DۊQ\8MpKWo[V9Rh=fB߰]wuơB`a@ w"4=@D4'.ђxL׸$Jha\DOc,%iFZER6,5IrG]΋KH&dCy{%'.X4XxH2 ɀ4kXB(F?䇥Mh/-!, M[  kD֭؂T]_ߢ.ڰV׭=cQ D\rhuQe!`b JSuWcچHI ֩ .`s> 7V \gkU:ThbcL}YXs"U?nmH:`ތ?_,4ypgރB;l٘X4B5XX&?m#^DgxCheq-}D Pr2sKenOp}łkz+'Vָ6u%o2 p/O>j&4Y-KǽlIZNj!.o.aɋ7M4 {ID7 ұw*4YބPB(S 7u/;Gu9Bs&0 L<}e|L)/E98{z b}4XOꯓY&gfJr!{f4, ;Z oIqr>}6[@CA1|;+Lq!/~L4rBCT!bwMӔ[Nh`F(r;?U J\LxgoSO\BOcuW B:k{j]&:<x"*3_|;~)gƱvԁ;kz9Wq>ND%*&>2G|bU4.TA2qO;fXhM7zk,7qŮ<'r̈́&,RvI_0!B%$JCBPr9c M3" .c ʹ?XEzb+焸֪[6l+ɋ8nXcNhrocIRm;1kyZMVWӸ+ sUZv7 R:9~q։'?9e~W˾ʘG\ 8ۧ4sc%AMcm? %1Y撋8K7~*n'4݂ҼMoW۲VbL\kfIr1^@ =g1%a7X8xkσ%@i'4s?X.SasrB4sB ~R{ .ޤ 2Cdw yM_LXiQI-Xh1&%Dn=\)>qp.s ']rY5Dd*l%dM-XN [nq={?\&紤s @s?}Qx9kkN_\X&tkM_fB @ SB"xqj{ Uu"4k/& E:5$|ӈ;)[(4do/iB݁h.a$fꔏ[k"= pX9C{A\25ʎkY;VלjzӍ ͑X!Ty[*H;{ΥoO=dq 9[ ":ũ- *t#NLu4aI`tRr<̦.aԙ{ BPzB+.2\d\:+#4/b.ځkqB3Vp~.)I9y9seQ5W%Nc |ZWLԕsQop \YdU*T;sS4bU/'423uUMܿq B0ݟ2|#o@{SI,dLF+:_2ۛ 7M^B(mc3w"49qhC'Mu8Bs&"$֐AT>% QY4(4@&6ogc~<'"1>&3&s|4CvYb9XO> H8dY;U>Ҟ7wȹ檍uq+dfFF]is>j{=kɃm\\vB .H|7oc>\ Ւ88jبO&o9KuW,;>IAh欳I)#4%aY%@AhLL}ñe={udz+X}Tt>JzͦBdM[qk'4Ht*4]&2ݒ{(1T1*fSCA4] x3kz'B[5uqZbB6NŹiVޞ&? .U5}I=k!L(?5w#vf w%6Z+ՏB%v4?l5<9&-ij |ҋ$X Id ,M^R,4E_rqC #).dBdʸB$iB[8{)4㸻*?,sH8E3ZfɥB3MxAmܢH\rW4N9|&81e&HO$ifG#x 4s1dXxӘNeL]9[/B}3W_ƞ*J|,4i/ tq2Kث&MS=Nɽ/>ĵKk7%w)}9~kn7m9kպX*4g/wo=@,7ȸrVj$&U.g nr-bQ)d;eH1 4@uH#Zz)4hVeьfU\f.XOT֍&!c,!+s꡾՚ι6sale Şai@# k0uL$'8*-~&<؆ۜd_14gi%aP7q̹ rý>6'C4ep}N%94,Xhbܡn\ȱFHIKxxoBdJd-sEs8& Ky+ω\o@<ɍrIi>\Uc1YLw'&#%4sxyOd,|fo&k`6}/|!|ӏWIpeV0R`UY!IlX&'aR!K\hH" b{dY\?1ޣ-5.C8ps"&4Mi*]gGBhV%w4a 0?{:Zn}f tm2Ą1vam K<f\2ȋO8A\X}>XBI3.)\R7P\LͶ7!KdHl |xx }2}A7x@K]R~\cE]98* Ք!8Y,59u }h)k'TZ[p, d&<8bCN,/I?}sYC7NՉlK<+Xyxq*bZyÚ_ 䶬!6Ju .hl\z&i]Mr/X{\O^ФO\%*Y|qC&v5}̸d26= M<}֕rC@6aw_kBፉ 3Nno?X+bLzDݒ^Qy=tч$8Sݣ.h3cBhBV|+ oFtQ1 >Q0}ЌJhV%'4 ~taF𠗊.L*rn-V7ݔl暫aRnm=sh_zdQPXqxC5`-l%4sFVBC(ammy,岆|{b[HB3qlŃ0]G8n.'(MZEu+LKp@cP::iq6K ͜/9!Ғf$/M 8<p%sn\KjU M\\aVBs:1x,oQg^RhQJKgHOȹk7to4UqrnYY7i\(H6rUa I6g& %Jw-BcFh2w9+N7sf+ڪ4#pZ4GBhVťl7gߴ/Ѯ̈́fb7NuL덳bâIݲФm⫾ ~ XMϴ-]nK\ݩ7gh'ƂgӲ%ݹ9N&͖%~eenwGxz!4i fXyo.&:oNhbQ'N,t^ \H Ibt.r{ȆcZytl'̹%7G5Zc,4{3Kl&!XIOIZ͜xqժ`EJc☶%bQʢI[UphO-NG<VxN~;o!FD5NJw T/T\ fb6b@SOͰTc H "f7+XRF*4ť.iž2]܏V]h[c%'4/ba.K qLe3nKNhCxǚB{0\h6 ɥȦ,R- fĐ?ƌ#. ]3r+P"V\iF 0 9@0[B\m;K LE1ukF.5^4p&nr/'3k -Mc$NaF_J,Oeθr-X\7ټ}>v[/-g}v=ffklBvW,`[u-BcZhUCj&4:Z›pSb'4c  n_">lm?2YV1V\r㪞s=[mV}/FO//K^G-4{?$ZCI&,~#+[-,4GLqT)46ca…*vK[/\2(ػXZ,>fV9hc,b|ō Vb|ꩧDrH s.cX-4u'ٚIkKb*R[h}hNBӫo T)4XxqnaXhvKyĽ{ߕihpBAF7pCcPV[mFH1Zo9Zg2Bӫo BhVBeD,.Iv):quf.fqK)N*'BPG$T^b9fc,4`-45d;C-C%?{,"[me (UǢ{nˉ -[l0:f } D[i1  f(+Bs̴9 XhYɘk}/C!| 7$P. w axʜs>UgCr \,ʶ-d%8Ɋa1(3ذ"kL$wyٯwݖ飏ȘʋPH5< 2cPV5&`ޙ %`9j3hz  ͑VMlea&`&`&`&0 M/ ,4Gzd&`&`&`&,4zz9 !&`&`&`&`얜3'`s $`9*Ճ2 M/0000!`9"ݨ 􄀅fO00000ׄ ^wn=2000k}==:w$=Ph4XhXMLLLXhdtѕ%/4IvqO,4Gzt&`&`&`&з,4vjJu k#UtXmLLLfMF] p[hIMLLL Xh㬔Sl gتYX9Bs̴i&`&`&`}FB&dwRk%,4؄{&`&`&`&/,4e&:GΚifg ca=F000C}8)m̚ifr{l9܄] @fF+ku @K:h)@3ЎPb~Q1000ğa-C[ƚZqڡv &`&`&`&`&`}NBshT֚ZUsh} @ Xhm:fڪ94>LLLLL&,4Njvg ԈfՍ5VyL00000nfڪoe&`&`&`&`&P#MPjvg Ԅf5Tk3&`&`&`&`&`5"`d5f>-I'U^Vι 00000lrLş%T $W3>LLLLL,4;ؚ ̧j9Z %$=Y7| / s9ji'4sVs#MLLLLj@B$2;L綶ji&`&`&`&`&,4ϛ NfL{#MLLLLL` XhV [Ym/\  j[hVӵ ԐffY-Of&`&`&`&`&PCNf<] @ XhV;itm&`&`&`&`&`5$`YYhVӵ ԐffY-Of&`&`&`&`&PCNf<] @ XhV;itm&`&`&`&`&`5$`YYhVӵ ԐffY-Of&`&`&`&`&PCNf<] @ XhV;itm&`&`&`&`&`5$`YYhVӵ ԐffY-Of&`&`&`&`&PCNf<] @ XhV;iuJZYbk30000K,4 4`X\ X"`Yl[hVӵ Ԑf6fv IKE%]+@ISڠ{}%Oҋ.t%, JKZUUş_Hz8&廒%cftc$M%]/`ISr.'$}Pґ7{IߔwJO_$$}K $&KE)NVV* KzLB/|SIKK[痫\\ h%`Y̎&bo~IHZHۋ&+WXA IgJ0\ҚM* /3Gc$WC?#i(wՋCBx>Ym>I/&?űsߧXu |8?KI3q%"L`8NIKZ`ɘ64KƵ #7JڢV?~SN (5uT7J[FCnLL`XXhVu4 MDVAw!s7~Uђ~%iB`&E}X8b1 A6I%bqcInI.CH}[b:na}f_rZŒ㢢/^ahN  \ BzXuQE,WHhW$}_ҩv(D$]Pԇj/6L67nOM ^\if[afs99WkL6M=nL2eO%%r(00:ЬvFDaMұX\X_t$!!%! m#yH2J#P=EI' wRSݜ[Ah-h+J{!qEw a͡?,6BA Y܈C " Ify^ Ph^2bo*^Dl5qW?[D.&П97nӦM#c.&`&`f8&ɟE#~G`ZXSW_@%<Q|1/KqdPRhBk]wqWE}zF/?^7v00Ьvǒlfь[$Zd^ #n"(XP?Bl#%E4brsF_w}l XYB]*X$?#ٮAb]ł 1delH5esɳ1FmN:I;l#>٨@[00 jt-l]d?峅%\dD B3~yO1<09h_dze̡ -N-ˣ-"KңV\Y4_.{e \ܚن_IDe?\%ں$@"p Z}u .&POl+]vY p101 j'Bs:ϵ̰=-Ygz% %M|kzH#I(/).lҍP\gGa{bH˞|gz -noU M;K -`: Qתr{ݗq`]eb{Hnk;ee]wn+y&.B}C$n[?t}00萀; 4IQLW.7$fHȎōd52b·ee"`TiŞąFmT0Bb1dec8ʜ["vPgd ۰^o!n3t"$bEk|G+AhGl5 $rzh8l_Ct[>яNVjb8ks>,4G YhPMF5 jBsE,&%Wh:EV2칹1~Qҗ$.E^(ۿ^m  ZG@w&`&0 ,4jY-Of&Pf9N>,4k0I `YRXhvˇ TFB2h Xh }0f5C-tm&`Xhj@B. @ % upH]&c% ۑ"9MH}0/ib/W۴CYOa熱O;qY}HB']20.Xhv)ulGrNR {`"HUm N(%9QٱeqU&$[T~I=FzAB݆ ? jUh/V4$JzX%}VQH:JҎ.,[Wvlt3͝%ɉ#4oIi1c01~Wo&p.LzKBZy(*3S#n [,$gFBsОzn~wk]׀ c݅ƒ>+ H^" I)0m)iIIkÒث@IH_{%=.o?ؾl&i%}un/TҒ.<1OOwIY$#'õzIK'$}Pґ7 vߔT4vcPƱ{xb_JZSƲ~]RX(_~"GI+I,kKڳq5GIe8^6 aSN9EomKs`5G _53ƚp;S)wEkH۝0Dą~_\D3DMXHD?9܎E=4HTf S%{ K4azR!9F[-G]qigyi{eG>1CfۥLL,4 t{6Kf+VcSYrqgDHl`\.@WvcBh;na쩵SAB$SC1NbJqpB59&a? ~c=FhֱdžY[On\O|B⡇^xf]@Kt7  }{jsTߕW^O?]'Dc2 !gơyk3\b]wըK[3> GЬ<&`&Ўf;B}_wITb CYւ',vu6u .U `1 l!.} _b;E&jG(cZyb1ǐ@B^ @;u}݅E_ȭ6N6L'QHHfYDT*4Cd3G:"NQƐ'ib"fvdyMl%rhI,"eNhcc{/ۖA 7! Nj!p*YvW] X;.?^5\?ӂWBB/Y25h,7V[mձD#NӲ2_?r .7tV]uղ],4G70XhVQQ݅&c*sb+EqeJ {V"HwB3P%iNd^VhUd<8/?^#v*&}#?lz؅LeLhgd loD;CsEL 7ZϒQR,;F j;V [ f"0@,-FkV̸lfµ5|3:Lr {_K .!*ⲋhIĸv:EOn*I* ,ŠXPmF@ni#F_|Fg\i=SqĊ?:. #L,C1uIڷ}N2jp &Nf_I0C=i!K%aM gI q|Bb9 @Y' i~Z_*Aq;(_fB̷XDҍL#8B; 2B3' \oeܭÅ`YYrML= :9b4Nc, F@-?i,~lxi' Z]ْ$6Wַe b\n9<pѥ$Tb˒\f\/qpKR}f}=50V,4]CmEB36:X Ziڊ0ElM#1]Gy;Vв%'4[eeFhuYlj\^xᅙ2~skW(f-^`W-N,4ά30&`YXhV˳Ed޸vG%ZM7kWfavMd˔)Sk Ml<$3nE]7|ʞ~9|2pq{w}4$җw_|oN Xh ,4L Atm%2lq aK/$ ,ܚZLd o$$ ې! w]υ^tęw-;Bsػe0f4뻽I\ @ $> .X+ַ6{,lÑN;Md͹GjC̲MH/fO,oy+Ї>RFhOVPtvhW^э7ިs9GpIKG^ڳlGߛ @=XhV;OhVӵ #B3's?I*!^cb^x)kAuV^M>xBwTf zrB3=/s̠o.N`EuW,F( MLBشh$$jq0l θ1@ ͜B$wq}$Kх W\T _3z'VV1vwymǑBs$M0fLG\TÒ"46%+H\ȊݺV\qBhbQ{;ёHRK k8w{Ʊh}ײ~ _.7MNkfcUcuj\bYYrML= :9BZuf͇{ZME))N:\Zr%$^y6ls9}Iq's=װZ‘?+{21}5 tMBkt*4DNIKZd@0 )/ Y%1/6UtAT;K{ A++KA!NG;+|KcHtT1.dTd"6يIϧ?%&J"fwɴUc6ݣ}DfȄ+m)wu4h@¼yXq.1'Ig|뮒XYI#韒f+D0VQal'4ygD.W=]$a-渻 6IE>_G %Jo[Kmޯ$я 4c۵Yh[hVBzLFf*4( , ~a)D||$k+.a("p=':+ݟ%Qy$DMKI[Kb}$=SCXSŝZ4PދHƭxBOGѽ KiM=X5,zm7$# s% P/H{1UD&mt|aa9\qtMWخBsBz3u&`&0,4^Gɖ%aK,xFHDl":%P!8XZFPj"Xk5Cc((>Th ,YI8Bu6&r LXJCV$}pJ0I$ڵv.X/,;eګolf9BoY=x ꙺF0 R %eGlo Mڠ-YQqs\xЌc[}&1 xid3!nqf y74şJ"3 aM Va ;b*xW>nXIq% P MD=̰nPG+8B!nX QvT{,4Gh-4ɋQq10&ՂЄ@i_KXf 늤7$!l('"HʊDh"H!LT2X\yS`%2 ⍘GJpW(ī"(IJ\ 1P$}*r sBǢuxmMӒ >XKii{oa=fy,37^}c6 2^uU6-\Ҋ+8Ӗ'#ԽZ6;BɽœldN @?Ьvv*4ADTIuHZCdhMJvB<"Z eµqDh"pd% E<`cSI$c Os@̧ي BN8xklqBzles,$I" Qq7XsɀN-^| -[mNXkXl BL[X.LܰMMK}et:VxQ IDAT?3ip>8x ԗ@!4F'i%0NhX0" !LNЬت-7pb^6),Pw],r'?ɷqb%p=hyO0eN`:l,Cǘ @E,4+YTcY-:Vw,qZ0Pči>Z_O;4@m }zF>5kU)+0ڮ wL,49 jy֩틌d͹a,I,&͒D]7Fvx{gy<ȗNJC҉ uؚEg&`=&`Y-p jy60ry~#>G'mէzfdNK7:^*=06,4]"tm&` 1[7xV[\T.&P/mkW\qSNe [Y/m/ ֘% کЬk30a¢_~,K-6.&P~tvb_d;)ž{Y~L& Ьk3.|;{;;G@ L2Elo-ػ=Ioz577 wtϯG'|U;tm&``/W9YnV[[guGs.&0nV]}պ+E٢&i=imwͨӗ|U;-usWǏ_}ԩ \L`$ ?S^'&ItFO5l(I{ְ"@OH*psCIW-mA*󯗇 /j!Ih΃wsW]v٩+-ƍvYN U>sYUO-4"zLCksx M6)I'߾ڑ6_|Q/k>3ӦM[N=hMTK ݪ>WzQ[3if/W5"`YdМFrb$pk֢K:N-D?[5zלUsi-4;%M",gf\tE|ͫk3Xfeـ@.! U}jXpڪ9:(,4˒q&P1 jL%'Orqb%0iҤ;<|W}G1z2@U}}4No~s#Q! j'&5m[@ |k_GԩSX޷ ftG(;<,"``#DBZ,47Z j'ܵ >X&Ff :mesM? Xh缸WcҪd jy6Ь(c7\e@  j[hVӵf@Y`?Y5; M!ՂЬk3 N,bt<0&`YXhVӵf@~j^s l0"`Y-x jy6Ь28VMkν7 MNf<] Xhw tkI\枛@3hzmЬf<] Xhw 28RVM=7 M3Nf<] Xhs 28VMkͽ6vllGߛ0ЬfXhvB{v=83 v}99VMy 7,?7a3hw120a"`Y-X jyHm#F-4;˃e0'0C?>łs8m%y89r- 瘚 @,4jY-BsDwԨfGzy0X.߹;} tps:t[&`5`}@BIЬ?oY~n6=sZ|ŵb}{_Vo&}ݺ뮻oy["hUW{^:36mM:Uknm7\rAٽ?F[^xF7tS!\sYnV oĉ+jV +Ж>p>IK_[n9mF}7Zm&^7B PWKw4:Ls7d} n{a5&`YYhV'>A'vmաtiA@q>¯Yk|G;sG}!Bە;S/|㰗_~Yu!)&Wx@w%`-yCy 'h]v)U Va sV[mU|"~:/w Q\oW_Zhe&V6L?xf[A? j-4KaACh'u~b]]D ! |_8K,QZv^y9X4K_R']1=أ|Wwe6:˾+, N>Xb}JotL3jwf;B-g>3Y;à6/z׻>'|71ȏ|#{uYg+c!,j#PLJ>kKo}i&Ьv9XhvDpg?Yq#1cd1b#q5E., uq-e]v51{w^Ɲ̒iVvn܈O?qQG{9",CB$;N; -].h֊@gi/:^힛氭 .欑!~2kAy~` De({U)׿YhI")">'M4vBdl sg5K/?\(oSM/BeYAB eʔ)B%צf ڧ"PLJ>k(i IkJZ}kN:u !A{L 0~Nz%A-cآYڰ'(K.i$syW !0[nf1G"+[<#MDh?SN9Edrqˋ/g=Uvܸ/2fǗ `VRԑsUl>nܸL6m &J+Ͷ +2sy%hvKSL&駒t}G jZhv_z#2.qaj!fin&B+_LIwZϼfA {Js8}BRBqr;`57I:QV'N|ϟݞ]M4~CrAV,z>ȹ7dYgW_}u / 4^\O4G}tVI+Hl,&XhV,4KhV@>cZY`.E׸Bd%d:ʐfEĊ"ؒdA;k'BUXグy睳b ]|Ń#dM54.>rnXd,4\>: !vuWwqCƧ@ ɓgc9 !pI'oK:L+{ƧB"!2i6Y4(CJRTҤTBʘ2De o۹{>ws9}>ܻkֹYkDSuD3N<&[zѨ8d5bǎbFFxB>$A;ZCgAQOhPSԲ|Mu?֐֚'#i YE#>-[6o:u%2BYsH-v[(g YcܪU+Qe_g[hyAh|M _x1ύ*[>:cƌDt|PzE^C&X"+jDN0!5{9D^h PGsРAO$y6Ds̙Z4 HG}D [ti9!5I0Tyh$%"ΉAZk׮-?X#F@SR1,JˏɊMe'ʐ8}3HfMd#lnY5dsN9唰< Lx4 ^+ҩb ,p<&2[  YuC`5\s'|R"5kքmhz`ŠkC7iڠh*s} (zo7nh*y߆}Qx8wޡoVJN?$ʈkR3<\p'C4wEݺu mRyꩧP #5x~˔)8f+&mqǐ 2?@WdȐ!_ 8ph7\KidRwL4ڞ<A\,nprfCB8XS'iӦ4 ueff~Ç' 7=C4z|}@ΐ|gϞ=^^qGh)8㉐WkCNdEH R"lvZpѢE Fx2䃈3ԂϚ5KU`^<,-uNEb2?앰:`d0Ԅgvv6$UvmO:E]u5lP4gQgD&+R…Cׯ_/J wj\']zXb%(,:]ߑ#GXϧ"MYđ>Lw b>f M0{=0ԋ'KӀ_/d47xy5jÆ Ӡ^L4`iA$@s/fL4Ѫ)&&)jfz`OGfqpfiiiNޟ9rsd-f3 X0KČQCN+DI $O ؘh*:'sG}n3 D38bMh5)ݕfJߋGY4iRTEQN%Rʗ/1idc J>hL46*t }`na)D#yZynJ+Wm۶۩H"tI'љgI 4 yadzesΉD3h\"DUx'\y0MiS &eo+Ҧ2X@R*0ь`Ph(3?M ͘hE^Y Ю]jժt*iҥTؖ-[bŊtwي)';0LFƿ&&cg7D3--mTɒ%oYn]aȹ1ADgɩ@D;9pxETU8\!z#GF5WΝ g8bD A9Cwycƌ[o;j޼m~~aJOOM>`X"#k'N$:2_gDS}M4DZZڦ.츩S]:KcG`ѢET~}L( vFxETUfO,L4MxMxBP{=D0Jh 5h֬`ʴ+Vй+50L*s*/2aN4mD4ChB6mu999ݟg`BxETU˦O,L45M]fY;v[i׮M6Mzeƍ`c#d50ԇ%$hByDԨ^z9ǏOùn6lek7~֗uK:"m*M:|A)C4qrϞ=駟R~̂ؓO>N_~%mV|W_e]{W_}Ud_n=jժ>f>DSI!^3hb;|kE8HFVZzai@ܹs~)S^߈ez!0*xETUu@M]u4g4!_5h(;;租~ O/B3LC>۴iCӧOa E;/ 7@|+"#hx.DS/A"XIDU$Hsw?L4L4JL4Z9DZ"B ?|J/2,@`۶mԿ7n*ThѣG;Q 9܅ЌWMe^cq1`Ci&0"j9͚hiiiTpW@`D԰&@4"D6Ȭw tm۶ٷo߾D/)WMe^)g`D6lȗ iMg4GEwy4ڷoOSN L̙C\rI3dE-[B!au//LD318e&z-4y ƃ ҋKc4 Pݺuu999Ȧ,pWMe^Nj0 Gmժy晡:^_=7Х^fenݺQӦME1cƄLt޽;;Vh&32 pf"]v酄1j,_?|&777I&G3xETU+ Ӄ@%K%-z;wG+c4M/|&@D׬YCgaT~L4m?3ԋhPfFzzի?+VL/ ,Ќqfzf,WMe^"rD377WԱܹsg>$B4x CFf۶mi1_lYO^z1&[&fQ0T::(Dq֭[G+W֋Kc\Bo?YYYg^a^6yUUXRhޅ SO=E_}Uш-BoV?HΛ7/͛ӬY!DϞxG_ۻw/(Q"իCƇGn-l\NN}ڒdtFK/h&٘hPfy"ڄŸrc֭[鬳9rȫD^};(`=WMe^ASPNIikǎk.:z()RJ*%~nG!A]~зf͚TZ5:1t@MDS@4?.^՛7oN+Yճ4Fe}Q뮻 NDko"0tXL4&| Hʼ*cy*П-d\kiذasrr& yUN`1&zci@pl"m*U:G1tPׯ<wFl־}Fѽ>PUL4=z 4?3{Vcǎ\_fW$^^6yU&!}ǐADo3<MMxRMD}y0"m*e{@" S?Էo_oYI֬YC]vYdI#K$rA^C3ԋg 1tlL4ǘg`׀'%"9VF:hذ!]r%t 'xOXb}4gBټ6e+L4^L4ҥKiݺuvZڼy3*Uʔ)C>5oޜ EI&jBot-[RtC@ɡ /xB׮]iJ6lN.HGѲeh޽/҉'6׶mhѢErJ?5jйK5k$lJ; &vs!==znn999МL*wfgg#D8 |z@OV1XOD7ђԆW &z &zt, n0R;EXҝwޙ QG #֎ ucǎ %m VZEիW~{ 5ӏ?Zh!HѐTSN T*TUVoc6&D0-322VZTn]:}8+|=z~WZ|9mڴI,H"><>blSG^IDhtJuDKDۈw"# |B^{1ԋ#i'NkVz  Q >z!) XJR%HA4A>9rnҍ7(?HhZW(P`BVVV1/ѣ}6@ٳW^tfdd|}k({n$qD=͛8=DU'恌@*#DShSZe+VH;wxm" 7OvЁ&OL4#)mMO7R x&WcK{"9?4hK%Y)F8È#p-++ `dxHDKv@R$"ITS9Qz<@R!DS9hSZڬYIs 6l8P޽{… /t9ӧOsΥFQDɒ/\4ÇSBg5q|F^GH/===_B2n=D{s38CxB᝵ cnx61E{8|xennٔF Ϡ٩Wбy$5MUDs0IMfeOiiHen[n$h} go$)Z.]h„ a##+TnYG]EW_}u'F_ٕ3rA͚5ѹ DS%>L4➌@0ь:hFyMdHD8iGT!;hdL"BkskMFv,( q_~E /w`(kC"Z +M0єNJ{2q#D3ntdQL4F4' wF" ˥ϼ Ih,OȻOllEhʠ$߇3)Q!bÆ ꫯ4!5;;ƏOoUXQl4iBW\q=cB6Khݻw-ZP׮]ſ,-^v)RnݨXb!b#FQ=-Ԙh)E4'4'Zgy8W.]:+ao$fG4/";k_~9{j׾9J{NcL4+*dQer:y5NJ'U cMomD3~a6lxF!FOsn ʚ L4jd@h&º1nG)a؄<)8z@Qd9n7+q 338ÊQhnڴICfϞ-K=Hf=Fb$4lE=xmv֝6(U%=ѴˊիKc&9Q㥑U!G89aB=dlL4ѪD3Ns0VxaLzľM{/E {%m"B zˡ<Μci9'|X3CI8r<<1CGp<.N$t&|1t?ȜQ/CA状o.2³gfb! ?j|fQLx#}6DFj(m`4 )͛MKV7' ^2D㠷9ۤYH%3)"Y]X2[CrY{w&!W`Mk6 g-ic-55i{ifIh"ٽg#G\2OӚP+XړUJN!-<]zX\>y!4޲/(IO^D}8i48*ܜ<:D׬YV4#)C1x -Vur2O &u(CM^AD«C Ұf u3o=H!xdVkMkfYCe9|PZs+DYWvނ͜9.\(m:٭Dp7j"Ë31XD3\אgX†us}H@ /)MYgUer>$ʸ%5Su,[)фǝ ivdžpOG5e$yHjL4C(2vz&qZoCFB4Bb'[n^8x0OkÍ133SLCܜDH'9ky6p֪U+" IJGc@BK77d@xӳgϰ~O9oP=77s!ss%XDvڕ^QӜ/Tܛ up L4BUJ)NO:!fXJ^DǍM&}I`?9 D9i8<M'WOL4cc=eh*CK@7P@<&B9i 甬oQPZYK 県NfnĹjcE[Χ| $p bo'HyeGbahJYL,RΧ}ċ,s;zvC3TߋH17Dy!jO%wF8j #R!gVw8~`mxEqDrV8Dp| /ڋ;.D[yAo  ,voƱi$4iR8@3Dɽp5ԽěwYGׯ/6Vጔm=4iO?>ب}'քyA\,tDS\*dQer>$By0hp@}I#ɠL4ui`b K;/l]xrhv-|_ժU aIYg} ZL4j8V;X0Eauԑ.auY %l#vAbMhJN,RΧ}D]s7`l G|pDSW^H#Y$.ŨnrG rmK!d/DӧʀD3cMGNDD4(΁@-XX Ywaesx=eʔbSVD0od, OedH-GFp1x6 .,=}\$1u?a/ YTd}lРH@h4s"BQBΜ8L4uE4jCOZ:gdf7;h:EGC&_@`D?"ڛI8r"! ePRdoߵnppuro02jd6tEYsX,t`3X!{5IyH|ZQ'g>@4A0A%@YR-sEn'r}+Bkx1{2MJڱfF"K?sNDS)'M#MM  cD6"t ڵk?&<Y|6b 'tR<ꄍ1 'P L!rAcMk?$ɔ=QV4M3&"|to&֢BU [@'s3*[0}" ˗N;-٫,RO&2(q!s؜ b=7h:<4R 1< @4+YL4>̫%]CÌ3l&"G"G4L :3`Cnx; gAePʜ G&HEo/"-]tXEI3\Y-vQE|H4D3.ݟ Hpz<(iP &J&o&@'q\?[*" G @U7 ,ޑYYY?F4Ρ##ܔ5ޗ*U*(2ьGã`xeF~h{/Ko9ԩSEJq _&WӼxg7P&+8f4iLL&]uO>ӄ'N$x>,=,R$~XbhG6gKB!,3B}L tzg ɰiݻZf=D6h*D36nMC4?7hmH^x g7PqDS`df2ь}(bmHB֡CA@U׎h"4UV <| \Dv֍6mJ,59d/|n޼9an-"M1h"S /@HV"E⻪<D#MI*2"`ӦM ҕW^}oP I+PרYf f(z~mz뭷Yhт(n`oP(e &dѣEG# 7t@]85d4  #|8oзo_*VXH\NUzu1ܹsW_F"Xv/_;t]ٳC 0o֋W_}D3n$cy5͂e<@ō"LJ$,4I6o&֥Gel'Q_~m`&~%Knݺ3u]K "hgqFh JRYeR*J#MorDz`W?e˖Xyx̐ѮnD4}tBH[kի 8BV8C` 0XX6#<">VnxxD9i $%2vHb}(ʌftV 0da7x1TBQƫ)K8RB'QS \ћD3+>I$؁70>?lR7hs؛ #,<шrm۶Ǵ(c= Zj ^ #' V5&1M,ф64 >_{5[%xF!TfMAA[[gF1v40D֭[}݄ېi/RȱkfvJ~xfΗ_~Y̏\NM;i&" Vdf Aq~R6pz'qh6%"IC5(4^ T' ь`!g$<2MeDqZ5) xxy6A(O>!;EE"l07G@lO# gDzCȑ##^_þǮTff&abh&'<\z1 A(  jE|QU&nLh#W~}n ?pã~w*5 th5mܸ7b B?1 w2DSN9Yh↌f4h0ŏ5`{%i֥Cda&IJ6zCD4Qc.⛆GTI]|X~ ^:!$ǺݱcڵPcy|@tE8<=tǑ/u C'CFcwP>O$02:!RhnQ0a8ciN' f ;裏ҹ+,'C4EQ`-y5~{ AV 2$oɌ߸/^\Lc8ˉ\U)k'DZD\eڴixY`N՟KDiH2>?8`#ZNDeRh-9L45PMopfH+mMdC(*B>s=!f7Cf2h\xeN"]3<[*4lܾ}dV$i7MY;M1;L41`ds~7t&>*1gt4PX2d&M7IJ=A 4trB8ѫrT5}Zi MjGvH L9ij!]ACŋYx5ڹdI]o>HH~FbX@kӄH9mJ6Lqg<A+T`[ b\cE`hXڭINXv=͸͔%L46+D)L6 YTeCg&^YՎM:DJ>l%իWע[r(p!L42`4oS"ڳqiŃ zvR,gy5UX&UL4yM/Od@Bov[rϋwA e^֭[ӌ3ͼ]ԱcGϼZ^hHL4=Lb)p\fddT~W tMe̙3 D gr2+霖rʕ9sfzʕZ8F!C DL4/E'rD?+1O)XWYYYmV *]1BwߥÑI[N*o"2V^C=u܆Τg IDATDԳ>K}ի)Kc4#fYd r0#KND)b$C+W\r 5lؐjJK;$;;[?SŊ !KM4+{DŽUV1ݻw-ZEf˪U 94ydwƍSOя?(Ɵ{tUeʔiy+BkA[oUִhʬ7vM̵h"'o#G$v.'ctq3D* -''gL^ ETX rK5jQ3'dWl1'WFZZ6IDhmK< RnADgQ?żZd}2Im{Dd&O)e>֮];VG$1

k5n9}~K͚5gm7jA/s8MCgB3D n g͛Qs? YDW'Dde]?7&d9>GPU֫٫y}nʤQOž*c0:{W'9ee)E4Q駟]~С$pB|(́L(Ifx̤Htݻw9r$oߞ>裐7+ADY@iիWS5:Ԝ֙@4e׋y,kY$Q> ::*tk$>7iE_xg3Pkh꽹C@֫٫ySh*Ur):{'9d)E4AnHc-$6J 2g>7ƹ><$gPXF]HdFZ@A"_=tӜC4ZZehʮ5/ͽ{L>3gd]%t։0Xv6Ch꽹C-W"߂Z"1OKUž*cB5:{'9LC'M)-4b_!-.:K/.U_~0gN~&SFfi@b@`d T̼&Xo Ѽ^(nb}UzNu DΫ>H}Sv.Ѵp?;hE.]m>=zS4.7dQq# nՌ [y"N^ƪ'UzMu DΫzL)=M3DV\I5kԻj$;:uʬU "Hͳ;{MM}UzCu Dͫ>(]S~&z/?͆D4hw,H,^h "jqfPhEŅ I 6F ^Me RX ^ayUғ|%3ѬDDQ b1AF ### `(ȒD`3its˫ɿ#Ǧ VW}2zyGSgnժUoX n@7o]r%`^ӸPhE9F I iD`V;r`vUX ^ayUK<0{5h gktX}Q(B'/XcU2+dWX9o2"j>嚛l&zMyBZZچڵkXdI4F=MF!zf Ȩؽ|1DƦ + n/k`SX ^aּe_DS@4[h̳>K}ՋKc\@`׮]TV]v/''zhpU [z3ܗ7(3"@:7^;`PX ^aּldbN5֝rh5cM(Pg̘A-[TC s.]z$''ҌV.MbS-4 LNrannumw}Uzju +7敹OL8u\scI#^SMMZ/^cٲei*$RSG37x#{ "EDdV.SsimhpLK&yrYF յQr_*UA+ܘW>FDL2h8HD+?`˔)Szig}^4X#@~!Dԟ(5*<@<OgJp2숦1[d;J^ž*cB5:{'㵑l7֔R2h5wЈ&V_=--mfFFƩ&MJ0ZKPΝsL{&-އzE 3E&Dӎpj(W V^6 yyhJK&W*==#G={:X2we"yfKW^]}3DH#"N&K4S5G dG V^6 yyhJKPDҰaÃ.Ҥ)={hĈsСC~J J <)2!_NhHʴb_^DʫyF^!/1/M t :Rۤ?}aÆ s{QCTT)0pWF@LBjÇX,?|`U8#sssOX'>HOOߙ D"ZT*cZ~u +eyļL4%@r%uUL"*ODe [Jp J>P~.a+$JD3##jժEu֥O>9jTѣG_˗ӦMe)RdÇ潘&XDʫyF^!/1/M tIj\ a<)k~e&dee۷/у2,@ٳgS^Q)QOuAYڃ"mcKfXqh3ҘhjtUpm潉J*Aq%+POMhzvsDS͘h3ҘhjL4k5(P%J4ܳg?b/H<Bڹ NeS26A7Mu +eyļA&JҮL4kX~XvҚe6n8 1AF`֬YԲeK,>"z9AkQOMhz¼AL4ڋ^<(fvLg~XvN4ٳ'9;wҩ1=BDC}RelD+ u` e^L4DiL4h5&ssx zqM1F@sDeS26Ac RXn2k/&z 4&AZ͹L4c#|@)Q9^ kS`;Nj;hccHMxQ Zfp\s&1>E `Dӎ٨j_Vc/{i?a JcD1 ՜kD9f<§h|,DSqh3ҘhjL4k53t)#uph؀L4^<,J":߉hIHDg5MjnDSDS/Aɫ̠X Ֆf8!AYaBL4]Xx~vk-L4"DS/AɫXVL4U㱾B`:[B0ԋ-MxIի Y=V3ь;3hn&C<MeDS/L44i a,{ū-xqCeY0 ?s{(M3ԋgФ\73h3X|3ь7Q/~;$iᔑ$7D4A0yHDMSc&hFL4^<(o":؛<*x6Gc&8ߧ]Iٽ{7*UJQrjL4 h͈mS1c1k&z Dԁ<*x6Gc&8"k"VP.%ս;N։fY (MxQOD|Ըhz`g&H4({3!DӝKWT~vDS,&zeO$~X%mJ(믿hҤI~zȑ#T\9*S lْׯoڵkiÆ N=T 駟/BTjU]6]tETpᨨfee?@+W8@իWu rJLЊ+_rl"S|yWenX?Bd?3߿w%K 7PvvvuI'֭[Zw.Oa24h@%JСCw߅}vyg!>ͥ9sf4|.\Hf͢M6Qb_=|ɡiѢE?0Ymٲec!N`B%fg&>qD3nl2ԋ'Kc?ws%h5yڿ՞{4vXA"^x!QV';Ν; &gm;׺u覛n)RŜMn߾nf>}zDM6aÆ ҉֫W/9rdTkwܙI+TPX/,YB=X_9e+Ԭq5*U}[o-">.^x1M6ƍg dtСv|@_~WC`7?1t "Z`tGM2ԋ'Kc?ws%h"C=$J)3MAFZܹsqaN:ڷo/K5O ڴis~ 2Gh^wugn zCx6;uDO>dT\ m޼t$*73L40;bF IDAT݉lL43͘qFʷѪDavY駟.BhC4y2rkV)"B[ ,(?F#MVli~E(]q;oCcs9s&h"oM=zw}L&l$ rH4t (S|I4A)##CmNd$G87k4[ϰnܸ*T,D3^_1y~߇2єW=._ua+s2#avvh›8z|!O֭aÆ"OѢE#hG4,X/KgJ{ohQB3*RR M1 R!BBH2g&)"CҜ)$Iw߽o{=k螳ַ~뜵}6o=$f򋕵VZ9 P^!= xϓ^XD9N>#wÆ 8s=l*D|6<-rnZ&=3mr 6 gaÆ|`l9SVܚϡB4Qrvl#?C竚iDsŜ毟0o2[L~}I-_zF@l#|G}X;wlʠG̹S#lO=g*>a$r:U޳Owc,2% 6GYGӏoD4zt7fvtpK&_ŸvϡB4Smߗ;z$fONnܛX|$)!WȂb SDS-uaU:L:ܾ}u/c2x|駭NK$5kҥK}E >˙ԇsW3247D۶m=֭[Z"3%7pFu;$R*$Nc(4ID_alsH* NP6hrA*dhf "@@@`Љ&DkT%I؟w;Dd%P_Ÿ~ ?J4)Wwsנ jL&kȏfԣ7xRR:1$NF*eJ8 à6͠)NNgs !rvgt3ZLk* 7V>T2y89H=|qMRg^TbrMvs1˶!>K4M ]5 O| ml+x]C5;ϵLRD**x L+\dNXE(A;VI_¼"I0|1-6'-ϙ:՞2+ V_Mgn#C._iرc YYk2Ʉ@5yqƁPD[&q7ۻUV:dzM?gH-¸3+ԩl,GbAL'>a8~ c'} kSfO2[; lrƕMfspc3U+$/̰I! #$ILV:&f_0֩.SvKwK]6U#6203Czԥ zA#!?}k`> Z'x5gx8&$_JL؃sͧPhcl\xՏV;rXe(g h{0Xwcq~>EBk/R<P?_IeCuB:IjoN;^veE@"4~۷/- GG]Yː!C,?]ɳ2a8~ c'} kS& S%dw>$$o6x0I*ߵ?w+M?IyqwExhM~i Y9D^KM_7@kA9!уˤFJ$uO"q%xUMuң&m2KBs\H^Z l;<&m=&oa{QIhy ?PgpOpoՏ{J1}irXYGJ4nSN`vZTV˃< 77o^a a}ˉ]tE4iE˔)yI~H Yi\w+71%$*nXzU͟hufe>J3k͝唥6ӋOzWM$$6rHtMII1#`n|͚5&0+.̶M쮫(,Zˢݑ&oqA4Ƀΰq}c kӉ&$I$aXuxnLA!S!$mN3DVIL bv6zXU-YSĩ6sm< t)0<9!~;ӱJ?hEz7F/-=(̔>=TN,91sN+ 8ӓL(Ð'LAOcJ8h2 7c^s{BgHsNm۶YwWP?+Z<+5x!u]O ."MO,9L&2h:G8j;VfUtⸯC+3 D%gx:!}?fMx%1H\;*D^ğwSM[?(D{d9ch&Mޣt_# C4LD?$Ld@рQOL&Í5MA :)-(ywJa5rz<$I5X18ȞiL_VcM i~DiWfw;;Ce9 BzJ{Y>du'5r0IPժU}uP! A$=Yulz4X4H/fd3 fchڄѿ}=jY@^@4bzc`:Dwy(o$ lM2>IyƸ,@'t^7ƒ1IHZL"qL!~!LCQ.2id,mŧd7i 2yDD3~tO4/M1Љ&AaY2t8Gb$KH4I2g̘z,s*R\9_+VΕ1DW^Eu: {ZN0YLeСC- k˗/|?7첋G;4x֭^N~&f+9COn=sћ쮑Ch:$0qsÆ#+*$ *DSQB4?:W 3&a#Xc{xN㇦L<$,,vVUhd{#tcfPw4I a+˝vo:cueMƏh23951}^sG8&Hb)םYaM++DYÎ {oǑ8y7%Dtʇ̝;/e7lкW@x`\pk׶mu)BƒIb\ ?,:(k=~W`Oq}v6fΥԪUTô47ՌwM6}Q}ⸯGUB45i Ѥڙ_owٯ">2ē!1Cx3_E4)э_7\ C4$ac/˔p=\/;Wr2hՏ^Cqm2,wלu r?ɾ 1טH/ ë^pTŠ{+̄LnrXeh*tɈfd\D E|UW3߱B4q_& hjcќfZFۿmqdWfV.=M7=ULH4~M'Ѥ7Dnx=1#{LHOC" M39sɀ=Aяzpgޣd>#쮴CU~";&mw S=`!zD!#J !00#!N+3J/K8M1D^LY L~9!*s7{$RQ[okx=ph2ѐXYo޿t.Nn*&M3I=91Qucbr6 f(iG3)z?I(M T ьD{B4CB4CXy9! OMؖ$uiȴ"ˬg)zJJK$r0,Ug 匿_)1?damģOf4ˬ! D3P"hrv+C!zO4Ud) }._r^JJ*䛙4*:I"+~2)$L<E* Գ?c.3͐)Vf+O g2 [!zLƬky]/Jn|;{v3 2ckzɚHx,T,e9hXͥKbJ&.h*UBfV0~ !o!C<'@^lhS D@l\ #3 ![Kf+O g2 [!zi@6*hG7谗5# DS3E b V0<B4b+DS/"M&rXe2fhjTf耆0!KM]H1hk-4ȵ8&(8| ],W4NYduVY;ɄhFL!-"OvM kVۓ,*O+S-W/!zi" D3thrv+C!z5h:'t͇\p B]/'Whv0>:S@WĞ,dTyrXjz ԋH!/D3t'[ԋMxH+MDs0 $%AdrXl} ԇH!@f+O g2 [SCD/NZ=`%=l4GZU;\w )/گ I]k߬94uew?yl@oonf*9/=8֙I5<G׽ڮIL]}S,;l(ui. `")f\/oe0w(9^ $M@0ž؛P D3 D3t'[ԋDY!ίN_&@Ć@_wt f?='\j}~dM"ζ_;>x~5{GfՏh9[ĻItf֝Xp `G9maĒ8~d HImȐ+m jdHFIka3F<77~2z?aw{hhrv+C!z5h>OE ?m0I-e#'pq%dHĖ^DzHl=7n&lPhz!ȵQnsHo-鬁^Y!^$vFMfPyNkSX%& b;E,i]lAm欗:cM\GHI[ro @ S;Æ BdM4yu!8~ 4}WDq݄b71^CH4): ;dh Jނ}m/8.5ryoO;+*C?&x<!= m2TW7<%$uQ6G3[m"~=\{]]D`.+;lRpYz3Id80 [:6s&v©=< ?6إflz?K*hG4W2e~СCW^y%" dZA@_}7Q#5T)q"&,%oA&Jzny}D_ؼ3r(fH ,[ hsu  t6$ 5)= eh+ UbF\z!غ8AehҖa"q$nx9Yz'0za&.[7ժUk|jYC$jZq\~9J1"߱B4XqW棪IC!ŘH4ʛ0W"7nsGK J4bG)e]J7J9C|;Ht3*IVTW7$%1rhWč#ć$۪}du>ĺ&Cjdزc3:D? ]$Ғ#f]ׯ_=o'x"pbܸqT@ #!iY-$DSyL$jPNz}#yK tB73!+qI{yl2S-`y?PJ&yjޔ`'׹TW7=L&a^.Io];p7lLn| e, [& yQtl&DSHSCJD{Sh!M'-Zȟ={ܼ3̚5 3g~ ۷Z ű;V;-|T5i(DS!zi@6*h7KDK0hTj]ǚ|ƈswMw\jP& hR  U:%.7ݙE>ė 4As6X1i!Dl_XRX!qWѣ% hZNc9rUYG)ܓ.Dfeo\UЉx&!z `Ѭ_]HJ$rXHzQ>aAtԞtfuƎjO`%DS쌍22@X`d Ez{ 'Rk7f:_::%1QaUwyznϠzP@u +yړ*: TA/;c㸯kvD!W8:Z mM4OHSxKS-Ow?3]}V}D/]PaYU(,GIǜQIUh"8Q1d!z !DS/:ŕhXH9Kz͙10sKQc(B4U |U2M0h ^JXAnx Jo iy^*yu/78ғd$lgx{.lub ApѻK49J-㾒lRy5umQlcslI ԁb2ⸯE B4Tyj&@7DoZ6MX^Y VI8-c#^tf %CV%Îq_E*/T^lg]:gvq91W6.hB2<9qWad!z b"qs-G8V_8p*$f8MגMfleH*[. $0Տa`y:mo'!8 q'uCI{3>= |"pѻK49J㾊 ^6gy5uMQl\ܓ q_ah5Di taBzH/ѓG/=y?1G " &dYÏh|] waK9~Mm 1L5JCg{!G[lMݑn0:AUPc;dHêd1ΫesXW3[Yٲ]u6aΓ=s B4u8p0PMF1h7aUDzHH NGsM? *DញdO ?Nhկ + q^67L\?[Ğc{Kܓ&t~e Iu| %CV%Îq^E:^6gɼ:uΖs slIkpdq_Rh5D}Y{ }&S$2vMD6K"ģ,AEA4G/0Ɏ ?%&J$ϲֱQ kf:MPz5oJgẔesfL~֭:%QԫGlIHԍ~yqWQ0TM1hrN2?lT&IgT`yϒ Ceih$Aeh::ՏVX>2ĘvMvI ]vt:5QF&s-Uf(=desfjfc?YK%sdcO^M݈}C% kS&bY7,k{L ׎%8}Ya ec!^dZe2!ׂKX52h_m+FPZU%:JS{Na1q q~3кfMV P9S&z5a!|QAǰd U2㸯"`4!zcL\,k.=i$I~m_,0Q`_c)slﵠZ! 3I6 ۵4X. x3L&9u/C:k~&N>bʂPGV=@@ s?sQ0Bv{2 Y}r㸯pIB4dw%_k.Ų17a۞ĒDYV&4?D-$wIjy]7ôKq:iPԹsOͰ'7J $DSh3Jil*C|km)პ2҃H r*'AN?$NW3J^nnzҙ?/tNo֞ sB4DW8+=+!z$DS/QKc^ީ<IvYf&Q'E@+xln=4CtaT:8S#YT::9=*0QTAu.m:q_ kTD D@+hT8^6? &BW3D Lsг;[:g8 7LZKϠI:6]ⸯJh5uJ"Mt EFPKϥq>:DR$ a _ :9r\E$c!`t J4AҦCUMN;HRAOƦ'9!fKJ3Afzv%uL{#J<2 Fsi;H^SgzXB @&a j2&LTrQͫb9U >6{C3驂@zc5V^saO&d9ܨsT3Y9U >6{C3驂@zc5V^saO&d9ܨsTS Jc7T>ꈉ q_YW#DShS D@l-sA@!7ռA8 Ϡ:b"!qWAU"kF!zi@6*h\APyȍj?G5o6{C3H@Uu>B41Dhh$//^HD >0^%}Ii*-sPyȍj?G59xv7q*p.mq_ k8V{J*mW^^FըQ99-n .[lcW^]>''gm~~JWVJ,TrQͫb8*zlg0jKqW6Fq9F^XnM `&˖-Cǎ׻x@7BV@JSqo|M}UmR>!zMjA .\0a8!гgO;*w0UrXiT)#~j^pvϔq*ASp/zq_t ¦Vwu„ ڵkHXA <ova;VX1??>_5&F0ETrQͫgLǽ4 KUIgmkj-[n1cIb9sЬY3~=1!F0ETrQͫgLǽ4 KUIkau0ӧOGvB@D #Pn /vP+]H]# EcǎL:56[h\VQZPyȍj?G5 &*2mlg4K>qW%B4CɇUo GO>9r䖼gJ#"J *QU[e]P I'BfȖ5>B4C">+ 4wMrXez2I_R*eM>hl|=hfk)RTrzj^C|v˴q*A/q_T[ ٲ&VB4C6B4)*Q=|E5L>Ue8 ϠiT}ⸯJ-hlY+!!_g)4A@'8$A`$~Q=|E5 &*2mlMvQQ}ⸯGUA,Mӕx1&VB4K+= 4ht l"܎+yUl`٭.qo4m 4". XZTʚ|X ьpcz0h}.Z =zj^a٭.qo4m 4"B4Cɇ.DS<w>!mo٭I%y8lڂXŅ|X T4 7!B4:AW&*2mlx4ME4" F&VB40?۷E Ґ6l׺^{lٲP@LfPBEѬ#`2VeQͫgʺLǽ!DӴ]$D| F&VB40#<C4o4$uǎ={6fΜ͛7{,S {ol3sVB46(s*+j^M>Ue8 !"![Dfh62K"ܶmz!Ћn֭M":ʃXD5&*2mlMvM-"D34|X L%h5 6m IfϞ= ،13/!"Z*7ork̟~ >o7n*U_b֭[WiРAԅ9iԩ[Zr>{=y䑅}vٲeXx9C-}R%Kkܿ94i+V1dkN'_~Xwܸq$HIM<&YCt s*Kj^M>Ue8 !"![Dfh622h~퇣>o;ѬY"N2-*$^zE<-U? dzs~!$W]u>ʦ~m}e]J*y|ҤIEkԨ.C IxJl" :Ln}1оDY=u,=A PޠByݍ$5!YD s*ˉj^M>Ue8 !"![Dfh62-df??ҒH߿Ш#F$ZX2DB)<,*Ck;w*6mjy6?"$$l}yw{6,t WW|E*Ê4ށǔOl_|uԴ&D4>!!`2Ve9QͫgʺLǽ!DӴ]$D| F&V$k׶2:IOk{ |[0mذa`82 lѢN;B>ʗ/oeq%Yd[`^xOU13**cUVռ*:|v˴qB4MEE2UMJRӢ"caKɐXƺ[" {_}cw;魬\E5%LD|i3gĻ뙇^U& r wf}Kt7z2ehݍ^Hz#k{jh3vɓe7ϔB4M*P*cUռ*:|vo_TʟG*^yKZgal6 ?@{9o 8B^sBB4*vD/͋/c97t`*j]%fSu A4oVO~=GDo$ߤ֝z";rΜ9XjE2:/gݻ'-㒹4!c'#cXWEgnuz Xe)krصlz4A S776ǎ-6: W#Tvqq kM֮$yKl,[wD'AɺNH,eק~ڣB6mcIE4=wرE D^·~n'DS7"PTTƪռ*:|v w2w9VKu5P&l,ƭc?bo 6m[}{G^֌\s4C`8!zmca;rKD~ڞ{M6y<&Mw,Ckuu4?{B4'N˗cMQC+PyRzTlٝN~Bٜ7[* qXf(K:O Y߮Yș2#[sD׌&Vhcur IDAT<ǃ>*XÆ qQGX2Toc֬YPWfzx-+hN> u0m8XItԀXգWEg5hzKrVu A@'+?A0s.;:qƄ?4ȏweCMv2*D/Ϲ瞋?c0 u{4֝ wިTiZnAޭ]iB4*P*cUռ*:|vAӺݒsw=k^mȦ 7%wAΉux؛0*c:B^Ù|X (+DժU -͒#,=n[ILʛ"5A鍽=3!K.;`N9眣4!![=:ӟ5L>ӲG~s{>eZc -x^Îo*{{qB4 {_a_|RA4YN^Mwc,IB6w\O5h>C`Kw;xYhQ}W;4ȑ#y—e!!OK !љg V+ϩAy'()d#'@k)fC@!!g ]|X I5y)4ţ9m4̛7/ЃKOnqMftT+iWdsFdPa!ӯZ]:extB`0g.I!oec_@&VhҘL 303ڲluk׶^2hnݺF&ker k,Rrh YьH?k|vڠ׺ߜ oB^~@c `0 l\~G~~;:+hfZ-tcae(֮][~ӯgQκd˖-äIڷo&MxM~ꫯ.u<0WM$Afv x}СCҽMpbI,T+DN}?2cPUꆅ؛L>S6˪رԯشf `wY)zPST`8EK^M>'zM(X֭Á{/lyK,2В$׬Yu۰7I4%#*:Q5)jٝr^6>9vMŒRt"B`oMX]&`.DS/&Vh5H3 !&Y#6aDEg1&EM>S.qkkS?3]Jl{=)vn|יhT['Rk+!zm-"D@fw8>26VLsbW4+Zsl6wV/x ;峁gg:MA+}hJ^[!:#*:3䳻xZa9{Wb22[ @`ʏW wIo:L!zca%DSEZьNLJUʙinٝhvG=۶nxqAqi@LX"m9?2. U^M>h굵H!ߩ0Z93M>]׺}79ٙ\F_[2+Wʕ/=^6 ?'eq͠M"A?!zA7"-BhF~|È*ck47NA4Zs>&V\ vѺ]̅& MGvR >ƝcjtA[m['"?l?~bw]`eM|X kk!B4#?SaDEgrf|v GT?.(Gt8LzD]PDs;றwL;=3ߝ]X*DS/&VB4ZEq|QYel|&)՚er%DӠ=Ҝq{PK+]hEJ^[!:#*:3[ff6 jSNږ.Z Q1P:5WIJ˰Q~8!h٦%*TP,6o7wǚDuSۜjcdzvP4j{GJ3 D3S'DSmL>ݔ)Sйsgi@_-z Vy0Wq "h#UƖdm;<㥞~ɀ6m܄^I: C<ĬɀۯѮS;9 \{u>4js)dz^~ f޽E'񧄗xJfצVv w"M"[nEŊ9#Sݣqjs h" {2;J=?ɀ&"qHT>LoM0ijJ^ l^z> \ϘW_~=q'MzA/"Aᅣz+L33_5N& D3!zчUnnC{Wŋ^{]Hwމl;@S?n**ci'TTtVu}vfDS{Y4:Q8vLrey}dڜզ GD G3>,UPlYkLD32w8Y؝?{I j{M7377wE۶m+kޟ P/мuiD\ee$F*:-m;=&Wu {Bk D&%݌"^˯7vCᐾŴ_VZyMYu/r¸;zܽصܮ룇?Z+r>z#غ'}\oYJ3,a<=lyV'L2xɞZ}o6xG<{پs;ZNg!Q}B41tKa x kx>?#N9k֬Ywh3< a)"8HEgmIDqE5j}ԫ ߼9|A' eeNJWq<2H̘ʻU.|}ǎۍ{VR&6-ӯ=4.|7/:{of^>#+14I;6P=o,E W(WEsG}'zBM!Q}B41Ѥ8YfyO=Tnݺu s=V^ɴc<ũ< a)"8HEgm Laq;>zLwq;=֞;io=ڐ&6f~=N/zeK ֓dgG?vՇ! t78JcO?2OJ~$6Xhnk+<}wPnP&ub"TAAzåߝ}.Eه3?x}MVao)b#^E@**c&]4 G2Vܱ(Cg筘[K<LzHy∦7˾D }7TS;a {sg`3/a`o~vߚlջLl~Ohv'fd4l ,/B1C4_ze q=]R?,$ x+ځB4>NpE W^.Zp^ %,iӦ5kl_ϿΌTy0W%"8HEgmئ$Mw6ϓlw|{OMhN|dUҗ׮Zǜy9D9G4_n f+l~^gN!Q}B41tKau6oի^D e˖cǎۿ]P4NRy0WRDq*cK].%hN|x"Q@iS?|u)<4,J'āh%cƫp]k,W@x^Yھ+DSøM~+]zO^#ԋHBBǘ1c};̩`2V"%q*cSaY¢~ɀtκ=\ã57]7]SD=.ze=7/\OӮS; "r7 >uw23=Ó5Yf|B9eW>NL?ug,[Iƨ"޷MLn$D}!z1aB u.\[\k~6={رV>AL<ʃXqITtV[6Qn_gַu8),Ivfc+<39hSpJ듑,/=coV(?Hy_[fWD;˧l޴g|F;Y)Mz[7=H-SCoEX؍4O,zSfijT_Mu L?p еkW+i@qaXb~8ʃXKQ G2m$(iD e"3Llzv5k~sOL.BXv5;sr mYlTkWsOn_Uȳ; w .Zb%yϞq瘡kM(Ͼu'oB4KWzhc'߲e3f(&9sY3ife6Z]M\e.ENF*:-m{If :hR zOmS%Ȇ|gpdׁ6m܄GJ[N"߃oKU;װ_#;vb}X=}h Y DS0:ӧ]vzW-,#Pn /v]\e.EM7'"L>]EIh= F܂_~o ?="<2z<~3|==/Lz~)M͎;W솊*yL:M7 9#1|ץV ԋɇUo24A #Gܒ=rXUZFE 6RYelPLKJ?n#ffrKcɢ%Xb5o_ԪW5k=3jkVw_}uC#㰆B ;+?WZԦ>ujbi3e_[e8 Me!A`!z7:h5HA;q㏟ښʃXm A"G2m%RO4Kf s,2r(Y\yuVOl3~=zFu+DS'ffƊJf{Cf׈M`Qq$^*:-=`&B4Kn ygt3 ^"nxIk>Buu+DS'f&VB4ZEq$^*:3[ff6QIo,{AƥGf:hţMv2"-BhF~|#RYel|&B43JY}?q` kfG!iB4J^[!:KEgrf|v ̦2H6fk\"777tLhq'r+!zm-"D@fw8/Uʙin-D33ʨ09w__~hffSChhB4j4+!Y mٲ3kQL4| 6^{婳DB41,HTtV[ڶgҶKzhhuJik׮ţ>w'SOM96,<{l̜97oB{e˖W^dj ь 8OG⥢89M>hfbQKhl*-DSMM>bO4?,^buQ5J[f {1N8"im۶ᡇnnݺvmK|q$^*:-Oc&B40t7B4m??h굩ɇU&IɛvuWW5J#5j6m̞={f=Vf`Iǝđx2C47n؈5+נ|8b?c`(m{/.Gm: TޭrL DS=3 k+!zmR49t믿ԩS+''$_Feݳ,D3E#RYeli?&%h~ޚ1kǞp,2=dfs:umwǞAvĐ(`!a#}B4bna%DSI[ni3X~G5kK.Vfۚ8q"V^C7ޘ% *%e8/UƖ]gw _|6];wWc vB4nםUۖ_VǶ{gpd#ӟ(K#hf ,N#DS/&VMj*VV޷R 6m *xoY޴_~G{e˖ť^j5y8/[bwaZjZDk׶tϭ]r!VRo|衇Z᫯QUbE,[ ǟ]v|0ԩ/eۅ ZrHV^&2d8_~87Σ~#/][~Rq[r`zs=Q2!\g_ot8&ht#v Kv~+22h~裏SNgذaVMqE8 0q|t";q{D?Sv=vmڴ)S|G$IW_]dCwjܭO>I l}!aZeƑx2o|vk8/M-=n1jH~{X|Xmn =_ضu=u^z%lk+>Y@J_}af1M7QMB n{4o_ʛXBwqۅXEt+܎矍s/:׃'<,@eqttA'4>oMw`(Ssk]r=u^v:7^]r50 G5@{`/ԥ8[;]}9N_{>־5nogPnM$k\u q3.Z3O>z+о;{"u~qđ%H4ݤ~}o<5cMh^%]v=F~ƅhi%h*WdɇUD/8hVZG:D!&M :/KTx=eB4*`snK j~a0B|5פ5jg!q$^*:-m䳻X[N4lZk OSJv[Qzhv1Y8 Ӊ-Op'Z$k(KY~C$h_,e]p_'7^`yu1Y7{:t'jںPW]by%zBb8xշ k0yhw|B˻Il,y<0Nly17k;i/N^#fkZǝcw4/K=XbQ~5_2̞9_e=[|/!APW!zea]Ee&["CA˗/?HY}=zn݉djͮ] ,R / xSO=֭\ -u{'E%# ar\dI9y8ӭU&MDo*7 ={ͻ$l}Yo'[}6n܈ &Ol_|u3Mf6.1sőx2=BL>]Dsŏ+vι P{ es<$9 k3=c'iVmXWf/" `3c* ds{W[7{c{1ނi\HH{կWYsq{0筗oqiˬ u:Sa8 o?oy5ƪU-K[a͝1 4"T-*2IVrep )HB.% 9 m܍S/\V 6?I9̬Jr4~\W޽.(CH6yA&v7_{,n]vjԨQ_'YqJ.Mf.Qđx2D>bL>Ut9gBRuՍW>'$a$qtuÛS'|U$4hCOX"a=: u(pizI4p&z:8CfU7\ngtV~w%/)by5}o&=ޡA޷F|ϸѾs{Op"nȠ}r ID|[ڰqv]1M(|XEF4\pE~m|w׮ ˃h1Әȇa;~id9zž{d#]ݘ@gĈ_M-IKқ^y+ 9/Yg:ӟMywSy4Y#{-G&l"э ьG⥢;|vӉ&C5ɖ.dH ^v] mNnUWV|c꛸-YcVm[YN$L9l$~W^bk,'kR}aeG1J84~뵡 x'p߃G F :D<ǏhDD7E1 1 B4"na|衇@n7t*UOf Dˌ,Eb ˋ*km*I(G)[oBZ3!Ɉ,B9|ÅIp޽{O_~_/qq {jB4B>Ƒx26ֆ@ybc:t2++\O*AL~]_C.F+_CuMwᅣz{dv]y),;@:y=.aupN+cӺJ|g֟!xhX]P"#w}7mVtȠ&J4SP8Ibտ_5y뾦u x`FDe˖VF"CISޓv饗zFOW?!,UrHTtV[6d d@Ngt&sr ۃOɭN~\^&[e_rԣDEWu)$DGBn\ 8c/<'Cx|L]:\z0l_o%5rZ?b%r$$M:;rH<`Ag-}UE!zwM<ƆI4SeP!^ݭ8I$~gŋ=o9=3hkM6-2ÁnL*Daɼ+%ge(2F݄hFmXG⥢XXAibeGʻ29sw0]o+ B_H\K|4Cn0(dϵ\VwP2ꨃ mu uړ=g{2f=󬗂x4uvc hB/Cc8NsKsWTPhmdh*2fbBΌ+V_yWMwxfСCd%!}Eޱd z _|M0*tv@W7-9 쾙MLE$}v1Ǡm۶V6Ļƛ,|-$Xfe[1lǎ;&D38Vҳ8/UƖmc]-@4_|z  ۬;# 4ԬSC32i䩅D%S^57ҫ/&?1-۴[`٘ mㆍ7b:Fh{vvmcςl>wypOA7~I8أ=t~ uZ2d ;7O_wY9I<MC Q !4<&2\/ҏ i*~zԺ[F'|Һnf.]PvmンM&l߾}~Ě΄&||3 ?/.(UVNLð`&0[7o*c0X"'68ɚn udZ~1uLdCI)Inr*tˁAJ}K&)bynj鑼/D3CT^xM>"# M{D|ƍ ,pǵAfضm[HL/Sh96A-˒#Fĉ\:.54q̔Lzܝд=0Ќx T /+GM-2;)3~"dk[DɞCٟ(32(3z݋~MۣXl43NhM)ԡPB(\p}8Y~}ԩS?]"{BO oFy;vL)r,!oړHܣ ӍLͤu#:KBӽKj}䗕cZe$4U斑TO<ڢЌGZ^'7 /3e6rg8,')4l,ΞSyi44dдnLShA_Shh1ެ(̔LZz9;ek^3AG7}]{k|663o~g/U3 ՄO_V}MiT Qh{v2Sf3iJwguС}Z}{7͞g}FԷQV MaaDB3JpNF9N~YQh5AP1!y)9**USRRr=&@bbΩ6V8ؼ.`5k1`)PhƗ?X M^sˊBӠ)4 a4& |>_ XEʕxU ]=Laغu]v%]-ۏ2ˠJ*Z9eOk;\z|4q}G;jwH]>Y׎zJY/ʊ[܊[4SƻX5n jݡjԩGT*z;]l^'@jPV5kT 0sF>G&T'(4 ZN S,Uƣɉ۷oENˍh8h]Rss̹HN8f kz%'Z|HrjY)|!\P65S';^uɥn|5_& ył_"G{W iAr|yKwgNAaYǏGϿ[vume. +4Oѿt)"speB}-OSNGl zmZw;l=⑩4Cfk-RhFα )4պ/+ Mg5k֤={ H ֮]+СCY PF'mCK=TYU|1~5ks.m8w;w[nf_*:EnlV-]叫 ͡bʸ璶N:XPddܽ3''1Ӕs/XlvkTh acZ eJiӏfky6@ 5gT'(4bH fӻwQFŐ0& rgqʩSɪ-,BbF7^@{[BT?i}{y^y|/|6Un}%尲)nw9*U~~>Ci߉ e|RYQvϒW^yG mFSQ>\b gZ'm;)wvьϟSf`cT}2zhoT/ǞB3jtMrJX0'p޽(^H 'СCL>}U6f3bLZRJݹ{" G )) ͛7 pо +khP{lҿwIrJ [6lA&e9,YJ}Frm%5l4Ǝu5 #ƿ. hhf$4e9pZM{Tn^+R˿o;3uBLnܪ-%cڍX-G@ͺ50c9.@7'T'pfBV֚HfeʔI޽{'*ڌX4V·Z4k֬H4 }w ww"4EDq92/Bd? ܋)rϮ!{4[7G7c7ѥm6èhf&4?Hvx=߂&^"A"t Hf4uD dvyPtqڗ%oxFShFJ)4/+?_L=EHԩSș3YCX4VBBB>}dWVH~r^R7ww,Y| IDATN|nH?{N}Mj6 R /3eSf~3laԅ̠XdfbFSf]︩qث+[2psfĢ8&cKBq{nknOdĠ:NDnRkYvm^hݺ5ׯڵkbŊj DAXbV^9sv)}r*D9h&-z_b Ph Mk9Yݨ=kPt VԪW=hB3N֧nRr&j1 ('ʭ4#ͤдر4o? M{ShÙCBvBۄ^EBk${a&S~:fRhZYڷ=)4\!@ig;sTKۭBS-Z#w0#ͤtg{a3 @iOд3s=̅BS-m M$OpXvmf<} PhmBsh٠ge|@ Monú!c1ek^Ϡ#.EЯ円Z>WCbKفBtPK? ZFv0"@w]pIY](4!Ծ}{L6W^~-3}~z4"9sFdS Ph$4}>:I;Ce{CaxgoO}np2HmBfi9 >RgGq-7c3V)4cE޺|)4ղT˓HNFbs.S~/rFڿRhm;~:ѣț7m0nдǕn3sӿ+NB3Xdm]kW-SEom&I< ͘`4HB&ShƁY0*4YH ShZ|(4-jI M_0)wt /y>%;/QRyTnm3B_#x-Gypkh}_k?ѳ_L=HgNEaˆǕVB;m rS1 f7?ZFԪW :uhx-މ7MU堠Y_B"M N4b34oW7_^,X%r<I ºCˣ^Ҿ;s :4R.4;6wh^:F?,ltw7;,7)E3p5Wif~m Ƽ6F9(VXf*fSh eG: qbB3Njx@$B3ܬf(498(РzZh~wظq#djժvڐ Mnڴ {ݻ9r@PT)4nr Y"駟_~A߭ZJ[:'OȾ 3RwƁWʕѠAd͚ՓTB׻Eh3b,FAtT\A9PjR["|9KGNm M)PfeRjVՖ?5‚ T^ZTm;Źs0)QA~4q:^*:EnlV-]叫 ͡bʸ璶N:XU“}˓]}ؠ<62h@(SNTۏ) }p=VC g{|W33bTfMK'"({8NO ӧO'{ dɒLÁ:w caȑ袋xݺuѩBv {8 bm۶M/\ve6lvjgvL^-Bo;3\n!̯<{l۳O[&xqYM|wF~O*/ZΞCBNZo6{ǟ'>{=dT/rշFitT*y}:OYJTf$E.ټXÔ\IRtM-^o [g苤Y_SO=e8~DдǓnbMu:!C){$7۔fj9ӉZ\!{JYT^|y&\-TfZW7~=| lѦ9To$"6?? % 9q7^`Kң3J+?B7kmXc>H]F@D+9~)45@BSfdV3 N 㥗REB=܃3gFbB7xcDBS-ZG(/Yfe5#,)jv$Ύڵ&'>ŀaYK`eZ\BsIƍ۫߮ Crܳ/>CiE n<ו Ί\c45_CД}wh%+Fh!?OE0m2~ NBӎ<(4T˓K@po9rhgR=j{}oF6m r_~I{PFw%4ُ9qD-[ǘ1c0nܸ ;<dV?'|>}Yf ˇ/\h0t=hCgR^zAZ|Sh?zؕsU*=9GKO|͸ɞ%oh3w\/U ˿Z:k7bABY<>M9e{ub8w n|W}`oEPhHyF5CB i)456֕?P^6ىbK=d 1eB4)45tI{k"8 4fv cmԩ'.a׮](]EHɳ!PhaBLcIlo3#BSNalٲL2گNED{)PhYiСa7Nx `BS-q Mۨ۰.EFE SL o~Gs咕x]O Ԫ_ CF|Wg-׉<~Aw\i{E bdղܐ -u:fez~>D+[`kMz?H"{^Ea~!iomv1r?{M,Rh%ڗZ Fڵ<0O@T}BYf;wnXZw>wBN{/h̺֭[7(4z%f\~qn禾\3F29dƿ9^K7/ɓ(Y_ iw?D+QӆdOAɲ%3< LDqaM (Ta`8q\ -/H>Y6dn}kGFf8>I K]Om` BSfeYywԅϲe˖<9*hCPzIŊCŊ1gΜbPhFݵ Y=Vmɉu`=Q07sX3?H; d|A]4HO+uY>s|׾u<7vєYDur ; YN0ÅFaѢEa裩Ӯ-t >AFhirf`C~di'Olݺ_zQ(4_BR_tY&t>Mi&~ Yƙ-{67PNw]}MݖtlU[/R h: ͻ;,=Zђ /+(hGZf LxjF3h=W30&  ,pF֮Aу`8p@DrPOɰy4BSx݊,וqggg͚֭[Sh&h y'fvEJW'=&镋GUtt D:?NKem;1󃙸"_^g@wFA#wwY0I(4%G7v9LӓB^B+d/I3g@gu%!Ph֮];ho|믿 ]Kҽ{w5o6ܩCѢw:u 9s *GNmeKg"YSgϻl8K_)111370A6aK>QEL- hQh%GkF$e62i#GફJoI*IIIi B}6mZ{/mۦ>'[ow'x8)wfz*2Vl=݊+}e86t3PhAwKXUP@(Ǣ殮$:QeL-HBt^~'pcI3ȑhذaӧO 2Phk;ڃHBǎx75Ѯ];mܸT6 Wf?(4 I]d{b`h/ s$ @_m(M8wN<%HQfdt m>^IGOpco̡Dv;CDڿzNhrT|Kc9ʞˆ b\RR%2µ^}R>{ >W_}ukKt<4h%K}ܪU+|ȒWfLG=fDE~!2 #W &p@Zgn1V]3@ ҁBi->hڌ焦.vfS_~ ~ND9k׮XlYs~m=K3bX=Y/BYK]hJ<єVE|.XN<իNu֥;c=fl.=l*<1Ħ ̕2 /q=47$,_E`PhmjyҚ{ 7Rf͜t4M_FfMO Mſyflڴ yEjմ{)ܸq#v܉ UTф]l_ ;:gϮGf; 3k׮̈́֬YSۧz7Ѳc< yuМgI/U-~ ]es$p6V}SGeyGƍV@HWA1h" MBS-OZs/7vF\ 0n163ozZhf߻fl5hiN\4=!!՗uc[(irW掃ߦ;Rh>~" *޿B3(4ն MWyS .x児Ȗ%;.gqgq64Μ?O/ů~J9KNL̒eܙ'xSX4)(4ն MϽ^gI@~0t3))):*LjZ U)&uT/+: MJ-OZs/7vfl&{̒;ۤqmUf1ӛ<`%݁6Q9=Phu_VjyҚ{ 7Sf3ie&{n2n6S zvd*{80pĻu0Ze';~3e6ֽ^fɝLmҸw6*؅Dq'8WSo :{BSs2DFk$\n͔LZz%s3I8b곙(@YMw a M1ډ͕Hy)K&{nҋ)ceR?\H9낳O58ΥTH;Q 8;~3e69^cIۤqomUz1s_Yk@ n%Th;Q5=7vfl&m#I^8+31 =M: Ma'';~3e6ֽ^fɝLmҸw6LLr6Ci)4](4:Z^n)2KdlƽwqVfbzLK6@iD5p LͤuYr'`42cZ M MNc''fl&{̒;ۤqmlRh Phu;Qߏ]va۶mg͛-!!aف5Fp},c ukI}2{&ZWShiPG7xτԙG֮];3Y_Ƙ0 8e>6[Y֍}ꔾu(0j)Shi9r|͜93iӦ H@cǎm۶ $d˖ٳggKqE/qզֹ#}s8dj@־p6@n&0`PDM߸.,R|UNAa`9}l|IŢT M<-c,^زo M.TJkYIm%AݒY_cRXL Q,#ĺe7ZQhi5K۶ Vn,փ/ni),gf(bŲϛBS(4AVҥm[ pd+ndnekٹVBc-k4yYTBS-O+QhZIm%Aݒ[.t{(4:BS-O+QhFH799OPܹsGh%'Oıcpy\uUȖ-Rjxz brF3V虯jcUUjBS)N(4TJkҝ4ixT=J(F=l29rYЪU+S xUd*w`A`PhuZVZ (B'Nă>jϞ=(YdoiO8qgŘ`kjfd5V6eB>kShK8B_Ā–BӸ82Ί1m#>6jc&l>)|&T˗BS-O+q!pB{3g *IժUquGʕ#,7sM;c 8Y &f ͒B()c(4qrB,"©SpѠT^{-.#FXf?mr}ly9A^c>ֱa"%>6RbƧдwdQhF+rECa˖-Aׯ/XXz{y17jXpv؏ҥ qطo?7ߌk_s5ȕ+Wftw:Xχ˗k}z͛Wŋ?#G(Uڴi|M4 ~'{vA6z)nVȕ3wuW zӦMCҥӔE;C!ȶyWx cM MBSs(4Z Д=2 =Azh@W^V{̆6h CG}*DBSD|jG>>|8ẙ}/lذԉ21XaGflKA5ZVZADSgJ ]Nhtw܁mݺU $4k sMWh%{eD֬Y$Elq%Eόfxc" MBSs(4Z Jh_Nkm޼96n/zzhɓ9s洺ZqapcU"fZ;e?1cnݺYAY*rرi_ȝ;,q :w +m6ȊI~W^?-tFSPJZٳ)6q|<믿~:2)Ku%_~E[+409s&=nic) MBSs(4Z eFhx9r$ʖ-f!UVo'}"|" w}l8)ȡ;䰜t^~7R~;,XD"EpPġxO>7|3ȆNh㞾TDk^rA鷗.]n߾w30Ȍԃ\S%EL24F5ssk%q{TJkq7VBSӝT5 M5iE)c Ͷmj ]NzidϞmȐ!xɌ`ŊZԃ-A*'|;P9MVH'44'? bClIqƥir,^fMr!(4Š1ҘtZjyZi-AVŠVhʉ=zHhjF#(%w}l89`4K\6mDEY '>cAʗ2C9k,@Q 4 z_+BV+]dVߙY|OOI}jAnd%hܷvwRhZ M|4⮏ '4e }ޜ9s;M~"#  bŊ~kO>6lXwrog%p9>Mk/%%PhΓ7qjyZi-AVŠVh;?T5 M5iE)cr\^馛֬Y%#)P }Y :42kib4QĈ,ݾ};rʥQ؅>v*󍈀@BS(4Z \d ׯn(4xb)OMi ⮏Zh,Xvk`ٳvHL)'>ㆅDLII :PHO,ʁ>A 2)'`R޷z O6.X$  HheB3sFNw _|ldNɒ%dNݺuMj ( L{\]kД}ٲeM'f$433f 3Ұo>-*+L@(4N2 M<w +am M Go?APFS ]”@V M9\rZ7ny 7ߌ:(ȑ#Oj[:+. 4PjU|gYQNAeh1c6k/!>6^z@(4F6!ZVZAĶرcP6r'&a]kAf9e0^i-o\-3DF APfSϜ ]{)F/{ZǣK.Av_2+'1O?tv%}+/!iBSDW_} w'fݵKgϞ&Oe}R#G4hAeAucYfZjyZi-AV¢mg082*0tP: tQM˧ Cw&Д"D7nYf+bŊ9rdZbsNm_CdO=KԆ#cX5,%@/ZVZ Jm+LALܜʹՃq5=b2{ MWv MNTJkYIm% ()lދص@%-yZShi5K۶E+0CNpʆ'zY`fT(4ձ%oTo M6c 'tPhmjyZi׷o_ 2֣j[օ2%v"@GݹsYf1/ @(_|ݻ?pi3(89̱㱪u"4(46 M<-:GnjرcGc$ÇѩS',\PL06Dpp մr[nʕ+l2ѣGt(ΫǸظt++T M<&5' D@` ' uX͌'(4 2B (G@{6,8xf|`ucǺk, PhuZF$@$@$@$@$BjF' ZQhIk$@$@$@$@$@.$@ijy PhuZF$@$@$@$@$BjF' ZQhIk$@$@$@$@$@.$@ijy PhuZF$@$@$@$@$BjF' ZQhIk$@$@$@$@$@.$@ijy PhuZF$@$@$@$@$BjF' ZQhIk$@$@$@$@$@.$@ijy PhuZF$@$@$@$@$BjF' ZQhIk$ƞb> IDAT@$@$@$@$@.$@ijy PhuZF$@$@$@$@$BjF' ZQhIk$@$@$@$@$@.$@ijy PhuZF$@$@$@$@$BjF' ZQhIk$@$@$@$@$@.$@ijy PhuZF$@$@$@$@$BjF' ZQhIk$@$@$@$@$@.$@ijy PhuZF$@$@$@$@$BjF' ZQhIk$@$@$@$@$@.$@ijy PhuZF$@$@$@$@$BjF' ZQhIk$@$@$@$@$@.$@it 1Y_mF$@$ JbbbՔ9@bb6X `sF?+kteeIHh0/X+^T\,\r"4C|ؿ?nzf׮]'OO8ոK7\G hϬ @d.0@J*3gE cF^z!!!{@;-+>N˓!>|0HH.r1/<| HHDUގVh.匦IG@5_͆B /CA$@$` TMb6"4}><_C9Shty@1$@C̚HH, I{n۷Ϗ3K,[1J ~F8اO̙3ǟXb?{ ˞>}K. r|?]W_aϞ=<"T . E8ߤ!^FC /g4iLC$@$V>Sx7|&b0{AM4)RD^ҥ"2 ryΝdU$W <#G0-Z@+.*Q5 &r/<.h," 2P:]hhDΛ7ϐkժ "gΜC$/ T)4㡥xd'  MMk׮iW^lٲ]Gz7ѭ[7ϴg/ T)4=Ӝ=WQ/<k֬0 xfBSo:u {ٳHhLZQ٦MfܸqTի!;B+@B -ٛuKͶZ W PhF)4cYB3W\8qℿ^{عsvؐz3f4۷oL@B3sKf PhDh^vڡ^zQʖ- 9A׫ U M_ oǬ! (4]"4۷oiӦev/2w}{Qvmϵs/ T)4kׯNbփÇ[rX o󒫓! xS@$@$%.?3*V|6ڡC[ȝ;gڳ5?LG"o޼ UN_WOKo! ȈfBwc=OGSЮ=Çѯ_?FB֭lI8^RhFB)4fg*/<v(E$@$krhΈ#_ifz쉑#G:Fh9~8K|ڿ#Gێdʕ+Ǻْ5% ٙ /-y Ě' /߰anᆠׯe˖\h޽K, *V˗>9 'OvfK^z]hvZٳG;6O<(TJ.-[cY"駟jêUbC 'OԮ:pW?4hYf'p޽HIIAժU˒^[xt򙬒ꫯ:/p΢E:/<ne̟HHNGFn݂8 h넆̖ʵ#2( f7;w.Zhd7LSӧkSٛfg3// T*4E (C AbbMsy3 rrxaӦMh۶& rְaеkW-ܫiQ^;vNp1Hj"4s >50}Д؆ ƍCΝM MݴiSÃR,28%K,(^sfΜo䀬'NNߛД岲L5=s4h@7[lڲnQLzЌYb$@$@1 I)WXuf\a dɒ MgV|2ZD4d酁ׄӧ#Gj˾E9y9W\1c'#~i!_'>Ȳfm_n,[:t?SV Ab.0|w(W~G<3P|}OLȒ%K?qիWCf1w܁!W/<64$fA$@$9Uo*0ȲWY**U}'R 2s)Й}F~H#LTl۷oVUykW^y%Co߾iVgQA0Pܲe=0 G}߇)zd\O#Ά[pAϟoGjժUPy>쳠=F_HsNmo1cwI}v؛Kv GBSc͐ٓbŊi!#]n*]u絓#e*%NYCߏdJr꼝l U Mirjh?]6jԨziuCd+ /WZ3I$@$ (4ↀ^rhNܹ3O- %۷݈ʞ˗kW|(D#4ӦMЮ%uEP{Kﭘ# $@70PД'F/g/FۑД+V)wq[i _ xj @f(43#]C U/ MMݻwh|܏ˈД+Jdg_Ő̚83gNb@O;HkWlI8^x~Iw@A79_qYCy<`NO)Hi[P8Htm xaU[Z`/n%,;=vVӈlӦM!'_>Ƀ[W"4u#Ǐ&/ͦ׶s ϯUBkCUy a $hF' pF*k;/ T&4.]|9GNU {)Yf_OO6"4C4w]i%/Y h޽;͵IbSXC?rEKܞ7|ݺu A@aԩRhr)4\HH 3!ׄf8xuaСSie?Y:+ hzBgϞ[Sʕ+л-%̘w+V&;)VMr Mv7ACvR`5 uܳg e˖+]`,_ Md4 MHTfbO U M3VXaԵkW;~\"oưTA۸qcÇ YOzh޼yGwءէiӦ7o^DlGwFI #{u: 7wV'P/3VFo=`KHZr|&IFB$B37YN?0Kܡu? )>R7|.'|@ֲb'68NGHk` XЌ%}歔! y8B@\9-x 2('?BPhlF$3ƌ.3^Rguɒ%)Y~ڧOZܹe"8Y/BYK]hJl9K.i8۲rW&dgL_-t)lkW^y%Dxŋy;3/Y}rr2~7v^|CQ݆Nρ~-Gv*U "^x~.4dkrO2c'ˢuY*q*KE%wa/0,%FBfBSfO5b( ΃ [Lfelʲ3M -3)4#t" NsU 3CxaϯmYM ϯ: s'޸UfeVTHc@xg+Lcz~n 4ڮ 3if-EvĆg(Yf2NW ^vrPye5ąt4 F% T`oiL.]ZU&ƍGI:9Ƣ!dƦ1W xuД%]`^2< 3'Sf`Bi҅btf@rF'Д ߁ 19 @_8I(=o~<70\u`I.;U_b,_?@ >tD4I$oL`0|t5A%qC%hq(#" B52.1h5:18.Au\"Q&#nAdP"8nխ}U'~}].U;\),j۵oTwޱ=ܳ]E5X緳:n62ќD]R9 aA4dA9QOԐtvբ\hr7ј͊B>̌v63Te8W5 ef{к'^7F"`UC@_TaKD4v"_T#t~Q@o#͗$2%%f~GNh=3˶!2XnfEB>WcZh [,&/qju>ܾx*ysEDBM~QErKlƌ[=lԨQNGu?S^$gOnlɒ%6a_ھ6=ؒ\?ɧL_|Nq6xc|?c .(sw˖ romРAv衇1S*:Yv׺w3vE͗^zW_M7bfQ~_0y{駝@СCK/uՂm)\~=ֽ{wqrXʣ7߸#z1~߾}3WZϺWXaC q^k);v!2qDʀv,޲Oq<|Q]p5!Ј62ьCZ$3C!jfghfІށycI!WB5>fmglTp3|U"ծ* g}[h=#@G"P7?~6rHGBfm?w %E-ZdMf;ȭ&qP҅~mեc}#-^|EvnG1H`H)|cSO9ޮa)X"R?qqߛ43 I8E4kP5BiE5$OwO>!`nҵ'xŽ;8 HqOAyv"\!֢9s >ܑ;On)CH-GnKn R{lwS1{фnVv}-$~!d1cvmgw}w-sg YN(H@sΜ9_|7xs9_{]wy!v͚5yyI [n]f:(9Bp!Y'ym4!Dy3;*xxL U:nt|(l3#GHdgn-$l ġQHT_R\Ш 6E'G3#`j&@!PjH: dgyYU#!x/~; 4!V?еիѻꪂx8УyeK C?BA9;O$[ou{HXm >N>d* zmp@W_c#,"JmH4;ᶄb~͞=;3v om,ÃHAB3Ap 1IDATUN !$6B 2˅B̋ UO4u4믿vc !̺?hUoh@p~h{Y\RYu> Rd*y9qD4)*$ŴW̍Ct"9 ċ7b{A T :xH 1[ÿE4k}2^FՐh}:C)(N%Bb=#<~ p{o6\o;B=j+,Άkeڞ䑵y{],$foϛw^XyI<|Q]v(O<ߎ$xˢ73$k#lY3+$[6iCHOe&zKtG5e7 eQJDbu j -WFRS^f٥ 1&RZ>IRX @"mx8tx hNb!#Lg% +B\RDa>pWaG1k׺0U!m}YY,G4r4!^|ҷ[:O+Wl履~S1=hfː^h22_o]vu× c&HFT/e}Ͳ?8 z:a!@Hu,27Jg^EyJsfF%Ov#YUhjb(rDTUwkK񦄑»("+<43 &iP! : ].=zt&mm%N Dsܹ.#o&F8{ 6Tk #qF,E!+Ja)aH^{gNٖBK,Xؖ0[H}V쨇`ǒ-6͓N:y[x BO%!`u>}ru ~{2z)&Rč0B}~njO|/HfoYA D5>Qx' ]l3;q]ĀŊ:?8W"~)B =]~`f:xCjS Q`߅h ׯ>!kzj .ّ6O !nC s4x4!Æ^лZn=2-܃ryse9 7t+߁')I.9HD ;!d,VQ҅[ gyxu0yӧO,%jdsqQ#|)s!|&y^ dyܛu ocnSz~m$u$@^oM8XnvWAĦ,U*9F4鋘o$CavMIPb s"rWp'h f-O !.>PLMDs#BBZ M J[@V/:nXqy'ڷ~`+Y&7r()Φ7՟GJD 3yO*H>Ҟ1OڦJyx2YPɅgݟk'Nzݺu~Cgh!>֋h.FPzXA!aZrf3%\\B1 b^ˋesD+ֲ?~.]˗efpN˪Q'y ~xٜک^rA}%'Z!^ar|_DΚnh0! @F* ϒ5' FT 7ƶ hgbH>#3f̺ R Il$juD WDB@F@uĈ;m֭[O?ʆrI)jh@FiB\Eȇ^{vygWۡC+D jm]ٮ5ggMs~^z}5\ߧH ! @#0 Af_#VJ 7ֶ@RA WDs6*k}-wdBڎ!_R꼩4: Eίɣ^ B@! B@y4데B@! B@ ͜m+B@! 7"FX ! B@! rf6\B@! B@z#B@! 9C@D3g ! B@! ꍀfVB@! B@! rB@! F@D! B@! @نkB@! B@z#alIENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-016-gammapy-package-organisation-status.png0000644000175100001770000035257014721316200027470 0ustar00runnerdockerPNG  IHDRAI`+ IDATx^Eچ .$]X4Cpw&YXZj5=3=3uNw>F5ʨG`4EĢƄ@" " " " " " " " " " " " dYLf#D@D@D@D@D@DXr]dɌtt9]u" " " " " "L@b1.G@bunFmv^^~aӯ_T$SaA" b߾}S?\U1~XH@,4va9m%" " " "Y 0X쬽*<wn@D@D@D@ K@b]w]u" " " "OJ,E vXly]WRXLI4bSRȆb6U@$kg3De$[ZH,jH@ H,s4$;yM@b֋@ H,NO@bwnPD@D@D@rK@b1]hוE ,t@H,6 ,H,fAQuC@bj:GZD@bEu$5(D $s9jtru{& XSE@D@D@D$齟-k"$ Ntm|ozmm5rJŞ|gr-Wo:ߦrJW_V[me.Gǻk_V/ӧO8p J::>3s1GY3.rZki __ba@D@D@D@DNoVa6ۘ /"aÆ^X<*7 8.J{wW^9K ̯ ?Cӳgϲf\}f7nwbU@$Z%Qm8Mb uINO@bXr-ͥ^e1sFb1{y?g믿V^zNϲf1^^jee53N߸ /`'X ",b_2KFźGnO߱?C۸;L>с"Pd׿#sYA3?~;J*6WL{2"뮳sM;ί&c=Ωv[ol=[N!qy~}fX He=% u?S34ӤnBXIJ jE^(?B3Ym̝wޙ ]e]9ӡz]Apg|tMz_Ʌ<8b thFY9R*LkYlUފfkn /ziSazYJـ~mlZ s{Q53pF7xzнxLBƘjj)b=kra5/X7tSbwwlENXMb{$mI V_}us[Pbc~[2/8yI[C1H^"g}e^{%t]wUVYŞ'vD8bZ{ݡ7|sY<Ьnj+=ڈ579昣t,*>O>GEg± * p)bR6uYvK>D6!;IRzi +9Òd<^03i_h3nA~7[.>auKQ ]aH~7V/b8mInR'cԛoikN;S}qxc0ZNf2h7+i"3 R>õ{a _cH,xĮv#ϊ裏 3ԧO 3} `p3wûε1J0sK!:DEYL.]/\L ya |TznݺeY&zM&ǁk+&q2|gqSx`Y䇀-']W7vSQia 0`Bb(*{'N{W| L*%}~"H+|bU$w}KU|fe9}+KLV뭷^35(7|Օ8ؙ?[ #Li & 'C$3b S0iK5ַP'Ob8U3O1bKaG/&>~ Lc\Y kiiGD~ _4M1V]1>#+.I4EC.nVUʺ6x`@{>H 6~B(&T/|Y21d3K2,a4wQ.Ni?Q<pw켇T1/ &Lj}O&z$dh?·vƼ+F}$7=X-au}&@`&?,ꅢ!I,"<#:xq?ac604wk$v 7`6`wKȞI=u:z]<dv%.vEuw.@yD?&.P") A~IϴZ]O>?_>묳^҄"Qa޵՞9.,1a;]hrN^*$wM8aJ|^&~A("nOGWxׅU0GqDGQtI'up C,ʥYL}9;PbH4[XAV\e&6~ऱTM,Ž/[@8$dޝfL/|,V˒Oo_M,^Fh+mVZJ",b4o9x$=dGNPEuwN(+-TwB-bJ(< +IbCHb·/?Ki# HW;mA#,䄱X>UiKXEV08F~a5t%e!l뭷%7{fpfY ,\yžome_( +t KE gzQjl/$X*L5'7) )̄/>_e 'z꘶ <~=Ә#(|F8H!k)3v][R_cH\fq-n@YpIk;((da_b VPFsӋ]V`dM+yT#",KNuĒ&"U&avk&2L K2YbFi4X=x^J>v/9ҴcBz$/rl8aO:sJXyq.4nT#>.eCw@\^&ab E'msb]OzaI'0g|'؇צ/ BhG,Bp ;=3]}X-ү,6h=vcW,#֏H $PģY y>4zĘ$,_< ûg" 5_cA03p,|rL"}ɻo!XhzKyӖxoqo"_J!$ᘨ'MxEXt?\8gTw@vr?=UEAzǃ*yX cHb1R]iUW-K}%_<̘vBjZpmmaA !64ĥ\/(,ܧx |XL,?6|#($`6+.1'i+a";aL,@n1d H~l#<ŋГ:HV\35MLcn؇0o݉E 1Ig\\}9 cE9F f˟Ez1fA wDdt&8~pn,DT_L,OwƬ\=߀0Y-bVC<^\v[D|x^&ҨX-TˬNpjFJ,Cڋ"V%>@~ɣX%ޠibĸ0:rdI?GaKk$撏,xudwXS(bXK|7? 5+XX@|tAe$G bjrĢPF,bn?a,J,jA"p{U'1[ҌƘX$Raھ=?,I93pW^`zbk,qe5q?X @q>$ġ+,$@D39 @\,c]O1DċALc',d 㹸 74@d0 -XH`Ä{ycǸV.,9/$c "h B,Ƽ*eeEZL, PW糒،Mr/K/A+,~9ki(*"Ƭ fEc  >*)b%ivB~\(cY/  ->[E0>V]VŢvPXj$K#1ɪu,f$cbb& 6&} =[BВ&6¥u -as,@S3!eMjXck0L(Xe "%X -cȶ<[܇uKx'*v=Y(*%᠞s[ú*e9LˆJXp:H$)I,\cU,Qo>RF7 HX ~A̅CA{&$}\|7|^X$60$tIvbWisZ(w{%~`Yī9۶(fF<[aI.OU,VaO|pq!b1Ǔo[% 㹸OՒ%3:6KgX5eO ŢvK]?{߲XGL+ը%%t-I&~ WrcYAb\IDlܵ._Ja4O@(cgzoU֚zXYI3#汖ƲH}& Z8U,VI+ynN~s#+ETb%|fCȂmCj$+-THj|[&/uw/B,˜j>+ ܼk,&.ok l]~r' 0VnE_- Iƈ_|~b5|YX+ŢLf%PWq,N k- J_2yut 8ueň8sk 8iV,xoifWbC /fW7XK`.S,vǞˈ"Udu," Rז,6C,2G cy+ IDAT0< '% sV9/(aRg=XjHF_ y_%P䎍%c.'P˽XŔbXYŃ]0i3U,rLXIㆄyjC5(SϏ $Ҹְ+!X+&]۰{x=*Ÿ v0e{PtCmX̊K˂da&<~A1Klo6&̾~ӌX\]d,XgKbXP@9ޡ}‚HeqЕPȷwq&qb1X,bד@̵8@`\g E, nsOb8!bo݊YD!aaf|;}0s)B,b2.*e)p [{";3I?:pۣXr~`K%=ibl *%f5L{T)aS 'A?! , P+gu .|ș$Q?#b,Aך*>t3=/j30# _҃@=|hR-ǪL* W\O ]臈􋟹.o]- [g,.q1=Vus\´>W'ÏYp%fE]Bgb?rC͊Kl"[`XB4)ky*mGK\Si KEJM$+ts+ Ŷ`bL!<Ǭyk?u a P]"v=i,&gX)Y՝Ɣ/cۡ+k*p~!Wa6~cb˵Z(Rl󄫶+uBC"m :['6?RmS%'XH+];؜ 0:ϼV&)oEl/XڍfIoX )0?sgg.L,t V,r^,31b|'/\O6de3*IE]b1EoZ$" M+[LqI"@Wo$I ' j:iYlX̊K%WT TJb1.?n˞TxswWOXM,rFxVGd*iǷfϵ]OQy1wZ"şPDz' ̴B,rj>͂ ̫XCXĭ]5½\[)c,W[1WGvCV~IP&[iդbʞMC'/>0KR6Kur]M,ҝʘ ϠwܝpOPc)qkDzAieke%$[r KCkZ?&Uoq1dȐR2-G?N cM .BuVӌsB$XIXXlNx/ĩgX,bniBR6[\p%q'&] O/ K޳ 4m7|c-IoJv"i'6bb1a@SkJ,ccqaeYCc,uB,rXv%#íb}/8Ŕ=tGtNs* apPztE'7`i&cb=aB .~"/q<\{8XWTFk3K,f(nb1!~:241z/eEf̂K(Yl6\\ 3% KC0T~]eLP"ݹcX{XT˾Ϧ;XbX/瓣!6^$!ev0^qֵ{+[(pIaVqW2| wN~~ 2s뫚X˳nQb 0Ǥ[bᢒշR1;pL*X3xf lk{Zَ1WI3S-+~wŔjװ*|)/D˹] V/&Wz!O>ɪޢX 1dA"B\OR,2&rpN1趦`U/$bu 6HDGKX)ɖLïj{#fy.r5rFc[NfZ/}XyGgsGfe!!V1[xvNFaLÙ 7hB{Wćmb~qc84,|;am ?sJIjgncS|YLk佒e /W_}{wvV|aAc|y 7X}Йb WSV;V0XbgKPeQ_ bH3 |\X˟n4z`%_%Iwe%ˬvg/iK*qL teت$:d.Vp#bxVXዹq"&Xꋥw$YL%JOfJ+~a"V#GM@V+:B ٗmr+=p鯐#%/jX,j @b1=O|Q/zU>OLդ3Pb;)~O",dQtKVOԻwonZk@$k[Jb1Up쇆b1ezv&nÆ {{@OU'ɢPGL9BX {,,v]hx"iq!LEbQ˟+o喪' 3̐b%) 4Ȏ-WHnfMYU;mS$B*Eb$ m]EIF9W"S\=I4CvĞ={Zש YKJn̢.U]9`=:XIԽ{w\5]+uTg$@&Ow\!syWg'ѩuy]~b"L@b1hpU" L0:H,M@H,#" " " "XԘXqi" " " " H,vbO"X,r흞bbݠ䖀bnF c@b#;L8D@D@D@D@D$[[H,KGdG@b1;u״V[YQϞ=C'vN_uW" " " "Pm>lK/msktX[=" " " "uH,*^vem]^b1o=@! ƾvVEY9bN;F.@@b[]3d]lc95ID@D@D@6uthUuMJ,漃<$Թ1m_Vb1ǝ@'' ؆dUu KJ,漃<$йլ.Cr|IPh~/G5jԨTG($;IQ5 pa ={b*LJcUtg+3ju@H,6:UQ TXl{-VEYSE@D@D@D@D@ZN@bNZe]Nh :cUuN:MD@D@D@D@D$@^UQ:`X{#VEYSD@D@D@D@D@ZN@bFZe]h W*NhY}7YkE@D@D@D@D@ZN@b1""q=N;4sᇗ8j(#;hr-ͥ^ZC tb }/'d[ 0&%b7={:TD@D@D@D@D@O@b1%cgUDWEXFYSvh)Ŕq3ł;-X E#.&" " " " ")19ҊjVJSwf0Ȅb&WIb1&*Ȅb&%3ĨD@D@D@D@D@r@@b1Ne1CJD@D@D@D@D$3/!LU%" " " " "VXh+ K,fSUb%3D@D@D@D@D@J@b1CTU" " " " " m% !~ a*Xb0U@[ H,f_b1CJD@D@D@D@D$3/!LU%" " " " "VXh+ K,fSUb%3D@D@D@D@D@J@b1CTU" " " " " m% !"߼棏>2O?}TT@ H,fkA,~fg̐(" {Mb1CJD@D@D@D@D$3ߙwm=\sQG;vmO>1}1~c9_ͱkx3} /lfk1{yK/mmf)Ǹ'47x믿zg{os-/ؼfE1 26zn]{o7￿7+9#$Lbz)s1ǘ%XpeN83l0s衇^ׇzȜvi4/9m묳=K.17pyW}zv0c1=ME7syw̬j_}yiQU" " " " y' au&xy癝v ,?7֭[cf 'W\a6l3 C-gt3/=ŵc~'3tc({.]gƳm8~h:7m?t3?mlfncm۾k38O3dcG{X[fr 7h#6XLx0@]uUf2i;H\segOXӃ>hF} U%" " " "gNg,{J+dt!bV0pr)f}1l+`j8x}4br5f}v+>C3 3 ,bfСfe!9YeUVP_hzn91_݊YcVm^{e-7tVG VXgĊqpJ0DL"Zk-{ zI,&34ᄏjE]daXc [t>1JD@D@D@D $3(aGqD駟nsϒ{hգG!QoEETkiv1sl/)w͒spus9f]v'97mX<Kꫯy b:qutw☲;}wUW]XDT"].!қoKb1}߰:aO>," U%" " " "gNgX7t%߈'DQFY_~eX5N,# <غjBbr7SumNb=mwul)R,~AN9T7|S1&YCX`~mktb."]vY:}w1{1,mU^{-'FU䙀b"+]qb7o}z?*,dMJb4 WW,q${5a[Ob,z뭲 Ƨj8 U%" " " "SvLW,oK3~AٌN`CX<KW_v}Nuu-( ?;En~8R|ebqEXe]W ^Gi$'VN,JU:TD@D@D@D@ Yb0W^ye믷Y9)}Ypm΍YfGUŬݖŴsb{]A#$f–L3uI`%O,,γ: p; "AJD@D@D@D O$3 |glR {HE\ve6+q}-ⶉ*xJڈjmAL% mHP yIJb吂`\`,{!}70W}~ګW/uBQ=R[I@b1~ b>"b" 2f'ulb+#k'/PxKۭjcOdC{WrK qXq$0Ő@ƏY$ *[q\xvJܴ5AE$ TpALX"y\/O?uB߈lBqb1MnfED={r1o _ JD@D@D@rF@b1)XݛwKPl4OND'lN_|a1P~g+&gy̜Y3m]Is=ljޒlEg)uCܲ${ b(R/??}}Ȓ܇E.aOG>Dc ~饗m) ;Fb0a c_Xb kiDHV˰KZR!ZhkD?V'y@K̯jO\I@b1~XL\Pۣߩ HnSiˍ Uc'xuK*k'25iۗXinMf'" " " "P;ڙU) <ի"" & !ᢊE믿n7ao|'f7lꪫ.i{W /0۴';C3s[Çj@ H,fiE˭L` :(3dYJ_tE4/xܐ!C {~7G}y衇SLa[l1ĺ6{qCv}u.={1uY{mc^>K.1׎}<o4Yolr?tYd3h 3۹kn7s#4L2I鶓N;2~쾌ljr/ҽ+]1tmus1%˚y]I,>OuO8 }~7Ƚ:&=6;UUuQ}üfq$tE&OC9qE]D $3젢EbV2n<.z% + :> |g]Qgq);_}Yr% {r]Q\se7'lgcvEmV,ڋ%Lݔ޽{@}&2΅!"6lv՗裏npb-L3d@Cǘn6VR`i5X6#g\*G%#ϞwzCØH'> 9 C=0"E%СCFED@E@b1CE,XN;?HKE-ˤ/6!4̊XŊH!ęgiŶj+kDpu5ְ"4`2|Yf;:"VYe{Z@XDL?V[m5{⊘"%FNqweuaH!Jm\K-@,+e]f-),,v zRiZyg.JwE/*"Pdxw[ED@A@b1CEavP+?%C7}Qf ;,}<ֲ"VE,SR>8qtj.֍'h0ӱK{N 1JpE7Q\f]qBw7nų>ymitnW,mL{N,;UUuQ,˺@ 臧 g5*" " R-Xaȑ#ͤNj-VnW^i6tumV`Ys kw}g)qMuni:EL&~嗥8Җ/]weVb)akh{pb1p1D ͏ºS&_b@b?I;UUuQ?\}[n-~iG~rb> Qs{PX̰. ΕaÆ-a9snۣsZ,:˕EdtYi6i"5,#^XÕ Ed~a+TIPX\tEmIt\vVO$šLI!yb=ISe1iQb1×&5# 9&QDX̰#.\sMLJ;)]D"|`-{P,XJ+ - D1{챇93;cI,nbd@8蠃l $yi!I,gu(]'~$9$-wd% b3+%3|騪 hr]32c95M:  ;bXV^ye:y-HRӐԅ^XE9M :i3wVjnbH;t\ԴPI, xXp!AŲ6C VLVT2RiQb1×&5# 9&QDX̰#*dXN8ҞM O7.SÆIJ/ob 3$ 1X bjBfӍ6ȸ@~7['Efԛnٳtn-uJr)mr?f,[oɸp:7TIz "K~#=yWxM駟6lIIs事v|'mNڱ- ;bG@w<%Tm8J/0F2SO=u&7ްqs9й"|Uk,݇ՑjabXiG?l^|E3t봳{lFT8\k,t&ϝ7u/"OKgUbq"Ƚ45&D@M@b1C,HU(5S!ɵBg"ܙzS"$ aH,f Ud%eY j5S.MXr浠լ\w,& !q a*hM*/R3` ;ZHkrogKj.J@b1ÎX& Peo\#G{ْD͕v&JE+UC@b1CTU" "\7j~XmM`5Xy7Ú5s)jt2vhWĨfq1c1F4[[/Dk[4&"kL̲,7TsƳj=3tvO?tH9 ?rYkΓ8sq}cv6%O;4^{'|,Ӂ"Е hrݼgUKpmĢbcH狀# b:NbO>1O?dMUW]թĢ_7s9gnAz{?n"PEoy #F0;}8f6zJd믿_|Ѽ_6ua/o+w믿J?-bv^, 򷭶|g楗^*_m/_^r%DMdt3xxo.j/l eqfW.[Xy3X\gI'=쳛uY1sks`" " awH,ײ(რD 4&n"V\sM6>#j.rsEEv'Z p 7 6ؠ* '㏥cz-[w]s1fG.oZsa" e$3EuQ{1Ę:f׷nZNP1'|8p/+FfO{X}駟lfn]|! z)nܿ\΅ 5#Q"B\ꪫtZԿ,׮/mi1Md*HM(?* x3‚nV{S13ϴbFbڶ!PV*˟}_wE(EE`6 H,f82*6+V)2BrE0D+_wX {9kcB"$v]=vbaIC, "UsC\ q0]vm4*ic=fvZ +$jX8$63ZmR͊〥sϵ>#fKbh\q/m p N?R"VPVYeԴף}쳏u+pꉥb~L7>zJR(woUD$"bx7QXcq/<#f5k!"*ݛiV!L6tMV;M2]s5%5xK,ͮ-S" L~Bɷ/,qJbX&RX N9X/*ӈŘ`>uF1X)lA2ۈHhbӋ(YfeFVsp`ղuF(*9*6AzĢ#Wh3 \qJbщ0f1:$3*:{Wk  1,o7a aì;- k%T*XD %ݪy!IG~SU"@Q&כnꫫB}c!|‚o_p1+ ZL,j#8∲̪^ិ&JI#Y %X,.o?w8œx""1U[E@ H,f8"(}'Hkgt.뮻 dB~Y$rFTW0)HE,"9 1KN&N8A:P,":qArsbNrM"Xą2V𡧸Od:SV^'JR\ibu(^o-KO"܋XLۏi&GOU@jE\#XHJ:I@ł˼&ZjBdm=,燙]rKK+9aBoeu@H,f'E! x=&!v@gYt1siٓ"CxE\puSE!j*Y+Y\bgV) +n+]+DsD HH0/"eC@t7Eq7IH$q%ƹḃb#uA€&E~ǝ6lewRaFTb#AbN8:F\!*d# >N̑,iu׍2I;NN,4}ᣧD 5"NN$bQ t>_-/17?bf hL,O x=b1E=}rKkIgM=t`sP" uX\촢E KqlRZK6S> !_P~i]I,cw:T+L  bI:$pY?"bY`IcmrEqxu)kɁ&r],ddr&DDn2,dk-(O{="6oXFrꚶMT0ff`5|Bc#r(]e, PeUnX>=F,5O,cĘ:TJj"ХD@@@b1CE` FX3bɲGc" SO]3=H$ɉ]qB`eq;::2jz]6:߽{w/MXd_I٫-0z0D4#\\?l-\e믿lTܛؤkaAmX-P|YKۏi{""LcB/fó#\Ts^$~8]rǺ u Wx -d!ybD<3,ߜ'7,3O#C%a<.""4-XE -Pm,[V B(  _ n)"LcB^ sI7ُ+uyenѳ$3Y.bI,z17" b#G4}5OE: Ll^ 3J,* "~{d&kvX "?9XoP2RmXpdH,fSU@ire줓N2{lU$maK"c'{1$ ]1xQG٤WB! Bg!ܣGIFKiN#ٙ^>l~"ZJr0iPxג& *4oF䒀b"!LU%" M P55$C !⦟~zD%z$(12_mGibݺu gn _~Z?2s'%?KӖVSjF@c$Wvb0U IDAT4&Mjb1e:, !!"lX& PSV)T i<KE@b.l$3D@D 4nԔUJ,Ua5ҡ" uX [oO_|n\"&@, /lzm[qH*"(M%XtsLU@9 GD,o̅^h3fa@EI$`o5)N;vl hrݾNX̞}e]fFejE@ A@b1n*X2dݳN;4{dHAU@}#o;x`3g{e4{re@qdd:t3<Ө.{J3"9 d-v"]v0EI$3[~S?ٳA4UDsOs駛M6\uUEhژC͚\Vդ.@ D"2h +UD@. bq9X.*"P$ᄏ93-bZk"5]m '9-5j<"xpȪEn[<8wq梋.2[ouwD5~wQ3N1id54@t#&xDDVB 5^2# Jc]5?، /UWU|TVZi%s=uN<xg͢.xYb1~XE&il&{2,Ml ~zǖm#}U:'ٯE+ {Ib1CJ$5H-VYe{N8M f-0lK{K/Ԇ\X|}Z;X̰$3D@bQc{9&s=Yc5L>}2,c&h&_]Ջ@2^z 6<裆,G}tœ%"Zhb(!LU%-"@h2F@ .}MS?ׯ_K&pq~Mj H,&"JbzV:R!7ӹ1l?[ow}rjȐ!v>$MI Q%NL4LcfyfWKzJ`Tiו;XpTH,fSU@] KzerKäEmg/"e]loJ,^N@b1!!LU%"`j+v3T|wQ(b{PbuuŦŦ'umYg$e1(>gswR\q$e]l__J,ܑ, a*^_~el~kgޢ ̯+X쌽{ :[ޭjsq%(bRb}ueŦŦmYO?OJ{{)Qb.X,BW\U1(b{Tb=u8Y3lcUm)AYn]T ;FYӧ᮫J,6} H,6qK.  ]Db!|:9*Yl]Ts>hW.S$S3' bH%ɞg/yW/l~}_~8^0;X&L;f4.s1K f#F,"槟~*f>oF;~m{E{W^yeZPiדO>iÿ^zɌ?f03ݻk;z8D:s1Yn+E]4V j^+ Tйv-vQmnin5*# :ֺR2dFXL>hDKx>ڊ 8VL8N0;=>B2f9紇#4GuTHlִbo5 0~{ h A̺sG9,38/nEjފbzDɊ@]'OQz_H,j$䉀b!8;N*š6tPkaso4믿~9O6$U,?1K/Aئ)346lzKXGWYeW_Yr%͐!CO^vw}giNXL]@fXb]T3~U$# aH,6뮻3<ӞW^ΟiϩŲx9]v٥SN9^_3sWئ}~9 ZbtO ]!E֙"P,y" aoH,6s77W^yeY%7ymmV IGiF}txF㏷񊸥>橧*;>.,.;)!1f2?xY=w_: 6\wu_lfkp.Vp cwoL=:\guDMdZ#DjcZg@ZX\.i{Wlj@zYX̐b0?#33vF-\qeǑQyiY8O?-ͺ[v>nJW$>쳗YZ*hCS;e׎7/ǝtIf}-;wUġKփsm5]vYq,?uBV Eֱ"bzV:$3d,X?̘U>~9mZo ]{mOnkvjȁ>2tI XnsA$+k(of-mlfS ImH7| fbcZg@-йe]Tka+ H,g#O@b1CjO?J=bՇ #(>䓊E,>co߾eu]r%YK_~w38D]f6hb _LK,:tN^ R;js3D -Ŵt\+H,fHYb~ml_5c1CdJEn!Lv"cLw=b1Igm &яdgy&XL;tO jjs3E b!JҖX{Q5\%]7[uLze#IsVlΪj-ūvjVYgUv<+ā1jx.H,?u!Ш]Eܭ?M?b-tl H,fHXb~$-1qDL_~Yv)Һ[n6'q IqRJL! bbd4Ӕ&`/8㗝7Ia\aE ~/ﱗkpsu%f ?Ihse3q/B wOzvuvV'ps +`@H)X֙"D"qI&&}SbElIcGwnvwX҇=K\{ Zb, [nqoa"FR?.W_}uۿˬXDĝzꩶ~\_4GiҞ:U,>V̅"xuיCFpeXx?b^.&ݽV/-\7N@bq!;ٱL\R5WUXj)&ar[RoNъZSLQodYz "E`M]kjm'|}Bsb onҊEwڌ:͹#[ςbUGV$"e(K\UQ&)80e饗.K^V˚zϸ_b# NJc76^{mvEg!cݻ=/Xm|Eق#0c5Vh/TN;ep$9ҊjVJSw=NS\Y#$5"D@b1ި#a*VU <6i IB,~U\%K_~/;Q ۿ/O\xXpxîuN@]@)-9! $üfqqk4袋l!bs5 H,f82a!4h!WEN̘x.zO"ص.Aw_s *"PT7CtQX̐j?l馛#Fg9ûVU"zm!nuEh<S$[4rzX"dt$ö!L2PnXc 'i6-l$3< 7o4뮻nwD;k[q]QZD c:(wɨdQHQrA@@@d$U*Y ^gffvgw{vq]ULSUZSP!ʔ)߃=gΜ9BSNq,j!pQʝ;7[*VrhdFC`G!C8k,>=FI{<#@TA:pqx|@#Fg]|TV-7efϞM[v 0ٲe LH|P$oD{}_lL&VիG+VLddaI7o۷isڴipp޽3oD~ Dig,7n,^ϟ??2A2Em=[+E0YčM̛78XR%c`Y jڻwo; #8İbysm&7gfpBZv-UR%҂:{+ :uj7RHE3K,qnݺl2P Ay=/E3(lć_MCxvy4A%d-Wxx!GLaYs#A[ys/8m4QLիWJ2ehƍnֻswm͛G͛7wka| TX?hQp^f d J&8!3?,YRd͹Is3! _^&ޮOocqnj8:P&a'&NΛkE=C+ ks4qDLZcȤhCk}˖-E8m8H8g3eh%m!cRD; # 8cQjU-YU,z '>SX@shhi ͵YG IDAT"(c}A@V"7h"sEתVZG8H8g3eh%mD6mڈbC 19Knϋ,nI$1ÁӌGbӞ)@u2.v;Hc8oU'0T_"R׭[ge9m8g3Fch%mܱcTZ5&#p}.\~pNzH@YgM/17N"AZkhLaUxffIc8oW1̙S{>}04ny=0Y46?Q}jO"͛7*Tb἞͘ɢLQЉ7oF]79=n(ڵkJ*~ʛ7:?PdVOȢ ٳ,QOҵk?d 7.eʔIy_ s%,f19@0Yh&6.Dc8t])!'_b&L^SSΛ-(z,z>j.#Yڬ;T`Aj\ u9n޹CiS> h|dz:x$+\o܈ċGE;ر^ߡ6uj)e>|":pŊΛZ֬AErv}%6c&9rT^ﵧ4ٺo?MY?NYӥVj˗zȳ3Zz ^:M_yzOUKČ=ޭ`]dFU}`᰹^t)իWϴEtg2zhMҹw^ʖ-dAȤV$Yڵ+M4ɔnh4w\j֬NjI‹5z^M6#G'NuA.!s΂8֨QϟO齃Hɢw8Y?Ӝ(Au2dEr܇cש>1TvQTlYuP]#-~Tʟ+نI>Q*,Y=S|qoGdѻ,ڈڴ 6i"mN$ތS*ذ˃ k+GWa[ΡGSj]$Nr8{*7 heNzx]J] E$Rx`tǍ`!맫8 5j`,ڸ,Ů6n(jDnԨQ"^=hΝnm9BLnv 6oLŊa(QjUqNP+?^x@̰HExZ $.q3f0T"_a|Yfu+ǁ֭+# /blxCM,eQxgp6f̘AqUKAֆ0@`:8~L<"|W>TZ`VH–"I::#;4s1-s,Bd6U b}޿QSyèA2{BR ҺiS\clЈ:%~{=2b}b>Ԯ^]W[_dO)mղ%5Rōa 3Y`LmbW^ r(%v.7Ȏ {nA kՊxؾ}{>=MVB "(D5,vЁ>2}@j^kOΣ'ߧ@;9xT12۸kAɈ,"Nh͑,1xf i0YqU0YL/ğ@Q^$1*ђEOo#Ad6ޒEOm{GR)OYF#A9 }!CJwx r1YT$XRكx8 dF&L"B'D&$p݋٭EBlB?j\ :ew:u ;vߑkf@q.M'="㘼%D@IOAJ/\*N4Ȇu6_PY6({+@Kd,ڸ*,]|_ɒ%-%ђE:hHGOW_ Z,"|4y䑆K2֭ۜ[ ՂU_)gΜ^dQ L8Qdp*O~"9 bԖD 'Dj';|Ey=*"̜/]"T߶ubѷi3F [v͔TP!'O)ud>;,AK*yfjY['"[<>ʗ3'3GdIheNz8w7A"iii4l 9<{Tf"Q40Y|&6i+[N$1D7׊,Gј1cDFmOMVz]M%g$%;v9Mi۶axEjdQ s/^\$WK FϽ3Yښ$ׯZ)Ȣ6D߷]&(lsfCANvarqtӬ/"鵫>flt0~W"Ө'ԯ/o=UȢ9k(_nV8HQ=j+[^G}$#>1Lm\LmbW < qM a'71dQ^/#2^Z3*!~gﻆ,pvժU:HA)w"3owF$-«\s&jkajhZ0gf5~.\ȑ#wskEڱ>(ɍgQQTKCKZkւ=*~5 ܻ'ߗQEoƀCEBu֐Ըj=|D]Gr(ش`攳z QCED'[.GJzH9y*?a5"Dʰ#dƕdF0-v'h+Vk׊tR.])ы$""T+CS$'iٲ%͞=udbx:?2?g( E5N:EJv#`@ȿ&cHHD" RPÇbb„~Us3z5S&IvS:s]zҧNEӦ5̩7nzfMR&Mz g/_K}Q)(+aU ~#d1Fɢ+ɢ`Z ᛨE=à iaXvmF?$JkhiҤqL&PO YWQO ,HׯS)7|`y2&@T~6wh_.2Y4gdnx| %*:DF@#F_|A[pGdF0YL]̠!vʕK"q"ЖГ<|?`jذH<tCQ"jjsԜɢs=E{p^@ɢu;[*W,.DL^wbLm,""(!J {+YFɢ+ɢ`ZJ*"Ԫ |^HxCBL\CW^i$Hx+dܹ)biz}饗,]LհE5Z؃EpD٧f͚6᯽%JpK]֯bh@_hxAEՐaz 'dF+1YL]%wQLHQ3q"" 4*xcVl%lѢEg4g0YTdLհkaL}ñgϞj, \)$gc1Es8jdL~kL} x:wL JΉP7|S,YEJp$<$ 7BKhgP\+?رcGǢh>Ɏ-ZPP>doF:fh .\ 3Nڧ~`j@'NЙ3g?PE{_~YX k˒%8W,,`h ([3YL/ƒ_~SNZAE&LZƉo7G̐!ȑ#n>}J׮]D EFGH$Aj<" n^d?fsZ&jXbʹsxc ,b]={V<2?,#`&汊%h!-0YF;a3;`h*2Y:/DnD A~wʘ1:*0YLmb@ɢ˃bdX3Y?Az] xL'#dF1YL``81Y?Lq0Fz] %xL"dF1YL`7ȼ|a#Yd 3/_bŊfh:Rp~m/_LHﮕҥKOdq~y˖-tJ*Ȃ\`A_t۷O p עlwt=Enz셽W'Lm;E&ʯp!T<` H(W3x`jҤݺwN~g=8q}fTr$P&2"o {Fb Yw'AYfy\g >wʯE;dhjaUwQ-[6"dF0YL`lUTHF^'Y4@>ի[SR_fSxfst&6_$=yeC.y6Cɢ2YL`ءC:u%[ !`Er:RUFhVҦM+2Hٚ ,:Juʫ({ayбiӦnO2EK3f'/*LoY^E9c.:<!dFl,]ٓƎ+zzb -,aP&(Zi֬͝;f"\ǎ͛7)I$YjUH|dѳ,ZZJ6ƫ('E%MJ)EdF0}ɢ9R&j*"1bp^5x3=r}7n\BRH=hܸqnܾ};HǏo,8p R"ț[_;wǏ9řK3ld*P^E9#.k[,0Y&ރ˗/ÇM¥~mSk37o͜9uʕ+G%Kwy0`G}[Ʋeˊ,C*S !UD57@j#Gx6FyҥKqWE,aP'M4 GѰa\-ҥ M0A4iu٭_ @]*#co-pR+>}Z'WN~D9zTP ӂɢ}X'_R_.r<`h,Zǔ=Y"eۇ|TbE6b{ѣG=QKXtf8&NTd==y_z%{/I.:wAɢ2Y&6${TVMF]paW-0Fx DBo!kq6*ЍqJ{V˺ɢ}XS8EXCO $D8" w!N $խ[mx$H-^8\ϟ?ߣvl='ɢ,F$Q&:(iI#{Xc!dFl,ZHxV^YF-?EAp& oɑ3:?w߉lx(qUJJhWGB`fF7o2ܸʕK#fĉFdQ?7k$ S,23BR " g'~?x`$Ou &lB%CRǏӹs˛7/ꫦKJ~(d^tBc]-O/}EvWzSԖ̝;S„ .h:YDV]}8eH$f ߨ+dQ^+I#!EMd: 0CUZU2 |x.I%kldokժFDn e(dG  X/vyJaYIlp~e77*82L ^IYM4C;AǙ%n"iƌSxd6H%nu߯`;vnd̸uP,rj䵃82ʲJ/[̚1!d>,MI+ڸxܕk2YO:y\"2sæ Bؖ [q| DGڵYQ=EABx$= As!qHYݬe,:sTFbf:Ź,OY᭖m1YTDdQ_B7~k2I %!O &^f|E`=WI""21Bts!I}UXݤjb`;mۅƏy5(_WiL_ dIU4֐ɢߠ&6ɢ`޺u&M*:ҥ M0NC RƍŬ[Vz—ɢiȻ71'O)Qذr%3lcoF;Ɲ"Y+W׮ٲ '8,FE%U4֐ɢߠ&6ɢ`2YC3g%KRW$-سޒF&Aի,eǣ̐D\^E5V:E5Z8&6.\ŋit!:w,X~mY&e͚ՅoFC ^ERҥwp|֬Y_krEe˖ݻEF9ȰahŴj*5jӧӲehΝthذ!l2J+\@ +VPرcS W^/^'OP|(eҤdz:x$+\o܈ċGE;ر^ߡ6ujy%6c&9rx- xVOS/}ǏStU冞E3s2,"}54{J:|4e|*S-*^ ?U-QBtqg,I8WѻW1YQ/`hÑ,^p^{5gP ROYd!xg.Eo>Z~=i&EN:E֭Ν;@"An %+V _mhMٳg{ȑ#o߾{謟KiǎYFc$Z6mJSSĉ fDzүdь*U_xRd\d|#9ѣM꟡jdUsThhVIA>~bĈ!NM\G( M_m޻*m/ڕ*XM`Ft)э۷Pe#^=} ojIOO?OkFɦM>~#͛ݼEg.\,Seʤ$YDD<ߡJ:EuS=,`0YzHBA,X \V$B*UDB^~o!3ClP36QUBn߾MI$جMK4-QyG _ A3Be]&(͎e__ɢY;X!V%Wvek?;ȢN_ `8Lyz2-r5O1gqH& bѷig]]%}Lg A,ʼn#~<64cC$YԆ.\Z /d.ZISe˩Ӱ]Pep Bf"]FOۇիjטqdO)mղ%5RY!U5 5ic3YtX_FLep$%a's #J:ѿEc! d_ Gx3C7nHeD7=bƌ)~Nvd8"<ݻӧO3+V,WW\4~AB=E-QY4kd!VpՒE}/ B,g2d$UVʐ&<{3oC5b%2:3 Gwa2?m?x}qY4 E$xTLiq\ B@\רӹDS7n)S|"{Rʴn򖂄"9E8Voiʲ#E,:\,#`&!|AE<  *=mH:,Zo$vaC֭[-[@eMk#2B~Wʑ#^"EU#5mR޼yݮ2~c! ӧ+YZi;%jղ+ŏi u$QfV) %?j. 1$"UʕLB$_qν{d3"|ԠG/Z㏢)i݋ٳYZ5PWBvNδa;\~ wiҘ>H2ȲEmltsғE WR&M"<8o9\g#XI0]MeW3Lyu0End" CKH̦#FPʔ)GA␜>y0{t 7bzDx+6g3z(Ax/ *(,N7AW۶ms o\rX()2vX?nݺ.2,"Y;hɢyEJpceL#EU7P8ߋVDO*U# v8G嗨~[D C˗RIAG7R85E N Ж؈~+?m_vDꔁ(fhmlJU:y2J_PڲRՒ%h\^E`rfjY[ Q>|9sҘ9sDT) m3='=YjqiYS4a4d46=7V,ʿ_jm4/։ /,hp#!"ɓ'EQdݻ7}!\Նġ&"&K̭|RfCՓE*m ep^D4~( H|Zj$w3;~^*2F%Q'|gC2W&!E'дNu+QxqG=|D~F|fӮ_h$S2KӟOYiHNelEn˼~ 4oJoѨȢ6,.;(PY sYCϳCE5d05_1̿Q.O*.@ 8/ĦkָkvN91YDt]t3?U6+1NiI &v!I$7তUB>s =x@OZ$ٳg7w1SL8qHMmddu qn:4GҤIč,%<9rD_d|ĆHKK hTXdVnтlsg:e3W,IF^w9ʝ5Cp(I3t%ʞ!|Uo IDATC3zI}mS:s]zҧNEӦu%}mg(e4JoDag/_KHlQ)(+&m4w):&IzF?,ڸ™,cғ:$Cկ9&-W*IԒEmmRZsJg y8,J#]d輿1֘&"ɢ`+:@hɢuy%rt-\0Y & Ys iɢ1L-/7p<Lm4aE|(2.fϞ]ZrhE6;\oOliҤ[l>Kٲew}W^I;3J_#"ڵkӇZjEH*U*BF$+F'a„4l0ھ};Ŏ[o׮\'Nh yfAڿ.V.%:Ek(p/ZM [oEE… 5+wp0@`۶msNqܱcgΜϟ/֦n dQ5>`h#*?p5j!ݺuqƉzY 4H;ٳg 2 ~? b#O=:k% (_5կ_?ʕ ɢF,bB I"ԀLv~n ̐'OMzΝgft5k<PH.]*^@?;gor۹nϞ=ԫW/BF@%ҦM+^l#EgJA'뜨jvg}&6_?6׏?af 2 q~! A ?EO$Ig2|'@ݭ[hݺu矻L6M$)C hf̘Ayqy#zDhm@p=v>3ft#\pܹs-r ke"9lY]s1ݥ}açO҄ \xZ ?;g_t͈g9{,!a _pkAQT)G] 4x,3x.]:0nnuΉ:fw'`hU~cs}EW*!»wD5"6|pA|@HN>-,Pvm?HoFK-ZD 4aSMKB0.")H'(29rH.d|A 9DfxёE>o^ %Bߤh0Q I&=eر:huaf !΍<g-ĉ_F(-Dٺu} YDhŘ6|4d0Z=k={oFE#Y/ upΉ:˾<.#`EׅlA):C(˗//2s,}ϟ?H,,"t֭>p&h"ɋ1cƸ#JexVCtx8);AW=ۈ,s=5" A =Y|wdHZ;Qa럝?ֳ/:fZOH EpvqΉ:˾<.#`EׅlquAqExϟ?/!\鐞H2dҌ,jl}J*w6mҥKG62ʚpڄ<_tY4;_m^_=Y@).\g&}%VtBaYg=ٵx3 z|'^EFuA&9^D}؇ET:?6(0tP H[ժU)C )8$b3Hzr%"'Zn&Mիŋ]^E77(I#?NrcKK7rHBE8Q%}6T$=z YYdъ07\{hEݻ*BPneQ,c݅ 7^E9սLC{:x9Q^E<;!dFYiذ A$їr@bd!ӞgDy $ n6n FE<@@l<ԩS]eŃ ,ڇҞElQCÔ;wng]1Gkax#fD 38^ ^EL uN9Wi0Yb*?$o.L6NbjrKR*oFTuUTջs=8^D}؇ETڳxʚ5# ,@NdE&^7#*X:X*]duN9Q3DɢhL,(8~zg]1E`˖-TD |Hn*U*%fΛ%X%*]T9dѷ?'뜨oV{`h#?$;zw9J*8sos HQTjE՟L}qΉ:f%&6鄇d\r78s #G޽{fPfH_SU.:97u֟uN9Xq#,ڸ.O>!*V#(r9J849oFL"!ɫly楟횒%KFL6@93e'<,zDxs[dGɢ:!y]5C {ng]1E`TZ5%3fĿލ |7~x ΜO"H]tsɢQN9Qg-W2#dFL9s&mVx{e##nܸAy ЩS3fRWQK_~eJt#lbt:hn',xs/!du ͆ FF@ez-:rHjo(*oF3 IIʛ}hIcfuV@N9X9^De_`5Çp޽{)K,~FgCYfW_ѷ~+PUތhuB)<#m>S$i3gN.ZY+9Q`{u}y\Fɢ׀8W^%!;(#G?#3 PWo߾.`kތΝ;,F%>S/_׉:?8^Da;C99~8UPݻGK.Tw=5i҄V\)H"Ȣ›\ݜLah{u6c n &6"ć$M^:m߾&MD:u J*"OAUތn!gg ̵fVk'뜨Y{p;F 0Ye'>$oڴ)͟?P 7W [nѸq>qL  >Û= g ى:'ګg4,h1'>$_nH׾k.B֭[S5(iҤ6"]1!uVZjM2E|ؽ{w/\*aț:8:3uN9Wi0YbN|Hz>:øe$o޼lc)s"^LlD un߾M'ر6nHxڟ$IAǙ ;:;biyuNk񅌀`h#N|HF7}dL]|9-[6m]s0@4iVZTN*UTlțb]wҒ%Kɓte FGRNMҥҥKSݺuErĉ:'ꬂYF@"dƵdQϩSDmӥKٳg6"PA9263d@ӧ'OoF#_vFIǏŏ^uAO,CpiǎK={,YмyH"A҄eFdz9sBRJp؜xs7,DEEthWLj8͈rxgP.f̘!ʵ0#z˗I:F'뜨 /,ho&6Ю,:p>͛ 8;wnʚ5+:tŋ` x8F{r]vԢE 3gyquN %`h#Lmӡ]1Yt|T7#>ˑ]C <#;e˖x+W.;4كuN٤9#,3EthWLj8͈rd>Y}82؇ʺ"3M4ɾɉ:'0@ ,l&f vLC׶Q͌7#αĉk׮tʕ$qR s5kZ8^DeO0E3(ld$P!܌b7f9v/^8;vn޼YSF֭[D`{u1yF$LMeE3(v&m_O͈s(Q"qNX'#qF*WM0t8^DbL0E@id J݆bhۗɢ:us^Gg"ׯ_)S҈#O>Qщˉ:Ę<#`&&2ӌɢB ж/Ed rdxa@׳u(EdF0EGy3#<ЛN Cxsay #dF0YLvdѡQmތ`/:S@g'뜨s.e0Y8Lmӡ]1Yt|T7#>x0D ى:'K0Lm4EthWLj8͈@o85* zvΉ:R)+EdF0jʕtAQ_vi,"M9s?C7#αi7Aڵkʗ/Oj߀as׳uN90O0YHLm]e̘ܹ#"KҨ' xPٳgS˖-03V*Xzs3rtŋ.wNcǎUGAkbE|#0YLm]KتU+/H=|FE?${lΟ?Y dԂs d3Y[&W'Lm;EtHWӧo'N8c}Xӧg*:Ġ^dKpd3Y[&W'Lm;EtHWZ1cƤW^yj2Y_H_~N<)^*%N)GTvm `\/yѢEE¡ChxQ護ޢ 2D $"Çb R)gΜ /DǍ7h#GPرؙ3gURD\}`3q?.ƹw뻦MRYd/~7:qÅM$IPBŋŸoߦK/D۷o?>Ydb@ .>?>,YٳgRo)]d ?~\ [xq*[-^ ]SN-ř@@pX&3T,?EtPW]|={f5{dL/Ue%pA,dg׮] {MÇ1bǵd;!CϞ=OKpe]`ҩMSHz*UCQ+ v #Cɢ9Lbh kɻ^Ő3 1YtMADx ~)P@ƍjkD3"%yZjG J* 5kl y>,ʕK %dcPFeQy56^z]^g:ܖ ,hG&6鰮)-FYz tϢA-M6;wg&8&6ډɢ`:+w32Y^GYB 8 sR9JRYEKN&6mZq@Lq^Z+EHjbݻZӧ1B|d˖ͭ>Q & &І a9]DXkfH.0T%߯*~+ Uܹs0UB IDAT^+W-Zs}Q$fKFQ̤ ` )RDdD$Fd5kD^}}.I% j֭[+$Alܸʔ)CF4dTg:u*-_ܭͤIϼ%:tX#'YMׯ[ cNx"2v9ԉZATd[E$`Rpaq4#S[WIoT|xV2,I|jڴ[E>Axpe˖-R O'(Yʺg._,daÆTJyhΝnMΝ;] k˒-!,!;Q[E$%B(5k5Dٲe_#;ſ9~皙;ޙ3s>9=s̻ Xɔ)_R%VR<|(N ΁";+ ,fɢy -q Y4CZ*:Ycǎ4zh;o#ྎp6WɢdF0+FC0Yb'YDCUA#ȴmGP}!>EYJYT6l2\3@بº ")E/Q~'r%H*RxqjYBP(b,ӧO( 0D`ʔ)"~Pɨ0Y4vL M(Dɢw5>"P@mC84*<⚤IK"2>|8hOD tԉ LjՇEFRX\Q)HӢE ]EtS݃N nz$B}WFFD˃X(k92.|=# YgfJ+1Yqe,&wx&Y&@Y:W@W=zzdqƌ5O>~݁HΚ5˯h!11f{…D/~Qyxk-I;oN+,2z57GN8jWYe˖QΝ}i"ޮk]b=}U)K,U>Ӗ+iԨ5>C6Zn'"^v6 #ez͋ɢ+6(]oyl2{좀M!q%QO)epxmСCƌ`ՊYhu  RA@~iN, ^~qqE,/J ^SSWGھpAb$8q␛Ν;tY硇|8B޹sYe3fH?^k}`6yY,QثpfaC{9E#(l6Dp n׮]U39rt-ʪxY~A#=,zg"I,Z鏯n~n޼H4mڴO8"bpaՖ8< #&7n@7l@J2EC0kfr(%dr܊6L~LVihCVI#}7jYt}#N}Ce>VdVHo^J,mG6o,:jAEe˖Zu#P7nXV]@L҂*bլY3oYc…b^m2Z腿:(.4Clh^kD?+ll%sݻ 2=HLu!Q1`}ޱcH.(k֬" 0}Slmɢ-0F&Y҈2ZY?,!{v+YAr~ԩSEaCiI1|d,EM kUI$A\ 'k!Y!,zbeHArdsĉ*‡D=HuBb? KbHJiZ#)XϞ=}VSXa%:6YPZ19X]rI(@dk]pa_I'`)4ix\0nHFT0YthT pLز xk|(i e]dd[V& 7E(!I[l)ֲ`UnvTy@o@KMUZΐ$EI \A^V" k1J "dG\+W5jCiM)Didsd &LnyqF*Y H4SdY\C;OELO(vx8Ct(;i d]dd1ԞݏE dhBXݐ,"1 6Ld"EDqii(I ь0I7TX R$!MNEID6$tʕ?bc?5EdiVxYX >+W.d1~}4=4Ű!X<\ح;bXS8Y94.2YdhyAɢK ,hA$YD qL2 k?'ɢ2z nXA:PCEOIApԩ#Jk4Y42f$PBb*pE) $MT|>ydjժ/{26zdĉ"GH@931ȞEߩRŐ:s654ESp98k&.](UF="E&QC1YtpBь3DFR%YĿ_z%Zx ȄX@<½pׯիWOdEI$Jݐ9zn۶MHk" nH `d󄕲hѢ"iŕX֨~I2 Q;Ϥ_&MI{$"][vQ".3 7T% =)D, yI,zg,zgbES7E%iD<(81Y~#ʃYly2YtpuFkԨ!@M!~J@)P!("cR6#WDfL$YA& NEwZXe F9s&M>]X%U!믿.eƍ}S` nDF8Qq(!>PB):uۧ(Al}T)xAн{wzG|dш~FeM,&.2M+7ʿBt4.Qq˩Y>ټѣj2.}FHy&O#_g- /$eLvcW?x!>v~,Fc, aMDYsp]hU^NUt2EPdɔ&Lbv#'NSh]tU*U/V+KYWfDHmhմUbɒvG}]yOFnǎSg .MR |\yhu(.yr*ijߠ2L[οJ'ŨwV}nE۷o^K:E]۷Mkʝ=h_h7ߊ-x|%Kg6W%QqTzWo-?"O:P n;91e\OX%y^RxY5[/ҏ6ӕשlBRT{Jˢ dY4WزG8 N![W-؂E[` emԲeD=A@Ԟ={ MB E&Q@v щ% R#Q" aE0B8},>~J4jLW_1]Tq ʔ. "~"$Kį}+VfR$K*M-^VwkMQ}{ె$VCT=";Ƒ?\hP'm 7R',:BLfDɢ:>#F3ŨsV8{7jנ>u6/sgKS2eh?m: 5S|cQu͛}`ۻ'޺GYڵN_G%l Ca,1XY=[rEPyif ^t/R+JhO%6 mե̼1`hRiG3_Mv7JhtmȢѽd؁7P7СC^OP맟~2U_&oȐ!"s$S2y0L%LT&^2_AOn߾-2#N AIʲq$dKdb <4# L<,۟f-Y"FYkP*=-ArdL%ɯ"[Y]GtWbqlyA A\/["zҗ^x&ڻp|#-8f|I#Ͻ֒W՛۷t FIaEldI\"UӉg)W֬ۼ93%^ꪷ[dJԡ+7 ^H3=q')vA pS(o>A`X|[C@ЍesŋDb+]ZbV<5j 2 }#Va]ߡ$f+ou;DI=JTIj-[jv-b ^e?G&spdt.yP~}pMճ)"KAkDGrKh2S;&:4TaT{Ĵ4l4pɢԛU(Tbkw^mS& k$'~\IiR ^@@j蹡+L!vW^*UPT~٢V5D<4( DveJ[ Jٽ{^1R:_(-Ѻ@q4h@_} u?3xVD_lP4}dn"dQ`rDŽEAW8YmrI&Q֭EVT)My&m~/Aѵn}?&Yp@%рj/ʵk=r%UJ>{?{7SA0!ZF@7eQC2uw_fU"ӫvvjеqj4eY6O醪%JZhH.b^ %䱋,X_~m[3E%~lK#:E4oMDV$6|o2fE{ɢ70f͚ї_~Re߾}SO>;v%YDK)(aVR%gH~H^fe#dq9 ׯ\NO,xBU?>,?Q Jx!R E&,eYw"(Y4W؋0Y΢gUO*Q8^R/B\Aiℵ\d8Pf_K+j>"L T:}_Kp%g.\l_ ő=ћ뫈P,E(Wt]1)+;G$9xu?^SoҒ,]=ht0Yt#f2cp ,}zd0J2*LOn^Pd?It\ҎQ#y*Q/ͬ E&fKжn%F:dq@:uKTƌi̘1_Μ9iС믋x V5k׮~x…ȑ#iժU"2 YǍ70Jl޼Yw))3[E›"P`AuCUZCqaɄ|UBaՓhFHь <0oeo`ݻӂ *\Q˕+'ffcǎ5vz7(eʔf#F !F z .b„ rJAj۵k'0F͚5e ~xyY n_̛/(־,Ҳ7_̟#2nЏ`ܾWnD]~*?AE *,Jv Rʨ/gPϑ|G6\M2-TP?26G2a5mru3zh'SaG/fYbkt(l` 5VUXc=^gICI+JHENUYXe!]:SMDs#kԬW^+LmSW7I%SjI/L +<͘1,_mjln E(osRz1YdhtlF(Y~#@p{˄e   W|qjԩmŊi gI 4oWTM H].FtEZnr y26P̢$Fpܱc/_^ߋ iCd;[N|Ž# 5ĉ2f X¥U ¶FL{HMV Ax!x9s/ EV8} R߳}m3m\*ʖ)#﷍6JoɬYiC,Ibʚ1n&S$D):~ =1ʖ1n|;w"L)g̺ yЃQTq0S㏥܋`%OzDYԚFq#tR9]_Bdpn`}Յ@O!)R"pK]^iH,YPUSsk]&LJQTZep2A oqeD4o:H "bg`@@@x rD*"pAzҥEFΝ;'|"2B>/,5+<!CV/i[n>~G&+7lhF޽{U?.Qfp\pAX lXpBLִ[deZTG`0,~wUtj,7@nbʕGTH3nl p=Uh"pY̬$25JX!/]0YWxmٜ}޼yeh2V⯿*Sd{lժod^ @KI|`DN,*F|;d]%&BE^i@VAxDY2RzmNZۨ%L,w#YlҤ_޽{X=(3B ƛ!сk%#b/ k#b';ȁ)Ç UUIaAӧQʘI93j,7@MX-b0 py)fjōt4_$"^3fb!C +h=lE YD“)ϙ%|A"`,9 K%"!LIK_~-WTL !2<(E &)!B5$N۷W' EdWx` 7R֬Y#r3H:0 ϋș0k,Ux4h3z,?zh_-Y@*\gs#Y nxCMzL)߮%rHAxbΈ$|Jԩ EӨJ" r#ɱpJg(qপ# xƣD$/dQ[̚WEdX CJ˾ѩ9}hSnF B5 ^@SDHEoEsV$_L[d!<ȿ-G# YJFBxscTW%YSUNY #Ȣ~/0D›/IEe@̐Pk(ֲ=et͘,͆ dCܴ2CXŌ@@ 9X!8,-&%x^Ѻ**U j"S ʈb#5e=,ZQ5X檵d kd(-hXOz}`+wlzd.6>JCe gG2'LӦMoّo)À(ɢyGR+Hd | g=X/QA@j }q5 FuՖ΀:tnPp%p;R!p[GY^F3BQhf͘,Mm A0@"wXQ")LKOs8(m:2v!BϞ=be;PdWx!fJZjzR>"PhQF-ix !S=r Hށ`,7E.A+ d%'*%&qF:G>dBYBCYAiq`kٲ|(FuV>D毿˗/%$3Aj zuo $(q:n"Z@`G]ɓ'E)ȃ \$ʪc7,p *u5kx+jFp[_72`_|5 _PݾB Of͚yoqL!T$C^.0Y} &ɢЌ@#dћK &X*u]pӧO^6pvB,2Ytb"d^,Md4d|#0Y2OnDyFM wb0YdGbZɢeBF0Y)Kb-Lϟ/PL҉-%`͚5y"ٗSxw#AH\efܾpj?чEהɢЌ0Yt(PM4r3wE&w t+L\>pfDɢGв,yʖ-g56,2Y4[[`ʸAԍZf,!dl|QN6uы:wܙP[)(E&>L\C7EYLQF"{r%Bm 2}fX>P)Q'ьDJ?3:ihuZGɢu(-tۭ^EwܩZ"EkLtOn"/_$o&M0Cy}f8Z lf"9b zF,"W MPF1YG $^%nu q sYdnz؋c,>TD /l3#C+W.ڵkϟ2L d[ /@'OH"ԡCSjRǏ>gP>Ӣj<-ZΝ;Ge˖5j޽{>ڵh?y7oGAw-[ݺukJ, >\|߸qcQnݺ4x`Q^yxk0ҩ7N 6۷Ӆ ?ҥKiɒ%UTʕ+:jƘ!ChRJрgafc[Aɢm(Yun.ΡV5:Uꫯg~-:g||yJ&!,X#7E9H_t|?SA!;vhCY֭[9w a+Wm@Nxh?Cm*#a\ԑyI: u։ƌCof9?De)qy dfDlq…/Q;_Fx+VJ*bl7s+30Y4 @0 mE9ԪFJRO>xѱcy{QF`&F2έd?6S> 'N.]6mD&M"Ƞ $ 7@A .,^|E_-zdiӦ"2qĴj*Ed d ٨~H1cFA`+^8ݹsڶmKSL#zDy }+G ĉuuO&~ k!-r@\A8v&&`h,nP:yslX%x9%KӣGYpiT!rÊQah)Hg׉|7/?SDYCfYbE駟?F"FdIk@`I Lp=C*"٨~f͚ԥK"+ŋ:uj]ǂ  R$S&U9_SV-U0Ƀ1PLMM"`B';puuMd+;e˖d\,@R!&(* NpРATX1Cd.FY$]#3gg,o5vܹTvm$yc,{3dQtԩp%1.6n{H±-5"`B'vں:]eoul2ZXġCRnݼ}#o>?H?cڵd4d/p#YTJͥ{)VYZ5A@Fr)A& 2dIJFs,i)!&ђEXeK,)| QE3duqذa5.2ɢ?h]1Y47ENvuud+#KC8sdGr <"4 Q+d jq#Y2A *U*AbΜ9#Ȣ! @qpK'KLآ I(u6,(M @ 1 "7ʓh݀#J\TI3_$b8KwTdXſ6q֬YbCEXA pkƌ3X|7^hQ] nR$b&w ":ڵ2Ym+&&~XɎ.α,g1"#9r`;bs2٢( , pT\&k 4h@!>*3޺u+x/ہhuޝym r>3ڟ6/RXY G?$Y?~P nҤI}%EȜ&k33gE!YCOG|aF@22*IoE&bh0nB  (E9671\w>/"&LmDMdQ;-,bSâ' pD \9tYEόgʬ~(T_|DE|/"wZXG @̧ID@TL ڨ8\ (+Ԩߗ&d1f'Y_ErA,ڈɢӌV;w# IDATnCVE|5JXX dUY2PBzjU˗I $R+t.X~n/Ab|}{ KH4NLmDɢ`:,.GQ7+VxXSZVw&/G׳Ё$*=Z$ $-iJ(" ~f \Kʝ٭[ɢ[W4z/".seh#Lm]!s,b QW ui"R{)$ # `{ ʇ %IB+=+tuQP H\ Hc0N#?/윘;EQuH8w0.B CTeU\|#f_B(L|>9&H-Q<֍` ][ɢ\cqH`\E,T),o=+IfNJٲe#:=ݠY:۾p6vh+W *W|s &6"oqh`DC.AyB岺m#Z=_:[%@񥳕yFXEJ?o 6"EwCF+FpCz/+/L(+XqC UQ8ڬLm^<$m>w,^|Ľx22`c-#VE9h.2Yq/{q#x>fC92.ZѺdƽCsW@"Ň],M܋ʋ  ? 8V20cU6"EI]11cv<4q/+/L(+XqAU1ڬLm^<$駟hcڿpW5&MJE… S%z$IkE*/>G"Lx޽umJWˇѕCa_@(y'(y:3,[;lGU1LmRn<$MҥKԽ{wڻw/:t2e$/h\ŇC۷)QT8.5%JJ 0?}*]ݹuXjւN8rjUGCfT&v&"r!hjϧ6mйshԨQԩS'Q{@Fz 74iHA͋aN/^Wn9ܸAOGg7n2P:u)arw7"x!pb&:JJ"k۞-!WpgFehv!lJv Plfa@ΝiԨQ#9sTCdetr9tnVT8z20F\=}~;n_LeF$? Up[,ڸrHRԻ8V"(^WN/]u#&ZӃ?̻ppbQlsGzVEDx9IO' ϵE&6hPSA*UвeB5"uV*^mPf|xq_9}p>rm:%Mw#Y~59CʳsJ$")$$Y4zպdƝ!l*Gq-eX/#PHQVn  7:x/+Ǎ+"_ނƌ s&NR36D@ZHTYIj]dhs 6KRjՄJd0^FYQO8A7nt4P ^WNzu ^:Νm'Q>)MB R),oޟ%i:u0xI,ڸZN2|pڵ+::ar+bpҬr\U&\ # C F^`E[`blNPL dӇMdZr+w# onV w9z){LYgF`( (Ga؋V`h9>$,ڸܕ`alB#NLm|܍+`eU‹gh2YEs0Y4&^|aͽZ&\hU4,ZEɢqW9,Ë[{/>0Y^c݉Ew E5s ]p~Cxq_9}*x &]M/ޟɖEȱeF+FLŭF,zsDɢ;ū/d*rLmDb!d^ڛ0Y4nlY4_^,wmx&E1Y90Esxqko"Ň,zsDɢ;ׅ-]k! =\}1}j>(Q"mZ˗/G?띺mًNk/+A,_0YtylM,ZE-6"羮&NHgΜQ~OQݽ{lB6lk׮~衇(UTTbEzꩧLL#7 ^|q3Y|4e;2=ctpb-"ݹs/Ӿ}+#2>>i˖ׯ:jZe޻LÇ6"=xl&Vch#r*#իQر+S,^Fna0d12bdpQ,'߷VQah9&6"羮,ܹϟ<@ $q< ծ7LaG#Lz=2Y 9 D7**L"dFƍ>yJ25ј1cҥKO<5nܘӦM'No%r͘,Fk90d12bdpQ,'߷VQah9&6"wcǎ:3gLW\Ξ=+H Ldɒ0a z>}Zd5ϟɓD. i^Qp;/_> HA\r͛7N"HS@_[dY=r8p@}g. %$ c_EcnO"9̒E[i}aJ0!=#U/_Ɲ;")ϑ97|(U!Zj5:wG øhs]?!t(3⥦VO[breJOd~ܯ_~7o>ϔ1ʗ+_G(T@iΊS\r`+G}uP0K,ԴiS i:t)}7Yҽ{wו q~C[&w/+`3|3|UiVhD.K3Ѽѣ(w+ reJg ${ӊrEРo#d$q~j=uju,HV%IB_ =ӻ!z,;~ٻV[]{AM'H[l{&NsxJ*mC}ZGJŋWFS?`ޛf͠* ŋaeǘ?ptmwblxz͛çVfh9>$Mmzj"Ƞh~oeʔ5,wHrK65΁8) YC7o^A^ Dk׮*UO|926+'O2#1Rύ3YCE| "%SG/jcHdCź_~XGIAtFw0aThT $kȥ6˕9FW^dD998t8lďYU٣J2MZBM.YԻ^I; &~maTc;[&F;fM|ߌ\jd4d^U,ZEN:`SqõH +',wk׮UX?V%*U1H\k.?2%EVb8AK;QJ|btMdWnf",!X6 ԩ7.0shם~5kk#AۢV-J& aߺCKRZ/D _/]ke譻Vfh9>$MmzŜ9sROuի(zq.k֬".Q)&JCXT)\Yf$6JyGEvQ>޽{oQلk- mfΜ^ҥK>|25Jds*,5 >l.m9 G|02*@X 6l¯*?[r~CÖN !}9h,n* s ^e5Θӧk5K\d/?O;wQQ?΍(ճeT{,$QרQvePF uɢަ*'7Da5Պ6 t;l*AYj5:Yܳ`nfoޅ GIbZ"i|/?L %/XHWF$XD!F1԰zm,-Vch9>$Mmzj"H^ro٣BIpES7mOevŞ={JyW_Z%, H>ӡk֭[(e]!۶mŋ mݺu"6\m9/6Yxy?B,tW$JHAٌy+ͻvіw?dqߑ#TN=^2?P.֋SD;=s iʔSu,jc4('D)U_z$I6f)+&Fe wD ޲^l$.ɢ7*Ƌ#R,hU,ZEN:`SqõZچ `ZAIݚtJ5uTaE4*vE-Hg̘Rjժ~>BEdu;v_f!s̡]Ս8F񵻝s,v%/+A3dk`ғv S})o{f.^L{dq?Sͷ:ՖձQ~d?yE7حk4-N[0Y DE 2ɢ5knwT_}>p5^-5(ednN[udQ/s;]"FY Veh9>$Mmzj"zh\pp]( [ I2%]zUeitYT΢gΜQ}m,N6Ou9xJ/s,^ѢxŭrƐجjV"~6#$tUg%U&n1͐EiD\ )0( Jd={tu@Vg%EҤ"YҀ1QȢ^ e hx7u';DINLu*L"dFueYK*S^=zUG6TPEd*U&}w("YŋRE02Y P?[ZE=F#xzdDqل 4{2Y8&N3O~%6\VKæN,RЎTiA%BJ@?F/rTu4>"Z0KYL,:v4ϖ&JDL"dF#zoڴiCW #Z( {Rΰ,*k>jkF*گ_?n߾]Yl٨yA~ Xb Fn]W/ 2YDxakdK&|Ji~@& %$VQ j݋Lz]Q@ f JccSfZR*om#i_J*TXga*j};hJ=ۼ?-X'8Qf'SA~+PYQch~Mc9U;𣅋u!W_ -ߠ4e,"Jy}:n>L]KV)U,ZEN:`SqõvXQEE)T/hV8~xB-D-[ԫ-m4|pv~@gm֕s,-"w/+A3nrP S`u[m_<۽`ȜY<7!%K&/^}ȘG{ԗ3Hu jRyi4bP{zEeSNA>-ȉF-* IDAT}(c@$wMX5}#劣;i%~C/?*?^ʢ4ʍI`,?345V+LC]g2YuNⶇk=] N_.ld;X.\H;v02UXRaQ &LMX:4C{ jT}gҋ5 u^Ner\C4o|Ԣ=r,͋mޤ[liZ0v3D>FYD#MޣadLF/H_E1ߢ7QCՆɢig1YuN+d ҽRdZEI 4nܘr/-[͛7iĈ -,DYGHҤI^dѾ,=Bs,YDLG ]}8e\]Lz)Tk^ uhlՖ-;,lzbdnf_u"b5YcK1 /A'x.,/B?zW 8Ed*$GOZ;Ç;wc, iU,ZEN:`SYԺ\&NuOW%Prua]zo>{Rau-AC] TK+hiH.kGʖqZ6G'ϝ}ŹFp1d0UyYuzdwS5T("(?az2ڸkmq6/U+tplٯP>KgBYzr-0h0-\Əx# ZPjjjRy%IH;~ 1ʕU}v?orhΨbpWb|wExE\!l*^ 6. dѣtʘ1#Ν[E⻿̨ O<ʕKEN{|7KA}9,/Ȭ޽(= ʛ#_K۾g7>qRLI*Dҥ-O@(Xk+W(_ΜT')K :7y\d&xDBj#Nߏ^hr_sgϊx=[)w>琐kǟG)m8c*E jr2"+5g,T4_^?h,ڲ ҉V`h9&6"]1`h/nM0UHs1U'42>qE(0Y_|݋geh9&6"]1`h/nM0d;{M/ Jb+f@NEk&ѫ蝙{GS&]+/ޟd*rLmDb!d^ڛxaɢwښۨJ֋3 _`&`rVbh9&6"]1`h/nM0d[{SOVY͙,oMF^?[EɢU,ڈwCɢ97Eo5ڝ0Yt@+/ޟd*rLmDb!d^ڛxaɢ7kN,s],w]\Ӈd0pd]]@spyڔ\yZ,"F0YtzlM,ZEcлK3g^ƞv~CÙQ@>[.67 :`*L"18k,jܸ1m&:j] #,FZFt&n>$vFD,nzE.jqθY|2YsBYٺh W0Y9I=\;H,_dVE.ǖ6L~iCm׎-m?3Ymœ7L/"[Wի "-S.ZC΢͇mAܶ,&)"sɢMk֪Esrk{гӳ^,,p!鶇(=CY˗8z]3[]-X  sɢ UQҼys:uAM`:m+e*VrPܜpOs˗ȱǝQLMI=\k:uM4I_ٲeB ݽ{lB6lk׮tM*UXz)dѱ 7Lm@V)RPB2O/_/+ȑ#-[6cluspT\U܌p [>@3fFCZ0Yq|H ֭.vV5'Oɓ'*S Zn۷o P,Y2jժH3HϭdzYA;w,=JAHRv[o_D:GMD[]YnGbQl0Yq|H ^ `R$H@zUD"Y1b]z0> ;v{3E[F9n>,|J$LR\iӺiT4}Ku z U+jx_-&6.I=\[=ȢPwI[x@WCgmɢ_4_d|IIW"4i]}Xr*5O|ݸAw ф =='V|GML S6wq[3Yp7n{"x! F3f ]tIOLǏ3gΈ절?L6--Z%JZ ܹs̜p :98{,ݼy'k׊tP#E| E@4^# H7iJWPA߹nyy3IfϜe3;7w3IJ,Cٲe}CH?{ [lKagoli/TU^r #FXdaT;ԳgO"T#)J{6QF:PyϯsKXt^C9׭EYʖL*2*@,Fre׀խ\,w6!)du$k6' OĞ4kX{{)_|l@w9q%rP4sLuW|6mZjӦ%ɓ'-:UTkN< pTLѶg~eSȢ <B,n]3|VF޶FSJQĢʣ,#yT=+XiT{j9G,:m1"j{/Q㏉/m׮ӦKm{X2($ <B,np3~W˴b*IĔ< P J/>'h\=^tGy֯_sdCԩ7ޠ_K2P=|\ٳȑ#+e+V^3gΈef/!{͞1brx `0\3R b:C.a]t4g?/{vO{wU'R ŋpjyM5CڴTBʕ%ޫׯÇi˕+#SF*YΝ;߼u=J'ϜcOӹɒQ¹rQ۵k!}x=:5ϑj/O` Iڶ8A/]Q|yE[2O8 rFݷpU\'h\裏R.]<{XH:"(YĽ曞^a-yc8flN6_{6-ZPƌ|.]Dnߨ3Rg4ٮyԣ:A:*Ehbf-h)@^)oݡ~MjQ%Zc'Un6[oN=7+02(kҧ?}o9SGjr=W˵koP1chE~R,_>d`32-b1MI2/v޾F|r:tZ˖-'†Ӱp%_mܸ'LN2 I[y߶ kryUԨUF|]+-e|j9 ?Xlz IoxYܺj:I;f~ErZ%ܧJEUF"PyG}D,̩{8qFljX4 H|ӧ7-P4@b]N|OО޽{ě]Y|Fzl;;OP֭Ν;iɒ%9KeiuԫA-b1t\BbQND,NXN[%KOsWPzcVϒT>/-kuLn(^{t€cBu*O|6l矞!#t윊@sXdqԯ_?fN>]_4VZQt%+UDx;g隓? :)СͣZ> ;OVhڳ]5(Gw*σunuYŕ3>b[FiRk+lxatq:~ 7'S3ॣUZU/_؁R&KNϜcYr̢;?XaQڠөs|>72MfNrrdD{O>I'Μf9R #ZT$r4b1PU@z^BjNqE.͘1?n/HŊ󩊗rPs8|c(bQ{'/彛NѴgh>W b:C.!>bqٔ'kVP"9uhe IDATԈFvo5u9=KOF/Vޛ>C0s:[JgZ9Ԡ{OK9cFJ蚴DI(ޟ@/*j7h,с۲Œ{]9*@"g`"\ bPʓd4_Xq$DY͛yϢ݁,x9--[F۶mhk?Bw<,̩xTjU%{'G\ ḖrXj$zZV?nѴg3u@!<_U\9쩻aȮ#F҇3gY}ĵE3 EhG%J ;o*J>U{%GM{cM, A} +gbN8K=vW,S{<~o`Áp֯gІݻkh8;j۶Q߰ӳqVrVF{eC,ɔnvVΛC#| KXAQ2T$rw͛-^E>b*v og Fh5Ԝq7@beO'^d#k|_r|by+E4 y99qTՎ;ZG|.b ~H\S#FӞHUCzӈ-A,6bګ$͗k#&7I7o^4t>3dx͉9}~ͧ#?rTe)"lla+AL:6EM8w/jShޅ˗l~dzʟ=;-^k]kKҫXTR%o5oae׀ʱ49w]!\W$k>~apkسgO!$X٥%KRŊwuZ%Y4 6{9o(b1P]'1C l.\I:*pq%@}Mg+ <׹]B,:gWhEWz+44"+VR$MJvAe e>5}[e˩Y_kdyC,޺}R.CظSpjtK,6bګ$m{3x!g⍽g b //J|#G5ERj%;╗K 6I_vOttfGA6lbNEgT!C7@r U^x/Y89@Q^b=Y+W|]lsKKi'09M5s=e Hx;1A,8j!YI2b@kь#sF톃֭[}bYhٳvb1s#Xdm/>eΜ8`k4/ĉ-]&OU_XstK.f/UżI3f ]~ݓP3Ě T2!+g4{}P됝7I4ȫ׭2ԙFK+obueuFuNOIMر4i<$fXsWZ~%w'R"E<~qԭo\ZQx9'Ey,)IIRȃ^F]=,2/*DG=FO8A?س'OO:ĢH6s2fH2e, nvb_Zʖ-Kɓ'űGx{Zիb#3rdT^bɑ#yUg?^6aSE*σ[ b huꈎ)Sn%ɚ6C_~ vOF,9-/5nx,e8Ţ]TT{+JO|#LL)T-*O\+5p^K,lw\bQf=Xj V>Ň b|Fzb U5;t囬,8MT9ip˵֭Ix*\1F~jZ/ߴxgNڵ_mM/n#6}9͛S8z|Zy 8xгlт'J9uIT5o/⥭ƌw>*g@T bGZ'I^9NX D(fѷ @ *σb U,8tN9+>S ɚ57nޤG%tR7[\ʙ9-N9CGN=KiSٳQ̙-i |*O\<*νf Z*σb c>Ƣ ;n%Ljh.:gŵYnM.y'I^U6vj>P̙3&RCYɒ%^W=C,ƚ?%<Q>Z O\:eVצM}㆔$t4}RKR OȬ$ul8z-3b,F@yb1tkX aI4Og U}_,K/*O\;Dl `K@5{X=*σ[)b cI {o7lt5QYcgǝ =MUyTZ!F!=C,*bhF <B,n3X0oSw'ӀmU?Q'ŠpFf'A-b:zFllM( *PyX Bb:/a/Q(S2oBΈ7p\tnܺE)$ɓc6ePĢ$*O3f̠ƍӉ'([l{@ ^yھ};;v, bQa@# <@,%Abх$yaʗ/͝;ԩAcrh̙Jt bQa@# <؍o:/E笐S}ꏑ>IF7|,("E֭[(Q"6l+RYĢÀF(@@y_ϢsV @,">Ik׎kq_)bp%7xwN:EiӦUxTb>B,f$k׮%V*yh,F >NbQ:R T}z*eɒʔ)C-R(Z^mF<}UĢ2CD b $׭"CܮEu"b-a4inQ#fH$VTI>yb(# :̃ޘcx7ݸySK(1=ЃCK#ܐALYܸqL6r~8B 0|pݻ(a /P[))Jԓ.._}l-bq%ZجQ#f Өr[օ:6n/} Ə*eJ;7!#M\t$y СCuOK&@ ]Fׯ͛7%͗#`-ԏM A(Kу2ER'R%oXTkX{]okt$ӧ'NЅ z+:$qbլPYcXʔtt"J:OTn}͘NY3|Z]w*Jy-%y|aJ>4ﻕXֽ}*O7bi"{K4jl˘bˌ/)c97DI9!%ԩ'I *l:2hW 8Ŀ>"{>hbEx|4zk3F U"8kRMU[j];Ooj]e8g.u:L\cJjݖ-t=Ffy+?[.MW;wkoюzj@Hl^@:mLEv{TgӰ>VR,d,WY,Ƨ?F;߷mzHTH(_Q?N']BF^@n8GٛWWsܧDC5s7^JrCѭTI2P;l8qG-MYʕTzYnD*S'OBɚ̝#FuF5!쩇~yfx Y,uJTi,>sٳA[լ)]:tEX ?EY?3)ܦWz}Z};UiNeY:ӲM|KΣ!&Y0\H'HW>'I =lj?$:΃x~% ~uz4uj[[L8gz~=zlyOP,Y V΢g@:}KҤM_ro%&ի{DY矣st+?,oG\ʙ9zTy zVN$x;j%N\|>裏bCFd # ?C߭{:*=S|SDRݺmC*ر =Q&B5N`hȫ$)( \K/zt܋ _}xt *ުuT-[,6QR'ʓLRD۵'KipsOb aQ\gdW<Fg%lص^U@Z73q{ <ӿmkt&Thz\ۊ}#ԁ5kfٛQc9cǨ1mY O?n,U-[#J٣~gBslF¸.=Lϧޣ韌Db1|oq}%M=: SSoF)bb}rUC5c/> I+۬_FK?@hD^ohTVm ޯ<ę3q!γoK$Z^"k^w^SkVӹ];"Zo|*X5?@,J@@# V:΃x~jR /VBEPԲ6cRzuΘT|9cOUS&ӳOogohQ|nDCݲoFS~ݓ^,B+%|&ciQLŲMNFR[R#-GxpYSٟgng U4ǓhcA&=ҦN~.ӹEs?ڼ}o.sW;yJ³9j4`@6s%ûvqbM:cJWM="VL]Ģđq}%Mƍ7Ў;h׮]⏯!@ ɓ .LEJ*GJH6-ҝfͦt;(q %DQ E˫K,6m){;wn8qb};>D̙3tQ:r]vZlIFdɒDeu?[?D7/YEǻuڵk-x/©믿Sr?iTj>J"%z<-%z z$uŠpŝbQ"LXhT~nժ͙3+Fͣ35ƌCݺu&MDիWwMuoM^hN,YPfhԩ7tLLIM͛Ӵi~2gά&JWJCjĢтXEF 5,"5k5hЀ:vHǏ׫h >}'MݻAzA;w&@@WUV 6uBL[7UŻgbQA,J@@# V-G@ ,^jԨAΝܕln"ΫwQ7:;ĢQXEF .V IDAT%JH3gԧh)8tϟ-[F+W#$byu.Y1uEǨgX 9@  @,3 &Ν;СCi4Z 6n޼)=z4u#˫wQ'Ί@,:F8#b`FHbQQ僒Zñ!F9ԝթ UTݻ ggVf.E(&@,3Xxg`ρE+."ƉWQe.e2XAe@,J@@# ^+40s`FʡqUTٻh٠z!e$Khݻk@Q *XTqTۄk} - LQr b*U;wQѲAYB," ($@,3^xg`ρE+"&XE9Gd (&<a(ЈĢ>k} - LQr.bUTѻ:hٟz!%҄XEF ,\3Vhi`Ce_E9Gd ((ĢD( 4"`ZBK=f*PyU-ۓ]/ĢDa(ЈĢ>k} - LQr*bB*]Ts.B,J (&@,3Xx?V:u3gz2dʔo߮ະgu]Uϫ,Y2*\0[\r>T.Y] e3{ĢD( 4"`X5nܘf̘c?,upwI~L#GRD`ꎴ"Ϋ"soܸqG]V-$5kFSN 9@P9ĢDa(ЈĢ>ӧ+bi/BRLjj)Yj"U4ɓz>wQEZeZ?nwz$V@"`h ^!-=fëh' FNc͢1E8GY/ĢDN2U(@,*0kEE6سäeA4kG.XC4R̙3Ge0TabQ"`2U((X" vi{O?MK&ftݻwiϞ=t1w J0!OrIUVIZF_%^n:|۷M6e)R 'iN7o=~8}tYzꩧbŊiH&b^*b\)NX9\f! Q*r}mj߾=M2ŖɓipVZŋvډ`=סCz㼧QF{ĂABKI$#FP۶mizV{at1:Y5bQI>Xgr}\2ر#ʞŋSV-={+ΝKuuTg=z4uqXF{et1:Y6 (qt|$vEk @,3n|߿?;AXW͙3'rX+V,(KOdB.] uKRhAE3(bt|Ցs2^UC, M:>d@ z\_|g V2e~ğs Y,K8%O(ObѨ [Nm6K:'\ ( %Ԡ(~gSƌ5h9Vz=4i$^PM,yf1ch"qoҤ }~!NS|"Q\@,:a _E (EKpBƼ'G.&}vq${>,)M,r_KR֭ŏF2{K<ٓcڵkӴi,G,^pJ(at Ϛ5+(P@|Ģ{d̿Xꩮ/ĢD+XS88ݱcG?~E@J"j$/e%"Yfқ6? #yv^V,rT_r[ő|ۓQZ|Fs5IHO X>\oҟU Ƞ]N>b:8IGѼGh q 4hR^J3gWeFE;{y92d\3{VfĞijgZw9b)獏Xv९%;wR "0 趗K.FީH"LBv`}f73fXaװaC:pn,LS\Gw{)+gׯ"yx骷Q)CcfϘ`D(2*,X ~Rq(D %T(Ϝ9SVi /G̟??-[LDT-}rCn{f^#F^z=fhxt-^dѣRF${yi+{!ݔfϘdݱWcߵ>ĢD[XS&L( :TVi ͛7yG]!9켋&G4yNmL,ys=GkbJ.\xɨO SN4mڴGbb.xw+VիW[.שS8<aK^ung̿ѳ5,7wU!厷40UXMΊGs$Pݞy+U"mȑF6ݻ{aAlQT۶mKk׮ifz>~m|/55֭[EP%o!9Gy'_+u/_G^Ns.,mFӶPw b,JE0.J[ahBT@`E i7KٳgJ~iqnĢstrvZ'ȹ|2 $a# ?. (h %T(]ni MC$UД kb1dAe>Xh0a*\.Jh)yRG1 /= /A=~*TH4n֡Au]Mmڸ]y]bQ1B,JpQ<*!:uXgN|vZCjŋsK.yʚ;w.թS'rB x1'!bϯq[eCYŞ!%ĢD í>^}UK;F9rw3gq}ʀXtUu{vU $!rC@W6IW6?]bQ@,JpQ<*! X}:5j-QJ{@ bMbQ"LV [_~4'P",Yn)3o޼t!E[Tg;b|E1#Ñ#GZ e=ΝTrR bbQ"LV h߾}&xGĵk׮͛=3^x~Zbn{[nM)S;wPժUԩS{ʔ)C&M~'(iҤ*!P-ڳW1Th !C_~t]e 'Kƍ#햤U(hݺub^3RB(UTUVӧ)a„+#ׯOS=H!~b)bbQ"LV a|"/ Ӝ 只^͛dGڵkW̽|f$/,xu $Beo'{1[erCA@]앬Xbb/P\^c$oh_\b.9z sCfŞ!%ĢD íEC,U_yR8S֬YŞ˗|g8ۉA/?:txIT X L*UҥK=߿_x$IիWENk BJQS͚5믿vZtgE(EpPX_Z9e|DjԨAwt|_H%JeV{^l#FP=#4 +kڴ)رCxzSW>+~ ~w9^ɢ?rJb.䷹O,aس(?|܉q>y@b*b13<#>ںu%K\bEiazMÇ|dQF A .:t(fb~sXStyUBzj%E9#.{f'E,E3V:bxdXp}_E=2dQFͩK." EȑWx b3޹s$Ǹk;#2[kY{G۷I[{X6! U(UB,4ֶȶg XDr*N>O]t^'GJ,OBs8}\СAE,x.tQWFE$ՈzE⨧etgEϢD íBEF#sv_]pҧOoikF'OA{Xhna*\.J!U b[$ëev^E.xE]nxAq,Κ5$IBK.2#i\b1PWǁmL`өSDXH3ĢDkXStyUBh&F*;jY\^ENxE=nC+dĉԦM8 L2aKV֩S'qn*G)^j.xgIّtfϞ-tgE(Ep(k2L6o,[2'ƓđR׮]C#ԞD^Bno<@k.HGNR-ugq;CY >Y~AnjCݻwi \/ I|nl,W\I9;3cǎh N>fz%*V(N:9ugE6(Ep+MS@ {v*.F)ƹ+W֭['{ni.FhPb@W1m_mSN-Dw ډE_bw^@bd„ 9rDS?^H{ s 3b bQ"LV!H.F)f"C˗//\:w,do8w1B`o~-Ş!%ĢD í0B4M!htEeΜg1/5ⲍ (o!%T(]nm`{{Jk^xc['a '/Ģxot>7> (o!%T(]nm`{}ePH$暽x.B. e|CVbhw> (o!%T(]nm03%^FR;CEpeA̙3$N~7L> (ѸgQ"LV!;{h8 uM C,`E@@{Xh-a*\.4e؆E# @,{{̿"rA@{Xha*\.48`D# nE1\>ctgE(E=`cRn2~ =lӁSb 9bq4z7Qg3 NʅXtBaE4ϖ3gN*Q}w;'NP9hԨQ#8Fxٌ1;QߨG wEA,:F8#b`F^zt:p@,t}p19saϹsD\Šq 15t2CH 7ꃺb1(\qgXSᢆ Bw*J4 ۷/7n޼8s9D#!!fEP`UrXШx5ƣoX  %TgRƌiذaԫW/[ ɓ6lH&L[4»(@,j9l7C: Aot7B,JxQAvEEQhxi޽t)J$TDh<}4`DrEyz7RQOs G!%RXSfJB|rʞ=-FA^2tP2e a¢1y] 5ޑ-HG}Dj^.bQ"ME05(jٲeTJ#FP=4h5fGMҎ;L2~z7@Hb1pQ4aF@bQA,JIQ;w&M ˗jԨAJeRҤI5G7nuG_Ŀlm- <. Y. (!%Ԭݻѣ5k56\z̙ThQu0wqu]<u]bQA,JaQ|O'OnҰhr,tQlľZ  ʨ X!%ZĢD( @@@[  bQ"L  -Em pbȢ\@,4Zh+'ϢDXEhKbQۡCA@pbȢ\@,4Zh+XfT  8E !(' PBX'#ĢđXEhKbQۡCA@BbQA@,J@@%С  .X Y   EF mYha(m @,j;th8.X Y   EF mx#bJ@@'y  :$ĢJ   c}?X8a(m @,j;th8X@,J4E0Q :4@ e"rA@t"h  <bQ"L  -Em xe"rA@t"h  bDlb1"Q   4@2TdXtB y@@bb0 GbQ"L  -Em  E(&ЖĢCblb1\dQ.N u-@gQu@,J@@%С  blb1\dQ.N u-@Y @,F3*PĢCXlN(!@XF@BbQHC,J@@%С  `!( %DQ  XvpX @,,ЉĢN,JE0Q :4@Y @,,ЉĢN<ň`F%  XT|<pHPr b %u>[@,JiE0Q :4@, %ĢD( @@@[ pE  :XiVOE(&ЖĢC<pE  :XiVg1"6̨@@@q  `CPNA,:<  Nb1Gp E# (&ЖĢCĢDXEhKbQۡCA@b1\6.(@@@':   (: %DQ  Xvpg1\6.(@@@':  ,F #(NbQB@@!,CuI6E'@@ @, n!(q!%DQ  XvpXha(m @,j;th8@, EDbQB[A@?x%ZĢD( @@@[ , EDbQB[A@ňbD0 @,*>@h8$eA9:Xa@- %4ĢD( @@@[  bQ"L  -Em pԩSiڴi׬YP.7nܠovAv| M O;OӔ/_>lz޽oP[7QT:>RMLŋbOb4i#\ l L2E_߿? 45fc'N{uy~!c93ңGڹs0Z*mذ_kn7ߠE ptc^ Y   hBx zG}|V/^L5jԠsQ]s<fWtrTvm-_EW$:   J(AYd3gH 8qra>͛{gnZ C(l2\4υWD=O X@„ s4tPYvGYh_S,9һwoK5,si駟{!C񫺑X,~'ݼyS:zhڵk1<fWtƍ$I-_EW$:   >z뭷hO,?#,XO6{v[C ܫ>]bQ}[B A@$erݰ_ Їmn!(Ae3ĢVH%,Ӓ s"޽Kϟ'RNMHm BPtgEmJ@Yj￧SNy}r… >KJ+T Σ<{,>^z… =e͚.]*Q99۷i͚59Ct%=J+Wcǎ̔)ժUK-rcr=ni7Y{XtU  rLβ;R,  XM:?d+W.b-ćMjJD9xiܸqq[fM:)SNfnƇƅL@{X yQ.teQ,޹sG#dG8}YZb%JQX6{v[cFtgEX1k29"Ѷm[==Öc7L&L:ƧlyngWYCäŞ! 7 Ё.l2"guу,Xi"Ydiӊd,CVū"qb_$͛7{ͩJ*XmU&ytgEyc@@', hI4Tb1iҤtu'|9"[n4{lϿCƧlyngWYCäŞ! 7 Ё.l:E5 &5Sɒ%)O<đ]ݚfn[-Ş!b'8*⭷ޢbq4cƌ8)I$ԤIjذ!.]Ucʝu=Ν;E`#1#G%G,=u<| RtgEX2k29ӧ@qqyWޣɓƧlyng7Û>}:3aN J*bm^7G^O=C,ƚ?  ~ 29Bb>vY7Y4.Fj"wE۷/NRݺu-{ܣsٳmB,B/2b  ]&g٨8̘1c,^r ץK7n2bhȵkhٲeߊK.ESz),mbtЋ̽3bd(@@Y6Aܵk)RRU hڵQ=^N޽K7|#`y iӦQӦMe#T<ٳFرcl)RPƌ)W\Tvm|~W^ysڴiXeQ4tMq4 G2ٳ"/U?TX|A-8qܹCŋ,oț-[6}|W+\RxGfʔIu̒%ϛFbG(K@Y6zJ1`OU_~-C#-ҜB=:cѢETfMK&L̙3Eps/|jJyng7{ٓWի 2~C{\# H{ /q0#FP۶mE>w84>,sz*<)STQ K;tgŰ ? P.lvV{ΧX~idGN\xm"4i$bXpaJ6N̙Rhs;(O&s̖euTlYq"ز8p/͞_,Z`8[ E?pЧׯ|f F,S^`;{=x +i.>bw[Mj[tgPGhC@9@ׯ_OʕhޗW[,9r],|"ٳe͞۷oS„ -8"pҤIŞ?ڷoowwN~y/Q5[}ɒ%[qK@.+c}tAʛ7/=,`y9q-x|~a7o&&S*UhҥJ>{2=C,m  ]&p䈢̉RR~+Z3œ؃/óxI[DrNGv|zwԨQ4t8kԧOv*͞}ĞƯʳ/Z%Ho$*St9GsK]bQB;A@B&rGQ+͞_ޫףG,g+w" h b% ZE.WqHF'G{XH^3Զz೤x~ Pf,CexX* )]&gM&6{v[ٜ8şP9/C~aNb}GCmڴ)UPRHAB IF!׮]#ߊƑ ~"R.h7»5up,1be6a@@mLjSDT!6{vC׬Y#9p_N;w.= ͑Eʕ+ ƜuϜ,H.`}7x iӦ I{XA,+C*@@@erpng73fek:}ǖ|kvbɒ%֭[(Q"͛7>sG=^g8֮]ۓ=ɓ=0auR̙3Eps//bŞ!#`}03T . 7ʆ=b.]Gs͛>sɟ_ثg|رcE<租~wTlYqchvZ8y]b13C 29; b ) 1mۖ>#}Ŋ%g(Z:bH5jâ?ժU%K5b,T%Q̬|XM| y-Zʿ|!/-JLؽH>ǜVX!6뽙wӦMćpNR0b7'oٲE06$SHn>tK38zU\|v ڰa:'̗/vyf7ot .L|0]žE'ր< $MF[ng99g;"/!M64Ģ]oX~:uЬY9i-lC,ד7,3{K7n\0,ͩx}v%޿j/_xT,hHz FhիNv~m7?x֭~mRXtb  MLd!6{vSYQ0f䥜zGf$Op]nРA>CC, І{eB~Ow^c?8@w2"_g)0t82اOʖ->cs!O9矞5J*>ųWGнo0IDATk׮D,ߟyGPoo#bnQ(} G28Ie#{XD.s4n}͞_Dvjp~_3}, ^|YDSM FwFvÀ=r6GE T+~g^G~:>J >VW }o'ụ=+-x~lfn>Ƈ.XdSv:ivcGP2'a|ɒ%-s6mڈkNI7 sb~¿|!NS\b<7$^&;oY\RD2'^KV◽}˜z-"m/ge:,69ﯿJ)S"@ Ztio߾"y#oĉ>y#վPq=W.X,≽|g\Bs񈱀' g@B@;a(bB!)H)I217aDJB(ȟ9PJHI_]>{Y>9ֳ)ck2I#3{n?r͎9Rf?"QzCYbo۷o/k dF{|#495l4j8M%˗ ە+WB;&H@! z@]sɞgt e,۷uiM3l]oe <-Y=UDbw7ӧUHv)l qѣbƍZ;Jq}YvڦY~}g^u޾|Rvŵ^&q17XfŊ&H@! z@]sɞVd1Jyخ҇G<}KW^0vRFjIGF*'F]0(%53fZZ&ABPoӧg?dSRPcu ϓT+ǤGB !#r37o +$w !ڔ.Mq"-ӂsG 5dtш[6it0*(Rn8A1W2.0.)`l2i )0,!ygSœHMɱ\骊,J3?oDi-Xv`! j@sdK%9p |]}F-y 5j}2) Nl]t)ygx$u ,(_*z;"dm:)!={s)cY;אWKTD GO?"v y@D>}ڤk`d1 .!9s_>l;HN/uy,nڴxqYwHYJ3o7"IJ #zLd ǐFJkkGIɱ^ħ\Di>xyS #HtYF,栥c%9H7o>mв3#5O`uoȚF@ NzxɆ"527$l/^H"E5e9K ޽;]ZFvM;Y9D;ݏk:I]% Cћ\HòpBn(W1;v:I&s&rY1cFo߾HTc{R6y6sN4~ܷd>bn͚5ir5 N!\h!싽֎,"C*7SrwG1h08f8-5yZ}`N gVd쓡ㅀ69ٓEvVZi^mړ#ۡX>uVؑSdDR_F@ZyH5z6lZQ*0Qa5EKdlm5~\&_i+>A13'^G20"AZRhz;.?yU9߰]oXj#PkdݿC@`ܹM#hQ9>HM.-_@>Iȹsr.vd(]χ6Hf;ϨVOr 6&8n"٥ƙݹs'L%X B@tsJ/ÓE_gEbr h IRX,?E#RG11߿!?OTX ;|ȢT"'d̙aG|Ǐ meF#9l'~"ׯaXμ2__""?2k%\G]Мѻ:u}҄BCv,H:ꥤuw T7V?PE/ǥZZ\W;$j_'>;w fb ̛JM5z?'LZ7޻wo $ܸq#,< :F^"|Oh[:%ы~S蕫#xluOegׯ_իp^|&F|m,zBYhQ8τ0kӁ+j,YPM|5!ɔpWmwD~Zwn"בN|KT@߾&ܠTM_>"YZ2_ةvq2xdɍ  Z2^ /0bџS9, z#PJTɃH{DGݪ.K)Ef^a>xaxNdQ" O Afx"DDA:ˉ]7S7/hL ÏF&aXfEQv0KDԆ.Y%-eSWOdH j0_kտ8o޼09B)5:N7qD?؅ĿV 5s9n9.%$% o.vBɤN%`j;>0y5tvs I `R.+0`WM {Ǐb!ݪeD;0^@;VBzD{})W5w7Ea$%?v(sZjLZq?ҌRM'bN!PI)ZQD}w{62ZE>{ll,w96d]ATQq 3o^)`>vxP`͋6Aw+k˽?C;evZc=[+XGBB@~C-H ue{Z ADLH8koΘ,#VWc'ޚ;FCyСΪªF̪q+ZW+D3 ! @9u=^nͲ>?J4L륍; Ev.]I`F̤"Sk*Řu{kY)u~E+}44B@+d3aˋ+W6ձQ%鋤q2J:;` ]F*4ԳB xe:KZJ+{5?(>n! ?~<Ȅ@?#@4 z@_: gammapy/modeling/fitting * gammapy/utils/serialisation -> gammapy/modeling/serialisation * gammapy/spectrum/models -- gammapy/modeling/models/spectrum * gammapy/image/models -- gammapy/modeling/models/image * gammapy/cube/models -- gammapy/modeling/models/cube * gammapy/time/models -- gammapy/modeling/models/time Since it is very inconvenient for users to have to type sub-sub-sub-package names, we propose to expose the end-user API direction in gammapy.modeling for general classes (Parameter, Parameters, Model, Fit, Dataset). For the models, we propose to expose them in gammapy.modeling.models (same as Astropy), and to introduce a factory function to make it easy to make models that is easy to import and use. E.g. it could be gammapy.modeling.make_model and take a model specification string or dictionary. Developing the syntax and schema for this "model specification language" isn't in scope for this PIG. It will be done as part of the effort on model serialisation in Gammapy in the coming months. We propose to add "Source" to all (currently 3D) source models, and to add "Spectral", "Spatial" and "Time" for the spectral, spatial and time models. This avoids user confusion concerning the type of model, which currently exists in v0.13, because for some models (Gaussian, diffuse, table, constant) it isn't obvious what kind they are. The drawback is that this generates very long class names, such as e.g. SpectralExponentialCutoffPowerLaw3FGL. The motivation for this change is weak, but in discussions at the Erlangen coding sprint most Gammapy developers preferred it. The reasoning was that it's a bit easier to find and maintain models if they are all in one place. There was a concern that this might introduce circular dependencies between gammapy.modeling.models and e.g. gammapy.spectrum, gammapy.image, gammapy.cube or gammapy.time. But in `GH 2290`_ it was shown that this is not the case, the proposed change works well. There is still the open question whether models will keep this simple interface, or if we will add an evaluation caching layer, or whether eventually reduced IRFs will become models. So it is very much possible that there will be other changes concerning the Gammapy package structure in the future. Dissolve gammapy.background --------------------------- We propose to dissolve gammapy.background, and move the code like this: * background_estimate.py, reflected.py, phase.py -> gammapy/spectrum * ring.py -> gammapy/cube/background_ring.py The motivation for this change is that currently map-related background estimation code is split into gammapy/background and gammapy/cube without a clear separation what goes where. So the choice was either to move everything to dissolve gammapy.background or to move all background-related code to gammapy.background. Looking at the package structure illustration above, it's clear that gammapy.background is problematic, because it is lower level than gammapy.spectrum and gammapy.maps, so moving background estimation classes from gammapy.cube down to gammapy.background would likely lead to circular dependencies, because they rely on the map dataset class in gammapy.cube. The reflected region background code is not a great fit in gammapy/spectrum or anywhere in Gammapy except for gammapy.background, but moving it there does make sense if we say that all 1D spectrum code goes in gammapy.spectrum and all map-related analysis code goes in gammapy.cube. Scalar background statistics, i.e. for a given region and energy band, can be created via methods on the 1D and 3D dataset classes. Dissolve gammapy.image ---------------------- The gammapy.image package was introduced early and contained a SkyImage class for 2D sky images, and 2D image models, and some image analysis helper functions such as code to make a radial profile from an image. Much later gammapy.maps was introduced and we managed to support 2D and 3D maps with one container instead of separate gammapy.image.SkyImage and gammapy.cube.SkyCube. Now, if we move gammapy.image.models to gammapy.modeling as proposed above, and that SkyImage has been removed in 2018, there is very little functionality left in gammapy.image, and it's not clear what its scope is. So we propose to clean this up, roughly like this: * utils.py -> one helper function, move to gammapy/maps/utils.py * asmooth.py -> move to gammapy.detect * measure.py and profile.py -> move to gammapy.maps (and clean up) * plotting.py -> move to gammapy/maps/plotting.py (remove illustrate_colormap and grayify_colormap) Clean up gammapy.utils ---------------------- Above we propose to move gammapy.utils.fitting, the biggest piece of gammapy.utils to gammapy.modeling. We still need and want to keep gammapy.utils, as a lowest level layer in Gammapy where we write adapters and helpers for functionality e.g. from Numpy and Astropy. Mostly this is for usage within Gammapy, users very rarely need to import from gammapy.utils. Currently only energy_logspace, sqrt_space, sample_powerlaw and SphericalCircleSkyRegion are used from end-user documentation, i.e. very little. We propose to clean up gammapy.utils, e.g. gammapy/utils/coordinates.py still contains coordinate helper functions that were added before astropy.coordinates existed. We do not detail the cleanup here, this will be discussed on a case-by-case basis. After that we will review what is left and decide which parts do mention in the public API docs and which to keep as purely internal code. Outlook ======= During discussion of this PIG, several other ideas surfaced and were considered. The following ideas could be considered and evaluated for feasibility (as done in `GH 2290`_ for gammapy.modeling) in the future. * Generalising maps to allow maps without a WCS and spatial axes, and / or using it everywhere in Gammapy (for IRFs, spectra, lightcurves) could be nice. Possibly ALMA does this, all their data is in 4D containers with lon, lat, wavelength, time, and e.g. a 1D spectrum would have only one bin in lon, lat, time. It's not clear if and what changes this would imply for the Gammapy package structure. * CTA IRFs and other CTA data level 3 classes might be maintained in ctapipe or somehow shared between Gammapy and ctapipe. * gammapy.spectrum will likely become higher level than gammapy.cube, i.e. 1D spectral analysis will be based on 3D maps analysis more. * gammapy.detect will likely grow map analysis and modeling based algorithms, i.e. depend on gammapy.cube and the rest of the Gammapy package * gammapy.astro could grow (merge in Naima) or be split out from Gammapy. * gammapy.cube could be renamed to gammapy.map_analysis. gammapy.scripts could be renamed to gammapy.analysis. Or similar changes to better names could be made, depending on how the purpose and scope change of sub-packages changes. * Datasets could be grouped in gammapy.datasets, if they are simple container objects (like models) that aren't tightly coupled with analysis code in gammapy.spectrum and gammapy.maps. Decision ======== The PIG was extensively discussed at the Gammapy coding sprint in July 2019 and in `GH 2274`_. A final review announced on the Gammapy and CC mailing list did not yield any additional comments. Therefore the PIG was accepted on August 18, 2019. .. _GH 2219: https://github.com/gammapy/gammapy/pull/2219 .. _GH 2274: https://github.com/gammapy/gammapy/pull/2274 .. _GH 2290: https://github.com/gammapy/gammapy/pull/2290 .. _gammapy-benchmarks: https://github.com/gammapy/gammapy-benchmarks ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-018.rst0000644000175100001770000003730514721316200020454 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-018: ********************** PIG 18 - Documentation ********************** * Author: Christoph Deil, Axel Donath, José Enrique Ruiz * Created: Oct 16, 2019 * Accepted: Nov 6, 2019 * Status: accepted * Discussion: `GH 2463`_ Abstract ======== Over the past years the Gammapy package and documentation has grown organically, at the moment there's a lot of duplicated and missing content, especially for recently added functionality like datasets, the high level interface, and the new restructure of the gammapy package. We propose to spend significant effort to reorganise and improve the Gammapy documentation in Nov and Dec 2019, roughly following the plan outlined here. Further discussion and planning will occur in Github issues and pull requests, and will be summarised on the `Documentation Github project`_ board. Introduction ============ Gammapy started in 2013 and since then the package and documentation has continuously evolved (see `Gammapy changelog`_). The oldest version of the documentation that is still readily available online is for Gammapy v0.6 from April 2017 (https://docs.gammapy.org/0.6). The current version of the documentation is for Gammapy v0.14 from September 2019 (https://docs.gammapy.org/0.14). In 2018, following other projects such as Astropy or Sunpy, we created a "project webpage" at https://gammapy.org which is not versioned and hand-written, in addition to https://docs.gammapy.org which is versioned and auto-generated by Sphinx. And we introduced a new setup for tutorials (written as Jupyter notebooks, integrated into the Sphinx documentation) and ``gammapy download`` as the way that users download versioned tutorial notebooks, example python scripts and example datasets in a reproducible conda environment (see :ref:`pig-004`, `ADASS XVIII proceedings`_). Currently there are a 19 tutorial notebooks plus 7 listed as "extra topics"). Among the notebooks there is a lot of duplicated content, but on the other hand there is also still a lot of missing documentation (e.g. recently implemented large changes in Gammapy such as :ref:`pig-012` and :ref:`pig-016` are not completely documented yet). In addition to the Jupyter notebook tutorials, we have RST documentation pages for each Gammapy sub-package. In some cases there is a lot of content and examples (e.g. `maps`_ or `modeling`_), in other cases there is only a sentence or two and the API docs (e.g. `cube`_). The more technical documentation related with the API classes, methods and objects is autogenerated from Python docstrings written in their code. The tutorials usually have the following structure: introduction, setup, main content, and sometimes at the end a summary, exercises or links to other documentation. The sub-package RST pages usually have the following structure, following the Astropy docs: Introduction (overview description), Getting Started (first examples), Using (links to tutorials and sometimes sub-pages), API reference. We will not discuss how other projects structure their documentation, but we did look at a small list of projects and think it's useful to compare and contrast to figure out a good documentation for Gammapy: - https://www.tensorflow.org/tutorials/ - https://scikit-learn.org/stable/documentation.html - https://jupyter.org/ and https://jupyterlab.readthedocs.io - https://www.astropy.org/ and https://docs.astropy.org - https://sunpy.org/ and https://docs.sunpy.org - http://cxc.harvard.edu/sherpa/ and https://sherpa.readthedocs.io - https://fermi.gsfc.nasa.gov/ssc/data/analysis/ - https://fermipy.readthedocs.io - http://cta.irap.omp.eu/ctools - https://ctapipe.readthedocs.io/en/latest/ - https://www.djangoproject.com/ and https://docs.djangoproject.com Generally one has to be aware that Gammapy is both a flexible and extensible library with building blocks that experts can use to implement advanced analyses, as well as a science tool package for IACTs (CTA, H.E.S.S.) with most analysis use cases pre-implemented, that users just need to configure and run. For some of the examples considered, that's also the case (e.g. JupyterLab), some others (e.g. scikit-learn or Astropy) are just a library, and thus their documentation is partly different. Proposal ======== Guidelines and specific actions ------------------------------- We propose to undertake a minor general restructure of the **Getting started** section described below, mostly keeping the existing Gammapy documentation setup (e.g. to maintain part of the documentation in RST pages and another part in Jupyter notebooks), though we admit that there is no clear separation between the content of both. We will take the following items as guidelines and actions to improve the documentation: - More content should be moved to Jupyter notebooks (e.g. currently the RST pages for maps, modeling, catalog, detect, etc. have a few code examples). Those should be moved to corresponding notebooks ``maps.ipynb``, ``modeling.ipynb``, ``catalog.ipynb`` and ``detect.ipynb``, since in many cases there would be a hands-on tutorial introduction for each sub-package. More cross-links between IPYNB, RST and API docs should be created. - Sub-package RST pages will be kept short with links to relevant hands-on tutorials or Python scripts at the top, and the API docs at the bottom. Some pages have significant content, which is not related to code examples in between. (e.g. for maps, modeling or IRFs there is a description of the design). - When possible the notebooks should use the high level interface everywhere it makes sense (e.g. automatic data reduction), and the lower level API at the end for the very specific use case proposed, trying to have shorter notebooks going to the point. - Add a Gammapy overview page to the RST docs, where the general data analysis concepts are explained (DL3, Datasets, Model, Fitting). This page would be similar to the description of Gammapy in the paper that we also plan to write now, and the same figures would be used for both. - Add a HowTo RST page with a list of short specific needs linking to subsections of notebooks exposing the solution. - Add a few examples of how to use Gammapy with Python scripts, and provide these scripts with ``gammapy download``. - Extend the `Glossary`_ present in the References section with some non-obvious but common terms used through the documentation and tutorials. - Some effort will be put in revisioning the completeness and consistency of the API docstrings. Getting started section restructuring ------------------------------------- Gammapy overview ~~~~~~~~~~~~~~~~ We suggest to add an overview page at the beginning of the section. That's a ten minute read and non-hands-on introduction to Gammapy, explaining the details of data analysis and giving an overview about concepts such as Datasets, Fit, Models etc. and how those play together in a Gammapy analysis. People could skip this section and go directly to the installation or hands-on tutorial notebooks and come back to that page later if they prefer. Installation ~~~~~~~~~~~~ Keep as it is. Getting started ~~~~~~~~~~~~~~~ Keep as it is. Tutorials ~~~~~~~~~ The tutorials will be reorganised in the following groups (items) and individual notebooks (sub-items). - First analysis + Config-driven 1D and 3D analysis of Crab (evolution of current ``hess.ipynb`` that could be renamed) + Extended source analysis, also showing the lower level API with customisation options for background modeling (to be implemented) The group below will be a "starting page" for people from CTA, HESS and Fermi, and possibly other instruments in the future. We could remove https://gammapy.org/cta.html (very few, and outdated infos), since it is better to have one starting page for CTA users instead of two. - What data can I analyse? + Observations and Datasets (to be implemented) + CTA, mention prod3 and DC1, show what the IRFs look like (``cta_data_analysis.ipynb``) + HESS, mention DR1, show what the IRFs look like (to be implemented) + Fermi-LAT, show how to create map and spectrum dataset using 3FHL example (``fermi_lat.ipynb``) - What analyses can I do? + IACT data selection and grouping (to be implemented) + IACT 3D cube analysis (data reduction and fitting) + IACT 2D image analysis (``image_analysis.ipynb``) + IACT 1D spectrum analysis (data reduction and fitting) + IACT light curve computation (``light_curve.ipynb``) + Flux point fitting (``sed_fitting_gammacat_fermi.ipynb``) + Binned simulation 1D / 3D (``spectrum_simulation.ipynb`` and ``simulate_3d.ipynb`` combined) + Binned sensitivity computation (``cta_sensistivity.ipynb``) + Pulsar analysis (``pulsar_analysis.ipynb``) + Naima model fitting (to be implemented) + Joint Crab analysis using Fermi + HESS + some flux points (to be implemented) For many Gammapy sub-packages, we plan to have a corresponding notebook that is a hands-on, executable tutorial introduction that complements the description and API docs on the sub-package RST page. These notebooks are listed in the group below. - Gammapy package + Overview (short section with one example for each sub-package, hands-on, an evolution of the current ``getting_started.ipynb``) + Maps (``gammapy.maps``) (``maps.ipynb``) + Models (``gammapy.modeling.models``) (``models.ipynb``) + Modeling (``gammapy.modeling``) (to be implemented) + Statistics (``gammapy.stats``) (to be implemented, explains about likelihood, TS values, significance, ...) + Source detection (``gammapy.detect``) (``detect_ts.ipynb``, ``cwt.ipynb``) + Source catalogs (``gammapy.catalog``) (to be implemented) - **Scripts** This group will contain a few examples on how to use Gammapy from Python scripts (i.e. make a CTA 1DC survey counts map or some other long-running analysis or simulations). The Python scripts could be provided as links and also in ``gammapy download``, as it is the case with the notebooks. - Extra topics + MCMC sampling (``mcmc_sampling.ipynb``) + Dark matter models (``gammapy.astro.darkmatter``) (``astro_dark_matter.ipynb``) + Dark matter analysis (to be implemented) + Light curve simulation (to be implemented) + Source population modelling (``gammapy.astro.population``) (``source_population_model.ipynb``) + Background model making (``background_model.ipynb``) + Sherpa for Gammapy users (``image_fitting_with_sherpa.ipynb``, ``spectrum_fitting_with_sherpa.ipynb``)? + HESS Galactic Plane Survey data (``hgps.ipynb``) More specialised notebooks, and in some cases of lower quality. - **Basics** Leave the basics section at the end of the tutorials page, pretty much as-is. How To ~~~~~~ We suggest to add a HowTo RST file with short entries explaining how to do something specific in Gammapy. Probably each HOWTO should be a short section with just 1-2 sentences and links to tutorials, specific sections of the tutorials, to the API docs, or if it should be small mini-tutorials with code snippets, they could possibly go on sub-pages. The `HowTo documentation page`_ shows a preliminary version of the content of this page. Reference ~~~~~~~~~ Keep as it is and extend the Glossary. Changelog ~~~~~~~~~ Keep as it is. Alternatives ============ We could try to change to a purely Jupyter notebook maintained documentation (e.g. the "Python Data Science Handbook" is written just as Jupyter notebooks). Or we could change documentation system and write all documentation as RST or MD, and then have a documentation processor that auto-generates notebooks. E.g. `Jupytext`_ does this, and partly e.g. the `scikit-learn`_ dos do that for their tutorials, they maintain it in Python scripts and RST files. There's a lot of ways to structure the documentation, or to put different focus. Outlook ======= This is a short-term proposal, to quickly improve the Gammapy documentation within the next 1-2 months, with the limited contributors we already have. In early 2020, we should run a Gammapy user survey and gather feedback on the Gammapy package and documentation. Examples of previous user surveys exist, e.g. from CTA 1DC or the `Scipy documentation user survey`_, that we can use as reference how to get useful feedback. We should also try to attract or hire contributors to Gammapy that have a strong interest in documentation. Once concrete idea could be to participate in `Google season of docs`_, to get a junior technical writer for a few months, if someone from the Gammapy team has time to work and mentor the project. Another thing to keep in mind is that we should work towards a setup and structure for the Gammapy package that support CTA as well as possible, and that makes it easy for CTAO to choose and adapt Gammapy as prototype of the CTA science tools and evolve and maintain it. This PIG doesn't propose a solution for this, that's for later. Implementation ============== Implementing this PIG is a lot of work, roughly 2 months of full-time work. We suggest that, after the PIG is accepted, one coordinator spends a few days to do quick additions / removals / renames / rearrangements, so that the structure of the RST and IPYNB files we want is in place. For this we propose the coordinator to fill the `Documentation GitHub project`_ with a list of 20-30 tasks that should be done (each 1-2 days of work, not longer) and asks for help. Each task is usually to edit one notebook or RST page, and needs one author and one reviewer. It is then up to those two people to coordinate their work: they can open a GitHub issue to discuss, or they can just do a phone call or meet. Eventually there is a pull request and when it's in a state where both are happy, and it's merged in. Whether to use "notes" or "issues" for each task will be discussed during the development of the PIG and and will be basically up to the documentation coordinator. Decision ======== The PIG was extensively discussed and received a lot of feedback on `GH 2463`_. The main suggestions received were incorporated. There was some controversy e.g. whether we should have more shorter pages and notebooks, or fewer longer ones. This PIG was accepted on Nov 6, 2019, although we'd like to note that the outline and changes described above aren't set in stone, we expect the documentation to evolve and improve in an interactive fashion over the coming weeks, but also in 2020 and after. .. _GH 172: https://github.com/gammapy/gammapy/issues/172 .. _GH 241: https://github.com/gammapy/gammapy/issues/241 .. _GH 242: https://github.com/gammapy/gammapy/issues/242 .. _GH 1540: https://github.com/gammapy/gammapy/issues/1540 .. _GH 1577: https://github.com/gammapy/gammapy/issues/1577 .. _GH 1597: https://github.com/gammapy/gammapy/issues/1597 .. _GH 690: https://github.com/gammapy/gammapy/issues/690 .. _GH 2164: https://github.com/gammapy/gammapy/issues/2164 .. _GH 2175: https://github.com/gammapy/gammapy/issues/2175 .. _GH 2221: https://github.com/gammapy/gammapy/issues/2221 .. _GH 2463: https://github.com/gammapy/gammapy/pull/2463 .. _80-20 rule: https://en.wikipedia.org/wiki/Pareto_principle .. _Scipy documentation user survey: https://forms.gle/eK3x2ohJs1sLPJEk8 .. _Google season of docs: https://developers.google.com/season-of-docs/ .. _documentation principles: https://www.writethedocs.org/guide/writing/docs-principles/ .. _Gammapy changelog: https://docs.gammapy.org/0.19/changelog.html .. _maps: https://docs.gammapy.org/0.14/maps/index.html .. _modeling: https://docs.gammapy.org/0.14/modeling/index.html .. _cube: https://docs.gammapy.org/0.14/cube/index.html .. _ADASS XVIII proceedings: http://www.aspbooks.org/publications/523/357.pdf .. _Glossary: https://docs.gammapy.org/0.14/references.html#glossary .. _Jupytext: https://jupytext.readthedocs.io .. _Documentation GitHub project: https://github.com/gammapy/gammapy/projects/1 .. _HowTo documentation page: https://docs.gammapy.org/dev/development/dev_howto.html ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-019.rst0000644000175100001770000001730214721316200020450 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-019: ******************************************** PIG 19 - Gammapy package structure follow up ******************************************** * Author: Axel Donath, Régis Terrier * Created: Jan 16, 2020 * Accepted: Feb 24, 2020 * Status: accepted * Discussion: `GH 2720`_ Abstract ======== Following the proposal in :ref:`pig-016` the subpackages ``gammapy.image`` and ``gammapy.background`` were removed and a new ``gammapy.modeling`` sub-package was introduced. These changes followed the general idea of rather giving up the separation between 1D, 2D and 3D analysis use-cases and instead structuring the package by concept. In this sense ``gammapy.modeling`` now contains all models, including base classes (``Model``) a container classes (``SkyModels``), a registry object ``MODELS`` and many sub-classes. A previous step in this direction was made by introducing the ``gammapy.maps`` sub-package which resolved the distinction between the 2D and 3D data structures and replaced it with a general ND map structure. The outlook in :ref:`pig-016` already proposed the possibility to make further changes to the package structure leading in the same direction. This PIG now proposes these changes in more detail, especially introducing new ``gammapy.datasets``, ``gammapy.makers`` and ``gammapy.estimators`` sub-packages. This restructuring of the package will also reflect the general API and data analysis workflow in a better way. Proposal ======== Introduce gammapy.datasets -------------------------- The ``Dataset`` abstraction is now the central data container for any likelihood based analysis in Gammapy. It includes a base-class (``Dataset``) a container class (``Datasets``) and many specific implementations such as the ``MapDataset``, ``SpectrumDataset`` or ``FluxPointsDataset``. Currently the different sub-classes are implemented in different sub-packages ``gammapy.cube``, ``gammapy.spectrum`` while the base and container class are defined in ``gammapy.modeling``, where also a dataset registry is defined. This scattered implementation of the sub-classes seems non-intuitive and leads to practical problems such as circular imports. Currently the dataset registry (defined in ``gammapy.modeling.serialize``) imports from ``gammapy.spectrum`` and ``gammapy.cube``, while the latter sub-modules import the ``Dataset`` base class from ``gammapy.modeling``. Another minor problem occurs in documentation, where there is no obvious place to document the concept of a dataset. For this reason we propose to introduce a ``gammapy.datasets`` sub-package and move the base class, the container class, the registry, existing implementations of sub-classes and I/O related functionality there. Introduce gammapy.makers ------------------------ In the past year we introduced a configurable and flexible ``Maker`` system for data reduction in gammapy, currently implemented in ``gammapy.spectrum`` and ``gammapy.cube``. A common ``SafeMaskMaker`` is implemented in ``gammapy.cube`` but used for spectral data reduction as well. The documentation is currently split between ``gammapy.cube`` and ``gammapy.spectrum`` and partly duplicated in multiple tutorials. While already being in use, the system is not yet fully developed. A common ``Maker`` base class, a container class such as ``Makers`` and a ``MAKERS`` registry is still missing. In analogy to the datasets we propose to introduce a ``gammapy.makers`` sub-package as a common namespace for all data-reduction related functionality. This gives again an obvious place to define a common API using base classes and defining registry and container class as well. All existing ``Maker`` classes should be moved there. Later a common base class, a registry and a container class as well as common configuration and serialisation handling can be moved there. Introduce gammapy.estimators ---------------------------- Gammapy already defines a number of ``Estimator`` classes. In general an estimator can be seen as a higher level maker class, taking a dataset on input and computing derived outputs from it, mostly based on likelihood fits. This includes spectral points, lightcurves, significance maps etc. Again those classes are split among multiple sub-packages, such as ``gammapy.spectrum``, ``gammapy.time``, ``gammapy.detect`` and ``gammapy.maps``. The existing estimator classes can be mainly divided into two groups; - Flux estimators such as the ``LightCurveEstimator`` and ``FluxPointsEstimator`` where the output is a table like object. - Map estimators such as the ``TSMapEstimator`` or the ``ASmoothEstimator`` where the output is a ``Map`` or dict of ``Map`` object. No clear API definition via a common base class, nor a registry for such estimators exists. Already now there is code duplication between the ``LightCurveEstimator`` and ``FluxPointsEstimator``, which could be resolved by a common ``FluxEstimator`` base class (see `GH 2555`_). For this reason we propose to introduce a ``gammapy.estimators`` sub-package to group all existing estimator classes into the same namespace. This provides the possibility in future to define a common API via base classes, introducing general higher level estimators and a common configuration system. Introduce gammapy.visualization ------------------------------- To group plotting and data visualisation related functionality in a single namespace we propose to introduce ``gammapy.visualization``. The model of a separate sub-package for plotting was adopted by many other large astronomical python packages, such as `astropy`_, `sunpy`_, `ctapipe`_ and `sherpa`_. So far the plotting functionality in Gammapy is limited and mostly attached to data objects such as datasets, maps and irfs. Introducing ``gammapy.visualization`` would be a possibility to maintain plotting related code independently and move existing functionality such as ``MapPanelPlotter`` or ``colormap_hess`` etc. to a common namespace. Resolve gammapy.detect ---------------------- If we introduce ``gammapy.estimators`` and the ``TSMapEstimator``, ``LiMaMapEstimator``, ``ASmoothMapEstimator`` and ``KernelBackgroundMapEstimator`` have been moved the ``gammapy.detect`` package can be resolved. Minor changes ------------- In addition we propose to move the ``PSFKernel``, ``EDispMap`` and ``PSFMap`` classes to ``gammapy.irf``. Those classes represent a "reduced" IRF and are therefore similar to the existing ``EnergyDependentTablePSF`` or ``TablePSF`` classes. Alternatives ------------ An alternative approach could be to introduce a ``gammapy.core`` sub-package that contains all base-classes, container classes and registries. For each sub-package an ``subpackage/plotting.py`` file, could be introduced where plotting related functionality lives. Outlook ======= Once the functionality on data reduction, datasets and estimators has been removed to the corresponding sub-packages, the ``gammapy.cube`` and ``gammapy.spectrum`` packages can possibly be resolved. Decision ======== This PIG received only little feedback, but there was general agreement in `GH 2720`_ on the introduction of the ``gammapy.datasets`` and ``gammapy.makers`` sub-packages as well as the location of irf maps. Those changes will be introduced first. There was some concern about the ``gammapy.estimators`` sub-package. For this reason the implementation of this will be delayed and can be re-discussed at the next Gammapy coding sprint in person. A final review announced on the Gammapy and CC mailing list did not yield any additional comments. Therefore the PIG was accepted on Feb 24, 2020. .. _GH 2720: https://github.com/gammapy/gammapy/pull/2720 .. _GH 2555: https://github.com/gammapy/gammapy/issues/2555 .. _ctapipe: https://github.com/cta-observatory/ctapipe .. _sunpy: https://github.com/sunpy/sunpy ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-020.rst0000644000175100001770000001354314721316200020443 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-020: ************************* PIG 20 - Global Model API ************************* * Author: Axel Donath, Régis Terrier and Quentin Remy * Created: Jun 10, 2020 * Withdrawn: Apr 26th, 2021 * Status: withdrawn * Discussion: `GH 2942`_ Abstract ======== Gammapy already supports joint-likelihood analyses, where indiviudal, statistically independent datasets are represented by a `Dataset` object. In a typical analysis scenario there are components in the model, that are only fitted to one of the datasets, while other model components are shared between all datasets. This PIG proposes the introduction of a global model API for Gammapy, which handles all model components involved in an analysis in a single global models object to resolve the spread model definition in the current implementation. We consider the global model API as a key solution for future support for distributed computing in Gammapy. Proposal ======== Global Model Handling --------------------- Currently different model components are handled in Gammapy by having a different selection of models in the ``Dataset.models`` attributes and pointing to the same instance of a model, if the component is shared between multiple datasets. This works as long as all objects reside in the same memory space. If datasets are distributed to different processes in future, it is technically complex and probably in-efficient to share model states between all sub-processes. It is conceptionally simpler if processes communicate with a single server process that contains the single global model object. The fundamental important difference to the current design is, that the model objects defined in ``Dataset.models`` can represent copies of the global model object components. To avoid Using the ``.set_models()`` API we propose to hide the ``dataset.models`` attribute. .. code:: from gammapy.modeling.models import Models from gammapy.datasets import Datasets models = Models.read("my-models.yaml") datasets = Datasets.read("my-datasets.yaml") # the .set_models call distributes the model components to the datasets datasets.set_models(models) # this initialises the model evaluators dataset.set_models(models) # and to update parameters during fitting, a manual parameter modification by the user # requires an update as well, maybe we can "detect" parameter changes automatically by # caching the last parameters state? datasets.set_models_parameters(models.parameters) It also requires adapting our fitting API as well to handle the model separately: .. code:: from gammapy.modeling import Fit fit = Fit(datasets) result = fit.optimize(models) # or for estimators fpe = FluxPointsEstimator( source="my_source" ) fpe.run(datasets, models) The public model attribute allows to create a global model on data reduction like so: .. code:: models = Models() for obs in observations: dataset, bkg_model = bkg_maker.run(dataset) dataset.write(f"dataset-{obs.obs_id}.fits.gz") models.extend(model) models.write("my-model.yaml") Interaction Between Models and Dataset Objects ---------------------------------------------- The ``MapDataset`` object features methods such as ``.to_spectrum_dataset()``, ``.to_image()`` and ``.stack()`` and ``.copy()``. It is convenient for the user if those methods modify the models contained in the dataset as well. In particular this is useful for the background model. We propose a uniform scheme on how the dataset methods interact with the model. We propose that in general datasets can modify their own models i.e. copies contained in ``DatasetModels``, but never interact "bottom to top" with the global ``Models`` object. So the global model object needs to be re-defined or updated explicitly. The proposed behaviour is as follows: - ``Dataset.copy()``, copy the dataset and model, if a new name is specified for the dataset, the ``Model.dataset_names`` are adapted. - ``Dataset.stack()``, stack the model components by concatenating the model lists. The background model is stacked in place. - ``.to_image()`` sums up the background model component and ``TemplateSpatialModel`` if it defines an energy axis. - ``.to_spectrum_dataset``, creates a fixed ``BackgroundModel`` by summing up the data in the same region. Further suggestions? Check which model contributes npred to the region? In this case we drop the model from the dataset. Background Model Handling ------------------------- We also propose to extend the ``BackgroundModel`` to include a spectral model component like so: .. code:: from gammapy.modeling.models import BackgroundIRFModel, PowerLawNormSpectralModel norm = PowerLawNormSpectralModel(norm=1, tilt=0.1) bkg_model = BackgroundIRFModel( spectral_model=norm, dataset_name="my-dataset" ) bkg_model.evaluate(map=map) After introduction of the global model we propose to remove ``MapDataset.background_model`` and use ``MapDataset.models["dataset-name-bkg"]`` instead. Introduce a naming convention? The background data can be stored either in the ``BackgroundModel`` class or the ``MapDataset`` object as an IRF. This has implications on the serialisation and memory management once we introduce distributed computing. In one case the data is stored in the server process in the other case it is stored on the sub-process. To support spectral background models we propose to support ``RegionGeom`` in the ``BackgroundModel`` class. Decision ======== The authors decided to withdraw the PIG. Most of the proposed changes have been discussed and implemented independently in small contributions and discussion with the Gammapy developer team. .. _GH 2942: https://github.com/gammapy/gammapy/pull/2942 .. _gammapy: https://github.com/gammapy/gammapy .. _gammapy-web: https://github.com/gammapy/gammapy-webpage ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-021.rst0000644000175100001770000002631114721316200020441 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-021: **************************** PIG 21 - Models improvements **************************** * Author: Axel Donath, Régis Terrier and Quentin Remy * Created: Jun 10, 2020 * Accepted: July 31, 2020 * Status: accepted * Discussion: `GH 2944`_ Abstract ======== This PIG outlines further minor improvements to the modeling framework of Gammapy. This includes introduction of a spectra models with norm parameters, minimal support for energy dependent spatial models as well as simplifications of the YAML serialisation. Proposal ======== Spectral Norm Models -------------------- For the purpose of adjusting template based models we propose to introduce a new class of spectral models. Those spectral models feature a norm parameter instead of amplitude and are named using the ``NormSpectralModel`` suffix: .. code:: from gammapy.modeling.models import ( PowerLawNormSpectralModel, LogParabolaNormSpectralModel, PiecewiseBrokenPowerlawNormSpectralModel, ConstantNormSpectralModel, ) pwl_norm = PowerLawNormSpectralModel(norm=, tilt=, reference=) log_parabola_norm = LogParabolaNormSpectralModel(norm=, alpha=, beta=) bpwl_norm = PiecewiseBrokenPowerlawNormSpectralModel(norms=, energies=) const_norm = ConstantNormSpectralModel(norm=) # typically used like template = TemplateSpectralModel() model = template * pwl_norm model = BackgroundModel( map=map, spectral_model=pwl_norm ) model = SkyDiffuseCube( map=map, spectral_model=pwl_norm ) This requires removing the hard-coded parameters of the `TemplateSpectralModel` and `BackgroundModel`. Energy Dependent Spatial Models ------------------------------- A very common use-case in scientific analyses is to look for energy dependent morphology of extended sources. In the past this kind of analysis has been typically done by splitting the data into energy bands and doing individual fits of the morphology in these energy bands. In a combined spectral and spatial ("cube") analysis this can be naturally achieved by allowing for an energy dependent spatial model, where the energy dependence is e.g. described by a parametric model expression with energy dependent parameters. In the current model scheme we use a "factorised" representation of the source model, where the spatial, energy and time dependence are independent model components and not correlated: .. math:: f(l, b, E, t) = A \cdot F(l, b) \cdot G(E) \cdot H(t) To represent energy dependent morphology we propose to introduce energy dependent spatial models: .. math:: f(l, b, E, t) = A \cdot F(l, b, E) \cdot G(E) \cdot H(t) In general the energy dependence is optional. If the spatial model does not declare an energy dependence it assumes the same morphology for all energies. This also ensures backwards compatibility with the current behaviour. To limit the implementation effort in this PIG we propose to only add an example energy dependent custom model to our documentation. We do not propose to introduce general dependence of arbitrary model parameters for any spatial model, such as ``GaussianSpatialModel`` or ``DiskSpatialModel``. An example of how this can be achieved with a custom implemented model is given below: .. code:: from gammapy.modeling.models import SpatialModel from astropy.coordinates import angular_separation class MyCustomGaussianModel(SpatialModel): """My custom gaussian model. Parameters ---------- lon_0, lat_0 : `~astropy.coordinates.Angle` Center position sigma_1TeV : `~astropy.coordinates.Angle` Width of the Gaussian at 1 TeV sigma_10TeV : `~astropy.coordinates.Angle` Width of the Gaussian at 10 TeV """ tag = "MyCustomGaussianModel" lon_0 = Parameter("lon_0", "0 deg") lat_0 = Parameter("lat_0", "0 deg", min=-90, max=90) sigma_1TeV = Parameter("sigma_1TeV", "1 deg", min=0) sigma_10TeV = Parameter("sigma_10TeV", "0.5 deg", min=0) @staticmethod: def evaluate(lon, lat, energy, lon_0, lat_0, sigma_1TeV, sigma_10TeV): """Evaluate custom Gaussian model""" sigmas = u.Quantity([sigma_1TeV, sigma_10TeV]) energy_nodes = [1, 10] * u.TeV sigma = np.interp(energy, energy_nodes, sigmas) sep = angular_separation(lon, lat, lon_0, lat_0) exponent = -0.5 * (sep / sigma) ** 2 norm = 1 / (2 * np.pi * sigma ** 2) return norm * np.exp(exponent) @property def evaluation_radius(self): """Evaluation radius (`~astropy.coordinates.Angle`).""" return 5 * self.sigma_1TeV.quantity Spectral Absorption Model ------------------------- In the current handling of absorbed spectral models we have a very special ``Absorption`` model, which is not a spectral model. To resolve this special case, we propose to refactor the existing code and handle the absorbed case using a ``CompoundSpectralModel``. The new implementation is used as follows: .. code:: from gammapy.modeling.models import EBLAbsorptionNormSpectralModel, PowerLawSpectralModel absorption = EBLAbsorptionNormSpectralModel.from_reference( redshift=0.1, alpha_norm=1, reference="dominguez" ) pwl = PowerLawSpectralModel() spectral_model = absorption * pwl assert isinstance(spectral_model, CompoundSpectralModel) In addition we propose to rename ``.table_model`` to ``.to_template_spectral_model(redshift, alpha_norm)``. Additional Models ----------------- In addition we propose to implement the following models in ``gammapy.modeling.models``: - ``SersicSpatialModel`` following the parametrisation of the Astropy ``Sersic2D`` model. - ``BrokenPowerLaw`` with the following parametrisation: .. math:: \begin{split}\phi(E) = \phi_0 \times \left \{ \begin{eqnarray} \left( \frac{E}{E_b} \right)^{\gamma_1} & {\rm if\,\,} E < E_b \\ \left( \frac{E}{E_b} \right)^{\gamma_2} & {\rm otherwise} \end{eqnarray} \right .\end{split} - We propose to re-introduce the ``PhaseCurveModel`` to compute mean fluxes over time Simplify YAML Representation ---------------------------- Introduce Shorter YAML Alias Tags ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ To simplify the definition of models in YAML files as well as for interactive model creation we propose to introduce shorter YAML tags for all models in Gammapy. For backwards compatibility we propose to support the class name as well. We require that the short YAML tag is only unique within the model class, so that the same tag such as "gauss" can be re-used by spectral, spatial as well as temporal components. The uniqueness is guaranteed in the YAML file, because of the different sections for the model types. For writing the models the verbose class name is used by default. We propose to introduce the following YAML tags: ======================================== ====================== Class Name YAML Tag ======================================== ====================== ConstantSpectralModel const ConstantNormSpectralModel const-norm CompoundSpectralModel compound PowerLawSpectralModel pl PowerLawNormSpectralModel pl-norm PowerLaw2SpectralModel pl-2 SmoothBrokenPowerLawSpectralModel sbpl BrokenPowerLawSpectralModel bpl PiecewiseBrokenPowerLawNormSpectraModel pwbpl-norm ExpCutoffPowerLawSpectralModel ecpl ExpCutoffPowerLaw3FGLSpectralModel ecpl-3fgl SuperExpCutoffPowerLaw3FGLSpectralModel secpl-3fgl SuperExpCutoffPowerLaw4FGLSpectralModel secpl-4fgl LogParabolaSpectralModel lp LogParabolaNormSpectralModel lp-norm TemplateSpectralModel template GaussianSpectralModel gauss EBLAbsorbtionNormSpectralModel ebl-absorbtion-norm NaimaSpectralModel naima ScaleSpectralModel scale ======================================== ====================== ======================================== ====================== Class Name YAML Tag ======================================== ====================== ConstantSpatialModel const TemplateSpatialModel template GaussianSpatialModel gauss DiskSpatialModel disk PointSpatialModel point ShellSpatialModel shell SersicSpatialModel sersic ======================================== ====================== ======================================== ====================== Class Name YAML Tag ======================================== ====================== ConstantTemporalModel const LightCurveTemplateTemporalModel template GaussianTemporalModel gauss ExpDecayTemporalModel exp-decay ======================================== ====================== To simplify the interactive model creation we propose to introduce: .. code:: from gammapy.modeling.models import SkyModel model = SkyModel.create( spectral_type="pl", spectral_pars={"index": 2}, spatial_type="gauss", spatial_pars={"sigma": "0.1 deg"}, temporal_type="const", ) In addition the ``Model.create()`` factory function should be adapted to: .. code:: from gammapy.modeling.models import Model pwl = Model.create(tag="gauss", model_type="spectral", **pars) Simplify YAML Parameter Representation ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ To further simplify the structure of the YAML file we propose to remove parameter properties if they are equivalent to the default values. The current representation looks like: .. code:: spectral: type: PowerLawSpectralModel parameters: - name: index value: 2.0 unit: '' min: .nan max: .nan frozen: false error: 0 - name: amplitude value: 1.0e-12 unit: cm-2 s-1 TeV-1 min: .nan max: .nan frozen: false error: 0 - name: reference value: 1.0 unit: TeV min: .nan max: .nan frozen: true error: 0 After the simplification: .. code:: spectral: type: PowerLawSpectralModel parameters: - name: index value: 2.0 - name: amplitude value: 1.0e-12 unit: cm-2 s-1 TeV-1 - name: reference value: 1.0 unit: TeV frozen: true Decision ======== This PIG received feedback in form of comments to the PR `GH 2944`_ and via private Slack messages to the authors. Most concerns were raised against the introduction of `NormSpectralModels`. In a following discussion among the lead developers it was decided to stick with the proposal, but keep the abstraction of `SkyDiffuseCube` to avoid the case, where a spatial model carries the flux information. A final review announced on the Gammapy and CC mailing list did not yield any additional comments. Therefore the PIG was accepted on July 31, 2020. .. _GH 2944: https://github.com/gammapy/gammapy/pull/2944 .. _gammapy: https://github.com/gammapy/gammapy .. _gammapy-web: https://github.com/gammapy/gammapy-webpage ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-022.rst0000644000175100001770000003556314721316200020453 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-022: ************************************ PIG 22 - Unified flux estimators API ************************************ * Author: Régis Terrier and Axel Donath * Created: Nov 11, 2020 * Withdrawn: Nov 14th, 2021 * Status: withdrawn * Discussion: `GH 3075`_ Abstract ======== This pig proposes a uniform API for flux ``Estimator`` results. We discuss the introduction of a general ``FluxEstimate`` object that would allow flux type conversions and that would serve as a base class for ``FluxPoints`` and ``FluxMap`` class. The latter would allow easier handling of ``TSMapEstimator`` results, in particular regarding serialization and flux conversion. Introduction ============ Flux estimation is performed by ``Estimators`` in gammapy. Some perform forward- folding methods to compute flux, ts, significance and errors at various positions or energy. This is the case of the ``FluxPointsEstimator``, the ``LightCurveEstimator`` and the ``TSMapEstimator``. Other perform backward folding methods to compute similar quantities (they compute excesses and associated ts and errors and divide by the exposure in reco energy to deduce flux quantities). This is the case of ``ExcessMapEstimator`` and ``ExcessProfileEstimator``. So far, the output of all these estimators is diverse and rely on different conventions for the definition of flux. There are four types of SED flux estimates described in the gamma astro data format. They are; - ``dnde`` differential flux which is defined at a given ``e_ref`` - ``e2dnde`` differential energy flux which is defined at a given ``e_ref`` - ``flux`` integral flux defined between ``e_min`` and ``e_max`` - ``eflux`` integral energy flux defined between ``e_min`` and ``e_max`` To convert between these flux types, an assumption must be made on the spectral model. Besides these, a useful SED type is the so-called ``likelihood`` type introduced by fermipy to represent SEDs and described in the gamma astro data format (ref). It uses reference fluxes expressed in the above flux types and a ``norm`` value that is used to derive the actual fluxes. Associated quantities are denoted ``norm_err``, ``norm_ul`` etc. What we have ============ So far, the only API in gammapy handling these different flux types is the ``FluxPoints`` object. It contains a ``Table`` representing one given format above and utility functions allow to convert the table into another format with e.g.: .. code:: fp_dnde = fp.to_sed_type("dnde") fp_energy_flux = fp.to_sed_type("eflux", model=PowerLawSpectralModel(index=3)) The conversion is not possible in all directions. The various estimators implemented so far return different objects. - Map estimators return a dictionary of ``Map`` objects which are defined as ``flux`` types. Beyond the fixed flux type, there is no easy API to allow the user to serialize all the ``Maps`` at once. - ``FluxPointsEstimator`` returns a ``FluxPoints`` object using the ``likelihood`` normalization scheme. - ``LightCurveEstimator`` relies on the ``FluxPointsEstimator`` in each time interval but converts the output into a ``Table`` with one row per time interval and flux points stored as an array in each row. - ``ExcessProfileEstimator`` computes an integral flux (``flux`` type) in a list of regions and a list of energies. It returns a ``Table`` with one region per row, and energy dependent fluxes stored as an array in each row. This diversity of output formats and flux types could be simplified with better design for flux quantities in gammapy. We propose below a generalized flux points API. Proposal of API for flux estimate results ========================================= Introduce a FluxEstimate base class ----------------------------------- First we propose that all ``Estimators`` compute quantities following a SED type inspired by the ``likelihood`` SED type. It would basically rely on the ``norm`` value and reference model to obtain fluxes in various formats. This is the basic ingredient of the current likelihood format described in the gadf but without the likelihood scan. Using such a format, beyond the uniform behavior, has the advantage of making flux type conversion easier. To limit code duplication (e.g. for flux conversions), we propose a common base class to describe the format and contain the required quantities. .. code:: class FluxEstimate: """General likelihood sed conversion class Converts norm values into dnde, flux, etc. Parameters ---------- data : dict of `Map` or `Table` Mappable containing the sed likelihood data spectral_model : `SpectralModel` Reference spectral model energy_axis : `MapAxis` Reference energy axis """ def __init__(self, data, spectral_model, energy_axis=None): self._data = data self.spectral_model = spectral_model self.energy_axis = energy_axis @property def dnde_ref(self): """Reference differential flux"" energy = self.energy_axis.center return = self.spectral_model(energy) @property def dnde(self): """Differential flux""" return self._data["norm"] * self.dnde_ref @property def flux(self): """Integral flux""" energy = self.energy_axis.edges flux_ref = self.spectral_model.integral(energy[:-1], energy[1:]) return self._data["norm"] * flux_ref Practically, this allows easy manipulation of flux quantities between formats: .. code:: fpe = FluxPointsEtimator() fp = fpe.run(datasets) print(fp.dnde) print(fp.eflux_err) # but also get excess number estimate fp.excess TODO: what to do with counts based quantities? Introduce an `ExcessEstimate`? Introduce a FluxMap API ----------------------- Handling a simple dictionary of ``Maps`` is not very convenient. It is complex to perform flux transform, it is more complex to provide standard plotting functions for instance. More importantly, there is no way to serialize the maps with their associated quantities and other information. It could be useful to export the ``GTI`` table of the ``Dataset`` that was used to extract the map or its ``meta`` table. Some information on the way the flux was obtained could be kept as well (e.g. spatial model assumed or correlation radius used). The ``FluxMap`` would inherit from the general likelihood SED class discussed above and could include the possibility to export the resulting maps (in particular the ``norm_scan`` maps) after sparsification according to the format definition from fermipy and presented in gamma astro data format. In addition, utilities to extract flux points at various positions could be provided. Typical usage would be: .. code:: model = SkyModel(PointSpatialModel(), PowerLawSpectralModel(index=2.5)) estimator = TSMapEstimator(model, energy_edges=[0.2, 1.0, 10.0]*u.TeV) flux_maps = estimator.run(dataset) # plot differential flux map in each energy band flux_maps.dnde.plot_grid() # plot energy flux map in each energy band flux_maps.eflux.plot_grid() # one can access other quantities flux_maps.sqrt_ts.plot_grid() flux_maps.excess.plot_grid() # Extract flux points at selected positions positions = SkyCoord([225.31, 200.4], [35.65, 25.3], unit="deg", frame="icrs") fp = flux_maps.get_flux_points(positions) fp.plot() # Save to disk as a fits file flux_maps.write("my_flux_maps.fits", overwrite=True) # Read from disk new_flux_maps = FluxMap.read("my_flux_maps.fits") Introduce a FluxPointsCollection API ------------------------------------ Several estimators return a set of ``FluxPoints`` . It is the case of the ``LightCurveEstimator``, a light curve being a time ordered list of fluxes. It is also the case of the ``ExcessProfileEstimator`` which return a list of flux points ordered along a direction or radially. In the current implementation, they produce an ``~astropy.Table`` where each row contain an array of fluxes representing flux points at a given time or position. This solution is not ideal. It introduces a different container logic than the ``FluxPoints`` for which one row represents one energy. A dedicated API could be introduced to support these objects. In order to keep the logic used in ``FluxPoints``, a flat table ``Table`` could be used to store the various fluxes, where each row would represent the flux at a given energy, time, position etc. Astropy provides a mechanism to group table rows according to column entries. It is then possible to extract the relevant ``FluxPoints`` object, representing the simple flux points or the lightcurve. A possible implementation could follow the following lines: .. code:: energy_columns = ["e_min", "e_max", "e_ref"] time_columns = ["t_min", "t_max"] class FluxPointsCollection: def __init__(self, table): self.table = table if all(_ in self.table.keys() for _ in energy_columns ): self._energy_table = self.table.group_by(energy_columns) else: raise TypeError("Table does not describe a flux point. Missing energy columns.") self._time_groups = None if all(_ in self.table.keys() for _ in time_columns): self._time_table = self.table.group_by(time_columns) def flux_points_at_time(self, time): if self._time_table is None: raise KeyError("No time information") index = np.where((time - self.t_min) <= 0)[0][0] return self.__class__(self._time_table.groups[index]) def lightcurve_at_energy(self, energy): if self._time_table is None: raise KeyError("No time information") index = np.where((energy - self.e_min) <= 0)[0][0] return self.__class__(self._energy_table.groups[index]) Keeping dedicated classes for specific types is an open question. While it might not be needed, it also provides a more explicit API for the user as well as specific functionalities. In particular, plotting is specific to each type. So will be I/O once specific data formats have been introduced. We also note that ``FluxPointsDataset`` rely on ``FluxPoints`` object for now. An important missing feature is the ability of using these for temporal model evaluation. ``TemporalModel`` might requires a ``GTI``. Unification of flux estimators? =============================== Most estimators actually do the same thing: `LightCurveEstimator`` is simply grouping datasets in time intervals and applies ``FluxPointsEstimator`` in each interval. Similarly, the ``ExcessProfileEstimator`` is performing a similar thing in different regions. The necessity of a specific estimator for all these tasks is then questionable. Currently, the responsibility of an estimator is to group and reproject datasets in the relevant intervals or group of data and then to apply a general flux or excess estimate. In practice there are two different types of estimators. We can distinguish these: - Flux estimators perform a fit of all individual counts inside the requested range (energy-bin and/or spatial box) to obtain a flux given a certain physical model. Because it relies on a fit of the complete model,this approach allows to take into account fluctuations of other parameters by performing re-optimization. - Excess estimators sum all counts inside the requested range and estimate the corresponding excess counts taking into account background and other signal contributions from other sources. No explicit fit is actually performed, since we rely on ``CountsStatistics`` classes. The flux is then deduced computing the ratio of measured excess to total ``npred`` in a given model assumption. Currently, the ``FluxPointsEstimator`` and the ``LightCurveEstimator`` belong to the first category. The ``TSMapEstimator`` a priori belongs there as well although our current implementation does not allow for other parameters re-optimization. A more general implementation could also rely on ``FluxPointsEstimator``. The ``ExcessMapEstimator`` and ``ExcessProfileEstimator`` are following the second approach (as their name suggest), but they still provide flux estimates through the excess/npred ratio. We note that flux points and light curve can also be estimated through this approach to provide rapid estimates. So we propose to introduce the following simplified estimator API: - `FluxPointsEstimator`, which handles multiple sources, time intervals and energy bins, using "forward folding" - `FluxMapEstimator`, which handles multiple energy bins, using "forward folding" - `ExcessPointsEstimator`, which handles multiple regions, time intervals and energy bins, using "backward folding" - `ExcessMapEstimator`, which handles multiple energy bins, using "backward folding" Excess estimators ----------------- We can unify and clarify the general approach by introducing estimators following each of the methods. A general flux estimator already exists, it is the ``FluxEstimator``. It is the base of the ``FluxPointsEstimator``. An equivalent ``ExcessEstimator`` could be added. Technically, it would mostly encapsulate functionalities provided by the ``CountsStatistics``. Generalist estimators --------------------- It is possible to create a generalist ``FluxPointsEstimator`` that could return a general ``FluxPointsCollection`` without knowing a priori what type of grouping is applied to the datasets. This would allow to perform lightcurve or region-based flux estimates with the same API. This would allow a direct generalization of the ``FluxPointsEstimator`` to cover other problematic e.g. flux estimation in phase to build phase curves, flux estimation in different observation condition or instrument states to study systematic effects. The main change required for this is to move dataset grouping to the ``Datasets`` level. Outlook ------- Selection by time is already possible on ``Datasets`` thanks to the ``select_time(t_min, t_max)`` method. This functionality could be extended to other quantities characterizing the datasets. So in future we could add more general grouping functionality on the ``Datasets``. It could internally rely on the grouping of the meta table: ``Datasets.meta_table.group_by``. It would require the table to be present in memory and be recomputed each time the ``Datasets`` object is updated. Then retrieving a set of datasets could be done by ``Datasets.get_group_by_idx()``. .. code:: # group datasets according to phase phase_axis = MapAxis.from_bounds(0., 1., 10, name="phase", unit="") datasets.group_by_axis(phase_axis) datasets_in_phase_bin_3 = datasets.get_group_by_idx(3) Alternative =========== Decision ======== The authors have decided to withdraw the PIG. Most of the proposed changes have been implemented independently with review and discussions on individuals PRs. .. _GH 3075: https://github.com/gammapy/gammapy/pull/3075 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-023.rst0000644000175100001770000001350214721316200020441 0ustar00runnerdocker. include:: ../../references.txt .. _pig-023: **************************************************** PIG 23 - Gammapy release cycle and version numbering **************************************************** * Author: Régis Terrier, Axel Donath * Created: May 12th, 2022 * Accepted: September 2nd, 2022 * Status: accepted * Discussion: `GH 3950`_ Abstract ======== This PIG proposes the general plan for timing of releases and their version numbers to start with the release v1.0. Following the approach of astropy (described in APE 2), versions numbered vX.0 are designated "long-term support" (LTS) and are released on typical timescales of two years. New features releases have a shorter cycle of about 6 months and are numbered X.Y (with Y>0). Bugfix releases are applied on both the current feature release and the LTS releases when needed. The development procedure to implement this scheme is detailed. Current status ============== Until v0.20, Gammapy releases have mostly been new feature releases in order to build a complete set of functionalities with a stable API for the v1.0 release. This approach and release cycles were described in the roadmap for v1.0 PIG :ref:`pig-005`. Releases cycles have varied between every two months and once a year. The six-month timescale has been found to be most practical both for users and developers. Version 1.0 is the first one that requires "long-term-support" (LTS) to allow for users and facilities relying on Gammapy not to always upgrade to the latest version while still having the guarantee that major bugs will be corrected. Implementing this requires to precisely define the numbering scheme and to specify the associated development plan in terms of calendar, workflow (development branches), deprecation and backward compatibility. A first discussion on this took place during January 2022 co-working [week](https://github.com/gammapy/gammapy-meetings/tree/master/coding-sprints/2022-01-Co-Working-Week). Release scheduling and LTS ========================== Following on the current scheme, a typical timescale between new feature releases is 6 months. More rapid releases can occur when important new features are to be distributed to users. Feature releases can introduce backwards incompatible changes w.r.t. previous versions. Those changes have to be mentioned in the release notes. In addition, feature versions can be corrected by bug fix releases. A bugfix must neither introduce new major functionality nor break backwards compatibility in the public API, unless the API has an actual mistake that needs to be fixed. Documentation improvements can be part of a bugfix release. Bugfixes are applied to the current feature release. Finally, long-term support (LTS) releases will have a timescale of about 2 years (or about 3-4 feature releases) and will continue to receive bugfixes until the next LTS release to guarantee long term stability. Bugs affecting both the LTS and latest feature releases will be corrected on both branches. This will require a careful cherry picking of commits. Version numbering ================= Gammapy will follow the astropy version scheme i.e. a scheme following the [semantic versioning](https://semver.org) of the form ``x.y.z``, with typically:: * 1.0.0 (LTS release) * 1.0.1 (bug fix release) * 1.0.2 * 1.1.0 (six months after 1.0.0) * 1.1.1 (bug fix release on the feature branch) * 1.0.3 (bug fix release on the LTS) * 1.1.2 * 1.2.0 (six months after 1.1.0) * 1.2.1 * 1.3.0 (six months after 1.2.0) * 1.0.4 (bug fix release on the LTS) * 1.3.1 (bug fix release on the feature branch) * 2.0.0 (LTS release, six months after 1.3.0) Release preparation, feature freeze =================================== To prepare for a new feature release. A feature freeze has to occur typically one month before the planned release date. From that point no new major feature is accepted for the coming release. Open pull requests planned for the release should be merged within two weeks of the feature freeze. If they are carried out in the same timescale, minor improvements, bug fixes, or documentation additions are still acceptable. No alpha or beta release ar expected for Gammapy. For testing purposes, a release candidate is released on the Python Package Index two weeks before the expected release. Conda distribution of the release candidate is not foreseen at the moment. Release candidates will be produced for all LTS and feature releases. Bug fix releases won't go through the release candidate step. If issues are found, they should be corrected before the actual release is made. This can lead to a delay in the release. Once the version is tagged, specific branches must be created for the feature and backport developments. This process will require a detailed description in the developer documentation. Deprecation =========== API breaking changes can be introduced in feature and LTS releases. This changes must be clearly identified in the release notes. To avoid abruptly changing the API between consecutive version, forecoming API changes and deprecation should be announced in the previous release. In particular, a deprecation warning system should be applied to warn users of future changes affecting their code. Support of python Cython, numpy and astropy versions ==================================================== see [APE 18](https://github.com/astropy/astropy-APEs/blob/main/APE18.rst) see [NEP 29](https://numpy.org/neps/nep-0029-deprecation_policy.html) Decision ======== The PIG was discussed in `GH 3950`_ and during a coordination committee meeting. It received few but positive comments. Some training was organized for core developers and maintainers during the July 2022 [co-working week](https://github.com/gammapy/gammapy-meetings/tree/master/coding-sprints/2022-06-Barcelona). The PIG was accepted on September 2nd 2022. .. _GH 3950: https://github.com/gammapy/gammapy/pull/3950 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-024.rst0000644000175100001770000004323514721316200020450 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-024: ************************** PIG 24 - Authorship policy ************************** * Authors: Bruno Khélifi, Thomas Vuillaume * Created: May 25th, 2022 * Accepted: Oct. 20th, 2022 * Status: accepted * Discussion: `GH 3970`_ Abstract ======== Given that the Gammapy library is more widely used by the community, a proper citation of the project including a policy about the authorship is necessary. This PIG addresses this issue by setting an authorship policy for the Gammapy project for each type of products (releases, papers and conferences). Introduction ============ Gammapy started in 2013 and is now widely used in scientific publications. A proper citation scheme with correct authorship allows: - a proper citation of the used Gammapy release, - a proper recognition of the achieved work of any contributor, - compliance with the FAIR principles for Research Software (`FAIR4RS `_). This PIG aims to set up the project policy about authorship for our publication and releases. Given the fact that Gammapy is licensed under a 3-clause BSD style license (see gammapy/LICENSE.rst), Gammapy can be used and even modified for a science project. For this modified version, the proposed authorship policy of this PIG is not applicable but the general citation scheme should be applied. This PIG is structured as follows: a reminder of our general citation scheme that was up-to-now only given orally and on our web pages; then, the authorship policy is given for each of the products associated with Gammapy, namely the intermediate releases, the Long Term Support (LTS) releases, the general Gammapy papers, and the contributions to the conference. Citation scheme =============== When Gammapy is used for any publication, contribution to conferences or software, authors should properly cite Gammapy. It is asked to cite the DOI of the used Gammapy version as well as the associated paper (e.g. the latest LTS release). .. note:: As the Gammapy community fully supports Open Science, we strongly encourage authors using Gammapy to follow the FAIR4RS principles and to allow the reproducibility of their results. As consequence, we suggest always mentioning the `Zenodo `_ DOI or the `HAL `_ SWHID identifier (associated with the `Software Heritage `_ archive) of the used release of Gammapy. This citation scheme with these **two references** will be given on our web pages. Authorship policy ================= The Gammapy references contain a list of authors that requires to be updated with time according to a general policy. This section defines who is a *contributor* to Gammapy and the policy for each type of product. .. _contributor: Definition of a *Contributor* ----------------------------- Contributions to Gammapy can be made in different ways: * contributing to gammapy source code, * contributing to its documentation (rst pages, docstring, gammapy-webpage), * contributing to code reviews, maintenance, deployment and DeVops, * contributing to our associated projects (gammapy-benchmarks, etc), * organizing communication, e.g. hands-on sessions, schools, social media, * coordinating the project. Contributions on any of the aforementioned aspects make a user as official Gammapy contributor. Many of these contributions are tracked using the git history. However, the use of personal data is regulated in Europe and one should have the authorisation of users to be cited with their personal information for publications (their full name is mandatory, and, if any, their affiliation and their ORCID number). For this reason, we propose to use the *Developer Certificate of Origin* (DCO, see the text in `here `_) for each commit. Its acceptation by contributors will permit to: * certify that a user wrote or otherwise has the right to submit a code under our open source licence, * allow our project to use their personal data for the contributions' record (ie for releases). The file ``README.md`` will contain a *Contributing* section explaining that the project follows this DCO that will be given in a new ``CONTRIBUTING.md`` file, as well as a link to this PIG. .. important:: For practical reasons, it is strongly advised that the contributors use always the same GitHub account with a valid full name, to synchronise their git email addresses with their GitHub addresses, to have a unique `ORCID `_ identifier and store it into their GitHub user profile for commodity, and inform the developers of any change of affiliation or email address. Releases -------- Each Gammapy release should be associated with an updated list of authors that will be public on repository (Zenodo and HAL/SWH). This section is about feature releases. Bug releases will have as list of authors the one of its associated feature release whose rules are described here, plus the potentiel additional contributors. The list of authors is composed of people who contributed to the current release code by accepting the DCO. The history of merged pull requests will be used as starting point. Other type of contributions (see :ref:`contributor`) could be added during a dedicated call (see below). The order of the authors is first 'The Gammapy team', followed by the list of contributors in alphabetical order: :: The Gammapy team: aa, bb, cc, ... As mentioned in the `PIG 23 `_, the publication of a release is preceded by a code freeze period. With its announcement, a call will be made to contributors to update their personal information (e.g. their affiliation) under their sole responsibility. This call will also permit to add any additional author into the list for their special contribution (e.g. on communication). The Project Managers and Lead Developers will have the duty to examine such request and to update accordingly the author list data . In case of conflict, the Gammapy Coordination Committee will be the final decision maker. .. _LTS: Long Term Support releases -------------------------- The list of authors is composed of the union of all the contributors of the releases realised since the last LTS release. As the Coordination Committee provides a long term support of the project, their members will be de-facto co-authors. The order of the authors is first 'The Gammapy team', followed by the list of contributors in alphabetical order. Like for the common releases, a period of code freeze is used to make a call to update the personal information of contributors, as well as to submit a request of co-authorship for special contributions. The same rules and methods as for the standard releases are applied here. For the first LTS release, the *V1.0*, all contributors from the beginning of the project will be co-authors by default. As the DCO has not be applied for the past contributions, the best will be done to contact all contributors in order to exercise their right to sign it ("OptIn" scheme), to update their personal information, and to add authors with special contributions. The order of the authors is 'The Gammapy team', followed by the list of contributors in alphabetical order. General Gammapy publications ---------------------------- This product aims to describe the project and/or the library as a whole and will be most of the time associated with the publication of an LTS release. As it targets a wide community, the following scheme is used. 1. List of authors a/ By default, 'The Gammapy team' is mentioned first, then the Lead Developers, then the past Lead Developers since the last general Gammapy publication, then the list of contributors of each release since the last Gammapy publication. b/ If the editorial rules of the targeted journal permit it, the scheme used by the Astropy project (e.g. `Astropy V2.0 `_) should be used in priority: * 'The Gammapy team', and the list of primary paper contributors in order of contribution agreed per consensus with the development team, as '(Primary Paper Contributors)', * the members of the Coordination Committee, as '(Gammapy Coordination Committee)', * the list of contributors of the associated LTS ordered by alphabetical order, as '(Gammapy Contributors)' In this case, a comment on the author list composition should be added. Extracted from the Astropy project, one can precise: "The author list has three parts: the authors that made significant contributions to the writing of the paper, the members of the Gammapy Project Coordination Committee, and contributors to the Gammapy Project in alphabetical order. The position in the author list does not correspond to contributions to the Gammapy Project as a whole. A more complete list of contributors to the core package can be found in the package repository and at the Gammapy team webpage." 2. Corresponding author It is the 'Gammapy Coordination Committee', associated with its usual mailing list (GAMMAPY-COORDINATION-L@IN2P3.FR). 3. Acknowledgement One has the freedom to precisely mention acknowledgements associated with the publication. In practice, it is recommended to precise the grants or fellowships given to some authors. One should in any case always acknowledge the Astropy project, to which we are affiliated, and our mandatory external libraries (e.g. numpy, scipy, matplotlib, iminuit). Our web pages will contain a section with some standard sentence(s) on which one could make a reference. Contribution in conferences --------------------------- The section is about any contribution in conferences (a talk, a poster and their associated proceedings) related to the Gammapy project itself. It does not concern a technical or scientific work that uses Gammapy as an open library. In this case, the citation scheme of Gammapy should be used by these authors. As the length of the author list is generally a constrain, the author list is reduced to the short list of contributors for the conference, followed by 'The Gammapy team' associated with a link to the Gammapy team webpage: .. code-block:: text oo, ff, tt, for `the Gammapy team `_ If there is a corresponding author, the 'Gammapy Coordination Committee' associated with its usual mailing list, is used. Concerning the acknowledgement, the Astropy project should always be mentioned, and if possible our mandatory external libraries. Metadata files ============== Depending on the software repository, different metadata files are used in the eco-system of Open Source research software and **are mandatory**. CITATION.cff ------------ The file ``CITATION.cff`` is used by Zenodo and GitHub. Its format should follow the rules set up by the `Citation File Format project `_. A GitHub Action can be used to automatically check the compliance with the latest format. At the date of this PIG, there is no official scheme to handle current and past affiliations, that might be needed for LTS releases for example (see the CFF project `Issue #268 `_). In the affiliation string, one could add a past affiliation as precised below, but with a risk that Zenodo or tools using ``CITATION.cff`` to make the ``codemeta.json`` does not process correctly this double affiliation. :: authors: - family-names: XXX given-names: "YYYY Z." affiliation: "Lab AA, XX; Past: Lab BB, YY" email: yyyy.xxx@gammapy.org orcid: "https://orcid.org/0000-0000-0000-0000" codemeta.json ------------- The file ``codemeta.json`` is used by HAL and Software Heritage, and recommended by `ESCAPE `_ or `EOSC `_ to be used. Its format is elaborated by the `CodeMeta Project `_. It offers a more detailed description of a software than the one into ``CITATION.cff``, that is focused on authors. Definition of the *Maintainer* ------------------------------ In the codemeta format, a specific field, a *Maintainer*, can be filled. Usually, a software maintainer or package maintainer is one or more people who build source code into a binary package for distribution, commit patches, or organize code in a source repository. And in case of issue, this person can be contacted to fix a broken deposit. For Gammapy, by default, the maintainers are the Lead Developers. If in the future a task is dedicated to the creation of a release, the maintainer will be the person in charge of this task. The name or the names of the maintainers will be filled by the Lead Developers. Possible implementations ======================== DCO implementation ------------------ Today, the tool offered by GitHub to sign the DCO is to add the extra parameter ``-s`` to each git commit. Users have to be aware of that and get used of it. For the reviewers, a github CI can be used to check quickly if a contributor has signed the DCO. This method will be used for the moment. If there is any other method that avoids this extra parameter, one will naturally consider it. In order to respect these rules, some automation is required to create the list of contributors. One could use the Python library `Tributors `_. This library uses the GitHub API to determine the list of contributors, which is written in the format associated with codemeta.json, citation.cff, zenodo.json. This kind of library allows then two requirements: retrieve automatically the list of contributors from GitHub and write the authors list in all the needed formats. Collection of the personal information of authors ------------------------------------------------- The ``CITATION.cff`` file will be used as the main file to be maintained, while the ``codemeta.json`` will be automatically created from it. For any release, the former should be carefully updated and systematically a review should be organised by the dev team. In order to maintain updated data about authors for the LTS releases and LTS papers, a new file could be used with these data, ``LTS_AUTHORS.cff``. The decision will be made in a dedicated PR. The data collected from the git history are not sufficient for any publication. Most of the contributors must give their affiliation as stipulated by their working contract. Also, the ORCID number is a relatively new type of information, recommended to be used in the context of the Open Science movement, but it is not mandatory. However, one needs to collect this information to build the authors lists, in a safe and fair way. In this purpose, one could use the public user profile of the GitHub platform. They are public data, but one should be aware that they might not be complete and/or up-to-date. As mentioned earlier, it is recommended to fill correctly this GitHub profile to help the dev team to pre-fill the metadata files. One should note that the GitHub profile does not contain a field associated with the ORCID identifier. However, one could also retrieve the ORCID identifier of a contributor with the ORCID API (e.g. `How-To-1 `_ or `How-To-2 `_). In any case, a specific Python script located in the main Gammapy repository had to be written such that the authors' order follows these rules. In this case, integration within a GitHub action could be possible. In this context, one could technically use an other cff file as maintenance file to store the personal data of users that have signed the DCO. A second script should be settled in order to create the list of authors associated to the latest LTS in HTML format that could be inserted in the *team* section for our general web page. This script would read one of the metadata files (e.g. ``citation.cff``) to create such list. It is important to note that the list of personal information might change from one editor to an other. For the code deposits in Zenodo, SWH, etc, the data follow standards of Open Science and scripts can be settled. For publications, the editors have frequently their own scheme and adaptations will be made on the case by case. As mentioned here, the authors list will be reviewed for any publication, for safety and to allow OptIn/OptOut requests. Handling of conference material ------------------------------- Conference contributions (Proceedings, Posters, ...) could be developed in dedicated repositories in the Gammapy Github organization, as well as the author list. Suggestions =========== One could ask CTAO software coordinators (ie ACADA, DPPS, SUSS) if such rules can be used by them also, even if the The Gammapy project is independent. In the same spirit, advice could be asked to people of the ESCAPE project via our corresponding person. And finally, one could mention the existence of these rules and ask for advice to the Astropy project. These requests to the Astropy community and CTAO for recommendations and preferences either remained un-responded or did not lead to objections at the current time. Decision ======== After the addition of comments and proposals, the PIG is accepted by the dev team and the CC. The choice of practical implementation of such scheme will be made in dedicated pull requests. .. _GH 3970: https://github.com/gammapy/gammapy/pull/3970 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-025.rst0000644000175100001770000002563714721316200020457 0ustar00runnerdocker. include:: ../../references.txt .. _pig-025: *************************************** PIG 25 - Metadata container for Gammapy *************************************** * Author: Régis Terrier * Created: April 14th, 2023 * Accepted: July 10th, 2023 * Status: Accepted * Discussion: `GH 4491`_ Abstract ======== Metadata handling is crucial to correctly store information that are not directly data but are still required for processing, post-processing and serialization. They are fundamental for reproducibility. Introduction ============ As of version 1.0, Gammapy has very little support for metadata. Existing features are heterogeneous, hardly configurable and appear sporadically in the code, mostly in containers. At the DL3 level, ``EventList`` metadata is carried by its `Table.meta` dictionary. It is extracted from the file FITS header which follows the GADF specifications. Similarly, ``Observation`` contains an `obs_info` dictionary that is build from the header as well. After data reduction, ``Dataset`` contains a `meta_table` which consists in a selection of `Observation.obs_info` entries (one table row per observation). During `Dataset` stacking the `meta_table` are stacked. The ``Datasets`` collection also aggregates the `meta_table` of its members. After estimation, the ``FluxPoints`` don't contain any specific meta information. The algorithm classes (`Makers` and `Estimators`) don't contain any meta information so far. This might be an issue since some information could be transferred on their various products as well. A minimal information that needs to be present on every Gammapy product and serialized in various formats is the `CREATOR` which is the software and software version used, as well as the `DATE` when the object was created, and possibly the `ORIGIN` (the user, the consortium that has produced the object). The Gammapy version number is important to ensure reproducibility and compatibility. An `Observation` also needs some information to ensure correct handling of data. For instance, the telescope name, the sub-array used, the observation mode, the telescope location etc. A practical and systematic solution must be implemented in Gammapy. This PIG discusses the approach and proposes a solution for this. It does not discuss the metadata model, i.e. what information has to be stored on which data product. It proposes a basic concept and a possible implementation of a metadata container object that fulfill the requirements. Requirements ============ The Gammapy metadata solution should: - offer flexibility regarding the content of the data model, e.g.: - it should allow optional entries - it should be configurable to allow for specific data models - have a systematic validation of its content - allow for serialization to various formats, e.g.: - export specific keywords to fits headers depending on the data format - export content to yaml - allow hierarchical content to allow easy propagation of metadata content along the analysis flow - be easily sub-scriptable to allow for specialized metadata containers for the various objects in Gammapy. In the following, we propose a plausible solution fulfilling these requirements based on the the `pydantic` package ``BaseModel`` class. Metadata API ============ All Gammapy classes containing metadata should store them in a `meta` attribute. Type validation --------------- The API should support simple validation on input, even for non standard types such as astropy or Gammapy objects. .. code :: # direct assignment >>> meta.zenith_angle = Angle(30, "deg") # or via a string representation >>> meta.zenith_angle = "30 deg" # input validation >>> meta.zenith_angle = 2*u.TeV ValidationError: 1 validation error for MetaData zenith_angle # attribute type is astropy Angle >>> print(meta.zenith_angle) 30d00m00s Hierarchy --------- The API should allow hierarchical structure with metadata classes having other metadata objects as attributes. See the following example: .. code :: class CreatorMedadata: creator : str date : datetime origin : str class ObservationMetadata: obs_id : str creator : CreatorMetadata Serialization ------------- The metadata classes should have a `to_dict()` or `dict()` method to convert their content to a dictionary. Conversion to various output formats should be supported with specific methods such as `to_yaml()` of `to_header()` to export the content in the form of a FITS header, when possible. Because we expect the number of data formats will increase over the years, specific `Reader` and `Writer` functions or classes could be defined to support e.g. reading and writing to gadf DL3 format. Proposed solution ================= pydantic -------- The pydantic package has been built to perform data validation and settings management using Python type annotations. It enforces type hints at runtime, and provides user friendly errors when data is invalid. It offers nice features such as: - it can be extended to custom data types (e.g. a `Quantity` or a `Map`) with a simple decorator based scheme to define validators. - it supports recursive models The package now extremely widely used in the python ecosystem with more than 50 millions monthly Pypi downloads. Its long-term viability does not appear problematic. Gammapy already uses pydantic for its high level analysis configuration class. There are several other options available such as `traitlets`. The latter also allows the addition of user-defined `TraitType`. the base class -------------- A typical base class for all Gammapy metadata could structured following the structure below: .. code :: class MetaDataBaseModel(BaseModel): class Config: arbitrary_types_allowed = True validate_all = True validate_assignment = True extra = "allow" def to_header(self): hdr_dict = {} for key, item in self.dict().items(): hdr_dict[key.upper()] = item.__str__() return hdr_dict @classmethod def from_header(cls, hdr): kwargs = {} for key in cls.__fields__.keys(): kwargs[key] = hdr.get(key.upper(), None) return cls(**kwargs) The model `Config` defined allows: - using any type input and not only simple `Annotation` types (`arbitrary_types_allowed = True`) - Setting the `validate_assignment` to `True` ensures that validation is performed when a value is assigned to the attribute. - `extra = "allow"` accepts additional attributes not defined in the metadata class. arbitrary type input and validation ----------------------------------- By providing a validation method, it is possible to validate non-standard objects. The `validator` decorator provided by pydantic makes it easy. As shown below: .. code :: class ArbitraryTypeMetaData(MetaDataBaseModel): # allow string defining angle or Angle object zenith_angle : Optional[Union[str, Angle]] pointing_altaz : Union[] # allow observatory name or astropy EarthLocation object location : Optional[Union[str, EarthLocation]] @validator('location') def validate_location(cls, v): if isinstance(v, str) and v in observatory_locations.keys(): return observatory_locations[v] elif isinstance(v, EarthLocation): return v else: raise ValueError("Incorrect location value") @validator('zenith_angle') def validate_zenith_angle(cls, v): return Angle(v) Alternatives ============ Another option could be to use `traitlets`, but this would require creating dedicated types for non-supported types (e.g. `SkyCoord`). Additional functionalities such as `observer` would not be very useful here. Proposed metadata classes ========================= Here we list the expected metadata classes that we expect. All classes will inherit from a parent ``MetaDataBase`` that will provide most base properties to the daughter classes. We provide the list of classes by subpackage data ---- - ``EventListMetaData`` - ``ObservationMetaData`` - ``DataStoreMetaData`` - ``PointingMetaData`` - ``GTIMetaData`` IRF --- Here we should distinguish between actual IRFs and reduced modeling-ready IRFs such as kernels and IRF maps. - ``IRFMetaData`` : A single generic class could be used for all actual IRFs. Makers ------ It is unclear whether stateless algorithm classes such as ``Maker`` actually need meta information beyond their actual attributes. They will have to create or update `meta` information of the ``Dataset`` they create or modify. For now, we don't propose any metadata for ``Maker`` objects. Datasets -------- The ``Dataset`` already contains some meta information with the `meta_table` which contains a small subset of information from the observations that where used to build the object. The new metadata might replace the current `meta_table`. The metadata should support stacking, in particular some of the fields might be lists of entries which require validation. - ``MapDatasetMetaData`` - ``FluxPointsDatasetMetaData`` : the metadata class for the ``FluxPointsDataset``. - ``DatasetsMetaData`` Modeling -------- Similarly to ``Makers``, it is unclear the ``Fit`` class needs specific metadata as it is not serialized. Because they are serialized, ``Model`` and ``Models`` objects should have a minimal `meta`. - ``ModelsMetadata`` - ``ModelMetaData`` Estimators ---------- Again, the stateless ``Estimator`` algorithms do not need a `meta` attribute. They need to build the `meta` information of the products they create, transferring some metadata from the parent ``Datasets``. - ``FluxMapsMetaData`` - ``FluxPointsMetaData`` Metadata generation and propagation along the dataflow ------------------------------------------------------ DL3 products should come with their pre-defined metadata (unless generated by Gammapy for instance during event simulations). But all all other data levels will have metadata generated by Gammapy. Algorithm classes (Makers, Estimators) produce new data containers (DL4 and DL5), they will generate new metadata to be stored on the container and will propagate some of the metadata from the lower level products they manipulate. What metadata will be passed or discarded, how metadata will be restructured in this process (i.e. how propagation and reduction will be performed) is beyond the scope of this PIG. For now, the important point is that metadata handling becomes a task of algorithm classes. The actual definition of the metadata classes will have to support the propagation and reduction process. An obvious case is `Dataset` stacking. The associated `meta` class will have to support the stacking mechanism. Decision ======== The PIG is accepted. Some of the proposed API will have to evolve a bit after the release of pydantic v2. .. _GH 4491: https://github.com/gammapy/gammapy/pull/4491 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/pigs/pig-026.rst0000644000175100001770000002377714721316200020463 0ustar00runnerdocker.. include:: ../../references.txt .. _pig-026: ******************************** PIG 26 - Model Priors API ******************************** * Author: Noah Biederbeck, Katrin Streil * Created: ``2023-03-13`` * Accepted: ``2023-11-03`` * Status: Accepted * Discussion: `#4381`_ Abstract ======== This PIG is intended to introduce priors on parameters that are evaluated during fitting. Motivation ========== Using priors on models or on parameters is the next step in full-fledged statistical analyses. A prior can incorporate the analysers' knowledge of the topic at hand or information about the estimated IRFs systematics provided by the corresponding experiment, and yield more realistic and trustworthy results. The proposed formalism also includes the application of nuisance parameters and regularization. Use cases ========= In the past priors were a regularly requested feature or the solution to a problem. See the following issues and PRs of Gammapy: Case 1: Background systematics as a nuisance parameter `#3955`_ --------------------------------------------------------------- The goal is to fit energy-dependent systematics in the FoVBackground with nuisance parameters and set priors on the two model parameters ``norm`` and ``tilt``: .. code:: python from gammapy.modeling.model import GaussianPrior, PowerLawNormSpectralModel bkg_model = FoVBackgroundModel(spectral_model = PowerLawNormSpectralModel(), dataset_name = dataset.name) tilt_prior = GaussianPrior(mu = "0", sigma = "0.05") norm_prior = GaussianPrior(mu = "1", sigma = "0.1") bkg_model.set_prior([bkg_model.parameters['tilt'],bkg_model.parameters['norm']], [tilt_prior, norm_prior]) A preliminary version was set up like this and tests on simulated datasets were successful. Note that this setup is quite simple and it can be developed more advanced based on the same principle. Case 2: Favoring positive values for flux amplitudes ---------------------------------------------------- A step-like prior function can be used to favour positive values for physical properties like the flux amplitude. By setting a prior one avoids defining hard boundary conditions with the ``min`` attribute of the to-be-fitted parameter. The prior is set to ``value`` if the parameter value is between ``xmin`` and ``xmax`` and 0 if not. .. code:: python from gammapy.modeling.models import PowerLawSpectraModel, StepPrior pwl = PowerLawSpectraModel() prior = StepPrior(xmin = "-inf", xmax = "0", value = "1") pwl.set_prior([pwl.parameters['amplitude']], [prior]) Case 3: Support unfolding methods for spectral flux points `#4122`_ ------------------------------------------------------------------- The proposed prior class will allow the probability to unfold spectral flux points with Tikonov regularisation. The Tikonov matrix can be defined and set as the covariance matrix in the class ``CovarianceGaussianPrior``. The weight of the ``PriorFitStatistic`` can be interpreted as the regularisation strength tau. However, this requires a more advanced setup since this multi-dimensional prior is set on multiple parameters simultaneously. The goal is to use the proposed prior class as a starting point and develop it into multidimensional use cases in the near future. A suggested implementation example is shown below. Here the prior is set on the model instead of the single parameters. .. code:: python import numpy as np from astropy import units as u from gammapy.modeling.model import PiecewiseNormSpectralModel, MultivariateGaussianPrior, PowerLawSpectralModel n_points = 10 energy = np.geomspace(1, 10, n_points) * u.TeV norm = PiecewiseNormSpectralModel(energy=energy) prior = MultivariateGaussianPrior.from_covariance_matrix(means=np.ones(10), covariance_type="diagonal") prior.weight = 1.1 norm.prior = prior spectral_model = PowerLawSpectraModel() * norm Additional use case: Add sigma v estimator functionality for dark matter use case `#2075`_ Implementation ============== The following implementation draft is compatible with the current API, where the models are set via Dataset.model and the Fit is run via Fit.run(datasets). .. code:: python class PriorModel(ModelBase): _weight = 1 _type = "prior" @property def parameters(self): """PriorParameters (`~gammapy.modeling.PriorParameters`)""" return PriorParameters( [getattr(self, name) for name in self.default_parameters.names] ) @property def weight(self): return self._weight @weight.setter def weight(self, value): self._weight = value def __call__(self, value): """Call evaluate method""" kwargs = {par.name: par.quantity for par in self.parameters} if isinstance(value, Parameter): return self.evaluate(value.quantity, **kwargs) else: raise TypeError(f"Invalid type: {value}, {type(value)}") The base ``PriorModel`` class inherits from the ``ModelBase`` class and allows to set (for now one) unit to which the to-be-evaluated parameter the prior is set on is converged. In addition, there can be a weight set. The parameters of the ``PriorModel`` are ``PriorParameters`` and ``PriorParameter``. They inherit from the ``Parameters`` and ``Parameter`` classes, respectively, however, have a limited amount of attributes. It is assumed that the ``PriorParameters`` are set in the same unit as the ``Parameter`` they get evaluated on. Future additional properties of the ``PriorModel`` classes: - Write/read from/to a yaml file, ideally also when the corresponding model is written/read (see serialisation example below) - Prior registry system - Different prior subclasses depending on the use cases Exemplary additional prior subclasses: -------------------------------------- .. code:: python class GaussianPrior(PriorModel): """Gaussian Prior with mu and sigma. """ tag = ["GaussianPrior"] _type = "prior" mu = PriorParameter(name="mu", value = 0, unit = '') sigma = PriorParameter(name="sigma", value = 1, unit = '') @staticmethod def evaluate(value, mu, sigma): return ((value - mu) / sigma) ** 2 .. code:: python class UniformPrior(PriorModel): """Uniform Prior """ tag = ["UniformPrior"] uni = PriorParameter(name="uni", value = 0, min = 0, max = 10, unit = '' ) @staticmethod def evaluate(value, uni): return uni The priors can be set on a ```Parameter``` as ``.prior`` and get evaluated within: .. code:: python class Parameter(): _prior = None @property def prior(self): return self._prior @prior.setter def prior(self, value): self._prior = value def prior_stat_sum(self): if self.prior is not None: return self.prior.weight * self.prior(self) The ``Parameters`` inherit the priors and evaluate the ``prior_stat_sum`` of all the ``Parameters.parameters``: .. code:: python class Parameters(): @property def prior(self): return [par.prior for par in self] def prior_stat_sum(self): parameters_stat_sum = 0 for par in self: if par.prior is not None: parameters_stat_sum += par.prior_stat_sum() return parameters_stat_sum During the Fit the ``Datasets.stat_sum()`` function is evaluated. Now the function has to compute the statistics due to the priors set on the parameters and add to the stat_sum. .. code:: python class Datasets(): def stat_sum(self): """Total statistic given the current model parameters.""" stat = self.stat_array() if self.mask is not None: stat = stat[self.mask.data] prior_stat_sum = self.models.parameters.prior_stat_sum() return np.sum(stat, dtype=np.float64) + prior_stat_sum A simple method in the ``Models`` class will allow setting priors on multiple parameters simultaneously. .. code:: python class Models(Models): def set_prior(self, parameters, priors): for parameter, prior in zip(parameters, priors): parameter.prior = prior Serialisation ------------- The priors are serialised and read and writeable to a yaml file. They are also saved if the associated ``Parameter`` or ``Parameters`` is transformed to a dictionary. For this, the prior subclasses are tagged. The following two example shows the resulting yaml file for a ``Parameter``. .. code-block:: yaml {'name': 'testpar', 'value': 0.1, 'unit': '', 'error': 0, 'min': nan, 'max': nan, 'frozen': False, 'interp': 'lin', 'scale_method': 'scale10', 'is_norm': False, 'prior': {'tag': 'GaussianPrior', 'parameters': [{'name': 'mu', 'value': 0}, {'name': 'sigma', 'value': 0.1}]}} Implementation Outline ----------------------- The PIG implementation will be piece-wise within individual issues in the following order: 1. ``Prior`` base class and subclasses (See `#4620`_) 2. ``PriorParameter`` and ``PriorParameters`` (See `#4620`_) 3. ``.prior`` attribute in the ``Parameter``, ``Parameters`` and ``Model`` classes 4. Evaluation of the prior in the ``Datasets`` class Decision ======== The PIG was discussed and reviewed among Gammapy developers. It has been reviewed by the CC and is now considered accepted. .. _#2075: https://github.com/gammapy/gammapy/issues/2075 .. _#3955: https://github.com/gammapy/gammapy/issues/3955 .. _#4122: https://github.com/gammapy/gammapy/issues/4122 .. _#4208: https://github.com/gammapy/gammapy/issues/4208 .. _#4620: https://github.com/gammapy/gammapy/pull/4620 .. _#4381: https://github.com/gammapy/gammapy/issues/4381 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/release.rst0000644000175100001770000001613214721316200020040 0ustar00runnerdocker.. include:: ../references.txt .. _dev-release: ***************************** How to make a Gammapy release ***************************** This page contains step-by-step instructions how to make a Gammapy release. We mostly follow the `Astropy release instructions `__ and just lists the additional required steps. Feature Freeze and Branching ---------------------------- #. Follow the `Astropy feature freeze and branching instructions `__. Instead of updating the ``whatsnew/.rst`` update the ``docs/release-notes/.rst``. #. Update the entry for the feature freeze in the `Gammapy release calendar `__. Releasing the first major release candidate ------------------------------------------- A few days before the planned release candidate: #. Fill the changelog ``docs/release-notes/.rst`` for the version you are about to release. #. Update the author list manually in the ``CITATION.cff``. #. Open a PR including both changes and mark it with the ``backport-v.x`` label. Gather feedback from the Gammapy user and dev community and finally merge and backport to the ``v.x`` branch. On the day of the release candidate: #. In the `gammapy-webpage repo `__: * Add an entry for the release candidate like ``v1.0rc1`` or ``v1.1rc1`` in the ``download/index.json`` file, by copying the entry for ``dev`` tag. As we do not handle release candidates nor bug fix releases for data, this still allows to fix bugs in the data during the release candidate testing. * In the ``download/install`` folder, copy a previous environment file file as ``gammapy-1.0rc1-environment.yml``. * Adapt the dependency conda env name and versions as required in this file. #. Update the ``CITATION.cff`` date and version by running the ``dev/prepare-release.py`` script. #. Locally create a new release candidate tag on the v1.0.x, like ``v1.0rc1`` for Gammapy and push. For details see the `Astropy release candidate instructions `__. #. Once the tag is pushed the docs build and upload to `PyPi `__ should be triggered automatically. #. Once the docs build has succeeded find the ``tutorials_jupyter.zip`` file for the release candidate in the `gammapy-docs repo `__ and adapt the ``download/index.json`` to point to it. #. Update the entry for the release candidate in the `Gammapy release calendar `__. #. Create a testing page like `Gammapy v1.0rc testing `__. #. Advertise the release candidate and motivate developers and users to report test fails and bugs and list them on the page created before. Releasing the final version of the major release ------------------------------------------------ #. Create a new release tag in the `gammapy-data repo `__, like ``v1.0`` or ``v1.1``. #. Update the datasets entry in the ``download/index.json`` file in the `gammapy-webpage repo `__ to point to this new release tag. #. Locally create a new release tag like ``v1.0`` for Gammapy and push. For details see the `Astropy release candidate instructions `__, but leave out the ``rc1`` suffix. #. In the `gammapy-docs repo `__: * Wait for the triggered docs build to finish. * Edit ``stable/switcher.json`` to add the new version. #. In the `gammapy-webpage repo `__: * Mention the release on the front page and on the news page. * In the ``download/install`` folder, copy a previous environment file file as ``gammapy-1.0-environment.yml``. * Adapt the dependency conda env name and versions as required in this file. * Adapt the entry in the ``download/index.json`` file to point ot the correct environment file. * Find the ``tutorials_jupyter.zip`` file for the new release in the `gammapy-docs repo `__ and adapt the ``download/index.json`` to point to it. #. Update the entry for the actual release in the `Gammapy release calendar `__. #. Finally: * Update the Gammapy conda-forge package at https://github.com/conda-forge/gammapy-feedstock * Encourage the Gammapy developers to try out the new stable version (update and run tests) via the GitHub issue for the release and wait a day or two for feedback. Post release ------------ Steps for the day to announce the release: #. Send release announcement to the Gammapy mailing list and on Gammapy Slack. #. If it's a big release with important new features or fixes, also send the release announcement to the following mailing lists (decide on a case by case basis, if it's relevant to the group of people): * https://groups.google.com/forum/#!forum/astropy-dev * CTAO AS WG list (cta-wg-as@cta-observatory.org) * hess-forum list (hess-forum@lsw.uni-heidelberg.de) #. Make sure the release milestone and issue is closed on GitHub #. Update these release notes with any useful infos / steps that you learned while making the release (ideally try to script / automate the task or check, e.g. as a ``make release-check-xyz`` target). #. Update version number in Binder ``Dockerfile`` in `gammapy-webpage repo `__ master branch and tag the release for Binder. #. Open a milestone and issue for the next release (and possibly also a milestone for the release after, so that low-priority issues can already be moved there) Find a release manager for the next release, assign the release issue to her / him, and ideally put a tentative date (to help developers plan their time for the coming weeks and months). #. Start working on the next release. Make a Bugfix release --------------------- #. Add an entry for the bug-fix release like ``v1.0.1`` or ``v1.1.2`` in the ``download/index.json`` file in the `gammapy-webpage repo `__. The ``datasets`` entry should point to last stable version, like ``v1.0`` or ``v1.1``. We do not provide bug-fix release for data. #. Follow the `Astropy bug fix release instructions `__. #. Follow the instructions for a major release for the updates of ``CITATION.cff``, the modifications in the `gammapy-docs` and `gammapy-webpage` repos as well as the conda builds.././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/development/setup.rst0000644000175100001770000002361414721316200017563 0ustar00runnerdocker.. include:: ../references.txt .. _dev_things: ============= Project setup ============= This page gives an overview of the technical infrastructure we have set up to develop and maintain Gammapy. If you just want to make contribution to the Gammapy code or documentation, you don't need to know about most of the things mentioned on this page. But for Gammapy maintainers it's helpful to have a reference that explains what we have and how things work. Gammapy repository ================== This section explains the content of the main repository for Gammapy: https://github.com/gammapy/gammapy Package and docs ---------------- The two main folders of interest for developers are the ``gammapy`` folder and the ``docs`` folder. In ``gammapy`` you find the Gammapy package, i.e. all code, but also tests are included there in sub-folders called ``tests``. The ``docs`` folder contains the documentation pages mostly in restructured text (RST) format. The Sphinx documentation generator is used to convert those RST files to the HTML documentation. Download -------- The ``gammapy download`` command allows downloading notebooks published in the documentation as well as the related datasets needed to execute them. The set of notebooks is versioned for each stable release as tar bundles published within the versioned documentation in the `gammapy-docs `__ repository. The same happens for conda working environments of stable releases, whose yaml files are published in the `gammapy-web `__ repository. The datasets are not versioned, and they are placed in the `gammapy-data `__ repository. .. _dev_build: Build ----- The ``setup.py`` and ``Makefile`` contain code to build and install Gammapy, as well as to run the tests and build the documentation, see :ref:`dev_intro`. The ``environment-dev.yml`` file contains the conda environment specification that allows one to quickly set up a conda environment for Gammapy development, see :ref:`dev_setup`. .. _setup_cython: Cython ------ We also have some Cython code in Gammapy, at the time of this writing less than 10% in this file: * ``gammapy/stats/fit_statistics_cython.pyx`` Others ------ There are two more folders in the ``gammapy`` repository: ``examples`` and ``dev``. The ``examples`` folder contains Python scripts needed by the sphinx-gallery extension to produce collections of examples use cases. The Python scripts needed by sphinx-gallery extension are placed in folders declared in the ``sphinx_gallery_conf`` variable in ``docs/conf.py`` script. The ``dev`` folder is a place for Gammapy developers to put stuff that is useful for maintenance, such as i.e. a helper script to produce a list of contributors. The file in ``github/workflows/ci.yml`` is the configuration file for the continuous integration (CI) we use with GitHub actions. Finally, there are some folders that are generated and filled by various build steps: * ``build`` contains the Gammapy package if you run ``python setup.py build``. If you run ``python setup.py install``, first the build is run and files placed there, and after that files are copied from the ``build`` folder to your ``site-packages``. * ``docs/_build`` contains the generated documentation, especially ``docs/_build/html`` the HTML version. * ``htmlcov`` and ``.coverage`` is where the test coverage report is stored. * ``v`` is a folder Pytest uses for caching information about failing tests across test runs. This is what makes it possible to execute tests e.g. with the ``--lf`` option and just run the tests that "last failed". * ``dist`` contains the Gammapy distribution if you run ``python setup.py sdist`` The gammapy-data repository =========================== https://github.com/gammapy/gammapy-data You may find here the datasets needed to execute the notebooks, perform the CI tests, build the documentation and check tutorials. .. _dev_gammapy-extra: The gammapy-extra repository ============================ For Gammapy we have a second repository for most of the example data files and a few other things: https://github.com/gammapy/gammapy-extra Old example data ---------------- The ``datasets`` and ``datasets/tests`` folders contain example datasets that were used by the Gammapy documentation and tests. Note that here is a lot of old cruft, because Gammapy was developed since 2013 in parallel with the development of data formats for gamma-ray astronomy (see below). Many old files in those folders can just be deleted; in some cases where documentation or tests access the old files, they should be changed to access newer files or generate test datasets from scratch. Doing this "cleanup" and improvement of curated example datasets will be an ongoing task in Gammapy for the coming years, that has to proceed in parallel with code, test and documentation improvements. Other folders ------------- * The ``figures`` folder contains images that we show in the documentation (or in presentations or publications), for cases where the analysis and image takes a while to compute (i.e. something we don't want to do all the time during the Gammapy documentation build). In each case, there should be a Python script to generate the image. * The ``experiments`` and ``checks`` folders contain Python scripts and notebooks with, well, experiments and checks by Gammapy developers. Some are still work in progress and of interest, most could probably be deleted. * The ``logo`` folder contains the Gammapy logo and banner in a few different variants. * The ``posters`` and ``presentations`` folders contain a few Gammapy posters and presentations, for cases where the poster or presentation isn't available somewhere else on the web. It's hugely incomplete and probably not very useful as-is, and we should discuss if this is useful at all, and if yes, how we want to maintain it. Other repositories ================== Performance benchmarks for Gammapy: * https://github.com/gammapy/gammapy-benchmarks Data from tutorials sometimes accesses files here: * https://github.com/gammapy/gamma-cat * https://github.com/gammapy/gammapy-fermi-lat-data Information from meetings is here: * https://github.com/gammapy/gammapy-meetings Gammapy webpages ================ There are two webpages for Gammapy: http://gammapy.org and http://docs.gammapy.org. In addition, we have Binder set up to allow users to try Gammapy in the browser. gammapy.org ----------- https://gammapy.org/ is a small landing page for the Gammapy project. The page shown there is a static webpage served via GitHub pages. To update it, edit the HTML and CSS files in the `gammapy-webpage GitHub repository `__ and then make a pull request against the default branch for that repo, called ``gh-pages``. Once it's merged, the webpage at https://gammapy.org/ usually updates within less than a minute. docs.gammapy.org ---------------- https://docs.gammapy.org/ contains most of the documentation for Gammapy, including information about Gammapy, release notes, tutorials, ... The dev version of the docs is built and updated with an automated GitHub action for every pull request merged in the `gammapy` GitHub code repository. All the docs are versioned, and each version of the docs is placed in its dedicated version-labelled folder. It is recommended to build the docs locally before each release to identify and fix possible Sphinx warnings from badly formatted RST files or failing Python scripts used to display figures. Gammapy Binder ============== We have set up https://mybinder.org/ for each released version of Gammapy, which allows users to execute the notebooks present in the versioned docs within the web browser, without having to install software or download data to their local machine. This can be useful for people to get started, and for tutorials. Every HTML-fixed version of the notebooks that you can find in the :ref:`tutorials` section has a link to Binder that allows you to execute the tutorial in the myBinder cloud infrastructure. myBinder provides versioned virtual environments coupled with every release. The myBinder docker image is created using the ``Dockerfile`` and ``binder.py`` files placed in the master branch of the `gammapy-webpage GitHub repository `__. The Dockerfile makes the Docker image used by Binder running some linux commands to install base-packages and copy the notebooks and datasets needed. It executes ``binder.py`` to conda install Gammapy dependencies listed in the environment YAML published within the versioned documentation. Continuous integration ====================== We are running various builds as `GitHub actions workflows for CI `__. Code quality ============ * Code coverage: https://coveralls.io/r/gammapy/gammapy * Code quality: https://lgtm.com/projects/g/gammapy/gammapy/context:python * Codacy: https://app.codacy.com/gh/gammapy/gammapy/dashboard To run all tests and measure coverage, type the command ``make test-cov``:: .. code-block:: text $ make test-cov Releases ======== At this time, making a Gammapy release is a sequence of steps to execute in the command line and on some webpages, that is fully documented in this checklist: :ref:`dev-release`. It is difficult to automate this procedure more, but it is already pretty straightforward and quick to do. If all goes well, making a release takes about 1 hour of human time and one or two days of real time, with the building of the conda binary packages being the slowest step, something we wait for before announcing a new release to users (because many use conda and will try to update as soon as they get the announcement email). * Source distribution releases: https://pypi.org/project/gammapy/ * Binary conda packages for Linux, Mac and Windows: https://github.com/conda-forge/gammapy-feedstock ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.1446419 gammapy-1.3/docs/getting-started/0000755000175100001770000000000014721316215016454 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/getting-started/environments.rst0000644000175100001770000000575614721316200021744 0ustar00runnerdocker .. _virtual-envs: Virtual Environments ==================== We recommend to create an isolated virtual environment for each version of Gammapy, so that you have full control over additional packages that you may use in your analysis. This will also help you on improving reproducibility within the user community. Conda Environments ------------------ For convenience we provide, for each stable release of Gammapy, a pre-defined conda environment file, so you can get additional useful packages together with Gammapy in a virtual isolated environment. First install `Miniconda `__ and then just execute the following commands in the terminal: .. code-block:: bash curl -O https://gammapy.org/download/install/gammapy-|release|-environment.yml conda env create -f gammapy-|release|-environment.yml .. note:: On Windows, you have to open up the conda environment file and delete the lines with ``sherpa`` and ``healpy``. Those are optional dependencies that currently aren't available on Windows. Once the environment has been created you can activate it using: .. code-block:: bash conda activate gammapy-|release| To create a new custom environment for your analysis with conda you can use: .. code-block:: bash conda env create -n my-gammapy-analysis And activate it: .. code-block:: bash conda activate my-gammapy-analysis After that you can install Gammapy using `conda` / `mamba` as well as other packages you may need. .. code-block:: bash conda install gammapy ipython jupyter To leave the environment, you may activate another one or just type: .. code-block:: bash conda deactivate If you want to remove an virtual environment again you can use the command below: .. code-block:: bash conda env remove -n my-gammapy-analysis It also recommended to create a custom `environment.yaml` file, which lists all the dependencies and additional packages you like to use explicitly. More detailed instructions on how to work with conda environments you can find in the `conda documentation `__. Venv Environments ----------------- You may prefer to create your virtual environments with Python `venv` command instead of using Anaconda. To create a virtual environment with `venv` (Python 3.5+ required) run the command: .. code-block:: bash python -m venv my-gammapy-analysis which will create one in a `my-gammapy-analysis` folder. To activate it: .. code-block:: bash ./my-gammapy-analysis/bin/activate After that you can install Gammapy using `pip` as well as other packages you may need. .. code-block:: bash pip install gammapy ipython jupyter To leave the environment, you may activate another one or just type: .. code-block:: bash deactivate More detailed instructions on how to work with virtual environments you can find in the `Python documentation `__. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/getting-started/index.rst0000644000175100001770000001416014721316200020311 0ustar00runnerdocker.. include:: ../references.txt .. _getting-started: =============== Getting started =============== .. toctree:: :hidden: install environments usage troubleshooting Installation ------------ There are various ways for users to install Gammapy. **We recommend setting up a virtual environment using either conda or mamba.** Here are two methods to quickly install Gammapy. .. grid:: 1 2 2 2 :gutter: 2 .. grid-item-card:: Working with conda? Gammapy can be installed with `Anaconda `__: .. code-block:: bash conda install -c conda-forge gammapy .. grid-item-card:: Prefer pip? Gammapy can be installed via pip from `PyPI `__. .. code-block:: bash pip install gammapy .. grid-item-card:: In-depth instructions? :columns: 12 Update existing version? Working with virtual environments? Installing a specific version? Check the advanced installation page. +++ .. button-ref:: installation :ref-type: ref :click-parent: :color: secondary :expand: Learn more .. include:: quickstart.rst Tutorials Overview ------------------ .. accordion-header:: :id: collapseOne :title: How to access gamma-ray data :link: ../tutorials/data/cta.html Gammapy can read and access data from multiple gamma-ray instruments. Data from Imaging Atmospheric Cherenkov Telescopes, such as `CTAO`_, `H.E.S.S.`_, `MAGIC`_ and `VERITAS`_, is typically accessed from the **event list data level**, called "DL3". This is most easily done using the `~gammapy.data.DataStore` class. In addition data can also be accessed from the **level of binned events and pre-reduced instrument response functions**, so called "DL4". This is typically the case for `Fermi-LAT`_ data or data from Water Cherenkov Observatories. This data can be read directly using the `~gammapy.maps.Map` and `~gammapy.irf.core.IRFMap` classes. :bdg-link-primary:`CTAO data tutorial <../tutorials/data/cta.html>` :bdg-link-primary:`HESS data tutorial <../tutorials/data/hess.html>` :bdg-link-primary:`Fermi-LAT data tutorial <../tutorials/data/fermi_lat.html>` .. accordion-footer:: .. accordion-header:: :id: collapseTwo :title: How to compute a 1D spectrum :link: ../tutorials/analysis-1d/spectral_analysis.html Gammapy lets you create a 1D spectrum by defining an analysis region in the sky and energy binning using `~gammapy.maps.RegionGeom` object. The **events and instrument response are binned** into `~gammapy.maps.RegionNDMap` and `~gammapy.irf.IRFMap` objects. In addition you can choose to estimate the background from data using e.g. a **reflected regions method**. Flux points can be computed using the `~gammapy.estimators.FluxPointsEstimator`. .. image:: ../_static/1d-analysis-image.png :width: 100% | :bdg-link-primary:`1D analysis tutorial <../tutorials/analysis-1d/spectral_analysis.html>` :bdg-link-primary:`1D analysis tutorial with point-like IRFs <../tutorials/analysis-1d/spectral_analysis_rad_max.html>` :bdg-link-primary:`1D analysis tutorial of extended sources <../tutorials/analysis-1d/extended_source_spectral_analysis.html>` .. accordion-footer:: .. accordion-header:: :id: collapseThree :title: How to compute a 2D image :link: ../tutorials/index.html#d-image Gammapy treats 2D maps as 3D cubes with one bin in energy. Computation of 2D images can be done following a 3D analysis with one bin with a fixed spectral index, or following the classical ring background estimation. .. image:: ../_static/2d-analysis-image.png :width: 100% | :bdg-link-primary:`2D analysis tutorial <../tutorials/analysis-2d/modeling_2D.html>` :bdg-link-primary:`2D analysis tutorial with ring background <../tutorials/analysis-2d/ring_background.html>` .. accordion-footer:: .. accordion-header:: :id: collapseFour :title: How to compute a 3D cube :link: ../tutorials/analysis-3d/analysis_3d.html Gammapy lets you perform a combined spectral and spatial analysis as well. This is sometimes called in jargon a "cube analysis". Based on the 3D data reduction Gammapy can also simulate events. Flux points can be computed using the `~gammapy.estimators.FluxPointsEstimator`. .. image:: ../_static/3d-analysis-image.png :width: 100% | :bdg-link-primary:`3D analysis tutorial <../tutorials/analysis-3d/analysis_3d.html>` :bdg-link-primary:`3D analysis tutorial with event sampling <../tutorials/analysis-3d/event_sampling.html>` .. accordion-footer:: .. accordion-header:: :id: collapseFive :title: How to compute a lightcurve :link: ../tutorials/analysis-time/light_curve.html Gammapy allows you to compute light curves in various ways. Light curves can be computed for a **1D or 3D analysis scenario** (see above) by either grouping or splitting the DL3 data into multiple time intervals. Grouping mutiple observations allows for computing e.g. a **monthly or nightly light curves**, while splitting of a single observation allows to compute **light curves for flares**. You can also compute light curves in multiple energy bands. In all cases the light curve is computed using the `~gammapy.estimators.LightCurveEstimator`. :bdg-link-primary:`Light curve tutorial <../tutorials/analysis-time/light_curve.html>` :bdg-link-primary:`Light curve tutorial for flares <../tutorials/analysis-time/light_curve_flare.html>` .. accordion-footer:: .. accordion-header:: :id: collapseSix :title: How to combine data from multiple instruments :link: ../tutorials/analysis-3d/analysis_mwl.html Gammapy offers the possibility to **combine data from multiple instruments** in a "joint-likelihood" fit. This can be done at **multiple data levels** and independent dimensionality of the data. Gammapy can handle 1D and 3D datasets at the same time and can also include e.g. flux points in a combined likelihood fit. :bdg-link-primary:`Combined 1D / 3D analysis tutorial <../tutorials/analysis-3d/analysis_mwl.html>` :bdg-link-primary:`SED fitting tutorial <../tutorials/analysis-1d/sed_fitting.html>` .. accordion-footer:: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/getting-started/install.rst0000644000175100001770000000761714721316200020661 0ustar00runnerdocker.. include:: ../references.txt .. _installation: Installation ============ There are numerous ways to install Python and Gammapy as a user. On this page, we list the most common ones. In general, **we recommend using** :ref:`virtual environments ` when using Gammapy. This way you have complete control over the additional packages that you may use in your analysis and you work with well defined computing environments. This enables you to easily share your work and ensure **reproducibility of your scientific analysis results**. You can also :ref:`install Gammapy for development `. .. _anaconda: Using Anaconda / Miniconda -------------------------- The easiest way to install Gammapy is to install the `Anaconda `__ or `Miniconda `__ Python distribution. To install the latest stable version of Gammapy and its dependencies using conda, execute this command in a terminal: .. code-block:: bash conda install -c conda-forge gammapy To update an existing installation you can use: .. code-block:: bash conda update gammapy To install a specific version of Gammapy just execute: .. code-block:: bash conda install -c conda-forge gammapy=1.0 If you encounter any issues you can check the :ref:`troubleshoot` guide. Using Mamba ----------- Alternatively, you can use `Mamba `__ for the installation. Mamba is an alternative package manager that supports most of conda’s commands but offers higher installation speed and more reliable environment solutions. To install ``mamba`` in the Conda base environment: .. code-block:: bash conda install mamba -n base -c conda-forge Then install Gammapy through: .. code-block:: bash mamba install gammapy Mamba supports the same commands available in conda. Therefore, updating and installing specific versions follows the same process as above, just simply replace the ``conda`` command with the ``mamba`` command. .. _install-pip: Using pip --------- To install the latest Gammapy **stable** version (see `Gammapy page on PyPI`_) using `pip`_: .. code-block:: bash python -m pip install gammapy This will install Gammapy with the required dependencies only. To install Gammapy with all optional dependencies, you can specify: .. code-block:: bash python -m pip install gammapy[all] To update an existing installation use: .. code-block:: bash python -m pip install gammapy --upgrade To install a specific version of Gammapy use: .. code-block:: bash python -m pip install gammapy==1.0 To install the current Gammapy **development** version with `pip`_ use: .. code-block:: bash python -m pip install git+https://github.com/gammapy/gammapy.git#egg=gammapy If you want to study or edit the code locally, use the following: .. code-block:: bash git clone https://github.com/gammapy/gammapy.git cd gammapy python -m pip install . If you encounter any issues you can check the :ref:`troubleshoot` guide. .. _install-other: Using other package managers ---------------------------- Gammapy has been packaged for some Linux package managers. E.g. on Debian, you can install Gammapy via: .. code-block:: bash sudo apt-get install python3-gammapy To get a more fully featured scientific Python environment, you can install other Python packages using the system package manager (``apt-get`` in this example), and then on top of this install more Python packages using ``pip``. Example: .. code-block:: bash sudo apt-get install \ python3-pip python3-matplotlib \ ipython3-notebook python3-gammapy python3 -m pip install gammapy Note that the Linux package managers typically have release cycles of 6 months, or yearly or longer, meaning that you'll typically get an older version of Gammapy. However, you can always get the recent version via `pip` or `conda` (see above). ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/getting-started/quickstart.rst0000644000175100001770000000556614721316200021406 0ustar00runnerdocker.. _quickstart-setup: Recommended Setup ----------------- We recommend using :ref:`virtual environments `, to do so execute the following commands in the terminal: .. substitution-code-block:: console curl -O https://gammapy.org/download/install/gammapy-|release|-environment.yml conda env create -f gammapy-|release|-environment.yml .. note:: On Windows, you have to open up the conda environment file and delete the lines with ``sherpa`` and ``healpy``. Those are optional dependencies that currently aren't available on Windows. .. note:: To avoid some installation issues, sherpa is not part of the environment file provided. You can nevertheless install `sherpa` in your environment using `python -m pip install sherpa`. The best way to get started and learn Gammapy are the :ref:`tutorials`. You can download the Gammapy tutorial notebooks and the example datasets. The total size to download is ~180 MB. Select the location where you want to install the datasets and proceed with the following commands: .. substitution-code-block:: console conda activate gammapy-|release| gammapy download notebooks gammapy download datasets conda env config vars set GAMMAPY_DATA=$PWD/gammapy-datasets/|release| conda activate gammapy-|release| The last conda commands will define the environment variable within the conda environment. Conversely, you might want to define the ``$GAMMAPY_DATA`` environment variable directly in your shell with: .. substitution-code-block:: console export GAMMAPY_DATA=$PWD/gammapy-datasets/|release| .. note:: If you are not using the ``bash`` shell, handling of shell environment variables might be different, e.g. in some shells the command to use is ``set`` or something else instead of ``export``, and also the profile setup file will be different. On Windows, you should set the ``GAMMAPY_DATA`` environment variable in the "Environment Variables" settings dialog, as explained e.g. `here `__. Jupyter ------- Once you have activated your gammapy environment you can start a notebook server by executing:: cd notebooks jupyter notebook Another option is to utilise the ipykernel functionality of Jupyter Notebook, which allows you to choose a kernel from a predefined list. To add kernels to the list, use the following command lines: .. substitution-code-block:: console conda activate gammapy-|release| python -m ipykernel install --user --name gammapy-|release| --display-name "gammapy-|release|" To make use of it, simply choose it as your kernel when launching `jupyter lab` or `jupyter notebook`. If you are new to conda, Python and Jupyter, it is recommended to also read the :ref:`using-gammapy` guide. If you encounter any issues you can check the :ref:`troubleshoot` guide. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/getting-started/troubleshooting.rst0000644000175100001770000000327214721316200022433 0ustar00runnerdocker.. include:: ../references.txt .. _troubleshoot: Troubleshooting =============== Check your setup ---------------- To access detailed information about your Gammapy installation, you can execute the following command. It will provide insight into the installation directly to the terminal. .. code-block:: bash gammapy info If you encounter some issues, the following commands can help you in troubleshooting your setup: .. code-block:: bash conda info which python which ipython which jupyter which gammapy env | grep PATH python -c 'import gammapy; print(gammapy); print(gammapy.__version__)' You can also use the following commands to check which conda environment is active and to list all available environments: .. code-block:: bash conda info conda env list For those that are new to conda, you can consult the `conda cheat sheet`_, which lists the common commands for installing packages and working with environments. Install issues -------------- If you're experiencing issues and believe you are using the incorrect Python or Gammapy version, or if you encounter problems importing Gammapy, you should check your Python executable and import path to help you resolve the issue: .. code-block:: python import sys print(sys.executable) print(sys.path) To check your Gammapy version use the following commands: .. code-block:: python import gammapy print(gammapy) print(gammapy.__version__) You should now be ready to start using Gammapy. Let's move on to the :ref:`tutorials`. Help!? ------ If you have any questions or issues, please ask for help on the Gammapy Slack, mailing list or on GitHub (see `Gammapy contact`_). ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/getting-started/usage.rst0000644000175100001770000001072014721316200020304 0ustar00runnerdocker.. include:: ../references.txt .. _using-gammapy: Using Gammapy ============= To use Gammapy you need a basic knowledge of Python, Numpy, Astropy, as well as matplotlib for plotting. Many standard gamma-ray analyses can be done with a few lines of configuration and code, so you can get pretty far by copy and pasting and adapting the working examples from the Gammapy documentation. But eventually, if you want to script more complex analyses, or inspect analysis results or intermediate analysis products, you need to acquire a basic to intermediate Python skill level. Jupyter notebooks ----------------- To learn more about Gammapy, and also for interactive data analysis in general, we recommend you use Jupyter notebooks. Assuming you have followed the steps above to install Gammapy and activate the conda environment, you can start `JupyterLab`_ like this: .. code-block:: bash jupyter lab This should open up JupyterLab app in your web browser, where you can create new Jupyter notebooks or open up existing ones. If you have downloaded the tutorials with ``gammapy download tutorials``, you can browse your ``gammapy-tutorials`` folder with Jupyterlab and execute them there. If you haven't used Jupyter before, try typing ``print("Hello Jupyter")`` in the first input cell, and use the keyboard shortcut ``SHIFT + ENTER`` to execute it. Python ------ Gammapy is a Python package, so you can of course import and use it from Python: .. code-block:: bash python Python 3.6.0 | packaged by conda-forge | (default, Feb 10 2017, 07:08:35) [GCC 4.2.1 Compatible Apple LLVM 7.3.0 (clang-703.0.31)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> from gammapy.stats import CashCountsStatistic >>> CashCountsStatistic(n_on=10, mu_bkg=4.2).sqrt_ts np.float64(2.397918129147546) IPython ------- IPython is nicer to use for interactive analysis: .. code-block:: bash ipython Python 3.6.0 | packaged by conda-forge | (default, Feb 10 2017, 07:08:35) Type 'copyright', 'credits' or 'license' for more information IPython 6.5.0 -- An enhanced Interactive Python. Type '?' for help. In [1]: from gammapy.stats import CashCountsStatistic In [2]: CashCountsStatistic(n_on=10, mu_bkg=4.2).sqrt_ts Out[2]: array([2.39791813]) For example you can use ``?`` to look up **help for any Gammapy function, class or method** from IPython: .. code-block:: bash In [3]: CashCountsStatistic? Of course, you can also use the Gammapy online docs if you prefer, clicking in links (i.e. `gammapy.stats.CashCountsStatistic`) or using *Search the docs* field in the upper left. As an example, here's how you can create `gammapy.data.DataStore` and `gammapy.data.EventList` objects and start exploring H.E.S.S. data: .. testcode:: from gammapy.data import DataStore data_store = DataStore.from_dir('$GAMMAPY_DATA/hess-dl3-dr1/') events = data_store.obs(obs_id=23523).events print(events) .. testoutput:: EventList --------- Instrument : H.E.S.S. Phase I Telescope : HESS Obs. ID : 23523 Number of events : 7613 Event rate : 4.513 1 / s Time start : 53343.92234009259 Time stop : 53343.94186555556 Min. energy : 2.44e-01 TeV Max. energy : 1.01e+02 TeV Median energy : 9.53e-01 TeV Max. offset : 58.0 deg Try to make your first plot using the `gammapy.data.EventList.peek` helper method: .. code-block:: import matplotlib.pyplot as plt events.peek() plt.savefig("events.png") Python scripts -------------- Another common way to use Gammapy is to write a Python script. Try it by putting the following code into a file called ``example.py``: .. testcode:: """Example Python script using Gammapy""" from gammapy.data import DataStore data_store = DataStore.from_dir('$GAMMAPY_DATA/hess-dl3-dr1/') events = data_store.obs(obs_id=23523).events print(events.energy.mean()) .. testoutput:: 4.418007850646973 TeV You can run it with Python: .. code-block:: bash python example.py 4.418007850646973 TeV If you want to continue with interactive data or results analysis after running some Python code, use IPython like this: .. code-block:: bash ipython -i example.py For examples how to run Gammapy analyses from Python scripts, see :doc:`/tutorials/scripts/survey_map`. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/index.rst0000644000175100001770000000640414721316200015206 0ustar00runnerdocker.. include:: references.txt .. image:: _static/gammapy_banner.png :width: 400px | Gammapy ------- **Date**: |today| **Version**: |version| **Useful links**: `Web page `__ | `Recipes `__ | `Discussions `__ | `Acknowledging `__ | `Contact `__ Gammapy is a community-developed, open-source Python package for gamma-ray astronomy built on Numpy, Scipy and Astropy. **It is the core library for the** `CTAO`_ **Science Tools** but can also be used to analyse data from existing imaging atmospheric Cherenkov telescopes (IACTs), such as `H.E.S.S.`_, `MAGIC`_ and `VERITAS`_. It also provides some support for `Fermi-LAT`_ and `HAWC`_ data analysis. .. grid:: 1 2 2 2 :gutter: 2 :class-container: sd-text-center .. grid-item-card:: Getting started :img-top: _static/index_getting_started.svg :class-card: intro-card :columns: 12 6 6 6 :shadow: md New to *Gammapy*? Check out the getting started documents. They contain information on how to install and start using *Gammapy* on your local desktop computer. +++ .. button-ref:: getting-started :ref-type: ref :click-parent: :color: secondary :expand: To the quickstart docs .. grid-item-card:: User guide :img-top: _static/index_user_guide.svg :class-card: intro-card :columns: 12 6 6 6 :shadow: md The user guide provide in-depth information on the key concepts of Gammapy with useful background information and explanation, as well as tutorials in the form of Jupyter notebooks. +++ .. button-ref:: user_guide :ref-type: ref :click-parent: :color: secondary :expand: To the user guide .. grid-item-card:: API reference :img-top: _static/index_api.svg :class-card: intro-card :columns: 12 6 6 6 :shadow: md The reference guide contains a detailed description of the Gammapy API. The reference describes how the methods work and which parameters can be used. It assumes that you have an understanding of the key concepts. +++ .. button-ref:: api-ref :ref-type: ref :click-parent: :color: secondary :expand: To the reference guide .. grid-item-card:: Developer guide :img-top: _static/index_contribute.svg :class-card: intro-card :columns: 12 6 6 6 :shadow: md Saw a typo in the documentation? Want to improve existing functionalities? The contributing guidelines will guide you through the process of improving Gammapy. +++ .. button-ref:: dev_intro :ref-type: ref :click-parent: :color: secondary :expand: To the developer guide .. toctree:: :titlesonly: :hidden: getting-started/index user-guide/index tutorials/index api-reference/index development/index release-notes/index ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/make.bat0000644000175100001770000001070514721316200014751 0ustar00runnerdocker@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=_build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. changes to make an overview over all changed/added/deprecated items echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* del /q /s api del /q /s generated goto end ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\Astropy.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\Astropy.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) :end ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/nitpick-exceptions0000644000175100001770000000066714721316200017115 0ustar00runnerdocker# This is needed for `astropy.io.fits.HDUList` sub-classes # TODO: resolve correctly py:class astropy.io.fits.hdu.hdulist.HDUList # For some reason links for astropy.io.fits don't work # TODO: resolve correctly #py:obj astropy.io.fits.BinTableHDU #py:obj astropy.io.fits.ImageHDU # This is needed for modern-style classes (sub-classes of the Python `object`) py:class object # This is needed for Python `list` sub-classes py:class list ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/references.txt0000644000175100001770000001735514721316200016236 0ustar00runnerdocker.. _uncertainties: http://pythonhosted.org/uncertainties/ .. _scikit-image: http://scikit-image.org .. _scikit-learn: http://scikit-learn.org/stable/ .. _scipy: https://docs.scipy.org/doc/scipy/reference/ .. _Scipy Lecture Notes: http://www.scipy-lectures.org/ .. _Practical Python for Astronomers Tutorial: http://python4astronomers.github.io/ .. _GammaLib: http://cta.irap.omp.eu/gammalib-devel/ .. _ctools: http://cta.irap.omp.eu/ctools .. _3ML: https://github.com/threeML/threeML .. _numpy: http://www.numpy.org/ .. _affiliated package: http://www.astropy.org/affiliated/index.html .. _regions: https://astropy-regions.readthedocs.io .. _reproject: https://reproject.readthedocs.io .. _photutils: https://photutils.readthedocs.io .. _astroplan: https://astroplan.readthedocs.io .. _iminuit: https://github.com/iminuit/iminuit .. _probfit: https://github.com/iminuit/probfit .. _h5py: http://www.h5py.org/ .. _naima: https://github.com/zblz/naima .. _ROOT: https://root.cern.ch/drupal/ .. _PyROOT: https://root.cern.ch/drupal/content/pyroot .. _rootpy: http://www.rootpy.org/ .. _Sherpa: http://cxc.cfa.harvard.edu/sherpa/ .. _CIAO: http://cxc.cfa.harvard.edu/ciao/ .. _Chandra X-ray observatory (CXC): http://cxc.harvard.edu/ .. _aplpy: http://aplpy.github.io .. _matplotlib: http://matplotlib.org .. _pandas: http://pandas.pydata.org .. _pip: https://pip.pypa.io/en/stable/ .. _pip install instructions: https://pip.pypa.io/en/latest/installing.html#install-pip .. _conda: https://conda.io/docs/ .. _conda cheat sheet: https://docs.conda.io/projects/conda/en/latest/user-guide/cheatsheet.html .. _PyFACT: https://ui.adsabs.harvard.edu/abs/2012AIPC.1505..789R .. _healpy: https://healpy.readthedocs.io/en/latest/ .. _HEALPix: https://en.wikipedia.org/wiki/HEALPix .. _PyYAML: https://pypi.org/project/PyYAML/ .. _YAML: https://en.wikipedia.org/wiki/YAML .. _jsonschema: https://python-jsonschema.readthedocs.io .. _json-schema.org: https://json-schema.org/ .. _emcee: http://dan.iel.fm/emcee/current/ .. _ctapipe: https://github.com/cta-observatory/ctapipe .. _click: http://click.pocoo.org/ .. _sphinx-click: https://github.com/click-contrib/sphinx-click .. _pycrflux: https://github.com/akira-okumura/pycrflux .. _VHEObserverTools: https://github.com/kialio/VHEObserverTools .. _PINT: https://github.com/nanograv/PINT .. _Gamera: https://github.com/JoachimHahn/GAMERA .. _gammatools: https://github.com/woodmd/gammatools .. _FLaapLUC: https://github.com/jlenain/flaapluc .. _pointlike: https://github.com/tburnett/Fermi-LAT .. _NaN: https://en.wikipedia.org/wiki/NaN .. _Python: https://www.python.org .. _corner: https://corner.readthedocs.io .. _gammapy-0.13-environment.yml: https://gammapy.org/download/install/gammapy-0.13-environment.yml .. _MPIK Heidelberg: https://www.mpi-hd.mpg.de/mpi/en/start/ .. _APC Paris: http://www.apc.univ-paris7.fr/ .. _Paris Observatory: https://www.obspm.fr .. _Python for gamma-ray astronomy 2015: http://gammapy.github.io/PyGamma15/ .. _CTAO: https://www.ctao.org/ .. _H.E.S.S.: https://www.mpi-hd.mpg.de/hfm/HESS/ .. _HAWC: https://www.hawc-observatory.org/ .. _VERITAS: https://veritas.sao.arizona.edu/ .. _MAGIC: https://magic.mpp.mpg.de/ .. _Fermi-LAT: https://fermi.gsfc.nasa.gov/ .. _Fermi ScienceTools: https://fermi.gsfc.nasa.gov/ssc/data/analysis/software/ .. _FermiPy: https://github.com/fermiPy/fermipy .. _gadf: https://gamma-astro-data-formats.readthedocs.io/ .. _Gamma Astro Data Formats: https://gamma-astro-data-formats.readthedocs.io/ .. _Very-high-energy Open Data Format: https://vodf.readthedocs.io/en/latest/index.html .. _April 2016 IACT data meeting: https://github.com/open-gamma-ray-astro/2016-04_IACT_DL3_Meeting/ .. _Werner Hofmann: https://en.wikipedia.org/wiki/Werner_Hofmann_(physicist) .. _Jim Hinton: https://www.mpi-hd.mpg.de/mpi/en/hinton/home/ .. _Martin Raue: https://twitter.com/_rmrtn .. _Christoph Deil: https://github.com/cdeil .. _Google Summer of Code: https://summerofcode.withgoogle.com/ .. _Manuel Paz Arribas GSoC 2015 on observation handling and cube background modeling: http://mapaz-gsoc2015-blog.blogspot.de/ .. _Olga Vorokh GSoC 2016 on image analysis and source detection: http://alcyonegammapy.blogspot.de/ .. _Gammapy poster and proceeding at ICRC 2015: https://indico.cern.ch/event/344485/session/142/contribution/695 .. _First Gammapy coding sprint: https://github.com/gammapy/gammapy/wiki/PyGamma-coding-sprint-1 .. _Gammapy poster at Gamma 2016: https://github.com/gammapy/gamma2016-gammapy-poster/blob/master/gamma2016-gammapy-poster.pdf .. _First Gammapy presentation at a CTA meeting: https://github.com/gammapy/gammapy-extra/blob/master/presentations/2016-05-16_CTA_Meeting_Gammapy.pdf .. _Open TeV source catalog: https://github.com/gammapy/gamma-cat .. _gamma-sky.net: http://gamma-sky.net/ .. _Gammapy binder: http://mybinder.org/repo/gammapy/gammapy-extra .. _Gammapy mailing list: https://groups.google.com/forum/#!forum/gammapy .. _Gammapy Github page: https://github.com/gammapy/gammapy .. _gammapy-webpage: https://github.com/gammapy/gammapy-webpage .. _Gammapy documentation: https://docs.gammapy.org/ .. _Gammapy contact: https://gammapy.org/contact.html .. _Gammapy project summary on Open HUB: https://www.openhub.net/p/gammapy .. _Gammapy page on PyPI: https://pypi.org/project/gammapy .. _Gammapy contributors page on Github: https://github.com/gammapy/gammapy/graphs/contributors .. _Gammapy on Slack: https://gammapy.slack.com .. _Slack: https://slack.com .. _PyPI: https://pypi.org .. _scientific Python stack: https://www.scipy.org/about.html .. _ctobssim: http://cta.irap.omp.eu/ctools-devel/reference_manual/ctobssim.html .. _gtpsf: https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/help/gtpsf.txt .. _rsync: https://en.wikipedia.org/wiki/Rsync .. _PHA: https://heasarc.gsfc.nasa.gov/docs/heasarc/ofwg/docs/spectra/ogip_92_007/node5.html .. _ARF: https://heasarc.gsfc.nasa.gov/docs/heasarc/caldb/docs/memos/cal_gen_92_002/cal_gen_92_002.html#tth_sEc4 .. _RMF: https://heasarc.gsfc.nasa.gov/docs/heasarc/caldb/docs/memos/cal_gen_92_002/cal_gen_92_002.html#tth_sEc3.1 .. _Github clone URL help article: https://help.github.com/articles/which-remote-url-should-i-use/ .. _ipython: https://ipython.org .. _Jupyter: https://jupyter.org .. _JupyterLab: https://jupyterlab.readthedocs.io/ .. _nbsphinx: https://nbsphinx.readthedocs.io .. _Installing Python Packages from a Jupyter Notebook: http://jakevdp.github.io/blog/2017/12/05/installing-python-packages-from-jupyter/ .. _apt-get: https://en.wikipedia.org/wiki/Advanced_Packaging_Tool .. _yum: https://en.wikipedia.org/wiki/Yellowdog_Updater,_Modified .. _Macports: https://www.macports.org/ .. _Homebrew: https://brew.sh/ .. _Fermi-LAT time systems in a nutshell: https://fermi.gsfc.nasa.gov/ssc/data/analysis/documentation/Cicerone/Cicerone_Data/Time_in_ScienceTools.html .. _FTOOLS: https://heasarc.gsfc.nasa.gov/ftools/ftools_menu.html .. _XSpec: https://heasarc.gsfc.nasa.gov/xanadu/xspec/ .. _XSpec manual statistics page: https://heasarc.gsfc.nasa.gov/xanadu/xspec/manual/XSappendixStatistics.html .. _Sherpa statistics page: https://cxc.cfa.harvard.edu/sherpa/statistics .. _XML schema: https://en.wikipedia.org/wiki/XML_schema .. _setuptools console_scripts entry point: https://python-packaging.readthedocs.io/en/latest/command-line-scripts.html .. _pydantic: https://pydantic-docs.helpmanual.io .. _A Whirlwind tour of Python: https://nbviewer.org/github/jakevdp/WhirlwindTourOfPython/blob/master/Index.ipynb .. _Python data science handbook: https://jakevdp.github.io/PythonDataScienceHandbook/ .. _Astropy Hands-On Tutorial: https://github.com/Asterics2020-Obelics/School2019/tree/master/astropy .. _FAIR principles: https://www.go-fair.org/fair-principles/ .. _FAIR4RS principles: https://www.rd-alliance.org/group_output/fair-principles-for-research-software-fair4rs-principles/ .. _IVOA: https://ivoa.net/././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.1486418 gammapy-1.3/docs/release-notes/0000755000175100001770000000000014721316215016115 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/index.rst0000644000175100001770000000406214721316200017752 0ustar00runnerdocker.. _release_notes: ============= Release notes ============= This is the list of changes to Gammapy between each release. For full details, see the `commit logs `_. A complete list of Gammapy contributors is at https://gammapy.org/team.html Version 1.3 ----------- .. toctree:: :maxdepth: 2 v1.3 Version 1.2 ----------- .. toctree:: :maxdepth: 2 v1.2 Version 1.1 ----------- .. toctree:: :maxdepth: 2 v1.1 Version 1.0.2 ------------- .. toctree:: :maxdepth: 2 v1.0.2 Version 1.0.1 ------------- .. toctree:: :maxdepth: 2 v1.0.1 Version 1.0 ----------- .. toctree:: :maxdepth: 2 v1.0 Version 0.20.1 -------------- .. toctree:: :maxdepth: 2 v0.20.1 Version 0.20 ------------ .. toctree:: :maxdepth: 2 v0.20 Version 0.19 ------------ .. toctree:: :maxdepth: 2 v0.19 Version 0.18 ------------ .. toctree:: :maxdepth: 2 v0.18.2 v0.18.1 v0.18 Version 0.17 ------------ .. toctree:: :maxdepth: 2 v0.17 Version 0.16 ------------ .. toctree:: :maxdepth: 2 v0.16 Version 0.15 ------------ .. toctree:: :maxdepth: 2 v0.15 Version 0.14 ------------ .. toctree:: :maxdepth: 2 v0.14 Version 0.13 ------------ .. toctree:: :maxdepth: 2 v0.13 Version 0.12 ------------ .. toctree:: :maxdepth: 2 v0.12 Version 0.11 ------------ .. toctree:: :maxdepth: 2 v0.11 Version 0.10 ------------ .. toctree:: :maxdepth: 2 v0.10 Version 0.9 ----------- .. toctree:: :maxdepth: 2 v0.9 Version 0.8 ----------- .. toctree:: :maxdepth: 2 v0.8 Version 0.7 ----------- .. toctree:: :maxdepth: 2 v0.7 Version 0.6 ----------- .. toctree:: :maxdepth: 2 v0.6 Version 0.5 ----------- .. toctree:: :maxdepth: 2 v0.5 Version 0.4 ----------- .. toctree:: :maxdepth: 2 v0.4 Version 0.3 ----------- .. toctree:: :maxdepth: 2 v0.3 Version 0.2 ----------- .. toctree:: :maxdepth: 2 v0.2 Version 0.1 ----------- .. toctree:: :maxdepth: 2 v0.1 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.1.rst0000644000175100001770000000562714721316200017337 0ustar00runnerdocker.. _gammapy_0p1_release: 0.1 (Aug 25, 2014) ------------------ Summary ~~~~~~~ - Released Aug 25, 2014 - 5 contributors - 15 months of work - 82 pull requests - Requires Astropy version 0.4 or later. Contributors ~~~~~~~~~~~~ - Rolf Bühler - Christoph Deil - Axel Donath - Ellis Owen - Régis Terrier Pull requests ~~~~~~~~~~~~~ Note that Gammapy development started out directly in the master branch, i.e. for some things there is no pull request we can list here. - [#180] Clean up datasets code and docs (Christoph Deil) - [#177] Misc code and docs cleanup (Christoph Deil) - [#176] Add new gammapy.data sub-package (Christoph Deil) - [#167] Add image profile function (Ellis Owen) - [#166] Add SED from Cube function (Ellis Owen) - [#160] Add code to make model images from a source catalog (Ellis Owen) - [#157] Re-write Galaxy modeling code (Axel Donath) - [#156] Add Fermi Vela dataset (Ellis Owen) - [#155] Add PSF convolve function (Ellis Owen) - [#154] Add Fermi PSF convolution method (Ellis Owen) - [#151] Improve npred cube functionality (Ellis Owen) - [#150] Add npred cube computation (Christoph Deil and Ellis Owen) - [#142] Add EffectiveAreaTable and EnergyDependentMultiGaussPSF classes (Axel Donath) - [#138] Add Crab flux point dataset (Rolf Bühler) - [#128] Add flux point computation using Lafferty & Wyatt (1995) (Ellis Owen) - [#122] Add morphology models as Astropy models (Axel Donath) - [#117] Improve synthetic Milky Way modeling (Christoph Deil) - [#116] Add Galactic source catalog simulation methods (Christoph Deil) - [#109] Python 2 / 3 compatibility with a single codebase (Christoph Deil) - [#103] Add datasets functions to fetch Fermi catalogs (Ellis Owen) - [#100] Add image plotting routines (Christoph Deil) - [#96] Add wstat likelihood function for spectra and images (Christoph Deil) - [#88] Add block reduce function for HDUs (Ellis Owen) - [#84] Add TablePSF and Fermi PSF (Christoph Deil) - [#68] Integrate PyFACT functionality in Gammapy (Christoph Deil) - [#67] Add image measure methods (Christoph Deil) - [#66] Add plotting module and HESS colormap (Axel Donath) - [#65] Add model image and image measurement functionality (Axel Donath) - [#64] Add coordinate string IAU designation format (Christoph Deil) - [#58] Add per-pixel solid angle function in image utils (Ellis Owen) - [#48] Add sphere and power-law sampling functions (Christoph Deil) - [#34] Rename tevpy to gammapy (Christoph Deil) - [#25] Add continuous wavelet transform class (Régis Terrier) - [#12] Add coverage reports to continuous integration on coveralls (Christoph Deil) - [#11] Add blob detection (Axel Donath) - Rename tevpy to gammapy in `commit 7e955f `__ on Aug 19, 2013 (Christoph Deil) - Start tevpy repo with `commit 11af4c `__ on May 15, 2013 (Christoph Deil) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.10.rst0000644000175100001770000000622014721316200017405 0ustar00runnerdocker.. _gammapy_0p10_release: 0.10 (Jan 28, 2019) ------------------- Summary ~~~~~~~ - Released Jan 28, 2019 - 7 contributors - 2 months of work - 30 pull requests (not all listed below) What's new? ~~~~~~~~~~~ Gammapy v0.10 is a small release. An option to have a background model with parameters such as normalization and spectral tilt was added. The curated example datasets were improved, the ``gammapy download`` script and access of example data from the tutorials via the ``GAMMAPY_DATA`` environment variable were improved. A notebook ``image_analysis`` showing how to use Gammapy to make and model 2D images for a given given energy band, as a special case of the existing 3D map-based analysis was added. A lot of the work recently went into planning the work ahead for 2019. See the `Gammapy 1.0 roadmap`_ and the `PIG 7 - models`_ as well as `PIG 8 - datasets`_ and get in touch if you want to contribute. We plan to ship a first version of the new datasets API in Gammapy v0.11 in March 2019. Gammapy v0.10 is the last Gammapy release that supports Python 2 (see `PIG 3`_). If you have any questions or need help to install Python 3, or to update your scripts and notebooks to work in Python 3, please contact us any time on the Gammapy mailing list or Slack. We apologise for the disruption and are happy to help with this transition. pyyaml is now a core dependency of Gammapy, i.e. will always be automatically installed as a dependency. Instructions for installing Gammapy on Windows, and continuous testing on Windows were improved. .. _PIG 3: https://github.com/gammapy/gammapy/pull/1278 .. _PIG 7 - models: https://github.com/gammapy/gammapy/pull/1971 .. _PIG 8 - datasets: https://github.com/gammapy/gammapy/pull/1986 .. _Gammapy 1.0 roadmap: https://github.com/gammapy/gammapy/pull/1841 Contributors ~~~~~~~~~~~~ - Atreyee Sinha - Axel Donath - Christoph Deil - David Fidalgo - José Enrique Ruiz - Lars Mohrmann - Régis Terrier Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy 0.10 merged pull requests list on GitHub `__. - [#2001] Use GAMMAPY_DATA everywhere / remove GAMMAPY_EXTRA (José Enrique Ruiz) - [#2000] Fix cta_simulation notebook, use CTA prod 3 IRFs (Régis Terrier) - [#1998] Fix SensitivityEstimator after IRF API change (Régis Terrier) - [#1995] Add pyyaml as core dependency (Christoph Deil) - [#1994] Unify Fermi-LAT datasets used in Gammapy (Axel Donath) - [#1991] Improve SourceCatalogObjectHGPS spatial model (Axel Donath) - [#1990] Add background model for map fit (Atreyee Sinha) - [#1989] Add tutorial notebook for 2D image analysis (Atreyee Sinha) - [#1988] Improve gammapy download (José Enrique Ruiz) - [#1979] Improve output units of spectral models (Axel Donath) - [#1975] Improve EnergyDependentTablePSF evaluate methods (Axel Donath) - [#1969] Improve ObservationStats (Lars Mohrmann) - [#1966] Add ObservationFilter select methods (David Fidalgo) - [#1962] Change data access in notebooks to GAMMAPY_DATA (José Enrique Ruiz) - [#1951] Add keepdim option for maps (Atreyee Sinha) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.11.rst0000644000175100001770000001342314721316200017411 0ustar00runnerdocker.. _gammapy_0p11_release: 0.11 (Mar 29, 2019) ------------------- Summary ~~~~~~~ - Released Mar 29, 2019 - 11 contributors - 2 months of work - 65 pull requests (not all listed below) What's new? ~~~~~~~~~~~ Gammapy v0.11 implements a large part of the new joint-likelihood fitting framework proposed in `PIG 8 - datasets`_ . This includes the introduction of the ``FluxPointsDataset``, ``MapDataset`` and ``Datasets`` classes, which now represent the main interface to the ``Fit`` class and fitting backends in Gammapy. As a first use-case of the new dataset classes we added a tutorial demonstrating a joint-likelihood fit of a CTA 1DC Galactic center observations. We also considerably improved the performance of the 3D likelihood evaluation by evaluating the source model components on smaller cutouts of the map. We also added a tutorial demonstrating the use of the ``MapDataset`` class for MCMC sampling and show how to interface Gammapy to the widely used emcee package. Gammapy v0.11 also includes a new pulsar analysis tutorial. It demonstrates how to compute phase curves and phase resolved sky maps with Gammapy. To better support classical analysis methods in our main API we implemented a ``MapMakerRing`` class, that provides ring and adaptive ring background estimation for map and image estimation. Gammapy v0.11 improves the support for the scipy and sherpa fitting backends. It now implements full support of parameter freezing and parameter limits for both backends. We also added a ``reoptimize`` option to the ``Fit.likelihood_profile`` method to compute likelihood profiles with reoptimizing remaining free parameters. For Gammapy v0.11 we added a ``SkyEllipse`` model to support fitting of elongated sources and changed the parametrization of the ``SkyGaussian`` to integrate correctly on the sphere. The spatial model classes now feature simple support for coordinate frames, such that the position of the source can be defined and fitted independently of the coordinate system of the data. Gammapy v0.11 now supports the evaluation non-radially symmetric 3D background models and defining multiple background models for a single ``MapDataset``. Gammapy v0.11 drops support for Python 2.7, only Python 3.5 or newer is supported (see `PIG 3`_). If you have any questions or need help to install Python 3, or to update your scripts and notebooks to work in Python 3, please contact us any time on the Gammapy mailing list or Slack. We apologise for the disruption and are happy to help with this transition. Note that Gammapy v0.10 will remain available and is Python 2 compatible forever, so sticking with that version might be an option in some cases. pip and conda should handle this correctly, i.e. automatically pick the last compatible version (Gammapy v0.10) on Python 2, or if you try to force installation of a later version by explicitly giving a version number, emit an error and exit without installing or updating. For Gammapy v0.11 we removed the unmaintained ``gammapy.datasets`` sub-module. Please use the ``gammapy download`` command to download datasets instead and the ``$GAMMAPY_DATA`` environment variable to access the data directly from your local gammapy-datasets folder. .. _PIG 3: https://github.com/gammapy/gammapy/pull/1278 .. _PIG 8 - datasets: https://github.com/gammapy/gammapy/pull/1986 Contributors ~~~~~~~~~~~~ In alphabetical order by first name: - Atreyee Sinha - Axel Donath - Brigitta Sipocz - Christoph Deil - Fabio Acero - hugovk - Jason Watson (new) - José Enrique Ruiz - Lars Mohrmann - Luca Giunti (new) - Régis Terrier Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy 0.11 merged pull requests list on GitHub `__. - [#2098] Remove gammapy.datasets submodule (Axel Donath) - [#2097] Clean up tutorial notebooks (Christoph Deil) - [#2093] Clean up PSF3D / TablePSF interpolation unit handling (Axel Donath) - [#2085] Improve EDispMap and PSFMap stacking (Régis Terrier) - [#2077] Add MCMC tutorial using emcee (Fabio Acero) - [#2076] Clean up maps/wcs.py (Axel Donath) - [#2071] Implement MapDataset npred evaluation using cutouts (Axel Donath) - [#2069] Improve support for scipy fitting backend (Axel Donath) - [#2066] Add SkyModel.position and frame attribute (Axel Donath) - [#2065] Add evaluation radius to SkyEllipse model (Luca Giunti) - [#2064] Add simulate_dataset() convenience function (Fabio Acero) - [#2054] Add likelihood profile reoptimize option (Axel Donath) - [#2051] Add WcsGeom.cutout() method (Léa Jouvin) - [#2050] Add notebook for 3D joint analysis (Léa Jouvin) - [#2049] Add EventList.select_map_mask() method (Régis Terrier) - [#2046] Add SkyEllipse model (Luca Giunti) - [#2039] Simplify and move energy threshold computation (Axel Donath) - [#2038] Add tutorial for pulsar analysis (Marion Spir-Jacob) - [#2037] Add parameter freezing for sherpa backend (Axel Donath) - [#2035] Fix symmetry issue in solid angle calculation for WcsGeom (Jason Watson) - [#2034] Change SkyGaussian to spherical representation (Luca Giunti) - [#2033] Add evaluation of asymmetric background models (Jason Watson) - [#2031] Add EDispMap class (Régis Terrier) - [#2030] Add Datasets class (Axel Donath) - [#2028] Add hess notebook to gammapy download list (José Enrique Ruiz) - [#2026] Refactor MapFit into MapDataset (Atreyee Sinha) - [#2023] Add FluxPointsDataset class (Axel Donath) - [#2022] Refactor TablePSF class (Axel Donath) - [#2019] Simplify PSF stacking and containment radius computation (Axel Donath) - [#2017] Updating astropy_helpers to 3.1 (Brigitta Sipocz) - [#2016] Drop support for Python 2 (hugovk) - [#2012] Drop Python 2 support (Christoph Deil) - [#2009] Improve field-of-view coordinate transformations (Lars Mohrmann) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.12.rst0000644000175100001770000001126314721316200017412 0ustar00runnerdocker.. _gammapy_0p12_release: 0.12 (May 30, 2019) ------------------- Summary ~~~~~~~ - Released May 30, 2019 - 9 contributors - 2 months of work - 66 pull requests (not all listed below) What's new? ~~~~~~~~~~~ For Gammapy v0.12 we did our homework, cleaned up the basement and emptied the trash bin. It is a maintenance release that does not introduce many new features, but where we have put a lot of effort into integrating the ``gammapy.spectrum`` submodule into the datasets framework we introduced in the previous Gammapy version. For this we replaced the former ``SpectrumObservation`` class by a new ``SpectrumDatasetOnOff`` class, which now works with the general ``Fit`` and ``Datasets`` objects in ``gammapy.utils.fitting``. This also enabled us to remove the ``SpectrumObservationList`` and ``SpectrumFit`` classes. We adapted the ``SpectrumExtraction`` class accordingly. We also refactored the ``NDData`` class to use ``MapAxis`` to handle the data axes. This affects the ``CountsSpectrum`` and the IRF classes in ``gammapy.irf``. In addition we changed the ``FluxPointsEstimator`` to work with the new ``SpectrumDatasetOnOff`` as well as the ``MapDataset``. Now it is possible to compute flux points for 1D as well 3D data with a uniform API. We added a new ``NaimaModel`` wrapper class (https://naima.readthedocs.io/), which allows you to fit true physical, spectral models directly to counts based gamma-ray data. To improve the fit convergence of the ``SkyDisk`` and ``SkyEllipse`` models we introduced a new parameter defining the slope of the edge of these models. If you would like to know how to adapt your old spectral analysis scripts to Gammapy v0.12, please checkout the updated `tutorial notebooks `__ and `get in contact with us `__ anytime if you need help. Contributors ~~~~~~~~~~~~ In alphabetical order by first name: - Atreyee Sinha - Axel Donath - Christoph Deil - Dirk Lennarz - Debanjan Bose (new) - José Enrique Ruiz - Lars Mohrmann - Luca Giunti - Régis Terrier Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy v0.12 merged pull requests list on GitHub `__. - [#2171] Remove Poisson chi2 approximations (Christoph Deil) - [#2169] Remove warning astropy_helpers.sphinx.conf is deprecated (José Enrique Ruiz) - [#2166] Remove PHACountsSpectrumList class (Régis Terrier) - [#2163] Fix integrate_spectrum for small integration ranges (Axel Donath) - [#2160] Add default of "all" for DataStore.get_observations (Christoph Deil) - [#2157] Rename SpectrumDataset.counts_on to SpectrumDataset.counts (Régis Terrier) - [#2154] Implement DataStoreMaker for IACT DL3 indexing (Christoph Deil) - [#2153] Remove SpectrumObservation and SpectrumObservationList classes (Régis Terrier) - [#2152] Improve FluxPointEstimator for joint likelihood datasets (Axel Donath) - [#2151] Add todo for improving wcs solid angle computation (Debanjan Bose) - [#2146] Implement scipy confidence method (Axel Donath) - [#2145] Make tests run without GAMMAPY_DATA (Christoph Deil) - [#2142] Implement oversampling option for background model evaluation (Axel Donath) - [#2141] Implement SkyDisk and SkyEllipse edge parameter (Axel Donath) - [#2140] Clean up spectral tutorials (Atreyee Sinha) - [#2139] Refactor SpectrumExtraction to use SpectrumDatasetOnOff (Régis Terrier) - [#2133] Replace DataAxis and BinnedDataAxis classes by MapAxis (Axel Donath) - [#2132] Change MapAxis.edges and MapAxis.center attributes to quantities (Atreyee Sinha) - [#2131] Implement flux point estimation for MapDataset (Axel Donath) - [#2130] Implement MapAxis.upsample() and MapAxis.downsample() methods (Axel Donath) - [#2128] Fix Feldman-Cousins examples (Dirk Lennarz) - [#2126] Fix sorting of node values in MapAxis (Atreyee Sinha) - [#2124] Implement NaimaModel wrapper class (Luca Giunti) - [#2123] Remove SpectrumFit class (Axel Donath) - [#2121] Move plotting helper functions to SpectrumDatasetOnOff (Axel Donath) - [#2119] Clean up Jupyter notebooks with PyCharm static code analysis (Christoph Deil) - [#2118] Remove tutorials/astropy_introduction.ipynb (Christoph Deil) - [#2115] Remove SpectrumResult object (Axel Donath) - [#2114] Refactor energy grouping (Axel Donath) - [#2112] Refactor FluxPointEstimator to use Datasets (Axel Donath) - [#2111] Implement SpectrumDatasetOnOff class (Régis Terrier) - [#2108] Fix frame attribute of SkyDiffuseCube and SkyDiffuseMap (Lars Mohrmann) - [#2106] Add frame attribute for SkyDiffuseMap (Lars Mohrmann) - [#2104] Implement sparse summed fit statistics in Cython (Axel Donath) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.13.rst0000644000175100001770000001174014721316200017413 0ustar00runnerdocker.. _gammapy_0p13_release: 0.13 (Jul 26, 2019) ------------------- Summary ~~~~~~~ - Released Jul 26, 2019 - 15 contributors - 2 months of work - 72 pull requests (not all listed below) What's new? ~~~~~~~~~~~ The Gammapy v0.13 release includes many bug-fixes, a lot of clean-up work and some new features. Gammapy v0.13 implements a new ``SpectralGaussian`` and ``PLSuperExpCutoff4FGL`` model. To support binned simulation of counts data in a uniform way ``MapDataset.fake()``, ``SpectrumDataset.fake()`` and ``SpectrumDatasetOnOff.fake()`` methods were implemented, which simulate binned counts maps and spectra from models. In addition a nice string representations for all of the dataset classes was implemented together with convenience functions to compute residuals using different methods on all of them. The algorithm and API of the current ``LightCurveEstimator`` was changed to use datasets. Now it is possible to compute lightcurves using spectral as well as cube based analyses. The definition of the position angle of the ``SkyEllipse`` model was changed to follow IAU conventions. The handling of sky regions in Gammapy was unified as described in `PIG 10`_. For convenience regions can now also be created from DS9 region strings. The clean-up process of ``gammapy.spectrum`` was continued by removing the ``PHACountsSpectrum`` class, which is now fully replaced by the ``SpectrumDatasetOnOff`` class. The ``Energy`` and ``EnergyBounds`` classes were also removed. Grids of energies can be created and handled directly using the ``MapAxis`` object now. The algorithm to compute solid angles for maps was fixed, so that it gives correct results for WCS projections even with high spatial distortions. Standard analyses using TAN or CAR projections are only affected on a <1% level. Different units for the energy axis of the counts and exposure map in a ``MapDataset`` are now handled correctly. The recommended conda environment for Gammapy v0.13 was updated. It now relies on Python 3.7, Ipython 7.5, Scipy 1.3, Matplotlib 3.1, Astropy 3.1, and Healpy 1.12. These updates should be backwards compatible. Scripts and notebooks should run and give the same results. Contributors ~~~~~~~~~~~~ In alphabetical order by first name: - Atreyee Sinha - Axel Donath - Brigitta Sipocz - Bruno Khelifi - Christoph Deil - Fabio Pintore - Fabio Acero - Kaori Nakashima - José Enrique Ruiz - Léa Jouvin - Luca Giunti - Quentin Remy - Régis Terrier - Silvia Manconi - Yu Wun Wong Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy v0.13 merged pull requests list on GitHub `__. - [#2296] Implement model YAML serialisation (Quentin Remy) - [#2310] Remove old ``LightCurveEstimator`` class (Axel Donath) - [#2305] Remove ``SpectrumSimulation`` class (Axel Donath) - [#2300] Change to IAU convention for position angle in SkyEllipse model (Luca Giunti) - [#2298] Implement ``.fake()`` methods on datasets (Léa Jouvin) - [#2297] Implement Fermi 4FGL catalog spectral models and catalog (Kaori Nakashima & Yu Wun Wong) - [#2294] Fix pulsar spin-down model bug (Silvia Manconi) - [#2289] Add ``gammapy/utils/fitting/sampling.py`` (Fabio Acero) - [#2287] Implement ``__str__`` methoda for dataset (Léa Jouvin) - [#2278] Refactor class ``CrabSpectrum`` in a function (Léa Jouvin) - [#2277] Implement GTI union (Régis Terrier) - [#2276] Fix map pixel solid angle computation (Axel Donath) - [#2272] Remove ``SpectrumStats`` class (Axel Donath) - [#2264] Implement ``MapDataset`` FITS I/O (Axel Donath) - [#2262] Clean up sky region select code (Christoph Deil) - [#2259] Fix ``Fit.minos_contour`` method for frozen parameters (Axel Donath) - [#2257] Update astropy-helpers to v3.2.1 (Brigitta Sipocz) - [#2254] Add select_region method for event lists (Régis Terrier) - [#2250] Remove ``PHACountsSpectrum`` class (Axel Donath) - [#2244] Implement ``SpectralGaussian`` model class (Léa Jouvin) - [#2243] Speed up mcmc_sampling tutorial (Fabio Acero) - [#2240] Remove use of NDDataArray from CountsSpectrum (Axel Donath) - [#2239] Remove GeneralRandom class (Axel Donath) - [#2238] Implement ``MapEventSampler`` class (Fabio Pintore) - [#2237] Remove ``Energy`` and ``EnergyBounds`` classes (Axel Donath) - [#2235] Remove unused functions in stats/data.py (Régis Terrier) - [#2230] Improve spectrum/models.py coverage (Régis Terrier) - [#2229] Implement ``InverseCDFSampler`` class (Fabio Pintore) - [#2217] Refactor gammapy download (José Enrique Ruiz) - [#2206] Remove unused map iter_by_pix and iter_by_coord methods (Christoph Deil) - [#2204] Clean up ``gammapy.utils.random`` (Fabio Pintore) - [#2200] Update astropy_helpers to v3.2 (Brigitta Sipocz) - [#2192] Improve ``gammapy.astro`` code and tests (Christoph Deil) - [#2129] PIG 10 - Regions (Christoph Deil) - [#2089] Improve ``ReflectedRegionsFinder`` class (Bruno Khelifi) .. _PIG 10: https://docs.gammapy.org/dev/development/pigs/pig-010.html ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.14.rst0000644000175100001770000001243114721316200017412 0ustar00runnerdocker.. _gammapy_0p14_release: 0.14 (Sep 30, 2019) ------------------- Summary ~~~~~~~ - Released Sep 30, 2019 - 8 contributors - 101 pull requests (not all listed below) What's new? ~~~~~~~~~~~ Gammapy v0.14 features a new high level analysis interface. Starting from a YAML configuration file, it supports the standard use-cases of joint or stacked 3D as well as 1D reflected region analyses. It also supports computation of flux points for all cases. The usage of this new ``Analysis`` class is demonstrated in the `hess tutorial `__. Following the proposal in :ref:`pig-016` the subpackages ``gammapy.background`` and ``gammapy.image`` were removed. Existing functionality was moved to the ``gammapy.cube`` and ``gammapy.spectrum`` subpackages. A new subpackage ``gammapy.modeling`` subpackage as introduced. All spectral, spatial, temporal and combined models were moved to the new namespace and renamed following a consistent naming scheme. This provides a much clearer structure of the model types and hierarchy for users. The ``SkyEllipse`` model was removed. Instead the ``GaussianSpatialModel`` as well as the ``DiskSpatialModel`` now support parameters for elongation. A bug that lead to an incorrect flux normalization of the ``PointSpatialModel`` at high latitudes was fixed. The default coordinate frame for all spatial models was changed to ``icrs``. A new ``ConstantTemporalModel`` was introduced. A new ``MapDataset.to_spectrum_dataset()`` method allows to reduce a map dataset to a spectrum dataset in a specified analysis region. The ``SpectrumDatasetOnOffStacker`` was removed and placed by a ``SpectrumDatasetOnOff.stack()`` and ``Datasets.stack_reduce()`` method. A ``SpectrumDataset.stack()`` method was also added. Following :ref:`pig-013` the support for Python 3.5 was dropped with Gammapy v0.14. At the same time the versions of the required dependencies were updated to Numpy 1.16, Scipy 1.2, Astropy 3.2, Regions 0.5, Pyyaml 5.1, Click 7.0 and Jsonschema 3.0. Contributors ~~~~~~~~~~~~ In alphabetical order by first name: - Atreyee Sinha - Axel Donath - Christoph Deil - Régis Terrier - Fabio Pintore - Quentin Remy - José Enrique Ruiz - Johannes King - Luca Giunti - Léa Jouvin Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy v0.14 merged pull requests list on GitHub `__. - [#2412] Remove model XML serialization (Quentin Remy) - [#2404] Clean up spectral model names (Christoph Deil) - [#2401] Clean up spatial model names (Christoph Deil) - [#2400] Clean up temporal model names (Christoph Deil) - [#2385] Change spatial model default frame to icrs (Christoph Deil) - [#2381] Add ``MapDataset.stack()`` (Atreyee Sinha) - [#2379] Cleanup ``WcsNDMap`` FITS convention handling (Axel Donath) - [#2378] Add support for 3D analysis in the high level interface (José Enrique Ruiz) - [#2377] Implement ``WcsGeom`` coord caching (Axel Donath) - [#2375] Adapt ``MapMakerObs`` to return a ``MapDataset`` (Atreyee Sinha) - [#2368] Add ``MapDataset.create()`` method (Atreyee Sinha) - [#2367] Fix SkyPointSource evaluation (Christoph Deil) - [#2366] Remove lon wrapping in spatial models (Christoph Deil) - [#2365] Remove gammapy/maps/measure.py (Christoph Deil) - [#2360] Add ``SpectrumDatasetOnOff.stack()`` (Régis Terrier) - [#2359] Remove ``BackgroundModels`` class (Axel Donath) - [#2358] Adapt MapMakerObs to also compute an EDispMap and PSFMap (Atreyee Sinha) - [#2356] Add ``SpectrumDataset.stack()`` (Régis Terrier) - [#2354] Move gammapy.utils.fitting to gammapy.modeling (Christoph Deil) - [#2351] Change OrderedDict to dict (Christoph Deil) - [#2347] Simplify ``EdispMap.stack()`` and ``PsfMap.stack()`` (Luca Giunti) - [#2346] Add ``SpectrumDatasetOnOff.create()`` (Régis Terrier) - [#2345] Add ``SpectrumDataset.create()`` (Régis Terrier) - [#2344] Change return type of ``WcsGeom.get_coord()`` to quantities (Axel Donath) - [#2343] Implement ``WcsNDMap.sample()`` and remove ``MapEventSampler`` (Fabio Pintore) - [#2342] Add zero clipping in ``MapEvaluator.apply_psf`` (Luca Giunti) - [#2338] Add model registries and ``Model.from_dict()`` method (Quentin Remy) - [#2335] Remove ``SpectrumAnalysisIACT`` class (José Enrique Ruiz) - [#2334] Simplify and extend background model handling (Axel Donath) - [#2330] Migrate SpectrumAnalysisIACT to the high level interface (José Enrique Ruiz) - [#2326] Fix bug in the spectral gaussian model evaluate method (Léa Jouvin) - [#2323] Add high level Config and Analysis classes (José Enrique Ruiz) - [#2321] Dissolve ``gammapy.image`` (Christoph Deil) - [#2320] Dissolve ``gammapy.background`` (Christoph Deil) - [#2314] Add datasets serialization (Quentin Remy) - [#2313] Add elongated gaussian model (Luca Giunti) - [#2308] Use parfive in gammapy download (José Enrique Ruiz) - [#2292] Implement ``MapDataset.to_spectrum_dataset()`` method (Régis Terrier) - [#2279] Update Gammapy packaging, removing astropy-helpers (Christoph Deil) - [#2274] PIG 16 - Gammapy package structure (Christoph Deil) - [#2219] PIG 12 - High level interface (José Enrique Ruiz) - [#2218] PIG 13 - Gammapy dependencies and distribution (Christoph Deil) - [#2136] PIG 9 - Event sampling (Fabio Pintore) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.15.rst0000644000175100001770000002302614721316200017415 0ustar00runnerdocker.. _gammapy_0p15_release: 0.15 (Dec 3, 2019) ------------------ Summary ~~~~~~~ - Released Dec 3, 2019 - 12 contributors - 187 pull requests (not all listed below) What's new? ~~~~~~~~~~~ For Gammapy v0.15 the high level ``Analysis`` class was moved to the newly introduced ``gammapy.analysis`` sub-package. The syntax of the YAML config file was simplified and validation of config parameters is now available for interactive use of the ``Analysis`` class as well. Both is demonstrated in the `first analysis with Gammapy notebook `__. In addition a new ``gammapy analysis`` command line tool was introduced, which executes the data reduction part of an analysis, based on a given config file. Following the proposal in `PIG 18`_ the structure of the documentation was improved. The new `overview page `__ gives an introduction and overview of the Gammapy analysis workflow and package structure. The structure and content of the `tutorials `__ page was simplified and cleaned up and a `how to page `__ was introduced. A tutorial notebook showing how to do a joint `multi-instrument analysis `__ of the Crab Nebula using H.E.S.S. and Fermi-LAT data and HAWC flux points was added. Another focus of the work for Gammapy v0.15 was the clean-up and unification of the spectrum and map data reduction. Gammapy now features a ``MapDatasetMaker``, and ``SpectrumDatasetMaker`` which directly produce a ``MapDataset`` or ``SpectrumDataset`` from DL3 data. The existing background estimation classes were adapted by introducing a ``ReflectedRegionsBackgroundMaker``, ``RingBackgroundMaker`` and ``AdaptiveRingbackgroundMaker``. Those makers can also be chained to create custom data reduction workflows. The new data reduction API is shown in the `second analysis with Gammapy notebook `__ and corresponding `docs page `__. A ``MapDatasetOnOff`` class was introduced to handle on-off observation based analyses and as a container for image based ring-background estimation. All datasets now have a ``.create()`` method to allow an easy creation of the dataset from a map geometry or energy specification. Gammapy now supports spatially varying PSF and energy dispersion in the data reduction as well as during fitting. By introducing an in memory ``Observation`` class Gammapy now features unified support for binned simulations of spectrum and map datasets. This is shown in the `1d simulation `__ and `3d simulation `__ tutorial notebooks. The ``LightCurveEstimator`` was improved to use the GTIs defined on datasets and allow for grouping of datasets according to provided time intervals. Details are explained on the `time docs page `__ and the newly added `flare light curve notebook `__. The support for 2FHL and 4FGL was improved by adding attributes returning spatial and spectral models as well as lightcurves to the corresponding objects. The support for the Fermi-LAT 1FHL catalog was dropped. An overview can be found on the `catalog docs page `__ and the `catalog tutorial notebook `__. Error propagation is now fully supported for the ``AbsorbedSpectralModel`` and ``NaimaModel``. For this release the dependency on ``reproject`` and ``jsonschema`` was dropped. The latter was replaced by a new dependency on ``pydantic``. This release contains several API breaking changes and removal of non-essential parts of Gammapy (see PR list below). These changes are required to finally arrive at a more consistent and stable API for Gammapy v1.0. Thanks for your understanding! Contributors ~~~~~~~~~~~~ In alphabetical order by first name: - Atreyee Sinha - Axel Donath - Brigitta Sipocz - Bruno Khelifi - Christoph Deil - Fabio Pintore - Fabio Acero - José Enrique Ruiz - Luca Giunti - Léa Jouvin - Quentin Remy - Régis Terrier Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy v0.15 merged pull requests list on GitHub `__. - [#2660] Remove tutorials/source_population_model.ipynb (Christoph Deil) - [#2654] Remove HGPS map and catalog tutorial (Christoph Deil) - [#2651] Add SkyModels read and write methods (Christoph Deil) - [#2645] Remove FluxPointsDataset chi2assym option (Christoph Deil) - [#2637] Remove keepdims option from MapDataset.to_images() (Axel Donath) - [#2635] Change Datasets model to models (Christoph Deil) - [#2627] Activate PSFMap for fitting (Atreyee Sinha) - [#2619] Unify model setters of all datasets (Axel Donath) - [#2620] Make SkyModel.spatial_model optional (Axel Donath) - [#2616] Add MapDatasetEventSampler.sample_background() method (Fabio Pintore) - [#2604] Implement additional methods for SafeMaskMaker (Luca Giunti) - [#2595] Change SpectrumDataset and FluxPointDataset model to SkyModels (Quentin Remy) - [#2594] Add light curve flare tutorial notebook (Léa Jouvin) - [#2587] Activate EDispMap in MapEvaluator (Axel Donath) - [#2585] Improve spectral model error propagation (Christoph Deil) - [#2580] Speed up Observations.select_time (Régis Terrier) - [#2574] Generalise exponential cutoff power law spectral model (Bruno Khelifi) - [#2567] Add time intervals to LightCurveEstimator (Léa Jouvin) - [#2564] Remove HpxSparseMap class (Axel Donath) - [#2563] Add in memory Observation class (Atreyee Sinha) - [#2562] Remove map reprojection functionality (Axel Donath) - [#2561] Use dataset GTI table in LightCurveEstimator (Régis Terrier) - [#2559] Replace energy grid helper functions with MapAxis (Christoph Deil) - [#2558] Remove gammapy.utils.nddata.sqrt_space (Christoph Deil) - [#2557] Add more options to Minuit wrapper (Quentin Remy) - [#2553] Remove MapDataset cstat likelihood option (Christoph Deil) - [#2552] Remove unused functions from gammapy.irf (Axel Donath) - [#2551] Cleanup mask safe handling (Axel Donath) - [#2546] Rename likelihood to stat (Christoph Deil) - [#2540] Restructure tutorial notebooks (Christoph Deil) - [#2538] Move SafeMaskMaker and adapt mask_safe handling in MapDatasetMaker (Axel Donath) - [#2536] Add WcsGeom.cutout_info information to WCS header (Axel Donath) - [#2535] Remove gammapy.detect.CWT (Christoph Deil) - [#2528] Move Analysis to new gammapy.analysis (José Enrique Ruiz) - [#2525] Remove MapMakerRing (Luca Giunti) - [#2523] Add EDispMap and PSFMap to MapDataset io (Atreyee Sinha) - [#2521] Remove .to_sherpa() methods (Axel Donath) - [#2520] Refactor ring background maker (Luca Giunti) - [#2510] Add EdispMap.sample_coord method (Fabio Pintore) - [#2505] Add a tutorial for joint 1d/3d analysis (Quentin Remy) - [#2502] Remove ObservationStats, ObservationsSummary and BackgroundEstimate (Axel Donath) - [#2501] Add .to_region() test for each spatial model (Quetin Remy) - [#2499] Remove SpectrumExtraction class (Axel Donath) - [#2498] Add mask_safe handling in MapDataset.to_image (Luca Giunti) - [#2497] Refactor PhaseBackgroundEstimator to PhaseBackgroundMaker (Axel Donath) - [#2496] Add PSFMap.sample_coord method (Fabio Pintore) - [#2493] Add region info to CountsSpectrum and adapt tutorials (Axel Donath) - [#2492] Change MapDataset.mask_fit and MapDataset.mask_safe to maps (Atreyee Sinha) - [#2491] Add SpatialModel.position_error and SpatialModel.to_region (Quentin Remy) - [#2490] Improve Parameters class (Christoph Deil) - [#2486] Update default offset value in simulate_dataset (Fabio Acero) - [#2483] Fix elongated source frame in Fermi-LAT catalogs (Quentin Remy) - [#2481] Add MapDatasetOnOff (Luca Giunti) - [#2479] Change parametrisation from geom_true to energy_axis_true (Atreyee Sinha) - [#2478] Improve 2FHL catalog support (Quentin Remy) - [#2477] Add SafeMaskMaker (Axel Donath) - [#2476] Remove Fermi-LAT 1FHL catalog (Quentin Remy) - [#2475] Implement ReflectedRegionsBackgroundMaker (Axel Donath) - [#2472] Remove multiprocessing code (Christoph Deil) - [#2470] Add MapDataset.from_geoms (Atreyee Sinha) - [#2468] Improve map and spectrum events fill methods (Christoph Deil) - [#2464] Implement SpectrumDatasetMaker (Axel Donath) - [#2463] PIG 18: Documentation (Christoph Deil) - [#2461] Remove error raising, when model component moves out of the image (Axel Donath) - [#2459] Add FluxPointsDataset serialisation (Quentin Remy) - [#2455] Improve datasets serialisation (Quentin Remy) - [#2454] Add a norm parameter to the EBL model (Léa Jouvin) - [#2450] Rename and refactor MapMakerObs #2450 (Axel Donath) - [#2449] Fix and improve 2HWC catalog source models (Quentin Remy) - [#2448] Improve 4FGL catalog support (Quentin Remy) - [#2446] Implement WcsNDMap.stack() method (Axel Donath) - [#2444] Remove MapMaker class (Axel Donath) - [#2441] Add GTI export in datasets (Régis Terrier) - [#2435] Add modeling notebook with model plot examples (Christoph Deil) - [#2433] Update astropy and numpy versions in Travis-CI (Brigitta Sipocz) - [#2405] Change value clipping in LogScale class (Quentin Remy) - [#2350] Modernise Gammapy code base (Christoph Deil) .. _PIG 18: https://docs.gammapy.org/dev/development/pigs/pig-018.html ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.16.rst0000644000175100001770000001303514721316200017415 0ustar00runnerdocker.. _gammapy_0p16_release: 0.16 (Feb 1, 2020) ------------------ Summary ~~~~~~~ - Released Feb 1, 2020 - 8 contributors - 61 pull requests (not all listed below) What's new? ~~~~~~~~~~~ For Gammapy v0.16 a ``FoVBackgroundMaker`` was implemented, which supports different methods of adapting the norm and tilt of a field of view background model to the data. To provide a visual overview of the available models in Gammapy a `model gallery `__ was added. A general introduction on how to work with the different models is now available in a dedicated `models tutorial `__. The spectral analysis of an extended source is demonstrated in the newly added `extended source spectral analysis tutorial `__. To further improve API consistency the ``EnergyDispersion`` class was renamed to ``EDispKernel`` and the ``SkyModels`` class was renamed to a more general ``Models`` class. The ``coordsys`` attribute of ``WcsGeom`` and ``HpxGeom`` was renamed to ``frame`` and now supports arbitrary Astropy coordinate frames. The ``Datasets`` and ``Models`` container objects now require unique names of the objects contained. By default unique identifiers are generated in the model and dataset objects. The ``Datasets``, ``Models`` as well as ``Observations`` container classes, were extended to now support in place ``.append()``, ``.extend()`` and ``.insert()`` operations. For Gammapy v0.16 the API of the ``SensitivityEstimator`` and ``TSMapEstimator`` was adapted to take a ``MapDataset`` or ``MapDatasetOnOff`` as input. The ``ASmooth`` class was renamed to ``ASmoothMapEstimator`` and also adapted to work with ``MapDataset`` and ``MapDatasetOnOff``. Again this release contains several API breaking changes and removal of non-essential parts of Gammapy (see PR list below). These changes are required to finally arrive at a more consistent and stable API for Gammapy v1.0. Thanks for your understanding! Contributors ~~~~~~~~~~~~ In alphabetical order by first name: - Atreyee Sinha - Axel Donath - Christoph Deil - Fabio Pintore - José Enrique Ruiz - Luca Giunti - Quentin Remy - Régis Terrier Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy v0.16 merged pull requests list on GitHub `__. - [#2756] Add config params for get_flux_points method in High level interface (José Enrique Ruiz) - [#2747] Modify Config and Analysis to support SafeMaskMaker (Régis Terrier) - [#2752] Add temporal model support to SkyModel (Quentin Remy) - [#2755] Fix WcsNDMap and MapDataset cutout to support mode='partial' (Régis Terrier) - [#2753] Make DataStoreObservation inherit from Observation (Axel Donath) - [#2751] Add checks for edisp, psf and bkg in MapDatasetEventSampler.run() (Fabio Pintore) - [#2750] Clean up MapDataset / BackgroundModel code (Axel Donath) - [#2746] Rework models notebook (Axel Donath) - [#2743] Add a MapDatasetOnOff.to_image() method (Régis Terrier) - [#2742] Add spectral models to gallery (José Enrique Ruiz) - [#2741] Adapt ASmooth to work with datasets and rename it to ASmoothMapEstimator (Axel Donath) - [#2739] Simplify and fix EDispMap.get_edisp_kernel() (Axel Donath) - [#2738] Unify analysis notebooks introductions (Régis Terrier) - [#2737] Add spatial models in models gallery (José Enrique Ruiz) - [#2735] Change configuration for sphinx gallery (José Enrique Ruiz) - [#2733] Handle MapDataset.to_image() without counts or background (Axel Donath) - [#2731] Add SmoothBrokenPowerLawSpectralModel (Axel Donath) - [#2730] Add an extended source spectral analysis tutorial (Régis Terrier) - [#2729] Unify SpectrumDataset and SpectrumDatasetOnOff overview methods (Axel Donath) - [#2728] Add auto-generated unique names (Quentin Remy) - [#2727] Rename SkyModels to Models (Axel Donath) - [#2726] Rename likelihood_type to stat_type (Axel Donath) - [#2725] Simplify trapz_loglog integrate method (Axel Donath) - [#2723] Add time scale info in GTI.__repr__ (Régis Terrier) - [#2719] Remove use of simulate_dataset from mcmc tutorial (Axel Donath) - [#2718] Adapt TSMapEstimator to take a MapDataset as input (Régis Terrier) - [#2715] Refactor sensitivity estimator (Axel Donath) - [#2713] Fix 3d array convolution with 2d kernel (Quentin Remy) - [#2712] Fix containment correction in MapDataset.to_spectrum_dataset (Régis Terrier) - [#2711] Remove Stats class (Axel Donath) - [#2709] Rename coordsys to frame in gammapy.maps (Axel Donath) - [#2707] Implement MapDatasetOnOff.to_spectrum_dataset() and .cutout() (Régis Terrier) - [#2705] Rename EnergyDispersion to EDispKernel (Axel Donath) - [#2703] Use sphinx gallery for a model gallery (Axel Donath) - [#2697] Add FoVBackgroundMaker class (Régis Terrier) - [#2692] Add PSF handling to MapDataset.to_image() (Atreyee) - [#2687] Allow interpolation of single bin axes in ScaledRegularGridInterpolator (Axel Donath) - [#2685] Move custom model tutorial to models notebook (Quentin Remy) - [#2684] Clean up image analysis tutorials (Atreyee Sinha) - [#2681] Update source detection notebook (Quentin Remy) - [#2674] Rewrite fit statistic rst page (Régis Terrier) - [#2673] Remove hard coded true energy axis in 1D HLI (Régis Terrier) - [#2672] Change lightcurve flare notebook to PKS 2155 flare (Régis Terrier) - [#2667] Add MapDatasetEventSampler.event_list_meta() and .run() method (Fabio Pintore) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.17.rst0000644000175100001770000002002214721316200017410 0ustar00runnerdocker.. _gammapy_0p17_release: 0.17 (Apr 1, 2020) ------------------ Summary ~~~~~~~ - Released April 1, 2020 - 8 contributors - 81 pull requests (not all listed below) What's new? ~~~~~~~~~~~ Gammapy v0.17 comes with new important features, an improved sub-package structure and a more uniform API. Again this release contains several API breaking changes and removal of non-essential parts of Gammapy. These changes are required to finally arrive at a more consistent and stable API for Gammapy v1.0. The main feature introduces in Gammapy v0.17 is event sampling. Based on the newly introduced ``MapDatasetEventSampler`` class, event lists can be sampled from a ``MapDataset`` object. The use of this class is shown in a dedicated `event sampling tutorial `__. Gammapy v0.17 now supports simulation and fitting of temporal models. Both are demonstrated in the `lightcurve simulation tutorial `__. A more general introduction to modeling and fitting in Gammapy is now available as a `modeling and fitting tutorial `__. Following the proposal in `PIG 19`_ the sub-package structure of Gammapy was unified. Instead of grouping the main functionality by use-case it is now grouped by abstractions and data levels. For this all ``Dataset`` classes were moved to the newly introduced `gammapy.datasets`. All ``Maker`` classes for data reduction from the DL3 to the DL4 data level were moved to the new `gammapy.makers` sub-package and all high level ``Estimator`` classes were moved to the new `gammapy.estimators`. In addition a `gammapy.visualization` module was introduced to group plotting related functionality into a single namespace. The `gammapy.cube`, `gammapy.spectrum` and `gammapy.detect` modules were removed. With the introduction of the `gammapy.estimators` sub-package the API of all ``Estimator`` classes was unified. The ``Dataset`` objects are now always passed to the ``.run()`` methods. A new ``ExcessMapEstimator`` was introduced, which replaces the former ``compute_lima_map`` functions and also computes maps of upper limits as well as asymmetric flux errors. The ``TSMapEstimator`` now takes into account PSF information automatically and uses `SkyModel` as kernel configuration. For Gammapy v0.17 the model handling was further improved and unified. The separate ``background_model`` argument was removed from the ``MapDataset``. Background models can now be specified as part of the general model definition. For this a ``BackgroundModel.datasets_names`` attribute was introduced which allows to declare to which dataset the model belongs. The application of PSF and energy dispersion can now be configured per model component using the newly introduced ``SkyModel.apply_irf`` and ``SkyDiffuseCube.apply_irf`` keywords. A new ``GaussianTemporalModel`` and ``ExpDecayTemporalModel`` were introduced. A new ``Covariance`` class was introduced and the covariance information was moved from the ``Parameters`` object to a ``.covariance`` attribute on all ``Model`` and ``Models`` objects. The covariance and is now automatically set after ``Fit.covariance()`` was called. To unify and clean up statistical calculations ``CountsStatistics`` classes we introduced in ``gammapy.stats`` which allow calculation of excess, background, significance, errors, asymmetric errors and upper limits. The ``gammapy.stats.poisson`` module has been removed as well as the ``significance_lima`` methods. To further unify the data structures for 1D and 3D analyses a ``RegionGeom`` and ``RegionNDMap`` were introduced in ``gammapy.maps``. These region based map classes are now used for the ``SpectrumDataset`` and ``SpectrumDatasetOnOff``. The previously used ``CountsSpectrum`` class was removed. Contributors ~~~~~~~~~~~~ In alphabetical order by first name: - Atreyee Sinha - Axel Donath - Brigitta Sipocz - Fabio Pintore - José Enrique Ruiz - Luca Giunti - Quentin Remy - Régis Terrier Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy v0.17 merged pull requests list on GitHub `__. - [#2846] Add more meta data keywords for sampled event lists (Fabio Pintore) - [#2841] Clean up model parameter handling (Axel Donath) - [#2845] Add Background3D plotting methods (Atreyee Sinha) - [#2842] Clean up gammapy.stats (Régis Terrier) - [#2839] Improve stats documentation (Régis Terrier) - [#2837] Improve Background2D visualisation (Atreyee Sinha) - [#2832] Implement EDispKernelMap (Axel Donath) - [#2829] Add event sampling tutorial (Fabio Pintore) - [#2828] Add notebook for light curve simulation (Atreyee Sinha) - [#2827] Improve covariance handling / implement Covariance class (Axel Donath) - [#2823] Add temporal evaluation for spectral datasets (Atreyee Sinha) - [#2822] Refactor model serialisation code (Quentin Remy) - [#2820] Rename LiMaMapEstimator to ExcessMapEstimator (Régis Terrier) - [#2818] Fix background serialization (Quentin Remy) - [#2817] Remove SpectrumEvaluator class (Axel Donath) - [#2816] Remove RegionGeom.energy_mask() (Axel Donath) - [#2815] Remove CountsSpectrum class (Axel Donath) - [#2812] Add ring background estimation in high level interface (José Enrique Ruiz) - [#2811] Fix confidence interval range in CountsStatistic class (Régis Terrier) - [#2810] Implement FluxEstimator (Régis Terrier) - [#2809] Add sampling of temporal models to event sampler (Fabio Pintore) - [#2808] Add temporal model evaluation (Atreyee Sinha) - [#2805] Move LightcurveEstimator to gammapy.estimators (Axel Donath) - [#2804] Remove gammapy.cube sub package (Axel Donath) - [#2803] Remove gammapy.spectrum sub package (Axel Donath) - [#2802] Remove gammapy.detect sub package (Axel Donath) - [#2801] Support SpectrumDataset in FluxPointsEstimator (Régis Terrier) - [#2799] Implement option to sample background only (Fabio Pintore) - [#2798] Support aeff-max safe energy threshold for MapDataset (Luca Giunti) - [#2797] Remove KernelBackgroundEstimator class (Axel Donath) - [#2796] Change beta sign in SmoothBrokenPowerLawSpectralModel (Quentin Remy) - [#2794] Refactor catalog registry (Axel Donath) - [#2793] Add notebooks test to azure pipelines (Axel Donath) - [#2792] Introduce gammapy.visualization sub-package (Axel Donath) - [#2791] Introduce gammapy.estimators and ParameterEstimator class (Axel Donath) - [#2790] Introduce gammapy.makers sub package (Axel Donath) - [#2789] Move irf maps to gammapy/irf (Axel Donath) - [#2788] Introduce gammapy.datasets submodule (Axel Donath) - [#2787] Add TemporalModel integral method (Atreyee Sinha) - [#2785] Datasets names follow up (Axel Donath) - [#2784] Implement naming convention for true energy axis (Axel Donath) - [#2783] Add __call__ method to TemporalModel (Atreyee Sinha) - [#2782] Add datasets_names attribute to cube models (Quentin Remy) - [#2781] Fix Jacobian factor in PSFMap.sample_coord() (Fabio Pintore) - [#2779] Add exclusion mask tutorial (Régis Terrier) - [#2778] Implement RegionGeom and RegionNDMap (Axel Donath) - [#2777] Add SkyModel.apply_irf and SkyDiffuseCube.apply_irf (Quentin Remy) - [#2776] Add support for FoVBackground on the HLI (Régis Terrier) - [#2775] Implement CountsStatistics classes (Régis Terrier) - [#2772] Add region serialization on CountsSpectrum (Régis Terrier) - [#2771] Set DM primary flux to zero beyond particle mass energy (José Enrique Ruiz) - [#2768] Refactor map dataset background model (Axel Donath) - [#2767] Implement self synchrotron compton for NaimaModel (Quentin Remy) - [#2765] Clean up container classes (Axel Donath) - [#2764] Add modeling and fitting tutorial notebook (Quentin Remy) - [#2762] Implement SignificanceMapEstimator (Régis Terrier) - [#2761] Implement LazyFitsData descriptor (Axel Donath) - [#2759] Fix osx travis build (Brigitta Sipocz) - [#2720] PIG 19 - Package structure follow up (Axel Donath) .. _PIG 19: https://docs.gammapy.org/dev/development/pigs/pig-019.html ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.18.1.rst0000644000175100001770000000126714721316200017562 0ustar00runnerdocker.. _gammapy_0p18_1_release: 0.18.1 (Nov 6th, 2020) ---------------------- Summary ~~~~~~~ - Released November 6, 2020 - 3 contributors - 6 pull requests What's new? ~~~~~~~~~~~ This release fixes multiple bugs found after v0.18. See the list of pull requests below for details. Pull requests ~~~~~~~~~~~~~ - [#3116] Fix model handling in FluxEstimator (Axel Donath) - [#3114] Corrected exclusion mask notebook (Régis Terrier) - [#3113] Modified TSMapEstimator to keep model term (Régis Terrier) - [#3112] Improve error messages for wrong shapes (Max Noethe) - [#3111] Adapt ExcessMapEstimator for missing models (Régis Terrier) - [#3110] Correct plot_residual methods (Régis Terrier) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.18.2.rst0000644000175100001770000000133214721316200017554 0ustar00runnerdocker.. _gammapy_0p18_2_release: 0.18.2 (Nov 19th, 2020) ----------------------- Summary ~~~~~~~ - Released November 19, 2020 - 4 contributors - 7 pull requests What's new? ~~~~~~~~~~~ This release again fixes a few minor bugs found after v0.18.1 See the list of pull requests below for details. Pull requests ~~~~~~~~~~~~~ - [#3130] Fix too small energy bin handling for FluxEstimator (Axel Donath) - [#3129] Fix spectral._propagate_error (Fabio Pintore) - [#3127] Fix containment radius computation (Axel Donath) - [#3126] Fix #3123 (Axel Donath) - [#3125] Update Astropy version to 4.0 (Axel Donath) - [#3124] Small cleanup in tutorial (Atreyee Sinha) - [#3122] Correct excess_matching_significance behavior (Régis Terrier) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.18.rst0000644000175100001770000002342014721316200017416 0ustar00runnerdocker.. _gammapy_0p18_release: 0.18 (Nov 4th, 2020) -------------------- Summary ~~~~~~~ - Released November 4, 2020 - 15 contributors - 160 pull requests (not all listed below) What's new? ~~~~~~~~~~~ Gammapy v0.18 comes with many new features, further unified API, bug fixes and improved performance. For Gammapy v0.18 the handling of the instrumental background was refactored by introducing a ``FoVBackgroundModel`` which is specific to each ``MapDataset``. Instead of relying on the previous ``BackgroundModel``, which coupled the map template and spectral parameters, the information is now split such that ``MapDataset.background`` contains the map template and the ``FoVBackgroundModel`` the corresponding parameters, respectively. The model corrected background can now be accessed via ``MapDataset.npred_background()``. By default the un-corrected background map is now added to the predicted counts. The consistent behaviour has been implemented on the ``MapDatasetOnOff``. The ``FoVBackgroundModel`` has support for a ``spectral_model`` argument, which allows different background corrections using different "norm" spectral models (see below). The definition of the quantities defined on ``MapDataset`` and ``MapDatasetOnOff`` as well as the corresponding attributes on the ``CountsStatistics`` has been improved in the stats definition table :ref:`stats_notation`. Many additional methods to modify the data have been added to the ``MapDataset`` and ``SpectrumDataset``. This includes ``.downsample()``, ``.pad()``, ``.resample_energy_axis()``, ``.slice_by_energy()`` ``.slice_by_idx()``. The models definition in the dataset is now reset consistently in all of the listed methods. The information returned by ``.info_dict()`` has been unified. The information contained in ``.meta_table`` is now handled correctly during stacking of datasets. Following the proposal in `PIG 21`_ Gammapy v0.18 comes with many improvements to the ``gammapy.modeling.models`` sub-package. This includes the introduction of "norm" spectral models, which are multiplicative models to used with spatial and spectral template or parametric models. Among those are a ``PowerLawNormSpectralModel``, ``LogParabolaNormSpectralModel`` and ``PiecewiseNormSectralModel``. The EBL absorption was refactored accordingly to an ``EBLAbsorptionNormSpectralModel``. A new ``GeneralizedGaussianSpatialModel`` and ``BrokenPowerlawSpectralModel`` have been introduced. Gammapy v0.18 comes now with support for custom energy dependent spatial models. An example for this can be found in the `models tutorial `__. The ``SkyDiffuseCube`` has been removed, the same functionality can now be achieved with the ``TemplateSpatialModel``. Following the proposal in `PIG 21`_, short YAML tags were introduced for all models. An overview of the tags can be found in a table in the linked PIG document. For Gammapy v0.18 the stacking behaviour of the ``EDispKernelMap`` was fixed. it now handles safe energy threshold for stacked analyses correctly. The ``MapDataset.edisp`` attribute has been changed to this class by default. The ``IRFStacker`` class has been removed. The ``Estimator``` API has been homogenized, in particular, they now support the definition of energy edges on init. This means light-curves, excess maps, excess profiles and ts maps can be computed in energy bins with the same API. The ``LightCurveEstimator`` was refactored to rely on the same algorithm as the ``FluxPointsEstimator``. It now also fits the data in the provided energy range. The ``Map`` class now supports boolean and comparator operations, such as ``>,<,&,|`` etc. . A convenience ``Map.interp_to_geom()`` has been added. A ``Fit.stat_surface()`` method was introduced which allows to compute a fit statistic surface. In addition an option to store the trace of the fit was added. Both are demonstrated in dedicated sections in the `modeling and fitting tutorial `__. Following the proposal in `PIG 19`_, the ``gammapy.time`` sub-package was removed. All functionality related to light curves can be found in ``gammapy.estimators``. The Feldman-Cousins method has been removed from the ``gammapy.stats``. Gammapy v0.18 comes with an improved performance related to caching the computation of predicted counts and caching of psf convolution. In Gammapy v0.18 the support for Numpy<1.17 had been dropped. .. _PIG 19: https://docs.gammapy.org/dev/development/pigs/pig-019.html Contributors ~~~~~~~~~~~~ In alphabetical order by first name: - Alexis de Almeida Coutinho - Atreyee Sinha - Axel Donath - Bruno Khelifi - Cosimo Nigro - Fabio Acero - Fabio Pintore - Jalel Hajlaoui - José Enrique Ruiz - Laura Olivera Nieto - Lea Jouvin - Luca Giunti - Max Noethe - Quentin Remy - Régis Terrier Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy v0.18 merged pull requests list on GitHub `__. - [#3106] Remove default FoVBackgroundModel (Axel Donath) - [#3100] Simplify EBL absorption spectral model (Quentin Remy) - [#3092] Update energy naming convention (Fabio Pintore) - [#3091] Implement Dataset.slice_by_energy (Axel Donath) - [#3089] Introduce DatasetModels class and global model (Axel Donath) - [#3088] Allowing Estimators norm parameter to be negative (Régis Terrier) - [#3086] Stats background convention (Axel Donath) - [#3085] Remove feldman cousins method (Axel Donath) - [#3083] Example of energy dependent spatial model (Atreyee Sinha) - [#3081] Unify axis ordering in gammapy.irf (Atreyee Sinha) - [#3080] Remove significance and replace with sqrt_ts (Régis Terrier) - [#3076] Introduce MapDataset.geoms property (Axel Donath) - [#3074] Implement option to store fit trace to Fit (Axel Donath) - [#3072] Allow to apply PSF in reconstructed energy (Axel Donath) - [#3070] Remove intervals option from integrate_spectrum() (Axel Donath) - [#3069] Remove pre-processing from Fermi tutorial (Axel Donath) - [#3063] Add PiecewiseNormSpectralModel (Quentin Remy) - [#3060] Remove code duplication between MapDataset and SpectrumDataset (Axel Donath) - [#3058] Clean up MapDataset mask handling (Axel Donath) - [#3054] Unify dataset info dicts (Axel Donath) - [#3053] Add bkg systematics condition for the sensitivity computation (Bruno Khelifi) - [#3052] Adapt LightCurveEstimator to take energy edges (Régis Terrier) - [#3051] Introduce dataset specific FoVBackgroundModel (Axel Donath) - [#3045] Add temporal models to model gallery (Jalel Hajlaoui) - [#3042] Refactor SpectrumDataset to use exposure (Axel Donath) - [#3041] Add SpectralModel.integral_error (Fabio Pintore) - [#3039] Use MapAxis in gammapy.irf consistently (Axel Donath) - [#3038] Implement Fit.stat_surface() method (Luca Giunti) - [#3037] Add generalized gaussian model (Quentin Remy) - [#3035] Update Numpy to 1.17 (Axel Donath) - [#3032] Introduce MapAxes object (Axel Donath) - [#3030] Fix inconsistency between EventList.stack() and GTI.stack() (Laura Olivera Nieto) - [#3012] Replace SkyDiffuseCube by TemplateSpatialModel (Quentin Remy) - [#3007] Support Map based IRFs in MapDatasetMaker (Laura Olivera Nieto) - [#3005] Allow custom spectral models corrections for BackgroundModel (Quentin Remy) - [#3002] Implement PSFMap.from_gaussian (Laura Olivera Nieto) - [#3001] Improve the datasets plot/peek interface (Alexis de Almeida Coutinho) - [#2999] Add e_edges to AsmoothMapEstimator (Axel Donath) - [#2998] Add e_edges to ExcessMapEstimator (Régis Terrier) - [#2993] Reuse FluxPointsEstimator in LightCurveEstimator (Axel Donath) - [#2992] Implement WcsNDMap.to_region_nd_map() (Axel Donath) - [#2991] Implement energy slicing for FluxPointsEstimator (Axel Donath) - [#2990] Optional exposure map for the EdispMap and PSF in the MapDataset (Laura Olivera Nieto) - [#2984] Change SpectrumDataset.aeff to RegionNDMap (Axel Donath) - [#2981] Add basic NormSpectralModels (Quentin Remy) - [#2976] Fix filename handling in read/write methods (Alexis de Almeida Coutinho) - [#2974] Implement meta table stacking (Axel Donath) - [#2967] Allow for varying energy range between datasets in FluxPointEstimator (Axel Donath) - [#2966] Implement MapDataset.slice_by_idx (Axel Donath) - [#2965] Add Map.to_cube() (Atreyee Sinha) - [#2956] Implement MapDataset.downsample() and MapDataset.pad() (Axel Donath) - [#2951] Implement Map.resample_axis() method (Axel Donath) - [#2950] Remove IRFStacker class (Axel Donath) - [#2948] Add ExcessProfileEstimator class (Bruno Khelifi) - [#2947] Improve spectral residuals plot (Luca Giunti) - [#2945] PSF-convolved spatial model caching in MapEvaluator (Quentin Remy) - [#2944] PIG 21 - Model framework improvements (Axel Donath) - [#2943] Add BrokenPowerLawSpectralModel (Quentin Remy) - [#2939] Add theta squared plot example (Léa Jouvin) - [#2938] Add shorter tags for models (Quentin Remy) - [#2932] Fix plot_spectrum_datasets_off_regions and add more options (Alexis de Almeida Coutinho) - [#2931] Remove gammapy.time sub-package (Axel Donath) - [#2929] Add meta_table to SpectrumDataset (Fabio Pintore) - [#2927] Introduce Maker and Estimator base classes and registries (Axel Donath) - [#2924] Add meta_table to MapDataset (Fabio Pintore) - [#2912] Cache npred in MapEvaluator (Quentin Remy) - [#2907] Add info_dict to MapDataset (Atreyee Sinha) - [#2903] Add multi-dimension support for RegionGeom (Régis Terrier) - [#2897] Change to EDispKernelMap in MapDataset (Régis Terrier) - [#2896] Add pyproject.toml (Max Noethe) - [#2891] Modify SpectrumDataset.create() to take MapAxis arguments (Régis Terrier) - [#2885] Add comparators on Map (Régis Terrier) - [#2874] Fix IRFMap stacking (Régis Terrier) - [#2872] Fix MCMC position spread (Fabio Acero) .. _PIG 21: https://docs.gammapy.org/dev/development/pigs/pig-021.html ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.19.rst0000644000175100001770000003525214721316200017425 0ustar00runnerdocker.. _gammapy_0p19_release: 0.19 (Nov 22nd, 2021) --------------------- Summary ~~~~~~~ - Released November 22, 2021 - 19 contributors - 390 pull requests so far (not all listed below) What's new? ~~~~~~~~~~~ Gammapy v0.19 comes with many new features, further unified API, bug fixes and improved performance. It provides a nearly complete API for the coming 1.0 release. API changes a relatively limited and concern mostly the `gammapy.estimators` subpackage. Its API has been unified and now heavily relies on the ``Map`` structure. A wiki page containing specific information on the main API changes from v0.18.2 to v0.19 is available `here `__. Documentation has been significantly reorganized. In particular, the tutorials have been restructured into subfolders based of the type of treatment described. Tutorials are now presented in a `nice gallery view `__ format. Some tutorials have been removed because their objectives are too specific for the general documentation. To present these specific, possibly contributed examples, a new web page and repository has been created: `gammapy.recipes `__. Please follow `these instructions `__ if you would like to contribute to the Gammapy recipes with your own. Several additions were made to the `gammapy.maps` subpackage. A new ``Maps`` container has been introduced. It is a `dict` like structure that stores ``Map`` objects sharing the same geometry. It allows easy serialization and I/O to FITS for such objects. A ``MapAxes`` class, a container for ``MapAxis`` objects, has been introduced as well. It provides several convenience functionality to handle multiple non-spatial axes. This is especially useful after the addition of new axis type. So far, ``MapAxis`` only supports contiguous axes defined either through their edges or their centers. Non-contiguous axes types have been added. ``TimeMapAxis`` provides an axis for non-adjacent time intervals. Similarly, ``LabelMapAxis`` provides an axis for non-numeric entries. The `gammapy.estimators` package has been largely restructured. All ``Estimator`` objects now share the same default options, in particular the full error and upper limit calculations are no longer calculated by default, but passing an option `selection_optional` parameter. The class ``FluxMaps`` has been introduced to provide a uniform scheme for the products of the ``Estimators``. It uses the ``Map`` API as a common container for estimator results. Region based maps, ``RegionNDMap``, are used to contain ``FluxPointsEstimator`` and ``LightCurveEstimator`` results and WCS based maps, ``WcsNDMap``, for ``TSMapEstimator`` and ``ExcessMapEstimator``. This provides many advantages. Internally, this reduce a lot the code duplication and remove unnecessary structures. For instance, the ``LightCurve`` class has been removed since all its functionalities are now supported by the ``FluxPoints``, since they can carry a ``TimeMapAxis``. The internal data model of high level products is therefore much more independent of the data formats defined outside gammapy. Some convenience methods have been introduced to support flux conversion from estimates based on a reference model and `norm` values into various fluxes as defined in gadf (namely `dnde`, `e2dnde`, `flux` or `eflux`). ``FluxMaps`` also allows simple serialization of the maps. Following this new scheme, a ``FluxProfileEstimator`` has been added. The subpackage `gammapy.datasets` has been re-organized. Its API does not change significantly but it has been simplified internally. ``SpectrumDataset`` now inherits from ``MapDataset`` which removes a lot of code duplication. The APIs for the creation of both map and spectrum datasets are now similar and rely on ``Geom``. IRFs can now be natively integrated or averaged over the region using a simple `use_region_center` parameter in the ``SpectrumDatasetMaker``. Model evaluation is now possible on a ``RegionGeom`` thanks to the ``RegionGeom.wcs``. Spatial models can now be evaluated using ``SpectrumDataset``. These changes allow for a direct support of extended sources in 1D analysis. In addition, healpy support has been improved with the addition of a `convolve` method on the ``HpxNDMap`` (which relies on a local WCS projection) and improved I/O methods. Computation of `npred` on a ``MapDataset`` with a ``HpxGeom`` geometry is now possible. `gammapy.irf` has been significantly updated. A general ``IRF`` base class is now used. It relies on ``MapAxes`` and ``Quantity`` and provides most of the methods such as interpolation and I/O. A registry of IRFs is now added. All these new features, considerably simplify the addition of a type of IRF for a user. Model handling has been strongly simplified thanks to a number of helper methods allowing e.g. to freeze all spectral or spatial parameters of a given ``Models`` object, to select all components of a ``Models`` lying inside a given region. New catalog information has been added on `gammapy.catalog`, namely the data release 2 of the 4FGL and the third HAWC catalog. For Gammapy v0.19 we dropped the support for Python 3.6, please use Python 3.7 or later. We also updated to use regions v0.5 and iminuit 2.8.4. After changes in the Travis CI platform, the Continuous Integration (CI) has been simplified and moved from Travis to GitHub actions. Contributors ~~~~~~~~~~~~ In alphabetical order by first name: - Alexis Coutinho - Atreyee Sinha - Axel Donath - Bruno Khelifi - Cosimo Nigro - Dimitri Papadopoulos - Fabio Acero - Fabio Pintore - Jalel Hajlaoui - Johannes Buchner - José Enrique Ruiz - Laura Olivera Nieto - Luca Giunti - Mathieu de Bony - Maximilian Nöthe - Quentin Remy - Régis Terrier - Sebastian Panny - Vikas Joshi Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. - [#3592] Avoid copy in background model evaluation (Quentin Rémy) - [#3584] Return Maps object in ASmoothMapEstimator (Axel Donath) - [#3568] Add test for event sampler using CTA IRF alpha configuration (Fabio Pintore) - [#3561] Add FluxPoints.is_ul setter and fix its serialisation (Quentin Rémy) - [#3559] Codespell infrastructure (Dimitri Papadopoulos) - [#3544] Add FluxMaps.success quantity (Bruno Khelifi) - [#3529] Correct map evaluator to avoid memory overload (Régis Terrier) - [#3523] Add .sample_time to GaussianTemporalMode` and ExpDecayTemporalModel (Fabio Pintore) - [#3515] Add RadMax2d IRF (Maximilian Nöthe) - [#3491] Use FluxMaps as return type for ExcessMapEstimator (Axel Donath) - [#3488] Add a Datasets API notebook (Atreyee Sinha) - [#3480] Implement FluxProfileEstimator (Axel Donath) - [#3474] Return min/max energy maps from energy_range methods (Alexis Coutinho) - [#3471] Rename MapDataset.to_spectrum and improve docstring (Axel Donath) - [#3468] Prevents spatial model integration if no psf is present and add test (Régis Terrier) - [#3466] Restructure gammapy maps with subfolders (Axel Donath) - [#3458] Support binned time series format in FluxPoints.to_table() (Axel Donath) - [#3454] Add npred quantities to flux point computation (Axel Donath) - [#3453] Adapt Gammapy to regions v0.5 (Axel Donath) - [#3449] Remove LightCurve class (Axel Donath) - [#3447] Implement plotting of stat profiles for lightcurves (Axel Donath) - [#3446] corrected axis label units for PSF containment radius (Sebastian Panny) - [#3445] Use FluxPoints object in LightcurveEstimator (Axel Donath) - [#3439] Allow to choose scaling method per parameter (Quentin Rémy) - [#3438] Unify plot unit handling (Axel Donath) - [#3434] Time axis plotting (Axel Donath) - [#3428] Support serialisation of TimeMapAxis (Axel Donath) - [#3426] Cleanup reflected regions finder (Axel Donath) - [#3423] Implement LabelMapAxis (Axel Donath) - [#3420] Disable IRF extrapolation (Quentin Rémy) - [#3418] Refactor FluxPoints to rely on maps internally (Axel Donath) - [#3416] Mask invalid background values in SafeMaskMaker (Quentin Rémy) - [#3413] Introduce inheritance for Estimator classes (Axel Donath) - [#3409] Support iminuit v2.0 (Axel Donath) - [#3406] Add scan specification to the Parameter object (Axel Donath) - [#3404] Add is_pointlike property on irfs (Régis Terrier) - [#3403] Add sparse option to get_coord() methods (Axel Donath) - [#3402] Rename energy_range to energy_bounds (Fabio Pintore) - [#3399] Implement WcsGeom.from_aligned (Axel Donath) - [#3397] Use string for model name in npred_signal() (Atreyee Sinha) - [#3395] Remove counts data caching from MapDataset (Axel Donath) - [#3393] Implement TimeMapAxis class (Régis Terrier) - [#3392] Implement padding for TSMapEstimator (Axel Donath) - [#3390] Fix parameter type for PiecewiseNormSpectralModel and NaimaSpectralModel (Quentin Rémy) - [#3381] Fix FoVBackgroundMaker error (Atreyee Sinha) - [#3379] Use find_roots to call root finding methods (Quentin Rémy) - [#3377] Expand $GAMMAPY_DATA defined in HLI config (Jose Enrique Ruiz) - [#3374] Fix off position only created for first observation in make_theta_squared_table (Maximilian Nöthe) - [#3369] Enable reading psf and edisp for MapDatasetOnOff (Atreyee Sinha) - [#3363] Read healpy maps with col name (Vikas Joshi) - [#3358] Complete test coverage in docstrings of python files (Jose Enrique Ruiz) - [#3357] Improve test coverage in RST files (Jose Enrique Ruiz) - [#3353] Tweak in MapEvaluator.need_update for performance (Quentin Rémy) - [#3349] Cleanup MapEvaluator and add more caching (Quentin Rémy) - [#3347] Simplify and refactor installation instructions (Jose Enrique Ruiz) - [#3346] Allow Npred computation for Healpix MapDataset (Laura Olivera Nieto) - [#3343] Display progress bar in gammapy (Luca Giunti) - [#3342] Add truncation value to npred in cash statistics calculation (Régis Terrier) - [#3338] Add fixed_offset attribute to SafeMaskMaker (Fabio Pintore) - [#3337] Rename BackgroundModel to TemplateNPredModel (Axel Donath) - [#3335] Changed p-value for CountsStatistic (Régis Terrier) - [#3333] Add sed_type keyword to SpectralModel.plot (Fabio Pintore) - [#3328] Make FluxPoints inherit from FluxEstimate (Axel Donath) - [#3323] Implement HpxNDMap.convolve() (Laura Olivera Nieto) - [#3320] Add functionality to read healpy maps (Vikas Joshi) - [#3319] Modify FoVBackgroundMaker to take spectral model as argument (Régis Terrier) - [#3314] Implement HpxNDMap.smooth() (Laura Olivera Nieto) - [#3310] Hpx separation method (Vikas Joshi) - [#3309] Improve doc and cleanup for selection_optional argument in estimators (Quentin Rémy) - [#3308] Implement HpxNDMap.to_region_nd_map (Axel Donath) - [#3307] Add support for ExcessMapEstimator on high level interface (Régis Terrier) - [#3306] Allow to specify spectral model in ExcessMapEstimator (Quentin Rémy) - [#3305] Implement HpxGeom.to_binsz() (Axel Donath) - [#3304] Implement stacking of healpix maps (Axel Donath) - [#3303] Fix npred after energy resampling in ExcessMapEstimator (Quentin Rémy) - [#3302] Implement healpix cutout (Axel Donath) - [#3301] Change HLI default for Fit range (Régis Terrier) - [#3293] Add link to tutorials as thumbnails in API docs (Jose Enrique Ruiz) - [#3285] Use FluxMaps in TSMapEstimator (Axel Donath) - [#3284] Fixing Matplotlib deprecation of nonposx (Fabio Acero) - [#3283] Add Models.plot_spatial() (Atreyee Sinha) - [#3281] Refactor and shorten overview docs page (Axel Donath) - [#3279] Improve visibility of models gallery (Jose Enrique Ruiz) - [#3278] Extend documentation of quantities returned by estimators (Axel Donath) - [#3277] Change Estimator optional default selection (Axel Donath) - [#3276] Group tutorials files in category folders (Jose Enrique Ruiz) - [#3272] Add SourceCatalog.to_models() and improve models selection from catalogues (Quentin Rémy) - [#3262] Cleanup maps tutorial and rst page (Quentin Rémy) - [#3258] Add support for RegionGeom to MapDataset (Axel Donath) - [#3257] Remove code duplication in EDispMap.to_edisp_kernel_map() (Axel Donath) - [#3254] Add .mailmap (Maximilian Nöthe) - [#3246] Clean up and fix WcsNDMap binary operations (Axel Donath) - [#3243] Support I/O for RegionGeom in MapDataset (Axel Donath) - [#3241] Update 4FGL catalogue to DR2 (Quentin Rémy) - [#3238] Unify integration in EDispMap.get_edisp_kernel() and EnergyDispersion2D.get_edisp_kernel() (Axel Donath) - [#3230] Expand extended source analysis tutorial (Laura Olivera Nieto) - [#3228] Remove EffectiveAreaTable class (Axel Donath) - [#3222] Add alternative parametrization for ShellSpatialModel (Quentin Rémy) - [#3219] Add missing values interpolation for Background3D (Quentin Rémy) - [#3217] Make is_norm_spectral_model as classproperty (Atreyee Sinha) - [#3216] Add a notebook to explain model management (Atreyee Sinha) - [#3211] Refactor notebooks processing (Jose Enrique Ruiz) - [#3208] Cleanup Dataset.plot_residuals() (Atreyee Sinha) - [#3207] Remove EnergyDependentTablePSF (Axel Donath) - [#3202] Add an option to the SpectrumDatasetMaker to average IRFs over a Region (Laura Olivera-Nieto) - [#3199] Enable tests of code in RST files and in docstrings of Python scripts (Jose Enrique Ruiz) - [#3197] Introduce PSF and ParametricPSF base classes (Axel Donath) - [#3191] Remove code duplication between SpectrumDatasetMaker and MapDatasetMaker (Axel Donath) - [#3185] Introduce IRF base class and remove code duplication (Axel Donath) - [#3182] Introduce FluxEstimate class (Régis Terrier) - [#3180] Remove code duplication from models.spectral (Fabio Pintore) - [#3178] Clean up PSF classes in gammapy.irf (Axel Donath) - [#3173] Restructure hawc catalog in 2hwc and 3hwc (Jalel Hajlaoui) - [#3169] Add .select_region() and .select_mask() methods to Models (Quentin Rémy) - [#3168] Remove evaluator update from models setter (Quentin Rémy) - [#3165] Improve documentation of RegionGeom and RegionNDMap (Laura Olivera Nieto) - [#3162] Correct Models.from_dict to check input parameters names (Régis Terrier) - [#3158] Add binary_erosion/dilation to WcsNDMap (Quentin Rémy) - [#3155] Add a tutorials notebooks gallery layout (Jose Enrique Ruiz) - [#3153] Refactor gammapy download (Jose Enrique Ruiz) - [#3152] Unify dataset I/O interface (Axel Donath) - [#3148] Add various models and parameters management options (Quentin Rémy) - [#3145] Change the _compute_flux_spatial method on MapEvaluator (Fabio Pintore) - [#3141] Allow SkyModel.integrate_geom to integrate over a RegionGeom (Fabio Pintore) - [#3140] Add an option to ExcessMapEstimator to choose to correlate off events (Mathieu de Bony) - [#3138] Migrate Travis CI to GitHub actions (Jose Enrique Ruiz) - [#3136] Evaluate spatial model in RegionGeom (Laura Olivera Nieto) - [#3131] Further remove code duplication between SpectrumDataset and MapDataset (Axel Donath) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.2.rst0000644000175100001770000000401414721316200017325 0ustar00runnerdocker.. _gammapy_0p2_release: 0.2 (Apr 13, 2015) ------------------ Summary ~~~~~~~ - Released Apr 13, 2015 - 4 contributors (1 new) - 8 months of work - 40 pull requests - Requires Astropy version 1.0 or later. - Gammapy now uses `Cython `__, i.e. requires a C compiler for end-users and in addition Cython for developers. Contributors ~~~~~~~~~~~~ - Manuel Paz Arribas (new) - Christoph Deil - Axel Donath - Ellis Owen Pull requests ~~~~~~~~~~~~~ - [#254] Add changelog for Gammapy (Christoph Deil) - [#252] Implement TS map computation in Cython (Axel Donath) - [#249] Add data store and observation table classes, improve event list classes (Christoph Deil) - [#248] Add function to fill acceptance image from curve (Manuel Paz Arribas) - [#247] Various fixes to image utils docstrings (Manuel Paz Arribas) - [#246] Add catalog and plotting utils (Axel Donath) - [#245] Add colormap and PSF inset plotting functions (Axel Donath) - [#244] Add 3FGL to dataset fetch functions (Manuel Paz Arribas) - [#236] Add likelihood converter function (Christoph Deil) - [#235] Add some catalog utilities (Christoph Deil) - [#234] Add multi-scale TS image computation (Axel Donath) - [#231] Add observatory and data classes (Christoph Deil) - [#230] Use setuptools entry_points for scripts (Christoph Deil) - [#225] Misc cleanup (Christoph Deil) - [#221] TS map calculation update and docs (Axel Donath) - [#215] Restructure TS map computation (Axel Donath) - [#212] Bundle xmltodict.py in gammapy/extern (Christoph Deil) - [#210] Restructure image measurement functions (Axel Donath) - [#205] Remove healpix_to_image function (moved to reproject repo) (Christoph Deil) - [#200] Fix quantity errors from astro source models (Christoph Deil) - [#194] Bundle TeVCat in gammapy.datasets (Christoph Deil) - [#191] Add Fermi PSF dataset and example (Ellis Owen) - [#188] Add tests for spectral_cube.integral_flux_image (Ellis Owen) - [#187] Fix bugs in spectral cube class (Ellis Owen) - [#186] Add iterative kernel background estimator (Ellis Owen) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.20.1.rst0000644000175100001770000000460114721316200017546 0ustar00runnerdocker.. _gammapy_0p20p1_release: 0.20.1 (June 16th, 2022) ------------------------ Summary ~~~~~~~ - Released June 17th, 2022 - 10 contributors - 21 pull requests (not all listed below) What's new? ~~~~~~~~~~~ This release is a bug fix for v0.20. It corrects several issues that were recently reported and improves the documentation . Data with `Background2D` IRFs were not handled correctly by the `MapDatasetMaker` in v0.20. This is now corrected. Error bars in spectral excess plots were not correctly computed which resulted in inconsistent displays between excess and residual panels. The correct excess error is now computed. Spectral extraction with point-like IRFs can fail is some cases where no counts are found within the radmax extension. This is due to an inconsistent behavior of the `PointSkyRegion.contains` method. A patch has been introduced to correct for that before this is fixed in the `regions` package. The new `Observation.peek()` methods no longer returns empty panels when some IRF are not available. Some issues affecting the `SafeMaskMaker` have been corrected as well as a bug of the `ExcessMapEstimator` appearing when the input `Dataset` had not mask defined. The folder structure of the documentation has been updated to closely follow the documentation structure. This corrects some inconsistencies appearing in the lateral table of contents both in the main documentation and the tutorial pages. Contributors ~~~~~~~~~~~~ - Mathieu de Bony - Axel Donath - Luca Giunti - Bruno Khelifi - Maximilian Nöthe - Laura Olivera-Nieto - Fabio Pintore - Quentin Rémy - Atreyee Sinha - Régis Terrier Pull Requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. - [#3990] Restructure docs folder (Axel Donath) - [#3988] Fix issue [#3983] (Laura Olivera-Nieto) - [#3987] Fix the SafeMaskMaker (Bruno Khelifi) - [#3986] Fix bug in RegionNDMap.fill_events with PointSkyRegion (Régis Terrier) - [#3979] Fix inconsistency between NMCID and number of MID* keywords in MapDatasetEventSampler (Fabio Pintore) - [#3975] Fix error bars on plot_excess method os SpectrumDataset(OnOff) (Mathieu de Bony) - [#3966] Adapt Observation.peek according to available hdus (Atreyee Sinha) - [#3959] Correct make_map_background_irf bug with Background2D (Régis Terrier) - [#3948] Load all available HDUs in DataStore.obs, not only the required_hdus (Maximilian Nöthe) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.20.rst0000644000175100001770000001504414721316200017412 0ustar00runnerdocker.. _gammapy_0p20_release: 0.20 (May 12th, 2022) --------------------- Summary ~~~~~~~ - Released May 12th, 2022 - 15 contributors - 184 pull requests (not all listed below) What's new? ~~~~~~~~~~~ *Support for energy dependent ON region spectral extraction has been added* - The RADMAX IRF is now fully supported. - A specific `WobbleRegionFinder` has been introduced to find OFF regions at symmetric positions from the center of the field-of-view. - A tutorial demonstrating how to perform a point-like analysis is provided. It uses a set of two test observations provided by the MAGIC collaboration to perform testing and validation of the method. We thank the Magic collaboration for their contribution. *New theme and re-structuration of the documentation* - The documentation has been completely restructured. The PyData Sphinx theme is now applied. - An automated testing of docstring example code has been added. All docstrings have been fixed. Package structure and dependencies ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ This new release provides a release candidate for v1.0. It serves as the basis for the first long term stable (LTS) release of Gammapy. To ensure long term support, the minimal astropy version is 5.0, and the minimal python version is 3.8. Because they are necessary to the vast majority of Gammapy use cases, iminuit and matplotlib are now required dependencies. Matplotlib >=3.5 is now supported as well as regions 0.6. Bug fixes and improvements ~~~~~~~~~~~~~~~~~~~~~~~~~~ *gammapy.data* - To correctly track observation information for simulations, `Observation.create()` now takes an `observatory_location` argument (default is `None`). - an `EventList.write()` convenience method has been added. - the observation table is now optional on the `DataStore` as per g.a.d.f. - a list of required irfs can be passed to `DataStore.get_observations()`. Predefined shortcuts are provided for `full-containment` (default) and `point-like`. *gammapy.estimators* - A method to resample the energy axis of a `Dataset` based on counts statistics criteria (e.g. minimum number of `counts` or `counts_off`): gammapy.estimators.utils.resample_energy_axis()` *gammapy.makers* - The behaviour of the `bkg-peak` method of the `SafeMaskMaker` has been changed to avoid removing the first energy bin if the maximum number of background counts is in the first energy bin. *gammapy.modeling* - A new `is_norm` property has been added to the `Parameter` object. It is used to determine the parameter giving the flux normalization and solves a bug with the flux point estimation using a `CompoundModel`. - `Fit.covariance()` now correctly handles shared (linked) parameters. - `FitResult` now contains a copy of the `Models` after the fit. Its `parameters` attribute is not modified by further manipulation of the initial model. *gammapy.analysis* - `DatasetsMaker` is now internally used and can be configured in the yaml file. *gammapy.catalog* - The 4FGL data release 3 has been added. Contributors ~~~~~~~~~~~~ - Fabio Acero - Arnau Aguasca - Tyler Cahill - Alisha Chromey - Axel Donath - Luca Giunti - Cosimo Nigro - Maximilian Nöthe - Laura Olivera-Nieto - Dimitri Papadopoulos - Fabio Pintore - Quentin Rémy - Jose Enrique Ruiz - Atreyee Sinha - Régis Terrier Pull Requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. - [#3941] Introduce matplotlib as required dependency (Régis Terrier) - [#3936] Introduce iminuit as required dependency (Axel Donath) - [#3933] Add support for regions v0.6 (Axel Donath) - [#3918] Improve fit results (Luca Giunti) - [#3915] Fix evaluate_containment calculation in class PSFKing(ParametricPSF) (Alisha Chromey) - [#3906] Require astropy v>=5.0 (Axel Donath) - [#3905] Use .is_norm property in FluxEstimator (Axel Donath) - [#3904] Fix bug on MC_ID when sampling IRF background events (Fabio Pintore) - [#3898] Add Map.resample method (Quentin Rémy) - [#3895] Add MapAxes .__eq__, .__neq__, and .copy() (Quentin Rémy) - [#3892] Allow for regular reflected region analysis with fixed rad_max IRFs (Régis Terrier) - [#3887] Implement __eq__ for DL3 IRFs (Atreyee Sinha) - [#3876] Add temporal model on flux point dataset (Atreyee Sinha) - [#3874] Allow MapDatasetOff in Datasets serialization and avoid read_lazy (Quentin Rémy) - [#3873] Modify SafeMaskMaker to ignore data only below the background peak (Atreyee Sinha) - [#3860] Introduce is_norm parameter property (Axel Donath) - [#3856] Make obs_table optional on DataStore (Régis Terrier) - [#3846] Allows DataStoreMaker to be used with IRFs not following CALDB structure (Quentin Rémy) - [#3842] Update SourceCatalog4FGL class to support changes in data release 3 (Quentin Rémy) - [#3837] Allow nearest neighbor interpolation with scalar data (Axel Donath) - [#3833] Automate generation of codemeta.json and .zenodo.json files (Jose Enrique Ruiz) - [#3817] Improve documentation theme (Jose Enrique Ruiz) - [#3810] Fix doc strings in estimators (Atreyee Sinha) - [#3806] Improve documentation for the DatasetsMaker (Quentin Rémy) - [#3804] Event wise rad max (Maximilian Nöthe) - [#3802] Use DatasetsMaker in Analysis class (Quentin Rémy) - [#3797] Refactor pointing information handling and Observation.create (Maximilian Nöthe) - [#3796] Helper function to rebin map axis (Luca Giunti) - [#3783] Speed docs building process (Jose Enrique Ruiz) - [#3777] Validate EnergyDispersion2D units (Fabio Pintore) - [#3761] Execute notebooks in parallel (Maximilian Nöthe) - [#3760] Fix issues reported by pyflakes and add pyflakes step to ci (Maximilian Nöthe) - [#3752] Human readable energy units string formatting for plot_interactive & plot_grid (Fabio Acero) - [#3748] Fix doc strings for makers and datasets (Atreyee Sinha) - [#3740] Common format axis labels (Fabio Pintore) - [#3733] Add new RegionsFinder that uses a fixed number of regions symmetrically distributed on the circle (Cosimo Nigro) - [#3728] Add missing required GADF headers in IRF classes (Maximilian Nöthe) - [#3722] Switch documentation to PyData Sphinx Theme (Jose Enrique Ruiz) - [#3720] Add convenience method to write EventLists to file (Laura Olivera Nieto) - [#3713] Fix matplotlib 3.5+ incompatibility with WcsNDMap.plot() (Tyler Cahill) - [#3712] Added a notebook tutorial showing an energy-dependent spectrum extraction (Cosimo Nigro) - [#3699] Use mamba in CI jobs (Maximilian Nöthe) - [#3684] Started to implement the energy-dependent 1D spectrum extraction (Cosimo Nigro) - [#3669] Add GeneralizedGaussianTemporalModel (Arnau Aguasca) - [#3535] Add TemplateNDSpectralModel (Quentin Rémy) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.3.rst0000644000175100001770000000456214721316200017336 0ustar00runnerdocker.. _gammapy_0p3_release: 0.3 (Aug 13, 2015) ------------------ Summary ~~~~~~~ - Released Aug 13, 2015 - 9 contributors (5 new) - 4 months of work - 24 pull requests - Requires Astropy version 1.0 or later. - On-off likelihood spectral analysis was added in gammapy.hspec, contributed by Régis Terrier and Ignasi Reichardt. It will be refactored and is thus not part of the public API. - The Gammapy 0.3 release is the basis for an `ICRC 2015 poster contribution `__ Contributors ~~~~~~~~~~~~ - Manuel Paz Arribas - Christoph Deil - Axel Donath - Jonathan Harris (new) - Johannes King (new) - Stefan Klepser (new) - Ignasi Reichardt (new) - Régis Terrier - Victor Zabalza (new) Pull requests ~~~~~~~~~~~~~ - [#326] Fix Debian install instructions (Victor Zabalza) - [#318] Set up and document logging for Gammapy (Christoph Deil) - [#317] Using consistent plotting style in docs (Axel Donath) - [#312] Add an "About Gammapy" page to the docs (Christoph Deil) - [#306] Use assert_quantity_allclose from Astropy (Manuel Paz Arribas) - [#301] Simplified attribute docstrings (Manuel Paz Arribas) - [#299] Add cube background model class (Manuel Paz Arribas) - [#296] Add interface to HESS FitSpectrum JSON output (Christoph Deil) - [#295] Observation table subset selection (Manuel Paz Arribas) - [#291] Remove gammapy.shower package (Christoph Deil) - [#289] Add a simple Makefile for Gammapy. (Manuel Paz Arribas) - [#286] Function to plot Fermi 3FGL light curves (Jonathan Harris) - [#285] Add infos how to handle times in Gammapy (Christoph Deil) - [#283] Consistent random number handling and improve sample_sphere (Manuel Paz Arribas) - [#280] Add new subpackage: gammapy.time (Christoph Deil) - [#279] Improve SNRcat dataset (Christoph Deil) - [#278] Document observation tables and improve gammapy.obs (Manuel Paz Arribas) - [#276] Add EffectiveAreaTable exporter to EffectiveAreaTable2D (Johannes King) - [#273] Fix TS map header writing and temp file handling (Axel Donath) - [#264] Add hspec - spectral analysis using Sherpa (Régis Terrier, Ignasi Reichardt, Christoph Deil) - [#262] Add SNRCat dataset access function (Christoph Deil) - [#261] Fix spiral arm model bar radius (Stefan Klepser) - [#260] Add offset-dependent effective area IRF class (Johannes King) - [#256] EventList class fixes and new features (Christoph Deil) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.4.rst0000644000175100001770000001074714721316200017341 0ustar00runnerdocker.. _gammapy_0p4_release: 0.4 (Apr 20, 2016) ------------------ Summary ~~~~~~~ - Released Apr 20, 2016 - 10 contributors (5 new) - 8 months of work - 108 pull requests (not all listed below) - Requires Python 2.7 or 3.4+, Numpy 1.8+, Scipy 0.15+, Astropy 1.0+, Sherpa 4.8+ What's new? ~~~~~~~~~~~ - Women are hacking on Gammapy! - IACT data access via DataStore and HDU index tables - Radially-symmetric background modeling - Improved 2-dim image analysis - 1-dim spectral analysis - Add sub-package ``gammapy.cube`` and start working on 3-dim cube analysis - Continuous integration testing for Windows on Appveyor added (Windows support for Gammapy is preliminary and incomplete) Contributors ~~~~~~~~~~~~ - Axel Donath - Brigitta Sipocz (new) - Christoph Deil - Dirk Lennarz (new) - Johannes King - Jonathan Harris - Léa Jouvin (new) - Luigi Tibaldo (new) - Manuel Paz Arribas - Olga Vorokh (new) Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy 0.4 merged pull requests list on GitHub `__. - [#518] Fixes and cleanup for SkyMap (Axel Donath) - [#511] Add exposure image computation (Léa Jouvin) - [#510] Add acceptance curve smoothing method (Léa Jouvin) - [#507] Add Fermi catalog spectrum evaluation and plotting (Johannes King) - [#506] Improve TS map computation performance (Axel Donath) - [#503] Add FOV background image modeling (Léa Jouvin) - [#502] Add DataStore subset method (Johannes King) - [#487] Add SkyMap class (Axel Donath) - [#485] Add OffDataBackgroundMaker (Léa Jouvin) - [#484] Add Sherpa cube analysis prototype (Axel Donath) - [#481] Add new gammapy.cube sub-package (Axel Donath) - [#478] Add observation stacking method for spectra (Léa Jouvin and Johannes King) - [#475] Add tests for TS map image computation (Olga Vorokh) - [#474] Improve significance image analysis (Axel Donath) - [#473] Improve tests for HESS data (Johannes King) - [#462] Misc cleanup (Christoph Deil) - [#461] Pacman (Léa Jouvin) - [#459] Add radially symmetric FOV background model (Léa Jouvin) - [#457] Improve data and observation handling (Christoph Deil) - [#456] Fix and improvements to TS map tool (Olga Vorokh) - [#455] Improve IRF interpolation and extrapolation (Christoph Deil) - [#447] Add King profile PSF class (Christoph Deil) - [#436] Restructure spectrum package and command line tool (Johannes King) - [#435] Add info about Gammapy contact points and gammapy-extra (Christoph Deil) - [#421] Add spectrum fit serialisation code (Johannes King) - [#403] Improve spectrum analysis (Johannes King) - [#415] Add EventList plots (Jonathan Harris) - [#414] Add Windows tests on Appveyor (Christoph Deil) - [#398] Add function to compute exposure cubes (Luigi Tibaldo) - [#396] Rewrite spectrum analysis (Johannes King) - [#395] Fix misc issues with IRF classes (Johannes King) - [#394] Move some data specs to gamma-astro-data-formats (Christoph Deil) - [#392] Use external ci-helpers (Brigitta Sipocz) - [#387] Improve Gammapy catalog query and browser (Christoph Deil) - [#383] Add EnergyOffsetArray (Léa Jouvin) - [#379] Add gammapy.region and reflected region computation (Johannes King) - [#375] Misc cleanup of scripts and docs (Christoph Deil) - [#371] Improve catalog utils (Christoph Deil) - [#369] Improve the data management toolbox (Christoph Deil) - [#367] Add Feldman Cousins algorithm (Dirk Lennarz) - [#364] Improve catalog classes and gammapy-extra data handling (Jonathan Harris, Christoph Deil) - [#361] Add gammapy-spectrum-pipe (Johannes King) - [#359] Add 1D spectrum analysis tool based on gammapy.hspec (Johannes King) - [#353] Add some scripts and examples (Christoph Deil) - [#352] Add data management tools (Christoph Deil) - [#351] Rewrite EnergyDispersion class (Johannes King) - [#348] Misc code cleanup (Christoph Deil) - [#347] Add background cube model comparison plot script (Manuel Paz Arribas) - [#342] Add gammapy-bin-image test (Christoph Deil) - [#339] Remove PoissonLikelihoodFitter (Christoph Deil) - [#338] Add example script for cube background models (Manuel Paz Arribas) - [#337] Fix sherpa morphology fitting script (Axel Donath) - [#335] Improve background model simulation (Manuel Paz Arribas) - [#332] Fix TS map boundary handling (Axel Donath) - [#330] Add EnergyDispersion and CountsSpectrum (Johannes King) - [#319] Make background cube models (Manuel Paz Arribas) - [#290] Improve energy handling (Johannes King) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.5.rst0000644000175100001770000001246414721316200017340 0ustar00runnerdocker.. _gammapy_0p5_release: 0.5 (Nov 22, 2016) ------------------ Summary ~~~~~~~ - Released Nov 22, 2016 - 12 contributors (5 new) - 7 months of work - 184 pull requests (not all listed below) - Requires Python 2.7 or 3.4+, Numpy 1.8+, Scipy 0.15+, Astropy 1.2+, Sherpa 4.8.2+ What's new? ~~~~~~~~~~~ - Tutorial-style getting started documentation as Jupyter notebooks - Removed ``gammapy.regions`` and have switched to the move complete and powerful `regions `__ package (planned to be added to the Astropy core within the next year). - ``gammapy.spectrum`` - Many 1-dimensional spectrum analysis improvements (e.g. spectral point computation) - ``gammapy.image`` - Many ``SkyImage`` improvements, adaptive ring background estimation, asmooth algorithm - ``gammapy.detect`` - CWT and TS map improvements - ``gammapy.time`` - A lightcurve class and variability test - ``gammapy.irf`` - Many improvements to IRF classes, especially the PSF classes. - Many improved tests and test coverage Contributors ~~~~~~~~~~~~ - Axel Donath - Brigitta Sipocz - Christoph Deil - Domenico Tiziani (new) - Helen Poon (new) - Johannes King - Julien Lefaucheur (new) - Léa Jouvin - Matthew Wood (new) - Nachiketa Chakraborty (new) - Olga Vorokh - Régis Terrier Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy 0.5 merged pull requests list on GitHub `__. - [#790] Add powerlaw energy flux integral for ``gamma=2`` (Axel Donath) - [#789] Fix Wstat (Johannes King) - [#783] Add PHA type II file I/O to SpectrumObservationList (Johannes King) - [#778] Fix Gauss PSF energy bin issue (Léa Jouvin) - [#777] Rewrite crab spectrum as class (Axel Donath) - [#774] Add skyimage smooth method (Axel Donath) - [#772] Stack EDISP for a set of observations (Léa Jouvin) - [#767] Improve PSF checker and add a test (Christoph Deil) - [#766] Improve SkyCube convolution and npred computation (Axel Donath) - [#763] Add TablePSFChecker (Domenico Tiziani) - [#762] Add IRFStacker class (Léa Jouvin) - [#759] Improve SkyCube energy axes (Axel Donath) - [#754] Change EventList from Table subclass to attribute (Christoph Deil) - [#753] Improve SkyCube class (Axel Donath) - [#746] Add image asmooth algorithm (Axel Donath) - [#740] Add SpectrumObservationStacker (Johannes King) - [#739] Improve kernel background estimator (Axel Donath) - [#738] Fix reflected region pixel origin issue (Léa Jouvin) - [#733] Add spectral table model (Julien Lefaucheur) - [#731] Add energy dispersion RMF integration (Léa Jouvin) - [#719] Add adaptive ring background estimation (Axel Donath) - [#713] Improve ring background estimation (Axel Donath) - [#710] Misc image and cube cleanup (Christoph Deil) - [#709] Spectrum energy grouping (Christoph Deil) - [#679] Add flux point computation method (Johannes King) - [#677] Fermi 3FGL and 2FHL spectrum plotting (Axel Donath) - [#661] Improve continuous wavelet transform (Olga Vorokh) - [#660] Add Fermipy sky image code to Gammapy (Matthew Wood) - [#653] Add up- and downsampling to SkyImage (Axel Donath) - [#649] Change to astropy regions package (Christoph Deil) - [#648] Add class to load CTA IRFs (Julien Lefaucheur) - [#647] Add SpectrumSimulation class (Johannes King) - [#641] Add ECPL model, energy flux and integration methods (Axel Donath) - [#640] Remove pyfact (Christoph Deil) - [#635] Fix TS maps low stats handling (Axel Donath) - [#631] Fix ExclusionMask.distance (Olga Vorokh) - [#628] Add flux points computation methods (Johannes King) - [#622] Make gammapy.time great again (Christoph Deil) - [#599] Move powerlaw utility functions to separate namespace (Christoph Deil) - [#594] Fix setup.py and docs/conf.py configparser import (Christoph Deil) - [#593] Remove gammapy/hspec (Christoph Deil) - [#591] Add spectrum energy flux computation (Axel Donath) - [#582] Add SkyImageList (Olga Vorokh) - [#558] Finish change to use gammapy.extern.regions (Johannes King and Christoph Deil) - [#569] Add detection utilities à la BgStats (Julien Lefaucheur) - [#565] Add exptest time variability test (Helen Poon) - [#564] Add LightCurve class (Nachiketa Chakraborty) - [#559] Add paste, cutout and look_up methods to SkyMap class (Axel Donath) - [#557] Add spectrum point source containment correction option (Régis Terrier) - [#556] Add offset-dependent table PSF class (Domenico Tiziani) - [#549] Add mean PSF computation (Léa Jouvin) - [#547] Add astropy.regions to gammapy.extern (Johannes King) - [#546] Add Target class (Johannes King) - [#545] Add PointingInfo class (Christoph Deil) - [#544] Improve SkyMap.coordinates (Olga Vorokh) - [#541] Refactor effective area IRFs to use NDDataArray (Johannes King) - [#535] Add spectrum and flux points to HGPS catalog (Axel Donath) - [#531] Add ObservationTableSummary class (Julien Lefaucheur) - [#530] Update readthedocs links from .org to .io (Brigitta Sipocz) - [#529] Add data_summary method to DataStore (Johannes King) - [#527] Add n-dim data base class for gammapy.irf (Johannes King) - [#526] Add King PSF evaluate and to_table_psf methods (Léa Jouvin) - [#524] Improve image pipe class (Léa Jouvin) - [#523] Add Gauss PSF to_table_psf method (Axel Donath) - [#521] Fix image pipe class (Léa Jouvin) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.6.rst0000644000175100001770000001627714721316200017347 0ustar00runnerdocker.. _gammapy_0p6_release: 0.6 (Apr 28, 2017) ------------------ Summary ~~~~~~~ - Released Apr 28, 2017 - 14 contributors (5 new) - 5 months of work - 147 pull requests (not all listed below) What's new? ~~~~~~~~~~~ - Release and installation - Until now, we had a roughly bi-yearly release cycle for Gammapy. Starting now, we will make stable releases more often, to ship features and fixes to Gammapy users more quickly. - Gammapy 0.6 requires Python 2.7 or 3.4+, Numpy 1.8+, Scipy 0.15+, Astropy 1.3+, Sherpa 4.9.0+ . Most things will still work with older Astropy and Sherpa, but we dropped testing for older versions from our continuous integration. - Gammapy is now available via Macports, a package manager for Mac OS (``port install py35-gammapy``) - Documentation - Added many tutorials as Jupyter notebooks (linked to from the docs front-page) - Misc docs improvements and new getting started notebooks - For CTA - Better support for CTA IRFs - A notebook showing how to analyse some simulated CTA data (preliminary files from first data challenge) - Better support and documentation for CTA will be the focus of the next release (0.7). - For Fermi-LAT - Introduced a reference dataset: https://github.com/gammapy/gammapy-fermi-lat-data - Added convenience class to work with Fermi-LAT datasets - gammapy.catalog - Add support for gamma-cat, an open data collection and source catalog for gamma-ray astronomy (https://github.com/gammapy/gamma-cat) - Access to more Fermi-LAT catalogs (1FHL, 2FHL, 3FHL) - gammapy.spectrum - Better flux point class - Add flux point SED fitter - EBL-absorbed spectral models - Improved spectrum simulation class - gammapy.image - Add image radial and box profiles - Add adaptive ring background estimation - Add adaptive image smooth algorithm - gammapy.cube - Add prototype for 3D analysis of IACT data (work in progress) - gammapy.time - Add prototype lightcurve estimator for IACT data (work in progress) - gammapy.irf - Many IRF classes now rewritten to use the generic ``NDDataArray`` and axis classes - Better handling of energy dispersion - gammapy.utils - Add gammapy.utils.modeling (work in progress) - Add gammapy.utils.sherpa (generic interface to sherpa for fitting, with models and likelihood function defined in Gammapy) (work in progress) - Many small bugfixes and improvements throughout the codebase and documentation Contributors ~~~~~~~~~~~~ - Arjun Voruganti (new) - Arpit Gogia (new) - Axel Donath - Brigitta Sipocz - Bruno Khelifi (new) - Christoph Deil - Dirk Lennarz - Fabio Acero (new) - Johannes King - Julien Lefaucheur - Lars Mohrmann (new) - Léa Jouvin - Nachiketa Chakraborty - Régis Terrier - Zé Vinícius (new) Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy 0.6 merged pull requests list on GitHub `__. - [#1006] Add possibility to skip runs based on alpha in SpectrumExtraction (Johannes King) - [#1002] Containment correction in SpectrumObservation via AREASCAL (Johannes King) - [#1001] Add SpectrumAnalysisIACT (Johannes King) - [#997] Add compute_chisq method to lightcurve class (Nachiketa Chakraborty) - [#994] Improve Gammapy installation docs (Christoph Deil) - [#988] Add spectral model absorbed by EBL that can be fit (Julien Lefaucheur) - [#985] Improve error methods on spectral models (Axel Donath) - [#979] Add flux point fitter class (Axel Donath) - [#976] Fixes to Galactic population simulation (Christoph Deil) - [#975] Add PLSuperExpCutoff3FGL spectral model (Axel Donath) - [#966] Remove SkyMask (merge with SkyImage) (Christoph Deil) - [#950] Add light curve computation (Julien Lefaucheur) - [#933] Change IRF plotting from imshow to pcolormesh (Axel Donath) - [#932] Change NDDataArray default_interp_kwargs to extrapolate (Johannes King) - [#919] Fix Double plot issue in notebooks and improve events.peek() (Fabio Acero) - [#911] Improve EnergyDispersion2D get_response and tests (Régis Terrier) - [#906] Fix catalog getitem to work with numpy int index (Zé Vinícius) - [#898] Add printout for 3FGL catalog objects (Arjun Voruganti) - [#893] Add Fermi-LAT 3FGL catalog object lightcurve property (Arpit Gogia) - [#888] Improve CTA IRF and simulation classes (point-like analysis) (Julien Lefaucheur) - [#885] Improve spectral model uncertainty handling (Axel Donath) - [#884] Improve BinnedDataAxis handling of lo / hi binning (Johannes King) - [#883] Improve spectrum docs page (Johannes King) - [#881] Add support for observations with different energy binning in SpectrumFit (Lars Mohrmann) - [#875] Add CTA spectrum simulation example (Julien Lefaucheur) - [#872] Add SED type e2dnde to FluxPoints (Johannes King) - [#871] Add Parameter class to SpectralModel (Johannes King) - [#870] Clean up docstrings in background sub-package (Arpit Gogia) - [#868] Add Fermi-LAT 3FHL catalogue (Julien Lefaucheur) - [#865] Add Fermi basic image estimator (Axel Donath) - [#864] Improve edisp.apply to support different true energy axes (Johannes King) - [#859] Remove old image_profile function (Axel Donath) - [#858] Fix Fermi catalog flux point upper limits (Axel Donath) - [#855] Add Fermi-LAT 1FHL catalogue (Julien Lefaucheur) - [#854] Add Fermi-LAT dataset class (Axel Donath) - [#851] Write Macports install docs (Christoph Deil) - [#847] Fix Sherpa spectrum OGIP file issue (Régis Terrier and Johannes King) - [#842] Add AbsorbedSpectralModel and improve CTA IRF class (Julien Lefaucheur) - [#840] Fix energy binning issue in cube pipe (Léa Jouvin) - [#837] Fix containment fraction issue for table PSF (Léa Jouvin) - [#836] Fix spectrum observation write issue (Léa Jouvin) - [#835] Add image profile estimator class (Axel Donath) - [#834] Bump to require Astropy v1.3 (Christoph Deil) - [#833] Add image profile class (Axel Donath) - [#832] Improve NDDataArray (use composition, not inheritance) (Johannes King) - [#831] Add CTA Sensitivity class and plot improvements (Julien Lefaucheur) - [#830] Add gammapy.utils.modeling and GammaCat to XML (Christoph Deil) - [#827] Add energy dispersion for 3D spectral analysis (Léa Jouvin) - [#826] Add sky cube computation for IACT data (Léa Jouvin) - [#825] Update astropy-helpers to v1.3 (Brigitta Sipocz) - [#824] Add XSPEC table absorption model to spectral table model (Julien Lefaucheur) - [#820] Add morphology models for gamma-cat sources (Axel Donath) - [#816] Add class to access CTA point-like responses (Julien Lefaucheur) - [#814] Remove old flux point classes (Axel Donath) - [#813] Improve Feldman Cousins code (Dirk Lennarz) - [#812] Improve differential flux point computation code (Axel Donath) - [#811] Adapt catalogs to new flux point class (Axel Donath) - [#810] Add new flux point class (Axel Donath) - [#798] Add Fvar variability measure for light curves (Nachiketa Chakraborty) - [#796] Improve LogEnergyAxis object (Axel Donath) - [#797] Improve WStat implementation (Johannes King) - [#793] Add GammaCat source catalog (Axel Donath) - [#791] Misc fixes to spectrum fitting code (Johannes King) - [#784] Improve SkyCube exposure computation (Léa Jouvin) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.7.rst0000644000175100001770000002641414721316200017342 0ustar00runnerdocker.. _gammapy_0p7_release: 0.7 (Feb 28, 2018) ------------------ Summary ~~~~~~~ - Released Feb 28, 2018 - 25 contributors (16 new) - 10 months of work - 178 pull requests (not all listed below) What's new? ~~~~~~~~~~~ Installation: - Gammapy 0.7 supports legacy Python 2.7, as well as Python 3.5 and 3.6. If you are still using Python 2.7 with Gammapy, please update to Python 3. Let us know if you need any help with the update, or are blocked from updating for some reason, by filling out the 1-minute `Gammapy installation questionnaire`_ form. This will help us make a plan how to finish the Python 2 -> 3 transition and to set a timeline (`PIG 3`_). - The Gammapy conda packages are now distributed via the ``conda-forge`` channel, i.e. to install or update Gammapy use the command ``conda install gammapy -c conda-forge``. Most other packages have also moved to ``conda-forge`` in the past years, the previously used ``astropy`` and ``openastronomy`` channels are no longer needed. - We now have a conda ``environment.yml`` file that contains all packages used in the tutorials. See instructions here: :ref:`tutorials`. Documentation: - We have created a separate project webpage at https://gammapy.org . The https://docs.gammapy.org page is not just for the Gammapy documentation. - A lot of new tutorials were added in the form of Jupyter notebooks. To make the content of the notebooks easier to navigate and search, a rendered static version of the notebooks was integrated in the Sphinx-based documentation (the one you are looking at) at :ref:`tutorials`. - Most of the Gammapy tutorials can be executed directly in the browser via the https://mybinder.org/ service. There is a "launch in binder" link at the top of each tutorial in the docs. - A page was created to collect the information for CTA members how to get started with Gammapy and with contact / support channels: https://gammapy.org/cta.html Gammapy Python package: - This release contains many bug fixes and improvements to the existing code, ranging from IRF interpolation to spectrum and lightcurve computation. Most of the improvements (see the list of pull requests below) were driven by user reports and feedback from CTA, HESS, MAGIC and Fermi-LAT analysis. Please update to the new version and keep filing bug reports and feature requests! - A new sub-package `gammapy.maps` was added that features WCS and HEALPix based maps, arbitrary extra axes in addition to the two spatial dimensions (e.g. energy, time or event type). Support for multi-resolution and sparse maps is work in progress. These new maps classes were implemented based on the experience gained from the existing ``SkyImage`` and ``SkyCube`` classes as well as the Fermi science tools, Fermipy and pointlike. Work on new analysis code based on ``gammapy.maps`` within Gammapy is starting now (see `PIG 2`_). Users are encouraged to start using ``gammapy.maps`` in their scripts. The plan is to keep the existing ``SkyImage`` and ``SkyCube`` and image / cube analysis code that we have now mostly unchanged (only apply bugfixes), and to remove them at some future date after the transition to the use of ``gammapy.maps`` within Gammapy (including all tests and documentation and tutorials) is complete and users had some time to update their code. If you have any questions or need help with ``gammapy.maps`` or find an issue or missing feature, let us know! Command line interface: - The Gammapy command-line interface was changed to use a single command ``gammapy`` multiple sub-commands (like ``gammapy info`` or ``gammapy image bin``). Discussions on developing the high level interface for Gammapy (e.g. as a set of command line tools, or a config file driven analysis) are starting now. Organisation: - A webpage at https://gammapy.org/ was set up, separate from the Gammapy documentation page https://docs.gammapy.org/ . - The Gammapy project and team organisation was set up with clear roles and responsibilities, in a way to help the Gammapy project grow, and to support astronomers and projects like CTA using Gammapy better. This is described at https://gammapy.org/team.html . - To improve the quality of Gammapy, we have set up a proposal-driven process for major improvements for Gammapy, described in :ref:`pig-001`. We are now starting to use this to design a better low level analysis code (`PIG 2`_) and to define a plan to finish the Python 2-> 3 transition (`PIG 3`_). .. _PIG 2: https://github.com/gammapy/gammapy/pull/1277 .. _PIG 3: https://github.com/gammapy/gammapy/pull/1278 .. _Gammapy installation questionnaire: https://goo.gl/forms/0QuYYyyPCbKnFJJI3 Contributors ~~~~~~~~~~~~ - Anne Lemière (new) - Arjun Voruganti - Atreyee Sinha (new) - Axel Donath - Brigitta Sipocz - Bruno Khelifi (new) - Christoph Deil - Cosimo Nigro (new) - Jean-Philippe Lenain (new) - Johannes King - José Enrique Ruiz (new) - Julien Lefaucheur - Kai Brügge (new) - Lab Saha (new) - Lars Mohrmann - Léa Jouvin - Matthew Wood - Matthias Wegen (new) - Oscar Blanch (new) - Peter Deiml (new) - Régis Terrier - Roberta Zanin (new) - Rubén López-Coto (new) - Thomas Armstrong (new) - Thomas Vuillaume (new) - Yves Gallant (new) Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy 0.7 merged pull requests list on GitHub `__. - [#1319] Fix a bug in SpectrumStacker (Anne Lemière) - [#1318] Improve MapCoord interface (Matthew Wood) - [#1316] Add flux point estimation for multiple observations (Lars Mohrmann) - [#1312] Add Background 2D class (Léa Jouvin) - [#1305] Fix exposure and flux units in IACTBasicImageEstimator (Yves Gallant) - [#1300] Add PhaseCurve class for periodic systems (Lab Saha) - [#1294] Fix IACTBasicImageEstimator psf method (Yves Gallant) - [#1291] Add meta attribute to maps (Léa Jouvin) - [#1290] Change image_pipe and fov to include a minimum offset cut (Atreyee Sinha) - [#1289] Fix excess for given significance computation (Oscar Blanch) - [#1287] Fix time in LightCurveEstimator result table (Jean-Philippe Lenain) - [#1281] Add methods for WCS maps (Matthew Wood) - [#1266] No pytest import from non-test code (Christoph Deil) - [#1268] Fix PSF3D.to_energy_dependent_table_psf (Christoph Deil) - [#1246] Improve map read method (Matthew Wood) - [#1240] Finish change to Click in gammapy.scripts (Christoph Deil) - [#1238] Clean up catalog image code (Axel Donath) - [#1235] Introduce main ``gammapy`` command line tool (Axel Donath and Christoph Deil) - [#1227] Remove gammapy-data-show and gammapy-cube-bin (Christoph Deil) - [#1226] Make DataStoreObservation properties less lazy (Christoph Deil) - [#1220] Fix flux point computation for non-power-law models (Axel Donath) - [#1215] Finish integration of Jupyter notebooks with Sphinx docs (Jose Enrique Ruiz) - [#1211] Add IRF write methods (Thomas Armstrong) - [#1210] Fix min energy handling in SpectrumEnergyGrouper (Julien Lefaucheur and Christoph Deil) - [#1207] Add theta2 distribution plot to EventList class (Thomas Vuillaume) - [#1204] Consistently use mode='constant' in convolutions of RingBackgroundEstimator (Lars Mohrmann) - [#1195] Change IRF extrapolation behaviour (Christoph Deil) - [#1190] Refactor gammapy.maps methods for calculating index and coordinate arrays (Matthew Wood) - [#1183] Add function to compute background cube (Roberta Zanin and Christoph Deil) - [#1179] Fix two bugs in LightCurveEstimator, and improve speed considerably (Lars Mohrmann) - [#1176] Integrate tutorial notebooks in Sphinx documentation (Jose Enrique Ruiz) - [#1170] Add sparse map prototype (Matthew Wood) - [#1169] Remove old HEALPix image and cube classes (Christoph Deil) - [#1166] Fix ring background estimation (Axel Donath) - [#1162] Add ``gammapy.irf.Background3D`` (Roberta Zanin and Christoph Deil) - [#1150] Fix PSF evaluate error at low energy and high offset (Bruno Khelifi) - [#1134] Add MAGIC Crab reference spectrum (Cosimo Nigro) - [#1133] Fix energy_resolution method in EnergyDispersion class (Lars Mohrmann) - [#1127] Fix 3FHL spectral indexes for PowerLaw model (Julien Lefaucheur) - [#1115] Fix energy bias computation (Cosimo Nigro) - [#1110] Remove ATNF catalog class and Green catalog load function (Christoph Deil) - [#1108] Add HAWC 2HWC catalog (Peter Deiml) - [#1107] Rewrite GaussianBand2D model (Axel Donath) - [#1105] Emit warning when HDU loading from index is ambiguous (Lars Mohrmann) - [#1104] Change conda install instructions to conda-forge channel (Christoph Deil) - [#1103] Remove catalog and data browser Flask web apps (Christoph Deil) - [#1102] Add 3FGL spatial models (Axel Donath) - [#1100] Add energy reference for exposure map (Léa Jouvin) - [#1098] Improve flux point fitter (Axel Donath) - [#1093] Implement I/O methods for ``gammapy.maps`` (Matthew Wood) - [#1092] Add random seed argument for CTA simulations (Julien Lefaucheur) - [#1090] Add default parameters for spectral models (Axel Donath) - [#1089] Fix Fermi-LAT catalog flux points property (Axel Donath) - [#1088] Update Gammapy to match Astropy region changes (Johannes King) - [#1087] Add peak energy property to some spectral models (Axel Donath) - [#1085] Update astropy-helpers to v2.0 (Brigitta Sipocz) - [#1084] Add flux points upper limit estimation (Axel Donath) - [#1083] Add JSON-serialisable source catalog object dict (Arjun Voruganti) - [#1082] Add observation sanity check method to DataStore (Lars Mohrmann) - [#1078] Add printout for 3FHL and gamma-cat sources (Arjun Voruganti) - [#1076] Development in ``gammapy.maps`` (Matthew Wood) - [#1073] Fix spectrum fit for case of no EDISP (Johannes King) - [#1070] Add Lomb-Scargle detection function (Matthias Wegen) - [#1069] Add easy access to parameter errors (Johannes King) - [#1067] Add flux upper limit computation to TSImageEstimator (Axel Donath) - [#1065] Add skip_missing option to ``DataStore.obs_list`` (Johannes King) - [#1057] Use system pytest rather than astropy (Brigitta Sipocz) - [#1054] Development in ``gammapy.maps`` (Matthew Wood) - [#1053] Add sensitivity computation (Bruno Khelifi) - [#1051] Improve 3D simulation / analysis example (Roberta Zanin) - [#1045] Fix energy dispersion apply and to_sherpa (Johannes King) - [#1043] Make ``gammapy.spectrum.powerlaw`` private (Christoph Deil) - [#1040] Add combined 3D model and simple npred function (Christoph Deil) - [#1038] Remove ``gammapy.utils.mpl_style`` (Christoph Deil) - [#1136] Improve CTA sensitivity estimator (Axel Donath and Kai Brügge) - [#1035] Some cleanup of FluxPoints code and tests (Christoph Deil) - [#1032] Improve table unit standardisation and flux points (Christoph Deil) - [#1031] Add HGPS catalog spatial models (Axel Donath) - [#1029] Add 3D model simulation example (Roberta Zanin) - [#1027] Add gamma-cat resource and resource index classes (Christoph Deil) - [#1026] Fix Fermi catalog flux points upper limits (Axel Donath) - [#1025] Remove spectrum butterfly class (Christoph Deil) - [#1021] Fix spiralarm=False case in make_base_catalog_galactic (Ruben Lopez-Coto) - [#1014] Introduce TSImageEstimator class (Axel Donath) - [#1013] Add Fermi-LAT 3FHL spatial models (Axel Donath) - [#845] Add background model component to SpectrumFit (Johannes King) - [#111] Include module-level variables in API docs (Christoph Deil) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.8.rst0000644000175100001770000002641114721316200017340 0ustar00runnerdocker.. _gammapy_0p8_release: 0.8 (Sep 23, 2018) ------------------ Summary ~~~~~~~ - Released Sep 23, 2018 - 24 contributors (6 new) - 7 months of work - 314 pull requests (not all listed below) What's new? ~~~~~~~~~~~ Gammapy v0.8 features major updates to maps and modeling, as well as installation and how to get started with tutorial notebooks. It also contains many smaller additions, as well as many fixes and improvements. The new ``gammapy.maps`` is now used for all map-based analysis (2D images and 3D cubes with an energy axis). The old SkyImage and SkyCube classes have been removed. All code and documentation has been updated to use ``gammapy.maps``. To learn about the new maps classes, see the ``intro_maps`` tutorial at :ref:`tutorials` and the :ref:`gammapy.maps ` documentation page. The new ``gammapy.utils.fitting`` contains a simple modeling and fitting framework, that allows the use of ``iminuit`` and ``sherpa`` optimisers as "backends" for any fit in Gammapy. The classes in ``gammapy.spectrum.models`` (1D spectrum models) are updated, and ``gammapy.image.models`` (2D spatial models) and ``gammapy.cube.models`` (3D cube models) was added. The ``SpectrumFit`` class was updated and a ``MapFit`` to fit models to maps was added. This part of Gammapy remains work in progress, some changes and major improvements are planned for the coming months. With Gammapy v0.8, we introduce the ``gammapy download`` command to download tutorial notebooks and example datasets. A step by step guide is here: :ref:`getting-started`. Previously tutorial notebooks were maintained in a separate ``gammapy-extra`` repository, which was inconvenient for users to clone and use, and more importantly wasn't version-coupled with the Gammapy code repository, causing major issues in this phase where Gammapy is still under heavy development. The recommended way to install Gammapy (described at :ref:`getting-started`) is now to use conda and to create an environment with dependencies pinned to fixed versions to get a consistent and reproducible environment. E.g. the Gammapy v0.8 environment uses Python 3.6, Numpy 1.15 and Astropy 3.0. As before, Gammapy is compatible with a wide range of versions of Numpy and Astropy from the past years and many installation options are available for Gammapy (e.g. pip or Macports) in addition to conda. But we wanted to offer this new "stable recommended environment" option for Gammapy as a default. The new ``analysis_3d`` notebook shows how to run a 3D analysis for IACT data using the ``MapMaker`` and ``MapFit`` classes. The ``simulate_3d`` shows how to simulate and fit a source using CTA instrument response functions. The simulation is done on a binned 3D cube, not via unbinned event sampling. The ``fermi_lat`` tutorial shows how to analyse high-energy Fermi-LAT data with events, exposure and PSF pre-computed using the Fermi science tools. The ``hess`` and ``light_curve`` tutorial show how to analyse data from the recent first H.E.S.S. test data release. You can find these tutorials and more at :ref:`tutorials`. Another addition in Gammapy v0.8 is :ref:`gammapy.astro.darkmatter `, which contains spatial and spectral models commonly used in dark matter searches using gamma-ray data. The number of optional dependencies used in Gammapy has been reduced. Sherpa is now an optional fitting backend, modeling is built-in in Gammapy. The following packages are no longer used in Gammapy: scikit-image, photutils, pandas, aplpy. The code quality and test coverage in Gammapy has been improved a lot. This release also contains a large number of small improvements and bug fixes to the existing code, listed below in the changelog. We are continuing to develop Gammapy at high speed, significant improvements on maps and modeling, but also on the data and IRF classes are planned for the coming months and the v0.9 release in fall 2019. We apologise if you are already using Gammapy for science studies and papers and have to update your scripts and notebooks to work with the new Gammapy version. If possible, stick with a given stable version of Gammapy. If you update to a newer version, let us know if you have any issues or questions. We're happy to help! Gammapy v0.8 works on Linux, MacOS and Windows, with Python 3.5, 3.6 as well as legacy Python 2.7. Contributors ~~~~~~~~~~~~ - Andrew Chen (new) - Atreyee Sinha - Axel Donath - Brigitta Sipocz - Bruno Khelifi - Christoph Deil - Cosimo Nigro - David Fidalgo (new) - Fabio Acero - Gabriel Emery (new) - Hubert Siejkowski (new) - Jean-Philippe Lenain - Johannes King - José Enrique Ruiz - Kai Brügge - Lars Mohrmann - Laura Vega Garcia (new) - Léa Jouvin - Marion Spir-Jacob (new) - Matthew Wood - Matthias Wegen - Oscar Blanch - Régis Terrier - Roberta Zanin Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy 0.8 merged pull requests list on GitHub `__. - [#1822] Use GAMMAPY_DATA in Gammapy codebase (José Enrique Ruiz) - [#1821] Improve analysis 3D tutorial (Axel Donath) - [#1818] Add HESS and background modeling tutorial (Christoph Deil) - [#1812] Add Fit likelihood profile method (Axel Donath) - [#1808] Rewrite getting started, improve tutorials and install pages (Christoph Deil) - [#1800] Add ObservationTableChecker and improve EVENTS checker (Christoph Deil) - [#1799] Fix EnergyDispersion write and to_sherpa (Régis Terrier) - [#1791] Move tutorial notebooks to the Gammapy repository (José Enrique Ruiz) - [#1785] Unify API of Gammapy Fit classes (Axel Donath) - [#1764] Format all code in Gammapy black (Christoph Deil) - [#1761] Add black notebooks functionality (José Enrique Ruiz) - [#1760] Add conda env file for release v0.8 (José Enrique Ruiz) - [#1759] Add find_peaks for images (Christoph Deil) - [#1755] Change map FITS unit header key to standard "BUNIT" (Christoph Deil) - [#1751] Improve EventList and data checkers (Christoph Deil) - [#1750] Remove EventListDataset class (Christoph Deil) - [#1748] Add DataStoreChecker and ObservationChecker (Christoph Deil) - [#1746] Unify and fix testing of plot methods (Axel Donath) - [#1731] Fix and unify Map.iter_by_image (Axel Donath) - [#1711] Clean up map reprojection code (Axel Donath) - [#1702] Add mask filter option to MapFit (Axel Donath) - [#1697] Improve convolution code and tests (Axel Donath) - [#1696] Add parameter auto scale (Johannes Kind and Christoph Deil) - [#1695] Add WcsNDMap convolve method (Axel Donath) - [#1685] Add quantity support to map coordinates (Axel Donath) - [#1681] Add make_images method in MapMaker (Axel Donath) - [#1675] Add gammapy.stats.excess_matching_significance (Christoph Deil) - [#1660] Fix spectrum energy grouping, use nearest neighbor method (Johannes King) - [#1658] Bundle skimage block_reduce in gammapy.extern (Christoph Deil) - [#1634] Add SkyDiffuseCube model for 3D maps (Roberta Zanin and Christoph Deil) - [#1630] Add new observation container class (David Fidalgo) - [#1616] Improve reflected background region finder (Régis Terrier) - [#1606] Change FluxPointFitter to use minuit (Axel Donath) - [#1605] Remove old sherpa backend from SpectrumFit (Johannes King) - [#1594] Remove SkyImage and SkyCube (Christoph Deil) - [#1582] Migrate ring background to use gammapy.maps (Régis Terrier) - [#1576] Migrate detect.cwt to use gammapy.maps (Hubert Siejkowski) - [#1573] Migrate image measure and profile to use gammapy.maps (Axel Donath) - [#1568] Remove IACT and Fermi-LAT basic image estimators (Christoph Deil) - [#1564] Migrate gammapy.detect to use gammapy.maps (Axel Donath) - [#1562] Add MapMaker run method (Atreyee Sinha) - [#1558] Integrate background spectrum in MapMaker (Léa Jouvin) - [#1556] Sync sky model parameters with components (Christoph Deil) - [#1554] Introduce map copy method (Axel Donath) - [#1543] Add plot_interactive method for 3D maps (Fabio Acero) - [#1527] Migrate ASmooth to use gammapy.maps (Christoph Deil) - [#1517] Remove cta_utils and CTASpectrumObservation (Christoph Deil) - [#1515] Remove old background model code (Christoph Deil) - [#1505] Remove old Sherpa 3D map analysis code (Christoph Deil) - [#1495] Change MapMaker to allow partially contained observations (Atreyee Sinha) - [#1492] Add robust periodogram to gammapy.time (Matthias Wegen) - [#1489] Add + operator for SkyModel (Johannes King) - [#1476] Add evaluate method Background3D IRF (Léa Jouvin) - [#1475] Add field-of-view coordinate transformations (Lars Mohrmann) - [#1474] Add more models to the xml model registry (Fabio Acero) - [#1470] Add background to map model evaluator (Atreyee Sinha) - [#1456] Add light curve upper limits (Bruno Khelifi) - [#1447] Add a PSFKernel to perform PSF convolution on Maps (Régis Terrier) - [#1446] Add WCS map cutout method (Atreyee Sinha) - [#1444] Add map smooth method (Atreyee Sinha) - [#1443] Add slice_by_idx methods to gammapy.maps (Axel Donath) - [#1435] Add __repr__ methods to Maps and related classes (Axel Donath) - [#1433] Fix map write for custom axis name (Christoph Deil) - [#1432] Add PSFMap class (Régis Terrier) - [#1426] Add background estimation for phase-resolved spectra (Marion Spir-Jacob) - [#1421] Add map region mask (Régis Terrier) - [#1412] Change to default overwrite=False in gammapy.maps (Christoph Deil) - [#1408] Fix 1D spectrum joint fit (Johannes King) - [#1406] Add adaptive lightcurve time binning method (Gabriel Emery) - [#1401] Remove old spatial models and CatalogImageEstimator (Christoph Deil) - [#1397] Add XML SkyModel serialization (Johannes King) - [#1395] Change Map.get_coord to return a MapCoord object (Régis Terrier) - [#1387] Update catalog to new model classes (Christoph Deil) - [#1381] Add 3D fit example using gammapy.maps (Johannes King) - [#1386] Improve spatial models and add diffuse models (Johannes King) - [#1378] Change 3D model evaluation from SkyCube to Map (Christoph Deil) - [#1377] Add more SkySpatialModel subclasses (Johannes King) - [#1376] Add new SpatialModel base class (Johannes King) - [#1374] Add units to gammapy.maps (Régis Terrier) - [#1373] Improve 3D analysis code using gammapy.maps (Christoph Deil) - [#1372] Add 3D analysis functions using gammapy.maps (Régis Terrier) - [#1369] Add gammapy download command (José Enrique Ruiz) - [#1367] Add first draft of LightCurve model class (Christoph Deil) - [#1362] Fix map sum_over_axes (Christoph Deil) - [#1360] Sphinx RTD responsive theme for documentation (José Enrique Ruiz) - [#1357] Add map geom pixel solid angle computation (Régis Terrier) - [#1354] Apply FOV mask to all maps in ring background estimator (Lars Mohrmann) - [#1347] Fix bug in LightCurveEstimator (Lars Mohrmann) - [#1346] Fix bug in map .fits.gz write (change map data transpose) (Christoph Deil) - [#1345] Improve docs for SpectrumFit (Johannes King) - [#1343] Apply containment correction in true energy (Johannes King) - [#1341] Remove u.ct from gammapy.spectrum (Johannes King) - [#1339] Add create fixed time interval method for light curves (Gabriel Emery) - [#1337] Enable rate models in SpectrumSimulation (Johannes King) - [#1334] Fix AREASCAL read for PHA count spectrum (Régis Terrier) - [#1331] Fix background image estimate (Régis Terrier) - [#1317] Add function to compute counts maps (Régis Terrier) - [#1231] Improve HESS HGPS catalog source class (Christoph Deil) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v0.9.rst0000644000175100001770000001221214721316200017333 0ustar00runnerdocker.. _gammapy_0p9_release: 0.9 (Nov 29, 2018) ------------------ Summary ~~~~~~~ - Released Nov 29, 2018 - 9 contributors (3 new) - 2 months of work - 88 pull requests (not all listed below) What's new? ~~~~~~~~~~~ Gammapy v0.9 comes just two months after v0.8. This is following the `Gammapy 1.0 roadmap`_, Gammapy will from now on have bi-monthly releases, as we work towards the Gammapy 1.0 release in fall 2019. Gammapy v0.9 contains many fixes, and a few new features. Big new features like observation event and time filters, background model classes, as well as support for fitting joint datasets will come in spring 2019. The ``FluxPointEstimator`` has been rewritten, and the option to compute spectral likelihood profiles has been added. The background and diffuse model interpolation in energy has been improved to be more accurate. The ``gammapy.utils.fitting`` backend is under heavy development, most of the functionality of MINUIT (covariance, confidence intervals, profiles, contours) can now be obtained from any ``Fit`` class (spectral or map analysis). Maps now support arithmetic operators, so that you can e.g. write ``residual = counts - model`` if ``counts`` and ``model`` are maps containing observed and model counts. Gammapy v0.9 now requires Astropy 2.0 or later, and Scipy was changed from status of optional to required dependency, since currently it is required for most analysis tasks (e.g. using interpolation when evaluating instrument responses). Please also note that we have a `plan to drop Python 2.7 support`_ in Gammapy v0.11 in March 2019. If you have any questions or concerns about moving your scripts and notebooks to Python 3, or need Python 2 support with later Gammapy releases in 2019, please let us know! .. _Gammapy 1.0 roadmap: https://github.com/gammapy/gammapy/pull/1841 .. _plan to drop Python 2.7 support: https://github.com/gammapy/gammapy/pull/1278 Contributors ~~~~~~~~~~~~ - Atreyee Sinha - Axel Donath - Brigitta Sipocz - Christoph Deil - Daniel Morcuende (new) - David Fidalgo - Ignacio Minaya (new) - José Enrique Ruiz - José Luis Contreras (new) - Régis Terrier Pull requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. See the complete `Gammapy 0.9 merged pull requests list on GitHub `__. - [#1949] Add fit minos_contour method (Christoph Deil) - [#1937] No copy of input and result model in fit (Christoph Deil) - [#1934] Improve FluxPointEstimator test and docs (Axel Donath) - [#1933] Add likelihood profiles to FluxPointEstimator (Axel Donath) - [#1930] Add sections in documentation navigation bar (José Enrique Ruiz) - [#1929] Rewrite FluxPointEstimator (Axel Donath) - [#1927] Improve Fit class, add confidence method (Christoph Deil) - [#1926] Fix MapAxis interpolation FITS serialisation (Atreyee Sinha) - [#1922] Add Fit.covar method (Christoph Deil) - [#1921] Use and improve ScaledRegularGridInterpolator (Axel Donath) - [#1919] Add Scipy as core dependency (Axel Donath) - [#1918] Add parameters correlation matrix property (Christoph Deil) - [#1912] Add ObservationFilter class (David Fidalgo) - [#1909] Clean up irf/io.py and add load_cta_irf function (Régis Terrier) - [#1908] Take observation time from GTI table (David Fidalgo) - [#1904] Fix parameter limit handling in fitting (Christoph Deil) - [#1903] Improve flux points class (Axel Donath) - [#1898] Review and unify quantity handling (Axel Donath) - [#1895] Rename obs_list to observations (David Fidalgo) - [#1894] Improve Background3D energy axis integration (Axel Donath) - [#1893] Add MapGeom equality operator (Régis Terrier) - [#1891] Add arithmetic operators for maps (Régis Terrier) - [#1890] Change map quantity to view instead of copy (Régis Terrier) - [#1888] Change ObservationList class to Observations (David Fidalgo) - [#1884] Improve analysis3d tutorial notebook (Ignacio Minaya) - [#1883] Fix fit parameter bug for very large numbers (Christoph Deil) - [#1871] Fix TableModel and ConstantModel output dimension (Régis Terrier) - [#1862] Move make_psf, make_mean_psf and make_mean_edisp (David Fidalgo) - [#1861] Change from live to on time in background computation (Christoph Deil) - [#1859] Fix in MapFit energy dispersion apply (Régis Terrier) - [#1857] Modify image_fitting_with_sherpa to use DC1 runs (Atreyee Sinha) - [#1855] Add ScaledRegularGridInterpolator (Axel Donath) - [#1854] Add FluxPointProfiles class (Christoph Deil) - [#1846] Allow different true and reco energy in map analysis (Atreyee Sinha) - [#1845] Improve first steps with Gammapy tutorial (Daniel Morcuende) - [#1837] Add method to compute energy-weighted 2D PSF kernel (Atreyee Sinha) - [#1836] Fix gammapy download for Python 2 (José Enrique Ruiz) - [#1807] Change map smooth widths to match Astropy (Atreyee Sinha) - [#1849] Improve gammapy.stats documentation page (José Luis Contreras) - [#1766] Add gammapy jupyter CLI for developers (José Enrique Ruiz) - [#1763] Improve gammapy download (José Enrique Ruiz) - [#1710] Clean up TableModel implementation (Axel Donath) - [#1419] PIG 4 - Setup for tutorial notebooks and data (José Enrique Ruiz and Christoph Deil) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v1.0.1.rst0000644000175100001770000000370414721316200017470 0ustar00runnerdocker.. include:: ../references.txt .. _gammapy_1p0p1_release: 1.0.1 (March 14th, 2023) ------------------------ Summary ~~~~~~~ - Released Mar 14th, 2023 - 9 contributors - 31 pull requests since v1.0 (not all listed below) This is the first bug-fix after v1.0. Several minor bugs and typos in the documentation are corrected. - Few corrections have been made to improve the performance of the models evaluation in particular for 1d spectral analyses. - The "TIMESYS" keyword was not properly exported to the lightcurve table format. This is now corrected. - An issue with interpolation in scipy 1.10 is causing problems in Gammapy. This specific version is excluded from dependencies. The minimal numpy version is now 1.21. - Models created from files with a parameter scale different than one were incorrect and this will be fixed by this patch. Note that by default gammapy always serializes the parameters with a scale of unity, so this bug affected only files written by hand with a scale different from one. - Datasets serialization now correctly supports PSF in reconstructed energies used in HAWC analyses. Contributors ~~~~~~~~~~~~ - Arnau Aguasca - Axel Donath - Bruno Khelifi - Maximilian Linhoff - Lars Mohrmann - Maxime Regeard - Quentin Remy - Atreyee Sinha - Régis Terrier Pull Requests ~~~~~~~~~~~~~ This list is incomplete. - [#4359] Fix interpolation values_scale in TemplateSpatialModel (Quentin Remy) - [#4344] Fix norm_only_changed in MapEvaluator (Quentin Remy) - [#4336] Change label units within parentheses to brackets (Arnau Aguasca) - [#4324] Fix Parameter init if scale is not one (Quentin Remy) - [#4301] Add TIMESYS to lightcurve table meta (Régis Terrier) - [#4275] Remove safe mask in background stacking (Atreyee Sinha) - [#4268] Add an HowTo for the fit non-convergence (Bruno Khelifi) - [#4231] Fix bug in safe mask computation for SpectrumDatasetOnOff (Lars Mohrmann) - [#4221] Fix wrong name in required hdus (Maximilian Linhoff) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v1.0.2.rst0000644000175100001770000000707514721316200017476 0ustar00runnerdocker.. include:: ../references.txt .. _gammapy_1p0p2_release: 1.0.2 (December 6th, 2023) -------------------------- Summary ~~~~~~~ - Released Dec 6th, 2023 - 16 contributors - 60 pull requests since v1.0.1 (not all listed below) This is the second bug-fix of the v1.0 maintenance. Several bugs and typos in the documentation are corrected. - The ``FluxProfileEstimator`` has been fixed so that the models attached to the dataset are properly taken into account. - A ``scale`` attribute has been added to ``TemporalModel`` classes to treat the ``t_ref`` parameter in a consistent manner via a ``TemporalModel.reference_time`` that converts the parameter (defined in mjd) in a proper ``~astropy.time.Time``. It avoids comparing inconsistent scales when evaluating temporal models. - The ``ExcessMapEstimator`` has been fixed to properly take into account the mask safe. - The ``wcs.array_shape`` definition in ``WcsGeom.create()`` was inconsistent with the ``WcsGeom`` shape. This is now fixed. - The offset calculation in ``Background2D.to_3d()`` has been modified to compute angular separation rather than cartesian distance. - The serialization of ``PiecewiseNormSpectralModel`` has been fixed to include interpolation attribute. Contributors ~~~~~~~~~~~~ - Arnau Aguasca - Axel Donath - Kirsty Feijen - Claudio Galelli - Bruno Khélifi - Maximilian Linhoff - Simone Mender - Daniel Morcuende - Laura Olivera-Nieto - Fabio Pintore - Michael Punch - Maxime Regeard - Quentin Remy - Atreyee Sinha - Katrin Streil - Régis Terrier Pull Requests ~~~~~~~~~~~~~ This list is incomplete. - [#4937] Fix import of angular_separation for astropy 6 (Maximilian Linhoff) - [#4936] PiecewiseNormSpectralModel serialising interp (Katrin Streil) - [#4913] Fix NoOverlapError in SpatialModel.integrate_geom (Quentin Remy) - [#4876] Add missing @property on PointSpatialModel.is_energy_dependent (Quentin Remy) - [#4772] Fix plot spectrum function (Simone Mender) - [#4755] Separate units in pcolormesh (Régis Terrier) - [#4753] Removes size 1 array to scalar conversion deprecation warnings from numpy (Régis Terrier) - [#4728] Fixed offset calculation in Background2D.to_3d (Claudio Galelli) - [#4721] Exposing NormSpectralModels (Kirsty Feijen) - [#4681] Fix MapEvaluator for regions (Quentin Remy) - [#4677] Fix wcs.array_shape definition in WcsGeom.create (Quentin Remy) - [#4657] Fix the FluxProfileEstimator to take into account models (Quentin Remy) - [#4653] Fix points scaling in TemplateNDSpectralModel (Quentin Remy) - [#4631] Impose pydantic <2.0 (Régis Terrier) - [#4619] Correct TemporalModel.plot() unit issue. (Régis Terrier) - [#4593] Make geom_exposure optional in MapDatasetOnOff.from_geoms (Atreyee Sinha) - [#4578] Fix ExcessMapEstimator to account for mask safe (Quentin Remy) - [#4574] Fixing if statements in OGIPDatasetWriter (Maxime Regeard) - [#4524] Corrected kwargs docstring of plot_grid() (Maxime Regeard) - [#4520] Support Astropy 5.3 (Axel Donath) - [#4500] Fix SpectrumDatasetOnOff.stat_sum to support when counts_off is None (Kirsty Feijen) - [#4486] Scale handling in temporal models (Atreyee Sinha) - [#4453] Add scale in temporal model (Atreyee Sinha) - [#4435] Fix wrong ticks in `rad_max` plot (Simone Mender) - [#4412] LightCurveTemplateModel serialisation (Atreyee Sinha) - [#4397] Fix plot_spectrum_datasets_off_regions with too many regions (Bruno Khélifi) - [#4394] Obs filter live time (Maxime Regeard) - [#4393] Iminuit output (Bruno Khélifi) - [#4382] Fix error message in interp_to_geom() (Axel Donath) - [#4380] Adapt default offset for plotting point like IRFs (Atreyee Sinha) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v1.0.rst0000644000175100001770000001603714721316200017334 0ustar00runnerdocker.. include:: ../references.txt .. _gammapy_1p0_release: 1.0 (November 10th, 2022) ------------------------- Summary ~~~~~~~ - Released Nov 10th, 2022 - 12 contributors - 103 pull requests since v0.20.1 (not all listed below) New features ~~~~~~~~~~~~ This new release is the Long Term Stable (LTS) version 1.0. Most of the changes are in the package infrastructure. A number of improvements and bug corrections have been implemented since v0.20.1. Gammapy v1.0 adds support for the latest version 0.3 of `gadf`_. *gammapy.data and gammapy.irf* - Support for HAWC data has been improved. In particular, a `~gammapy.irf.RecoPSFMap` has been added in IRF to support PSF in reco energy. *gammapy.maps* - A reprojection method has been implemented on `~gammapy.maps.Map` to allow for reprojection on a new `~gammapy.maps.MapGeom` object. It supports different spatial geometries but requires identical non-spatial axes. See `~gammapy.maps.Map.reproject_to_geom()`. *infrastructure* - The tutorial gallery now relies on Sphinx gallery. The tutorials are python scripts with specific syntax for the description cells. This simplifies the documentation build. This will make the history cleaner and tutorial code reviews easier. The download button is now moved at the end of the tutorial. Binder is now working again for tutorials. - The compliance with the astropy affiliated packages has been improved. In particular, tox is now used for testing and CI. - Numpy<=1.19 is no longer supported. API changes ~~~~~~~~~~~ *gammapy.modeling* - `~gammapy.modeling.models.LightCurveTemplateTemporalModel` internally relies on a `~gammapy.maps.RegionNDMap` and does not take a table on `__init__` anymore. To create one from a `Table` object, users have to go through `~gammapy.modeling.models.LightCurveTemplateTemporalModel.read()` or `~gammapy.modeling.models.LightCurveTemplateTemporalModel.from_table`. Bug fixes and improvements ~~~~~~~~~~~~~~~~~~~~~~~~~~ *gammapy.astro* - The DM annihilation spectral model `~gammapy.astro.darkmatter.DarkMatterAnnihilationSpectralModel` can now be serialized to yaml. *gammapy.data* - An `~gammapy.data.Observation.copy()` method has been implemented to allow modification of existing `~gammapy.data.Observation` objects. *gammapy.datasets* - The dataset name is now serialized with the Dataset as the `NAME` keyword in the primary HDU. - A `peek` method is now available for the `~gammapy.datasets.MapEvaluator` to help debugging issues with model evaluation on a `~gammapy.datasets.Dataset`. *gammapy.irf* - The interpolation scheme of the energy axis was incorrectly set to "linear" by default for the `~gammapy.irf.Background2D`. It is now set to "log". *gammapy.makers* - Adapt the `~gammapy.makers.MapDatasetMaker` to handle the DL3 format introduced in g.a.d.f. v0.3 for drifting instruments. *gammapy.maps* - `~gammapy.maps.RegionNDMap.sample_coord()` has been implemented to generate events from a region map. *gammapy.modeling* - A bug has been corrected in the `~gammapy.modeling.models.TemporalModel` `~gammapy.modeling.models.TemporalModel.sample_time()` method. An incorrect unit handling made times incorrectly sampled for the `~gammapy.modeling.models.TemplateLightCurveTemporalModel`. - `~gammapy.modeling.models.TemporalModel.integrate()` now provides a generic integration method. - A new `~gammapy.modeling.models.TemplatePhaseCurveTemporalModel` has been added to support pulsar-like lightcurves. - To allow for identical parameter names, the serialization of the covariance matrix does no longer export the parameter names as column headers but simply as the first entry in each row. *gammapy.stats* - For consistency with the convention of `errn` and `errp` in the `Estimator` classes, which are always positve quantities, the sign of the value returned by `~gammapy.stats.CountsStatistic.compute_errn()` has been changed. Contributors ~~~~~~~~~~~~ - Arnau Aguasca - Axel Donath - Luca Giunti - Bruno Khelifi - Mireia Nievas-Rosillo - Cosimo Nigro - Laura Olivera-Nieto - Fabio Pintore - Quentin Rémy - Brigitta Sipőcz - Atreyee Sinha - Régis Terrier Pull Requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. - [#4119] simplification of np.array(set(labels)) (Mireia Nievas-Rosillo) - [#4115] Add code of conduct file (Axel Donath) - [#4113] Move binder configuration to gammapy-webpage (Axel Donath) - [#4112] Add pre commit hooks and black CI (Axel Donath) - [#4108] Add tests with HAWC data (Laura Olivera-Nieto) - [#4107] Implement peek methods for map evaluator and psf kernel (luca GIUNTI) - [#4106] Reactivate gammapy download command (Axel Donath) - [#4105] Fix WcsNDMap upsampling along axis (Quentin Remy) - [#4103] Activate binder for tutorials (Axel Donath) - [#4098] Fixed test failure after introducing new MAGIC RAD_MAX files (Cosimo Nigro) - [#4095] Filling of the glossary (Bruno Khélifi) - [#4093] Update Astropy package template (Axel Donath) - [#4089] Change sign of the value returned by CountsStatistic.compute_errn (Axel Donath) - [#4088] Add sample_coord for RegionNDMap (Régis Terrier) - [#4084] Adapt TemplateTemporalModel to use a RegionNDMap internally (Atreyee Sinha) - [#4083] Implement Observation.copy() and tests (Axel Donath) - [#4080] Use sphinx gallery for tutorials (Axel Donath) - [#4079] Update of the mailmap for the git push management (Bruno Khélifi) - [#4076] Allow for DRIFT mode observations in the MapDatasetMaker (Laura Olivera-Nieto) - [#4075] Validate nside parameter for HpxGeom (luca GIUNTI) - [#4073] Make spatial coordinates optional in RegionNDMap.interp_by_coord() (Axel Donath) - [#4071] Add tag on DM Annihilation spectral model (Régis Terrier) - [#4067] Fix bug on TemporalModel.sample_time() (Fabio PINTORE) - [#4058] Serialisation in the primary HDU of the Dataset name (Bruno Khélifi) - [#4054] Update temporal model docs (aaguasca) - [#4051] Using astropy Table indices on ObservationTable (Régis Terrier) - [#4044] Addition of a tutorial about the 1D analysis with the HLI (Bruno Khélifi) - [#4043] Colour blind friendly visualisations (Bruno Khélifi) - [#4037] Implement IRF.slice_by_idx() (Axel Donath) - [#4026] Fix TemplateSpatialModel overwrite (Quentin Remy) - [#4025] Add support for PSF in reco energy (Quentin Remy) - [#4024] Add HowTo for adding phase column (Atreyee Sinha) - [#4022] Introduce consistent .rename_axes and .rename API for maps (Quentin Remy) - [#4018] Computation of the WcsMap kernel at the nearest valid exposure (Bruno Khélifi) - [#4017] Introduce a phase curve model (Régis Terrier) - [#4015] Allow to stack mask_fit in Dataset.stack (Quentin Remy) - [#4014] Avoid unnecessary copy in Map.stack (Quentin Remy) - [#4013] Fix zeros errors in models created from 3HWC catalog (Quentin Remy) - [#4000] MNT: Raise error rather than silently proceed (Brigitta Sipőcz) - [#3956] Safe mask range on the 1D spectrum tutorial (Bruno Khélifi) - [#3950] PIG 23 - Gammapy Release Cycle and Version Numbering (Régis Terrier) - [#3925] Temporal model integration (Axel Donath) - [#3862] Add Map.reproject method (Quentin Remy) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v1.1.rst0000644000175100001770000002716414721316200017340 0ustar00runnerdocker.. _gammapy_1p1_release: 1.1 (June 13th 2023) -------------------- - Released June 13th, 2023 - 17 contributors - 129 pull requests since v1.0 (not all listed below) - 85 closed issues Summary ~~~~~~~ This release introduces a number of new features as well as some performance improvements. Support for energy dependent temporal models is introduced. Only template models are supported. They use a `RegionNDMap` object with an energy and a time axis. Note that these models are meant for event simulation with the `MapDatasetEventSampler` and cannot be used for modeling and fitting. A tutorial demonstrating how to simulate events using this type of models has been added. Support for multiprocessing is improved. `FluxPointsEstimator` and `LightCurveEstimator` can now run on several cores. This is defined with the `n_jobs` parameter that can be set on init. The backend used by default is the python multiprocessing. This interface is also used by the `DatasetsMaker` that performs the data reduction loop. A first step in the separation of the internal data model and the GADF format is introduced in this release in the handling of the pointing and the GTI. This is part of a larger project that will be implemented in the coming feature releases to allow I/O with multiple data formats. New features ~~~~~~~~~~~~ *gammapy.data* - A new function `~gammapy.data.get_irfs_features()` can extract the features of IRFs (such as energy resolution bias or PSF radius) of a list of `Observation`. The output list can then be passed to the function `~gammapy.utils.cluster.hierarchical_clustering()` which will find clusters of IRF features allowing to combine `Observation` with similar IRF characteristics during analysis. *gammapy.maps* - A `Map.reorder_axes` helper method has been introduced. - Dot product is now supported with the `Map.dot()` function. It applies the dot product on the axes sharing the same name. It can also be called via the `@` operator. - A `WcsNDMap.to_region_nd_map_histogram()` helper method is introduced to compute the histogram of the map values along the spatial dimensions for each non spatial axes bins. *gammapy.modeling* - The `~gammapy.modeling.models.LightCurveTemplateTemporalModel` now supports energy dependent light curve templates. These models are now created from a `Map` object with a `time` axis and optionally with an `energy_true` axis. They can be read from an `~astropy.table.Table` (only regular light curve models) or from a serialized `Map`. For now, the energy dependent models cannot be used for analysis. - A new function `~gammapy.modeling.select_nested_models` has been introduced to perform nested model fits and compute the resulting test statistic (TS) between two nested hypotheses. It can be used to determine whether the addition of a source to the model is significant or to test for specific features in a model (e.g. test the existence of a spectral cutoff). - A new method has been added on `SpectralModel.spectral_index_error()` to compute the spectral index at a given energy as well as its error by error propagation of the covariance matrix elements. - The Franceschini (2018) and Saldana-Lopez (2021) EBL are now part of the built-ins EBL models. - A spatial correction model can now be added to the `~gammapy.modeling.models.FoVBackgroundModel`. It can be used with a new spatial model, the `~gammapy.modeling.models.PiecewiseNormSpatialModel`. *gammapy.estimators* - Multiprocessing is now supported for `FluxPointsEstimator` and `LightCurveEstimator`. Setting the number of cores used is done with the `n_jobs` property that is available on these classes. *gammapy.visualization* - A function to plot `Map` as RGB is now proposed: `~gammapy.visualization.plot_map_rgb()`. - A function to plot the spectral distributions of predicted counts of models defined on a `Dataset` is now available : `~gammapy.visualization.plot_npred_signal()` *documentation* - A new tutorial to demonstrate how to use Gammapy with HAWC data. - The pulsar analysis tutorial now shows how to create a `MapDatasetOnOff` using phase information. - A new tutorial demonstrating event sampling with energy dependent temporal models has been added. *infrastructure* - Deprecation warnings can now be raised by Gammapy deprecated code. The warnings will appear in the documentation and docstrings as well. Deprecated features will usually be removed in the following feature release. - Minimal python version is now 3.9. Python 3.10 is supported as well. API changes ~~~~~~~~~~~ *gammapy.data* - The `Observation.pointing_radec` and `Observation.pointing_altaz` `Observation.pointing_zen` are now deprecated. One should use `Observation.get_pointing_icrs(time)` instead. This approach will support different pointing strategies. In most cases, taking the pointing at the mid time of an observation is sufficient. This is provided by the property `Observation.tmid`. - To separate data format and data model and to support observations with fixed pointing, the `~gammapy.data.FixedPointingInfo` has been restructured. Most of its properties have been deprecated. The main functions are `FixedPointingInfo.get_icrs(time, location)` and `FixedPointingInfo.get_altaz(time, location)` - the `GTI` now consists in a simple `Table` with `START` and `STOP` as `~astropy.time.Time` object. `GTI.table` no longer contains a GADF formatted table with columns representing start and stop time as METs (Mission Elpased Time). All methods should behave equivalently with the same interface. *gammapy.irf* - `~gammapy.irf.load_cta_irfs` is now deprecated. Use instead the more general `~gammapy.irf.load_irf_dict_from_file` *gammapy.modeling* - The `FitResult.iminuit` attribute is now deprecated. It should be accessed from `optimize_result` property instead, via: `FitResult.optimize_result.iminuit`. *gammapy.utils* - The `~gammapy.utils.table.table_from_row_data()` is now deprecated. It can be simply replaced by the regular constructor : `Table(rows)`. Bug fixes and improvements ~~~~~~~~~~~~~~~~~~~~~~~~~~ - The `Map.fill_events()` method now supports adding weights to the input events. - The output of the `FitResult` has been clarified in some failure cases. - The energy dependent RADMAX cut is now supported by the `~gammapy.makers.PhaseBackgroundMaker`. - A `scale` attribute was added to `TemporalModel` classes. It is used to treat the `t_ref` parameter in a consistent manner via a `TemporalModel.reference_time` that converts the parameter (defined in mjd) in a proper `~astropy.time.Time`. It avoids comparing inconsistent scales when evaluating temporal models. - The `~gammapy.estimators.TSMapEstimator` now accepts `MapDatasetOnOff` as well as regular `MapDataset`. - `FluxPoints.plot()` now includes a `time_format` argument to adapt the time display in the resulting plot (can be either `iso` or `mjd`). - Units representation in plot labels is improved using `to_sting(latex_inline)`. The default behavior can be changed adapting the global variable `UNIT_STRING_FORMAT` defined in `~gammapy.maps.axes`. Contributors ~~~~~~~~~~~~ - Arnau Aguasca - Axel Donath - Kirsty Feijen - Luca Giunti - Lucas Gréaux - Bruno Khélifi - Maximilian Linhoff - Simone Mender - Lars Mohrmann - Cosimo Nigro - Laura Olivera-Nieto - Fabio Pintore - Maxime Regeard - Quentin Remy - Atreyee Sinha - Katrin Streil - Régis Terrier Pull Requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. - [#4545] Tutorial on event sampling for energy dependent temporal models (Fabio Pintore) - [#4521] Add covariance copy to support ray (Axel Donath) - [#4510] Introduce WcsNDMap.cutout_and_mask_region (Axel Donath) - [#4508] Implement WcsNDMap.to_region_nd_map_histogram (Axel Donath) - [#4506] Rename append method of MapAxis and LabelMapAxis to concatenate (REGEARD Maxime) - [#4504] Deprecate Fit.minuit member (Axel Donath) - [#4500] Fix SpectrumDatasetOnOff.stat_sum to support when counts_off is None (Kirsty Feijen) - [#4495] Introduce move_axis method on Map (Régis Terrier) - [#4486] Scale handling in temporal models (Atreyee Sinha) - [#4466] Add tutorial for the use of HAWC data (Laura Olivera-Nieto) - [#4459] Evaluation of energy dep temporal model (Atreyee Sinha) - [#4458] adding weights option to fill_events (REGEARD Maxime) - [#4453] Add scale in temporal model (Atreyee Sinha) - [#4444] Integral sensitivity in FluxPointsEstimator (Atreyee Sinha) - [#4435] Fix wrong ticks in rad_max plot (Simone Mender) - [#4430] Add squash method to LabelMapAxis (REGEARD Maxime) - [#4428] Add .to_string() to axis y/xlabel (Arnau Aguasca) - [#4418] Update the _evaluate_timevar_source function in MapDatasetEventSampler (Fabio PINTORE) - [#4417] adding from_stack and append to LabelMapAxis (REGEARD Maxime) - [#4412] LightCurveTemplateModel serialisation (Atreyee Sinha) - [#4409] Add a function that plot the npred_signal of models of a dataset (REGEARD Maxime) - [#4406] Add configuration and helper function to run multiprocessing or ray (Quentin Remy) - [#4402] Support for parallel evaluation in FluxPointsEstimator (Quentin Remy) - [#4397] Fix plot_spectrum_datasets_off_regions with too many regions (Bruno Khélifi) - [#4395] Add the possibility to plot in MJD the light curves (Bruno Khélifi) - [#4393] Iminuit output (Bruno Khélifi) - [#4380] Adapt default offset for plotting point like IRFs (Atreyee Sinha) - [#4370] Implement the _sample_coord_time_energy function in MapDatasetEventSampler (Fabio PINTORE) - [#4369] Pulsar analysis tutorial (REGEARD Maxime) - [#4359] Fix interpolation values_scale in TemplateSpatialModel (Quentin Remy) - [#4352] Adding rad max cut in PhaseBackgroundMaker (REGEARD Maxime) - [#4350] Always use FixedPointingInfo from events header in DataStore (Maximilian Linhoff) - [#4346] Add helper functions for delta TS to significance conversion (Quentin Remy) - [#4336] Change label units within parentheses to brackets (Arnau Aguasca) - [#4326] Introduce internal data model for GTI (Régis Terrier) - [#4324] Fix Parameter init if scale is not one (Quentin Remy) - [#4305] Add SpectralModel.spectral_index_error (Atreyee Sinha) - [#4301] Add TIMESYS to lightcurve table meta (Régis Terrier) - [#4294] Addition of a Map.dot operator (Régis Terrier) - [#4288] Add MapDatasetOnOff type test and associated error for TSMapEstimator (REGEARD Maxime) - [#4282] Add from_region() to DiskSpatialModel (Atreyee Sinha) - [#4280] Allow to load observations with only IRFs defined (Quentin Remy) - [#4277] Fix datasets io with RecoPSFMap (Quentin Remy) - [#4275] Remove safe mask in background stacking (Atreyee Sinha) - [#4264] Deprecate load_cta_irfs, replace usage with load_irf_dict_from_file (Maximilian Linhoff) - [#4252] Map dataset on off in phase maker (REGEARD Maxime) - [#4245] Added an evaluate method for CompoundSpectralModel (Lucas Gréaux) - [#4243] Change _check_intervals from PhaseBackgroundMaker (REGEARD Maxime) - [#4242] Add Observations clustering by IRFs quality (Quentin Remy) - [#4231] Fix bug in safe mask computation for SpectrumDatasetOnOff (Lars Mohrmann) - [#4219] Allow reading of IRF files with single-value axes (Lars Mohrmann) - [#4216] Add TestStatisticNested class (Quentin Remy) - [#4215] Adds built-in Franceschini (2018) and Saldana-Lopez (2021) EBL models (Cosimo Nigro) - [#4213] Add deprecation warning system (Régis Terrier) - [#4212] Remove unneeded table util function (Maximilian Linhoff) - [#4210] Add plot_rgb() function in gammapy.visualization (luca GIUNTI) - [#4209] Add support for spatial model correction on background models (Quentin Remy) - [#4208] Add PiecewiseNormSpatialModel (Quentin Remy) - [#4191] Modified Dark Matter Jfactor Computation and Dark Matter Tutorial (Katrin Streil) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v1.2.rst0000644000175100001770000002515214721316200017334 0ustar00runnerdocker.. _gammapy_1p2_release: 1.2 (February 29th 2024) ------------------------ - Released February 29, 2024 - 25 contributors - 215 pull requests since v1.1 (not all listed below) - 69 closed issues Summary ~~~~~~~ New features ~~~~~~~~~~~~ - Metadata containers have been introduced following PIG 25. Preliminary versions have been designed for most data products from DL3 to DL5. They will progressively be accessible on their ``.meta`` attributes. - Parameter prior support has been introduced following PIG 26. A few ``Prior`` classes can be defined on ``Parameter.prior`` and the associated log-prior is added to the total statistics during fitting. - Helper functions have been added to perform computation of lifetime and total observation time maps. - A preliminary support for asymmetric IRFs has been introduced. A tutorial shows how to implement new IRF classes to support non-axisymmetric IRFs. - Improved support for temporal analysis with the addition of helper functions to quantify lightcurve variability. - New helper classes have been added to perform event sampling for a single or a set of observations: ``gammapy.datasets.ObservationEventSampler`` and ``gammapy.data.ObservationsEventSampler``. - A checksum option has been added on read and write methods in Gammapy. It follows the FITS standard and reuses the astropy methods and behaviour. A checksum for yaml file has been introduced as well. - Improved support for parallel processing *gammapy.catalogs* - Update the 4FGL catalog to include DR4. - Added 1LHAASO catalog. *gammapy.data* - A general scheme for metadata support has been introduced. The ``Metadata`` base class has been designed according to PIG 25. - Added a function to remove a time interval from a ``GTI``. - Added function to export part of a ``DataStore`` to an IVOA compliant ObsCore table. *gammapy.makers* - Prototype support for asymmetric IRF in Gammapy's ``Maker`` classes. A tutorial exposing how to create such IRFs has been added. *gammapy.maps* - Implement ``TimeMapAxis.pix_to_coord()`` - Implement ``TimeMapAxis.to_gti()`` *gammapy.modeling* - Added a function to determine a pivot energy for all spectral models. - Added position as a parameter for the ``TemplateSpatialModel``. CAVEAT: results are correct only when the fitted position is close to the map center. - Added Spatial parameters in FoVBackgroundModels - SkyModel evaluation now supports a TimeMapAxis - Adapt FluxPointsDataset to directly fit lightcurves *gammapy.estimators* - Add a ``slice_by_coord()`` function on ``FluxMaps``. - Introduce timing utility functions: point to point flux variance, fractional excess variance, doubling/halving times for light curves - Add optional sensitivity estimation in ``ExcessMapEstimator``. - Added support for NormSpectralModels in FluxPointsDataset / FluxPoints computations - Fit status and degrees of freedom have been added to ``FluxMaps``. - Add a dedicated ``EnergyDependentMorphologyEstimator`` as well as a tutorial demonstrating its usage. - Add functionality to rebin flux points using the likelihood profiles. - GTI tables are now serialised on FluxPoints objects. *gammapy.visualization* - Add a plot function for the distribution of ``Map`` data. API changes ~~~~~~~~~~~ - Source parameters are now frozen on init in ``FluxEstimator`` classes. - The ``norm`` parameter is now passed as an argument to the various flux estimators. ``Parameter.is_norm`` is now deprecated. - The default index of ``ExpCutoffPowerlawNormSpectralModel`` has been changed to 0 for consistency with the ``PowerlawNormSpectralModel``. Bug fixes and improvements ~~~~~~~~~~~~~~~~~~~~~~~~~~ - Correct ``MapDataset.info_dict()`` to use background model rather than IRF background when giving excess counts and significance. - Import ray only when needed. - Added information on number of degrees of freedom on ``FluxMaps`` and ``FluxPoints`` objects. - Reduced memory usage of ``MapEvaluator`` and ``PSFMap.get_psf_kernel()``. - Added support for multiprocessing in ``FluxProfileEstimator``. - Added multiprocessing for ``WcsNDMap`` convolution. - Add a context manager for multiprocessing. - Add a faster reprojection method : ``reproject_by_image``. - Use interpolation for dark matter mass. Add Zhao profile and ``DarkMatterDecaySpectralModel``. - The asymmetric errors and upper limit calculations in ``CashCountsStatistic`` have been replaced by an equivalent analytical expression. Documentation ~~~~~~~~~~~~~ Contributors ~~~~~~~~~~~~ - Fabio Acero - Juan Bernete - Noah Biederbeck - Julia Djuvsland - Axel Donath - Kirsty Feijen - Stefan Fröse - Claudio Galelli - Bruno Khélifi - Jana Konrad - Paula Kornecki - Maximilian Linhoff - Kurt McKee - Simone Mender - Daniel Morcuende - Laura Olivera-Nieto - Fabio Pintore - Michael Punch - Maxime Regeard - Quentin Remy - Atreyee Sinha - Hanna Stapel - Katrin Streil - Régis Terrier - Tim Unbehaun Pull Requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. - [#5044] Add stat_null computation on ParameterEstimator (Atreyee Sinha) - [#5040] Add degrees of freedom on FluxMaps (Atreyee Sinha) - [#5015] Examples of radially asymmetric IRFs (Atreyee Sinha) - [#4994] Spatial parameters in FovBackgroundModel (Katrin Streil) - [#4992] Adding a function to guess the format of a FluxPoints object for serialization (Claudio Galelli) - [#4989] Reduce memory usage of MapEvaluator (Quentin Remy) - [#4978] Support negative offset for Background2d.to_3d (Atreyee Sinha) - [#4975] Reduce memory usage of get_psf_kernel (Quentin Remy) - [#4973] Add position as a parameter for TemplateSpatialModel (Atreyee Sinha) - [#4971] Use `FixedPointingInfo` in notebook (Atreyee Sinha) - [#4970] Adapt FluxPointsDataset to fit light curves (Atreyee Sinha) - [#4942] Parallel support for FluxProfileEstimation (Quentin Remy) - [#4940] Fix MapEvaluator for the apply_edisp=False case (Quentin Remy) - [#4937] Fix import of angular_separation for astropy 6 (Maximilian Linhoff) - [#4936] PiecewiseNormSpectralModel serialising interp (Katrin Streil) - [#4917] Add new class to directly simulate observations (Maximilian Linhoff) - [#4904] Deprecate is_norm on parameter (Quentin Remy) - [#4902] Add norm attribute to estimators and deprecate previous norm related attributes (Quentin Remy) - [#4886] Introduce hierarchical metadata structures (Régis Terrier) - [#4879] Fix energy dependent temporal model simulation (Quentin Remy) - [#4854] Notebook to sphinx-gallery script (REGEARD Maxime) - [#4851] Parallel support for WcsNDMap map convolution (Quentin Remy) - [#4850] Add utility function to split dataset into multiple datasets (Quentin Remy) - [#4849] Add TimeMapAxis.to_gti() (Atreyee Sinha) - [#4847] Variability tutorial (Claudio Galelli) - [#4845] Add context manager for multiprocessing configuration (Quentin Remy) - [#4837] Add checksum argument to gammapy products write functions (Régis Terrier) - [#4835] Management of metadata for `Models` (Bruno Khélifi) - [#4834] Adding prior stat sum to datasets (Katrin Streil) - [#4829] Caching gti and radmax (REGEARD Maxime) - [#4828] Adapt SkyModel to evaluate on TimeMapAxis (Atreyee Sinha) - [#4822] Add a function to delete a time interval from GTI (Claudio Galelli) - [#4817] Computation of total observation time map (Atreyee Sinha) - [#4814] Introduce a function to compute the doubling/halving time for a lightcurve (Claudio Galelli) - [#4810] Adding a tutorial for observational clustering (Astro-Kirsty) - [#4808] adding `Observations` in memory generator (REGEARD Maxime) - [#4805] Description of the arguments of the class `Observation` (Bruno Khélifi) - [#4802] Adapt detect tutorial to include flux parameters in find peaks (Astro-Kirsty) - [#4785] Use interpolation for dark matter mass (Stefan Fröse) - [#4783] Add EnergyDependentMorphologyEstimator (Astro-Kirsty) - [#4770] Raise error if the predicted event number is too large in event sampling (Fabio PINTORE) - [#4759] Display the default model parameters in docstrings (Astro-Kirsty) - [#4753] Removes size 1 array to scalar conversion deprecation warnings from numpy (Régis Terrier) - [#4750] Support pydantic v2.0 (Axel Donath) - [#4741] Add Zhao profile (Stefan Fröse) - [#4740] Add DarkMatterDecaySpectralModel (Stefan Fröse) - [#4738] Introduce Observation metadata container (Régis Terrier) - [#4729] Change default index for NormSpectralModel (Quentin Remy) - [#4726] Introduce a function to compute the point-to-point fractional variance (Claudio Galelli) - [#4714] Replace CashCountsStatistic error calculation by analytical expression (Régis Terrier) - [#4703] Update 4FGL catalog default to DR4 (Quentin Remy) - [#4697] Deduce pointing mode from arguments in FixedPointingInfo (Maximilian Linhoff) - [#4677] Fix wcs.array_shape definition in WcsGeom.create (Quentin Remy) - [#4671] Introduce metadata base class (Régis Terrier) - [#4669] Add the progress bar for the DataStore (Bruno Khélifi) - [#4668] Multidimensional geom support in SkyModel.integrate_geom and evaluate_geom (Régis Terrier) - [#4664] Add a faster reprojection method : reproject_by_image (Quentin Remy) - [#4660] Add function to convert hermes maps to gammapy compatible format (Quentin Remy) - [#4657] Fix the FluxProfileEstimator to take into account models (Quentin Remy) - [#4638] Add a `from_stack` method on `Observations` (REGEARD Maxime) - [#4635] Add function to determine pivot energy for any spectral model (Astro-Kirsty) - [#4628] Match energy binning per decade to pyirf's (JBernete) - [#4620] Adding prior class (Katrin Streil) - [#4615] Improve sensitivity example (Maximilian Linhoff) - [#4608] Add a slice_by_coord function for FluxMaps (Claudio Galelli) - [#4599] Add a SafeMaskMaker at DL3 level (Atreyee Sinha) - [#4595] Add 1LHAASO to gammapy.catalog (Quentin Remy) - [#4584] Add optional sensitivity computation in ExcessMapEstimator (Quentin Remy) - [#4574] Fixing if statements in OGIPDatasetWriter (REGEARD Maxime) - [#4567] Freeze source parameters in FluxEstimator (Régis Terrier) - [#4561] Export Datastore to Obscore table (PaulaKx) - [#4546] Remove is_ul column in FluxPointsEstimator if no upper limit is defined (Astro-Kirsty) - [#4540] Add function to extract values from FluxMaps (Astro-Kirsty) - [#4501] Exposing computation of the fractional excess variance (Claudio Galelli) - [#4491] PIG 27 - Metadata structure (Régis Terrier) - [#4485] Implement TimeMapAxis.pix_to_coord (Atreyee Sinha) - [#4432] Serialise gti table to flux points object (Atreyee Sinha) - [#4408] Add plot function for 1D distribution of map data (REGEARD Maxime) - [#4381] PIG 16 - Model Priors API (Noah Biederbeck) - [#4217] FluxPointsDataset support model with spatial template and NormSpectralModel (Quentin Remy) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v1.3.rst0000644000175100001770000002770014721316200017336 0ustar00runnerdocker.. include:: ../references.txt .. _gammapy_1p3_release: 1.3 (November 26th, 2024) ------------------------- - Released November 26th, 2024 - 25 contributors - 212 pull requests since v1.2 (not all listed below) - 75 closed issues Summary ~~~~~~~ This release introduces a number of performance improvements, new features and bug fixes. Improved support is provided for joint analysis with different event types. There has been significant code cleanup, specially with respect to the documentation and three new tutorials have been added. This release is compatible with numpy >=2.0. Sherpa dependency has been fixed to >4.16, it is not included in the gammapy-1.3-environment.yml file to avoid possible installation issues. API changes ~~~~~~~~~~~ Few API-breaking changes have been introduced in this version: - Note that the `~gammapy.irf.Background3D` FOV-lon alignment has been reverted to follow the correct GADF convention. A check is made to ensure compatibility with previous H.E.S.S. bkg models. Please check the alignment if you are building 3D bkg models. - The order of arguments in the `~gammapy.modeling.models.FoVBackgroundModel` has been modified; `dataset_name` is now a required first positional argument. - All `~gammapy.estimators.Estimator` init functions that require a spectral model now use `spectral_model` as the argument name. Features deprecated since Gammapy v1.2 have been removed. Infrastructure ~~~~~~~~~~~~~~ - Matomo is now used for website analytics, replacing Google Analytics. Documentation improvements ~~~~~~~~~~~~~~~~~~~~~~~~~~ - The pydata-sphinx-theme version used to build the documentation has been updated. The overall appearance has therefore changed a bit. - Tutorials are now ordered in a logical workflow in each category. - Three new tutorials have been added: - one on modeling the EBL absorption - one exposing the general API of `gammapy.estimators` and `~gammapy.estimators.FluxMaps` - one performing time-dependent spectroscopy - Code examples have been added in docstrings. New features ~~~~~~~~~~~~ *gammapy.catalogs* - Support for the 2PC and 3PC pulsar catalogs from Fermi-LAT has been added. *gammapy.datasets* - Performance improvement in Fermi-LAT analysis has been implemented with adaptations in PSF kernel computation. - Stacking of acceptance maps has been improved. *gammapy.maps* - Multiple non-spatial axes are now supported in MapDataset creation. Full likelihood computation with additional axes is not yet supported. - Map axes with periodic boundary conditions are now supported. - An error is raised for maps with invalid input shapes instead of silently broadcasting. *gammapy.estimators* - Added support for joint TS map : `TSMapEstimator.run()` now accepts `Datasets` as input. - Maps of likelihood profiles can now be computed with the `~gammapy.estimators.TSMapEstimator` via the argument : `selection_optional = ["stat_scan"]`. - Bin-wise likelihood profiles are now stored on `~gammapy.estimators.FluxMaps`. - `~gammapy.estimtors.FluxMaps` obtained from different datasets (eg: different event types) can be combined. - Added a function to combine flux maps in `~gammapy.estimators.utils.combine_flux_maps`. The combination uses the likelihood profile maps if available otherwise Gaussian approximation is used. - Added alpha maps to the output of `~gammapy.estimators.ExcessMapEstimator`. *gammapy.stats* - Timmer-Koenig algorithm with leakage protection for simulating a time series from a power spectrum has been implemented. - Discrete structure function for a variable source using the Emmanoulopoulos et al. (2010) algorithm has been added. - Discrete cross correlation function computation has been added. *gammapy.modeling* - The `~gammapy.modeling.FitResult` object is now exposed as an important API element, combining the results from the optimisation and covariance of the fit. It can be saved to disk in the form of a yaml file thanks to a new `~gammapy.modeling.FitResult.write()` function. - Unique naming of model parameters has been adapted, leading to resolution of issues with covariance matrix computation for complex model lists. - Time scale issues in simulation energy dependent temporal models has been resolved. Bug fixes and improvements ~~~~~~~~~~~~~~~~~~~~~~~~~~ - ON and OFF phase intervals are copied in `~gammapy.makers.PhaseBackgroundMaker`. - Added min_pix option in `~gammapy.maps.WcsGeom.cutout`. - Added copy method on `~gammapy.estimators.FluxMaps`. - Fixed `~gammapy.modeling.select_nested_models` if there is no free parameters for the null hypothesis. - Allowed extrapolation only along spatial dimension in `mask_safe_edisp`. - Fixed `cutout_template_models` to work if models is None. - Fixed angle wrapping of spatial parameters in `~gammapy.modeling.models.PiecewiseNormSpatialModel`. - Fixed map evaluation if no ``edisp`` is set. - Added missing parallel processing options. - Added `~gammapy.maps.TimeMapAxis.to_table()`. - Added a warning in `~gammapy.modeling.models.TemplateSpatialModel` for presence of nan values. - Fixed ignored overwrite in `~gammapy.datasets.Datasets.write`. - Improved plotting of spectral residuals by removing the mask in computation but rather adding a visual mask. - Added strict option to `~gammapy.maps.MapAxis.downsample`. - Fixed `~gammapy.data.ObservationsEventsSampler` for observations based on full-enclosure IRFs. - Added a method to normalize `~gammapy.modeling.models.TemplatePhaseCurveTemporalModel`. - Definition of `UniformPrior` has been fixed to return 0 within min and max values instead of 1. Known issues ~~~~~~~~~~~~ - Reading a `~gammapy.modeling.models.Models` file created with Gammapy<1.2 which contains a `~gammapy.modeling.models.TemplateSpatialModel` can fail because of the covariance format. A workaround is to remove the covariance entry in the `~gammapy.modeling.models.Models` file. Contributors ~~~~~~~~~~~~ - Fabio Acero - Arnau Aguasca-Cabot - Axel Donath - Kirsty Feijen - Stefan Fröse - Claudio Galelli - Bruno Khélifi - Maximilian Linhoff - Lars Mohrmann - Daniel Morcuende - Laura Olivera Nieto - Mireia Nievas - Michele Peresano - Fabio Pintore - Maxime Regeard - Quentin Remy - Gerrit Roellinghoff - Vasco Schiavo - Atreyee Sinha - Brigitta Sipőcz - Hanna Stapel - Régis Terrier - Santiago Vila - Samantha Wong - Pei Yu Pull Requests ~~~~~~~~~~~~~ This list is incomplete. Small improvements and bug fixes are not listed here. - [#5545] Fix ObservationsEventsSampler for observations based on full-enclosure IRFs (Michele Peresano) - [#5525] Norm TemplatePhaseCurveTemporalModel (Maxime Regeard) - [#5472] Add the Matomo tracker into our documentation web pages (Bruno Khélifi) - [#5466] Improve the time resolved spectroscopy tutorial (Claudio Galelli) - [#5462] Update all the documentation with the new name of CTA : CTAO (Bruno Khélifi) - [#5453] Fix time shift from sampled events with `MapDatasetEventSampler` (Fabio Pintore) - [#5449] combine_flux_maps supports for different energy axes (Quentin Remy) - [#5448] Fix typo in Uniform prior returned value (Fabio Acero) - [#5445] Add a to_table() method to TimeMapAxis (Claudio Galelli) - [#5438] Speed up event sampler (Fabio Pintore) - [#5437] Remove mask in residual plotting (Atreyee Sinha) - [#5433] Sort sampled events by time in `MapDatasetEventSampler` (Fabio Pintore) - [#5427] A time resolved spectroscopy estimator (Atreyee Sinha) - [#5423] Function for the discrete cross correlation function (Claudio Galelli) - [#5409] Add strict option to `MapAxis.downsample` (Kirsty Feijen) - [#5408] Add alpha maps to `ExcessMapEstimator` (Kirsty Feijen) - [#5407] Add an API tutorial for Estimators (Kirsty Feijen) - [#5405] Add missing parallel options (Quentin Remy) - [#5390] Fix region evaluation without psf convolution (Quentin Remy) - [#5389] Fix `LabelMapAxis` so it doesn't reorder the labels (Kirsty Feijen) - [#5385] Fix reco_exposure computation with mask safe in ExcessMapEstimator (Quentin Remy) - [#5382] Fix and tweaks for the stat_scan in TSMapEstimator (Quentin Remy) - [#5381] Apply safe mask in TSMapEstimator (Quentin Remy) - [#5380] Add max_niter as option in TSMapEstimator (Quentin Remy) - [#5378] Observations `__getitem__` method addition (Maxime Regeard) - [#5370] Add offset mask to make_effective_livetime_map (Quentin Remy) - [#5366] Remove use of np.rint in pix_tuple_to_idx (Atreyee Sinha) - [#5356] Brought gammapy.visualization.plot_distribution in line with its documentation (Gerrit Roellinghoff) - [#5353] Add support for joint TSmap estimation (Quentin Remy) - [#5350] Adapt `RadMax2D.plot_rad_max_vs_energy` (Kirsty Feijen) - [#5346] Add examples to fit function docstrings (Kirsty Feijen) - [#5342] Fix evaluation if no edisp is set (Quentin Remy) - [#5320] Fix parameters unique names (Quentin Remy) - [#5316] Add CovarianceMixin for multicomponent models classes (Quentin Remy) - [#5314] Improve get_psf_kernel performance (Quentin Remy) - [#5312] More complex background spectral model exposed (Kirsty Feijen) - [#5304] introducing a `filename` argument to TemplateSpatialModel.write (Fabio Pintore) - [#5303] QOL changes in Timmer&Konig (Claudio Galelli) - [#5300] Fix wrap in PiecewiseNormSpatialModel (Quentin Remy) - [#5298] Coherent units in `TemporalModel.sample_time` (Fabio Pintore) - [#5297] Precompute the PSF Kernel if the PSFMap has only one bin (Quentin Remy) - [#5289] Add new methods for combine_flux_maps to use likelihood profile or its approximation (Quentin Remy) - [#5285] Add support for stat_scan in TSMapEstimator (Quentin Remy) - [#5280] Add min_pix option in WcsGeom.cutout and set it to 3 for IRFs (Quentin Remy) - [#5279] Add copy method on FluxMaps (Quentin Remy) - [#5275] Add a notebook for EBL correction example (Atreyee Sinha) - [#5271] Allow extrapolation only along spatial dimension in `mask_safe_edisp` (Quentin Remy) - [#5270] Modify acceptance stacking behavior (Régis Terrier) - [#5269] Add leakage protection factor in the Timmer Konig algorithm (Claudio Galelli) - [#5258] Add write function to `FitResult` (Kirsty Feijen) - [#5255] Expose `FitResult` (Kirsty Feijen) - [#5254] Make non-FITS table file from FluxPoints overwritable (Michele Peresano) - [#5222] Allow `MapAxis` to be passed in `SpectralModel.plot()` (Kirsty Feijen) - [#5207] Specify the models with duplicated name in `Models` (Fabio Pintore) - [#5206] Convert negative `npred` values to zero in `MapDatasetEventSampler` (Fabio Pintore) - [#5205] Specify the sampled model in `MapDatasetEventSampler` (Fabio Pintore) - [#5200] Use two argument form of Time to parse reference time (Maximilian Linhoff) - [#5188] Raise an error if the geometry of the exclusion mask passed to `ReflectedRegionsBackgroundMaker` is not an image (Maxime Regeard) - [#5186] Adjust `to_edisp_kernel` input to `MapAxis` (Kirsty Feijen) - [#5184] Support additional axes in Mapdataset.create (Régis Terrier) - [#5180] Raise error on invalid input shape in Map (Maximilian Linhoff) - [#5176] Fix `observatory_earth_location` (Stefan Fröse) - [#5169] Add boundary condition to create a PeriodicMapAxis (Atreyee Sinha) - [#5161] Flux maps combination (Quentin Remy) - [#5160] Fix select_nested_models if there is no free parameters for the null hypothesis (Quentin Remy) - [#5156] Introduction of the structure function for variability studies (Claudio Galelli) - [#5145] LightCurveTemplateTemporalModel method to generate a model from a periodogram by using the Timmer algorithm (Claudio Galelli) - [#5135] Add different options to compute stat_array on FluxPointsDatasets (Quentin Remy) - [#5129] Copy on and off phase intervals in `PhaseBackgroundMaker` (Maxime Regeard) - [#5125] Fix implementation of the testing code for larger docstring examples (Kirsty Feijen) - [#5118] css for sphinx v15 (Hanna Stapel) - [#5115] Add 3 TS definitions to compute with TestStatisticNested class (Quentin Remy) - [#5058] 3PC Fermi catalog (Maxime Regeard) - [#5057] 2PC Fermi catalog (Maxime Regeard) - [#4996] MAINT: use new location for dev wheels (Brigitta Sipőcz) - [#4983] Adjust getting-started page (Kirsty Feijen) - [#4433] Add a function to combine excess maps (Quentin Remy) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/release-notes/v2.0.rst0000644000175100001770000000011714721316200017325 0ustar00runnerdocker.. _gammapy_2p0_release: 2.0 (unreleased) ---------------- - No changes yet ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/serve.py0000644000175100001770000000152514721316200015042 0ustar00runnerdocker""" Serve the docs page and open in default browser. Combination of these two SO (license CC-BY-SA 4.0) answers: https://stackoverflow.com/a/51295415/3838691 https://stackoverflow.com/a/52531444/3838691 """ import time import webbrowser from functools import partial from http.server import HTTPServer, SimpleHTTPRequestHandler from threading import Thread ip = "localhost" port = 8000 directory = "docs/_build/html" server_address = (ip, port) url = f"http://{ip}:{port}" def open_docs(): # give http server a little time to startup time.sleep(0.1) webbrowser.open(url) if __name__ == "__main__": Thread(target=open_docs).start() Handler = partial(SimpleHTTPRequestHandler, directory=directory) httpd = HTTPServer(server_address, Handler) try: httpd.serve_forever() except KeyboardInterrupt: pass ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/sphinxext.py0000644000175100001770000000477014721316200015755 0ustar00runnerdocker# -*- coding: utf-8 -*- # Licensed under a 3-clause BSD style license - see LICENSE.rst # # Gammapy documentation ordering configuration file. # # GalleryExplicitOrder is a re-implemation of private sphinx_gallery class `_SortKey`. # TUTORIAL_SORT_DICT = { # **Tutorials** # introductory "overview.py": 0, "analysis_1.py": 1, "analysis_2.py": 2, # data exploration "hess.py": 0, "cta.py": 1, "fermi_lat.py": 2, "hawc.py": 3, # data analysis # 1d "cta_sensitivity.py": 0, "spectral_analysis.py": 1, "spectral_analysis_hli.py": 2, "spectral_analysis_rad_max.py": 3, "extended_source_spectral_analysis.py": 4, "spectrum_simulation.py": 5, "sed_fitting.py": 6, "ebl.py": 7, # 2d "detect.py": 0, "ring_background.py": 1, "modeling_2D.py": 2, # 3d "analysis_3d.py": 0, "cta_data_analysis.py": 1, "energy_dependent_estimation.py": 2, "analysis_mwl.py": 3, "simulate_3d.py": 4, "event_sampling.py": 5, "event_sampling_nrg_depend_models.py": 6, "flux_profiles.py": 7, # time "light_curve.py": 0, "light_curve_flare.py": 1, "variability_estimation.py": 2, "time_resolved_spectroscopy.py": 3, "light_curve_simulation.py": 4, "pulsar_analysis.py": 5, # api "observation_clustering.py": 9, "irfs.py": 0, "maps.py": 4, "mask_maps.py": 5, "makers.py": 7, "datasets.py": 6, "models.py": 1, "priors.py": 2, "model_management.py": 10, "fitting.py": 11, "estimators.py": 12, "catalog.py": 8, "astro_dark_matter.py": 3, # scripts "survey_map.py": 0, } class BaseExplicitOrder: """ Base class inspired by sphinx_gallery _SortKey to customize sorting based on a dictionary. The dictionary should contain the filename and its order in the subsection. """ def __repr__(self): return f"<{self.__class__.__name__}>" def __init__(self, src_dir): self.src_dir = src_dir class TutorialExplicitOrder(BaseExplicitOrder): """ Class that handle the ordering of the tutorials in each gallery subsection. """ sort_dict = TUTORIAL_SORT_DICT def __call__(self, filename): if filename in self.sort_dict.keys(): return self.sort_dict.get(filename) else: return 0 ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.152642 gammapy-1.3/docs/user-guide/0000755000175100001770000000000014721316215015420 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.152642 gammapy-1.3/docs/user-guide/astro/0000755000175100001770000000000014721316215016550 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.152642 gammapy-1.3/docs/user-guide/astro/darkmatter/0000755000175100001770000000000014721316215020706 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/darkmatter/index.rst0000644000175100001770000001177314721316200022552 0ustar00runnerdocker.. _astro-darkmatter: *********** Dark matter *********** Introduction ============ The `gammapy.astro.darkmatter` module provides spatial and spectral models for indirect dark matter searches, using PPPC4DM models. This introduction is aimed at people who already have some experience with dark matter analysis. For a thorough introduction see e.g. `Cirelli 2014`_. The spatial distribution of dark matter halos is typically modeled with radially symmetric profiles. Common profiles are the ones by Navarro, Frenk and White (NFW) or Einasto for cuspy and an Isothermal or Burkert profile for cored dark matter distributions (see `gammapy.astro.darkmatter.DMProfile`). The spectral models in `gammapy.astro.darkmatter.PrimaryFlux` are based on `Cirelli et al. 2011`_, who provide tabulated spectra for different annihilation channels). These models are most commonly used in VHE dark matter analyses. Other packages ============== There are many other packages out there that implement functionality for dark matter analysis, their capabilities are summarized in the following FermiST ------- The Fermi Science tools have a `DMFitFunction`_ with the following XML serialization format .. code-block:: xml The `DMFitFunction`_ is only a spectral model and the spatial component is set using a point source. A spatial template can obviously be used. Utilities to create such sky maps are for example `fermipy/dmsky`_ but it seems like this package is basically a collection of spatial models from the literature. There is also `fermiPy/dmpipe`_ but it also does not seem to implement any spatial profiles. The DMFitFunction is also implemented in `fermipy.spectrum.DMFitFunction`_. It is a spectral model based on `Jeltema & Profuma 2008`_. From a quick look I didn't see where they get the spectral template from (obviously not `Cirelli et al. 2011`_) but `DarkSUSY`_ is mentioned in the paper. DMFitFunction is also implemented in `astromodels`_. None of the mentioned packages implement the spectral models by `Cirelli et al. 2011`_ CLUMPY ------ `CLUMPY`_ is a package for γ-ray signals from dark matter structures. The core of the code is the calculation of the line of sight integral of the dark matter density squared (for annihilations) or density (for decaying dark matter). CLUMPY is written in C/C++ and relies on the CERN ROOT library. There is no Python wrapper, as far as I can see. The available dark matter profiles go beyond what is used in usual VHE analyses. It might be worth looking into this package for cross checking the functionality in gammapy. gamLike ------- `GamLike`_ contains likelihood functions for most leading gamma-ray indirect searches for dark matter, including Fermi-LAT observations of dwarfs and the Galactic Centre (GC), HESS observations of the GC, and projected sensitivities for CTAO observations of the GC. It is released in tandem with the `GAMBIT`_ module `DarkBit`_. DarkBit can be used for directly computing observables and likelihoods, for any combination of parameter values in some underlying particle model. Using gammapy.astro.darkmatter ------------------------------ .. minigallery:: gammapy.astro.darkmatter .. _Cirelli et al. 2011: http://iopscience.iop.org/article/10.1088/1475-7516/2011/03/051/pdf .. _Cirelli 2014: http://www.marcocirelli.net/otherworks/HDR.pdf .. _DMFitFunction: https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/source_models.html#DMFitFunction .. _fermipy/dmsky: https://github.com/fermiPy/dmsky .. _fermipy/dmpipe: https://github.com/fermiPy/dmpipe .. _fermipy.spectrum.DMFitFunction: https://github.com/fermiPy/fermipy/blob/1c2291a4cbdf30f3940a472bcce2a45984c339a6/fermipy/spectrum.py#L504 .. _Jeltema & Profuma 2008: http://iopscience.iop.org/article/10.1088/1475-7516/2008/11/003/meta .. _astromodels: https://github.com/giacomov/astromodels/blob/master/astromodels/functions/dark_matter/dm_models.py .. _CLUMPY: http://lpsc.in2p3.fr/clumpy/ .. _DarkSUSY: http://www.darksusy.org/ .. _GamLike: https://bitbucket.org/weniger/gamlike .. _GAMBIT: https://gambitbsm.org/ .. _DarkBit: https://link.springer.com/article/10.1140%2Fepjc%2Fs10052-017-5155-4 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/index.rst0000644000175100001770000000172414721316200020407 0ustar00runnerdocker.. _astro: Astrophysics ============ This module contains utility functions for some astrophysical scenarios: * `~gammapy.astro.source` for astrophysical source models * `~gammapy.astro.population` for astrophysical population models * `~gammapy.astro.darkmatter` for dark matter spatial and spectral models The `gammapy.astro` sub-package is in a prototyping phase and its scope and future are currently being discussed. It is likely that some functionality will be removed or split out into a separate package at some point. Getting started --------------- The `gammapy.astro` namespace is empty. Use these import statements: .. testcode:: from gammapy.astro import source from gammapy.astro import population from gammapy.astro import darkmatter Please refer to the Getting Started section of each sub-package for a further introduction. Sub-packages ------------ .. toctree:: :maxdepth: 1 source/index population/index darkmatter/index ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.152642 gammapy-1.3/docs/user-guide/astro/population/0000755000175100001770000000000014721316215020742 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/population/index.rst0000644000175100001770000000376514721316200022610 0ustar00runnerdocker.. _astro-population: ************************************** Astrophysical source population models ************************************** Introduction ============ The `gammapy.astro.population` module provides a simple framework for population synthesis of gamma-ray sources, which is useful in the context of surveys and population studies. Getting started =============== The following example illustrates how to simulate a basic catalog including a spiral arm model. .. testcode:: import astropy.units as u from gammapy.astro.population import make_base_catalog_galactic max_age = 1E6 * u.yr SN_rate = 3. / (100. * u.yr) n_sources = int(max_age * SN_rate) table = make_base_catalog_galactic( n_sources=n_sources, rad_dis='L06', vel_dis='F06B', max_age=max_age, spiralarms=True, ) The total number of sources is determined assuming a maximum age and a supernova rate. The table returned is an instance of `~astropy.table.Table` which can be used for further processing. The example population with spiral-arms is illustrated in the following plot. .. plot:: user-guide/astro/population/plot_spiral_arms.py Galactocentric spatial distributions ------------------------------------ Here is a comparison plot of all available radial distribution functions of the surface density of pulsars and related objects used in literature: .. plot:: user-guide/astro/population/plot_radial_distributions.py TODO: add illustration of Galactocentric z-distribution model and combined (r, z) distribution for the Besancon model. Spiral arm models ----------------- Two spiral arm models of the Milky way are available: `~gammapy.astro.population.ValleeSpiral` and `gammapy.astro.population.FaucherSpiral` .. plot:: user-guide/astro/population/plot_spiral_arm_models.py Velocity distributions ---------------------- Here is a comparison plot of all available velocity distribution functions: .. plot:: user-guide/astro/population/plot_velocity_distributions.py ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/population/plot_radial_distributions.py0000644000175100001770000000144014721316200026561 0ustar00runnerdocker"""Plot radial surface density distributions of Galactic sources.""" import numpy as np import astropy.units as u import matplotlib.pyplot as plt from gammapy.astro.population import radial_distributions from gammapy.utils.random import normalize radius = np.linspace(0, 20, 100) * u.kpc for key in radial_distributions: model = radial_distributions[key]() if model.evolved: linestyle = "-" else: linestyle = "--" label = model.__class__.__name__ x = radius.value y = normalize(model, 0, radius[-1].value)(radius.value) plt.plot(x, y, linestyle=linestyle, label=label) plt.xlim(0, radius[-1].value) plt.ylim(0, 0.26) plt.xlabel("Galactocentric Distance [kpc]") plt.ylabel("Normalized Surface Density [kpc^-2]") plt.legend(prop={"size": 10}) plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/population/plot_spiral_arm_models.py0000644000175100001770000000307314721316200026043 0ustar00runnerdocker"""Plot Milky Way spiral arm models.""" import numpy as np from astropy.units import Quantity import matplotlib.pyplot as plt from gammapy.astro.population.spatial import FaucherSpiral, ValleeSpiral from gammapy.maps.axes import UNIT_STRING_FORMAT fig = plt.figure(figsize=(7, 8)) rect = [0.12, 0.12, 0.85, 0.85] ax_cartesian = fig.add_axes(rect) ax_cartesian.set_aspect("equal") vallee_spiral = ValleeSpiral() faucher_spiral = FaucherSpiral() radius = Quantity(np.arange(2.1, 10, 0.1), "kpc") for spiralarm_index in range(4): # TODO: spiral arm index is different for the two spirals # -> spirals by the same name have different colors. # Should we change this in the implementation or here in plotting? color = "C{}".format(spiralarm_index) # Plot Vallee spiral x, y = vallee_spiral.xy_position(radius=radius, spiralarm_index=spiralarm_index) name = vallee_spiral.spiralarms[spiralarm_index] ax_cartesian.plot(x, y, label="Vallee " + name, ls="-", color=color) # Plot Faucher spiral x, y = faucher_spiral.xy_position(radius=radius, spiralarm_index=spiralarm_index) name = faucher_spiral.spiralarms[spiralarm_index] ax_cartesian.plot(x, y, label="Faucher " + name, ls="-.", color=color) ax_cartesian.plot(vallee_spiral.bar["x"], vallee_spiral.bar["y"]) ax_cartesian.set_xlabel(f"x [{radius.unit.to_string(UNIT_STRING_FORMAT)}]") ax_cartesian.set_ylabel(f"y [{radius.unit.to_string(UNIT_STRING_FORMAT)}]") ax_cartesian.set_xlim(-12, 12) ax_cartesian.set_ylim(-15, 12) ax_cartesian.legend(ncol=2, loc="lower right") plt.grid() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/population/plot_spiral_arms.py0000644000175100001770000000435014721316200024662 0ustar00runnerdocker"""Plot Milky Way spiral arms.""" import numpy as np from astropy.units import Quantity, kpc import matplotlib.pyplot as plt from gammapy.astro.population import FaucherSpiral, simulate from gammapy.maps.axes import UNIT_STRING_FORMAT from gammapy.utils.coordinates import cartesian, polar catalog = simulate.make_base_catalog_galactic( n_sources=int(1e4), rad_dis="YK04", vel_dis="H05", max_age=Quantity(1e6, "yr") ) spiral = FaucherSpiral() fig = plt.figure(figsize=(6, 6)) rect = [0.12, 0.12, 0.85, 0.85] ax_cartesian = fig.add_axes(rect) ax_cartesian.set_aspect("equal") ax_polar = fig.add_axes(rect, polar=True, frameon=False) ax_polar.axes.get_xaxis().set_ticklabels([]) ax_polar.axes.get_yaxis().set_ticklabels([]) ax_cartesian.plot( catalog["x"], catalog["y"], marker=".", linestyle="none", markersize=5, alpha=0.3, fillstyle="full", ) ax_cartesian.set_xlim(-20, 20) ax_cartesian.set_ylim(-20, 20) ax_cartesian.set_xlabel(f"x [{kpc.to_string(UNIT_STRING_FORMAT)}]", labelpad=2) ax_cartesian.set_ylabel(f"y [{kpc.to_string(UNIT_STRING_FORMAT)}]", labelpad=-4) ax_cartesian.plot( 0, 8, color="k", markersize=10, fillstyle="none", marker="*", linewidth=2 ) ax_cartesian.annotate( "Sun", xy=(0, 8), xycoords="data", xytext=(-15, 15), arrowprops=dict(arrowstyle="->", color="k"), weight=400, ) plt.grid(True) # TODO: document what these magic numbers are or simplify the code # `other_idx = [95, 90, 80, 80]` and below `theta_idx = int(other_idx * 0.97)` for spiral_idx, other_idx in zip(range(4), [95, 90, 80, 80]): spiralarm_name = spiral.spiralarms[spiral_idx] theta_0 = spiral.theta_0[spiral_idx].value theta = Quantity(np.linspace(theta_0, theta_0 + 2 * np.pi, 100), "rad") x, y = spiral.xy_position(theta=theta, spiralarm_index=spiral_idx) ax_cartesian.plot(x.value, y.value, color="k") rad, phi = polar(x[other_idx], y[other_idx]) x_pos, y_pos = cartesian(rad + Quantity(1, "kpc"), phi) theta_idx = int(other_idx * 0.97) rotation = theta[theta_idx].to("deg").value ax_cartesian.text( x_pos.value, y_pos.value, spiralarm_name, ha="center", va="center", rotation=rotation - 90, weight=400, ) plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/population/plot_velocity_distributions.py0000644000175100001770000000163714721316200027173 0ustar00runnerdocker"""Plot velocity distributions of Galactic sources.""" import numpy as np import astropy.units as u import matplotlib.pyplot as plt from gammapy.astro.population import velocity_distributions from gammapy.maps.axes import UNIT_STRING_FORMAT from gammapy.utils.random import normalize velocity = np.linspace(10, 3000, 200) * u.km / u.s ax = plt.subplot() for key in velocity_distributions: model = velocity_distributions[key]() label = model.__class__.__name__ x = velocity.value y = normalize(model, velocity[0].value, velocity[-1].value)(velocity.value) ax.plot(x, y, linestyle="-", label=label) ax.set_xlim(velocity[0].value, velocity[-1].value) ax.set_ylim(0, 0.005) ax.set_xlabel(f"Velocity [{velocity.unit.to_string(UNIT_STRING_FORMAT)}]") ax.set_ylabel( f"Probability Density [{((velocity.unit)**(-1)).to_string(UNIT_STRING_FORMAT)}]" ) ax.semilogx() plt.legend(prop={"size": 10}) plt.show() ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.152642 gammapy-1.3/docs/user-guide/astro/source/0000755000175100001770000000000014721316215020050 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/source/index.rst0000644000175100001770000000075414721316200021711 0ustar00runnerdocker.. _astro-source: *************************** Astrophysical source models *************************** Introduction ============ The `gammapy.astro.source` module contains classes of source models, which can be used for population synthesis of galactic gamma-ray sources. Getting started =============== TODO: add basic example. Using gammapy.astro.source ========================== The following source models are available: .. toctree:: :maxdepth: 1 snr pwn pulsar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/source/plot_pulsar_spindown.py0000644000175100001770000000065314721316200024705 0ustar00runnerdocker"""Plot spin frequency of the pulsar with time.""" import numpy as np from astropy.units import Quantity import matplotlib.pyplot as plt from gammapy.astro.source import Pulsar t = Quantity(np.logspace(0, 6, 100), "yr") pulsar = Pulsar(P_0=Quantity(0.01, "s"), B="1e12 G") plt.plot(t.value, 1 / pulsar.period(t).cgs.value) plt.xlabel("time [years]") plt.ylabel("frequency [1/s]") plt.ylim(1e0, 1e2) plt.loglog() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/source/plot_pwn_evolution.py0000644000175100001770000000132514721316200024363 0ustar00runnerdocker"""Plot PWN evolution with time.""" import numpy as np from astropy.constants import M_sun from astropy.units import Quantity import matplotlib.pyplot as plt from gammapy.astro.source import PWN, SNRTrueloveMcKee t = Quantity(np.logspace(1, 5, 100), "yr") n_ISM = Quantity(1, "cm^-3") snr = SNRTrueloveMcKee(m_ejecta=8 * M_sun, n_ISM=n_ISM) pwn = PWN(snr=snr) pwn.pulsar.L_0 = Quantity(1e40, "erg/s") plt.plot(t.value, pwn.radius(t).to("pc").value, label="Radius PWN") plt.plot(t.value, snr.radius_reverse_shock(t).to("pc").value, label="Reverse Shock SNR") plt.plot(t.value, snr.radius(t).to("pc").value, label="Radius SNR") plt.xlabel("time [years]") plt.ylabel("radius [pc]") plt.legend(loc=4) plt.loglog() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/source/plot_snr_brightness_evolution.py0000644000175100001770000000140714721316200026612 0ustar00runnerdocker"""Plot SNR brightness evolution.""" import numpy as np from astropy.units import Quantity import matplotlib.pyplot as plt from gammapy.astro.source import SNR densities = Quantity([1, 0.1], "cm-3") t = Quantity(np.logspace(0, 5, 100), "yr") for density in densities: snr = SNR(n_ISM=density) F = snr.luminosity_tev(t) / (4 * np.pi * Quantity(1, "kpc") ** 2) plt.plot(t.value, F.to("cm-2 s-1").value, label="n_ISM = {}".format(density.value)) plt.vlines(snr.sedov_taylor_begin.to("yr").value, 1e-13, 1e-10, linestyle="--") plt.vlines(snr.sedov_taylor_end.to("yr").value, 1e-13, 1e-10, linestyle="--") plt.xlim(1e2, 1e5) plt.ylim(1e-13, 1e-10) plt.xlabel("time [years]") plt.ylabel("flux @ 1kpc [s^-1 cm^-2]") plt.legend(loc=4) plt.loglog() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/source/plot_snr_radius_evolution.py0000644000175100001770000000136114721316200025730 0ustar00runnerdocker"""Plot SNR radius evolution versus time.""" import numpy as np from astropy.units import Quantity import matplotlib.pyplot as plt from gammapy.astro.source import SNR, SNRTrueloveMcKee snr_models = [SNR, SNRTrueloveMcKee] densities = Quantity([1, 0.1], "cm^-3") linestyles = ["-", "--"] t = Quantity(np.logspace(0, 5, 100), "yr") for density in densities: for linestyle, snr_model in zip(linestyles, snr_models): snr = snr_model(n_ISM=density) label = snr.__class__.__name__ + " (n_ISM = {})".format(density.value) x = t.value y = snr.radius(t).to("pc").value plt.plot(x, y, label=label, linestyle=linestyle) plt.xlabel("time [years]") plt.ylabel("radius [pc]") plt.legend(loc=4) plt.loglog() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/source/pulsar.rst0000644000175100001770000000030214721316200022075 0ustar00runnerdocker.. _astro-source-pulsar: Pulsar Source Models ==================== Plot spin frequency of the pulsar with time: .. plot:: user-guide/astro/source/plot_pulsar_spindown.py :include-source: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/source/pwn.rst0000644000175100001770000000032514721316200021400 0ustar00runnerdocker.. _astro-source-pwn: Pulsar Wind Nebula Source Models ================================ Plot the evolution of the radius of the PWN: .. plot:: user-guide/astro/source/plot_pwn_evolution.py :include-source: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/astro/source/snr.rst0000644000175100001770000000055714721316200021405 0ustar00runnerdocker.. _astro-source-snr: Supernova Remnant Models ======================== Plot the evolution of radius of the SNR: .. plot:: user-guide/astro/source/plot_snr_radius_evolution.py :include-source: Plot the evolution of the flux of the SNR above 1 TeV and at 1 kpc distance: .. plot:: user-guide/astro/source/plot_snr_brightness_evolution.py :include-source: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/catalog.rst0000644000175100001770000000340114721316200017554 0ustar00runnerdocker.. _catalog: Source catalogs =============== `gammapy.catalog` provides convenient access to common gamma-ray source catalogs. * ``hgps`` / `~gammapy.catalog.SourceCatalogHGPS` - H.E.S.S. Galactic plane survey (HGPS) * ``gamma-cat`` / `~gammapy.catalog.SourceCatalogGammaCat` - An open catalog of gamma-ray sources * ``3fgl`` / `~gammapy.catalog.SourceCatalog3FGL` - LAT 4-year point source catalog * ``4fgl`` / `~gammapy.catalog.SourceCatalog4FGL` - LAT 8-year point source catalog * ``2fhl`` / `~gammapy.catalog.SourceCatalog2FHL` - LAT second high-energy source catalog * ``3fhl`` / `~gammapy.catalog.SourceCatalog3FHL` - LAT third high-energy source catalog * ``2hwc`` / `~gammapy.catalog.SourceCatalog2HWC` - 2HWC catalog from the HAWC observatory * ``3hwc`` / `~gammapy.catalog.SourceCatalog3HWC` - 3HWC catalog from the HAWC observatory For each catalog, a `~gammapy.catalog.SourceCatalog` class is provided to represent the catalog table, and a matching `~gammapy.catalog.SourceCatalogObject` class to represent one catalog source and table row. The main functionality provided is methods that map catalog information to `~gammapy.modeling.models.SkyModel`, `~gammapy.modeling.models.SpectralModel`, `~gammapy.modeling.models.SpatialModel`, `~gammapy.estimators.FluxPoints` and `~gammapy.estimators.LightCurve` objects. `gammapy.catalog` is independent from the rest of Gammapy. The typical use cases are to compare your results against previous results in the catalogs (e.g. overplot a spectral model), or to create initial source models for certain energy bands and sky regions. Using gammapy.catalog --------------------- .. minigallery:: `~gammapy.catalog.SourceCatalog3FHL` :add-heading: .. minigallery:: `~gammapy.catalog.SourceCatalogGammaCat` :add-heading: ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.156642 gammapy-1.3/docs/user-guide/datasets/0000755000175100001770000000000014721316215017230 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/datasets/index.rst0000644000175100001770000002372514721316200021074 0ustar00runnerdocker.. include:: ../../references.txt .. _datasets: Datasets (DL4) ============== The `gammapy.datasets` sub-package contains classes to handle reduced gamma-ray data for modeling and fitting. The `~gammapy.datasets.Dataset` class bundles reduced data, IRFs and model to perform likelihood fitting and joint-likelihood fitting. All datasets contain a `~gammapy.modeling.models.Models` container with one or more `~gammapy.modeling.models.SkyModel` objects that represent additive emission components. To model and fit data in Gammapy, you have to create a `~gammapy.datasets.Datasets` container object with one or multiple `~gammapy.datasets.Dataset` objects. .. _datasets-types: Types of supported datasets --------------------------- Gammapy has built-in support to create and analyse the following datasets: .. list-table:: :widths: 10 20 50 20 20 10 :header-rows: 1 * - **Dataset Type** - **Data Type** - **Reduced IRFs** - **Geometry** - **Additional Quantities** - **Fit Statistic** * - `~gammapy.datasets.MapDataset` - `counts` - `background`, `psf`, `edisp`, `exposure`, - `WcsGeom` or `RegionGeom` - - `cash` * - `~gammapy.datasets.MapDatasetOnOff` - `counts` - `psf`, `edisp`, `exposure` - `WcsGeom` - `acceptance`, `acceptance_off`, `counts_off` - `wstat` * - `~gammapy.datasets.SpectrumDataset` - `counts` - `background`, `edisp`, `exposure` - `RegionGeom` - - `cash` * - `~gammapy.datasets.SpectrumDatasetOnOff` - `counts` - `edisp`, `exposure` - `RegionGeom` - `acceptance`, `acceptance_off`, `counts_off` - `wstat` * - `~gammapy.datasets.FluxPointsDataset` - `flux` - None - None - - `chi2` In addition to the above quantities, a dataset can optionally have a `meta_table` serialised, which can contain relevant information about the observations used to create the dataset. In general, `OnOff` datasets should be used when the background is estimated from real off counts, rather than from a background model. The `~gammapy.datasets.FluxPointsDataset` is used to fit pre-computed flux points when no convolution with IRFs are needed. The map datasets represent 3D cubes (`~gammapy.maps.WcsNDMap` objects) with two spatial and one energy axis. For 2D images the same map objects and map datasets are used, an energy axis is present but only has one energy bin. The spectrum datasets represent 1D spectra (`~gammapy.maps.RegionNDMap` objects) with an energy axis. There are no spatial axes or information, the 1D spectra are obtained for a given on region. Note that in Gammapy, 2D image analyses are done with 3D cubes with a single energy bin, e.g. for modeling and fitting. To analyse multiple runs, you can either stack the datasets together, or perform a joint fit across multiple datasets. Predicted counts ---------------- The total number of predicted counts from a `~gammapy.datasets.MapDataset` are computed per bin like: .. math:: N_{Pred} = N_{Bkg} + \sum_{Src} N_{Src} Where :math:`N_{Bkg}` is the expected counts from the residual hadronic background model and :math:`N_{Src}` the predicted counts from a given source model component. The predicted counts from the hadronic background are computed directly from the model in reconstructed energy and spatial coordinates, while the predicted counts from a source are obtained by forward folding with the instrument response: .. math:: N_{Src} = \mathrm{PSF_{Src}} \circledast \mathrm{EDISP_{Src}}(\mathcal{E} \cdot F_{Src}(l, b, E_{True})) Where :math:`F_{Src}` is the integrated flux of the source model, :math:`\mathcal{E}` the exposure, :math:`\mathrm{EDISP}` the energy dispersion matrix and :math:`\mathrm{PSF}` the PSF convolution kernel. The corresponding IRFs are extracted at the current position of the model component defined by :math:`(l, b)` and assumed to be constant across the size of the source. The detailed expressions to compute the predicted number of counts from a source and corresponding IRFs are given in :ref:`irf`. .. _stack: Stacking Multiple Datasets -------------------------- Stacking datasets implies that the counts, background and reduced IRFs from all the runs are binned together to get one final dataset for which a likelihood is computed during the fit. Stacking is often useful to reduce the computation effort while analysing multiple runs. The following table lists how the individual quantities are handled during stacking. Here, :math:`k` denotes a bin in reconstructed energy, :math:`l` a bin in true energy and :math:`j` is the dataset number .. list-table:: :widths: 25 25 50 :header-rows: 1 * - Dataset attribute - Behaviour - Implementation * - ``livetime`` - Sum of individual livetimes - :math:`\overline{t} = \sum_j t_j` * - ``mask_safe`` - True if the pixel is included in the safe data range. - :math:`\overline{\epsilon_k} = \sum_{j} \epsilon_{jk}` * - ``mask_fit`` - Dropped - * - ``counts`` - Summed in the data range defined by `mask_safe` - :math:`\overline{\mathrm{counts}_k} = \sum_j \mathrm{counts}_{jk} \cdot \epsilon_{jk}` * - ``background`` - Summed in the data range defined by `mask_safe` - :math:`\overline{\mathrm{bkg}_k} = \sum_j \mathrm{bkg}_{jk} \cdot \epsilon_{jk}` * - ``exposure`` - Summed in the data range defined by `mask_safe` - :math:`\overline{\mathrm{exposure}_l} = \sum_{j} \mathrm{exposure}_{jl} \cdot \sum_k \epsilon_{jk}` * - ``psf`` - Exposure weighted average - :math:`\overline{\mathrm{psf}_l} = \frac{\sum_{j} \mathrm{psf}_{jl} \cdot \mathrm{exposure}_{jl}} {\sum_{j} \mathrm{exposure}_{jl}}` * - ``edisp`` - Exposure weighted average, with mask on reconstructed energy - :math:`\overline{\mathrm{edisp}_{kl}} = \frac{\sum_{j}\mathrm{edisp}_{jkl} \cdot \epsilon_{jk} \cdot \mathrm{exposure}_{jl}} {\sum_{j} \mathrm{exposure}_{jl}}` * - ``gti`` - Union of individual `gti` - For the model evaluation, an important factor that needs to be accounted for is that the energy threshold changes between observations. With the above implementation using a `~gammapy.irf.EDispMap`, the `npred` is conserved, ie, the predicted number of counts on the stacked dataset is the sum expected by stacking the `npred` of the individual runs, The following plot illustrates the stacked energy dispersion kernel and summed predicted counts for individual as well as stacked spectral datasets: .. plot:: user-guide/datasets/plot_stack.py .. note:: - A stacked analysis is reasonable only when adding runs taken by the same instrument. - Stacking happens in-place, ie, ``dataset1.stack(dataset2)`` will overwrite ``dataset1`` - To properly handle masks, it is necessary to stack onto an empty dataset. - Stacking only works for maps with equivalent geometry. Two geometries are called equivalent if one is exactly the same as or can be obtained from a cutout of the other. .. _joint: Joint Analysis -------------- An alternative to stacking datasets is a joint fit across all the datasets. For a definition, see :ref:`glossary`. The total fit statistic of datasets is the sum of the fit statistic of each dataset. Note that this is **not** equal to the stacked fit statistic. A joint fit usually allows a better modeling of the background because the background model parameters can be fit for each dataset simultaneously with the source models. However, a joint fit is, performance wise, very computationally intensive. The fit convergence time increases non-linearly with the number of datasets to be fit. Moreover, depending upon the number of parameters in the background model, even fit convergence might be an issue for a large number of datasets. To strike a balance, what might be a practical solution for analysis of many runs is to stack runs taken under similar conditions and then do a joint fit on the stacked datasets. Serialisation of datasets ------------------------- The various `~gammapy.maps.Map` objects contained in `~gammapy.datasets.MapDataset` and `~gammapy.datasets.MapDatasetOnOff` are serialised according to `GADF Sky Maps `__. A hdulist is created with the different attributes, and each of these are written with the data contained in a ``BinTableHDU`` with a `~gammapy.maps.WcsGeom` and a ``BANDS HDU`` specifying the non-spatial dimensions. Optionally, a ``meta_table`` is also written as an `astropy.table.Table` containing various information about the observations which created the dataset. While the ``meta_table`` can contain useful information for later stage analysis, it is not used anywhere internally within gammapy. `~gammapy.datasets.SpectrumDataset` follows a similar convention as for `~gammapy.datasets.MapDataset`, but uses a `~gammapy.maps.RegionGeom`. The region definition follows the standard FITS format, as described `here `__. `~gammapy.datasets.SpectrumDatasetOnOff` can be serialised either according to the above specification, or (by default), according to the `OGIP standards `__. `~gammapy.datasets.FluxPointsDataset` are serialised as `gammapy.estimators.FluxPoints` objects, which contains a set of `gammapy.maps.Map` objects storing the estimated flux as function of energy, and some optional quantities like typically errors, upper limits, etc. It also contains a reference model, serialised as a `~gammapy.modeling.models.TemplateSpectralModel`. Using gammapy.datasets ---------------------- .. minigallery:: :add-heading: Examples using `~gammapy.datasets.MapDataset` ../examples/tutorials/starting/analysis_1.py ../examples/tutorials/analysis-3d/analysis_3d.py .. minigallery:: :add-heading: Examples using `~gammapy.datasets.SpectrumDataset` ../examples/tutorials/analysis-1d/spectral_analysis.py ../examples/tutorials/analysis-1d/spectral_analysis_hli.py ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/datasets/plot_stack.py0000644000175100001770000000466214721316200021747 0ustar00runnerdocker"""Example plot showing stacking of two datasets.""" from astropy import units as u from astropy.coordinates import SkyCoord import matplotlib.pyplot as plt from gammapy.data import Observation, observatory_locations from gammapy.datasets import SpectrumDataset from gammapy.datasets.map import MIGRA_AXIS_DEFAULT from gammapy.irf import EffectiveAreaTable2D, EnergyDispersion2D from gammapy.makers import SpectrumDatasetMaker from gammapy.maps import MapAxis, RegionGeom from gammapy.modeling.models import PowerLawSpectralModel, SkyModel energy_true = MapAxis.from_energy_bounds( "0.1 TeV", "20 TeV", nbin=20, per_decade=True, name="energy_true" ) energy_reco = MapAxis.from_energy_bounds("0.2 TeV", "10 TeV", nbin=10, per_decade=True) aeff = EffectiveAreaTable2D.from_parametrization( energy_axis_true=energy_true, instrument="HESS" ) offset_axis = MapAxis.from_bounds(0 * u.deg, 5 * u.deg, nbin=2, name="offset") edisp = EnergyDispersion2D.from_gauss( energy_axis_true=energy_true, offset_axis=offset_axis, migra_axis=MIGRA_AXIS_DEFAULT, bias=0, sigma=0.2, ) observation = Observation.create( obs_id=0, pointing=SkyCoord("0d", "0d", frame="icrs"), irfs={"aeff": aeff, "edisp": edisp}, tstart=0 * u.h, tstop=0.5 * u.h, location=observatory_locations["hess"], ) geom = RegionGeom.create("icrs;circle(0, 0, 0.1)", axes=[energy_reco]) stacked = SpectrumDataset.create(geom=geom, energy_axis_true=energy_true) maker = SpectrumDatasetMaker(selection=["edisp", "exposure"]) dataset_1 = maker.run(stacked.copy(), observation=observation) dataset_2 = maker.run(stacked.copy(), observation=observation) pwl = PowerLawSpectralModel() model = SkyModel(spectral_model=pwl, name="test-source") dataset_1.mask_safe = geom.energy_mask(energy_min=2 * u.TeV) dataset_2.mask_safe = geom.energy_mask(energy_min=0.6 * u.TeV) dataset_1.models = model dataset_2.models = model dataset_1.counts = dataset_1.npred() dataset_2.counts = dataset_2.npred() stacked = dataset_1.copy(name="stacked") stacked.stack(dataset_2) stacked.models = model npred_stacked = stacked.npred() fig, axes = plt.subplots(nrows=1, ncols=2, figsize=(10, 4)) axes[0].set_title("Stacked Energy Dispersion Matrix") axes[1].set_title("Predicted Counts") stacked.edisp.get_edisp_kernel().plot_matrix(ax=axes[0]) npred_stacked.plot_hist(ax=axes[1], label="npred stacked") stacked.counts.plot_hist(ax=axes[1], ls="--", label="stacked npred") plt.legend() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/dl3.rst0000644000175100001770000001427514721316200016637 0ustar00runnerdocker.. _data: Data access and selection (DL3) =============================== IACT data is typically structured in "observations", which define a given time interval during with the instrument response is considered stable. `gammapy.data` currently contains the `~gammapy.data.EventList` class, as well as classes for IACT data and observation handling. The main classes in Gammapy to access the DL3 data library are the `~gammapy.data.DataStore` and `~gammapy.data.Observation`. They are used to store and retrieve dynamically the datasets relevant to any observation (event list in the form of an `~gammapy.data.EventList`, IRFs see :ref:`irf` and other relevant information). Once some observation selection has been selected, the user can build a list of observations: a `~gammapy.data.Observations` object, which will be used for the data reduction process. Getting started with data ------------------------- You can use the `~gammapy.data.EventList` class to load IACT gamma-ray event lists: .. testcode:: from gammapy.data import EventList filename = '$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz' events = EventList.read(filename) To load Fermi-LAT event lists, you can also use the `~gammapy.data.EventList` class: .. testcode:: from gammapy.data import EventList filename = "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-events.fits.gz" events = EventList.read(filename) The other main class in `gammapy.data` is the `~gammapy.data.DataStore`, which makes it easy to load IACT data. E.g. an alternative way to load the events for observation ID 23523 is this: .. testcode:: from gammapy.data import DataStore data_store = DataStore.from_dir('$GAMMAPY_DATA/hess-dl3-dr1') events = data_store.obs(23523).events The index tables ---------------- A typical way to organize the files relevant to the data we are interested in are the index tables. There are two tables: * **Observation index table:** this table collects the information on each observation or run, with meta data about each of them, such as the pointing direction, the duration, the run ID... * **HDU index table:** this table provides, for each observation listed in the index table, the location of the corresponding data and instrument response files. A `~gammapy.data.DataStore` can then be created by providing each of these two tables in the same file with `~gammapy.data.Datastore.from_file()`, or instead by the directory where they can be found with `~gammapy.data.Datastore.from_dir()` as shown above. More details on these tables and their content can be found in https://gamma-astro-data-formats.readthedocs.io/en/latest/data_storage/index.html. Working with event lists ------------------------ To take a quick look at the events inside the list, one can use `~gammapy.data.EventList.peek()` .. plot:: :include-source: from gammapy.data import EventList filename = '$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz' events = EventList.read(filename) events.peek() Events can be selected based on any of their properties, with dedicated functions existing for energy, time, offset from pointing position and the selection of events in a particular region of the sky. .. testcode:: import astropy.units as u from astropy.time import Time from gammapy.data import EventList filename = '$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz' events = EventList.read(filename) # Select events based on energy selected_energy = events.select_energy([1*u.TeV, 1.2*u.TeV]) # Select events based on time t_start = Time(57185, format='mjd') t_stop = Time(57185.5, format='mjd') selected_time = events.select_time([t_start, t_stop]) # Select events based on offset selected_offset = events.select_offset([1*u.deg, 2*u.deg]) # Select events from a region in the sky selected_region = events.select_region("icrs;circle(86.3,22.01,3)") # Finally one can select events based on any other of the columns of the `EventList.table` selected_id = events.select_parameter('EVENT_ID', (5407363826067,5407363826070)) Combining event lists and GTIs ------------------------------ Both event lists and GTIs can be stacked into a larger one. An `~gammapy.data.EventList` can also be created `~gammapy.data.EventList.from_stack`, that is, from a list of `~gammapy.data.EventList` objects. .. testcode:: from gammapy.data import EventList, GTI filename_1 = '$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz' filename_2 = '$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023526.fits.gz' events_1 = EventList.read(filename_1) events_2 = EventList.read(filename_2) gti_1 = GTI.read(filename_1) gti_2 = GTI.read(filename_2) # stack in place, now the _1 object contains the information of both gti_1.stack(gti_2) events_1.stack(events_2) # or instead create a new event list from the other two combined_events = EventList.from_stack([events_1, events_2]) Writing event lists and GTIs to file ------------------------------------ To write the events or GTIs separately, one can just save the underlying `astropy.table.Table`. To have an event file written in a correct DL3 format, it is necessary to utilise the ``write`` method available for `~gammapy.data.Observation`. It will write the `~gammapy.data.EventList` and their associated GTIs together in the same FITS file according to the format specifications. To avoid writing IRFs along the ``EventList`` one has to set ``include_irfs`` to ``False``. See the example below: .. testcode:: from gammapy.data import EventList, GTI, Observation filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz" events = EventList.read(filename) gti = GTI.read(filename) # Save separately gti.write("test_gti.fits.gz") # Save together. First initiate an Observation object obs = Observation(gti=gti, events=events) obs.write("test_events_with_GTI.fits.gz", include_irfs=False) Using gammapy.data ------------------ .. minigallery:: ../examples/tutorials/starting/overview.py ../examples/tutorials/starting/analysis_1.py ../examples/tutorials/api/observation_clustering.py ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/estimators.rst0000644000175100001770000002345714721316200020351 0ustar00runnerdocker.. _estimators: Estimators (DL4 to DL5, and DL6) ================================ The `gammapy.estimators` submodule contains algorithms and classes for high level flux and significance estimation. This includes estimation flux points, flux maps, flux points, flux profiles and flux light curves. All estimators feature a common API and allow estimating fluxes in bands of reconstructed energy. General method -------------- The core of any estimator algorithm is hypothesis testing: a reference model or counts excess is tested against a null hypothesis. From the best fit reference model a flux is derived and a corresponding :math:`\Delta TS` value from the difference in fit statistics to the null hypothesis. Assuming one degree of freedom, :math:`\sqrt{\Delta TS}` represents an approximation (`Wilk's theorem `_) of the "classical significance". In case of a negative best fit flux, e.g. when the background is overestimated, the significance is defined as :math:`-\sqrt{\Delta TS}` by convention. In general the flux can be estimated using two methods: #. **Based on model fitting:** given a (global) best fit model with multiple model components, the flux of the component of interest is re-fitted in the chosen energy, time or spatial region. The new flux is given as a ``norm`` with respect to the global reference model. Optionally the free parameters of the other models can be re-optimised (but the other parameters of the source of interest are always kept frozen). This method is also named **forward folding**. #. **Based on excess:** in the case of having one energy bin, neglecting the PSF and not re-optimising other parameters, one can estimate the significance based on the analytical solution by [LiMa1983]. In this case the "best fit" flux and significance are given by the excess over the null hypothesis. This method is also named **backward folding**. This method is currently only exposed in the `~gammapy.estimators.ExcessMapEstimator` Energy edges ------------ The estimators run on bins of reconstructed energy. The estimator cannot modify the binning of the parent dataset, only group the energy bins. The input energy edges by the user are converted to the nearest parent dataset energy bin values. The estimators select the energy bins from the parent dataset which are closest to the requested energy edges. Hence, the requested edges are used to group the parent dataset energy edges into large bins. Therefore, the input energy edges are not always the same as the output energy bins provided in the final product. If a specific energy binning is required at the estimator level, it should be implemented in the parent dataset geometry (i.e. the dataset energy axis edges should contain the required edges). Flux quantities --------------- In case the data is fitted to a single data bin only, e.g. one energy bin Uniformly for both methods most estimators compute the same basic quantities: ================= ================================================= Quantity Definition ================= ================================================= norm Best fit norm with respect to the reference spectral model norm_err Symmetric error on the norm derived from the Hessian matrix. Given as absolute difference to the best fit norm. stat Fit statistics value of the best fit hypothesis stat_null Fit statistics value of the null hypothesis ts Difference in fit statistics (`stat - stat_null` ) sqrt_ts Square root of ts time sign(norm), in case of one degree of freedom (n_dof), corresponds to significance (Wilk's theorem) npred Predicted counts of the best fit hypothesis. Equivalent to correlated counts for backward folding npred_excess Predicted excess counts of the best fit hypothesis. Equivalent to correlated excess for backward folding npred_background Predicted background counts of the best fit hypothesis. Equivalent to correlated excess for backward folding n_dof Number of degrees of freedom. If not explicitly present, assumed to be one ================= ================================================= In addition, the following optional quantities can be computed: ================= ================================================= Quantity Definition ================= ================================================= norm_errp Positive error of the norm, given as absolute difference to the best fit norm norm_errn Negative error of the norm, given as absolute difference to the best fit norm norm_ul Upper limit of the norm norm_scan Norm parameter values used for the fit statistic scan stat_scan Fit statistics values associated with norm_scan ================= ================================================= To compute the error, asymmetric errors as well as upper limits one can specify the arguments ``n_sigma`` and ``n_sigma_ul``. The ``n_sigma`` arguments are translated into a TS difference assuming ``ts = n_sigma ** 2``. .. _sedtypes: SED types ^^^^^^^^^ In addition to the norm values a reference spectral model and energy ranges are given. Using this reference spectral model the norm values can be converted to the following different SED types: ================= ================================================= Quantity Definition ================= ================================================= e_ref Reference energy e_min Minimum energy e_max Maximum energy dnde Differential flux at ``e_ref`` flux Integrated flux between ``e_min`` and ``e_max`` eflux Integrated energy flux between ``e_min`` and ``e_max`` e2dnde Differential energy flux at ``e_ref`` ================= ================================================= The same can be applied for the error and upper limit information. More information can be found on the `likelihood SED type page`_. The `~gammapy.estimators.FluxPoints` and `~gammapy.estimators.FluxMaps` objects can optionally define meta data with the following valid keywords: ================= ================================================= Name Definition ================= ================================================= n_sigma Number of sigma used for error estimation n_sigma_ul Number of sigma used for upper limit estimation ts_threshold_ul TS threshold to define the use of an upper limit ================= ================================================= A note on negative flux and upper limit values: .. note:: Gammapy allows for negative flux values and upper limits by default. While those values are physically not valid solutions, they are still valid statistically. Negative flux values either hint at overestimated background levels or underestimated systematic errors in general. Or in case of many measurements, such as pixels in a flux map, they are even statistically expected. For flux points and light curves the amplitude limits (if defined) are taken into account. In future versions of Gammapy it will be possible to account for systematic errors in the likelihood as well. For now the correct interpretation of the results is left to the user. Flux maps --------- This how to compute flux maps with the `~gammapy.estimators.ExcessMapEstimator`: .. testcode:: import numpy as np from gammapy.datasets import MapDataset from gammapy.estimators import ExcessMapEstimator from astropy import units as u dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") estimator = ExcessMapEstimator( correlation_radius="0.1 deg", energy_edges=[0.1, 1, 10] * u.TeV ) maps = estimator.run(dataset) print(maps["flux"]) .. testoutput:: WcsNDMap geom : WcsGeom axes : ['lon', 'lat', 'energy'] shape : (np.int64(320), np.int64(240), 2) ndim : 3 unit : 1 / (s cm2) dtype : float64 Flux points ----------- This is how to compute flux points: .. testcode:: from astropy import units as u from gammapy.datasets import SpectrumDatasetOnOff, Datasets from gammapy.estimators import FluxPointsEstimator from gammapy.modeling.models import PowerLawSpectralModel, SkyModel path = "$GAMMAPY_DATA/joint-crab/spectra/hess/" dataset_1 = SpectrumDatasetOnOff.read(path + "pha_obs23523.fits") dataset_2 = SpectrumDatasetOnOff.read(path + "pha_obs23592.fits") datasets = Datasets([dataset_1, dataset_2]) pwl = PowerLawSpectralModel(index=2, amplitude='1e-12 cm-2 s-1 TeV-1') datasets.models = SkyModel(spectral_model=pwl, name="crab") estimator = FluxPointsEstimator( source="crab", energy_edges=[0.1, 0.3, 1, 3, 10, 30, 100] * u.TeV ) # this will run a joint fit of the datasets fp = estimator.run(datasets) table = fp.to_table(sed_type="dnde", formatted=True) # print(table[["e_ref", "dnde", "dnde_err"]]) # or stack the datasets # fp = estimator.run(datasets.stack_reduce()) table = fp.to_table(sed_type="dnde", formatted=True) # print(table[["e_ref", "dnde", "dnde_err"]]) Using gammapy.estimators ------------------------ .. minigallery:: ../examples/tutorials/api/estimators.py .. minigallery:: :add-heading: Examples using `~gammapy.estimators.FluxPointsEstimator` ../examples/tutorials/analysis-1d/spectral_analysis.py ../examples/tutorials/analysis-3d/analysis_mwl.py .. minigallery:: :add-heading: Examples using `~gammapy.estimators.LightCurveEstimator` ../examples/tutorials/analysis-time/light_curve.py ../examples/tutorials/analysis-time/light_curve_flare.py .. _`likelihood SED type page`: https://gamma-astro-data-formats.readthedocs.io/en/latest/spectra/binned_likelihoods/index.html ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/hli.rst0000644000175100001770000000141314721316200016717 0ustar00runnerdocker.. _analysis: High Level Analysis Interface ============================= The high level interface for Gammapy provides a high level Python API for the most common use cases identified in the analysis process. The classes and methods included may be used in Python scripts, notebooks or as commands within IPython sessions. The high level user interface could also be used to automatise processes driven by parameters declared in a configuration file in YAML format that addresses the most common analysis use cases identified. It exposes a subset of the functionality that is available in the sub-packages to support standard analysis use case in a convenient way. Using gammapy.analysis ---------------------- .. minigallery:: gammapy.analysis.Analysis :add-heading: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/howto.rst0000644000175100001770000003624114721316200017312 0ustar00runnerdocker.. include:: ../references.txt .. _how_to: How To ====== This page contains short "how to" or "frequently asked question" entries for Gammapy. Each entry is for a very specific task, with a short answer, and links to examples and documentation. If you're new to Gammapy, please check the :ref:`getting-started` section and the :ref:`user_guide` and have a look at the list of :ref:`tutorials`. The information below is in addition to those pages, it's not a complete list of how to do everything in Gammapy. Please give feedback and suggest additions to this page! .. accordion-header:: :id: HowToGammapy :title: Spell and pronounce Gammapy The recommended spelling is "Gammapy" as proper name. The recommended pronunciation is [ɡæməpaɪ] where the syllable "py" is pronounced like the english word "pie". You can listen to it `here `__. .. accordion-footer:: .. accordion-header:: :id: HowToObs :title: Select observations :link: ../tutorials/starting/analysis_2.html#defining-the-datastore-and-selecting-observations The `~gammapy.data.DataStore` provides access to a summary table of all observations available. It can be used to select observations with various criterion. You can for instance apply a cone search or also select observations based on other information available using the `~gammapy.data.ObservationTable.select_observations` method. .. accordion-footer:: .. accordion-header:: :id: HowToObsMap :title: Make observation duration maps :link: ../tutorials/api/makers.html#observation-duration-and-effective-livetime Gammapy offers a number of methods to explore the content of the various IRFs contained in an observation. This is usually done thanks to their ``peek()`` methods. .. accordion-footer:: .. accordion-header:: :id: HowToGroupObs :title: Group observations :link: ../tutorials/api/observation_clustering.html `~gammapy.data.Observations` can be grouped depending on a number of various quantities. The two methods to do so are manual grouping and hierarchical clustering. The quantity you group by can be adjusted according to each science case. .. accordion-footer:: .. accordion-header:: :id: HowToLivetimeMap :title: Make an on-axis equivalent livetime map :link: ../tutorials/data/hess.html#on-axis-equivalent-livetime The `~gammapy.data.DataStore` provides access to a summary table of all observations available. It can be used to obtain various quantities from your `~gammapy.data.Observations` list, such as livetime. The on-axis equivalent number of observation hours on the source can be calculated. .. accordion-footer:: .. accordion-header:: :id: HowToEnergyEdges :title: Compute minimum number of counts of significance per bin :link: ../tutorials/analysis-1d/spectral_analysis.html#compute-flux-points The `~gammapy.estimators.utils.resample_energy_edges` provides a way to resample the energy bins t o satisfy a minimum number of counts of significance per bin. .. accordion-footer:: .. accordion-header:: :id: HowToPlottingUnits :title: Choose units for plotting Units for plotting are handled with a combination of `matplotlib` and `astropy.units`. The methods `ax.xaxis.set_units()` and `ax.yaxis.set_units()` allow you to define the x and y axis units using `astropy.units`. Here is a minimal example: .. code:: import matplotlib.pyplot as plt from gammapy.estimators import FluxPoints from astropy import units as u filename = "$GAMMAPY_DATA/hawc_crab/HAWC19_flux_points.fits" fp = FluxPoints.read(filename) ax = plt.subplot() ax.xaxis.set_units(u.eV) ax.yaxis.set_units(u.Unit("erg cm-2 s-1")) fp.plot(ax=ax, sed_type="e2dnde") .. accordion-footer:: .. accordion-header:: :id: HowToSrcSig :title: Compute source significance Estimate the significance of a source, or more generally of an additional model component (such as e.g. a spectral line on top of a power-law spectrum), is done via a hypothesis test. You fit two models, with and without the extra source or component, then use the test statistic values from both fits to compute the significance or p-value. To obtain the test statistic, call `~gammapy.modeling.Dataset.stat_sum` for the model corresponding to your two hypotheses (or take this value from the print output when running the fit), and take the difference. Note that in Gammapy, the fit statistic is defined as ``S = - 2 * log(L)`` for likelihood ``L``, such that ``TS = S_0 - S_1``. See :ref:`datasets` for an overview of fit statistics used. .. accordion-footer:: .. accordion-header:: :id: HowToReduceData :title: Perform data reduction loop with multi-processing :link: ../tutorials/api/makers.html#data-reduction-loop There are two ways for the data reduction steps to be implemented. Either a loop is used to run the full reduction chain, or the reduction is performed with multi-processing tools by utilising the `~gammapy.makers.DatasetsMaker` to perform the loop internally. .. accordion-footer:: .. accordion-header:: :id: HowToSrcCumSig :title: Compute cumulative significance :link: ../tutorials/analysis-1d/spectral_analysis.html#source-statistic A classical plot in gamma-ray astronomy is the cumulative significance of a source as a function of observing time. In Gammapy, you can produce it with 1D (spectral) analysis. Once datasets are produced for a given ON region, you can access the total statistics with the ``info_table(cumulative=True)`` method of `~gammapy.datasets.Datasets`. .. accordion-footer:: .. accordion-header:: :id: HowToCustomModel :title: Implement a custom model :link: ../tutorials/api/models.html#implementing-a-custom-model Gammapy allows the flexibility of using user-defined models for analysis. .. accordion-footer:: .. accordion-header:: :id: HowToEDepSpatialModel :title: Implement energy dependent spatial models :link: ../tutorials/api/models.html#models-with-energy-dependent-morphology While Gammapy does not ship energy dependent spatial models, it is possible to define such models within the modeling framework. .. accordion-footer:: .. accordion-header:: :id: HowToElevenAstroSrcSpectra :title: Model astrophysical source spectra It is possible to combine Gammapy with astrophysical modeling codes, if they provide a Python interface. Usually this requires some glue code to be written, e.g. `~gammapy.modeling.models.NaimaSpectralModel` is an example of a Gammapy wrapper class around the Naima spectral model and radiation classes, which then allows modeling and fitting of Naima models within Gammapy (e.g. using CTAO, H.E.S.S. or Fermi-LAT data). .. accordion-footer:: .. accordion-header:: :id: HowToTempProfile :title: Model temporal profiles :link: ../tutorials/analysis-time/light_curve_simulation.html#fitting-temporal-models Temporal models can be directly fit on available lightcurves, or on the reduced datasets. This is done through a joint fitting of the datasets, one for each time bin. .. accordion-footer:: .. accordion-header:: :id: HowToFitConvergence :title: Improve fit convergence with constraints on the source position It happens that a 3D fit does not converge with warning messages indicating that the scanned positions of the model are outside the valid IRF map range. The type of warning message is: :: Position is outside valid IRF map range, using nearest IRF defined within This issue might happen when the position of a model has no defined range. The minimizer might scan positions outside the spatial range in which the IRFs are computed and then it gets lost. The simple solution is to add a physically-motivated range on the model's position, e.g. within the field of view or around an excess position. Most of the time, this tip solves the issue. The documentation of the `models sub-package `_ explains how to add a validity range of a model parameter. .. accordion-footer:: .. accordion-header:: :id: HowToReduceMemory :title: Reduce memory budget for large datasets When dealing with surveys and large sky regions, the amount of memory required might become problematic, in particular because of the default settings of the IRF maps stored in the `~gammapy.datasets.MapDataset` used for the data reduction. Several options can be used to reduce the required memory: - Reduce the spatial sampling of the `~gammapy.irf.PSFMap` and the `~gammapy.irf.EDispKernelMap` using the `binsz_irf` argument of the `~gammapy.datasets.MapDataset.create` method. This will reduce the accuracy of the IRF kernels used for model counts predictions. - Change the default IRFMap axes, in particular the `rad_axis` argument of `~gammapy.datasets.MapDataset.create` This axis is used to define the geometry of the `~gammapy.irf.PSFMap` and controls the distribution of error angles used to sample the PSF. This will reduce the quality of the PSF description. - If one or several IRFs are not required for the study at hand, it is possible not to build them by removing it from the list of options passed to the `~gammapy.makers.MapDatasetMaker`. .. accordion-footer:: .. accordion-header:: :id: HowToCopyDataStore :title: Copy part of a data store To share specific data from a database, it might be necessary to create a new data storage with a limited set of observations and summary files following the scheme described in gadf_. This is possible with the method `~gammapy.data.DataStore.copy_obs` provided by the `~gammapy.data.DataStore`. It allows to copy individual observations files in a given directory and build the associated observation and HDU tables. .. accordion-footer:: .. accordion-header:: :id: HowToInterpolateGeom :title: Interpolate onto a different geometry :link: ../tutorials/api/maps.html#filling-maps-from-interpolation To interpolate maps onto a different geometry use `~gammapy.maps.Map.interp_to_geom`. .. accordion-footer:: .. accordion-header:: :id: HowToSuppressWarn :title: Suppress warnings In general it is not recommended to suppress warnings from code because they might point to potential issues or help debugging a non-working script. However in some cases the cause of the warning is known and the warnings clutter the logging output. In this case it can be useful to locally suppress a specific warning like so: .. testcode:: from astropy.io.fits.verify import VerifyWarning import warnings with warnings.catch_warnings(): warnings.simplefilter('ignore', VerifyWarning) # do stuff here .. accordion-footer:: .. accordion-header:: :id: HowToAvoidNaninFp :title: Avoid NaN results in Flux Point estimation :link: ../tutorials/api/estimators.html#a-fully-configured-flux-points-estimation Sometimes, upper limit values may show as ``nan`` while running a `~gammapy.estimators.FluxPointsEstimator` or a `~gammapy.estimators.LightCurveEstimator`. This often arises because the range of the norm parameter being scanned over is not sufficient. Increasing this range usually solves the problem. In some cases, you can also consider configuring the estimator with a different `~gammapy.modeling.Fit` backend. .. accordion-footer:: .. accordion-header:: :id: HowToProgressBar :title: Display a progress bar Gammapy provides the possibility of displaying a progress bar to monitor the advancement of time-consuming processes. To activate this functionality, make sure that `tqdm` is installed and add the following code snippet to your code: .. testcode:: from gammapy.utils import pbar pbar.SHOW_PROGRESS_BAR = True The progress bar is available within the following: * `~gammapy.analysis.Analysis.get_datasets` method * `~gammapy.data.DataStore.get_observations` method * The ``run()`` method from the ``estimator`` classes: `~gammapy.estimators.ASmoothMapEstimator`, `~gammapy.estimators.TSMapEstimator`, `~gammapy.estimators.LightCurveEstimator` * `~gammapy.modeling.Fit.stat_profile` and `~gammapy.modeling.Fit.stat_surface` methods * `~gammapy.scripts.download.progress_download` method * `~gammapy.utils.parallel.run_multiprocessing` method .. accordion-footer:: .. accordion-header:: :id: HowToChangePlotStyle :title: Change plotting style and color-blind friendly visualizations As the Gammapy visualisations are using the library `matplotlib` that provides color styles, it is possible to change the default colors map of the Gammapy plots. Using using the `style sheet of matplotlib `_, you should add into your notebooks or scripts the following lines after the Gammapy imports: .. code:: import matplotlib.style as style style.use('XXXX') # with XXXX from `print(plt.style.available)` Note that you can create your own style with matplotlib (see `here `__ and `here `__) .. TODO: update the link The CTAO observatory released a document describing best practices for **data visualisation in a way friendly to color-blind people**: `CTAO document `_. To use them, you should add into your notebooks or scripts the following lines after the Gammapy imports: .. code:: import matplotlib.style as style style.use('tableau-colorblind10') or .. code:: import matplotlib.style as style style.use('seaborn-colorblind') .. accordion-footer:: .. accordion-header:: :id: HowToAddPhase :title: Add PHASE information to your data To do a pulsar analysis, one must compute the pulsar phase of each event and put this new information in a new `~gammapy.data.Observation`. Computing pulsar phases can be done using an external library such as [PINT](https://nanograv-pint.readthedocs.io/en/latest/) or [Tempo2](https://www.pulsarastronomy.net/pulsar/software/tempo2). A [Gammapy Recipe](https://gammapy.github.io/gammapy-recipes/_build/html/index.html) showing how to use PINT within the Gammapy framework is available [here](https://gammapy.github.io/gammapy-recipes/_build/html/notebooks/pulsar_phase/pulsar_phase_computation.html). For brevity, the code below shows how to add a dummy phase column to a new `~gammapy.data.EventList` and `~gammapy.data.Observation`. .. testcode:: import numpy as np from gammapy.data import DataStore, Observation, EventList # read the observation datastore = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") obs = datastore.obs(23523) # use the phase information - dummy in this example phase = np.random.random(len(obs.events.table)) # create a new `EventList` table = obs.events.table table["PHASE"] = phase new_events = EventList(table) # copy the observation in memory, changing the events obs2 = obs.copy(events=new_events, in_memory=True) # The new observation and the new events table can be serialised independently obs2.write("new_obs.fits.gz") obs2.write("events.fits.gz", include_irfs=False) .. accordion-footer:: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/index.rst0000644000175100001770000000403614721316200017256 0ustar00runnerdocker.. include:: ../references.txt .. _user_guide: ========== User guide ========== .. toctree:: :hidden: :titlesonly: package howto model-gallery/index Gammapy recipes references .. grid:: 1 2 2 2 :gutter: 2 :class-container: sd-text-center .. grid-item-card:: Analysis workflow and package structure :img-top: ../_static/box.png :columns: 12 6 6 6 :class-card: intro-card :shadow: md An overview of the main concepts in Gammapy package. +++ .. button-ref:: package_structure :ref-type: ref :click-parent: :color: secondary :expand: To the package overview .. grid-item-card:: How To :img-top: ../_static/index_contribute.svg :columns: 12 6 6 6 :class-card: intro-card :shadow: md A short “frequently asked question” entries for Gammapy. +++ .. button-ref:: how_to :ref-type: ref :click-parent: :color: secondary :expand: To the How To .. grid-item-card:: Model gallery :img-top: ../_static/galleryicon.png :columns: 12 6 6 6 :class-card: intro-card :shadow: md Gammapy provides a large choice of spatial, spectral and temporal models. +++ .. button-ref:: model-gallery :ref-type: ref :click-parent: :color: secondary :expand: To the model gallery .. grid-item-card:: Gammapy recipes :img-top: ../_static/glossaryicon.png :columns: 12 6 6 6 :class-card: intro-card :shadow: md A collection of **user contributed** notebooks covering aspects not present in the official tutorials. +++ .. button-link:: https://gammapy.github.io/gammapy-recipes :click-parent: :color: secondary :expand: To the recipes ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.156642 gammapy-1.3/docs/user-guide/irf/0000755000175100001770000000000014721316215016200 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/irf/aeff.rst0000644000175100001770000000136614721316200017633 0ustar00runnerdocker.. _irf-aeff: Effective area ============== as a function of true energy and offset angle (:ref:`gadf:aeff_2d`) ------------------------------------------------------------------- The `~gammapy.irf.EffectiveAreaTable2D` class represents an effective area as a function of true energy and offset angle from the field of view center (:math:`A_{\rm eff}(E_{\rm true}, p_{\rm true})`, following the notation in :ref:`irf`). Its format specifications are available in :ref:`gadf:aeff_2d`. This is the format in which IACT DL3 effective areas are usually provided, as an example .. plot:: user-guide/irf/plot_aeff.py :include-source: - using a pre-defined effective area parameterisation .. plot:: user-guide/irf/plot_aeff_param.py :include-source: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/irf/bkg.rst0000644000175100001770000000235114721316200017470 0ustar00runnerdocker.. _irf-bkg: Background ========== as a function of reconstructed energy and detector coordinates (:ref:`gadf:bkg_3d`) -------------------------------------------------------------------------------------- The `~gammapy.irf.Background3D` class represents a background rate per solid angle as a function of reconstructed energy and detector coordinates Its format specifications are available in :ref:`gadf:bkg_3d`. This is the format in which IACT DL3 background rates are usually provided, as an example: .. plot:: user-guide/irf/plot_bkg_3d.py :include-source: as a function of reconstructed energy and offset angle, radially symmetric (:ref:`gadf:bkg_2d`) ----------------------------------------------------------------------------------------------- The `~gammapy.irf.Background2D` class represents a background rate per solid angle as a function reconstructed energy and offset angle from the field of view center. Its format specifications are available in :ref:`gadf:bkg_2d`. You can produce a radially-symmetric background from a list of DL3 observations devoid of gamma-ray signal using the tutorial in `background_model.html `__ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/irf/edisp.rst0000644000175100001770000000672414721316200020041 0ustar00runnerdocker.. _irf-edisp: Energy Dispersion ================= as a function of of true energy and offset angle (:ref:`gadf:edisp_2d`) ----------------------------------------------------------------------- The `~gammapy.irf.EnergyDispersion2D` class represents the probability density of the energy migration :math:`\mu=\frac{E}{E_{\rm true}}` as a function of true energy and offset angle from the field of view center (:math:`E_{\rm disp}(E_{\rm true}, \mu|p_{\rm true})` in :ref:`irf`). Its format specifications are available in :ref:`gadf:edisp_2d` This is the format in which IACT DL3 energy dispersions are usually provided, as an example: .. plot:: user-guide/irf/plot_edisp.py :include-source: as a function of true energy (:ref:`gadf:ogip-rmf`) --------------------------------------------------- `~gammapy.irf.EDispKernel` instead represents an energy dispersion as a function of true energy only (:math:`E_{\rm disp}(E| E_{\rm true})` following the notation in :ref:`irf`). `~gammapy.irf.EDispKernel` contains the energy redistribution matrix (or redistribution matrix function, RMF, in the OGIP standard). The energy redistribution provides the integral of the energy dispersion probability function over bins of reconstructed energy. It is used to convert vectors of predicted counts in true energy in vectors of predicted counts in reconstructed energy. Its format specifications are available in :ref:`gadf:ogip-rmf`. Such an energy dispersion can be obtained for example: - selecting the value of an `~gammapy.irf.EnergyDispersion2D` at a given offset (using `~astropy.coordinates.Angle`) .. plot:: user-guide/irf/plot_edisp_kernel.py :include-source: - or starting from a parameterisation: .. plot:: user-guide/irf/plot_edisp_kernel_param.py :include-source: Storing the energy dispersion information as a function of sky position ----------------------------------------------------------------------- The `gammapy.irf.EDispKernelMap` is a four-dimensional `~gammapy.maps.Map` that stores, for each position in the sky, an `~gammapy.irf.EDispKernel`, which, as described above, depends on true energy. The `~gammapy.irf.EDispKernel` at a given position can be extracted with `~gammapy.irf.EDispKernelMap.get_edisp_kernel()` by providing a `~astropy.coordinates.SkyCoord` or `~regions.SkyRegion`. .. plot:: :include-source: from gammapy.irf import EDispKernelMap from gammapy.maps import MapAxis from astropy.coordinates import SkyCoord # Create a test EDispKernelMap from a gaussian distribution energy_axis_true = MapAxis.from_energy_bounds(1,10, 8, unit="TeV", name="energy_true") energy_axis = MapAxis.from_energy_bounds(1,10, 5, unit="TeV", name="energy") edisp_map = EDispKernelMap.from_gauss(energy_axis, energy_axis_true, 0.3, 0) position = SkyCoord(ra=83, dec=22, unit='deg', frame='icrs') edisp_kernel = edisp_map.get_edisp_kernel(position) # We can quickly check the edisp kernel via the peek() method edisp_kernel.peek() The `gammapy.irf.EDispMap` serves a similar purpose but instead of a true energy axis, it contains the information of the energy dispersion as a function of the energy migration (:math:`E/ E_{\rm true}`). It can be converted into a `gammapy.irf.EDispKernelMap` with `gammapy.irf.EDispMap.to_edisp_kernel_map()` and the `gammapy.irf.EDispKernelMap` at a given position can be extracted in the same way as described above, using `~gammapy.irf.EDispMap.get_edisp_kernel()` and providing a `~astropy.coordinates.SkyCoord`. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/irf/index.rst0000644000175100001770000001177514721316200020046 0ustar00runnerdocker.. _irf: Instrument Response Functions (DL3) =================================== Typically the IRFs are stored in the form of multidimensional tables giving the response functions such as the distribution of gamma-like events or the probability density functions of the reconstructed energy and position. Expected number of detected events ---------------------------------- To model the expected number of events a gamma-ray source should produce on a detector one has to model its effect using an instrument response function (IRF). In general, such a function gives the probability to detect a photon emitted from true position :math:`p_{\rm true}` on the sky and true energy :math:`E_{\rm true}` at reconstructed position :math:`p` and energy :math:`E` and the effective collection area of the detector at position :math:`p_{\rm true}` on the sky and true energy :math:`E_{\rm true}`. We can write the expected number of detected events :math:`N(p, E)`: .. math:: N(p, E) {\rm d}p {\rm d}E = t_{\rm obs} \int_{E_{\rm true}} {\rm d}E_{\rm true} \, \int_{p_{\rm true}} {\rm d}p_{\rm true} \, R(p, E|p_{\rm true}, E_{\rm true}) \times \Phi(p_{\rm true}, E_{\rm true}) where: * :math:`R(p, E| p_{\rm true}, E_{\rm true})` is the instrument response (unit: :math:`{\rm m}^2\,{\rm TeV}^{-1}`) * :math:`\Phi(p_{\rm true}, E_{\rm true})` is the sky flux model (unit: :math:`{\rm m}^{-2}\,{\rm s}^{-1}\,{\rm TeV}^{-1}\,{\rm sr}^{-1}`) * :math:`t_{\rm obs}` is the observation time: (unit: :math:`{\rm s}`) Factorisation of the IRFs ------------------------- Most of the time, in high level gamma-ray data (DL3), we assume that the instrument response can be simplified as the product of three independent functions: .. math:: R(p, E|p_{\rm true}, E_{\rm true}) = A_{\rm eff}(p_{\rm true}, E_{\rm true}) \times PSF(p|p_{\rm true}, E_{\rm true}) \times E_{\rm disp}(E|p_{\rm true}, E_{\rm true}), where: * :math:`A_{\rm eff}(p_{\rm true}, E_{\rm true})` is the effective collection area of the detector (unit: :math:`{\rm m}^2`). It is the product of the detector collection area times its detection efficiency at true energy :math:`E_{\rm true}` and position :math:`p_{\rm true}`. * :math:`PSF(p|p_{\rm true}, E_{\rm true})` is the point spread function (unit: :math:`{\rm sr}^{-1}`). It gives the probability of measuring a direction :math:`p` when the true direction is :math:`p_{\rm true}` and the true energy is :math:`E_{\rm true}`. Gamma-ray instruments consider the probability density of the angular separation between true and reconstructed directions :math:`\delta p = p_{\rm true} - p`, i.e. :math:`PSF(\delta p|p_{\rm true}, E_{\rm true})`. * :math:`E_{\rm disp}(E|p_{\rm true}, E_{\rm true})` is the energy dispersion (unit: :math:`{\rm TeV}^{-1}`). It gives the probability to reconstruct the photon at energy :math:`E` when the true energy is :math:`E_{\rm true}` and the true position :math:`p_{\rm true}`. Gamma-ray instruments consider the probability density of the migration :math:`\mu=\frac{E}{E_{\rm true}}`, i.e. :math:`E_{\rm disp}(\mu|p_{\rm true}, E_{\rm true})`. The implicit assumption here is that energy dispersion and PSF are completely independent. This is not totally valid in some situations. These functions are obtained through Monte-Carlo simulations of gamma-ray showers for different observing conditions, e.g. detector configuration, zenith angle of the pointing position, detector state and different event reconstruction and selection schemes. In the DL3 format, the IRF are distributed for each observing run. Further details on individuals responses and how they are implemented in gammapy are given in :ref:`irf-aeff`, :ref:`irf-edisp` and :ref:`irf-psf`. Most of the formats defined at :ref:`gadf:iact-irf` are supported. At the moment, there is little support for Fermi-LAT or other instruments. Most users will not use `gammapy.irf` directly, but will instead use IRFs as part of their spectrum, image or cube analysis to compute exposure and effective EDISP and PSF for a given dataset. IRF axis naming --------------- In the IRF classes we use the following axis naming convention: ================= ============================================================================ Variable Definition ================= ============================================================================ ``energy`` Reconstructed energy axis (:math:`E`) ``energy_true`` True energy axis (:math:`E_{\rm true}`) ``offset`` Field of view offset from center (:math:`p_{\rm true}`) ``fov_lon`` Field of view longitude ``fov_lat`` Field of view latitude ``migra`` Energy migration (:math:`\mu`) ``rad`` Offset angle from source position (:math:`\delta p`) ================= ============================================================================ Using gammapy.irf ----------------- .. minigallery:: ../examples/tutorials/api/irfs.py ../examples/tutorials/analysis-1d/cta_sensitivity.py .. toctree:: :maxdepth: 1 :hidden: aeff bkg edisp psf ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/irf/plot_aeff.py0000644000175100001770000000045414721316200020506 0ustar00runnerdocker"""Plot an effective area from the HESS DL3 data release 1.""" import matplotlib.pyplot as plt from gammapy.irf import EffectiveAreaTable2D filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" aeff = EffectiveAreaTable2D.read(filename, hdu="AEFF") aeff.peek() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/irf/plot_aeff_param.py0000644000175100001770000000066014721316200021665 0ustar00runnerdockerfrom astropy import units as u import matplotlib.pyplot as plt from gammapy.irf import EffectiveAreaTable2D ax = plt.subplot() for instrument in ["HESS", "HESS2", "CTA"]: aeff = EffectiveAreaTable2D.from_parametrization(instrument=instrument) aeff.plot_energy_dependence(ax=ax, label=instrument, offset=[0] * u.deg) ax.set_yscale("log") ax.set_xlim([1e-3, 1e3]) ax.set_ylim([1e3, 1e12]) plt.legend(loc="best") plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/irf/plot_bkg_3d.py0000644000175100001770000000043314721316200020733 0ustar00runnerdocker"""Plot the background rate from the HESS DL3 data release 1.""" import matplotlib.pyplot as plt from gammapy.irf import Background3D filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" bkg = Background3D.read(filename, hdu="BKG") bkg.peek() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/irf/plot_edisp.py0000644000175100001770000000045614721316200020713 0ustar00runnerdocker"""Plot an energy dispersion from the HESS DL3 data release 1.""" import matplotlib.pyplot as plt from gammapy.irf import EnergyDispersion2D filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" edisp = EnergyDispersion2D.read(filename, hdu="EDISP") edisp.peek() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/irf/plot_edisp_kernel.py0000644000175100001770000000072614721316200022253 0ustar00runnerdocker"""Plot the energy dispersion at a given offset.""" from astropy.coordinates import Angle import matplotlib.pyplot as plt from gammapy.irf import EnergyDispersion2D filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" edisp = EnergyDispersion2D.read(filename, hdu="EDISP") # offset at which we want to examine the energy dispersion offset = Angle("0.5 deg") edisp_kernel = edisp.to_edisp_kernel(offset=offset) edisp_kernel.peek() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/irf/plot_edisp_kernel_param.py0000644000175100001770000000076214721316200023433 0ustar00runnerdocker"""Plot an energy dispersion using a gaussian parametrisation""" import matplotlib.pyplot as plt from gammapy.irf import EDispKernel from gammapy.maps import MapAxis energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=10) energy_axis_true = MapAxis.from_energy_bounds( "0.5 TeV", "30 TeV", nbin=10, per_decade=True, name="energy_true" ) edisp = EDispKernel.from_gauss( energy_axis=energy_axis, energy_axis_true=energy_axis_true, sigma=0.1, bias=0 ) edisp.peek() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/irf/plot_fermi_psf.py0000644000175100001770000000073714721316200021563 0ustar00runnerdocker"""Plot Fermi PSF.""" from gammapy.irf import PSFMap from gammapy.maps import MapAxis, WcsGeom filename = "$GAMMAPY_DATA/fermi_3fhl/fermi_3fhl_psf_gc.fits.gz" psf = PSFMap.read(filename, format="gtpsf") axis = MapAxis.from_energy_bounds("10 GeV", "2 TeV", nbin=20, name="energy_true") geom = WcsGeom.create(npix=50, binsz=0.01, axes=[axis]) # .to_image() computes the exposure weighted mean PSF kernel = psf.get_psf_kernel(geom=geom).to_image() kernel.psf_kernel_map.plot() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/irf/plot_psf.py0000644000175100001770000000037714721316200020401 0ustar00runnerdocker"""Plot a PSF from the HESS DL3 data release 1.""" import matplotlib.pyplot as plt from gammapy.irf import PSF3D filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" psf = PSF3D.read(filename, hdu="PSF") psf.peek() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/irf/psf.rst0000644000175100001770000000252714721316200017522 0ustar00runnerdocker.. _irf-psf: Point Spread Function ===================== As a function of of true energy and offset angle (:ref:`gadf:psf_table`) ------------------------------------------------------------------------ The `~gammapy.irf.PSF3D` class represents the radially symmetric probability density of the angular separation between true and reconstructed directions :math:`\delta p = p_{\rm true} - p` (or `rad`), as a function of true energy and offset angle from the field of view center (:math:`PSF(E_{\rm true}, \delta p|p_{\rm true})` in :ref:`irf`). Its format specifications are available in :ref:`gadf:psf_table`. This is the format in which IACT DL3 PSFs are usually provided, as an example: .. plot:: user-guide/irf/plot_psf.py :include-source: Additional PSF classes ---------------------- The remaining IRF classes implement: - `~gammapy.irf.EnergyDependentMultiGaussPSF` a PSF whose probability density is parametrised by the sum of 1 to 3 2-dimensional gaussian (definition at :ref:`gadf:psf_3gauss`); - `~gammapy.irf.PSFKing` a PSF whose probability density is parametrised by the King function (definition at :ref:`gadf:psf_king`); - `~gammapy.irf.PSFKernel` a PSF that can be used to convolve `~gammapy.maps.WcsNDMap` objects; - `~gammapy.irf.PSFMap` a four-dimensional `~gammapy.maps.Map` storing a `~gammapy.irf.PSFKernel` for each sky position. ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.156642 gammapy-1.3/docs/user-guide/makers/0000755000175100001770000000000014721316215016702 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/makers/create_region.py0000644000175100001770000000231214721316200022052 0ustar00runnerdocker"""Example how to compute and plot reflected regions.""" import astropy.units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion, EllipseAnnulusSkyRegion, RectangleSkyRegion import matplotlib.pyplot as plt from gammapy.maps import WcsNDMap position = SkyCoord(83.63, 22.01, unit="deg", frame="icrs") on_circle = CircleSkyRegion(position, 0.3 * u.deg) on_ellipse_annulus = EllipseAnnulusSkyRegion( center=position, inner_width=1.5 * u.deg, outer_width=2.5 * u.deg, inner_height=3 * u.deg, outer_height=4 * u.deg, angle=130 * u.deg, ) another_position = SkyCoord(80.3, 22.0, unit="deg") on_rectangle = RectangleSkyRegion( center=another_position, width=2.0 * u.deg, height=4.0 * u.deg, angle=50 * u.deg ) # Now we plot those regions. We first create an empty map empty_map = WcsNDMap.create( skydir=position, width=10 * u.deg, binsz=0.1 * u.deg, proj="TAN" ) empty_map.data += 1.0 empty_map.plot(cmap="gray", vmin=0, vmax=1) # To plot the regions, we convert them to PixelRegion with the map wcs on_circle.to_pixel(empty_map.geom.wcs).plot() on_rectangle.to_pixel(empty_map.geom.wcs).plot() on_ellipse_annulus.to_pixel(empty_map.geom.wcs).plot() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/makers/fov.rst0000644000175100001770000000676214721316200020233 0ustar00runnerdocker.. include:: ../../references.txt .. _fov_background: ************** FoV background ************** Overview -------- Background models stored in IRF might not predict accurately the actual number of background counts. To correct the predicted counts, one can use the data themselves in regions deprived of gamma-ray signal. The field-of-view background technique is used to adjust the predicted counts on the measured ones outside an exclusion mask. This technique is recommended for 3D analysis, in particular when stacking `~gammapy.datasets.Datasets`. Gammapy provides the `~gammapy.makers.FoVBackgroundMaker`. The latter creates a `~gammapy.modeling.models.FoVBackgroundModel` which combines the `background` predicted number of counts and a `~gammapy.modeling.models.NormSpectralModel` which allows to renormalize the background cube, and possibly to change its spectral distribution. By default, only the `norm` parameter of a `~gammapy.modeling.models.PowerLawNormSpectralModel` is left free. Here we show the addition of a `~gammapy.modeling.models.PowerLawNormSpectralModel` in which the `norm` and `tilt` parameters are unfrozen as an example. .. testcode:: from gammapy.makers import MapDatasetMaker, FoVBackgroundMaker, SafeMaskMaker from gammapy.datasets import MapDataset from gammapy.data import DataStore from gammapy.modeling.models import PowerLawNormSpectralModel from gammapy.maps import MapAxis, WcsGeom, Map from regions import CircleSkyRegion from astropy import units as u data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") observations = data_store.get_observations([23592, 23559]) energy_axis = MapAxis.from_energy_bounds("0.5 TeV", "10 TeV", nbin=5) energy_axis_true = MapAxis.from_energy_bounds("0.3 TeV", "20 TeV", nbin=20, name="energy_true") geom = WcsGeom.create(skydir=(83.63, 22.01), axes=[energy_axis], width=5, binsz=0.02) stacked = MapDataset.create(geom, energy_axis_true=energy_axis_true) maker = MapDatasetMaker() safe_mask_maker = SafeMaskMaker( methods=["aeff-default", "offset-max"], offset_max="2.5 deg" ) circle = CircleSkyRegion(center=geom.center_skydir, radius=0.2 * u.deg) exclusion_mask = geom.region_mask([circle], inside=False) spectral_model = PowerLawNormSpectralModel() spectral_model.norm.frozen = False spectral_model.tilt.frozen = False fov_bkg_maker = FoVBackgroundMaker( method="fit", exclusion_mask=exclusion_mask, spectral_model=spectral_model ) for obs in observations: dataset = maker.run(stacked, obs) dataset = safe_mask_maker.run(dataset, obs) dataset = fov_bkg_maker.run(dataset) stacked.stack(dataset) It is also possible to implement other normed models, such as the `~gammapy.modeling.models.PiecewiseNormSpectralModel`. To do so, you can utilise most of the above code with an adaption to the `spectral_model` applied in the `fov_bkg_maker`. .. code-block:: python from gammapy.modeling.models import PiecewiseNormSpectralModel spectral_model = PiecewiseNormSpectralModel( energy=energy_axis.edges, norms=np.ones_like(energy_axis.edges) ) fov_bkg_maker = FoVBackgroundMaker( method="fit", exclusion_mask=exclusion_mask, spectral_model=spectral_model ) Note: to prevent poorly constrained `norm` parameters or large variance in the last bins, the binning should be adjusted to have wider bins at higher energies. This ensures there are enough statistics per bin, enabling the fit to converge. .. minigallery:: gammapy.makers.FoVBackgroundMaker :add-heading: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/makers/index.rst0000644000175100001770000000237214721316200020541 0ustar00runnerdocker.. include:: ../../references.txt .. _makers: Data reduction (DL3 to DL4) =========================== The `gammapy.makers` sub-package contains classes to perform data reduction tasks from DL3 data to binned datasets. In the data reduction step the DL3 data is prepared for modeling and fitting, by binning events into a counts map and interpolating the exposure, background, psf and energy dispersion on the chosen analysis geometry. Background estimation --------------------- .. toctree:: :maxdepth: 1 fov reflected ring Safe data range definition -------------------------- The definition of a safe data range is done using the `~gammapy.makers.SafeMaskMaker` or manually. Using gammapy.makers -------------------- .. minigallery:: ../examples/tutorials/api/makers.py .. minigallery:: :add-heading: Examples using `~gammapy.makers.SpectrumDatasetMaker` ../examples/tutorials/analysis-1d/spectral_analysis.py ../examples/tutorials/analysis-1d/spectral_analysis_rad_max.py ../examples/tutorials/analysis-1d/extended_source_spectral_analysis.py .. minigallery:: :add-heading: Examples using `~gammapy.makers.MapDatasetMaker` ../examples/tutorials/starting/analysis_1.py ../examples/tutorials/data/hawc.py ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/makers/make_rectangular_reflected_background.py0000644000175100001770000000331014721316200026763 0ustar00runnerdocker"""Example how to compute and plot reflected regions.""" import astropy.units as u from astropy.coordinates import SkyCoord from regions import RectangleSkyRegion import matplotlib.pyplot as plt from gammapy.data import DataStore from gammapy.datasets import SpectrumDataset from gammapy.makers import ReflectedRegionsBackgroundMaker, SpectrumDatasetMaker from gammapy.maps import Map, MapAxis, RegionGeom from gammapy.visualization import plot_spectrum_datasets_off_regions data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") mask = data_store.obs_table["TARGET_NAME"] == "Crab" obs_ids = data_store.obs_table["OBS_ID"][mask].data observations = data_store.get_observations(obs_ids) crab_position = SkyCoord(83.63, 22.01, unit="deg", frame="icrs") # The ON region center is defined in the icrs frame. The angle is defined w.r.t. to its axis. rectangle = RectangleSkyRegion( center=crab_position, width=0.5 * u.deg, height=0.4 * u.deg, angle=0 * u.deg ) bkg_maker = ReflectedRegionsBackgroundMaker(min_distance=0.1 * u.rad) dataset_maker = SpectrumDatasetMaker(selection=["counts"]) energy_axis = MapAxis.from_energy_bounds(0.1, 100, 30, unit="TeV") geom = RegionGeom.create(region=rectangle, axes=[energy_axis]) dataset_empty = SpectrumDataset.create(geom=geom) datasets = [] for obs in observations: dataset = dataset_maker.run(dataset_empty.copy(name=f"obs-{obs.obs_id}"), obs) dataset_on_off = bkg_maker.run(observation=obs, dataset=dataset) datasets.append(dataset_on_off) m = Map.create(skydir=crab_position, width=(8, 8), proj="TAN") ax = m.plot(vmin=-1, vmax=0) rectangle.to_pixel(ax.wcs).plot(ax=ax, color="black") plot_spectrum_datasets_off_regions(datasets=datasets, ax=ax) plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/makers/make_reflected_regions.py0000644000175100001770000000432214721316200023727 0ustar00runnerdockerfrom astropy.coordinates import Angle, SkyCoord from regions import CircleSkyRegion import matplotlib.pyplot as plt from gammapy.makers import ReflectedRegionsFinder from gammapy.maps import RegionGeom, WcsNDMap # Exclude a rectangular region exclusion_mask = WcsNDMap.create(npix=(801, 701), binsz=0.01, skydir=(83.6, 23.0)) coords = exclusion_mask.geom.get_coord().skycoord data = (Angle("23 deg") < coords.dec) & (coords.dec < Angle("24 deg")) exclusion_mask.data = ~data pos = SkyCoord(83.633, 22.014, unit="deg") radius = Angle(0.3, "deg") on_region = CircleSkyRegion(pos, radius) center = SkyCoord(83.633, 24, unit="deg") # One can impose a minimal distance between ON region and first reflected regions finder = ReflectedRegionsFinder( min_distance_input="0.2 rad", ) regions, _ = finder.run( region=on_region, center=center, exclusion_mask=exclusion_mask, ) fig, axes = plt.subplots( ncols=3, subplot_kw={"projection": exclusion_mask.geom.wcs}, figsize=(12, 3), ) def plot_regions(ax, regions, on_region, exclusion_mask): """Little helper function to plot off regions""" exclusion_mask.plot_mask(ax=ax, colors="gray") on_region.to_pixel(ax.wcs).plot(ax=ax, edgecolor="tab:orange") geom = RegionGeom.from_regions(regions) geom.plot_region(ax=ax, color="tab:blue") ax = axes[0] ax.set_title("Min. distance first region") plot_regions(ax=ax, regions=regions, on_region=on_region, exclusion_mask=exclusion_mask) # One can impose a minimal distance between two adjacent regions finder = ReflectedRegionsFinder( min_distance="0.1 rad", ) regions, _ = finder.run( region=on_region, center=center, exclusion_mask=exclusion_mask, ) ax = axes[1] ax.set_title("Min. distance all regions") plot_regions(ax=ax, regions=regions, on_region=on_region, exclusion_mask=exclusion_mask) # One can impose a maximal number of regions to be extracted finder = ReflectedRegionsFinder( max_region_number=5, min_distance="0.1 rad", ) regions, _ = finder.run( region=on_region, center=center, exclusion_mask=exclusion_mask, ) ax = axes[2] ax.set_title("Max. number of regions") plot_regions(ax=ax, regions=regions, on_region=on_region, exclusion_mask=exclusion_mask) plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/makers/reflected.rst0000644000175100001770000001100314721316200021356 0ustar00runnerdocker.. include:: ../../references.txt .. _reflected_background: **************************** Reflected regions background **************************** .. currentmodule:: gammapy.makers Overview -------- This method is used in classical Cherenkov astronomy to estimate background for spectral analysis. As illustrated in Fig. 1, a region on the sky, the ON region, is chosen to select events around the studied source position. In the absence of a solid template of the residual hadronic background, a classical method to estimate it is the so-called Reflected Region Background. The underlying assumption is that the background is approximately purely radial in the field-of-view. A set of OFF counts is found in the observation, by rotating the ON region selected around the pointing position. To avoid that the reflected regions contain actual gamma-ray signal from other objects, one has to remove the gamma-ray bright parts of the field-of-view with a exclusion mask. More details on the reflected regions method can be found in [Berge2007]_. .. _figure_reflected_background: .. figure:: ../../_static/hgps_spectrum_background_estimation.png :width: 50% Fig.1, Illustration of the reflected regions background estimation method, taken from [Abdalla2018]_. The extraction of the OFF events from the `~gammapy.data.EventList` of a set of observations is performed by the `ReflectedRegionsBackgroundMaker`. The latter uses the `~gammapy.makers.background.ReflectedRegionsFinder` to create reflected regions for a given circular on region and exclusion mask. .. code-block:: python from gammapy.makers import SpectrumDatasetMaker, ReflectedRegionsBackgroundMaker, SafeMaskMaker from gammapy.datasets import SpectrumDataset, Datasets from gammapy.data import DataStore from gammapy.maps import MapAxis, Map, WcsGeom, RegionGeom data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") observations = data_store.get_observations([23592, 23559]) energy_axis = MapAxis.from_energy_bounds("0.5 TeV", "10 TeV", nbin=15) energy_axis_true = MapAxis.from_energy_bounds("0.3 TeV", "20 TeV", nbin=40, name="energy_true") geom = RegionGeom.create(region="icrs;circle(83.63,22.01,0.11)", axes=[energy_axis]) dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=energy_axis_true) wcsgeom = WcsGeom.create(skydir=geom.center_skydir, width=5, binsz=0.02) exclusion_mask = wcsgeom.region_mask(geom.region, inside=False) maker = SpectrumDatasetMaker() safe_mask_maker = SafeMaskMaker(methods=["aeff-max"], aeff_percent=10) bkg_maker = ReflectedRegionsBackgroundMaker(exclusion_mask=exclusion_mask) datasets = Datasets() for obs in observations: dataset = maker.run(dataset_empty.copy(), obs) dataset = safe_mask_maker.run(dataset, obs) dataset_on_off = bkg_maker.run(dataset, obs) datasets.append(dataset) Using regions ------------- The on region is a `~regions.SkyRegion`. It is typically a circle (`~regions.CircleSkyRegion`) for pointlike source analysis, but can be a more complex region such as a `~regions.CircleAnnulusSkyRegion` a `~regions.EllipseSkyRegion`, a `~regions.RectangleSkyRegion` etc. The following example shows how to create such regions: .. plot:: user-guide/makers/create_region.py :include-source: The reflected region finder --------------------------- The following example illustrates how to create reflected regions for a given circular on region and exclusion mask using the `~gammapy.makers.background.ReflectedRegionsFinder`. In particular, it shows how to change the minimal distance between the ON region and the reflected regions. This is useful to limit contamination by events leaking out the ON region. It also shows how to change the minimum distance between adjacent regions as well as the maximum number of reflected regions. .. plot:: user-guide/makers/make_reflected_regions.py :include-source: Using the reflected background estimator ---------------------------------------- In practice, the user does not usually need to directly interact with the `~gammapy.makers.background.ReflectedRegionsFinder`. This actually is done via the `~gammapy.makers.ReflectedRegionsBackgroundMaker`, which extracts the ON and OFF events for an `~gammapy.data.Observations` object. The last example shows how to run it on a few observations with a rectangular region. .. plot:: user-guide/makers/make_rectangular_reflected_background.py :include-source: .. minigallery:: gammapy.makers.ReflectedRegionsBackgroundMaker :add-heading:././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/makers/ring.rst0000644000175100001770000000551314721316200020371 0ustar00runnerdocker.. include:: ../../references.txt .. _ring_background: *************** Ring background *************** Overview -------- This background estimation technique is used in classical Cherenkov astronomy for 2D image analysis. The working principle is illustrated in Fig. 1. For any given pixel in a counts map the off counts are estimate from a ring centered on the test position with a given fixed radius and width. To improve the estimate, regions with known gamma-ray emission are excluded from measuring the off events. A variation of this methods is given by the "adaptive" ring. In some regions (e.g. in the Galactic plane) a lot of gamma-ray emission has to be excluded and there are not many regions left to estimate the background from. To improve the off statistics in this case, the ring is adaptively enlarged to achieve a defined minimal background efficiency ratio (see :ref:`stats_notation`). More details on the ring background method can be found in [Berge2007]_. .. _figure_ring_background: .. figure:: ../../_static/hgps_map_background_estimation.png :width: 50% Fig.1, Illustration of the ring background estimation method, taken from [Abdalla2018]_. To include the classical ring background estimation into a data reduction chain, Gammapy provides the `RingBackgroundMaker` and `AdaptiveRingBackgroundMaker` classed. These classes can only be used for image based data. A given `MapDataset` has to be reduced to a single image by calling `MapDataset.to_image()` .. testcode:: from gammapy.makers import MapDatasetMaker, RingBackgroundMaker, SafeMaskMaker from gammapy.datasets import MapDataset, MapDatasetOnOff from gammapy.data import DataStore from gammapy.maps import MapAxis, WcsGeom, Map from regions import CircleSkyRegion from astropy import units as u data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") observations = data_store.get_observations([23592, 23559]) energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=1) energy_axis_true = MapAxis.from_energy_bounds("0.3 TeV", "30 TeV", nbin=20, per_decade=True, name="energy_true") geom = WcsGeom.create(skydir=(83.63, 22.01), axes=[energy_axis], width=5, binsz=0.02) empty = MapDataset.create(geom, energy_axis_true=energy_axis_true) stacked = MapDatasetOnOff.create(geom, energy_axis_true=energy_axis_true) maker = MapDatasetMaker() safe_mask_maker = SafeMaskMaker( methods=["aeff-default", "offset-max"], offset_max="2.5 deg" ) circle = CircleSkyRegion(center=geom.center_skydir, radius=0.2 * u.deg) exclusion_mask = geom.region_mask([circle], inside=False) ring_bkg_maker = RingBackgroundMaker(r_in="0.3 deg", width="0.3 deg", exclusion_mask=exclusion_mask) for obs in observations: dataset = maker.run(empty, obs) dataset = safe_mask_maker.run(dataset, obs) dataset_on_off = ring_bkg_maker.run(dataset) stacked.stack(dataset_on_off) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.156642 gammapy-1.3/docs/user-guide/maps/0000755000175100001770000000000014721316215016360 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/maps/hpxmap.rst0000644000175100001770000000363314721316200020406 0ustar00runnerdocker.. _hpxmap: HEALPix-based maps ================== This page provides examples and documentation specific to the HEALPix map classes. All HEALPix classes inherit from `~gammapy.maps.Map` which provides generic interface methods that can be be used to access or update the contents of a map without reference to its pixelization scheme. .. warning:: Gammapy uses `NEST` as default pixel order scheme, while `~healpy` functions have `RING` as the default (see https://healpy.readthedocs.io/en/1.11.0/index.html). If you are interfacing Gammapy HEALPix maps with `~healpy` functions, you need to specify the pixelization scheme either while creating the Gammapy object or when using the `~healpy` functions. HEALPix geometry ---------------- The `~gammapy.maps.HpxGeom` class encapsulates the geometry of a HEALPix map: the pixel size (NSIDE), coordinate system, and definition of non-spatial axes (e.g. energy). By default a HEALPix geometry will encompass the full sky. The following example shows how to create an all-sky 2D HEALPix image: .. testcode:: from gammapy.maps import HpxGeom, HpxNDMap, HpxMap # Create a HEALPix geometry of NSIDE=16 geom = HpxGeom(16, frame="galactic") m = HpxNDMap(geom) # Equivalent factory method call m = HpxMap.create(nside=16, frame="galactic") Partial-sky maps can be created by passing a ``region`` argument to the map geometry constructor or by setting the ``width`` argument to the `~gammapy.maps.HpxMap.create` factory method: .. testcode:: from gammapy.maps import HpxGeom, HpxMap, HpxNDMap from astropy.coordinates import SkyCoord # Create a partial-sky HEALPix geometry of NSIDE=16 geom = HpxGeom(16, region='DISK(0.0,5.0,10.0)', frame="galactic") m = HpxNDMap(geom) # Equivalent factory method call position = SkyCoord(0.0, 5.0, frame='galactic', unit='deg') m = HpxMap.create(nside=16, skydir=position, width=20.0) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/maps/index.rst0000644000175100001770000002747614721316200020233 0ustar00runnerdocker.. _maps: Sky maps (DL4) ============== `gammapy.maps` contains classes for representing pixelized data structures with at least two spatial dimensions representing coordinates on a sphere (e.g. an image in celestial coordinates). These classes support an arbitrary number of non-spatial dimensions and can represent images (2D), cubes (3D), or hypercubes (4+D). Multiple pixelization schemes are supported: * WCS : Projection onto a 2D cartesian grid following the conventions of the World Coordinate System (WCS). Pixels are square in projected coordinates and as such are not equal area in spherical coordinates. * HEALPix : Hierarchical Equal Area Iso Latitude pixelation of the sphere. Pixels are equal area but have irregular shapes. Documentation specific to HEALPix-based maps is provided in :doc:`../maps/hpxmap`. * Region : A single spatial pixel with arbitrary shape defined by a region on the sky. Documentation specific to region-based maps is provided in :doc:`../maps/regionmap`. `gammapy.maps` is organized around two data structures: *geometry* classes inheriting from `~gammapy.maps.Geom` and *map* classes inheriting from `~gammapy.maps.Map`. A geometry defines the map boundaries, pixelisation scheme, and provides methods for converting to/from map and pixel coordinates. A map owns a `~gammapy.maps.Geom` instance as well as a data array containing map values. Where possible it is recommended to use the abstract `~gammapy.maps.Map` interface for accessing or updating the contents of a map as this allows algorithms to be used interchangeably with different map representations. The following reviews methods of the abstract map interface. Getting started with maps ------------------------- All map objects have an abstract interface provided through the methods of the `~gammapy.maps.Map`. These methods can be used for accessing and manipulating the contents of a map without reference to the underlying data representation (e.g. whether a map uses WCS or HEALPix pixelisation). For applications which do depend on the specific representation one can also work directly with the classes derived from `~gammapy.maps.Map`. In the following we review some of the basic methods for working with map objects, more details are given in the :doc:`/tutorials/api/maps` tutorial. Accessor methods ---------------- All map objects have a set of accessor methods provided through the abstract `~gammapy.maps.Map` class. These methods can be used to access or update the contents of the map irrespective of its underlying representation. Four types of accessor methods are provided: * ``get`` : Return the value of the map at the pixel containing the given coordinate (`~gammapy.maps.Map.get_by_idx`, `~gammapy.maps.Map.get_by_pix`, `~gammapy.maps.Map.get_by_coord`). * ``interp`` : Interpolate or extrapolate the value of the map at an arbitrary coordinate. * ``set`` : Set the value of the map at the pixel containing the given coordinate (`~gammapy.maps.Map.set_by_idx`, `~gammapy.maps.Map.set_by_pix`, `~gammapy.maps.Map.set_by_coord`). * ``fill`` : Increment the value of the map at the pixel containing the given coordinate with a unit weight or the value in the optional ``weights`` argument (`~gammapy.maps.Map.fill_by_idx`, `~gammapy.maps.Map.fill_by_pix`, `~gammapy.maps.Map.fill_by_coord`). Accessor methods accept as their first argument a coordinate tuple containing scalars, lists, or numpy arrays with one tuple element for each dimension of the map. ``coord`` methods optionally support a `dict` or `~gammapy.maps.MapCoord` argument. When using tuple input the first two elements in the tuple should be longitude and latitude followed by one element for each non-spatial dimension. Map coordinates can be expressed in one of three coordinate systems: * ``idx`` : Pixel indices. These are explicit (integer) pixel indices into the map. * ``pix`` : Coordinates in pixel space. Pixel coordinates are continuous defined on the interval [0,N-1] where N is the number of pixels along a given map dimension with pixel centers at integer values. For methods that reference a discrete pixel, pixel coordinates will be rounded to the nearest pixel index and passed to the corresponding ``idx`` method. * ``coord`` : The true map coordinates including angles on the sky (longitude and latitude). This coordinate system supports three coordinate representations: `tuple`, `dict`, and `~gammapy.maps.MapCoord`. The tuple representation should contain longitude and latitude in degrees followed by one coordinate array for each non-spatial dimension. The coordinate system accepted by a given accessor method can be inferred from the suffix of the method name (e.g. `~gammapy.maps.Map.get_by_idx`). The following demonstrates how one can access the same pixels of a WCS map using each of the three coordinate systems: .. testcode:: from gammapy.maps import Map m = Map.create(binsz=0.1, map_type='wcs', width=10.0) vals = m.get_by_idx( ([49,50],[49,50]) ) vals = m.get_by_pix( ([49.0,50.0],[49.0,50.0]) ) vals = m.get_by_coord( ([-0.05,-0.05],[0.05,0.05]) ) Coordinate arguments obey normal numpy broadcasting rules. The coordinate tuple may contain any combination of scalars, lists or numpy arrays as long as they have compatible shapes. For instance a combination of scalar and vector arguments can be used to perform an operation along a slice of the map at a fixed value along that dimension. Multi-dimensional arguments can be use to broadcast a given operation across a grid of coordinate values. .. testcode:: import numpy as np from gammapy.maps import Map m = Map.create(binsz=0.1, map_type='wcs', width=10.0) coords = np.linspace(-4.0, 4.0, 9) # Equivalent calls for accessing value at pixel (49,49) vals = m.get_by_idx( (49,49) ) vals = m.get_by_idx( ([49],[49]) ) vals = m.get_by_idx( (np.array([49]), np.array([49])) ) # Retrieve map values along latitude at fixed longitude=0.0 vals = m.get_by_coord( (0.0, coords) ) # Retrieve map values on a 2D grid of latitude/longitude points vals = m.get_by_coord( (coords[None,:], coords[:,None]) ) # Set map values along slice at longitude=0.0 to twice their existing value m.set_by_coord((0.0, coords), 2.0*m.get_by_coord((0.0, coords))) The ``set`` and ``fill`` methods can both be used to set pixel values. The following demonstrates how one can set pixel values: .. testcode:: from gammapy.maps import Map import numpy as np m = Map.create(binsz=0.1, map_type='wcs', width=10.0) m.set_by_coord(([-0.05, -0.05], [0.05, 0.05]), [0.5, 1.5]) m.fill_by_coord( ([-0.05, -0.05], [0.05, 0.05]), weights=np.array([0.5, 1.5])) Interface with MapCoord and SkyCoord ------------------------------------ The ``coord`` accessor methods accept ``dict``, `~gammapy.maps.MapCoord`, and `~astropy.coordinates.SkyCoord` arguments in addition to the standard `tuple` of `~numpy.ndarray` argument. When using a `tuple` argument a `~astropy.coordinates.SkyCoord` can be used instead of longitude and latitude arrays. The coordinate frame of the `~astropy.coordinates.SkyCoord` will be transformed to match the coordinate system of the map. .. testcode:: import numpy as np from astropy.coordinates import SkyCoord from gammapy.maps import Map, MapCoord, MapAxis lon = [0, 1] lat = [1, 2] energy = [100, 1000] energy_axis = MapAxis.from_bounds(100, 1E5, 12, interp='log', name='energy') skycoord = SkyCoord(lon, lat, unit='deg', frame='galactic') m = Map.create(binsz=0.1, map_type='wcs', width=10.0, frame="galactic", axes=[energy_axis]) m.set_by_coord((skycoord, energy), [0.5, 1.5]) m.get_by_coord((skycoord, energy)) A `~gammapy.maps.MapCoord` or ``dict`` argument can be used to interact with a map object without reference to the axis ordering of the map geometry: .. testcode:: coord = MapCoord.create(dict(lon=lon, lat=lat, energy=energy)) m.set_by_coord(coord, [0.5, 1.5]) m.get_by_coord(coord) m.set_by_coord(dict(lon=lon, lat=lat, energy=energy), [0.5, 1.5]) m.get_by_coord(dict(lon=lon, lat=lat, energy=energy)) However when using the named axis interface the axis name string (e.g. as given by `MapAxis.name`) must match the name given in the method argument. The two spatial axes must always be named ``lon`` and ``lat``. .. _mapcoord: MapCoord -------- `~gammapy.maps.MapCoord` is an N-dimensional coordinate object that stores both spatial and non-spatial coordinates and is accepted by all ``coord`` methods. A `~gammapy.maps.MapCoord` can be created with or without explicitly named axes with `~gammapy.maps.MapCoord.create`. Axes of a `~gammapy.maps.MapCoord` can be accessed by index, name, or attribute. A `~gammapy.maps.MapCoord` without explicit axis names can be created by calling `~gammapy.maps.MapCoord.create` with a `tuple` argument: .. testcode:: import numpy as np from astropy.coordinates import SkyCoord from gammapy.maps import MapCoord lon = [0.0, 1.0] lat = [1.0, 2.0] energy = [100, 1000] skycoord = SkyCoord(lon, lat, unit='deg', frame='galactic') # Create a MapCoord from a tuple (no explicit axis names) c = MapCoord.create((lon, lat, energy)) print(c[0], c['lon'], c.lon) print(c[1], c['lat'], c.lat) print(c[2], c['axis0']) # Create a MapCoord from a tuple + SkyCoord (no explicit axis names) c = MapCoord.create((skycoord, energy)) print(c[0], c['lon'], c.lon) print(c[1], c['lat'], c.lat) print(c[2], c['axis0']) .. testoutput:: :hide: [0. 1.] [0. 1.] [0. 1.] [1. 2.] [1. 2.] [1. 2.] [ 100 1000] [ 100 1000] [0. 1.] [0. 1.] [0. 1.] [1. 2.] [1. 2.] [1. 2.] [ 100 1000] [ 100 1000] The first two elements of the tuple argument must contain longitude and latitude. Non-spatial axes are assigned a default name ``axis{I}`` where ``{I}`` is the index of the non-spatial dimension. `~gammapy.maps.MapCoord` objects created without named axes must have the same axis ordering as the map geometry. A `~gammapy.maps.MapCoord` with named axes can be created by calling `~gammapy.maps.MapCoord.create` with a ``dict``: .. testcode:: c = MapCoord.create(dict(lon=lon, lat=lat, energy=energy)) print(c[0], c['lon'], c.lon) print(c[1], c['lat'], c.lat) print(c[2], c['energy']) c = MapCoord.create({'energy': energy, 'lon': lon, 'lat': lat}) print(c[0], c['energy']) print(c[1], c['lon'], c.lon) print(c[2], c['lat'], c.lat) c = MapCoord.create(dict(skycoord=skycoord, energy=energy)) print(c[0], c['lon'], c.lon) print(c[1], c['lat'], c.lat) print(c[2], c['energy']) .. testoutput:: :hide: [0. 1.] [0. 1.] [0. 1.] [1. 2.] [1. 2.] [1. 2.] [ 100 1000] [ 100 1000] [ 100 1000] [ 100 1000] [0. 1.] [0. 1.] [0. 1.] [1. 2.] [1. 2.] [1. 2.] [0. 1.] [0. 1.] [0. 1.] [1. 2.] [1. 2.] [1. 2.] [ 100 1000] [ 100 1000] Spatial axes must be named ``lon`` and ``lat``. `~gammapy.maps.MapCoord` objects created with named axes do not need to have the same axis ordering as the map geometry. However the name of the axis must match the name of the corresponding map geometry axis. Note: it is possible to have an arbitrary number of additional non-spatial dimensions, however, the user must ensure that the array has been correctly broadcasted. .. testcode:: energy = np.array([1000, 3000, 5000])[:, np.newaxis, np.newaxis] energy_true = np.arange(500, 6000, 500)[np.newaxis, :, np.newaxis] print(energy.shape) print(energy_true.shape) c = MapCoord.create(dict(skycoord=skycoord, energy=energy, energy_true=energy_true)) print(c.shape) .. testoutput:: :hide: (3, 1, 1) (1, 11, 1) (3, 11, 2) Using gammapy.maps ------------------ .. minigallery:: ../examples/tutorials/api/maps.py ../examples/tutorials/api/mask_maps.py .. toctree:: :maxdepth: 1 :hidden: hpxmap regionmap ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/maps/regionmap.rst0000644000175100001770000004522614721316200021076 0ustar00runnerdocker.. include:: ../../references.txt .. _regionmap: ************************** RegionGeom and RegionNDMap ************************** This page provides examples and documentation specific to the Region classes. These objects are used to bundle the energy distribution - or any other non-spatial axis - of quantities (counts, exposure, ...) inside of a given region in the sky while retaining the information of the chosen spatial region. In particular, they are suited for so-called 1D analysis (see :ref:`references`). RegionGeom ========== A `~RegionGeom` describes the underlying geometry of a region in the sky with any number of non-spatial axes associated to it. Is analogous to a map geometry `~Geom`, but instead of a fine spatial grid on a rectangular region, the spatial dimension is reduced to a single bin with an arbitrary shape, which describes a region in the sky with that same shape. Besides the spatial region, a `~RegionGeom` can also have any number of non-spatial dimensions, the most common case being an additional energy axis. The `~RegionGeom` object defines the structure into which the data contained in a `~RegionNDMap` is distributed. Region geometries have an associated WCS projection object, which is used to project into the tangential plane for certain operations, such as convolution with a PSF. This projection is defined using the region center, and might introduce deformations for very large regions. This is why the use of regions with size larger than a few degrees is not recommended. Creating a RegionGeom --------------------- A `~RegionGeom` can be created via a DS9 region string (see http://ds9.si.edu/doc/ref/region.html for a list of options) or an Astropy Region (https://astropy-regions.readthedocs.io/en/latest/shapes.html). Note that region geometries have an associated WCS projection object. This requires the region to have a defined center, which is not the case for all the shapes defined in DS9. Hence only certain shapes are supported for constructing a `~RegionGeom`, such as circles, boxes, ellipses and annuli. .. testcode:: from astropy.coordinates import SkyCoord from regions import CircleSkyRegion, RectangleSkyRegion, EllipseAnnulusSkyRegion from gammapy.maps import RegionGeom import astropy.units as u # Create a circular region with radius 1 deg centered around # the Galactic Center # from DS9 string geom = RegionGeom.create("galactic;circle(0, 0, 1)") # using the regions package center = SkyCoord("0 deg", "0 deg", frame="galactic") region = CircleSkyRegion(center=center, radius=1*u.deg) geom = RegionGeom(region) # Create a rectangular region with a 45 degree tilt # from DS9 string geom = RegionGeom.create("galactic;box(0, 0, 1,2,45)") # using the regions package region = RectangleSkyRegion(center=center, width=1*u.deg, height=2*u.deg, angle=45*u.deg) geom = RegionGeom(region) # Equivalent factory method call geom = RegionGeom.create(region) # Something a bit more complicated: an elliptical annulus center_sky = SkyCoord(42, 43, unit='deg', frame='fk5') region = EllipseAnnulusSkyRegion(center=center_sky, inner_width=3 * u.deg, outer_width=4 * u.deg, inner_height=6 * u.deg, outer_height=7 * u.deg, angle=6 * u.deg) geom = RegionGeom(region) Higher dimensional region geometries (cubes and hypercubes) can be constructed in exactly the same way as a `~WcsGeom` by passing a list of `~MapAxis` objects for non-spatial dimensions with the axes parameter: .. testcode:: from astropy.coordinates import SkyCoord from regions import CircleSkyRegion from gammapy.maps import MapAxis, RegionGeom import astropy.units as u energy_axis = MapAxis.from_bounds(100., 1e5, 12, interp='log', name='energy', unit='GeV') # from a DS9 string geom = RegionGeom.create("galactic;circle(0, 0, 1)", axes=[energy_axis]) #using the regions package center = SkyCoord("0 deg", "0 deg", frame="galactic") region = CircleSkyRegion(center=center, radius=1*u.deg) geom = RegionGeom(region, axes=[energy_axis]) print(geom) .. testoutput:: RegionGeom region : CircleSkyRegion axes : ['lon', 'lat', 'energy'] shape : (1, 1, 12) ndim : 3 frame : galactic center : 0.0 deg, 0.0 deg The resulting `~RegionGeom` object has `ndim = 3`, two spatial dimensions with one single bin and the chosen energy axis with 12 bins. RegionGeom and coordinates -------------------------- A `~RegionGeom` defines a single spatial bin with arbitrary shape. The spatial coordinates are then given by the center of the region geometry. If one or more non-spatial axes are present, they can have any number of bins. There are different methods that can be used to access or modify the coordinates of a `~RegionGeom`. Bin volume and angular size ^^^^^^^^^^^^^^^^^^^^^^^^^^^ The angular size of the region geometry is given by the method `~RegionGeom.solid_angle()`. If a region geometry has any number of non-spatial axes, then the volume of each bin is given by `~RegionGeom.bin_volume()`. If there are no non-spatial axes, both return the same quantity. .. testcode:: from gammapy.maps import MapAxis, RegionGeom # Create a circular region geometry with an energy axis energy_axis = MapAxis.from_bounds(100., 1e5, 12, interp='log', name='energy', unit='GeV') geom = RegionGeom.create("galactic;circle(0, 0, 1)", axes=[energy_axis]) # Get angular size and bin volume angular_size = geom.solid_angle() bin_volume = geom.bin_volume() Coordinates defined by the `~RegionGeom` ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Given a map coordinate or `~MapCoord` object, the method `~RegionGeom.contains()` checks if they are contained in the region geometry. One can also retrieve the coordinates of the region geometry with `~RegionGeom.get_coord()` and `~RegionGeom.get_idx()`, which return the sky coordinates and indexes respectively. Note that the spatial coordinate will always be a single entry, namely the center, while any non-spatial axes can have as many bins as desired. .. testcode:: from astropy.coordinates import SkyCoord from gammapy.maps import RegionGeom # Create a circular region geometry geom = RegionGeom.create("galactic;circle(0, 0, 1)") # Get coordinates and indexes defined by the region geometry coord = geom.get_coord() indexes = geom.get_idx() # Check if a coordinate is contained in the region geom.contains(center) # Check if an array of coordinates are contained in the region coordinates = SkyCoord(l = [0, 0.5, 1.5], b = [0.5,2,0], frame='galactic', unit='deg') geom.contains(coordinates) Upsampling and downsampling non-spatial axes ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ The spatial binning of a `~RegionGeom` is made up of a single bin, that cannot be modified as it defines the region. However, if any non-spatial axes are present, they can be modified using the `~RegionGeom.upsample()` and `~RegionGeom.downsample()` methods, which take as input a factor by which the indicated axis is to be up- or downsampled. .. testcode:: from astropy.coordinates import SkyCoord from regions import CircleSkyRegion from gammapy.maps import MapAxis, RegionGeom import astropy.units as u # Create a circular region geometry with an energy axis energy_axis = MapAxis.from_bounds(100., 1e5, 12, interp='log', name='energy', unit='GeV') geom = RegionGeom.create("galactic;circle(0, 0, 1)", axes=[energy_axis]) # Upsample the energy axis by a factor 2 geom_24_energy_bins = geom.upsample(2, "energy") # Downsample the energy axis by a factor 2 geom_6_energy_bins = geom.downsample(2, "energy") Relation to WCS geometries ^^^^^^^^^^^^^^^^^^^^^^^^^^ If a `~RegionGeom` has any number of non-spatial axes, the corresponding region geometry with just the spatial dimensions is given by the method `~RegionGeom.to_image()`. If the region geometry only has spatial dimensions, a copy of it is returned. Conversely, non-spatial axes can be added to an existing `~RegionGeom` by `~RegionGeom.to_cube()`, which takes a list of non-spatial axes with unique names to add to the region geometry. Region geometries are made of a single spatial bin, but are constructed on top of a finer `WcsGeom`. The method `~RegionGeom.to_wcs_geom()` returns the minimal equivalent geometry that contains the region geometry. It can also be given as an argument a minimal width for the resulting geometry. .. testcode:: from gammapy.maps import MapAxis, RegionGeom # Create a circular region geometry geom = RegionGeom.create("galactic;circle(0, 0, 1)") # Add an energy axis to the region geometry energy_axis = MapAxis.from_bounds(100., 1e5, 12, interp='log', name='energy', unit='GeV') geom_energy = geom.to_cube([energy_axis]) # Get the image region geometry without the energy axis. # Note that geom_image == geom geom_image = geom_energy.to_image() # Get the minimal wcs geometry that contains the region wcs_geom = geom.to_wcs_geom() Plotting a RegionGeom --------------------- It can be useful to plot the region that defines a `~RegionGeom`, on its own or on top of an existing `~Map`. This is done via `~RegionGeom.plot_region()`: .. plot:: :include-source: from gammapy.maps import RegionGeom geom = RegionGeom.create("icrs;circle(83.63, 22.01, 0.5)") geom.plot_region() One can also plot the region on top of an existing map, and change the properties of the different regions by passing keyword arguments forwarded to `~regions.PixelRegion.as_artist`. .. plot:: :include-source: from gammapy.maps import RegionGeom, Map import numpy as np m = Map.create(npix=100,binsz=3/100, skydir=(83.63, 22.01), frame='icrs') m.data = np.add(*np.indices((100, 100))) # A circle centered in the Crab position circle = RegionGeom.create("icrs;circle(83.63, 22.01, 0.5)") # A box centered in the same position box = RegionGeom.create("icrs;box(83.63, 22.01, 1,2,45)") # An ellipse in a different location ellipse = RegionGeom.create("icrs;ellipse(84.63, 21.01, 0.3,0.6,-45)") # An annulus in a different location annulus = RegionGeom.create("icrs;annulus(82.8, 22.91, 0.1,0.3)") m.plot(add_cbar=True) # Default plotting settings circle.plot_region() # Different line styles, widths and colors box.plot_region(lw=2, linestyle='--', ec='k') ellipse.plot_region(lw=2, linestyle=':', ec='white') # Filling the region with a color annulus.plot_region(lw=2, ec='purple', fc='purple') RegionNDMap =========== A `~RegionNDMap` owns a `~RegionGeom` instance as well as a data array containing the values associated to that region in the sky along the non-spatial axis, which is usually an energy axis. The spatial dimensions of a `~RegionNDMap` are reduced to a single spatial bin with an arbitrary shape, and any extra dimensions are described by an arbitrary number of non-spatial axes. It is to a `~RegionGeom` what a `~Map` is to a `~Geom`: it contains the data which is distributed in the structure defined by the `~RegionGeom` axes. Creating a RegionNDMap ---------------------- A region map can be created either from a DS9 region string, an `regions.SkyRegion` object or an existing `~RegionGeom`: .. testcode:: from gammapy.maps import RegionGeom, RegionNDMap from astropy.coordinates import SkyCoord from regions import CircleSkyRegion import astropy.units as u # Create a map of a circular region with radius 0.5 deg centered around # the Crab Nebula # from DS9 string region_map = RegionNDMap.create("icrs;circle(83.63, 22.01, 0.5)") # using the regions package center = SkyCoord("83.63 deg", "22.01 deg", frame="icrs") region = CircleSkyRegion(center=center, radius=0.5*u.deg) region_map = RegionNDMap.create(region) # from an existing RegionGeom, perhaps the one corresponding # to another, existing RegionNDMap geom = region_map.geom region_map_2 = RegionNDMap.from_geom(geom) Higher dimensional region map objects (cubes and hypercubes) can be constructed by passing a list of `~MapAxis` objects for non-spatial dimensions with the axes parameter: .. testcode:: from gammapy.maps import MapAxis, RegionNDMap energy_axis = MapAxis.from_bounds(100., 1e5, 12, interp='log', name='energy', unit='GeV') region_map = RegionNDMap.create("icrs;circle(83.63, 22.01, 0.5)", axes=[energy_axis]) Filling a RegionNDMap --------------------- All the region maps created above are empty. In order to fill or access the data contained in a `~RegionNDMap`, the `~RegionNDMap.data` attribute is used. In case the region map is being created from an existing `~RegionGeom`, this can be done in the same step: .. testcode:: from gammapy.maps import MapAxis, RegionNDMap import numpy as np # Create a RegionNDMap with 12 energy bins energy_axis = MapAxis.from_bounds(100., 1e5, 12, interp='log', name='energy', unit='GeV') region_map = RegionNDMap.create("icrs;circle(83.63, 22.01, 0.5)", axes=[energy_axis]) # Fill the region map # with an entry for each of the 12 energy bins region_map.data = np.logspace(-2,3,12) # Create another region map with the same RegionGeom but different data, # with the same value for each of the 12 energy bins geom = region_map.geom region_map_1 = RegionNDMap.from_geom(geom, data=1.) # Access the data print(region_map_1.data) .. testoutput:: :hide: [[[1.]] [[1.]] [[1.]] [[1.]] [[1.]] [[1.]] [[1.]] [[1.]] [[1.]] [[1.]] [[1.]] [[1.]]] The data contained in a region map is a `~numpy.ndarray` with shape defined by the underlying `~RegionGeom.data_shape`. In the case of only spatial dimensions, the shape is just (1,1), one single spatial bin. If the associated `~RegionGeom` has a non-spatial axis with N bins, the data shape is then (N, 1, 1), and similarly for additional non-spatial axes. Visualing a RegionNDMap ----------------------- Visualizing a `~RegionNDMap` can be interpreted in two different ways. One is to plot a sky map that contains the region, indicating the area of the sky encompassed by the spatial component of the region map. This is done via `~RegionNDMap.plot_region()`. Another option is to plot the contents of the region map, which would be either a single value for the case of only spatial axes, or a function of the non-spatial axis bins. This is done by `~RegionNDMap.plot()` and `~RegionNDMap.plot_hist()`. Plotting the underlying region ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ This is equivalent to the `~RegionGeom.plot_region()` described above, and, in fact, the `~RegionNDMap` method simply calls it on the associated region geometry, `~RegionNDMap.geom`. Consequently, the use of this method is already described by the section above. .. plot:: :include-source: from gammapy.maps import RegionNDMap region_map = RegionNDMap.create("icrs;circle(83.63, 22.01, 0.5)") region_map.plot_region() Plotting the map content ^^^^^^^^^^^^^^^^^^^^^^^^ This is only possible if the region map has a non-spatial axis. .. plot:: :include-source: from gammapy.maps import MapAxis, RegionNDMap import numpy as np # Create a RegionNDMap with 12 energy bins energy_axis = MapAxis.from_bounds(100., 1e5, 12, interp='log', name='energy', unit='GeV') region_map = RegionNDMap.create("icrs;circle(83.63, 22.01, 0.5)", axes=[energy_axis]) # Fill the data along the energy axis and plot region_map.data = np.logspace(-2,3,12) region_map.plot() Similarly, the map contents can also be plotted as a histogram: .. plot:: :include-source: from gammapy.maps import MapAxis, RegionNDMap import numpy as np # Create a RegionNDMap with 12 energy bins energy_axis = MapAxis.from_bounds(100., 1e5, 12, interp='log', name='energy', unit='GeV') region_map = RegionNDMap.create("icrs;circle(83.63, 22.01, 0.5)", axes=[energy_axis]) # Fill the data along the energy axis and plot region_map.data = np.logspace(-2,3,12) region_map.plot_hist() Writing and reading a RegionNDMap to/from a FITS file ----------------------------------------------------- Region maps can be written to and read from a FITS file with the `~RegionNDMap.write()` and `~RegionNDMap.read()` methods. Currently the following formats are supported: - "gadf": a generic serialisation format with support for ND axes - "ogip" / "ogip-sherpa": an ogip-like counts spectrum with reconstructed energy axis - "ogip-arf" / "ogip-sherpa": an ogip-like effective area with true energy axis The "sherpa" format is equivalent, except energies are stored in "keV" and "cm2". For data with an `energy` axis, so reconstructed energy, the formats `ogip` and `ogip-sherpa` store the data along with the `REGION` and `EBOUNDS HDU`. .. testcode:: from gammapy.maps import RegionNDMap,MapAxis energy_axis = MapAxis.from_bounds(100., 1e5, 12, interp='log', name='energy', unit='GeV') m = RegionNDMap.create("icrs;circle(83.63, 22.01, 0.5)", axes=[energy_axis]) m.write("file.fits", overwrite=True, format="ogip") m = RegionNDMap.read("file.fits", format="ogip") For data with an `energy_true` axis, so true energy, the formats `ogip-arf` and `ogip-arf-sherpa` store the data in true energy, with the definition of the energy bins. The region information is however lost. .. testcode:: from gammapy.maps import RegionNDMap,MapAxis energy_axis = MapAxis.from_bounds(100., 1e5, 12, interp='log', name='energy_true', unit='GeV') m = RegionNDMap.create("icrs;circle(83.63, 22.01, 0.5)", axes=[energy_axis]) m.write("file.fits", overwrite=True, format="ogip-arf") m = RegionNDMap.read("file.fits", format="ogip-arf") Again the "sherpa" format is equivalent, except energies are stored in "keV" and areas in "cm2". The "gadf" allows to serialise a region map with arbitrary extra axis as well: .. testcode:: from gammapy.maps import RegionNDMap, MapAxis energy_axis = MapAxis.from_energy_bounds("1 TeV", "100 TeV", nbin=12) time_axis = MapAxis.from_bounds(0., 12, nbin=12, interp='lin', unit='h', name='time') m = RegionNDMap.create("icrs;circle(83.63, 22.01, 0.5)", axes=[energy_axis, time_axis]) m.write("file.fits", overwrite=True, format="gadf") m = RegionNDMap.read("file.fits", format="gadf") .. minigallery:: gammapy.estimators.RegionNDMap :add-heading:././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/modeling.rst0000644000175100001770000000520714721316200017746 0ustar00runnerdocker.. _modeling: Modeling and Fitting (DL4 to DL5) ================================= `gammapy.modeling` contains all the functionality related to modeling and fitting data. This includes spectral, spatial and temporal model classes, as well as the fit and parameter API. Assuming you have prepared your gamma-ray data as a set of `~gammapy.datasets.Dataset` objects, and stored one or more datasets in a `~gammapy.datasets.Datasets` container, you are all set for modeling and fitting. Either via a YAML config file, or via Python code, define the `~gammapy.modeling.models.Models` to use, which is a list of `~gammapy.modeling.models.SkyModel` objects representing additive emission components, usually sources or diffuse emission, although a single source can also be modeled by multiple components if you want. The `~gammapy.modeling.models.SkyModel` is a factorised model with a `~gammapy.modeling.models.SpectralModel` component and a `~gammapy.modeling.models.SpatialModel` component. Most commonly used models in gamma-ray astronomy are built-in, see the :ref:`model-gallery`. It is easy to create user-defined models and datasets, Gammapy is very flexible. The `~gammapy.modeling.Fit` class provides methods to fit, i.e. optimise parameters and estimate parameter errors and correlations. It interfaces with a `~gammapy.datasets.Datasets` object, which in turn is connected to a `~gammapy.modeling.models.Models` object, which has a `~gammapy.modeling.Parameters` object, which contains the model parameters. Currently ``iminuit`` is used as modeling and fitting backend, in the future we plan to support other optimiser and error estimation methods, e.g. from ``scipy``. Models can be unique for a given dataset, or contribute to multiple datasets and thus provide links, allowing e.g. to do a joint fit to multiple IACT datasets, or to a joint IACT and Fermi-LAT dataset. Many examples are given in the tutorials. Built-in models --------------- Gammapy provides a large choice of spatial, spectral and temporal models. You may check out the whole list of built-in models in the :ref:`model-gallery`. Custom models --------------- Gammapy provides an easy interface to :ref:`custom-model`. Using gammapy.modeling ---------------------- .. minigallery:: ../examples/tutorials/api/fitting.py ../examples/tutorials/api/models.py ../examples/tutorials/api/model_management.py ../examples/tutorials/analysis-1d/spectral_analysis.py ../examples/tutorials/analysis-3d/analysis_3d.py ../examples/tutorails/analysis-3d/analysis_mwl.py ../examples/tutorials/analysis-1d/sed_fitting.py ../examples/tutorials/api/priors.py .. include:: ../references.txt ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/package.rst0000644000175100001770000000734614721316200017551 0ustar00runnerdocker.. include:: ../references.txt .. _package_structure: Gammapy analysis workflow and package structure =============================================== .. toctree:: :hidden: :includehidden: dl3 irf/index makers/index maps/index datasets/index modeling estimators hli scripts/index catalog astro/index stats/index visualization/index utils Analysis workflow ----------------- :ref:`Fig. 1 ` illustrates the standard analysis flow and the corresponding sub-package structure of Gammapy. Gammapy can be typically used with the configuration based high level analysis API or as a standard Python library by importing the functionality from sub-packages. The different data levels and data reduction steps and how they map to the Gammapy API are explained in more detail in the following. .. _data_flow: .. figure:: ../_static/data-flow-gammapy.png :width: 100% Fig. 1 Data flow and sub-package structure of Gammapy. The folder icons represent the corresponding sub-packages. The direction of the the data flow is illustrated with shaded arrows. The top section shows the data levels as defined by `CTAO`_. Analysis steps -------------- :ref:`data` The analysis of gamma-ray data with Gammapy starts at the "data level 3". At this level the data is stored as lists of gamma-like events and the corresponding instrument response functions (IRFs). :ref:`irf` Gammapy supports various forms of instrument response functions (IRFs), which are represented as multidimensional data classes. :ref:`makers` The events and instrument response are projected and binned onto the selected geometry. To limit uncertainties, additional background estimation methods are applied and "safe" data analysis range is determined. :ref:`maps` Gammapy represents data on multi-dimensional maps which are defined with a geometry representing spatial and spectral coordinates. The former can be a spherical map projection system or a simple sky region. :ref:`datasets` The datasets classes bundle reduced data in form of maps, reduced IRFs, models and fit statistics and allow to perform likelihood fitting. Different classes support different analysis methods and fit statistics. The datasets are used to perform joint-likelihood fitting allowing to combine different measurement. :ref:`modeling` Gammapy provides a uniform interface to multiple fitting backends to fit the datasets model parameters on the reduced data with maximum likelihood techniques. :ref:`estimators` In addition to the global modelling and fitting, Gammapy provides utility classes to compute and store flux points, light curves and flux as well as significance maps in energy bands. Configurable analysis --------------------- :ref:`analysis` To define and execute a full data analysis process from a YAML configuration file, Gammapy implements a high level analysis interface. It exposes a subset of the functionality that is available in the sub-packages to support standard analysis use case in a convenient way. :ref:`CLI` A minimal command line interface (CLI) is provided for commonly used and easy to configure analysis tasks. Additional utilities -------------------- :ref:`catalog` Access to a variety of GeV-TeV gamma-ray catalogs. :ref:`stats` Statistical estimators, fit statistics and algorithms commonly used in gamma-ray astronomy. :ref:`astro` Support for simulation of TeV source populations and dark matter models. :ref:`visualization` Helper functions and classes to create publication-quality images. :ref:`utils` Utility functions that are used in many places or don’t fit in one of the other packages. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/references.rst0000644000175100001770000002376114721316200020276 0ustar00runnerdocker.. include:: ../references.txt .. _references: Glossary and references ======================= .. _glossary: Glossary -------- .. glossary:: 1D Analysis 1D analysis or spectral analysis where data are reduced to a simple 1D geometry along the reconstructed energy axis. In Cherenkov astronomy, this is classically performed with an OFF background measurement, see WStat. 3D Analysis 3D analysis or cube analysis, where data are reduced to a 3D cube with spatial coordinates and energy axes [Stewart2009]_. In Gammapy, these cube are represented by `Map` objects (see :ref:`maps`) and contained in a `MapDataset` object. Aeff Short for "effective area": it is the IRF representing the detector collection area. See :ref:`irf` and :ref:`irf-aeff`. Bkg Short for "background": it is the IRF representing the residual background rate per solid angle. See :ref:`irf` and :ref:`irf-bkg`. Cash The Cash statistic is a Poisson fit statistic usually used when signal and background can be modeled. It is defined as :math:`2 \times log(L)`. See :ref:`cash` in :ref:`fit statistics `. Dataset In Gammapy a dataset bundles the data, IRFs, model and a likelihood function. Based on the model and IRFs the predicted number of counts are computed and compared to the measured counts using the likelihood. See :ref:`datasets` DL3 Short for "data level 3": it is used to mention or describe science-ready data, ie the event list (with basically a Time, an arrival direction and an energy) and the four IRFs (Aeff, EDisp, PSF, Bkg). See :ref:`data flow `. DL4 Short for "data level 4": it is used to mention or describe binned-science data, ie N-dim maps (multi-dimensional histograms) of the spatial, temporal and/or spectral components of the DL3 data in instrumental units (e.g. counts). See :ref:`maps` and :ref:`data flow `. DL5 Short for "data level 5": it is used to mention or describe advanced-science data, ie N-dim maps of the spatial, temporal and/or spectral astrophysical quantities in physical units (e.g. cm-2 s-1). See :ref:`data flow `. DL6 Short for "data level 6": it is used to mention or describe catalogs of sources, with their spectral and spatial properties. See :ref:`data flow `. EDisp Short for "energy dispersion": it is the IRF that represents the probability of measuring a given reconstructed energy as a function of the true photon energy. See :ref:`irf` and :ref:`irf-edisp`. Estimator Gammapy utility classes to compute astrophysical quantities based on a model. They are used for the computation of DL5 products from the DL4 ones. See :ref:`modeling` and the :ref:`data flow `. FoV Short for "field of view": it indicates the angular aperture (sometimes also the solid angle) on the sky that is visible by the instrument with a single pointing. FoV Background Background estimation method typically used for the 3D analysis. See :ref:`makers`. HLI Short for "high level interface: this API aims to realize most of the analysis types using YAML configuration files. See :ref:`analysis` GADF Short for "Gamma Astro Data Format". The `open initiative GADF `_ provides a Data Format for gamma-ray data that is currently used by many IACT experiments and by HAWC. Gammapy I/O functions are compliant with this format. GTI Short for "good time interval": it indicates a continuous time interval of data acquisition. In the GADF DL3 format, it also represents a time interval in which the IRFs are supposed to be constant. IRF Short for "instrument response function": they are used to model the probability to detect a photon with a number of measured characteristics. See :ref:`irf`. Joint Analysis A joint fit across multiple datasets implies that each dataset is handled independently during the data reduction stage, and the statistics combined during the likelihood fit. The likelihood is computed for each dataset and summed to get the total fit statistic. See :ref:`joint` Maker Gammapy utility classes performing data reduction of the DL3 data to binned datasets (DL4). See :ref:`makers` and the :ref:`data flow `. MCMC Short for "Markov chain Monte Carlo". See the `following recipe `_. MET Short for "mission elapsed time". See also :ref:`MET_definition` in :ref:`time_handling`. PSF Short for "point spread function": it is the IRF representing the probability density of the angular separation between true and reconstructed directions. See :ref:`irf` and :ref:`irf-psf`. Reco Energy The reconstructed (or measured) energy (often written `e_reco`) is the energy of the measured photon by contrast with its actual true energy. Measured quantities such as counts are represented along a reco energy axis. Reflected Background Background estimation method typically used for spectral analysis. See :ref:`makers`. Ring Background Background estimation method typically used for image analysis. See :ref:`makers`. RoI Short for "region of interest": it indicates the spatial region in which the data are analyzed. In practice, at each energy it corresponds with the sky region in which the dataset mask is True. SED Short for "spectral energy distribution". For a spectral model or flux points object, the type of plot (e.g. :math:`dN/dE`, :math:`E^2\ dN/dE`) is typically adjusted through the `sed_type` quantity. See :ref:`sedtypes` for a list of options. .. STI Short for "stable time interval": it indicates a continuous time interval of data acquisition for which the instrument response files are supposed to be constant. Stacked Analysis In a stacked analysis individual observations are reduced to datasets which are then stacked to produce a single reduced dataset. The latter is then used to obtain physical information through model fitting. Some approximations must be made to perform dataset stacking (e.g. loss of individual background normalization, averaging of instrument responses, loss of information outside region of interest etc), but this can reduce very significantly the computing and memory cost. For details, see :ref:`stack` True Energy The true energy (often written `e_true`) is the energy of the incident photon by contrast with the energy reconstructed by the instrument. Instrument response functions are represented along a true energy axis. TS Short for "test statistics". See :ref:`ts` and :ref:`fit-statistics`. WStat The WStat is a Poisson fit statistic usually used for ON-OFF analysis. It is based on the profile likelihood method where the unknown background parameters are marginalized. See :ref:`wstat` in :ref:`fit statistics `. .. _publications: References ---------- This is the bibliography containing the literature references for the implemented methods referenced from the Gammapy docs. .. [Abdalla2018] `H.E.S.S. Collaboration (2018) `_, "The H.E.S.S. Galactic plane survey" .. [Albert2007] `Albert et al. (2007) `_, "Unfolding of differential energy spectra in the MAGIC experiment", .. [Berge2007] `Berge et al. (2007) `_, "Background modelling in very-high-energy gamma-ray astronomy" .. [Cash1979] `Cash (1979) `_, "Parameter estimation in astronomy through application of the likelihood ratio" .. [Cousins2007] `Cousins et al. (2007) `_, "Evaluation of three methods for calculating statistical significance when incorporating a systematic uncertainty into a test of the background-only hypothesis for a Poisson process" .. [Feldman1998] `Feldman & Cousins (1998) `_, "Unified approach to the classical statistical analysis of small signals" .. [Lafferty1994] `Lafferty & Wyatt (1994) `_, "Where to stick your data points: The treatment of measurements within wide bins" .. [LiMa1983] `Li & Ma (1983) `_, "Analysis methods for results in gamma-ray astronomy" .. [Meyer2010] `Meyer et al. (2010) `_, "The Crab Nebula as a standard candle in very high-energy astrophysics" .. [Mohrmann2019] `Mohrmann et al. (2019) `_, "Validation of open-source science tools and background model construction in γ-ray astronomy" .. [Naurois2012] `de Naurois (2012) `_, "Very High Energy astronomy from H.E.S.S. to CTA. Opening of a new astronomical window on the non-thermal Universe", .. [Piron2001] `Piron et al. (2001) `_, "Temporal and spectral gamma-ray properties of Mkn 421 above 250 GeV from CAT observations between 1996 and 2000", .. [Rolke2005] `Rolke et al. (2005) `_, "Limits and confidence intervals in the presence of nuisance parameters", .. [Stewart2009] `Stewart (2009) `_, "Maximum-likelihood detection of sources among Poissonian noise" ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.160642 gammapy-1.3/docs/user-guide/scripts/0000755000175100001770000000000014721316215017107 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/scripts/index.rst0000644000175100001770000003061314721316200020745 0ustar00runnerdocker.. include:: ../../references.txt .. _CLI: Command line tools ================== .. warning:: The Gammapy command line interface (CLI) described here is experimental and only supports a small sub-set of the functionality available via the Gammapy Python package. Currently, Gammapy is first and foremost a Python package. This means that to use it you have to write a Python script or Jupyter notebook, where you import the functions and classes needed for a given analysis, and then call them, passing parameters to configure the analysis. We have also have a :ref:`analysis` that provides high level Python functions for the most common needs present in the analysis process. That said, for some very commonly used and easy to configure analysis tasks we have implemented a **command line interface (CLI)**. It is automatically installed together with the Gammapy python package. Execution --------- To execute the Gammapy CLI, type the command ``gammapy`` at your terminal shell (not in Python): .. code-block:: bash gammapy --help or equivalently, just type this: .. code-block:: bash gammapy Either way, the command should print some help text to the console and then exit: .. code-block:: text Usage: gammapy [OPTIONS] COMMAND [ARGS]... Gammapy command line interface (CLI). Gammapy is a Python package for gamma-ray astronomy. Use ``--help`` to see available sub-commands, as well as the available arguments and options for each sub-command. For further information, see https://gammapy.org/ and https://docs.gammapy.org/ Examples -------- gammapy --help gammapy --version gammapy info --help gammapy info Options: --log-level [debug|info|warning|error] Logging verbosity level. --ignore-warnings Ignore warnings? --version Print version and exit. -h, --help Show this message and exit. Commands: analysis Automation of configuration driven data reduction process. check Run checks for Gammapy download Download datasets and notebooks info Display information about Gammapy All CLI functionality for Gammapy is implemented as sub-commands of the main ``gammapy`` command. If a command has sub-commands, they are listed in the help output. E.g. the help output from ``gammapy`` above shows that there is a sub-command called ``gammapy analysis``. Actually, ``gammapy analysis`` itself isn't a command that does something, but another command group that is used to group sub-commands. So now you know how the Gammapy CLI is structured and how to discover all available sub-commands, arguments and options. Running config driven data reduction ------------------------------------ Here's the main usage of the Gammapy CLI for data processing: use the ``gammapy analysis`` command to first create a default configuration file with default values and then perform a simple automated data reduction process (i.e. fetching observations from a datastore and producing the reduced datasets.) .. code-block:: bash gammapy analysis --help Usage: gammapy analysis [OPTIONS] COMMAND [ARGS]... Automation of configuration driven data reduction process. Examples -------- gammapy analysis config gammapy analysis run gammapy analysis config --overwrite gammapy analysis config --filename myconfig.yaml gammapy analysis run --filename myconfig.yaml Options: -h, --help Show this message and exit. Commands: config Writes default configuration file. run Performs automated data reduction process. gammapy analysis config INFO:gammapy.scripts.analysis:Configuration file produced: config.yaml You can manually edit this produced configuration file and the run the data reduction process: .. code-block:: bash gammapy analysis run INFO:gammapy.analysis.config:Setting logging config: {'level': 'INFO', 'filename': None, 'filemode': None, 'format': None, 'datefmt': None} INFO:gammapy.analysis.core:Fetching observations. INFO:gammapy.analysis.core:Number of selected observations: 4 INFO:gammapy.analysis.core:Reducing spectrum datasets. INFO:gammapy.analysis.core:Processing observation 23592 INFO:gammapy.analysis.core:Processing observation 23523 INFO:gammapy.analysis.core:Processing observation 23526 INFO:gammapy.analysis.core:Processing observation 23559 Datasets stored in datasets folder. .. _CLI_write: Write your own CLI ================== This section explains how to write your own command line interface (CLI). We will focus on the command line aspect, and use a very simple example where we just call `gammapy.stats.CashCountsStatistic.sqrt_ts`. From the interactive Python or IPython prompt or from a Jupyter notebook you just import the functionality you need and call it, like this: >>> from gammapy.stats import CashCountsStatistic >>> CashCountsStatistic(n_on=10, mu_bkg=4.2).sqrt_ts np.float64(2.397918129147546) If you imagine that the actual computation involves many lines of code (and not just a one-line function call), and that you need to do this computation frequently, you will probably write a Python script that looks something like this: .. testcode:: # Compute significance for a Poisson count observation from gammapy.stats import CashCountsStatistic n_observed = 10 mu_background = 4.2 s = CashCountsStatistic(n_observed, mu_background).sqrt_ts print(f"{s:.4f}") .. testoutput:: 2.3979 We have introduced variables that hold the parameters for the analysis and put them before the computation. Let's say this script is in a file called ``significance.py``, then to use it you put the parameters you like and then execute it via: .. code-block:: bash python significance.py If you want, you can also put the line ``#!/usr/bin/env python`` at the top of the script, make it executable via ``chmod +x significance.py`` and then you'll be able to execute it via ``./significance.py`` if you prefer to execute it like this. This works on Linux and Mac OS, but not on Windows. It is also possible to omit the ``.py`` extension from the filename, i.e. to simply call the file ``significance``. Either way has some advantages and disadvantages, it's a matter of taste. Omitting the ``.py`` is nice because users calling the tool usually don't care that it's a Python script, and it's shorter. But omitting the ``.py`` also means that some advanced users that open up the file in an editor have a harder time (because the editor might not recognise it as a Python file and syntax highlight appropriately), or more importantly that importing functions of classes from that script from other Python files or Jupyter notebooks is not easily possible, leading some people to rename it or copy & paste from it. We're explaining these details, because if you work with colleagues and share scripts, you'll encounter the ``#!/usr/bin/env python`` and scripts with and without ``.py`` and will need to know how to work with them. Writing and using such scripts is perfectly fine and a common way to run science analyses. However, if you use it very frequently it might become annoying to have to open up and edit the ``significance.py`` file every time to use it. In that case, you can change your script into a command line interface that allows you to set analysis parameters without having to edit the file, like this: .. code-block:: bash python significance.py --help Usage: significance.py [OPTIONS] N_OBSERVED MU_BACKGROUND Compute significance for a Poisson count observation. The significance is the tail probability to observe N_OBSERVED counts or more, given a known background level MU_BACKGROUND. Options: --value [sqrt_ts|p_value] Square root TS or p_value --help Show this message and exit. python significance.py 10 4.2 2.39791813 python significance.py 10 4.2 --value p_value 0.01648855015875024 In Python, there are several ways to do command line argument parsing and to create command line interfaces. Of course you're free to do whatever you like, but if you're not sure what to use to build your own CLIs, we suggest you give `click`_ a try. Here is how you'd rewrite your ``significance.py`` as a click CLI: .. literalinclude:: significance.py We use `click`_ in Gammapy itself. We also use `click`_ frequently for our own projects if we choose to add a CLI (no matter if Gammapy is used or not). Putting the CLI in a file called ``make.py`` makes it easy to go back to a project after a while and to remember or quickly figure out again how it works (as opposed to just having a bunch of Python scripts or Jupyter notebooks where it's harder to remember where to edit parameters and which ones to run in which order). One example is the `gamma-cat make.py`_. .. _gamma-cat make.py: https://github.com/gammapy/gamma-cat/blob/master/make.py .. _gamma-sky.net make.py: https://github.com/gammapy/gamma-sky/blob/master/make.py If you find that you don't like `click`_, another popular alternative to create CLIs is `argparse`_ from the Python standard library. To learn argparse, either read the official documentation, or the `PYMOTW argparse`_ tutorial. For basic use cases ``argparse`` is similar to ``click``, the main difference being that ``click`` uses decorators (``@command``, ``@argument``, ``@option``) attached to a callback function to execute, whereas ``argparse`` uses classes and method calls to create a parser object, and then you have to call ``parse_args`` yourself and also pass the ``args`` to the code or function to execute yourself. So for basic use cases, but also for more advanced use cases where you define a CLI with sub-commands, ``argparse`` can be used just as well, it's just a little harder to learn and use than ``click`` (of course that's a matter of opinion). Another advantage of choosing Click is that once you've learned it, you'll be able to quickly read and understand, or even contribute to the Gammapy CLI. .. _argparse: https://docs.python.org/3/library/argparse.html .. _PYMOTW argparse: https://pymotw.com/3/argparse/index.html Troubleshooting =============== Command not found ----------------- Usually tools that install Gammapy (e.g. setuptools via ``python setup.py install`` or ``pip`` or package managers like ``conda``) will put the ``gammapy`` command line tool in a directory that is on your ``PATH``, and if you type ``gammapy`` the command is found and executed. However, due to the large number of supported systems (Linux, Mac OS, Windows) and different ways to install Python packages like Gammapy (e.g. system install, user install, virtual environments, conda environments) and environments to launch command line tools like ``gammapy`` (e.g. bash, csh, Windows command prompt, Jupyter, ...) it is not unheard of that users have trouble running ``gammapy`` after installing it. This usually looks like this: .. code-block:: bash gammapy -bash: gammapy: command not found If you just installed Gammapy, search the install log for the message "Installing gammapy script" to see where ``gammapy`` was installed, and check that this location is on your PATH: .. code-block:: bash echo $PATH If you don't manage to figure out where the ``gammapy`` command line tool is installed, you can try calling it like this instead: .. code-block:: bash python -m gammapy This also has the advantage that it avoids issues where users have multiple versions of Python and Gammapy installed and accidentally launch one they don't want because it comes first on their ``PATH``. For the same reason these days the recommended way to use e.g. ``pip`` is via ``python -m pip``. If this still doesn't work, check if you are using the right Python and have Gammapy installed: .. code-block:: bash which python python -c 'import gammapy' To see more information about your shell environment, these commands might be helpful: .. code-block:: bash python -m site python -m gammapy info echo $PATH conda info -a # if you're using conda If you're still stuck or have any question, feel free to ask for help with installation issues on the Gammapy mailing list of Slack any time! Reference ========= You may find the auto-generated documentation for all available sub-commands, arguments and options of the ``gammapy`` command line interface (CLI) in the :ref:`API ref docs `. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/scripts/significance.py0000644000175100001770000000167214721316200022103 0ustar00runnerdocker"""Example how to write a command line tool with Click""" import click from gammapy.stats import CashCountsStatistic # You can call the callback function for the click command anything you like. # `cli` is just a commonly used generic term for "command line interface". @click.command() @click.argument("n_observed", type=float) @click.argument("mu_background", type=float) @click.option( "--value", type=click.Choice(["sqrt_ts", "p_value"]), default="sqrt_ts", help="Significance or p_value", ) def cli(n_observed, mu_background, value): """Compute significance for a Poisson count observation. The significance is the tail probability to observe N_OBSERVED counts or more, given a known background level MU_BACKGROUND.""" stat = CashCountsStatistic(n_observed, mu_background) if value == "sqrt_ts": s = stat.sqrt_ts else: s = stat.p_value print(s) if __name__ == "__main__": cli() ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.160642 gammapy-1.3/docs/user-guide/stats/0000755000175100001770000000000014721316215016556 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/stats/fit_statistics.rst0000644000175100001770000001334214721316200022341 0ustar00runnerdocker.. include:: ../../references.txt .. _fit-statistics: Fit statistics ============== Introduction ------------ This page describes the fit statistics used in gammapy. These fit statistics are used by datasets to perform model fitting and parameter estimation. Fit statistics in gammapy are all log-likelihood functions normalized like chi-squares, i.e. if :math:`L` is the likelihood function used, they follow the expression :math:`2 \times log L`. All functions compute per-bin statistics. If you want the summed statistics for all bins, call sum on the output array yourself. .. _cash: Cash : Poisson data with background model ----------------------------------------- The number of counts, :math:`n`, is a Poisson random variable of mean value :math:`\mu_{\mathrm{sig}} + \mu_{\mathrm{bkg}}`. The former is the expected number of counts from the source (the signal), the latter is the number of expected background counts, which is supposed to be known. We can write the likelihood :math:`L` and applying the expression above, we obtain the following formula for the Cash fit statistic: .. math:: C = 2 \times \left(\mu_{\mathrm{sig}} + \mu_{\mathrm{bkg}} - n \times log (\mu_{\mathrm{sig}} + \mu_{\mathrm{bkg}}) \right) The Cash statistic is implemented in `~gammapy.stats.cash` and is used as a `stat` function by the `~gammapy.datasets.MapDataset` and the `~gammapy.datasets.SpectrumDataset`. Example ^^^^^^^ Here's an example for the `~gammapy.stats.cash` statistic:: >>> from gammapy.stats import cash >>> data = [3, 5, 9] >>> model = [3.3, 6.8, 9.2] >>> cash(data, model) array([ -0.56353481, -5.56922612, -21.54566271]) >>> total_stat = cash(data, model).sum() >>> print(f"{total_stat:.4f}") -27.6784 .. _wstat: WStat : Poisson data with background measurement ------------------------------------------------ In the absence of a reliable background model, it is possible to use a second measurement containing only background to estimate it. In the OFF region, which contains background only, the number of counts :math:`n_{\mathrm{off}}` is a Poisson random variable of mean value :math:`\alpha,\mu_{\mathrm{bkg}}` In the ON region which contains signal and background contribution, the number of counts, :math:`n_{\mathrm{on}}`, is a Poisson random variable of mean value :math:`\mu_{\mathrm{sig}} + \mu_{\mathrm{bkg}}`, where :math:`\alpha` is the ratio of the ON and OFF region acceptances, and :math:`\mu_{\mathrm{bkg}}` the mean background counts in the ON region. It is possible define a likelihood function and marginalize it over the unknown :math:`\mu_{\mathrm{bkg}}` to obtain :math:`\mu_{\mathrm{sig}}`. This yields the so-called WStat or ON-OFF statistics which is traditionally used for ON-OFF measurements in ground based gamma-ray astronomy. The WStat fit statistics is given by the following formula: .. math:: W = 2 \big(\mu_{\mathrm{sig}} + (1 + 1/\alpha)\mu_{\mathrm{bkg}} - n_{\mathrm{on}} \log{(\mu_{\mathrm{sig}} + \mu_{\mathrm{bkg}})} - n_{\mathrm{off}} \log{(\mu_{\mathrm{bkg}}/\alpha)}\big) To see how to derive it see the :ref:`wstat derivation `. The WStat statistic is implemented in `~gammapy.stats.wstat` and is used as a `stat` function by the `~gammapy.datasets.MapDatasetOnOff` and the `~gammapy.datasets.SpectrumDatasetOnOff`. Caveat ^^^^^^ - Since WStat takes into account background estimation uncertainties and makes no assumption such as a background model, it usually gives larger statistical uncertainties on the fitted parameters. If a background model exists, to properly compare with parameters estimated using the Cash statistics, one should include some systematic uncertainty on the background model. - Note also that at very low counts, WStat is known to result in biased estimates. This can be an issue when studying the high energy behaviour of faint sources. When performing spectral fits with WStat, it is recommended to randomize observations and check whether the resulting fitted parameters distributions are consistent with the input values. Example ^^^^^^^ The following table gives an overview over values that WStat takes in different scenarios >>> from gammapy.stats import wstat >>> from astropy.table import Table >>> table = Table() >>> table['mu_sig'] = [0.1, 0.1, 1.4, 0.2, 0.1, 5.2, 6.2, 4.1, 6.4, 4.9, 10.2, ... 16.9, 102.5] >>> table['n_on'] = [0, 0, 0, 0, 0, 5, 5, 5, 5, 5, 10, 20, 100] >>> table['n_off'] = [0, 1, 1, 10 , 10, 0, 5, 5, 20, 40, 2, 70, 10] >>> table['alpha'] = [0.01, 0.01, 0.5, 0.1 , 0.2, 0.2, 0.2, 0.01, 0.4, 0.4, ... 0.2, 0.1, 0.6] >>> table['wstat'] = wstat(n_on=table['n_on'], ... n_off=table['n_off'], ... alpha=table['alpha'], ... mu_sig=table['mu_sig']) >>> table['wstat'].format = '.3f' >>> table.pprint() mu_sig n_on n_off alpha wstat ------ ---- ----- ----- ------ 0.1 0 0 0.01 0.200 0.1 0 1 0.01 0.220 1.4 0 1 0.5 3.611 0.2 0 10 0.1 2.306 0.1 0 10 0.2 3.846 5.2 5 0 0.2 0.008 6.2 5 5 0.2 0.736 4.1 5 5 0.01 0.163 6.4 5 20 0.4 7.125 4.9 5 40 0.4 14.578 10.2 10 2 0.2 0.034 16.9 20 70 0.1 0.656 102.5 100 10 0.6 0.663 Notes ^^^^^ All above formulae are equivalent to what is given on the `XSpec manual statistics page`_ with the substitutions: .. math:: \mu_{\mathrm{sig}} = t_s \cdot m_i \\ \mu_{\mathrm{bkg}} = t_b \cdot m_b \\ \alpha = t_s / t_b \\ Further references ------------------ * `Sherpa statistics page`_ * `XSpec manual statistics page`_ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/stats/index.rst0000644000175100001770000002647214721316200020424 0ustar00runnerdocker.. _stats: Statistical utility functions ============================= `gammapy.stats` holds statistical estimators, fit statistics and algorithms commonly used in gamma-ray astronomy. It is mostly concerned with the evaluation of one or several observations that count events in a given region and time window, i.e. with Poisson-distributed counts measurements. .. _stats_notation: Notations --------- For statistical analysis we use the following variable names following mostly the notation in [LiMa1983]_. For the `~gammapy.datasets.MapDataset` attributes we chose more verbose equivalents: ================= ====================== ==================================================== Variable Dataset attribute name Definition ================= ====================== ==================================================== ``n_on`` ``counts`` Observed counts ``n_off`` ``counts_off`` Observed counts in the off region ``n_bkg`` ``background`` Estimated background counts, defined as ``alpha * n_off`` ``n_sig`` ``excess`` Estimated signal counts defined as ``n_on`` - ``n_bkg`` ``mu_on`` ``npred`` Predicted counts ``mu_off`` ``npred_off`` Predicted counts in the off region ``mu_bkg`` ``npred_background`` Predicted background counts in the on region ``mu_sig`` ``npred_signal`` Predicted signal counts ``a_on`` ``acceptance`` Relative background exposure ``a_off`` ``acceptance_off`` Relative background exposure in the off region ``alpha`` ``alpha`` Background efficiency ratio ``a_on`` / ``a_off`` ================= ====================== ==================================================== The on measurement, assumed to contain signal and background counts, :math:`n_{on}` follows a Poisson random variable with expected value :math:`\mu_{on} = \mu_{sig} + \mu_{bkg}`. The off measurement is assumed to contain only background counts, with an acceptance to background :math:`a_{off}`. This off measurement can be used to estimate the number of background counts in the on region: :math:`n_{bkg} = \alpha\ n_{off}` with :math:`\alpha = a_{on}/a_{off}` the ratio of on and off acceptances. Therefore, :math:`n_{off}` follows a Poisson distribution with expected value :math:`\mu_{off} = \mu_{bkg} / \alpha`. The expectation or predicted values :math:`\mu_X` are in general derived using maximum likelihood estimation. Counts and fit statistics ------------------------- Gamma-ray measurements are counts, :math:`n_{on}`, containing both signal and background events. Estimation of number of signal events or of quantities in physical models is done through Poisson likelihood functions, the fit statistics. In gammapy, they are all log-likelihood functions normalized like chi-squares, i.e. if :math:`L` is the likelihood function used, they follow the expression :math:`2 \times log L`. When the expected number of background events, :math:`\mu_{bkg}` is known, the statistic function is ``Cash`` (see :ref:`cash`). When the number of background events is unknown, one has to use a background estimate :math:`n_{bkg}` taken from an off measurement where only background events are expected. In this case, the statistic function is ``WStat`` (see :ref:`wstat`). These statistic functions are at the heart of the model fitting approach in gammapy. They are used to estimate the best fit values of model parameters and their associated confidence intervals. They are used also to estimate the excess counts significance, i.e. the probability that a given number of measured events :math:`n_{on}` actually contains some signal events :math:`n_{sig}`, as well as the errors associated to this estimated number of signal counts. .. _ts: Estimating TS ------------- A classical approach to modeling and fitting relies on hypothesis testing. One wants to estimate whether an hypothesis :math:`H_1` is statistically preferred over the reference, or null-hypothesis, :math:`H_0`. The maximum log-likelihood ratio test provides a way to estimate the p-value of the data following :math:`H_1` rather than :math:`H_0`, when the two hypotheses are nested. We note this ratio :math:`\lambda = \frac{max L(X|{H_1})}{max L(X|H_0)}` The Wilks theorem shows that under some hypothesis, :math:`2 \log \lambda` asymptotically follows a :math:`\chi^2` distribution with :math:`n_{dof}` degrees of freedom, where :math:`n_{dof}` is the difference of free parameters between :math:`H_1` and :math:`H_0`. With the definition the fit statistics :math:`-2 \log \lambda` is simply the difference of the fit statistic values for the two hypotheses, the delta TS (short for test statistic). Hence, :math:`\Delta TS` follows :math:`\chi^2` distribution with :math:`n_{dof}` degrees of freedom. This can be used to convert :math:`\Delta TS` into a "classical significance" using the following recipe: .. code:: from scipy.stats import chi2, norm def sigma_to_ts(sigma, df=1): """Convert sigma to delta ts""" p_value = 2 * norm.sf(sigma) return chi2.isf(p_value, df=df) def ts_to_sigma(ts, df=1): """Convert delta ts to sigma""" p_value = chi2.sf(ts, df=df) return norm.isf(0.5 * p_value) In particular, with only one degree of freedom (e.g. flux amplitude), one can estimate the statistical significance in terms of number of :math:`\sigma` as :math:`\sqrt{\Delta TS}`. In case the excess is negative, which can happen if the background is overestimated the following convention is used: .. math:: \sqrt{\Delta TS} = \left \{ \begin{array}{ll} -\sqrt{\Delta TS} & : \text{if} \text{Excess} < 0 \\ \sqrt{\Delta TS} & : \text{else} \end{array} \right. Counts statistics classes ========================= To estimate the excess counts significance and errors, gammapy uses two classes for Poisson counts with and without known background: `~gammapy.stats.CashCountsStatistic` and `~gammapy.stats.WStatCountsStatistic` We show below how to use them. Cash counts statistic --------------------- Excess and Significance ~~~~~~~~~~~~~~~~~~~~~~~ Assume one measured :math:`n_{on} = 13` counts in a region where one suspects a source might be present. if the expected number of background events is known (here e.g. :math:`\mu_{bkg}=5.5`), one can use the Cash statistic to estimate the signal or excess number, its statistical significance as well as the confidence interval on the true signal counts number value. .. testcode:: from gammapy.stats import CashCountsStatistic stat = CashCountsStatistic(n_on=13, mu_bkg=5.5) print(f"Excess : {stat.n_sig:.2f}") print(f"Error : {stat.error:.2f}") print(f"TS : {stat.ts:.2f}") print(f"sqrt(TS): {stat.sqrt_ts:.2f}") print(f"p-value : {stat.p_value:.4f}") .. testoutput:: Excess : 7.50 Error : 3.61 TS : 7.37 sqrt(TS): 2.71 p-value : 0.0033 The error is the symmetric error obtained from the covariance of the statistic function, here :math:`\sqrt{n_{on}}`. The `sqrt_ts` is the square root of the :math:`TS`, multiplied by the sign of the excess, which is equivalent to the Li & Ma significance for known background. The p-value is now computed taking into account only positive fluctuations. To see how the :math:`TS`, relates to the statistic function, we plot below the profile of the Cash statistic as a function of the expected signal events number. .. plot:: user-guide/stats/plot_cash_significance.py Excess errors ~~~~~~~~~~~~~ You can also compute the confidence interval for the true excess based on the statistic value: If you are interested in 68% (1 :math:`\sigma`) and 95% (2 :math:`\sigma`) confidence ranges: .. testcode:: from gammapy.stats import CashCountsStatistic count_statistic = CashCountsStatistic(n_on=13, mu_bkg=5.5) excess = count_statistic.n_sig errn = count_statistic.compute_errn(1.) errp = count_statistic.compute_errp(1.) print(f"68% confidence range: {excess - errn:.3f} < mu < {excess + errp:.3f}") .. testoutput:: 68% confidence range: 4.220 < mu < 11.446 .. testcode:: errn_2sigma = count_statistic.compute_errn(2.) errp_2sigma = count_statistic.compute_errp(2.) print(f"95% confidence range: {excess - errn_2sigma:.3f} < mu < {excess + errp_2sigma:.3f}") .. testoutput:: 95% confidence range: 1.556 < mu < 16.102 The 68% confidence interval (1 :math:`\sigma`) is obtained by finding the expected signal values for which the TS variation is 1. The 95% confidence interval (2 :math:`\sigma`) is obtained by finding the expected signal values for which the TS variation is :math:`2^2 = 4`. On the following plot, we show how the 1 :math:`\sigma` and 2 :math:`\sigma` confidence errors relate to the Cash statistic profile. .. plot:: user-guide/stats/plot_cash_errors.py WStat counts statistic ---------------------- Excess and Significance ~~~~~~~~~~~~~~~~~~~~~~~ To measure the significance of an excess, one can directly use the TS of the measurement with and without the excess. Taking the square root of the result yields the so-called Li & Ma significance [LiMa1983]_ (see equation 17). As an example, assume you measured :math:`n_{on} = 13` counts in a region where you suspect a source might be present and :math:`n_{off} = 11` counts in a background control region where you assume no source is present and that is :math:`a_{off}/a_{on}=2` times larger than the on-region. Here's how you compute the statistical significance of your detection: .. testcode:: from gammapy.stats import WStatCountsStatistic stat = WStatCountsStatistic(n_on=13, n_off=11, alpha=1./2) print(f"Excess : {stat.n_sig:.2f}") print(f"sqrt(TS): {stat.sqrt_ts:.2f}") .. testoutput:: Excess : 7.50 sqrt(TS): 2.09 .. plot:: user-guide/stats/plot_wstat_significance.py Excess errors ~~~~~~~~~~~~~ You can also compute the confidence interval for the true excess based on the statistic value: If you are interested in 68% (1 :math:`\sigma`) and 95% (1 :math:`\sigma`) confidence errors: .. testcode:: from gammapy.stats import WStatCountsStatistic count_statistic = WStatCountsStatistic(n_on=13, n_off=11, alpha=1./2) excess = count_statistic.n_sig errn = count_statistic.compute_errn(1.) errp = count_statistic.compute_errp(1.) print(f"68% confidence range: {excess - errn:.3f} < mu < {excess + errp:.3f}") .. testoutput:: 68% confidence range: 3.750 < mu < 11.736 .. testcode:: errn_2sigma = count_statistic.compute_errn(2.) errp_2sigma = count_statistic.compute_errp(2.) print(f"95% confidence range: {excess - errn_2sigma:.3f} < mu < {excess + errp_2sigma:.3f}") .. testoutput:: 95% confidence range: 0.311 < mu < 16.580 As above, the 68% confidence interval (1 :math:`\sigma`) is obtained by finding the expected signal values for which the TS variation is 1. The 95% confidence interval (2 :math:`\sigma`) is obtained by finding the expected signal values for which the TS variation is :math:`2^2 = 4`. On the following plot, we show how the 1 :math:`\sigma` and 2 :math:`\sigma` confidence errors relate to the WStat statistic profile. .. plot:: user-guide/stats/plot_wstat_errors.py These are references describing the available methods: [LiMa1983]_, [Cash1979]_, [Stewart2009]_, [Rolke2005]_, [Feldman1998]_, [Cousins2007]_. .. toctree:: :maxdepth: 1 :hidden: fit_statistics wstat_derivation ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/stats/plot_cash_errors.py0000644000175100001770000000331614721316200022475 0ustar00runnerdocker"""Example plot showing the profile of the Cash statistic and its connection to excess errors.""" import numpy as np import matplotlib.pyplot as plt from gammapy.stats import CashCountsStatistic count_statistic = CashCountsStatistic(n_on=13, mu_bkg=5.5) excess = count_statistic.n_sig errn = count_statistic.compute_errn(1.0) errp = count_statistic.compute_errp(1.0) errn_2sigma = count_statistic.compute_errn(2.0) errp_2sigma = count_statistic.compute_errp(2.0) # We compute the Cash statistic profile mu_signal = np.linspace(-1.5, 25, 100) stat_values = count_statistic._stat_fcn(mu_signal) xmin, xmax = -1.5, 25 ymin, ymax = -42, -28.0 plt.figure(figsize=(5, 5)) plt.plot(mu_signal, stat_values, color="k") plt.xlim(xmin, xmax) plt.ylim(ymin, ymax) plt.xlabel(r"Number of expected signal event, $\mu_{sig}$") plt.ylabel(r"Cash statistic value, TS ") plt.hlines( count_statistic.stat_max + 1, xmin=excess - errn, xmax=excess + errp, linestyle="dotted", color="r", label="1 sigma (68% C.L.)", ) plt.vlines( excess - errn, ymin=ymin, ymax=count_statistic.stat_max + 1, linestyle="dotted", color="r", ) plt.vlines( excess + errp, ymin=ymin, ymax=count_statistic.stat_max + 1, linestyle="dotted", color="r", ) plt.hlines( count_statistic.stat_max + 4, xmin=excess - errn_2sigma, xmax=excess + errp_2sigma, linestyle="dashed", color="b", label="2 sigma (95% C.L.)", ) plt.vlines( excess - errn_2sigma, ymin=ymin, ymax=count_statistic.stat_max + 4, linestyle="dashed", color="b", ) plt.vlines( excess + errp_2sigma, ymin=ymin, ymax=count_statistic.stat_max + 4, linestyle="dashed", color="b", ) plt.legend() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/stats/plot_cash_significance.py0000644000175100001770000000250714721316200023604 0ustar00runnerdocker"""Example plot showing the profile of the Cash statistic and its connection to significance.""" import numpy as np import matplotlib.pyplot as plt from gammapy.stats import CashCountsStatistic count_statistic = CashCountsStatistic(n_on=13, mu_bkg=5.5) excess = count_statistic.n_sig # We compute the Cash statistic profile mu_signal = np.linspace(-1.5, 25, 100) stat_values = count_statistic._stat_fcn(mu_signal) xmin, xmax = -1.5, 25 ymin, ymax = -42, -28.0 plt.figure(figsize=(5, 5)) plt.plot(mu_signal, stat_values, color="k") plt.xlim(xmin, xmax) plt.ylim(ymin, ymax) plt.xlabel(r"Number of expected signal event, $\mu_{sig}$") plt.ylabel(r"Cash statistic value, TS ") plt.vlines( excess, ymin=ymin, ymax=count_statistic.stat_max, linestyle="dashed", color="k", label="Best fit", ) plt.hlines( count_statistic.stat_max, xmin=xmin, xmax=excess, linestyle="dashed", color="k" ) plt.hlines( count_statistic.stat_null, xmin=xmin, xmax=0, linestyle="dotted", color="k", label="Null hypothesis", ) plt.vlines(0, ymin=ymin, ymax=count_statistic.stat_null, linestyle="dotted", color="k") plt.vlines( excess, ymin=count_statistic.stat_max, ymax=count_statistic.stat_null, color="r" ) plt.hlines( count_statistic.stat_null, xmin=0, xmax=excess, linestyle="dotted", color="r" ) plt.legend() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/stats/plot_wstat_errors.py0000644000175100001770000000331514721316200022720 0ustar00runnerdocker"""Example plot showing the profile of the WStat statistic and its connection to excess errors.""" import numpy as np import matplotlib.pyplot as plt from gammapy.stats import WStatCountsStatistic count_statistic = WStatCountsStatistic(n_on=13, n_off=11, alpha=0.5) excess = count_statistic.n_sig errn = count_statistic.compute_errn(1.0) errp = count_statistic.compute_errp(1.0) errn_2sigma = count_statistic.compute_errn(2.0) errp_2sigma = count_statistic.compute_errp(2.0) # We compute the WStat statistic profile mu_signal = np.linspace(-2.5, 26, 100) stat_values = count_statistic._stat_fcn(mu_signal) xmin, xmax = -2.5, 26 ymin, ymax = 0, 15 plt.figure(figsize=(5, 5)) plt.plot(mu_signal, stat_values, color="k") plt.xlim(xmin, xmax) plt.ylim(ymin, ymax) plt.xlabel(r"Number of expected signal event, $\mu_{sig}$") plt.ylabel(r"WStat value, TS ") plt.hlines( count_statistic.stat_max + 1, xmin=excess - errn, xmax=excess + errp, linestyle="dotted", color="r", label="1 sigma (68% C.L.)", ) plt.vlines( excess - errn, ymin=ymin, ymax=count_statistic.stat_max + 1, linestyle="dotted", color="r", ) plt.vlines( excess + errp, ymin=ymin, ymax=count_statistic.stat_max + 1, linestyle="dotted", color="r", ) plt.hlines( count_statistic.stat_max + 4, xmin=excess - errn_2sigma, xmax=excess + errp_2sigma, linestyle="dashed", color="b", label="2 sigma (95% C.L.)", ) plt.vlines( excess - errn_2sigma, ymin=ymin, ymax=count_statistic.stat_max + 4, linestyle="dashed", color="b", ) plt.vlines( excess + errp_2sigma, ymin=ymin, ymax=count_statistic.stat_max + 4, linestyle="dashed", color="b", ) plt.legend() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/stats/plot_wstat_significance.py0000644000175100001770000000250614721316200024027 0ustar00runnerdocker"""Example plot showing the profile of the Wstat statistic and its connection to significance.""" import numpy as np import matplotlib.pyplot as plt from gammapy.stats import WStatCountsStatistic count_statistic = WStatCountsStatistic(n_on=13, n_off=11, alpha=0.5) excess = count_statistic.n_sig # We compute the WStat statistic profile mu_signal = np.linspace(-2.5, 26, 100) stat_values = count_statistic._stat_fcn(mu_signal) xmin, xmax = -2.5, 26 ymin, ymax = 0, 15 plt.figure(figsize=(5, 5)) plt.plot(mu_signal, stat_values, color="k") plt.xlim(xmin, xmax) plt.ylim(ymin, ymax) plt.xlabel(r"Number of expected signal event, $\mu_{sig}$") plt.ylabel(r"WStat value, TS ") plt.vlines( excess, ymin=ymin, ymax=count_statistic.stat_max, linestyle="dashed", color="k", label="Best fit", ) plt.hlines( count_statistic.stat_max, xmin=xmin, xmax=excess, linestyle="dashed", color="k" ) plt.hlines( count_statistic.stat_null, xmin=xmin, xmax=0, linestyle="dotted", color="k", label="Null hypothesis", ) plt.vlines(0, ymin=ymin, ymax=count_statistic.stat_null, linestyle="dotted", color="k") plt.vlines( excess, ymin=count_statistic.stat_max, ymax=count_statistic.stat_null, color="r" ) plt.hlines( count_statistic.stat_null, xmin=0, xmax=excess, linestyle="dotted", color="r" ) plt.legend() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/stats/wstat_derivation.rst0000644000175100001770000001513414721316200022674 0ustar00runnerdocker.. include:: ../../references.txt .. _wstat_derivation: Derivation of the WStat formula ------------------------------- you can write down the likelihood formula as .. math:: L (n_{\mathrm{on}}, n_{\mathrm{off}}, \alpha; \mu_{\mathrm{sig}}, \mu_{\mathrm{bkg}}) = \frac{(\mu_{\mathrm{sig}}+ \mu_{\mathrm{bkg}})^{n_{\mathrm{on}}}}{n_{\mathrm{on}} !} \exp{(-(\mu_{\mathrm{sig}}+ \mu_{\mathrm{bkg}}))}\times \frac{(\mu_{\mathrm{bkg}}/\alpha)^{n_{\mathrm{off}}}}{n_{\mathrm{off}} !}\exp{(-\mu_{\mathrm{bkg}}/\alpha)}, where :math:`\mu_{\mathrm{sig}}` and :math:`\mu_{\mathrm{bkg}}` are respectively the number of expected signal and background counts in the ON region, as defined in the :ref:`stats_notation`. By taking two time the negative log likelihood and neglecting model independent and thus constant terms, we define the **WStat**. .. math:: W = 2 \big(\mu_{\mathrm{sig}} + (1 + 1/\alpha)\mu_{\mathrm{bkg}} - n_{\mathrm{on}} \log{(\mu_{\mathrm{sig}} + \mu_{\mathrm{bkg}})} - n_{\mathrm{off}} \log{(\mu_{\mathrm{bkg}}/\alpha)}\big) In the most general case, where :math:`\mu_{\mathrm{sig}}` and :math:`\mu_{\mathrm{bkg}}` are free the minimum of :math:`W` is at .. math:: \mu_{\mathrm{sig}} = n_{\mathrm{on}} - \alpha\,n_{\mathrm{off}} \\ \mu_{\mathrm{bkg}} = \alpha\,n_{\mathrm{off}} Profile Likelihood ^^^^^^^^^^^^^^^^^^ Most of the times you probably won't have a model in order to get :math:`\mu_{\mathrm{bkg}}`. The strategy in this case is to treat :math:`\mu_{\mathrm{bkg}}` as so-called nuisance parameter, i.e. a free parameter that is of no physical interest. Of course you don't want an additional free parameter for each bin during a fit. Therefore one calculates an estimator for :math:`\mu_{\mathrm{bkg}}` by analytically minimizing the likelihood function. This is called 'profile likelihood'. .. math:: \frac{\mathrm d \log L}{\mathrm d \mu_{\mathrm{bkg}}} = 0 This yields a quadratic equation for :math:`\mu_{\mathrm{bkg}}` .. math:: \frac{\alpha n_{\mathrm{on}}}{\mu_{\mathrm{sig}}+\mu_{\mathrm{bkg}}} + \frac{\alpha n_{\mathrm{off}}}{\mu_{\mathrm{bkg}}} - (\alpha + 1) = 0 with the solution .. math:: \mu_{\mathrm{bkg}} = \frac{C + D}{2(\alpha + 1)} where .. math:: C = \alpha(n_{\mathrm{on}} + n_{\mathrm{off}}) - (\alpha+1)\mu_{\mathrm{sig}} \\ D^2 = C^2 + 4 (\alpha+1)\alpha n_{\mathrm{off}} \mu_{\mathrm{sig}} Goodness of fit ^^^^^^^^^^^^^^^ The best-fit value of the WStat as defined now contains no information about the goodness of the fit. We consider the likelihood of the data :math:`n_{\mathrm{on}}` and :math:`n_{\mathrm{off}}` under the expectation of :math:`n_{\mathrm{on}}` and :math:`n_{\mathrm{off}}`. .. math:: L (n_{\mathrm{on}}, n_{\mathrm{off}}, \alpha; n_{\mathrm{on}} - \alpha n_{\mathrm{off}}, \alpha n_{\mathrm{off}}) = \frac{n_{\mathrm{on}}^{n_{\mathrm{on}}}}{n_{\mathrm{on}} !} \exp{(-n_{\mathrm{on}})}\times \frac{n_{\mathrm{off}}^{n_{\mathrm{off}}}}{n_{\mathrm{off}} !} \exp{(-n_{\mathrm{off}})} and add twice the log likelihood .. math:: 2 \log L (n_{\mathrm{on}}, n_{\mathrm{off}}; \alpha; n_{\mathrm{on}} - \alpha n_{\mathrm{off}}, \alpha n_{\mathrm{off}}) = 2 (n_{\mathrm{on}} ( \log{(n_{\mathrm{on}})} - 1 ) + n_{\mathrm{off}} ( \log{(n_{\mathrm{off}})} - 1)) to WStat. In doing so, we are computing the likelihood ratio: .. math:: -2 \log \frac{L(n_{\mathrm{on}},n_{\mathrm{off}},\alpha; \mu_{\mathrm{sig}},\mu_{\mathrm{bkg}})} {L(n_{\mathrm{on}},n_{\mathrm{off}}, \alpha; n_{\mathrm{on}} - \alpha n_{\mathrm{off}}, \alpha n_{\mathrm{off}})} Intuitively, this log-likelihood ratio should asymptotically behave like a chi-square with ``m-n`` degrees of freedom, where ``m`` is the number of measurements and ``n`` the number of model parameters. Final result ^^^^^^^^^^^^ .. math:: W = 2 \big(\mu_{\mathrm{sig}} + (1 + \frac{1}{\alpha})\mu_{\mathrm{bkg}} - n_{\mathrm{on}} - n_{\mathrm{off}} - n_{\mathrm{on}} (\log{(\mu_{\mathrm{sig}} + \mu_{\mathrm{bkg}}) - \log{(n_{\mathrm{on}})}}) - n_{\mathrm{off}} (\log(\frac{{\mu_{\mathrm{bkg}}}}{\alpha}) - \log{(n_{\mathrm{off}})})\big) Special cases ^^^^^^^^^^^^^ The above formula is undefined if :math:`n_{\mathrm{on}}` or :math:`n_{\mathrm{off}}` are equal to zero, because of the :math:`n\log{{n}}` terms, that were introduced by adding the goodness of fit terms. These cases are treated as follows. If :math:`n_{\mathrm{on}} = 0` the likelihood formulae read .. math:: L (0, n_{\mathrm{off}}, \alpha; \mu_{\mathrm{sig}}, \mu_{\mathrm{bkg}}) = \exp{(-(\mu_{\mathrm{sig}}+ \mu_{\mathrm{bkg}}))}\times \frac{(\mu_{\mathrm{bkg}}/\alpha)^{n_{\mathrm{off}}}}{n_{\mathrm{off}} !}\exp{(-\mu_{\mathrm{bkg}}/\alpha))}, and .. math:: L (0, n_{\mathrm{off}}, \alpha; 0 - \alpha n_{\mathrm{off}}, \alpha n_{\mathrm{off}} ) = \frac{n_{\mathrm{off}}^{n_{\mathrm{off}}}}{n_{\mathrm{off}} !} \exp{(-n_{\mathrm{off}})} WStat is derived by taking 2 times the negative log likelihood and adding the goodness of fit term as ever .. math:: W = 2 \big(\mu_{\mathrm{sig}} + (1 + \frac{1}{\alpha})\mu_{\mathrm{bkg}} - n_{\mathrm{off}} - n_{\mathrm{off}} (\log{(\mu_{\mathrm{bkg}}/\alpha)} - \log{(n_{\mathrm{off}})})\big) Note that this is the limit of the original Wstat formula for :math:`n_{\mathrm{on}} \rightarrow 0`. The analytical result for :math:`\mu_{\mathrm{bkg}}` in this case reads: .. math:: \mu_{\mathrm{bkg}} = \frac{\alpha n_{\mathrm{off}}}{\alpha + 1} When inserting this into the WStat we find the simplified expression. .. math:: W = 2\big(\mu_{\mathrm{sig}} + n_{\mathrm{off}} \log{(1 + \alpha)}\big) If :math:`n_{\mathrm{off}} = 0` Wstat becomes .. math:: W = 2 \big(\mu_{\mathrm{sig}} + (1 + \frac{1}{\alpha})\mu_{\mathrm{bkg}} - n_{\mathrm{on}} - n_{\mathrm{on}} (\log{(\mu_{\mathrm{sig}} + \mu_{\mathrm{bkg}}) - \log{(n_{\mathrm{on}})}}) and .. math:: \mu_{\mathrm{bkg}} = \frac{\alpha n_{\mathrm{on}}}{1+\alpha} - {\mu_{\mathrm{sig}}} For :math:`\mu_{\mathrm{sig}} > n_{\mathrm{on}} (\frac{\alpha}{1 + \alpha})`, :math:`\mu_{\mathrm{bkg}}` becomes negative which is unphysical. Therefore we distinct two cases. The physical one where :math:`\mu_{\mathrm{sig}} < n_{\mathrm{on}} (\frac{\alpha}{1 + \alpha})`. is straightforward and gives .. math:: W = -2\big(\mu_{\mathrm{sig}} \left(\frac{1}{\alpha}\right) + n_{\mathrm{on}} \log{\left(\frac{\alpha}{1 + \alpha}\right)\big)} For the unphysical case, we set :math:`\mu_{\mathrm{bkg}}=0` and arrive at .. math:: W = 2\big(\mu_{\mathrm{sig}} + n_{\mathrm{on}}(\log{(n_{\mathrm{on}})} - \log{(\mu_{\mathrm{sig}})} - 1)\big) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/utils.rst0000644000175100001770000000717214721316200017313 0ustar00runnerdocker.. include:: ../references.txt .. _utils: Utility functions ================= `gammapy.utils` is a collection of utility functions that are used in many places or don't fit in one of the other packages. Since the various sub-modules of `gammapy.utils` are mostly unrelated, they are not imported into the top-level namespace. Here are some examples of how to import functionality from the `gammapy.utils` sub-modules: .. testcode:: from gammapy.utils.random import sample_sphere sample_sphere(size=10) from gammapy.utils import random random.sample_sphere(size=10) .. _time_handling: Time handling in Gammapy ------------------------ See `gammapy.utils.time`. Time format and scale +++++++++++++++++++++ In Gammapy, `astropy.time.Time` objects are used to represent times: .. testcode:: from astropy.time import Time Time(['1999-01-01T00:00:00.123456789', '2010-01-01T00:00:00']) Note that Astropy chose ``format='isot'`` and ``scale='utc'`` as default and in Gammapy these are also the recommended format and time scale. .. .. warning:: .. Actually what's written here is not true. In CTAO, it hasn't been decided if times will be in ``utc`` or ``tt`` (terrestrial time) format. .. Here's a reminder that this needs to be settled / updated: https://github.com/gammapy/gammapy/issues/284 When other time formats are needed it's easy to convert, see the :ref:`time format section and table in the Astropy docs `. E.g. sometimes in astronomy the modified Julian date ``mjd`` is used and for passing times to matplotlib for plotting the ``plot_date`` format should be used: .. testcode:: from astropy.time import Time time = Time(['1999-01-01T00:00:00.123456789', '2010-01-01T00:00:00']) print(time.mjd) print(time.plot_date) .. testoutput:: [51179.00000143 55197. ] [10592.00000143 14610. ] Converting to other time scales is also easy, see the :ref:`time scale section, diagram and table in the Astropy docs `. E.g. when converting celestial (RA/DEC) to horizontal (ALT/AZ) coordinates, the `sidereal time `__ is needed. This is done automatically by `astropy.coordinates.AltAz` when the `astropy.coordinates.AltAz.obstime` is set with a `~astropy.time.Time` object in any scale, no need for explicit time scale transformations in Gammapy (although if you do want to explicitly compute it, it's easy, see `here `__). The `Fermi-LAT time systems in a nutshell`_ page gives a good, brief explanation of the differences between the relevant time scales ``UT1``, ``UTC`` and ``TT``. .. _MET_definition: Mission elapsed times (MET) +++++++++++++++++++++++++++ :term:`MET` time references are times representing UTC seconds after a specific origin. Each experiment may have a different MET origin that should be included in the header of the corresponding data files. For more details see `Fermi-LAT time systems in a nutshell`_. It's not clear yet how to best implement METs in Gammapy, it's one of the tasks here: https://github.com/gammapy/gammapy/issues/284 For now, we use the `gammapy.utils.time.time_ref_from_dict`, `gammapy.utils.time.time_relative_to_ref` and `gammapy.utils.time.absolute_time` functions to convert MET floats to `~astropy.time.Time` objects via the reference times stored in FITS headers. Time differences ++++++++++++++++ TODO: discuss when to use `~astropy.time.TimeDelta` or `~astropy.units.Quantity` or :term:`MET` floats and where one needs to convert between those and what to watch out for. ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.160642 gammapy-1.3/docs/user-guide/visualization/0000755000175100001770000000000014721316215020321 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/visualization/colormap_example.py0000644000175100001770000000206614721316200024220 0ustar00runnerdocker"""Plot significance image with HESS and MILAGRO colormap.""" from astropy.visualization import LinearStretch from astropy.visualization.mpl_normalize import ImageNormalize import matplotlib.pyplot as plt from gammapy.maps import Map from gammapy.visualization import colormap_hess, colormap_milagro filename = "$GAMMAPY_DATA/tests/unbundled/poisson_stats_image/expected_ts_0.000.fits.gz" image = Map.read(filename, hdu="SQRT_TS") # Plot with the HESS and Milagro colormap vmin, vmax, vtransition = -5, 15, 5 fig = plt.figure(figsize=(15.5, 6)) normalize = ImageNormalize(vmin=vmin, vmax=vmax, stretch=LinearStretch()) transition = normalize(vtransition) ax = fig.add_subplot(121, projection=image.geom.wcs) cmap = colormap_hess(transition=transition) image.plot(ax=ax, cmap=cmap, norm=normalize, add_cbar=True) plt.title("HESS-style colormap") ax = fig.add_subplot(122, projection=image.geom.wcs) cmap = colormap_milagro(transition=transition) image.plot(ax=ax, cmap=cmap, norm=normalize, add_cbar=True) plt.title("MILAGRO-style colormap") plt.tight_layout() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/user-guide/visualization/index.rst0000644000175100001770000000254514721316200022162 0ustar00runnerdocker.. include:: ../../references.txt .. _visualization: Visualization ============= `gammapy.visualization` provides a few helper functions and classes to create publication-quality images. Colormaps --------- The following example shows how to plot images using colormaps that are commonly used in gamma-ray astronomy (`colormap_hess` and `colormap_milagro`). .. plot:: user-guide/visualization/colormap_example.py Survey panel plots ------------------ The `~gammapy.visualization.MapPanelPlotter` allows to split e.g. a galactic plane survey image with a large aspect ratio into multiple panels. Here is a short example: .. plot:: import matplotlib.pyplot as plt from astropy.coordinates import SkyCoord, Angle from gammapy.maps import Map from gammapy.data import EventList from gammapy.visualization import MapPanelPlotter skydir = SkyCoord("0d", "0d", frame="galactic") counts = Map.create(skydir=skydir, width=(180, 10), frame="galactic") events = EventList.read("$GAMMAPY_DATA/fermi_3fhl/fermi_3fhl_events_selected.fits.gz") counts.fill_events(events) smoothed = counts.smooth("0.1 deg") fig = plt.figure(figsize=(12, 6)) xlim, ylim = Angle(["90d", "270d"]), Angle(["-5d", "5d"]) plotter = MapPanelPlotter(figure=fig, xlim=xlim, ylim=ylim, npanels=3) axes = plotter.plot(smoothed, vmax=10) plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/environment-dev.yml0000644000175100001770000000254014721316200016255 0ustar00runnerdocker# Conda environment for Gammapy development # # Install: conda env create -f environment-dev.yml # Update: conda env update -f environment-dev.yml # Activate: conda activate gammapy-dev # Deactivate: conda deactivate name: gammapy-dev channels: - conda-forge - https://cxc.cfa.harvard.edu/conda/sherpa variables: PYTHONNOUSERSITE: "1" dependencies: # core dependencies - python=3.11 - pip - astropy - click - cython - numpy>1.20 - pydantic>=2.5 - pyyaml - regions>=0.5 - matplotlib>=3.4 - scipy!=1.10 - iminuit>=2.8.0 - extension-helpers # test dependencies - codecov - pytest - pytest-astropy - pytest-cov - pytest-xdist - coverage - requests - tqdm # extra dependencies - healpy - ipython - jupyter - jupyterlab - naima - pandas - reproject # dev dependencies - black=22.6.0 - codespell - flake8 - isort - jinja2 - jupytext - nbsphinx - numdifftools - pandoc - pydocstyle - pylint - setuptools_scm - sphinx - sphinx-astropy - sphinx-click - sphinx-gallery - sphinx-design - sphinx-copybutton - tox - pydata-sphinx-theme - pre-commit - twine - yamllint - nbformat - h5py - ruamel.yaml - cffconvert - pyinstrument - memray - pip: - sherpa>=4.17 - pytest-sphinx - ray[default]>=2.9 - PyGithub - pypandoc ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.160642 gammapy-1.3/examples/0000755000175100001770000000000014721316215014235 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/README.rst0000644000175100001770000000123214721316200015714 0ustar00runnerdockerGammapy examples folder ======================= This folder contains the following: * Python scripts that could be used as example scripts tutorials in the documentation. * Python scripts needed by the sphinx-gallery extension to produce collections of examples use cases. Only the Python scripts declared in the ``scripts.yaml`` file were downloaded by the ``gammapy download scripts`` command. This list was versioned for each release of Gammapy as it was also the case for the Jupyter notebooks tutorials. The Python scripts needed by sphinx-gallery extension are placed in folders declared in the ``sphinx_gallery_conf`` variable in ``docs/conf.py`` script. ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.160642 gammapy-1.3/examples/models/0000755000175100001770000000000014721316215015520 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/README.txt0000644000175100001770000000114514721316200017211 0ustar00runnerdocker.. _model-gallery: Model gallery ============= The model gallery provides a visual overview of the available models in Gammapy. In general the models are grouped into the following categories: - `~gammapy.modeling.models.SpectralModel`: models to describe spectral shapes of sources - `~gammapy.modeling.models.SpatialModel`: models to describe spatial shapes (morphologies) of sources - `~gammapy.modeling.models.TemporalModel`: models to describe temporal flux evolution of sources, such as light and phase curves The models follow a naming scheme which contains the category as a suffix to the class name. ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.160642 gammapy-1.3/examples/models/spatial/0000755000175100001770000000000014721316215017155 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spatial/README.rst0000644000175100001770000000007314721316200020636 0ustar00runnerdockerSpatial models -------------- .. _spatial_models_gallery: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spatial/plot_constant.py0000644000175100001770000000143014721316200022406 0ustar00runnerdockerr""" .. _constant-spatial-model: Constant spatial model ====================== This model is a spatially constant model. """ # %% # Example plot # ------------ # Here is an example plot of the model: from gammapy.maps import WcsGeom from gammapy.modeling.models import ( ConstantSpatialModel, Models, PowerLawSpectralModel, SkyModel, ) geom = WcsGeom.create(npix=(100, 100), binsz=0.1) model = ConstantSpatialModel(value="42 sr-1") model.plot(geom=geom, add_cbar=True) # %% # YAML representation # ------------------- # Here is an example YAML file using the model: pwl = PowerLawSpectralModel() constant = ConstantSpatialModel() model = SkyModel(spectral_model=pwl, spatial_model=constant, name="pwl-constant-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spatial/plot_disk.py0000644000175100001770000000702514721316200021515 0ustar00runnerdockerr""" .. _disk-spatial-model: Disk spatial model ================== This is a spatial model parametrising a disk. By default, the model is symmetric, i.e. a disk: .. math:: \phi(lon, lat) = \frac{1}{2 \pi (1 - \cos{r_0}) } \cdot \begin{cases} 1 & \text{for } \theta \leq r_0 \\ 0 & \text{for } \theta > r_0 \end{cases} where :math:`\theta` is the sky separation. To improve fit convergence of the model, the sharp edges is smoothed using `~scipy.special.erf`. In case an eccentricity (`e`) and rotation angle (:math:`\phi`) are passed, then the model is an elongated disk (i.e. an ellipse), with a major semiaxis of length :math:`r_0` and position angle :math:`\phi` (increasing counter-clockwise from the North direction). The model is defined on the celestial sphere, with a normalization defined by: .. math:: \int_{4\pi}\phi(\text{lon}, \text{lat}) \,d\Omega = 1\,. """ # %% # Example plot # ------------ # Here is an example plot of the model: import numpy as np from astropy.coordinates import Angle from gammapy.modeling.models import ( DiskSpatialModel, Models, PowerLawSpectralModel, SkyModel, ) phi = Angle("30 deg") model = DiskSpatialModel( lon_0="2 deg", lat_0="2 deg", r_0="1 deg", e=0.8, phi=phi, edge_width=0.1, frame="galactic", ) ax = model.plot(add_cbar=True) # illustrate size parameter region = model.to_region().to_pixel(ax.wcs) artist = region.as_artist(facecolor="none", edgecolor="red") ax.add_artist(artist) transform = ax.get_transform("galactic") ax.scatter(2, 2, transform=transform, s=20, edgecolor="red", facecolor="red") ax.text(1.7, 1.85, r"$(l_0, b_0)$", transform=transform, ha="center") ax.plot([2, 2 + np.sin(phi)], [2, 2 + np.cos(phi)], color="r", transform=transform) ax.vlines(x=2, color="r", linestyle="--", transform=transform, ymin=0, ymax=5) ax.text(2.15, 2.3, r"$\phi$", transform=transform) # %% # This plot illustrates the definition of the edge parameter: import numpy as np from astropy import units as u from astropy.visualization import quantity_support import matplotlib.pyplot as plt from gammapy.modeling.models import DiskSpatialModel lons = np.linspace(0, 0.3, 500) * u.deg r_0, edge_width = 0.2 * u.deg, 0.5 disk = DiskSpatialModel(lon_0="0 deg", lat_0="0 deg", r_0=r_0, edge_width=edge_width) profile = disk(lons, 0 * u.deg) plt.plot(lons, profile / profile.max(), alpha=0.5) plt.xlabel("Radius (deg)") plt.ylabel("Profile (A.U.)") edge_min, edge_max = r_0 * (1 - edge_width / 2.0), r_0 * (1 + edge_width / 2.0) with quantity_support(): plt.vlines([edge_min, edge_max], 0, 1, linestyles=["--"], color="k") plt.annotate( "", xy=(edge_min, 0.5), xytext=(edge_min + r_0 * edge_width, 0.5), arrowprops=dict(arrowstyle="<->", lw=2), ) plt.text(0.2, 0.53, "Edge width", ha="center", size=12) margin = 0.02 * u.deg plt.hlines( [0.95], edge_min - margin, edge_min + margin, linestyles=["-"], color="k" ) plt.text(edge_min + margin, 0.95, "95%", size=12, va="center") plt.hlines( [0.05], edge_max - margin, edge_max + margin, linestyles=["-"], color="k" ) plt.text(edge_max - margin, 0.05, "5%", size=12, va="center", ha="right") plt.show() # %% # YAML representation # ------------------- # Here is an example YAML file using the model: pwl = PowerLawSpectralModel() gauss = DiskSpatialModel() model = SkyModel(spectral_model=pwl, spatial_model=gauss, name="pwl-disk-model") models = Models([model]) print(models.to_yaml())././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spatial/plot_gauss.py0000644000175100001770000000641114721316200021703 0ustar00runnerdockerr""" .. _gaussian-spatial-model: Gaussian spatial model ====================== This is a spatial model parametrising a Gaussian function. By default, the Gaussian is symmetric: .. math:: \phi(\text{lon}, \text{lat}) = N \times \exp\left\{-\frac{1}{2} \frac{1-\cos \theta}{1-\cos \sigma}\right\}\,, where :math:`\theta` is the sky separation to the model center. In this case, the Gaussian is normalized to 1 on the sphere: .. math:: N = \frac{1}{4\pi a\left[1-\exp(-1/a)\right]}\,,\,\,\,\, a = 1-\cos \sigma\,. In the limit of small :math:`\theta` and :math:`\sigma`, this definition reduces to the usual form: .. math:: \phi(\text{lon}, \text{lat}) = \frac{1}{2\pi\sigma^2} \exp{\left(-\frac{1}{2} \frac{\theta^2}{\sigma^2}\right)}\,. In case an eccentricity (:math:`e`) and rotation angle (:math:`\phi`) are passed, then the model is an elongated Gaussian, whose evaluation is performed as in the symmetric case but using the effective radius of the Gaussian: .. math:: \sigma_{eff}(\text{lon}, \text{lat}) = \sqrt{ (\sigma_M \sin(\Delta \phi))^2 + (\sigma_m \cos(\Delta \phi))^2 }. Here, :math:`\sigma_M` (:math:`\sigma_m`) is the major (minor) semiaxis of the Gaussian, and :math:`\Delta \phi` is the difference between `phi`, the position angle of the Gaussian, and the position angle of the evaluation point. **Caveat:** For the asymmetric Gaussian, the model is normalized to 1 on the plane, i.e. in small angle approximation: :math:`N = 1/(2 \pi \sigma_M \sigma_m)`. This means that for huge elongated Gaussians on the sky this model is not correctly normalized. However, this approximation is perfectly acceptable for the more common case of models with modest dimensions: indeed, the error introduced by normalizing on the plane rather than on the sphere is below 0.1\% for Gaussians with radii smaller than ~ 5 deg. """ # %% # Example plot # ------------ # Here is an example plot of the model: import numpy as np from astropy.coordinates import Angle from gammapy.maps import WcsGeom from gammapy.modeling.models import ( GaussianSpatialModel, Models, PowerLawSpectralModel, SkyModel, ) phi = Angle("30 deg") model = GaussianSpatialModel( lon_0="2 deg", lat_0="2 deg", sigma="1 deg", e=0.7, phi=phi, frame="galactic", ) geom = WcsGeom.create( skydir=model.position, frame=model.frame, width=(4, 4), binsz=0.02 ) ax = model.plot(geom=geom, add_cbar=True) # illustrate size parameter region = model.to_region().to_pixel(ax.wcs) artist = region.as_artist(facecolor="none", edgecolor="red") ax.add_artist(artist) transform = ax.get_transform("galactic") ax.scatter(2, 2, transform=transform, s=20, edgecolor="red", facecolor="red") ax.text(1.5, 1.85, r"$(l_0, b_0)$", transform=transform, ha="center") ax.plot([2, 2 + np.sin(phi)], [2, 2 + np.cos(phi)], color="r", transform=transform) ax.vlines(x=2, color="r", linestyle="--", transform=transform, ymin=-5, ymax=5) ax.text(2.25, 2.45, r"$\phi$", transform=transform) # %% # YAML representation # ------------------- # Here is an example YAML file using the model: pwl = PowerLawSpectralModel() gauss = GaussianSpatialModel() model = SkyModel(spectral_model=pwl, spatial_model=gauss, name="pwl-gauss-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spatial/plot_gen_gauss.py0000644000175100001770000000532414721316200022536 0ustar00runnerdockerr""" .. _generalized-gaussian-spatial-model: Generalized gaussian spatial model ================================== This is a spatial model parametrising a generalized Gaussian function. By default, the Generalized Gaussian is defined as : .. math:: \phi(\text{lon}, \text{lat}) = \phi(\text{r}) = N \times \exp \left[ - \left( \frac{r}{r_{\rm eff}} \right)^ \left( 1/\eta \right) \right] \,, the normalization is expressed as: .. math:: N = \frac{1}{ 2 \pi \sqrt(1-e^2) r_{0}^2 \eta \Gamma(2\eta)}\, where :math:`\Gamma` is the gamma function. This analytical norm is approximated so it may not integrate to unity in extreme cases if ellipticity tend to one and radius is large or :math:`\eta` much larger than one (outside the default range). The effective radius is given by: .. math:: r_{rm eff}(\text{lon}, \text{lat}) = \sqrt{ (r_M \sin(\Delta \phi))^2 + (r_m \cos(\Delta \phi))^2 }. where :math:`r_M` (:math:`r_m`) is the major (minor) semiaxis, and :math:`\Delta \phi` is the difference between `phi`, the position angle of the model, and the position angle of the evaluation point. If the eccentricity (:math:`e`) is null it reduces to :math:`r_0`. """ # %% # Example plot # ------------ # Here is an example plot of the model for different shape parameter: from astropy import units as u import matplotlib.pyplot as plt from gammapy.maps import Map, WcsGeom from gammapy.modeling.models import ( GeneralizedGaussianSpatialModel, Models, PowerLawSpectralModel, SkyModel, ) lon_0 = 20 lat_0 = 0 reval = 3 dr = 0.02 geom = WcsGeom.create( skydir=(lon_0, lat_0), binsz=dr, width=(2 * reval, 2 * reval), frame="galactic", ) tags = [r"Disk, $\eta=0.01$", r"Gaussian, $\eta=0.5$", r"Laplace, $\eta=1$"] eta_range = [0.01, 0.5, 1] r_0 = 1 e = 0.5 phi = 45 * u.deg fig, axes = plt.subplots(1, 3, figsize=(9, 6)) for ax, eta, tag in zip(axes, eta_range, tags): model = GeneralizedGaussianSpatialModel( lon_0=lon_0 * u.deg, lat_0=lat_0 * u.deg, eta=eta, r_0=r_0 * u.deg, e=e, phi=phi, frame="galactic", ) meval = model.evaluate_geom(geom) Map.from_geom(geom=geom, data=meval.value, unit=meval.unit).plot(ax=ax) pixreg = model.to_region().to_pixel(geom.wcs) pixreg.plot(ax=ax, edgecolor="g", facecolor="none", lw=2) ax.set_title(tag) ax.set_xticks([]) ax.set_yticks([]) plt.tight_layout() # %% # YAML representation # ------------------- # Here is an example YAML file using the model: pwl = PowerLawSpectralModel() gengauss = GeneralizedGaussianSpatialModel() model = SkyModel(spectral_model=pwl, spatial_model=gengauss, name="pwl-gengauss-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spatial/plot_piecewise_norm_spatial.py0000644000175100001770000000202514721316200025303 0ustar00runnerdockerr""" .. _piecewise-norm-spatial: Piecewise norm spatial model ============================ This model parametrises a piecewise spatial correction with a free norm parameter at each fixed node in longitude, latitude and optionaly energy. """ # %% # Example plot # ------------ # Here is an example plot of the model: import numpy as np from astropy import units as u from gammapy.maps import MapCoord, WcsGeom from gammapy.modeling.models import ( FoVBackgroundModel, Models, PiecewiseNormSpatialModel, ) geom = WcsGeom.create(skydir=(50, 0), npix=(120, 120), binsz=0.03, frame="galactic") coords = MapCoord.create(geom.footprint) coords["lon"] *= u.deg coords["lat"] *= u.deg model = PiecewiseNormSpatialModel( coords, norms=np.array([0.5, 3, 2, 1]), frame="galactic" ) model.plot(geom=geom) # %% # YAML representation # ------------------- # Here is an example YAML file using the model: bkg_model = FoVBackgroundModel(spatial_model=model, dataset_name="dataset") models = Models([bkg_model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spatial/plot_point.py0000644000175100001770000000236514721316200021716 0ustar00runnerdockerr""" .. _point-spatial-model: Point spatial model =================== This model is a delta function centered in *lon_0* and *lat_0* parameters provided: .. math:: \phi(lon, lat) = \delta{(lon - lon_0, lat - lat_0)} The model is defined on the celestial sphere in the coordinate frame provided by the user. If the point source is not centered on a pixel, the flux is re-distributed across 4 neighbouring pixels. This ensured that the center of mass position is conserved. """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy.coordinates import SkyCoord from gammapy.maps import WcsGeom from gammapy.modeling.models import ( Models, PointSpatialModel, PowerLawSpectralModel, SkyModel, ) model = PointSpatialModel( lon_0="0.01 deg", lat_0="0.01 deg", frame="galactic", ) geom = WcsGeom.create( skydir=SkyCoord("0d 0d", frame="galactic"), width=(1, 1), binsz=0.1 ) model.plot(geom=geom, add_cbar=True) # %% # YAML representation # ------------------- # Here is an example YAML file using the model: pwl = PowerLawSpectralModel() point = PointSpatialModel() model = SkyModel(spectral_model=pwl, spatial_model=point, name="pwl-point-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spatial/plot_shell.py0000644000175100001770000000275714721316200021701 0ustar00runnerdockerr""" .. _shell-spatial-model: Shell spatial model =================== This is a spatial model parametrizing a projected radiating shell. The shell spatial model is defined by the following equations: .. math:: \phi(lon, lat) = \frac{3}{2 \pi (r_{out}^3 - r_{in}^3)} \cdot \begin{cases} \sqrt{r_{out}^2 - \theta^2} - \sqrt{r_{in}^2 - \theta^2} & \text{for } \theta \lt r_{in} \\ \sqrt{r_{out}^2 - \theta^2} & \text{for } r_{in} \leq \theta \lt r_{out} \\ 0 & \text{for } \theta > r_{out} \end{cases} where :math:`\theta` is the sky separation and :math:`r_{\text{out}} = r_{\text{in}}` + width Note that the normalization is a small angle approximation, although that approximation is still very good even for 10 deg radius shells. """ # %% # Example plot # ------------ # Here is an example plot of the model: from gammapy.modeling.models import ( Models, PowerLawSpectralModel, ShellSpatialModel, SkyModel, ) model = ShellSpatialModel( lon_0="10 deg", lat_0="20 deg", radius="2 deg", width="0.5 deg", frame="galactic", ) model.plot(add_cbar=True) # %% # YAML representation # ------------------- # Here is an example YAML file using the model: pwl = PowerLawSpectralModel() shell = ShellSpatialModel() model = SkyModel(spectral_model=pwl, spatial_model=shell, name="pwl-shell-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spatial/plot_shell2.py0000644000175100001770000000443414721316200021755 0ustar00runnerdockerr""" .. _shell2-spatial-model: Shell2 spatial model ==================== This is a spatial model parametrizing a projected radiating shell. The shell spatial model is defined by the following equations: .. math:: \phi(lon, lat) = \frac{3}{2 \pi (r_{out}^3 - r_{in}^3)} \cdot \begin{cases} \sqrt{r_{out}^2 - \theta^2} - \sqrt{r_{in}^2 - \theta^2} & \text{for } \theta \lt r_{in} \\ \sqrt{r_{out}^2 - \theta^2} & \text{for } r_{in} \leq \theta \lt r_{out} \\ 0 & \text{for } \theta > r_{out} \end{cases} where :math:`\theta` is the sky separation, :math:`r_{\text{out}}` is the outer radius and :math:`r_{\text{in}}` is the inner radius. For Shell2SpatialModel, the radius parameter r_0 correspond to :math:`r_{\text{out}}`. The relative width parameter, eta, is given as \eta = :math:`(r_{\text{out}} - r_{\text{in}})/r_{\text{out}}` so we have :math:`r_{\text{in}} = (1-\eta) r_{\text{out}}`. Note that the normalization is a small angle approximation, although that approximation is still very good even for 10 deg radius shells. """ # %% # Example plot # ------------ # Here is an example plot of the shell model for the parametrization using outer radius and relative width. # In this case the relative width, eta, acts as a shape parameter. import matplotlib.pyplot as plt from gammapy.modeling.models import ( Models, PowerLawSpectralModel, Shell2SpatialModel, SkyModel, ) tags = [ r"Disk-like, $\eta \rightarrow 0$", r"Shell, $\eta=0.25$", r"Peaked, $\eta\rightarrow 1$", ] eta_range = [0.001, 0.25, 1] fig, axes = plt.subplots(1, 3, figsize=(9, 6)) for ax, eta, tag in zip(axes, eta_range, tags): model = Shell2SpatialModel( lon_0="10 deg", lat_0="20 deg", r_0="2 deg", eta=eta, frame="galactic", ) model.plot(ax=ax) ax.set_title(tag) ax.set_xticks([]) ax.set_yticks([]) plt.tight_layout() # %% # YAML representation # ------------------- # Here is an example YAML file using the model: pwl = PowerLawSpectralModel() shell2 = Shell2SpatialModel() model = SkyModel(spectral_model=pwl, spatial_model=shell2, name="pwl-shell2-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spatial/plot_template.py0000644000175100001770000000164014721316200022373 0ustar00runnerdockerr""" .. _template-spatial-model: Template spatial model ====================== This is a spatial model based on a 2D sky map provided as a template. """ # %% # Example plot # ------------ # Here is an example plot of the model: from gammapy.maps import Map from gammapy.modeling.models import ( Models, PowerLawSpectralModel, SkyModel, TemplateSpatialModel, ) filename = "$GAMMAPY_DATA/catalogs/fermi/Extended_archive_v18/Templates/RXJ1713_2016_250GeV.fits" m = Map.read(filename) m = m.copy(unit="sr^-1") model = TemplateSpatialModel(m, filename=filename) model.plot(add_cbar=True) # %% # YAML representation # ------------------- # Here is an example YAML file using the model: pwl = PowerLawSpectralModel() template = TemplateSpatialModel(m, filename=filename) model = SkyModel(spectral_model=pwl, spatial_model=template, name="pwl-template-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.1646419 gammapy-1.3/examples/models/spectral/0000755000175100001770000000000014721316215017335 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/README.rst0000644000175100001770000000007614721316200021021 0ustar00runnerdockerSpectral models --------------- .. _spectral_models_gallery: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_absorbed.py0000644000175100001770000000613714721316200022527 0ustar00runnerdockerr""" .. _absorption-spectral-model: EBL absorption spectral model ============================= This model evaluates absorbed spectral model. The EBL absorption factor given by .. math:: \exp{ \left ( -\alpha \times \tau(E, z) \right )} where :math:`\tau(E, z)` is the optical depth predicted by the model (`~gammapy.modeling.models.EBLAbsorptionNormSpectralModel`), which depends on the energy of the gamma-rays and the redshift z of the source, and :math:`\alpha` is a scale factor (default: 1) for the optical depth. The available EBL models are defined in `~gammapy.modeling.models.EBL_DATA_BUILTIN`. """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import ( EBL_DATA_BUILTIN, EBLAbsorptionNormSpectralModel, Models, PowerLawSpectralModel, SkyModel, ) # Print the available EBL models print(EBL_DATA_BUILTIN.keys()) # Here we illustrate how to create and plot EBL absorption models for a redshift of 0.5 # sphinx_gallery_thumbnail_number = 1 redshift = 0.5 dominguez = EBLAbsorptionNormSpectralModel.read_builtin("dominguez", redshift=redshift) franceschini = EBLAbsorptionNormSpectralModel.read_builtin( "franceschini", redshift=redshift ) finke = EBLAbsorptionNormSpectralModel.read_builtin("finke", redshift=redshift) franceschini17 = EBLAbsorptionNormSpectralModel.read_builtin( "franceschini17", redshift=redshift ) saldana21 = EBLAbsorptionNormSpectralModel.read_builtin( "saldana-lopez21", redshift=redshift ) fig, (ax_ebl, ax_model) = plt.subplots( nrows=1, ncols=2, figsize=(10, 4), gridspec_kw={"left": 0.08, "right": 0.96} ) energy_bounds = [0.08, 3] * u.TeV opts = dict(energy_bounds=energy_bounds, xunits=u.TeV) franceschini.plot(ax=ax_ebl, label="Franceschini 2008", **opts) finke.plot(ax=ax_ebl, label="Finke 2010", **opts) dominguez.plot(ax=ax_ebl, label="Dominguez 2011", **opts) franceschini17.plot(ax=ax_ebl, label="Franceschni 2017", **opts) saldana21.plot(ax=ax_ebl, label="Saldana-Lopez 2021", **opts) ax_ebl.set_ylabel(r"Absorption coefficient [$\exp{(-\tau(E))}$]") ax_ebl.set_xlim(energy_bounds.value) ax_ebl.set_ylim(1e-4, 2) ax_ebl.set_title(f"EBL models (z={redshift})") ax_ebl.grid(which="both") ax_ebl.legend(loc="best") # Spectral model corresponding to PKS 2155-304 (quiescent state) index = 3.53 amplitude = 1.81 * 1e-12 * u.Unit("cm-2 s-1 TeV-1") reference = 1 * u.TeV pwl = PowerLawSpectralModel(index=index, amplitude=amplitude, reference=reference) # The power-law model is multiplied by the EBL norm spectral model redshift = 0.117 absorption = EBLAbsorptionNormSpectralModel.read_builtin("dominguez", redshift=redshift) model = pwl * absorption energy_bounds = [0.1, 100] * u.TeV model.plot(energy_bounds, ax=ax_model) ax_model.grid(which="both") ax_model.set_ylim(1e-24, 1e-8) ax_model.set_title("Absorbed Power Law") plt.show() # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="absorbed-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_broken_powerlaw.py0000644000175100001770000000217314721316200024142 0ustar00runnerdockerr""" .. _broken-powerlaw-spectral-model: Broken power law spectral model =============================== This model parametrises a broken power law spectrum. It is defined by the following equation: .. math:: \phi(E) = \phi_0 \cdot \begin{cases} \left( \frac{E}{E_{break}} \right)^{-\Gamma1} & \text{if } E < E_{break} \\ \left( \frac{E}{E_{break}} \right)^{-\Gamma2} & \text{otherwise} \end{cases} """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import BrokenPowerLawSpectralModel, Models, SkyModel energy_bounds = [0.1, 100] * u.TeV model = BrokenPowerLawSpectralModel( index1=1.5, index2=2.5, amplitude="1e-12 TeV-1 cm-2 s-1", ebreak="1 TeV", ) model.plot(energy_bounds) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="broken-power-law-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_compound.py0000644000175100001770000000204714721316200022566 0ustar00runnerdockerr""" .. _compound-spectral-model: Compound spectral model ======================= This model is formed by the arithmetic combination of any two other spectral models. """ # %% # Example plot # ------------ # Here is an example plot of the model: import operator from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import ( CompoundSpectralModel, LogParabolaSpectralModel, Models, PowerLawSpectralModel, SkyModel, ) energy_bounds = [0.1, 100] * u.TeV pwl = PowerLawSpectralModel( index=2.0, amplitude="1e-12 cm-2 s-1 TeV-1", reference="1 TeV" ) lp = LogParabolaSpectralModel( amplitude="1e-12 cm-2 s-1 TeV-1", reference="10 TeV", alpha=2.0, beta=1.0 ) model_add = CompoundSpectralModel(pwl, lp, operator.add) model_add.plot(energy_bounds) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: sky_model = SkyModel(spectral_model=model_add, name="add-compound-model") models = Models([sky_model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_constant_spectral.py0000644000175100001770000000134614721316200024471 0ustar00runnerdockerr""" .. _constant-spectral-model: Constant spectral model ======================= This model takes a constant value along the spectral range. .. math:: \phi(E) = k """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import ConstantSpectralModel, Models, SkyModel energy_bounds = [0.1, 100] * u.TeV model = ConstantSpectralModel(const="1 / (cm2 s TeV)") model.plot(energy_bounds) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="constant-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_exp_cutoff_powerlaw.py0000644000175100001770000000201514721316200025017 0ustar00runnerdockerr""" .. _exp-cutoff-powerlaw-spectral-model: Exponential cutoff power law spectral model =========================================== This model parametrises a cutoff power law spectrum. It is defined by the following equation: .. math:: \phi(E) = \phi_0 \cdot \left(\frac{E}{E_0}\right)^{-\Gamma} \exp(- {(\lambda E})^{\alpha}) """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import ExpCutoffPowerLawSpectralModel, Models, SkyModel energy_bounds = [0.1, 100] * u.TeV model = ExpCutoffPowerLawSpectralModel( amplitude=1e-12 * u.Unit("cm-2 s-1 TeV-1"), index=2, lambda_=0.1 * u.Unit("TeV-1"), reference=1 * u.TeV, ) model.plot(energy_bounds) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="exp-cutoff-power-law-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_exp_cutoff_powerlaw_3fgl.py0000644000175100001770000000213314721316200025733 0ustar00runnerdockerr""" .. _exp-cutoff-powerlaw-3fgl-spectral-model: Exponential cutoff power law spectral model used for 3FGL ========================================================= This model parametrises a cutoff power law spectrum used for 3FGL. It is defined by the following equation: .. math:: \phi(E) = \phi_0 \cdot \left(\frac{E}{E_0}\right)^{-\Gamma} \exp \left( \frac{E_0 - E}{E_{C}} \right) """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import ExpCutoffPowerLaw3FGLSpectralModel, Models, SkyModel energy_bounds = [0.1, 100] * u.TeV model = ExpCutoffPowerLaw3FGLSpectralModel( index=2.3 * u.Unit(""), amplitude=4 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, ecut=10 * u.TeV, ) model.plot(energy_bounds) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="exp-cutoff-power-law-3fgl-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_exp_cutoff_powerlaw_norm_spectral.py0000644000175100001770000000234014721316200027750 0ustar00runnerdockerr""" .. _exp-cutoff-powerlaw-norm-spectral-model: Exponential cutoff power law norm spectral model ================================================ This model parametrises a cutoff power law spectral correction with a norm parameter. """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import ( ExpCutoffPowerLawNormSpectralModel, Models, SkyModel, TemplateSpectralModel, ) energy_bounds = [0.1, 100] * u.TeV energy = [0.3, 1, 3, 10, 30] * u.TeV values = [40, 30, 20, 10, 1] * u.Unit("TeV-1 s-1 cm-2") template = TemplateSpectralModel(energy, values) norm = ExpCutoffPowerLawNormSpectralModel( norm=2, reference=1 * u.TeV, ) template.plot(energy_bounds=energy_bounds, label="Template model") ecpl_norm = template * norm ecpl_norm.plot( energy_bounds, label="Template model with ExpCutoffPowerLaw norm correction" ) plt.legend(loc="best") plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=ecpl_norm, name="exp-cutoff-power-law-norm-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_gauss_spectral.py0000644000175100001770000000161714721316200023763 0ustar00runnerdockerr""" .. _gaussian-spectral-model: Gaussian spectral model ======================= This model parametrises a gaussian spectrum. It is defined by the following equation: .. math:: \phi(E) = \frac{N_0}{\sigma \sqrt{2\pi}} \exp{ \frac{- \left( E-\bar{E} \right)^2 }{2 \sigma^2} } """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import GaussianSpectralModel, Models, SkyModel energy_bounds = [0.1, 100] * u.TeV model = GaussianSpectralModel(norm="1e-2 cm-2 s-1", mean=2 * u.TeV, sigma=0.2 * u.TeV) model.plot(energy_bounds) plt.grid(which="both") plt.ylim(1e-24, 1e-1) # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="gaussian-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_logparabola.py0000644000175100001770000000277514721316200023235 0ustar00runnerdockerr""" .. _logparabola-spectral-model: Log parabola spectral model =========================== This model parametrises a log parabola spectrum. It is defined by the following equation: .. math:: \phi(E) = \phi_0 \left( \frac{E}{E_0} \right) ^ { - \alpha - \beta \log{ \left( \frac{E}{E_0} \right) } } Note that :math:`log` refers to the natural logarithm. This is consistent with the `Fermi Science Tools `_ and `ctools `_. The `Sherpa `_ package, however, uses :math:`log_{10}`. If you have parametrization based on :math:`log_{10}` you can use the :func:`~gammapy.modeling.models.LogParabolaSpectralModel.from_log10` method. """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import LogParabolaSpectralModel, Models, SkyModel energy_bounds = [0.1, 100] * u.TeV model = LogParabolaSpectralModel( alpha=2.3, amplitude="1e-12 cm-2 s-1 TeV-1", reference=1 * u.TeV, beta=0.5, ) model.plot(energy_bounds) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="log-parabola-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_logparabola_norm_spectral.py0000644000175100001770000000222114721316200026147 0ustar00runnerdockerr""" .. _logparabola-spectral-norm-model: Log parabola spectral norm model ================================ This model parametrises a log parabola spectral correction with a norm parameter. """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import ( LogParabolaNormSpectralModel, Models, SkyModel, TemplateSpectralModel, ) energy_bounds = [0.1, 100] * u.TeV energy = [0.3, 1, 3, 10, 30] * u.TeV values = [40, 30, 20, 10, 1] * u.Unit("TeV-1 s-1 cm-2") template = TemplateSpectralModel(energy, values) norm = LogParabolaNormSpectralModel( norm=1.5, reference=1 * u.TeV, ) template.plot(energy_bounds=energy_bounds, label="Template model") lp_norm = template * norm lp_norm.plot(energy_bounds, label="Template model with LogParabola norm correction") plt.legend(loc="best") plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=lp_norm, name="log-parabola-norm-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_naima.py0000644000175100001770000000510514721316200022025 0ustar00runnerdockerr""" .. _naima-spectral-model: Naima spectral model ==================== This class provides an interface with the models defined in the naima models module. The model accepts as a positional argument a `Naima `_ radiative `~naima.models` instance, used to compute the non-thermal emission from populations of relativistic electrons or protons due to interactions with the ISM or with radiation and magnetic fields. One of the advantages provided by this class consists in the possibility of performing a maximum likelihood spectral fit of the model's parameters directly on observations, as opposed to the MCMC `fit to flux points `_ featured in Naima. All the parameters defining the parent population of charged particles are stored as `~gammapy.modeling.Parameter` and left free by default. In case that the radiative model is `~naima.radiative.Synchrotron`, the magnetic field strength may also be fitted. Parameters can be freezed/unfreezed before the fit, and maximum/minimum values can be set to limit the parameters space to the physically interesting region. """ # %% # Example plot # ------------ # Here we create and plot a spectral model that convolves an `~gammapy.modeling.models.ExpCutoffPowerLawSpectralModel` # electron distribution with an `InverseCompton` radiative model, in the presence of multiple seed photon fields. from astropy import units as u import matplotlib.pyplot as plt import naima from gammapy.modeling.models import Models, NaimaSpectralModel, SkyModel particle_distribution = naima.models.ExponentialCutoffPowerLaw( 1e30 / u.eV, 10 * u.TeV, 3.0, 30 * u.TeV ) radiative_model = naima.radiative.InverseCompton( particle_distribution, seed_photon_fields=["CMB", ["FIR", 26.5 * u.K, 0.415 * u.eV / u.cm**3]], Eemin=100 * u.GeV, ) model = NaimaSpectralModel(radiative_model, distance=1.5 * u.kpc) opts = { "energy_bounds": [10 * u.GeV, 80 * u.TeV], "sed_type": "e2dnde", } # Plot the total inverse Compton emission model.plot(label="IC (total)", **opts) # Plot the separate contributions from each seed photon field for seed, ls in zip(["CMB", "FIR"], ["-", "--"]): model = NaimaSpectralModel(radiative_model, seed=seed, distance=1.5 * u.kpc) model.plot(label=f"IC ({seed})", ls=ls, color="gray", **opts) plt.legend(loc="best") plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="naima-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_piecewise_norm_spectral.py0000644000175100001770000000167114721316200025651 0ustar00runnerdockerr""" .. _piecewise-norm-spectral: Piecewise norm spectral model ============================== This model parametrises a piecewise spectral correction with a free norm parameter at each fixed energy node. """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import ( Models, PiecewiseNormSpectralModel, PowerLawSpectralModel, SkyModel, ) energy_bounds = [0.1, 100] * u.TeV model = PiecewiseNormSpectralModel( energy=[0.1, 1, 3, 10, 30, 100] * u.TeV, norms=[1, 3, 8, 10, 8, 2], ) model.plot(energy_bounds, yunits=u.Unit("")) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = model * PowerLawSpectralModel() model = SkyModel(spectral_model=model, name="piecewise-norm-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_powerlaw.py0000644000175100001770000000155614721316200022606 0ustar00runnerdockerr""" .. _powerlaw-spectral-model: Power law spectral model ======================== This model parametrises a power law spectrum. It is defined by the following equation: .. math:: \phi(E) = \phi_0 \cdot \left( \frac{E}{E_0} \right)^{-\Gamma} """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import Models, PowerLawSpectralModel, SkyModel energy_bounds = [0.1, 100] * u.TeV model = PowerLawSpectralModel( index=2, amplitude="1e-12 TeV-1 cm-2 s-1", reference=1 * u.TeV, ) model.plot(energy_bounds) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="power-law-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_powerlaw2.py0000644000175100001770000000207414721316200022664 0ustar00runnerdockerr""" .. _powerlaw2-spectral-model: Power law 2 spectral model ========================== This model parametrises a power law spectrum with integral as amplitude parameter. It is defined by the following equation: .. math:: \phi(E) = F_0 \cdot \frac{\Gamma + 1}{E_{0, max}^{-\Gamma + 1} - E_{0, min}^{-\Gamma + 1}} \cdot E^{-\Gamma} See also: https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/source_models.html """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import Models, PowerLaw2SpectralModel, SkyModel energy_bounds = [0.1, 100] * u.TeV model = PowerLaw2SpectralModel( amplitude=u.Quantity(1e-12, "cm-2 s-1"), index=2.3, emin=1 * u.TeV, emax=10 * u.TeV, ) model.plot(energy_bounds) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="power-law2-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_powerlaw_norm_spectral.py0000644000175100001770000000220214721316200025523 0ustar00runnerdockerr""" .. _powerlaw-spectral-norm-model: Power law norm spectral model ============================= This model parametrises a power law spectral correction with a norm and tilt parameter. """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import ( Models, PowerLawNormSpectralModel, SkyModel, TemplateSpectralModel, ) energy_bounds = [0.1, 100] * u.TeV energy = [0.3, 1, 3, 10, 30] * u.TeV values = [40, 30, 20, 10, 1] * u.Unit("TeV-1 s-1 cm-2") template = TemplateSpectralModel(energy, values) norm = PowerLawNormSpectralModel( norm=5, reference=1 * u.TeV, ) template.plot(energy_bounds=energy_bounds, label="Template model") pwl_norm = template * norm pwl_norm.plot(energy_bounds, label="Template model with PowerLaw norm correction") plt.legend(loc="best") plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=pwl_norm, name="power-law-norm-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_smooth_broken_powerlaw.py0000644000175100001770000000211314721316200025525 0ustar00runnerdockerr""" .. _smooth-broken-powerlaw-spectral-model: Smooth broken power law spectral model ====================================== This model parametrises a smooth broken power law spectrum. It is defined by the following equation: .. math:: \phi(E) = \phi_0 \cdot \left( \frac{E}{E_0} \right)^{-\Gamma1}\left(1 + \frac{E}{E_{break}}^{\frac{\Gamma2-\Gamma1}{\beta}} \right)^{-\beta} """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import Models, SkyModel, SmoothBrokenPowerLawSpectralModel energy_bounds = [0.1, 100] * u.TeV model = SmoothBrokenPowerLawSpectralModel( index1=1.5, index2=2.5, amplitude="1e-12 TeV-1 cm-2 s-1", ebreak="1 TeV", reference="1 TeV", beta=1, ) model.plot(energy_bounds) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="smooth-broken-power-law-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_super_exp_cutoff_powerlaw_3fgl.py0000644000175100001770000000243214721316200027153 0ustar00runnerdockerr""" .. _super-exp-cutoff-powerlaw-3fgl-spectral-model: Super exponential cutoff power law model used for 3FGL ====================================================== This model parametrises super exponential cutoff power-law model spectrum used for 3FGL. It is defined by the following equation: .. math:: \phi(E) = \phi_0 \cdot \left(\frac{E}{E_0}\right)^{-\Gamma_1} \exp \left( \left(\frac{E_0}{E_{C}} \right)^{\Gamma_2} - \left(\frac{E}{E_{C}} \right)^{\Gamma_2} \right) """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import ( Models, SkyModel, SuperExpCutoffPowerLaw3FGLSpectralModel, ) energy_bounds = [0.1, 100] * u.TeV model = SuperExpCutoffPowerLaw3FGLSpectralModel( index_1=1, index_2=2, amplitude="1e-12 TeV-1 s-1 cm-2", reference="1 TeV", ecut="10 TeV", ) model.plot(energy_bounds) plt.grid(which="both") plt.ylim(1e-24, 1e-10) # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="super-exp-cutoff-power-law-3fgl-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_super_exp_cutoff_powerlaw_4fgl.py0000644000175100001770000000311514721316200027153 0ustar00runnerdockerr""" .. _super-exp-cutoff-powerlaw-4fgl-dr3-spectral-model: Super Exponential Cutoff Power Law Model used for 4FGL-DR3 ========================================================== This model parametrises super exponential cutoff power-law model spectrum used for 4FGL. It is defined by the following equation: .. math:: \phi(E) = \begin{cases} \phi_0 \cdot \left(\frac{E}{E_0}\right)^{\frac{a}{\Gamma_2} -\Gamma_1} \cdot \exp \left( \frac{a}{\Gamma_2^2}\left( 1 - \left(\frac{E}{E_0}\right)^{\Gamma_2} \right) \right) \\ \phi_0 \cdot \left(\frac{E}{E_0}\right)^{ -\Gamma_1 - \frac{a}{2} \ln \frac{E}{E_0} - \frac{a \Gamma_2}{6} \ln^2 \frac{E}{E_0} - \frac{a \Gamma_2^2}{24} \ln^3 \frac{E}{E_0}} & \text{for } \left| \Gamma_2 \ln \frac{E}{E_0} \right| < 10^{-2} \end{cases} See Equation (2) and (3) in https://arxiv.org/pdf/2201.11184.pdf """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import ( Models, SkyModel, SuperExpCutoffPowerLaw4FGLDR3SpectralModel, ) energy_range = [0.1, 100] * u.TeV model = SuperExpCutoffPowerLaw4FGLDR3SpectralModel( index_1=1, index_2=2, amplitude="1e-12 TeV-1 cm-2 s-1", reference="1 TeV", expfactor=1e-2, ) model.plot(energy_range) plt.grid(which="both") plt.ylim(1e-24, 1e-10) # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="super-exp-cutoff-power-law-4fgl-dr3-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_super_exp_cutoff_powerlaw_4fgl_dr1.py0000644000175100001770000000246514721316200027730 0ustar00runnerdockerr""" .. _super-exp-cutoff-powerlaw-4fgl-spectral-model: Super Exponential Cutoff Power Law Model used for 4FGL-DR1 (and DR2) ==================================================================== This model parametrises super exponential cutoff power-law model spectrum used for 4FGL. It is defined by the following equation: .. math:: \phi(E) = \phi_0 \cdot \left(\frac{E}{E_0}\right)^{-\Gamma_1} \exp \left( a \left( E_0 ^{\Gamma_2} - E^{\Gamma_2} \right) \right) See Equation (4) in https://arxiv.org/pdf/1902.10045.pdf """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import ( Models, SkyModel, SuperExpCutoffPowerLaw4FGLSpectralModel, ) energy_range = [0.1, 100] * u.TeV model = SuperExpCutoffPowerLaw4FGLSpectralModel( index_1=1, index_2=2, amplitude="1e-12 TeV-1 cm-2 s-1", reference="1 TeV", expfactor=1e-2, ) model.plot(energy_range) plt.grid(which="both") plt.ylim(1e-24, 1e-10) # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=model, name="super-exp-cutoff-power-law-4fgl-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/spectral/plot_template_spectral.py0000644000175100001770000000436414721316200024456 0ustar00runnerdockerr""" .. _template-spectral-model: Template spectral model ======================= This model is defined by custom tabular values. The units returned will be the units of the values array provided at initialization. The model will return values interpolated in log-space, returning 0 for energies outside of the limits of the provided energy array. The class implementation follows closely what has been done in `naima.models.TemplateSpectralModel` """ # %% # Example plot # ------------ # Here is an example plot of the model: import numpy as np from astropy import units as u import matplotlib.pyplot as plt from gammapy.modeling.models import ( Models, PowerLawNormSpectralModel, SkyModel, TemplateSpectralModel, ) energy_bounds = [0.1, 1] * u.TeV energy = np.array([1e6, 3e6, 1e7, 3e7]) * u.MeV values = np.array([4.4e-38, 2.0e-38, 8.8e-39, 3.9e-39]) * u.Unit("MeV-1 s-1 cm-2") template = TemplateSpectralModel(energy=energy, values=values) template.plot(energy_bounds) plt.grid(which="both") # %% # Example of extrapolation # ------------------------ # The following shows how to implement extrapolation of a template spectral model: energy = [0.5, 1, 3, 10, 20] * u.TeV values = [40, 30, 20, 10, 1] * u.Unit("TeV-1 s-1 cm-2") template_noextrapolate = TemplateSpectralModel( energy=energy, values=values, interp_kwargs={"extrapolate": False}, ) template_extrapolate = TemplateSpectralModel( energy=energy, values=values, interp_kwargs={"extrapolate": True} ) energy_bounds = [0.2, 80] * u.TeV template_extrapolate.plot(energy_bounds, label="Extrapolated", alpha=0.4, color="blue") template_noextrapolate.plot( energy_bounds, label="Not extrapolated", ls="--", color="black" ) plt.legend() # %% # Spectral corrections to templates can be applied by multiplication with a normalized spectral model, # for example `gammapy.modeling.models.PowerLawNormSpectralModel`. # This operation creates a new `gammapy.modeling.models.CompoundSpectralModel` new_model = template * PowerLawNormSpectralModel(norm=2, tilt=0) print(new_model) # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel(spectral_model=template, name="template-model") models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.168642 gammapy-1.3/examples/models/temporal/0000755000175100001770000000000014721316215017343 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/temporal/README.rst0000644000175100001770000000007614721316200021027 0ustar00runnerdockerTemporal models --------------- .. _temporal-models-gallery: ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/temporal/plot_constant_temporal.py0000644000175100001770000000155114721316200024503 0ustar00runnerdockerr""" .. _constant-temporal-model: Constant temporal model ======================= This model parametrises a constant time model. .. math:: F(t) = k """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u from astropy.time import Time import matplotlib.pyplot as plt from gammapy.modeling.models import ( ConstantTemporalModel, Models, PowerLawSpectralModel, SkyModel, ) time_range = [Time.now(), Time.now() + 1 * u.d] constant_model = ConstantTemporalModel(const=1) constant_model.plot(time_range) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel( spectral_model=PowerLawSpectralModel(), temporal_model=constant_model, name="constant-model", ) models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/temporal/plot_expdecay_temporal.py0000644000175100001770000000171114721316200024452 0ustar00runnerdockerr""" .. _expdecay-temporal-model: ExpDecay temporal model ======================= This model parametrises an ExpDecay time model. .. math:: F(t) = \exp \left( \frac{t - t_{\rm{ref}}}{t0} \right) """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u from astropy.time import Time import matplotlib.pyplot as plt from gammapy.modeling.models import ( ExpDecayTemporalModel, Models, PowerLawSpectralModel, SkyModel, ) t0 = "5 h" t_ref = Time("2020-10-01") time_range = [t_ref, t_ref + 1 * u.d] expdecay_model = ExpDecayTemporalModel(t_ref=t_ref.mjd * u.d, t0=t0) expdecay_model.plot(time_range) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel( spectral_model=PowerLawSpectralModel(), temporal_model=expdecay_model, name="expdecay_model", ) models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/temporal/plot_gaussian_temporal.py0000644000175100001770000000177514721316200024474 0ustar00runnerdockerr""" .. _gaussian-temporal-model: Gaussian temporal model ======================= This model parametrises a gaussian time model. .. math:: F(t) = \exp \left( -0.5 \cdot \frac{ (t - t_{\rm{ref}})^2 } { \sigma^2 } \right) """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u from astropy.time import Time import matplotlib.pyplot as plt from gammapy.modeling.models import ( GaussianTemporalModel, Models, PowerLawSpectralModel, SkyModel, ) sigma = "3 h" t_ref = Time("2020-10-01") time_range = [t_ref - 0.5 * u.d, t_ref + 0.5 * u.d] gaussian_model = GaussianTemporalModel(t_ref=t_ref.mjd * u.d, sigma=sigma) gaussian_model.plot(time_range) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel( spectral_model=PowerLawSpectralModel(), temporal_model=gaussian_model, name="gaussian_model", ) models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/temporal/plot_generalized_gaussian_temporal.py0000644000175100001770000000266014721316200027037 0ustar00runnerdockerr""" .. _generalized-gaussian-temporal-model: Generalized Gaussian temporal model =================================== This model parametrises a generalized Gaussian time model. .. math:: F(t) = \exp \left( - 0.5 \cdot \left( \frac{|t - t_{\rm{ref}}|}{t_{\rm{rise}}} \right) ^ {1 / \eta} \right) \text{ for } t < t_{\rm{ref}} F(t) = \exp \left( - 0.5 \cdot \left( \frac{|t - t_{\rm{ref}}|}{t_{\rm{decay}}} \right) ^ {1 / \eta} \right) \text{ for } t > t_{\rm{ref}} """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u from astropy.time import Time from astropy.units import Quantity import matplotlib.pyplot as plt from gammapy.modeling.models import ( GeneralizedGaussianTemporalModel, Models, PowerLawSpectralModel, SkyModel, ) t_rise = Quantity(0.1, "d") t_decay = Quantity(1, "d") eta = Quantity(2 / 3, "") t_ref = Time("2020-10-01") time_range = [t_ref - 1 * u.d, t_ref + 1 * u.d] gen_gaussian_model = GeneralizedGaussianTemporalModel( t_ref=t_ref.mjd * u.d, t_rise=t_rise, t_decay=t_decay, eta=eta ) gen_gaussian_model.plot(time_range) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel( spectral_model=PowerLawSpectralModel(), temporal_model=gen_gaussian_model, name="generalized_gaussian_model", ) models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/temporal/plot_linear_temporal.py0000644000175100001770000000167214721316200024130 0ustar00runnerdockerr""" .. _linear-temporal-model: Linear temporal model ======================= This model parametrises a linear time model. .. math:: F(t) = \alpha + \beta \cdot (t - t_{\rm{ref}}) """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u from astropy.time import Time import matplotlib.pyplot as plt from gammapy.modeling.models import ( LinearTemporalModel, Models, PowerLawSpectralModel, SkyModel, ) time_range = [Time.now(), Time.now() + 2 * u.d] linear_model = LinearTemporalModel( alpha=1, beta=0.5 / u.d, t_ref=(time_range[0].mjd - 0.1) * u.d ) linear_model.plot(time_range) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel( spectral_model=PowerLawSpectralModel(), temporal_model=linear_model, name="linear-model", ) models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/temporal/plot_powerlaw_temporal.py0000644000175100001770000000170514721316200024513 0ustar00runnerdockerr""" .. _powerlaw-temporal-model: PowerLaw temporal model ======================= This model parametrises a power-law time model. .. math:: F(t) = \left( \frac{t - t_{\rm{ref}}}{t_0} \right)^\alpha """ # %% # Example plot # ------------ # Here is an example plot of the model: from astropy import units as u from astropy.time import Time import matplotlib.pyplot as plt from gammapy.modeling.models import ( Models, PowerLawSpectralModel, PowerLawTemporalModel, SkyModel, ) time_range = [Time.now(), Time.now() + 2 * u.d] pl_model = PowerLawTemporalModel(alpha=-2.0, t_ref=(time_range[0].mjd - 0.1) * u.d) pl_model.plot(time_range) plt.grid(which="both") plt.yscale("log") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel( spectral_model=PowerLawSpectralModel(), temporal_model=pl_model, name="powerlaw-model", ) models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/temporal/plot_sine_temporal.py0000644000175100001770000000176714721316200023621 0ustar00runnerdockerr""" .. _sine-temporal-model: Sine temporal model =================== This model parametrises a time model of sinusoidal modulation. .. math:: F(t) = 1 + amp \cdot \sin(\omega \cdot (t-t_{\rm{ref}})) """ # %% # Example plot # ------------ # Here is an example plot of the model: import numpy as np from astropy import units as u from astropy.time import Time import matplotlib.pyplot as plt from gammapy.modeling.models import ( Models, PowerLawSpectralModel, SineTemporalModel, SkyModel, ) time_range = [Time.now(), Time.now() + 16 * u.d] omega = np.pi / 4.0 * u.rad / u.day sine_model = SineTemporalModel( amp=0.5, omega=omega, t_ref=(time_range[0].mjd - 0.1) * u.d ) sine_model.plot(time_range) plt.grid(which="both") # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel( spectral_model=PowerLawSpectralModel(), temporal_model=sine_model, name="sine-model", ) models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/temporal/plot_template_phase_temporal.py0000644000175100001770000000167014721316200025647 0ustar00runnerdockerr""" .. _PhaseCurve-temporal-model: Phase curve temporal model ========================== This model parametrises a PhaseCurve time model, i.e. with a template phasogram and timing parameters """ import astropy.units as u from astropy.time import Time from gammapy.modeling.models import ( Models, PowerLawSpectralModel, SkyModel, TemplatePhaseCurveTemporalModel, ) path = "$GAMMAPY_DATA/tests/phasecurve_LSI_DC.fits" t_ref = 43366.275 * u.d f0 = 1.0 / (26.7 * u.d) phase_model = TemplatePhaseCurveTemporalModel.read(path, t_ref, 0.0, f0) time_range = [Time("59100", format="mjd"), Time("59200", format="mjd")] phase_model.plot(time_range, n_points=400) # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel( spectral_model=PowerLawSpectralModel(), temporal_model=phase_model, name="phase_curve_model", ) models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/models/temporal/plot_template_temporal.py0000644000175100001770000000252314721316200024465 0ustar00runnerdockerr""" .. _LightCurve-temporal-model: Light curve temporal model ========================== This model parametrises a LightCurve time model. The gammapy internal lightcurve model format is a `~gammapy.maps.RegionNDMap` with `time`, and optionally `energy` axes. The times are defined wrt to a reference time. For serialisation, a `table` and a `map` format are supported. A `table` format is a `~astropy.table.Table` with the reference_time` serialised as a dictionary in the table meta. Only maps without an energy axis can be serialised to this format. In `map` format, a `~gammapy.maps.RegionNDMap` is serialised, with the `reference_time` in the SKYMAP_BANDS HDU. """ from astropy.time import Time from gammapy.modeling.models import ( LightCurveTemplateTemporalModel, Models, PowerLawSpectralModel, SkyModel, ) time_range = [Time("59100", format="mjd"), Time("59365", format="mjd")] path = "$GAMMAPY_DATA/tests/models/light_curve/lightcrv_PKSB1222+216.fits" light_curve_model = LightCurveTemplateTemporalModel.read(path) light_curve_model.plot(time_range) # %% # YAML representation # ------------------- # Here is an example YAML file using the model: model = SkyModel( spectral_model=PowerLawSpectralModel(), temporal_model=light_curve_model, name="light_curve_model", ) models = Models([model]) print(models.to_yaml()) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.168642 gammapy-1.3/examples/tutorials/0000755000175100001770000000000014721316215016263 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/README.rst0000644000175100001770000000161014721316200017742 0ustar00runnerdocker.. include:: ../references.txt .. _tutorials: ========= Tutorials ========= **Notice :** it is advised to first read :ref:`package_structure` of the User Guide before using the tutorials. This page lists the Gammapy tutorials that are available as `Jupyter`_ notebooks. You can read them here, or execute them using a temporary cloud server in Binder. To execute them locally, you have to first install Gammapy locally (see :ref:`installation`) and download the tutorial notebooks and example datasets (see :ref:`getting-started`). Once Gammapy is installed, remember that you can always use ``gammapy info`` to check your setup. Gammapy is a Python package built on `Numpy`_ and `Astropy`_, so to use it effectively, you have to learn the basics. Many good free resources are available, e.g. `A Whirlwind tour of Python`_, the `Python data science handbook`_ and the `Astropy Hands-On Tutorial`_. ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.168642 gammapy-1.3/examples/tutorials/analysis-1d/0000755000175100001770000000000014721316215020410 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-1d/README.rst0000644000175100001770000000046614721316200022077 0ustar00runnerdockerData analysis ============= The following set of tutorials are devoted to data analysis, and grouped according to the specific covered use cases in spectral analysis and flux fitting, image and cube analysis modelling and fitting, as well as time-dependent analysis with light-curves. 1D Spectral -----------././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-1d/cta_sensitivity.py0000644000175100001770000002155214721316200024202 0ustar00runnerdocker""" Point source sensitivity ======================== Estimate the CTAO sensitivity for a point-like IRF at a fixed zenith angle and fixed offset. Introduction ------------ This notebook explains how to estimate the CTAO sensitivity for a point-like IRF at a fixed zenith angle and fixed offset, using the full containment IRFs distributed for the CTA 1DC. The significance is computed for a 1D analysis (ON-OFF regions) with the Li&Ma formula. We use here an approximate approach with an energy dependent integration radius to take into account the variation of the PSF. We will first determine the 1D IRFs including a containment correction. We will be using the following Gammapy class: - `~gammapy.estimators.SensitivityEstimator` """ from cycler import cycler import numpy as np import astropy.units as u from astropy.coordinates import SkyCoord # %matplotlib inline from regions import CircleSkyRegion import matplotlib.pyplot as plt ###################################################################### # Setup # ----- # # As usual, we’ll start with some setup … # from IPython.display import display from gammapy.data import FixedPointingInfo, Observation, observatory_locations from gammapy.datasets import SpectrumDataset, SpectrumDatasetOnOff from gammapy.estimators import FluxPoints, SensitivityEstimator from gammapy.irf import load_irf_dict_from_file from gammapy.makers import SpectrumDatasetMaker from gammapy.maps import MapAxis, RegionGeom from gammapy.maps.axes import UNIT_STRING_FORMAT ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Define analysis region and energy binning # ----------------------------------------- # # Here we assume a source at 0.5 degree from pointing position. We perform # a simple energy independent extraction for now with a radius of 0.1 # degree. # energy_axis = MapAxis.from_energy_bounds(0.03 * u.TeV, 30 * u.TeV, nbin=20) energy_axis_true = MapAxis.from_energy_bounds( 0.01 * u.TeV, 100 * u.TeV, nbin=100, name="energy_true" ) pointing = SkyCoord(ra=0 * u.deg, dec=0 * u.deg) pointing_info = FixedPointingInfo(fixed_icrs=pointing) offset = 0.5 * u.deg source_position = pointing.directional_offset_by(0 * u.deg, offset) on_region_radius = 0.1 * u.deg on_region = CircleSkyRegion(source_position, radius=on_region_radius) geom = RegionGeom.create(on_region, axes=[energy_axis]) empty_dataset = SpectrumDataset.create(geom=geom, energy_axis_true=energy_axis_true) ###################################################################### # Load IRFs and prepare dataset # ----------------------------- # # We extract the 1D IRFs from the full 3D IRFs provided by CTAO. # irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) location = observatory_locations["cta_south"] livetime = 50.0 * u.h obs = Observation.create( pointing=pointing_info, irfs=irfs, livetime=livetime, location=location ) ###################################################################### # Initiate and run the `~gammapy.makers.SpectrumDatasetMaker`. # # Note that here we ensure ``containment_correction=False`` which allows us to # apply our own containment correction in the next part of the tutorial. # spectrum_maker = SpectrumDatasetMaker( selection=["exposure", "edisp", "background"], containment_correction=False, ) dataset = spectrum_maker.run(empty_dataset, obs) ###################################################################### # Now we correct for the energy dependent region size. # # **Note**: In the calculation of the containment radius, we use the point spread function # which is defined dependent on true energy to compute the correction we apply in reconstructed # energy, thus neglecting the energy dispersion in this step. # # Start by correcting the exposure: # containment = 0.68 dataset.exposure *= containment ###################################################################### # Next, correct the background estimation. # # Warning: this neglects the energy dispersion by computing the containment # radius from the PSF in true energy but using the reco energy axis. # on_radii = obs.psf.containment_radius( energy_true=energy_axis.center, offset=offset, fraction=containment ) factor = (1 - np.cos(on_radii)) / (1 - np.cos(on_region_radius)) dataset.background *= factor.value.reshape((-1, 1, 1)) ###################################################################### # Finally, define a `~gammapy.datasets.SpectrumDatasetOnOff` with an alpha of 0.2. # The off counts are created from the background model: # dataset_on_off = SpectrumDatasetOnOff.from_spectrum_dataset( dataset=dataset, acceptance=1, acceptance_off=5 ) ###################################################################### # Compute sensitivity # ------------------- # # We impose a minimal number of expected signal counts of 10 per bin and a # minimal significance of 5 per bin. The excess must also be larger than 5% of the background. # # We assume an alpha of 0.2 (ratio between ON and OFF area). We then run the sensitivity estimator. # # These are the conditions imposed in standard CTAO sensitivity computations. sensitivity_estimator = SensitivityEstimator( gamma_min=10, n_sigma=5, bkg_syst_fraction=0.05, ) sensitivity_table = sensitivity_estimator.run(dataset_on_off) ###################################################################### # Results # ------- # # The results are given as a `~astropy.table.Table`, which can be written to # disk utilising the usual `~astropy.table.Table.write` method. # A column criterion allows us # to distinguish bins where the significance is limited by the signal # statistical significance from bins where the sensitivity is limited by # the number of signal counts. This is visible in the plot below. # display(sensitivity_table) ###################################################################### # Plot the sensitivity curve # fig, ax = plt.subplots() ax.set_prop_cycle(cycler("marker", "s*v") + cycler("color", "rgb")) for criterion in ("significance", "gamma", "bkg"): mask = sensitivity_table["criterion"] == criterion t = sensitivity_table[mask] ax.errorbar( t["e_ref"], t["e2dnde"], xerr=0.5 * (t["e_max"] - t["e_min"]), label=criterion, linestyle="", ) ax.loglog() ax.set_xlabel(f"Energy [{t['e_ref'].unit.to_string(UNIT_STRING_FORMAT)}]") ax.set_ylabel(f"Sensitivity [{t['e2dnde'].unit.to_string(UNIT_STRING_FORMAT)}]") ax.legend() plt.show() ###################################################################### # We add some control plots showing the expected number of background # counts per bin and the ON region size cut (here the 68% containment # radius of the PSF). # # Plot expected number of counts for signal and background. # fig, ax1 = plt.subplots() ax1.plot( sensitivity_table["e_ref"], sensitivity_table["background"], "o-", color="black", label="blackground", ) ax1.loglog() ax1.set_xlabel(f"Energy [{t['e_ref'].unit.to_string(UNIT_STRING_FORMAT)}]") ax1.set_ylabel("Expected number of bkg counts") ax2 = ax1.twinx() ax2.set_ylabel( f"ON region radius [{on_radii.unit.to_string(UNIT_STRING_FORMAT)}]", color="red" ) ax2.semilogy(sensitivity_table["e_ref"], on_radii, color="red", label="PSF68") ax2.tick_params(axis="y", labelcolor="red") ax2.set_ylim(0.01, 0.5) plt.show() ###################################################################### # Obtaining an integral flux sensitivity # -------------------------------------- # # It is often useful to obtain the integral sensitivity above a certain # threshold. In this case, it is simplest to use a dataset with one energy bin # while setting the high energy edge to a very large value. # Here, we simply squash the previously created dataset into one with a single # energy # dataset_on_off1 = dataset_on_off.to_image() sensitivity_estimator = SensitivityEstimator( gamma_min=5, n_sigma=3, bkg_syst_fraction=0.10 ) sensitivity_table = sensitivity_estimator.run(dataset_on_off1) print(sensitivity_table) ###################################################################### # To get the integral flux, we convert to a `~gammapy.estimators.FluxPoints` object # that does the conversion internally. # flux_points = FluxPoints.from_table( sensitivity_table, sed_type="e2dnde", reference_model=sensitivity_estimator.spectral_model, ) print( f"Integral sensitivity in {livetime:.2f} above {energy_axis.edges[0]:.2e} " f"is {np.squeeze(flux_points.flux.quantity):.2e}" ) ###################################################################### # Exercises # --------- # # - Compute the sensitivity for a 20 hour observation # - Compare how the sensitivity differs between 5 and 20 hours by # plotting the ratio as a function of energy. # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-1d/ebl.py0000644000175100001770000002050014721316200021513 0ustar00runnerdocker""" Account for spectral absorption due to the EBL ============================================== Gamma rays emitted from extra-galactic objects, eg blazars, interact with the photons of the Extragalactic Background Light (EBL) through pair production and are attenuated, thus modifying the intrinsic spectrum. Various models of the EBL are supplied in `GAMMAPY_DATA`. This notebook shows how to use these models to correct for this interaction. """ ###################################################################### # Setup # ----- # # As usual, we’ll start with the standard imports … # import astropy.units as u import matplotlib.pyplot as plt from gammapy.catalog import SourceCatalog4FGL from gammapy.datasets import SpectrumDatasetOnOff from gammapy.estimators import FluxPointsEstimator from gammapy.modeling import Fit from gammapy.modeling.models import ( EBL_DATA_BUILTIN, EBLAbsorptionNormSpectralModel, GaussianPrior, PowerLawSpectralModel, SkyModel, ) ###################################################################### # Load the data # ------------- # # We will use 6 observations of the blazars PKS 2155-304 taken in 2008 by # H.E.S.S. when it was in a steady state. The data have already been # reduced to OGIP format `SpectrumDatasetOnOff` following the procedure # :doc:`/tutorials/analysis-1d/spectral_analysis` tutorial using a # `ReflectedRegions` background estimation. The spectra and IRFs from the # 6 observations have been stacked together. # # We will load this dataset as a `~gammapy.datasets.SpectrumDatasetOnOff` and proceed with # the modeling. You can do a 3D analysis as well. # dataset = SpectrumDatasetOnOff.read( "$GAMMAPY_DATA/PKS2155-steady/pks2155-304_steady.fits.gz" ) print(dataset) ###################################################################### # Model the observed spectrum # --------------------------- # # The observed spectrum is already attenuated due to the EBL. Assuming # that the intrinsic spectrum is a power law, the observed spectrum is a # `gammapy.modeling.models.CompoundSpectralModel` given by the product of an EBL model with the # intrinsic model. # ###################################################################### # For a list of available models, see # :doc:`/api/gammapy.modeling.models.EBL_DATA_BUILTIN`. # print(EBL_DATA_BUILTIN.keys()) ###################################################################### # To use other EBL models, you need to save the optical depth as a # function of energy and redshift as an XSPEC model. # Alternatively, you can use packages like `ebltable `_ # which shows how to interface other EBL models with Gammapy. # ###################################################################### # Define the power law # index = 2.3 amplitude = 1.81 * 1e-12 * u.Unit("cm-2 s-1 TeV-1") reference = 1 * u.TeV pwl = PowerLawSpectralModel(index=index, amplitude=amplitude, reference=reference) pwl.index.frozen = False # Specify the redshift of the source redshift = 0.116 # Load the EBL model. Here we use the model from Dominguez, 2011 absorption = EBLAbsorptionNormSpectralModel.read_builtin("dominguez", redshift=redshift) # The power-law model is multiplied by the EBL to get the final model spectral_model = pwl * absorption print(spectral_model) ###################################################################### # Now, create a sky model and proceed with the fit # sky_model = SkyModel(spatial_model=None, spectral_model=spectral_model, name="pks2155") dataset.models = sky_model ###################################################################### # Note that since this dataset has been produced # by a reflected region analysis, it uses ON-OFF statistic # and does not require a background model. # fit = Fit() result = fit.run(datasets=[dataset]) # we make a copy here to compare it later model_best = sky_model.copy() print(result.models.to_parameters_table()) ###################################################################### # Get the flux points # =================== # # To get the observed flux points, just run the `~gammapy.estimators.FluxPointsEstimator` # normally # energy_edges = dataset.counts.geom.axes["energy"].edges fpe = FluxPointsEstimator( energy_edges=energy_edges, source="pks2155", selection_optional="all" ) flux_points_obs = fpe.run(datasets=[dataset]) ###################################################################### # To get the deabsorbed flux points (ie, intrinsic points), we simply need # to set the reference model to the best fit power law instead of the # compound model. # flux_points_intrinsic = flux_points_obs.copy( reference_model=SkyModel(spectral_model=pwl) ) # print(flux_points_obs.reference_model) # print(flux_points_intrinsic.reference_model) ###################################################################### # Plot the observed and intrinsic fluxes # -------------------------------------- # plt.figure() sed_type = "e2dnde" energy_bounds = [0.2, 20] * u.TeV ax = flux_points_obs.plot(sed_type=sed_type, label="observed", color="navy") flux_points_intrinsic.plot(ax=ax, sed_type=sed_type, label="intrinsic", color="red") model_best.spectral_model.plot( ax=ax, energy_bounds=energy_bounds, sed_type=sed_type, color="blue" ) model_best.spectral_model.plot_error( ax=ax, energy_bounds=energy_bounds, sed_type="e2dnde", facecolor="blue" ) pwl.plot(ax=ax, energy_bounds=energy_bounds, sed_type=sed_type, color="tomato") pwl.plot_error( ax=ax, energy_bounds=energy_bounds, sed_type=sed_type, facecolor="tomato" ) plt.ylim(bottom=1e-13) plt.legend() plt.show() # sphinx_gallery_thumbnail_number = 2 ###################################################################### # Further extensions # ------------------ # # In this notebook, we have kept the parameters of the EBL model, the # `alpha_norm` and the `redshift` frozen. Under reasonable assumptions # on the intrinsic spectrum, it can be possible to constrain these # parameters. # # Example: We now assume that the FermiLAT 4FGL catalog spectrum of the # source is a good assumption of the intrinsic spectrum. # # *NOTE*: This is a very simplified assumption and in reality, EBL # absorption can affect the Fermi spectrum significantly. Also, blazar # spectra vary with time and long term averaged states may not be # representative of a specific steady state # catalog = SourceCatalog4FGL() src = catalog["PKS 2155-304"] # Get the intrinsic model intrinsic_model = src.spectral_model() print(intrinsic_model) ###################################################################### # We add Gaussian priors on the `alpha` and `beta` parameters based on the 4FGL # measurements and the associated errors. For more details on using priors, see # :doc:`/tutorials/api/priors` # intrinsic_model.alpha.prior = GaussianPrior( mu=intrinsic_model.alpha.value, sigma=intrinsic_model.alpha.error ) intrinsic_model.beta.prior = GaussianPrior( mu=intrinsic_model.beta.value, sigma=intrinsic_model.beta.error ) ###################################################################### # As before, multiply the intrinsic model with the EBL model # obs_model = intrinsic_model * absorption ###################################################################### # Now, free the redshift of the source # obs_model.parameters["redshift"].frozen = False print(obs_model.parameters.to_table()) sky_model = SkyModel(spectral_model=obs_model, name="observed") dataset.models = sky_model result1 = fit.run([dataset]) print(result1.parameters.to_table()) ###################################################################### # Get a fit stat profile for the redshift # --------------------------------------- # # For more information about stat profiles, see # :doc:`/tutorials/api/fitting` # total_stat = result1.total_stat par = sky_model.parameters["redshift"] par.scan_max = par.value + 5.0 * par.error par.scan_min = max(0, par.value - 5.0 * par.error) par.scan_n_values = 31 # %time profile = fit.stat_profile( datasets=[dataset], parameter=sky_model.parameters["redshift"], reoptimize=True ) plt.figure() ax = plt.gca() ax.plot( profile["observed.spectral.model2.redshift_scan"], profile["stat_scan"] - total_stat ) ax.set_title("TS profile") ax.set_xlabel("Redshift") ax.set_ylabel("$\Delta$ TS") plt.show() ###################################################################### # We see that the redshift is well constrained. # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-1d/extended_source_spectral_analysis.py0000644000175100001770000002441114721316200027736 0ustar00runnerdocker""" Spectral analysis of extended sources ===================================== Perform a spectral analysis of an extended source. Prerequisites ------------- - Understanding of spectral analysis techniques in classical Cherenkov astronomy. - Understanding the basic data reduction and modeling/fitting processes with the gammapy library API as shown in the tutorial :doc:`/tutorials/starting/analysis_2` Context ------- Many VHE sources in the Galaxy are extended. Studying them with a 1D spectral analysis is more complex than studying point sources. One often has to use complex (i.e. non-circular) regions and more importantly, one has to take into account the fact that the instrument response is non-uniform over the selected region. A typical example is given by the supernova remnant RX J1713-3935 which is nearly 1 degree in diameter. See the `following article `__. **Objective: Measure the spectrum of RX J1713-3945 in a 1 degree region fully enclosing it.** Proposed approach ----------------- We have seen in the general presentation of the spectrum extraction for point sources (see :doc:`/tutorials/analysis-1d/spectral_analysis` tutorial) that Gammapy uses specific datasets makers to first produce reduced spectral data and then to extract OFF measurements with reflected background techniques: the `~gammapy.makers.SpectrumDatasetMaker` and the `~gammapy.makers.ReflectedRegionsBackgroundMaker`. However, if the flag `use_region_center` is not set to `False`, the former simply computes the reduced IRFs at the center of the ON region (assumed to be circular). This is no longer valid for extended sources. To be able to compute average responses in the ON region, we can set `use_region_center=False` with the `~gammapy.makers.SpectrumDatasetMaker`, in which case the values of the IRFs are averaged over the entire region. In summary, we have to: - Define an ON region (a `~regions.SkyRegion`) fully enclosing the source we want to study. - Define a `~gammapy.maps.RegionGeom` with the ON region and the required energy range (in particular, beware of the true energy range). - Create the necessary makers : - the spectrum dataset maker : `~gammapy.makers.SpectrumDatasetMaker` with `use_region_center=False` - the OFF background maker, here a `~gammapy.makers.ReflectedRegionsBackgroundMaker` - and usually the safe range maker : `~gammapy.makers.SafeMaskMaker` - Perform the data reduction loop. And for every observation: - Produce a spectrum dataset - Extract the OFF data to produce a `~gammapy.datasets.SpectrumDatasetOnOff` and compute a safe range for it. - Stack or store the resulting spectrum dataset. - Finally proceed with model fitting on the dataset as usual. Here, we will use the RX J1713-3945 observations from the H.E.S.S. first public test data release. The tutorial is implemented with the intermediate level API. Setup ----- As usual, we’ll start with some general imports… """ import astropy.units as u from astropy.coordinates import Angle, SkyCoord from regions import CircleSkyRegion # %matplotlib inline import matplotlib.pyplot as plt from IPython.display import display from gammapy.data import DataStore from gammapy.datasets import Datasets, SpectrumDataset from gammapy.makers import ( ReflectedRegionsBackgroundMaker, SafeMaskMaker, SpectrumDatasetMaker, ) from gammapy.maps import MapAxis, RegionGeom from gammapy.modeling import Fit from gammapy.modeling.models import PowerLawSpectralModel, SkyModel ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Select the data # --------------- # # We first set the datastore and retrieve a few observations from our # source. # datastore = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") obs_ids = [20326, 20327, 20349, 20350, 20396, 20397] # In case you want to use all RX J1713 data in the H.E.S.S. DR1 # other_ids=[20421, 20422, 20517, 20518, 20519, 20521, 20898, 20899, 20900] observations = datastore.get_observations(obs_ids) ###################################################################### # Prepare the datasets creation # ----------------------------- # ###################################################################### # Select the ON region # ~~~~~~~~~~~~~~~~~~~~ # # Here we take a simple 1 degree circular region because it fits well with # the morphology of RX J1713-3945. More complex regions could be used # e.g. `~regions.EllipseSkyRegion` or `~regions.RectangleSkyRegion`. # target_position = SkyCoord(347.3, -0.5, unit="deg", frame="galactic") radius = Angle("0.5 deg") on_region = CircleSkyRegion(target_position, radius) ###################################################################### # Define the geometries # ~~~~~~~~~~~~~~~~~~~~~ # # This part is especially important. # # - We have to define first energy axes. They define the axes of the resulting # `~gammapy.datasets.SpectrumDatasetOnOff`. In particular, we have to be # careful to the true energy axis: it has to cover a larger range than the # reconstructed energy one. # - Then we define the region geometry itself from the on region. # # The binning of the final spectrum is defined here. energy_axis = MapAxis.from_energy_bounds(0.1, 40.0, 10, unit="TeV") # Reduced IRFs are defined in true energy (i.e. not measured energy). energy_axis_true = MapAxis.from_energy_bounds( 0.05, 100, 30, unit="TeV", name="energy_true" ) geom = RegionGeom(on_region, axes=[energy_axis]) ###################################################################### # Create the makers # ~~~~~~~~~~~~~~~~~ # # First we instantiate the target `~gammapy.datasets.SpectrumDataset`. # dataset_empty = SpectrumDataset.create( geom=geom, energy_axis_true=energy_axis_true, ) ###################################################################### # Now we create its associated maker. Here we need to produce, counts, # exposure and edisp (energy dispersion) entries. PSF and IRF background # are not needed, therefore we don’t compute them. # # **IMPORTANT**: Note that `use_region_center` is set to `False`. This # is necessary so that the `~gammapy.makers.SpectrumDatasetMaker` # considers the whole region in the IRF computation and not only the # center. # maker = SpectrumDatasetMaker( selection=["counts", "exposure", "edisp"], use_region_center=False ) ###################################################################### # Now we create the OFF background maker for the spectra. If we have an # exclusion region, we have to pass it here. We also define the safe range # maker. # bkg_maker = ReflectedRegionsBackgroundMaker() safe_mask_maker = SafeMaskMaker(methods=["aeff-max"], aeff_percent=10) ###################################################################### # Perform the data reduction loop. # -------------------------------- # # We can now run over selected observations. For each of them, we: # # - Create the `~gammapy.datasets.SpectrumDataset` # - Compute the OFF via the reflected background method and create a `~gammapy.datasets.SpectrumDatasetOnOff` object # - Run the safe mask maker on it # - Add the `~gammapy.datasets.SpectrumDatasetOnOff` to the list. # # %%time datasets = Datasets() for obs in observations: # A SpectrumDataset is filled in this geometry dataset = maker.run(dataset_empty.copy(name=f"obs-{obs.obs_id}"), obs) # Define safe mask dataset = safe_mask_maker.run(dataset, obs) # Compute OFF dataset = bkg_maker.run(dataset, obs) # Append dataset to the list datasets.append(dataset) display(datasets.meta_table) ###################################################################### # Explore the results # ------------------- # # We can peek at the content of the spectrum datasets # datasets[0].peek() plt.show() ###################################################################### # Cumulative excess and significance # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Finally, we can look at cumulative significance and number of excesses. # This is done with the `info_table` method of # `~gammapy.datasets.Datasets`. # info_table = datasets.info_table(cumulative=True) display(info_table) ###################################################################### # And make the corresponding plots fig, (ax_excess, ax_sqrt_ts) = plt.subplots(figsize=(10, 4), ncols=2, nrows=1) ax_excess.plot( info_table["livetime"].to("h"), info_table["excess"], marker="o", ls="none", ) ax_excess.set_title("Excess") ax_excess.set_xlabel("Livetime [h]") ax_excess.set_ylabel("Excess events") ax_sqrt_ts.plot( info_table["livetime"].to("h"), info_table["sqrt_ts"], marker="o", ls="none", ) ax_sqrt_ts.set_title("Sqrt(TS)") ax_sqrt_ts.set_xlabel("Livetime [h]") ax_sqrt_ts.set_ylabel("Sqrt(TS)") plt.show() ###################################################################### # Perform spectral model fitting # ------------------------------ # # Here we perform a joint fit. # # We first create the model, here a simple powerlaw, and assign it to # every dataset in the `~gammapy.datasets.Datasets`. # spectral_model = PowerLawSpectralModel( index=2, amplitude=2e-11 * u.Unit("cm-2 s-1 TeV-1"), reference=1 * u.TeV ) model = SkyModel(spectral_model=spectral_model, name="RXJ 1713") datasets.models = [model] ###################################################################### # Now we can run the fit # fit_joint = Fit() result_joint = fit_joint.run(datasets=datasets) print(result_joint) ###################################################################### # Explore the fit results # ~~~~~~~~~~~~~~~~~~~~~~~ # # First the fitted parameters values and their errors. # display(datasets.models.to_parameters_table()) ###################################################################### # Then plot the fit result to compare measured and expected counts. Rather # than plotting them for each individual dataset, we stack all datasets # and plot the fit result on the result. # # First stack them all reduced = datasets.stack_reduce() # Assign the fitted model reduced.models = model # Plot the result ax_spectrum, ax_residuals = reduced.plot_fit() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-1d/sed_fitting.py0000644000175100001770000001673514721316200023267 0ustar00runnerdocker""" Flux point fitting ================== Fit spectral models to combined Fermi-LAT and IACT flux points tables. Prerequisites ------------- - Some knowledge about retrieving information from catalogs, see :doc:`/tutorials/api/catalog` tutorial. Context ------- Some high level studies do not rely on reduced datasets with their IRFs but directly on higher level products such as flux points. This is not ideal because flux points already contain some hypothesis for the underlying spectral shape and the uncertainties they carry are usually simplified (e.g. symmetric gaussian errors). Yet, this is an efficient way to combine heterogeneous data. **Objective: fit spectral models to combined Fermi-LAT and IACT flux points.** Proposed approach ----------------- Here we will load, the spectral points from Fermi-LAT and TeV catalogs and fit them with various spectral models to find the best representation of the wide-band spectrum. The central class we’re going to use for this example analysis is: - `~gammapy.datasets.FluxPointsDataset` In addition we will work with the following data classes: - `~gammapy.estimators.FluxPoints` - `~gammapy.catalog.SourceCatalogGammaCat` - `~gammapy.catalog.SourceCatalog3FHL` - `~gammapy.catalog.SourceCatalog3FGL` And the following spectral model classes: - `~gammapy.modeling.models.PowerLawSpectralModel` - `~gammapy.modeling.models.ExpCutoffPowerLawSpectralModel` - `~gammapy.modeling.models.LogParabolaSpectralModel` """ ###################################################################### # Setup # ----- # # Let us start with the usual IPython notebook and Python imports: # from astropy import units as u # %matplotlib inline import matplotlib.pyplot as plt from gammapy.catalog import CATALOG_REGISTRY from gammapy.datasets import Datasets, FluxPointsDataset from gammapy.modeling import Fit from gammapy.modeling.models import ( ExpCutoffPowerLawSpectralModel, LogParabolaSpectralModel, PowerLawSpectralModel, SkyModel, ) ###################################################################### # Load spectral points # -------------------- # # For this analysis we choose to work with the source ‘HESS J1507-622’ and # the associated Fermi-LAT sources ‘3FGL J1506.6-6219’ and ‘3FHL # J1507.9-6228e’. We load the source catalogs, and then access source of # interest by name: # catalog_3fgl = CATALOG_REGISTRY.get_cls("3fgl")() catalog_3fhl = CATALOG_REGISTRY.get_cls("3fhl")() catalog_gammacat = CATALOG_REGISTRY.get_cls("gamma-cat")() source_fermi_3fgl = catalog_3fgl["3FGL J1506.6-6219"] source_fermi_3fhl = catalog_3fhl["3FHL J1507.9-6228e"] source_gammacat = catalog_gammacat["HESS J1507-622"] ###################################################################### # The corresponding flux points data can be accessed with ``.flux_points`` # attribute: # dataset_gammacat = FluxPointsDataset(data=source_gammacat.flux_points, name="gammacat") dataset_gammacat.data.to_table(sed_type="dnde", formatted=True) dataset_3fgl = FluxPointsDataset(data=source_fermi_3fgl.flux_points, name="3fgl") dataset_3fgl.data.to_table(sed_type="dnde", formatted=True) dataset_3fhl = FluxPointsDataset(data=source_fermi_3fhl.flux_points, name="3fhl") dataset_3fhl.data.to_table(sed_type="dnde", formatted=True) ###################################################################### # Power Law Fit # ------------- # # First we start with fitting a simple # `~gammapy.modeling.models.PowerLawSpectralModel`. # pwl = PowerLawSpectralModel( index=2, amplitude="1e-12 cm-2 s-1 TeV-1", reference="1 TeV" ) model = SkyModel(spectral_model=pwl, name="j1507-pl") ###################################################################### # After creating the model we run the fit by passing the ``flux_points`` # and ``model`` objects: # datasets = Datasets([dataset_gammacat, dataset_3fgl, dataset_3fhl]) datasets.models = model print(datasets) fitter = Fit() result_pwl = fitter.run(datasets=datasets) ###################################################################### # And print the result: # print(result_pwl) print(model) ###################################################################### # Finally we plot the data points and the best fit model: # ax = plt.subplot() ax.yaxis.set_units(u.Unit("TeV cm-2 s-1")) kwargs = {"ax": ax, "sed_type": "e2dnde"} for d in datasets: d.data.plot(label=d.name, **kwargs) energy_bounds = [1e-4, 1e2] * u.TeV pwl.plot(energy_bounds=energy_bounds, color="k", **kwargs) pwl.plot_error(energy_bounds=energy_bounds, **kwargs) ax.set_ylim(1e-13, 1e-11) ax.set_xlim(energy_bounds) ax.legend() plt.show() ###################################################################### # Exponential Cut-Off Powerlaw Fit # -------------------------------- # # Next we fit an # `~gammapy.modeling.models.ExpCutoffPowerLawSpectralModel` law to the # data. # ecpl = ExpCutoffPowerLawSpectralModel( index=1.8, amplitude="2e-12 cm-2 s-1 TeV-1", reference="1 TeV", lambda_="0.1 TeV-1", ) model = SkyModel(spectral_model=ecpl, name="j1507-ecpl") ###################################################################### # We run the fitter again by passing the flux points and the model # instance: # datasets.models = model result_ecpl = fitter.run(datasets=datasets) print(model) ###################################################################### # We plot the data and best fit model: # ax = plt.subplot() kwargs = {"ax": ax, "sed_type": "e2dnde"} ax.yaxis.set_units(u.Unit("TeV cm-2 s-1")) for d in datasets: d.data.plot(label=d.name, **kwargs) ecpl.plot(energy_bounds=energy_bounds, color="k", **kwargs) ecpl.plot_error(energy_bounds=energy_bounds, **kwargs) ax.set_ylim(1e-13, 1e-11) ax.set_xlim(energy_bounds) ax.legend() plt.show() ###################################################################### # Log-Parabola Fit # ---------------- # # Finally we try to fit a # `~gammapy.modeling.models.LogParabolaSpectralModel` model: # log_parabola = LogParabolaSpectralModel( alpha=2, amplitude="1e-12 cm-2 s-1 TeV-1", reference="1 TeV", beta=0.1 ) model = SkyModel(spectral_model=log_parabola, name="j1507-lp") datasets.models = model result_log_parabola = fitter.run(datasets=datasets) print(model) ax = plt.subplot() kwargs = {"ax": ax, "sed_type": "e2dnde"} ax.yaxis.set_units(u.Unit("TeV cm-2 s-1")) for d in datasets: d.data.plot(label=d.name, **kwargs) log_parabola.plot(energy_bounds=energy_bounds, color="k", **kwargs) log_parabola.plot_error(energy_bounds=energy_bounds, **kwargs) ax.set_ylim(1e-13, 1e-11) ax.set_xlim(energy_bounds) ax.legend() plt.show() ###################################################################### # Exercises # --------- # # - Fit a `~gammapy.modeling.models.PowerLaw2SpectralModel` and # `~gammapy.modeling.models.ExpCutoffPowerLaw3FGLSpectralModel` to # the same data. # - Fit a `~gammapy.modeling.models.ExpCutoffPowerLawSpectralModel` # model to Vela X (‘HESS J0835-455’) only and check if the best fit # values correspond to the values given in the Gammacat catalog # ###################################################################### # What next? # ---------- # # This was an introduction to SED fitting in Gammapy. # # - If you would like to learn how to perform a full Poisson maximum # likelihood spectral fit, please check out the # :doc:`/tutorials/analysis-1d/spectral_analysis` tutorial. # - To learn how to combine heterogeneous datasets to perform a # multi-instrument forward-folding fit see the # :doc:`/tutorials/analysis-3d/analysis_mwl` tutorial. # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-1d/spectral_analysis.py0000644000175100001770000004555214721316200024507 0ustar00runnerdocker""" Spectral analysis ================= Perform a full region based on-off spectral analysis and fit the resulting datasets. Prerequisites ------------- - Understanding how spectral extraction is performed in Cherenkov astronomy, in particular regarding OFF background measurements. - Understanding the basics data reduction and modeling/fitting process with the gammapy library API as shown in the :doc:`/tutorials/starting/analysis_2` tutorial. Context ------- While 3D analyses allow in principle to consider complex field of views containing overlapping gamma-ray sources, in many cases we might have an observation with a single, strong, point-like source in the field of view. A spectral analysis, in that case, might consider all the events inside a source (or ON) region and bin them in energy only, obtaining 1D datasets. In classical Cherenkov astronomy, the background estimation technique associated with this method measures the number of events in OFF regions taken in regions of the field-of-view devoid of gamma-ray emitters, where the background rate is assumed to be equal to the one in the ON region. This allows to use a specific fit statistics for ON-OFF measurements, the wstat (see `~gammapy.stats.wstat`), where no background model is assumed. Background is treated as a set of nuisance parameters. This removes some systematic effects connected to the choice or the quality of the background model. But this comes at the expense of larger statistical uncertainties on the fitted model parameters. **Objective: perform a full region based spectral analysis of 4 Crab observations of H.E.S.S. data release 1 and fit the resulting datasets.** Introduction ------------ Here, as usual, we use the `~gammapy.data.DataStore` to retrieve a list of selected observations (`~gammapy.data.Observations`). Then, we define the ON region containing the source and the geometry of the `~gammapy.datasets.SpectrumDataset` object we want to produce. We then create the corresponding dataset Maker. We have to define the Maker object that will extract the OFF counts from reflected regions in the field-of-view. To ensure we use data in an energy range where the quality of the IRFs is good enough we also create a safe range Maker. We can then proceed with data reduction with a loop over all selected observations to produce datasets in the relevant geometry. We can then explore the resulting datasets and look at the cumulative signal and significance of our source. We finally proceed with model fitting. In practice, we have to: - Create a `~gammapy.data.DataStore` pointing to the relevant data - Apply an observation selection to produce a list of observations, a `~gammapy.data.Observations` object. - Define a geometry of the spectrum we want to produce: - Create a `~regions.CircleSkyRegion` for the ON extraction region - Create a `~gammapy.maps.MapAxis` for the energy binnings: one for the reconstructed (i.e.measured) energy, the other for the true energy (i.e.the one used by IRFs and models) - Create the necessary makers : - the spectrum dataset maker : `~gammapy.makers.SpectrumDatasetMaker` - the OFF background maker, here a `~gammapy.makers.ReflectedRegionsBackgroundMaker` - and the safe range maker : `~gammapy.makers.SafeMaskMaker` - Perform the data reduction loop. And for every observation: - Apply the makers sequentially to produce a `~gammapy.datasets.SpectrumDatasetOnOff` - Append it to list of datasets - Define the `~gammapy.modeling.models.SkyModel` to apply to the dataset. - Create a `~gammapy.modeling.Fit` object and run it to fit the model parameters - Apply a `~gammapy.estimators.FluxPointsEstimator` to compute flux points for the spectral part of the fit. """ from pathlib import Path # Check package versions import astropy.units as u from astropy.coordinates import Angle, SkyCoord from regions import CircleSkyRegion # %matplotlib inline import matplotlib.pyplot as plt ###################################################################### # Setup # ----- # # As usual, we’ll start with some setup … # from IPython.display import display from gammapy.data import DataStore from gammapy.datasets import ( Datasets, FluxPointsDataset, SpectrumDataset, SpectrumDatasetOnOff, ) from gammapy.estimators import FluxPointsEstimator from gammapy.estimators.utils import resample_energy_edges from gammapy.makers import ( ReflectedRegionsBackgroundMaker, SafeMaskMaker, SpectrumDatasetMaker, ) from gammapy.maps import MapAxis, RegionGeom, WcsGeom from gammapy.modeling import Fit from gammapy.modeling.models import ( ExpCutoffPowerLawSpectralModel, SkyModel, create_crab_spectral_model, ) ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup from gammapy.visualization import plot_spectrum_datasets_off_regions check_tutorials_setup() ###################################################################### # Load Data # --------- # # First, we select and load some H.E.S.S. observations of the Crab nebula. # # We will access the events, effective area, energy dispersion, livetime # and PSF for containment correction. # datastore = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") obs_ids = [23523, 23526, 23559, 23592] observations = datastore.get_observations(obs_ids) ###################################################################### # Define Target Region # -------------------- # # The next step is to define a signal extraction region, also known as on # region. In the simplest case this is just a # `CircleSkyRegion `__. # target_position = SkyCoord(ra=83.63, dec=22.01, unit="deg", frame="icrs") on_region_radius = Angle("0.11 deg") on_region = CircleSkyRegion(center=target_position, radius=on_region_radius) ###################################################################### # Create exclusion mask # --------------------- # # We will use the reflected regions method to place off regions to # estimate the background level in the on region. To make sure the off # regions don’t contain gamma-ray emission, we create an exclusion mask. # # Using http://gamma-sky.net/ we find that there’s only one known # gamma-ray source near the Crab nebula: the AGN called `RGB # J0521+212 `__ at GLON = 183.604 deg # and GLAT = -8.708 deg. # exclusion_region = CircleSkyRegion( center=SkyCoord(183.604, -8.708, unit="deg", frame="galactic"), radius=0.5 * u.deg, ) skydir = target_position.galactic geom = WcsGeom.create( npix=(150, 150), binsz=0.05, skydir=skydir, proj="TAN", frame="icrs" ) exclusion_mask = ~geom.region_mask([exclusion_region]) exclusion_mask.plot() plt.show() ###################################################################### # Run data reduction chain # ------------------------ # # We begin with the configuration of the maker classes: # energy_axis = MapAxis.from_energy_bounds( 0.1, 40, nbin=10, per_decade=True, unit="TeV", name="energy" ) energy_axis_true = MapAxis.from_energy_bounds( 0.05, 100, nbin=20, per_decade=True, unit="TeV", name="energy_true" ) geom = RegionGeom.create(region=on_region, axes=[energy_axis]) dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=energy_axis_true) dataset_maker = SpectrumDatasetMaker( containment_correction=True, selection=["counts", "exposure", "edisp"] ) bkg_maker = ReflectedRegionsBackgroundMaker(exclusion_mask=exclusion_mask) safe_mask_maker = SafeMaskMaker(methods=["aeff-max"], aeff_percent=10) # %%time datasets = Datasets() for obs_id, observation in zip(obs_ids, observations): dataset = dataset_maker.run(dataset_empty.copy(name=str(obs_id)), observation) dataset_on_off = bkg_maker.run(dataset, observation) dataset_on_off = safe_mask_maker.run(dataset_on_off, observation) datasets.append(dataset_on_off) print(datasets) ###################################################################### # Plot off regions # ---------------- # plt.figure() ax = exclusion_mask.plot() on_region.to_pixel(ax.wcs).plot(ax=ax, edgecolor="k") plot_spectrum_datasets_off_regions(ax=ax, datasets=datasets) plt.show() ###################################################################### # Source statistic # ---------------- # # Next we’re going to look at the overall source statistics in our signal # region. # info_table = datasets.info_table(cumulative=True) display(info_table) ###################################################################### # And make the corresponding plots fig, (ax_excess, ax_sqrt_ts) = plt.subplots(figsize=(10, 4), ncols=2, nrows=1) ax_excess.plot( info_table["livetime"].to("h"), info_table["excess"], marker="o", ls="none", ) ax_excess.set_title("Excess") ax_excess.set_xlabel("Livetime [h]") ax_excess.set_ylabel("Excess events") ax_sqrt_ts.plot( info_table["livetime"].to("h"), info_table["sqrt_ts"], marker="o", ls="none", ) ax_sqrt_ts.set_title("Sqrt(TS)") ax_sqrt_ts.set_xlabel("Livetime [h]") ax_sqrt_ts.set_ylabel("Sqrt(TS)") plt.show() ###################################################################### # Finally you can write the extracted datasets to disk using the OGIP # format (PHA, ARF, RMF, BKG, see # `here `__ # for details): # path = Path("spectrum_analysis") path.mkdir(exist_ok=True) for dataset in datasets: dataset.write(filename=path / f"obs_{dataset.name}.fits.gz", overwrite=True) ###################################################################### # If you want to read back the datasets from disk you can use: # datasets = Datasets() for obs_id in obs_ids: filename = path / f"obs_{obs_id}.fits.gz" datasets.append(SpectrumDatasetOnOff.read(filename)) ###################################################################### # Fit spectrum # ------------ # # Now we’ll fit a global model to the spectrum. First we do a joint # likelihood fit to all observations. If you want to stack the # observations see below. We will also produce a debug plot in order to # show how the global fit matches one of the individual observations. # spectral_model = ExpCutoffPowerLawSpectralModel( amplitude=1e-12 * u.Unit("cm-2 s-1 TeV-1"), index=2, lambda_=0.1 * u.Unit("TeV-1"), reference=1 * u.TeV, ) model = SkyModel(spectral_model=spectral_model, name="crab") datasets.models = [model] fit_joint = Fit() result_joint = fit_joint.run(datasets=datasets) # we make a copy here to compare it later model_best_joint = model.copy() ###################################################################### # Fit quality and model residuals # ------------------------------- # ###################################################################### # We can access the results dictionary to see if the fit converged: # print(result_joint) ###################################################################### # and check the best-fit parameters # display(result_joint.models.to_parameters_table()) ###################################################################### # A simple way to inspect the model residuals is using the function # `~SpectrumDataset.plot_fit()` # ax_spectrum, ax_residuals = datasets[0].plot_fit() ax_spectrum.set_ylim(0.1, 40) plt.show() ###################################################################### # For more ways of assessing fit quality, please refer to the dedicated # :doc:`/tutorials/api/fitting` tutorial. # ###################################################################### # Compute Flux Points # ------------------- # # To round up our analysis we can compute flux points by fitting the norm # of the global model in energy bands. # We create an instance of the # `~gammapy.estimators.FluxPointsEstimator`, by passing the dataset and # the energy binning: # fpe = FluxPointsEstimator( energy_edges=energy_axis.edges, source="crab", selection_optional="all" ) flux_points = fpe.run(datasets=datasets) ###################################################################### # Here is a the table of the resulting flux points: # display(flux_points.to_table(sed_type="dnde", formatted=True)) ###################################################################### # Now we plot the flux points and their likelihood profiles. For the # plotting of upper limits we choose a threshold of TS < 4. # fig, ax = plt.subplots() flux_points.plot(ax=ax, sed_type="e2dnde", color="darkorange") flux_points.plot_ts_profiles(ax=ax, sed_type="e2dnde") ax.set_xlim(0.6, 40) plt.show() ###################################################################### # Note: it is also possible to plot the flux distribution with the spectral model overlaid, # but you must ensure the axis binning is identical for the flux points and # integral flux. # ###################################################################### # The final plot with the best fit model, flux points and residuals can be # quickly made like this: # flux_points_dataset = FluxPointsDataset( data=flux_points, models=model_best_joint.copy() ) ax, _ = flux_points_dataset.plot_fit() ax.set_xlim(0.6, 40) plt.show() ###################################################################### # Stack observations # ------------------ # # An alternative approach to fitting the spectrum is stacking all # observations first and the fitting a model. For this we first stack the # individual datasets: # dataset_stacked = Datasets(datasets).stack_reduce() ###################################################################### # Again we set the model on the dataset we would like to fit (in this case # it’s only a single one) and pass it to the `~gammapy.modeling.Fit` # object: # dataset_stacked.models = model stacked_fit = Fit() result_stacked = stacked_fit.run([dataset_stacked]) # Make a copy to compare later model_best_stacked = model.copy() print(result_stacked) ###################################################################### # And display the parameter table display(model_best_joint.parameters.to_table()) display(model_best_stacked.parameters.to_table()) ###################################################################### # Finally, we compare the results of our stacked analysis to a previously # published Crab Nebula Spectrum for reference. This is available in # `~gammapy.modeling.models.create_crab_spectral_model`. # fig, ax = plt.subplots() plot_kwargs = { "energy_bounds": [0.1, 30] * u.TeV, "sed_type": "e2dnde", "yunits": u.Unit("erg cm-2 s-1"), "ax": ax, } # plot stacked model model_best_stacked.spectral_model.plot(**plot_kwargs, label="Stacked analysis result") model_best_stacked.spectral_model.plot_error(facecolor="blue", alpha=0.3, **plot_kwargs) # plot joint model model_best_joint.spectral_model.plot( **plot_kwargs, label="Joint analysis result", ls="--" ) model_best_joint.spectral_model.plot_error(facecolor="orange", alpha=0.3, **plot_kwargs) create_crab_spectral_model("hess_ecpl").plot( **plot_kwargs, label="Crab reference", ) ax.legend() plt.show() # sphinx_gallery_thumbnail_number = 5 ###################################################################### # A note on statistics # -------------------- # # Different statistic are available for the `~gammapy.datasets.FluxPointsDataset` : # # - chi2 : estimate from chi2 statistics. # - profile : estimate from interpolation of the likelihood profile. # - distrib : estimate from probability distributions, # assuming that flux points correspond to asymmetric gaussians # and upper limits complementary error functions. # # Default is `chi2`, in that case upper limits are ignored and the mean of asymetrics error is used. # So it is recommended to use `profile` if `stat_scan` is available on flux points. # The `distrib` case provides an approximation if the `profile` is not available # which allows to take into accounts upper limit and asymetrics error. # # In the example below we can see that the `profile` case matches exactly the result # from the joint analysis of the ON/OFF datasets using `wstat` (as labelled). def plot_stat(fp_dataset): fig, ax = plt.subplots() plot_kwargs = { "energy_bounds": [0.1, 30] * u.TeV, "sed_type": "e2dnde", "ax": ax, } fp_dataset.data.plot(energy_power=2, ax=ax) model_best_joint.spectral_model.plot( color="b", lw=0.5, **plot_kwargs, label="wstat" ) stat_types = ["chi2", "profile", "distrib"] colors = ["red", "g", "c"] lss = ["--", ":", "--"] for ks, stat in enumerate(stat_types): fp_dataset.stat_type = stat fit = Fit() fit.run([fp_dataset]) fp_dataset.models[0].spectral_model.plot( color=colors[ks], ls=lss[ks], **plot_kwargs, label=stat ) fp_dataset.models[0].spectral_model.plot_error( facecolor=colors[ks], **plot_kwargs ) plt.legend() plot_stat(flux_points_dataset) ###################################################################### # # In order to avoid discrepancies due to the treatment of upper limits # we can utilise the `~gammapy.estimators.utils.resample_energy_edges` # for defining energy bins in which the minimum number of `sqrt_ts` is 2. # In that case all the statistics definitions give equivalent results. # energy_edges = resample_energy_edges(dataset_stacked, conditions={"sqrt_ts_min": 2}) fpe_no_ul = FluxPointsEstimator( energy_edges=energy_edges, source="crab", selection_optional="all" ) flux_points_no_ul = fpe_no_ul.run(datasets=datasets) flux_points_dataset_no_ul = FluxPointsDataset( data=flux_points_no_ul, models=model_best_joint.copy(), ) plot_stat(flux_points_dataset_no_ul) ###################################################################### # Exercises # --------- # # Now you have learned the basics of a spectral analysis with Gammapy. To # practice you can continue with the following exercises: # # - Fit a different spectral model to the data. You could try # `~gammapy.modeling.models.ExpCutoffPowerLawSpectralModel` or # `~gammapy.modeling.models.LogParabolaSpectralModel`. # - Compute flux points for the stacked dataset. # - Create a `~gammapy.datasets.FluxPointsDataset` with the flux points # you have computed for the stacked dataset and fit the flux points # again with obe of the spectral models. How does the result compare to # the best fit model, that was directly fitted to the counts data? # ###################################################################### # What next? # ---------- # # The methods shown in this tutorial is valid for point-like or midly # extended sources where we can assume that the IRF taken at the region # center is valid over the whole region. If one wants to extract the 1D # spectrum of a large source and properly average the response over the # extraction region, one has to use a different approach explained in # the :doc:`/tutorials/analysis-1d/extended_source_spectral_analysis` # tutorial. # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-1d/spectral_analysis_hli.py0000644000175100001770000003257314721316200025342 0ustar00runnerdocker""" Spectral analysis with the HLI ============================== Introduction to 1D analysis using the Gammapy high level interface. Prerequisites ------------- - Understanding the gammapy data workflow, in particular what are DL3 events and instrument response functions (IRF). Context ------- This notebook is an introduction to gammapy analysis using the high level interface. Gammapy analysis consists of two main steps. The first one is data reduction: user selected observations are reduced to a geometry defined by the user. It can be 1D (spectrum from a given extraction region) or 3D (with a sky projection and an energy axis). The resulting reduced data and instrument response functions (IRF) are called datasets in Gammapy. The second step consists of setting a physical model on the datasets and fitting it to obtain relevant physical information. **Objective: Create a 1D dataset of the Crab using the H.E.S.S. DL3 data release 1 and perform a simple model fitting of the Crab nebula.** Proposed approach ----------------- This notebook uses the high level `~gammapy.analysis.Analysis` class to orchestrate data reduction and run the data fits. In its current state, `Analysis` supports the standard analysis cases of joint or stacked 3D and 1D analyses. It is instantiated with an `~gammapy.analysis.AnalysisConfig` object that gives access to analysis parameters either directly or via a YAML config file. To see what is happening under-the-hood and to get an idea of the internal API, a second notebook performs the same analysis without using the `~gammapy.analysis.Analysis` class. In summary, we have to: - Create an `~gammapy.analysis.AnalysisConfig` object and the analysis configuration: - Define what observations to use - Define the geometry of the dataset (data and IRFs) - Define the model we want to fit on the dataset. - Instantiate a `~gammapy.analysis.Analysis` from this configuration and run the different analysis steps - Observation selection - Data reduction - Model fitting - Estimating flux points """ from pathlib import Path # %matplotlib inline import matplotlib.pyplot as plt ###################################################################### # Setup # ----- # from IPython.display import display from gammapy.analysis import Analysis, AnalysisConfig from gammapy.modeling.models import Models ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Analysis configuration # ---------------------- # # For configuration of the analysis we use the # `YAML `__ data format. YAML is a # machine-readable serialisation format, that is also friendly for humans # to read. In this tutorial we will write the configuration file just # using Python strings, but of course the file can be created and modified # with any text editor of your choice. # # Here is what the configuration for our analysis looks like: # yaml_str = """ observations: datastore: $GAMMAPY_DATA/hess-dl3-dr1 obs_cone: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 5 deg} datasets: type: 1d stack: true geom: axes: energy: {min: 0.5 TeV, max: 30 TeV, nbins: 20} energy_true: {min: 0.1 TeV, max: 50 TeV, nbins: 40} on_region: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 0.11 deg} containment_correction: true safe_mask: methods: ['aeff-default', 'aeff-max'] parameters: {aeff_percent: 0.1} background: method: reflected fit: fit_range: {min: 1 TeV, max: 20 TeV} flux_points: energy: {min: 1 TeV, max: 20 TeV, nbins: 8} source: 'crab' """ config = AnalysisConfig.from_yaml(yaml_str) print(config) ###################################################################### # Note that you can save this string into a yaml file and load it as # follow: # # config = AnalysisConfig.read("config-1d.yaml") # # the AnalysisConfig gives access to the various parameters used from logging to reduced dataset geometries # print(config) ###################################################################### # Using data stored into your computer # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # ###################################################################### # Here, we want to use Crab runs from the H.E.S.S. DL3-DR1. We have # defined the datastore and a cone search of observations pointing with 5 # degrees of the Crab nebula. Parameters can be set directly or as a # python dict. # # PS: do not forget to set up your environment variable `$GAMMAPY_DATA` to # your local directory containing the H.E.S.S. DL3-DR1 as described in # :ref:`quickstart-setup`. # ###################################################################### # Setting the exclusion mask # ~~~~~~~~~~~~~~~~~~~~~~~~~~ # ###################################################################### # In order to properly adjust the background normalisation on regions # without gamma-ray signal, one needs to define an exclusion mask for the # background normalisation. For this tutorial, we use the following one # `$GAMMAPY_DATA/joint-crab/exclusion/exclusion_mask_crab.fits.gz` # config.datasets.background.exclusion = ( "$GAMMAPY_DATA/joint-crab/exclusion/exclusion_mask_crab.fits.gz" ) ###################################################################### # We’re all set. But before we go on let’s see how to save or import # `~gammapy.analysis.AnalysisConfig` objects though YAML files. # ###################################################################### # Using YAML configuration files for setting/writing the Data Reduction parameters # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # One can export/import the `~gammapy.analysis.AnalysisConfig` to/from a YAML file. # config.write("config.yaml", overwrite=True) config = AnalysisConfig.read("config.yaml") print(config) ###################################################################### # Running the first step of the analysis: the Data Reduction # ---------------------------------------------------------- # ###################################################################### # Configuration of the analysis # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # We first create an `~gammapy.analysis.Analysis` object from our # configuration. # analysis = Analysis(config) ###################################################################### # Observation selection # ~~~~~~~~~~~~~~~~~~~~~ # # We can directly select and load the observations from disk using # `~gammapy.analysis.Analysis.get_observations()`: # analysis.get_observations() ###################################################################### # The observations are now available on the `Analysis` object. The # selection corresponds to the following ids: # print(analysis.observations.ids) ###################################################################### # To see how to explore observations, please refer to the following # notebook: :doc:`/tutorials/data/cta` or :doc:`/tutorials/data/hess` # ###################################################################### # Running the Data Reduction # ~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Now we proceed to the data reduction. In the config file we have chosen # a WCS map geometry, energy axis and decided to stack the maps. We can # run the reduction using `.get_datasets()`: # # %%time analysis.get_datasets() ###################################################################### # Results exploration # ~~~~~~~~~~~~~~~~~~~ # # As we have chosen to stack the data, one can print what contains the # unique entry of the datasets: # print(analysis.datasets[0]) ###################################################################### # As you can see the dataset uses WStat with the background computed with # the Reflected Background method during the data reduction, but no source # model has been set yet. # # The counts, exposure and background, etc are directly available on the # dataset and can be printed: # info_table = analysis.datasets.info_table() info_table print( f"Tobs={info_table['livetime'].to('h')[0]:.1f} Excess={info_table['excess'].value[0]:.1f} \ Significance={info_table['sqrt_ts'][0]:.2f}" ) ###################################################################### # Save dataset to disk # ~~~~~~~~~~~~~~~~~~~~ # # It is common to run the preparation step independent of the likelihood # fit, because often the preparation of counts, collection are and energy # dispersion is slow if you have a lot of data. We first create a folder: # path = Path("hli_spectrum_analysis") path.mkdir(exist_ok=True) ###################################################################### # And then write the stacked dataset to disk by calling the dedicated # `write()` method: # filename = path / "crab-stacked-dataset.fits.gz" analysis.datasets.write(filename, overwrite=True) ###################################################################### # Model fitting # ------------- # ###################################################################### # Creation of the model # ~~~~~~~~~~~~~~~~~~~~~ # # First, let’s create a model to be adjusted. As we are performing a 1D # Analysis, only a spectral model is needed within the `SkyModel` # object. Here is a pre-defined YAML configuration file created for this # 1D analysis: # model_str = """ components: - name: crab type: SkyModel spectral: type: PowerLawSpectralModel parameters: - name: index frozen: false scale: 1.0 unit: '' value: 2.6 - name: amplitude frozen: false scale: 1.0 unit: cm-2 s-1 TeV-1 value: 5.0e-11 - name: reference frozen: true scale: 1.0 unit: TeV value: 1.0 """ model_1d = Models.from_yaml(model_str) print(model_1d) ###################################################################### # Or from a yaml file, e.g. # # model_1d = Models.read("model-1d.yaml") # print(model_1d) ###################################################################### # Now we set the model on the analysis object: # analysis.set_models(model_1d) ###################################################################### # Setting fitting parameters # ~~~~~~~~~~~~~~~~~~~~~~~~~~ # # `Analysis` can perform a few modeling and fitting tasks besides data # reduction. Parameters have then to be passed to the configuration # object. # ###################################################################### # Running the fit # ~~~~~~~~~~~~~~~ # # %%time analysis.run_fit() ###################################################################### # Exploration of the fit results # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # print(analysis.fit_result) display(model_1d.to_parameters_table()) ###################################################################### # To check the fit is correct, we compute the excess spectrum with the # predicted counts. # ax_spectrum, ax_residuals = analysis.datasets[0].plot_fit() ax_spectrum.set_ylim(0.1, 200) ax_spectrum.set_xlim(0.2, 60) ax_residuals.set_xlim(0.2, 60) plt.show() ###################################################################### # Serialisation of the fit result # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # This is how we can write the model back to file again: # filename = path / "model-best-fit.yaml" analysis.models.write(filename, overwrite=True) with filename.open("r") as f: print(f.read()) ###################################################################### # Creation of the Flux points # --------------------------- # ###################################################################### # Running the estimation # ~~~~~~~~~~~~~~~~~~~~~~ # analysis.get_flux_points() crab_fp = analysis.flux_points.data crab_fp_table = crab_fp.to_table(sed_type="dnde", formatted=True) display(crab_fp_table) ###################################################################### # Let’s plot the flux points with their likelihood profile # fig, ax_sed = plt.subplots() crab_fp.plot(ax=ax_sed, sed_type="e2dnde", color="darkorange") ax_sed.set_ylim(1.0e-12, 2.0e-10) ax_sed.set_xlim(0.5, 40) crab_fp.plot_ts_profiles(ax=ax_sed, sed_type="e2dnde") plt.show() ###################################################################### # Serialisation of the results # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # The flux points can be exported to a fits table following the format # defined # `here `__ # filename = path / "flux-points.fits" analysis.flux_points.write(filename, overwrite=True) ###################################################################### # Plotting the final results of the 1D Analysis # --------------------------------------------- # ###################################################################### # We can plot of the spectral fit with its error band overlaid with the # flux points: # ax_sed, ax_residuals = analysis.flux_points.plot_fit() ax_sed.set_ylim(1.0e-12, 1.0e-9) ax_sed.set_xlim(0.5, 40) plt.show() ###################################################################### # What’s next? # ------------ # # You can look at the same analysis without the high level interface in # :doc:`/tutorials/analysis-1d/spectral_analysis` # # As we can store the best model fit, you can overlay the fit results of # both methods on a unique plot. # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-1d/spectral_analysis_rad_max.py0000644000175100001770000003216314721316200026174 0ustar00runnerdocker""" Spectral analysis with energy-dependent directional cuts ======================================================== Perform a point like spectral analysis with energy dependent offset cut. Prerequisites ------------- - Understanding the basic data reduction performed in the :doc:`/tutorials/analysis-1d/spectral_analysis` tutorial. - understanding the difference between a `point-like `__ and a `full-enclosure `__ IRF. Context ------- As already explained in the :doc:`/tutorials/analysis-1d/spectral_analysis` tutorial, the background is estimated from the field of view of the observation. In particular, the source and background events are counted within a circular ON region enclosing the source. The background to be subtracted is then estimated from one or more OFF regions with an expected background rate similar to the one in the ON region (i.e. from regions with similar acceptance). *Full-containment* IRFs have no directional cut applied, when employed for a 1D analysis, it is required to apply a correction to the IRF accounting for flux leaking out of the ON region. This correction is typically obtained by integrating the PSF within the ON region. When computing a *point-like* IRFs, a directional cut around the assumed source position is applied to the simulated events. For this IRF type, no PSF component is provided. The size of the ON and OFF regions used for the spectrum extraction should then reflect this cut, since a response computed within a specific region around the source is being provided. The directional cut is typically an angular distance from the assumed source position, :math:`\\theta`. The `gamma-astro-data-format `__ specifications offer two different ways to store this information: * if the same :math:`\\theta` cut is applied at all energies and offsets, a `RAD_MAX `__ keyword is added to the header of the data units containing IRF components. This should be used to define the size of the ON and OFF regions; * in case an energy-dependent (and offset-dependent) :math:`\\theta` cut is applied, its values are stored in additional `FITS` data unit, named `RAD_MAX_2D `__. `Gammapy` provides a class to automatically read these values, `~gammapy.irf.RadMax2D`, for both cases (fixed or energy-dependent :math:`\\theta` cut). In this notebook we will focus on how to perform a spectral extraction with a point-like IRF with an energy-dependent :math:`\\theta` cut. We remark that in this case a `~regions.PointSkyRegion` (and not a `~regions.CircleSkyRegion`) should be used to define the ON region. If a geometry based on a `~regions.PointSkyRegion` is fed to the spectra and the background `Makers`, the latter will automatically use the values stored in the `RAD_MAX` keyword / table for defining the size of the ON and OFF regions. Beside the definition of the ON region during the data reduction, the remaining steps are identical to the other :doc:`/tutorials/analysis-1d/spectral_analysis` tutorial., so we will not detail the data reduction steps already presented in the other tutorial. **Objective: perform the data reduction and analysis of 2 Crab Nebula observations of MAGIC and fit the resulting datasets.** Introduction ------------ We load two MAGIC observations in the `gammapy-data `__ containing IRF component with a :math:`\\theta` cut. We define the ON region, this time as a `~regions.PointSkyRegion` instead of a `CircleSkyRegion`, i.e. we specify only the center of our ON region. We create a `RegionGeom` adding to the region the estimated energy axis of the `~gammapy.datasets.SpectrumDataset` object we want to produce. The corresponding dataset maker will automatically use the :math:`\\theta` values in `~gammapy.irf.RadMax2D` to set the appropriate ON region sizes (based on the offset on the observation and on the estimated energy binning). In order to define the OFF regions it is recommended to use a `~gammapy.makers.WobbleRegionsFinder`, that uses fixed positions for the OFF regions. In the different estimated energy bins we will have OFF regions centered at the same positions, but with changing size. As for the `~gammapy.makers.SpectrumDatasetMaker`, the `~gammapy.makers.ReflectedRegionsBackgroundMaker` will use the values in `~gammapy.irf.RadMax2D` to define the sizes of the OFF regions. Once the datasets with the ON and OFF counts are created, we can perform a 1D likelihood fit, exactly as illustrated in the previous example. """ import astropy.units as u from astropy.coordinates import SkyCoord from regions import PointSkyRegion # %matplotlib inline import matplotlib.pyplot as plt ###################################################################### # Setup # ----- # # As usual, we’ll start with some setup … # from IPython.display import display from gammapy.data import DataStore from gammapy.datasets import Datasets, SpectrumDataset from gammapy.makers import ( ReflectedRegionsBackgroundMaker, SafeMaskMaker, SpectrumDatasetMaker, WobbleRegionsFinder, ) from gammapy.maps import Map, MapAxis, RegionGeom from gammapy.modeling import Fit from gammapy.modeling.models import ( LogParabolaSpectralModel, SkyModel, create_crab_spectral_model, ) ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup from gammapy.visualization import plot_spectrum_datasets_off_regions check_tutorials_setup() ###################################################################### # Load data # --------- # # We load the two MAGIC observations of the Crab Nebula containing the # `RAD_MAX_2D` table. # data_store = DataStore.from_dir("$GAMMAPY_DATA/magic/rad_max/data") observations = data_store.get_observations(required_irf="point-like") ###################################################################### # A `RadMax2D` attribute, containing the `RAD_MAX_2D` table, is # automatically loaded in the observation. As we can see from the IRF # component axes, the table has a single offset value and 28 estimated # energy values. # rad_max = observations["5029747"].rad_max print(rad_max) ###################################################################### # We can also plot the rad max value against the energy: # fig, ax = plt.subplots() rad_max.plot_rad_max_vs_energy(ax=ax) plt.show() ###################################################################### # Define the ON region # -------------------- # # To use the `RAD_MAX_2D` values to define the sizes of the ON and OFF # regions it is necessary to specify the ON region as # a `~regions.PointSkyRegion`: # target_position = SkyCoord(ra=83.63, dec=22.01, unit="deg", frame="icrs") on_region = PointSkyRegion(target_position) ###################################################################### # Run data reduction chain # ------------------------ # # We begin with the configuration of the dataset maker classes: # # true and estimated energy axes energy_axis = MapAxis.from_energy_bounds( 50, 1e5, nbin=5, per_decade=True, unit="GeV", name="energy" ) energy_axis_true = MapAxis.from_energy_bounds( 10, 1e5, nbin=10, per_decade=True, unit="GeV", name="energy_true" ) # geometry defining the ON region and SpectrumDataset based on it geom = RegionGeom.create(region=on_region, axes=[energy_axis]) dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=energy_axis_true) ###################################################################### # The `SpectrumDataset` is now based on a geometry consisting of a # single coordinate and an estimated energy axis. The # `SpectrumDatasetMaker` and `ReflectedRegionsBackgroundMaker` will # take care of producing ON and OFF with the proper sizes, automatically # adopting the :math:`\theta` values in `Observation.rad_max`. # # As explained in the introduction, we use a `WobbleRegionsFinder`, to # determine the OFF positions. The parameter `n_off_positions` specifies # the number of OFF regions to be considered. # dataset_maker = SpectrumDatasetMaker( containment_correction=False, selection=["counts", "exposure", "edisp"] ) # tell the background maker to use the WobbleRegionsFinder, let us use 3 off region_finder = WobbleRegionsFinder(n_off_regions=3) bkg_maker = ReflectedRegionsBackgroundMaker(region_finder=region_finder) # use the energy threshold specified in the DL3 files safe_mask_masker = SafeMaskMaker(methods=["aeff-default"]) # %%time datasets = Datasets() # create a counts map for visualisation later... counts = Map.create(skydir=target_position, width=3) for observation in observations: dataset = dataset_maker.run( dataset_empty.copy(name=str(observation.obs_id)), observation ) counts.fill_events(observation.events) dataset_on_off = bkg_maker.run(dataset, observation) dataset_on_off = safe_mask_masker.run(dataset_on_off, observation) datasets.append(dataset_on_off) ###################################################################### # Now we can plot the off regions and target positions on top of the counts # map: # ax = counts.plot(cmap="viridis") geom.plot_region(ax=ax, kwargs_point={"color": "k", "marker": "*"}) plot_spectrum_datasets_off_regions(ax=ax, datasets=datasets) plt.show() ###################################################################### # Fit spectrum # ------------ # # | We perform a joint likelihood fit of the two datasets. # | For this particular datasets we select a fit range between # :math:`80\,{\rm GeV}` and :math:`20\,{\rm TeV}`. # e_min = 80 * u.GeV e_max = 20 * u.TeV for dataset in datasets: dataset.mask_fit = dataset.counts.geom.energy_mask(e_min, e_max) spectral_model = LogParabolaSpectralModel( amplitude=1e-12 * u.Unit("cm-2 s-1 TeV-1"), alpha=2, beta=0.1, reference=1 * u.TeV, ) model = SkyModel(spectral_model=spectral_model, name="crab") datasets.models = [model] fit = Fit() result = fit.run(datasets=datasets) # we make a copy here to compare it later best_fit_model = model.copy() ###################################################################### # Fit quality and model residuals # ------------------------------- # ###################################################################### # We can access the results dictionary to see if the fit converged: # print(result) ###################################################################### # and check the best-fit parameters # display(datasets.models.to_parameters_table()) ###################################################################### # A simple way to inspect the model residuals is using the function # `~SpectrumDataset.plot_fit()` # ax_spectrum, ax_residuals = datasets[0].plot_fit() ax_spectrum.set_ylim(0.1, 120) plt.show() ###################################################################### # For more ways of assessing fit quality, please refer to the dedicated # :doc:`/tutorials/api/fitting` tutorial. # ###################################################################### # Compare against the literature # ------------------------------ # # Let us compare the spectrum we obtained against a `previous measurement # by # MAGIC `__. # fig, ax = plt.subplots() plot_kwargs = { "energy_bounds": [0.08, 20] * u.TeV, "sed_type": "e2dnde", "yunits": u.Unit("TeV cm-2 s-1"), "xunits": u.GeV, "ax": ax, } crab_magic_lp = create_crab_spectral_model("magic_lp") best_fit_model.spectral_model.plot( ls="-", lw=1.5, color="crimson", label="best fit", **plot_kwargs ) best_fit_model.spectral_model.plot_error(facecolor="crimson", alpha=0.4, **plot_kwargs) crab_magic_lp.plot(ls="--", lw=1.5, color="k", label="MAGIC reference", **plot_kwargs) ax.legend() ax.set_ylim([1e-13, 1e-10]) plt.show() ###################################################################### # Dataset simulations # ------------------- # # A common way to check if a fit is biased is to simulate multiple datasets with # the obtained best fit model, and check the distribution of the fitted parameters. # Here, we show how to perform one such simulation assuming the measured off counts # provide a good distribution of the background. # dataset_simulated = datasets.stack_reduce().copy(name="simulated_ds") simulated_model = best_fit_model.copy(name="simulated") dataset_simulated.models = simulated_model dataset_simulated.fake( npred_background=dataset_simulated.counts_off * dataset_simulated.alpha ) dataset_simulated.peek() plt.show() # The important thing to note here is that while this samples the on-counts, the off counts are # not sampled. If you have multiple measurements of the off counts, they should be used. # Alternatively, you can try to create a parametric model of the background. result = fit.run(datasets=[dataset_simulated]) print(result.models.to_parameters_table()) # sphinx_gallery_thumbnail_number = 4 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-1d/spectrum_simulation.py0000644000175100001770000002036114721316200025064 0ustar00runnerdocker""" 1D spectrum simulation ====================== Simulate a number of spectral on-off observations of a source with a power-law spectral model using the CTA 1DC response and fit them with the assumed spectral model. Prerequisites ------------- - Knowledge of spectral extraction and datasets used in gammapy, see for instance the :doc:`spectral analysis tutorial ` Context ------- To simulate a specific observation, it is not always necessary to simulate the full photon list. For many uses cases, simulating directly a reduced binned dataset is enough: the IRFs reduced in the correct geometry are combined with a source model to predict an actual number of counts per bin. The latter is then used to simulate a reduced dataset using Poisson probability distribution. This can be done to check the feasibility of a measurement, to test whether fitted parameters really provide a good fit to the data etc. Here we will see how to perform a 1D spectral simulation of a CTAO observation, in particular, we will generate OFF observations following the template background stored in the CTAO IRFs. **Objective: simulate a number of spectral ON-OFF observations of a source with a power-law spectral model with CTAO using the CTA 1DC response, fit them with the assumed spectral model and check that the distribution of fitted parameters is consistent with the input values.** Proposed approach ----------------- We will use the following classes and functions: - `~gammapy.datasets.SpectrumDatasetOnOff` - `~gammapy.datasets.SpectrumDataset` - `~gammapy.irf.load_irf_dict_from_file` - `~gammapy.modeling.models.PowerLawSpectralModel` """ import numpy as np import astropy.units as u from astropy.coordinates import Angle, SkyCoord from regions import CircleSkyRegion # %matplotlib inline import matplotlib.pyplot as plt ###################################################################### # Setup # ----- # from IPython.display import display from gammapy.data import FixedPointingInfo, Observation, observatory_locations from gammapy.datasets import Datasets, SpectrumDataset, SpectrumDatasetOnOff from gammapy.irf import load_irf_dict_from_file from gammapy.makers import SpectrumDatasetMaker from gammapy.maps import MapAxis, RegionGeom from gammapy.modeling import Fit from gammapy.modeling.models import PowerLawSpectralModel, SkyModel ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Simulation of a single spectrum # ------------------------------- # # To do a simulation, we need to define the observational parameters like # the livetime, the offset, the assumed integration radius, the energy # range to perform the simulation for and the choice of spectral model. We # then use an in-memory observation which is convolved with the IRFs to # get the predicted number of counts. This is Poisson fluctuated using # the `fake()` to get the simulated counts for each observation. # # Define simulation parameters parameters livetime = 1 * u.h pointing_position = SkyCoord(0, 0, unit="deg", frame="galactic") # We want to simulate an observation pointing at a fixed position in the sky. # For this, we use the `FixedPointingInfo` class pointing = FixedPointingInfo( fixed_icrs=pointing_position.icrs, ) offset = 0.5 * u.deg # Reconstructed and true energy axis energy_axis = MapAxis.from_edges( np.logspace(-0.5, 1.0, 10), unit="TeV", name="energy", interp="log" ) energy_axis_true = MapAxis.from_edges( np.logspace(-1.2, 2.0, 31), unit="TeV", name="energy_true", interp="log" ) on_region_radius = Angle("0.11 deg") center = pointing_position.directional_offset_by( position_angle=0 * u.deg, separation=offset ) on_region = CircleSkyRegion(center=center, radius=on_region_radius) # Define spectral model - a simple Power Law in this case model_simu = PowerLawSpectralModel( index=3.0, amplitude=2.5e-12 * u.Unit("cm-2 s-1 TeV-1"), reference=1 * u.TeV, ) print(model_simu) # we set the sky model used in the dataset model = SkyModel(spectral_model=model_simu, name="source") ###################################################################### # Load the IRFs # In this simulation, we use the CTA-1DC IRFs shipped with Gammapy. irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) location = observatory_locations["cta_south"] obs = Observation.create( pointing=pointing, livetime=livetime, irfs=irfs, location=location, ) print(obs) ###################################################################### # Simulate a spectra # # Make the SpectrumDataset geom = RegionGeom.create(region=on_region, axes=[energy_axis]) dataset_empty = SpectrumDataset.create( geom=geom, energy_axis_true=energy_axis_true, name="obs-0" ) maker = SpectrumDatasetMaker(selection=["exposure", "edisp", "background"]) dataset = maker.run(dataset_empty, obs) # Set the model on the dataset, and fake dataset.models = model dataset.fake(random_state=42) print(dataset) ###################################################################### # You can see that background counts are now simulated # ###################################################################### # On-Off analysis # ~~~~~~~~~~~~~~~ # # To do an on off spectral analysis, which is the usual science case, the # standard would be to use `SpectrumDatasetOnOff`, which uses the # acceptance to fake off-counts. Please also refer to the `Dataset simulations` # section in the :doc:`/tutorials/analysis-1d/spectral_analysis_rad_max` tutorial, # dealing with simulations based on observations of real off counts. # dataset_on_off = SpectrumDatasetOnOff.from_spectrum_dataset( dataset=dataset, acceptance=1, acceptance_off=5 ) dataset_on_off.fake(npred_background=dataset.npred_background()) print(dataset_on_off) ###################################################################### # You can see that off counts are now simulated as well. We now simulate # several spectra using the same set of observation conditions. # # %%time n_obs = 100 datasets = Datasets() for idx in range(n_obs): dataset_on_off.fake(random_state=idx, npred_background=dataset.npred_background()) dataset_fake = dataset_on_off.copy(name=f"obs-{idx}") dataset_fake.meta_table["OBS_ID"] = [idx] datasets.append(dataset_fake) table = datasets.info_table() display(table) ###################################################################### # Before moving on to the fit let’s have a look at the simulated # observations. # fix, axes = plt.subplots(1, 3, figsize=(12, 4)) axes[0].hist(table["counts"]) axes[0].set_xlabel("Counts") axes[1].hist(table["counts_off"]) axes[1].set_xlabel("Counts Off") axes[2].hist(table["excess"]) axes[2].set_xlabel("excess") plt.show() ###################################################################### # Now, we fit each simulated spectrum individually # # %%time results = [] fit = Fit() for dataset in datasets: dataset.models = model.copy() result = fit.optimize(dataset) results.append( { "index": result.parameters["index"].value, "amplitude": result.parameters["amplitude"].value, } ) ###################################################################### # We take a look at the distribution of the fitted indices. This matches # very well with the spectrum that we initially injected. # fig, ax = plt.subplots() index = np.array([_["index"] for _ in results]) ax.hist(index, bins=10, alpha=0.5) ax.axvline(x=model_simu.parameters["index"].value, color="red") ax.set_xlabel("Reconstructed spectral index") print(f"index: {index.mean()} += {index.std()}") plt.show() ###################################################################### # Exercises # --------- # # - Change the observation time to something longer or shorter. Does the # observation and spectrum results change as you expected? # - Change the spectral model, e.g. add a cutoff at 5 TeV, or put a # steep-spectrum source with spectral index of 4.0 # - Simulate spectra with the spectral model we just defined. How much # observation duration do you need to get back the injected parameters? # ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.168642 gammapy-1.3/examples/tutorials/analysis-2d/0000755000175100001770000000000014721316215020411 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-2d/README.rst0000644000175100001770000000002114721316200022063 0ustar00runnerdocker2D Image --------././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-2d/detect.py0000644000175100001770000002306114721316200022227 0ustar00runnerdocker""" Source detection and significance maps ====================================== Build a list of significant excesses in a Fermi-LAT map. Context ------- The first task in a source catalog production is to identify significant excesses in the data that can be associated to unknown sources and provide a preliminary parametrization in terms of position, extent, and flux. In this notebook we will use Fermi-LAT data to illustrate how to detect candidate sources in counts images with known background. **Objective: build a list of significant excesses in a Fermi-LAT map** Proposed approach ----------------- This notebook show how to do source detection with Gammapy using the methods available in `~gammapy.estimators`. We will use images from a Fermi-LAT 3FHL high-energy Galactic center dataset to do this: - perform adaptive smoothing on counts image - produce 2-dimensional test-statistics (TS) - run a peak finder to detect point-source candidates - compute Li & Ma significance images - estimate source candidates radius and excess counts Note that what we do here is a quick-look analysis, the production of real source catalogs use more elaborate procedures. We will work with the following functions and classes: - `~gammapy.maps.WcsNDMap` - `~gammapy.estimators.ASmoothMapEstimator` - `~gammapy.estimators.TSMapEstimator` - `~gammapy.estimators.utils.find_peaks` """ ###################################################################### # Setup # ----- # # As always, let’s get started with some setup … # import numpy as np import astropy.units as u # %matplotlib inline import matplotlib.pyplot as plt from IPython.display import display from gammapy.datasets import MapDataset from gammapy.estimators import ASmoothMapEstimator, TSMapEstimator from gammapy.estimators.utils import find_peaks, find_peaks_in_flux_map from gammapy.irf import EDispKernelMap, PSFMap from gammapy.maps import Map from gammapy.modeling.models import PointSpatialModel, PowerLawSpectralModel, SkyModel ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Read in input images # -------------------- # # We first read the relevant maps: # counts = Map.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-counts-cube.fits.gz") background = Map.read( "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-background-cube.fits.gz" ) exposure = Map.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-exposure-cube.fits.gz") psfmap = PSFMap.read( "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-psf-cube.fits.gz", format="gtpsf", ) edisp = EDispKernelMap.from_diagonal_response( energy_axis=counts.geom.axes["energy"], energy_axis_true=exposure.geom.axes["energy_true"], ) dataset = MapDataset( counts=counts, background=background, exposure=exposure, psf=psfmap, name="fermi-3fhl-gc", edisp=edisp, ) ###################################################################### # Adaptive smoothing # ------------------ # # For visualisation purpose it can be nice to look at a smoothed counts # image. This can be performed using the adaptive smoothing algorithm from # `Ebeling et # al. (2006) `__. # # In the following example the `ASmoothMapEstimator.threshold` argument gives the minimum # significance expected, values below are clipped. # # %%time scales = u.Quantity(np.arange(0.05, 1, 0.05), unit="deg") smooth = ASmoothMapEstimator(threshold=3, scales=scales, energy_edges=[10, 500] * u.GeV) images = smooth.run(dataset) plt.figure(figsize=(9, 5)) images["flux"].plot(add_cbar=True, stretch="asinh") plt.show() ###################################################################### # TS map estimation # ----------------- # # The Test Statistic, :math:`TS = 2 \Delta log L` (`Mattox et # al. 1996 `__), # compares the likelihood function L optimized with and without a given # source. The TS map is computed by fitting by a single amplitude # parameter on each pixel as described in Appendix A of `Stewart # (2009) `__. # The fit is simplified by finding roots of the derivative of the fit # statistics (default settings use `Brent’s # method `__). # # We first need to define the model that will be used to test for the # existence of a source. Here, we use a point source. # spatial_model = PointSpatialModel() # We choose units consistent with the map units here... spectral_model = PowerLawSpectralModel(amplitude="1e-22 cm-2 s-1 keV-1", index=2) model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) ###################################################################### # Here we show a full configuration of the estimator. We remind the user of the meaning # of the various quantities: # # - ``model``: a `~gammapy.modeling.models.SkyModel` which is converted to a source model kernel # - ``kernel_width``: the width for the above kernel # - ``n_sigma``: number of sigma for the flux error # - ``n_sigma_ul``: the number of sigma for the flux upper limits # - ``selection_optional``: what optional maps to compute # - ``n_jobs``: for running in parallel, the number of processes used for the computation # - ``sum_over_energy_groups``: to sum over the energy groups or fit the `norm` on the full energy cube # %%time estimator = TSMapEstimator( model=model, kernel_width="1 deg", energy_edges=[10, 500] * u.GeV, n_sigma=1, n_sigma_ul=2, selection_optional=None, n_jobs=1, sum_over_energy_groups=True, ) maps = estimator.run(dataset) ###################################################################### # Accessing and visualising results # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Below we print the result of the `~gammapy.estimators.TSMapEstimator`. We have access to a number of # different quantities, as shown below. We can also access the quantities names # through ``map_result.available_quantities``. # print(maps) ###################################################################### # fig, (ax1, ax2, ax3) = plt.subplots( ncols=3, figsize=(20, 3), subplot_kw={"projection": counts.geom.wcs}, gridspec_kw={"left": 0.1, "right": 0.98}, ) maps["sqrt_ts"].plot(ax=ax1, add_cbar=True) ax1.set_title("Significance map") maps["flux"].plot(ax=ax2, add_cbar=True, stretch="sqrt", vmin=0) ax2.set_title("Flux map") maps["niter"].plot(ax=ax3, add_cbar=True) ax3.set_title("Iteration map") plt.show() ###################################################################### # The flux in each pixel is obtained by multiplying a reference model with a # normalisation factor: print(maps.reference_model) ###################################################################### # maps.norm.plot(add_cbar=True, stretch="sqrt") plt.show() ###################################################################### # Source candidates # ----------------- # # Let’s run a peak finder on the `sqrt_ts` image to get a list of # point-sources candidates (positions and peak `sqrt_ts` values). The # `~gammapy.estimators.utils.find_peaks` function performs a local maximum search in a sliding # window, the argument `min_distance` is the minimum pixel distance # between peaks (smallest possible value and default is 1 pixel). # sources = find_peaks(maps["sqrt_ts"], threshold=5, min_distance="0.25 deg") nsou = len(sources) display(sources) # Plot sources on top of significance sky image plt.figure(figsize=(9, 5)) ax = maps["sqrt_ts"].plot(add_cbar=True) ax.scatter( sources["ra"], sources["dec"], transform=ax.get_transform("icrs"), color="none", edgecolor="w", marker="o", s=600, lw=1.5, ) plt.show() # sphinx_gallery_thumbnail_number = 3 ###################################################################### # We can also utilise `~gammapy.estimators.utils.find_peaks_in_flux_map` # to display various parameters from the FluxMaps sources_flux_map = find_peaks_in_flux_map(maps, threshold=5, min_distance="0.25 deg") display(sources_flux_map) ###################################################################### # Note that we used the instrument point-spread-function (PSF) as kernel, # so the hypothesis we test is the presence of a point source. In order to # test for extended sources we would have to use as kernel an extended # template convolved by the PSF. Alternatively, we can compute the # significance of an extended excess using the Li & Ma formalism, which is # faster as no fitting is involve. # ###################################################################### # What next? # ---------- # # In this notebook, we have seen how to work with images and compute TS # and significance images from counts data, if a background estimate is # already available. # # Here’s some suggestions what to do next: # # - Look how background estimation is performed for IACTs with and # without the high level interface in # :doc:`/tutorials/starting/analysis_1` and # :doc:`/tutorials/starting/analysis_2` notebooks, # respectively # - Learn about 2D model fitting in the :doc:`/tutorials/analysis-2d/modeling_2D` notebook # - Find more about Fermi-LAT data analysis in the # :doc:`/tutorials/data/fermi_lat` notebook # - Use source candidates to build a model and perform a 3D fitting (see # :doc:`/tutorials/analysis-3d/analysis_3d`, # :doc:`/tutorials/analysis-3d/analysis_mwl` notebooks for some hints) # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-2d/modeling_2D.py0000644000175100001770000001315514721316200023105 0ustar00runnerdocker""" 2D map fitting ============== Source modelling and fitting in stacked observations using the high level interface. Prerequisites ------------- - To understand how a general modelling and fitting works in gammapy, please refer to the :doc:`/tutorials/analysis-3d/analysis_3d` tutorial. Context ------- We often want the determine the position and morphology of an object. To do so, we don’t necessarily have to resort to a full 3D fitting but can perform a simple image fitting, in particular, in an energy range where the PSF does not vary strongly, or if we want to explore a possible energy dependence of the morphology. Objective --------- To localize a source and/or constrain its morphology. Proposed approach ----------------- The first step here, as in most analysis with DL3 data, is to create reduced datasets. For this, we will use the `Analysis` class to create a single set of stacked maps with a single bin in energy (thus, an *image* which behaves as a *cube*). This, we will then model with a spatial model of our choice, while keeping the spectral model fixed to an integrated power law. """ # %matplotlib inline import astropy.units as u import matplotlib.pyplot as plt ###################################################################### # Setup # ----- # # As usual, we’ll start with some general imports… # from IPython.display import display from gammapy.analysis import Analysis, AnalysisConfig ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Creating the config file # ------------------------ # # Now, we create a config file for out analysis. You may load this from # disc if you have a pre-defined config file. # # Here, we use 3 simulated CTAO runs of the galactic center. # config = AnalysisConfig() # Selecting the observations config.observations.datastore = "$GAMMAPY_DATA/cta-1dc/index/gps/" config.observations.obs_ids = [110380, 111140, 111159] ###################################################################### # Technically, gammapy implements 2D analysis as a special case of 3D # analysis (one bin in energy). So, we must specify the type of # analysis as *3D*, and define the geometry of the analysis. # config.datasets.type = "3d" config.datasets.geom.wcs.skydir = { "lon": "0 deg", "lat": "0 deg", "frame": "galactic", } # The WCS geometry - centered on the galactic center config.datasets.geom.wcs.width = {"width": "8 deg", "height": "6 deg"} config.datasets.geom.wcs.binsize = "0.02 deg" # The FoV radius to use for cutouts config.datasets.geom.selection.offset_max = 2.5 * u.deg config.datasets.safe_mask.methods = ["offset-max"] config.datasets.safe_mask.parameters = {"offset_max": "2.5 deg"} config.datasets.background.method = "fov_background" config.fit.fit_range = {"min": "0.1 TeV", "max": "30.0 TeV"} # We now fix the energy axis for the counts map - (the reconstructed energy binning) config.datasets.geom.axes.energy.min = "0.1 TeV" config.datasets.geom.axes.energy.max = "10 TeV" config.datasets.geom.axes.energy.nbins = 1 config.datasets.geom.wcs.binsize_irf = "0.2 deg" print(config) ###################################################################### # Getting the reduced dataset # --------------------------- # ###################################################################### # We now use the config file and create a single `MapDataset` containing # `counts`, `background`, `exposure`, `psf` and `edisp` maps. # # %%time analysis = Analysis(config) analysis.get_observations() analysis.get_datasets() print(analysis.datasets["stacked"]) ###################################################################### # The counts and background maps have only one bin in reconstructed # energy. The exposure and IRF maps are in true energy, and hence, have a # different binning based upon the binning of the IRFs. We need not bother # about them presently. # print(analysis.datasets["stacked"].counts) print(analysis.datasets["stacked"].background) print(analysis.datasets["stacked"].exposure) ###################################################################### # We can have a quick look of these maps in the following way: # analysis.datasets["stacked"].counts.reduce_over_axes().plot(vmax=10, add_cbar=True) plt.show() ###################################################################### # Modelling # --------- # # Now, we define a model to be fitted to the dataset. **The important # thing to note here is the dummy spectral model - an integrated powerlaw # with only free normalisation**. Here, we use its YAML definition to load # it: # model_config = """ components: - name: GC-1 type: SkyModel spatial: type: PointSpatialModel frame: galactic parameters: - name: lon_0 value: 0.02 unit: deg - name: lat_0 value: 0.01 unit: deg spectral: type: PowerLaw2SpectralModel parameters: - name: amplitude value: 1.0e-12 unit: cm-2 s-1 - name: index value: 2.0 unit: '' frozen: true - name: emin value: 0.1 unit: TeV frozen: true - name: emax value: 10.0 unit: TeV frozen: true """ analysis.set_models(model_config) ###################################################################### # We will freeze the parameters of the background # analysis.datasets["stacked"].background_model.parameters["tilt"].frozen = True # To run the fit analysis.run_fit() # To see the best fit values along with the errors display(analysis.models.to_parameters_table()) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-2d/ring_background.py0000644000175100001770000002201014721316200024106 0ustar00runnerdocker""" Ring background map =================== Create an excess (gamma-ray events) and a significance map extracting a ring background. Context ------- One of the challenges of IACT analysis is accounting for the large residual hadronic emission. An excess map, assumed to be a map of only gamma-ray events, requires a good estimate of the background. However, in the absence of a solid template bkg model it is not possible to obtain reliable background model a priori. It was often found necessary in classical cherenkov astronomy to perform a local renormalization of the existing templates, usually with a ring kernel. This assumes that most of the events are background and requires to have an exclusion mask to remove regions with bright signal from the estimation. To read more about this method, see `here. `__ Objective --------- Create an excess (gamma-ray events) map of MSH 15-52 as well as a significance map to determine how solid the signal is. Proposed approach ----------------- The analysis workflow is roughly: - Compute the sky maps keeping each observation separately using the `~gammapy.analysis.Analysis` class - Estimate the background using the `~gammapy.makers.RingBackgroundMaker` - Compute the correlated excess and significance maps using the `~gammapy.estimators.ExcessMapEstimator` The normalised background thus obtained can be used for general modelling and fitting. """ ###################################################################### # Setup # ----- # # As usual, we’ll start with some general imports… # import logging # %matplotlib inline import astropy.units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion import matplotlib.pyplot as plt from gammapy.analysis import Analysis, AnalysisConfig from gammapy.datasets import MapDatasetOnOff from gammapy.estimators import ExcessMapEstimator from gammapy.makers import RingBackgroundMaker from gammapy.visualization import plot_distribution from gammapy.utils.check import check_tutorials_setup log = logging.getLogger(__name__) ###################################################################### # Check setup # ----------- check_tutorials_setup() ###################################################################### # Creating the config file # ------------------------ # # Now, we create a config file for out analysis. You may load this from # disc if you have a pre-defined config file. # # In this example, we will use a few H.E.S.S. runs on the pulsar wind nebula, # MSH 1552 # # source_pos = SkyCoord.from_name("MSH 15-52") source_pos = SkyCoord(228.32, -59.08, unit="deg") config = AnalysisConfig() # Select observations - 2.5 degrees from the source position config.observations.datastore = "$GAMMAPY_DATA/hess-dl3-dr1/" config.observations.obs_cone = { "frame": "icrs", "lon": source_pos.ra, "lat": source_pos.dec, "radius": 2.5 * u.deg, } config.datasets.type = "3d" config.datasets.geom.wcs.skydir = { "lon": source_pos.ra, "lat": source_pos.dec, "frame": "icrs", } # The WCS geometry - centered on MSH 15-52 config.datasets.geom.wcs.width = {"width": "3 deg", "height": "3 deg"} config.datasets.geom.wcs.binsize = "0.02 deg" # Cutout size (for the run-wise event selection) config.datasets.geom.selection.offset_max = 2.5 * u.deg # We now fix the energy axis for the counts map - (the reconstructed energy binning) config.datasets.geom.axes.energy.min = "0.5 TeV" config.datasets.geom.axes.energy.max = "10 TeV" config.datasets.geom.axes.energy.nbins = 10 # We need to extract the ring for each observation separately, hence, no stacking at this stage config.datasets.stack = False print(config) ###################################################################### # Getting the reduced dataset # --------------------------- # # We now use the config file to do the initial data reduction which will # then be used for a ring extraction # # Create the config: analysis = Analysis(config) # for this specific case,w e do not need fine bins in true energy analysis.config.datasets.geom.axes.energy_true = ( analysis.config.datasets.geom.axes.energy ) # First get the required observations analysis.get_observations() print(analysis.config) # %%time # Data extraction: analysis.get_datasets() ###################################################################### # Extracting the ring background # ------------------------------ # # Since the ring background is extracted from real off events, we need to # use the Wstat statistics in this case. For this, we will use the # `~gammapy.datasets.MapDatasetOnOff` and the `~gammapy.makers.RingBackgroundMaker` classes. # ###################################################################### # Create exclusion mask # ~~~~~~~~~~~~~~~~~~~~~ # # First, we need to create an exclusion mask on the known sources. In this # case, we need to mask only MSH 15-52 but this depends on the sources # present in our field of view. # # get the geom that we use geom = analysis.datasets[0].counts.geom energy_axis = analysis.datasets[0].counts.geom.axes["energy"] geom_image = geom.to_image().to_cube([energy_axis.squash()]) # Make the exclusion mask regions = CircleSkyRegion(center=source_pos, radius=0.4 * u.deg) exclusion_mask = ~geom_image.region_mask([regions]) exclusion_mask.sum_over_axes().plot() plt.show() ###################################################################### # For the present analysis, we use a ring with an inner radius of 0.5 deg # and width of 0.3 deg. # ring_maker = RingBackgroundMaker( r_in="0.5 deg", width="0.3 deg", exclusion_mask=exclusion_mask ) ###################################################################### # Create a stacked dataset # ~~~~~~~~~~~~~~~~~~~~~~~~ # # Now, we extract the background for each dataset and then stack the maps # together to create a single stacked map for further analysis # energy_axis_true = analysis.datasets[0].exposure.geom.axes["energy_true"] stacked_on_off = MapDatasetOnOff.create( geom=geom_image, energy_axis_true=energy_axis_true, name="stacked" ) # %%time for dataset in analysis.datasets: # Ring extracting makes sense only for 2D analysis dataset_on_off = ring_maker.run(dataset.to_image()) stacked_on_off.stack(dataset_on_off) ###################################################################### # This `stacked_on_off` has `on` and `off` counts and acceptance # maps which we will use in all further analysis. The `acceptance` and # `acceptance_off` maps are the system acceptance of gamma-ray like # events in the `on` and `off` regions respectively. # print(stacked_on_off) ###################################################################### # Compute correlated significance and correlated excess maps # ---------------------------------------------------------- # # We need to convolve our maps with an appropriate smoothing kernel. The # significance is computed according to the Li & Ma expression for ON and # OFF Poisson measurements, see # `here `__. # # # Also, since the off counts are obtained with a ring background estimation, and are thus already correlated, we specify `correlate_off=False` # to avoid over correlation. # Using a convolution radius of 0.04 degrees estimator = ExcessMapEstimator(0.04 * u.deg, selection_optional=[], correlate_off=False) lima_maps = estimator.run(stacked_on_off) significance_map = lima_maps["sqrt_ts"] excess_map = lima_maps["npred_excess"] # We can plot the excess and significance maps fig, (ax1, ax2) = plt.subplots( figsize=(11, 4), subplot_kw={"projection": lima_maps.geom.wcs}, ncols=2 ) ax1.set_title("Significance map") significance_map.plot(ax=ax1, add_cbar=True) ax2.set_title("Excess map") excess_map.plot(ax=ax2, add_cbar=True) plt.show() ###################################################################### # It is often important to look at the significance distribution outside # the exclusion region to check that the background estimation is not # contaminated by gamma-ray events. This can be the case when exclusion # regions are not large enough. Typically, we expect the off distribution # to be a standard normal distribution. To compute the significance distribution outside the exclusion region, # we can recompute the maps after adding a `mask_fit` to our dataset. # # Mask the regions with gamma-ray emission stacked_on_off.mask_fit = exclusion_mask lima_maps2 = estimator.run(stacked_on_off) significance_map_off = lima_maps2["sqrt_ts"] kwargs_axes = {"xlabel": "Significance", "yscale": "log", "ylim": (1e-3, 1)} ax, _ = plot_distribution( significance_map, kwargs_hist={ "density": True, "alpha": 0.5, "color": "red", "label": "all bins", "bins": 51, }, kwargs_axes=kwargs_axes, ) ax, res = plot_distribution( significance_map_off, ax=ax, func="norm", kwargs_hist={ "density": True, "alpha": 0.5, "color": "blue", "label": "off bins", "bins": 51, }, kwargs_axes=kwargs_axes, ) plt.show() # sphinx_gallery_thumbnail_number = 2 ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.168642 gammapy-1.3/examples/tutorials/analysis-3d/0000755000175100001770000000000014721316215020412 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-3d/README.rst0000644000175100001770000000001714721316200022071 0ustar00runnerdocker3D Cube -------././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-3d/analysis_3d.py0000644000175100001770000003405014721316200023171 0ustar00runnerdocker""" 3D detailed analysis ==================== Perform detailed 3D stacked and joint analysis. This tutorial does a 3D map based analysis on the galactic center, using simulated observations from the CTA-1DC. We will use the high level interface for the data reduction, and then do a detailed modelling. This will be done in two different ways: - stacking all the maps together and fitting the stacked maps - handling all the observations separately and doing a joint fitting on all the maps """ from pathlib import Path import astropy.units as u from regions import CircleSkyRegion # %matplotlib inline import matplotlib.pyplot as plt from IPython.display import display from gammapy.analysis import Analysis, AnalysisConfig from gammapy.estimators import ExcessMapEstimator from gammapy.modeling import Fit from gammapy.modeling.models import ( ExpCutoffPowerLawSpectralModel, FoVBackgroundModel, Models, PointSpatialModel, SkyModel, ) ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup from gammapy.visualization import plot_distribution check_tutorials_setup() ###################################################################### # Analysis configuration # ---------------------- # # In this section we select observations and define the analysis # geometries, irrespective of joint/stacked analysis. For configuration of # the analysis, we will programmatically build a config file from scratch. # config = AnalysisConfig() # The config file is now empty, with only a few defaults specified. print(config) # Selecting the observations config.observations.datastore = "$GAMMAPY_DATA/cta-1dc/index/gps/" config.observations.obs_ids = [110380, 111140, 111159] # Defining a reference geometry for the reduced datasets config.datasets.type = "3d" # Analysis type is 3D config.datasets.geom.wcs.skydir = { "lon": "0 deg", "lat": "0 deg", "frame": "galactic", } # The WCS geometry - centered on the galactic center config.datasets.geom.wcs.width = {"width": "10 deg", "height": "8 deg"} config.datasets.geom.wcs.binsize = "0.02 deg" # Cutout size (for the run-wise event selection) config.datasets.geom.selection.offset_max = 3.5 * u.deg config.datasets.safe_mask.methods = ["aeff-default", "offset-max"] # We now fix the energy axis for the counts map - (the reconstructed energy binning) config.datasets.geom.axes.energy.min = "0.1 TeV" config.datasets.geom.axes.energy.max = "10 TeV" config.datasets.geom.axes.energy.nbins = 10 # We now fix the energy axis for the IRF maps (exposure, etc) - (the true energy binning) config.datasets.geom.axes.energy_true.min = "0.08 TeV" config.datasets.geom.axes.energy_true.max = "12 TeV" config.datasets.geom.axes.energy_true.nbins = 14 print(config) ###################################################################### # Configuration for stacked and joint analysis # -------------------------------------------- # # This is done just by specifying the flag on `config.datasets.stack`. # Since the internal machinery will work differently for the two cases, we # will write it as two config files and save it to disc in YAML format for # future reference. # config_stack = config.copy(deep=True) config_stack.datasets.stack = True config_joint = config.copy(deep=True) config_joint.datasets.stack = False # To prevent unnecessary cluttering, we write it in a separate folder. path = Path("analysis_3d") path.mkdir(exist_ok=True) config_joint.write(path=path / "config_joint.yaml", overwrite=True) config_stack.write(path=path / "config_stack.yaml", overwrite=True) ###################################################################### # Stacked analysis # ---------------- # # Data reduction # ~~~~~~~~~~~~~~ # # We first show the steps for the stacked analysis and then repeat the # same for the joint analysis later # # Reading yaml file: config_stacked = AnalysisConfig.read(path=path / "config_stack.yaml") analysis_stacked = Analysis(config_stacked) # %%time # select observations: analysis_stacked.get_observations() # run data reduction analysis_stacked.get_datasets() ###################################################################### # We have one final dataset, which we can print and explore # dataset_stacked = analysis_stacked.datasets["stacked"] print(dataset_stacked) ###################################################################### # To plot a smooth counts map # dataset_stacked.counts.smooth(0.02 * u.deg).plot_interactive(add_cbar=True) plt.show() ###################################################################### # And the background map # dataset_stacked.background.plot_interactive(add_cbar=True) plt.show() ###################################################################### # We can quickly check the PSF # dataset_stacked.psf.peek() plt.show() ###################################################################### # And the energy dispersion in the center of the map # dataset_stacked.edisp.peek() plt.show() ###################################################################### # You can also get an excess image with a few lines of code: # excess = dataset_stacked.excess.sum_over_axes() excess.smooth("0.06 deg").plot(stretch="sqrt", add_cbar=True) plt.show() ###################################################################### # Modeling and fitting # ~~~~~~~~~~~~~~~~~~~~ # # Now comes the interesting part of the analysis - choosing appropriate # models for our source and fitting them. # # We choose a point source model with an exponential cutoff power-law # spectrum. # ###################################################################### # To perform the fit on a restricted energy range, we can create a # specific *mask*. On the dataset, the `mask_fit` is a `Map` sharing # the same geometry as the `MapDataset` and containing boolean data. # # To create a mask to limit the fit within a restricted energy range, one # can rely on the `~gammapy.maps.Geom.energy_mask()` method. # # For more details on masks and the techniques to create them in gammapy, # please checkout the dedicated :doc:`/tutorials/api/mask_maps` tutorial. # dataset_stacked.mask_fit = dataset_stacked.counts.geom.energy_mask( energy_min=0.3 * u.TeV, energy_max=None ) spatial_model = PointSpatialModel( lon_0="-0.05 deg", lat_0="-0.05 deg", frame="galactic" ) spectral_model = ExpCutoffPowerLawSpectralModel( index=2.3, amplitude=2.8e-12 * u.Unit("cm-2 s-1 TeV-1"), reference=1.0 * u.TeV, lambda_=0.02 / u.TeV, ) model = SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, name="gc-source", ) bkg_model = FoVBackgroundModel(dataset_name="stacked") bkg_model.spectral_model.norm.value = 1.3 models_stacked = Models([model, bkg_model]) dataset_stacked.models = models_stacked # %%time fit = Fit(optimize_opts={"print_level": 1}) result = fit.run(datasets=[dataset_stacked]) ###################################################################### # Fit quality assessment and model residuals for a `MapDataset` # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # ###################################################################### # We can access the results dictionary to see if the fit converged: # print(result) ###################################################################### # Check best-fit parameters and error estimates: # display(models_stacked.to_parameters_table()) ###################################################################### # A quick way to inspect the model residuals is using the function # `~MapDataset.plot_residuals_spatial()`. This function computes and # plots a residual image (by default, the smoothing radius is `0.1 deg` # and `method=diff`, which corresponds to a simple `data - model` # plot): # dataset_stacked.plot_residuals_spatial(method="diff/sqrt(model)", vmin=-1, vmax=1) plt.show() ###################################################################### # The more general function `~MapDataset.plot_residuals()` can also # extract and display spectral residuals in a region: # region = CircleSkyRegion(spatial_model.position, radius=0.15 * u.deg) dataset_stacked.plot_residuals( kwargs_spatial=dict(method="diff/sqrt(model)", vmin=-1, vmax=1), kwargs_spectral=dict(region=region), ) plt.show() ###################################################################### # This way of accessing residuals is quick and handy, but comes with # limitations. For example: - In case a fitting energy range was defined # using a `MapDataset.mask_fit`, it won’t be taken into account. # Residuals will be summed up over the whole reconstructed energy range - # In order to make a proper statistic treatment, instead of simple # residuals a proper residuals significance map should be computed # # A more accurate way to inspect spatial residuals is the following: # estimator = ExcessMapEstimator( correlation_radius="0.1 deg", selection_optional=[], energy_edges=[0.1, 1, 10] * u.TeV, ) result = estimator.run(dataset_stacked) result["sqrt_ts"].plot_grid( figsize=(12, 4), cmap="coolwarm", add_cbar=True, vmin=-5, vmax=5, ncols=2 ) plt.show() ###################################################################### # Distribution of residuals significance in the full map geometry: # significance_map = result["sqrt_ts"] kwargs_hist = {"density": True, "alpha": 0.9, "color": "red", "bins": 40} ax, res = plot_distribution( significance_map, func="norm", kwargs_hist=kwargs_hist, kwargs_axes={"xlim": (-5, 5)}, ) plt.show() ###################################################################### # Here we could also plot the number of predicted counts for each model and # for the background in our dataset by using the # `~gammapy.visualization.plot_npred_signal` function. # ###################################################################### # Joint analysis # -------------- # # In this section, we perform a joint analysis of the same data. Of # course, joint fitting is considerably heavier than stacked one, and # should always be handled with care. For brevity, we only show the # analysis for a point source fitting without re-adding a diffuse # component again. # # Data reduction # ~~~~~~~~~~~~~~ # # %%time # Read the yaml file from disk config_joint = AnalysisConfig.read(path=path / "config_joint.yaml") analysis_joint = Analysis(config_joint) # select observations: analysis_joint.get_observations() # run data reduction analysis_joint.get_datasets() # You can see there are 3 datasets now print(analysis_joint.datasets) ###################################################################### # You can access each one by name or by index, eg: # print(analysis_joint.datasets[0]) ###################################################################### # After the data reduction stage, it is nice to get a quick summary info # on the datasets. Here, we look at the statistics in the center of Map, # by passing an appropriate `region`. To get info on the entire spatial # map, omit the region argument. # display(analysis_joint.datasets.info_table()) models_joint = Models() model_joint = model.copy(name="source-joint") models_joint.append(model_joint) for dataset in analysis_joint.datasets: bkg_model = FoVBackgroundModel(dataset_name=dataset.name) models_joint.append(bkg_model) print(models_joint) # and set the new model analysis_joint.datasets.models = models_joint # %%time fit_joint = Fit() result_joint = fit_joint.run(datasets=analysis_joint.datasets) ###################################################################### # Fit quality assessment and model residuals for a joint `Datasets` # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # ###################################################################### # We can access the results dictionary to see if the fit converged: # print(result_joint) ###################################################################### # Check best-fit parameters and error estimates: # print(models_joint) ###################################################################### # Since the joint dataset is made of multiple datasets, we can either: - # Look at the residuals for each dataset separately. In this case, we can # directly refer to the section # `Fit quality and model residuals for a MapDataset` in this notebook - # Or, look at a stacked residual map. # stacked = analysis_joint.datasets.stack_reduce() stacked.models = [model_joint] plt.figure() stacked.plot_residuals_spatial(vmin=-1, vmax=1) ###################################################################### # Then, we can access the stacked model residuals as previously shown in # the section `Fit quality and model residuals for a MapDataset` in this # notebook. # ###################################################################### # Finally, let us compare the spectral results from the stacked and joint # fit: # def plot_spectrum(model, ax, label, color): spec = model.spectral_model energy_bounds = [0.3, 10] * u.TeV spec.plot( ax=ax, energy_bounds=energy_bounds, energy_power=2, label=label, color=color ) spec.plot_error(ax=ax, energy_bounds=energy_bounds, energy_power=2, color=color) fig, ax = plt.subplots() plot_spectrum(model, ax=ax, label="stacked", color="tab:blue") plot_spectrum(model_joint, ax=ax, label="joint", color="tab:orange") ax.legend() plt.show() ###################################################################### # Summary # ------- # # Note that this notebook aims to show you the procedure of a 3D analysis # using just a few observations. Results get much better for a more # complete analysis considering the GPS dataset from the CTA First Data # Challenge (DC-1) and also the CTA model for the Galactic diffuse # emission, as shown in the next image: # ###################################################################### # .. image:: ../../_static/DC1_3d.png # ###################################################################### # Exercises # --------- # # - Analyse the second source in the field of view: G0.9+0.1 and add it # to the combined model. # - Perform modeling in more details - Add diffuse component, get flux # points. # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-3d/analysis_mwl.py0000644000175100001770000002061714721316200023466 0ustar00runnerdocker""" Multi instrument joint 3D and 1D analysis ========================================= Joint 3D analysis using 3D Fermi datasets, a H.E.S.S. reduced spectrum and HAWC flux points. Prerequisites ------------- - Handling of Fermi-LAT data with Gammapy see the :doc:`/tutorials/data/fermi_lat` tutorial. - Knowledge of spectral analysis to produce 1D On-Off datasets, see the following :doc:`/tutorials/analysis-1d/spectral_analysis` tutorial. - Using flux points to directly fit a model (without forward-folding) from the :doc:`/tutorials/analysis-1d/sed_fitting` tutorial. Context ------- Some science studies require to combine heterogeneous data from various instruments to extract physical information. In particular, it is often useful to add flux measurements of a source at different energies to an analysis to better constrain the wide-band spectral parameters. This can be done using a joint fit of heterogeneous datasets. **Objectives: Constrain the spectral parameters of the gamma-ray emission from the Crab nebula between 10 GeV and 100 TeV, using a 3D Fermi dataset, a H.E.S.S. reduced spectrum and HAWC flux points.** Proposed approach ----------------- This tutorial illustrates how to perform a joint modeling and fitting of the Crab Nebula spectrum using different datasets. The spectral parameters are optimized by combining a 3D analysis of Fermi-LAT data, a ON/OFF spectral analysis of H.E.S.S. data, and flux points from HAWC. In this tutorial we are going to use pre-made datasets. We prepared maps of the Crab region as seen by Fermi-LAT using the same event selection than the `3FHL catalog `__ (7 years of data with energy from 10 GeV to 2 TeV). For the H.E.S.S. ON/OFF analysis we used two observations from the `first public data release `__ with a significant signal from energy of about 600 GeV to 10 TeV. These observations have an offset of 0.5° and a zenith angle of 45-48°. The HAWC flux points data are taken from a `recent analysis `__ based on 2.5 years of data with energy between 300 Gev and 300 TeV. The setup --------- """ from pathlib import Path from astropy import units as u import matplotlib.pyplot as plt from gammapy.datasets import Datasets, FluxPointsDataset, SpectrumDatasetOnOff from gammapy.estimators import FluxPoints, FluxPointsEstimator from gammapy.maps import MapAxis from gammapy.modeling import Fit from gammapy.modeling.models import Models, create_crab_spectral_model ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup from gammapy.utils.scripts import make_path check_tutorials_setup() ###################################################################### # Data and models files # --------------------- # # The datasets serialization produce YAML files listing the datasets and # models. In the following cells we show an example containing only the # Fermi-LAT dataset and the Crab model. # # Fermi-LAT-3FHL_datasets.yaml: # path = make_path("$GAMMAPY_DATA/fermi-3fhl-crab/Fermi-LAT-3FHL_datasets.yaml") with path.open("r") as f: print(f.read()) ###################################################################### # We used as model a point source with a log-parabola spectrum. The # initial parameters were taken from the latest Fermi-LAT catalog # `4FGL `__, then we have re-optimized # the spectral parameters for our dataset in the 10 GeV - 2 TeV energy # range (fixing the source position). # # Fermi-LAT-3FHL_models.yaml: # path = make_path("$GAMMAPY_DATA/fermi-3fhl-crab/Fermi-LAT-3FHL_models.yaml") with path.open("r") as f: print(f.read()) ###################################################################### # Reading different datasets # -------------------------- # # Fermi-LAT 3FHL: map dataset for 3D analysis # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # For now we let’s use the datasets serialization only to read the 3D # `MapDataset` associated to Fermi-LAT 3FHL data and models. # path = Path("$GAMMAPY_DATA/fermi-3fhl-crab") filename = path / "Fermi-LAT-3FHL_datasets.yaml" datasets = Datasets.read(filename=filename) models = Models.read(path / "Fermi-LAT-3FHL_models.yaml") print(models) ###################################################################### # We get the Crab model in order to share it with the other datasets # print(models["Crab Nebula"]) ###################################################################### # HESS-DL3: 1D ON/OFF dataset for spectral fitting # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # The ON/OFF datasets can be read from PHA files following the `OGIP # standards `__. # We read the PHA files from each observation, and compute a stacked # dataset for simplicity. Then the Crab spectral model previously defined # is added to the dataset. # datasets_hess = Datasets() for obs_id in [23523, 23526]: dataset = SpectrumDatasetOnOff.read( f"$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs{obs_id}.fits" ) datasets_hess.append(dataset) dataset_hess = datasets_hess.stack_reduce(name="HESS") datasets.append(dataset_hess) print(datasets) ###################################################################### # HAWC: 1D dataset for flux point fitting # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # The HAWC flux point are taken from https://arxiv.org/pdf/1905.12518.pdf # Then these flux points are read from a pre-made FITS file and passed to # a `FluxPointsDataset` together with the source spectral model. # # read flux points from https://arxiv.org/pdf/1905.12518.pdf filename = "$GAMMAPY_DATA/hawc_crab/HAWC19_flux_points.fits" flux_points_hawc = FluxPoints.read( filename, reference_model=create_crab_spectral_model("meyer") ) dataset_hawc = FluxPointsDataset(data=flux_points_hawc, name="HAWC") datasets.append(dataset_hawc) print(datasets) ###################################################################### # Datasets serialization # ---------------------- # # The `datasets` object contains each dataset previously defined. It can # be saved on disk as datasets.yaml, models.yaml, and several data files # specific to each dataset. Then the `datasets` can be rebuild later # from these files. # path = Path("crab-3datasets") path.mkdir(exist_ok=True) filename = path / "crab_10GeV_100TeV_datasets.yaml" datasets.write(filename, overwrite=True) datasets = Datasets.read(filename) datasets.models = models print(datasets) ###################################################################### # Joint analysis # -------------- # # We run the fit on the `Datasets` object that include a dataset for # each instrument # # %%time fit_joint = Fit() results_joint = fit_joint.run(datasets=datasets) print(results_joint) ###################################################################### # Let’s display only the parameters of the Crab spectral model # crab_spec = datasets[0].models["Crab Nebula"].spectral_model print(crab_spec) ###################################################################### # We can compute flux points for Fermi-LAT and H.E.S.S. datasets in order plot # them together with the HAWC flux point. # # compute Fermi-LAT and H.E.S.S. flux points energy_edges = MapAxis.from_energy_bounds("10 GeV", "2 TeV", nbin=5).edges flux_points_fermi = FluxPointsEstimator( energy_edges=energy_edges, source="Crab Nebula", ).run([datasets["Fermi-LAT"]]) energy_edges = MapAxis.from_bounds(1, 15, nbin=6, interp="log", unit="TeV").edges flux_points_hess = FluxPointsEstimator( energy_edges=energy_edges, source="Crab Nebula", selection_optional=["ul"] ).run([datasets["HESS"]]) ###################################################################### # Now, let’s plot the Crab spectrum fitted and the flux points of each # instrument. # # display spectrum and flux points fig, ax = plt.subplots(figsize=(8, 6)) energy_bounds = [0.01, 300] * u.TeV sed_type = "e2dnde" crab_spec.plot(ax=ax, energy_bounds=energy_bounds, sed_type=sed_type, label="Model") crab_spec.plot_error(ax=ax, energy_bounds=energy_bounds, sed_type=sed_type) flux_points_fermi.plot(ax=ax, sed_type=sed_type, label="Fermi-LAT") flux_points_hess.plot(ax=ax, sed_type=sed_type, label="H.E.S.S.") flux_points_hawc.plot(ax=ax, sed_type=sed_type, label="HAWC") ax.set_xlim(energy_bounds) ax.legend() plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-3d/cta_data_analysis.py0000644000175100001770000002644714721316200024436 0ustar00runnerdocker""" Basic image exploration and fitting =================================== Detect sources, produce a sky image and a spectrum using CTA-1DC data. Introduction ------------ **This notebook shows an example how to make a sky image and spectrum for simulated CTAO data with Gammapy.** The dataset we will use is three observation runs on the Galactic Center. This is a tiny (and thus quick to process and play with and learn) subset of the simulated CTAO dataset that was produced for the first data challenge in August 2017. """ ###################################################################### # Setup # ----- # # As usual, we’ll start with some setup … # # Configure the logger, so that the spectral analysis # isn't so chatty about what it's doing. import logging import astropy.units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion import matplotlib.pyplot as plt from IPython.display import display from gammapy.data import DataStore from gammapy.datasets import Datasets, FluxPointsDataset, MapDataset, SpectrumDataset from gammapy.estimators import FluxPointsEstimator, TSMapEstimator from gammapy.estimators.utils import find_peaks from gammapy.makers import ( MapDatasetMaker, ReflectedRegionsBackgroundMaker, SafeMaskMaker, SpectrumDatasetMaker, ) from gammapy.maps import MapAxis, RegionGeom, WcsGeom from gammapy.modeling import Fit from gammapy.modeling.models import ( GaussianSpatialModel, PowerLawSpectralModel, SkyModel, ) from gammapy.visualization import plot_npred_signal, plot_spectrum_datasets_off_regions from gammapy.utils.check import check_tutorials_setup logging.basicConfig() log = logging.getLogger("gammapy.spectrum") log.setLevel(logging.ERROR) ###################################################################### # Check setup # ----------- check_tutorials_setup() ###################################################################### # Select observations # ------------------- # # A Gammapy analysis usually starts by creating a # `~gammapy.data.DataStore` and selecting observations. # # This is shown in detail in other notebooks (see e.g. the :doc:`/tutorials/starting/analysis_2` tutorial), # here we choose three observations near the Galactic Center. # data_store = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps") # Just as a reminder: this is how to select observations # from astropy.coordinates import SkyCoord # table = data_store.obs_table # pos_obs = SkyCoord(table['GLON_PNT'], table['GLAT_PNT'], frame='galactic', unit='deg') # pos_target = SkyCoord(0, 0, frame='galactic', unit='deg') # offset = pos_target.separation(pos_obs).deg # mask = (1 < offset) & (offset < 2) # table = table[mask] # table.show_in_browser(jsviewer=True) obs_id = [110380, 111140, 111159] observations = data_store.get_observations(obs_id) obs_cols = ["OBS_ID", "GLON_PNT", "GLAT_PNT", "LIVETIME"] display(data_store.obs_table.select_obs_id(obs_id)[obs_cols]) ###################################################################### # Make sky images # --------------- # # Define map geometry # ~~~~~~~~~~~~~~~~~~~ # # Select the target position and define an ON region for the spectral # analysis # axis = MapAxis.from_energy_bounds( 0.1, 10, nbin=10, unit="TeV", name="energy", ) axis_true = MapAxis.from_energy_bounds( 0.05, 20, nbin=20, name="energy_true", unit="TeV", ) geom = WcsGeom.create( skydir=(0, 0), npix=(500, 400), binsz=0.02, frame="galactic", axes=[axis] ) print(geom) ###################################################################### # Compute images # ~~~~~~~~~~~~~~ # # %%time stacked = MapDataset.create(geom=geom, energy_axis_true=axis_true) maker = MapDatasetMaker(selection=["counts", "background", "exposure", "psf"]) maker_safe_mask = SafeMaskMaker(methods=["offset-max"], offset_max=2.5 * u.deg) for obs in observations: cutout = stacked.cutout(obs.get_pointing_icrs(obs.tmid), width="5 deg") dataset = maker.run(cutout, obs) dataset = maker_safe_mask.run(dataset, obs) stacked.stack(dataset) # # The maps are cubes, with an energy axis. # Let's also make some images: # dataset_image = stacked.to_image() geom_image = dataset_image.geoms["geom"] ###################################################################### # Show images # ~~~~~~~~~~~ # # Let’s have a quick look at the images we computed … # fig, (ax1, ax2, ax3) = plt.subplots( figsize=(15, 5), ncols=3, subplot_kw={"projection": geom_image.wcs}, gridspec_kw={"left": 0.1, "right": 0.9}, ) ax1.set_title("Counts map") dataset_image.counts.smooth(2).plot(ax=ax1, vmax=5) ax2.set_title("Background map") dataset_image.background.plot(ax=ax2, vmax=5) ax3.set_title("Excess map") dataset_image.excess.smooth(3).plot(ax=ax3, vmax=2) plt.show() ###################################################################### # Source Detection # ---------------- # # Use the class `~gammapy.estimators.TSMapEstimator` and function # `~gammapy.estimators.utils.find_peaks` to detect sources on the images. # We search for 0.1 deg sigma gaussian sources in the dataset. # spatial_model = GaussianSpatialModel(sigma="0.05 deg") spectral_model = PowerLawSpectralModel(index=2) model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) ts_image_estimator = TSMapEstimator( model, kernel_width="0.5 deg", selection_optional=[], downsampling_factor=2, sum_over_energy_groups=False, energy_edges=[0.1, 10] * u.TeV, ) # %%time images_ts = ts_image_estimator.run(stacked) sources = find_peaks( images_ts["sqrt_ts"], threshold=5, min_distance="0.2 deg", ) display(sources) ###################################################################### # To get the position of the sources, simply # source_pos = SkyCoord(sources["ra"], sources["dec"]) print(source_pos) ###################################################################### # Plot sources on top of significance sky image # fig, ax = plt.subplots(figsize=(8, 6), subplot_kw={"projection": geom_image.wcs}) images_ts["sqrt_ts"].plot(ax=ax, add_cbar=True) ax.scatter( source_pos.ra.deg, source_pos.dec.deg, transform=ax.get_transform("icrs"), color="none", edgecolor="white", marker="o", s=200, lw=1.5, ) plt.show() ###################################################################### # Spatial analysis # ---------------- # # See other notebooks for how to run a 3D cube or 2D image based analysis. # ###################################################################### # Spectrum # -------- # # We’ll run a spectral analysis using the classical reflected regions # background estimation method, and using the on-off (often called WSTAT) # likelihood function. # target_position = SkyCoord(0, 0, unit="deg", frame="galactic") on_radius = 0.2 * u.deg on_region = CircleSkyRegion(center=target_position, radius=on_radius) exclusion_mask = ~geom.to_image().region_mask([on_region]) exclusion_mask.plot() plt.show() ###################################################################### # Configure spectral analysis energy_axis = MapAxis.from_energy_bounds(0.1, 40, 40, unit="TeV", name="energy") energy_axis_true = MapAxis.from_energy_bounds( 0.05, 100, 200, unit="TeV", name="energy_true" ) geom = RegionGeom.create(region=on_region, axes=[energy_axis]) dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=energy_axis_true) dataset_maker = SpectrumDatasetMaker( containment_correction=False, selection=["counts", "exposure", "edisp"] ) bkg_maker = ReflectedRegionsBackgroundMaker(exclusion_mask=exclusion_mask) safe_mask_masker = SafeMaskMaker(methods=["aeff-max"], aeff_percent=10) ###################################################################### # Run data reduction # %%time datasets = Datasets() for observation in observations: dataset = dataset_maker.run( dataset_empty.copy(name=f"obs-{observation.obs_id}"), observation ) dataset_on_off = bkg_maker.run(dataset, observation) dataset_on_off = safe_mask_masker.run(dataset_on_off, observation) datasets.append(dataset_on_off) ###################################################################### # Plot results plt.figure(figsize=(8, 6)) ax = dataset_image.counts.smooth("0.03 deg").plot(vmax=8) on_region.to_pixel(ax.wcs).plot(ax=ax, edgecolor="white") plot_spectrum_datasets_off_regions(datasets, ax=ax) plt.show() ###################################################################### # Model fit # ~~~~~~~~~ # # The next step is to fit a spectral model, using all data (i.e. a # “global” fit, using all energies). # # %%time spectral_model = PowerLawSpectralModel( index=2, amplitude=1e-11 * u.Unit("cm-2 s-1 TeV-1"), reference=1 * u.TeV ) model = SkyModel(spectral_model=spectral_model, name="source-gc") datasets.models = model fit = Fit() result = fit.run(datasets=datasets) print(result) ###################################################################### # Here we can plot the predicted number of counts for each model and # for the background in the dataset. This is especially useful when # studying complex field with a lot a sources. There is a function # in the visualization sub-package of gammapy that does this automatically. # # First we need to stack our datasets. stacked_dataset = datasets.stack_reduce(name="stacked") stacked_dataset.models = model print(stacked_dataset) ###################################################################### # Call `~gammapy.visualization.plot_npred_signal` to plot the predicted counts. # plot_npred_signal(stacked_dataset) plt.show() ###################################################################### # Spectral points # ~~~~~~~~~~~~~~~ # # Finally, let’s compute spectral points. The method used is to first # choose an energy binning, and then to do a 1-dim likelihood fit / # profile to compute the flux and flux error. # # Flux points are computed on stacked datasets energy_edges = MapAxis.from_energy_bounds("1 TeV", "30 TeV", nbin=5).edges fpe = FluxPointsEstimator(energy_edges=energy_edges, source="source-gc") flux_points = fpe.run(datasets=[stacked_dataset]) flux_points.to_table(sed_type="dnde", formatted=True) ###################################################################### # Plot # ~~~~ # # Let’s plot the spectral model and points. You could do it directly, but # for convenience we bundle the model and the flux points in a # `~gammapy.datasets.FluxPointsDataset`: # flux_points_dataset = FluxPointsDataset(data=flux_points, models=model) flux_points_dataset.plot_fit() plt.show() ###################################################################### # Exercises # --------- # # - Re-run the analysis above, varying some analysis parameters, e.g. # # - Select a few other observations # - Change the energy band for the map # - Change the spectral model for the fit # - Change the energy binning for the spectral points # # - Change the target. Make a sky image and spectrum for your favourite # source. # # - If you don’t know any, the Crab nebula is the “hello world!” # analysis of gamma-ray astronomy. # # print('hello world') # SkyCoord.from_name('crab') ###################################################################### # What next? # ---------- # # - This notebook showed an example of a first CTAO analysis with Gammapy, # using simulated 1DC data. # - Let us know if you have any questions or issues! # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-3d/energy_dependent_estimation.py0000644000175100001770000002205314721316200026533 0ustar00runnerdocker""" Morphological energy dependence estimation ========================================== Learn how to test for energy-dependent morphology in your dataset. Prerequisites ------------- Knowledge on data reduction and datasets used in Gammapy, for example see the :doc:`/tutorials/data/hess` and :doc:`/tutorials/analysis-2d/ring_background` tutorials. Context ------- This tutorial introduces a method to investigate the potential of energy-dependent morphology from spatial maps. It is possible for gamma-ray sources to exhibit energy-dependent morphology, in which the spatial morphology of the gamma rays varies across different energy bands. This is plausible for different source types, including pulsar wind nebulae (PWNe) and supernova remnants. HESS J1825−137 is a well-known example of a PWNe which shows a clear energy-dependent gamma-ray morphology (see `Aharonian et al., 2006 `__, `H.E.S.S. Collaboration et al., 2019 `__ and `Principe et al., 2020 `__.) Many different techniques of measuring this energy-dependence have been utilised over the years. The method shown here is to perform a morphological fit of extension and position in various energy slices and compare this with a global morphology fit. **Objective: Perform an energy-dependent morphology study of a mock source.** Tutorial overview ----------------- This tutorial consists of two main steps. Firstly, the user defines the initial `~gammapy.modeling.models.SkyModel` based on previous investigations and selects the energy bands of interest to test for energy dependence. The null hypothesis is defined as only the background component being free (norm). The alternative hypothesis introduces the source model. The results of this first step show the significance of the source above the background in each energy band. The second step is to quantify any energy-dependent morphology. The null hypothesis is determined by performing a joint fit of the parameters. In the alternative hypothesis, the free parameters of the model are fit individually within each energy band. Setup ----- """ from astropy import units as u from astropy.coordinates import SkyCoord from astropy.table import Table import matplotlib.pyplot as plt from IPython.display import display from gammapy.datasets import Datasets, MapDataset from gammapy.estimators import EnergyDependentMorphologyEstimator from gammapy.estimators.energydependentmorphology import weighted_chi2_parameter from gammapy.maps import Map from gammapy.modeling.models import ( GaussianSpatialModel, PowerLawSpectralModel, SkyModel, ) from gammapy.stats.utils import ts_to_sigma from gammapy.utils.check import check_tutorials_setup ###################################################################### # Check setup # ----------- check_tutorials_setup() ###################################################################### # Obtain the data to use # ---------------------- # # Utilise the pre-defined dataset within `$GAMMAPY_DATA`. # # P.S.: do not forget to set up your environment variable `$GAMMAPY_DATA` # to your local directory. stacked_dataset = MapDataset.read( "$GAMMAPY_DATA/estimators/mock_DL4/dataset_energy_dependent.fits.gz" ) datasets = Datasets([stacked_dataset]) ###################################################################### # Define the energy edges of interest. These will be utilised to # investigate the potential of energy-dependent morphology in the dataset. energy_edges = [1, 3, 5, 20] * u.TeV ###################################################################### # Define the spectral and spatial models of interest. We utilise # a `~gammapy.modeling.models.PowerLawSpectralModel` and a # `~gammapy.modeling.models.GaussianSpatialModel` to test the energy-dependent # morphology component in each energy band. A standard 3D fit (see the # :doc:`/tutorials/analysis-3d/analysis_3d` tutorial) # is performed, then the best fit model is utilised here for the initial parameters # in each model. source_position = SkyCoord(5.58, 0.2, unit="deg", frame="galactic") spectral_model = PowerLawSpectralModel( index=2.5, amplitude=9.8e-12 * u.Unit("cm-2 s-1 TeV-1"), reference=1.0 * u.TeV ) spatial_model = GaussianSpatialModel( lon_0=source_position.l, lat_0=source_position.b, frame="galactic", sigma=0.2 * u.deg, ) # Limit the search for the position on the spatial model spatial_model.lon_0.min = source_position.galactic.l.deg - 0.8 spatial_model.lon_0.max = source_position.galactic.l.deg + 0.8 spatial_model.lat_0.min = source_position.galactic.b.deg - 0.8 spatial_model.lat_0.max = source_position.galactic.b.deg + 0.8 model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model, name="src") ###################################################################### # Run Estimator # ------------- # # We can now run the energy-dependent estimation tool and explore the results. # # Let's start with the initial hypothesis, in which the source is introduced # to compare with the background. We specify which parameters we # wish to use for testing the energy dependence. # To test for the energy dependence, it is recommended to keep the position and # extension parameters free. This allows them to be used for fitting the spatial model # in each energy band. # model.spatial_model.lon_0.frozen = False model.spatial_model.lat_0.frozen = False model.spatial_model.sigma.frozen = False model.spectral_model.amplitude.frozen = False model.spectral_model.index.frozen = True datasets.models = model estimator = EnergyDependentMorphologyEstimator(energy_edges=energy_edges, source="src") ###################################################################### # Show the results tables # ----------------------- # # The results of the source signal above the background in each energy bin # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Firstly, the estimator is run to produce the results. # The table here shows the ∆(TS) value, the number of degrees of freedom (df) # and the significance (σ) in each energy bin. The significance values here show that each # energy band has significant signal above the background. # results = estimator.run(datasets) table_bkg_src = Table(results["src_above_bkg"]) display(table_bkg_src) ###################################################################### # The results for testing energy dependence # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Next, the results of the energy-dependent estimator are shown. # The table shows the various free parameters for the joint fit for :math:`H_0` across the entire # energy band and for each energy bin shown for :math:`H_1`. ts = results["energy_dependence"]["delta_ts"] df = results["energy_dependence"]["df"] sigma = ts_to_sigma(ts, df=df) print(f"The delta_ts for the energy-dependent study: {ts:.3f}.") print(f"Converting this to a significance gives: {sigma:.3f} \u03C3") results_table = Table(results["energy_dependence"]["result"]) display(results_table) ###################################################################### # The chi-squared value for each parameter of interest # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # We can also utilise the `~gammapy.estimators.energydependence.weighted_chi2_parameter` # function for each parameter. # # The weighted chi-squared significance for the ``sigma``, ``lat_0`` and ``lon_0`` values. # display( Table( weighted_chi2_parameter( results["energy_dependence"]["result"], parameters=["sigma", "lat_0", "lon_0"], ) ) ) ###################################################################### # Note: The chi-squared parameter does not include potential correlation between the # parameters, so it should be used cautiously. # ###################################################################### # Plotting the results # -------------------- empty_map = Map.create( skydir=spatial_model.position, frame=spatial_model.frame, width=1, binsz=0.02 ) colors = ["red", "blue", "green", "magenta"] fig = plt.figure(figsize=(6, 4)) ax = empty_map.plot() lat_0 = results["energy_dependence"]["result"]["lat_0"][1:] lat_0_err = results["energy_dependence"]["result"]["lat_0_err"][1:] lon_0 = results["energy_dependence"]["result"]["lon_0"][1:] lon_0_err = results["energy_dependence"]["result"]["lon_0_err"][1:] sigma = results["energy_dependence"]["result"]["sigma"][1:] sigma_err = results["energy_dependence"]["result"]["sigma_err"][1:] for i in range(len(lat_0)): model_plot = GaussianSpatialModel( lat_0=lat_0[i], lon_0=lon_0[i], sigma=sigma[i], frame=spatial_model.frame ) model_plot.lat_0.error = lat_0_err[i] model_plot.lon_0.error = lon_0_err[i] model_plot.sigma.error = sigma_err[i] model_plot.plot_error( ax=ax, which="all", kwargs_extension={"facecolor": colors[i], "edgecolor": colors[i]}, kwargs_position={"color": colors[i]}, ) plt.show() # sphinx_gallery_thumbnail_number = 1 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-3d/event_sampling.py0000644000175100001770000004134114721316200023774 0ustar00runnerdocker""" Event sampling ============== Learn to sampling events from a given sky model and IRFs. Prerequisites ------------- To understand how to generate a model and a `~gammapy.datasets.MapDataset` and how to fit the data, please refer to the `~gammapy.modeling.models.SkyModel` and :doc:`/tutorials/analysis-3d/simulate_3d` tutorial. Context ------- This tutorial describes how to sample events from an observation of a one (or more) gamma-ray source(s). The main aim of the tutorial will be to set the minimal configuration needed to deal with the Gammapy event-sampler and how to obtain an output photon event list. The core of the event sampling lies into the Gammapy `~gammapy.datasets.MapDatasetEventSampler` class, which is based on the inverse cumulative distribution function `(Inverse CDF) `__. The `~gammapy.datasets.MapDatasetEventSampler` takes in input a `~gammapy.datasets.Dataset` object containing the spectral, spatial and temporal properties of the source(s) of interest. The `~gammapy.datasets.MapDatasetEventSampler` class evaluates the map of predicted counts (`npred`) per bin of the given Sky model, and the `npred` map is then used to sample the events. In particular, the output of the event-sampler will be a set of events having information about their true coordinates, true energies and times of arrival. To these events, IRF corrections (i.e. PSF and energy dispersion) can also further be applied in order to obtain reconstructed coordinates and energies of the sampled events. At the end of this process, you will obtain an event-list in FITS format. Objective --------- Describe the process of sampling events from a given Sky model and obtain an output event-list. Proposed approach ----------------- In this section, we will show how to define an observation and create a Dataset object. These are both necessary for the event sampling. Then, we will define the Sky model from which we sample events. In this tutorial, we propose examples for sampling events of: - `a point-like source <#sampling-the-source-and-background-events>`__ - `a time variable point-like source <#time-variable-source-using-a-lightcurve>`__ - `an extended source using a template map <#extended-source-using-a-template>`__ - `a set of observations <#simulate-multiple-event-lists>`__ We will work with the following functions and classes: - `~gammapy.data.Observations` - `~gammapy.datasets.Dataset` - `~gammapy.modeling.models.SkyModel` - `~gammapy.datasets.MapDatasetEventSampler` - `~gammapy.data.EventList` """ ###################################################################### # Setup # ----- # # As usual, let’s start with some general imports… # from pathlib import Path import numpy as np import astropy.units as u from astropy.coordinates import Angle, SkyCoord from astropy.io import fits from astropy.time import Time from regions import CircleSkyRegion import matplotlib.pyplot as plt from IPython.display import display from gammapy.data import ( DataStore, FixedPointingInfo, Observation, observatory_locations, ) from gammapy.datasets import MapDataset, MapDatasetEventSampler from gammapy.irf import load_irf_dict_from_file from gammapy.makers import MapDatasetMaker from gammapy.maps import MapAxis, WcsGeom from gammapy.modeling.models import ( ExpDecayTemporalModel, FoVBackgroundModel, Models, PointSpatialModel, PowerLawNormSpectralModel, PowerLawSpectralModel, SkyModel, TemplateSpatialModel, ) ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Define an Observation # --------------------- # # You can firstly create a `~gammapy.data.Observations` object that # contains the pointing position, the GTIs and the IRF you want to # consider. # # Hereafter, we chose the IRF of the South configuration used for the CTA # DC1 and we set the pointing position of the simulated field at the # Galactic Center. We also fix the exposure time to 1 hr. # # Let’s start with some initial settings: path = Path("$GAMMAPY_DATA/cta-caldb") irf_filename = "Prod5-South-20deg-AverageAz-14MSTs37SSTs.180000s-v0.1.fits.gz" # telescope is pointing at a fixed position in ICRS for the observation pointing = FixedPointingInfo( fixed_icrs=SkyCoord(0.0, 0.0, frame="galactic", unit="deg").icrs, ) livetime = 1 * u.hr location = observatory_locations["cta_south"] irfs = load_irf_dict_from_file(path / irf_filename) ###################################################################### # Now you can create the observation: # observation = Observation.create( obs_id=1001, pointing=pointing, livetime=livetime, irfs=irfs, location=location, ) print(observation) ###################################################################### # Define the MapDataset # --------------------- # # Let’s generate the `~gammapy.datasets.Dataset` object (for more info # on `~gammapy.datasets.Dataset` objects, please checkout # :doc:`/tutorials/api/datasets` tutorial): # we define the energy axes (true and reconstructed), the migration axis # and the geometry of the observation. # # *This is a crucial point for the correct configuration of the event # sampler. Indeed, the spatial and energetic binning should be treated # carefully and… the finer the better. For this reason, we suggest to # define the energy axes (true and reconstructed) by setting a minimum # binning of least 10-20 bins per decade for all the sources of interest. # The spatial binning may instead be different from source to source and, # at first order, it should be adopted a binning significantly smaller # than the expected source size.* # # For the examples that will be shown hereafter, we set the geometry of # the dataset to a field of view of 2degx2deg and we bin the spatial map # with pixels of 0.02 deg. # energy_axis = MapAxis.from_energy_bounds("0.1 TeV", "100 TeV", nbin=10, per_decade=True) energy_axis_true = MapAxis.from_energy_bounds( "0.03 TeV", "300 TeV", nbin=20, per_decade=True, name="energy_true" ) migra_axis = MapAxis.from_bounds(0.5, 2, nbin=150, node_type="edges", name="migra") geom = WcsGeom.create( skydir=pointing.fixed_icrs, width=(2, 2), binsz=0.02, frame="galactic", axes=[energy_axis], ) ###################################################################### # In the following, the dataset is created by selecting the effective # area, background model, the PSF and the Edisp from the IRF. The dataset # thus produced can be saved into a FITS file just using the `write()` # function. We put it into the `evt_sampling` sub-folder: # # %%time empty = MapDataset.create( geom, energy_axis_true=energy_axis_true, migra_axis=migra_axis, name="my-dataset", ) maker = MapDatasetMaker(selection=["exposure", "background", "psf", "edisp"]) dataset = maker.run(empty, observation) Path("event_sampling").mkdir(exist_ok=True) dataset.write("./event_sampling/dataset.fits", overwrite=True) ###################################################################### # Define the Sky model: a point-like source # ----------------------------------------- # # Now let’s define a sky model for a point-like source centered 0.5 # deg far from the Galactic Center and with a power-law spectrum. We then # save the model into a yaml file. # spectral_model_pwl = PowerLawSpectralModel( index=2, amplitude="1e-12 TeV-1 cm-2 s-1", reference="1 TeV" ) spatial_model_point = PointSpatialModel( lon_0="0 deg", lat_0="0.5 deg", frame="galactic" ) sky_model_pntpwl = SkyModel( spectral_model=spectral_model_pwl, spatial_model=spatial_model_point, name="point-pwl", ) bkg_model = FoVBackgroundModel(dataset_name="my-dataset") models = Models([sky_model_pntpwl, bkg_model]) file_model = "./event_sampling/point-pwl.yaml" models.write(file_model, overwrite=True) ###################################################################### # Sampling the source and background events # ----------------------------------------- # # Now, we can finally add the `~gammapy.modeling.models.SkyModel` we # want to event-sample to the `~gammapy.datasets.Dataset` container: # dataset.models = models print(dataset.models) ###################################################################### # The next step shows how to sample the events with the # `~gammapy.datasets.MapDatasetEventSampler` class. The class requests a # random number seed generator (that we set with `random_state=0`), the # `~gammapy.datasets.Dataset` and the `~gammapy.data.Observations` # object. From the latter, the # `~gammapy.datasets.MapDatasetEventSampler` class takes all the meta # data information. # # %%time sampler = MapDatasetEventSampler(random_state=0) events = sampler.run(dataset, observation) ###################################################################### # The output of the event-sampler is an event list with coordinates, # energies (true and reconstructed) and time of arrivals of the source and # background events. `events` is a `~gammapy.data.EventList` object # (for details see e.g. :doc:`/tutorials/data/cta` tutorial.). # Source and background events are flagged by the MC_ID identifier (where # 0 is the default identifier for the background). # print(f"Source events: {(events.table['MC_ID'] == 1).sum()}") print(f"Background events: {(events.table['MC_ID'] == 0).sum()}") ###################################################################### # We can inspect the properties of the simulated events as follows: # events.select_offset([0, 1] * u.deg).peek() plt.show() ###################################################################### # By default, the `~gammapy.datasets.MapDatasetEventSampler` fills the # metadata keyword `OBJECT` in the event list using the first model of # the SkyModel object. You can change it with the following commands: # events.table.meta["OBJECT"] = dataset.models[0].name ###################################################################### # Let’s write the event list and its GTI extension to a FITS file. We make # use of `fits` library in `astropy`: # primary_hdu = fits.PrimaryHDU() hdu_evt = fits.BinTableHDU(events.table) hdu_gti = fits.BinTableHDU(dataset.gti.table, name="GTI") hdu_all = fits.HDUList([primary_hdu, hdu_evt, hdu_gti]) hdu_all.writeto("./event_sampling/events_0001.fits", overwrite=True) ###################################################################### # Time variable source using a lightcurve # --------------------------------------- # # The event sampler can also handle temporal variability of the simulated # sources. In this example, we show how to sample a source characterized # by an exponential decay, with decay time of 2800 seconds, during the # observation. # # First of all, let’s create a lightcurve: # t0 = 2800 * u.s t_ref = Time("2000-01-01T00:01:04.184") times = t_ref + livetime * np.linspace(0, 1, 100) expdecay_model = ExpDecayTemporalModel(t_ref=t_ref.mjd * u.d, t0=t0) ###################################################################### # where we defined the time axis starting from the reference time # `t_ref` up to the requested exposure (`livetime`). The bin size of # the time-axis is quite arbitrary but, as above for spatial and energy # binning, the finer the better. # ###################################################################### # Then, we can create the sky model. Just for the sake of the example, # let’s boost the flux of the simulated source of an order of magnitude: # spectral_model_pwl.amplitude.value = 2e-11 sky_model_pntpwl = SkyModel( spectral_model=spectral_model_pwl, spatial_model=spatial_model_point, temporal_model=expdecay_model, name="point-pwl", ) bkg_model = FoVBackgroundModel(dataset_name="my-dataset") models = Models([sky_model_pntpwl, bkg_model]) file_model = "./event_sampling/point-pwl_decay.yaml" models.write(file_model, overwrite=True) ###################################################################### # For simplicity, we use the same dataset defined for the previous # example: # dataset.models = models print(dataset.models) ###################################################################### # And now, let’s simulate the variable source: # # %%time sampler = MapDatasetEventSampler(random_state=0) events = sampler.run(dataset, observation) print(f"Source events: {(events.table['MC_ID'] == 1).sum()}") print(f"Background events: {(events.table['MC_ID'] == 0).sum()}") ###################################################################### # We can now inspect the properties of the simulated source. To do that, # we adopt the `select_region` function that extracts only the events # into a given `SkyRegion` of a `~gammapy.data.EventList` object: # src_position = SkyCoord(0.0, 0.5, frame="galactic", unit="deg") on_region_radius = Angle("0.15 deg") on_region = CircleSkyRegion(center=src_position, radius=on_region_radius) src_events = events.select_region(on_region) ###################################################################### # Then we can have a quick look to the data with the `peek` function: # src_events.peek() plt.show() ###################################################################### # In the right figure of the bottom panel, it is shown the source # lightcurve that follows a decay trend as expected. # ###################################################################### # Extended source using a template # -------------------------------- # # The event sampler can also work with a template model. Here we use the # interstellar emission model map of the Fermi 3FHL, which can be found in # the `$GAMMAPY_DATA` repository. # # We proceed following the same steps showed above and we finally have a # look at the event’s properties: # template_model = TemplateSpatialModel.read( "$GAMMAPY_DATA/fermi-3fhl-gc/gll_iem_v06_gc.fits.gz", normalize=False ) # we make the model brighter artificially so that it becomes visible over the background diffuse = SkyModel( spectral_model=PowerLawNormSpectralModel(norm=5), spatial_model=template_model, name="template-model", ) bkg_model = FoVBackgroundModel(dataset_name="my-dataset") models_diffuse = Models([diffuse, bkg_model]) file_model = "./event_sampling/diffuse.yaml" models_diffuse.write(file_model, overwrite=True) dataset.models = models_diffuse print(dataset.models) # %%time sampler = MapDatasetEventSampler(random_state=0) events = sampler.run(dataset, observation) events.select_offset([0, 1] * u.deg).peek() plt.show() ###################################################################### # Simulate multiple event lists # ----------------------------- # # In some user case, you may want to sample events from a number of # observations. In this section, we show how to simulate a set of event # lists. For simplicity, we consider only one point-like source, observed # three times for 1 hr and assuming the same pointing position. # # Let’s firstly define the time start and the livetime of each # observation: # tstarts = Time("2020-01-01 00:00:00") + [1, 5, 7] * u.hr livetimes = [1, 1, 1] * u.hr # %%time n_obs = len(tstarts) irf_paths = [path / irf_filename] * n_obs events_paths = [] for idx, tstart in enumerate(tstarts): irfs = load_irf_dict_from_file(irf_paths[idx]) observation = Observation.create( obs_id=idx, pointing=pointing, tstart=tstart, livetime=livetimes[idx], irfs=irfs, location=location, ) dataset = maker.run(empty, observation) dataset.models = models sampler = MapDatasetEventSampler(random_state=idx) events = sampler.run(dataset, observation) path = Path(f"./event_sampling/events_{idx:04d}.fits") events_paths.append(path) events.table.write(path, overwrite=True) ###################################################################### # You can now load the event list and the corresponding IRFs with # `DataStore.from_events_files`: # path = Path("./event_sampling/") events_paths = list(path.rglob("events*.fits")) data_store = DataStore.from_events_files(events_paths, irf_paths) display(data_store.obs_table) ###################################################################### # Then you can create the observations from the data store and make your own # analysis following the instructions in the # :doc:`/tutorials/starting/analysis_2` tutorial. # observations = data_store.get_observations() observations[0].peek() plt.show() ###################################################################### # Exercises # --------- # # - Try to sample events for an extended source (e.g. a radial gaussian # morphology); # - Change the spatial model and the spectrum of the simulated Sky model; # - Include a temporal model in the simulation # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-3d/event_sampling_nrg_depend_models.py0000644000175100001770000003065214721316200027527 0ustar00runnerdocker""" Sample a source with energy-dependent temporal evolution ======================================================== This notebook shows how to sample events of sources whose model evolves in energy and time. Prerequisites ------------- To understand how to generate a model and a MapDataset and how to fit the data, please refer to the `~gammapy.modeling.models.SkyModel` and :doc:`/tutorials/analysis-3d/simulate_3d` tutorial. To know how to sample events for standards sources, we suggest to visit the event sampler :doc:`/tutorials/analysis-3d/event_sampling` tutorial. Objective --------- Describe the process of sampling events of a source having an energy-dependent temporal model, and obtain an output event-list. Proposed approach ----------------- Here we will show how to create an energy dependent temporal model; then we also create an observation and define a Dataset object. Finally, we describe how to sample events from the given model. We will work with the following functions and classes: - `~gammapy.data.Observations` - `~gammapy.datasets.Dataset` - `~gammapy.modeling.models.SkyModel` - `~gammapy.datasets.MapDatasetEventSampler` - `~gammapy.data.EventList` - `~gammapy.maps.RegionNDMap` """ ###################################################################### # Setup # ----- # # As usual, let’s start with some general imports… # from pathlib import Path import astropy.units as u from astropy.coordinates import Angle, SkyCoord from astropy.time import Time from regions import CircleSkyRegion, PointSkyRegion import matplotlib.pyplot as plt from gammapy.data import FixedPointingInfo, Observation, observatory_locations from gammapy.datasets import MapDataset, MapDatasetEventSampler from gammapy.irf import load_irf_dict_from_file from gammapy.makers import MapDatasetMaker from gammapy.maps import MapAxis, RegionNDMap, WcsGeom from gammapy.modeling.models import ( ConstantSpectralModel, FoVBackgroundModel, LightCurveTemplateTemporalModel, PointSpatialModel, PowerLawSpectralModel, SkyModel, ) ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Create the energy-dependent temporal model # ------------------------------------------ # # The source we want to simulate has a spectrum that varies as a function of # the time. # Here we show how to create an energy dependent temporal model. If you already # have such a model, go directly to the :ref:`corresponding` section. # # # In the following example, the source spectrum will vary continuously # with time. Here we define 5 time bins and represent the spectrum # at the center of each bin as a powerlaw. The spectral evolution # is also shown in the following plot: # amplitudes = u.Quantity( [2e-10, 8e-11, 5e-11, 3e-11, 1e-11], unit="cm-2s-1TeV-1" ) # amplitude indices = u.Quantity([2.2, 2.0, 1.8, 1.6, 1.4], unit="") # index for i in range(len(amplitudes)): spec = PowerLawSpectralModel( index=indices[i], amplitude=amplitudes[i], reference="1 TeV" ) spec.plot([0.2, 100] * u.TeV, label=f"Time bin {i+1}") plt.legend() plt.show() ###################################################################### # Let's now create the temporal model (if you already have this model, # please go directly to the `Read the energy-dependent model` section), # that will be defined as a `LightCurveTemplateTemporalModel`. The latter # take as input a `RegionNDMap` with temporal and energy axes, on which # the fluxes are stored. # # To create such map, we first need to define a time axis with `MapAxis`: # here we consider 5 time bins of 720 s (i.e. 1 hr in total). # As a second step, we create an energy axis with 10 bins where the # powerlaw spectral models will be evaluated. # # source position pointing_position = SkyCoord("100 deg", "30 deg", frame="icrs") position = FixedPointingInfo(fixed_icrs=pointing_position.icrs) # time axis time_axis = MapAxis.from_bounds(0 * u.s, 3600 * u.s, nbin=5, name="time", interp="lin") # energy axis energy_axis = MapAxis.from_energy_bounds( energy_min=0.2 * u.TeV, energy_max=100 * u.TeV, nbin=10 ) ###################################################################### # Now let's create the `RegionNDMap` and fill it with the expected # spectral values: # # create the RegionNDMap containing fluxes m = RegionNDMap.create( region=PointSkyRegion(center=pointing_position), axes=[energy_axis, time_axis], unit="cm-2s-1TeV-1", ) # to compute the spectra as a function of time we extract the coordinates of the geometry coords = m.geom.get_coord(sparse=True) # We reshape the indices and amplitudes array to perform broadcasting indices = indices.reshape(coords["time"].shape) amplitudes = amplitudes.reshape(coords["time"].shape) # evaluate the spectra and fill the RegionNDMap m.quantity = PowerLawSpectralModel.evaluate( coords["energy"], indices, amplitudes, 1 * u.TeV ) ###################################################################### # Create the temporal model and write it to disk # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Now, we define the `LightCurveTemplateTemporalModel`. It needs the # map we created above and a reference time. The latter # is crucial to evaluate the model as a function of time. # We show also how to write the model on disk, noting that we explicitly # set the `format` to `map`. t_ref = Time(51544.00074287037, format="mjd", scale="tt") filename = "./temporal_model_map.fits" temp = LightCurveTemplateTemporalModel(m, t_ref=t_ref, filename=filename) temp.write(filename, format="map", overwrite=True) ###################################################################### # .. _read-the-energy-dependent-model: # # Read the energy-dependent model # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # We read the map written on disc again with `LightCurveTemplateTemporalModel.read`. # When the model is from a map, the arguments `format="map"` is mandatory. # The map is `fits` file, with 3 extensions: # # 1) `SKYMAP`: a table with a `CHANNEL` and `DATA` column; the number of rows is given # by the product of the energy and time bins. The `DATA` represent the values of the model # at each energy; # # 2) `SKYMAP_BANDS`: a table with `CHANNEL`, `ENERGY`, `E_MIN`, `E_MAX`, `TIME`, # `TIME_MIN` and `TIME_MAX`. `ENERGY` is the mean of `E_MIN` and `E_MAX`, as well as # `TIME` is the mean of `TIME_MIN` and `TIME_MAX`; this extension should contain the # reference time in the header, through the keywords `MJDREFI` and `MJDREFF`. # # 3) `SKYMAP_REGION`: it gives information on the spatial morphology, i.e. `SHAPE` # (only `point` is accepted), `X` and `Y` (source position), `R` (the radius if # extended; not used in our case) and `ROTANG` (the angular rotation of the spatial # model, if extended; not used in our case). # temporal_model = LightCurveTemplateTemporalModel.read(filename, format="map") ###################################################################### # We note that an interpolation scheme is also provided when loading # a map: for an energy-dependent temporal model, the `method` and # `values_scale` arguments by default are set to `linear` and `log`. # We warn the reader to carefully check the interpolation method used # for the time axis while creating the template model, as different # methods provide different results. # By default, we assume `linear` interpolation for the time, `log` # for the energies and values. # Users can modify the `method` and `values_scale` arguments but we # warn that this should be done only when the user knows the consequences # of the changes. Here, we show how to set them explicitly: # temporal_model.method = "linear" # default temporal_model.values_scale = "log" # default ###################################################################### # We can have a visual inspection of the temporal model at different energies: # time_range = temporal_model.reference_time + [-100, 3600] * u.s temporal_model.plot(time_range=time_range, energy=[0.1, 0.5, 1, 5] * u.TeV) plt.semilogy() plt.show() ###################################################################### # Prepare and run the event sampler # --------------------------------- # # Define the simulation source model # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Now that the temporal model is complete, we create the whole source # `SkyModel`. We define its spatial morphology as `point-like`. This # is a mandatory condition to simulate energy-dependent temporal model. # Other morphologies will raise an error! # Note also that the source `spectral_model` is a `ConstantSpectralModel`: # this is necessary and mandatory, as the real source spectrum is actually # passed through the map. # spatial_model = PointSpatialModel.from_position(pointing_position) spectral_model = ConstantSpectralModel(const="1 cm-2 s-1 TeV-1") model = SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, temporal_model=temporal_model, name="test-source", ) bkg_model = FoVBackgroundModel(dataset_name="my-dataset") models = [model, bkg_model] ###################################################################### # Define an observation and make a dataset # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # In the following, we define an observation of 1 hr with CTAO in the # alpha-configuration for the south array, and we also create a dataset # to be passed to the event sampler. The full `SkyModel` created above # is passed to the dataset. # path = Path("$GAMMAPY_DATA/cta-caldb") irf_filename = "Prod5-South-20deg-AverageAz-14MSTs37SSTs.180000s-v0.1.fits.gz" pointing_position = SkyCoord(ra=100 * u.deg, dec=30 * u.deg) pointing = FixedPointingInfo(fixed_icrs=pointing_position) livetime = 1 * u.hr irfs = load_irf_dict_from_file(path / irf_filename) location = observatory_locations["cta_south"] observation = Observation.create( obs_id=1001, pointing=pointing, livetime=livetime, irfs=irfs, location=location, ) ###################################################################### energy_axis = MapAxis.from_energy_bounds("0.2 TeV", "100 TeV", nbin=5, per_decade=True) energy_axis_true = MapAxis.from_energy_bounds( "0.05 TeV", "150 TeV", nbin=10, per_decade=True, name="energy_true" ) migra_axis = MapAxis.from_bounds(0.5, 2, nbin=150, node_type="edges", name="migra") geom = WcsGeom.create( skydir=pointing_position, width=(2, 2), binsz=0.02, frame="icrs", axes=[energy_axis], ) ###################################################################### empty = MapDataset.create( geom, energy_axis_true=energy_axis_true, migra_axis=migra_axis, name="my-dataset", ) maker = MapDatasetMaker(selection=["exposure", "background", "psf", "edisp"]) dataset = maker.run(empty, observation) dataset.models = models print(dataset.models) ###################################################################### # Let's simulate the model # ~~~~~~~~~~~~~~~~~~~~~~~~ # # Initialize and run the `MapDatasetEventSampler` class. We also define # the `oversample_energy_factor` arguments: this should be carefully # considered by the user, as a higher `oversample_energy_factor` gives # a more precise source flux estimate, at the expense of computational # time. Here we adopt an `oversample_energy_factor` of 10: # sampler = MapDatasetEventSampler(random_state=0, oversample_energy_factor=10) events = sampler.run(dataset, observation) ###################################################################### # Let's inspect the simulated events in the source region: # src_position = SkyCoord(100.0, 30.0, frame="icrs", unit="deg") on_region_radius = Angle("0.15 deg") on_region = CircleSkyRegion(center=src_position, radius=on_region_radius) src_events = events.select_region(on_region) src_events.peek() plt.show() ###################################################################### # Let's inspect the simulated events as a function of time: # time_interval = temporal_model.reference_time + [300, 700] * u.s src_events.select_time(time_interval).plot_energy(label="500 s") time_interval = temporal_model.reference_time + [1600, 2000] * u.s src_events.select_time(time_interval).plot_energy(label="1800 s") plt.legend() plt.show() ###################################################################### # Exercises # --------- # # - Try to create a temporal model with a more complex energy-dependent # evolution; # - Read your temporal model in Gammapy and simulate it; # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-3d/flux_profiles.py0000644000175100001770000002161314721316200023642 0ustar00runnerdocker""" Flux Profile Estimation ======================= Learn how to estimate flux profiles on a Fermi-LAT dataset. Prerequisites ------------- Knowledge of 3D data reduction and datasets used in Gammapy, see for instance the first analysis tutorial. Context ------- A useful tool to study and compare the spatial distribution of flux in images and data cubes is the measurement of flux profiles. Flux profiles can show spatial correlations of gamma-ray data with e.g. gas maps or other type of gamma-ray data. Most commonly flux profiles are measured along some preferred coordinate axis, either radially distance from a source of interest, along longitude and latitude coordinate axes or along the path defined by two spatial coordinates. Proposed Approach ----------------- Flux profile estimation essentially works by estimating flux points for a set of predefined spatially connected regions. For radial flux profiles the shape of the regions are annuli with a common center, for linear profiles it’s typically a rectangular shape. We will work on a pre-computed `~gammapy.datasets.MapDataset` of Fermi-LAT data, use `~regions.SkyRegion` to define the structure of the bins of the flux profile and run the flux profile extraction using the `~gammapy.estimators.FluxProfileEstimator` """ import numpy as np from astropy import units as u from astropy.coordinates import SkyCoord # %matplotlib inline import matplotlib.pyplot as plt ###################################################################### # Setup # ----- # from IPython.display import display from gammapy.datasets import MapDataset from gammapy.estimators import FluxPoints, FluxProfileEstimator from gammapy.maps import RegionGeom from gammapy.modeling.models import PowerLawSpectralModel ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup from gammapy.utils.regions import ( make_concentric_annulus_sky_regions, make_orthogonal_rectangle_sky_regions, ) check_tutorials_setup() ###################################################################### # Read and Introduce Data # ----------------------- # dataset = MapDataset.read( "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc.fits.gz", name="fermi-dataset" ) ###################################################################### # This is what the counts image we will work with looks like: # counts_image = dataset.counts.sum_over_axes() counts_image.smooth("0.1 deg").plot(stretch="sqrt") plt.show() ###################################################################### # There are 400x200 pixels in the dataset and 11 energy bins between 10 # GeV and 2 TeV: # print(dataset.counts) ###################################################################### # Profile Estimation # ------------------ # # Configuration # ~~~~~~~~~~~~~ # # We start by defining a list of spatially connected regions along the # galactic longitude axis. For this there is a helper function # `~gammapy.utils.regions.make_orthogonal_rectangle_sky_regions`. The individual region bins # for the profile have a height of 3 deg and in total there are 31 bins. # Its starts from lon = 10 deg and goes to lon = 350 deg. In addition, we # have to specify the `wcs` to take into account possible projections # effects on the region definition: # regions = make_orthogonal_rectangle_sky_regions( start_pos=SkyCoord("10d", "0d", frame="galactic"), end_pos=SkyCoord("350d", "0d", frame="galactic"), wcs=counts_image.geom.wcs, height="3 deg", nbin=51, ) ###################################################################### # We can use the `~gammapy.maps.RegionGeom` object to illustrate the regions on top of # the counts image: # geom = RegionGeom.create(region=regions) ax = counts_image.smooth("0.1 deg").plot(stretch="sqrt") geom.plot_region(ax=ax, color="w") plt.show() ###################################################################### # Next we create the `~gammapy.estimators.FluxProfileEstimator`. For the estimation of the # flux profile we assume a spectral model with a power-law shape and an # index of 2.3 # flux_profile_estimator = FluxProfileEstimator( regions=regions, spectrum=PowerLawSpectralModel(index=2.3), energy_edges=[10, 2000] * u.GeV, selection_optional=["ul"], ) ###################################################################### # We can see the full configuration by printing the estimator object: # print(flux_profile_estimator) ###################################################################### # Run Estimation # ~~~~~~~~~~~~~~ # # Now we can run the profile estimation and explore the results: # # %%time profile = flux_profile_estimator.run(datasets=dataset) print(profile) ###################################################################### # We can see the flux profile is represented by a `~gammapy.estimators.FluxPoints` object # with a `projected-distance` axis, which defines the main axis the flux # profile is measured along. The `lon` and `lat` axes can be ignored. # # Plotting Results # ~~~~~~~~~~~~~~~~ # # Let us directly plot the result using `~gammapy.estimators.FluxPoints.plot`: # ax = profile.plot(sed_type="dnde") ax.set_yscale("linear") plt.show() ###################################################################### # Based on the spectral model we specified above we can also plot in any # other sed type, e.g. energy flux and define a different threshold when # to plot upper limits: # profile.sqrt_ts_threshold_ul = 2 plt.figure() ax = profile.plot(sed_type="eflux") ax.set_yscale("linear") plt.show() ###################################################################### # We can also plot any other quantity of interest, that is defined on the # `~gammapy.estimators.FluxPoints` result object. E.g. the predicted total counts, # background counts and excess counts: # quantities = ["npred", "npred_excess", "npred_background"] fig, ax = plt.subplots() for quantity in quantities: profile[quantity].plot(ax=ax, label=quantity.title()) ax.set_ylabel("Counts") ax.legend() plt.show() ###################################################################### # Serialisation and I/O # ~~~~~~~~~~~~~~~~~~~~~ # # The profile can be serialised using `~gammapy.estimators.FluxPoints.write`, given a # specific format: # profile.write( filename="flux_profile_fermi.fits", format="profile", overwrite=True, sed_type="dnde", ) profile_new = FluxPoints.read(filename="flux_profile_fermi.fits", format="profile") ax = profile_new.plot() ax.set_yscale("linear") plt.show() ###################################################################### # The profile can be serialised to a `~astropy.table.Table` object # using: # table = profile.to_table(format="profile", formatted=True) display(table) ###################################################################### # No we can also estimate a radial profile starting from the Galactic # center: # regions = make_concentric_annulus_sky_regions( center=SkyCoord("0d", "0d", frame="galactic"), radius_max="1.5 deg", nbin=11, ) ###################################################################### # Again we first illustrate the regions: # geom = RegionGeom.create(region=regions) gc_image = counts_image.cutout( position=SkyCoord("0d", "0d", frame="galactic"), width=3 * u.deg ) ax = gc_image.smooth("0.1 deg").plot(stretch="sqrt") geom.plot_region(ax=ax, color="w") plt.show() ###################################################################### # This time we define two energy bins and include the fit statistic # profile in the computation: flux_profile_estimator = FluxProfileEstimator( regions=regions, spectrum=PowerLawSpectralModel(index=2.3), energy_edges=[10, 100, 2000] * u.GeV, selection_optional=["ul", "scan"], ) ###################################################################### # The configuration of the fit statistic profile is done throught the norm parameter: flux_profile_estimator.norm.scan_values = np.linspace(-1, 5, 11) ###################################################################### # Now we can run the estimator, profile = flux_profile_estimator.run(datasets=dataset) ###################################################################### # and plot the result: # profile.plot(axis_name="projected-distance", sed_type="flux") plt.show() ###################################################################### # However because of the powerlaw spectrum the flux at high energies is # much lower. To extract the profile at high energies only we can use: # profile_high = profile.slice_by_idx({"energy": slice(1, 2)}) plt.show() ###################################################################### # And now plot the points together with the likelihood profiles: # fig, ax = plt.subplots() profile_high.plot(ax=ax, sed_type="eflux", color="tab:orange") profile_high.plot_ts_profiles(ax=ax, sed_type="eflux") ax.set_yscale("linear") plt.show() # sphinx_gallery_thumbnail_number = 2 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-3d/simulate_3d.py0000644000175100001770000001573414721316200023201 0ustar00runnerdocker""" 3D map simulation ================= Simulate a 3D observation of a source with the CTA 1DC response and fit it with the assumed source model. Prerequisites ------------- - Knowledge of 3D extraction and datasets used in gammapy, see for example the :doc:`/tutorials/starting/analysis_1` tutorial. Context ------- To simulate a specific observation, it is not always necessary to simulate the full photon list. For many uses cases, simulating directly a reduced binned dataset is enough: the IRFs reduced in the correct geometry are combined with a source model to predict an actual number of counts per bin. The latter is then used to simulate a reduced dataset using Poisson probability distribution. This can be done to check the feasibility of a measurement (performance / sensitivity study), to test whether fitted parameters really provide a good fit to the data etc. Here we will see how to perform a 3D simulation of a CTA observation, assuming both the spectral and spatial morphology of an observed source. **Objective: simulate a 3D observation of a source with CTA using the CTA 1DC response and fit it with the assumed source model.** Proposed approach ----------------- Here we can’t use the regular observation objects that are connected to a `DataStore`. Instead we will create a fake `~gammapy.data.Observation` that contain some pointing information and the CTA 1DC IRFs (that are loaded with `~gammapy.irf.load_irf_dict_from_file`). Then we will create a `~gammapy.datasets.MapDataset` geometry and create it with the `~gammapy.makers.MapDatasetMaker`. Then we will be able to define a model consisting of a `~gammapy.modeling.models.PowerLawSpectralModel` and a `~gammapy.modeling.models.GaussianSpatialModel`. We will assign it to the dataset and fake the count data. """ ###################################################################### # Imports and versions # -------------------- # import numpy as np import astropy.units as u from astropy.coordinates import SkyCoord import matplotlib.pyplot as plt # %matplotlib inline from IPython.display import display from gammapy.data import FixedPointingInfo, Observation, observatory_locations from gammapy.datasets import MapDataset from gammapy.irf import load_irf_dict_from_file from gammapy.makers import MapDatasetMaker, SafeMaskMaker from gammapy.maps import MapAxis, WcsGeom from gammapy.modeling import Fit from gammapy.modeling.models import ( FoVBackgroundModel, GaussianSpatialModel, Models, PowerLawSpectralModel, SkyModel, ) ###################################################################### # Simulation # ---------- # ###################################################################### # We will simulate using the CTA-1DC IRFs shipped with gammapy # # Loading IRFs irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) # Define the observation parameters (typically the observation duration and the pointing position): livetime = 2.0 * u.hr pointing_position = SkyCoord(0, 0, unit="deg", frame="galactic") # We want to simulate an observation pointing at a fixed position in the sky. # For this, we use the `FixedPointingInfo` class pointing = FixedPointingInfo( fixed_icrs=pointing_position.icrs, ) # Define map geometry for binned simulation energy_reco = MapAxis.from_edges( np.logspace(-1.0, 1.0, 10), unit="TeV", name="energy", interp="log" ) geom = WcsGeom.create( skydir=(0, 0), binsz=0.02, width=(6, 6), frame="galactic", axes=[energy_reco], ) # It is usually useful to have a separate binning for the true energy axis energy_true = MapAxis.from_edges( np.logspace(-1.5, 1.5, 30), unit="TeV", name="energy_true", interp="log" ) empty = MapDataset.create(geom, name="dataset-simu", energy_axis_true=energy_true) # Define sky model to used simulate the data. # Here we use a Gaussian spatial model and a Power Law spectral model. spatial_model = GaussianSpatialModel( lon_0="0.2 deg", lat_0="0.1 deg", sigma="0.3 deg", frame="galactic" ) spectral_model = PowerLawSpectralModel( index=3, amplitude="1e-11 cm-2 s-1 TeV-1", reference="1 TeV" ) model_simu = SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, name="model-simu", ) bkg_model = FoVBackgroundModel(dataset_name="dataset-simu") models = Models([model_simu, bkg_model]) print(models) ###################################################################### # Now, comes the main part of dataset simulation. We create an in-memory # observation and an empty dataset. We then predict the number of counts # for the given model, and Poisson fluctuate it using `fake()` to make # a simulated counts maps. Keep in mind that it is important to specify # the `selection` of the maps that you want to produce # # Create an in-memory observation location = observatory_locations["cta_south"] obs = Observation.create( pointing=pointing, livetime=livetime, irfs=irfs, location=location ) print(obs) # Make the MapDataset maker = MapDatasetMaker(selection=["exposure", "background", "psf", "edisp"]) maker_safe_mask = SafeMaskMaker(methods=["offset-max"], offset_max=4.0 * u.deg) dataset = maker.run(empty, obs) dataset = maker_safe_mask.run(dataset, obs) print(dataset) # Add the model on the dataset and Poisson fluctuate dataset.models = models dataset.fake() # Do a print on the dataset - there is now a counts maps print(dataset) ###################################################################### # Now use this dataset as you would in all standard analysis. You can plot # the maps, or proceed with your custom analysis. In the next section, we # show the standard 3D fitting as in :doc:`/tutorials/analysis-3d/analysis_3d` # tutorial. # # To plot, eg, counts: dataset.counts.smooth(0.05 * u.deg).plot_interactive(add_cbar=True, stretch="linear") plt.show() ###################################################################### # Fit # --- # # In this section, we do a usual 3D fit with the same model used to # simulated the data and see the stability of the simulations. Often, it # is useful to simulate many such datasets and look at the distribution of # the reconstructed parameters. # models_fit = models.copy() # We do not want to fit the background in this case, so we will freeze the parameters models_fit["dataset-simu-bkg"].spectral_model.norm.frozen = True models_fit["dataset-simu-bkg"].spectral_model.tilt.frozen = True dataset.models = models_fit print(dataset.models) # %%time fit = Fit(optimize_opts={"print_level": 1}) result = fit.run(datasets=[dataset]) dataset.plot_residuals_spatial(method="diff/sqrt(model)", vmin=-0.5, vmax=0.5) plt.show() ###################################################################### # Compare the injected and fitted models: # print( "True model: \n", model_simu, "\n\n Fitted model: \n", models_fit["model-simu"], ) ###################################################################### # Get the errors on the fitted parameters from the parameter table # display(result.parameters.to_table()) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.172642 gammapy-1.3/examples/tutorials/analysis-time/0000755000175100001770000000000014721316215021042 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-time/README.rst0000644000175100001770000000001114721316200022513 0ustar00runnerdockerTime ----././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-time/light_curve.py0000644000175100001770000002735014721316200023730 0ustar00runnerdocker""" Light curves ============ Compute per-observation and nightly fluxes of four Crab nebula observations. Prerequisites ------------- - Knowledge of the high level interface to perform data reduction, see the :doc:`/tutorials/starting/analysis_1` tutorial. Context ------- This tutorial presents how light curve extraction is performed in gammapy, i.e. how to measure the flux of a source in different time bins. Cherenkov telescopes usually work with observing runs and distribute data according to this basic time interval. A typical use case is to look for variability of a source on various time bins: observation run-wise binning, nightly, weekly etc. **Objective: The Crab nebula is not known to be variable at TeV energies, so we expect constant brightness within statistical and systematic errors. Compute per-observation and nightly fluxes of the four Crab nebula observations from the H.E.S.S. first public test data release.** Proposed approach ----------------- We will demonstrate how to compute a light curve from 3D reduced datasets (`~gammapy.datasets.MapDataset`) as well as 1D ON-OFF spectral datasets (`~gammapy.datasets.SpectrumDatasetOnOff`). The data reduction will be performed with the high level interface for the data reduction. Then we will use the `~gammapy.estimators.LightCurveEstimator` class, which is able to extract a light curve independently of the dataset type. """ ###################################################################### # Setup # ----- # # As usual, we’ll start with some general imports… # import logging import astropy.units as u from astropy.coordinates import SkyCoord from astropy.time import Time # %matplotlib inline import matplotlib.pyplot as plt from IPython.display import display from gammapy.analysis import Analysis, AnalysisConfig from gammapy.estimators import LightCurveEstimator from gammapy.modeling.models import ( Models, PointSpatialModel, PowerLawSpectralModel, SkyModel, ) from gammapy.utils.check import check_tutorials_setup log = logging.getLogger(__name__) ###################################################################### # Check setup # ----------- check_tutorials_setup() ###################################################################### # Analysis configuration # ---------------------- # # For the 1D and 3D extraction, we will use the same CrabNebula # configuration than in the :doc:`/tutorials/starting/analysis_1` tutorial # using the high level interface of Gammapy. # # From the high level interface, the data reduction for those observations # is performed as follows. # ###################################################################### # Building the 3D analysis configuration # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # conf_3d = AnalysisConfig() ###################################################################### # Definition of the data selection # ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ # # Here we use the Crab runs from the # `H.E.S.S. DL3 data release 1 `__. # conf_3d.observations.obs_ids = [23523, 23526, 23559, 23592] ###################################################################### # Definition of the dataset geometry # ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ # # We want a 3D analysis conf_3d.datasets.type = "3d" # We want to extract the data by observation and therefore to not stack them conf_3d.datasets.stack = False # Here is the WCS geometry of the Maps conf_3d.datasets.geom.wcs.skydir = dict( frame="icrs", lon=83.63308 * u.deg, lat=22.01450 * u.deg ) conf_3d.datasets.geom.wcs.binsize = 0.02 * u.deg conf_3d.datasets.geom.wcs.width = dict(width=1 * u.deg, height=1 * u.deg) # We define a value for the IRF Maps binsize conf_3d.datasets.geom.wcs.binsize_irf = 0.2 * u.deg # Define energy binning for the Maps conf_3d.datasets.geom.axes.energy = dict(min=0.7 * u.TeV, max=10 * u.TeV, nbins=5) conf_3d.datasets.geom.axes.energy_true = dict(min=0.3 * u.TeV, max=20 * u.TeV, nbins=20) ###################################################################### # Run the 3D data reduction # ~~~~~~~~~~~~~~~~~~~~~~~~~ # analysis_3d = Analysis(conf_3d) analysis_3d.get_observations() analysis_3d.get_datasets() ###################################################################### # Define the model to be used # ~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Here we don’t try to fit the model parameters to the whole dataset, but # we use predefined values instead. # target_position = SkyCoord(ra=83.63308, dec=22.01450, unit="deg") spatial_model = PointSpatialModel( lon_0=target_position.ra, lat_0=target_position.dec, frame="icrs" ) spectral_model = PowerLawSpectralModel( index=2.702, amplitude=4.712e-11 * u.Unit("1 / (cm2 s TeV)"), reference=1 * u.TeV, ) sky_model = SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, name="crab" ) ###################################################################### # We assign them the model to be fitted to each dataset # models = Models([sky_model]) analysis_3d.set_models(models) ###################################################################### # Light Curve estimation by observation # ------------------------------------- # # We can now create the light curve estimator. # # We pass it the list of datasets and the name of the model component for # which we want to build the light curve. In a given time bin, the only # free parameter of the source is its normalization. We can optionally ask # for parameters of other model components to be reoptimized during fit, # that is most of the time to fit background normalization in each time # bin. # # If we don’t set any time interval, the # `~gammapy.estimators.LightCurveEstimator` determines the flux of # each dataset and places it at the corresponding time in the light curve. # Here one dataset equals to one observing run. # lc_maker_3d = LightCurveEstimator( energy_edges=[1, 10] * u.TeV, source="crab", reoptimize=False ) # Example showing how to change some parameters from the object itself lc_maker_3d.n_sigma_ul = 3 # Number of sigma to use for upper limit computation lc_maker_3d.selection_optional = ( "all" # Add the computation of upper limits and likelihood profile ) lc_3d = lc_maker_3d.run(analysis_3d.datasets) ###################################################################### # The lightcurve `~gammapy.estimators.FluxPoints` object `lc_3d` contains a table which we can explore. # # Example showing how to change just before plotting the threshold on the signal significance # (points vs upper limits), even if this has no effect with this data set. fig, ax = plt.subplots( figsize=(8, 6), gridspec_kw={"left": 0.16, "bottom": 0.2, "top": 0.98, "right": 0.98}, ) lc_3d.sqrt_ts_threshold_ul = 5 lc_3d.plot(ax=ax, axis_name="time") plt.show() table = lc_3d.to_table(format="lightcurve", sed_type="flux") display(table["time_min", "time_max", "e_min", "e_max", "flux", "flux_err"]) ###################################################################### # Running the light curve extraction in 1D # ---------------------------------------- # ###################################################################### # Building the 1D analysis configuration # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # conf_1d = AnalysisConfig() ###################################################################### # Definition of the data selection # ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ # # Here we use the Crab runs from the # `H.E.S.S. DL3 data release 1 `__ # conf_1d.observations.obs_ids = [23523, 23526, 23559, 23592] ###################################################################### # Definition of the dataset geometry # ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ # # We want a 1D analysis conf_1d.datasets.type = "1d" # We want to extract the data by observation and therefore to not stack them conf_1d.datasets.stack = False # Here we define the ON region and make sure that PSF leakage is corrected conf_1d.datasets.on_region = dict( frame="icrs", lon=83.63308 * u.deg, lat=22.01450 * u.deg, radius=0.1 * u.deg, ) conf_1d.datasets.containment_correction = True # Finally we define the energy binning for the spectra conf_1d.datasets.geom.axes.energy = dict(min=0.7 * u.TeV, max=10 * u.TeV, nbins=5) conf_1d.datasets.geom.axes.energy_true = dict(min=0.3 * u.TeV, max=20 * u.TeV, nbins=20) ###################################################################### # Run the 1D data reduction # ~~~~~~~~~~~~~~~~~~~~~~~~~ # analysis_1d = Analysis(conf_1d) analysis_1d.get_observations() analysis_1d.get_datasets() ###################################################################### # Define the model to be used # ~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Here we don’t try to fit the model parameters to the whole dataset, but # we use predefined values instead. # target_position = SkyCoord(ra=83.63308, dec=22.01450, unit="deg") spectral_model = PowerLawSpectralModel( index=2.702, amplitude=4.712e-11 * u.Unit("1 / (cm2 s TeV)"), reference=1 * u.TeV, ) sky_model = SkyModel(spectral_model=spectral_model, name="crab") ###################################################################### # We assign the model to be fitted to each dataset. We can use the same # `~gammapy.modeling.models.SkyModel` as before. # models = Models([sky_model]) analysis_1d.set_models(models) ###################################################################### # Extracting the light curve # ~~~~~~~~~~~~~~~~~~~~~~~~~~ # lc_maker_1d = LightCurveEstimator( energy_edges=[1, 10] * u.TeV, source="crab", reoptimize=False ) lc_1d = lc_maker_1d.run(analysis_1d.datasets) print(lc_1d.geom.axes.names) display(lc_1d.to_table(sed_type="flux", format="lightcurve")) ###################################################################### # Compare results # ~~~~~~~~~~~~~~~ # # Finally we compare the result for the 1D and 3D lightcurve in a single # figure: # fig, ax = plt.subplots( figsize=(8, 6), gridspec_kw={"left": 0.16, "bottom": 0.2, "top": 0.98, "right": 0.98}, ) lc_1d.plot(ax=ax, marker="o", label="1D") lc_3d.plot(ax=ax, marker="o", label="3D") plt.legend() plt.show() ###################################################################### # Night-wise LC estimation # ------------------------ # # Here we want to extract a night curve per night. We define the time # intervals that cover the three nights. # time_intervals = [ Time([53343.5, 53344.5], format="mjd", scale="utc"), Time([53345.5, 53346.5], format="mjd", scale="utc"), Time([53347.5, 53348.5], format="mjd", scale="utc"), ] ###################################################################### # To compute the LC on the time intervals defined above, we pass the # `~gammapy.estimators.LightCurveEstimator` the list of time intervals. # # Internally, datasets are grouped per time interval and a flux extraction # is performed for each group. # lc_maker_1d = LightCurveEstimator( energy_edges=[1, 10] * u.TeV, time_intervals=time_intervals, source="crab", reoptimize=False, selection_optional="all", ) nightwise_lc = lc_maker_1d.run(analysis_1d.datasets) fig, ax = plt.subplots( figsize=(8, 6), gridspec_kw={"left": 0.16, "bottom": 0.2, "top": 0.98, "right": 0.98}, ) nightwise_lc.plot(ax=ax, color="tab:orange") nightwise_lc.plot_ts_profiles(ax=ax) ax.set_ylim(1e-12, 3e-12) plt.show() ###################################################################### # What next? # ---------- # # When sources are bright enough to look for variability at small time # scales, the per-observation time binning is no longer relevant. One can # easily extend the light curve estimation approach presented above to any # time binning. This is demonstrated in the :doc:`/tutorials/analysis-time/light_curve_flare` # tutorial. which shows the extraction of the lightcurve of an AGN flare. # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-time/light_curve_flare.py0000644000175100001770000002562714721316200025106 0ustar00runnerdocker""" Light curves for flares ======================= Compute the light curve of a PKS 2155-304 flare on 10 minutes time intervals. Prerequisites ------------- - Understanding of how the light curve estimator works, please refer to the :doc:`light curve notebook `. Context ------- Frequently, especially when studying flares of bright sources, it is necessary to explore the time behaviour of a source on short time scales, in particular on time scales shorter than observing runs. A typical example is given by the flare of PKS 2155-304 during the night from July 29 to 30 2006. See the `following article `__. **Objective: Compute the light curve of a PKS 2155-304 flare on 5 minutes time intervals, i.e. smaller than the duration of individual observations.** Proposed approach ----------------- We have seen in the general presentation of the light curve estimator, see the :doc:`light curve notebook `, Gammapy produces datasets in a given time interval, by default that of the parent observation. To be able to produce datasets on smaller time steps, it is necessary to split the observations into the required time intervals. This is easily performed with the `~gammapy.data.Observations.select_time` method of `~gammapy.data.Observations`. If you pass it a list of time intervals it will produce a list of time filtered observations in a new `~gammapy.data.Observations` object. Data reduction can then be performed and will result in datasets defined on the required time intervals and light curve estimation can proceed directly. In summary, we have to: - Select relevant `~gammapy.data.Observations` from the `~gammapy.data.DataStore` - Apply the time selection in our predefined time intervals to obtain a new `~gammapy.data.Observations` - Perform the data reduction (in 1D or 3D) - Define the source model - Extract the light curve from the reduced dataset Here, we will use the PKS 2155-304 observations from the `H.E.S.S. first public test data release `__. We will use time intervals of 5 minutes duration. The tutorial is implemented with the intermediate level API. Setup ----- As usual, we’ll start with some general imports… """ import logging import numpy as np import astropy.units as u from astropy.coordinates import Angle, SkyCoord from astropy.time import Time from regions import CircleSkyRegion # %matplotlib inline import matplotlib.pyplot as plt log = logging.getLogger(__name__) from gammapy.data import DataStore from gammapy.datasets import Datasets, SpectrumDataset from gammapy.estimators import LightCurveEstimator from gammapy.estimators.utils import get_rebinned_axis from gammapy.makers import ( ReflectedRegionsBackgroundMaker, SafeMaskMaker, SpectrumDatasetMaker, ) from gammapy.maps import MapAxis, RegionGeom from gammapy.modeling.models import PowerLawSpectralModel, SkyModel from gammapy.modeling import Fit ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Select the data # --------------- # # We first set the datastore. # data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") ###################################################################### # Now we select observations within 2 degrees of PKS 2155-304. # target_position = SkyCoord(329.71693826 * u.deg, -30.2255890 * u.deg, frame="icrs") selection = dict( type="sky_circle", frame="icrs", lon=target_position.ra, lat=target_position.dec, radius=2 * u.deg, ) obs_ids = data_store.obs_table.select_observations(selection)["OBS_ID"] observations = data_store.get_observations(obs_ids) print(f"Number of selected observations : {len(observations)}") ###################################################################### # Define time intervals # --------------------- # # We create the list of time intervals, each of duration 10 minutes. Each time interval is an # `astropy.time.Time` object, containing a start and stop time. t0 = Time("2006-07-29T20:30") duration = 10 * u.min n_time_bins = 35 times = t0 + np.arange(n_time_bins) * duration time_intervals = [Time([tstart, tstop]) for tstart, tstop in zip(times[:-1], times[1:])] print(time_intervals[0].mjd) ###################################################################### # Filter the observations list in time intervals # ---------------------------------------------- # # Here we apply the list of time intervals to the observations with # `~gammapy.data.Observations.select_time()`. # # This will return a new list of Observations filtered by ``time_intervals``. # For each time interval, a new observation is created that converts the # intersection of the GTIs and time interval. # short_observations = observations.select_time(time_intervals) # check that observations have been filtered print(f"Number of observations after time filtering: {len(short_observations)}\n") print(short_observations[1].gti) ###################################################################### # As we can see, we have now observations of duration equal to the chosen # time step. # # Now data reduction and light curve extraction can proceed exactly as # before. # ###################################################################### # Building 1D datasets from the new observations # ---------------------------------------------- # # Here we will perform the data reduction in 1D with reflected regions. # # *Beware, with small time intervals the background normalization with OFF # regions might become problematic.* # ###################################################################### # Defining the geometry # ~~~~~~~~~~~~~~~~~~~~~ # # We define the energy axes. As usual, the true energy axis has to cover a # wider range to ensure a good coverage of the measured energy range # chosen. # # We need to define the ON extraction region. Its size follows typical # spectral extraction regions for H.E.S.S. analyses. # # Target definition energy_axis = MapAxis.from_energy_bounds("0.4 TeV", "20 TeV", nbin=10) energy_axis_true = MapAxis.from_energy_bounds( "0.1 TeV", "40 TeV", nbin=20, name="energy_true" ) on_region_radius = Angle("0.11 deg") on_region = CircleSkyRegion(center=target_position, radius=on_region_radius) geom = RegionGeom.create(region=on_region, axes=[energy_axis]) ###################################################################### # Creation of the data reduction makers # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # We now create the dataset and background makers for the selected # geometry. # dataset_maker = SpectrumDatasetMaker( containment_correction=True, selection=["counts", "exposure", "edisp"] ) bkg_maker = ReflectedRegionsBackgroundMaker() safe_mask_masker = SafeMaskMaker(methods=["aeff-max"], aeff_percent=10) ###################################################################### # Creation of the datasets # ~~~~~~~~~~~~~~~~~~~~~~~~ # # Now we perform the actual data reduction in the ``time_intervals``. # # %%time datasets = Datasets() dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=energy_axis_true) for obs in short_observations: dataset = dataset_maker.run(dataset_empty.copy(), obs) dataset_on_off = bkg_maker.run(dataset, obs) dataset_on_off = safe_mask_masker.run(dataset_on_off, obs) datasets.append(dataset_on_off) ###################################################################### # Define underlying model # ----------------------- # # Since we use forward folding to obtain the flux points in each bin, exact values will depend on the underlying model. In this example, we use a power law as used in the # `reference # paper `__. # # As we have are only using spectral datasets, we do not need any spatial models. # # **Note** : All time bins must have the same spectral model. To see how to investigate spectral variability, # see :doc:`time resolved spectroscopy notebook `. spectral_model = PowerLawSpectralModel(amplitude=1e-10 * u.Unit("1 / (cm2 s TeV)")) sky_model = SkyModel(spatial_model=None, spectral_model=spectral_model, name="pks2155") ###################################################################### # Assign to model to all datasets # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # We assign each dataset its spectral model # datasets.models = sky_model # %%time fit = Fit() result = fit.run(datasets) print(result.models.to_parameters_table()) ###################################################################### # Extract the light curve # ----------------------- # # We first create the `~gammapy.estimators.LightCurveEstimator` for the # list of datasets we just produced. We give the estimator the name of the # source component to be fitted. We can directly compute the light curve in multiple energy # bins by supplying a list of `energy_edges`. # # By default the likelihood scan is computed from 0.2 to 5.0. # Here, we increase the max value to 10.0, because we are # dealing with a large flare. lc_maker_1d = LightCurveEstimator( energy_edges=[0.7, 1, 20] * u.TeV, source="pks2155", time_intervals=time_intervals, selection_optional="all", n_jobs=4, ) lc_maker_1d.norm.scan_max = 10 ###################################################################### # We can now perform the light curve extraction itself. To compare with # the `reference # paper `__, # we select the 0.7-20 TeV range. # # %%time lc_1d = lc_maker_1d.run(datasets) ###################################################################### # Finally we plot the result for the 1D lightcurve: # plt.figure(figsize=(8, 6)) plt.tight_layout() plt.subplots_adjust(bottom=0.3) lc_1d.plot(marker="o", axis_name="time", sed_type="flux") plt.show() ###################################################################### # Light curves once obtained can be rebinned using the likelihood profiles. # Here, we rebin 3 adjacent bins together, to get 30 minute bins. # # We will first slice `lc_1d` to obtain the lightcurve in the first energy bin # slices = {"energy": slice(0, 1)} sliced_lc = lc_1d.slice_by_idx(slices) print(sliced_lc) axis_new = get_rebinned_axis( sliced_lc, method="fixed-bins", group_size=3, axis_name="time" ) print(axis_new) lc_new = sliced_lc.resample_axis(axis_new) plt.figure(figsize=(8, 6)) plt.tight_layout() plt.subplots_adjust(bottom=0.3) ax = sliced_lc.plot(label="original") lc_new.plot(ax=ax, label="rebinned") plt.legend() plt.show() ##################################################################### # We can use the sliced lightcurve to understand the variability, # as shown in the doc:`/tutorials/analysis-time/variability_estimation` tutorial. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-time/light_curve_simulation.py0000644000175100001770000002551614721316200026176 0ustar00runnerdocker""" Simulating and fitting a time varying source ============================================ Simulate and fit a time decaying light curve of a source using the CTA 1DC response. Prerequisites ------------- - To understand how a single binned simulation works, please refer to :doc:`/tutorials/analysis-1d/spectrum_simulation` tutorial and :doc:`/tutorials/analysis-3d/simulate_3d` tutorial for 1D and 3D simulations, respectively. - For details of light curve extraction using gammapy, refer to the two tutorials :doc:`/tutorials/analysis-time/light_curve` and :doc:`/tutorials/analysis-time/light_curve_flare`. Context ------- Frequently, studies of variable sources (eg: decaying GRB light curves, AGN flares, etc.) require time variable simulations. For most use cases, generating an event list is an overkill, and it suffices to use binned simulations using a temporal model. **Objective: Simulate and fit a time decaying light curve of a source with CTA using the CTA 1DC response.** Proposed approach ----------------- We will simulate 10 spectral datasets within given time intervals (Good Time Intervals) following a given spectral (a power law) and temporal profile (an exponential decay, with a decay time of 6 hr). These are then analysed using the light curve estimator to obtain flux points. Modelling and fitting of lightcurves can be done either - directly on the output of the `~gammapy.estimators.LightCurveEstimator` (at the DL5 level) - fit the simulated datasets (at the DL4 level) In summary, the necessary steps are: - Choose observation parameters including a list of `gammapy.data.GTI` - Define temporal and spectral models from the :ref:`model-gallery` as per science case - Perform the simulation (in 1D or 3D) - Extract the light curve from the reduced dataset as shown in :doc:`/tutorials/analysis-time/light_curve` tutorial - Optionally, we show here how to fit the simulated datasets using a source model Setup ----- As usual, we’ll start with some general imports… """ import logging import numpy as np import astropy.units as u from astropy.coordinates import SkyCoord from astropy.time import Time # %matplotlib inline import matplotlib.pyplot as plt from IPython.display import display log = logging.getLogger(__name__) ###################################################################### # And some gammapy specific imports # import warnings from gammapy.data import FixedPointingInfo, Observation, observatory_locations from gammapy.datasets import Datasets, FluxPointsDataset, SpectrumDataset from gammapy.estimators import LightCurveEstimator from gammapy.irf import load_irf_dict_from_file from gammapy.makers import SpectrumDatasetMaker from gammapy.maps import MapAxis, RegionGeom, TimeMapAxis from gammapy.modeling import Fit from gammapy.modeling.models import ( ExpDecayTemporalModel, PowerLawSpectralModel, SkyModel, ) warnings.filterwarnings( action="ignore", message="overflow encountered in exp", module="astropy" ) ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # We first define our preferred time format: # TimeMapAxis.time_format = "iso" ###################################################################### # Simulating a light curve # ------------------------ # # We will simulate 10 spectra between 300 GeV and 10 TeV using an # `~gammapy.modeling.models.PowerLawSpectralModel` and a # `~gammapy.modeling.models.ExpDecayTemporalModel`. The important # thing to note here is how to attach a different `GTI` to each dataset. # Since we use spectrum datasets here, we will use a `~gammapy.maps.RegionGeom`. # # Loading IRFs irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) # Reconstructed and true energy axis energy_axis = MapAxis.from_edges( np.logspace(-0.5, 1.0, 10), unit="TeV", name="energy", interp="log" ) energy_axis_true = MapAxis.from_edges( np.logspace(-1.2, 2.0, 31), unit="TeV", name="energy_true", interp="log" ) geom = RegionGeom.create("galactic;circle(0, 0, 0.11)", axes=[energy_axis]) # Pointing position to be supplied as a `FixedPointingInfo` pointing = FixedPointingInfo( fixed_icrs=SkyCoord(0.5, 0.5, unit="deg", frame="galactic").icrs, ) ###################################################################### # Note that observations are usually conducted in Wobble mode, in which # the source is not in the center of the camera. This allows to have a # symmetrical sky position from which background can be estimated. # # Define the source model: A combination of spectral and temporal model gti_t0 = Time("2020-03-01") spectral_model = PowerLawSpectralModel( index=3, amplitude="1e-11 cm-2 s-1 TeV-1", reference="1 TeV" ) temporal_model = ExpDecayTemporalModel(t0="6 h", t_ref=gti_t0.mjd * u.d) model_simu = SkyModel( spectral_model=spectral_model, temporal_model=temporal_model, name="model-simu", ) # Look at the model display(model_simu.parameters.to_table()) ###################################################################### # Now, define the start and observation livetime wrt to the reference # time, ``gti_t0`` # n_obs = 10 tstart = gti_t0 + [1, 2, 3, 5, 8, 10, 20, 22, 23, 24] * u.h lvtm = [55, 25, 26, 40, 40, 50, 40, 52, 43, 47] * u.min ###################################################################### # Now perform the simulations # datasets = Datasets() empty = SpectrumDataset.create( geom=geom, energy_axis_true=energy_axis_true, name="empty" ) maker = SpectrumDatasetMaker(selection=["exposure", "background", "edisp"]) for idx in range(n_obs): obs = Observation.create( pointing=pointing, livetime=lvtm[idx], tstart=tstart[idx], irfs=irfs, reference_time=gti_t0, obs_id=idx, location=observatory_locations["cta_south"], ) empty_i = empty.copy(name=f"dataset-{idx}") dataset = maker.run(empty_i, obs) dataset.models = model_simu dataset.fake() datasets.append(dataset) ###################################################################### # The reduced datasets have been successfully simulated. Let’s take a # quick look into our datasets. # display(datasets.info_table()) ###################################################################### # Extract the lightcurve # ---------------------- # # This section uses standard light curve estimation tools for a 1D # extraction. Only a spectral model needs to be defined in this case. # Since the estimator returns the integrated flux separately for each time # bin, the temporal model need not be accounted for at this stage. We # extract the lightcurve in 3 energy bins. # # Define the model: spectral_model = PowerLawSpectralModel( index=3, amplitude="1e-11 cm-2 s-1 TeV-1", reference="1 TeV" ) model_fit = SkyModel(spectral_model=spectral_model, name="model-fit") # Attach model to all datasets datasets.models = model_fit # %%time lc_maker_1d = LightCurveEstimator( energy_edges=[0.3, 0.6, 1.0, 10] * u.TeV, source="model-fit", selection_optional=["ul"], ) lc_1d = lc_maker_1d.run(datasets) fig, ax = plt.subplots( figsize=(8, 6), gridspec_kw={"left": 0.16, "bottom": 0.2, "top": 0.98, "right": 0.98}, ) lc_1d.plot(ax=ax, marker="o", axis_name="time", sed_type="flux") plt.show() ###################################################################### # Fitting temporal models # ----------------------- # # We have the reconstructed lightcurve at this point. Now we want to fit a # profile to the obtained light curves, using a joint fitting across the # different datasets, while simultaneously minimising across the temporal # model parameters as well. The temporal models can be applied # # - directly on the obtained lightcurve # - on the simulated datasets # ###################################################################### # Fitting the obtained light curve # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # We will first convert the obtained light curve to a `~gammapy.datasets.FluxPointsDataset` # and fit it with a spectral and temporal model # Create the datasets by iterating over the returned lightcurve dataset_fp = FluxPointsDataset(data=lc_1d, name="dataset_lc") ###################################################################### # We will fit the amplitude, spectral index and the decay time scale. Note # that ``t_ref`` should be fixed by default for the # `~gammapy.modeling.models.ExpDecayTemporalModel`. # # Define the model: spectral_model1 = PowerLawSpectralModel( index=2.0, amplitude="1e-12 cm-2 s-1 TeV-1", reference="1 TeV" ) temporal_model1 = ExpDecayTemporalModel(t0="10 h", t_ref=gti_t0.mjd * u.d) model = SkyModel( spectral_model=spectral_model1, temporal_model=temporal_model1, name="model-test", ) dataset_fp.models = model print(dataset_fp) # %%time # Fit the dataset fit = Fit() result = fit.run(dataset_fp) display(result.parameters.to_table()) ###################################################################### # Now let’s plot model and data. We plot only the normalisation of the # temporal model in relative units for one particular energy range # dataset_fp.plot_spectrum(axis_name="time") ###################################################################### # Fit the datasets # ~~~~~~~~~~~~~~~~ # # Here, we apply the models directly to the simulated datasets. # # For modelling and fitting more complex flares, you should attach the # relevant model to each group of ``datasets``. The parameters of a model # in a given group of dataset will be tied. For more details on joint # fitting in Gammapy, see the :doc:`/tutorials/analysis-3d/analysis_3d`. # # Define the model: spectral_model2 = PowerLawSpectralModel( index=2.0, amplitude="1e-12 cm-2 s-1 TeV-1", reference="1 TeV" ) temporal_model2 = ExpDecayTemporalModel(t0="10 h", t_ref=gti_t0.mjd * u.d) model2 = SkyModel( spectral_model=spectral_model2, temporal_model=temporal_model2, name="model-test2", ) display(model2.parameters.to_table()) datasets.models = model2 # %%time # Perform a joint fit fit = Fit() result = fit.run(datasets=datasets) display(result.parameters.to_table()) ###################################################################### # We see that the fitted parameters are consistent between fitting flux # points and datasets, and match well with the simulated ones # ###################################################################### # Exercises # --------- # # 1. Re-do the analysis with `~gammapy.datasets.MapDataset` instead of a `~gammapy.datasets.SpectrumDataset` # 2. Model the flare of PKS 2155-304 which you obtained using # the :doc:`/tutorials/analysis-time/light_curve_flare` tutorial. # Use a combination of a Gaussian and Exponential flare profiles. # 3. Do a joint fitting of the datasets. #././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-time/pulsar_analysis.py0000644000175100001770000003424714721316200024631 0ustar00runnerdocker""" Pulsar analysis =============== Produce a phasogram, phased-resolved maps and spectra for pulsar analysis. Introduction ------------ This notebook shows how to do a simple pulsar analysis with Gammapy. We will produce a phasogram, a phase-resolved map and a phase-resolved spectrum of the Vela pulsar. In order to build these products, we will use the `~gammapy.makers.PhaseBackgroundMaker` which takes into account the on and off phase to compute a `~gammapy.datasets.MapDatasetOnOff` and a `~gammapy.datasets.SpectrumDatasetOnOff` in the phase space. This tutorial uses a simulated run of vel observation from the CTA DC1, which already contains a column for the pulsar phases. The phasing in itself is therefore not show here. It requires specific packages like Tempo2 or `PINT `__. A gammapy recipe shows how to compute phases with PINT in the framework of Gammapy. Opening the data ---------------- Let’s first do the imports and load the only observation containing Vela in the CTA 1DC dataset shipped with Gammapy. """ # Remove warnings import warnings import numpy as np import astropy.units as u from astropy.coordinates import SkyCoord import matplotlib.pyplot as plt # %matplotlib inline from IPython.display import display from gammapy.data import DataStore from gammapy.datasets import Datasets, FluxPointsDataset, MapDataset, SpectrumDataset from gammapy.estimators import ExcessMapEstimator, FluxPointsEstimator from gammapy.makers import ( MapDatasetMaker, PhaseBackgroundMaker, SafeMaskMaker, SpectrumDatasetMaker, ) from gammapy.maps import MapAxis, RegionGeom, WcsGeom from gammapy.modeling import Fit from gammapy.modeling.models import PowerLawSpectralModel, SkyModel from gammapy.stats import WStatCountsStatistic from gammapy.utils.regions import SphericalCircleSkyRegion from gammapy.utils.check import check_tutorials_setup warnings.filterwarnings("ignore") ###################################################################### # Check setup # ----------- check_tutorials_setup() ###################################################################### # Load the data store (which is a subset of CTA-DC1 data): data_store = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps") ###################################################################### # Define observation ID and print events: id_obs_vela = [111630] obs_list_vela = data_store.get_observations(id_obs_vela) print(obs_list_vela[0].events) ###################################################################### # Now that we have our observation, let’s select the events in 0.2° radius # around the pulsar position. pos_target = SkyCoord(ra=128.836 * u.deg, dec=-45.176 * u.deg, frame="icrs") on_radius = 0.2 * u.deg on_region = SphericalCircleSkyRegion(pos_target, on_radius) # Apply angular selection events_vela = obs_list_vela[0].events.select_region(on_region) print(events_vela) ###################################################################### # Let’s load the phases of the selected events in a dedicated array. phases = events_vela.table["PHASE"] # Let's take a look at the first 10 phases display(phases[:10]) ###################################################################### # Phasogram # --------- # # Once we have the phases, we can make a phasogram. A phasogram is a # histogram of phases. It works exactly like any other histogram (you # can set the binning, evaluate the errors based on the counts in each # bin, etc). nbins = 30 phase_min, phase_max = (0, 1) values, bin_edges = np.histogram(phases, range=(phase_min, phase_max), bins=nbins) bin_width = (phase_max - phase_min) / nbins bin_center = (bin_edges[:-1] + bin_edges[1:]) / 2 # Poissonian uncertainty on each bin values_err = np.sqrt(values) fig, ax = plt.subplots() ax.bar( x=bin_center, height=values, width=bin_width, color="orangered", alpha=0.7, edgecolor="black", yerr=values_err, ) ax.set_xlim(0, 1) ax.set_xlabel("Phase") ax.set_ylabel("Counts") ax.set_title(f"Phasogram with angular cut of {on_radius}") plt.show() on_phase_range = (0.5, 0.6) off_phase_range = (0.7, 1) ###################################################################### # Now let’s add some fancy additions to our phasogram: a patch on the ON- # and OFF-phase regions and one for the background level. # Evaluate background level mask_off = (off_phase_range[0] < phases) & (phases < off_phase_range[1]) count_bkg = mask_off.sum() print(f"Number of Off events: {count_bkg}") # bkg level normalized by the size of the OFF zone (0.3) bkg = count_bkg / nbins / (off_phase_range[1] - off_phase_range[0]) # error on the background estimation bkg_err = np.sqrt(count_bkg) / nbins / (off_phase_range[1] - off_phase_range[0]) ###################################################################### # Let's redo the same plot for the basis fig, ax = plt.subplots(figsize=(10, 7)) ax.bar( x=bin_center, height=values, width=bin_width, color="orangered", alpha=0.7, edgecolor="black", yerr=values_err, ) # Plot background level x_bkg = np.linspace(0, 1, 50) kwargs = {"color": "black", "alpha": 0.7, "ls": "--", "lw": 2} ax.plot(x_bkg, (bkg - bkg_err) * np.ones_like(x_bkg), **kwargs) ax.plot(x_bkg, (bkg + bkg_err) * np.ones_like(x_bkg), **kwargs) ax.fill_between( x_bkg, bkg - bkg_err, bkg + bkg_err, facecolor="grey", alpha=0.5 ) # grey area for the background level # Let's make patches for the on and off phase zones on_patch = ax.axvspan( on_phase_range[0], on_phase_range[1], alpha=0.5, color="royalblue", ec="black" ) off_patch = ax.axvspan( off_phase_range[0], off_phase_range[1], alpha=0.25, color="white", hatch="x", ec="black", ) # Legends "ON" and "OFF" ax.text(0.55, 5, "ON", color="black", fontsize=17, ha="center") ax.text(0.895, 5, "OFF", color="black", fontsize=17, ha="center") ax.set_xlabel("Phase") ax.set_ylabel("Counts") ax.set_xlim(0, 1) ax.set_title(f"Phasogram with angular cut of {on_radius}") plt.show() ###################################################################### # Make a Li&Ma test over the events # --------------------------------- # # Another thing that we want to do is to compute a Li&Ma test between the on-phase and the off-phase. # Calculate the ratio between the on-phase and the off-phase alpha = (on_phase_range[1] - on_phase_range[0]) / ( off_phase_range[1] - off_phase_range[0] ) # Select events in the on region region_events = obs_list_vela[0].events.select_region(on_region) # Select events in phase space on_events = region_events.select_parameter("PHASE", band=on_phase_range) off_events = region_events.select_parameter("PHASE", band=off_phase_range) # Apply the WStat (Li&Ma statistic) pulse_stat = WStatCountsStatistic( len(on_events.time), len(off_events.time), alpha=alpha ) print(f"Number of excess counts: {pulse_stat.n_sig}") print(f"TS: {pulse_stat.ts}") print(f"Significance: {pulse_stat.sqrt_ts}") ###################################################################### # Phase-resolved map # ------------------ ###################################################################### # Now that we have an overview of the phasogram of the pulsar, we can do a phase-resolved sky map # : a map of the ON-phase events minus alpha times the OFF-phase events. # Alpha is the ratio between the size of the ON-phase zone (here 0.1) and # the OFF-phase zone (0.3). e_true = MapAxis.from_energy_bounds( 0.003, 10, 6, per_decade=True, unit="TeV", name="energy_true" ) e_reco = MapAxis.from_energy_bounds( 0.01, 10, 4, per_decade=True, unit="TeV", name="energy" ) geom = WcsGeom.create( binsz=0.02 * u.deg, skydir=pos_target, width="4 deg", axes=[e_reco] ) ###################################################################### # Let’s create an ON-map and an OFF-map: map_dataset_empty = MapDataset.create(geom=geom, energy_axis_true=e_true) map_dataset_maker = MapDatasetMaker() phase_bkg_maker = PhaseBackgroundMaker( on_phase=on_phase_range, off_phase=off_phase_range, phase_column_name="PHASE" ) offset_max = 5 * u.deg safe_mask_maker = SafeMaskMaker(methods=["offset-max"], offset_max=offset_max) map_datasets = Datasets() for obs in obs_list_vela: map_dataset = map_dataset_maker.run(map_dataset_empty, obs) map_dataset = safe_mask_maker.run(map_dataset, obs) map_dataset_on_off = phase_bkg_maker.run(map_dataset, obs) map_datasets.append(map_dataset_on_off) ###################################################################### # Once the data reduction is done, we can plot the map of the counts-ON (i.e. in the ON-phase) # and the map of the background (i.e. the counts-OFF, selected in the OFF-phase, multiplied by alpha). # If one wants to plot the counts-OFF instead, `~background` should be replaced by `~counts_off` in the following cell. counts = ( map_datasets[0].counts.smooth(kernel="gauss", width=0.1 * u.deg).sum_over_axes() ) background = ( map_datasets[0].background.smooth(kernel="gauss", width=0.1 * u.deg).sum_over_axes() ) fig, (ax1, ax2) = plt.subplots( figsize=(11, 4), ncols=2, subplot_kw={"projection": counts.geom.wcs} ) counts.plot(ax=ax1, add_cbar=True) ax1.set_title("Counts") background.plot(ax=ax2, add_cbar=True) ax2.set_title("Background") plt.show() ###################################################################### # Finally, we can run an `~gammapy.estimators.ExcessMapEstimator` to compute the excess and significance maps. excess_map_estimator = ExcessMapEstimator( correlation_radius="0.2 deg", energy_edges=[50 * u.GeV, 10 * u.TeV] ) estimator_results = excess_map_estimator.run(dataset=map_datasets[0]) npred_excess = estimator_results.npred_excess sqrt_ts = estimator_results.sqrt_ts fig, (ax1, ax2) = plt.subplots( figsize=(11, 4), ncols=2, subplot_kw={"projection": npred_excess.geom.wcs} ) npred_excess.plot(ax=ax1, add_cbar=True) ax1.set_title("Excess counts") sqrt_ts.plot(ax=ax2, add_cbar=True) ax2.set_title("Significance") plt.show() ###################################################################### # Note that here we are lacking statistic because we only use one run of CTAO. # # Phase-resolved spectrum # ----------------------- # # We can also make a phase-resolved spectrum. # In order to do that, we are going to use the `~gammapy.makers.PhaseBackgroundMaker` to create a # `~gammapy.datasets.SpectrumDatasetOnOff` with the ON and OFF taken in the phase space. # Note that this maker take the ON and OFF in the same spatial region. # # Here to create the `~gammapy.datasets.SpectrumDatasetOnOff`, we are going to redo the whole data reduction. # However, note that one can use the `~gammapy.datasets.MapDatasetOnOff.to_spectrum_dataset()` method # (with the `containment_correction` parameter set to True) if such a `~gammapy.datasets.MapDatasetOnOff` # has been created as shown above. e_true = MapAxis.from_energy_bounds(0.003, 10, 100, unit="TeV", name="energy_true") e_reco = MapAxis.from_energy_bounds(0.01, 10, 30, unit="TeV", name="energy") geom = RegionGeom.create(region=on_region, axes=[e_reco]) spectrum_dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=e_true) spectrum_dataset_maker = SpectrumDatasetMaker() phase_bkg_maker = PhaseBackgroundMaker( on_phase=on_phase_range, off_phase=off_phase_range, phase_column_name="PHASE" ) offset_max = 5 * u.deg safe_mask_maker = SafeMaskMaker(methods=["offset-max"], offset_max=offset_max) spectrum_datasets = Datasets() for obs in obs_list_vela: spectrum_dataset = spectrum_dataset_maker.run(spectrum_dataset_empty, obs) spectrum_dataset = safe_mask_maker.run(spectrum_dataset, obs) spectrum_dataset_on_off = phase_bkg_maker.run(spectrum_dataset, obs) spectrum_datasets.append(spectrum_dataset_on_off) ###################################################################### # Now let’s take a look at the datasets we just created: spectrum_datasets[0].peek() plt.show() ###################################################################### # Now we’ll fit a model to the spectrum with the `~gammapy.modeling.Fit` class. First we # load a power law model with an initial value for the index and the # amplitude and then wo do a likelihood fit. The fit results are printed # below. spectral_model = PowerLawSpectralModel( index=4, amplitude="1.3e-9 cm-2 s-1 TeV-1", reference="0.02 TeV" ) model = SkyModel(spectral_model=spectral_model, name="vela psr") emin_fit, emax_fit = (0.04 * u.TeV, 0.4 * u.TeV) mask_fit = geom.energy_mask(energy_min=emin_fit, energy_max=emax_fit) for dataset in spectrum_datasets: dataset.models = model dataset.mask_fit = mask_fit joint_fit = Fit() joint_result = joint_fit.run(datasets=spectrum_datasets) print(joint_result) ###################################################################### # Now you might want to do the stacking here even if in our case there is # only one observation which makes it superfluous. We can compute flux # points by fitting the norm of the global model in energy bands. energy_edges = np.logspace(np.log10(0.04), np.log10(0.4), 7) * u.TeV stack_dataset = spectrum_datasets.stack_reduce() stack_dataset.models = model fpe = FluxPointsEstimator( energy_edges=energy_edges, source="vela psr", selection_optional="all" ) flux_points = fpe.run(datasets=[stack_dataset]) flux_points.meta["ts_threshold_ul"] = 1 amplitude_ref = 0.57 * 19.4e-14 * u.Unit("1 / (cm2 s MeV)") spec_model_true = PowerLawSpectralModel( index=4.5, amplitude=amplitude_ref, reference="20 GeV" ) flux_points_dataset = FluxPointsDataset(data=flux_points, models=model) ###################################################################### # Now we can plot. ax_spectrum, ax_residuals = flux_points_dataset.plot_fit() ax_spectrum.set_ylim([1e-14, 3e-11]) ax_residuals.set_ylim([-1.7, 1.7]) spec_model_true.plot( ax=ax_spectrum, energy_bounds=(emin_fit, emax_fit), label="Reference model", c="black", linestyle="dashed", energy_power=2, ) ax_spectrum.legend(loc="best") plt.show() ###################################################################### # This tutorial suffers a bit from the lack of statistics: there were 9 # Vela observations in the CTA DC1 while there is only one here. When done # on the 9 observations, the spectral analysis is much better agreement # between the input model and the gammapy fit. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-time/time_resolved_spectroscopy.py0000644000175100001770000002430114721316200027064 0ustar00runnerdocker""" Time resolved spectroscopy estimator ==================================== Perform spectral fits of a blazar in different time bins to investigate spectral changes during flares. Context ------- The `~gammapy.estimators.LightCurveEstimator` in Gammapy (see :doc:`light curve notebook `, and :doc:`light curve for flares notebook `.) fits the amplitude in each time/energy bin, keeping the spectral shape frozen. However, in the analysis of flaring sources, it is often interesting to study not only how the flux changes with time but how the spectral shape varies with time. Proposed approach ----------------- The main idea behind doing a time resolved spectroscopy is to - Select relevant `~gammapy.data.Observations` from the `~gammapy.data.DataStore` - Define time intervals in which to fit the spectral model - Apply the above time selections on the data to obtain new `~gammapy.data.Observations` - Perform standard data reduction on the above data - Define a source model - Fit the reduced data in each time bin with the source model - Extract relevant information in a table Here, we will use the PKS 2155-304 observations from the `H.E.S.S. first public test data release `__. We use time intervals of 15 minutes duration to explore spectral variability. Setup ----- As usual, we’ll start with some general imports… """ import logging import numpy as np import astropy.units as u from astropy.coordinates import Angle, SkyCoord from astropy.table import QTable from astropy.time import Time from regions import CircleSkyRegion # %matplotlib inline import matplotlib.pyplot as plt log = logging.getLogger(__name__) from gammapy.data import DataStore from gammapy.datasets import Datasets, SpectrumDataset from gammapy.makers import ( ReflectedRegionsBackgroundMaker, SafeMaskMaker, SpectrumDatasetMaker, ) from gammapy.maps import MapAxis, RegionGeom, TimeMapAxis from gammapy.modeling import Fit from gammapy.modeling.models import ( PowerLawSpectralModel, SkyModel, ) log = logging.getLogger(__name__) ###################################################################### # Data selection # ~~~~~~~~~~~~~~ # # We select all runs pointing within 2 degrees of PKS 2155-304. # data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") target_position = SkyCoord(329.71693826 * u.deg, -30.2255890 * u.deg, frame="icrs") selection = dict( type="sky_circle", frame="icrs", lon=target_position.ra, lat=target_position.dec, radius=2 * u.deg, ) obs_ids = data_store.obs_table.select_observations(selection)["OBS_ID"] observations = data_store.get_observations(obs_ids) print(f"Number of selected observations : {len(observations)}") ###################################################################### # The flaring observations were taken during July 2006. We define # 15-minute time intervals as lists of `~astropy.time.Time` start and stop # objects, and apply the intervals to the observations by using # `~gammapy.data.Observations.select_time` # t0 = Time("2006-07-29T20:30") duration = 15 * u.min n_time_bins = 25 times = t0 + np.arange(n_time_bins) * duration time_intervals = [Time([tstart, tstop]) for tstart, tstop in zip(times[:-1], times[1:])] print(time_intervals[-1].mjd) short_observations = observations.select_time(time_intervals) # check that observations have been filtered print(f"Number of observations after time filtering: {len(short_observations)}\n") print(short_observations[1].gti) ###################################################################### # Data reduction # -------------- # # In this example, we perform a 1D analysis with a reflected regions # background estimation. For details, see the # :doc:`/tutorials/analysis-1d/spectral_analysis` tutorial. # energy_axis = MapAxis.from_energy_bounds("0.4 TeV", "20 TeV", nbin=10) energy_axis_true = MapAxis.from_energy_bounds( "0.1 TeV", "40 TeV", nbin=20, name="energy_true" ) on_region_radius = Angle("0.11 deg") on_region = CircleSkyRegion(center=target_position, radius=on_region_radius) geom = RegionGeom.create(region=on_region, axes=[energy_axis]) dataset_maker = SpectrumDatasetMaker( containment_correction=True, selection=["counts", "exposure", "edisp"] ) bkg_maker = ReflectedRegionsBackgroundMaker() safe_mask_masker = SafeMaskMaker(methods=["aeff-max"], aeff_percent=10) datasets = Datasets() dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=energy_axis_true) for obs in short_observations: dataset = dataset_maker.run(dataset_empty.copy(), obs) dataset_on_off = bkg_maker.run(dataset, obs) dataset_on_off = safe_mask_masker.run(dataset_on_off, obs) datasets.append(dataset_on_off) ###################################################################### # This gives us list of `~gammapy.datasets.SpectrumDatasetOnOff` which can now be # modelled. # print(datasets) ###################################################################### # Modeling # -------- # # We will first fit a simple power law model in each time bin. Note that # since we are using an on-off analysis here, no background model is # required. If you are doing a 3D FoV analysis, you will need to model the # background appropriately as well. # # The index and amplitude of the spectral model is kept free. You can # configure the quantities you want to freeze. # spectral_model = PowerLawSpectralModel( index=3.0, amplitude=2e-11 * u.Unit("1 / (cm2 s TeV)"), reference=1 * u.TeV ) spectral_model.parameters["index"].frozen = False sky_model = SkyModel(spatial_model=None, spectral_model=spectral_model, name="pks2155") print(sky_model) ###################################################################### # Time resolved spectroscopy algorithm # ------------------------------------ # # The following function is the crux of this tutorial. The ``sky_model`` # is fit in each bin and a list of ``fit_results`` stores the fit # information in each bin. # # If time bins are present without any available observations, those bins # are discarded and a new list of valid time intervals and fit results are # created. # def time_resolved_spectroscopy(datasets, model, time_intervals): fit = Fit() valid_intervals = [] fit_results = [] index = 0 for t_min, t_max in time_intervals: datasets_to_fit = datasets.select_time(time_min=t_min, time_max=t_max) if len(datasets_to_fit) == 0: log.info( f"No Dataset for the time interval {t_min} to {t_max}. Skipping interval." ) continue model_in_bin = model.copy(name="Model_bin_" + str(index)) datasets_to_fit.models = model_in_bin result = fit.run(datasets_to_fit) fit_results.append(result) valid_intervals.append([t_min, t_max]) index += 1 return valid_intervals, fit_results ###################################################################### # We now apply it to our data # valid_times, results = time_resolved_spectroscopy(datasets, sky_model, time_intervals) ###################################################################### # To view the results of the fit, # print(results[0]) ###################################################################### # Or, to access the fitted models, # print(results[0].models) ###################################################################### # To better visualise the data, we can create a table by extracting some # relevant information. In the following, we extract the time intervals, # information on the fit convergence and the free parameters. You can # extract more information if required, eg, the `total_stat` in each # bin, etc. # def create_table(time_intervals, fit_result): t = QTable() t["tstart"] = np.array(time_intervals).T[0] t["tstop"] = np.array(time_intervals).T[1] t["convergence"] = [result.success for result in fit_result] for par in fit_result[0].models.parameters.free_parameters: t[par.name] = [ result.models.parameters[par.name].value * par.unit for result in fit_result ] t[par.name + "_err"] = [ result.models.parameters[par.name].error * par.unit for result in fit_result ] return t table = create_table(valid_times, results) print(table) ###################################################################### # Visualising the results # ~~~~~~~~~~~~~~~~~~~~~~~~ # # We can plot the spectral index and the amplitude as a function of time. # For convenience, we will convert the times into a `~gammapy.maps.TimeMapAxis`. # time_axis = TimeMapAxis.from_time_edges( time_min=table["tstart"], time_max=table["tstop"] ) fix, axes = plt.subplots(2, 1, figsize=(8, 8)) axes[0].errorbar( x=time_axis.as_plot_center, y=table["index"], yerr=table["index_err"], fmt="o" ) axes[1].errorbar( x=time_axis.as_plot_center, y=table["amplitude"], yerr=table["amplitude_err"], fmt="o", ) axes[0].set_ylabel("index") axes[1].set_ylabel("amplitude") axes[1].set_xlabel("time") plt.show() ###################################################################### # To get the integrated flux, we can access the model stored in the fit # result object, eg # integral_flux = ( results[0] .models[0] .spectral_model.integral_error(energy_min=1 * u.TeV, energy_max=10 * u.TeV) ) print("Integral flux in the first bin:", integral_flux) ###################################################################### # To plot hysteresis curves, ie the spectral index as a function of # amplitude # plt.errorbar( table["amplitude"], table["index"], xerr=table["amplitude_err"], yerr=table["index_err"], linestyle=":", linewidth=0.5, ) plt.scatter(table["amplitude"], table["index"], c=time_axis.center.value) plt.xlabel("amplitude") plt.ylabel("index") plt.colorbar().set_label("time") plt.show() ###################################################################### # Exercises # --------- # # 1. Quantify the variability in the spectral index # 2. Rerun the algorithm using a different spectral shape, such as a # broken power law. # 3. Compare the significance of the new model with the simple power law. # Take note of any fit non-convergence in the bins. # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/analysis-time/variability_estimation.py0000644000175100001770000002173114721316200026165 0ustar00runnerdocker""" Estimation of time variability in a lightcurve ============================================== Compute a series of time variability significance estimators for a lightcurve. Prerequisites ------------- Understanding the light curve estimator, please refer to the :doc:`/tutorials/analysis-time/light_curve` tutorial. For more in-depth explanation on the creation of smaller observations for exploring time variability, refer to the :doc:`/tutorials/analysis-time/light_curve_flare` tutorial. Context ------- Frequently, after computing a lightcurve, we need to quantify its variability in the time domain, for example in the case of a flare, burst, decaying light curve in GRBs or heightened activity in general. There are many ways to define the significance of the variability. **Objective: Estimate the level time variability in a lightcurve through different methods.** Proposed approach ----------------- We will start by reading the pre-computed light curve for PKS 2155-304 that is stored in `$GAMMAPY_DATA` To learn how to compute such an object, see the :doc:`/tutorials/analysis-time/light_curve_flare` tutorial. This tutorial will demonstrate how to compute different estimates which measure the significance of variability. These estimators range from basic ones that calculate the peak-to-trough variation, to more complex ones like fractional excess and point-to-point fractional variance, which consider the entire light curve. We also show an approach which utilises the change points in Bayesian blocks as indicators of variability. """ ###################################################################### # Setup # ----- # As usual, we’ll start with some general imports… import numpy as np from astropy.stats import bayesian_blocks from astropy.time import Time import matplotlib.pyplot as plt from gammapy.estimators import FluxPoints from gammapy.estimators.utils import ( compute_lightcurve_doublingtime, compute_lightcurve_fpp, compute_lightcurve_fvar, ) from gammapy.maps import TimeMapAxis ###################################################################### # Load the light curve for the PKS 2155-304 flare directly from `$GAMMAPY_DATA/estimators`. lc_1d = FluxPoints.read( "$GAMMAPY_DATA/estimators/pks2155_hess_lc/pks2155_hess_lc.fits", format="lightcurve" ) plt.figure(figsize=(8, 6)) plt.subplots_adjust(bottom=0.2, left=0.2) lc_1d.plot(marker="o") plt.show() ###################################################################### # Methods to characterize variability # ----------------------------------- # # The three methods shown here are: # # - amplitude maximum variation # - relative variability amplitude # - variability amplitude. # # The amplitude maximum variation is the simplest method to define variability (as described in # `Boller et al., 2016 `__) # as it just computes # the level of tension between the lowest and highest measured fluxes in the lightcurve. # This estimator requires fully Gaussian errors. flux = lc_1d.flux.quantity flux_err = lc_1d.flux_err.quantity f_mean = np.mean(flux) f_mean_err = np.mean(flux_err) f_max = flux.max() f_max_err = flux_err[flux.argmax()] f_min = flux.min() f_min_err = flux_err[flux.argmin()] amplitude_maximum_variation = (f_max - f_max_err) - (f_min + f_min_err) amplitude_maximum_significance = amplitude_maximum_variation / np.sqrt( f_max_err**2 + f_min_err**2 ) print(amplitude_maximum_significance) ###################################################################### # There are other methods based on the peak-to-trough difference to assess the variability in a lightcurve. # Here we present as example the relative variability amplitude as presented in # `Kovalev et al., 2004 `__: relative_variability_amplitude = (f_max - f_min) / (f_max + f_min) relative_variability_error = ( 2 * np.sqrt((f_max * f_min_err) ** 2 + (f_min * f_max_err) ** 2) / (f_max + f_min) ** 2 ) relative_variability_significance = ( relative_variability_amplitude / relative_variability_error ) print(relative_variability_significance) ###################################################################### # The variability amplitude as presented in # `Heidt & Wagner, 1996 `__ is: variability_amplitude = np.sqrt((f_max - f_min) ** 2 - 2 * f_mean_err**2) variability_amplitude_100 = 100 * variability_amplitude / f_mean variability_amplitude_error = ( 100 * ((f_max - f_min) / (f_mean * variability_amplitude_100 / 100)) * np.sqrt( (f_max_err / f_mean) ** 2 + (f_min_err / f_mean) ** 2 + ((np.std(flux, ddof=1) / np.sqrt(len(flux))) / (f_max - f_mean)) ** 2 * (variability_amplitude_100 / 100) ** 4 ) ) variability_amplitude_significance = ( variability_amplitude_100 / variability_amplitude_error ) print(variability_amplitude_significance) ###################################################################### # Fractional excess variance, point-to-point fractional variance and doubling/halving time # ----------------------------------------------------------------------------------------- # The fractional excess variance, as presented by # `Vaughan et al., 2003 `__, is # a simple but effective method to assess the significance of a time variability feature in an object, # for example an AGN flare. It is important to note that it requires Gaussian errors to be applicable. # The excess variance computation is implemented in `~gammapy.estimators.utils`. fvar_table = compute_lightcurve_fvar(lc_1d) print(fvar_table) ###################################################################### # A similar estimator is the point-to-point fractional variance, as defined by # `Edelson et al., 2002 `__, # which samples the lightcurve on smaller time granularity. # In general, the point-to-point fractional variance being higher than the fractional excess variance is indicative # of the presence of very short timescale variability. fpp_table = compute_lightcurve_fpp(lc_1d) print(fpp_table) ###################################################################### # The characteristic doubling and halving time of the light curve, as introduced by # `Brown, 2013 `__, can also be computed. # This provides information on the shape of the variability feature, in particular how quickly it rises and falls. dtime_table = compute_lightcurve_doublingtime(lc_1d, flux_quantity="flux") print(dtime_table) ###################################################################### # Bayesian blocks # --------------- # The presence of temporal variability in a lightcurve can be assessed by using bayesian blocks # (`Scargle et al., 2013 `__). # A good and simple-to-use implementation of the algorithm is found in # `astropy.stats.bayesian_blocks`. # This implementation uses Gaussian statistics, as opposed to the # `first introductory paper `__ # which is based on Poissonian statistics. # # By passing the flux and error on the flux as ``measures`` to the method we can obtain the list of optimal bin edges # defined by the bayesian blocks algorithm. time = lc_1d.geom.axes["time"].time_mid.mjd bayesian_edges = bayesian_blocks( t=time, x=flux.flatten(), sigma=flux_err.flatten(), fitness="measures" ) ###################################################################### # The result giving a significance estimation for variability in the lightcurve is the number of *change points*, # i.e. the number of internal bin edges: if at least one change point is identified by the algorithm, # there is significant variability. ncp = len(bayesian_edges) - 2 print(ncp) ###################################################################### # We can rebin the lightcurve to compute the one expected with bayesian edges. # First, we adjust the first and last bins of the ``bayesian_edges`` to coincide # with the original light curve start and end points. ###################################################################### # Create a new axis: axis_original = lc_1d.geom.axes["time"] bayesian_edges[0] = axis_original.time_edges[0].value bayesian_edges[-1] = axis_original.time_edges[-1].value edges = Time(bayesian_edges, format="mjd", scale=axis_original.reference_time.scale) axis_new = TimeMapAxis.from_time_edges(edges[:-1], edges[1:]) ###################################################################### # Rebin the lightcurve: resample = lc_1d.resample_axis(axis_new) ###################################################################### # Plot the new lightcurve on top of the old one: plt.figure(figsize=(8, 6)) plt.subplots_adjust(bottom=0.2, left=0.2) ax = lc_1d.plot(label="original") resample.plot(ax=ax, marker="s", label="rebinned") plt.legend() ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.172642 gammapy-1.3/examples/tutorials/api/0000755000175100001770000000000014721316215017034 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/README.rst0000644000175100001770000000023314721316200020513 0ustar00runnerdockerPackage / API ============= The following tutorials demonstrate different dimensions of the Gammapy API or expose how to perform more specific use cases. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/astro_dark_matter.py0000644000175100001770000001762314721316200023116 0ustar00runnerdocker""" Dark matter spatial and spectral models ======================================= Convenience methods for dark matter high level analyses. Introduction ------------ Gammapy has some convenience methods for dark matter analyses in `~gammapy.astro.darkmatter`. These include J-Factor computation and calculation the expected gamma flux for a number of annihilation channels. They are presented in this notebook. The basic concepts of indirect dark matter searches, however, are not explained. So this is aimed at people who already know what the want to do. A good introduction to indirect dark matter searches is given for example in https://arxiv.org/pdf/1012.4515.pdf (Chapter 1 and 5) """ ###################################################################### # Setup # ----- # # As always, we start with some setup for the notebook, and with imports. # import numpy as np import astropy.units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion, RectangleSkyRegion # %matplotlib inline import matplotlib.pyplot as plt from matplotlib.colors import LogNorm from gammapy.astro.darkmatter import ( DarkMatterAnnihilationSpectralModel, DarkMatterDecaySpectralModel, JFactory, PrimaryFlux, profiles, ) from gammapy.maps import WcsGeom, WcsNDMap ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Profiles # -------- # # The following dark matter profiles are currently implemented. Each model # can be scaled to a given density at a certain distance. These parameters # are controlled by `~gammapy.astro.darkmatter.profiles.DMProfile.LOCAL_DENSITY` and # `~gammapy.astro.darkmatter.profiles.DMProfile.DISTANCE_GC` # profiles.DMProfile.__subclasses__() for profile in profiles.DMProfile.__subclasses__(): p = profile() p.scale_to_local_density() radii = np.logspace(-3, 2, 100) * u.kpc plt.plot(radii, p(radii), label=p.__class__.__name__) plt.loglog() plt.axvline(8.5, linestyle="dashed", color="black", label="local density") plt.legend() plt.show() print("LOCAL_DENSITY:", profiles.DMProfile.LOCAL_DENSITY) print("DISTANCE_GC:", profiles.DMProfile.DISTANCE_GC) ###################################################################### # J Factors # --------- # # There are utilities to compute J-Factor maps that can serve as a basis # to compute J-Factors for certain regions. In the following we compute a # J-Factor annihilation map for the Galactic Centre region # profile = profiles.NFWProfile(r_s=20 * u.kpc) # Adopt standard values used in H.E.S.S. profiles.DMProfile.DISTANCE_GC = 8.5 * u.kpc profiles.DMProfile.LOCAL_DENSITY = 0.39 * u.Unit("GeV / cm3") profile.scale_to_local_density() position = SkyCoord(0.0, 0.0, frame="galactic", unit="deg") geom = WcsGeom.create(binsz=0.05, skydir=position, width=3.0, frame="galactic") jfactory = JFactory(geom=geom, profile=profile, distance=profiles.DMProfile.DISTANCE_GC) jfact = jfactory.compute_jfactor() jfact_map = WcsNDMap(geom=geom, data=jfact.value, unit=jfact.unit) plt.figure() ax = jfact_map.plot(cmap="viridis", norm=LogNorm(), add_cbar=True) plt.title(f"J-Factor [{jfact_map.unit}]") # 1 deg circle usually used in H.E.S.S. analyses without the +/- 0.3 deg band around the plane sky_reg = CircleSkyRegion(center=position, radius=1 * u.deg) pix_reg = sky_reg.to_pixel(wcs=geom.wcs) pix_reg.plot(ax=ax, facecolor="none", edgecolor="red", label="1 deg circle") sky_reg_rec = RectangleSkyRegion(center=position, height=0.6 * u.deg, width=2 * u.deg) pix_reg_rec = sky_reg_rec.to_pixel(wcs=geom.wcs) pix_reg_rec.plot(ax=ax, facecolor="none", edgecolor="orange", label="+/- 0.3 deg band") plt.legend() plt.show() # NOTE: https://arxiv.org/abs/1607.08142 quote 2.67e21 total_jfact = ( pix_reg.to_mask().multiply(jfact).sum() - pix_reg_rec.to_mask().multiply(jfact).sum() ) total_jfact = ( pix_reg.to_mask().multiply(jfact).sum() - pix_reg_rec.to_mask().multiply(jfact).sum() ) print( "J-factor in 1 deg circle without the +/- 0.3 deg band around GC assuming a " f"{profile.__class__.__name__} is {total_jfact:.3g}" ) ###################################################################### # The J-Factor can also be computed for dark matter decay jfactory = JFactory( geom=geom, profile=profile, distance=profiles.DMProfile.DISTANCE_GC, annihilation=False, ) jfact_decay = jfactory.compute_jfactor() jfact_map = WcsNDMap(geom=geom, data=jfact_decay.value, unit=jfact_decay.unit) plt.figure() ax = jfact_map.plot(cmap="viridis", norm=LogNorm(), add_cbar=True) plt.title(f"J-Factor [{jfact_map.unit}]") # 1 deg circle usually used in H.E.S.S. analyses without the +/- 0.3 deg band around the plane sky_reg = CircleSkyRegion(center=position, radius=1 * u.deg) pix_reg = sky_reg.to_pixel(wcs=geom.wcs) pix_reg.plot(ax=ax, facecolor="none", edgecolor="red", label="1 deg circle") sky_reg_rec = RectangleSkyRegion(center=position, height=0.6 * u.deg, width=2 * u.deg) pix_reg_rec = sky_reg_rec.to_pixel(wcs=geom.wcs) pix_reg_rec.plot(ax=ax, facecolor="none", edgecolor="orange", label="+/- 0.3 deg band") plt.legend() plt.show() total_jfact_decay = ( pix_reg.to_mask().multiply(jfact_decay).sum() - pix_reg_rec.to_mask().multiply(jfact_decay).sum() ) total_jfact_decay = ( pix_reg.to_mask().multiply(jfact_decay).sum() - pix_reg_rec.to_mask().multiply(jfact_decay).sum() ) print( "J-factor in 1 deg circle without the +/- 0.3 deg band around GC assuming a " f"{profile.__class__.__name__} is {total_jfact_decay:.3g}" ) ###################################################################### # Gamma-ray spectra at production # ------------------------------- # # The gamma-ray spectrum per annihilation is a further ingredient for a # dark matter analysis. The following annihilation channels are supported. # For more info see https://arxiv.org/pdf/1012.4515.pdf # fluxes = PrimaryFlux(mDM="1 TeV", channel="eL") print(fluxes.allowed_channels) fig, axes = plt.subplots(4, 1, figsize=(4, 16)) mDMs = [0.01, 0.1, 1, 10] * u.TeV for mDM, ax in zip(mDMs, axes): fluxes.mDM = mDM ax.set_title(rf"m$_{{\mathrm{{DM}}}}$ = {mDM}") ax.set_yscale("log") ax.set_ylabel("dN/dE") for channel in ["tau", "mu", "b", "Z"]: fluxes = PrimaryFlux(mDM=mDM, channel=channel) fluxes.channel = channel fluxes.plot( energy_bounds=[mDM / 100, mDM], ax=ax, label=channel, yunits=u.Unit("1/GeV"), ) axes[0].legend() plt.subplots_adjust(hspace=0.9) plt.show() ###################################################################### # Flux maps for annihilation # -------------------------- # # Finally flux maps can be produced like this: # channel = "Z" massDM = 10 * u.TeV diff_flux = DarkMatterAnnihilationSpectralModel(mass=massDM, channel=channel) int_flux = ( jfact * diff_flux.integral(energy_min=0.1 * u.TeV, energy_max=10 * u.TeV) ).to("cm-2 s-1") flux_map = WcsNDMap(geom=geom, data=int_flux.value, unit="cm-2 s-1") plt.figure() ax = flux_map.plot(cmap="viridis", norm=LogNorm(), add_cbar=True) plt.title( f"Flux [{int_flux.unit}]\n m$_{{DM}}$={fluxes.mDM.to('TeV')}, channel={fluxes.channel}" ) plt.show() ###################################################################### # Flux maps for decay # ------------------- # # Finally flux maps for decay can be produced like this: # channel = "Z" massDM = 10 * u.TeV diff_flux = DarkMatterDecaySpectralModel(mass=massDM, channel=channel) int_flux = ( jfact_decay * diff_flux.integral(energy_min=0.1 * u.TeV, energy_max=10 * u.TeV) ).to("cm-2 s-1") flux_map = WcsNDMap(geom=geom, data=int_flux.value, unit="cm-2 s-1") plt.figure() ax = flux_map.plot(cmap="viridis", norm=LogNorm(), add_cbar=True) plt.title( f"Flux [{int_flux.unit}]\n m$_{{DM}}$={fluxes.mDM.to('TeV')}, channel={fluxes.channel}" ) plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/catalog.py0000644000175100001770000003130214721316200021011 0ustar00runnerdocker""" Source catalogs =============== Access and explore thew most common gamma-ray source catalogs. Introduction ------------ `~gammapy.catalog` provides convenient access to common gamma-ray source catalogs. This module is mostly independent of the rest of Gammapy. Typically, you use it to compare new analyses against catalog results, e.g. overplot the spectral model, or compare the source position. Moreover, as creating a source model and flux points for a given catalog from the FITS table is tedious, `~gammapy.catalog` has this already implemented. So you can create initial source models for your analyses. This is very common for Fermi-LAT, to start with a catalog model. For TeV analysis, especially in crowded Galactic regions, using the HGPS, gamma-cat or 2HWC catalog in this way can also be useful. In this tutorial you will learn how to: - List available catalogs - Load a catalog - Access the source catalog table data - Select a catalog subset or a single source - Get source spectral and spatial models - Get flux points (if available) - Get lightcurves (if available) - Access the source catalog table data - Pretty-print the source information In this tutorial we will show examples using the following catalogs: - `~gammapy.catalog.SourceCatalogHGPS` - `~gammapy.catalog.SourceCatalogGammaCat` - `~gammapy.catalog.SourceCatalog3FHL` - `~gammapy.catalog.SourceCatalog4FGL` All catalog and source classes work the same, as long as some information is available. E.g. trying to access a lightcurve from a catalog and source that does not have that information will return `None`. Further information is available at `~gammapy.catalog`. """ import numpy as np import astropy.units as u # %matplotlib inline import matplotlib.pyplot as plt from IPython.display import display from gammapy.catalog import SourceCatalog4FGL from gammapy.catalog import CATALOG_REGISTRY ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # List available catalogs # ----------------------- # # `~gammapy.catalog` contains a catalog registry `~gammapy.catalog.CATALOG_REGISTRY`, # which maps catalog names (e.g. “3fhl”) to catalog classes # (e.g. `SourceCatalog3FHL`). # print(CATALOG_REGISTRY) ###################################################################### # Load catalogs # ------------- # # If you have run `gammapy download datasets` or # `gammapy download tutorials`, you have a copy of the catalogs as FITS # files in `$GAMMAPY_DATA/catalogs`, and that is the default location # where `~gammapy.catalog` loads from. # # # # !ls -1 $GAMMAPY_DATA/catalogs # %% # # # !ls -1 $GAMMAPY_DATA/catalogs/fermi ###################################################################### # So a catalog can be loaded directly from its corresponding class # catalog = SourceCatalog4FGL() print("Number of sources :", len(catalog.table)) ###################################################################### # Note that it loads the default catalog from `$GAMMAPY_DATA/catalogs`, # you could pass a different `filename` when creating the catalog. For # example here we load an older version of 4FGL catalog: # catalog = SourceCatalog4FGL("$GAMMAPY_DATA/catalogs/fermi/gll_psc_v20.fit.gz") print("Number of sources :", len(catalog.table)) ###################################################################### # Alternatively you can load a catalog by name via # `CATALOG_REGISTRY.get_cls(name)()` (note the `()` to instantiate a # catalog object from the catalog class - only this will load the catalog # and be useful), or by importing the catalog class # (e.g. `~gammapy.catalog.SourceCatalog3FGL`) directly. The two ways are equivalent, the # result will be the same. # # FITS file is loaded catalog = CATALOG_REGISTRY.get_cls("3fgl")() print(catalog) # %% # Let's load the source catalogs we will use throughout this tutorial catalog_gammacat = CATALOG_REGISTRY.get_cls("gamma-cat")() catalog_3fhl = CATALOG_REGISTRY.get_cls("3fhl")() catalog_4fgl = CATALOG_REGISTRY.get_cls("4fgl")() catalog_hgps = CATALOG_REGISTRY.get_cls("hgps")() ###################################################################### # Catalog table # ------------- # # Source catalogs are given as `FITS` files that contain one or multiple # tables. # # However, you can also access the underlying `astropy.table.Table` for # a catalog, and the row data as a Python `dict`. This can be useful if # you want to do something that is not pre-scripted by the # `~gammapy.catalog.SourceCatalog` classes, such as e.g. selecting sources by sky # position or association class, or accessing special source information. # print(type(catalog_3fhl.table)) # %% print(len(catalog_3fhl.table)) # %% display(catalog_3fhl.table[:3][["Source_Name", "RAJ2000", "DEJ2000"]]) ###################################################################### # Note that the catalogs object include a helper property that gives # directly the sources positions as a `~astropy.coordinates.SkyCoord` object (we will show an # usage example in the following). # print(catalog_3fhl.positions[:3]) ###################################################################### # Source object # ------------- # # Select a source # ~~~~~~~~~~~~~~~ # # The catalog entries for a single source are represented by a # `~gammapy.catalog.SourceCatalogObject`. In order to select a source object index into # the catalog using `[]`, with a catalog table row index (zero-based, # first row is `[0]`), or a source name. If a name is given, catalog # table columns with source names and association names (“ASSOC1” in the # example below) are searched top to bottom. There is no name resolution # web query. # source = catalog_4fgl[49] print(source) # %% print(source.row_index, source.name) # %% source = catalog_4fgl["4FGL J0010.8-2154"] print(source.row_index, source.name) # %% print(source.data["ASSOC1"]) # %% source = catalog_4fgl["PKS 0008-222"] print(source.row_index, source.name) ###################################################################### # Note that you can also do a `for source in catalog` loop, to find or # process sources of interest. # # Source information # ~~~~~~~~~~~~~~~~~~ # # The source objects have a `data` property that contains the # information of the catalog row corresponding to the source. # print(source.data["Npred"]) # %% print(source.data["GLON"], source.data["GLAT"]) ###################################################################### # As for the catalog object, the source object has a `position` # property. # print(source.position.galactic) ###################################################################### # Select a catalog subset # ----------------------- # # The catalog objects support selection using boolean arrays (of the same # length), so one can create a new catalog as a subset of the main catalog # that verify a set of conditions. # # In the next example we selection only few of the brightest sources # brightest sources in the 100 to 200 GeV energy band. # mask_bright = np.zeros(len(catalog_3fhl.table), dtype=bool) for k, source in enumerate(catalog_3fhl): flux = source.spectral_model().integral(100 * u.GeV, 200 * u.GeV).to("cm-2 s-1") if flux > 1e-10 * u.Unit("cm-2 s-1"): mask_bright[k] = True print(f"{source.row_index:<7d} {source.name:20s} {flux:.3g}") # %% catalog_3fhl_bright = catalog_3fhl[mask_bright] print(catalog_3fhl_bright) # %% print(catalog_3fhl_bright.table["Source_Name"]) ###################################################################### # Similarly we can select only sources within a region of interest. Here # for example we use the `position` property of the catalog object to # select sources within 5 degrees from “PKS 0008-222”: # source = catalog_4fgl["PKS 0008-222"] mask_roi = source.position.separation(catalog_4fgl.positions) < 5 * u.deg catalog_4fgl_roi = catalog_4fgl[mask_roi] print("Number of sources :", len(catalog_4fgl_roi.table)) ###################################################################### # Source models # ------------- # # The `~gammapy.catalog.SourceCatalogObject` classes have a # `~gammapy.catalog.SourceCatalogObject.sky_model()` model which creates a # `~gammapy.modeling.models.SkyModel` object, with model parameter values # and parameter errors from the catalog filled in. # # In most cases, the `~gammapy.catalog.SourceCatalogObject.spectral_model()` method provides the # `~gammapy.modeling.models.SpectralModel` part of the sky model, and the # `~gammapy.catalog.SourceCatalogObject.spatial_model()` method the `~gammapy.modeling.models.SpatialModel` # part individually. # # We use the `~gammapy.catalog.SourceCatalog3FHL` for the examples in # this section. # source = catalog_4fgl["PKS 2155-304"] model = source.sky_model() print(model) # %% print(model) # %% print(model.spatial_model) # %% print(model.spectral_model) # %% energy_bounds = (100 * u.MeV, 100 * u.GeV) opts = dict(sed_type="e2dnde", yunits=u.Unit("TeV cm-2 s-1")) model.spectral_model.plot(energy_bounds, **opts) model.spectral_model.plot_error(energy_bounds, **opts) plt.show() ###################################################################### # You can create initial source models for your analyses using the # `~gammapy.catalog.SourceCatalog.to_models()` method of the catalog objects. Here for example we # create a `~gammapy.modeling.models.Models` object from the 4FGL catalog subset we previously # defined: # models_4fgl_roi = catalog_4fgl_roi.to_models() print(models_4fgl_roi) ###################################################################### # Specificities of the HGPS catalog # --------------------------------- # # Using the `~gammapy.catalog.SourceCatalog.to_models()` method for the # `~gammapy.catalog.SourceCatalogHGPS` will return only the models # components of the sources retained in the main catalog, several # candidate objects appears only in the Gaussian components table (see # section 4.9 of the HGPS paper, https://arxiv.org/abs/1804.02432). To # access these components you can do the following: # discarded_ind = np.where( ["Discarded" in _ for _ in catalog_hgps.table_components["Component_Class"]] )[0] discarded_table = catalog_hgps.table_components[discarded_ind] ###################################################################### # There is no spectral model available for these components but you can # access their spatial models: # discarded_spatial = [ catalog_hgps.gaussian_component(idx).spatial_model() for idx in discarded_ind ] ###################################################################### # In addition to the source components the HGPS catalog include a large # scale diffuse component built by fitting a gaussian model in a sliding # window along the Galactic plane. Information on this model can be # accessed via the properties `~gammapy.catalog.SourceCatalogHGPS.table_large_scale_component` and # `~gammapy.catalog.SourceCatalogHGPS.large_scale_component` of `~gammapy.catalog.SourceCatalogHGPS`. # # here we show the 5 first elements of the table display(catalog_hgps.table_large_scale_component[:5]) # you can also try : # help(catalog_hgps.large_scale_component) ###################################################################### # Flux points # ----------- # # The flux points are available via the `~gammapy.catalog.SourceCatalogObject.flux_points` property as a # `~gammapy.estimators.FluxPoints` object. # source = catalog_4fgl["PKS 2155-304"] flux_points = source.flux_points print(flux_points) # %% display(flux_points.to_table(sed_type="flux")) # %% flux_points.plot(sed_type="e2dnde") plt.show() ###################################################################### # Lightcurves # ----------- # # The Fermi catalogs contain lightcurves for each source. It is available # via the `source.lightcurve` method as a # `~gammapy.estimators.FluxPoints` object with a time axis. # lightcurve = catalog_4fgl["4FGL J0349.8-2103"].lightcurve() print(lightcurve) # %% display(lightcurve.to_table(format="lightcurve", sed_type="flux")) # %% plt.figure(figsize=(8, 6)) plt.subplots_adjust(bottom=0.2, left=0.2) lightcurve.plot() plt.show() ###################################################################### # Pretty-print source information # ------------------------------- # # A source object has a nice string representation that you can print. # source = catalog_hgps["MSH 15-52"] print(source) ###################################################################### # You can also call `source.info()` instead and pass as an option what # information to print. The options available depend on the catalog, you # can learn about them using `help()` # help(source.info) # %% print(source.info("associations")) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/datasets.py0000644000175100001770000004345214721316200021220 0ustar00runnerdocker""" Datasets - Reduced data, IRFs, models ===================================== Learn how to work with datasets Introduction ------------ `~gammapy.datasets` are a crucial part of the gammapy API. `~gammapy.datasets.Dataset` objects constitute `DL4` data - binned counts, IRFs, models and the associated likelihoods. `~gammapy.datasets.Datasets` from the end product of the data reduction stage, see :doc:`/tutorials/api/makers` tutorial and are passed on to the `~gammapy.modeling.Fit` or estimator classes for modelling and fitting purposes. To find the different types of `~gammapy.datasets.Dataset` objects that are supported see :ref:`datasets-types`: Setup ----- """ import astropy.units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion import matplotlib.pyplot as plt from IPython.display import display from gammapy.data import GTI from gammapy.datasets import ( Datasets, FluxPointsDataset, MapDataset, SpectrumDatasetOnOff, ) from gammapy.estimators import FluxPoints from gammapy.maps import MapAxis, WcsGeom from gammapy.modeling.models import FoVBackgroundModel, PowerLawSpectralModel, SkyModel ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup from gammapy.utils.scripts import make_path # %matplotlib inline check_tutorials_setup() ###################################################################### # MapDataset # ---------- # # The counts, exposure, background, masks, and IRF maps are bundled # together in a data structure named `~gammapy.datasets.MapDataset`. While the `counts`, # and `background` maps are binned in reconstructed energy and must have # the same geometry, the IRF maps can have a different spatial (coarsely # binned and larger) geometry and spectral range (binned in true # energies). It is usually recommended that the true energy bin should be # larger and more finely sampled and the reco energy bin. # # Creating an empty dataset # ~~~~~~~~~~~~~~~~~~~~~~~~~ # # An empty `~gammapy.datasets.MapDataset` can be directly instantiated from any # `~gammapy.maps.WcsGeom` object: # energy_axis = MapAxis.from_energy_bounds(1, 10, nbin=11, name="energy", unit="TeV") geom = WcsGeom.create( skydir=(83.63, 22.01), axes=[energy_axis], width=5 * u.deg, binsz=0.05 * u.deg, frame="icrs", ) dataset_empty = MapDataset.create(geom=geom, name="my-dataset") ###################################################################### # It is good practice to define a name for the dataset, such that you can # identify it later by name. However if you define a name it **must** be # unique. Now we can already print the dataset: # print(dataset_empty) ###################################################################### # The printout shows the key summary information of the dataset, such as # total counts, fit statistics, model information etc. # # `~gammapy.datasetsMapDataset.from_geom` has additional keywords, that can be used to # define the binning of the IRF related maps: # # choose a different true energy binning for the exposure, psf and edisp energy_axis_true = MapAxis.from_energy_bounds( 0.1, 100, nbin=11, name="energy_true", unit="TeV", per_decade=True ) # choose a different rad axis binning for the psf rad_axis = MapAxis.from_bounds(0, 5, nbin=50, unit="deg", name="rad") gti = GTI.create(0 * u.s, 1000 * u.s) dataset_empty = MapDataset.create( geom=geom, energy_axis_true=energy_axis_true, rad_axis=rad_axis, binsz_irf=0.1, gti=gti, name="dataset-empty", ) ###################################################################### # To see the geometry of each map, we can use: # print(dataset_empty.geoms) ###################################################################### # Another way to create a `~gammapy.datasets.MapDataset` is to just read an existing one # from a FITS file: # dataset_cta = MapDataset.read( "$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz", name="dataset-cta" ) print(dataset_cta) ###################################################################### # Accessing contents of a dataset # ------------------------------- # ###################################################################### # To further explore the contents of a `Dataset`, you can use # e.g. `~gammapy.datasets.MapDataset.info_dict()` # ###################################################################### # For a quick info, use # print(dataset_cta.info_dict()) ###################################################################### # For a quick view, use # dataset_cta.peek() plt.show() ###################################################################### # And access individual maps like: # plt.figure() counts_image = dataset_cta.counts.sum_over_axes() counts_image.smooth("0.1 deg").plot(add_cbar=True) plt.show() ###################################################################### # Of course you can also access IRF related maps, e.g. the psf as # `~gammapy.irf.PSFMap`: # print(dataset_cta.psf) ###################################################################### # And use any method on the `~gammapy.irf.PSFMap` object: # radius = dataset_cta.psf.containment_radius(energy_true=1 * u.TeV, fraction=0.95) print(radius) # %% edisp_kernel = dataset_cta.edisp.get_edisp_kernel() edisp_kernel.plot_matrix() plt.show() ###################################################################### # The `~gammapy.datasets.MapDataset` typically also contains the information on the # residual hadronic background, stored in `~gammapy.datasets.MapDataset.background` as a # map: # print(dataset_cta.background) ###################################################################### # As a next step we define a minimal model on the dataset using the # `~gammapy.datasets.MapDataset.models` setter: # model = SkyModel.create("pl", "point", name="gc") model.spatial_model.position = SkyCoord("0d", "0d", frame="galactic") model_bkg = FoVBackgroundModel(dataset_name="dataset-cta") dataset_cta.models = [model, model_bkg] ###################################################################### # Assigning models to datasets is covered in more detail in :doc:`/tutorials/api/model_management`. # Printing the dataset will now show the model components: # print(dataset_cta) ###################################################################### # Now we can use the `~gammapy.datasets.MapDataset.npred` method to get a map of the total predicted counts # of the model: # npred = dataset_cta.npred() npred.sum_over_axes().plot(add_cbar=True) plt.show() ###################################################################### # To get the predicted counts from an individual model component we can # use: # npred_source = dataset_cta.npred_signal(model_names=["gc"]) npred_source.sum_over_axes().plot(add_cbar=True) plt.show() ###################################################################### # `~gammapy.datasets.MapDataset.background` contains the background map computed from the # IRF. Internally it will be combined with a `~gammapy.modeling.models.FoVBackgroundModel`, to # allow for adjusting the background model during a fit. To get the model # corrected background, one can use `~gammapy.datasets.MapDataset.npred_background`. # npred_background = dataset_cta.npred_background() npred_background.sum_over_axes().plot() plt.show() ###################################################################### # Using masks # ~~~~~~~~~~~ # # There are two masks that can be set on a `~gammapy.datasets.MapDataset`, `~gammapy.datasets.MapDataset.mask_safe` and # `~gammapy.datasets.MapDataset.mask_fit`. # # - The `~gammapy.datasets.MapDataset.mask_safe` is computed during the data reduction process # according to the specified selection cuts, and should not be changed # by the user. # - During modelling and fitting, the user might want to additionally # ignore some parts of a reduced dataset, e.g. to restrict the fit to a # specific energy range or to ignore parts of the region of interest. # This should be done by applying the `~gammapy.datasets.MapDataset.mask_fit`. To see details of # applying masks, please refer to :ref:`masks-for-fitting`. # # Both the `~gammapy.datasets.MapDataset.mask_fit` and `~gammapy.datasets.MapDataset.mask_safe` must # have the same `~gammapy.maps.Map.geom` as the `~gammapy.datasets.MapDataset.counts` and # `~gammapy.datasets.MapDataset.background` maps. # # For example to see the safe data range: # dataset_cta.mask_safe.plot_grid(add_cbar=True) plt.show() ###################################################################### # In addition it is possible to define a custom `~gammapy.datasets.MapDataset.mask_fit`. # # Here we apply a mask fit in energy and space: region = CircleSkyRegion(SkyCoord("0d", "0d", frame="galactic"), 1.5 * u.deg) geom = dataset_cta.counts.geom mask_space = geom.region_mask([region]) mask_energy = geom.energy_mask(0.3 * u.TeV, 8 * u.TeV) dataset_cta.mask_fit = mask_space & mask_energy dataset_cta.mask_fit.plot_grid(vmin=0, vmax=1, add_cbar=True) plt.show() ###################################################################### # To access the energy range defined by the mask you can use: # # - `~gammapy.datasets.MapDataset.energy_range_safe` : energy range defined by the `~gammapy.datasets.MapDataset.mask_safe` # - `~gammapy.datasets.MapDataset.energy_range_fit` : energy range defined by the `~gammapy.datasets.MapDataset.mask_fit` # - `~gammapy.datasets.MapDataset.energy_range` : the final energy range used in likelihood computation # # These methods return two maps, with the `min` and `max` energy # values at each spatial pixel # e_min, e_max = dataset_cta.energy_range # To see the low energy threshold at each point e_min.plot(add_cbar=True) plt.show() # %% # To see the high energy threshold at each point e_max.plot(add_cbar=True) plt.show() ###################################################################### # Just as for `~gammapy.maps.Map` objects it is possible to cutout a whole # `~gammapy.datasets.MapDataset`, which will perform the cutout for all maps in # parallel. Optionally one can provide a new name to the resulting dataset: # cutout = dataset_cta.cutout( position=SkyCoord("0d", "0d", frame="galactic"), width=2 * u.deg, name="cta-cutout", ) cutout.counts.sum_over_axes().plot(add_cbar=True) plt.show() ###################################################################### # It is also possible to slice a `~gammapy.datasets.MapDataset` in energy: # sliced = dataset_cta.slice_by_energy( energy_min=1 * u.TeV, energy_max=5 * u.TeV, name="slice-energy" ) sliced.counts.plot_grid(add_cbar=True) plt.show() ###################################################################### # The same operation will be applied to all other maps contained in the # datasets such as `~gammapy.datasets.MapDataset.mask_fit`: # sliced.mask_fit.plot_grid() plt.show() ###################################################################### # Resampling datasets # ~~~~~~~~~~~~~~~~~~~ # # It can often be useful to coarsely rebin an initially computed datasets # by a specified factor. This can be done in either spatial or energy # axes: # plt.figure() downsampled = dataset_cta.downsample(factor=8) downsampled.counts.sum_over_axes().plot(add_cbar=True) plt.show() ###################################################################### # And the same down-sampling process is possible along the energy axis: # downsampled_energy = dataset_cta.downsample( factor=5, axis_name="energy", name="downsampled-energy" ) downsampled_energy.counts.plot_grid(add_cbar=True) plt.show() ###################################################################### # In the printout one can see that the actual number of counts is # preserved during the down-sampling: # print(downsampled_energy, dataset_cta) ###################################################################### # We can also resample the finer binned datasets to an arbitrary coarser # energy binning using: # energy_axis_new = MapAxis.from_energy_edges([0.1, 0.3, 1, 3, 10] * u.TeV) resampled = dataset_cta.resample_energy_axis(energy_axis=energy_axis_new) resampled.counts.plot_grid(ncols=2, add_cbar=True) plt.show() ###################################################################### # To squash the whole dataset into a single energy bin there is the # `~gammapy.datasets.MapDataset.to_image()` convenience method: # dataset_image = dataset_cta.to_image() dataset_image.counts.plot(add_cbar=True) plt.show() ###################################################################### # SpectrumDataset # --------------- # # `~gammapy.datasets.SpectrumDataset` inherits from a `~gammapy.datasets.MapDataset`, and is specially # adapted for 1D spectral analysis, and uses a `~gammapy.maps.RegionGeom` instead of a # `~gammapy.maps.WcsGeom`. A `~gammapy.datasets.MapDataset` can be converted to a `~gammapy.datasets.SpectrumDataset`, # by summing the `counts` and `background` inside the `on_region`, # which can then be used for classical spectral analysis. Containment # correction is feasible only for circular regions. # region = CircleSkyRegion(SkyCoord(0, 0, unit="deg", frame="galactic"), 0.5 * u.deg) spectrum_dataset = dataset_cta.to_spectrum_dataset( region, containment_correction=True, name="spectrum-dataset" ) # For a quick look spectrum_dataset.peek() plt.show() ###################################################################### # A `~gammapy.datasets.MapDataset` can also be integrated over the `on_region` to create # a `~gammapy.datasets.MapDataset` with a `~gammapy.maps.RegionGeom`. Complex regions can be handled # and since the full IRFs are used, containment correction is not # required. # reg_dataset = dataset_cta.to_region_map_dataset(region, name="region-map-dataset") print(reg_dataset) ###################################################################### # FluxPointsDataset # ----------------- # # `~gammapy.datasets.FluxPointsDataset` is a `~gammapy.datasets.Dataset` container for precomputed flux # points, which can be then used in fitting. # flux_points = FluxPoints.read( "$GAMMAPY_DATA/tests/spectrum/flux_points/diff_flux_points.fits" ) model = SkyModel(spectral_model=PowerLawSpectralModel(index=2.3)) fp_dataset = FluxPointsDataset(data=flux_points, models=model) fp_dataset.plot_spectrum() plt.show() ###################################################################### # The masks on `~gammapy.datasets.FluxPointsDataset` are `~numpy.ndarray` objects # and the data is a # `~gammapy.datasets.FluxPoints` object. The `~gammapy.datasets.FluxPointsDataset.mask_safe`, # by default, masks the upper limit points. # print(fp_dataset.mask_safe) # Note: the mask here is simply a numpy array # %% print(fp_dataset.data) # is a `FluxPoints` object # %% print(fp_dataset.data_shape()) # number of data points ###################################################################### # # For an example of fitting `~gammapy.estimators.FluxPoints`, see :doc:`/tutorials/analysis-1d/sed_fitting`, # and for using source catalogs see :doc:`/tutorials/api/catalog`. # ###################################################################### # Datasets # -------- # # `~gammapy.datasets.Datasets` are a collection of `~gammapy.datasets.Dataset` objects. They can be of the # same type, or of different types, eg: mix of `~gammapy.datasets.FluxPointsDataset`, # `~gammapy.datasets.MapDataset` and `~gammapy.datasets.SpectrumDataset`. # # For modelling and fitting of a list of `~gammapy.datasets.Dataset` objects, you can # either: # # - (A) Do a joint fitting of all the datasets together **OR** # - (B) Stack the datasets together, and then fit them. # # `~gammapy.datasets.Datasets` is a convenient tool to handle joint fitting of # simultaneous datasets. As an example, please see :doc:`/tutorials/analysis-3d/analysis_mwl`. # # To see how stacking is performed, please see :ref:`stack`. # # To create a `~gammapy.datasets.Datasets` object, pass a list of `~gammapy.datasets.Dataset` on init, eg # datasets = Datasets([dataset_empty, dataset_cta]) print(datasets) ###################################################################### # If all the datasets have the same type we can also print an info table, # collecting all the information from the individual calls to # `~gammapy.datasets.Dataset.info_dict()`: # display(datasets.info_table()) # quick info of all datasets print(datasets.names) # unique name of each dataset ###################################################################### # We can access individual datasets in `Datasets` object by name: # print(datasets["dataset-empty"]) # extracts the first dataset ###################################################################### # Or by index: # print(datasets[0]) ###################################################################### # Other list type operations work as well such as: # # Use python list convention to remove/add datasets, eg: datasets.remove("dataset-empty") print(datasets.names) ###################################################################### # Or # datasets.append(spectrum_dataset) print(datasets.names) ###################################################################### # Let’s create a list of spectrum datasets to illustrate some more # functionality: # datasets = Datasets() path = make_path("$GAMMAPY_DATA/joint-crab/spectra/hess") for filename in path.glob("pha_*.fits"): dataset = SpectrumDatasetOnOff.read(filename) datasets.append(dataset) print(datasets) ###################################################################### # Now we can stack all datasets using `~gammapy.datasets.Datasets.stack_reduce()`: # stacked = datasets.stack_reduce(name="stacked") print(stacked) ###################################################################### # Or slice all datasets by a given energy range: # datasets_sliced = datasets.slice_by_energy(energy_min="1 TeV", energy_max="10 TeV") plt.show() print(datasets_sliced.energy_ranges) # %% ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/estimators.py0000644000175100001770000002470214721316200021577 0ustar00runnerdocker""" Estimators ========== This tutorial provides an overview of the `Estimator` API. All estimators live in the `gammapy.estimators` sub-module, offering a range of algorithms and classes for high-level flux and significance estimation. This is accomplished through a common functionality allowing the estimation of flux points, light curves, flux maps and profiles via a common API. Key Features ------------ - **Hypothesis Testing**: Estimations are based on testing a reference model against a null hypothesis, deriving flux and significance values. - **Estimation via Two Methods**: - **Model Fitting (Forward Folding)**: Refit the flux of a model component within specified energy, time, or spatial regions. - **Excess Calculation (Backward Folding)**: Use the analytical solution by Li and Ma for significance based on excess counts, currently available in `~gammapy.estimators.ExcessMapEstimator`. For further information on these details please refer to :doc:`/user-guide/estimators`. The setup --------- """ import numpy as np import matplotlib.pyplot as plt from astropy import units as u from IPython.display import display from gammapy.datasets import SpectrumDatasetOnOff, Datasets, MapDataset from gammapy.estimators import ( FluxPointsEstimator, ExcessMapEstimator, FluxPoints, ) from gammapy.modeling import Fit, Parameter from gammapy.modeling.models import SkyModel, PowerLawSpectralModel from gammapy.utils.scripts import make_path ###################################################################### # Flux Points Estimation # ---------------------- # # We start with a simple example for flux points estimation taking multiple datasets into account. # In this section we show the steps to estimate the flux points. # First we read the pre-computed datasets from `$GAMMAPY_DATA`. # datasets = Datasets() path = make_path("$GAMMAPY_DATA/joint-crab/spectra/hess/") for filename in path.glob("pha_obs*.fits"): dataset = SpectrumDatasetOnOff.read(filename) datasets.append(dataset) ###################################################################### # Next we define a spectral model and set it on the datasets: # pwl = PowerLawSpectralModel(index=2.7, amplitude="5e-11 cm-2 s-1 TeV-1") datasets.models = SkyModel(spectral_model=pwl, name="crab") ###################################################################### # Before using the estimators, it is necessary to first ensure that the model is properly # fitted. This applies to all scenarios, including light curve estimation. To optimize the # model parameters to best fit the data we utilise the following: # fit = Fit() fit_result = fit.optimize(datasets=datasets) print(fit_result) ###################################################################### # A fully configured Flux Points Estimation # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # The `~gammapy.estimators.FluxPointsEstimator` estimates flux points for a given list of datasets, # energies and spectral model. The most simple way to call the estimator is by defining both # the name of the ``source`` and its ``energy_edges``. # Here we prepare a full configuration of the flux point estimation. # Firstly we define the ``backend`` for the fit: # fit = Fit( optimize_opts={"backend": "minuit"}, confidence_opts={"backend": "scipy"}, ) ###################################################################### # Define the fully configured flux points estimator: # energy_edges = np.geomspace(0.7, 100, 9) * u.TeV norm = Parameter(name="norm", value=1.0) fp_estimator = FluxPointsEstimator( source="crab", energy_edges=energy_edges, n_sigma=1, n_sigma_ul=2, selection_optional="all", fit=fit, norm=norm, ) ###################################################################### # The ``norm`` parameter can be adjusted in a few different ways. For example, we can change its # minimum and maximum values that it scans over, as follows. # fp_estimator.norm.scan_min = 0.1 fp_estimator.norm.scan_max = 10 ###################################################################### # Note: The default scan range of the norm parameter is between 0.2 to 5. In case the upper # limit values lie outside this range, nan values will be returned. It may thus be useful to # increase this range, specially for the computation of upper limits from weak sources. # # The various quantities utilised in this tutorial are described here: # # - ``source``: which source from the model to compute the flux points for # - ``energy_edges``: edges of the flux points energy bins # - ``n_sigma``: number of sigma for the flux error # - ``n_sigma_ul``: the number of sigma for the flux upper limits # - ``selection_optional``: what additional maps to compute # - ``fit``: the fit instance (as defined above) # - ``reoptimize``: whether to reoptimize the flux points with other model parameters, aside from the ``norm`` # - ``norm``: normalisation parameter for the fit # # **Important note**: the output ``energy_edges`` are taken from the parent dataset energy bins, # selecting the bins closest to the requested ``energy_edges``. To match the input bins directly, # specific binning must be defined based on the parent dataset geometry. This could be done in the following way: # `energy_edges = datasets[0].geoms["geom"].axes["energy"].downsample(factor=5).edges` # # %%time fp_result = fp_estimator.run(datasets=datasets) ###################################################################### # Accessing and visualising the results # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ print(fp_result) ###################################################################### # We can specify the SED type to plot: # fp_result.plot(sed_type="dnde") plt.show() ###################################################################### # We can also access # the quantities names through ``fp_result.available_quantities``. # Here we show how you can plot a different plot type and define the axes units, # we also overlay the TS profile. ax = plt.subplot() ax.xaxis.set_units(u.eV) ax.yaxis.set_units(u.Unit("TeV cm-2 s-1")) fp_result.plot(ax=ax, sed_type="e2dnde", color="tab:orange") fp_result.plot_ts_profiles(sed_type="e2dnde") plt.show() ###################################################################### # The actual data members are N-dimensional `~gammapy.maps.region.ndmap.RegionNDMap` objects. So you can # also plot them: print(type(fp_result.dnde)) ###################################################################### # fp_result.dnde.plot() plt.show() ###################################################################### # From the above, we can see that we access to many quantities. ###################################################################### # Access the data: print(fp_result.e2dnde.quantity.to("TeV cm-2 s-1")) ###################################################################### # print(fp_result.dnde.quantity.shape) ###################################################################### # print(fp_result.dnde.quantity[:, 0, 0]) ###################################################################### # Or even extract an energy range: fp_result.dnde.slice_by_idx({"energy": slice(3, 10)}) ###################################################################### # A note on the internal representation # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # The result contains a reference spectral model, which defines the spectral shape. # Typically, it is the best fit model: print(fp_result.reference_model) ###################################################################### # `~gammapy.estimators.FluxPoints` are the represented by the "norm" scaling factor with # respect to the reference model: fp_result.norm.plot() plt.show() ###################################################################### # Dataset specific quantities ("counts like") # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # While the flux estimate and associated errors are common to all datasets, # the result also stores some dataset specific quantities, which can be useful # for debugging. # Here we remind the user of the meaning of the forthcoming quantities: # # - ``counts``: predicted counts from the null hypothesis, # - ``npred``: predicted number of counts from best fit hypothesis, # - ``npred_excess``: predicted number of excess counts from best fit hypothesis. # # The `~gammapy.maps.region.ndmap.RegionNDMap` allows for plotting of multidimensional data # as well, by specifying the primary ``axis_name``: fp_result.counts.plot(axis_name="energy") plt.show() ###################################################################### # fp_result.npred.plot(axis_name="energy") plt.show() ###################################################################### # fp_result.npred_excess.plot(axis_name="energy") plt.show() ###################################################################### # Table conversion # ~~~~~~~~~~~~~~~~ # # Flux points can be converted to tables: # table = fp_result.to_table(sed_type="flux", format="gadf-sed") display(table) ###################################################################### # table = fp_result.to_table(sed_type="likelihood", format="gadf-sed", formatted=True) display(table) ###################################################################### # Common API # ---------- # In `GAMMAPY_DATA` we have access to other `~gammapy.estimators.FluxPoints` objects # which have been created utilising the above method. Here we read the PKS 2155-304 light curve # and create a `~gammapy.estimators.FluxMaps` object and show the data structure of such objects. # We emphasize that these follow a very similar structure. # ###################################################################### # Load the light curve for the PKS 2155-304 as a `~gammapy.estimators.FluxPoints` object. # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # lightcurve = FluxPoints.read( "$GAMMAPY_DATA/estimators/pks2155_hess_lc/pks2155_hess_lc.fits", format="lightcurve" ) display(lightcurve.available_quantities) ###################################################################### # Create a `~gammapy.estimators.FluxMaps` object through one of the estimators. # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") estimator = ExcessMapEstimator(correlation_radius="0.1 deg") result = estimator.run(dataset) display(result) ###################################################################### # display(result.available_quantities) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/fitting.py0000644000175100001770000004435014721316200021052 0ustar00runnerdocker""" Fitting ======= Learn how the model, dataset and fit Gammapy classes work together in a detailed modeling and fitting use-case. Prerequisites ------------- - Knowledge of spectral analysis to produce 1D On-Off datasets, see the :doc:`/tutorials/analysis-1d/spectral_analysis` tutorial. - Reading of pre-computed datasets see e.g. :doc:`/tutorials/analysis-3d/analysis_mwl` tutorial. - General knowledge on statistics and optimization methods Proposed approach ----------------- This is a hands-on tutorial to `~gammapy.modeling`, showing how to do perform a Fit in gammapy. The emphasis here is on interfacing the `Fit` class and inspecting the errors. To see an analysis example of how datasets and models interact, see the :doc:`/tutorials/api/model_management` tutorial. As an example, in this notebook, we are going to work with H.E.S.S. data of the Crab Nebula and show in particular how to : - perform a spectral analysis - use different fitting backends - access covariance matrix information and parameter errors - compute likelihood profile - compute confidence contours See also: :doc:`/tutorials/api/models` and :ref:`modeling`. The setup --------- """ from itertools import combinations import numpy as np from astropy import units as u import matplotlib.pyplot as plt from IPython.display import display from gammapy.datasets import Datasets, SpectrumDatasetOnOff from gammapy.modeling import Fit from gammapy.modeling.models import LogParabolaSpectralModel, SkyModel ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup from gammapy.visualization.utils import plot_contour_line check_tutorials_setup() ###################################################################### # Model and dataset # ----------------- # # First we define the source model, here we need only a spectral model for # which we choose a log-parabola # crab_spectrum = LogParabolaSpectralModel( amplitude=1e-11 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, alpha=2.3, beta=0.2, ) crab_spectrum.alpha.max = 3 crab_spectrum.alpha.min = 1 crab_model = SkyModel(spectral_model=crab_spectrum, name="crab") ###################################################################### # The data and background are read from pre-computed ON/OFF datasets of # H.E.S.S. observations, for simplicity we stack them together. Then we set # the model and fit range to the resulting dataset. # datasets = [] for obs_id in [23523, 23526]: dataset = SpectrumDatasetOnOff.read( f"$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs{obs_id}.fits" ) datasets.append(dataset) dataset_hess = Datasets(datasets).stack_reduce(name="HESS") datasets = Datasets(datasets=[dataset_hess]) # Set model and fit range dataset_hess.models = crab_model e_min = 0.66 * u.TeV e_max = 30 * u.TeV dataset_hess.mask_fit = dataset_hess.counts.geom.energy_mask(e_min, e_max) ###################################################################### # Fitting options # --------------- # # First let’s create a `Fit` instance: # scipy_opts = { "method": "L-BFGS-B", "options": {"ftol": 1e-4, "gtol": 1e-05}, "backend": "scipy", } fit_scipy = Fit(store_trace=True, optimize_opts=scipy_opts) ###################################################################### # By default the fit is performed using MINUIT, you can select alternative # optimizers and set their option using the `optimize_opts` argument of # the `Fit.run()` method. In addition we have specified to store the # trace of parameter values of the fit. # # Note that, for now, covariance matrix and errors are computed only for # the fitting with MINUIT. However, depending on the problem other # optimizers can better perform, so sometimes it can be useful to run a # pre-fit with alternative optimization methods. # # | For the “scipy” backend the available options are described in detail # here: # | https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html # # %%time result_scipy = fit_scipy.run(datasets) ###################################################################### # | For the “sherpa” backend you can choose the optimization algorithm # between method = {“simplex”, “levmar”, “moncar”, “gridsearch”}. # | Those methods are described and compared in detail on # http://cxc.cfa.harvard.edu/sherpa/methods/index.html The available # options of the optimization methods are described on the following # page https://cxc.cfa.harvard.edu/sherpa/methods/opt_methods.html # # %%time sherpa_opts = {"method": "simplex", "ftol": 1e-3, "maxfev": int(1e4)} fit_sherpa = Fit(store_trace=True, backend="sherpa", optimize_opts=sherpa_opts) results_simplex = fit_sherpa.run(datasets) ###################################################################### # For the “minuit” backend see # https://iminuit.readthedocs.io/en/latest/reference.html for a detailed # description of the available options. If there is an entry # ‘migrad_opts’, those options will be passed to # `iminuit.Minuit.migrad `__. # Additionally you can set the fit tolerance using the # `tol `__ # option. The minimization will stop when the estimated distance to the # minimum is less than 0.001*tol (by default tol=0.1). The # `strategy `__ # option change the speed and accuracy of the optimizer: 0 fast, 1 # default, 2 slow but accurate. If you want more reliable error estimates, # you should run the final fit with strategy 2. # # %%time fit = Fit(store_trace=True) minuit_opts = {"tol": 0.001, "strategy": 1} fit.backend = "minuit" fit.optimize_opts = minuit_opts result_minuit = fit.run(datasets) ###################################################################### # Fit quality assessment # ---------------------- # # There are various ways to check the convergence and quality of a fit. # Among them: # # Refer to the automatically-generated results dictionary: # print(result_scipy) # %% print(results_simplex) # %% print(result_minuit) ###################################################################### # If the fit is performed with minuit you can print detailed information # to check the convergence # print(result_minuit.minuit) ###################################################################### # Check the trace of the fit e.g.  in case the fit did not converge # properly # display(result_minuit.trace) ###################################################################### # The fitted models are copied on the `~gammapy.modeling.FitResult` object. # They can be inspected to check that the fitted values and errors # for all parameters are reasonable, and no fitted parameter value is “too close” # - or even outside - its allowed min-max range # display(result_minuit.models.to_parameters_table()) ###################################################################### # Plot fit statistic profiles for all fitted parameters, using # `~gammapy.modeling.Fit.stat_profile`. For a good fit and error # estimate each profile should be parabolic. The specification for each # fit statistic profile can be changed on the # `~gammapy.modeling.Parameter` object, which has `~gammapy.modeling.Parameter.scan_min`, # `~gammapy.modeling.Parameter.scan_max`, `~gammapy.modeling.Parameter.scan_n_values` and `~gammapy.modeling.Parameter.scan_n_sigma` attributes. # total_stat = result_minuit.total_stat fig, axes = plt.subplots(nrows=1, ncols=3, figsize=(14, 4)) for ax, par in zip(axes, crab_model.parameters.free_parameters): par.scan_n_values = 17 idx = crab_model.parameters.index(par) name = crab_model.parameters_unique_names[idx] profile = fit.stat_profile(datasets=datasets, parameter=par) ax.plot( profile[f"{crab_model.name}.{name}_scan"], profile["stat_scan"] - total_stat ) ax.set_xlabel(f"{par.name} [{par.unit}]") ax.set_ylabel("Delta TS") ax.set_title(f"{name}:\n {par.value:.1e} +- {par.error:.1e}") plt.show() ###################################################################### # Inspect model residuals. Those can always be accessed using # `~gammapy.datasets.Dataset.residuals()`. For more details, we refer here to the dedicated # :doc:`/tutorials/analysis-3d/analysis_3d` (for `~gammapy.datasets.MapDataset` fitting) and # :doc:`/tutorials/analysis-1d/spectral_analysis` (for `SpectrumDataset` fitting). # ###################################################################### # Covariance and parameters errors # -------------------------------- # # After the fit the covariance matrix is attached to the models copy # stored on the `~gammapy.modeling.FitResult` object. # You can access it directly with: print(result_minuit.models.covariance) ###################################################################### # And you can plot the total parameter correlation as well: # result_minuit.models.covariance.plot_correlation(figsize=(7, 5)) plt.show() ###################################################################### # The covariance information is also propagated to the individual models # Therefore, one can also get the error on a specific parameter by directly # accessing the `~gammapy.modeling.Parameter.error` attribute: # print(crab_model.spectral_model.alpha.error) ###################################################################### # As an example, this step is needed to produce a butterfly plot showing # the envelope of the model taking into account parameter uncertainties. # energy_bounds = [1, 10] * u.TeV crab_spectrum.plot(energy_bounds=energy_bounds, energy_power=2) ax = crab_spectrum.plot_error(energy_bounds=energy_bounds, energy_power=2) plt.show() ###################################################################### # Confidence contours # ------------------- # # In most studies, one wishes to estimate parameters distribution using # observed sample data. A 1-dimensional confidence interval gives an # estimated range of values which is likely to include an unknown # parameter. A confidence contour is a 2-dimensional generalization of a # confidence interval, often represented as an ellipsoid around the # best-fit value. # # Gammapy offers two ways of computing confidence contours, in the # dedicated methods `~gammapy.modeling.Fit.stat_contour` and `~gammapy.modeling.Fit.stat_profile`. In # the following sections we will describe them. # ###################################################################### # An important point to keep in mind is: *what does a* :math:`N\sigma` # *confidence contour really mean?* The answer is it represents the points # of the parameter space for which the model likelihood is :math:`N\sigma` # above the minimum. But one always has to keep in mind that **1 standard # deviation in two dimensions has a smaller coverage probability than # 68%**, and similarly for all other levels. In particular, in # 2-dimensions the probability enclosed by the :math:`N\sigma` confidence # contour is :math:`P(N)=1-e^{-N^2/2}`. # ###################################################################### # Computing contours using `~gammapy.modeling.Fit.stat_contour` # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # After the fit, MINUIT offers the possibility to compute the confidence # contours. gammapy provides an interface to this functionality through # the `~gammapy.modeling.Fit` object using the `~gammapy.modeling.Fit.stat_contour` method. Here we defined a # function to automate the contour production for the different # parameter and confidence levels (expressed in terms of sigma): # def make_contours(fit, datasets, model, params, npoints, sigmas): cts_sigma = [] for sigma in sigmas: contours = dict() for par_1, par_2 in combinations(params, r=2): idx1, idx2 = model.parameters.index(par_1), model.parameters.index(par_2) name1 = model.parameters_unique_names[idx1] name2 = model.parameters_unique_names[idx2] contour = fit.stat_contour( datasets=datasets, x=model.parameters[par_1], y=model.parameters[par_2], numpoints=npoints, sigma=sigma, ) contours[f"contour_{par_1}_{par_2}"] = { par_1: contour[f"{model.name}.{name1}"].tolist(), par_2: contour[f"{model.name}.{name2}"].tolist(), } cts_sigma.append(contours) return cts_sigma ###################################################################### # Now we can compute few contours. # # %%time params = ["alpha", "beta", "amplitude"] sigmas = [1, 2] cts_sigma = make_contours( fit=fit, datasets=datasets, model=crab_model, params=params, npoints=10, sigmas=sigmas, ) ##################################################################### # # Define the combinations of parameters to plot param_pairs = list(combinations(params, r=2)) ##################################################################### # # Labels for plotting labels = { "amplitude": r"$\phi_0 \,/\,({\rm TeV}^{-1} \, {\rm cm}^{-2} {\rm s}^{-1})$", "alpha": r"$\alpha$", "beta": r"$\beta$", } ##################################################################### # Produce the confidence contours figures. # fig, axes = plt.subplots(1, 3, figsize=(10, 3)) colors = ["m", "b", "c"] for (par_1, par_2), ax in zip(param_pairs, axes): for ks, sigma in enumerate(sigmas): contour = cts_sigma[ks][f"contour_{par_1}_{par_2}"] plot_contour_line( ax, contour[par_1], contour[par_2], lw=2.5, color=colors[ks], label=f"{sigmas[ks]}" + r"$\sigma$", ) ax.set_xlabel(labels[par_1]) ax.set_ylabel(labels[par_2]) plt.legend() plt.tight_layout() ###################################################################### # Computing contours using `~gammapy.modeling.Fit.stat_surface` # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # This alternative method for the computation of confidence contours, # although more time consuming than `~gammapy.modeling.Fit.stat_contour()`, is expected # to be more stable. It consists of a generalization of # `~gammapy.modeling.Fit.stat_profile()` to a 2-dimensional parameter space. The algorithm # is very simple: - First, passing two arrays of parameters values, a # 2-dimensional discrete parameter space is defined; - For each node of # the parameter space, the two parameters of interest are frozen. This # way, a likelihood value (:math:`-2\mathrm{ln}\,\mathcal{L}`, actually) # is computed, by either freezing (default) or fitting all nuisance # parameters; - Finally, a 2-dimensional surface of # :math:`-2\mathrm{ln}(\mathcal{L})` values is returned. Using that # surface, one can easily compute a surface of # :math:`TS = -2\Delta\mathrm{ln}(\mathcal{L})` and compute confidence # contours. # # Let’s see it step by step. # # First of all, we can notice that this method is “backend-agnostic”, # meaning that it can be run with MINUIT, sherpa or scipy as fitting # tools. Here we will stick with MINUIT, which is the default choice: # # As an example, we can compute the confidence contour for the `alpha` # and `beta` parameters of the `dataset_hess`. Here we define the # parameter space: # result = result_minuit par_alpha = crab_model.parameters["alpha"] par_beta = crab_model.parameters["beta"] par_alpha.scan_values = np.linspace(1.55, 2.7, 20) par_beta.scan_values = np.linspace(-0.05, 0.55, 20) ###################################################################### # Then we run the algorithm, by choosing `reoptimize=False` for the sake # of time saving. In real life applications, we strongly recommend to use # `reoptimize=True`, so that all free nuisance parameters will be fit at # each grid node. This is the correct way, statistically speaking, of # computing confidence contours, but is expected to be time consuming. # fit = Fit(backend="minuit", optimize_opts={"print_level": 0}) stat_surface = fit.stat_surface( datasets=datasets, x=par_alpha, y=par_beta, reoptimize=False, ) ###################################################################### # In order to easily inspect the results, we can convert the # :math:`-2\mathrm{ln}(\mathcal{L})` surface to a surface of statistical # significance (in units of Gaussian standard deviations from the surface # minimum): # # Compute TS TS = stat_surface["stat_scan"] - result.total_stat # Compute the corresponding statistical significance surface stat_surface = np.sqrt(TS.T) ###################################################################### # Notice that, as explained before, :math:`1\sigma` contour obtained this # way will not contain 68% of the probability, but rather # # Compute the corresponding statistical significance surface # p_value = 1 - st.chi2(df=1).cdf(TS) # gaussian_sigmas = st.norm.isf(p_value / 2).T ###################################################################### # Finally, we can plot the surface values together with contours: # fig, ax = plt.subplots(figsize=(8, 6)) x_values = par_alpha.scan_values y_values = par_beta.scan_values # plot surface im = ax.pcolormesh(x_values, y_values, stat_surface, shading="auto") fig.colorbar(im, label="sqrt(TS)") ax.set_xlabel(f"{par_alpha.name}") ax.set_ylabel(f"{par_beta.name}") # We choose to plot 1 and 2 sigma confidence contours levels = [1, 2] contours = ax.contour(x_values, y_values, stat_surface, levels=levels, colors="white") ax.clabel(contours, fmt="%.0f $\\sigma$", inline=3, fontsize=15) plt.show() ###################################################################### # Note that, if computed with `reoptimize=True`, this plot would be # completely consistent with the third panel of the plot produced with # `~gammapy.modeling.Fit.stat_contour` (try!). # ###################################################################### # Finally, it is always remember that confidence contours are # approximations. In particular, when the parameter range boundaries are # close to the contours lines, it is expected that the statistical meaning # of the contours is not well defined. That’s why we advise to always # choose a parameter space that contains the contours you’re interested # in. # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/irfs.py0000644000175100001770000003331014721316200020343 0ustar00runnerdocker""" Using Gammapy IRFs ================== `gammapy.irf` contains classes for handling Instrument Response Functions typically stored as multi-dimensional tables. For a list of IRF classes internally supported, see https://gamma-astro-data-formats.readthedocs.io/en/v0.3/irfs/full_enclosure/index.html This tutorial is intended for advanced users typically creating IRFs. """ import numpy as np import astropy.units as u from astropy.coordinates import SkyCoord from astropy.visualization import quantity_support import matplotlib.pyplot as plt from gammapy.irf import ( IRF, Background3D, EffectiveAreaTable2D, EnergyDependentMultiGaussPSF, EnergyDispersion2D, ) from gammapy.irf.io import COMMON_IRF_HEADERS, IRF_DL3_HDU_SPECIFICATION from gammapy.makers.utils import ( make_edisp_kernel_map, make_map_exposure_true_energy, make_psf_map, ) from gammapy.maps import MapAxis, WcsGeom ###################################################################### # Inbuilt Gammapy IRFs # -------------------- # irf_filename = ( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) aeff = EffectiveAreaTable2D.read(irf_filename, hdu="EFFECTIVE AREA") print(aeff) ###################################################################### # We can see that the Effective Area Table is defined in terms of # `energy_true` and `offset` from the camera center # # To see the IRF axes binning, eg, offset print(aeff.axes["offset"]) # To get the IRF data print(aeff.data) # the aeff is evaluated at a given energy and offset print(aeff.evaluate(energy_true=[1, 10] * u.TeV, offset=[0.2, 2.5] * u.deg)) # The peek method gives a quick look into the IRF aeff.peek() ###################################################################### # Similarly, we can access other IRFs as well # psf = EnergyDependentMultiGaussPSF.read(irf_filename, hdu="Point Spread Function") bkg = Background3D.read(irf_filename, hdu="BACKGROUND") edisp = EnergyDispersion2D.read(irf_filename, hdu="ENERGY DISPERSION") print(bkg) ###################################################################### # Note that the background is given in FoV coordiantes with `fov_lon` # and `fov_lat` axis, and not in `offset` from the camera center. We # can also check the Field of view alignment. Currently, two possible # alignments are supported: alignment with the horizontal coordinate # system (ALTAZ) and alignment with the equatorial coordinate system # (RADEC). # print(bkg.fov_alignment) ###################################################################### # To evaluate the IRFs, pass the values for each axis. To know the default # interpolation scheme for the data # print(bkg.interp_kwargs) # Evaluate background # Note that evaluate functions support numpy-style broadcasting energy = [1, 10, 100, 1000] * u.TeV fov_lon = [1, 2] * u.deg fov_lat = [1, 2] * u.deg ev = bkg.evaluate( energy=energy.reshape(-1, 1, 1), fov_lat=fov_lat.reshape(1, -1, 1), fov_lon=fov_lon.reshape(1, 1, -1), ) print(ev) ###################################################################### # We can customise the interpolation scheme. Here, we adapt to fill # `nan` instead of `0` for extrapolated values # bkg.interp_kwargs["fill_value"] = np.nan ev2 = bkg.evaluate( energy=energy.reshape(-1, 1, 1), fov_lat=fov_lat.reshape(1, -1, 1), fov_lon=fov_lon.reshape(1, 1, -1), ) print(ev2) ###################################################################### # The interpolation scheme along each axis is taken from the `MapAxis` # specification. eg # print( "Interpolation scheme for energy axis is: ", bkg.axes["energy"].interp, "and for the fov_lon axis is: ", bkg.axes["fov_lon"].interp, ) # Evaluate energy dispersion ev = edisp.evaluate(energy_true=1 * u.TeV, offset=[0, 1] * u.deg, migra=[1, 1.2]) print(ev) edisp.peek() print(psf) ###################################################################### # The PointSpreadFunction for the CTA DC1 is stored as a combination of 3 # Gaussians. Other PSFs, like a `PSF_TABLE` and analytic expressions # like KING function are also supported. All PSF classes inherit from a # common base `PSF` class. # print(psf.axes.names) # To get the containment radius for a fixed fraction at a given position print( psf.containment_radius( fraction=0.68, energy_true=1.0 * u.TeV, offset=[0.2, 4.0] * u.deg ) ) # Alternatively, to get the containment fraction for at a given position print( psf.containment( rad=0.05 * u.deg, energy_true=1.0 * u.TeV, offset=[0.2, 4.0] * u.deg ) ) ###################################################################### # Support for Asymmetric IRFs # --------------------------- # ###################################################################### # While Gammapy does not have inbuilt classes for supporting asymmetric # IRFs (except for `Background3D`), custom classes can be created. For # this to work correctly with the `MapDatasetMaker`, only variations # with `fov_lon` and `fov_lat` can be allowed. # # The main idea is that the list of required axes should be correctly # mentioned in the class definition. # ###################################################################### # Effective Area # ~~~~~~~~~~~~~~ # class EffectiveArea3D(IRF): tag = "aeff_3d" required_axes = ["energy_true", "fov_lon", "fov_lat"] default_unit = u.m**2 energy_axis = MapAxis.from_energy_edges( [0.1, 0.3, 1.0, 3.0, 10.0] * u.TeV, name="energy_true" ) nbin = 7 fov_lon_axis = MapAxis.from_edges([-1.5, -0.5, 0.5, 1.5] * u.deg, name="fov_lon") fov_lat_axis = MapAxis.from_edges([-1.5, -0.5, 0.5, 1.5] * u.deg, name="fov_lat") data = np.ones((4, 3, 3)) for i in range(1, 4): data[i] = data[i - 1] * 2.0 data[i][-1] = data[i][0] * 1.2 aeff_3d = EffectiveArea3D( [energy_axis, fov_lon_axis, fov_lat_axis], data=data, unit=u.m**2 ) print(aeff_3d) res = aeff_3d.evaluate( fov_lon=[-0.5, 0.8] * u.deg, fov_lat=[-0.5, 1.0] * u.deg, energy_true=[0.2, 8.0] * u.TeV, ) print(res) # to visualise at a given energy # sphinx_gallery_thumbnail_number = 1 aeff_eval = aeff_3d.evaluate(energy_true=[1.0] * u.TeV) ax = plt.subplot() with quantity_support(): caxes = ax.pcolormesh( fov_lat_axis.edges, fov_lon_axis.edges, aeff_eval.value.squeeze() ) fov_lat_axis.format_plot_xaxis(ax) fov_lon_axis.format_plot_yaxis(ax) ax.set_title("Asymmetric effective area") ###################################################################### # Unless specified, it is assumed these IRFs are in the RADEC frame # aeff_3d.fov_alignment ###################################################################### # Serialisation # ~~~~~~~~~~~~~ # # For serialisation, we need to add the class definition to the # `IRF_DL3_HDU_SPECIFICATION` dictionary # IRF_DL3_HDU_SPECIFICATION["aeff_3d"] = { "extname": "EFFECTIVE AREA", "column_name": "EFFAREA", "mandatory_keywords": { **COMMON_IRF_HEADERS, "HDUCLAS2": "EFF_AREA", "HDUCLAS3": "FULL-ENCLOSURE", # added here to have HDUCLASN in order "HDUCLAS4": "AEFF_3D", }, } aeff_3d.write("test_aeff3d.fits", overwrite=True) aeff_new = EffectiveArea3D.read("test_aeff3d.fits") print(aeff_new) ###################################################################### # Create exposure map # ~~~~~~~~~~~~~~~~~~~ # # DL4 data products can be created from these IRFs. # axis = MapAxis.from_energy_bounds(0.1 * u.TeV, 10 * u.TeV, 6, name="energy_true") pointing = SkyCoord(2, 1, unit="deg") geom = WcsGeom.create(npix=(4, 3), binsz=2, axes=[axis], skydir=pointing) print(geom) exposure_map = make_map_exposure_true_energy( pointing=pointing, livetime="42 h", aeff=aeff_3d, geom=geom ) exposure_map.plot_grid(add_cbar=True, figsize=(17, 7)) plt.show() ###################################################################### # Energy Dispersion # ----------------- # class EnergyDispersion3D(IRF): tag = "edisp_3d" required_axes = ["energy_true", "migra", "fov_lon", "fov_lat"] default_unit = u.one ###################################################################### # Note that most functions defined on the inbuilt IRF classes can be # easily generalised to higher dimensions. # # Make a test case energy_axis_true = MapAxis.from_energy_bounds( "0.1 TeV", "100 TeV", nbin=3, name="energy_true" ) migra_axis = MapAxis.from_bounds(0, 4, nbin=2, node_type="edges", name="migra") fov_lon_axis = MapAxis.from_edges([-1.5, -0.5, 0.5, 1.5] * u.deg, name="fov_lon") fov_lat_axis = MapAxis.from_edges([-1.5, -0.5, 0.5, 1.5] * u.deg, name="fov_lat") data = np.array( [ [ [ [5.00e-01, 5.10e-01, 5.20e-01], [6.00e-01, 6.10e-01, 6.30e-01], [6.00e-01, 6.00e-01, 6.00e-01], ], [ [2.0e-02, 2.0e-02, 2.0e-03], [2.0e-02, 2.0e-02, 2.0e-03], [2.0e-02, 2.0e-02, 2.0e-03], ], ], [ [ [5.00e-01, 5.10e-01, 5.20e-01], [6.00e-01, 6.10e-01, 6.30e-01], [6.00e-01, 6.00e-01, 6.00e-01], ], [ [2.0e-02, 2.0e-02, 2.0e-03], [2.0e-02, 2.0e-02, 2.0e-03], [2.0e-02, 2.0e-02, 2.0e-03], ], ], [ [ [5.00e-01, 5.10e-01, 5.20e-01], [6.00e-01, 6.10e-01, 6.30e-01], [6.00e-01, 6.00e-01, 6.00e-01], ], [ [3.0e-02, 6.0e-02, 2.0e-03], [3.0e-02, 5.0e-02, 2.0e-03], [3.0e-02, 4.0e-02, 2.0e-03], ], ], ] ) edisp3d = EnergyDispersion3D( [energy_axis_true, migra_axis, fov_lon_axis, fov_lat_axis], data=data ) print(edisp3d) energy = [1, 2] * u.TeV migra = np.array([0.98, 0.97, 0.7]) fov_lon = [0.1, 1.5] * u.deg fov_lat = [0.0, 0.3] * u.deg edisp_eval = edisp3d.evaluate( energy_true=energy.reshape(-1, 1, 1, 1), migra=migra.reshape(1, -1, 1, 1), fov_lon=fov_lon.reshape(1, 1, -1, 1), fov_lat=fov_lat.reshape(1, 1, 1, -1), ) print(edisp_eval[0]) ###################################################################### # Serialisation # ~~~~~~~~~~~~~ # IRF_DL3_HDU_SPECIFICATION["edisp_3d"] = { "extname": "ENERGY DISPERSION", "column_name": "MATRIX", "mandatory_keywords": { **COMMON_IRF_HEADERS, "HDUCLAS2": "EDISP", "HDUCLAS3": "FULL-ENCLOSURE", # added here to have HDUCLASN in order "HDUCLAS4": "EDISP_3D", }, } edisp3d.write("test_edisp.fits", overwrite=True) edisp_new = EnergyDispersion3D.read("test_edisp.fits") edisp_new ###################################################################### # Create edisp kernel map # ~~~~~~~~~~~~~~~~~~~~~~~ # migra = MapAxis.from_edges(np.linspace(0.5, 1.5, 50), unit="", name="migra") etrue = MapAxis.from_energy_bounds(0.5, 2, 6, unit="TeV", name="energy_true") ereco = MapAxis.from_energy_bounds(0.5, 2, 3, unit="TeV", name="energy") geom = WcsGeom.create(10, binsz=0.5, axes=[ereco, etrue], skydir=pointing) edispmap = make_edisp_kernel_map(edisp3d, pointing, geom) edispmap.peek() plt.show() # print(edispmap.edisp_map.data[3][1][3]) ###################################################################### # PSF # --- # # There are two types of asymmetric PSFs that can be considered # - Asymmetry about the camera center: Such PSF Tables can be supported # - Asymmetry about the source position: These PSF models cannot be supported # correctly within the data reduction scheme at present # Also, analytic PSF models defined within the GADF scheme cannot be # directly generalised to the 3D case for use within Gammapy. # class PSF_assym(IRF): tag = "psf_assym" required_axes = ["energy_true", "fov_lon", "fov_lat", "rad"] default_unit = u.sr**-1 energy_axis = MapAxis.from_energy_edges( [0.1, 0.3, 1.0, 3.0, 10.0] * u.TeV, name="energy_true" ) nbin = 7 fov_lon_axis = MapAxis.from_edges([-1.5, -0.5, 0.5, 1.5] * u.deg, name="fov_lon") fov_lat_axis = MapAxis.from_edges([-1.5, -0.5, 0.5, 1.5] * u.deg, name="fov_lat") rad_axis = MapAxis.from_edges([0, 1, 2], unit="deg", name="rad") data = 0.1 * np.ones((4, 3, 3, 2)) for i in range(1, 4): data[i] = data[i - 1] * 1.5 psf_assym = PSF_assym( axes=[energy_axis, fov_lon_axis, fov_lat_axis, rad_axis], data=data, ) print(psf_assym) energy = [1, 2] * u.TeV rad = np.array([0.98, 0.97, 0.7]) * u.deg fov_lon = [0.1, 1.5] * u.deg fov_lat = [0.0, 0.3] * u.deg psf_assym.evaluate( energy_true=energy.reshape(-1, 1, 1, 1), rad=rad.reshape(1, -1, 1, 1), fov_lon=fov_lon.reshape(1, 1, -1, 1), fov_lat=fov_lat.reshape(1, 1, 1, -1), ) ###################################################################### # Serialisation # ~~~~~~~~~~~~~ # IRF_DL3_HDU_SPECIFICATION["psf_assym"] = { "extname": "POINT SPREAD FUNCTION", "column_name": "MATRIX", "mandatory_keywords": { **COMMON_IRF_HEADERS, "HDUCLAS2": "PSF", "HDUCLAS3": "FULL-ENCLOSURE", "HDUCLAS4": "PSFnD", }, } psf_assym.write("test_psf.fits.gz", overwrite=True) psf_new = PSF_assym.read("test_psf.fits.gz") ###################################################################### # Create DL4 product - `PSFMap` # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # rad = MapAxis.from_edges(np.linspace(0.5, 3.0, 10), unit="deg", name="rad") etrue = MapAxis.from_energy_bounds(0.5, 2, 6, unit="TeV", name="energy_true") geom = WcsGeom.create(10, binsz=0.5, axes=[rad, etrue], skydir=pointing) psfmap = make_psf_map(psf_assym, pointing, geom) psfmap.peek() plt.show() ###################################################################### # Containers for asymmetric analytic PSFs are not supported at present. # # **NOTE**: Support for asymmetric IRFs is preliminary at the moment, and # will evolve depending on feedback. # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/makers.py0000644000175100001770000004012314721316200020662 0ustar00runnerdocker""" Makers - Data reduction ======================= Data reduction: from observations to binned datasets Introduction ------------ The `gammapy.makers` sub-package contains classes to perform data reduction tasks from DL3 data to binned datasets. In the data reduction step the DL3 data is prepared for modeling and fitting, by binning events into a counts map and interpolating the exposure, background, psf and energy dispersion on the chosen analysis geometry. Setup ----- """ import numpy as np from astropy import units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion import matplotlib.pyplot as plt from gammapy.data import DataStore from gammapy.datasets import Datasets, MapDataset, SpectrumDataset from gammapy.makers import ( DatasetsMaker, FoVBackgroundMaker, MapDatasetMaker, ReflectedRegionsBackgroundMaker, SafeMaskMaker, SpectrumDatasetMaker, ) from gammapy.makers.utils import make_effective_livetime_map, make_observation_time_map from gammapy.maps import MapAxis, RegionGeom, WcsGeom ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Dataset # ------- # # The counts, exposure, background and IRF maps are bundled together in a # data structure named `~gammapy.datasets.MapDataset`. # # The first step of the data reduction is to create an empty dataset. A # `~gammapy.datasets.MapDataset` can be created from any `~gammapy.maps.WcsGeom` # object. This is illustrated in the following example: # energy_axis = MapAxis.from_bounds( 1, 10, nbin=11, name="energy", unit="TeV", interp="log" ) geom = WcsGeom.create( skydir=(83.63, 22.01), axes=[energy_axis], width=5 * u.deg, binsz=0.05 * u.deg, frame="icrs", ) dataset_empty = MapDataset.create(geom=geom) print(dataset_empty) ###################################################################### # It is possible to compute the instrument response functions with # different spatial and energy bins as compared to the counts and # background maps. For example, one can specify a true energy axis which # defines the energy binning of the IRFs: # energy_axis_true = MapAxis.from_bounds( 0.3, 10, nbin=31, name="energy_true", unit="TeV", interp="log" ) dataset_empty = MapDataset.create(geom=geom, energy_axis_true=energy_axis_true) ###################################################################### # For the detail of the other options available, you can always call the # help: # help(MapDataset.create) ###################################################################### # Once this empty “reference” dataset is defined, it can be filled with # observational data using the `~gammapy.makers.MapDatasetMaker`: # # get observation data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") obs = data_store.get_observations([23592])[0] # fill dataset maker = MapDatasetMaker() dataset = maker.run(dataset_empty, obs) print(dataset) dataset.counts.sum_over_axes().plot(stretch="sqrt", add_cbar=True) plt.show() ###################################################################### # The `~gammapy.makers.MapDatasetMaker` fills the corresponding `counts`, # `exposure`, `background`, `psf` and `edisp` map per observation. # The `~gammapy.makers.MapDatasetMaker` has a `selection` parameter, in case some of # the maps should not be computed. There is also a # `background_oversampling` parameter that defines the oversampling # factor in energy used to compute the background (default is None). # # .. _safe-data-range: # Safe data range handling # ------------------------ # # To exclude the data range from a `~gammapy.datasets.MapDataset`, that is associated with # high systematics on instrument response functions, a `~gammapy.datasets.MapDataset.mask_safe` # can be defined. The `~gammapy.datasets.MapDataset.mask_safe` is a `~gammapy.maps.Map` object # with `bool` data type, which indicates for each pixel, whether it should be included in # the analysis. The convention is that a value of `True` or `1` # includes the pixel, while a value of `False` or `0` excludes a # pixels from the analysis. To compute safe data range masks according to # certain criteria, Gammapy provides a `~gammapy.makers.SafeMaskMaker` class. The # different criteria are given by the `methods` argument, available # options are : # # - aeff-default, uses the energy ranged specified in the DL3 data files, # if available. # - aeff-max, the lower energy threshold is determined such as the # effective area is above a given percentage of its maximum # - edisp-bias, the lower energy threshold is determined such as the # energy bias is below a given percentage # - offset-max, the data beyond a given offset radius from the # observation center are excluded # - bkg-peak, the energy threshold is defined as the upper edge of the # energy bin with the highest predicted background rate. This method # was introduced in the # `H.E.S.S. DL3 validation paper `__ # # Note that currently some methods computing a safe energy range # ("aeff-default", "aeff-max" and "edisp-bias") determine a true energy range and # apply it to reconstructed energy, effectively neglecting the energy dispersion. # # Multiple methods can be combined. Here is an example : # safe_mask_maker = SafeMaskMaker( methods=["aeff-default", "offset-max"], offset_max="3 deg" ) dataset = maker.run(dataset_empty, obs) dataset = safe_mask_maker.run(dataset, obs) print(dataset.mask_safe) dataset.mask_safe.sum_over_axes().plot() plt.show() ###################################################################### # The `~gammapy.makers.SafeMaskMaker` does not modify any data, but only defines the # `~gammapy.datasets.MapDataset.mask_safe` attribute. This means that the safe data range # can be defined and modified in between the data reduction and stacking # and fitting. For a joint-likelihood analysis of multiple observations # the safe mask is applied to the counts and predicted number of counts # map during fitting. This correctly accounts for contributions # (spill-over) by the PSF from outside the field of view. # # Background estimation # --------------------- # # The background computed by the `~gammapy.makers.MapDatasetMaker` gives the number of # counts predicted by the background IRF of the observation. Because its # actual normalization, or even its spectral shape, might be poorly # constrained, it is necessary to correct it with the data themselves. # This is the role of background estimation Makers. # # FoV background # ~~~~~~~~~~~~~~ # # If the background energy dependent morphology is well reproduced by the # background model stored in the IRF, it might be that its normalization # is incorrect and that some spectral corrections are necessary. This is # made possible thanks to the `~gammapy.makers.FoVBackgroundMaker`. This # technique is recommended in most 3D data reductions. For more details # and usage, see the :doc:`FoV background `. # # Here we are going to use a `~gammapy.makers.FoVBackgroundMaker` that # will rescale the background model to the data excluding the region where # a known source is present. For more details on the way to create # exclusion masks see the :doc:`mask maps ` notebook. # circle = CircleSkyRegion(center=geom.center_skydir, radius=0.2 * u.deg) exclusion_mask = geom.region_mask([circle], inside=False) fov_bkg_maker = FoVBackgroundMaker(method="scale", exclusion_mask=exclusion_mask) dataset = fov_bkg_maker.run(dataset) ###################################################################### # Other backgrounds production methods available are listed below. # # Ring background # ~~~~~~~~~~~~~~~ # # If the background model does not reproduce well the morphology, a # classical approach consists in applying local corrections by smoothing # the data with a ring kernel. This allows to build a set of OFF counts # taking into account the imperfect knowledge of the background. This is # implemented in the `~gammapy.makers.RingBackgroundMaker` which # transforms the Dataset in a `~gammapy.datasets.MapDatasetOnOff`. This technique is # mostly used for imaging, and should not be applied for 3D modeling and # fitting. # # For more details and usage, see # :doc:`Ring background ` # # Reflected regions background # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # In the absence of a solid background model, a classical technique in # Cherenkov astronomy for 1D spectral analysis is to estimate the # background in a number of OFF regions. When the background can be safely # estimated as radially symmetric w.r.t. the pointing direction, one can # apply the reflected regions background technique. This is implemented in # the `~gammapy.makers.ReflectedRegionsBackgroundMaker` which transforms # a `~gammapy.datasets.SpectrumDataset` in a `~gammapy.datasets.SpectrumDatasetOnOff`. # This method is only used for 1D spectral analysis. # # For more details and usage, see # the :doc:`Reflected background ` # # Data reduction loop # ------------------- # # The data reduction steps can be combined in a single loop to run a full # data reduction chain. For this the `~gammapy.makers.MapDatasetMaker` is run first and # the output dataset is the passed on to the next maker step. Finally, the # dataset per observation is stacked into a larger map. # data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") observations = data_store.get_observations([23523, 23592, 23526, 23559]) energy_axis = MapAxis.from_bounds( 1, 10, nbin=11, name="energy", unit="TeV", interp="log" ) geom = WcsGeom.create(skydir=(83.63, 22.01), axes=[energy_axis], width=5, binsz=0.02) dataset_maker = MapDatasetMaker() safe_mask_maker = SafeMaskMaker( methods=["aeff-default", "offset-max"], offset_max="3 deg" ) stacked = MapDataset.create(geom) for obs in observations: local_dataset = stacked.cutout(obs.get_pointing_icrs(obs.tmid), width="6 deg") dataset = dataset_maker.run(local_dataset, obs) dataset = safe_mask_maker.run(dataset, obs) dataset = fov_bkg_maker.run(dataset) stacked.stack(dataset) print(stacked) ###################################################################### # To maintain good performance it is always recommended to do a cutout of # the `~gammapy.datasets.MapDataset` as shown above. In case you want to increase the # offset-cut later, you can also choose a larger width of the cutout than # `2 * offset_max`. # # Note that we stack the individual `~gammapy.datasets.MapDataset`, which are computed per # observation into a larger dataset. During the stacking the safe data # range mask (`~gammapy.datasets.MapDataset.mask_safe`) is applied by setting data outside # to zero, then data is added to the larger map dataset. To stack multiple # observations, the larger dataset must be created first. # # The data reduction loop shown above can be done through the # `~gammapy.makers.DatasetsMaker` class that take as argument a list of makers. **Note # that the order of the makers list is important as it determines their # execution order.** Moreover the `stack_datasets` option offers the # possibility to stack or not the output datasets, and the `n_jobs` option # allow to use multiple processes on run. # global_dataset = MapDataset.create(geom) makers = [dataset_maker, safe_mask_maker, fov_bkg_maker] # the order matter datasets_maker = DatasetsMaker(makers, stack_datasets=False, n_jobs=1) datasets = datasets_maker.run(global_dataset, observations) print(datasets) ###################################################################### # Spectrum dataset # ---------------- # # The spectrum datasets represent 1D spectra along an energy axis within a # given on region. The `~gammapy.datasets.SpectrumDataset` contains a counts spectrum, and # a background model. The `~gammapy.datasets.SpectrumDatasetOnOff` contains ON and OFF # count spectra, background is implicitly modeled via the OFF counts # spectrum. # # The `~gammapy.makers.SpectrumDatasetMaker` make spectrum dataset for a single # observation. In that case the IRFs and background are computed at a # single fixed offset, which is recommended only for point-sources. # # Here is an example of data reduction loop to create # `~gammapy.datasets.SpectrumDatasetOnOff` datasets: # # on region is given by the CircleSkyRegion previously defined geom = RegionGeom.create(region=circle, axes=[energy_axis]) exclusion_mask_2d = exclusion_mask.reduce_over_axes(np.logical_or, keepdims=False) spectrum_dataset_empty = SpectrumDataset.create( geom=geom, energy_axis_true=energy_axis_true ) spectrum_dataset_maker = SpectrumDatasetMaker( containment_correction=False, selection=["counts", "exposure", "edisp"] ) reflected_bkg_maker = ReflectedRegionsBackgroundMaker(exclusion_mask=exclusion_mask_2d) safe_mask_masker = SafeMaskMaker(methods=["aeff-max"], aeff_percent=10) datasets = Datasets() for observation in observations: dataset = spectrum_dataset_maker.run( spectrum_dataset_empty.copy(name=f"obs-{observation.obs_id}"), observation, ) dataset_on_off = reflected_bkg_maker.run(dataset, observation) dataset_on_off = safe_mask_masker.run(dataset_on_off, observation) datasets.append(dataset_on_off) print(datasets) plt.show() ###################################################################### # Observation duration and effective livetime # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # It can often be useful to know the total number of hours spent # in the given field of view (without correcting for the acceptance # variation). This can be computed using `~gammapy.makers.utils.make_observation_time_map` # as shown below # # Get the observations obs_id = data_store.obs_table["OBS_ID"][data_store.obs_table["OBJECT"] == "MSH 15-5-02"] observations = data_store.get_observations(obs_id) print("No. of observations: ", len(observations)) # Define an energy range energy_min = 100 * u.GeV energy_max = 10.0 * u.TeV # Define an offset cut (the camera field of view) offset_max = 2.5 * u.deg # Define the geom source_pos = SkyCoord(228.32, -59.08, unit="deg") energy_axis_true = MapAxis.from_energy_bounds( energy_min, energy_max, nbin=2, name="energy_true" ) geom = WcsGeom.create( skydir=source_pos, binsz=0.02, width=(6, 6), frame="icrs", proj="CAR", axes=[energy_axis_true], ) total_obstime = make_observation_time_map(observations, geom, offset_max=offset_max) plt.figure(figsize=(5, 5)) ax = total_obstime.plot(add_cbar=True) # Add the pointing position on top for obs in observations: ax.plot( obs.get_pointing_icrs(obs.tmid).to_pixel(wcs=ax.wcs)[0], obs.get_pointing_icrs(obs.tmid).to_pixel(wcs=ax.wcs)[1], "+", color="black", ) ax.set_title("Total observation time") plt.show() ###################################################################### # As the acceptance of IACT cameras vary within the field of # view, it can also be interesting to plot the on-axis equivalent # number of hours. # effective_livetime = make_effective_livetime_map( observations, geom, offset_max=offset_max ) axs = effective_livetime.plot_grid(add_cbar=True) # Add the pointing position on top for ax in axs: for obs in observations: ax.plot( obs.get_pointing_icrs(obs.tmid).to_pixel(wcs=ax.wcs)[0], obs.get_pointing_icrs(obs.tmid).to_pixel(wcs=ax.wcs)[1], "+", color="black", ) plt.show() ###################################################################### # To get the value of the observation time at a particular position, # use ``get_by_coord`` obs_time_src = total_obstime.get_by_coord(source_pos) effective_times_src = effective_livetime.get_by_coord( (source_pos, energy_axis_true.center) ) print(f"Time spent on position {source_pos}") print(f"Total observation time: {obs_time_src}* {total_obstime.unit}") print( f"Effective livetime at {energy_axis_true.center[0]}: {effective_times_src[0]} * {effective_livetime.unit}" ) print( f"Effective livetime at {energy_axis_true.center[1]}: {effective_times_src[1]} * {effective_livetime.unit}" ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/maps.py0000644000175100001770000011067114721316200020346 0ustar00runnerdocker""" Maps ==== A thorough tutorial to work with WCS maps. .. figure:: ../../_static/gammapy_maps.png :alt: Gammapy Maps Illustration Gammapy Maps Illustration Introduction ------------ The `~gammapy.maps` submodule contains classes for representing pixilised data on the sky with an arbitrary number of non-spatial dimensions such as energy, time, event class or any possible user-defined dimension (illustrated in the image above). The main `Map` data structure features a uniform API for `WCS `__ as well as `HEALPix `__ based images. The API also generalizes simple image based operations such as smoothing, interpolation and reprojection to the arbitrary extra dimensions and makes working with (2 + N)-dimensional hypercubes as easy as working with a simple 2D image. Further information is also provided on the `~gammapy.maps` docs page. In the following introduction we will learn all the basics of working with WCS based maps. HEALPix based maps will be covered in a future tutorial. Make sure you have worked through the :doc:`Gammapy overview `, because a solid knowledge about working with `SkyCoord` and `Quantity` objects as well as `Numpy `__ is required for this tutorial. This notebook is rather lengthy, but getting to know the `Map` data structure in detail is essential for working with Gammapy and will allow you to fulfill complex analysis tasks with very few and simple code in future! """ ###################################################################### # Setup # ----- # import os # %matplotlib inline import numpy as np from astropy import units as u from astropy.convolution import convolve from astropy.coordinates import SkyCoord from astropy.io import fits from astropy.table import Table from astropy.time import Time import matplotlib.pyplot as plt from IPython.display import display from gammapy.data import EventList from gammapy.maps import ( LabelMapAxis, Map, MapAxes, MapAxis, TimeMapAxis, WcsGeom, WcsNDMap, ) ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Creating WCS Maps # ----------------- # # Using Factory Methods # ~~~~~~~~~~~~~~~~~~~~~ # # Maps are most easily created using the `~gammapy.maps.Map.create` # factory method: # m_allsky = Map.create() ###################################################################### # Calling `~gammapy.maps.Map.create` without any further arguments creates by # default an allsky WCS map using a CAR projection, ICRS coordinates and a # pixel size of 1 deg. This can be easily checked by printing the # `~gammapy.maps.Map.geom` attribute of the map: # print(m_allsky.geom) ###################################################################### # The `~gammapy.maps.Map.geom` attribute is a `~gammapy.maps.Geom` object, that defines the basic # geometry of the map, such as size of the pixels, width and height of the # image, coordinate system etc., but we will learn more about this object # later. # # Besides the ``.geom`` attribute the map has also a ``.data`` attribute, # which is just a plain ``~numpy.ndarray`` and stores the data associated # with this map: # print(m_allsky.data) ###################################################################### # By default maps are filled with zeros. # # The ``map_type`` argument can be used to control the pixelization scheme # (WCS or HPX). # position = SkyCoord(0.0, 5.0, frame="galactic", unit="deg") # Create a WCS Map m_wcs = Map.create(binsz=0.1, map_type="wcs", skydir=position, width=10.0) # Create a HPX Map m_hpx = Map.create(binsz=0.1, map_type="hpx", skydir=position, width=10.0) ###################################################################### # Here is an example that creates a WCS map centered on the Galactic # center and now uses Galactic coordinates: # skydir = SkyCoord(0, 0, frame="galactic", unit="deg") m_gc = Map.create( binsz=0.02, width=(10, 5), skydir=skydir, frame="galactic", proj="TAN" ) print(m_gc.geom) ###################################################################### # In addition we have defined a TAN projection, a pixel size of ``0.02`` # deg and a width of the map of ``10 deg x 5 deg``. The `width` argument # also takes scalar value instead of a tuple, which is interpreted as both # the width and height of the map, so that a quadratic map is created. # ###################################################################### # Creating from a Map Geometry # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # As we have seen in the first examples, the `~gammapy.maps.Map` object couples the # data (stored as a `~numpy.ndarray`) with a `~gammapy.maps.Geom` object. The # `~gammapy.maps.~Geom` object can be seen as a generalization of an # `astropy.wcs.WCS` object, providing the information on how the data # maps to physical coordinate systems. In some cases e.g. when creating # many maps with the same WCS geometry it can be advantageous to first # create the map geometry independent of the map object it-self: # wcs_geom = WcsGeom.create(binsz=0.02, width=(10, 5), skydir=(0, 0), frame="galactic") ###################################################################### # And then create the map objects from the ``wcs_geom`` geometry # specification: # maps = {} for name in ["counts", "background"]: maps[name] = Map.from_geom(wcs_geom) ###################################################################### # The `~gammapy.maps.Geom` object also has a few helpful methods. E.g. we can check # whether a given position on the sky is contained in the map geometry: # # define the position of the Galactic center and anti-center positions = SkyCoord([0, 180], [0, 0], frame="galactic", unit="deg") wcs_geom.contains(positions) ###################################################################### # Or get the image center of the map: # print(wcs_geom.center_skydir) ###################################################################### # Or we can also retrieve the solid angle per pixel of the map: # print(wcs_geom.solid_angle()) ###################################################################### # Adding Non-Spatial Axes # ~~~~~~~~~~~~~~~~~~~~~~~ # # In many analysis scenarios we would like to add extra dimension to the # maps to study e.g. energy or time dependency of the data. Those # non-spatial dimensions are handled with the `~gammapy.maps.MapAxis` object. Let us # first define an energy axis, with 4 bins: # energy_axis = MapAxis.from_bounds( 1, 100, nbin=4, unit="TeV", name="energy", interp="log" ) print(energy_axis) ###################################################################### # Where ``interp='log'`` specifies that a logarithmic spacing is used # between the bins, equivalent to ``np.logspace(0, 2, 4)``. This # `~gammapy.maps.MapAxis` object we can now pass to `~gammapy.maps.Map.create()` using the # ``axes=`` argument: # m_cube = Map.create(binsz=0.02, width=(10, 5), frame="galactic", axes=[energy_axis]) print(m_cube.geom) ###################################################################### # Now we see that besides ``lon`` and ``lat`` the map has an additional # axes named ``energy`` with 4 bins. The total dimension of the map is now # ``ndim=3``. # # We can also add further axes by passing a list of `~gammapy.maps.MapAxis` objects. # To demonstrate this we create a time axis with linearly spaced bins and # pass both axes to `Map.create()`: # time_axis = MapAxis.from_bounds(0, 24, nbin=24, unit="hour", name="time", interp="lin") m_4d = Map.create( binsz=0.02, width=(10, 5), frame="galactic", axes=[energy_axis, time_axis] ) print(m_4d.geom) ###################################################################### # The `~gammapy.maps.MapAxis` object internally stores the coordinates or “position # values” associated with every map axis bin or “node”. We distinguish # between two node types: ``"edges"`` and ``"center"``. The node type # ``"edges"``\ (which is also the default) specifies that the data # associated with this axis is integrated between the edges of the bin # (e.g. counts data). The node type ``"center"`` specifies that the data is # given at the center of the bin (e.g. exposure or differential fluxes). # # The edges of the bins can be checked with `~gammapy.maps.MapAxis.edges` attribute: # print(energy_axis.edges) ###################################################################### # The numbers are given in the units we specified above, which can be # checked again with: # print(energy_axis.unit) ###################################################################### # The centers of the axis bins can be checked with the `~gammapy.maps.MapAxis.center` # attribute: # print(energy_axis.center) ###################################################################### # Adding Non-contiguous axes # ~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Non-spatial map axes can also be handled through two other objects known as the `~gammapy.maps.TimeMapAxis` # and the `~gammapy.maps.LabelMapAxis`. # ###################################################################### # TimeMapAxis # ^^^^^^^^^^^ # # The `~gammapy.maps.TimeMapAxis` object provides an axis for non-adjacent # time intervals. # time_map_axis = TimeMapAxis( edges_min=[1, 5, 10, 15] * u.day, edges_max=[2, 7, 13, 18] * u.day, reference_time=Time("2020-03-19"), ) print(time_map_axis) ###################################################################### # This ``time_map_axis`` can then be utilised in a similar way to the previous implementation to create # a `~gammapy.maps.Map`. # map_4d = Map.create( binsz=0.02, width=(10, 5), frame="galactic", axes=[energy_axis, time_map_axis] ) print(map_4d.geom) ###################################################################### # It is possible to utilise the `~gammapy.maps.TimeMapAxis.slice` attrribute # to create new a `~gammapy.maps.TimeMapAxis`. Here we are slicing # between the first and third axis to extract the subsection of the axis # between indice 0 and 2. print(time_map_axis.slice([0, 2])) ###################################################################### # It is also possible to `~gammapy.maps.TimeMapAxis.squash` the axis, # which squashes the existing axis into one bin. This creates a new axis # between the extreme edges of the initial axis. print(time_map_axis.squash()) ###################################################################### # The `~gammapy.maps.TimeMapAxis.is_contiguous` method returns a boolean # which indicates whether the `~gammapy.maps.TimeMapAxis` is contiguous or not. print(time_map_axis.is_contiguous) ###################################################################### # As we have a non-contiguous axis we can print the array of bin edges for both # the minimum axis edges (`~gammapy.maps.TimeMapAxis.edges_min`) and the maximum axis # edges (`~gammapy.maps.TimeMapAxis.edges_max`). print(time_map_axis.edges_min) print(time_map_axis.edges_max) ###################################################################### # Next, we use the `~gammapy.maps.TimeMapAxis.to_contiguous` functionality to # create a contiguous axis and expose `~gammapy.maps.TimeMapAxis.edges`. This # method returns a `~astropy.units.Quantity` with respect to the reference time. time_map_axis_contiguous = time_map_axis.to_contiguous() print(time_map_axis_contiguous.is_contiguous) print(time_map_axis_contiguous.edges) ###################################################################### # The `~gammapy.maps.TimeMapAxis.time_edges` will return the `~astropy.time.Time` object directly print(time_map_axis_contiguous.time_edges) ###################################################################### # `~gammapy.maps.TimeMapAxis` also has both functionalities of # `~gammapy.maps.TimeMapAxis.coord_to_pix` and `~gammapy.maps.TimeMapAxis.coord_to_idx`. # The `~gammapy.maps.TimeMapAxis.coord_to_idx` attribute will give the index of the # ``time`` specified, similarly for `~gammapy.maps.TimeMapAxis.coord_to_pix` which returns # the pixel. A linear interpolation is assumed. # # Start by choosing a time which we know is within the `~gammapy.maps.TimeMapAxis` and see the results. time = Time(time_map_axis.time_max.mjd[0], format="mjd") print(time_map_axis.coord_to_pix(time)) print(time_map_axis.coord_to_idx(time)) ###################################################################### # This functionality can also be used with an array of `~astropy.time.Time` values. times = Time(time_map_axis.time_max.mjd, format="mjd") print(time_map_axis.coord_to_pix(times)) print(time_map_axis.coord_to_idx(times)) ###################################################################### # Note here we take a `~astropy.time.Time` which is outside the edges. # A linear interpolation is assumed for both methods, therefore for a time # outside the ``time_map_axis`` there is no extrapolation and -1 is returned. # # Note: due to this, the `~gammapy.maps.TimeMapAxis.coord_to_pix` method will # return ``nan`` and the `~gammapy.maps.TimeMapAxis.coord_to_idx` method returns -1. print(time_map_axis.coord_to_pix(Time(time.mjd + 1, format="mjd"))) print(time_map_axis.coord_to_idx(Time(time.mjd + 1, format="mjd"))) ###################################################################### # LabelMapAxis # ^^^^^^^^^^^^ # # The `~gammapy.maps.LabelMapAxis` object allows for handling of labels for map axes. # It provides an axis for non-numeric entries. # label_axis = LabelMapAxis( labels=["dataset-1", "dataset-2", "dataset-3"], name="dataset" ) print(label_axis) ###################################################################### # The labels can be checked using the `~gammapy.maps.LabelMapAxis.center` attribute: print(label_axis.center) ###################################################################### # To obtain the position of the label, one can utilise the `~gammapy.maps.LabelMapAxis.coord_to_pix` attribute print(label_axis.coord_to_pix(["dataset-3"])) ###################################################################### # To adapt and create new axes the following attributes can be utilised: # `~gammapy.maps.LabelMapAxis.concatenate`, `~gammapy.maps.LabelMapAxis.slice` and # `~gammapy.maps.LabelMapAxis.squash`. # # Combining two different `~gammapy.maps.LabelMapAxis` is done in the following way: label_axis2 = LabelMapAxis(labels=["dataset-a", "dataset-b"], name="dataset") print(label_axis.concatenate(label_axis2)) ###################################################################### # A new `~gammapy.maps.LabelMapAxis` can be created by slicing an already existing one. # Here we are slicing between the second and third bins to extract the subsection. print(label_axis.slice([1, 2])) ###################################################################### # A new axis object can be created by squashing the axis into a single bin. print(label_axis.squash()) ###################################################################### # Mixing the three previous axis types (`~gammapy.maps.MapAxis`, # `~gammapy.maps.TimeMapAxis` and `~gammapy.maps.LabelMapAxis`) # would be done like so: # axes = MapAxes(axes=[energy_axis, time_map_axis, label_axis]) hdu = axes.to_table_hdu(format="gadf") table = Table.read(hdu) display(table) ###################################################################### # Reading and Writing # ------------------- # # Gammapy `~gammapy.maps.Map` objects are serialized using the Flexible Image # Transport Format (FITS). Depending on the pixelization scheme (HEALPix # or WCS) and presence of non-spatial dimensions the actual convention to # write the FITS file is different. By default Gammapy uses a generic # convention named ``"gadf"``, which will support WCS and HEALPix formats as # well as an arbitrary number of non-spatial axes. The convention is # documented in detail on the `Gamma Astro Data # Formats `__ # page. # # Other conventions required by specific software (e.g. the Fermi Science # Tools) are supported as well. At the moment those are the following # # - ``"fgst-ccube"``: Fermi counts cube format. # - ``"fgst-ltcube"``: Fermi livetime cube format. # - ``"fgst-bexpcube"``: Fermi exposure cube format # - ``"fgst-template"``: Fermi Galactic diffuse and source template format. # - ``"fgst-srcmap"`` and ``"fgst-srcmap-sparse"``: Fermi source map and # sparse source map format. # # The conventions listed above only support an additional energy axis. # # Reading Maps # ~~~~~~~~~~~~ # # Reading FITS files is mainly exposed via the `~gammapy.maps.Map.read()` method. Let # us take a look at a first example: # filename = "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-counts.fits.gz" m_3fhl_gc = Map.read(filename) print(m_3fhl_gc) ###################################################################### # If ``map_type`` argument is not given when calling read a map object # will be instantiated with the pixelization of the input HDU. # # By default ``Map.read()`` will try to find the first valid data hdu in # the filename and read the data from there. If multiple HDUs are present # in the FITS file, the desired one can be chosen with the additional # `hdu=` argument: # m_3fhl_gc = Map.read(filename, hdu="PRIMARY") print(m_3fhl_gc) ###################################################################### # In rare cases e.g. when the FITS file is not valid or meta data is # missing from the header it can be necessary to modify the header of a # certain HDU before creating the `Map` object. In this case we can use # `astropy.io.fits` directly to read the FITS file: # filename = os.environ["GAMMAPY_DATA"] + "/fermi-3fhl-gc/fermi-3fhl-gc-exposure.fits.gz" hdulist = fits.open(filename) print(hdulist.info()) ###################################################################### # And then modify the header keyword and use `Map.from_hdulist()` to # create the `Map` object after: # hdulist["PRIMARY"].header["BUNIT"] = "cm2 s" print(Map.from_hdulist(hdulist=hdulist)) ###################################################################### # Writing Maps # ~~~~~~~~~~~~ # # Writing FITS files on disk via the `Map.write()` method. # Here is a first example: # m_cube.write("example_cube.fits", overwrite=True) ###################################################################### # By default Gammapy does not overwrite files. In this example we set # `overwrite=True` in case the cell gets executed multiple times. Now we # can read back the cube from disk using `Map.read()`: # m_cube = Map.read("example_cube.fits") print(m_cube) ###################################################################### # We can also choose a different FITS convention to write the example cube # in a format compatible to the Fermi Galactic diffuse background model: # m_cube.write("example_cube_fgst.fits", format="fgst-template", overwrite=True) ###################################################################### # To understand a little bit better the generic `gadf` convention we use # `Map.to_hdulist()` to generate a list of FITS HDUs first: # hdulist = m_4d.to_hdulist(format="gadf") print(hdulist.info()) ###################################################################### # As we can see the `HDUList` object contains to HDUs. The first one # named `PRIMARY` contains the data array with shape corresponding to # our data and the WCS information stored in the header: # print(hdulist["PRIMARY"].header) ###################################################################### # The second HDU is a `BinTableHDU` named `PRIMARY_BANDS` contains the # information on the non-spatial axes such as name, order, unit, min, max # and center values of the axis bins. We use an `astropy.table.Table` to # show the information: # print(Table.read(hdulist["PRIMARY_BANDS"])) ###################################################################### # Maps can be serialized to a sparse data format by calling write with # `sparse=True`. This will write all non-zero pixels in the map to a # data table appropriate to the pixelization scheme. # m = Map.create(binsz=0.1, map_type="wcs", width=10.0) m.write("file.fits", hdu="IMAGE", sparse=True, overwrite=True) m = Map.read("file.fits", hdu="IMAGE", map_type="wcs") ###################################################################### # Accessing Data # -------------- # # How to get data values # ~~~~~~~~~~~~~~~~~~~~~~ # # All map objects have a set of accessor methods, which can be used to # access or update the contents of the map irrespective of its underlying # representation. Those accessor methods accept as their first argument a # coordinate `tuple` containing scalars, `list`, or `numpy.ndarray` # with one tuple element for each dimension. Some methods additionally # accept a `dict` or `MapCoord` argument, of which both allow to # assign coordinates by axis name. # # Let us first begin with the `~gammapy.maps.Map.get_by_idx()` method, that accepts a # tuple of indices. The order of the indices corresponds to the axis order # of the map: # print(m_gc.get_by_idx((50, 30))) ###################################################################### # **Important:** Gammapy uses a reversed index order in the map API with # the longitude axes first. To achieve the same by directly indexing into # the numpy array we have to call: # print(m_gc.data[([30], [50])]) ###################################################################### # To check the order of the axes you can always print the ``.geom``` # attribute: # print(m_gc.geom) ###################################################################### # To access values directly by sky coordinates we can use the # `~gammapy.maps.Map.get_by_coord()` method. This time we pass in a `dict`, specifying # the axes names corresponding to the given coordinates: # print(m_gc.get_by_coord({"lon": [0, 180], "lat": [0, 0]})) ###################################################################### # The units of the coordinates are assumed to be in degrees in the # coordinate system used by the map. If the coordinates do not correspond # to the exact pixel center, the value of the nearest pixel center will be # returned. For positions outside the map geometry `np.nan` is returned. # # The coordinate or idx arrays follow normal `Numpy broadcasting # rules `__. # So the following works as expected: # lons = np.linspace(-4, 4, 10) print(m_gc.get_by_coord({"lon": lons, "lat": 0})) ###################################################################### # Or as an even more advanced example, we can provide `lats` as column # vector and broadcasting to a 2D result array will be applied: # lons = np.linspace(-4, 4, 8) lats = np.linspace(-4, 4, 8).reshape(-1, 1) print(m_gc.get_by_coord({"lon": lons, "lat": lats})) ###################################################################### # Indexing and Slicing Sub-Maps # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # When you have worked with Numpy arrays in the past you are probably # familiar with the concept of indexing and slicing into data arrays. To # support slicing of non-spatial axes of `Map` objects, the `Map` # object has a `~gammapy.maps.Map.slice_by_idx()` method, which allows to extract # sub-maps from a larger map. # # The following example demonstrates how to get the map at the energy bin # number 3: # m_sub = m_cube.slice_by_idx({"energy": 3}) print(m_sub) ###################################################################### # Note that the returned object is again a `~gammapy.maps.Map` with updated axes # information. In this case, because we extracted only a single image, the # energy axes is dropped from the map. # # To extract a sub-cube with a sliced energy axes we can use a normal # ``slice()`` object: # m_sub = m_cube.slice_by_idx({"energy": slice(1, 3)}) print(m_sub) ###################################################################### # Note that the returned object is also a `~gammapy.maps.Map` object, but this time # with updated energy axis specification. # # Slicing of multiple dimensions is supported by adding further entries to # the dict passed to `~gammapy.maps.Map.slice_by_idx()` # m_sub = m_4d.slice_by_idx({"energy": slice(1, 3), "time": slice(4, 10)}) print(m_sub) ###################################################################### # For convenience there is also a `~gammapy.maps.Map.get_image_by_coord()` method which # allows to access image planes at given non-spatial physical coordinates. # This method also supports `~astropy.units.Quantity` objects: # image = m_4d.get_image_by_coord({"energy": 4 * u.TeV, "time": 5 * u.h}) print(image.geom) ###################################################################### # Iterating by image # ~~~~~~~~~~~~~~~~~~ # # For maps with non-spatial dimensions the `~gammapy.maps.Map.iter_by_image_data` # method can be used to loop over image slices. The image plane index # `idx` is returned in data order, so that the data array can be indexed # directly. Here is an example for an in-place convolution of an image # using `~astropy.convolution.convolve` to interpolate NaN values: # axis1 = MapAxis([1, 10, 100], interp="log", name="energy") axis2 = MapAxis([1, 2, 3], interp="lin", name="time") m = Map.create(width=(5, 3), axes=[axis1, axis2], binsz=0.1) m.data[:, :, 15:18, 20:25] = np.nan for img, idx in m.iter_by_image_data(): kernel = np.ones((5, 5)) m.data[idx] = convolve(img, kernel) assert not np.isnan(m.data).any() ###################################################################### # Modifying Data # -------------- # # How to set data values # ~~~~~~~~~~~~~~~~~~~~~~ # # To modify and set map data values the `Map` object features as well a # `~gammapy.maps.Map.set_by_idx()` method: # m_cube.set_by_idx(idx=(10, 20, 3), vals=42) ###################################################################### # here we check that data have been updated: # print(m_cube.get_by_idx((10, 20, 3))) ###################################################################### # Of course there is also a `~gammapy.maps.Map.set_by_coord()` method, which allows to # set map data values in physical coordinates. # m_cube.set_by_coord({"lon": 0, "lat": 0, "energy": 2 * u.TeV}, vals=42) ###################################################################### # Again the `lon` and `lat` values are assumed to be given in degrees # in the coordinate system used by the map. For the energy axis, the unit # is the one specified on the axis (use ``m_cube.geom.axes[0].unit`` to # check if needed…). # # All ``.xxx_by_coord()`` methods accept `~astropy.coordinates.SkyCoord` objects as well. In # this case we have to use the ``"skycoord"`` keyword instead of ``"lon"`` and # ``"lat"``: # skycoords = SkyCoord([1.2, 3.4], [-0.5, 1.1], frame="galactic", unit="deg") m_cube.set_by_coord({"skycoord": skycoords, "energy": 2 * u.TeV}, vals=42) ###################################################################### # Filling maps from event lists # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # This example shows how to fill a counts cube from an event list: # energy_axis = MapAxis.from_bounds( 10.0, 2e3, 12, interp="log", name="energy", unit="GeV" ) counts_3d = WcsNDMap.create( binsz=0.1, width=10.0, skydir=(0, 0), frame="galactic", axes=[energy_axis] ) events = EventList.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-events.fits.gz") counts_3d.fill_by_coord({"skycoord": events.radec, "energy": events.energy}) counts_3d.write("ccube.fits", format="fgst-ccube", overwrite=True) ###################################################################### # Alternatively you can use the `~gammapy.maps.Map.fill_events` method: # counts_3d = WcsNDMap.create( binsz=0.1, width=10.0, skydir=(0, 0), frame="galactic", axes=[energy_axis] ) counts_3d.fill_events(events) ###################################################################### # If you have a given map already, and want to make a counts image with # the same geometry (not using the pixel data from the original map), you # can also use the `~gammapy.maps.Map.fill_events` method. # events = EventList.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-events.fits.gz") reference_map = Map.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-counts.fits.gz") counts = Map.from_geom(reference_map.geom) counts.fill_events(events) ###################################################################### # It works for IACT and Fermi-LAT events, for WCS or HEALPix map # geometries, and also for extra axes. Especially energy axes are # automatically handled correctly. # ###################################################################### # Filling maps from interpolation # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Maps support interpolation via the `~~gammapy.maps.Map.interp_by_coord` and # `~~gammapy.maps.Map.interp_by_pix` methods. Currently, the following interpolation # methods are supported: # # - ``"nearest"`` : Return value of nearest pixel (no interpolation). # - ``"linear"`` : Interpolation with first order polynomial. This is the # only interpolation method that is supported for all map types. # - `quadratic` : Interpolation with second order polynomial. # - `cubic` : Interpolation with third order polynomial. # # Note that ``"quadratic"`` and ``"cubic"`` interpolation are currently only # supported for WCS-based maps with regular geometry (e.g. 2D or ND with # the same geometry in every image plane). ``"linear"`` and higher order # interpolation by pixel coordinates is only supported for WCS-based maps. # # In the following example we create a new map and fill it by # interpolating another map: # # read map filename = "$GAMMAPY_DATA/fermi-3fhl-gc/gll_iem_v06_gc.fits.gz" m_iem_gc = Map.read(filename) # create new geometry skydir = SkyCoord(266.4, -28.9, frame="icrs", unit="deg") wcs_geom_cel = WcsGeom.create(skydir=skydir, binsz=0.1, frame="icrs", width=(8, 4)) # create new empty map from geometry m_iem_10GeV = Map.from_geom(wcs_geom_cel) coords = m_iem_10GeV.geom.get_coord() # fill new map using interpolation m_iem_10GeV.data = m_iem_gc.interp_by_coord( {"skycoord": coords.skycoord, "energy_true": 10 * u.GeV}, method="linear", fill_value=np.nan, ) ###################################################################### # Interpolating onto a different geometry # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # ###################################################################### # For 3d geometries this operation can be performed directly using the # `~gammapy.maps.Map.interp_to_geom()` method. This is very useful, ex: while using map # arithmetic. # # create new geometry energy_axis = MapAxis.from_bounds( 10.0, 2e3, 6, interp="log", name="energy_true", unit="GeV" ) skydir = SkyCoord(266.4, -28.9, frame="icrs", unit="deg") wcs_geom_3d = WcsGeom.create( skydir=skydir, binsz=0.1, frame="icrs", width=(8, 4), axes=[energy_axis] ) # create the interpolated map m_iem_interp = m_iem_gc.interp_to_geom( wcs_geom_3d, preserve_counts=False, method="linear", fill_value=np.nan ) print(m_iem_interp) ###################################################################### # Note that ``preserve_counts=`` option should be true if the map is an # integral quantity (e.g. counts) and false if the map is a differential # quantity (e.g. intensity). # ###################################################################### # Maps operations # --------------- # # Basic operators # ~~~~~~~~~~~~~~~ # # One can perform simple arithmetic on maps using the `+`, `-`, `*`, # `/` operators, this works only for maps with the same geometry: # iem_plus_iem = m_iem_10GeV + m_iem_10GeV iem_minus_iem = m_iem_10GeV - m_iem_10GeV ###################################################################### # These operations can be applied between a Map and a scalar in that # specific order: # iem_times_two = m_iem_10GeV * 2 # iem_times_two = 2 * m_iem_10GeV # this won't work ###################################################################### # The logic operators can also be applied on maps (the result is a map of # boolean type): # is_null = iem_minus_iem == 0 print(is_null) ###################################################################### # Here we check that the result is `True` for all the well-defined # pixels (not `NaN`): # print(np.all(is_null.data[~np.isnan(iem_minus_iem)])) ###################################################################### # Cutouts # ~~~~~~~ # # The `WCSNDMap` objects features a `~gammapy.maps.Map.cutout()` method, which allows # you to cut out a smaller part of a larger map. This can be useful, # e.g. when working with all-sky diffuse maps. Here is an example: # position = SkyCoord(0, 0, frame="galactic", unit="deg") m_iem_cutout = m_iem_gc.cutout(position=position, width=(4 * u.deg, 2 * u.deg)) ###################################################################### # The returned object is again a `~gammapy.maps.Map` object with updated WCS # information and data size. As one can see the cutout is automatically # applied to all the non-spatial axes as well. The cutout width is given # in the order of `(lon, lat)` and can be specified with units that will # be handled correctly. # ###################################################################### # Visualizing and Plotting # ------------------------ # # All map objects provide a `~gammapy.maps.Map.plot` method for generating a visualization # of a map. This method returns figure, axes, and image objects that can # be used to further tweak/customize the image. The `~gammapy.maps.Map.plot` method should # be used with 2D maps, while 3D maps can be displayed with the # `~gammapy.maps.Map.plot_interactive()` or `~gammapy.maps.Map.plot_grid()` methods. # # Image Plotting # ~~~~~~~~~~~~~~ # # For debugging and inspecting the map data it is useful to plot or # visualize the images planes contained in the map. # filename = "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-counts.fits.gz" m_3fhl_gc = Map.read(filename) ###################################################################### # After reading the map we can now plot it on the screen by calling the # ``.plot()`` method: # m_3fhl_gc.plot() plt.show() ###################################################################### # We can easily improve the plot by calling `~gammapy.maps.Map.smooth()` first and # providing additional arguments to `~gammapy.maps.Map.plot()`. Most of them are passed # further to # `plt.imshow() `__: # smoothed = m_3fhl_gc.smooth(width=0.2 * u.deg, kernel="gauss") smoothed.plot(stretch="sqrt", add_cbar=True, vmax=4, cmap="inferno") plt.show() ###################################################################### # We can use the # `plt.rc_context() `__ # context manager to further tweak the plot by adapting the figure and # font size: # rc_params = {"figure.figsize": (12, 5.4), "font.size": 12} with plt.rc_context(rc=rc_params): smoothed = m_3fhl_gc.smooth(width=0.2 * u.deg, kernel="gauss") smoothed.plot(stretch="sqrt", add_cbar=True, vmax=4) plt.show() ###################################################################### # Cube plotting # ~~~~~~~~~~~~~ # # For maps with non-spatial dimensions the `~gammapy.maps.Map` object features an # interactive plotting method, that works in jupyter notebooks only (Note: # it requires the package `ipywidgets` to be installed). We first read a # small example cutout from the Fermi Galactic diffuse model and display # the data cube by calling `~gammapy.maps.Map.plot_interactive()`: # rc_params = { "figure.figsize": (12, 5.4), "font.size": 12, "axes.formatter.limits": (2, -2), } m_iem_gc.plot_interactive(add_cbar=True, stretch="sqrt", rc_params=rc_params) plt.show() ###################################################################### # Now you can use the interactive slider to select an energy range and the # corresponding image is displayed on the screen. You can also use the # radio buttons to select your preferred image stretching. We have passed # additional keywords using the `rc_params` argument to improve the # figure and font size. Those keywords are directly passed to the # `plt.rc_context() `__ # context manager. # # Additionally all the slices of a 3D `~gammapy.maps.Map` can be displayed using the # `~gammapy.maps.Map.plot_grid()` method. By default the colorbars bounds of the subplots # are not the same, we can make them consistent using the `vmin` and # `vmax` options: # counts_3d.plot_grid(ncols=4, figsize=(16, 12), vmin=0, vmax=100, stretch="log") plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/mask_maps.py0000644000175100001770000003672214721316200021365 0ustar00runnerdocker""" Mask maps ========= Create and apply masks maps. Prerequisites ------------- - Understanding of basic analyses in 1D or 3D. - Usage of `~regions` and catalogs, see the :doc:`catalog notebook `. Context ------- There are two main categories of masks in Gammapy for different use cases. - Fitting often requires to ignore some parts of a reduced dataset, e.g. to restrict the fit to a specific energy range or to ignore parts of the region of interest that the user does not want to model, or both. Gammapy’s `Datasets` therefore contain a `mask_fit` sharing the same geometry as the data (i.e. `counts`). - During data reduction, some background makers will normalize the background model template on the data themselves. To limit contamination by real photons, one has to exclude parts of the field-of-view where signal is expected to be large. To do so, one needs to provide an exclusion mask. The latter can be provided in a different geometry as it will be reprojected by the `~gammapy.makers.Makers`. We explain in more details these two types of masks below: Masks for fitting ~~~~~~~~~~~~~~~~~ The region of interest used for the fit can defined through the dataset `mask_fit` attribute. The `mask_fit` is a map containing boolean values where pixels used in the fit are stored as True. A spectral fit (1D or 3D) can be restricted to a specific energy range where e.g. the background is well estimated or where the number of counts is large enough. Similarly, 2D and 3D analyses usually require to work with a wider map than the region of interest so sources laying outside but reconstructed inside because of the PSF are correctly taken into account. Then the `mask_fit` have to include a margin that take into account the PSF width. We will show an example in the boundary mask sub-section. The `mask_fit` also can be used to exclude sources or complex regions for which we don’t have good enough models. In that case the masking is an extra security, it is preferable to include the available models even if the sources are masked and frozen. Note that a dataset contains also a `mask_safe` attribute that is created and filled during data reduction. It is not to be modified directly by users. The `mask_safe` is defined only from the options passed to the `~gammapy.makers.SafeMaskMaker`. Exclusion masks ~~~~~~~~~~~~~~~ Background templates stored in the DL3 IRF are often not reliable enough to be used without some corrections. A set of common techniques to perform background or normalisation from the data is implemented in gammapy: reflected regions for 1D spectrum analysis, field-of-view (FoV) background or ring background for 2D and 3D analyses. To avoid contamination of the background estimate from gamma-ray bright regions these methods require to exclude those regions from the data used for the estimation. To do so, we use exclusion masks. They are maps containing boolean values where excluded pixels are stored as False. Proposed approach ----------------- Even if the use cases for exclusion masks and fit masks are different, the way to create these masks is exactly the same, so in the following we show how to work with masks in general: - Creating masks from scratch - Combining multiple masks - Extending and reducing an existing mask - Reading and writing masks """ ###################################################################### # Setup # ----- # import numpy as np import astropy.units as u from astropy.coordinates import Angle, SkyCoord from regions import CircleSkyRegion, Regions # %matplotlib inline import matplotlib.pyplot as plt from gammapy.catalog import CATALOG_REGISTRY from gammapy.datasets import Datasets from gammapy.estimators import ExcessMapEstimator from gammapy.maps import Map, WcsGeom ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # .. _masks-for-fitting: # # Creating a mask for fitting # --------------------------- # # One can build a `mask_fit` to restrict the energy range of pixels used # to fit a `Dataset`. The mask being a `Map` it needs to use the same # geometry (i.e. a `Geom` object) as the `Dataset` it will be applied # to. # # We show here how to proceed on a `MapDataset` taken from Fermi data # used in the 3FHL catalog. The dataset is already in the form of a # `Datasets` object. We read it from disk. # filename = "$GAMMAPY_DATA/fermi-3fhl-crab/Fermi-LAT-3FHL_datasets.yaml" datasets = Datasets.read(filename=filename) dataset = datasets["Fermi-LAT"] ###################################################################### # We can check the default energy range of the dataset. In the absence of # a `mask_fit` it is equal to the safe energy range. # print(f"Fit range : {dataset.energy_range_total}") ###################################################################### # Create a mask in energy # ~~~~~~~~~~~~~~~~~~~~~~~ # # We show first how to use a simple helper function # `~gammapy.maps.Geom.energy_range()`. # # We obtain the `Geom` that is stored on the `counts` map inside the # `Dataset` and we can directly create the `Map`. # mask_energy = dataset.counts.geom.energy_mask(10 * u.GeV, 700 * u.GeV) ###################################################################### # We can now set the dataset `mask_fit` attribute. # # And we check that the total fit range has changed accordingly. The bin # edges closest to requested range provide the actual fit range. # dataset.mask_fit = mask_energy print(f"Fit range : {dataset.energy_range_total}") ###################################################################### # Mask some sky regions # ~~~~~~~~~~~~~~~~~~~~~ # # One might also exclude some specific part of the sky for the fit. For # instance, if one wants not to model a specific source in the region of # interest, or if one want to reduce the region of interest in the dataset # `Geom`. # # In the following we restrict the fit region to a square around the Crab # nebula. **Note**: the dataset geometry is aligned on the galactic frame, # we use the same frame to define the box to ensure a correct alignment. # We can now create the map. We use the `WcsGeom.region_mask` method # putting all pixels outside the regions to False (because we only want to # consider pixels inside the region. For convenience, we can directly pass # a ds9 region string to the method: # regions = "galactic;box(184.55, -5.78, 3.0, 3.0)" mask_map = dataset.counts.geom.region_mask(regions) ###################################################################### # We can now combine this mask with the energy mask using the logical and # operator # dataset.mask_fit &= mask_map ###################################################################### # Let’s check the result and plot the full mask. # dataset.mask_fit.plot_grid(ncols=5, vmin=0, vmax=1, figsize=(14, 3)) plt.show() ###################################################################### # Creating a mask manually # ~~~~~~~~~~~~~~~~~~~~~~~~ # # If you are more familiar with the `Geom` and `Map` API, you can also # create the mask manually from the coordinates of all pixels in the # geometry. Here we simply show how to obtain the same behaviour as the # `energy_mask` helper method. # # In practice, this allows to create complex energy dependent masks. # coords = dataset.counts.geom.get_coord() mask_data = (coords["energy"] >= 10 * u.GeV) & (coords["energy"] < 700 * u.GeV) mask_energy = Map.from_geom(dataset.counts.geom, data=mask_data) ###################################################################### # Creating an exclusion mask # -------------------------- # # Exclusion masks are typically used for background estimation to mask out # regions where gamma-ray signal is expected. An exclusion mask is usually # a simple 2D boolean `Map` where excluded positions are stored as # `False`. Their actual geometries are independent of the target # datasets that a user might want to build. The first thing to do is to # build the geometry. # # Define the geometry # ~~~~~~~~~~~~~~~~~~~ # # Masks are stored in `Map` objects. We must first define its geometry # and then we can determine which pixels to exclude. Here we consider a # region at the Galactic anti-centre around the crab nebula. # position = SkyCoord(83.633083, 22.0145, unit="deg", frame="icrs") geom = WcsGeom.create(skydir=position, width="5 deg", binsz=0.02, frame="galactic") ###################################################################### # Create the mask from a list of regions # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # One can build an exclusion mask from regions. We show here how to # proceed. # # We can rely on known sources positions and properties to build a list of # regions (here `~regions.SkyRegions`) enclosing most of the signal that # our detector would see from these objects. # # A useful function to create region objects is # `~regions.regions.parse`. It can take strings defining regions # e.g. following the “ds9” format and convert them to `regions`. # # Here we use a region enclosing the Crab nebula with 0.3 degrees. The # actual region size should depend on the expected PSF of the data used. # We also add another region with a different shape as en example. # regions_ds9 = "galactic;box(185,-4,1.0,0.5, 45);icrs;circle(83.633083, 22.0145, 0.3)" regions = Regions.parse(regions_ds9, format="ds9") print(regions) ###################################################################### # Equivalently the regions can be read from a ds9 file, this time using # `Regions.read`. # # regions = Regions.read('ds9.reg', format="ds9") ###################################################################### # Create the mask map # ^^^^^^^^^^^^^^^^^^^ # # We can now create the map. We use the `WcsGeom.region_mask` method # putting all pixels inside the regions to False. # # to define the exclusion mask we take the inverse mask_map = ~geom.region_mask(regions) mask_map.plot() plt.show() ###################################################################### # Create the mask from a catalog of sources # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # We can also build our list of regions from a list of catalog sources. # Here we use the Fermi 4FGL catalog which we read using # `~gammapy.catalog.SourceCatalog`. # fgl = CATALOG_REGISTRY.get_cls("4fgl")() ###################################################################### # We now select sources that are contained in the region we are interested # in. # inside_geom = geom.contains(fgl.positions) positions = fgl.positions[inside_geom] ###################################################################### # We now create the list of regions using our 0.3 degree radius a priori # value. If the sources were extended, one would have to adapt the sizes # to account for the larger size. # exclusion_radius = Angle("0.3 deg") regions = [CircleSkyRegion(position, exclusion_radius) for position in positions] ###################################################################### # Now we can build the mask map the same way as above. # mask_map_catalog = ~geom.region_mask(regions) mask_map_catalog.plot() plt.show() ###################################################################### # Create the mask from statistically significant pixels in a dataset # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Here we want to determine an exclusion from the data directly. We will # estimate the significance of the data using the `ExcessMapEstimator`, # and exclude all pixels above a given threshold. # # Here we use the `MapDataset` taken from the Fermi data used above. # ###################################################################### # We apply a significance estimation. We integrate the counts using a # correlation radius of 0.4 degree and apply regular significance # estimate. # estimator = ExcessMapEstimator("0.4 deg", selection_optional=[]) result = estimator.run(dataset) ###################################################################### # Finally, we create the mask map by applying a threshold of 5 sigma to # remove pixels. # significance_mask = result["sqrt_ts"] < 5.0 ###################################################################### # Because the `ExcessMapEstimator` returns NaN for masked pixels, we # need to put the NaN values to `True` to avoid incorrectly excluding # them. # invalid_pixels = np.isnan(result["sqrt_ts"].data) significance_mask.data[invalid_pixels] = True significance_mask.plot() plt.show() ###################################################################### # This method frequently yields isolated pixels or weakly significant # features if one places the threshold too low. # # To overcome this issue, one can use # `~skimage.filters.apply_hysteresis_threshold` . This filter allows to # define two thresholds and mask only the pixels between the low and high # thresholds if they are not continuously connected to a pixel above the # high threshold. This allows to better preserve the structure of the # excesses. # # Note that scikit-image is not a required dependency of gammapy, you # might need to install it. # ###################################################################### # Masks operations # ---------------- # # If two masks share the same geometry it is easy to combine them with # `Map` arithmetic. # # OR condition is represented by `|` operator : # mask = mask_map | mask_map_catalog mask.plot() plt.show() ###################################################################### # AND condition is represented by `&` or `*` operators : # mask_map &= mask_map_catalog mask_map.plot() plt.show() ###################################################################### # The NOT operator is represented by the ``~`` symbol: # significance_mask_inv = ~significance_mask significance_mask_inv.plot() plt.show() ###################################################################### # Mask modifications # ------------------ # # Mask dilation and erosion # ~~~~~~~~~~~~~~~~~~~~~~~~~ # # One can reduce or extend a mask using `binary_erode` and # `binary_dilate` methods, respectively. # fig, (ax1, ax2) = plt.subplots( figsize=(11, 5), ncols=2, subplot_kw={"projection": significance_mask_inv.geom.wcs} ) mask = significance_mask_inv.binary_erode(width=0.2 * u.deg, kernel="disk") mask.plot(ax=ax1) mask = significance_mask_inv.binary_dilate(width=0.2 * u.deg) mask.plot(ax=ax2) plt.show() ###################################################################### # Boundary mask # ~~~~~~~~~~~~~ # # In the following example we use the Fermi dataset previously loaded and # add its `mask_fit` taking into account a margin based on the psf # width. The margin width is determined using the `containment_radius` # method of the psf object and the mask is created using the # `boundary_mask` method available on the geometry object. # # get PSF 95% containment radius energy_true = dataset.exposure.geom.axes[0].center psf_r95 = dataset.psf.containment_radius(fraction=0.95, energy_true=energy_true) plt.show() # create mask_fit with margin based on PSF mask_fit = dataset.counts.geom.boundary_mask(psf_r95.max()) dataset.mask_fit = mask_fit dataset.mask_fit.sum_over_axes().plot() plt.show() ###################################################################### # Reading and writing masks # ------------------------- # # `gammapy.maps` can directly read/write maps with boolean content as # follows: # # To save masks to disk mask_map.write("exclusion_mask.fits", overwrite="True") # To read maps from disk mask_map = Map.read("exclusion_mask.fits") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/model_management.py0000644000175100001770000003647014721316200022706 0ustar00runnerdocker""" Modelling ========= Multiple datasets and models interaction in Gammapy. Aim --- The main aim of this tutorial is to illustrate model management in Gammapy, specially how to distribute multiple models across multiple datasets. We also show some convenience functions built in gammapy for handling multiple model components. **Note: Since gammapy v0.18, the responsibility of model management is left totally upon the user. All models, including background models, have to be explicitly defined.** To keep track of the used models, we define a global `~gammapy.modeling.models.Models` object (which is a collection of `~gammapy.modeling.models.SkyModel` objects) to which we append and delete models. Prerequisites ------------- - Knowledge of 3D analysis, dataset reduction and fitting see the :doc:`/tutorials/starting/analysis_2` tutorial. - Understanding of gammapy models, see the :doc:`/tutorials/api/models` tutorial. - Analysis of the Galactic Center with Fermi-LAT, shown in the :doc:`/tutorials/data/fermi_lat` tutorial. - Analysis of the Galactic Center with CTA-DC1 , shown in the :doc:`/tutorials/analysis-3d/analysis_3d` tutorial. Proposed approach ----------------- To show how datasets interact with models, we use two pre-computed datasets on the galactic center, one from Fermi-LAT and the other from simulated CTA (DC1) data. We demonstrate - Adding background models for each dataset - Sharing a model between multiple datasets We then load models from the Fermi 3FHL catalog to show some convenience handling for multiple `~gammapy.modeling.models.Models` together - accessing models from a catalog - selecting models contributing to a given region - adding and removing models - freezing and thawing multiple model parameters together - serialising models For computational purposes, we do not perform any fitting in this notebook. Setup ----- """ from astropy import units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion import matplotlib.pyplot as plt # %matplotlib inline from IPython.display import display from gammapy.catalog import SourceCatalog3FHL from gammapy.datasets import Datasets, MapDataset from gammapy.maps import Map from gammapy.modeling.models import ( FoVBackgroundModel, Models, PowerLawNormSpectralModel, SkyModel, TemplateSpatialModel, create_fermi_isotropic_diffuse_model, ) ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Read the datasets # ----------------- # # First, we read some precomputed Fermi and CTA datasets, and create a # `~gammapy.datasets.Datasets` object containing the two. # fermi_dataset = MapDataset.read( "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc.fits.gz", name="fermi_dataset" ) cta_dataset = MapDataset.read( "$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz", name="cta_dataset" ) datasets = Datasets([fermi_dataset, cta_dataset]) ###################################################################### # Plot the counts maps to see the region # plt.figure(figsize=(15, 5)) ax1 = plt.subplot(121, projection=fermi_dataset.counts.geom.wcs) ax2 = plt.subplot(122, projection=cta_dataset.counts.geom.wcs) datasets[0].counts.sum_over_axes().smooth(0.05 * u.deg).plot( ax=ax1, stretch="sqrt", add_cbar=True ) datasets[1].counts.sum_over_axes().smooth(0.05 * u.deg).plot( ax=ax2, stretch="sqrt", add_cbar=True ) ax1.set_title("Fermi counts") ax2.set_title("CTA counts") plt.show() ###################################################################### # display(datasets.info_table(cumulative=False)) ###################################################################### # print(datasets) ###################################################################### # Note that while the datasets have an associated background map, they # currently do not have any associated background model. This will be # added in the following section # ###################################################################### # Assigning background models to datasets # --------------------------------------- # # For any IACT dataset (in this case `cta_dataset`) , we have to create # a `~gammapy.modeling.models.FoVBackgroundModel`. Note that # `~gammapy.modeling.models.FoVBackgroundModel` must be # specified to one dataset only # # For Fermi-LAT, the background contribution is taken from a diffuse # isotropic template. To convert this into a gammapy `~gammapy.modeling.models.SkyModel`, use the # helper function `~gammapy.modeling.models.create_fermi_isotropic_diffuse_model` # # To attach a model on a particular dataset it is necessary to specify the # ``datasets_names``. Otherwise, by default, the model will be applied to # all the datasets in ``datasets`` # ###################################################################### # First, we must create a global `~gammapy.modeling.models.Models` object which acts as the # container for all models used in a particular analysis # models = Models() # global models object # Create the FoV background model for CTA data bkg_model = FoVBackgroundModel(dataset_name=cta_dataset.name) models.append(bkg_model) # Add the bkg_model to models() # Read the fermi isotropic diffuse background model diffuse_iso = create_fermi_isotropic_diffuse_model( filename="$GAMMAPY_DATA/fermi_3fhl/iso_P8R2_SOURCE_V6_v06.txt", ) diffuse_iso.datasets_names = fermi_dataset.name # specifying the dataset name models.append(diffuse_iso) # Add the fermi_bkg_model to models() # Now, add the models to datasets datasets.models = models # You can see that each dataset lists the correct associated models print(datasets) ###################################################################### # Add a model on multiple datasets # -------------------------------- # # In this section, we show how to add a model to multiple datasets. For # this, we specify a list of `datasets_names` to the model. # Alternatively, not specifying any `datasets_names` will add it to all # the datasets. # # For this example, we use a template model of the galactic diffuse # emission to be shared between the two datasets. # # Create the diffuse model diffuse_galactic_fermi = Map.read("$GAMMAPY_DATA/fermi-3fhl-gc/gll_iem_v06_gc.fits.gz") template_diffuse = TemplateSpatialModel( diffuse_galactic_fermi, normalize=False ) # the template model in this case is already a full 3D model, it should not be normalised diffuse_iem = SkyModel( spectral_model=PowerLawNormSpectralModel(), spatial_model=template_diffuse, name="diffuse-iem", datasets_names=[ cta_dataset.name, fermi_dataset.name, ], # specifying list of dataset names ) # A power law spectral correction is applied in this case # Now, add the diffuse model to the global models list models.append(diffuse_iem) # add it to the datasets, and inspect datasets.models = models print(datasets) ###################################################################### # The `diffuse-iem` model is correctly present on both. Now, you can # proceed with the fit. For computational purposes, we skip it in this # notebook # # %%time # fit2 = Fit() # result2 = fit2.run(datasets) # print(result2.success) ###################################################################### # Loading models from a catalog # ----------------------------- # # We now load the Fermi 3FHL catalog and demonstrate some convenience # functions. For more details on using Gammapy catalog, see the # :doc:`/tutorials/api/catalog` tutorial. # catalog = SourceCatalog3FHL() ###################################################################### # We first choose some relevant models from the catalog and create a new # `~gammapy.modeling.models.Models` object. # gc_sep = catalog.positions.separation(SkyCoord(0, 0, unit="deg", frame="galactic")) models_3fhl = [_.sky_model() for k, _ in enumerate(catalog) if gc_sep[k].value < 8] models_3fhl = Models(models_3fhl) print(len(models_3fhl)) ###################################################################### # Selecting models contributing to a given region # ----------------------------------------------- # # We now use `~gammapy.modeling.models.Models.select_region` to get a subset of models # contributing to a particular region. You can also use # `~gammapy.modeling.models.Models.select_mask` to get models lying inside the `True` region # of a mask map` # region = CircleSkyRegion( center=SkyCoord(0, 0, unit="deg", frame="galactic"), radius=3.0 * u.deg ) models_selected = models_3fhl.select_region(region) print(len(models_selected)) ###################################################################### # We now want to assign `models_3fhl` to the Fermi dataset, and # `models_selected` to both the CTA and Fermi datasets. For this, we # explicitly mention the `datasets_names` to the former, and leave it # `None` (default) for the latter. # for model in models_3fhl: if model not in models_selected: model.datasets_names = fermi_dataset.name # assign the models to datasets datasets.models = models_3fhl ###################################################################### # To see the models on a particular dataset, you can simply see # print("Fermi dataset models: ", datasets[0].models.names) print("\n CTA dataset models: ", datasets[1].models.names) ###################################################################### # Combining two Models # -------------------- # # `~gammapy.modeling.models.Models` can be extended simply as python lists # models.extend(models_selected) print(len(models)) ###################################################################### # Selecting models from a list # ---------------------------- # # A `~gammapy.modeling.models.Model` can be selected from a list of # `~gammapy.modeling.models.Models` by specifying its index or its name. # model = models_3fhl[0] print(model) # Alternatively model = models_3fhl["3FHL J1731.7-3003"] print(model) ###################################################################### # `~gammapy.modeling.models.Models.select` can be used to select all models satisfying a list of # conditions. To select all models applied on the ``cta_dataset`` with the # characters `1748` in the name # models = models_3fhl.select(datasets_names=cta_dataset.name, name_substring="1748") print(models) ###################################################################### # Note that `~gammapy.modeling.models.Models.select` combines the different conditions with an # `AND` operator. If one needs to combine conditions with a `OR` # operator, the `~gammapy.modeling.models.Models.selection_mask` method can generate a boolean # array that can be used for selection. For example: # selection_mask = models_3fhl.selection_mask( name_substring="1748" ) | models_3fhl.selection_mask(name_substring="1731") models_OR = models_3fhl[selection_mask] print(models_OR) ###################################################################### # Removing a model from a dataset # ------------------------------- # # Any addition or removal of a model must happen through the global models # object, which must then be re-applied on the dataset/s. Note that # operations **cannot** be directly performed on `dataset.models()`. # # cta_dataset.models.remove() # * this is forbidden * # Remove the model '3FHL J1744.5-2609' models_3fhl.remove("3FHL J1744.5-2609") len(models_3fhl) # After any operation on models, it must be re-applied on the datasets datasets.models = models_3fhl ###################################################################### # To see the models applied on a dataset, you can simply # print(datasets.models.names) ###################################################################### # Plotting models on a (counts) map # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # The spatial regions of `~gammapy.modeling.models.Models` can be plotted on a given geom using # `~gammapy.modeling.models.Models.plot_regions`. You can also use # `~gammapy.modeling.models.Models.plot_positions` # to plot the centers of each model. # plt.figure(figsize=(16, 5)) ax1 = plt.subplot(121, projection=fermi_dataset.counts.geom.wcs) ax2 = plt.subplot(122, projection=cta_dataset.counts.geom.wcs) for ax, dataset in zip([ax1, ax2], datasets): dataset.counts.sum_over_axes().smooth(0.05 * u.deg).plot( ax=ax, stretch="sqrt", add_cbar=True, cmap="afmhot" ) dataset.models.plot_regions(ax=ax, color="white") ax.set_title(dataset.name) plt.show() ###################################################################### # Freezing and unfreezing model parameters # ---------------------------------------- # # For a given model, any parameter can be (un)frozen individually. # Additionally, `model.freeze` and `model.unfreeze` can be used to # freeze and unfreeze all parameters in one go. # model = models_3fhl[0] print(model) # To freeze a single parameter model.spectral_model.index.frozen = True print(model) # index is now frozen # To unfreeze a parameter model.spectral_model.index.frozen = False # To freeze all parameters of a model model.freeze() print(model) # To unfreeze all parameters (except parameters which must remain frozen) model.unfreeze() print(model) ###################################################################### # Only spectral or spatial or temporal components of a model can also be # frozen # # To freeze spatial components model.freeze("spatial") print(model) ###################################################################### # To check if all the parameters of a model are frozen, # print(model.frozen) # False because spectral components are not frozen print(model.spatial_model.frozen) # all spatial components are frozen ###################################################################### # The same operations can be performed on `~gammapy.modeling.models.Models` # directly - to perform on a list of models at once, e.g. # models_selected.freeze() # freeze all parameters of all models models_selected.unfreeze() # unfreeze all parameters of all models # print the free parameters in the models print(models_selected.parameters.free_parameters.names) ###################################################################### # There are more functionalities which you can explore. In general, using # `help()` on any function is a quick and useful way to access the # documentation. For ex, `Models.unfreeze_all` will unfreeze all # parameters, even those which are fixed by default. To see its usage, you # can simply type # help(models_selected.unfreeze) ###################################################################### # Serialising models # ------------------ # ###################################################################### # `~gammapy.modeling.models.Models` can be (independently of # `~gammapy.datasets.Datasets`) written to/ read from # a disk as yaml files. Datasets are always serialised along with their # associated models, ie, with yaml and fits files. eg: # # To save only the models models_3fhl.write("3fhl_models.yaml", overwrite=True) # To save datasets and models datasets.write( filename="datasets-gc.yaml", filename_models="models_gc.yaml", overwrite=True ) # To read only models models = Models.read("3fhl_models.yaml") print(models) # To read datasets with models datasets_read = Datasets.read("datasets-gc.yaml", filename_models="models_gc.yaml") print(datasets_read) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/models.py0000644000175100001770000006516414721316200020677 0ustar00runnerdocker""" Models ====== This is an introduction and overview on how to work with models in Gammapy. The sub-package `~gammapy.modeling` contains all the functionality related to modeling and fitting data. This includes spectral, spatial and temporal model classes, as well as the fit and parameter API.The models follow a naming scheme which contains the category as a suffix to the class name. An overview of all the available models can be found in the :ref:`model-gallery`. Note that there are separate tutorials, :doc:`/tutorials/api/model_management` and :doc:`/tutorials/api/fitting` that explains about `~gammapy.modeling`, the Gammapy modeling and fitting framework. You have to read that to learn how to work with models in order to analyse data. """ ###################################################################### # Setup # ----- # # %matplotlib inline import numpy as np from astropy import units as u import matplotlib.pyplot as plt from IPython.display import display from gammapy.maps import Map, MapAxis, WcsGeom ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Spectral models # --------------- # # All models are imported from the `~gammapy.modeling.models` namespace. # Let’s start with a `~gammapy.modeling.models.PowerLawSpectralModel`: # from gammapy.modeling.models import PowerLawSpectralModel pwl = PowerLawSpectralModel() print(pwl) ###################################################################### # To get a list of all available spectral models you can import and print # the spectral model registry or take a look at the :ref:`model-gallery` # from gammapy.modeling.models import SPECTRAL_MODEL_REGISTRY print(SPECTRAL_MODEL_REGISTRY) ###################################################################### # Spectral models all come with default parameters. Different parameter # values can be passed on creation of the model, either as a string # defining the value and unit or as an `astropy.units.Quantity` object # directly: # amplitude = 1e-12 * u.Unit("TeV-1 cm-2 s-1") pwl = PowerLawSpectralModel(amplitude=amplitude, index=2.2) ###################################################################### # For convenience a `str` specifying the value and unit can be passed as # well: # pwl = PowerLawSpectralModel(amplitude="2.7e-12 TeV-1 cm-2 s-1", index=2.2) print(pwl) ###################################################################### # The model can be evaluated at given energies by calling the model # instance: # energy = [1, 3, 10, 30] * u.TeV dnde = pwl(energy) print(dnde) ###################################################################### # The returned quantity is a differential photon flux. # # For spectral models you can additionally compute the integrated and # energy flux in a given energy range: # flux = pwl.integral(energy_min=1 * u.TeV, energy_max=10 * u.TeV) print(flux) eflux = pwl.energy_flux(energy_min=1 * u.TeV, energy_max=10 * u.TeV) print(eflux) ###################################################################### # This also works for a list or an array of integration boundaries: # energy = [1, 3, 10, 30] * u.TeV flux = pwl.integral(energy_min=energy[:-1], energy_max=energy[1:]) print(flux) ###################################################################### # In some cases it can be useful to find use the inverse of a spectral # model, to find the energy at which a given flux is reached: # dnde = 2.7e-12 * u.Unit("TeV-1 cm-2 s-1") energy = pwl.inverse(dnde) print(energy) ###################################################################### # As a convenience you can also plot any spectral model in a given energy # range: # pwl.plot(energy_bounds=[1, 100] * u.TeV) plt.show() ###################################################################### # Norm Spectral Models # ~~~~~~~~~~~~~~~~~~~~ # ###################################################################### # Normed spectral models are a special class of Spectral Models, which # have a dimension-less normalisation. These spectral models feature a # norm parameter instead of amplitude and are named using the # `NormSpectralModel` suffix. They **must** be used along with another # spectral model, as a multiplicative correction factor according to their # spectral shape. They can be typically used for adjusting template based # models, or adding a EBL correction to some analytic model. # # To check if a given `~gammapy.modeling.models.SpectralModel` is a norm model, you can simply # look at the `is_norm_spectral_model` property # # To see the available norm models shipped with gammapy: for model in SPECTRAL_MODEL_REGISTRY: if model.is_norm_spectral_model: print(model) ###################################################################### # As an example, we see the `~gammapy.modeling.models.PowerLawNormSpectralModel` # from gammapy.modeling.models import PowerLawNormSpectralModel pwl_norm = PowerLawNormSpectralModel(tilt=0.1) print(pwl_norm) ###################################################################### # We can check the correction introduced at each energy # energy = [0.3, 1, 3, 10, 30] * u.TeV print(pwl_norm(energy)) ###################################################################### # A typical use case of a norm model would be in applying spectral # correction to a `~gammapy.modeling.models.TemplateSpectralModel`. A template model is defined # by custom tabular values provided at initialization. # from gammapy.modeling.models import TemplateSpectralModel energy = [0.3, 1, 3, 10, 30] * u.TeV values = [40, 30, 20, 10, 1] * u.Unit("TeV-1 s-1 cm-2") template = TemplateSpectralModel(energy, values) template.plot(energy_bounds=[0.2, 50] * u.TeV, label="template model") normed_template = template * pwl_norm normed_template.plot(energy_bounds=[0.2, 50] * u.TeV, label="normed_template model") plt.legend() plt.show() ###################################################################### # Compound Spectral Model # ~~~~~~~~~~~~~~~~~~~~~~~ # # A `CompoundSpectralModel` is an arithmetic combination of two spectral # models. The model `normed_template` created in the preceding example # is an example of a `CompoundSpectralModel` # print(normed_template) ###################################################################### # To create an additive model, you can do simply: # model_add = pwl + template print(model_add) ###################################################################### # Spatial models # -------------- # ###################################################################### # Spatial models are imported from the same `~gammapy.modeling.models` # namespace, let’s start with a `~gammapy.modeling.models.GaussianSpatialModel`: # from gammapy.modeling.models import GaussianSpatialModel gauss = GaussianSpatialModel(lon_0="0 deg", lat_0="0 deg", sigma="0.2 deg") print(gauss) ###################################################################### # Again you can check the `SPATIAL_MODELS` registry to see which models # are available or take a look at the :ref:`model-gallery` # from gammapy.modeling.models import SPATIAL_MODEL_REGISTRY print(SPATIAL_MODEL_REGISTRY) ###################################################################### # The default coordinate frame for all spatial models is `"icrs"`, but # the frame can be modified using the `frame` argument: # gauss = GaussianSpatialModel( lon_0="0 deg", lat_0="0 deg", sigma="0.2 deg", frame="galactic" ) ###################################################################### # You can specify any valid `astropy.coordinates` frame. The center # position of the model can be retrieved as a # `astropy.coordinates.SkyCoord` object using `SpatialModel.position`: # print(gauss.position) ###################################################################### # Spatial models can be evaluated again by calling the instance: # lon = [0, 0.1] * u.deg lat = [0, 0.1] * u.deg flux_per_omega = gauss(lon, lat) print(flux_per_omega) ###################################################################### # The returned quantity corresponds to a surface brightness. Spatial model # can be also evaluated using `~gammapy.maps.Map` and # `~gammapy.maps.Geom` objects: # m = Map.create(skydir=(0, 0), width=(1, 1), binsz=0.02, frame="galactic") m.quantity = gauss.evaluate_geom(m.geom) m.plot(add_cbar=True) plt.show() ###################################################################### # Again for convenience the model can be plotted directly: # gauss.plot(add_cbar=True) plt.show() ###################################################################### # All spatial models have an associated sky region to it e.g. to # illustrate the extension of the model on a sky image. The returned object # is an `~regions.SkyRegion` object: # print(gauss.to_region()) ###################################################################### # Now we can plot the region on a sky image: # plt.figure() gauss_elongated = GaussianSpatialModel( lon_0="0 deg", lat_0="0 deg", sigma="0.2 deg", e=0.7, phi="45 deg" ) ax = gauss_elongated.plot(add_cbar=True) region = gauss_elongated.to_region() region_pix = region.to_pixel(ax.wcs) ax.add_artist(region_pix.as_artist(ec="w", fc="None")) plt.show() ###################################################################### # The `~gammapy.modeling.models.SpatialModel.to_region()` method can also be useful to write e.g. ds9 region # files using `write_ds9` from the `regions` package: # from regions import Regions regions = Regions([gauss.to_region(), gauss_elongated.to_region()]) filename = "regions.reg" regions.write( filename, format="ds9", overwrite=True, ) # !cat regions.reg ###################################################################### # Temporal models # --------------- # ###################################################################### # Temporal models are imported from the same `~gammapy.modeling.models` # namespace, let’s start with a `~gammapy.modeling.models.GaussianTemporalModel`: # from gammapy.modeling.models import GaussianTemporalModel gauss_temp = GaussianTemporalModel(t_ref=59240.0 * u.d, sigma=2.0 * u.d) print(gauss_temp) ###################################################################### # To check the `TEMPORAL_MODELS` registry to see which models are # available: # from gammapy.modeling.models import TEMPORAL_MODEL_REGISTRY print(TEMPORAL_MODEL_REGISTRY) ###################################################################### # Temporal models can be evaluated on `astropy.time.Time` objects. The # returned quantity is a dimensionless number # from astropy.time import Time time = Time("2021-01-29 00:00:00.000") gauss_temp(time) ###################################################################### # As for other models, they can be plotted in a given time range # time = Time([59233.0, 59250], format="mjd") gauss_temp.plot(time) plt.show() ###################################################################### # SkyModel # -------- # # The `~gammapy.modeling.models.SkyModel` class combines a spectral, and # optionally, a spatial model and a temporal. It can be created from # existing spectral, spatial and temporal model components: # from gammapy.modeling.models import SkyModel model = SkyModel( spectral_model=pwl, spatial_model=gauss, temporal_model=gauss_temp, name="my-source", ) print(model) ###################################################################### # It is good practice to specify a name for your sky model, so that you # can access it later by name and have meaningful identifier you # serialisation. If you don’t define a name, a unique random name is # generated: # model_without_name = SkyModel(spectral_model=pwl, spatial_model=gauss) print(model_without_name.name) ###################################################################### # The individual components of the source model can be accessed using # ``.spectral_model``, ``.spatial_model`` and ``.temporal_model``: # print(model.spectral_model) # %% print(model.spatial_model) # %% print(model.temporal_model) ###################################################################### # And can be used as you have seen already seen above: # model.spectral_model.plot(energy_bounds=[1, 10] * u.TeV) plt.show() ###################################################################### # Note that the gammapy fitting can interface only with a `~gammapy.modeling.models.SkyModel` and # **not** its individual components. So, it is customary to work with # `~gammapy.modeling.models.SkyModel` even if you are not doing a 3D fit. Since the amplitude # parameter resides on the `~gammapy.modeling.models.SpectralModel`, specifying a spectral # component is compulsory. The temporal and spatial components are # optional. The temporal model needs to be specified only for timing # analysis. In some cases (e.g. when doing a spectral analysis) there is # no need for a spatial component either, and only a spectral model is # associated with the source. # model_spectrum = SkyModel(spectral_model=pwl, name="source-spectrum") print(model_spectrum) ###################################################################### # Additionally the spatial model of `~gammapy.modeling.models.SkyModel` # can be used to represent source models based on templates, where the # spatial and energy axes are correlated. It can be created e.g. from an # existing FITS file: # from gammapy.modeling.models import PowerLawNormSpectralModel, TemplateSpatialModel diffuse_cube = TemplateSpatialModel.read( "$GAMMAPY_DATA/fermi-3fhl-gc/gll_iem_v06_gc.fits.gz", normalize=False ) diffuse = SkyModel(PowerLawNormSpectralModel(), diffuse_cube) print(diffuse) ###################################################################### # Note that if the spatial model is not normalized over the sky it has to # be combined with a normalized spectral model, for example # `~gammapy.modeling.models.PowerLawNormSpectralModel`. This is the only # case in `~gammapy.models.SkyModel` where the unit is fully attached to # the spatial model. # ###################################################################### # Modifying model parameters # -------------------------- # # Model parameters can be modified (eg: frozen, values changed, etc at any # point), eg: # # Freezing a parameter model.spectral_model.index.frozen = True # Making a parameter free model.spectral_model.index.frozen = False # Changing a value model.spectral_model.index.value = 3 # Setting min and max ranges on parameters model.spectral_model.index.min = 1.0 model.spectral_model.index.max = 5.0 # Visualise the model as a table display(model.parameters.to_table()) ###################################################################### # You can use the interactive boxes to choose model parameters by name, # type or other attributes mentioned in the column names. # ###################################################################### # Model lists and serialisation # ----------------------------- # # In a typical analysis scenario a model consists of multiple model # components, or a “catalog” or “source library”. To handle this list of # multiple model components, Gammapy has a `~gammapy.modeling.models.Models` class: # from gammapy.modeling.models import Models models = Models([model, diffuse]) print(models) ###################################################################### # Individual model components in the list can be accessed by their name: # print(models["my-source"]) ###################################################################### # **Note:** To make the access by name unambiguous, models are required to # have a unique name, otherwise an error will be thrown. # # To see which models are available you can use the ``.names`` attribute: # print(models.names) ###################################################################### # Note that a `~gammapy.modeling.models.SkyModel` object can be evaluated for a given longitude, # latitude, and energy, but the `~gammapy.modeling.models.Models` object cannot. # This `~gammapy.modeling.models.Models` # container object will be assigned to `~gammapy.datasets.Dataset` or `~gammapy.datasets.Datasets` # together with the data to be fitted. Checkout e.g. the # :doc:`/tutorials/api/model_management` tutorial for details. # # The `~gammapy.modeling.models.Models` class also has in place ``.append()`` and ``.extend()`` # methods: # model_copy = model.copy(name="my-source-copy") models.append(model_copy) ###################################################################### # This list of models can be also serialised to a custom YAML based # format: # models_yaml = models.to_yaml() print(models_yaml) ###################################################################### # The structure of the yaml files follows the structure of the python # objects. The ``components`` listed correspond to the `~gammapy.modeling.models.SkyModel` and # components of the `~gammapy.modeling.models.Models`. For each `~gammapy.modeling.models.SkyModel` # we have information about its ``name``, ``type`` (corresponding to the # tag attribute) and sub-models (i.e ``spectral`` model and eventually # ``spatial`` model). Then the spatial and spectral models are defined by # their type and parameters. The ``parameters`` keys name/value/unit are # mandatory, while the keys min/max/frozen are optional (so you can # prepare shorter files). # # If you want to write this list of models to disk and read it back later # you can use: # models.write("models.yaml", overwrite=True) models_read = Models.read("models.yaml") ###################################################################### # Additionally the models can be exported and imported together with the data # using the `~gammapy.datasets.Datasets.read()` and `~gammapy.datasets.Datasets.write()` methods as shown # in the :doc:`/tutorials/analysis-3d/analysis_mwl` # notebook. # # Models with shared parameter # ---------------------------- # # A model parameter can be shared with other models, for example we can # define two power-law models with the same spectral index but different # amplitudes: # pwl2 = PowerLawSpectralModel() pwl2.index = pwl.index pwl.index.value = ( 2.3 # also update pwl2 as the parameter object is now the same as shown below ) print(pwl.index) print(pwl2.index) ###################################################################### # In the YAML files the shared parameter is flagged by the additional # ``link`` entry that follows the convention ``parameter.name@unique_id``: # models = Models([SkyModel(pwl, name="source1"), SkyModel(pwl2, name="source2")]) models_yaml = models.to_yaml() print(models_yaml) ###################################################################### # .. _custom-model: # # Implementing a custom model # --------------------------- # # In order to add a user defined spectral model you have to create a # `~gammapy.modeling.models.SpectralModel` subclass. This new model class should include: # # - a tag used for serialization (it can be the same as the class name) # - an instantiation of each Parameter with their unit, default values # and frozen status # - the evaluate function where the mathematical expression for the model # is defined. # # As an example we will use a PowerLawSpectralModel plus a Gaussian (with # fixed width). First we define the new custom model class that we name # ``MyCustomSpectralModel``: # from gammapy.modeling import Parameter from gammapy.modeling.models import SpectralModel class MyCustomSpectralModel(SpectralModel): """My custom spectral model, parametrizing a power law plus a Gaussian spectral line. Parameters ---------- amplitude : `astropy.units.Quantity` Amplitude of the spectra model. index : `astropy.units.Quantity` Spectral index of the model. reference : `astropy.units.Quantity` Reference energy of the power law. mean : `astropy.units.Quantity` Mean value of the Gaussian. width : `astropy.units.Quantity` Sigma width of the Gaussian line. """ tag = "MyCustomSpectralModel" amplitude = Parameter("amplitude", "1e-12 cm-2 s-1 TeV-1", min=0) index = Parameter("index", 2, min=0) reference = Parameter("reference", "1 TeV", frozen=True) mean = Parameter("mean", "1 TeV", min=0) width = Parameter("width", "0.1 TeV", min=0, frozen=True) @staticmethod def evaluate(energy, index, amplitude, reference, mean, width): pwl = PowerLawSpectralModel.evaluate( energy=energy, index=index, amplitude=amplitude, reference=reference, ) gauss = amplitude * np.exp(-((energy - mean) ** 2) / (2 * width**2)) return pwl + gauss ###################################################################### # It is good practice to also implement a docstring for the model, # defining the parameters and also defining a ``.tag``, which specifies the # name of the model for serialisation. Also note that gammapy assumes that # all `~gammapy.modeling.models.SpectralModel` evaluate functions return a flux in unit of # `"cm-2 s-1 TeV-1"` (or equivalent dimensions). # # This model can now be used as any other spectral model in Gammapy: # my_custom_model = MyCustomSpectralModel(mean="3 TeV") print(my_custom_model) print(my_custom_model.integral(1 * u.TeV, 10 * u.TeV)) my_custom_model.plot(energy_bounds=[1, 10] * u.TeV) plt.show() ###################################################################### # As a next step we can also register the custom model in the # ``SPECTRAL_MODELS`` registry, so that it becomes available for # serialization: # SPECTRAL_MODEL_REGISTRY.append(MyCustomSpectralModel) model = SkyModel(spectral_model=my_custom_model, name="my-source") models = Models([model]) models.write("my-custom-models.yaml", overwrite=True) # !cat my-custom-models.yaml ###################################################################### # Similarly you can also create custom spatial models and add them to the # ``SPATIAL_MODELS`` registry. In that case gammapy assumes that the # evaluate function return a normalized quantity in “sr-1” such as the # model integral over the whole sky is one. # ###################################################################### # Models with energy dependent morphology # --------------------------------------- # # A common science case in the study of extended sources is to probe for # energy dependent morphology, eg: in Supernova Remnants or Pulsar Wind # Nebulae. Traditionally, this has been done by splitting the data into # energy bands and doing individual fits of the morphology in these energy # bands. # # `~gammapy.modeling.models.SkyModel` offers a natural framework to simultaneously model the # energy and morphology, e.g. spatial extent described by a parametric # model expression with energy dependent parameters. # # The models shipped within gammapy use a “factorised” representation of # the source model, where the spatial (:math:`l,b`), energy (:math:`E`) # and time (:math:`t`) dependence are independent model components and not # correlated: # # .. math:: # \begin{align}f(l, b, E, t) = F(l, b) \cdot G(E) \cdot H(t)\end{align} # # To use full 3D models, ie :math:`f(l, b, E) = F(l, b, E) \cdot \ G(E)`, # you have to implement your own custom # `SpatialModel`. Note that it is still necessary to multiply by a # `SpectralModel`, :math:`G(E)` to be dimensionally consistent. # # In this example, we create Gaussian Spatial Model with the extension # varying with energy. For simplicity, we assume a linear dependency on # energy and parameterize this by specifying the extension at 2 energies. # You can add more complex dependencies, probably motivated by physical # models. # from astropy.coordinates import angular_separation from gammapy.modeling.models import SpatialModel class MyCustomGaussianModel(SpatialModel): """My custom Energy Dependent Gaussian model. Parameters ---------- lon_0, lat_0 : `~astropy.coordinates.Angle` Center position sigma_1TeV : `~astropy.coordinates.Angle` Width of the Gaussian at 1 TeV sigma_10TeV : `~astropy.coordinates.Angle` Width of the Gaussian at 10 TeV """ tag = "MyCustomGaussianModel" is_energy_dependent = True lon_0 = Parameter("lon_0", "0 deg") lat_0 = Parameter("lat_0", "0 deg", min=-90, max=90) sigma_1TeV = Parameter("sigma_1TeV", "2.0 deg", min=0) sigma_10TeV = Parameter("sigma_10TeV", "0.2 deg", min=0) @staticmethod def evaluate(lon, lat, energy, lon_0, lat_0, sigma_1TeV, sigma_10TeV): sep = angular_separation(lon, lat, lon_0, lat_0) # Compute sigma for the given energy using linear interpolation in log energy sigma_nodes = u.Quantity([sigma_1TeV, sigma_10TeV]) energy_nodes = [1, 10] * u.TeV log_s = np.log(sigma_nodes.to("deg").value) log_en = np.log(energy_nodes.to("TeV").value) log_e = np.log(energy.to("TeV").value) sigma = np.exp(np.interp(log_e, log_en, log_s)) * u.deg exponent = -0.5 * (sep / sigma) ** 2 norm = 1 / (2 * np.pi * sigma**2) return norm * np.exp(exponent) ###################################################################### # Serialisation of this model can be achieved as explained in the previous # section. You can now use it as standard `~gammapy.modeling.models.SpatialModel` in your # analysis. Note that this is still a `~gammapy.modeling.models.SpatialModel` and not a # `~gammapy.modeling.models.SkyModel`, so it needs to be multiplied by a # `~gammapy.modeling.models.SpectralModel` as before. # spatial_model = MyCustomGaussianModel() spectral_model = PowerLawSpectralModel() sky_model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) print(spatial_model.evaluation_radius) ###################################################################### # To visualise it, we evaluate it on a 3D geom. # energy_axis = MapAxis.from_energy_bounds( energy_min=0.1 * u.TeV, energy_max=10.0 * u.TeV, nbin=3, name="energy_true" ) geom = WcsGeom.create(skydir=(0, 0), width=5.0 * u.deg, binsz=0.1, axes=[energy_axis]) spatial_model.plot_grid(geom=geom, add_cbar=True, figsize=(14, 3)) plt.show() ###################################################################### # For computational purposes, it is useful to specify a # ``evaluation_radius`` for `~gammapy.modeling.models.SpatialModels` - this gives a size on which # to compute the model. Though optional, it is highly recommended for # Custom Spatial Models. This can be done, for ex, by defining the # following function inside the above class: # @property def evaluation_radius(self): """Evaluation radius (`~astropy.coordinates.Angle`).""" return 5 * np.max([self.sigma_1TeV.value, self.sigma_10TeV.value]) * u.deg ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/observation_clustering.py0000644000175100001770000002006514721316200024175 0ustar00runnerdocker""" Observational clustering ======================== Clustering observations into specific groups. Context ------- Typically, observations from gamma-ray telescopes can span a number of different observation periods, therefore it is likely that the observation conditions and quality are not always the same. This tutorial aims to provide a way in which observations can be grouped such that similar observations are grouped together, and then the data reduction is performed. Objective --------- To cluster similar observations based on various quantities. Proposed approach ----------------- For completeness two different methods for grouping of observations are shown here. - A simple grouping based on zenith angle from an existing observations table. - Grouping the observations depending on the IRF quality, by means of hierarchical clustering. """ import numpy as np import astropy.units as u from astropy.coordinates import SkyCoord import matplotlib.pyplot as plt from gammapy.data import DataStore from gammapy.data.utils import get_irfs_features from gammapy.utils.cluster import hierarchical_clustering, standard_scaler ###################################################################### # Obtain the observations # ----------------------- # # First need to define the `~gammapy.data.DataStore` object for the H.E.S.S. DL3 DR1 # data. Next, utilise a cone search to select only the observations of interest. # In this case, we choose PKS 2155-304 as the object of interest. # # The `~gammapy.data.ObservationTable` is then filtered using the `select_observations` tool. # data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") selection = dict( type="sky_circle", frame="icrs", lon="329.71693826 deg", lat="-30.2255 deg", radius="2 deg", ) obs_table = data_store.obs_table selected_obs_table = obs_table.select_observations(selection) ###################################################################### # More complex selection can be done by utilising the obs_table entries directly. # We can now retrieve the relevant observations by passing their obs_id to the # `~gammapy.data.DataStore.get_observations` method. # obs_ids = selected_obs_table["OBS_ID"] observations = data_store.get_observations(obs_ids) ###################################################################### # Show various observation quantities # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Print here the range of zenith angles and muon efficiencies, to see # if there is a sensible way to group the observations. # obs_zenith = selected_obs_table["ZEN_PNT"].to(u.deg) obs_muoneff = selected_obs_table["MUONEFF"] print(f"{np.min(obs_zenith):.2f} < zenith angle < {np.max(obs_zenith):.2f}") print(f"{np.min(obs_muoneff):.2f} < muon efficiency < {np.max(obs_muoneff):.2f}") ###################################################################### # Manual grouping of observations # ------------------------------- # # Here we can plot the zenith angle vs muon efficiency of the observations. # We decide to group the observations according to their zenith angle. # This is done manually as per a user defined cut, in this case we take the # median value of the zenith angles to define each observation group. # # This type of grouping can be utilised according to different parameters i.e. # zenith angle, muon efficiency, offset angle. The quantity chosen can therefore # be adjusted according to each specific science case. # median_zenith = np.median(obs_zenith) labels = [] for obs in observations: zenith = obs.get_pointing_altaz(time=obs.tmid).zen labels.append("low_zenith" if zenith < median_zenith else "high_zenith") grouped_observations = observations.group_by_label(labels) print(grouped_observations) ###################################################################### # # The results for each group of observations is shown visually below. # fix, ax = plt.subplots(1, 1, figsize=(7, 5)) for obs in grouped_observations["group_low_zenith"]: ax.plot( obs.get_pointing_altaz(time=obs.tmid).zen, obs.meta.optional["MUONEFF"], "d", color="red", ) for obs in grouped_observations["group_high_zenith"]: ax.plot( obs.get_pointing_altaz(time=obs.tmid).zen, obs.meta.optional["MUONEFF"], "o", color="blue", ) ax.set_ylabel("Muon efficiency") ax.set_xlabel("Zenith angle (deg)") ax.axvline(median_zenith.value, ls="--", color="black") plt.show() ###################################################################### # This shows the observations grouped by zenith angle. The diamonds # are observations which have a zenith angle less than the median value, # whilst the circles are observations above the median. # # The `grouped_observations` provide a list of `~gammapy.data.Observations` # which can be utilised in the usual way to show the various properties # of the observations i.e. see the :doc:`/tutorials/data/cta` tutorial. # ###################################################################### # Hierarchical clustering of observations # --------------------------------------- # # This method shows how to cluster observations based on their IRF quantities, # in this case those that have a similar edisp and psf. The # `~gammapy.data.utils.get_irfs_features` is utilised to achieve this. The # observations are then clustered based on these criteria using # `~gammapy.utils.cluster.hierarchical_clustering`. The idea here is to minimise # the variance of both edisp and psf within a specific group to limit the error # on the quantity when they are stacked at the dataset level. # # In this example, the irf features are computed for the `edisp-res` and # `psf-radius` at 1 TeV. This is stored as a `~astropy.table.table.Table`, as shown below. # source_position = SkyCoord(329.71693826 * u.deg, -30.2255890 * u.deg, frame="icrs") names = ["edisp-res", "psf-radius"] features_irfs = get_irfs_features( observations, energy_true="1 TeV", position=source_position, names=names ) print(features_irfs) ###################################################################### # Compute standardized features by removing the mean and scaling to unit # variance: # scaled_features_irfs = standard_scaler(features_irfs) print(scaled_features_irfs) ###################################################################### # The `~gammapy.utils.cluster.hierarchical_clustering` then clusters # this table into ``t=2`` groups with a corresponding label for each group. # In this case, we choose to cluster the observations into two groups. # We can print this table to show the corresponding label which has been # added to the previous ``feature_irfs`` table. # # The arguments for `~scipy.cluster.hierarchy.fcluster` are passed as # a dictionary here. # features = hierarchical_clustering(scaled_features_irfs, fcluster_kwargs={"t": 2}) print(features) ###################################################################### # Finally, ``observations.group_by_label`` creates a dictionary containing ``t`` # `~gammapy.data.Observations` objects by grouping the similar labels. # obs_clusters = observations.group_by_label(features["labels"]) print(obs_clusters) mask_1 = features["labels"] == 1 mask_2 = features["labels"] == 2 fix, ax = plt.subplots(1, 1, figsize=(7, 5)) ax.set_ylabel("edisp-res") ax.set_xlabel("psf-radius") ax.plot( features_irfs[mask_1]["edisp-res"], features_irfs[mask_1]["psf-radius"], "d", color="green", label="Group 1", ) ax.plot( features_irfs[mask_2]["edisp-res"], features_irfs[mask_2]["psf-radius"], "o", color="magenta", label="Group 2", ) ax.legend() plt.show() ###################################################################### # The groups here are divided by the quality of the IRFs values ``edisp-res`` # and ``psf-radius``. The diamond and circular points indicate how the observations # are grouped. # # # In both examples we have a set of `~gammapy.data.Observation` objects which # can be reduced using the `~gammapy.makers.DatasetsMaker` to create two (in this # specific case) separate datasets. These can then be jointly fitted using the # :doc:`/tutorials/analysis-3d/analysis_mwl` tutorial. # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/api/priors.py0000644000175100001770000004040314721316200020717 0ustar00runnerdocker""" Priors ====== Learn how you can include prior knowledge into the fitting by setting priors on single parameters. Prerequisites ------------- - Knowledge of spectral analysis to produce 1D spectral datasets, see the :doc:`/tutorials/analysis-1d/spectral_analysis` tutorial. - Knowledge of fitting models to datasets, see :doc:`/tutorials/api/fitting` tutorial - General knowledge of statistics and priors Context ------- Generally, the set of parameters describing best the data is where the fit statistic :math:`2 x log L` has its global minimum. Depending on the type of background estimation, it is either the Cash fit statistic or the WStat (see :doc:`/user-guide/stats/fit_statistics` for more details). A prior on the value of some of the parameters can be added to this fit statistic. The prior is again a probability density function of the model parameters and can take different forms, including Gaussian distributions, uniform distributions, etc. The prior includes information or knowledge about the dataset or the parameters of the fit. Setting priors on multiple parameters simultaneously is supported, However, for now, they should not be correlated. The spectral dataset used here contains a simulated power-law source and its IRFs are based on H.E.S.S. data of the Crab Nebula (similar to :doc:`/tutorials/api/fitting`). We are simulating the source here, so that we can control the input and check the correctness of the fit results. The tutorial addresses three examples: 1. Including prior information about the sources index 2. Encouraging positive amplitude values 3. How to add a custom prior class? In the first example, the Gaussian prior is used. It is shown how to set a prior on a model parameter and how it modifies the fit statistics. A source is simulated without statistics and fitted with and without the priors. The different fit results are discussed. For the second example, 1000 datasets containing a very weak source are simulated. Due to the statistical fluctuations, the amplitude’s best-fit value is negative for some draws. By setting a uniform prior on the amplitude, this can be avoided. The setup --------- """ import numpy as np from astropy import units as u import matplotlib.pyplot as plt from gammapy.datasets import SpectrumDatasetOnOff from gammapy.modeling import Fit from gammapy.modeling.models import ( GaussianPrior, PowerLawSpectralModel, SkyModel, UniformPrior, ) from gammapy.utils.check import check_tutorials_setup ###################################################################### # Check setup # ----------- # check_tutorials_setup() ###################################################################### # Model and dataset # ----------------- # # First, we define the source model, a power-law with an index of # :math:`2.3` # pl_spectrum = PowerLawSpectralModel( index=2.3, amplitude=1e-11 / u.cm**2 / u.s / u.TeV, ) model = SkyModel(spectral_model=pl_spectrum, name="simu-source") ###################################################################### # The data and background are read from pre-computed ON/OFF datasets of # H.E.S.S. observations. For simplicity, we stack them together and transform # the dataset to a `~gammapy.datasets.SpectrumDataset`. Then we set the model and create # an Asimov dataset (dataset without statistics) by setting the counts as # the model predictions. # dataset = SpectrumDatasetOnOff.read( "$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs23523.fits" ) # Set model and fit range e_min = 0.66 * u.TeV e_max = 30 * u.TeV dataset.mask_fit = dataset.counts.geom.energy_mask(e_min, e_max) dataset = dataset.to_spectrum_dataset() dataset1 = dataset.copy() dataset1.models = model.copy(name="model") dataset1.counts = dataset1.npred() ###################################################################### # Example 1: Including Prior Information about the Sources Index # -------------------------------------------------------------- # # The index was assumed to be :math:`2.3`. However, let us assume you # have reasons to believe that the index value of the source is actually # :math:`2.1`. This can be due to theoretical predictions, other # instruments’ results, etc. We can now create a Gaussian distributed # prior with a minimum at the expected value of :math:`2.1`. The standard # deviation of the Gaussian quantifies how much we believe the prior # knowledge to be true. The smaller the standard deviation, the stronger # the constraining ability of the prior. For now, we set it to the # arbitrary value of :math:`0.1`. # # initialising the gaussian prior gaussianprior = GaussianPrior(mu=2.1, sigma=0.1) # setting the gaussian prior on the index parameter model_prior = model.copy() model_prior.parameters["index"].prior = gaussianprior ###################################################################### # The value of the prior depends on the value of the index. If the index # value equals the Gaussians mean (here 2.1), the prior is zero. This # means that nothing is added to the cash statistics, and this value is # favoured in the fit. If the value of the index deviates from the mean of # 2.1, a prior value > 0 is added to the cash statistics. # # For the visualisation, the values are appended to the list; this is not a necessity for the fitting prior_stat_sums = [] with model_prior.parameters.restore_status(): i_scan = np.linspace(1.8, 2.3, 100) for a in i_scan: model_prior.parameters["index"].value = a prior_stat_sums.append(model_prior.parameters.prior_stat_sum()) plt.plot( i_scan, prior_stat_sums, color="tab:orange", linestyle="dashed", label=f"Gaussian Prior: \n$\mu = {gaussianprior.mu.value}$; $\sigma = {gaussianprior.sigma.value}$", ) plt.axvline(x=gaussianprior.mu.value, color="red") plt.xlabel("Index Value") plt.ylabel("Prior") plt.legend() plt.xlim(2.0, 2.2) plt.ylim(-0.05, 1.1) plt.show() ###################################################################### # Fitting a Dataset with and without the Prior # ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ # # Now, we copy the dataset consisting of the power-law source and assign # the model with the Gaussian prior set on its index to it. Both of the # datasets are fitted. # dataset1.fake() dataset1_prior = dataset1.copy() dataset1_prior.models = model_prior.copy(name="prior-model") fit = Fit() results = fit.run(dataset1) results_prior = fit.run(dataset1_prior) ###################################################################### # The parameters table will mention the type of prior associated to each model print(results_prior.models.to_parameters_table()) ###################################################################### # To see the details of the priors, eg: print(results_prior.models.parameters["index"].prior) ###################################################################### # The Likelihood profiles can be computed for both the datasets. Hereby, # the likelihood gets computed for different values of the index. For each # step the other free parameters are getting reoptimized. # dataset1.models.parameters["index"].scan_n_values = 20 dataset1_prior.models.parameters["index"].scan_n_values = 20 scan = fit.stat_profile(datasets=dataset1, parameter="index", reoptimize=True) scan_prior = fit.stat_profile( datasets=dataset1_prior, parameter="index", reoptimize=True ) ###################################################################### # Now, we can compare the two Likelihood scans. In the first case, we did # not set any prior. This means that only the Cash Statistic itself is # getting minimized. The Cash statistics minimum is the actual value of # the index we used for the simulation (:math:`2.3`). Therefore, the # best-fit value was found to be :math:`2.3`. Note how the error bars # correspond to the :math:`1\sigma` error, i.e. where the stat sum equals # the minimum + 1. # # The plot also shows the prior we set on the index for the second # dataset. The scan was computed above. If the logarithm of the prior is added to the cash # statistics, one ends up with the posterior function. The posterior is # minimized during the second fit to obtain the maximum a posteriori estimation. Its minimum is between the priors and # the cash statistics minimum. The best-fit value is :math:`2.16`. We # weighed the truth with our prior beliefs and ended up with a compromise # between the values. The uncertainty of the parameter is again where the # posterior distributions equal its minimum + 1. # # The best-fit index value and uncertainty depend strongly on the standard # deviation of the Gaussian prior. You can vary :math:`\sigma` and see how # the Likelihood profiles and corresponding minima will change. # plt.plot(scan["model.spectral.index_scan"], scan["stat_scan"], label="Cash Statistics") plt.plot( i_scan, np.array(prior_stat_sums) + np.min(scan["stat_scan"]), linestyle="dashed", label="Prior", ) plt.plot( scan_prior["prior-model.spectral.index_scan"], scan_prior["stat_scan"], label="Posterior\n(Cash Stat. + Prior)", ) par = dataset1.models.parameters["index"] plt.errorbar( x=par.value, y=np.min(scan["stat_scan"]) + 1, xerr=par.error, fmt="x", markersize=6, capsize=8, color="darkblue", label="Best-Fit w/o Prior", ) par = dataset1_prior.models.parameters["index"] plt.errorbar( x=par.value, y=np.min(scan_prior["stat_scan"]) + 1, xerr=par.error, fmt="x", markersize=6, capsize=8, color="darkgreen", label="Best-Fit w/ Prior", ) plt.legend() # plt.ylim(31.5,35) plt.xlabel("Index") plt.ylabel("Fit Statistics [arb. unit]") plt.show() ###################################################################### # This example shows a critical note on using priors: we were able to # manipulate the best-fit index. This can have multiple advantages if one # wants to include additional information. But it should always be used # carefully to not falsify or bias any results! # # Note how the :math:`\Delta`\ TS of the dataset with the prior set is # larger (:math:`-53.91`) than the one without prior (:math:`-55.03`) # since the index is not fitted to the underlying true value. If the # Gaussian priors mean were the true value of :math:`2.3`, the index would # be fitted correctly, and the :math:`\Delta`\ TS values would be the # same. # ###################################################################### # Example 2: Encouraging Positive Amplitude Values # ------------------------------------------------ # # In the next example, we want to encourage the amplitude to have # positive, i.e. physical, values. Instead of setting hard bounds, we can # also set a uniform prior, which prefers positive values to negatives. # # We set the amplitude of the power-law used to simulate the source to a very # small value. Together with statistical fluctuations, this could result in some # negative amplitude best-fit values. # model_weak = SkyModel( PowerLawSpectralModel( amplitude=1e-13 / u.cm**2 / u.s / u.TeV, ), name="weak-model", ) model_weak_prior = model_weak.copy(name="weak-model-prior") uniform = UniformPrior(min=0) uniform.weight = 2 model_weak_prior.parameters["amplitude"].prior = uniform ###################################################################### # We set the minimum value to zero and per default, the maximum value # is set to positive infinity. Therefore, the uniform prior penalty # is zero, i.e. no influence on the fit at all, if the amplitude value # is positive and a penalty (the weight) in the form of a prior likelihood # for negative values. # Here, we are setting it to 2. This value is only applied if the # amplitude value goes below zero. # uni_prior_stat_sums = [] with model_weak_prior.parameters.restore_status(): a_scan = np.linspace(-1, 1, 100) for a in a_scan: model_weak_prior.parameters["amplitude"].value = a uni_prior_stat_sums.append(model_weak_prior.parameters.prior_stat_sum()) plt.plot( a_scan, uni_prior_stat_sums, color="tab:orange", linestyle="dashed", label=f"Uniform Prior\n $min={uniform.min.value}$, weight={uniform.weight}", ) plt.xlabel("Amplitude Value [1 / (TeV s cm2)]") plt.ylabel("Prior") plt.legend() plt.show() ###################################################################### # Fitting Multiple Datasets with and without the Prior # ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ # # To showcase how the uniform prior affects the fit results, :math:`100` # datasets are created and fitted without and with the prior # results, results_prior = [], [] N = 100 dataset2 = dataset.copy() for n in range(N): # simulating the dataset dataset2.models = model_weak.copy() dataset2.fake() dataset2_prior = dataset2.copy() dataset2_prior.models = model_weak_prior.copy() # fitting without the prior fit = Fit() result = fit.optimize(dataset2) results.append( { "index": result.parameters["index"].value, "amplitude": result.parameters["amplitude"].value, } ) # fitting with the prior fit_prior = Fit() result = fit_prior.optimize(dataset2_prior) results_prior.append( { "index": result.parameters["index"].value, "amplitude": result.parameters["amplitude"].value, } ) fig, axs = plt.subplots(1, 2, figsize=(7, 4)) for i, parname in enumerate(["index", "amplitude"]): par = np.array([_[parname] for _ in results]) c, bins, _ = axs[i].hist(par, bins=20, alpha=0.5, label="Without Prior") par = np.array([_[parname] for _ in results_prior]) axs[i].hist(par, bins=bins, alpha=0.5, color="tab:green", label="With Prior") axs[i].axvline(x=model_weak.parameters[parname].value, color="red") axs[i].set_xlabel(f"Reconstructed spectral\n {parname}") axs[i].legend() plt.tight_layout() plt.show() ###################################################################### # The distribution of the best-fit amplitudes shows how less best-fit # amplitudes have negative values. This also has an effect on the # distribution of the best-fit indices. How exactly the distribution # changes depends on the weight assigned to the uniform prior. The # stronger the weight, the less negative amplitudes. # # Note that the model parameters uncertainties are, per default, computed # symmetrical. This can lead to incorrect # uncertainties, especially with asymmetrical priors like the previous # uniform. Calculating the uncertainties from the profile likelihood # is advised. For more details see the :doc:`/tutorials/api/fitting` tutorial. # ###################################################################### # Implementing a custom prior # ^^^^^^^^^^^^^^^^^^^^^^^^^^^ # # For now, only ``GaussianPrior`` and ``UniformPrior`` are implemented. # To add a use case specific prior one has to create a prior subclass # containing: # # - a tag and a _type used for the serialization # - an instantiation of each PriorParameter with their unit and default values # - the evaluate function where the mathematical expression for the prior is defined. # # As an example for a custom prior a Jeffrey prior for a scale parameter is chosen. # The only parameter is ``sigma`` and the evaluation method return the squared inverse of ``sigma``. from gammapy.modeling import PriorParameter from gammapy.modeling.models import Model, Prior class MyCustomPrior(Prior): """Custom Prior. Parameters ---------- min : float Minimum value. Default is -inf. max : float Maximum value. Default is inf. """ tag = ["MyCustomPrior"] sigma = PriorParameter(name="sigma", value=1, unit="") _type = "prior" @staticmethod def evaluate(value, sigma): """Evaluate the custom prior.""" return value / sigma**2 # The custom prior is added to the PRIOR_REGISTRY so that it can be serialised. from gammapy.modeling.models import PRIOR_REGISTRY PRIOR_REGISTRY.append(MyCustomPrior) # The custom prior is set on the index of a powerlaw spectral model and is evaluated. customprior = MyCustomPrior(sigma=0.5) pwl = PowerLawSpectralModel() pwl.parameters["index"].prior = customprior customprior(pwl.parameters["index"]) # The power law spectral model can be written into a dictionary. # If a model is read in from this dictionary, the custom prior is still set on the index. print(pwl.to_dict()) model_read = Model.from_dict(pwl.to_dict()) model_read.parameters.prior ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.176642 gammapy-1.3/examples/tutorials/data/0000755000175100001770000000000014721316215017174 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/data/README.rst0000644000175100001770000000055714721316200020664 0ustar00runnerdockerData exploration ================ These tutorials show how to perform data exploration with Gammapy, providing an introduction to the CTA, HAWC, H.E.S.S. and Fermi-LAT data and instrument response functions (IRFs). You will be able to explore and filter event lists according to different criteria, as well as to get a quick look of the multidimensional IRFs files. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/data/cta.py0000644000175100001770000003517214721316200020317 0ustar00runnerdocker""" CTAO with Gammapy ================= Access and inspect CTAO data and instrument response functions (IRFs) using Gammapy. Introduction ------------ The `Cherenkov Telescope Array Observatory (CTAO) `__ is the next generation ground-based observatory for gamma-ray astronomy. Gammapy is the core library for the Cherenkov Telescope Array Observatory (CTAO) science tools (`2017ICRC…35..766D `__ and `CTAO Press Release `__). CTAO will start taking data in the coming years. For now, to learn how to analyse CTAO data and to use Gammapy, if you are a member of the CTAO consortium, you can use the simulated dataset from: - the CTA first data challenge which ran in 2017 and 2018 (see `here `__ for CTAO members) - the CTAO Science Data Challenge of 2024 (see `here `__ for CTAO members) Gammapy fully supports the FITS data formats (events, IRFs) used in CTA 1DC and SDC. The XML sky model format is not supported, but are also not needed to analyse the data, you have to specify your model via the Gammapy YAML model format, or Python code, as shown below. You can also use Gammapy to simulate CTAO data and evaluate CTAO performance using the CTAO response files. Two sets of responses are available for different array layouts: - the Omega configuration (prod3b, 2016): https://zenodo.org/records/5163273, - the Alpha configuration (prod5, 2021): https://zenodo.org/records/5499840. They are all fully supported by Gammapy. Tutorial overview ----------------- This notebook shows how to access CTAO data and instrument response functions (IRFs) using Gammapy, and gives some examples how to quick look the content of CTAO files, especially to see the shape of CTAO IRFs. At the end of the notebooks, we give several links to other tutorial notebooks that show how to simulate CTAO data and how to evaluate CTAO observability and sensitivity, or how to analyse CTAO data. Note that the FITS data and IRF format currently used by CTAO is the one documented at https://gamma-astro-data-formats.readthedocs.io/, and is also used by H.E.S.S. and other Imaging Atmospheric Cherenkov Telescopes (IACTs). So if you see other Gammapy tutorials using e.g. H.E.S.S. example data, know that they also apply to CTAO, all you have to do is to change the loaded data or IRFs to CTAO. Setup ----- """ import os from pathlib import Path from astropy import units as u # %matplotlib inline import matplotlib.pyplot as plt from IPython.display import display from gammapy.data import DataStore, EventList from gammapy.irf import EffectiveAreaTable2D, load_irf_dict_from_file ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # CTA 1DC # ------- # # The CTA first data challenge (1DC) ran in 2017 and 2018. It is described # in detail # `here `__ # and a description of the data and how to download it is # `here `__. # # You should download `caldb.tar.gz` (1.2 MB), `models.tar.gz` (0.9 # GB), `index.tar.gz` (0.5 MB), as well as optionally the simulated # survey data you are interested in: Galactic plane survey `gps.tar.gz` # (8.3 GB), Galactic center `gc.tar.gz` (4.4 MB), Extragalactic survey # `egal.tar.gz` (2.5 GB), AGN monitoring `agn.wobble.tar.gz` (4.7 GB). # After download, follow the instructions how to `untar` the files, and # set a `CTADATA` environment variable to point to the data. # # **For convenience**, since the 1DC data files are large and not publicly # available to anyone, we have taken a tiny subset of the CTA 1DC data, # four observations with the southern array from the GPS survey, pointing # near the Galactic center, and **included them at `$GAMMAPY_DATA/cta-1dc`** # which you get via `gammapy download datasets`. # # Files # ~~~~~ # # Next we will show a quick overview of the files and how to load them, # and some quick look plots showing the shape of the CTAO IRFs. How to do # CTAO simulations and analyses is shown in other tutorials, see links at # the end of this notebook. # # !ls -1 $GAMMAPY_DATA/cta-1dc # !ls -1 $GAMMAPY_DATA/cta-1dc/data/baseline/gps # !ls -1 $GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h # !ls -1 $GAMMAPY_DATA/cta-1dc/index/gps ###################################################################### # The access to the IRFs files requires to define a `CALDB` environment # variable. We are going to define it only for this notebook so it won’t # overwrite the one you may have already defined. # os.environ["CALDB"] = os.environ["GAMMAPY_DATA"] + "/cta-1dc/caldb" ###################################################################### # Datastore # ~~~~~~~~~ # # You can use the `~gammapy.data.DataStore` to load via the index files # data_store = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps") print(data_store) ###################################################################### # If you can’t download the index files, or got errors related to the data # access using them, you can generate the `DataStore` directly from the # event files. # path = Path(os.environ["GAMMAPY_DATA"]) / "cta-1dc/data" paths = list(path.rglob("*.fits")) data_store = DataStore.from_events_files(paths) print(data_store) data_store.obs_table[["OBS_ID", "GLON_PNT", "GLAT_PNT", "IRF"]] observation = data_store.obs(110380) print(observation) ###################################################################### # Events # ------ # # We can load events data via the data store and observation, or # equivalently via the `~gammapy.data.EventList` class by specifying the # EVENTS filename. # # The quick-look `events.peek()` plot below shows that CTAO has a field # of view of a few degrees, and two energy thresholds, one significantly # below 100 GeV where the CTAO large-size telescopes (LSTs) detect events, # and a second one near 100 GeV where the mid-sized telescopes (MSTs) # start to detect events. # # Note that most events are “hadronic background” due to cosmic ray # showers in the atmosphere that pass the gamma-hadron selection cuts for # this analysis configuration. Since this is simulated data, column # `MC_ID` is available that gives an emission component identifier code, # and the EVENTS header in `events.table.meta` can be used to look up # which `MC_ID` corresponds to which emission component. # # Events can be accessed from the observation object like: events = observation.events ###################################################################### # Or read directly from an event file: # events = EventList.read( "$GAMMAPY_DATA/cta-1dc/data/baseline/gps/gps_baseline_110380.fits" ) ###################################################################### # Here we print the data from the first 5 events listed in the table: # display(events.table[:5]) ###################################################################### # And show a summary plot: # events.peek() plt.show() ###################################################################### # IRFs # ---- # # The CTAO instrument response functions (IRFs) are given as FITS files in # the `caldb` folder, the following IRFs are available: # # - effective area # - energy dispersion # - point spread function # - background # # Notes: # # - The IRFs contain the energy and offset dependence of the CTAO response # - CTA 1DC was based on an early version of the CTAO FITS responses # produced in 2017, improvements have been made since. # - The point spread function was approximated by a Gaussian shape # - The background is from hadronic and electron air shower events that # pass CTAO selection cuts. It was given as a function of field of view # coordinates, although it is radially symmetric. # - The energy dispersion in CTA 1DC is noisy at low energy, leading to # unreliable spectral points for some analyses. # - The CTA 1DC response files have the first node at field of view # offset 0.5 deg, so to get the on-axis response at offset 0 deg, # Gammapy has to extrapolate. Furthermore, because diffuse gamma-rays # in the FOV were used to derive the IRFs, and the solid angle at small # FOV offset circles is small, the IRFs at the center of the FOV are # somewhat noisy. This leads to unstable analysis and simulation issues # when using the DC1 IRFs for some analyses. # print(observation.aeff) irf_filename = ( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) irfs = load_irf_dict_from_file(irf_filename) print(irfs) ###################################################################### # Effective area # ~~~~~~~~~~~~~~ # # Equivalent alternative way to load IRFs directly aeff = EffectiveAreaTable2D.read(irf_filename, hdu="EFFECTIVE AREA") print(aeff) irfs["aeff"].peek() plt.show() # What is the on-axis effective area at 10 TeV? print(aeff.evaluate(energy_true="10 TeV", offset="0 deg").to("km2")) ###################################################################### # Energy dispersion # ~~~~~~~~~~~~~~~~~ # irfs["edisp"].peek() plt.show() ###################################################################### # Point spread function # ~~~~~~~~~~~~~~~~~~~~~ # irfs["psf"].peek() plt.show() # %% # This is how for analysis you could slice out the PSF # at a given field of view offset plt.figure(figsize=(8, 5)) irfs["psf"].plot_containment_radius_vs_energy( offset=[1] * u.deg, fraction=[0.68, 0.8, 0.95] ) plt.show() ###################################################################### # Background # ~~~~~~~~~~ # # The background is given as a rate in units `MeV-1 s-1 sr-1`. # irfs["bkg"].peek() plt.show() print(irfs["bkg"].evaluate(energy="3 TeV", fov_lon="1 deg", fov_lat="0 deg")) ###################################################################### # To visualise the background at particular energies: # irfs["bkg"].plot_at_energy( ["100 GeV", "500 GeV", "1 TeV", "3 TeV", "10 TeV", "100 TeV"] ) plt.show() ###################################################################### # Source models # ------------- # # The 1DC sky model is distributed as a set of XML files, which in turn # link to a ton of other FITS and text files. Gammapy doesn’t support this # XML model file format. We are currently developing a YAML based format # that improves upon the XML format, to be easier to write and read, add # relevant information (units for physical quantities), and omit useless # information (e.g. parameter scales in addition to values). # # If you must or want to read the XML model files, you can use # e.g. `ElementTree `__ # from the Python standard library, or # `xmltodict `__ if you # `pip install xmltodict`. Here’s an example how to load the information # for a given source, and to convert it into the sky model format Gammapy # understands. # # This is what the XML file looks like # !tail -n 20 $CTADATA/models/models_gps.xml # TODO: write this example! # Read XML file and access spectrum parameters # from gammapy.extern import xmltodict # filename = os.path.join(os.environ["CTADATA"], "models/models_gps.xml") # data = xmltodict.parse(open(filename).read()) # data = data["source_library"]["source"][-1] # data = data["spectrum"]["parameter"] # data # Create a spectral model the the right units # from astropy import units as u # from gammapy.modeling.models import PowerLawSpectralModel # par_to_val = lambda par: float(par["@value"]) * float(par["@scale"]) # spec = PowerLawSpectralModel( # amplitude=par_to_val(data[0]) * u.Unit("cm-2 s-1 MeV-1"), # index=par_to_val(data[1]), # reference=par_to_val(data[2]) * u.Unit("MeV"), # ) # print(spec) ###################################################################### # Latest CTAO performance files # ----------------------------- # # CTA 1DC is useful to learn how to analyse CTAO data. But to do # simulations and studies for CTAO now, you should get the most recent CTAO # IRFs in FITS format from https://www.ctao.org/for-scientists/performance/. # # If you want to use other response files, the following code cells (remove the # to uncomment) # explain how to proceed. This example is made with the Alpha configuration (Prod5). # # !curl -o cta-prod5-zenodo-fitsonly-v0.1.zip https://zenodo.org/records/5499840/files/cta-prod5-zenodo-fitsonly-v0.1.zip # !unzip cta-prod5-zenodo-fitsonly-v0.1.zip # !ls fits/ # !tar xf fits/CTA-Performance-prod5-v0.1-South-40deg.FITS.tar.gz -C fits/. # !ls fits/*.fits.gz # irfs1 = load_irf_dict_from_file("fits/Prod5-South-40deg-SouthAz-14MSTs37SSTs.180000s-v0.1.fits.gz") # irfs1["aeff"].plot_energy_dependence() # irfs2 = load_irf_dict_from_file("fits/Prod5-South-40deg-SouthAz-14MSTs37SSTs.1800s-v0.1.fits.gz") # irfs2["aeff"].plot_energy_dependence() ###################################################################### # Exercises # --------- # # - Load the EVENTS file for `obs_id=111159` as a # `~gammapy.data.EventList` object. # - Use `~gammapy.data.EventList.table` to find the energy, sky coordinate and time of # the highest-energy event. # - Use `~gammapy.data.EventList.pointing_radec` to find the pointing position of this # observation, and use `astropy.coordinates.SkyCoord` methods to find # the field of view offset of the highest-energy event. # - What is the effective area and PSF 68% containment radius of CTAO at 1 # TeV for the `South_z20_50h` configuration used for the CTA 1DC # simulation? # - Get the latest CTAO FITS performance files from # https://www.ctao.org/for-scientists/performance/ and run the # code example above. Make an effective area ratio plot of 40 deg # zenith versus 20 deg zenith for the `South_z40_50h` and # `South_z20_50h` configurations. # # start typing here ... ###################################################################### # Next steps # ---------- # # - Learn how to analyse data with # :doc:`/tutorials/starting/analysis_1` and # :doc:`/tutorials/starting/analysis_2` or any other # Gammapy analysis tutorial. # - Learn how to evaluate CTAO observability and sensitivity with # :doc:`/tutorials/analysis-3d/simulate_3d`, # :doc:`/tutorials/analysis-1d/spectrum_simulation` # or :doc:`/tutorials/analysis-1d/cta_sensitivity`. # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/data/fermi_lat.py0000644000175100001770000003440114721316200021504 0ustar00runnerdocker""" Fermi-LAT with Gammapy ====================== Data inspection and preliminary analysis with Fermi-LAT data. Introduction ------------ This tutorial will show you how to work with Fermi-LAT data with Gammapy. As an example, we will look at the Galactic center region using the high-energy dataset that was used for the 3FHL catalog, in the energy range 10 GeV to 2 TeV. We note that support for Fermi-LAT data analysis in Gammapy is very limited. For most tasks, we recommend you use `Fermipy `__, which is based on the `Fermi Science Tools `__ (Fermi ST). Using Gammapy with Fermi-LAT data could be an option for you if you want to do an analysis that is not easily possible with Fermipy and the Fermi Science Tools. For example a joint likelihood fit of Fermi-LAT data with data e.g. from H.E.S.S., MAGIC, VERITAS or some other instrument, or analysis of Fermi-LAT data with a complex spatial or spectral model that is not available in Fermipy or the Fermi ST. Besides Gammapy, you might want to look at are `Sherpa `__ or `3ML `__. Or just using Python to roll your own analysis using several existing analysis packages. E.g. it it possible to use Fermipy and the Fermi ST to evaluate the likelihood on Fermi-LAT data, and Gammapy to evaluate it e.g. for IACT data, and to do a joint likelihood fit using e.g. `iminuit `__ or `emcee `__. To use Fermi-LAT data with Gammapy, you first have to use the Fermi ST to prepare an event list (using `gtselect` and `gtmktime`, exposure cube (using `gtexpcube2` and PSF (using `gtpsf`). You can then use `~gammapy.data.EventList`, `~gammapy.maps` and the `~gammapy.irf.PSFMap` to read the Fermi-LAT maps and PSF, i.e. support for these high level analysis products from the Fermi ST is built in. To do a 3D map analysis, you can use Fit for Fermi-LAT data in the same way that it’s use for IACT data. This is illustrated in this notebook. A 1D region-based spectral analysis is also possible, this will be illustrated in a future tutorial. Setup ----- **IMPORTANT**: For this notebook you have to get the prepared `3fhl` dataset provided in your `$GAMMAPY_DATA`. Note that the `3fhl` dataset is high-energy only, ranging from 10 GeV to 2 TeV. """ from astropy import units as u from astropy.coordinates import SkyCoord # %matplotlib inline import matplotlib.pyplot as plt from IPython.display import display from gammapy.data import EventList from gammapy.datasets import Datasets, MapDataset from gammapy.irf import EDispKernelMap, PSFMap from gammapy.maps import Map, MapAxis, WcsGeom from gammapy.modeling import Fit from gammapy.modeling.models import ( Models, PointSpatialModel, PowerLawNormSpectralModel, PowerLawSpectralModel, SkyModel, TemplateSpatialModel, create_fermi_isotropic_diffuse_model, ) ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Events # ------ # # To load up the Fermi-LAT event list, use the `~gammapy.data.EventList` # class: # events = EventList.read("$GAMMAPY_DATA/fermi_3fhl/fermi_3fhl_events_selected.fits.gz") print(events) ###################################################################### # The event data is stored in a # `astropy.table.Table `__ # object. In case of the Fermi-LAT event list this contains all the # additional information on position, zenith angle, earth azimuth angle, # event class, event type etc. # print(events.table.colnames) display(events.table[:5][["ENERGY", "RA", "DEC"]]) print(events.time[0].iso) print(events.time[-1].iso) energy = events.energy energy.info("stats") ###################################################################### # As a short analysis example we will count the number of events above a # certain minimum energy: # for e_min in [10, 100, 1000] * u.GeV: n = (events.energy > e_min).sum() print(f"Events above {e_min:4.0f}: {n:5.0f}") ###################################################################### # Counts # ------ # # Let us start to prepare things for an 3D map analysis of the Galactic # center region with Gammapy. The first thing we do is to define the map # geometry. We chose a TAN projection centered on position # `(glon, glat) = (0, 0)` with pixel size 0.1 deg, and four energy bins. # gc_pos = SkyCoord(0, 0, unit="deg", frame="galactic") energy_axis = MapAxis.from_edges( [1e4, 3e4, 1e5, 3e5, 2e6], name="energy", unit="MeV", interp="log" ) counts = Map.create( skydir=gc_pos, npix=(100, 80), proj="TAN", frame="galactic", binsz=0.1, axes=[energy_axis], dtype=float, ) # We put this call into the same Jupyter cell as the Map.create # because otherwise we could accidentally fill the counts # multiple times when executing the `fill_by_coord` multiple times. counts.fill_events(events) print(counts.geom.axes[0]) counts.sum_over_axes().smooth(2).plot(stretch="sqrt", vmax=30) plt.show() ###################################################################### # Exposure # -------- # # The Fermi-LAT dataset contains the energy-dependent exposure for the # whole sky as a HEALPix map computed with `gtexpcube2`. This format is # supported by `~gammapy.maps.Map` directly. # # Interpolating the exposure cube from the Fermi ST to get an exposure # cube matching the spatial geometry and energy axis defined above with # Gammapy is easy. The only point to watch out for is how exactly you want # the energy axis and binning handled. # # Below we just use the default behaviour, which is linear interpolation # in energy on the original exposure cube. Probably log interpolation # would be better, but it doesn’t matter much here, because the energy # binning is fine. Finally, we just copy the counts map geometry, which # contains an energy axis with `node_type="edges"`. This is non-ideal # for exposure cubes, but again, acceptable because exposure doesn’t vary # much from bin to bin, so the exact way interpolation occurs in later use # of that exposure cube doesn’t matter a lot. Of course you could define # any energy axis for your exposure cube that you like. # exposure_hpx = Map.read("$GAMMAPY_DATA/fermi_3fhl/fermi_3fhl_exposure_cube_hpx.fits.gz") print(exposure_hpx.geom) print(exposure_hpx.geom.axes[0]) exposure_hpx.plot() plt.show() ###################################################################### # For exposure, we choose a geometry with node_type='center', axis = MapAxis.from_energy_bounds( "10 GeV", "2 TeV", nbin=10, per_decade=True, name="energy_true", ) geom = WcsGeom(wcs=counts.geom.wcs, npix=counts.geom.npix, axes=[axis]) exposure = exposure_hpx.interp_to_geom(geom) print(exposure.geom) print(exposure.geom.axes[0]) ###################################################################### # Exposure is almost constant across the field of view exposure.slice_by_idx({"energy_true": 0}).plot(add_cbar=True) plt.show() ###################################################################### # Exposure varies very little with energy at these high energies energy = [10, 100, 1000] * u.GeV print(exposure.get_by_coord({"skycoord": gc_pos, "energy_true": energy})) ###################################################################### # Galactic diffuse background # --------------------------- # ###################################################################### # The Fermi-LAT collaboration provides a galactic diffuse emission model, # that can be used as a background model for Fermi-LAT source analysis. # # Diffuse model maps are very large (100s of MB), so as an example here, # we just load one that represents a small cutout for the Galactic center # region. # # In this case, the maps are already in differential units, so we do not # want to normalise it again. # template_diffuse = TemplateSpatialModel.read( filename="$GAMMAPY_DATA/fermi-3fhl-gc/gll_iem_v06_gc.fits.gz", normalize=False ) print(template_diffuse.map) diffuse_iem = SkyModel( spectral_model=PowerLawNormSpectralModel(), spatial_model=template_diffuse, name="diffuse-iem", ) ###################################################################### # Let’s look at the map of first energy band of the cube: # template_diffuse.map.slice_by_idx({"energy_true": 0}).plot(add_cbar=True) plt.show() ###################################################################### # Here is the spectrum at the Galactic center: # dnde = template_diffuse.map.to_region_nd_map(region=gc_pos) dnde.plot() plt.xlabel("Energy (GeV)") plt.ylabel("Flux (cm-2 s-1 MeV-1 sr-1)") plt.show() ###################################################################### # Isotropic diffuse background # ---------------------------- # # To load the isotropic diffuse model with Gammapy, use the # `~gammapy.modeling.models.TemplateSpectralModel`. We are using # `'extrapolate': True` to extrapolate the model above 500 GeV: # filename = "$GAMMAPY_DATA/fermi_3fhl/iso_P8R2_SOURCE_V6_v06.txt" diffuse_iso = create_fermi_isotropic_diffuse_model( filename=filename, interp_kwargs={"extrapolate": True} ) ###################################################################### # We can plot the model in the energy range between 50 GeV and 2000 GeV: # energy_bounds = [50, 2000] * u.GeV diffuse_iso.spectral_model.plot(energy_bounds, yunits=u.Unit("1 / (cm2 MeV s)")) plt.show() ###################################################################### # PSF # --- # # Next we will tke a look at the PSF. It was computed using `gtpsf`, in # this case for the Galactic center position. Note that generally for # Fermi-LAT, the PSF only varies little within a given regions of the sky, # especially at high energies like what we have here. We use the # `~gammapy.irf.PSFMap` class to load the PSF and use some of it’s # methods to get some information about it. # psf = PSFMap.read("$GAMMAPY_DATA/fermi_3fhl/fermi_3fhl_psf_gc.fits.gz", format="gtpsf") print(psf) ###################################################################### # To get an idea of the size of the PSF we check how the containment radii # of the Fermi-LAT PSF vary with energy and different containment # fractions: # plt.figure(figsize=(8, 5)) psf.plot_containment_radius_vs_energy() plt.show() ###################################################################### # In addition we can check how the actual shape of the PSF varies with # energy and compare it against the mean PSF between 50 GeV and 2000 GeV: # plt.figure(figsize=(8, 5)) energy = [100, 300, 1000] * u.GeV psf.plot_psf_vs_rad(energy_true=energy) spectrum = PowerLawSpectralModel(index=2.3) psf_mean = psf.to_image(spectrum=spectrum) psf_mean.plot_psf_vs_rad(c="k", ls="--", energy_true=[500] * u.GeV) plt.xlim(1e-3, 0.3) plt.ylim(1e3, 1e6) plt.legend() plt.show() ###################################################################### # This is what the corresponding PSF kernel looks like: # psf_kernel = psf.get_psf_kernel( position=geom.center_skydir, geom=geom, max_radius="1 deg" ) psf_kernel.to_image().psf_kernel_map.plot(stretch="log", add_cbar=True) plt.show() ###################################################################### # Energy Dispersion # ~~~~~~~~~~~~~~~~~ # # For simplicity we assume a diagonal energy dispersion: # e_true = exposure.geom.axes["energy_true"] edisp = EDispKernelMap.from_diagonal_response( energy_axis_true=e_true, energy_axis=energy_axis ) edisp.get_edisp_kernel().plot_matrix() plt.show() ###################################################################### # Fit # --- # # Now, the big finale: let’s do a 3D map fit for the source at the # Galactic center, to measure it’s position and spectrum. We keep the # background normalization free. # spatial_model = PointSpatialModel(lon_0="0 deg", lat_0="0 deg", frame="galactic") spectral_model = PowerLawSpectralModel( index=2.7, amplitude="5.8e-10 cm-2 s-1 TeV-1", reference="100 GeV" ) source = SkyModel( spectral_model=spectral_model, spatial_model=spatial_model, name="source-gc", ) models = Models([source, diffuse_iem, diffuse_iso]) dataset = MapDataset( models=models, counts=counts, exposure=exposure, psf=psf, edisp=edisp, name="fermi-dataset", ) # %%time fit = Fit() result = fit.run(datasets=[dataset]) print(result) print(models) residual = counts - dataset.npred() residual.sum_over_axes().smooth("0.1 deg").plot( cmap="coolwarm", vmin=-3, vmax=3, add_cbar=True ) plt.show() ###################################################################### # Serialisation # ------------- # # To serialise the created dataset, you must proceed through the # Datasets API # Datasets([dataset]).write( filename="fermi_dataset.yaml", filename_models="fermi_models.yaml", overwrite=True ) datasets_read = Datasets.read( filename="fermi_dataset.yaml", filename_models="fermi_models.yaml" ) print(datasets_read) ###################################################################### # Exercises # --------- # # - Fit the position and spectrum of the source `SNR # G0.9+0.1 `__. # - Make maps and fit the position and spectrum of the `Crab # nebula `__. # ###################################################################### # Summary # ------- # # In this tutorial you have seen how to work with Fermi-LAT data with # Gammapy. You have to use the Fermi ST to prepare the exposure cube and # PSF, and then you can use Gammapy for any event or map analysis using # the same methods that are used to analyse IACT data. # # This works very well at high energies (here above 10 GeV), where the # exposure and PSF is almost constant spatially and only varies a little # with energy. It is not expected to give good results for low-energy # data, where the Fermi-LAT PSF is very large. If you are interested to # help us validate down to what energy Fermi-LAT analysis with Gammapy # works well (e.g. by re-computing results from 3FHL or other published # analysis results), or to extend the Gammapy capabilities (e.g. to work # with energy-dependent multi-resolution maps and PSF), that would be very # welcome! # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/data/hawc.py0000644000175100001770000002454614721316200020475 0ustar00runnerdocker""" HAWC with Gammapy ================= Explore HAWC event lists and instrument response functions (IRFs), then perform the data reduction steps. Introduction ------------ `HAWC `__ is an array of water Cherenkov detectors located in Mexico that detects gamma-rays in the range between hundreds of GeV and hundreds of TeV. Gammapy recently added support of HAWC high level data analysis, after export to the current `open data level 3 format `__. The HAWC data is largely private. However, in 2022, a small sub-set of HAWC Pass4 event lists from the Crab nebula region was publicly `released `__. This dataset is 1 GB in size, so only a subset will be used here as an example. Tutorial overview ----------------- This notebook is a quick introduction to HAWC data analysis with Gammapy. It briefly describes the HAWC data and how to access it, and then uses a subset of the data to produce a `~gammapy.datasets.MapDataset`, to show how the data reduction is performed. The HAWC data release contains events where the energy is estimated using two different algorithms, referred to as "NN" and "GP" (see this `paper `__ for a detailed description). These two event classes are not independent, meaning that events are repeated between the NN and GP datasets. Therefore, these data should never be analysed jointly, and one of the two estimators needs to be chosen before proceeding. Once the data has been reduced to a `~gammapy.datasets.MapDataset`, there are no differences in the way that HAWC data is handled with respect to data from any other observatory, such as H.E.S.S. or CTAO. HAWC data access and reduction ------------------------------ This is how to access data and IRFs from the HAWC Crab event data release. """ import astropy.units as u from astropy.coordinates import SkyCoord import matplotlib.pyplot as plt from gammapy.data import DataStore, HDUIndexTable, ObservationTable from gammapy.datasets import MapDataset from gammapy.estimators import ExcessMapEstimator from gammapy.makers import MapDatasetMaker, SafeMaskMaker from gammapy.maps import Map, MapAxis, WcsGeom ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Chose which estimator we will use energy_estimator = "NN" ###################################################################### # A useful way to organize the relevant files are with index tables. The # observation index table contains information on each particular observation, # such as the run ID. The HDU index table has a row per # relevant file (i.e., events, effective area, psf…) and contains the path # to said file. # The HAWC data is divided into different event types, classified using # the fraction of the array that was triggered by an event, a quantity # usually referred to as "fHit". These event types are fully independent, # meaning that an event will have a unique event type identifier, which # is usually a number indicating which fHit bin the event corresponds to. # For this reason, a single HAWC observation has several HDU index tables # associated to it, one per event type. In each table, there will be # paths to a distinct event list file and IRFs. # In the HAWC event data release, all the HDU tables are saved into # the same FITS file, and can be accesses through the choice of the hdu index. ###################################################################### # Load the tables # ~~~~~~~~~~~~~~~ data_path = "$GAMMAPY_DATA/hawc/crab_events_pass4/" hdu_filename = f"hdu-index-table-{energy_estimator}-Crab.fits.gz" obs_filename = f"obs-index-table-{energy_estimator}-Crab.fits.gz" ###################################################################### # There is only one observation table obs_table = ObservationTable.read(data_path + obs_filename) ###################################################################### # The remainder of this tutorial utilises just one fHit value, however, # for a regular analysis you should combine all fHit bins. Here, # we utilise fHit bin number 6. We start by reading the HDU index table # of this fHit bin fHit = 6 hdu_table = HDUIndexTable.read(data_path + hdu_filename, hdu=fHit) ###################################################################### # From the tables, we can create a `~gammapy.data.DataStore`. data_store = DataStore(hdu_table=hdu_table, obs_table=obs_table) data_store.info() ###################################################################### # There is only one observation obs = data_store.get_observations()[0] ###################################################################### # Peek events from this observation obs.events.peek() plt.show() ###################################################################### # Peek the energy dispersion: obs.edisp.peek() plt.show() ###################################################################### # Peek the psf: obs.psf.peek() plt.show() ###################################################################### # Peek the background for one transit: plt.figure() obs.bkg.reduce_over_axes().plot(add_cbar=True) plt.show() ###################################################################### # Peek the effective exposure for one transit: plt.figure() obs.aeff.reduce_over_axes().plot(add_cbar=True) plt.show() ###################################################################### # Data reduction into a MapDataset # -------------------------------- # # We will now produce a `~gammapy.datasets.MapDataset` using the data from one of the # fHit bins. In order to use all bins, one just needs to repeat this # process inside a for loop that modifies the variable fHit. ###################################################################### # First we define the geometry and axes, starting with the energy reco axis: energy_axis = MapAxis.from_edges( [1.00, 1.78, 3.16, 5.62, 10.0, 17.8, 31.6, 56.2, 100, 177, 316] * u.TeV, name="energy", interp="log", ) ###################################################################### # Note: this axis is the one used to create the background model map. If # different edges are used, the `~gammapy.makers.MapDatasetMaker` will interpolate between # them, which might lead to unexpected behaviour. ###################################################################### # Define the energy true axis: energy_axis_true = MapAxis.from_energy_bounds( 1e-3, 1e4, nbin=140, unit="TeV", name="energy_true" ) ###################################################################### # Finally, create a geometry around the Crab location: geom = WcsGeom.create( skydir=SkyCoord(ra=83.63, dec=22.01, unit="deg", frame="icrs"), width=6 * u.deg, axes=[energy_axis], binsz=0.05, ) ###################################################################### # Define the makers we will use: maker = MapDatasetMaker(selection=["counts", "background", "exposure", "edisp", "psf"]) safe_mask_maker = SafeMaskMaker(methods=["aeff-max"], aeff_percent=10) ###################################################################### # Create an empty `~gammapy.datasets.MapDataset`. # The keyword ``reco_psf=True`` is needed because the HAWC PSF is # derived in reconstructed energy. dataset_empty = MapDataset.create( geom, energy_axis_true=energy_axis_true, name=f"fHit {fHit}", reco_psf=True ) dataset = maker.run(dataset_empty, obs) ###################################################################### # The livetime information is used by the `~gammapy.makers.SafeMaskMaker` to retrieve the # effective area from the exposure. The HAWC effective area is computed # for one source transit above 45º zenith, which is around 6h. # Since the effective area condition used here is relative to # the maximum, this value does not actually impact the result. dataset.exposure.meta["livetime"] = "6 h" dataset = safe_mask_maker.run(dataset) ###################################################################### # Now we have a dataset that has background and exposure quantities for # one single transit, but our dataset might comprise more. The number # of transits can be derived using the good time intervals (GTI) stored # with the event list. For convenience, the HAWC data release already # included this quantity as a map. transit_map = Map.read(data_path + "irfs/TransitsMap_Crab.fits.gz") transit_number = transit_map.get_by_coord(geom.center_skydir) ###################################################################### # Correct the background and exposure quantities: dataset.background.data *= transit_number dataset.exposure.data *= transit_number ###################################################################### # Check the dataset we produced # ----------------------------- # # We will now check the contents of the dataset. # We can use the ``.peek()`` method to quickly get a glimpse of the contents dataset.peek() plt.show() ###################################################################### # Create significance maps to check that the Crab is visible: excess_estimator = ExcessMapEstimator( correlation_radius="0.2 deg", selection_optional=[], energy_edges=energy_axis.edges ) excess = excess_estimator.run(dataset) (dataset.mask * excess["sqrt_ts"]).plot_grid( add_cbar=True, cmap="coolwarm", vmin=-5, vmax=5 ) plt.show() ###################################################################### # Combining all energies excess_estimator_integrated = ExcessMapEstimator( correlation_radius="0.2 deg", selection_optional=[] ) excess_integrated = excess_estimator_integrated.run(dataset) excess_integrated["sqrt_ts"].plot(add_cbar=True) plt.show() ###################################################################### # Exercises # --------- # # - Repeat the process for a different fHit bin. # - Repeat the process for all the fHit bins provided in the data # release and fit a model to the result. # ###################################################################### # Next steps # ---------- # # Now you know how to access and work with HAWC data. All other # tutorials and documentation concerning 3D analysis techniques and # the `~gammapy.datasets.MapDataset` object can be used from this step on. # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/data/hess.py0000644000175100001770000001176214721316200020511 0ustar00runnerdocker""" H.E.S.S. with Gammapy ===================== Explore H.E.S.S. event lists and IRFs. Introduction ------------ `H.E.S.S. `__ is an array of gamma-ray telescopes located in Namibia. Gammapy is regularly used and fully supports H.E.S.S. high level data analysis, after export to the current `open data level 3 format `__. The H.E.S.S. data is private, and H.E.S.S. analysis is mostly documented and discussed in the internal Wiki pages and in H.E.S.S.-internal communication channels. However, in 2018, a small sub-set of archival H.E.S.S. data was publicly released, called the `H.E.S.S. DL3 DR1 `__, the data level 3, data release number 1. This dataset is 50 MB in size and is used in many Gammapy analysis tutorials, and can be downloaded via `gammapy download `__. This notebook is a quick introduction to this specific DR1 release. It briefly describes H.E.S.S. data and instrument responses and show a simple exploration of the data with the creation of theta-squared plot. H.E.S.S. members can find details on the DL3 FITS production on this `Confluence page `__ and access more detailed tutorials in the BitBucket `hess-open-tools` repository. DL3 DR1 ------- This is how to access data and IRFs from the H.E.S.S. data level 3, data release 1. """ import astropy.units as u from astropy.coordinates import SkyCoord # %matplotlib inline import matplotlib.pyplot as plt from IPython.display import display from gammapy.data import DataStore from gammapy.makers.utils import make_theta_squared_table from gammapy.maps import MapAxis ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup from gammapy.visualization import plot_theta_squared_table check_tutorials_setup() ###################################################################### # A useful way to organize the relevant files are the index tables. The # observation index table contains information on each particular run, # such as the pointing, or the run ID. The HDU index table has a row per # relevant file (i.e., events, effective area, psf…) and contains the path # to said file. Together they can be loaded into a Datastore by indicating # the directory in which they can be found, in this case # `$GAMMAPY_DATA/hess-dl3-dr1`: # ###################################################################### # Create and get info on the data store data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") data_store.info() ###################################################################### # Preview an excerpt from the observation table display(data_store.obs_table[:2][["OBS_ID", "DATE-OBS", "RA_PNT", "DEC_PNT", "OBJECT"]]) ###################################################################### # Get a single observation obs = data_store.obs(23523) ###################################################################### # Select and peek events obs.events.select_offset([0, 2.5] * u.deg).peek() ###################################################################### # Peek the effective area obs.aeff.peek() ###################################################################### # Peek the energy dispersion obs.edisp.peek() ###################################################################### # Peek the psf obs.psf.peek() ###################################################################### # Peek the background rate obs.bkg.to_2d().plot() plt.show() ###################################################################### # Theta squared event distribution # -------------------------------- # # As a quick look plot it can be helpful to plot the quadratic offset # (theta squared) distribution of the events. # position = SkyCoord(ra=83.63, dec=22.01, unit="deg", frame="icrs") theta2_axis = MapAxis.from_bounds(0, 0.2, nbin=20, interp="lin", unit="deg2") observations = data_store.get_observations([23523, 23526]) theta2_table = make_theta_squared_table( observations=observations, position=position, theta_squared_axis=theta2_axis, ) plt.figure(figsize=(10, 5)) plot_theta_squared_table(theta2_table) plt.show() ###################################################################### # Exercises # --------- # # - Find the `OBS_ID` for the runs of the Crab nebula # - Compute the expected number of background events in the whole RoI for # `OBS_ID=23523` in the 1 TeV to 3 TeV energy band, from the # background IRF. # ###################################################################### # Next steps # ---------- # # Now you know how to access and work with H.E.S.S. data. All other # tutorials and documentation apply to H.E.S.S. and CTAO or any other IACT # that provides DL3 data and IRFs in the standard format. # ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.176642 gammapy-1.3/examples/tutorials/scripts/0000755000175100001770000000000014721316215017752 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/scripts/README.rst0000644000175100001770000000050014721316200021426 0ustar00runnerdocker.. _tutorials_scripts: Scripts ======= For interactive use, IPython and Jupyter are great, and most Gammapy examples use those. However, for long-running, non-interactive tasks like data reduction or survey maps, you might prefer a Python script. The following example shows how to run Gammapy within a Python script.././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/scripts/survey_map.py0000644000175100001770000000142314721316200022510 0ustar00runnerdocker""" Survey Map Script ================= Make a survey counts map using a script. We create an all-sky map in AIT projection for the `H.E.S.S. DL3 DR1 `__ dataset. """ import logging from gammapy.data import DataStore from gammapy.maps import Map log = logging.getLogger(__name__) def main(): data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") obs_id = data_store.obs_table["OBS_ID"] observations = data_store.get_observations(obs_id) m = Map.create() for obs in observations: log.info(f"Processing obs_id: {obs.obs_id}") m.fill_events(obs.events) m.write("survey_map.fits.gz", overwrite=True) if __name__ == "__main__": logging.basicConfig(level=logging.INFO) main() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/scripts/survey_map.rst0000644000175100001770000000037614721316200022676 0ustar00runnerdocker.. include:: ../references.txt .. _survey_map: Survey map ========== Here's an example of a Gammapy analysis as a Python script: .. literalinclude:: ../../examples/survey_map.py To execute it run: .. code-block:: bash python survey_example.py ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.176642 gammapy-1.3/examples/tutorials/starting/0000755000175100001770000000000014721316215020116 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/starting/README.rst0000644000175100001770000000120614721316200021576 0ustar00runnerdockerIntroduction ============ The following three tutorials show different ways of how to use Gammapy to perform a complete data analysis, from data selection to data reduction and finally modeling and fitting. The first tutorial is an overview on how to perform a standard analysis workflow using the high level interface in a configuration-driven approach, whilst the second deals with the same use-case using the low level API and showing what is happening *under-the-hood*. The third tutorial shows a glimpse of how to handle different basic data structures like event lists, source catalogs, sky maps, spectral models and flux points tables. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/starting/analysis_1.py0000644000175100001770000003254614721316200022537 0ustar00runnerdocker""" High level interface ==================== Introduction to 3D analysis using the Gammapy high level interface. Prerequisites ------------- - Understanding the gammapy data workflow, in particular what are DL3 events and instrument response functions (IRF). Context ------- This notebook is an introduction to gammapy analysis using the high level interface. Gammapy analysis consists in two main steps. The first one is data reduction: user selected observations are reduced to a geometry defined by the user. It can be 1D (spectrum from a given extraction region) or 3D (with a sky projection and an energy axis). The resulting reduced data and instrument response functions (IRF) are called datasets in Gammapy. The second step consists in setting a physical model on the datasets and fitting it to obtain relevant physical information. **Objective: Create a 3D dataset of the Crab using the H.E.S.S. DL3 data release 1 and perform a simple model fitting of the Crab nebula.** Proposed approach ----------------- This notebook uses the high level `~gammapy.analysis.Analysis` class to orchestrate data reduction. In its current state, `~gammapy.analysis.Analysis` supports the standard analysis cases of joint or stacked 3D and 1D analyses. It is instantiated with an `~gammapy.analysis.AnalysisConfig` object that gives access to analysis parameters either directly or via a YAML config file. To see what is happening under-the-hood and to get an idea of the internal API, a second notebook performs the same analysis without using the `~gammapy.analysis.Analysis` class. In summary, we have to: - Create an `~gammapy.analysis.AnalysisConfig` object and edit it to define the analysis configuration: - Define what observations to use - Define the geometry of the dataset (data and IRFs) - Define the model we want to fit on the dataset. - Instantiate a `~gammapy.analysis.Analysis` from this configuration and run the different analysis steps - Observation selection - Data reduction - Model fitting - Estimating flux points Finally, we will compare the results against a reference model. """ ###################################################################### # Setup # ----- # from pathlib import Path from astropy import units as u # %matplotlib inline import matplotlib.pyplot as plt from gammapy.analysis import Analysis, AnalysisConfig ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() ###################################################################### # Analysis configuration # ---------------------- # # For configuration of the analysis we use the # `YAML `__ data format. YAML is a # machine readable serialisation format, that is also friendly for humans # to read. In this tutorial we will write the configuration file just # using Python strings, but of course the file can be created and modified # with any text editor of your choice. # # Here is what the configuration for our analysis looks like: # config = AnalysisConfig() # the AnalysisConfig gives access to the various parameters used from logging to reduced dataset geometries print(config) ###################################################################### # Setting the data to use # ~~~~~~~~~~~~~~~~~~~~~~~ # ###################################################################### # We want to use Crab runs from the H.E.S.S. DL3-DR1. We define here the # datastore and a cone search of observations pointing with 5 degrees of # the Crab nebula. Parameters can be set directly or as a python dict. # # PS: do not forget to setup your environment variable `$GAMMAPY_DATA` to # your local directory containing the H.E.S.S. DL3-DR1 as described in # :ref:`quickstart-setup`. # # We define the datastore containing the data config.observations.datastore = "$GAMMAPY_DATA/hess-dl3-dr1" # We define the cone search parameters config.observations.obs_cone.frame = "icrs" config.observations.obs_cone.lon = "83.633 deg" config.observations.obs_cone.lat = "22.014 deg" config.observations.obs_cone.radius = "5 deg" # Equivalently we could have set parameters with a python dict # config.observations.obs_cone = {"frame": "icrs", "lon": "83.633 deg", "lat": "22.014 deg", "radius": "5 deg"} ###################################################################### # Setting the reduced datasets geometry # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # We want to perform a 3D analysis config.datasets.type = "3d" # We want to stack the data into a single reduced dataset config.datasets.stack = True # We fix the WCS geometry of the datasets config.datasets.geom.wcs.skydir = { "lon": "83.633 deg", "lat": "22.014 deg", "frame": "icrs", } config.datasets.geom.wcs.width = {"width": "2 deg", "height": "2 deg"} config.datasets.geom.wcs.binsize = "0.02 deg" # We now fix the energy axis for the counts map config.datasets.geom.axes.energy.min = "1 TeV" config.datasets.geom.axes.energy.max = "10 TeV" config.datasets.geom.axes.energy.nbins = 10 # We now fix the energy axis for the IRF maps (exposure, etc) config.datasets.geom.axes.energy_true.min = "0.5 TeV" config.datasets.geom.axes.energy_true.max = "20 TeV" config.datasets.geom.axes.energy_true.nbins = 20 ###################################################################### # Setting the background normalization maker # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # config.datasets.background.method = "fov_background" config.datasets.background.parameters = {"method": "scale"} ###################################################################### # Setting the exclusion mask # ~~~~~~~~~~~~~~~~~~~~~~~~~~ # ###################################################################### # In order to properly adjust the background normalisation on regions # without gamma-ray signal, one needs to define an exclusion mask for the # background normalisation. For this tutorial, we use the following one # ``$GAMMAPY_DATA/joint-crab/exclusion/exclusion_mask_crab.fits.gz`` # config.datasets.background.exclusion = ( "$GAMMAPY_DATA/joint-crab/exclusion/exclusion_mask_crab.fits.gz" ) ###################################################################### # Setting modeling and fitting parameters # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # ``Analysis`` can perform a few modeling and fitting tasks besides data # reduction. Parameters have then to be passed to the configuration # object. # # Here we define the energy range on which to perform the fit. We also set # the energy edges used for flux point computation as well as the # correlation radius to compute excess and significance maps. # config.fit.fit_range.min = 1 * u.TeV config.fit.fit_range.max = 10 * u.TeV config.flux_points.energy = {"min": "1 TeV", "max": "10 TeV", "nbins": 4} config.excess_map.correlation_radius = 0.1 * u.deg ###################################################################### # We’re all set. But before we go on let’s see how to save or import # ``AnalysisConfig`` objects though YAML files. # ###################################################################### # Using YAML configuration files # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # One can export/import the `~gammapy.modeling.AnalysisConfig` to/from a YAML file. # config.write("config.yaml", overwrite=True) config = AnalysisConfig.read("config.yaml") print(config) ###################################################################### # Running the analysis # -------------------- # # We first create an `~gammapy.analysis.Analysis` object from our # configuration. # analysis = Analysis(config) ###################################################################### # Observation selection # ~~~~~~~~~~~~~~~~~~~~~ # # We can directly select and load the observations from disk using # `~gammapy.analysis.Analysis.get_observations()`: # analysis.get_observations() ###################################################################### # The observations are now available on the `~gammapy.modeling.Analysis` object. The # selection corresponds to the following ids: # print(analysis.observations.ids) ###################################################################### # To see how to explore observations, please refer to the following # notebook: :doc:`CTA with Gammapy ` or :doc:`H.E.S.S. with # Gammapy ` # ###################################################################### # Data reduction # -------------- # # Now we proceed to the data reduction. In the config file we have chosen # a WCS map geometry, energy axis and decided to stack the maps. We can # run the reduction using `~gammapy.modeling.Analysis.get_datasets()`: # # %%time analysis.get_datasets() ###################################################################### # As we have chosen to stack the data, there is finally one dataset # contained which we can print: # print(analysis.datasets["stacked"]) ###################################################################### # As you can see the dataset comes with a predefined background model out # of the data reduction, but no source model has been set yet. # # The counts, exposure and background model maps are directly available on # the dataset and can be printed and plotted: # counts = analysis.datasets["stacked"].counts counts.smooth("0.05 deg").plot_interactive() ###################################################################### # We can also compute the map of the sqrt_ts (significance) of the excess # counts above the background. The correlation radius to sum counts is # defined in the config file. # analysis.get_excess_map() analysis.excess_map["sqrt_ts"].plot(add_cbar=True) plt.show() ###################################################################### # Save dataset to disk # -------------------- # # It is common to run the preparation step independent of the likelihood # fit, because often the preparation of maps, PSF and energy dispersion is # slow if you have a lot of data. We first create a folder: # path = Path("analysis_1") path.mkdir(exist_ok=True) ###################################################################### # And then write the maps and IRFs to disk by calling the dedicated # `~gammapy.datasets.Datasets.write` method: # filename = path / "crab-stacked-dataset.fits.gz" analysis.datasets[0].write(filename, overwrite=True) ###################################################################### # Model fitting # ------------- # # Now we define a model to be fitted to the dataset. Here we use its YAML # definition to load it: # model_config = """ components: - name: crab type: SkyModel spatial: type: PointSpatialModel frame: icrs parameters: - name: lon_0 value: 83.63 unit: deg - name: lat_0 value: 22.014 unit: deg spectral: type: PowerLawSpectralModel parameters: - name: amplitude value: 1.0e-12 unit: cm-2 s-1 TeV-1 - name: index value: 2.0 unit: '' - name: reference value: 1.0 unit: TeV frozen: true """ ###################################################################### # Now we set the model on the analysis object: # analysis.set_models(model_config) ###################################################################### # Finally we run the fit: # # %%time analysis.run_fit() print(analysis.fit_result) ###################################################################### # This is how we can write the model back to file again: # filename = path / "model-best-fit.yaml" analysis.models.write(filename, overwrite=True) with filename.open("r") as f: print(f.read()) ###################################################################### # Flux points # ~~~~~~~~~~~ # analysis.config.flux_points.source = "crab" # Example showing how to change the FluxPointsEstimator parameters: analysis.config.flux_points.energy.nbins = 5 config_dict = { "selection_optional": "all", "n_sigma": 2, # Number of sigma to use for asymmetric error computation "n_sigma_ul": 3, # Number of sigma to use for upper limit computation } analysis.config.flux_points.parameters = config_dict analysis.get_flux_points() # Example showing how to change just before plotting the threshold on the signal significance # (points vs upper limits), even if this has no effect with this data set. fp = analysis.flux_points.data fp.sqrt_ts_threshold_ul = 5 ax_sed, ax_residuals = analysis.flux_points.plot_fit() plt.show() ###################################################################### # The flux points can be exported to a fits table following the format # defined # `here `_ # filename = path / "flux-points.fits" analysis.flux_points.write(filename, overwrite=True) ###################################################################### # To check the fit is correct, we compute the map of the sqrt_ts of the # excess counts above the current model. # analysis.get_excess_map() analysis.excess_map["sqrt_ts"].plot(add_cbar=True, cmap="RdBu", vmin=-5, vmax=5) plt.show() ###################################################################### # What’s next # ----------- # # You can look at the same analysis without the high level interface in # :doc:`/tutorials/starting/analysis_2` # # You can see how to perform a 1D spectral analysis of the same data in # :doc:`/tutorials/analysis-1d/spectral_analysis` # ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/starting/analysis_2.py0000644000175100001770000003137314721316200022535 0ustar00runnerdocker""" Low level API ============= Introduction to Gammapy analysis using the low level API. Prerequisites ------------- - Understanding the gammapy data workflow, in particular what are DL3 events and instrument response functions (IRF). - Understanding of the data reduction and modeling fitting process as shown in the analysis with the high level interface tutorial :doc:`/tutorials/starting/analysis_1` Context ------- This notebook is an introduction to gammapy analysis this time using the lower level classes and functions the library. This allows to understand what happens during two main gammapy analysis steps, data reduction and modeling/fitting. **Objective: Create a 3D dataset of the Crab using the H.E.S.S. DL3 data release 1 and perform a simple model fitting of the Crab nebula using the lower level gammapy API.** Proposed approach ----------------- Here, we have to interact with the data archive (with the `~gammapy.data.DataStore`) to retrieve a list of selected observations (`~gammapy.data.Observations`). Then, we define the geometry of the `~gammapy.datasets.MapDataset` object we want to produce and the maker object that reduce an observation to a dataset. We can then proceed with data reduction with a loop over all selected observations to produce datasets in the relevant geometry and stack them together (i.e.sum them all). In practice, we have to: - Create a `~gammapy.data.DataStore` pointing to the relevant data - Apply an observation selection to produce a list of observations, a `~gammapy.data.Observations` object. - Define a geometry of the Map we want to produce, with a sky projection and an energy range. - Create a `~gammapy.maps.MapAxis` for the energy - Create a `~gammapy.maps.WcsGeom` for the geometry - Create the necessary makers: - the map dataset maker `~gammapy.makers.MapDatasetMaker` - the background normalization maker, here a `~gammapy.makers.FoVBackgroundMaker` - and usually the safe range maker : `~gammapy.makers.SafeMaskMaker` - Perform the data reduction loop. And for every observation: - Apply the makers sequentially to produce the current `~gammapy.datasets.MapDataset` - Stack it on the target one. - Define the`~gammapy.modeling.models.SkyModel` to apply to the dataset. - Create a `~gammapy.modeling.Fit` object and run it to fit the model parameters - Apply a `~gammapy.estimators.FluxPointsEstimator` to compute flux points for the spectral part of the fit. Setup ----- First, we setup the analysis by performing required imports. """ from pathlib import Path from astropy import units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion # %matplotlib inline import matplotlib.pyplot as plt from gammapy.data import DataStore from gammapy.datasets import MapDataset from gammapy.estimators import FluxPointsEstimator from gammapy.makers import FoVBackgroundMaker, MapDatasetMaker, SafeMaskMaker from gammapy.maps import MapAxis, WcsGeom from gammapy.modeling import Fit from gammapy.modeling.models import ( FoVBackgroundModel, PointSpatialModel, PowerLawSpectralModel, SkyModel, ) from gammapy.utils.check import check_tutorials_setup from gammapy.visualization import plot_npred_signal ###################################################################### # Check setup # ----------- check_tutorials_setup() ###################################################################### # Defining the datastore and selecting observations # ------------------------------------------------- # # We first use the `~gammapy.data.DataStore` object to access the # observations we want to analyse. Here the H.E.S.S. DL3 DR1. # data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") ###################################################################### # We can now define an observation filter to select only the relevant # observations. Here we use a cone search which we define with a python # dict. # # We then filter the `ObservationTable` with # `~gammapy.data.ObservationTable.select_observations`. # selection = dict( type="sky_circle", frame="icrs", lon="83.633 deg", lat="22.014 deg", radius="5 deg", ) selected_obs_table = data_store.obs_table.select_observations(selection) ###################################################################### # We can now retrieve the relevant observations by passing their # ``obs_id`` to the `~gammapy.data.DataStore.get_observations` # method. # observations = data_store.get_observations(selected_obs_table["OBS_ID"]) ###################################################################### # Preparing reduced datasets geometry # ----------------------------------- # # Now we define a reference geometry for our analysis, We choose a WCS # based geometry with a binsize of 0.02 deg and also define an energy # axis: # energy_axis = MapAxis.from_energy_bounds(1.0, 10.0, 4, unit="TeV") geom = WcsGeom.create( skydir=(83.633, 22.014), binsz=0.02, width=(2, 2), frame="icrs", proj="CAR", axes=[energy_axis], ) # Reduced IRFs are defined in true energy (i.e. not measured energy). energy_axis_true = MapAxis.from_energy_bounds( 0.5, 20, 10, unit="TeV", name="energy_true" ) ###################################################################### # Now we can define the target dataset with this geometry. # stacked = MapDataset.create( geom=geom, energy_axis_true=energy_axis_true, name="crab-stacked" ) ###################################################################### # Data reduction # -------------- # # Create the maker classes to be used # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # The `~gammapy.makers.MapDatasetMaker` object is initialized as well as # the `~gammapy.makers.SafeMaskMaker` that carries here a maximum offset # selection. The `~gammapy.makers.FoVBackgroundMaker` utilised here has the # default ``spectral_model`` but it is possible to set your own. For further # details see the :doc:`FoV background `. # offset_max = 2.5 * u.deg maker = MapDatasetMaker() maker_safe_mask = SafeMaskMaker( methods=["offset-max", "aeff-max"], offset_max=offset_max ) circle = CircleSkyRegion(center=SkyCoord("83.63 deg", "22.14 deg"), radius=0.2 * u.deg) exclusion_mask = ~geom.region_mask(regions=[circle]) maker_fov = FoVBackgroundMaker(method="fit", exclusion_mask=exclusion_mask) ###################################################################### # Perform the data reduction loop # ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # %%time for obs in observations: # First a cutout of the target map is produced cutout = stacked.cutout( obs.get_pointing_icrs(obs.tmid), width=2 * offset_max, name=f"obs-{obs.obs_id}" ) # A MapDataset is filled in this cutout geometry dataset = maker.run(cutout, obs) # The data quality cut is applied dataset = maker_safe_mask.run(dataset, obs) # fit background model dataset = maker_fov.run(dataset) print( f"Background norm obs {obs.obs_id}: {dataset.background_model.spectral_model.norm.value:.2f}" ) # The resulting dataset cutout is stacked onto the final one stacked.stack(dataset) print(stacked) ###################################################################### # Inspect the reduced dataset # ~~~~~~~~~~~~~~~~~~~~~~~~~~~ # stacked.counts.sum_over_axes().smooth(0.05 * u.deg).plot(stretch="sqrt", add_cbar=True) plt.show() ###################################################################### # Save dataset to disk # -------------------- # # It is common to run the preparation step independent of the likelihood # fit, because often the preparation of maps, PSF and energy dispersion is # slow if you have a lot of data. We first create a folder: # path = Path("analysis_2") path.mkdir(exist_ok=True) ###################################################################### # And then write the maps and IRFs to disk by calling the dedicated # `~gammapy.datasets.MapDataset.write` method: # filename = path / "crab-stacked-dataset.fits.gz" stacked.write(filename, overwrite=True) ###################################################################### # Define the model # ---------------- # # We first define the model, a `~gammapy.modeling.models.SkyModel`, as the combination of a point # source `~gammapy.modeling.models.SpatialModel` with a powerlaw `~gammapy.modeling.models.SpectralModel`: # target_position = SkyCoord(ra=83.63308, dec=22.01450, unit="deg") spatial_model = PointSpatialModel( lon_0=target_position.ra, lat_0=target_position.dec, frame="icrs" ) spectral_model = PowerLawSpectralModel( index=2.702, amplitude=4.712e-11 * u.Unit("1 / (cm2 s TeV)"), reference=1 * u.TeV, ) sky_model = SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, name="crab" ) bkg_model = FoVBackgroundModel(dataset_name="crab-stacked") ###################################################################### # Now we assign this model to our reduced dataset: # stacked.models = [sky_model, bkg_model] ###################################################################### # Fit the model # ------------- # # The `~gammapy.modeling.Fit` class is orchestrating the fit, connecting # the ``stats`` method of the dataset to the minimizer. By default, it # uses ``iminuit``. # # Its constructor takes a list of dataset as argument. # # %%time fit = Fit(optimize_opts={"print_level": 1}) result = fit.run([stacked]) ###################################################################### # The `~gammapy.modeling.FitResult` contains information about the optimization and # parameter error calculation. # print(result) ###################################################################### # The fitted parameters are visible from the # `~astropy.modeling.models.Models` object. # print(stacked.models.to_parameters_table()) ###################################################################### # Here we can plot the number of predicted counts for each model and # for the background in our dataset. In order to do this, we can use # the `~gammapy.visualization.plot_npred_signal` function. # plot_npred_signal(stacked) plt.show() ###################################################################### # Inspecting residuals # ~~~~~~~~~~~~~~~~~~~~ # # For any fit it is useful to inspect the residual images. We have a few # options on the dataset object to handle this. First we can use # `~gammapy.datasets.MapDataset.plot_residuals_spatial` to plot a residual image, summed over all # energies: # stacked.plot_residuals_spatial(method="diff/sqrt(model)", vmin=-0.5, vmax=0.5) plt.show() ###################################################################### # In addition, we can also specify a region in the map to show the # spectral residuals: # region = CircleSkyRegion(center=SkyCoord("83.63 deg", "22.14 deg"), radius=0.5 * u.deg) stacked.plot_residuals( kwargs_spatial=dict(method="diff/sqrt(model)", vmin=-0.5, vmax=0.5), kwargs_spectral=dict(region=region), ) plt.show() ###################################################################### # We can also directly access the ``.residuals()`` to get a map, that we # can plot interactively: # residuals = stacked.residuals(method="diff") residuals.smooth("0.08 deg").plot_interactive( cmap="coolwarm", vmin=-0.2, vmax=0.2, stretch="linear", add_cbar=True ) plt.show() ###################################################################### # Plot the fitted spectrum # ------------------------ # ###################################################################### # Making a butterfly plot # ~~~~~~~~~~~~~~~~~~~~~~~ # # The `~gammapy.modeling.models.SpectralModel` component can be used to produce a, so-called, # butterfly plot showing the envelope of the model taking into account # parameter uncertainties: # spec = sky_model.spectral_model ###################################################################### # Now we can actually do the plot using the ``plot_error`` method: # energy_bounds = [1, 10] * u.TeV fig, ax = plt.subplots(figsize=(8, 6)) spec.plot(ax=ax, energy_bounds=energy_bounds, sed_type="e2dnde") spec.plot_error(ax=ax, energy_bounds=energy_bounds, sed_type="e2dnde") plt.show() ###################################################################### # Computing flux points # ~~~~~~~~~~~~~~~~~~~~~ # # We can now compute some flux points using the # `~gammapy.estimators.FluxPointsEstimator`. # # Besides the list of datasets to use, we must provide it the energy # intervals on which to compute flux points as well as the model component # name. # energy_edges = [1, 2, 4, 10] * u.TeV fpe = FluxPointsEstimator(energy_edges=energy_edges, source="crab") # %%time flux_points = fpe.run(datasets=[stacked]) fig, ax = plt.subplots(figsize=(8, 6)) spec.plot(ax=ax, energy_bounds=energy_bounds, sed_type="e2dnde") spec.plot_error(ax=ax, energy_bounds=energy_bounds, sed_type="e2dnde") flux_points.plot(ax=ax, sed_type="e2dnde") plt.show() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/examples/tutorials/starting/overview.py0000644000175100001770000003514014721316200022333 0ustar00runnerdocker""" Data structures =============== Introduction to basic data structures handling. Introduction ------------ This is a getting started tutorial for Gammapy. In this tutorial we will use the `Third Fermi-LAT Catalog of High-Energy Sources (3FHL) catalog `__, corresponding event list and images to learn how to work with some of the central Gammapy data structures. We will cover the following topics: - **Sky maps** - We will learn how to handle image based data with gammapy using a Fermi-LAT 3FHL example image. We will work with the following classes: - `~gammapy.maps.WcsNDMap` - `~astropy.coordinates.SkyCoord` - `~numpy.ndarray` - **Event lists** - We will learn how to handle event lists with Gammapy. Important for this are the following classes: - `~gammapy.data.EventList` - `~astropy.table.Table` - **Source catalogs** - We will show how to load source catalogs with Gammapy and explore the data using the following classes: - `~gammapy.catalog.SourceCatalog`, specifically `~gammapy.catalog.SourceCatalog3FHL` - `~astropy.table.Table` - **Spectral models and flux points** - We will pick an example source and show how to plot its spectral model and flux points. For this we will use the following classes: - `~gammapy.modeling.models.SpectralModel`, specifically the `~gammapy.modeling.models.PowerLaw2SpectralModel` - `~gammapy.estimators.FluxPoints` - `~astropy.table.Table` """ ###################################################################### # Setup # ----- # # **Important**: to run this tutorial the environment variable # ``GAMMAPY_DATA`` must be defined and point to the directory on your # machine where the datasets needed are placed. To check whether your # setup is correct you can execute the following cell: # import astropy.units as u from astropy.coordinates import SkyCoord import matplotlib.pyplot as plt ###################################################################### # Check setup # ----------- from gammapy.utils.check import check_tutorials_setup # %matplotlib inline check_tutorials_setup() ###################################################################### # Maps # ---- # # The `~gammapy.maps` package contains classes to work with sky images # and cubes. # # In this section, we will use a simple 2D sky image and will learn how # to: # # - Read sky images from FITS files # - Smooth images # - Plot images # - Cutout parts from images # from gammapy.maps import Map gc_3fhl = Map.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-counts.fits.gz") ###################################################################### # The image is a `~gammapy.maps.WcsNDMap` object: # print(gc_3fhl) ###################################################################### # The shape of the image is 400 x 200 pixel and it is defined using a # cartesian projection in galactic coordinates. # # The ``geom`` attribute is a `~gammapy.maps.WcsGeom` object: # print(gc_3fhl.geom) ###################################################################### # Let’s take a closer look a the ``.data`` attribute: # print(gc_3fhl.data) ###################################################################### # That looks familiar! It just an *ordinary* 2 dimensional numpy array, # which means you can apply any known numpy method to it: # print(f"Total number of counts in the image: {gc_3fhl.data.sum():.0f}") ###################################################################### # To show the image on the screen we can use the ``plot`` method. It # basically calls # `plt.imshow `__, # passing the ``gc_3fhl.data`` attribute but in addition handles axis with # world coordinates using # `astropy.visualization.wcsaxes `__ # and defines some defaults for nicer plots (e.g. the colormap ‘afmhot’): # gc_3fhl.plot(stretch="sqrt") plt.show() ###################################################################### # To make the structures in the image more visible we will smooth the data # using a Gaussian kernel. # gc_3fhl_smoothed = gc_3fhl.smooth(kernel="gauss", width=0.2 * u.deg) gc_3fhl_smoothed.plot(stretch="sqrt") plt.show() ###################################################################### # The smoothed plot already looks much nicer, but still the image is # rather large. As we are mostly interested in the inner part of the # image, we will cut out a quadratic region of the size 9 deg x 9 deg # around Vela. Therefore we use `~gammapy.maps.Map.cutout` to make a # cutout map: # # define center and size of the cutout region center = SkyCoord(0, 0, unit="deg", frame="galactic") gc_3fhl_cutout = gc_3fhl_smoothed.cutout(center, 9 * u.deg) gc_3fhl_cutout.plot(stretch="sqrt") plt.show() ###################################################################### # For a more detailed introduction to `~gammapy.maps`, take a look a the # :doc:`/tutorials/api/maps` notebook. # # Exercises # ~~~~~~~~~ # # - Add a marker and circle at the position of ``Sgr A*`` (you can find # examples in # `astropy.visualization.wcsaxes `__). # ###################################################################### # Event lists # ----------- # # Almost any high level gamma-ray data analysis starts with the raw # measured counts data, which is stored in event lists. In Gammapy event # lists are represented by the `~gammapy.data.EventList` class. # # In this section we will learn how to: # # - Read event lists from FITS files # - Access and work with the ``EventList`` attributes such as ``.table`` # and ``.energy`` # - Filter events lists using convenience methods # # Let’s start with the import from the `~gammapy.data` submodule: # from gammapy.data import EventList ###################################################################### # Very similar to the sky map class an event list can be created, by # passing a filename to the `~gammapy.data.EventList.read()` method: # events_3fhl = EventList.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-events.fits.gz") ###################################################################### # This time the actual data is stored as an # `~astropy.table.Table ` # object. It can be accessed with ``.table`` attribute: # print(events_3fhl.table) ###################################################################### # You can do *len* over event_3fhl.table to find the total number of # events. # print(len(events_3fhl.table)) ###################################################################### # And we can access any other attribute of the ``Table`` object as well: # print(events_3fhl.table.colnames) ###################################################################### # For convenience we can access the most important event parameters as # properties on the ``EventList`` objects. The attributes will return # corresponding Astropy objects to represent the data, such as # `~astropy.units.Quantity`, # `~astropy.coordinates.SkyCoord` # or # `~astropy.time.Time` # objects: # print(events_3fhl.energy.to("GeV")) # %% print(events_3fhl.galactic) # events_3fhl.radec # %% print(events_3fhl.time) ###################################################################### # In addition ``EventList`` provides convenience methods to filter the # event lists. One possible use case is to find the highest energy event # within a radius of 0.5 deg around the vela position: # # select all events within a radius of 0.5 deg around center from gammapy.utils.regions import SphericalCircleSkyRegion region = SphericalCircleSkyRegion(center, radius=0.5 * u.deg) events_gc_3fhl = events_3fhl.select_region(region) # sort events by energy events_gc_3fhl.table.sort("ENERGY") # and show highest energy photon print("highest energy photon: ", events_gc_3fhl.energy[-1].to("GeV")) ###################################################################### # Exercises # ~~~~~~~~~ # # - Make a counts energy spectrum for the galactic center region, within # a radius of 10 deg. # ###################################################################### # Source catalogs # --------------- # # Gammapy provides a convenient interface to access and work with catalog # based data. # # In this section we will learn how to: # # - Load builtins catalogs from `~gammapy.catalog` # - Sort and index the underlying Astropy tables # - Access data from individual sources # # Let’s start with importing the 3FHL catalog object from the # `~gammapy.catalog` submodule: # from gammapy.catalog import SourceCatalog3FHL ###################################################################### # First we initialize the Fermi-LAT 3FHL catalog and directly take a look # at the ``.table`` attribute: # fermi_3fhl = SourceCatalog3FHL() print(fermi_3fhl.table) ###################################################################### # This looks very familiar again. The data is just stored as an # `~astropy.table.Table` # object. We have all the methods and attributes of the ``Table`` object # available. E.g. we can sort the underlying table by ``Signif_Avg`` to # find the top 5 most significant sources: # # sort table by significance fermi_3fhl.table.sort("Signif_Avg") # invert the order to find the highest values and take the top 5 top_five_TS_3fhl = fermi_3fhl.table[::-1][:5] # print the top five significant sources with association and source class print(top_five_TS_3fhl[["Source_Name", "ASSOC1", "ASSOC2", "CLASS", "Signif_Avg"]]) ###################################################################### # If you are interested in the data of an individual source you can access # the information from catalog using the name of the source or any alias # source name that is defined in the catalog: # mkn_421_3fhl = fermi_3fhl["3FHL J1104.4+3812"] # or use any alias source name that is defined in the catalog mkn_421_3fhl = fermi_3fhl["Mkn 421"] print(mkn_421_3fhl.data["Signif_Avg"]) ###################################################################### # Exercises # ~~~~~~~~~ # # - Try to load the Fermi-LAT 2FHL catalog and check the total number of # sources it contains. # - Select all the sources from the 2FHL catalog which are contained in # the Galactic Center region. The methods # `~gammapy.maps.WcsGeom.contains()` and # `~gammapy.catalog.SourceCatalog.positions` might be helpful for # this. Add markers for all these sources and try to add labels with # the source names. # - Try to find the source class of the object at position ra=68.6803, # dec=9.3331 # ###################################################################### # Spectral models and flux points # ------------------------------- # # In the previous section we learned how access basic data from individual # sources in the catalog. Now we will go one step further and explore the # full spectral information of sources. We will learn how to: # # - Plot spectral models # - Compute integral and energy fluxes # - Read and plot flux points # # As a first example we will start with the Crab Nebula: # crab_3fhl = fermi_3fhl["Crab Nebula"] crab_3fhl_spec = crab_3fhl.spectral_model() print(crab_3fhl_spec) ###################################################################### # The ``crab_3fhl_spec`` is an instance of the # `~gammapy.modeling.models.PowerLaw2SpectralModel` model, with the # parameter values and errors taken from the 3FHL catalog. # # Let’s plot the spectral model in the energy range between 10 GeV and # 2000 GeV: # ax_crab_3fhl = crab_3fhl_spec.plot(energy_bounds=[10, 2000] * u.GeV, energy_power=0) plt.show() ###################################################################### # We assign the return axes object to variable called ``ax_crab_3fhl``, # because we will re-use it later to plot the flux points on top. # # To compute the differential flux at 100 GeV we can simply call the model # like normal Python function and convert to the desired units: # print(crab_3fhl_spec(100 * u.GeV).to("cm-2 s-1 GeV-1")) ###################################################################### # Next we can compute the integral flux of the Crab between 10 GeV and # 2000 GeV: # print( crab_3fhl_spec.integral(energy_min=10 * u.GeV, energy_max=2000 * u.GeV).to( "cm-2 s-1" ) ) ###################################################################### # We can easily convince ourself, that it corresponds to the value given # in the Fermi-LAT 3FHL catalog: # print(crab_3fhl.data["Flux"]) ###################################################################### # In addition we can compute the energy flux between 10 GeV and 2000 GeV: # print( crab_3fhl_spec.energy_flux(energy_min=10 * u.GeV, energy_max=2000 * u.GeV).to( "erg cm-2 s-1" ) ) ###################################################################### # Next we will access the flux points data of the Crab: # print(crab_3fhl.flux_points) ###################################################################### # If you want to learn more about the different flux point formats you can # read the specification # `here `__. # # No we can check again the underlying astropy data structure by accessing # the ``.table`` attribute: # print(crab_3fhl.flux_points.to_table(sed_type="dnde", formatted=True)) ###################################################################### # Finally let’s combine spectral model and flux points in a single plot # and scale with ``energy_power=2`` to obtain the spectral energy # distribution: # ax = crab_3fhl_spec.plot(energy_bounds=[10, 2000] * u.GeV, sed_type="e2dnde") crab_3fhl.flux_points.plot(ax=ax, sed_type="e2dnde") plt.show() ###################################################################### # Exercises # ~~~~~~~~~ # # - Plot the spectral model and flux points for PKS 2155-304 for the 3FGL # and 2FHL catalogs. Try to plot the error of the model (aka # “Butterfly”) as well. ###################################################################### # What next? # ---------- # # This was a quick introduction to some of the high level classes in # Astropy and Gammapy. # # - To learn more about those classes, go to the API docs (links are in # the introduction at the top). # - To learn more about other parts of Gammapy (e.g. Fermi-LAT and TeV # data analysis), check out the other tutorial notebooks. # - To see what’s available in Gammapy, browse the Gammapy docs or use # the full-text search. # - If you have any questions, ask on the mailing list. # ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.176642 gammapy-1.3/gammapy/0000755000175100001770000000000014721316215014052 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/__init__.py0000644000175100001770000000432014721316200016154 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Gammapy: A Python package for gamma-ray astronomy. * Code: https://github.com/gammapy/gammapy * Docs: https://docs.gammapy.org/ The top-level `gammapy` namespace is almost empty, it only contains this: :: test --- Run Gammapy unit tests __version__ --- Gammapy version string The Gammapy functionality is available for import from the following sub-packages (e.g. `gammapy.makers`): :: astro --- Astrophysical source and population models catalog --- Source catalog tools makers --- Data reduction functionality data --- Data and observation handling irf --- Instrument response functions (IRFs) maps --- Sky map data structures modeling --- Models and fitting estimators --- High level flux estimation stats --- Statistics tools utils --- Utility functions and classes """ __all__ = ["__version__", "song"] from importlib.metadata import PackageNotFoundError, version try: __version__ = version(__name__) except PackageNotFoundError: # package is not installed pass del version, PackageNotFoundError def song(karaoke=False): """ Listen to the Gammapy song. Make sure you listen on good headphones or speakers. You'll be not disappointed! Parameters ---------- karaoke : bool Print lyrics to sing along. """ import sys import webbrowser webbrowser.open("https://gammapy.org/gammapy_song.mp3") if karaoke: lyrics = ( "\nGammapy Song Lyrics\n" "-------------------\n\n" "Gammapy, gamma-ray data analysis package\n" "Gammapy, prototype software CTA science tools\n\n" "Supernova remnants, pulsar winds, AGN, Gamma, Gamma, Gammapy\n" "Galactic plane survey, pevatrons, Gammapy, Gamma, Gammapy\n" "Gammapy, github, continuous integration, readthedocs, travis, " "open source project\n\n" "Gammapy, Gammapy\n\n" "Supernova remnants, pulsar winds, AGN, Gamma, Gamma, Gammapy\n" ) centered = "\n".join(f"{s:^80}" for s in lyrics.split("\n")) sys.stdout.write(centered) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/__main__.py0000644000175100001770000000054614721316200016143 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Top-level script environment for Gammapy. This is what's executed when you run: python -m gammapy See https://docs.python.org/3/library/__main__.html """ import sys from .scripts.main import cli if __name__ == "__main__": sys.exit(cli()) # pylint:disable=no-value-for-parameter ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/_astropy_init.py0000644000175100001770000000054414721316200017304 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import os __all__ = ["__version__", "test"] try: from .version import version as __version__ except ImportError: __version__ = "" # Create the test function for self test from astropy.tests.runner import TestRunner test = TestRunner.make_test_runner_in(os.path.dirname(__file__)) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.176642 gammapy-1.3/gammapy/analysis/0000755000175100001770000000000014721316215015675 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/analysis/__init__.py0000644000175100001770000000034214721316200017777 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Gammapy high level interface (analysis).""" from .config import AnalysisConfig from .core import Analysis __all__ = [ "Analysis", "AnalysisConfig", ] ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.180642 gammapy-1.3/gammapy/analysis/config/0000755000175100001770000000000014721316215017142 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/analysis/config/docs.yaml0000644000175100001770000000434514721316200020756 0ustar00runnerdocker# Section: general # General settings for the high level interface / optional general: # logging settings for the session log: # choose one of the example values for level level: INFO # also CRITICAL, ERROR, WARNING, DEBUG filename: filename.log filemode: w format: "%(asctime)s - %(message)s" datefmt: "%d-%b-%y %H:%M:%S" # output folder where files will be stored outdir: . # Section: observations # Observations used in the analysis / mandatory observations: # path to data store where to fetch observations datastore: $GAMMAPY_DATA/hess-dl3-dr1/ obs_ids: [23523, 23526] obs_file: # csv file with obs_ids # spatial /time filters applied on the obs_ids obs_cone: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 3 deg} obs_time: {start: '2019-12-01', stop: '2020-03-01'} # Section: datasets # Process of data reduction / mandatory datasets: type: 3d # also 1d stack: false geom: wcs: skydir: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg} binsize: 0.1 deg width: {width: 7 deg, height: 5 deg} binsize_irf: 0.1 deg axes: energy: {min: 0.1 TeV, max: 10 TeV, nbins: 30} energy_true: {min: 0.1 TeV, max: 10 TeV, nbins: 30} map_selection: ['counts', 'exposure', 'background', 'psf', 'edisp'] background: method: ring # also fov_background, reflected for 1d exclusion: # fits file for exclusion mask parameters: {r_in: 0.7 deg, width: 0.7 deg} # ring safe_mask: methods: ['aeff-default', 'offset-max'] parameters: {offset_max: 2.5 deg} on_region: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 3 deg} containment_correction: true # Section: fit # Fitting process / optional fit: fit_range: {min: 0.1 TeV, max: 10 TeV} # Section: flux_points # Flux estimation process /optional flux_points: energy: {min: 0.1 TeV, max: 10 TeV, nbins: 30} source: "source" parameters: {} # Section: excess_map # Flux/significance map process /optional excess_map: correlation_radius: 0.1 deg energy_edges: {min: 0.1 TeV, max: 10 TeV, nbins: 1} parameters: {} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/analysis/config/example-1d.yaml0000644000175100001770000000100014721316200021744 0ustar00runnerdockerobservations: datastore: $GAMMAPY_DATA/hess-dl3-dr1 obs_cone: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 5 deg} obs_ids: [23592, 23559] datasets: type: 1d stack: false geom: axes: energy: {min: 0.01 TeV, max: 100 TeV, nbins: 73} on_region: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 0.1 deg} containment_correction: true fit: fit_range: {min: 0.1 TeV, max: 100 TeV} flux_points: energy: {min: 1 TeV, max: 10 TeV, nbins: 2} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/analysis/config/example-3d.yaml0000644000175100001770000000205014721316200021754 0ustar00runnerdockerobservations: datastore: $GAMMAPY_DATA/hess-dl3-dr1 obs_cone: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 5 deg} obs_ids: [23592, 23559] datasets: type: 3d stack: true geom: wcs: skydir: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg} binsize: 0.04 deg width: {width: 5 deg, height: 5 deg} binsize_irf: 0.2 deg selection: offset_max: 2.5 deg axes: energy: {min: 1 TeV, max: 10 TeV, nbins: 4} energy_true: {min: 0.7 TeV, max: 12 TeV, nbins: 10} map_selection: ['counts', 'exposure', 'background', 'psf', 'edisp'] safe_mask: methods: ['offset-max'] parameters: {offset_max: 2.5 deg} background: method: fov_background parameters: {method: 'scale'} fit: fit_range: {min: 1 TeV, max: 10.1 TeV} flux_points: energy: {min: 1 TeV, max: 10 TeV, nbins: 2} excess_map: correlation_radius: 0.1 deg energy_edges: {min: 1 TeV, max: 10 TeV, nbins: 1} parameters: {} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/analysis/config/model-1d.yaml0000644000175100001770000000106314721316200021422 0ustar00runnerdockercomponents: - name: source type: SkyModel spectral: type: PowerLawSpectralModel parameters: - frozen: false name: index scale: 1.0 unit: '' value: 2.6 - frozen: false name: amplitude scale: 1.0 unit: cm-2 s-1 TeV-1 value: 5.0e-11 - frozen: true name: reference scale: 1.0 unit: TeV value: 1.0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/analysis/config/model.yaml0000644000175100001770000000176314721316200021127 0ustar00runnerdockercomponents: - name: source spatial: parameters: - frozen: false max: 85 min: 80 name: lon_0 scale: 1.0 unit: deg value: 83.633 - frozen: false max: 24 min: 20 name: lat_0 scale: 1.0 unit: deg value: 22.14 type: PointSpatialModel spectral: parameters: - frozen: false name: index scale: 1.0 unit: '' value: 2.6 - frozen: false name: amplitude scale: 1.0 unit: cm-2 s-1 TeV-1 value: 5.0e-11 - frozen: true name: reference scale: 1.0 unit: TeV value: 1.0 type: PowerLawSpectralModel type: SkyModel ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/analysis/config.py0000644000175100001770000002166714721316200017522 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import html import json import logging from collections import defaultdict from collections.abc import Mapping from enum import Enum from pathlib import Path from typing import List, Optional import yaml from pydantic import BaseModel, ConfigDict from gammapy.makers import MapDatasetMaker from gammapy.utils.scripts import read_yaml, to_yaml, write_yaml from gammapy.utils.types import AngleType, EnergyType, PathType, TimeType __all__ = ["AnalysisConfig"] CONFIG_PATH = Path(__file__).resolve().parent / "config" DOCS_FILE = CONFIG_PATH / "docs.yaml" log = logging.getLogger(__name__) def deep_update(d, u): """Recursively update a nested dictionary. Taken from: https://stackoverflow.com/a/3233356/19802442 """ for k, v in u.items(): if isinstance(v, Mapping): d[k] = deep_update(d.get(k, {}), v) else: d[k] = v return d class ReductionTypeEnum(str, Enum): spectrum = "1d" cube = "3d" class FrameEnum(str, Enum): icrs = "icrs" galactic = "galactic" class RequiredHDUEnum(str, Enum): events = "events" gti = "gti" aeff = "aeff" bkg = "bkg" edisp = "edisp" psf = "psf" rad_max = "rad_max" class BackgroundMethodEnum(str, Enum): reflected = "reflected" fov = "fov_background" ring = "ring" class SafeMaskMethodsEnum(str, Enum): aeff_default = "aeff-default" aeff_max = "aeff-max" edisp_bias = "edisp-bias" offset_max = "offset-max" bkg_peak = "bkg-peak" class MapSelectionEnum(str, Enum): counts = "counts" exposure = "exposure" background = "background" psf = "psf" edisp = "edisp" class GammapyBaseConfig(BaseModel): model_config = ConfigDict( arbitrary_types_allowed=True, validate_assignment=True, extra="forbid", validate_default=True, use_enum_values=True, ) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"

{html.escape(str(self))}
" class SkyCoordConfig(GammapyBaseConfig): frame: Optional[FrameEnum] = None lon: Optional[AngleType] = None lat: Optional[AngleType] = None class EnergyAxisConfig(GammapyBaseConfig): min: Optional[EnergyType] = None max: Optional[EnergyType] = None nbins: Optional[int] = None class SpatialCircleConfig(GammapyBaseConfig): frame: Optional[FrameEnum] = None lon: Optional[AngleType] = None lat: Optional[AngleType] = None radius: Optional[AngleType] = None class EnergyRangeConfig(GammapyBaseConfig): min: Optional[EnergyType] = None max: Optional[EnergyType] = None class TimeRangeConfig(GammapyBaseConfig): start: Optional[TimeType] = None stop: Optional[TimeType] = None class FluxPointsConfig(GammapyBaseConfig): energy: EnergyAxisConfig = EnergyAxisConfig() source: str = "source" parameters: dict = {"selection_optional": "all"} class LightCurveConfig(GammapyBaseConfig): time_intervals: TimeRangeConfig = TimeRangeConfig() energy_edges: EnergyAxisConfig = EnergyAxisConfig() source: str = "source" parameters: dict = {"selection_optional": "all"} class FitConfig(GammapyBaseConfig): fit_range: EnergyRangeConfig = EnergyRangeConfig() class ExcessMapConfig(GammapyBaseConfig): correlation_radius: AngleType = "0.1 deg" parameters: dict = {} energy_edges: EnergyAxisConfig = EnergyAxisConfig() class BackgroundConfig(GammapyBaseConfig): method: Optional[BackgroundMethodEnum] = None exclusion: Optional[PathType] = None parameters: dict = {} class SafeMaskConfig(GammapyBaseConfig): methods: List[SafeMaskMethodsEnum] = [SafeMaskMethodsEnum.aeff_default] parameters: dict = {} class EnergyAxesConfig(GammapyBaseConfig): energy: EnergyAxisConfig = EnergyAxisConfig(min="1 TeV", max="10 TeV", nbins=5) energy_true: EnergyAxisConfig = EnergyAxisConfig( min="0.5 TeV", max="20 TeV", nbins=16 ) class SelectionConfig(GammapyBaseConfig): offset_max: AngleType = "2.5 deg" class WidthConfig(GammapyBaseConfig): width: AngleType = "5 deg" height: AngleType = "5 deg" class WcsConfig(GammapyBaseConfig): skydir: SkyCoordConfig = SkyCoordConfig() binsize: AngleType = "0.02 deg" width: WidthConfig = WidthConfig() binsize_irf: AngleType = "0.2 deg" class GeomConfig(GammapyBaseConfig): wcs: WcsConfig = WcsConfig() selection: SelectionConfig = SelectionConfig() axes: EnergyAxesConfig = EnergyAxesConfig() class DatasetsConfig(GammapyBaseConfig): type: ReductionTypeEnum = ReductionTypeEnum.spectrum stack: bool = True geom: GeomConfig = GeomConfig() map_selection: List[MapSelectionEnum] = MapDatasetMaker.available_selection background: BackgroundConfig = BackgroundConfig() safe_mask: SafeMaskConfig = SafeMaskConfig() on_region: SpatialCircleConfig = SpatialCircleConfig() containment_correction: bool = True class ObservationsConfig(GammapyBaseConfig): datastore: PathType = Path("$GAMMAPY_DATA/hess-dl3-dr1/") obs_ids: List[int] = [] obs_file: Optional[PathType] = None obs_cone: SpatialCircleConfig = SpatialCircleConfig() obs_time: TimeRangeConfig = TimeRangeConfig() required_irf: List[RequiredHDUEnum] = ["aeff", "edisp", "psf", "bkg"] class LogConfig(GammapyBaseConfig): level: str = "info" filename: Optional[PathType] = None filemode: Optional[str] = None format: Optional[str] = None datefmt: Optional[str] = None class GeneralConfig(GammapyBaseConfig): log: LogConfig = LogConfig() outdir: str = "." n_jobs: int = 1 datasets_file: Optional[PathType] = None models_file: Optional[PathType] = None class AnalysisConfig(GammapyBaseConfig): """Gammapy analysis configuration.""" general: GeneralConfig = GeneralConfig() observations: ObservationsConfig = ObservationsConfig() datasets: DatasetsConfig = DatasetsConfig() fit: FitConfig = FitConfig() flux_points: FluxPointsConfig = FluxPointsConfig() excess_map: ExcessMapConfig = ExcessMapConfig() light_curve: LightCurveConfig = LightCurveConfig() def __str__(self): """Display settings in pretty YAML format.""" info = self.__class__.__name__ + "\n\n\t" data = self.to_yaml() data = data.replace("\n", "\n\t") info += data return info.expandtabs(tabsize=4) @classmethod def read(cls, path): """Read from YAML file. Parameters ---------- path : str input filepath """ config = read_yaml(path) config.pop("metadata", None) return AnalysisConfig(**config) @classmethod def from_yaml(cls, config_str): """Create from YAML string. Parameters ---------- config_str : str yaml str """ settings = yaml.safe_load(config_str) return AnalysisConfig(**settings) def write(self, path, overwrite=False): """Write to YAML file. Parameters ---------- path : `pathlib.Path` or str Path to write files. overwrite : bool, optional Overwrite existing file. Default is False. """ yaml_str = self.to_yaml() write_yaml(yaml_str, path, overwrite=overwrite) def to_yaml(self): """Convert to YAML string.""" data = json.loads(self.model_dump_json()) return to_yaml(data) def set_logging(self): """Set logging config. Calls ``logging.basicConfig``, i.e. adjusts global logging state. """ self.general.log.level = self.general.log.level.upper() logging.basicConfig(**self.general.log.model_dump()) log.info("Setting logging config: {!r}".format(self.general.log.model_dump())) def update(self, config=None): """Update config with provided settings. Parameters ---------- config : str or `AnalysisConfig` object, optional Configuration settings provided in dict() syntax. Default is None. """ if isinstance(config, str): other = AnalysisConfig.from_yaml(config) elif isinstance(config, AnalysisConfig): other = config else: raise TypeError(f"Invalid type: {config}") config_new = deep_update( self.model_dump(exclude_defaults=True), other.model_dump(exclude_defaults=True), ) return AnalysisConfig(**config_new) @staticmethod def _get_doc_sections(): """Return dictionary with commented docs from docs file.""" doc = defaultdict(str) with open(DOCS_FILE) as f: for line in filter(lambda line: not line.startswith("---"), f): line = line.strip("\n") if line.startswith("# Section: "): keyword = line.replace("# Section: ", "") doc[keyword] += line + "\n" return doc ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/analysis/core.py0000644000175100001770000005471614721316200017206 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Session class driving the high level interface API.""" import html import logging from astropy.coordinates import SkyCoord from astropy.table import Table from regions import CircleSkyRegion from gammapy.analysis.config import AnalysisConfig from gammapy.data import DataStore from gammapy.datasets import Datasets, FluxPointsDataset, MapDataset, SpectrumDataset from gammapy.estimators import ( ExcessMapEstimator, FluxPointsEstimator, LightCurveEstimator, ) from gammapy.makers import ( DatasetsMaker, FoVBackgroundMaker, MapDatasetMaker, ReflectedRegionsBackgroundMaker, RingBackgroundMaker, SafeMaskMaker, SpectrumDatasetMaker, ) from gammapy.maps import Map, MapAxis, RegionGeom, WcsGeom from gammapy.modeling import Fit from gammapy.modeling.models import DatasetModels, FoVBackgroundModel, Models from gammapy.utils.pbar import progress_bar from gammapy.utils.scripts import make_path __all__ = ["Analysis"] log = logging.getLogger(__name__) class Analysis: """Config-driven high level analysis interface. It is initialized by default with a set of configuration parameters and values declared in an internal high level interface model, though the user can also provide configuration parameters passed as a nested dictionary at the moment of instantiation. In that case these parameters will overwrite the default values of those present in the configuration file. Parameters ---------- config : dict or `~gammapy.analysis.AnalysisConfig` Configuration options following `AnalysisConfig` schema. """ def __init__(self, config): self.config = config self.config.set_logging() self.datastore = None self.observations = None self.datasets = None self.fit = Fit() self.fit_result = None self.flux_points = None def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property def models(self): if not self.datasets: raise RuntimeError("No datasets defined. Impossible to set models.") return self.datasets.models @models.setter def models(self, models): self.set_models(models, extend=False) @property def config(self): """Analysis configuration as an `~gammapy.analysis.AnalysisConfig` object.""" return self._config @config.setter def config(self, value): if isinstance(value, dict): self._config = AnalysisConfig(**value) elif isinstance(value, AnalysisConfig): self._config = value else: raise TypeError("config must be dict or AnalysisConfig.") def _set_data_store(self): """Set the datastore on the Analysis object.""" path = make_path(self.config.observations.datastore) if path.is_file(): log.debug(f"Setting datastore from file: {path}") self.datastore = DataStore.from_file(path) elif path.is_dir(): log.debug(f"Setting datastore from directory: {path}") self.datastore = DataStore.from_dir(path) else: raise FileNotFoundError(f"Datastore not found: {path}") def _make_obs_table_selection(self): """Return list of obs_ids after filtering on datastore observation table.""" obs_settings = self.config.observations # Reject configs with list of obs_ids and obs_file set at the same time if len(obs_settings.obs_ids) and obs_settings.obs_file is not None: raise ValueError( "Values for both parameters obs_ids and obs_file are not accepted." ) # First select input list of observations from obs_table if len(obs_settings.obs_ids): selected_obs_table = self.datastore.obs_table.select_obs_id( obs_settings.obs_ids ) elif obs_settings.obs_file is not None: path = make_path(obs_settings.obs_file) ids = list(Table.read(path, format="ascii", data_start=0).columns[0]) selected_obs_table = self.datastore.obs_table.select_obs_id(ids) else: selected_obs_table = self.datastore.obs_table # Apply cone selection if obs_settings.obs_cone.lon is not None: cone = dict( type="sky_circle", frame=obs_settings.obs_cone.frame, lon=obs_settings.obs_cone.lon, lat=obs_settings.obs_cone.lat, radius=obs_settings.obs_cone.radius, border="0 deg", ) selected_obs_table = selected_obs_table.select_observations(cone) return selected_obs_table["OBS_ID"].tolist() def get_observations(self): """Fetch observations from the data store according to criteria defined in the configuration.""" observations_settings = self.config.observations self._set_data_store() log.info("Fetching observations.") ids = self._make_obs_table_selection() required_irf = observations_settings.required_irf self.observations = self.datastore.get_observations( ids, skip_missing=True, required_irf=required_irf ) if observations_settings.obs_time.start is not None: start = observations_settings.obs_time.start stop = observations_settings.obs_time.stop if len(start.shape) == 0: time_intervals = [(start, stop)] else: time_intervals = [(tstart, tstop) for tstart, tstop in zip(start, stop)] self.observations = self.observations.select_time(time_intervals) log.info(f"Number of selected observations: {len(self.observations)}") for obs in self.observations: log.debug(obs) def get_datasets(self): """ Produce reduced datasets. Notes ----- The progress bar can be displayed for this function. """ datasets_settings = self.config.datasets if not self.observations or len(self.observations) == 0: raise RuntimeError("No observations have been selected.") if datasets_settings.type == "1d": self._spectrum_extraction() else: # 3d self._map_making() def set_models(self, models, extend=True): """Set models on datasets. Adds `FoVBackgroundModel` if not present already Parameters ---------- models : `~gammapy.modeling.models.Models` or str Models object or YAML models string. extend : bool, optional Extend the exiting models on the datasets or replace them. Default is True. """ if not self.datasets or len(self.datasets) == 0: raise RuntimeError("Missing datasets") log.info("Reading model.") if isinstance(models, str): models = Models.from_yaml(models) elif isinstance(models, Models): pass elif isinstance(models, DatasetModels) or isinstance(models, list): models = Models(models) else: raise TypeError(f"Invalid type: {models!r}") if extend: models.extend(self.datasets.models) self.datasets.models = models bkg_models = [] for dataset in self.datasets: if dataset.tag == "MapDataset" and dataset.background_model is None: bkg_models.append(FoVBackgroundModel(dataset_name=dataset.name)) if bkg_models: models.extend(bkg_models) self.datasets.models = models log.info(models) def read_models(self, path, extend=True): """Read models from YAML file. Parameters ---------- path : str Path to the model file. extend : bool, optional Extend the exiting models on the datasets or replace them. Default is True. """ path = make_path(path) models = Models.read(path) self.set_models(models, extend=extend) log.info(f"Models loaded from {path}.") def write_models(self, overwrite=True, write_covariance=True): """Write models to YAML file. File name is taken from the configuration file. """ filename_models = self.config.general.models_file if filename_models is not None: self.models.write( filename_models, overwrite=overwrite, write_covariance=write_covariance ) log.info(f"Models loaded from {filename_models}.") else: raise RuntimeError("Missing models_file in config.general") def read_datasets(self): """Read datasets from YAML file. File names are taken from the configuration file. """ filename = self.config.general.datasets_file filename_models = self.config.general.models_file if filename is not None: self.datasets = Datasets.read(filename) log.info(f"Datasets loaded from {filename}.") if filename_models is not None: self.read_models(filename_models, extend=False) else: raise RuntimeError("Missing datasets_file in config.general") def write_datasets(self, overwrite=True, write_covariance=True): """Write datasets to YAML file. File names are taken from the configuration file. Parameters ---------- overwrite : bool, optional Overwrite existing file. Default is True. write_covariance : bool, optional Save covariance or not. Default is True. """ filename = self.config.general.datasets_file filename_models = self.config.general.models_file if filename is not None: self.datasets.write( filename, filename_models, overwrite=overwrite, write_covariance=write_covariance, ) log.info(f"Datasets stored to {filename}.") log.info(f"Datasets stored to {filename_models}.") else: raise RuntimeError("Missing datasets_file in config.general") def run_fit(self): """Fitting reduced datasets to model.""" if not self.models: raise RuntimeError("Missing models") fit_settings = self.config.fit for dataset in self.datasets: if fit_settings.fit_range: energy_min = fit_settings.fit_range.min energy_max = fit_settings.fit_range.max geom = dataset.counts.geom dataset.mask_fit = geom.energy_mask(energy_min, energy_max) log.info("Fitting datasets.") result = self.fit.run(datasets=self.datasets) self.fit_result = result log.info(self.fit_result) def get_flux_points(self): """Calculate flux points for a specific model component.""" if not self.datasets: raise RuntimeError( "No datasets defined. Impossible to compute flux points." ) fp_settings = self.config.flux_points log.info("Calculating flux points.") energy_edges = self._make_energy_axis(fp_settings.energy).edges flux_point_estimator = FluxPointsEstimator( energy_edges=energy_edges, source=fp_settings.source, fit=self.fit, n_jobs=self.config.general.n_jobs, **fp_settings.parameters, ) fp = flux_point_estimator.run(datasets=self.datasets) self.flux_points = FluxPointsDataset( data=fp, models=self.models[fp_settings.source] ) cols = ["e_ref", "dnde", "dnde_ul", "dnde_err", "sqrt_ts"] table = self.flux_points.data.to_table(sed_type="dnde") log.info("\n{}".format(table[cols])) def get_excess_map(self): """Calculate excess map with respect to the current model.""" excess_settings = self.config.excess_map log.info("Computing excess maps.") # TODO: Here we could possibly stack the datasets if needed # or allow to compute the excess map for each dataset if len(self.datasets) > 1: raise ValueError("Datasets must be stacked to compute the excess map") if self.datasets[0].tag not in ["MapDataset", "MapDatasetOnOff"]: raise ValueError("Cannot compute excess map for 1D dataset") energy_edges = self._make_energy_axis(excess_settings.energy_edges) if energy_edges is not None: energy_edges = energy_edges.edges excess_map_estimator = ExcessMapEstimator( correlation_radius=excess_settings.correlation_radius, energy_edges=energy_edges, **excess_settings.parameters, ) self.excess_map = excess_map_estimator.run(self.datasets[0]) def get_light_curve(self): """Calculate light curve for a specific model component.""" lc_settings = self.config.light_curve log.info("Computing light curve.") energy_edges = self._make_energy_axis(lc_settings.energy_edges).edges if ( lc_settings.time_intervals.start is None or lc_settings.time_intervals.stop is None ): log.info( "Time intervals not defined. Extract light curve on datasets GTIs." ) time_intervals = None else: time_intervals = [ (t1, t2) for t1, t2 in zip( lc_settings.time_intervals.start, lc_settings.time_intervals.stop ) ] light_curve_estimator = LightCurveEstimator( time_intervals=time_intervals, energy_edges=energy_edges, source=lc_settings.source, fit=self.fit, n_jobs=self.config.general.n_jobs, **lc_settings.parameters, ) lc = light_curve_estimator.run(datasets=self.datasets) self.light_curve = lc log.info( "\n{}".format( self.light_curve.to_table(format="lightcurve", sed_type="flux") ) ) def update_config(self, config): """Update the configuration.""" self.config = self.config.update(config=config) @staticmethod def _create_wcs_geometry(wcs_geom_settings, axes): """Create the WCS geometry.""" geom_params = {} skydir_settings = wcs_geom_settings.skydir if skydir_settings.lon is not None: skydir = SkyCoord( skydir_settings.lon, skydir_settings.lat, frame=skydir_settings.frame ) geom_params["skydir"] = skydir if skydir_settings.frame in ["icrs", "galactic"]: geom_params["frame"] = skydir_settings.frame else: raise ValueError( f"Incorrect skydir frame: expect 'icrs' or 'galactic'. Got {skydir_settings.frame}" ) geom_params["axes"] = axes geom_params["binsz"] = wcs_geom_settings.binsize width = wcs_geom_settings.width.width.to("deg").value height = wcs_geom_settings.width.height.to("deg").value geom_params["width"] = (width, height) return WcsGeom.create(**geom_params) @staticmethod def _create_region_geometry(on_region_settings, axes): """Create the region geometry.""" on_lon = on_region_settings.lon on_lat = on_region_settings.lat on_center = SkyCoord(on_lon, on_lat, frame=on_region_settings.frame) on_region = CircleSkyRegion(on_center, on_region_settings.radius) return RegionGeom.create(region=on_region, axes=axes) def _create_geometry(self): """Create the geometry.""" log.debug("Creating geometry.") datasets_settings = self.config.datasets geom_settings = datasets_settings.geom axes = [self._make_energy_axis(geom_settings.axes.energy)] if datasets_settings.type == "3d": geom = self._create_wcs_geometry(geom_settings.wcs, axes) elif datasets_settings.type == "1d": geom = self._create_region_geometry(datasets_settings.on_region, axes) else: raise ValueError( f"Incorrect dataset type. Expect '1d' or '3d'. Got {datasets_settings.type}." ) return geom def _create_reference_dataset(self, name=None): """Create the reference dataset for the current analysis.""" log.debug("Creating target Dataset.") geom = self._create_geometry() geom_settings = self.config.datasets.geom geom_irf = dict(energy_axis_true=None, binsz_irf=None) if geom_settings.axes.energy_true.min is not None: geom_irf["energy_axis_true"] = self._make_energy_axis( geom_settings.axes.energy_true, name="energy_true" ) if geom_settings.wcs.binsize_irf is not None: geom_irf["binsz_irf"] = geom_settings.wcs.binsize_irf.to("deg").value if self.config.datasets.type == "1d": return SpectrumDataset.create(geom, name=name, **geom_irf) else: return MapDataset.create(geom, name=name, **geom_irf) def _create_dataset_maker(self): """Create the Dataset Maker.""" log.debug("Creating the target Dataset Maker.") datasets_settings = self.config.datasets if datasets_settings.type == "3d": maker = MapDatasetMaker(selection=datasets_settings.map_selection) elif datasets_settings.type == "1d": maker_config = {} if datasets_settings.containment_correction: maker_config["containment_correction"] = ( datasets_settings.containment_correction ) maker_config["selection"] = ["counts", "exposure", "edisp"] maker = SpectrumDatasetMaker(**maker_config) return maker def _create_safe_mask_maker(self): """Create the SafeMaskMaker.""" log.debug("Creating the mask_safe Maker.") safe_mask_selection = self.config.datasets.safe_mask.methods safe_mask_settings = self.config.datasets.safe_mask.parameters return SafeMaskMaker(methods=safe_mask_selection, **safe_mask_settings) def _create_background_maker(self): """Create the Background maker.""" log.info("Creating the background Maker.") datasets_settings = self.config.datasets bkg_maker_config = {} if datasets_settings.background.exclusion: path = make_path(datasets_settings.background.exclusion) exclusion_mask = Map.read(path) exclusion_mask.data = exclusion_mask.data.astype(bool) bkg_maker_config["exclusion_mask"] = exclusion_mask bkg_maker_config.update(datasets_settings.background.parameters) bkg_method = datasets_settings.background.method bkg_maker = None if bkg_method == "fov_background": log.debug(f"Creating FoVBackgroundMaker with arguments {bkg_maker_config}") bkg_maker = FoVBackgroundMaker(**bkg_maker_config) elif bkg_method == "ring": bkg_maker = RingBackgroundMaker(**bkg_maker_config) log.debug(f"Creating RingBackgroundMaker with arguments {bkg_maker_config}") if datasets_settings.geom.axes.energy.nbins > 1: raise ValueError( "You need to define a single-bin energy geometry for your dataset." ) elif bkg_method == "reflected": bkg_maker = ReflectedRegionsBackgroundMaker(**bkg_maker_config) log.debug( f"Creating ReflectedRegionsBackgroundMaker with arguments {bkg_maker_config}" ) else: log.warning("No background maker set. Check configuration.") return bkg_maker def _map_making(self): """Make maps and datasets for 3d analysis.""" datasets_settings = self.config.datasets offset_max = datasets_settings.geom.selection.offset_max log.info("Creating reference dataset and makers.") stacked = self._create_reference_dataset(name="stacked") maker = self._create_dataset_maker() maker_safe_mask = self._create_safe_mask_maker() bkg_maker = self._create_background_maker() makers = [maker, maker_safe_mask, bkg_maker] makers = [maker for maker in makers if maker is not None] log.info("Start the data reduction loop.") datasets_maker = DatasetsMaker( makers, stack_datasets=datasets_settings.stack, n_jobs=self.config.general.n_jobs, cutout_mode="trim", cutout_width=2 * offset_max, ) self.datasets = datasets_maker.run(stacked, self.observations) def _spectrum_extraction(self): """Run all steps for the spectrum extraction.""" log.info("Reducing spectrum datasets.") datasets_settings = self.config.datasets dataset_maker = self._create_dataset_maker() safe_mask_maker = self._create_safe_mask_maker() bkg_maker = self._create_background_maker() reference = self._create_reference_dataset() datasets = [] for obs in progress_bar(self.observations, desc="Observations"): log.debug(f"Processing observation {obs.obs_id}") dataset = dataset_maker.run(reference.copy(), obs) if bkg_maker is not None: dataset = bkg_maker.run(dataset, obs) if dataset.counts_off is None: log.debug( f"No OFF region found for observation {obs.obs_id}. Discarding." ) continue dataset = safe_mask_maker.run(dataset, obs) log.debug(dataset) datasets.append(dataset) self.datasets = Datasets(datasets) if datasets_settings.stack: stacked = self.datasets.stack_reduce(name="stacked") self.datasets = Datasets([stacked]) @staticmethod def _make_energy_axis(axis, name="energy"): if axis.min is None or axis.max is None: return None elif axis.nbins is None or axis.nbins < 1: return None else: return MapAxis.from_bounds( name=name, lo_bnd=axis.min.value, hi_bnd=axis.max.to_value(axis.min.unit), nbin=axis.nbins, unit=axis.min.unit, interp="log", node_type="edges", ) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.180642 gammapy-1.3/gammapy/analysis/tests/0000755000175100001770000000000014721316215017037 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/analysis/tests/__init__.py0000644000175100001770000000010014721316200021131 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/analysis/tests/test_analysis.py0000644000175100001770000003555614721316200022303 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging from pathlib import Path import pytest from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion from pydantic import ValidationError from gammapy.analysis import Analysis, AnalysisConfig from gammapy.datasets import MapDataset, SpectrumDatasetOnOff from gammapy.maps import WcsGeom, WcsNDMap from gammapy.modeling.models import DatasetModels from gammapy.utils.testing import requires_data CONFIG_PATH = Path(__file__).resolve().parent / ".." / "config" MODEL_FILE = CONFIG_PATH / "model.yaml" MODEL_FILE_1D = CONFIG_PATH / "model-1d.yaml" def get_example_config(which): """Example config: which can be 1d or 3d.""" return AnalysisConfig.read(CONFIG_PATH / f"example-{which}.yaml") def test_init(): cfg = {"general": {"outdir": "test"}} analysis = Analysis(cfg) assert analysis.config.general.outdir == "test" with pytest.raises(TypeError): Analysis("spam") def test_update_config(): analysis = Analysis(AnalysisConfig()) data = {"general": {"outdir": "test"}} config = AnalysisConfig(**data) analysis.update_config(config) assert analysis.config.general.outdir == "test" analysis = Analysis(AnalysisConfig()) data = """ general: outdir: test """ analysis.update_config(data) assert analysis.config.general.outdir == "test" analysis = Analysis(AnalysisConfig()) with pytest.raises(TypeError): analysis.update_config(0) def test_get_observations_no_datastore(): config = AnalysisConfig() analysis = Analysis(config) analysis.config.observations.datastore = "other" with pytest.raises(FileNotFoundError): analysis.get_observations() @requires_data() def test_get_observations_all(): config = AnalysisConfig() analysis = Analysis(config) analysis.config.observations.datastore = "$GAMMAPY_DATA/cta-1dc/index/gps/" analysis.get_observations() assert len(analysis.observations) == 4 @requires_data() def test_get_observations_obs_ids(): config = AnalysisConfig() analysis = Analysis(config) analysis.config.observations.datastore = "$GAMMAPY_DATA/cta-1dc/index/gps/" analysis.config.observations.obs_ids = ["110380"] analysis.get_observations() assert len(analysis.observations) == 1 @requires_data() def test_get_observations_obs_cone(): config = AnalysisConfig() analysis = Analysis(config) analysis.config.observations.datastore = "$GAMMAPY_DATA/hess-dl3-dr1" analysis.config.observations.obs_cone = { "frame": "icrs", "lon": "83d", "lat": "22d", "radius": "5d", } analysis.get_observations() assert len(analysis.observations) == 4 @requires_data() def test_get_observations_obs_file(tmp_path): config = AnalysisConfig() analysis = Analysis(config) analysis.get_observations() filename = tmp_path / "obs_ids.txt" filename.write_text("20136\n47829\n") analysis.config.observations.obs_file = filename analysis.get_observations() assert len(analysis.observations) == 2 @requires_data() def test_get_observations_obs_time(tmp_path): config = AnalysisConfig() analysis = Analysis(config) analysis.config.observations.obs_time = { "start": "2004-03-26", "stop": "2004-05-26", } analysis.get_observations() assert len(analysis.observations) == 40 analysis.config.observations.obs_ids = [0] with pytest.raises(KeyError): analysis.get_observations() @requires_data() def test_get_observations_missing_irf(): config = AnalysisConfig() analysis = Analysis(config) analysis.config.observations.datastore = "$GAMMAPY_DATA/joint-crab/dl3/magic/" analysis.config.observations.obs_ids = ["05029748"] analysis.config.observations.required_irf = ["aeff", "edisp"] analysis.get_observations() assert len(analysis.observations) == 1 @requires_data() def test_set_models(): config = get_example_config("3d") analysis = Analysis(config) analysis.get_observations() analysis.get_datasets() models_str = Path(MODEL_FILE).read_text() analysis.set_models(models=models_str) assert isinstance(analysis.models, DatasetModels) assert len(analysis.models) == 2 assert analysis.models.names == ["source", "stacked-bkg"] with pytest.raises(TypeError): analysis.set_models(0) new_source = analysis.models["source"].copy(name="source2") analysis.set_models(models=[new_source], extend=False) assert len(analysis.models) == 2 assert analysis.models.names == ["source2", "stacked-bkg"] @requires_data() def test_analysis_1d(): cfg = """ observations: datastore: $GAMMAPY_DATA/hess-dl3-dr1 obs_ids: [23523, 23526] obs_time: { start: [J2004.92654346, J2004.92658453, J2004.92663655], stop: [J2004.92658453, J2004.92663655, J2004.92670773] } datasets: type: 1d background: method: reflected geom: axes: energy_true: {min: 0.01 TeV, max: 300 TeV, nbins: 109} on_region: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 0.11 deg} safe_mask: methods: [aeff-default, edisp-bias] parameters: {bias_percent: 10.0} containment_correction: false flux_points: energy: {min: 1 TeV, max: 50 TeV, nbins: 4} light_curve: energy_edges: {min: 1 TeV, max: 50 TeV, nbins: 1} time_intervals: { start: [J2004.92654346, J2004.92658453, J2004.92663655], stop: [J2004.92658453, J2004.92663655, J2004.92670773] } """ config = get_example_config("1d") analysis = Analysis(config) analysis.update_config(cfg) analysis.get_observations() analysis.get_datasets() analysis.read_models(MODEL_FILE_1D) analysis.run_fit() analysis.get_flux_points() analysis.get_light_curve() assert len(analysis.datasets) == 3 table = analysis.flux_points.data.to_table(sed_type="dnde") assert len(table) == 4 dnde = table["dnde"].quantity assert dnde.unit == "cm-2 s-1 TeV-1" assert_allclose(dnde[0].value, 8.116854e-12, rtol=1e-2) assert_allclose(dnde[2].value, 3.444475e-14, rtol=1e-2) axis = analysis.light_curve.geom.axes["time"] assert axis.nbin == 3 assert_allclose(axis.time_min.mjd, [53343.92, 53343.935, 53343.954]) flux = analysis.light_curve.flux.data[:, :, 0, 0] assert_allclose(flux, [[1.688954e-11], [2.347870e-11], [1.604152e-11]], rtol=1e-4) @requires_data() def test_geom_analysis_1d(): cfg = """ observations: datastore: $GAMMAPY_DATA/hess-dl3-dr1 obs_ids: [23523] datasets: type: 1d background: method: reflected on_region: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 0.11 deg} geom: axes: energy: {min: 0.1 TeV, max: 30 TeV, nbins: 20} energy_true: {min: 0.03 TeV, max: 100 TeV, nbins: 50} containment_correction: false flux_points: energy: {min: 1 TeV, max: 50 TeV, nbins: 4} """ config = get_example_config("1d") analysis = Analysis(config) analysis.update_config(cfg) analysis.get_observations() analysis.get_datasets() assert len(analysis.datasets) == 1 axis = analysis.datasets[0].exposure.geom.axes["energy_true"] assert axis.nbin == 50 assert_allclose(axis.edges[0].to_value("TeV"), 0.03) assert_allclose(axis.edges[-1].to_value("TeV"), 100) @requires_data() def test_exclusion_region(tmp_path): config = get_example_config("1d") analysis = Analysis(config) region = CircleSkyRegion(center=SkyCoord("85d 23d"), radius=1 * u.deg) geom = WcsGeom.create(npix=(150, 150), binsz=0.05, skydir=SkyCoord("83d 22d")) exclusion_mask = ~geom.region_mask([region]) filename = tmp_path / "exclusion.fits" exclusion_mask.write(filename) config.datasets.background.method = "reflected" config.datasets.background.exclusion = filename analysis.get_observations() analysis.get_datasets() assert len(analysis.datasets) == 2 config = get_example_config("3d") analysis = Analysis(config) analysis.get_observations() analysis.get_datasets() geom = analysis.datasets[0]._geom exclusion_mask = ~geom.region_mask([region]) filename = tmp_path / "exclusion3d.fits" exclusion_mask.write(filename) config.datasets.background.exclusion = filename analysis.get_datasets() assert len(analysis.datasets) == 1 @requires_data() def test_analysis_1d_stacked_no_fit_range(): cfg = """ observations: datastore: $GAMMAPY_DATA/hess-dl3-dr1 obs_cone: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 5 deg} obs_ids: [23592, 23559] datasets: type: 1d stack: false geom: axes: energy: {min: 0.01 TeV, max: 100 TeV, nbins: 73} energy_true: {min: 0.03 TeV, max: 100 TeV, nbins: 50} on_region: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 0.1 deg} containment_correction: true background: method: reflected """ config = AnalysisConfig.from_yaml(cfg) analysis = Analysis(config) analysis.update_config(cfg) analysis.config.datasets.stack = True analysis.get_observations() analysis.get_datasets() analysis.read_models(MODEL_FILE_1D) analysis.run_fit() with pytest.raises(ValueError): analysis.get_excess_map() assert len(analysis.datasets) == 1 assert_allclose(analysis.datasets["stacked"].counts.data.sum(), 184) pars = analysis.models.parameters assert_allclose(analysis.datasets[0].mask_fit.data, True) assert_allclose(pars["index"].value, 2.76913, rtol=1e-2) assert_allclose(pars["amplitude"].value, 5.479729e-11, rtol=1e-2) @requires_data() def test_analysis_ring_background(): config = get_example_config("3d") config.datasets.background.method = "ring" config.datasets.background.parameters = {"r_in": "0.7 deg", "width": "0.7 deg"} config.datasets.geom.axes.energy.nbins = 1 analysis = Analysis(config) analysis.get_observations() analysis.get_datasets() analysis.get_excess_map() assert isinstance(analysis.datasets[0], MapDataset) assert_allclose( analysis.datasets[0].npred_background().data[0, 10, 10], 0.091799, rtol=1e-2 ) assert isinstance(analysis.excess_map["sqrt_ts"], WcsNDMap) assert_allclose(analysis.excess_map.npred_excess.data[0, 62, 62], 134.12389) @requires_data() def test_analysis_ring_3d(): config = get_example_config("3d") config.datasets.background.method = "ring" config.datasets.background.parameters = {"r_in": "0.7 deg", "width": "0.7 deg"} analysis = Analysis(config) analysis.get_observations() with pytest.raises(ValueError): analysis.get_datasets() @requires_data() def test_analysis_no_bkg_1d(caplog): config = get_example_config("1d") analysis = Analysis(config) with caplog.at_level(logging.WARNING): analysis.get_observations() analysis.get_datasets() assert not isinstance(analysis.datasets[0], SpectrumDatasetOnOff) assert "No background maker set. Check configuration." in [ _.message for _ in caplog.records ] @requires_data() def test_analysis_no_bkg_3d(caplog): config = get_example_config("3d") config.datasets.background.method = None analysis = Analysis(config) with caplog.at_level(logging.WARNING): analysis.get_observations() analysis.get_datasets() assert isinstance(analysis.datasets[0], MapDataset) assert "No background maker set. Check configuration." in [ _.message for _ in caplog.records ] @requires_data() def test_analysis_3d(): config = get_example_config("3d") analysis = Analysis(config) analysis.get_observations() analysis.get_datasets() analysis.read_models(MODEL_FILE) analysis.datasets["stacked"].background_model.spectral_model.tilt.frozen = False analysis.run_fit() analysis.get_flux_points() assert len(analysis.datasets) == 1 assert len(analysis.models.parameters) == 8 res = analysis.models.parameters assert res["amplitude"].unit == "cm-2 s-1 TeV-1" table = analysis.flux_points.data.to_table(sed_type="dnde") assert len(table) == 2 dnde = table["dnde"].quantity assert_allclose(dnde[0].value, 1.2722e-11, rtol=1e-2) assert_allclose(dnde[-1].value, 4.054128e-13, rtol=1e-2) assert_allclose(res["index"].value, 2.772814, rtol=1e-2) assert_allclose(res["tilt"].value, -0.133436, rtol=1e-2) @requires_data() def test_analysis_3d_joint_datasets(): config = get_example_config("3d") config.datasets.stack = False analysis = Analysis(config) analysis.get_observations() analysis.get_datasets() assert len(analysis.datasets) == 2 assert_allclose( analysis.datasets[0].background_model.spectral_model.norm.value, 1.031743694988066, rtol=1e-6, ) assert_allclose( analysis.datasets[0].background_model.spectral_model.tilt.value, 0.0, rtol=1e-6, ) assert_allclose( analysis.datasets[1].background_model.spectral_model.norm.value, 0.9776349021876344, rtol=1e-6, ) @requires_data() def test_usage_errors(): config = get_example_config("1d") analysis = Analysis(config) with pytest.raises(RuntimeError): analysis.get_datasets() with pytest.raises(RuntimeError): analysis.read_datasets() with pytest.raises(RuntimeError): analysis.write_datasets() with pytest.raises(TypeError): analysis.read_models() with pytest.raises(RuntimeError): analysis.write_models() with pytest.raises(RuntimeError): analysis.run_fit() with pytest.raises(RuntimeError): analysis.get_flux_points() with pytest.raises(ValidationError): analysis.config.datasets.type = "None" @requires_data() def test_datasets_io(tmpdir): config = get_example_config("3d") analysis = Analysis(config) analysis.get_observations() analysis.get_datasets() models_str = Path(MODEL_FILE).read_text() analysis.models = models_str config.general.datasets_file = tmpdir / "datasets.yaml" config.general.models_file = tmpdir / "models.yaml" analysis.write_datasets() analysis = Analysis(config) analysis.read_datasets() assert len(analysis.datasets.models) == 2 assert analysis.models.names == ["source", "stacked-bkg"] analysis.models[0].parameters["index"].value = 3 analysis.write_models() analysis = Analysis(config) analysis.read_datasets() assert len(analysis.datasets.models) == 2 assert analysis.models.names == ["source", "stacked-bkg"] assert analysis.models[0].parameters["index"].value == 3 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/analysis/tests/test_config.py0000644000175100001770000001164214721316200021713 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from pathlib import Path import pytest from astropy.coordinates import Angle from astropy.time import Time from astropy.units import Quantity from pydantic import ValidationError from gammapy.analysis.config import AnalysisConfig, GeneralConfig from gammapy.utils.testing import assert_allclose CONFIG_PATH = Path(__file__).resolve().parent / ".." / "config" DOC_FILE = CONFIG_PATH / "docs.yaml" def test_config_default_types(): config = AnalysisConfig() assert config.observations.obs_cone.frame is None assert config.observations.obs_cone.lon is None assert config.observations.obs_cone.lat is None assert config.observations.obs_cone.radius is None assert config.observations.obs_time.start is None assert config.observations.obs_time.stop is None assert config.datasets.geom.wcs.skydir.frame is None assert config.datasets.geom.wcs.skydir.lon is None assert config.datasets.geom.wcs.skydir.lat is None assert isinstance(config.datasets.geom.wcs.binsize, Angle) assert isinstance(config.datasets.geom.wcs.binsize_irf, Angle) assert isinstance(config.datasets.geom.axes.energy.min, Quantity) assert isinstance(config.datasets.geom.axes.energy.max, Quantity) assert isinstance(config.datasets.geom.axes.energy_true.min, Quantity) assert isinstance(config.datasets.geom.axes.energy_true.max, Quantity) assert isinstance(config.datasets.geom.selection.offset_max, Angle) assert config.fit.fit_range.min is None assert config.fit.fit_range.max is None assert isinstance(config.excess_map.correlation_radius, Angle) assert config.excess_map.energy_edges.min is None assert config.excess_map.energy_edges.max is None assert config.excess_map.energy_edges.nbins is None def test_config_not_default_types(): config = AnalysisConfig() config.observations.obs_cone = { "frame": "galactic", "lon": "83.633 deg", "lat": "22.014 deg", "radius": "1 deg", } config.fit.fit_range = {"min": "0.1 TeV", "max": "10 TeV"} assert config.observations.obs_cone.frame == "galactic" assert isinstance(config.observations.obs_cone.lon, Angle) assert isinstance(config.observations.obs_cone.lat, Angle) assert isinstance(config.observations.obs_cone.radius, Angle) config.observations.obs_time.start = "2019-12-01" assert isinstance(config.observations.obs_time.start, Time) with pytest.raises(ValueError): config.flux_points.energy.min = "1 deg" assert isinstance(config.fit.fit_range.min, Quantity) assert isinstance(config.fit.fit_range.max, Quantity) def test_config_basics(): config = AnalysisConfig() assert "AnalysisConfig" in str(config) config = AnalysisConfig.read(DOC_FILE) assert config.general.outdir == "." def test_config_create_from_dict(): data = {"general": {"log": {"level": "warning"}}} config = AnalysisConfig(**data) assert config.general.log.level == "warning" def test_config_create_from_yaml(): config = AnalysisConfig.read(DOC_FILE) assert isinstance(config.general, GeneralConfig) config_str = Path(DOC_FILE).read_text() config = AnalysisConfig.from_yaml(config_str) assert isinstance(config.general, GeneralConfig) def test_config_to_yaml(tmp_path): config = AnalysisConfig() assert "level: info" in config.to_yaml() config = AnalysisConfig() fpath = Path(tmp_path) / "temp.yaml" config.write(fpath) text = Path(fpath).read_text() assert "stack" in text with pytest.raises(IOError): config.write(fpath) def test_get_doc_sections(): config = AnalysisConfig() doc = config._get_doc_sections() assert "general" in doc.keys() def test_safe_mask_config_validation(): config = AnalysisConfig() # Check empty list is accepted config.datasets.safe_mask.methods = [] with pytest.raises(ValidationError): config.datasets.safe_mask.methods = ["bad"] def test_time_range_iso(): cfg = """ observations: datastore: $GAMMAPY_DATA/hess-dl3-dr1 obs_ids: [23523, 23526] obs_time: { start: [2004-12-04 22:04:48.000, 2004-12-04 22:26:24.000, 2004-12-04 22:53:45.600], stop: [2004-12-04 22:26:24.000, 2004-12-04 22:53:45.600, 2004-12-04 23:31:12.000] } """ config = AnalysisConfig.from_yaml(cfg) assert_allclose( config.observations.obs_time.start.mjd, [53343.92, 53343.935, 53343.954] ) def test_time_range_jyear(): cfg = """ observations: datastore: $GAMMAPY_DATA/hess-dl3-dr1 obs_ids: [23523, 23526] obs_time: { start: [J2004.92654346, J2004.92658453, J2004.92663655], stop: [J2004.92658453, J2004.92663655, J2004.92670773] } """ config = AnalysisConfig.from_yaml(cfg) assert_allclose( config.observations.obs_time.start.mjd, [53343.92, 53343.935, 53343.954] ) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.180642 gammapy-1.3/gammapy/astro/0000755000175100001770000000000014721316215015202 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/__init__.py0000644000175100001770000000016214721316200017304 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Astrophysical source and population models.""" ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.180642 gammapy-1.3/gammapy/astro/darkmatter/0000755000175100001770000000000014721316215017340 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/darkmatter/__init__.py0000644000175100001770000000155114721316200021445 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Dark matter spatial and spectral models.""" from gammapy.modeling.models import SPECTRAL_MODEL_REGISTRY from .profiles import ( BurkertProfile, DMProfile, EinastoProfile, IsothermalProfile, MooreProfile, NFWProfile, ZhaoProfile, ) from .spectra import ( DarkMatterAnnihilationSpectralModel, DarkMatterDecaySpectralModel, PrimaryFlux, ) from .utils import JFactory __all__ = [ "DarkMatterAnnihilationSpectralModel", "DarkMatterDecaySpectralModel", "JFactory", "PrimaryFlux", "BurkertProfile", "DMProfile", "EinastoProfile", "IsothermalProfile", "MooreProfile", "NFWProfile", "ZhaoProfile", ] SPECTRAL_MODEL_REGISTRY.append(DarkMatterAnnihilationSpectralModel) SPECTRAL_MODEL_REGISTRY.append(DarkMatterDecaySpectralModel) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/darkmatter/profiles.py0000644000175100001770000002635314721316200021540 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Dark matter profiles.""" import abc import html import numpy as np import astropy.units as u from gammapy.modeling import Parameter, Parameters from gammapy.utils.integrate import trapz_loglog __all__ = [ "BurkertProfile", "DMProfile", "EinastoProfile", "IsothermalProfile", "MooreProfile", "NFWProfile", "ZhaoProfile", ] class DMProfile(abc.ABC): """DMProfile model base class.""" LOCAL_DENSITY = 0.3 * u.GeV / (u.cm**3) """Local dark matter density as given in reference 2""" DISTANCE_GC = 8.33 * u.kpc """Distance to the Galactic Center as given in reference 2""" def __call__(self, radius): """Call evaluate method of derived classes.""" kwargs = {par.name: par.quantity for par in self.parameters} return self.evaluate(radius, **kwargs) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def scale_to_local_density(self): """Scale to local density.""" scale = (self.LOCAL_DENSITY / self(self.DISTANCE_GC)).to_value("") self.parameters["rho_s"].value *= scale def _eval_substitution(self, radius, separation, squared): """Density at given radius together with the substitution part.""" exponent = 2 if squared else 1 return ( self(radius) ** exponent * radius / np.sqrt(radius**2 - (self.DISTANCE_GC * np.sin(separation)) ** 2) ) def integral(self, rmin, rmax, separation, ndecade, squared=True): r"""Integrate dark matter profile numerically. .. math:: F(r_{min}, r_{max}) = \int_{r_{min}}^{r_{max}}\rho(r)^\gamma dr \\ \gamma = 2 \text{for annihilation} \\ \gamma = 1 \text{for decay} Parameters ---------- rmin, rmax : `~astropy.units.Quantity` Lower and upper bound of integration range. separation : `~numpy.ndarray` Separation angle in radians. ndecade : int, optional Number of grid points per decade used for the integration. Default is 10000. squared : bool, optional Square the profile before integration. Default is True. """ integral = self.integrate_spectrum_separation( self._eval_substitution, rmin, rmax, separation, ndecade, squared ) inegral_unit = u.Unit("GeV2 cm-5") if squared else u.Unit("GeV cm-2") return integral.to(inegral_unit) def integrate_spectrum_separation( self, func, xmin, xmax, separation, ndecade, squared=True ): """Squared dark matter profile integral. Parameters ---------- xmin, xmax : `~astropy.units.Quantity` Lower and upper bound of integration range. separation : `~numpy.ndarray` Separation angle in radians. ndecade : int Number of grid points per decade used for the integration. squared : bool Square the profile before integration. Default is True. """ unit = xmin.unit xmin = xmin.value xmax = xmax.to_value(unit) logmin = np.log10(xmin) logmax = np.log10(xmax) n = np.int32((logmax - logmin) * ndecade) x = np.logspace(logmin, logmax, n) * unit y = func(x, separation, squared) val = trapz_loglog(y, x) return val.sum() class ZhaoProfile(DMProfile): r"""Zhao Profile. This is taken from equation 1 from Zhao (1996). It is a generalization of the NFW profile. The volume density is parametrized with a double power-law. Scale radii smaller than the scale radius are described with a slope of :math:`-\gamma` and scale radii larger than the scale radius are described with a slope of :math:`-\beta`. :math:`\alpha` is a measure for the width of the transition region. .. math:: \rho(r) = \rho_s \left(\frac{r_s}{r}\right)^\gamma \left(1 + \left(\frac{r}{r_s}\right)^\frac{1}{\alpha} \right)^{(\gamma - \beta) \alpha} Parameters ---------- r_s : `~astropy.units.Quantity` Scale radius, :math:`r_s`. alpha : `~astropy.units.Quantity` :math:`\alpha`. beta: `~astropy.units.Quantity` :math:`\beta`. gamma : `~astropy.units.Quantity` :math:`\gamma`. rho_s : `~astropy.units.Quantity` Characteristic density, :math:`\rho_s`. References ---------- * `1996MNRAS.278..488Z `_ * `2011JCAP...03..051C `_ """ DEFAULT_SCALE_RADIUS = 24.42 * u.kpc DEFAULT_ALPHA = 1 DEFAULT_BETA = 3 DEFAULT_GAMMA = 1 """ (alpha, beta, gamma) = (1,3,1) is NFW profile. Default scale radius as given in reference 2 (same as for NFW profile) """ def __init__( self, r_s=None, alpha=None, beta=None, gamma=None, rho_s=1 * u.Unit("GeV / cm3") ): r_s = self.DEFAULT_SCALE_RADIUS if r_s is None else r_s alpha = self.DEFAULT_ALPHA if alpha is None else alpha beta = self.DEFAULT_BETA if beta is None else beta gamma = self.DEFAULT_GAMMA if gamma is None else gamma self.parameters = Parameters( [ Parameter("r_s", u.Quantity(r_s)), Parameter("rho_s", u.Quantity(rho_s)), Parameter("alpha", alpha), Parameter("beta", beta), Parameter("gamma", gamma), ] ) @staticmethod def evaluate(radius, r_s, alpha, beta, gamma, rho_s): """Evaluate the profile.""" rr = radius / r_s return rho_s / (rr**gamma * (1 + rr ** (1 / alpha)) ** ((beta - gamma) * alpha)) class NFWProfile(DMProfile): r"""NFW Profile. .. math:: \rho(r) = \rho_s \frac{r_s}{r}\left(1 + \frac{r}{r_s}\right)^{-2} Parameters ---------- r_s : `~astropy.units.Quantity` Scale radius, :math:`r_s`. rho_s : `~astropy.units.Quantity` Characteristic density, :math:`\rho_s`. References ---------- * `1997ApJ...490..493 `_ * `2011JCAP...03..051C `_ """ DEFAULT_SCALE_RADIUS = 24.42 * u.kpc """Default scale radius as given in reference 2""" def __init__(self, r_s=None, rho_s=1 * u.Unit("GeV / cm3")): r_s = self.DEFAULT_SCALE_RADIUS if r_s is None else r_s self.parameters = Parameters( [Parameter("r_s", u.Quantity(r_s)), Parameter("rho_s", u.Quantity(rho_s))] ) @staticmethod def evaluate(radius, r_s, rho_s): """Evaluate the profile.""" rr = radius / r_s return rho_s / (rr * (1 + rr) ** 2) class EinastoProfile(DMProfile): r"""Einasto Profile. .. math:: \rho(r) = \rho_s \exp{ \left(-\frac{2}{\alpha}\left[ \left(\frac{r}{r_s}\right)^{\alpha} - 1\right] \right)} Parameters ---------- r_s : `~astropy.units.Quantity` Scale radius, :math:`r_s`. alpha : `~astropy.units.Quantity` :math:`\alpha`. rho_s : `~astropy.units.Quantity` Characteristic density, :math:`\rho_s`. References ---------- * `1965TrAlm...5...87E `_ * `2011JCAP...03..051C `_ """ DEFAULT_SCALE_RADIUS = 28.44 * u.kpc """Default scale radius as given in reference 2""" DEFAULT_ALPHA = 0.17 """Default scale radius as given in reference 2""" def __init__(self, r_s=None, alpha=None, rho_s=1 * u.Unit("GeV / cm3")): alpha = self.DEFAULT_ALPHA if alpha is None else alpha r_s = self.DEFAULT_SCALE_RADIUS if r_s is None else r_s self.parameters = Parameters( [ Parameter("r_s", u.Quantity(r_s)), Parameter("alpha", u.Quantity(alpha)), Parameter("rho_s", u.Quantity(rho_s)), ] ) @staticmethod def evaluate(radius, r_s, alpha, rho_s): """Evaluate the profile.""" rr = radius / r_s exponent = (2 / alpha) * (rr**alpha - 1) return rho_s * np.exp(-1 * exponent) class IsothermalProfile(DMProfile): r"""Isothermal Profile. .. math:: \rho(r) = \frac{\rho_s}{1 + (r/r_s)^2} Parameters ---------- r_s : `~astropy.units.Quantity` Scale radius, :math:`r_s`. References ---------- * `1991MNRAS.249..523B `_ * `2011JCAP...03..051C `_ """ DEFAULT_SCALE_RADIUS = 4.38 * u.kpc """Default scale radius as given in reference 2""" def __init__(self, r_s=None, rho_s=1 * u.Unit("GeV / cm3")): r_s = self.DEFAULT_SCALE_RADIUS if r_s is None else r_s self.parameters = Parameters( [Parameter("r_s", u.Quantity(r_s)), Parameter("rho_s", u.Quantity(rho_s))] ) @staticmethod def evaluate(radius, r_s, rho_s): """Evaluate the profile.""" rr = radius / r_s return rho_s / (1 + rr**2) class BurkertProfile(DMProfile): r"""Burkert Profile. .. math:: \rho(r) = \frac{\rho_s}{(1 + r/r_s)(1 + (r/r_s)^2)} Parameters ---------- r_s : `~astropy.units.Quantity` Scale radius, :math:`r_s`. References ---------- * `1995ApJ...447L..25B `_ * `2011JCAP...03..051C `_ """ DEFAULT_SCALE_RADIUS = 12.67 * u.kpc """Default scale radius as given in reference 2""" def __init__(self, r_s=None, rho_s=1 * u.Unit("GeV / cm3")): r_s = self.DEFAULT_SCALE_RADIUS if r_s is None else r_s self.parameters = Parameters( [Parameter("r_s", u.Quantity(r_s)), Parameter("rho_s", u.Quantity(rho_s))] ) @staticmethod def evaluate(radius, r_s, rho_s): """Evaluate the profile.""" rr = radius / r_s return rho_s / ((1 + rr) * (1 + rr**2)) class MooreProfile(DMProfile): r"""Moore Profile. .. math:: \rho(r) = \rho_s \left(\frac{r_s}{r}\right)^{1.16} \left(1 + \frac{r}{r_s} \right)^{-1.84} Parameters ---------- r_s : `~astropy.units.Quantity` Scale radius, :math:`r_s`. References ---------- * `2004MNRAS.353..624D `_ * `2011JCAP...03..051C `_ """ DEFAULT_SCALE_RADIUS = 30.28 * u.kpc """Default scale radius as given in reference 2""" def __init__(self, r_s=None, rho_s=1 * u.Unit("GeV / cm3")): r_s = self.DEFAULT_SCALE_RADIUS if r_s is None else r_s self.parameters = Parameters( [Parameter("r_s", u.Quantity(r_s)), Parameter("rho_s", u.Quantity(rho_s))] ) @staticmethod def evaluate(radius, r_s, rho_s): """Evaluate the profile.""" rr = radius / r_s rr_ = r_s / radius return rho_s * rr_**1.16 * (1 + rr) ** (-1.84) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/darkmatter/spectra.py0000644000175100001770000002704214721316200021352 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Dark matter spectra.""" import numpy as np import astropy.units as u from astropy.table import Table from gammapy.maps import Map, MapAxis, RegionGeom from gammapy.modeling import Parameter from gammapy.modeling.models import SpectralModel, TemplateNDSpectralModel from gammapy.utils.scripts import make_path __all__ = ["PrimaryFlux", "DarkMatterAnnihilationSpectralModel"] class PrimaryFlux(TemplateNDSpectralModel): """DM-annihilation gamma-ray spectra. Based on the precomputed models by Cirelli et al. (2016). All available annihilation channels can be found there. The dark matter mass will be set to the nearest available value. The spectra will be available as `~gammapy.modeling.models.TemplateNDSpectralModel` for a chosen dark matter mass and annihilation channel. Using a `~gammapy.modeling.models.TemplateNDSpectralModel` allows the interpolation between different dark matter masses. Parameters ---------- mDM : `~astropy.units.Quantity` Dark matter particle mass as rest mass energy. channel: str Annihilation channel. List available channels with `~gammapy.spectrum.PrimaryFlux.allowed_channels`. References ---------- * `2011JCAP...03..051 `_ * Cirelli et al (2016): http://www.marcocirelli.net/PPPC4DMID.html """ channel_registry = { "eL": "eL", "eR": "eR", "e": "e", "muL": r"\[Mu]L", "muR": r"\[Mu]R", "mu": r"\[Mu]", "tauL": r"\[Tau]L", "tauR": r"\[Tau]R", "tau": r"\[Tau]", "q": "q", "c": "c", "b": "b", "t": "t", "WL": "WL", "WT": "WT", "W": "W", "ZL": "ZL", "ZT": "ZT", "Z": "Z", "g": "g", "gamma": r"\[Gamma]", "h": "h", "nu_e": r"\[Nu]e", "nu_mu": r"\[Nu]\[Mu]", "nu_tau": r"\[Nu]\[Tau]", "V->e": "V->e", "V->mu": r"V->\[Mu]", "V->tau": r"V->\[Tau]", } table_filename = "$GAMMAPY_DATA/dark_matter_spectra/AtProduction_gammas.dat" tag = ["PrimaryFlux", "dm-pf"] def __init__(self, mDM, channel): self.table_path = make_path(self.table_filename) if not self.table_path.exists(): raise FileNotFoundError( f"\n\nFile not found: {self.table_filename}\n" "You may download the dataset needed with the following command:\n" "gammapy download datasets --src dark_matter_spectra" ) else: self.table = Table.read( str(self.table_path), format="ascii.fast_basic", guess=False, delimiter=" ", ) self.channel = channel # create RegionNDMap for channel masses = np.unique(self.table["mDM"]) log10x = np.unique(self.table["Log[10,x]"]) mass_axis = MapAxis.from_nodes(masses, name="mass", interp="log", unit="GeV") log10x_axis = MapAxis.from_nodes(log10x, name="energy_true") channel_name = self.channel_registry[self.channel] geom = RegionGeom(region=None, axes=[log10x_axis, mass_axis]) region_map = Map.from_geom( geom=geom, data=self.table[channel_name].reshape(geom.data_shape) ) interp_kwargs = {"extrapolate": True, "fill_value": 0, "values_scale": "lin"} super().__init__(region_map, interp_kwargs=interp_kwargs) self.mDM = mDM self.mass.frozen = True @property def mDM(self): """Dark matter mass.""" return u.Quantity(self.mass.value, "GeV") @mDM.setter def mDM(self, mDM): unit = self.mass.unit _mDM = u.Quantity(mDM).to(unit) _mDM_val = _mDM.to_value(unit) min_mass = u.Quantity(self.mass.min, unit) max_mass = u.Quantity(self.mass.max, unit) if _mDM_val < self.mass.min or _mDM_val > self.mass.max: raise ValueError( f"The mass {_mDM} is out of the bounds of the model. Please choose a mass between {min_mass} < `mDM` < {max_mass}" ) self.mass.value = _mDM_val @property def allowed_channels(self): """List of allowed annihilation channels.""" return list(self.channel_registry.keys()) @property def channel(self): """Annihilation channel as a string.""" return self._channel @channel.setter def channel(self, channel): if channel not in self.allowed_channels: raise ValueError( f"Invalid channel: {channel}\nAvailable: {self.allowed_channels}\n" ) else: self._channel = channel def evaluate(self, energy, **kwargs): """Evaluate the primary flux.""" mass = {"mass": self.mDM} kwargs.update(mass) log10x = np.log10(energy / self.mDM) dN_dlogx = super().evaluate(log10x, **kwargs) dN_dE = dN_dlogx / (energy * np.log(10)) return dN_dE class DarkMatterAnnihilationSpectralModel(SpectralModel): r"""Dark matter annihilation spectral model. The gamma-ray flux is computed as follows: .. math:: \frac{\mathrm d \phi}{\mathrm d E} = \frac{\langle \sigma\nu \rangle}{4\pi k m^2_{\mathrm{DM}}} \frac{\mathrm d N}{\mathrm dE} \times J(\Delta\Omega) Parameters ---------- mass : `~astropy.units.Quantity` Dark matter mass. channel : str Annihilation channel for `~gammapy.astro.darkmatter.PrimaryFlux`, e.g. "b" for "bbar". See `PrimaryFlux.channel_registry` for more. scale : float Scale parameter for model fitting. jfactor : `~astropy.units.Quantity` Integrated J-Factor needed when `~gammapy.modeling.models.PointSpatialModel` is used. z: float Redshift value. k: int Type of dark matter particle (k:2 Majorana, k:4 Dirac). Examples -------- This is how to instantiate a `DarkMatterAnnihilationSpectralModel` model:: >>> import astropy.units as u >>> from gammapy.astro.darkmatter import DarkMatterAnnihilationSpectralModel >>> channel = "b" >>> massDM = 5000*u.Unit("GeV") >>> jfactor = 3.41e19 * u.Unit("GeV2 cm-5") >>> modelDM = DarkMatterAnnihilationSpectralModel(mass=massDM, channel=channel, jfactor=jfactor) # noqa: E501 References ---------- * `2011JCAP...03..051 `_ """ THERMAL_RELIC_CROSS_SECTION = 3e-26 * u.Unit("cm3 s-1") """Thermally averaged annihilation cross-section""" scale = Parameter( "scale", 1, unit="", interp="log", ) tag = ["DarkMatterAnnihilationSpectralModel", "dm-annihilation"] def __init__(self, mass, channel, scale=scale.quantity, jfactor=1, z=0, k=2): self.k = k self.z = z self.mass = u.Quantity(mass) self.channel = channel self.jfactor = u.Quantity(jfactor) self.primary_flux = PrimaryFlux(mass, channel=self.channel) super().__init__(scale=scale) def evaluate(self, energy, scale): """Evaluate dark matter annihilation model.""" flux = ( scale * self.jfactor * self.THERMAL_RELIC_CROSS_SECTION * self.primary_flux(energy=energy * (1 + self.z)) / self.k / self.mass / self.mass / (4 * np.pi) ) return flux def to_dict(self, full_output=False): """Convert to dictionary.""" data = super().to_dict(full_output=full_output) data["spectral"]["channel"] = self.channel data["spectral"]["mass"] = self.mass.to_string() data["spectral"]["jfactor"] = self.jfactor.to_string() data["spectral"]["z"] = self.z data["spectral"]["k"] = self.k return data @classmethod def from_dict(cls, data): """Create spectral model from a dictionary. Parameters ---------- data : dict Dictionary with model data. Returns ------- model : `DarkMatterAnnihilationSpectralModel` Dark matter annihilation spectral model. """ data = data["spectral"] data.pop("type") parameters = data.pop("parameters") scale = [p["value"] for p in parameters if p["name"] == "scale"][0] return cls(scale=scale, **data) class DarkMatterDecaySpectralModel(SpectralModel): r"""Dark matter decay spectral model. The gamma-ray flux is computed as follows: .. math:: \frac{\mathrm d \phi}{\mathrm d E} = \frac{\Gamma}{4\pi m_{\mathrm{DM}}} \frac{\mathrm d N}{\mathrm dE} \times J(\Delta\Omega) Parameters ---------- mass : `~astropy.units.Quantity` Dark matter mass. channel : str Annihilation channel for `~gammapy.astro.darkmatter.PrimaryFlux`, e.g. "b" for "bbar". See `PrimaryFlux.channel_registry` for more. scale : float Scale parameter for model fitting jfactor : `~astropy.units.Quantity` Integrated J-Factor needed when `~gammapy.modeling.models.PointSpatialModel` is used. z: float Redshift value. Examples -------- This is how to instantiate a `DarkMatterAnnihilationSpectralModel` model:: >>> import astropy.units as u >>> from gammapy.astro.darkmatter import DarkMatterDecaySpectralModel >>> channel = "b" >>> massDM = 5000*u.Unit("GeV") >>> jfactor = 3.41e19 * u.Unit("GeV cm-2") >>> modelDM = DarkMatterDecaySpectralModel(mass=massDM, channel=channel, jfactor=jfactor) # noqa: E501 References ---------- * `2011JCAP...03..051 `_ """ LIFETIME_AGE_OF_UNIVERSE = 4.3e17 * u.Unit("s") """Use age of univserse as lifetime""" scale = Parameter( "scale", 1, unit="", interp="log", ) tag = ["DarkMatterDecaySpectralModel", "dm-decay"] def __init__(self, mass, channel, scale=scale.quantity, jfactor=1, z=0): self.z = z self.mass = u.Quantity(mass) self.channel = channel self.jfactor = u.Quantity(jfactor) self.primary_flux = PrimaryFlux(mass, channel=self.channel) super().__init__(scale=scale) def evaluate(self, energy, scale): """Evaluate dark matter decay model.""" flux = ( scale * self.jfactor * self.primary_flux(energy=energy * (1 + self.z)) / self.LIFETIME_AGE_OF_UNIVERSE / self.mass / (4 * np.pi) ) return flux def to_dict(self, full_output=False): data = super().to_dict(full_output=full_output) data["spectral"]["channel"] = self.channel data["spectral"]["mass"] = self.mass.to_string() data["spectral"]["jfactor"] = self.jfactor.to_string() data["spectral"]["z"] = self.z return data @classmethod def from_dict(cls, data): """Create spectral model from dictionary. Parameters ---------- data : dict Dictionary with model data. Returns ------- model : `DarkMatterDecaySpectralModel` Dark matter decay spectral model. """ data = data["spectral"] data.pop("type") parameters = data.pop("parameters") scale = [p["value"] for p in parameters if p["name"] == "scale"][0] return cls(scale=scale, **data) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.180642 gammapy-1.3/gammapy/astro/darkmatter/tests/0000755000175100001770000000000014721316215020502 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/darkmatter/tests/__init__.py0000644000175100001770000000010014721316200022574 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/darkmatter/tests/test_profiles.py0000644000175100001770000000113514721316200023730 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from gammapy.astro.darkmatter import profiles from gammapy.utils.testing import assert_quantity_allclose dm_profiles = [ profiles.ZhaoProfile, profiles.NFWProfile, profiles.EinastoProfile, profiles.IsothermalProfile, profiles.BurkertProfile, profiles.MooreProfile, ] @pytest.mark.parametrize("profile", dm_profiles) def test_profiles(profile): p = profile() p.scale_to_local_density() actual = p(p.DISTANCE_GC) desired = p.LOCAL_DENSITY assert_quantity_allclose(actual, desired) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/darkmatter/tests/test_spectra.py0000644000175100001770000000665314721316200023560 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose import astropy.units as u from gammapy.astro.darkmatter import ( DarkMatterAnnihilationSpectralModel, DarkMatterDecaySpectralModel, PrimaryFlux, ) from gammapy.modeling.models import Models, SkyModel from gammapy.utils.testing import assert_quantity_allclose, requires_data @requires_data() def test_primary_flux(): with pytest.raises(ValueError): PrimaryFlux(channel="Spam", mDM=1 * u.TeV) primflux = PrimaryFlux(channel="W", mDM=1 * u.TeV) actual = primflux(500 * u.GeV) desired = 9.3319318e-05 / u.GeV assert_quantity_allclose(actual, desired) @pytest.mark.parametrize( "mass, expected_flux", [(1.6, 0.00025037), (11, 0.00502079), (75, 0.02028309)] ) @requires_data() def test_primary_flux_interpolation(mass, expected_flux): primflux = PrimaryFlux(channel="W", mDM=mass * u.TeV) actual = primflux(500 * u.GeV) assert_quantity_allclose(actual, expected_flux / u.GeV, rtol=1e-5) @requires_data() def test_dm_annihilation_spectral_model(tmpdir): channel = "b" massDM = 5 * u.TeV jfactor = 3.41e19 * u.Unit("GeV2 cm-5") energy_min = 0.01 * u.TeV energy_max = 10 * u.TeV model = DarkMatterAnnihilationSpectralModel( mass=massDM, channel=channel, jfactor=jfactor ) integral_flux = model.integral(energy_min=energy_min, energy_max=energy_max).to( "cm-2 s-1" ) differential_flux = model.evaluate(energy=1 * u.TeV, scale=1).to("cm-2 s-1 TeV-1") sky_model = SkyModel( spectral_model=model, name="skymodel", ) models = Models([sky_model]) filename = tmpdir / "model.yaml" models.write(filename, overwrite=True) new_models = Models.read(filename) assert_quantity_allclose(integral_flux.value, 6.19575457e-14, rtol=1e-3) assert_quantity_allclose(differential_flux.value, 2.97831615e-16, rtol=1e-3) assert new_models[0].spectral_model.channel == model.channel assert new_models[0].spectral_model.z == model.z assert_allclose(new_models[0].spectral_model.jfactor.value, model.jfactor.value) assert new_models[0].spectral_model.mass.value == 5 assert new_models[0].spectral_model.mass.unit == u.TeV @requires_data() def test_dm_decay_spectral_model(tmpdir): channel = "b" massDM = 5 * u.TeV jfactor = 3.41e19 * u.Unit("GeV cm-2") energy_min = 0.01 * u.TeV energy_max = 10 * u.TeV model = DarkMatterDecaySpectralModel(mass=massDM, channel=channel, jfactor=jfactor) integral_flux = model.integral(energy_min=energy_min, energy_max=energy_max).to( "cm-2 s-1" ) differential_flux = model.evaluate(energy=1 * u.TeV, scale=1).to("cm-2 s-1 TeV-1") sky_model = SkyModel( spectral_model=model, name="skymodel", ) models = Models([sky_model]) filename = tmpdir / "model.yaml" models.write(filename, overwrite=True) new_models = Models.read(filename) assert_quantity_allclose(integral_flux.value, 4.80283595e-2, rtol=1e-3) assert_quantity_allclose(differential_flux.value, 2.3088e-4, rtol=1e-3) assert new_models[0].spectral_model.channel == model.channel assert new_models[0].spectral_model.z == model.z assert_allclose(new_models[0].spectral_model.jfactor.value, model.jfactor.value) assert new_models[0].spectral_model.mass.value == 5 assert new_models[0].spectral_model.mass.unit == u.TeV ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/darkmatter/tests/test_utils.py0000644000175100001770000000364514721316200023255 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import astropy.units as u from gammapy.astro.darkmatter import ( DarkMatterAnnihilationSpectralModel, DarkMatterDecaySpectralModel, JFactory, profiles, ) from gammapy.maps import WcsGeom from gammapy.utils.testing import assert_quantity_allclose, requires_data @pytest.fixture(scope="session") def geom(): return WcsGeom.create(binsz=0.5, npix=10) @pytest.fixture(scope="session") def jfact_annihilation(geom): jfactory = JFactory( geom=geom, profile=profiles.NFWProfile(), distance=profiles.DMProfile.DISTANCE_GC, ) return jfactory.compute_jfactor() @pytest.fixture(scope="session") def jfact_decay(geom): jfactory = JFactory( geom=geom, profile=profiles.NFWProfile(), distance=profiles.DMProfile.DISTANCE_GC, annihilation=False, ) return jfactory.compute_jfactor() @requires_data() def test_dmfluxmap_annihilation(jfact_annihilation): energy_min = 0.1 * u.TeV energy_max = 10 * u.TeV massDM = 1 * u.TeV channel = "W" diff_flux = DarkMatterAnnihilationSpectralModel(mass=massDM, channel=channel) int_flux = ( jfact_annihilation * diff_flux.integral(energy_min=energy_min, energy_max=energy_max) ).to("cm-2 s-1") actual = int_flux[5, 5] desired = 5.96827647e-12 / u.cm**2 / u.s assert_quantity_allclose(actual, desired, rtol=1e-3) @requires_data() def test_dmfluxmap_decay(jfact_decay): energy_min = 0.1 * u.TeV energy_max = 10 * u.TeV massDM = 1 * u.TeV channel = "W" diff_flux = DarkMatterDecaySpectralModel(mass=massDM, channel=channel) int_flux = ( jfact_decay * diff_flux.integral(energy_min=energy_min, energy_max=energy_max) ).to("cm-2 s-1") actual = int_flux[5, 5] desired = 7.01927e-3 / u.cm**2 / u.s assert_quantity_allclose(actual, desired, rtol=1e-3) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/darkmatter/utils.py0000644000175100001770000000566014721316200021053 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Utilities to compute J-factor maps.""" import html import numpy as np import astropy.units as u __all__ = ["JFactory"] class JFactory: """Compute J-Factor or D-Factor maps. J-Factors are computed for annihilation and D-Factors for decay. Set the argument `annihilation` to `False` to compute D-Factors. The assumed dark matter profiles will be centered on the center of the map. Parameters ---------- geom : `~gammapy.maps.WcsGeom` Reference geometry. profile : `~gammapy.astro.darkmatter.profiles.DMProfile` Dark matter profile. distance : `~astropy.units.Quantity` Distance to convert angular scale of the map. annihilation: bool, optional Decay or annihilation. Default is True. """ def __init__(self, geom, profile, distance, annihilation=True): self.geom = geom self.profile = profile self.distance = distance self.annihilation = annihilation def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def compute_differential_jfactor(self, ndecade=1e4): r"""Compute differential J-Factor. .. math:: \frac{\mathrm d J_\text{ann}}{\mathrm d \Omega} = \int_{\mathrm{LoS}} \mathrm d l \rho(l)^2 .. math:: \frac{\mathrm d J_\text{decay}}{\mathrm d \Omega} = \int_{\mathrm{LoS}} \mathrm d l \rho(l) """ separation = self.geom.separation(self.geom.center_skydir).rad rmin = u.Quantity( value=np.tan(separation) * self.distance, unit=self.distance.unit ) rmax = self.distance val = [ ( 2 * self.profile.integral( _.value * u.kpc, rmax, np.arctan(_.value / self.distance.value), ndecade, self.annihilation, ) + self.profile.integral( self.distance, 4 * rmax, np.arctan(_.value / self.distance.value), ndecade, self.annihilation, ) ) for _ in rmin.ravel() ] integral_unit = u.Unit("GeV2 cm-5") if self.annihilation else u.Unit("GeV cm-2") jfact = u.Quantity(val).to(integral_unit).reshape(rmin.shape) return jfact / u.steradian def compute_jfactor(self, ndecade=1e4): r"""Compute astrophysical J-Factor. .. math:: J(\Delta\Omega) = \int_{\Delta\Omega} \mathrm d \Omega^{\prime} \frac{\mathrm d J}{\mathrm d \Omega^{\prime}} """ diff_jfact = self.compute_differential_jfactor(ndecade) return diff_jfact * self.geom.to_image().solid_angle() ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.180642 gammapy-1.3/gammapy/astro/population/0000755000175100001770000000000014721316215017374 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/population/__init__.py0000644000175100001770000000255214721316200021503 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Astrophysical population models.""" from .simulate import ( add_observed_parameters, add_pulsar_parameters, add_pwn_parameters, add_snr_parameters, make_base_catalog_galactic, make_catalog_random_positions_cube, make_catalog_random_positions_sphere, ) from .spatial import ( CaseBattacharya1998, Exponential, FaucherKaspi2006, FaucherSpiral, LogSpiral, Lorimer2006, Paczynski1990, ValleeSpiral, YusifovKucuk2004, YusifovKucuk2004B, radial_distributions, ) from .velocity import ( FaucherKaspi2006VelocityBimodal, FaucherKaspi2006VelocityMaxwellian, Paczynski1990Velocity, velocity_distributions, ) __all__ = [ "add_observed_parameters", "add_pulsar_parameters", "add_pwn_parameters", "add_snr_parameters", "CaseBattacharya1998", "Exponential", "FaucherKaspi2006", "FaucherKaspi2006VelocityBimodal", "FaucherKaspi2006VelocityMaxwellian", "FaucherSpiral", "LogSpiral", "Lorimer2006", "make_base_catalog_galactic", "make_catalog_random_positions_cube", "make_catalog_random_positions_sphere", "Paczynski1990", "Paczynski1990Velocity", "radial_distributions", "ValleeSpiral", "velocity_distributions", "YusifovKucuk2004", "YusifovKucuk2004B", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/population/simulate.py0000644000175100001770000004007014721316200021564 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Simulate source catalogs.""" import numpy as np from astropy.coordinates import SkyCoord from astropy.table import Column, Table from astropy.units import Quantity from gammapy.astro.source import PWN, SNR, Pulsar, SNRTrueloveMcKee from gammapy.utils import coordinates as astrometry from gammapy.utils.random import ( draw, get_random_state, pdf, sample_sphere, sample_sphere_distance, ) from .spatial import ( RMAX, RMIN, ZMAX, ZMIN, Exponential, FaucherSpiral, radial_distributions, ) from .velocity import VMAX, VMIN, velocity_distributions __all__ = [ "add_observed_parameters", "add_pulsar_parameters", "add_pwn_parameters", "add_snr_parameters", "make_base_catalog_galactic", "make_catalog_random_positions_cube", "make_catalog_random_positions_sphere", ] def make_catalog_random_positions_cube( size=100, dimension=3, distance_max="1 pc", random_state="random-seed" ): """Make a catalog of sources randomly distributed on a line, square or cube. This can be used to study basic source population distribution effects, e.g. what the distance distribution looks like, or for a given luminosity function what the resulting flux distributions are for different spatial configurations. Parameters ---------- size : int, optional Number of sources. Default is 100. dimension : {1, 2, 3} Number of dimensions. Default is 3. distance_max : `~astropy.units.Quantity`, optional Maximum distance. Default is "1 pc". random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Default is 'random-seed'. Passed to `~gammapy.utils.random.get_random_state`. Returns ------- table : `~astropy.table.Table` Table with 3D position cartesian coordinates. Columns: x (pc), y (pc), z (pc). """ distance_max = Quantity(distance_max).to_value("pc") random_state = get_random_state(random_state) # Generate positions 1D, 2D, or 3D if dimension == 1: x = random_state.uniform(-distance_max, distance_max, size) y, z = 0, 0 elif dimension == 2: x = random_state.uniform(-distance_max, distance_max, size) y = random_state.uniform(-distance_max, distance_max, size) z = 0 elif dimension == 3: x = random_state.uniform(-distance_max, distance_max, size) y = random_state.uniform(-distance_max, distance_max, size) z = random_state.uniform(-distance_max, distance_max, size) else: raise ValueError(f"Invalid dimension: {dimension}") table = Table() table["x"] = Column(x, unit="pc", description="Cartesian coordinate") table["y"] = Column(y, unit="pc", description="Cartesian coordinate") table["z"] = Column(z, unit="pc", description="Cartesian coordinate") return table def make_catalog_random_positions_sphere( size=100, distance_min="0 pc", distance_max="1 pc", random_state="random-seed" ): """Sample random source locations in a sphere. This can be used to generate an isotropic source population in a sphere, e.g. to represent extra-galactic sources. Parameters ---------- size : int, optional Number of sources. Default is 100. distance_min, distance_max : `~astropy.units.Quantity`, optional Minimum and maximum distance. Default is "0 pc" and "1 pc", respectively. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Default is "random-state". Passed to `~gammapy.utils.random.get_random_state`. Returns ------- catalog : `~astropy.table.Table` Table with 3D position spherical coordinates. Columns: lon (deg), lat (deg), distance(pc). """ distance_min = Quantity(distance_min).to_value("pc") distance_max = Quantity(distance_max).to_value("pc") random_state = get_random_state(random_state) lon, lat = sample_sphere(size, random_state=random_state) distance = sample_sphere_distance(distance_min, distance_max, size, random_state) table = Table() table["lon"] = Column(lon, unit="rad", description="Spherical coordinate") table["lat"] = Column(lat, unit="rad", description="Spherical coordinate") table["distance"] = Column(distance, unit="pc", description="Spherical coordinate") return table def make_base_catalog_galactic( n_sources, rad_dis="YK04", vel_dis="H05", max_age="1e6 yr", spiralarms=True, n_ISM="1 cm-3", random_state="random-seed", ): """Make a catalog of Galactic sources, with basic source parameters. Choose a radial distribution, a velocity distribution, the number of sources ``n_sources`` and the maximum age ``max_age`` in years. If ``spiralarms`` is True a spiral arm modeling after Faucher&Kaspi is included. ``max_age`` and ``n_sources`` effectively correspond to a SN rate: SN_rate = ``n_sources`` / ``max_age`` Parameters ---------- n_sources : int Number of sources to simulate. rad_dis : callable, optional Radial surface density distribution of sources. Default is "YK04". vel_dis : callable, optional Proper motion velocity distribution of sources. Default is "H05". max_age : `~astropy.units.Quantity`, optional Maximal age of the source. Default is "1e6 yr". spiralarms : bool, optional Include a spiral arm model in the catalog. Default is True. n_ISM : `~astropy.units.Quantity`, optional Density of the interstellar medium. Default is "1 cm-3". random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Default is "random-state". Passed to `~gammapy.utils.random.get_random_state`. Returns ------- table : `~astropy.table.Table` Catalog of simulated source positions and proper velocities. """ max_age = Quantity(max_age).to_value("yr") n_ISM = Quantity(n_ISM).to("cm-3") random_state = get_random_state(random_state) if isinstance(rad_dis, str): rad_dis = radial_distributions[rad_dis] if isinstance(vel_dis, str): vel_dis = velocity_distributions[vel_dis] # Draw random values for the age age = random_state.uniform(0, max_age, n_sources) age = Quantity(age, "yr") # Draw spatial distribution r = draw( RMIN.to_value("kpc"), RMAX.to_value("kpc"), n_sources, pdf(rad_dis()), random_state=random_state, ) r = Quantity(r, "kpc") if spiralarms: r, theta, spiralarm = FaucherSpiral()(r, random_state=random_state) else: theta = Quantity(random_state.uniform(0, 2 * np.pi, n_sources), "rad") spiralarm = None x, y = astrometry.cartesian(r, theta) z = draw( ZMIN.to_value("kpc"), ZMAX.to_value("kpc"), n_sources, Exponential(), random_state=random_state, ) z = Quantity(z, "kpc") # Draw values from velocity distribution v = draw( VMIN.to_value("km/s"), VMAX.to_value("km/s"), n_sources, vel_dis(), random_state=random_state, ) v = Quantity(v, "km/s") # Draw random direction of initial velocity theta = Quantity(random_state.uniform(0, np.pi, x.size), "rad") phi = Quantity(random_state.uniform(0, 2 * np.pi, x.size), "rad") # Compute new position dx, dy, dz, vx, vy, vz = astrometry.motion_since_birth(v, age, theta, phi) # Add displacement to birth position x_moved = x + dx y_moved = y + dy z_moved = z + dz table = Table() table["age"] = Column(age, unit="yr", description="Age of the source") table["n_ISM"] = Column(n_ISM, description="Interstellar medium density") if spiralarms: table["spiralarm"] = Column(spiralarm, description="Which spiralarm?") table["x_birth"] = Column(x, description="Galactocentric x coordinate at birth") table["y_birth"] = Column(y, description="Galactocentric y coordinate at birth") table["z_birth"] = Column(z, description="Galactocentric z coordinate at birth") table["x"] = Column(x_moved.to("kpc"), description="Galactocentric x coordinate") table["y"] = Column(y_moved.to("kpc"), description="Galactocentric y coordinate") table["z"] = Column(z_moved.to("kpc"), description="Galactocentric z coordinate") table["vx"] = Column(vx, description="Galactocentric velocity in x direction") table["vy"] = Column(vy, description="Galactocentric velocity in y direction") table["vz"] = Column(vz, description="Galactocentric velocity in z direction") table["v_abs"] = Column(v, description="Galactocentric velocity (absolute)") return table def add_snr_parameters(table): """Add SNR parameters to the table. Parameters ---------- table : `~astropy.table.Table` Table that requires at least columns "age" and "n_ISM" to be defined. Returns ------- table : `~astropy.table.Table` Table with the following entries: * "E_SN" : SNR kinetic energy * "r_out" : SNR outer radius * "r_in" : SNR inner radius * "L_SNR" : SNR photon rate above 1 TeV """ # Read relevant columns age = table["age"].quantity n_ISM = table["n_ISM"].quantity # Compute properties snr = SNR(n_ISM=n_ISM) E_SN = snr.e_sn * np.ones(len(table)) r_out = snr.radius(age) r_in = snr.radius_inner(age) L_SNR = snr.luminosity_tev(age) # Add columns to table table["E_SN"] = Column(E_SN, description="SNR kinetic energy") table["r_out"] = Column(r_out.to("pc"), description="SNR outer radius") table["r_in"] = Column(r_in.to("pc"), description="SNR inner radius") table["L_SNR"] = Column(L_SNR, description="SNR photon rate above 1 TeV") return table def add_pulsar_parameters( table, B_mean=12.05, B_stdv=0.55, P_mean=0.3, P_stdv=0.15, random_state="random-seed", ): """Add pulsar parameters to the table. For the initial normal distribution of period and logB is given with the default values. Parameters ---------- B_mean : float, optional The mean magnetic field. Default is 12.05 (log Gauss). B_stdv : float, optional The standard deviation magnetic field. Default is 0.55. P_mean : float, optional The mean period. Default is 0.3 (seconds). P_stdv : float, optional The standard deviation period. Default is 0.15. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. """ random_state = get_random_state(random_state) # Read relevant columns age = table["age"].quantity # Draw the initial values for the period and magnetic field def p_dist(x): return np.exp(-0.5 * ((x - P_mean) / P_stdv) ** 2) p0_birth = draw(0, 2, len(table), p_dist, random_state=random_state) p0_birth = Quantity(p0_birth, "s") log10_b_psr = random_state.normal(B_mean, B_stdv, len(table)) b_psr = Quantity(10**log10_b_psr, "G") # Compute pulsar parameters psr = Pulsar(p0_birth, b_psr) p0 = psr.period(age) p1 = psr.period_dot(age) p1_birth = psr.P_dot_0 tau = psr.tau(age) tau_0 = psr.tau_0 l_psr = psr.luminosity_spindown(age) l0_psr = psr.L_0 # Add columns to table table["P0"] = Column(p0, unit="s", description="Pulsar period") table["P1"] = Column(p1, unit="", description="Pulsar period derivative") table["P0_birth"] = Column(p0_birth, unit="s", description="Pulsar birth period") table["P1_birth"] = Column( p1_birth, unit="", description="Pulsar birth period derivative" ) table["CharAge"] = Column(tau, unit="yr", description="Pulsar characteristic age") table["Tau0"] = Column(tau_0, unit="yr") table["L_PSR"] = Column(l_psr, unit="erg s-1") table["L0_PSR"] = Column(l0_psr, unit="erg s-1") table["B_PSR"] = Column( b_psr, unit="Gauss", description="Pulsar magnetic field at the poles" ) return table def add_pwn_parameters(table): """Add PWN parameters to the table. Parameters ---------- table : `~astropy.table.Table` Table that requires at least the following columns to be defined: "age", "E_SN", "n_ISM", "P0_birth" and "B_PSR". Returns ------- table : `~astropy.table.Table` Table with the additional entry "r_out_PWN" which is the outer radius of the PWN. """ # Some of the computations (specifically `pwn.radius`) aren't vectorised # across all parameters; so here we loop over source parameters explicitly results = [] for idx in range(len(table)): age = table["age"].quantity[idx] E_SN = table["E_SN"].quantity[idx] n_ISM = table["n_ISM"].quantity[idx] P0_birth = table["P0_birth"].quantity[idx] b_psr = table["B_PSR"].quantity[idx] # Compute properties pulsar = Pulsar(P0_birth, b_psr) snr = SNRTrueloveMcKee(e_sn=E_SN, n_ISM=n_ISM) pwn = PWN(pulsar, snr) r_out_pwn = pwn.radius(age).to_value("pc") results.append(dict(r_out_pwn=r_out_pwn)) # Add columns to table table["r_out_PWN"] = Column( [_["r_out_pwn"] for _ in results], unit="pc", description="PWN outer radius" ) return table def add_observed_parameters(table, obs_pos=None): """Add observable parameters (such as sky position or distance). Input table columns: x, y, z, extension, luminosity. Output table columns: distance, glon, glat, flux, angular_extension. Position of observer in cartesian coordinates. Center of galaxy as origin, x-axis goes through sun. Parameters ---------- table : `~astropy.table.Table` Input table. obs_pos : tuple, optional Observation position (X, Y, Z) in Galactocentric coordinates. Default is None, which uses the Earth's position. Returns ------- table : `~astropy.table.Table` Modified input table with columns added. """ obs_pos = obs_pos or [astrometry.D_SUN_TO_GALACTIC_CENTER, 0, 0] # Get data x, y, z = table["x"].quantity, table["y"].quantity, table["z"].quantity vx, vy, vz = table["vx"].quantity, table["vy"].quantity, table["vz"].quantity distance, glon, glat = astrometry.galactic(x, y, z, obs_pos=obs_pos) # Compute projected velocity v_glon, v_glat = astrometry.velocity_glon_glat(x, y, z, vx, vy, vz) coordinate = SkyCoord(glon, glat, unit="deg", frame="galactic").transform_to("icrs") ra, dec = coordinate.ra.deg, coordinate.dec.deg # Add columns to table table["distance"] = Column( distance, unit="pc", description="Distance observer to source center" ) table["GLON"] = Column(glon, unit="deg", description="Galactic longitude") table["GLAT"] = Column(glat, unit="deg", description="Galactic latitude") table["VGLON"] = Column( v_glon.to("deg/Myr"), unit="deg/Myr", description="Velocity in Galactic longitude", ) table["VGLAT"] = Column( v_glat.to("deg/Myr"), unit="deg/Myr", description="Velocity in Galactic latitude", ) table["RA"] = Column(ra, unit="deg", description="Right ascension") table["DEC"] = Column(dec, unit="deg", description="Declination") try: luminosity = table["luminosity"] flux = luminosity / (4 * np.pi * distance**2) table["flux"] = Column(flux.value, unit=flux.unit, description="Source flux") except KeyError: pass try: extension = table["extension"] angular_extension = np.degrees(np.arctan(extension / distance)) table["angular_extension"] = Column( angular_extension, unit="deg", description="Source angular radius (i.e. half-diameter)", ) except KeyError: pass return table ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/population/spatial.py0000644000175100001770000004063314721316200021403 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Galactic radial source distribution probability density functions.""" import numpy as np from astropy.modeling import Fittable1DModel, Parameter from astropy.units import Quantity from gammapy.utils.coordinates import D_SUN_TO_GALACTIC_CENTER, cartesian, polar from gammapy.utils.random import get_random_state __all__ = [ "CaseBattacharya1998", "Exponential", "FaucherKaspi2006", "FaucherSpiral", "LogSpiral", "Lorimer2006", "Paczynski1990", "radial_distributions", "ValleeSpiral", "YusifovKucuk2004", "YusifovKucuk2004B", ] # Simulation range used for random number drawing RMIN, RMAX = Quantity([0, 20], "kpc") ZMIN, ZMAX = Quantity([-0.5, 0.5], "kpc") class Paczynski1990(Fittable1DModel): r"""Radial distribution of the birth surface density of neutron stars. .. math:: f(r) = A r_{exp}^{-2} \exp \left(-\frac{r}{r_{exp}} \right) Formula (2) [Paczynski1990]_. Parameters ---------- amplitude : float See formula. r_exp : float See formula. See Also -------- CaseBattacharya1998, YusifovKucuk2004, Lorimer2006, YusifovKucuk2004B, FaucherKaspi2006, Exponential References ---------- .. [Paczynski1990] https://ui.adsabs.harvard.edu/abs/1990ApJ...348..485P """ amplitude = Parameter() r_exp = Parameter() evolved = False def __init__(self, amplitude=1, r_exp=4.5, **kwargs): super().__init__(amplitude=amplitude, r_exp=r_exp, **kwargs) @staticmethod def evaluate(r, amplitude, r_exp): """Evaluate model.""" return amplitude * r_exp**-2 * np.exp(-r / r_exp) class CaseBattacharya1998(Fittable1DModel): r"""Radial distribution of the surface density of supernova remnants in the galaxy. .. math:: f(r) = A \left( \frac{r}{r_{\odot}} \right) ^ \alpha \exp \left[ -\beta \left( \frac{ r - r_{\odot}}{r_{\odot}} \right) \right] Formula (14) [CaseBattacharya1998]_. Parameters ---------- amplitude : float See model formula. alpha : float See model formula. beta : float See model formula. See Also -------- Paczynski1990, YusifovKucuk2004, Lorimer2006, YusifovKucuk2004B, FaucherKaspi2006, Exponential References ---------- .. [CaseBattacharya1998] https://ui.adsabs.harvard.edu/abs/1998ApJ...504..761C """ amplitude = Parameter() alpha = Parameter() beta = Parameter() evolved = True def __init__(self, amplitude=1.0, alpha=2, beta=3.53, **kwargs): super().__init__(amplitude=amplitude, alpha=alpha, beta=beta, **kwargs) @staticmethod def evaluate(r, amplitude, alpha, beta): """Evaluate model.""" d_sun = D_SUN_TO_GALACTIC_CENTER.value term1 = (r / d_sun) ** alpha term2 = np.exp(-beta * (r - d_sun) / d_sun) return amplitude * term1 * term2 class YusifovKucuk2004(Fittable1DModel): r"""Radial distribution of the surface density of pulsars in the galaxy. .. math:: f(r) = A \left ( \frac{r + r_1}{r_{\odot} + r_1} \right )^a \exp \left [-b \left( \frac{r - r_{\odot}}{r_{\odot} + r_1} \right ) \right ] Used by Faucher-Guigere and Kaspi. Density at ``r = 0`` is nonzero. Formula (15) [YusifovKucuk2004]_. Parameters ---------- amplitude : float See model formula. a : float See model formula. b : float See model formula. r_1 : float See model formula. See Also -------- CaseBattacharya1998, Paczynski1990, Lorimer2006, YusifovKucuk2004B, FaucherKaspi2006, Exponential References ---------- .. [YusifovKucuk2004] https://ui.adsabs.harvard.edu/abs/2004A%26A...422..545Y """ amplitude = Parameter() a = Parameter() b = Parameter() r_1 = Parameter() evolved = True def __init__(self, amplitude=1, a=1.64, b=4.01, r_1=0.55, **kwargs): super().__init__(amplitude=amplitude, a=a, b=b, r_1=r_1, **kwargs) @staticmethod def evaluate(r, amplitude, a, b, r_1): """Evaluate model.""" d_sun = D_SUN_TO_GALACTIC_CENTER.value term1 = ((r + r_1) / (d_sun + r_1)) ** a term2 = np.exp(-b * (r - d_sun) / (d_sun + r_1)) return amplitude * term1 * term2 class YusifovKucuk2004B(Fittable1DModel): r"""Radial distribution of the surface density of OB stars in the galaxy. .. math:: f(r) = A \left( \frac{r}{r_{\odot}} \right) ^ a \exp \left[ -b \left( \frac{r}{r_{\odot}} \right) \right] Derived empirically from OB-stars distribution. Formula (17) [YusifovKucuk2004]_. Parameters ---------- amplitude : float See model formula. a : float See model formula. b : float See model formula. See Also -------- CaseBattacharya1998, Paczynski1990, YusifovKucuk2004, Lorimer2006, FaucherKaspi2006, Exponential References ---------- .. [YusifovKucuk2004] https://ui.adsabs.harvard.edu/abs/2004A%26A...422..545Y """ amplitude = Parameter() a = Parameter() b = Parameter() evolved = False def __init__(self, amplitude=1, a=4, b=6.8, **kwargs): super().__init__(amplitude=amplitude, a=a, b=b, **kwargs) @staticmethod def evaluate(r, amplitude, a, b): """Evaluate model.""" d_sun = D_SUN_TO_GALACTIC_CENTER.value return amplitude * (r / d_sun) ** a * np.exp(-b * (r / d_sun)) class FaucherKaspi2006(Fittable1DModel): r""" Radial distribution of the birth surface density of pulsars in the galaxy. .. math:: f(r) = A \frac{1}{\sqrt{2 \pi} \sigma} \exp \left(- \frac{(r - r_0)^2}{2 \sigma ^ 2}\right) Appendix B [FaucherKaspi2006]_. Parameters ---------- amplitude : float See model formula. r_0 : float See model formula. sigma : float See model formula. See Also -------- CaseBattacharya1998, Paczynski1990, YusifovKucuk2004, Lorimer2006, YusifovKucuk2004B, Exponential References ---------- .. [FaucherKaspi2006] https://ui.adsabs.harvard.edu/abs/2006ApJ...643..332F """ amplitude = Parameter() r_0 = Parameter() sigma = Parameter() evolved = False def __init__(self, amplitude=1, r_0=7.04, sigma=1.83, **kwargs): super().__init__(amplitude=amplitude, r_0=r_0, sigma=sigma, **kwargs) @staticmethod def evaluate(r, amplitude, r_0, sigma): """Evaluate model.""" term1 = 1.0 / np.sqrt(2 * np.pi * sigma) term2 = np.exp(-((r - r_0) ** 2) / (2 * sigma**2)) return amplitude * term1 * term2 class Lorimer2006(Fittable1DModel): r"""Radial distribution of the surface density of pulsars in the galaxy. .. math:: f(r) = A \left( \frac{r}{r_{\odot}} \right) ^ B \exp \left[ -C \left( \frac{r - r_{\odot}}{r_{\odot}} \right) \right] Formula (10) [Lorimer2006]_. Parameters ---------- amplitude : float See model formula. B : float See model formula. C : float See model formula. See Also -------- CaseBattacharya1998, Paczynski1990, YusifovKucuk2004, Lorimer2006, YusifovKucuk2004B, FaucherKaspi2006 References ---------- .. [Lorimer2006] https://ui.adsabs.harvard.edu/abs/2006MNRAS.372..777L """ amplitude = Parameter() B = Parameter() C = Parameter() evolved = True def __init__(self, amplitude=1, B=1.9, C=5.0, **kwargs): super().__init__(amplitude=amplitude, B=B, C=C, **kwargs) @staticmethod def evaluate(r, amplitude, B, C): """Evaluate model.""" d_sun = D_SUN_TO_GALACTIC_CENTER.value term1 = (r / d_sun) ** B term2 = np.exp(-C * (r - d_sun) / d_sun) return amplitude * term1 * term2 class Exponential(Fittable1DModel): r"""Exponential distribution. .. math:: f(z) = A \exp \left(- \frac{|z|}{z_0} \right) Usually used for height distribution above the Galactic plane, with 0.05 kpc as a commonly used birth height distribution. Parameters ---------- amplitude : float See model formula. z_0 : float Scale height of the distribution. See Also -------- CaseBattacharya1998, Paczynski1990, YusifovKucuk2004, Lorimer2006, YusifovKucuk2004B, FaucherKaspi2006, Exponential """ amplitude = Parameter() z_0 = Parameter() evolved = False def __init__(self, amplitude=1, z_0=0.05, **kwargs): super().__init__(amplitude=amplitude, z_0=z_0, **kwargs) @staticmethod def evaluate(z, amplitude, z_0): """Evaluate model.""" return amplitude * np.exp(-np.abs(z) / z_0) class LogSpiral: """Logarithmic spiral. Reference: http://en.wikipedia.org/wiki/Logarithmic_spiral """ def xy_position(self, theta=None, radius=None, spiralarm_index=0): """Compute (x, y) position for a given angle or radius. Parameters ---------- theta : `~astropy.units.Quantity`, optional Angle (deg). Default is None. radius : `~astropy.units.Quantity`, optional Radius (kpc). Default is None. spiralarm_index : int, optional Spiral arm index. Default is 0. Returns ------- x, y : `~numpy.ndarray` Position (x, y). """ if (theta is None) and radius is not None: theta = self.theta(radius, spiralarm_index=spiralarm_index) elif (radius is None) and theta is not None: radius = self.radius(theta, spiralarm_index=spiralarm_index) else: raise ValueError("Specify only one of: theta, radius") theta = np.radians(theta) x = radius * np.cos(theta) y = radius * np.sin(theta) return x, y def radius(self, theta, spiralarm_index): """Radius for a given angle. Parameters ---------- theta : `~astropy.units.Quantity` Angle (deg). spiralarm_index : int Spiral arm index. Returns ------- radius : `~numpy.ndarray` Radius (kpc). """ k = self.k[spiralarm_index] r_0 = self.r_0[spiralarm_index] theta_0 = self.theta_0[spiralarm_index] d_theta = np.radians(theta - theta_0) radius = r_0 * np.exp(d_theta / k) return radius def theta(self, radius, spiralarm_index): """Angle for a given radius. Parameters ---------- radius : `~astropy.units.Quantity` Radius (kpc). spiralarm_index : int Spiral arm index. Returns ------- theta : `~numpy.ndarray` Angle (deg). """ k = self.k[spiralarm_index] r_0 = self.r_0[spiralarm_index] theta_0 = self.theta_0[spiralarm_index] theta_0 = np.radians(theta_0) theta = k * np.log(radius / r_0) + theta_0 return np.degrees(theta) class FaucherSpiral(LogSpiral): """Milky way spiral arm used in Faucher et al (2006). Reference: https://ui.adsabs.harvard.edu/abs/2006ApJ...643..332F """ # Parameters k = Quantity([4.25, 4.25, 4.89, 4.89], "rad") r_0 = Quantity([3.48, 3.48, 4.9, 4.9], "kpc") theta_0 = Quantity([1.57, 4.71, 4.09, 0.95], "rad") spiralarms = np.array(["Norma", "Carina Sagittarius", "Perseus", "Crux Scutum"]) @staticmethod def _blur(radius, theta, amount=0.07, random_state="random-seed"): """Blur the positions around the centroid of the spiral arm. The given positions are blurred by drawing a displacement in radius from a normal distribution, with sigma = amount * radius. And a direction theta from a uniform distribution in the interval [0, 2 * pi]. Parameters ---------- radius : `~astropy.units.Quantity` Radius coordinate. theta : `~astropy.units.Quantity` Angle coordinate. amount : float, optional Amount of blurring of the position, given as a fraction of `radius`. Default is 0.07. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is 'random-seed'. """ random_state = get_random_state(random_state) dr = Quantity(abs(random_state.normal(0, amount * radius, radius.size)), "kpc") dtheta = Quantity(random_state.uniform(0, 2 * np.pi, radius.size), "rad") x, y = cartesian(radius, theta) dx, dy = cartesian(dr, dtheta) return polar(x + dx, y + dy) @staticmethod def _gc_correction( radius, theta, r_corr=Quantity(2.857, "kpc"), random_state="random-seed" ): """Correction of source distribution towards the galactic center. To avoid spiral arm features near the Galactic Center, the position angle theta is blurred by a certain amount towards the GC. Parameters ---------- radius : `~astropy.units.Quantity` Radius coordinate. theta : `~astropy.units.Quantity` Angle coordinate. r_corr : `~astropy.units.Quantity`, optional Scale of the correction towards the GC. Default is 2.857 * u.kpc. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is 'random-seed'. """ random_state = get_random_state(random_state) theta_corr = Quantity(random_state.uniform(0, 2 * np.pi, radius.size), "rad") return radius, theta + theta_corr * np.exp(-radius / r_corr) def __call__(self, radius, blur=True, random_state="random-seed"): """Draw random position from spiral arm distribution. Returns the corresponding angle theta[rad] to a given radius[kpc] and number of spiral arm. Possible numbers are: * Norma = 0, * Carina Sagittarius = 1, * Perseus = 2 * Crux Scutum = 3. Parameters ---------- random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is 'random-seed'. Returns ------- Returns dx and dy, if blurring= true. """ random_state = get_random_state(random_state) # Choose spiral arm N = random_state.randint(0, 4, radius.size) theta = self.k[N] * np.log(radius / self.r_0[N]) + self.theta_0[N] spiralarm = self.spiralarms[N] if blur: # Apply blurring model according to Faucher radius, theta = self._blur(radius, theta, random_state=random_state) radius, theta = self._gc_correction( radius, theta, random_state=random_state ) return radius, theta, spiralarm class ValleeSpiral(LogSpiral): """Milky way spiral arm model from Vallee (2008). Reference: https://ui.adsabs.harvard.edu/abs/2008AJ....135.1301V """ # Model parameters p = Quantity(12.8, "deg") # pitch angle in deg m = 4 # number of spiral arms r_sun = Quantity(7.6, "kpc") # distance sun to Galactic center in kpc r_0 = Quantity(2.1, "kpc") # spiral inner radius in kpc theta_0 = Quantity(-20, "deg") # Norma spiral arm start angle bar_radius = Quantity(3.0, "kpc") # Radius of the galactic bar (not equal r_0!) spiralarms = np.array(["Norma", "Perseus", "Carina Sagittarius", "Crux Scutum"]) def __init__(self): self.r_0 = self.r_0 * np.ones(4) self.theta_0 = self.theta_0 + Quantity([0, 90, 180, 270], "deg") self.k = Quantity(1.0 / np.tan(np.radians(self.p.value)) * np.ones(4), "rad") # Compute start and end point of the bar x_0, y_0 = self.xy_position(radius=self.bar_radius, spiralarm_index=0) x_1, y_1 = self.xy_position(radius=self.bar_radius, spiralarm_index=2) self.bar = dict(x=Quantity([x_0, x_1]), y=Quantity([y_0, y_1])) """Radial distribution (dict mapping names to classes).""" radial_distributions = { "CB98": CaseBattacharya1998, "F06": FaucherKaspi2006, "L06": Lorimer2006, "P90": Paczynski1990, "YK04": YusifovKucuk2004, "YK04B": YusifovKucuk2004B, } ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.1846418 gammapy-1.3/gammapy/astro/population/tests/0000755000175100001770000000000014721316215020536 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/population/tests/__init__.py0000644000175100001770000000010014721316200022630 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/population/tests/test_simulate.py0000644000175100001770000001571714721316200023777 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from numpy.testing import assert_allclose, assert_equal import astropy.units as u from astropy.table import Table from gammapy.astro.population import ( add_observed_parameters, add_pulsar_parameters, add_pwn_parameters, add_snr_parameters, make_base_catalog_galactic, make_catalog_random_positions_cube, make_catalog_random_positions_sphere, ) def test_make_catalog_random_positions_cube(): table = make_catalog_random_positions_cube(random_state=0) d = table[0] assert len(table) == 100 assert len(table.colnames) == 3 assert table["x"].unit == "pc" assert_allclose(d["x"], 0.0976270078546495) assert table["y"].unit == "pc" assert_allclose(d["y"], 0.3556330735924602) assert table["z"].unit == "pc" assert_allclose(d["z"], -0.37640823601179485) table = make_catalog_random_positions_cube(dimension=2, random_state=0) assert_equal(table["z"], 0) table = make_catalog_random_positions_cube(dimension=1, random_state=0) assert_equal(table["y"], 0) assert_equal(table["z"], 0) def test_make_catalog_random_positions_sphere(): table = make_catalog_random_positions_sphere(random_state=0) d = table[0] assert len(table) == 100 assert len(table.colnames) == 3 assert table["lon"].unit == "rad" assert_allclose(d["lon"], 3.4482969442579128) assert table["lat"].unit == "rad" assert_allclose(d["lat"], 0.36359133530192267) assert table["distance"].unit == "pc" assert_allclose(d["distance"], 0.6780943487897606) def test_make_base_catalog_galactic(): table = make_base_catalog_galactic(n_sources=10, random_state=0) d = table[0] assert len(table) == 10 assert len(table.colnames) == 13 assert table["age"].unit == "yr" assert_allclose(d["age"], 548813.50392732478) assert table["n_ISM"].unit == "cm-3" assert_allclose(d["n_ISM"], 1.0) assert table["spiralarm"].unit is None assert d["spiralarm"] == "Crux Scutum" assert table["x_birth"].unit == "kpc" assert_allclose(d["x_birth"], -5.856461, atol=1e-5) assert table["y_birth"].unit == "kpc" assert_allclose(d["y_birth"], 3.017292, atol=1e-5) assert table["z_birth"].unit == "kpc" assert_allclose(d["z_birth"], 0.049088, atol=1e-5) assert table["x"].unit == "kpc" assert_allclose(d["x"], -5.941061, atol=1e-5) assert table["y"].unit == "kpc" assert_allclose(d["y"], 3.081642, atol=1e-5) assert table["z"].unit == "kpc" assert_allclose(d["z"], 0.023161, atol=1e-5) assert table["vx"].unit == "km/s" assert_allclose(d["vx"], -150.727104, atol=1e-5) assert table["vy"].unit == "km/s" assert_allclose(d["vy"], 114.648494, atol=1e-5) assert table["vz"].unit == "km/s" assert_allclose(d["vz"], -46.193814, atol=1e-5) assert table["v_abs"].unit == "km/s" assert_allclose(d["v_abs"], 194.927693, atol=1e-5) def test_add_snr_parameters(): table = Table() table["age"] = [100, 1000] * u.yr table["n_ISM"] = u.Quantity(1, "cm-3") table = add_snr_parameters(table) assert len(table) == 2 assert table.colnames == ["age", "n_ISM", "E_SN", "r_out", "r_in", "L_SNR"] assert table["E_SN"].unit == "erg" assert_allclose(table["E_SN"], 1e51) assert table["r_out"].unit == "pc" assert_allclose(table["r_out"], [1, 3.80730787743]) assert table["r_in"].unit == "pc" assert_allclose(table["r_in"], [0.9086, 3.45931993743]) assert table["L_SNR"].unit == "1 / s" assert_allclose(table["L_SNR"], [0, 1.0768e33]) def test_add_pulsar_parameters(): table = Table() table["age"] = [100, 1000] * u.yr table = add_pulsar_parameters(table, random_state=0) assert len(table) == 2 assert len(table.colnames) == 10 assert table["age"].unit == "yr" assert_allclose(table["age"], [100, 1000]) assert table["P0"].unit == "s" assert_allclose(table["P0"], [0.214478, 0.246349], atol=1e-5) assert table["P1"].unit == "" assert_allclose(table["P1"], [6.310423e-13, 4.198294e-16], atol=1e-5) assert table["P0_birth"].unit == "s" assert_allclose(table["P0_birth"], [0.212418, 0.246336], atol=1e-5) assert table["P1_birth"].unit == "" assert_allclose(table["P1_birth"], [6.558773e-13, 4.199198e-16], atol=1e-5) assert table["CharAge"].unit == "yr" assert_allclose(table["CharAge"], [2.207394e-21, 1.638930e-24], atol=1e-5) assert table["Tau0"].unit == "yr" assert_allclose(table["Tau0"], [5.131385e03, 9.294538e06], atol=1e-5) assert table["L_PSR"].unit == "erg / s" assert_allclose(table["L_PSR"], [2.599229e36, 1.108788e33], rtol=1e-5) assert table["L0_PSR"].unit == "erg / s" assert_allclose(table["L0_PSR"], [2.701524e36, 1.109026e33], rtol=1e-5) assert table["B_PSR"].unit == "G" assert_allclose(table["B_PSR"], [1.194420e13, 3.254597e11], rtol=1e-5) def test_add_pwn_parameters(): table = make_base_catalog_galactic(n_sources=10, random_state=0) # To compute PWN parameters we need PSR and SNR parameters first table = add_snr_parameters(table) table = add_pulsar_parameters(table, random_state=0) table = add_pwn_parameters(table) d = table[0] assert len(table) == 10 assert len(table.colnames) == 27 assert table["r_out_PWN"].unit == "pc" assert_allclose(d["r_out_PWN"], 1.378224, atol=1e-4) def test_add_observed_parameters(): table = make_base_catalog_galactic(n_sources=10, random_state=0) table = add_observed_parameters(table) d = table[0] assert len(table) == 10 assert len(table.colnames) == 20 assert table["distance"].unit == "pc" assert_allclose(d["distance"], 13016.572756, atol=1e-5) assert table["GLON"].unit == "deg" assert_allclose(d["GLON"], -27.156565, atol=1e-5) assert table["GLAT"].unit == "deg" assert_allclose(d["GLAT"], 0.101948, atol=1e-5) assert table["VGLON"].unit == "deg / Myr" assert_allclose(d["VGLON"], 0.368166, atol=1e-5) assert table["VGLAT"].unit == "deg / Myr" assert_allclose(d["VGLAT"], -0.209514, atol=1e-5) assert table["RA"].unit == "deg" assert_allclose(d["RA"], 244.347149, atol=1e-5) assert table["DEC"].unit == "deg" assert_allclose(d["DEC"], -50.410142, atol=1e-5) def test_chain_all(): # Test that running the simulation functions in chain works table = make_base_catalog_galactic(n_sources=10, random_state=0) table = add_snr_parameters(table) table = add_pulsar_parameters(table, random_state=0) table = add_pwn_parameters(table) table = add_observed_parameters(table) d = table[0] # Note: the individual functions are tested above. # Here we just run them in a chain and do very basic asserts # on the output so that we make sure we notice changes. assert len(table) == 10 assert len(table.colnames) == 34 assert table["r_out_PWN"].unit == "pc" assert_allclose(d["r_out_PWN"], 1.378224, atol=1e-4) assert table["RA"].unit == "deg" assert_allclose(d["RA"], 244.347149, atol=1e-5) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/population/tests/test_spatial.py0000644000175100001770000000266214721316200023604 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose from gammapy.astro.population.spatial import ( CaseBattacharya1998, Exponential, FaucherKaspi2006, Lorimer2006, Paczynski1990, YusifovKucuk2004, YusifovKucuk2004B, ) test_cases = [ { "class": FaucherKaspi2006, "x": [0.1, 1, 10], "y": [0.0002221728797095, 0.00127106525755, 0.0797205770877], }, { "class": Lorimer2006, "x": [0.1, 1, 10], "y": [0.03020158, 1.41289246, 0.56351182], }, { "class": Paczynski1990, "x": [0.1, 1, 10], "y": [0.04829743, 0.03954259, 0.00535151], }, { "class": YusifovKucuk2004, "x": [0.1, 1, 10], "y": [0.55044445, 1.5363482, 0.66157715], }, { "class": YusifovKucuk2004B, "x": [0.1, 1, 10], "y": [1.76840095e-08, 8.60773150e-05, 6.42641018e-04], }, { "class": CaseBattacharya1998, "x": [0.1, 1, 10], "y": [0.00453091, 0.31178967, 0.74237311], }, { "class": Exponential, "x": [0, 0.25, 0.5], "y": [1.00000000e00, 6.73794700e-03, 4.53999298e-05], }, ] @pytest.mark.parametrize("case", test_cases, ids=lambda _: _["class"].__name__) def test_spatial_model(case): model = case["class"]() y = model(case["x"]) assert_allclose(y, case["y"], rtol=1e-5) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/population/tests/test_velocity.py0000644000175100001770000000152114721316200023776 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose from gammapy.astro.population.velocity import ( FaucherKaspi2006VelocityBimodal, FaucherKaspi2006VelocityMaxwellian, Paczynski1990Velocity, ) test_cases = [ { "class": FaucherKaspi2006VelocityMaxwellian, "x": [1, 10], "y": [4.28745276e-08, 4.28443169e-06], }, { "class": FaucherKaspi2006VelocityBimodal, "x": [1, 10], "y": [1.754811e-07, 1.751425e-05], }, {"class": Paczynski1990Velocity, "x": [1, 10], "y": [0.00227363, 0.00227219]}, ] @pytest.mark.parametrize("case", test_cases, ids=lambda _: _["class"].__name__) def test_velocity_model(case): model = case["class"]() y = model(case["x"]) assert_allclose(y, case["y"], rtol=1e-5) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/population/velocity.py0000644000175100001770000000743114721316200021603 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Pulsar velocity distribution models.""" import numpy as np from astropy.modeling import Fittable1DModel, Parameter from astropy.units import Quantity __all__ = [ "FaucherKaspi2006VelocityBimodal", "FaucherKaspi2006VelocityMaxwellian", "Paczynski1990Velocity", "velocity_distributions", ] # Simulation range used for random number drawing VMIN, VMAX = Quantity([0, 4000], "km/s") class FaucherKaspi2006VelocityMaxwellian(Fittable1DModel): r"""Maxwellian pulsar velocity distribution. .. math:: f(v) = A \sqrt{ \frac{2}{\pi}} \frac{v ^ 2}{\sigma ^ 3 } \exp \left(-\frac{v ^ 2}{2 \sigma ^ 2} \right) Reference: https://ui.adsabs.harvard.edu/abs/2006ApJ...643..332F Parameters ---------- amplitude : float Value of the integral. sigma : float Velocity parameter (km s^-1). """ amplitude = Parameter() sigma = Parameter() def __init__(self, amplitude=1, sigma=265, **kwargs): super().__init__(amplitude=amplitude, sigma=sigma, **kwargs) @staticmethod def evaluate(v, amplitude, sigma): """One dimensional velocity model function.""" term1 = np.sqrt(2 / np.pi) * v**2 / sigma**3 term2 = np.exp(-(v**2) / (2 * sigma**2)) return term1 * term2 class FaucherKaspi2006VelocityBimodal(Fittable1DModel): r"""Bimodal pulsar velocity distribution. - Faucher & Kaspi (2006). .. math:: f(v) = A\sqrt{\frac{2}{\pi}} v^2 \left[\frac{w}{\sigma_1^3} \exp \left(-\frac{v^2}{2\sigma_1^2} \right) + \frac{1-w}{\sigma_2^3} \exp \left(-\frac{v^2}{2\sigma_2^2} \right) \right] Formula (7) [FaucherKaspi2006]_. Parameters ---------- amplitude : float Value of the integral. sigma1 : float See model formula. sigma2 : float See model formula. w : float See model formula. References ---------- .. [FaucherKaspi2006] https://ui.adsabs.harvard.edu/abs/2006ApJ...643..332F """ amplitude = Parameter() sigma_1 = Parameter() sigma_2 = Parameter() w = Parameter() def __init__(self, amplitude=1, sigma_1=160, sigma_2=780, w=0.9, **kwargs): super().__init__( amplitude=amplitude, sigma_1=sigma_1, sigma_2=sigma_2, w=w, **kwargs ) @staticmethod def evaluate(v, amplitude, sigma_1, sigma_2, w): """One dimensional Faucher-Guigere & Kaspi 2006 velocity model function.""" A = amplitude * np.sqrt(2 / np.pi) * v**2 term1 = (w / sigma_1**3) * np.exp(-(v**2) / (2 * sigma_1**2)) term2 = (1 - w) / sigma_2**3 * np.exp(-(v**2) / (2 * sigma_2**2)) return A * (term1 + term2) class Paczynski1990Velocity(Fittable1DModel): r"""Distribution by Lyne 1982 and adopted by Paczynski and Faucher. .. math:: f(v) = A\frac{4}{\pi} \frac{1}{v_0 \left[1 + (v / v_0) ^ 2 \right] ^ 2} Formula (3) [Paczynski1990]_. Parameters ---------- amplitude : float Value of the integral. v_0 : float Velocity parameter (km s^-1). References ---------- .. [Paczynski1990] https://ui.adsabs.harvard.edu/abs/1990ApJ...348..485P """ amplitude = Parameter() v_0 = Parameter() def __init__(self, amplitude=1, v_0=560, **kwargs): super().__init__(amplitude=amplitude, v_0=v_0, **kwargs) @staticmethod def evaluate(v, amplitude, v_0): """One dimensional Paczynski 1990 velocity model function.""" return amplitude * 4.0 / (np.pi * v_0 * (1 + (v / v_0) ** 2) ** 2) """Velocity distributions (dict mapping names to classes).""" velocity_distributions = { "H05": FaucherKaspi2006VelocityMaxwellian, "F06B": FaucherKaspi2006VelocityBimodal, "F06P": Paczynski1990Velocity, } ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.1846418 gammapy-1.3/gammapy/astro/source/0000755000175100001770000000000014721316215016502 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/source/__init__.py0000644000175100001770000000044714721316200020612 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Astrophysical source models.""" from .pulsar import Pulsar, SimplePulsar from .pwn import PWN from .snr import SNR, SNRTrueloveMcKee __all__ = [ "Pulsar", "PWN", "SimplePulsar", "SNR", "SNRTrueloveMcKee", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/source/pulsar.py0000644000175100001770000001241114721316200020353 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Pulsar source models.""" import html import numpy as np from astropy.units import Quantity __all__ = ["Pulsar", "SimplePulsar"] DEFAULT_I = Quantity(1e45, "g cm2") """Pulsar default moment of inertia""" DEFAULT_R = Quantity(1e6, "cm") """Pulsar default radius of the neutron star""" B_CONST = Quantity(3.2e19, "gauss s^(-1/2)") """Pulsar default magnetic field constant""" class SimplePulsar: """Magnetic dipole spin-down model for a pulsar. Parameters ---------- P : `~astropy.units.Quantity` Rotation period (sec). P_dot : `~astropy.units.Quantity` Rotation period derivative (sec sec^-1). I : `~astropy.units.Quantity` Moment of inertia (g cm^2). R : `~astropy.units.Quantity` Radius of the pulsar (cm). """ def __init__(self, P, P_dot, I=DEFAULT_I, R=DEFAULT_R): # noqa: E741 self.P = Quantity(P, "s") self.P_dot = P_dot self.I = I # noqa: E741 self.R = R def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property def luminosity_spindown(self): r"""Spin-down luminosity as a `~astropy.units.Quantity`. .. math:: \dot{L} = 4\pi^2 I \frac{\dot{P}}{P^{3}} """ return 4 * np.pi**2 * self.I * self.P_dot / self.P**3 @property def tau(self): r"""Characteristic age as a `~astropy.units.Quantity`. .. math:: \tau = \frac{P}{2\dot{P}} """ return (self.P / (2 * self.P_dot)).to("yr") @property def magnetic_field(self): r"""Magnetic field strength at the polar cap as a `~astropy.units.Quantity`. .. math:: B = 3.2 \cdot 10^{19} (P\dot{P})^{1/2} \text{ Gauss} """ return B_CONST * np.sqrt(self.P * self.P_dot) class Pulsar: """Magnetic dipole spin-down pulsar model. Parameters ---------- P_0 : float Period at birth. B : `~astropy.units.Quantity` Magnetic field strength at the poles (Gauss). n : float Spin-down braking index. I : float Moment of inertia. R : float Radius. """ def __init__( self, P_0="0.1 s", B="1e10 G", n=3, I=DEFAULT_I, # noqa: E741 R=DEFAULT_R, age=None, L_0=None, # noqa: E741 ): P_0 = Quantity(P_0, "s") B = Quantity(B, "G") self.I = I # noqa: E741 self.R = R self.P_0 = P_0 self.B = B self.P_dot_0 = (B / B_CONST) ** 2 / P_0 self.tau_0 = P_0 / (2 * self.P_dot_0) self.n = float(n) self.beta = -(n + 1.0) / (n - 1.0) if age is not None: self.age = Quantity(age, "yr") if L_0 is None: self.L_0 = 4 * np.pi**2 * self.I * self.P_dot_0 / self.P_0**3 def luminosity_spindown(self, t): r"""Spin down luminosity. .. math:: \dot{L}(t) = \dot{L}_0 \left(1 + \frac{t}{\tau_0}\right)^{-\frac{n + 1}{n - 1}} Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the pulsar. """ t = Quantity(t, "yr") return self.L_0 * (1 + (t / self.tau_0)) ** self.beta def energy_integrated(self, t): r"""Total energy released by a given time. Time-integrated spin-down luminosity since birth. .. math:: E(t) = \dot{L}_0 \tau_0 \frac{t}{t + \tau_0} Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the pulsar. """ t = Quantity(t, "yr") return self.L_0 * self.tau_0 * (t / (t + self.tau_0)) def period(self, t): r"""Rotation period. .. math:: P(t) = P_0 \left(1 + \frac{t}{\tau_0}\right)^{\frac{1}{n - 1}} Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the pulsar. """ t = Quantity(t, "yr") return self.P_0 * (1 + (t / self.tau_0)) ** (1.0 / (self.n - 1)) def period_dot(self, t): r"""Period derivative at age t. P_dot for a given period and magnetic field B, assuming a dipole spin-down. .. math:: \dot{P}(t) = \frac{B^2}{3.2 \cdot 10^{19} P(t)} Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the pulsar. """ t = Quantity(t, "yr") return self.B**2 / (self.period(t) * B_CONST**2) def tau(self, t): r"""Characteristic age at real age t. .. math:: \tau = \frac{P}{2\dot{P}} Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the pulsar. """ t = Quantity(t, "yr") return self.period(t) / 2 * self.period_dot(t) def magnetic_field(self, t): r"""Magnetic field at polar cap (assumed constant). .. math:: B = 3.2 \cdot 10^{19} (P\dot{P})^{1/2} \text{ Gauss} Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the pulsar. """ t = Quantity(t, "yr") return B_CONST * np.sqrt(self.period(t) * self.period_dot(t)) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/source/pwn.py0000644000175100001770000001000014721316200017641 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Pulsar wind nebula (PWN) source models.""" import html import numpy as np import scipy.optimize import astropy.constants from astropy.units import Quantity from astropy.utils import lazyproperty from .pulsar import Pulsar from .snr import SNRTrueloveMcKee __all__ = ["PWN"] class PWN: """Simple pulsar wind nebula (PWN) evolution model. Parameters ---------- pulsar : `~gammapy.astro.source.Pulsar` Pulsar model instance. snr : `~gammapy.astro.source.SNRTrueloveMcKee` SNR model instance. eta_e : float Fraction of energy going into electrons. eta_B : float Fraction of energy going into magnetic fields. age : `~astropy.units.Quantity` Age of the PWN. morphology : str Morphology model of the PWN. """ def __init__( self, pulsar=Pulsar(), snr=SNRTrueloveMcKee(), eta_e=0.999, eta_B=0.001, morphology="Gaussian2D", age=None, ): self.pulsar = pulsar if not isinstance(snr, SNRTrueloveMcKee): raise ValueError("SNR must be instance of SNRTrueloveMcKee") self.snr = snr self.eta_e = eta_e self.eta_B = eta_B self.morphology = morphology if age is not None: self.age = Quantity(age, "yr") def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def _radius_free_expansion(self, t): """Radius at age t during free expansion phase. Reference: https://ui.adsabs.harvard.edu/abs/2006ARA%26A..44...17G (Formula 8). """ term1 = (self.snr.e_sn**3 * self.pulsar.L_0**2) / (self.snr.m_ejecta**5) return (1.44 * term1 ** (1.0 / 10) * t ** (6.0 / 5)).cgs @lazyproperty def _collision_time(self): """Time of collision between the PWN and the reverse shock of the SNR. Returns ------- t_coll : `~astropy.units.Quantity` Time of collision. """ def time_coll(t): t = Quantity(t, "yr") r_pwn = self._radius_free_expansion(t).to_value("cm") r_shock = self.snr.radius_reverse_shock(t).to_value("cm") return r_pwn - r_shock # 4e3 years is a typical value that works for fsolve return Quantity(scipy.optimize.fsolve(time_coll, 4e3), "yr") def radius(self, t): r"""Radius of the PWN at age t. During the free expansion phase the radius of the PWN evolves like: .. math:: R_{PWN}(t) = 1.44 \left(\frac{E_{SN}^3\dot{E}_0^2} {M_{ej}^5}\right)^{1/10}t^{6/5} \text{pc} After the collision with the reverse shock of the SNR, the radius is assumed to be constant (See `~gammapy.astro.source.SNRTrueloveMcKee.radius_reverse_shock`). Reference: https://ui.adsabs.harvard.edu/abs/2006ARA%26A..44...17G (Formula 8). Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the SNR. """ t = Quantity(t, "yr") r_collision = self._radius_free_expansion(self._collision_time) r = np.where( t < self._collision_time, self._radius_free_expansion(t).value, r_collision.value, ) return Quantity(r, "cm") def magnetic_field(self, t): """Estimate of the magnetic field inside the PWN. By assuming that a certain fraction of the spin down energy is converted to magnetic field energy an estimation of the magnetic field can be derived. Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the SNR. """ t = Quantity(t, "yr") energy = self.pulsar.energy_integrated(t) volume = 4.0 / 3 * np.pi * self.radius(t) ** 3 return np.sqrt(2 * astropy.constants.mu0 * self.eta_B * energy / volume) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/source/snr.py0000644000175100001770000002510214721316200017650 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Supernova remnant (SNR) source models.""" import html import numpy as np import astropy.constants from astropy.units import Quantity from astropy.utils import lazyproperty __all__ = ["SNR", "SNRTrueloveMcKee"] class SNR: """Simple supernova remnant (SNR) evolution model. The model is based on the Sedov-Taylor solution for strong explosions. Reference: https://ui.adsabs.harvard.edu/abs/1950RSPSA.201..159T Parameters ---------- e_sn : `~astropy.units.Quantity` SNR energy (erg), equal to the SN energy after neutrino losses. theta : `~astropy.units.Quantity` Fraction of E_SN that goes into cosmic rays. n_ISM : `~astropy.units.Quantity` ISM density (g cm^-3). m_ejecta : `~astropy.units.Quantity` Ejecta mass (g). t_stop : `~astropy.units.Quantity` Post-shock temperature where gamma-ray emission stops. """ def __init__( self, e_sn="1e51 erg", theta=Quantity(0.1), n_ISM=Quantity(1, "cm-3"), m_ejecta=astropy.constants.M_sun, t_stop=Quantity(1e6, "K"), age=None, morphology="Shell2D", spectral_index=2.1, ): self.e_sn = Quantity(e_sn, "erg") self.theta = theta self.rho_ISM = n_ISM * astropy.constants.m_p self.n_ISM = n_ISM self.m_ejecta = m_ejecta self.t_stop = t_stop self.morphology = morphology self.spectral_index = spectral_index if age is not None: self.age = Quantity(age, "yr") def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def radius(self, t): r"""Outer shell radius at age t. The radius during the free expansion phase is given by: .. math:: r_{SNR}(t) \approx 0.01 \left(\frac{E_{SN}}{10^{51}erg}\right)^{1/2} \left(\frac{M_{ej}}{M_{\odot}}\right)^{-1/2} t \text{ pc} The radius during the Sedov-Taylor phase evolves like: .. math:: r_{SNR}(t) \approx \left(\frac{E_{SN}}{\rho_{ISM}}\right)^{1/5}t^{2/5} Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the SNR. """ t = Quantity(t, "yr") r = np.where( t > self.sedov_taylor_begin, self._radius_sedov_taylor(t).to_value("cm"), self._radius_free_expansion(t).to_value("cm"), ) return Quantity(r, "cm") def _radius_free_expansion(self, t): """Shock radius at age t during free expansion phase. Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the SNR. """ # proportional constant for the free expansion phase term_1 = (self.e_sn / Quantity(1e51, "erg")) ** (1.0 / 2) term_2 = (self.m_ejecta / astropy.constants.M_sun) ** (-1.0 / 2) return Quantity(0.01, "pc/yr") * term_1 * term_2 * t def _radius_sedov_taylor(self, t): """Shock radius at age t during Sedov Taylor phase. Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the SNR. """ R_FE = self._radius_free_expansion(self.sedov_taylor_begin) return R_FE * (t / self.sedov_taylor_begin) ** (2.0 / 5) def radius_inner(self, t, fraction=0.0914): """Inner radius at age t of the SNR shell. Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the SNR. """ return self.radius(t) * (1 - fraction) def luminosity_tev(self, t, energy_min="1 TeV"): r"""Gamma-ray luminosity above ``energy_min`` at age ``t``. The luminosity is assumed constant in a given age interval and zero before and after. The assumed spectral index is 2.1. The gamma-ray luminosity above 1 TeV is given by: .. math:: L_{\gamma}(\geq 1TeV) \approx 10^{34} \theta \left(\frac{E_{SN}}{10^{51} erg}\right) \left(\frac{\rho_{ISM}}{1.66\cdot 10^{-24} g/cm^{3}} \right) \text{ s}^{-1} Reference: https://ui.adsabs.harvard.edu/abs/1994A%26A...287..959D (Formula (7)). Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the SNR. energy_min : `~astropy.units.Quantity` Lower energy limit for the luminosity. """ t = Quantity(t, "yr") energy_min = Quantity(energy_min, "TeV") # Flux in 1 k distance according to Drury formula 9 term_0 = energy_min / Quantity(1, "TeV") term_1 = self.e_sn / Quantity(1e51, "erg") term_2 = self.rho_ISM / (Quantity(1, "cm-3") * astropy.constants.m_p) L = self.theta * term_0 ** (1 - self.spectral_index) * term_1 * term_2 # Corresponding luminosity L = np.select( [t <= self.sedov_taylor_begin, t <= self.sedov_taylor_end], [0, L] ) return Quantity(1.0768e34, "s-1") * L @lazyproperty def sedov_taylor_begin(self): r"""Characteristic time scale when the Sedov-Taylor phase of the SNR's evolution begins. The beginning of the Sedov-Taylor phase of the SNR is defined by the condition, that the swept up mass of the surrounding medium equals the mass of the ejected mass. The time scale is given by: .. math:: t_{begin} \approx 200 \left(\frac{E_{SN}}{10^{51}erg}\right)^{-1/2} \left(\frac{M_{ej}}{M_{\odot}}\right)^{5/6} \left(\frac{\rho_{ISM}}{10^{-24}g/cm^3}\right)^{-1/3} \text{yr} """ term1 = (self.e_sn / Quantity(1e51, "erg")) ** (-1.0 / 2) term2 = (self.m_ejecta / astropy.constants.M_sun) ** (5.0 / 6) term3 = (self.rho_ISM / (Quantity(1, "cm-3") * astropy.constants.m_p)) ** ( -1.0 / 3 ) return Quantity(200, "yr") * term1 * term2 * term3 @lazyproperty def sedov_taylor_end(self): r"""Characteristic time scale when the Sedov-Taylor phase of the SNR's evolution ends. The end of the Sedov-Taylor phase of the SNR is defined by the condition, that the temperature at the shock drops below T = 10^6 K. The time scale is given by: .. math:: t_{end} \approx 43000 \left(\frac{m}{1.66\cdot 10^{-24}g}\right)^{5/6} \left(\frac{E_{SN}}{10^{51}erg}\right)^{1/3} \left(\frac{\rho_{ISM}}{1.66\cdot 10^{-24}g/cm^3}\right)^{-1/3} \text{yr} """ term1 = ( 3 * astropy.constants.m_p.cgs / (100 * astropy.constants.k_B.cgs * self.t_stop) ) term2 = (self.e_sn / self.rho_ISM) ** (2.0 / 5) return ((term1 * term2) ** (5.0 / 6)).to("yr") class SNRTrueloveMcKee(SNR): """SNR model according to Truelove & McKee (1999). Reference: https://ui.adsabs.harvard.edu/abs/1999ApJS..120..299T """ def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) # Characteristic dimensions self.r_c = self.m_ejecta ** (1.0 / 3) * self.rho_ISM ** (-1.0 / 3) self.t_c = ( self.e_sn ** (-1.0 / 2) * self.m_ejecta ** (5.0 / 6) * self.rho_ISM ** (-1.0 / 3) ) def radius(self, t): r"""Outer shell radius at age t. The radius during the free expansion phase is given by: .. math:: R_{SNR}(t) = 1.12R_{ch}\left(\frac{t}{t_{ch}}\right)^{2/3} The radius during the Sedov-Taylor phase evolves like: .. math:: R_{SNR}(t) = \left[R_{SNR, ST}^{5/2} + \left(2.026\frac{E_{SN}} {\rho_{ISM}}\right)^{1/2}(t - t_{ST})\right]^{2/5} Using the characteristic dimensions: .. math:: R_{ch} = M_{ej}^{1/3}\rho_{ISM}^{-1/3} \ \ \text{and} \ \ t_{ch} = E_{SN}^{-1/2}M_{ej}^{5/6}\rho_{ISM}^{-1/3} Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the SNR. """ t = Quantity(t, "yr") # Evaluate `_radius_sedov_taylor` on `t > self.sedov_taylor_begin` # only to avoid a warning r = np.empty(t.shape, dtype=np.float64) mask = t > self.sedov_taylor_begin r[mask] = self._radius_sedov_taylor(t[mask]).to_value("cm") r[~mask] = self._radius_free_expansion(t[~mask]).to_value("cm") return Quantity(r, "cm") def _radius_free_expansion(self, t): """Shock radius at age t during free expansion phase. Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the SNR. """ return 1.12 * self.r_c * (t / self.t_c) ** (2.0 / 3) def _radius_sedov_taylor(self, t): """Shock radius at age t during Sedov Taylor phase. Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the SNR. """ term1 = self._radius_free_expansion(self.sedov_taylor_begin) ** (5.0 / 2) term2 = (2.026 * (self.e_sn / self.rho_ISM)) ** (1.0 / 2) return (term1 + term2 * (t - self.sedov_taylor_begin)) ** (2.0 / 5) @lazyproperty def sedov_taylor_begin(self): r"""Characteristic time scale when the Sedov-Taylor phase starts. Given by :math:`t_{ST} \approx 0.52 t_{ch}`. """ return 0.52 * self.t_c def radius_reverse_shock(self, t): r"""Reverse shock radius at age t. Initially the reverse shock co-evolves with the radius of the SNR: .. math:: R_{RS}(t) = \frac{1}{1.19}r_{SNR}(t) After a time :math:`t_{core} \simeq 0.25t_{ch}` the reverse shock reaches the core and then propagates as: .. math:: R_{RS}(t) = \left[1.49 - 0.16 \frac{t - t_{core}}{t_{ch}} - 0.46 \ln \left(\frac{t}{t_{core}}\right)\right]\frac{R_{ch}}{t_{ch}}t Parameters ---------- t : `~astropy.units.Quantity` Time after birth of the SNR. """ t = Quantity(t, "yr") # Time when reverse shock reaches the "core" t_core = 0.25 * self.t_c term1 = (t - t_core) / (self.t_c) term2 = 1.49 - 0.16 * term1 - 0.46 * np.log(t / t_core) R_1 = self._radius_free_expansion(t) / 1.19 R_RS = term2 * (self.r_c / self.t_c) * t r = np.where(t < t_core, R_1.to_value("cm"), R_RS.to_value("cm")) return Quantity(r, "cm") ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.1846418 gammapy-1.3/gammapy/astro/source/tests/0000755000175100001770000000000014721316215017644 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/source/tests/__init__.py0000644000175100001770000000010014721316200021736 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/source/tests/test_pulsar.py0000644000175100001770000000646714721316200022572 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import scipy.integrate from numpy.testing import assert_allclose from astropy.table import Table from astropy.units import Quantity from gammapy.astro.source import Pulsar, SimplePulsar from gammapy.utils.testing import assert_quantity_allclose pulsar = Pulsar() time = Quantity([1e2, 1e4, 1e6, 1e8], "yr") def get_atnf_catalog_sample(): data = """ NUM NAME Gl Gb P0 P1 AGE BSURF EDOT 1 J0006+1834 108.172 -42.985 0.693748 2.10e-15 5.24e+06 1.22e+12 2.48e+32 2 J0007+7303 119.660 10.463 0.315873 3.60e-13 1.39e+04 1.08e+13 4.51e+35 3 B0011+47 116.497 -14.631 1.240699 5.64e-16 3.48e+07 8.47e+11 1.17e+31 7 B0021-72E 305.883 -44.883 0.003536 9.85e-20 5.69e+08 5.97e+08 8.79e+34 8 B0021-72F 305.899 -44.892 0.002624 6.45e-20 6.44e+08 4.16e+08 1.41e+35 16 J0024-7204O 305.897 -44.889 0.002643 3.04e-20 1.38e+09 2.87e+08 6.49e+34 18 J0024-7204Q 305.877 -44.899 0.004033 3.40e-20 1.88e+09 3.75e+08 2.05e+34 21 J0024-7204T 305.890 -44.894 0.007588 2.94e-19 4.09e+08 1.51e+09 2.65e+34 22 J0024-7204U 305.890 -44.905 0.004343 9.52e-20 7.23e+08 6.51e+08 4.59e+34 28 J0026+6320 120.176 0.593 0.318358 1.50e-16 3.36e+07 2.21e+11 1.84e+32 """ return Table.read(data, format="ascii") def test_SimplePulsar_atnf(): """Test functions against ATNF pulsar catalog values""" atnf = get_atnf_catalog_sample() simple_pulsar = SimplePulsar( P=Quantity(atnf["P0"], "s"), P_dot=Quantity(atnf["P1"], "") ) assert_allclose(simple_pulsar.tau.to("yr").value, atnf["AGE"], rtol=0.01) edot = simple_pulsar.luminosity_spindown.to("erg s^-1").value assert_allclose(edot, atnf["EDOT"], rtol=0.01) bsurf = simple_pulsar.magnetic_field.to("gauss").value assert_allclose(bsurf, atnf["BSURF"], rtol=0.01) def test_Pulsar_period(): """Test pulsar period""" reference = Quantity([0.1, 0.10000031, 0.10003081, 0.10303572], "s") assert_quantity_allclose(pulsar.period(time), reference) def test_Pulsar_peridod_dot(): """Test pulsar period derivative""" reference = [9.76562470e-19, 9.76559490e-19, 9.76261682e-19, 9.47790252e-19] assert_allclose(pulsar.period_dot(time), reference) def test_Pulsar_luminosity_spindown(): """Test pulsar spin down luminosity""" reference = [3.85531374e31, 3.85526669e31, 3.85056609e31, 3.42064935e31] assert_allclose(pulsar.luminosity_spindown(time).value, reference) def test_Pulsar_energy_integrated(): """Test against numerical integration""" energies = [] def lumi(t): t = Quantity(t, "s") return pulsar.luminosity_spindown(t).value for t_ in time: energy = scipy.integrate.quad(lumi, 0, t_.cgs.value, epsrel=0.01)[0] energies.append(energy) # The last value is quite inaccurate, because integration is over several decades assert_allclose(energies, pulsar.energy_integrated(time).value, rtol=0.2) def test_Pulsar_magnetic_field(): b = pulsar.magnetic_field(time) assert b.unit == "G" assert pulsar.B.unit == "G" assert_allclose(b.value, pulsar.B.value) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/source/tests/test_pwn.py0000644000175100001770000000124314721316200022053 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from numpy.testing import assert_allclose from astropy.units import Quantity from gammapy.astro.source import PWN t = Quantity([0, 1, 10, 100, 1000, 10000, 100000], "yr") pwn = PWN() def test_PWN_radius(): """Test SNR luminosity""" r = [0, 1.334e14, 2.114e15, 3.350e16, 5.310e17, 6.927e17, 6.927e17] assert_allclose(pwn.radius(t).to_value("cm"), r, rtol=1e-3) def test_magnetic_field(): """Test SNR luminosity""" b = [np.nan, 1.753e-03, 8.788e-05, 4.404e-06, 2.207e-07, 4.685e-07, 1.481e-06] assert_allclose(pwn.magnetic_field(t).to_value("gauss"), b, rtol=1e-3) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/astro/source/tests/test_snr.py0000644000175100001770000000250114721316200022047 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from numpy.testing import assert_allclose from astropy.units import Quantity from gammapy.astro.source import SNR, SNRTrueloveMcKee t = Quantity([0, 1, 10, 100, 1000, 10000], "yr") snr = SNR() snr_mckee = SNRTrueloveMcKee() def test_SNR_luminosity_tev(): """Test SNR luminosity""" reference = [0, 0, 0, 0, 1.076e33, 1.076e33] assert_allclose(snr.luminosity_tev(t).value, reference, rtol=1e-3) def test_SNR_radius(): """Test SNR radius""" reference = [0, 3.085e16, 3.085e17, 3.085e18, 1.174e19, 2.950e19] assert_allclose(snr.radius(t).value, reference, rtol=1e-3) def test_SNR_radius_inner(): """Test SNR radius""" reference = (1 - 0.0914) * np.array( [0, 3.085e16, 3.085e17, 3.085e18, 1.174e19, 2.950e19] ) assert_allclose(snr.radius_inner(t).value, reference, rtol=1e-3) def test_SNRTrueloveMcKee_luminosity_tev(): """Test SNR Truelov McKee luminosity""" reference = [0, 0, 0, 0, 1.076e33, 1.076e33] assert_allclose(snr_mckee.luminosity_tev(t).value, reference, rtol=1e-3) def test_SNRTrueloveMcKee_radius(): """Test SNR RTruelove McKee radius""" reference = [0, 1.953e17, 9.066e17, 4.208e18, 1.579e19, 4.117e19] assert_allclose(snr_mckee.radius(t).value, reference, rtol=1e-3) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.1846418 gammapy-1.3/gammapy/catalog/0000755000175100001770000000000014721316215015464 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/__init__.py0000644000175100001770000000407514721316200017575 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Source catalogs.""" from gammapy.utils.registry import Registry from .core import SourceCatalog, SourceCatalogObject from .fermi import ( SourceCatalog2FHL, SourceCatalog2PC, SourceCatalog3FGL, SourceCatalog3FHL, SourceCatalog3PC, SourceCatalog4FGL, SourceCatalogObject2FHL, SourceCatalogObject2PC, SourceCatalogObject3FGL, SourceCatalogObject3FHL, SourceCatalogObject3PC, SourceCatalogObject4FGL, ) from .gammacat import SourceCatalogGammaCat, SourceCatalogObjectGammaCat from .hawc import ( SourceCatalog2HWC, SourceCatalog3HWC, SourceCatalogObject2HWC, SourceCatalogObject3HWC, ) from .hess import ( SourceCatalogHGPS, SourceCatalogLargeScaleHGPS, SourceCatalogObjectHGPS, SourceCatalogObjectHGPSComponent, ) from .lhaaso import SourceCatalog1LHAASO, SourceCatalogObject1LHAASO CATALOG_REGISTRY = Registry( [ SourceCatalogGammaCat, SourceCatalogHGPS, SourceCatalog2HWC, SourceCatalog3FGL, SourceCatalog4FGL, SourceCatalog2FHL, SourceCatalog3FHL, SourceCatalog3PC, SourceCatalog3HWC, SourceCatalog2PC, SourceCatalog1LHAASO, ] ) """Registry of source catalogs in Gammapy.""" __all__ = [ "CATALOG_REGISTRY", "SourceCatalog", "SourceCatalog1LHAASO", "SourceCatalog2FHL", "SourceCatalog2HWC", "SourceCatalog3FGL", "SourceCatalog3FHL", "SourceCatalog3HWC", "SourceCatalog4FGL", "SourceCatalog2PC", "SourceCatalog3PC", "SourceCatalogGammaCat", "SourceCatalogHGPS", "SourceCatalogLargeScaleHGPS", "SourceCatalogObject", "SourceCatalogObject1LHAASO", "SourceCatalogObject2FHL", "SourceCatalogObject2HWC", "SourceCatalogObject3FGL", "SourceCatalogObject3FHL", "SourceCatalogObject3HWC", "SourceCatalogObject4FGL", "SourceCatalogObject2PC", "SourceCatalogObject3PC", "SourceCatalogObjectGammaCat", "SourceCatalogObjectHGPS", "SourceCatalogObjectHGPSComponent", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/core.py0000644000175100001770000002446414721316200016772 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Source catalog and object base classes.""" import abc import html import numbers from copy import deepcopy import numpy as np from astropy.coordinates import SkyCoord from astropy.table import Table from astropy.utils import lazyproperty from gammapy.maps import TimeMapAxis from gammapy.modeling.models import Models from gammapy.utils.table import table_row_to_dict __all__ = ["SourceCatalog", "SourceCatalogObject"] # https://pydanny.blogspot.com/2011/11/loving-bunch-class.html class Bunch(dict): def __init__(self, **kw): dict.__init__(self, kw) self.__dict__.update(kw) def format_flux_points_table(table): for column in table.colnames: if column.startswith(("dnde", "eflux", "flux", "e2dnde", "ref")): table[column].format = ".3e" elif column.startswith( ("e_min", "e_max", "e_ref", "sqrt_ts", "norm", "ts", "stat") ): table[column].format = ".3f" return table class SourceCatalogObject: """Source catalog object. This class can be used directly, but it is mostly used as a base class for the other source catalog classes. The catalog data on this source is stored in the `source.data` attribute as a dict. The source catalog object is decoupled from the source catalog, it doesn't hold a reference back to it, except for a key ``_row_index`` of type ``int`` that links to the catalog table row the source information comes from. Parameters ---------- data : dict Dictionary of data from a catalog for a given source. data_extended : dict Dictionary of data from a catalog for a given source in the case where the catalog contains an extended sources table (Fermi-LAT). data_spectral : dict Dictionary of data from a catalof for a given source in the case where the catalog contains a spectral table (Fermi-LAT 2PC and 3PC). """ _source_name_key = "Source_Name" _row_index_key = "_row_index" def __init__(self, data, data_extended=None, data_spectral=None): self.data = Bunch(**data) if data_extended: self.data_extended = Bunch(**data_extended) self.data_spectral = Bunch(**data_spectral) if data_spectral else None def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property def name(self): """Source name as a string.""" name = self.data[self._source_name_key] return name.strip() @property def row_index(self): """Row index of source in catalog as an integer.""" return self.data[self._row_index_key] @property def position(self): """Source position as an `~astropy.coordinates.SkyCoord` object.""" table = Table([self.data]) return _skycoord_from_table(table)[0] class SourceCatalog(abc.ABC): """Generic source catalog. This class can be used directly, but it is mostly used as a base class for the other source catalog classes. This is a thin wrapper around `~astropy.table.Table`, which is stored in the ``catalog.table`` attribute. Parameters ---------- table : `~astropy.table.Table` Table with catalog data. source_name_key : str Column with source name information. source_name_alias : tuple of str Columns with source name aliases. This will allow accessing the source row by alias names as well. """ @classmethod @abc.abstractmethod def description(cls): """Catalog description as a string.""" pass @property @abc.abstractmethod def tag(self): pass source_object_class = SourceCatalogObject """Source class (`SourceCatalogObject`).""" def __init__(self, table, source_name_key="Source_Name", source_name_alias=()): self.table = table self._source_name_key = source_name_key self._source_name_alias = source_name_alias def __str__(self): return ( f"{self.__class__.__name__}:\n" f" name: {self.tag}\n" f" description: {self.description}\n" f" sources: {len(self.table)}\n" ) @lazyproperty def _name_to_index_cache(self): # Make a dict for quick lookup: source name -> row index names = {} for idx, row in enumerate(self.table): name = row[self._source_name_key] names[name.strip()] = idx for alias_column in self._source_name_alias: for alias in str(row[alias_column]).split(","): if not alias == "": names[alias.strip()] = idx return names def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def row_index(self, name): """Look up row index of source by name. Parameters ---------- name : str Source name. Returns ------- index : int Row index of source in table. """ index = self._name_to_index_cache[name] row = self.table[index] # check if name lookup is correct other wise recompute _name_to_index_cache possible_names = [row[self._source_name_key]] for alias_column in self._source_name_alias: possible_names += str(row[alias_column]).split(",") if name not in possible_names: self.__dict__.pop("_name_to_index_cache") index = self._name_to_index_cache[name] return index def source_name(self, index): """Look up source name by row index. Parameters ---------- index : int Row index of source in table. """ source_name_col = self.table[self._source_name_key] name = source_name_col[index] return name.strip() def __getitem__(self, key): """Get source by name. Parameters ---------- key : str or int Source name or row index. Returns ------- source : `SourceCatalogObject` An object representing one source. """ if isinstance(key, str): index = self.row_index(key) elif isinstance(key, numbers.Integral): index = key elif isinstance(key, np.ndarray) and key.dtype == bool: new = deepcopy(self) new.table = self.table[key] return new else: raise TypeError(f"Invalid key: {key!r}, {type(key)}\n") return self._make_source_object(index) def _make_source_object(self, index): """Make one source object. Parameters ---------- index : int Row index. Returns ------- source : `SourceCatalogObject` Source object. """ data = table_row_to_dict(self.table[index]) data[SourceCatalogObject._row_index_key] = index fp_energy_edges = getattr(self, "flux_points_energy_edges", None) if fp_energy_edges is not None: data["fp_energy_edges"] = fp_energy_edges hist_table = getattr(self, "hist_table", None) hist2_table = getattr(self, "hist2_table", None) if hist_table: try: data["time_axis"] = TimeMapAxis.from_table( hist_table, format="fermi-fgl" ) except KeyError: pass if hist2_table: try: data["time_axis_2"] = TimeMapAxis.from_table( hist2_table, format="fermi-fgl" ) except KeyError: pass if "Extended_Source_Name" in data: name_extended = data["Extended_Source_Name"].strip() elif "Source_Name" in data: name_extended = data["Source_Name"].strip() else: name_extended = None name_spectral = self._get_name_spectral(data) try: idx = self._lookup_additional_table( self.extended_sources_table[self._source_name_key] )[name_extended] data_extended = table_row_to_dict(self.extended_sources_table[idx]) except (KeyError, AttributeError): data_extended = None try: idx = self._lookup_additional_table( self.spectral_table[self._get_source_name_key] )[name_spectral] data_spectral = table_row_to_dict(self.spectral_table[idx]) except (KeyError, AttributeError): data_spectral = None source = self.source_object_class(data, data_extended, data_spectral) return source @property def _get_source_name_key(self): return self._source_name_key def _get_name_spectral(self, data): if "Source_name" in data: name_spectral = data["Source_name"].strip() else: name_spectral = None return name_spectral def _lookup_additional_table(self, selected_table): """""" names = [_.strip() for _ in selected_table] idx = range(len(names)) return dict(zip(names, idx)) @property def positions(self): """Source positions as a `~astropy.coordinates.SkyCoord` object.""" return _skycoord_from_table(self.table) def to_models(self, **kwargs): """Create Models object from catalog.""" return Models([_.sky_model(**kwargs) for _ in self]) def _skycoord_from_table(table): keys = table.colnames if {"RAJ2000", "DEJ2000"}.issubset(keys): lon, lat, frame = "RAJ2000", "DEJ2000", "icrs" elif {"RAJ2000", "DECJ2000"}.issubset(keys): lon, lat, frame = "RAJ2000", "DECJ2000", "fk5" elif {"RA", "DEC"}.issubset(keys): lon, lat, frame = "RA", "DEC", "icrs" elif {"ra", "dec"}.issubset(keys): lon, lat, frame = "ra", "dec", "icrs" elif {"RAJD", "DECJD"}.issubset(keys): lon, lat, frame = "RAJD", "DECJD", "icrs" else: raise KeyError("No column GLON / GLAT or RA / DEC or RAJ2000 / DEJ2000 found.") unit = table[lon].unit.to_string() if table[lon].unit else "deg" return SkyCoord(table[lon], table[lat], unit=unit, frame=frame) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/fermi.py0000644000175100001770000021553614721316200017146 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Fermi catalog and source classes.""" import abc import logging import warnings import numpy as np import astropy.units as u from astropy.table import Table from astropy.wcs import FITSFixedWarning from gammapy.estimators import FluxPoints from gammapy.maps import MapAxis, Maps, RegionGeom from gammapy.modeling.models import ( DiskSpatialModel, GaussianSpatialModel, Model, Models, PointSpatialModel, SkyModel, TemplateSpatialModel, ) from gammapy.utils.gauss import Gauss2DPDF from gammapy.utils.scripts import make_path from gammapy.utils.table import table_standardise_units_inplace from .core import SourceCatalog, SourceCatalogObject, format_flux_points_table __all__ = [ "SourceCatalog2FHL", "SourceCatalog3FGL", "SourceCatalog3FHL", "SourceCatalog4FGL", "SourceCatalog2PC", "SourceCatalog3PC", "SourceCatalogObject2FHL", "SourceCatalogObject3FGL", "SourceCatalogObject3FHL", "SourceCatalogObject4FGL", "SourceCatalogObject2PC", "SourceCatalogObject3PC", ] log = logging.getLogger(__name__) def compute_flux_points_ul(quantity, quantity_errp): """Compute UL value for fermi flux points. See https://arxiv.org/pdf/1501.02003.pdf (page 30). """ return 2 * quantity_errp + quantity class SourceCatalogObjectFermiPCBase(SourceCatalogObject, abc.ABC): """Base class for Fermi-LAT Pulsar catalogs.""" def __str__(self): return self.info() def info(self, info="all"): if info == "all": info = "basic,more,position,pulsar,spectral,lightcurve" ss = "" ops = info.split(",") if "basic" in ops: ss += self._info_basic() if "more" in ops: ss += self._info_more() if "pulsar" in ops: ss += self._info_pulsar() if "position" in ops: ss += self._info_position() if "spectral" in ops: ss += self._info_spectral_fit() ss += self._info_spectral_points() if "lightcurve" in ops: ss += self._info_phasogram() return ss def _info_basic(self): ss = "\n*** Basic info ***\n\n" ss += "Catalog row index (zero-based) : {}\n".format(self.row_index) ss += "{:<20s} : {}\n".format("Source name", self.name) return ss def _info_more(self): return "" def _info_pulsar(self): return "\n" def _info_position(self): source_pos = self.position ss = "\n*** Position info ***\n\n" ss += "{:<20s} : {:.3f}\n".format("RA", source_pos.ra) ss += "{:<20s} : {:.3f}\n".format("DEC", source_pos.dec) ss += "{:<20s} : {:.3f}\n".format("GLON", source_pos.galactic.l) ss += "{:<20s} : {:.3f}\n".format("GLAT", source_pos.galactic.b) return ss def _info_spectral_fit(self): return "\n" def _info_spectral_points(self): ss = "\n*** Spectral points ***\n\n" if self.flux_points_table is None: ss += "No spectral points available.\n" return ss lines = format_flux_points_table(self.flux_points_table).pformat( max_width=-1, max_lines=-1 ) ss += "\n".join(lines) ss += "\n" return ss def _info_phasogram(self): return "" def spatial_model(self): source_pos = self.position ra = source_pos.ra dec = source_pos.dec model = PointSpatialModel(lon_0=ra, lat_0=dec, frame="icrs") return model def sky_model(self, name=None): """Sky model (`~gammapy.modeling.models.SkyModel`).""" spectral_model = self.spectral_model() if spectral_model is None: return None if name is None: name = self.name return SkyModel( spatial_model=self.spatial_model(), spectral_model=spectral_model, name=name, ) @property def flux_points(self): """Flux points (`~gammapy.estimators.FluxPoints`).""" if self.flux_points_table is None: return None return FluxPoints.from_table( table=self.flux_points_table, reference_model=self.sky_model(), format="gadf-sed", ) @property def lightcurve(self): """Light-curve.""" pass class SourceCatalogObjectFermiBase(SourceCatalogObject, abc.ABC): """Base class for Fermi-LAT catalogs.""" asso = ["ASSOC1", "ASSOC2", "ASSOC_TEV", "ASSOC_GAM1", "ASSOC_GAM2", "ASSOC_GAM3"] flux_points_meta = { "sed_type_init": "flux", "n_sigma": 1, "sqrt_ts_threshold_ul": 1, "n_sigma_ul": 2, } def __str__(self): return self.info() def info(self, info="all"): """Summary information string. Parameters ---------- info : {'all', 'basic', 'more', 'position', 'spectral', 'lightcurve'} Comma separated list of options. """ if info == "all": info = "basic,more,position,spectral,lightcurve" ss = "" ops = info.split(",") if "basic" in ops: ss += self._info_basic() if "more" in ops: ss += self._info_more() if "position" in ops: ss += self._info_position() if not self.is_pointlike: ss += self._info_morphology() if "spectral" in ops: ss += self._info_spectral_fit() ss += self._info_spectral_points() if "lightcurve" in ops: ss += self._info_lightcurve() return ss def _info_basic(self): d = self.data keys = self.asso ss = "\n*** Basic info ***\n\n" ss += "Catalog row index (zero-based) : {}\n".format(self.row_index) ss += "{:<20s} : {}\n".format("Source name", self.name) if "Extended_Source_Name" in d: ss += "{:<20s} : {}\n".format("Extended name", d["Extended_Source_Name"]) def get_nonentry_keys(keys): vals = [str(d[_]).strip() for _ in keys] return ", ".join([_ for _ in vals if _ != ""]) associations = get_nonentry_keys(keys) ss += "{:<16s} : {}\n".format("Associations", associations) try: ss += "{:<16s} : {:.3f}\n".format("ASSOC_PROB_BAY", d["ASSOC_PROB_BAY"]) ss += "{:<16s} : {:.3f}\n".format("ASSOC_PROB_LR", d["ASSOC_PROB_LR"]) except KeyError: pass try: ss += "{:<16s} : {}\n".format("Class1", d["CLASS1"]) except KeyError: ss += "{:<16s} : {}\n".format("Class", d["CLASS"]) try: ss += "{:<16s} : {}\n".format("Class2", d["CLASS2"]) except KeyError: pass ss += "{:<16s} : {}\n".format("TeVCat flag", d.get("TEVCAT_FLAG", "N/A")) return ss @abc.abstractmethod def _info_more(self): pass def _info_position(self): d = self.data ss = "\n*** Position info ***\n\n" ss += "{:<20s} : {:.3f}\n".format("RA", d["RAJ2000"]) ss += "{:<20s} : {:.3f}\n".format("DEC", d["DEJ2000"]) ss += "{:<20s} : {:.3f}\n".format("GLON", d["GLON"]) ss += "{:<20s} : {:.3f}\n".format("GLAT", d["GLAT"]) ss += "\n" ss += "{:<20s} : {:.4f}\n".format("Semimajor (68%)", d["Conf_68_SemiMajor"]) ss += "{:<20s} : {:.4f}\n".format("Semiminor (68%)", d["Conf_68_SemiMinor"]) ss += "{:<20s} : {:.2f}\n".format("Position angle (68%)", d["Conf_68_PosAng"]) ss += "{:<20s} : {:.4f}\n".format("Semimajor (95%)", d["Conf_95_SemiMajor"]) ss += "{:<20s} : {:.4f}\n".format("Semiminor (95%)", d["Conf_95_SemiMinor"]) ss += "{:<20s} : {:.2f}\n".format("Position angle (95%)", d["Conf_95_PosAng"]) ss += "{:<20s} : {:.0f}\n".format("ROI number", d["ROI_num"]) return ss def _info_morphology(self): e = self.data_extended ss = "\n*** Extended source information ***\n\n" ss += "{:<16s} : {}\n".format("Model form", e["Model_Form"]) ss += "{:<16s} : {:.4f}\n".format("Model semimajor", e["Model_SemiMajor"]) ss += "{:<16s} : {:.4f}\n".format("Model semiminor", e["Model_SemiMinor"]) ss += "{:<16s} : {:.4f}\n".format("Position angle", e["Model_PosAng"]) try: ss += "{:<16s} : {}\n".format("Spatial function", e["Spatial_Function"]) except KeyError: pass ss += "{:<16s} : {}\n\n".format("Spatial filename", e["Spatial_Filename"]) return ss def _info_spectral_fit(self): return "\n" def _info_spectral_points(self): ss = "\n*** Spectral points ***\n\n" lines = format_flux_points_table(self.flux_points_table).pformat( max_width=-1, max_lines=-1 ) ss += "\n".join(lines) return ss def _info_lightcurve(self): return "\n" @property def is_pointlike(self): return self.data["Extended_Source_Name"].strip() == "" # FIXME: this should be renamed `set_position_error`, # and `phi_0` isn't filled correctly, other parameters missing # see https://github.com/gammapy/gammapy/pull/2533#issuecomment-553329049 def _set_spatial_errors(self, model): d = self.data if "Pos_err_68" in d: percent = 0.68 semi_minor = d["Pos_err_68"] semi_major = d["Pos_err_68"] phi_0 = 0.0 else: percent = 0.95 semi_minor = d["Conf_95_SemiMinor"] semi_major = d["Conf_95_SemiMajor"] phi_0 = d["Conf_95_PosAng"] if np.isnan(phi_0): phi_0 = 0.0 * u.deg scale_1sigma = Gauss2DPDF().containment_radius(percent) lat_err = semi_major / scale_1sigma lon_err = semi_minor / scale_1sigma / np.cos(d["DEJ2000"]) if "TemplateSpatialModel" not in model.tag: model.parameters["lon_0"].error = lon_err model.parameters["lat_0"].error = lat_err model.phi_0 = phi_0 def sky_model(self, name=None): """Sky model as a `~gammapy.modeling.models.SkyModel` object.""" if name is None: name = self.name return SkyModel( spatial_model=self.spatial_model(), spectral_model=self.spectral_model(), name=name, ) @property def flux_points(self): """Flux points as a `~gammapy.estimators.FluxPoints` object.""" return FluxPoints.from_table( table=self.flux_points_table, reference_model=self.sky_model(), format="gadf-sed", ) class SourceCatalogObject4FGL(SourceCatalogObjectFermiBase): """One source from the Fermi-LAT 4FGL catalog. Catalog is represented by `~gammapy.catalog.SourceCatalog4FGL`. """ asso = [ "ASSOC1", "ASSOC2", "ASSOC_TEV", "ASSOC_FGL", "ASSOC_FHL", "ASSOC_GAM1", "ASSOC_GAM2", "ASSOC_GAM3", ] def _info_more(self): d = self.data ss = "\n*** Other info ***\n\n" fmt = "{:<32s} : {:.3f}\n" ss += fmt.format("Significance (100 MeV - 1 TeV)", d["Signif_Avg"]) ss += "{:<32s} : {:.1f}\n".format("Npred", d["Npred"]) ss += "\n{:<20s} : {}\n".format("Other flags", d["Flags"]) return ss def _info_spectral_fit(self): d = self.data spec_type = d["SpectrumType"].strip() ss = "\n*** Spectral info ***\n\n" ss += "{:<45s} : {}\n".format("Spectrum type", d["SpectrumType"]) fmt = "{:<45s} : {:.3f}\n" ss += fmt.format("Detection significance (100 MeV - 1 TeV)", d["Signif_Avg"]) if spec_type == "PowerLaw": tag = "PL" elif spec_type == "LogParabola": tag = "LP" ss += "{:<45s} : {:.4f} +- {:.5f}\n".format( "beta", d["LP_beta"], d["Unc_LP_beta"] ) ss += "{:<45s} : {:.1f}\n".format("Significance curvature", d["LP_SigCurv"]) elif spec_type == "PLSuperExpCutoff": tag = "PLEC" fmt = "{:<45s} : {:.4f} +- {:.4f}\n" if "PLEC_ExpfactorS" in d: ss += fmt.format( "Exponential factor", d["PLEC_ExpfactorS"], d["Unc_PLEC_ExpfactorS"] ) else: ss += fmt.format( "Exponential factor", d["PLEC_Expfactor"], d["Unc_PLEC_Expfactor"] ) ss += "{:<45s} : {:.4f} +- {:.4f}\n".format( "Super-exponential cutoff index", d["PLEC_Exp_Index"], d["Unc_PLEC_Exp_Index"], ) ss += "{:<45s} : {:.1f}\n".format( "Significance curvature", d["PLEC_SigCurv"] ) else: raise ValueError(f"Invalid spec_type: {spec_type!r}") ss += "{:<45s} : {:.0f} {}\n".format( "Pivot energy", d["Pivot_Energy"].value, d["Pivot_Energy"].unit ) fmt = "{:<45s} : {:.3f} +- {:.3f}\n" if f"{tag}_ExpfactorS" in d: ss += fmt.format( "Spectral index", d[tag + "_IndexS"], d["Unc_" + tag + "_IndexS"] ) else: ss += fmt.format( "Spectral index", d[tag + "_Index"], d["Unc_" + tag + "_Index"] ) fmt = "{:<45s} : {:.3} +- {:.3} {}\n" ss += fmt.format( "Flux Density at pivot energy", d[tag + "_Flux_Density"].value, d["Unc_" + tag + "_Flux_Density"].value, "cm-2 MeV-1 s-1", ) fmt = "{:<45s} : {:.3} +- {:.3} {}\n" ss += fmt.format( "Integral flux (1 - 100 GeV)", d["Flux1000"].value, d["Unc_Flux1000"].value, "cm-2 s-1", ) fmt = "{:<45s} : {:.3} +- {:.3} {}\n" ss += fmt.format( "Energy flux (100 MeV - 100 GeV)", d["Energy_Flux100"].value, d["Unc_Energy_Flux100"].value, "erg cm-2 s-1", ) return ss def _info_lightcurve(self): d = self.data ss = "\n*** Lightcurve info ***\n\n" ss += "Lightcurve measured in the energy band: 100 MeV - 100 GeV\n\n" ss += "{:<15s} : {:.3f}\n".format("Variability index", d["Variability_Index"]) if np.isfinite(d["Flux_Peak"]): ss += "{:<40s} : {:.3f}\n".format( "Significance peak (100 MeV - 100 GeV)", d["Signif_Peak"] ) fmt = "{:<40s} : {:.3} +- {:.3} cm^-2 s^-1\n" ss += fmt.format( "Integral flux peak (100 MeV - 100 GeV)", d["Flux_Peak"].value, d["Unc_Flux_Peak"].value, ) # TODO: give time as UTC string, not MET ss += "{:<40s} : {:.3} s (Mission elapsed time)\n".format( "Time peak", d["Time_Peak"].value ) peak_interval = d["Peak_Interval"].to_value("day") ss += "{:<40s} : {:.3} day\n".format("Peak interval", peak_interval) else: ss += "\nNo peak measured for this source.\n" # TODO: Add a lightcurve table with d['Flux_History'] and d['Unc_Flux_History'] return ss def spatial_model(self): """Spatial model as a `~gammapy.modeling.models.SpatialModel` object.""" d = self.data ra = d["RAJ2000"] dec = d["DEJ2000"] if self.is_pointlike: model = PointSpatialModel(lon_0=ra, lat_0=dec, frame="icrs") else: de = self.data_extended morph_type = de["Model_Form"].strip() e = (1 - (de["Model_SemiMinor"] / de["Model_SemiMajor"]) ** 2.0) ** 0.5 sigma = de["Model_SemiMajor"] phi = de["Model_PosAng"] if morph_type == "Disk": r_0 = de["Model_SemiMajor"] model = DiskSpatialModel( lon_0=ra, lat_0=dec, r_0=r_0, e=e, phi=phi, frame="icrs" ) elif morph_type in ["Map", "Ring", "2D Gaussian x2"]: filename = de["Spatial_Filename"].strip() + ".gz" if de["version"] < 28: path_extended = "$GAMMAPY_DATA/catalogs/fermi/LAT_extended_sources_8years/Templates/" elif de["version"] < 32: path_extended = ( "$GAMMAPY_DATA/catalogs/fermi/Extended_12years/Templates/" ) else: path_extended = ( "$GAMMAPY_DATA/catalogs/fermi/Extended_14years/Templates/" ) path = make_path(path_extended) with warnings.catch_warnings(): # ignore FITS units warnings warnings.simplefilter("ignore", FITSFixedWarning) model = TemplateSpatialModel.read(path / filename) elif morph_type == "2D Gaussian": model = GaussianSpatialModel( lon_0=ra, lat_0=dec, sigma=sigma, e=e, phi=phi, frame="icrs" ) else: raise ValueError(f"Invalid spatial model: {morph_type!r}") self._set_spatial_errors(model) return model def spectral_model(self): """Best fit spectral model as a `~gammapy.modeling.models.SpectralModel` object.""" spec_type = self.data["SpectrumType"].strip() if spec_type == "PowerLaw": tag = "PowerLawSpectralModel" pars = { "reference": self.data["Pivot_Energy"], "amplitude": self.data["PL_Flux_Density"], "index": self.data["PL_Index"], } errs = { "amplitude": self.data["Unc_PL_Flux_Density"], "index": self.data["Unc_PL_Index"], } elif spec_type == "LogParabola": tag = "LogParabolaSpectralModel" pars = { "reference": self.data["Pivot_Energy"], "amplitude": self.data["LP_Flux_Density"], "alpha": self.data["LP_Index"], "beta": self.data["LP_beta"], } errs = { "amplitude": self.data["Unc_LP_Flux_Density"], "alpha": self.data["Unc_LP_Index"], "beta": self.data["Unc_LP_beta"], } elif spec_type == "PLSuperExpCutoff": if "PLEC_ExpfactorS" in self.data: tag = "SuperExpCutoffPowerLaw4FGLDR3SpectralModel" expfactor = self.data["PLEC_ExpfactorS"] expfactor_err = self.data["Unc_PLEC_ExpfactorS"] index_1 = self.data["PLEC_IndexS"] index_1_err = self.data["Unc_PLEC_IndexS"] else: tag = "SuperExpCutoffPowerLaw4FGLSpectralModel" expfactor = self.data["PLEC_Expfactor"] expfactor_err = self.data["Unc_PLEC_Expfactor"] index_1 = self.data["PLEC_Index"] index_1_err = self.data["Unc_PLEC_Index"] pars = { "reference": self.data["Pivot_Energy"], "amplitude": self.data["PLEC_Flux_Density"], "index_1": index_1, "index_2": self.data["PLEC_Exp_Index"], "expfactor": expfactor, } errs = { "amplitude": self.data["Unc_PLEC_Flux_Density"], "index_1": index_1_err, "index_2": np.nan_to_num(float(self.data["Unc_PLEC_Exp_Index"])), "expfactor": expfactor_err, } else: raise ValueError(f"Invalid spec_type: {spec_type!r}") model = Model.create(tag, "spectral", **pars) for name, value in errs.items(): model.parameters[name].error = value return model @property def flux_points_table(self): """Flux points as a `~astropy.table.Table`.""" table = Table() table.meta.update(self.flux_points_meta) table["e_min"] = self.data["fp_energy_edges"][:-1] table["e_max"] = self.data["fp_energy_edges"][1:] flux = self._get_flux_values("Flux_Band") flux_err = self._get_flux_values("Unc_Flux_Band") table["flux"] = flux table["flux_errn"] = np.abs(flux_err[:, 0]) table["flux_errp"] = flux_err[:, 1] nuFnu = self._get_flux_values("nuFnu_Band", "erg cm-2 s-1") table["e2dnde"] = nuFnu table["e2dnde_errn"] = np.abs(nuFnu * flux_err[:, 0] / flux) table["e2dnde_errp"] = nuFnu * flux_err[:, 1] / flux is_ul = np.isnan(table["flux_errn"]) table["is_ul"] = is_ul # handle upper limits table["flux_ul"] = np.nan * flux_err.unit flux_ul = compute_flux_points_ul(table["flux"], table["flux_errp"]) table["flux_ul"][is_ul] = flux_ul[is_ul] # handle upper limits table["e2dnde_ul"] = np.nan * nuFnu.unit e2dnde_ul = compute_flux_points_ul(table["e2dnde"], table["e2dnde_errp"]) table["e2dnde_ul"][is_ul] = e2dnde_ul[is_ul] # Square root of test statistic table["sqrt_ts"] = self.data["Sqrt_TS_Band"] return table def _get_flux_values(self, prefix, unit="cm-2 s-1"): values = self.data[prefix] return u.Quantity(values, unit) def lightcurve(self, interval="1-year"): """Lightcurve as a `~gammapy.estimators.FluxPoints` object. Parameters ---------- interval : {'1-year', '2-month'} Time interval of the lightcurve. Default is '1-year'. Note that '2-month' is not available for all catalogue version. """ if interval == "1-year": tag = "Flux_History" if tag not in self.data or "time_axis" not in self.data: raise ValueError( "'1-year' interval is not available for this catalogue version" ) time_axis = self.data["time_axis"] tag_sqrt_ts = "Sqrt_TS_History" elif interval == "2-month": tag = "Flux2_History" if tag not in self.data or "time_axis_2" not in self.data: raise ValueError( "2-month interval is not available for this catalog version" ) time_axis = self.data["time_axis_2"] tag_sqrt_ts = "Sqrt_TS2_History" else: raise ValueError("Time intervals available are '1-year' or '2-month'") energy_axis = MapAxis.from_energy_edges([50, 300000] * u.MeV) geom = RegionGeom.create(region=self.position, axes=[energy_axis, time_axis]) names = ["flux", "flux_errp", "flux_errn", "flux_ul", "ts"] maps = Maps.from_geom(geom=geom, names=names) maps["flux"].quantity = self.data[tag].reshape(geom.data_shape) maps["flux_errp"].quantity = self.data[f"Unc_{tag}"][:, 1].reshape( geom.data_shape ) maps["flux_errn"].quantity = -self.data[f"Unc_{tag}"][:, 0].reshape( geom.data_shape ) maps["flux_ul"].quantity = compute_flux_points_ul( maps["flux"].quantity, maps["flux_errp"].quantity ).reshape(geom.data_shape) maps["ts"].quantity = (self.data[tag_sqrt_ts] ** 2).reshape(geom.data_shape) return FluxPoints.from_maps( maps=maps, sed_type="flux", reference_model=self.sky_model(), meta=self.flux_points.meta.copy(), ) class SourceCatalogObject3FGL(SourceCatalogObjectFermiBase): """One source from the Fermi-LAT 3FGL catalog. Catalog is represented by `~gammapy.catalog.SourceCatalog3FGL`. """ _energy_edges = u.Quantity([100, 300, 1000, 3000, 10000, 100000], "MeV") _energy_edges_suffix = [ "100_300", "300_1000", "1000_3000", "3000_10000", "10000_100000", ] energy_range = u.Quantity([100, 100000], "MeV") """Energy range used for the catalog. Paper says that analysis uses data up to 300 GeV, but results are all quoted up to 100 GeV only to be consistent with previous catalogs. """ def _info_more(self): d = self.data ss = "\n*** Other info ***\n\n" ss += "{:<20s} : {}\n".format("Other flags", d["Flags"]) return ss def _info_spectral_fit(self): d = self.data spec_type = d["SpectrumType"].strip() ss = "\n*** Spectral info ***\n\n" ss += "{:<45s} : {}\n".format("Spectrum type", d["SpectrumType"]) fmt = "{:<45s} : {:.3f}\n" ss += fmt.format("Detection significance (100 MeV - 300 GeV)", d["Signif_Avg"]) ss += "{:<45s} : {:.1f}\n".format("Significance curvature", d["Signif_Curve"]) if spec_type == "PowerLaw": pass elif spec_type == "LogParabola": ss += "{:<45s} : {} +- {}\n".format("beta", d["beta"], d["Unc_beta"]) elif spec_type in ["PLExpCutoff", "PlSuperExpCutoff"]: fmt = "{:<45s} : {:.0f} +- {:.0f} {}\n" ss += fmt.format( "Cutoff energy", d["Cutoff"].value, d["Unc_Cutoff"].value, d["Cutoff"].unit, ) elif spec_type == "PLSuperExpCutoff": ss += "{:<45s} : {} +- {}\n".format( "Super-exponential cutoff index", d["Exp_Index"], d["Unc_Exp_Index"] ) else: raise ValueError(f"Invalid spec_type: {spec_type!r}") ss += "{:<45s} : {:.0f} {}\n".format( "Pivot energy", d["Pivot_Energy"].value, d["Pivot_Energy"].unit ) ss += "{:<45s} : {:.3f}\n".format( "Power law spectral index", d["PowerLaw_Index"] ) fmt = "{:<45s} : {:.3f} +- {:.3f}\n" ss += fmt.format("Spectral index", d["Spectral_Index"], d["Unc_Spectral_Index"]) fmt = "{:<45s} : {:.3} +- {:.3} {}\n" ss += fmt.format( "Flux Density at pivot energy", d["Flux_Density"].value, d["Unc_Flux_Density"].value, "cm-2 MeV-1 s-1", ) fmt = "{:<45s} : {:.3} +- {:.3} {}\n" ss += fmt.format( "Integral flux (1 - 100 GeV)", d["Flux1000"].value, d["Unc_Flux1000"].value, "cm-2 s-1", ) fmt = "{:<45s} : {:.3} +- {:.3} {}\n" ss += fmt.format( "Energy flux (100 MeV - 100 GeV)", d["Energy_Flux100"].value, d["Unc_Energy_Flux100"].value, "erg cm-2 s-1", ) return ss def _info_lightcurve(self): d = self.data ss = "\n*** Lightcurve info ***\n\n" ss += "Lightcurve measured in the energy band: 100 MeV - 100 GeV\n\n" ss += "{:<15s} : {:.3f}\n".format("Variability index", d["Variability_Index"]) if np.isfinite(d["Flux_Peak"]): ss += "{:<40s} : {:.3f}\n".format( "Significance peak (100 MeV - 100 GeV)", d["Signif_Peak"] ) fmt = "{:<40s} : {:.3} +- {:.3} cm^-2 s^-1\n" ss += fmt.format( "Integral flux peak (100 MeV - 100 GeV)", d["Flux_Peak"].value, d["Unc_Flux_Peak"].value, ) # TODO: give time as UTC string, not MET ss += "{:<40s} : {:.3} s (Mission elapsed time)\n".format( "Time peak", d["Time_Peak"].value ) peak_interval = d["Peak_Interval"].to_value("day") ss += "{:<40s} : {:.3} day\n".format("Peak interval", peak_interval) else: ss += "\nNo peak measured for this source.\n" # TODO: Add a lightcurve table with d['Flux_History'] and d['Unc_Flux_History'] return ss def spectral_model(self): """Best fit spectral model as a `~gammapy.modeling.models.SpectralModel` object.""" spec_type = self.data["SpectrumType"].strip() if spec_type == "PowerLaw": tag = "PowerLawSpectralModel" pars = { "amplitude": self.data["Flux_Density"], "reference": self.data["Pivot_Energy"], "index": self.data["Spectral_Index"], } errs = { "amplitude": self.data["Unc_Flux_Density"], "index": self.data["Unc_Spectral_Index"], } elif spec_type == "PLExpCutoff": tag = "ExpCutoffPowerLaw3FGLSpectralModel" pars = { "amplitude": self.data["Flux_Density"], "reference": self.data["Pivot_Energy"], "index": self.data["Spectral_Index"], "ecut": self.data["Cutoff"], } errs = { "amplitude": self.data["Unc_Flux_Density"], "index": self.data["Unc_Spectral_Index"], "ecut": self.data["Unc_Cutoff"], } elif spec_type == "LogParabola": tag = "LogParabolaSpectralModel" pars = { "amplitude": self.data["Flux_Density"], "reference": self.data["Pivot_Energy"], "alpha": self.data["Spectral_Index"], "beta": self.data["beta"], } errs = { "amplitude": self.data["Unc_Flux_Density"], "alpha": self.data["Unc_Spectral_Index"], "beta": self.data["Unc_beta"], } elif spec_type == "PLSuperExpCutoff": tag = "SuperExpCutoffPowerLaw3FGLSpectralModel" pars = { "amplitude": self.data["Flux_Density"], "reference": self.data["Pivot_Energy"], "index_1": self.data["Spectral_Index"], "index_2": self.data["Exp_Index"], "ecut": self.data["Cutoff"], } errs = { "amplitude": self.data["Unc_Flux_Density"], "index_1": self.data["Unc_Spectral_Index"], "index_2": self.data["Unc_Exp_Index"], "ecut": self.data["Unc_Cutoff"], } else: raise ValueError(f"Invalid spec_type: {spec_type!r}") model = Model.create(tag, "spectral", **pars) for name, value in errs.items(): model.parameters[name].error = value return model def spatial_model(self): """Spatial model as a `~gammapy.modeling.models.SpatialModel` object.""" d = self.data ra = d["RAJ2000"] dec = d["DEJ2000"] if self.is_pointlike: model = PointSpatialModel(lon_0=ra, lat_0=dec, frame="icrs") else: de = self.data_extended morph_type = de["Model_Form"].strip() e = (1 - (de["Model_SemiMinor"] / de["Model_SemiMajor"]) ** 2.0) ** 0.5 sigma = de["Model_SemiMajor"] phi = de["Model_PosAng"] if morph_type == "Disk": r_0 = de["Model_SemiMajor"] model = DiskSpatialModel( lon_0=ra, lat_0=dec, r_0=r_0, e=e, phi=phi, frame="icrs" ) elif morph_type in ["Map", "Ring", "2D Gaussian x2"]: filename = de["Spatial_Filename"].strip() path = make_path( "$GAMMAPY_DATA/catalogs/fermi/Extended_archive_v15/Templates/" ) model = TemplateSpatialModel.read(path / filename) elif morph_type == "2D Gaussian": model = GaussianSpatialModel( lon_0=ra, lat_0=dec, sigma=sigma, e=e, phi=phi, frame="icrs" ) else: raise ValueError(f"Invalid spatial model: {morph_type!r}") self._set_spatial_errors(model) return model @property def flux_points_table(self): """Flux points as a `~astropy.table.Table`.""" table = Table() table.meta.update(self.flux_points_meta) table["e_min"] = self._energy_edges[:-1] table["e_max"] = self._energy_edges[1:] flux = self._get_flux_values("Flux") flux_err = self._get_flux_values("Unc_Flux") table["flux"] = flux table["flux_errn"] = np.abs(flux_err[:, 0]) table["flux_errp"] = flux_err[:, 1] nuFnu = self._get_flux_values("nuFnu", "erg cm-2 s-1") table["e2dnde"] = nuFnu table["e2dnde_errn"] = np.abs(nuFnu * flux_err[:, 0] / flux) table["e2dnde_errp"] = nuFnu * flux_err[:, 1] / flux is_ul = np.isnan(table["flux_errn"]) table["is_ul"] = is_ul # handle upper limits table["flux_ul"] = np.nan * flux_err.unit flux_ul = compute_flux_points_ul(table["flux"], table["flux_errp"]) table["flux_ul"][is_ul] = flux_ul[is_ul] # handle upper limits table["e2dnde_ul"] = np.nan * nuFnu.unit e2dnde_ul = compute_flux_points_ul(table["e2dnde"], table["e2dnde_errp"]) table["e2dnde_ul"][is_ul] = e2dnde_ul[is_ul] # Square root of test statistic table["sqrt_ts"] = [self.data["Sqrt_TS" + _] for _ in self._energy_edges_suffix] return table def _get_flux_values(self, prefix, unit="cm-2 s-1"): values = [self.data[prefix + _] for _ in self._energy_edges_suffix] return u.Quantity(values, unit) def lightcurve(self): """Lightcurve as a `~gammapy.estimators.FluxPoints` object.""" time_axis = self.data["time_axis"] tag = "Flux_History" energy_axis = MapAxis.from_energy_edges(self.energy_range) geom = RegionGeom.create(region=self.position, axes=[energy_axis, time_axis]) names = ["flux", "flux_errp", "flux_errn", "flux_ul"] maps = Maps.from_geom(geom=geom, names=names) maps["flux"].quantity = self.data[tag].reshape(geom.data_shape) maps["flux_errp"].quantity = self.data[f"Unc_{tag}"][:, 1].reshape( geom.data_shape ) maps["flux_errn"].quantity = -self.data[f"Unc_{tag}"][:, 0].reshape( geom.data_shape ) maps["flux_ul"].quantity = compute_flux_points_ul( maps["flux"].quantity, maps["flux_errp"].quantity ).reshape(geom.data_shape) is_ul = np.isnan(maps["flux_errn"]) maps["flux_ul"].data[~is_ul] = np.nan return FluxPoints.from_maps( maps=maps, sed_type="flux", reference_model=self.sky_model(), meta=self.flux_points_meta.copy(), ) class SourceCatalogObject2FHL(SourceCatalogObjectFermiBase): """One source from the Fermi-LAT 2FHL catalog. Catalog is represented by `~gammapy.catalog.SourceCatalog2FHL`. """ asso = ["ASSOC", "3FGL_Name", "1FHL_Name", "TeVCat_Name"] _energy_edges = u.Quantity([50, 171, 585, 2000], "GeV") _energy_edges_suffix = ["50_171", "171_585", "585_2000"] energy_range = u.Quantity([0.05, 2], "TeV") """Energy range used for the catalog.""" def _info_more(self): d = self.data ss = "\n*** Other info ***\n\n" fmt = "{:<32s} : {:.3f}\n" ss += fmt.format("Test statistic (50 GeV - 2 TeV)", d["TS"]) return ss def _info_position(self): d = self.data ss = "\n*** Position info ***\n\n" ss += "{:<20s} : {:.3f}\n".format("RA", d["RAJ2000"]) ss += "{:<20s} : {:.3f}\n".format("DEC", d["DEJ2000"]) ss += "{:<20s} : {:.3f}\n".format("GLON", d["GLON"]) ss += "{:<20s} : {:.3f}\n".format("GLAT", d["GLAT"]) ss += "\n" ss += "{:<20s} : {:.4f}\n".format("Error on position (68%)", d["Pos_err_68"]) ss += "{:<20s} : {:.0f}\n".format("ROI number", d["ROI"]) return ss def _info_spectral_fit(self): d = self.data ss = "\n*** Spectral fit info ***\n\n" fmt = "{:<32s} : {:.3f} +- {:.3f}\n" ss += fmt.format( "Power-law spectral index", d["Spectral_Index"], d["Unc_Spectral_Index"] ) ss += "{:<32s} : {:.3} +- {:.3} {}\n".format( "Integral flux (50 GeV - 2 TeV)", d["Flux50"].value, d["Unc_Flux50"].value, "cm-2 s-1", ) ss += "{:<32s} : {:.3} +- {:.3} {}\n".format( "Energy flux (50 GeV - 2 TeV)", d["Energy_Flux50"].value, d["Unc_Energy_Flux50"].value, "erg cm-2 s-1", ) return ss @property def is_pointlike(self): return self.data["Source_Name"].strip()[-1] != "e" def spatial_model(self): """Spatial model as a `~gammapy.modeling.models.SpatialModel` object.""" d = self.data ra = d["RAJ2000"] dec = d["DEJ2000"] if self.is_pointlike: model = PointSpatialModel(lon_0=ra, lat_0=dec, frame="icrs") else: de = self.data_extended morph_type = de["Model_Form"].strip() e = (1 - (de["Model_SemiMinor"] / de["Model_SemiMajor"]) ** 2.0) ** 0.5 sigma = de["Model_SemiMajor"] phi = de["Model_PosAng"] if morph_type in ["Disk", "Elliptical Disk"]: r_0 = de["Model_SemiMajor"] model = DiskSpatialModel( lon_0=ra, lat_0=dec, r_0=r_0, e=e, phi=phi, frame="icrs" ) elif morph_type in ["Map", "Ring", "2D Gaussian x2"]: filename = de["Spatial_Filename"].strip() path = make_path( "$GAMMAPY_DATA/catalogs/fermi/Extended_archive_v15/Templates/" ) return TemplateSpatialModel.read(path / filename) elif morph_type in ["2D Gaussian", "Elliptical 2D Gaussian"]: model = GaussianSpatialModel( lon_0=ra, lat_0=dec, sigma=sigma, e=e, phi=phi, frame="icrs" ) else: raise ValueError(f"Invalid spatial model: {morph_type!r}") self._set_spatial_errors(model) return model def spectral_model(self): """Best fit spectral model as a `~gammapy.modeling.models.SpectralModel`.""" tag = "PowerLaw2SpectralModel" pars = { "amplitude": self.data["Flux50"], "emin": self.energy_range[0], "emax": self.energy_range[1], "index": self.data["Spectral_Index"], } errs = { "amplitude": self.data["Unc_Flux50"], "index": self.data["Unc_Spectral_Index"], } model = Model.create(tag, "spectral", **pars) for name, value in errs.items(): model.parameters[name].error = value return model @property def flux_points_table(self): """Flux points as a `~astropy.table.Table`.""" table = Table() table.meta.update(self.flux_points_meta) table["e_min"] = self._energy_edges[:-1] table["e_max"] = self._energy_edges[1:] table["flux"] = self._get_flux_values("Flux") flux_err = self._get_flux_values("Unc_Flux") table["flux_errn"] = np.abs(flux_err[:, 0]) table["flux_errp"] = flux_err[:, 1] # handle upper limits is_ul = np.isnan(table["flux_errn"]) table["is_ul"] = is_ul table["flux_ul"] = np.nan * flux_err.unit flux_ul = compute_flux_points_ul(table["flux"], table["flux_errp"]) table["flux_ul"][is_ul] = flux_ul[is_ul] return table def _get_flux_values(self, prefix, unit="cm-2 s-1"): values = [self.data[prefix + _ + "GeV"] for _ in self._energy_edges_suffix] return u.Quantity(values, unit) class SourceCatalogObject3FHL(SourceCatalogObjectFermiBase): """One source from the Fermi-LAT 3FHL catalog. Catalog is represented by `~gammapy.catalog.SourceCatalog3FHL`. """ asso = ["ASSOC1", "ASSOC2", "ASSOC_TEV", "ASSOC_GAM"] energy_range = u.Quantity([0.01, 2], "TeV") """Energy range used for the catalog.""" _energy_edges = u.Quantity([10, 20, 50, 150, 500, 2000], "GeV") def _info_position(self): d = self.data ss = "\n*** Position info ***\n\n" ss += "{:<20s} : {:.3f}\n".format("RA", d["RAJ2000"]) ss += "{:<20s} : {:.3f}\n".format("DEC", d["DEJ2000"]) ss += "{:<20s} : {:.3f}\n".format("GLON", d["GLON"]) ss += "{:<20s} : {:.3f}\n".format("GLAT", d["GLAT"]) # TODO: All sources are non-elliptical; just give one number for radius? ss += "\n" ss += "{:<20s} : {:.4f}\n".format("Semimajor (95%)", d["Conf_95_SemiMajor"]) ss += "{:<20s} : {:.4f}\n".format("Semiminor (95%)", d["Conf_95_SemiMinor"]) ss += "{:<20s} : {:.2f}\n".format("Position angle (95%)", d["Conf_95_PosAng"]) ss += "{:<20s} : {:.0f}\n".format("ROI number", d["ROI_num"]) return ss def _info_spectral_fit(self): d = self.data spec_type = d["SpectrumType"].strip() ss = "\n*** Spectral fit info ***\n\n" ss += "{:<32s} : {}\n".format("Spectrum type", d["SpectrumType"]) ss += "{:<32s} : {:.1f}\n".format("Significance curvature", d["Signif_Curve"]) # Power-law parameters are always given; give in any case fmt = "{:<32s} : {:.3f} +- {:.3f}\n" ss += fmt.format( "Power-law spectral index", d["PowerLaw_Index"], d["Unc_PowerLaw_Index"] ) if spec_type == "PowerLaw": pass elif spec_type == "LogParabola": fmt = "{:<32s} : {:.3f} +- {:.3f}\n" ss += fmt.format( "LogParabolaSpectralModel spectral index", d["Spectral_Index"], d["Unc_Spectral_Index"], ) ss += "{:<32s} : {:.3f} +- {:.3f}\n".format( "LogParabolaSpectralModel beta", d["beta"], d["Unc_beta"] ) else: raise ValueError(f"Invalid spec_type: {spec_type!r}") ss += "{:<32s} : {:.1f} {}\n".format( "Pivot energy", d["Pivot_Energy"].value, d["Pivot_Energy"].unit ) ss += "{:<32s} : {:.3} +- {:.3} {}\n".format( "Flux Density at pivot energy", d["Flux_Density"].value, d["Unc_Flux_Density"].value, "cm-2 GeV-1 s-1", ) ss += "{:<32s} : {:.3} +- {:.3} {}\n".format( "Integral flux (10 GeV - 1 TeV)", d["Flux"].value, d["Unc_Flux"].value, "cm-2 s-1", ) ss += "{:<32s} : {:.3} +- {:.3} {}\n".format( "Energy flux (10 GeV - TeV)", d["Energy_Flux"].value, d["Unc_Energy_Flux"].value, "erg cm-2 s-1", ) return ss def _info_more(self): d = self.data ss = "\n*** Other info ***\n\n" fmt = "{:<32s} : {:.3f}\n" ss += fmt.format("Significance (10 GeV - 2 TeV)", d["Signif_Avg"]) ss += "{:<32s} : {:.1f}\n".format("Npred", d["Npred"]) ss += "\n{:<16s} : {:.3f} {}\n".format( "HEP Energy", d["HEP_Energy"].value, d["HEP_Energy"].unit ) ss += "{:<16s} : {:.3f}\n".format("HEP Probability", d["HEP_Prob"]) ss += "{:<16s} : {}\n".format("Bayesian Blocks", d["Variability_BayesBlocks"]) ss += "{:<16s} : {:.3f}\n".format("Redshift", d["Redshift"]) ss += "{:<16s} : {:.3} {}\n".format( "NuPeak_obs", d["NuPeak_obs"].value, d["NuPeak_obs"].unit ) return ss def spectral_model(self): """Best fit spectral model as a `~gammapy.modeling.models.SpectralModel` object.""" d = self.data spec_type = self.data["SpectrumType"].strip() if spec_type == "PowerLaw": tag = "PowerLawSpectralModel" pars = { "reference": d["Pivot_Energy"], "amplitude": d["Flux_Density"], "index": d["PowerLaw_Index"], } errs = { "amplitude": d["Unc_Flux_Density"], "index": d["Unc_PowerLaw_Index"], } elif spec_type == "LogParabola": tag = "LogParabolaSpectralModel" pars = { "reference": d["Pivot_Energy"], "amplitude": d["Flux_Density"], "alpha": d["Spectral_Index"], "beta": d["beta"], } errs = { "amplitude": d["Unc_Flux_Density"], "alpha": d["Unc_Spectral_Index"], "beta": d["Unc_beta"], } else: raise ValueError(f"Invalid spec_type: {spec_type!r}") model = Model.create(tag, "spectral", **pars) for name, value in errs.items(): model.parameters[name].error = value return model @property def flux_points_table(self): """Flux points as a `~astropy.table.Table`.""" table = Table() table.meta.update(self.flux_points_meta) table["e_min"] = self._energy_edges[:-1] table["e_max"] = self._energy_edges[1:] flux = self.data["Flux_Band"] flux_err = self.data["Unc_Flux_Band"] e2dnde = self.data["nuFnu"] table["flux"] = flux table["flux_errn"] = np.abs(flux_err[:, 0]) table["flux_errp"] = flux_err[:, 1] table["e2dnde"] = e2dnde table["e2dnde_errn"] = np.abs(e2dnde * flux_err[:, 0] / flux) table["e2dnde_errp"] = e2dnde * flux_err[:, 1] / flux is_ul = np.isnan(table["flux_errn"]) table["is_ul"] = is_ul # handle upper limits table["flux_ul"] = np.nan * flux_err.unit flux_ul = compute_flux_points_ul(table["flux"], table["flux_errp"]) table["flux_ul"][is_ul] = flux_ul[is_ul] table["e2dnde_ul"] = np.nan * e2dnde.unit e2dnde_ul = compute_flux_points_ul(table["e2dnde"], table["e2dnde_errp"]) table["e2dnde_ul"][is_ul] = e2dnde_ul[is_ul] # Square root of test statistic table["sqrt_ts"] = self.data["Sqrt_TS_Band"] return table def spatial_model(self): """Source spatial model as a `~gammapy.modeling.models.SpatialModel` object.""" d = self.data ra = d["RAJ2000"] dec = d["DEJ2000"] if self.is_pointlike: model = PointSpatialModel(lon_0=ra, lat_0=dec, frame="icrs") else: de = self.data_extended morph_type = de["Spatial_Function"].strip() e = (1 - (de["Model_SemiMinor"] / de["Model_SemiMajor"]) ** 2.0) ** 0.5 sigma = de["Model_SemiMajor"] phi = de["Model_PosAng"] if morph_type == "RadialDisk": r_0 = de["Model_SemiMajor"] model = DiskSpatialModel( lon_0=ra, lat_0=dec, r_0=r_0, e=e, phi=phi, frame="icrs" ) elif morph_type in ["SpatialMap"]: filename = de["Spatial_Filename"].strip() path = make_path( "$GAMMAPY_DATA/catalogs/fermi/Extended_archive_v18/Templates/" ) model = TemplateSpatialModel.read(path / filename) elif morph_type == "RadialGauss": model = GaussianSpatialModel( lon_0=ra, lat_0=dec, sigma=sigma, e=e, phi=phi, frame="icrs" ) else: raise ValueError(f"Invalid morph_type: {morph_type!r}") self._set_spatial_errors(model) return model class SourceCatalogObject2PC(SourceCatalogObjectFermiPCBase): """One source from the 2PC catalog.""" @property def _auxiliary_filename(self): return make_path( f"$GAMMAPY_DATA/catalogs/fermi/2PC_auxiliary/PSR{self.name}_2PC_data.fits.gz" ) def _info_more(self): d = self.data ss = "\n*** Other info ***\n\n" ss += "{:<20s} : {:s}\n".format("Binary", d["Binary"]) return ss def _info_pulsar(self): d = self.data ss = "\n*** Pulsar info ***\n\n" ss += "{:<20s} : {:.3f}\n".format("Period", d["Period"]) ss += "{:<20s} : {:.3e}\n".format("P_Dot", d["P_Dot"]) ss += "{:<20s} : {:.3e}\n".format("E_Dot", d["E_Dot"]) ss += "{:<20s} : {}\n".format("Type", d["Type"]) return ss def _info_spectral_fit(self): d = self.data_spectral ss = "\n*** Spectral info ***\n\n" if d is None: ss += "No spectral info available.\n" return ss ss += "{:<20s} : {}\n".format("On peak", d["On_Peak"]) ss += "{:<20s} : {:.0f}\n".format("TS DC", d["TS_DC"]) ss += "{:<20s} : {:.0f}\n".format("TS cutoff", d["TS_Cutoff"]) ss += "{:<20s} : {:.0f}\n".format("TS b free", d["TS_bfree"]) indentation = " " * 4 fmt_e = "{}{:<20s} : {:.3e} +- {:.3e}\n" fmt_f = "{}{:<20s} : {:.3f} +- {:.3f}\n" if not isinstance(d["PLEC1_Prefactor"], np.ma.core.MaskedConstant): ss += "\n{}* PLSuperExpCutoff b = 1 *\n\n".format(indentation) ss += fmt_e.format( indentation, "Amplitude", d["PLEC1_Prefactor"], d["Unc_PLEC1_Prefactor"] ) ss += fmt_f.format( indentation, "Index 1", d["PLEC1_Photon_Index"], d["Unc_PLEC1_Photon_Index"], ) ss += "{}{:<20s} : {:.3f}\n".format(indentation, "Index 2", 1) ss += "{}{:<20s} : {:.3f}\n".format( indentation, "Reference", d["PLEC1_Scale"] ) ss += fmt_f.format( indentation, "Ecut", d["PLEC1_Cutoff"], d["Unc_PLEC1_Cutoff"] ) if not isinstance(d["PLEC_Prefactor"], np.ma.core.MaskedConstant): ss += "\n{}* PLSuperExpCutoff b free *\n\n".format(indentation) ss += fmt_e.format( indentation, "Amplitude", d["PLEC_Prefactor"], d["Unc_PLEC_Prefactor"] ) ss += fmt_f.format( indentation, "Index 1", d["PLEC_Photon_Index"], d["Unc_PLEC_Photon_Index"], ) ss += fmt_f.format( indentation, "Index 2", d["PLEC_Exponential_Index"], d["Unc_PLEC_Exponential_Index"], ) ss += "{}{:<20s} : {:.3f}\n".format( indentation, "Reference", d["PLEC_Scale"] ) ss += fmt_f.format( indentation, "Ecut", d["PLEC_Cutoff"], d["Unc_PLEC_Cutoff"] ) if not isinstance(d["PL_Prefactor"], np.ma.core.MaskedConstant): ss += "\n{}* PowerLaw *\n\n".format(indentation) ss += fmt_e.format( indentation, "Amplitude", d["PL_Prefactor"], d["Unc_PL_Prefactor"] ) ss += fmt_f.format( indentation, "Index", d["PL_Photon_Index"], d["Unc_PL_Photon_Index"] ) ss += "{}{:<20s} : {:.3f}\n".format(indentation, "Reference", d["PL_Scale"]) return ss def _info_phasogram(self): d = self.data ss = "\n*** Phasogram info ***\n\n" ss += "{:<20s} : {:d}\n".format("Number of peaks", d["Num_Peaks"]) ss += "{:<20s} : {:.3f}\n".format("Peak separation", d["Peak_Sep"]) return ss def spectral_model(self): d = self.data_spectral if d is None: log.warning(f"No spectral model available for source {self.name}") return None if d["TS_Cutoff"] < 9: tag = "PowerLawSpectralModel" pars = { "reference": d["PL_Scale"], "amplitude": d["PL_Prefactor"], "index": d["PL_Photon_Index"], } errs = { "amplitude": d["Unc_PL_Prefactor"], "index": d["Unc_PL_Photon_Index"], } elif d["TS_bfree"] >= 9: tag = "SuperExpCutoffPowerLaw3FGLSpectralModel" pars = { "index_1": d["PLEC_Photon_Index"], "index_2": d["PLEC_Exponential_Index"], "amplitude": d["PLEC_Prefactor"], "reference": d["PLEC_Scale"], "ecut": d["PLEC_Cutoff"], } errs = { "index_1": d["Unc_PLEC_Photon_Index"], "index_2": d["Unc_PLEC_Exponential_Index"], "amplitude": d["Unc_PLEC_Prefactor"], "ecut": d["Unc_PLEC_Cutoff"], } elif d["TS_bfree"] < 9: tag = "SuperExpCutoffPowerLaw3FGLSpectralModel" pars = { "index_1": d["PLEC1_Photon_Index"], "index_2": 1, "amplitude": d["PLEC1_Prefactor"], "reference": d["PLEC1_Scale"], "ecut": d["PLEC1_Cutoff"], } errs = { "index_1": d["Unc_PLEC1_Photon_Index"], "amplitude": d["Unc_PLEC1_Prefactor"], "ecut": d["Unc_PLEC1_Cutoff"], } else: log.warning(f"No spectral model available for source {self.name}") return None model = Model.create(tag, "spectral", **pars) for name, value in errs.items(): model.parameters[name].error = value return model @property def flux_points_table(self): """Flux points (`~astropy.table.Table`).""" try: with warnings.catch_warnings(): warnings.simplefilter("ignore", u.UnitsWarning) fp_data = Table.read(self._auxiliary_filename, hdu="PULSAR_SED") except (KeyError, FileNotFoundError): log.warning(f"No flux points available for source {self.name}") return None table = Table() table["e_min"] = fp_data["Energy_Min"] table["e_max"] = fp_data["Energy_Max"] table["e_ref"] = fp_data["Center_Energy"] table["flux"] = fp_data["PhotonFlux"] table["flux_err"] = fp_data["Unc_PhotonFlux"] table["e2dnde"] = fp_data["EnergyFlux"] table["e2dnde_err"] = fp_data["Unc_EnergyFlux"] is_ul = np.where(table["e2dnde_err"] == 0, True, False) table["is_ul"] = is_ul table["flux_ul"] = np.nan * table["flux_err"].unit flux_ul = compute_flux_points_ul(table["flux"], table["flux_err"]) table["flux_ul"][is_ul] = flux_ul[is_ul] table["e2dnde_ul"] = np.nan * table["e2dnde"].unit e2dnde_ul = compute_flux_points_ul(table["e2dnde"], table["e2dnde_err"]) table["e2dnde_ul"][is_ul] = e2dnde_ul[is_ul] return table class SourceCatalogObject3PC(SourceCatalogObjectFermiPCBase): """One source from the 3PC catalog.""" asso = ["assoc_new"] _energy_edges = u.Quantity([50, 100, 300, 1_000, 3e3, 1e4, 3e4, 1e5, 1e6], "MeV") @property def _auxiliary_filename(self): return make_path( f"$GAMMAPY_DATA/catalogs/fermi/3PC_auxiliary_20230728/{self.name}_3PC_data.fits.gz" ) def _info_pulsar(self): d = self.data ss = "\n*** Pulsar info ***\n\n" ss += "{:<20s} : {:.3f}\n".format("Period", d["P0"]) ss += "{:<20s} : {:.3e}\n".format("P_Dot", d["P1"]) ss += "{:<20s} : {:.3e}\n".format("E_Dot", d["EDOT"]) return ss def _info_phasogram(self): d = self.data ss = "\n*** Phasogram info ***\n\n" if not isinstance(d["NPEAK"], np.ma.core.MaskedConstant): npeak = d["NPEAK"] ss += "{:<20s} : {:.3f}\n".format("Number of peaks", npeak) if npeak > 1: ss += "{:<20s} : {:.3f}\n".format("Ph1 (peak one)", d["PHI1"]) ss += "{:<20s} : {:.3f}\n".format( "Ph2 (peak two)", d["PHI1"] + d["PKSEP"] ) ss += "{:<20s} : {:.3f}\n".format("Peak separation", d["PKSEP"]) else: if not isinstance(d["PHI1"], np.ma.core.MaskedConstant): ss += "{:<20s} : {:.3f}\n".format("Ph1 (peak one)", d["PHI1"]) else: ss += "No phasogram info available.\n" return ss def _info_spectral_fit(self): d = self.data_spectral ss = "\n*** Spectral info ***\n\n" if d is None: ss += "No spectral info available.\n" return ss ss += "{:<20s} : {:.0f}\n".format("TS", d["Test_Statistic"]) ss += "{:<20s} : {:.0f}\n".format("Significance (DC)", d["Signif_Avg"]) ss += "{:<20s} : {:s}\n".format("Spectrum Type", d["SpectrumType"]) indentation = " " * 4 fmt_e = "{}{:<20s} : {:.3e} +- {:.3e}\n" fmt_f = "{}{:<20s} : {:.3f} +- {:.3f}\n" if not isinstance(d["PLEC_Flux_Density_b23"], np.ma.core.MaskedConstant): ss += "\n{}* SuperExpCutoffPowerLaw4FGLDR3 b = 2/3 *\n\n".format( indentation ) ss += fmt_e.format( indentation, "Amplitude", d["PLEC_Flux_Density_b23"], d["Unc_PLEC_Flux_Density_b23"], ) ss += fmt_f.format( indentation, "Index 1", -d["PLEC_IndexS_b23"], d["Unc_PLEC_IndexS_b23"], ) ss += "{}{:<20s} : {:.3f}\n".format(indentation, "Index 2", 0.6667) ss += "{}{:<20s} : {:.3f}\n".format( indentation, "Reference", d["Pivot_Energy_b23"] ) ss += fmt_f.format( indentation, "Expfactor", d["PLEC_ExpfactorS_b23"], d["Unc_PLEC_ExpfactorS_b23"], ) if not isinstance(d["PLEC_Flux_Density_bfr"], np.ma.core.MaskedConstant): ss += "\n{}* SuperExpCutoffPowerLaw4FGLDR3 b free *\n\n".format(indentation) ss += fmt_e.format( indentation, "Amplitude", d["PLEC_Flux_Density_bfr"], d["Unc_PLEC_Flux_Density_bfr"], ) ss += fmt_f.format( indentation, "Index 1", -d["PLEC_IndexS_bfr"], d["Unc_PLEC_IndexS_bfr"], ) ss += fmt_f.format( indentation, "Index 2", d["PLEC_Exp_Index_bfr"], d["Unc_PLEC_Exp_Index_bfr"], ) ss += "{}{:<20s} : {:.3f}\n".format( indentation, "Reference", d["Pivot_Energy_bfr"] ) ss += fmt_f.format( indentation, "Expfactor", d["PLEC_ExpfactorS_bfr"], d["Unc_PLEC_ExpfactorS_bfr"], ) return ss def spectral_model(self, fit="auto"): """ In the 3PC, Fermi-LAT collaboration tried to fit a `~gammapy.modelling.models.SuperExpCutoffPowerLaw4FGLDR3SpectralModel` with the exponential index `index_2` free, or fixed to 2/3. These two models are referred as "b free" and "b 23". For most pulsars, both models are available. However, in some cases the "b free" model did not fit correctly. Parameters ---------- fit : str, optional Which fitted model to return. The user can choose between "auto", "b free" and "b 23". "auto" will always try to return the "b free" first and fall back to the "b 23" fit if "b free" is not available. Default is "auto". """ d = self.data_spectral if d is None or d["SpectrumType"] != "PLSuperExpCutoff4": log.warning(f"No spectral model available for source {self.name}") return None tag = "SuperExpCutoffPowerLaw4FGLDR3SpectralModel" if not ( isinstance(d["PLEC_IndexS_bfr"], np.ma.core.masked_array) or (fit == "b 23") ): pars = { "reference": d["Pivot_Energy_bfr"], "amplitude": d["PLEC_Flux_Density_bfr"], "index_1": -d["PLEC_IndexS_bfr"], "index_2": d["PLEC_Exp_Index_bfr"], "expfactor": d["PLEC_ExpfactorS_bfr"], } errs = { "amplitude": d["Unc_PLEC_Flux_Density_bfr"], "index_1": d["Unc_PLEC_IndexS_bfr"], "index_2": d["Unc_PLEC_Exp_Index_bfr"], "expfactor": d["Unc_PLEC_ExpfactorS_bfr"], } else: pars = { "reference": d["Pivot_Energy_b23"], "amplitude": d["PLEC_Flux_Density_b23"], "index_1": -d["PLEC_IndexS_b23"], "index_2": d["PLEC_Exp_Index_b23"], "expfactor": d["PLEC_ExpfactorS_b23"], } errs = { "amplitude": d["Unc_PLEC_Flux_Density_b23"], "index_1": d["Unc_PLEC_IndexS_b23"], "expfactor": d["Unc_PLEC_ExpfactorS_b23"], } model = Model.create(tag, "spectral", **pars) for name, value in errs.items(): model.parameters[name].error = value return model @property def flux_points_table(self): """Flux points (`~astropy.table.Table`). Flux point is an upper limit if its significance is less than 2.""" fp_data = self.data_spectral if fp_data is None: log.warning(f"No flux points available for source {self.name}") return None table = Table() table["e_min"] = self._energy_edges[:-1] table["e_max"] = self._energy_edges[1:] table["e_ref"] = np.sqrt(table["e_min"] * table["e_max"]) fgl_cols = ["Flux_Band", "Unc_Flux_Band", "Sqrt_TS_Band", "nuFnu_Band"] flux, flux_err, sig, nuFnu = [fp_data[col] for col in fgl_cols] table["flux"] = flux table["flux_errn"] = np.abs(flux_err[:, 0]) table["flux_errp"] = flux_err[:, 1] table["e2dnde"] = nuFnu table["e2dnde_errn"] = np.abs(nuFnu * flux_err[:, 0] / flux) table["e2dnde_errp"] = nuFnu * flux_err[:, 1] / flux is_ul = np.isnan(flux_err[:, 0]) | (sig < 2) table["is_ul"] = is_ul table["flux_ul"] = np.nan * flux_err.unit flux_ul = compute_flux_points_ul(table["flux"], table["flux_errp"]) table["flux_ul"][is_ul] = flux_ul[is_ul] table["e2dnde_ul"] = np.nan * table["e2dnde"].unit e2dnde_ul = compute_flux_points_ul(table["e2dnde"], table["e2dnde_errp"]) table["e2dnde_ul"][is_ul] = e2dnde_ul[is_ul] table["sqrt_ts"] = fp_data["Sqrt_TS_Band"] return table class SourceCatalog3FGL(SourceCatalog): """Fermi-LAT 3FGL source catalog. - https://ui.adsabs.harvard.edu/abs/2015ApJS..218...23A - https://fermi.gsfc.nasa.gov/ssc/data/access/lat/4yr_catalog/ One source is represented by `~gammapy.catalog.SourceCatalogObject3FGL`. """ tag = "3fgl" description = "LAT 4-year point source catalog" source_object_class = SourceCatalogObject3FGL def __init__(self, filename="$GAMMAPY_DATA/catalogs/fermi/gll_psc_v16.fit.gz"): filename = make_path(filename) with warnings.catch_warnings(): # ignore FITS units warnings warnings.simplefilter("ignore", u.UnitsWarning) table = Table.read(filename, hdu="LAT_Point_Source_Catalog") table_standardise_units_inplace(table) source_name_key = "Source_Name" source_name_alias = ( "Extended_Source_Name", "0FGL_Name", "1FGL_Name", "2FGL_Name", "1FHL_Name", "ASSOC_TEV", "ASSOC1", "ASSOC2", ) super().__init__( table=table, source_name_key=source_name_key, source_name_alias=source_name_alias, ) self.extended_sources_table = Table.read(filename, hdu="ExtendedSources") self.hist_table = Table.read(filename, hdu="Hist_Start") class SourceCatalog4FGL(SourceCatalog): """Fermi-LAT 4FGL source catalog. - https://arxiv.org/abs/1902.10045 (DR1) - https://arxiv.org/abs/2005.11208 (DR2) - https://arxiv.org/abs/2201.11184 (DR3) - https://arxiv.org/abs/2307.12546 (DR4) By default we use the file of the DR4 initial release from https://fermi.gsfc.nasa.gov/ssc/data/access/lat/14yr_catalog/ One source is represented by `~gammapy.catalog.SourceCatalogObject4FGL`. """ tag = "4fgl" description = "LAT 14-year point source catalog" source_object_class = SourceCatalogObject4FGL def __init__(self, filename="$GAMMAPY_DATA/catalogs/fermi/gll_psc_v32.fit.gz"): filename = make_path(filename) table = Table.read(filename, hdu="LAT_Point_Source_Catalog") table_standardise_units_inplace(table) source_name_key = "Source_Name" source_name_alias = ( "Extended_Source_Name", "ASSOC_FGL", "ASSOC_FHL", "ASSOC_GAM1", "ASSOC_GAM2", "ASSOC_GAM3", "ASSOC_TEV", "ASSOC1", "ASSOC2", ) super().__init__( table=table, source_name_key=source_name_key, source_name_alias=source_name_alias, ) self.extended_sources_table = Table.read(filename, hdu="ExtendedSources") self.extended_sources_table["version"] = int( "".join(filter(str.isdigit, table.meta["VERSION"])) ) try: self.hist_table = Table.read(filename, hdu="Hist_Start") if "MJDREFI" not in self.hist_table.meta: self.hist_table.meta = Table.read(filename, hdu="GTI").meta except KeyError: pass try: self.hist2_table = Table.read(filename, hdu="Hist2_Start") if "MJDREFI" not in self.hist_table.meta: self.hist2_table.meta = Table.read(filename, hdu="GTI").meta except KeyError: pass table = Table.read(filename, hdu="EnergyBounds") self.flux_points_energy_edges = np.unique( np.c_[table["LowerEnergy"].quantity, table["UpperEnergy"].quantity] ) class SourceCatalog2FHL(SourceCatalog): """Fermi-LAT 2FHL source catalog. - https://ui.adsabs.harvard.edu/abs/2016ApJS..222....5A - https://fermi.gsfc.nasa.gov/ssc/data/access/lat/2FHL/ One source is represented by `~gammapy.catalog.SourceCatalogObject2FHL`. """ tag = "2fhl" description = "LAT second high-energy source catalog" source_object_class = SourceCatalogObject2FHL def __init__(self, filename="$GAMMAPY_DATA/catalogs/fermi/gll_psch_v09.fit.gz"): filename = make_path(filename) with warnings.catch_warnings(): # ignore FITS units warnings warnings.simplefilter("ignore", u.UnitsWarning) table = Table.read(filename, hdu="2FHL Source Catalog") table_standardise_units_inplace(table) source_name_key = "Source_Name" source_name_alias = ("ASSOC", "3FGL_Name", "1FHL_Name", "TeVCat_Name") super().__init__( table=table, source_name_key=source_name_key, source_name_alias=source_name_alias, ) self.extended_sources_table = Table.read(filename, hdu="Extended Sources") self.rois = Table.read(filename, hdu="ROIs") class SourceCatalog3FHL(SourceCatalog): """Fermi-LAT 3FHL source catalog. - https://ui.adsabs.harvard.edu/abs/2017ApJS..232...18A - https://fermi.gsfc.nasa.gov/ssc/data/access/lat/3FHL/ One source is represented by `~gammapy.catalog.SourceCatalogObject3FHL`. """ tag = "3fhl" description = "LAT third high-energy source catalog" source_object_class = SourceCatalogObject3FHL def __init__(self, filename="$GAMMAPY_DATA/catalogs/fermi/gll_psch_v13.fit.gz"): filename = make_path(filename) with warnings.catch_warnings(): # ignore FITS units warnings warnings.simplefilter("ignore", u.UnitsWarning) table = Table.read(filename, hdu="LAT_Point_Source_Catalog") table_standardise_units_inplace(table) source_name_key = "Source_Name" source_name_alias = ("ASSOC1", "ASSOC2", "ASSOC_TEV", "ASSOC_GAM") super().__init__( table=table, source_name_key=source_name_key, source_name_alias=source_name_alias, ) self.extended_sources_table = Table.read(filename, hdu="ExtendedSources") self.rois = Table.read(filename, hdu="ROIs") self.energy_bounds_table = Table.read(filename, hdu="EnergyBounds") class SourceCatalog2PC(SourceCatalog): """Fermi-LAT 2nd pulsar catalog. - https://ui.adsabs.harvard.edu/abs/2013ApJS..208...17A - https://fermi.gsfc.nasa.gov/ssc/data/access/lat/2nd_PSR_catalog/ One source is represented by `~gammapy.catalog.SourceCatalogObject2PC`. """ tag = "2PC" description = "LAT 2nd pulsar catalog" source_object_class = SourceCatalogObject2PC def __init__(self, filename="$GAMMAPY_DATA/catalogs/fermi/2PC_catalog_v04.fits.gz"): filename = make_path(filename) with warnings.catch_warnings(): warnings.simplefilter("ignore", u.UnitsWarning) table_psr = Table.read(filename, hdu="PULSAR_CATALOG") table_spectral = Table.read(filename, hdu="SPECTRAL") table_off_peak = Table.read(filename, hdu="OFF_PEAK") table_standardise_units_inplace(table_psr) table_standardise_units_inplace(table_spectral) table_standardise_units_inplace(table_off_peak) source_name_key = "PSR_Name" super().__init__(table=table_psr, source_name_key=source_name_key) self.source_object_class._source_name_key = source_name_key self.off_peak_table = table_off_peak self.spectral_table = table_spectral def to_models(self, **kwargs): models = Models() for m in self: sky_model = m.sky_model() if sky_model is not None: models.append(sky_model) return models def _get_name_spectral(self, data): return f"{data[self._source_name_key].strip()}" class SourceCatalog3PC(SourceCatalog): """Fermi-LAT 3rd pulsar catalog. - https://arxiv.org/abs/2307.11132 - https://fermi.gsfc.nasa.gov/ssc/data/access/lat/3rd_PSR_catalog/ One source is represented by `~gammapy.catalog.SourceCatalogObject3PC`. """ tag = "3PC" description = "LAT 3rd pulsar catalog" source_object_class = SourceCatalogObject3PC def __init__( self, filename="$GAMMAPY_DATA/catalogs/fermi/3PC_Catalog+SEDs_20230803.fits.gz" ): filename = make_path(filename) with warnings.catch_warnings(): warnings.simplefilter("ignore", u.UnitsWarning) table_psr = Table.read(filename, hdu="PULSARS_BIGFILE") table_spectral = Table.read(filename, hdu="LAT_Point_Source_Catalog") table_bigfile_config = Table.read(filename, hdu="BIGFILE_CONFIG") table_standardise_units_inplace(table_psr) table_standardise_units_inplace(table_spectral) table_standardise_units_inplace(table_bigfile_config) source_name_key = "PSRJ" super().__init__(table=table_psr, source_name_key=source_name_key) self.source_object_class._source_name_key = source_name_key self.spectral_table = table_spectral self.off_bigfile_config = table_bigfile_config def to_models(self, **kwargs): models = Models() for m in self: sky_model = m.sky_model() if sky_model is not None: models.append(sky_model) return models @property def _get_source_name_key(self): return "NickName" def _get_name_spectral(self, data): return f"PSR{data[self._source_name_key].strip()}" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/gammacat.py0000644000175100001770000003274514721316200017615 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Gammacat open TeV source catalog. https://github.com/gammapy/gamma-cat """ import logging import numpy as np from astropy import units as u from astropy.table import Table from gammapy.estimators import FluxPoints from gammapy.modeling.models import Model, SkyModel from gammapy.utils.scripts import make_path from .core import SourceCatalog, SourceCatalogObject, format_flux_points_table __all__ = ["SourceCatalogGammaCat", "SourceCatalogObjectGammaCat"] log = logging.getLogger(__name__) class SourceCatalogObjectGammaCat(SourceCatalogObject): """One object from the gamma-cat source catalog. Catalog is represented by `~gammapy.catalog.SourceCatalogGammaCat`. """ _source_name_key = "common_name" def __str__(self): return self.info() @property def flux_points(self): """Flux points (`~gammapy.estimators.FluxPoints`).""" return FluxPoints.from_table( table=self.flux_points_table, reference_model=self.sky_model(), format="gadf-sed", ) def info(self, info="all"): """Information string. Parameters ---------- info : {'all', 'basic', 'position, 'model'} Comma separated list of options """ if info == "all": info = "basic,position,model" ss = "" ops = info.split(",") if "basic" in ops: ss += self._info_basic() if "position" in ops: ss += self._info_position() if "model" in ops: ss += self._info_morph() ss += self._info_spectral_fit() ss += self._info_spectral_points() return ss def _info_basic(self): """Print basic info.""" d = self.data return ( f"\n*** Basic info ***\n\n" f"Catalog row index (zero-based): {self.row_index}\n" f"Common name: {self.name}\n" f"Gamma names: {d.gamma_names}\n" f"Fermi names: {d.fermi_names}\n" f"Other names: {d.other_names}\n" f"Location: {d.where}\n" f"Class: {d.classes}\n\n" f"TeVCat ID: {d.tevcat_id}\n" f"TeVCat 2 ID: {d.tevcat2_id}\n" f"TeVCat name: {d.tevcat_name}\n\n" f"TGeVCat ID: {d.tgevcat_id}\n" f"TGeVCat name: {d.tgevcat_name}\n\n" f"Discoverer: {d.discoverer}\n" f"Discovery date: {d.discovery_date}\n" f"Seen by: {d.seen_by}\n" f"Reference: {d.reference_id}\n" ) def _info_position(self): """Print position information.""" d = self.data return ( f"\n*** Position info ***\n\n" f"SIMBAD:\n" f"RA: {d.ra:.3f}\n" f"DEC: {d.dec:.3f}\n" f"GLON: {d.glon:.3f}\n" f"GLAT: {d.glat:.3f}\n" f"\nMeasurement:\n" f"RA: {d.pos_ra:.3f}\n" f"DEC: {d.pos_dec:.3f}\n" f"GLON: {d.pos_glon:.3f}\n" f"GLAT: {d.pos_glat:.3f}\n" f"Position error: {d.pos_err:.3f}\n" ) def _info_morph(self): """Print morphology information.""" d = self.data return ( f"\n*** Morphology info ***\n\n" f"Morphology model type: {d.morph_type}\n" f"Sigma: {d.morph_sigma:.3f}\n" f"Sigma error: {d.morph_sigma_err:.3f}\n" f"Sigma2: {d.morph_sigma2:.3f}\n" f"Sigma2 error: {d.morph_sigma2_err:.3f}\n" f"Position angle: {d.morph_pa:.3f}\n" f"Position angle error: {d.morph_pa_err:.3f}\n" f"Position angle frame: {d.morph_pa_frame}\n" ) def _info_spectral_fit(self): """Print spectral information.""" d = self.data ss = "\n*** Spectral info ***\n\n" ss += f"Significance: {d.significance:.3f}\n" ss += f"Livetime: {d.livetime:.3f}\n" spec_type = d["spec_type"] ss += f"\nSpectrum type: {spec_type}\n" if spec_type == "pl": ss += f"norm: {d.spec_pl_norm:.3} +- {d.spec_pl_norm_err:.3} (stat) +- {d.spec_pl_norm_err_sys:.3} (sys) cm-2 s-1 TeV-1\n" # noqa: E501 ss += f"index: {d.spec_pl_index:.3} +- {d.spec_pl_index_err:.3} (stat) +- {d.spec_pl_index_err_sys:.3} (sys)\n" # noqa: E501 ss += f"reference: {d.spec_pl_e_ref:.3}\n" elif spec_type == "pl2": ss += f"flux: {d.spec_pl2_flux.value:.3} +- {d.spec_pl2_flux_err.value:.3} (stat) +- {d.spec_pl2_flux_err_sys.value:.3} (sys) cm-2 s-1\n" # noqa: E501 ss += f"index: {d.spec_pl2_index:.3} +- {d.spec_pl2_index_err:.3} (stat) +- {d.spec_pl2_index_err_sys:.3} (sys)\n" # noqa: E501 ss += f"e_min: {d.spec_pl2_e_min:.3}\n" ss += f"e_max: {d.spec_pl2_e_max:.3}\n" elif spec_type == "ecpl": ss += f"norm: {d.spec_ecpl_norm.value:.3g} +- {d.spec_ecpl_norm_err.value:.3g} (stat) +- {d.spec_ecpl_norm_err_sys.value:.03g} (sys) cm-2 s-1 TeV-1\n" # noqa: E501 ss += f"index: {d.spec_ecpl_index:.3} +- {d.spec_ecpl_index_err:.3} (stat) +- {d.spec_ecpl_index_err_sys:.3} (sys)\n" # noqa: E501 ss += f"e_cut: {d.spec_ecpl_e_cut.value:.3} +- {d.spec_ecpl_e_cut_err.value:.3} (stat) +- {d.spec_ecpl_e_cut_err_sys.value:.3} (sys) TeV\n" # noqa: E501 ss += f"reference: {d.spec_ecpl_e_ref:.3}\n" elif spec_type == "none": pass else: raise ValueError(f"Invalid spec_type: {spec_type}") ss += f"\nEnergy range: ({d.spec_erange_min.value:.3}, {d.spec_erange_max.value:.3}) TeV\n" ss += f"theta: {d.spec_theta:.3}\n" ss += "\n\nDerived fluxes:\n" ss += f"Spectral model norm (1 TeV): {d.spec_dnde_1TeV:.3} +- {d.spec_dnde_1TeV_err:.3} (stat) cm-2 s-1 TeV-1\n" # noqa: E501 ss += f"Integrated flux (>1 TeV): {d.spec_flux_1TeV.value:.3} +- {d.spec_flux_1TeV_err.value:.3} (stat) cm-2 s-1\n" # noqa: E501 ss += f"Integrated flux (>1 TeV): {d.spec_flux_1TeV_crab:.3f} +- {d.spec_flux_1TeV_crab_err:.3f} (% Crab)\n" # noqa: E501 ss += f"Integrated flux (1-10 TeV): {d.spec_eflux_1TeV_10TeV.value:.3} +- {d.spec_eflux_1TeV_10TeV_err.value:.3} (stat) erg cm-2 s-1\n" # noqa: E501 return ss def _info_spectral_points(self): """Print spectral points information.""" d = self.data ss = "\n*** Spectral points ***\n\n" ss += f"SED reference ID: {d.sed_reference_id}\n" ss += f"Number of spectral points: {d.sed_n_points}\n" ss += f"Number of upper limits: {d.sed_n_ul}\n\n" table = self.flux_points_table if table is None: ss += "\nNo spectral points available for this source." else: lines = format_flux_points_table(table).pformat(max_width=-1, max_lines=-1) ss += "\n".join(lines) return ss + "\n" def spectral_model(self): """Source spectral model (`~gammapy.modeling.models.SpectralModel`). Parameter errors are statistical errors only. """ data = self.data spec_type = data["spec_type"] if spec_type == "pl": tag = "PowerLawSpectralModel" pars = { "amplitude": data["spec_pl_norm"], "index": data["spec_pl_index"], "reference": data["spec_pl_e_ref"], } errs = { "amplitude": data["spec_pl_norm_err"], "index": data["spec_pl_index_err"], } elif spec_type == "pl2": e_max = data["spec_pl2_e_max"] DEFAULT_E_MAX = u.Quantity(1e5, "TeV") if np.isnan(e_max.value) or e_max.value == 0: e_max = DEFAULT_E_MAX tag = "PowerLaw2SpectralModel" pars = { "amplitude": data["spec_pl2_flux"], "index": data["spec_pl2_index"], "emin": data["spec_pl2_e_min"], "emax": e_max, } errs = { "amplitude": data["spec_pl2_flux_err"], "index": data["spec_pl2_index_err"], } elif spec_type == "ecpl": tag = "ExpCutoffPowerLawSpectralModel" pars = { "amplitude": data["spec_ecpl_norm"], "index": data["spec_ecpl_index"], "lambda_": 1.0 / data["spec_ecpl_e_cut"], "reference": data["spec_ecpl_e_ref"], } errs = { "amplitude": data["spec_ecpl_norm_err"], "index": data["spec_ecpl_index_err"], "lambda_": data["spec_ecpl_e_cut_err"] / data["spec_ecpl_e_cut"] ** 2, } elif spec_type == "none": return None else: raise ValueError(f"Invalid spec_type: {spec_type}") model = Model.create(tag, "spectral", **pars) for name, value in errs.items(): model.parameters[name].error = value return model def spatial_model(self): """Source spatial model (`~gammapy.modeling.models.SpatialModel`). TODO: add parameter errors! """ d = self.data morph_type = d["morph_type"] pars = {"lon_0": d["glon"], "lat_0": d["glat"], "frame": "galactic"} errs = { "lat_0": self.data["pos_err"], "lon_0": self.data["pos_err"] / np.cos(self.data["glat"]), } if morph_type == "point": tag = "PointSpatialModel" elif morph_type == "gauss": # TODO: add infos back once model support elongation # pars['x_stddev'] = d['morph_sigma'] # pars['y_stddev'] = d['morph_sigma'] # if not np.isnan(d['morph_sigma2']): # pars['y_stddev'] = d['morph_sigma2'] # if not np.isnan(d['morph_pa']): # # TODO: handle reference frame for rotation angle # pars['theta'] = Angle(d['morph_pa'], 'deg').rad tag = "GaussianSpatialModel" pars["sigma"] = d["morph_sigma"] elif morph_type == "shell": tag = "ShellSpatialModel" # TODO: probably we shouldn't guess a shell width here! pars["radius"] = 0.8 * d["morph_sigma"] pars["width"] = 0.2 * d["morph_sigma"] elif morph_type == "none": return None else: raise ValueError(f"Invalid morph_type: {morph_type!r}") model = Model.create(tag, "spatial", **pars) for name, value in errs.items(): model.parameters[name].error = value return model def sky_model(self): """Source sky model (`~gammapy.modeling.models.SkyModel`).""" return SkyModel( spatial_model=self.spatial_model(), spectral_model=self.spectral_model(), name=self.name, ) def _add_source_meta(self, table): """Copy over some information to `table.meta`.""" d = self.data m = table.meta m["origin"] = "Data from gamma-cat" m["source_id"] = d["source_id"] m["common_name"] = d["common_name"] m["reference_id"] = d["reference_id"] @property def flux_points_table(self): """Differential flux points (`~gammapy.estimators.FluxPoints`).""" d = self.data table = Table() table.meta["SED_TYPE"] = "dnde" self._add_source_meta(table) valid = np.isfinite(d["sed_e_ref"].value) if valid.sum() == 0: return None table["e_ref"] = d["sed_e_ref"] table["e_min"] = d["sed_e_min"] table["e_max"] = d["sed_e_max"] table["dnde"] = d["sed_dnde"] table["dnde_err"] = d["sed_dnde_err"] table["dnde_errn"] = d["sed_dnde_errn"] table["dnde_errp"] = d["sed_dnde_errp"] table["dnde_ul"] = d["sed_dnde_ul"] # Only keep rows that actually contain information table = table[valid] for colname in table.colnames: if not np.isfinite(table[colname]).any(): table.remove_column(colname) return table class SourceCatalogGammaCat(SourceCatalog): """Gammacat open TeV source catalog. See: https://github.com/gammapy/gamma-cat One source is represented by `~gammapy.catalog.SourceCatalogObjectGammaCat`. Parameters ---------- filename : str Path to the gamma-cat fits file. Examples -------- Load the catalog data: >>> import matplotlib.pyplot as plt >>> import astropy.units as u >>> from gammapy.catalog import SourceCatalogGammaCat >>> cat = SourceCatalogGammaCat() Access a source by name: >>> source = cat['Vela Junior'] Access source spectral data and plot it: >>> ax = plt.subplot() >>> energy_range = [1, 10] * u.TeV >>> source.spectral_model().plot(energy_range, ax=ax) # doctest: +SKIP >>> source.spectral_model().plot_error(energy_range, ax=ax) # doctest: +SKIP >>> source.flux_points.plot(ax=ax) # doctest: +SKIP """ tag = "gamma-cat" description = "An open catalog of gamma-ray sources" source_object_class = SourceCatalogObjectGammaCat def __init__(self, filename="$GAMMAPY_DATA/catalogs/gammacat/gammacat.fits.gz"): filename = make_path(filename) table = Table.read(filename, hdu=1) source_name_key = "common_name" source_name_alias = ("other_names", "gamma_names") super().__init__( table=table, source_name_key=source_name_key, source_name_alias=source_name_alias, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/hawc.py0000644000175100001770000002456014721316200016761 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """HAWC catalogs (https://www.hawc-observatory.org).""" import abc import numpy as np from astropy.table import Table from gammapy.modeling.models import Model, SkyModel from gammapy.utils.scripts import make_path from .core import SourceCatalog, SourceCatalogObject __all__ = [ "SourceCatalog2HWC", "SourceCatalog3HWC", "SourceCatalogObject2HWC", "SourceCatalogObject3HWC", ] class SourceCatalogObjectHWCBase(SourceCatalogObject, abc.ABC): """Base class for the HAWC catalogs objects.""" _source_name_key = "source_name" def __str__(self): return self.info() def info(self, info="all"): """Summary information string. Parameters ---------- info : {'all', 'basic', 'position', 'spectrum'} Comma separated list of options """ if info == "all": info = "basic,position,spectrum" ss = "" ops = info.split(",") if "basic" in ops: ss += self._info_basic() if "position" in ops: ss += self._info_position() if "spectrum" in ops: ss += self._info_spectrum() return ss def _info_basic(self): """Print basic information.""" return ( f"\n*** Basic info ***\n\n" f"Catalog row index (zero-based) : {self.row_index}\n" f"Source name : {self.name}\n" ) def _info_position(self): """Print position information.""" return ( f"\n*** Position info ***\n\n" f"RA: {self.data.ra:.3f}\n" f"DEC: {self.data.dec:.3f}\n" f"GLON: {self.data.glon:.3f}\n" f"GLAT: {self.data.glat:.3f}\n" f"Position error: {self.data.pos_err:.3f}\n" ) def _info_spectrum(self): """Print spectral information.""" ss = "\n*** Spectral info ***\n\n" ss += self._info_spectrum_one(0) if self.n_models == 2: ss += self._info_spectrum_one(1) else: ss += "No second spectrum available" return ss class SourceCatalogObject2HWC(SourceCatalogObjectHWCBase): """One source from the HAWC 2HWC catalog. Catalog is represented by `~gammapy.catalog.SourceCatalog2HWC`. """ @property def n_models(self): """Number of models (1 or 2).""" if hasattr(self.data, "spec1_dnde"): return 1 if np.isnan(self.data.spec1_dnde) else 2 else: return 1 def _get_idx(self, which): if which == "point": return 0 elif which == "extended": if self.n_models == 2: return 1 else: raise ValueError(f"No extended source analysis available: {self.name}") else: raise ValueError(f"Invalid which: {which!r}") def _info_spectrum_one(self, idx): d = self.data ss = f"Spectrum {idx}:\n" val, err = d[f"spec{idx}_dnde"].value, d[f"spec{idx}_dnde_err"].value ss += f"Flux at 7 TeV: {val:.3} +- {err:.3} cm-2 s-1 TeV-1\n" val, err = d[f"spec{idx}_index"], d[f"spec{idx}_index_err"] ss += f"Spectral index: {val:.3f} +- {err:.3f}\n" radius = d[f"spec{idx}_radius"] ss += f"Test Radius: {radius:1}\n\n" return ss def spectral_model(self, which="point"): """Spectral model (`~gammapy.modeling.models.PowerLawSpectralModel`). * ``which="point"`` -- Spectral model under the point source assumption. * ``which="extended"`` -- Spectral model under the extended source assumption. Only available for some sources. Raise ValueError if not available. """ idx = self._get_idx(which) pars = { "reference": "7 TeV", "amplitude": self.data[f"spec{idx}_dnde"], "index": -self.data[f"spec{idx}_index"], } errs = { "amplitude": self.data[f"spec{idx}_dnde_err"], "index": self.data[f"spec{idx}_index_err"], } model = Model.create("PowerLawSpectralModel", "spectral", **pars) for name, value in errs.items(): model.parameters[name].error = value return model def spatial_model(self, which="point"): """Spatial model (`~gammapy.modeling.models.SpatialModel`). * ``which="point"`` - `~gammapy.modeling.models.PointSpatialModel`. * ``which="extended"`` - `~gammapy.modeling.models.DiskSpatialModel`. Only available for some sources. Raise ValueError if not available. """ idx = self._get_idx(which) pars = {"lon_0": self.data.glon, "lat_0": self.data.glat, "frame": "galactic"} if idx == 0: tag = "PointSpatialModel" else: tag = "DiskSpatialModel" pars["r_0"] = self.data[f"spec{idx}_radius"] errs = { "lat_0": self.data.pos_err, "lon_0": self.data.pos_err / np.cos(self.data.glat), } model = Model.create(tag, "spatial", **pars) for name, value in errs.items(): model.parameters[name].error = value return model def sky_model(self, which="point"): """Sky model (`~gammapy.modeling.models.SkyModel`). * ``which="point"`` - Sky model for point source analysis. * ``which="extended"`` - Sky model for extended source analysis. Only available for some sources. Raise ValueError if not available. According to the paper, the radius of the extended source model is only a rough estimate of the source size, based on the residual excess. """ return SkyModel( spatial_model=self.spatial_model(which), spectral_model=self.spectral_model(which), name=self.name, ) class SourceCatalog2HWC(SourceCatalog): """HAWC 2HWC catalog. One source is represented by `~gammapy.catalog.SourceCatalogObject2HWC`. The data is from tables 2 and 3 in the paper [1]_. The catalog table contains 40 rows / sources. The paper mentions 39 sources e.g. in the abstract. The difference is due to Geminga, which was detected as two "sources" by the algorithm used to make the catalog, but then in the discussion considered as one source. References ---------- .. [1] Abeysekara et al, "The 2HWC HAWC Observatory Gamma Ray Catalog", On ADS: `2017ApJ...843...40A `__ """ tag = "2hwc" """Catalog name""" description = "2HWC catalog from the HAWC observatory" """Catalog description""" source_object_class = SourceCatalogObject2HWC def __init__(self, filename="$GAMMAPY_DATA/catalogs/2HWC.ecsv"): table = Table.read(make_path(filename), format="ascii.ecsv") source_name_key = "source_name" super().__init__(table=table, source_name_key=source_name_key) class SourceCatalogObject3HWC(SourceCatalogObjectHWCBase): """One source from the HAWC 3HWC catalog. Catalog is represented by `~gammapy.catalog.SourceCatalog3HWC`. """ @property def n_models(self): """Number of models.""" return 1 def _info_spectrum_one(self, idx): d = self.data ss = f"Spectrum {idx}:\n" val, errn, errp = ( d[f"spec{idx}_dnde"].value, d[f"spec{idx}_dnde_errn"].value, d[f"spec{idx}_dnde_errp"].value, ) ss += f"Flux at 7 TeV: {val:.3} {errn:.3} + {errp:.3} cm-2 s-1 TeV-1\n" val, errn, errp = ( d[f"spec{idx}_index"], d[f"spec{idx}_index_errn"], d[f"spec{idx}_index_errp"], ) ss += f"Spectral index: {val:.3f} {errn:.3f} + {errp:.3f}\n" radius = d[f"spec{idx}_radius"] ss += f"Test Radius: {radius:1}\n\n" return ss @property def is_pointlike(self): return self.data["spec0_radius"] == 0.0 def spectral_model(self): """Spectral model as a `~gammapy.modeling.models.PowerLawSpectralModel` object.""" pars = { "reference": "7 TeV", "amplitude": self.data["spec0_dnde"], "index": -self.data["spec0_index"], } errs = { "index": 0.5 * (self.data["spec0_index_errp"] + np.abs(self.data["spec0_index_errn"])), "amplitude": 0.5 * (self.data["spec0_dnde_errp"] + np.abs(self.data["spec0_dnde_errn"])), } model = Model.create("PowerLawSpectralModel", "spectral", **pars) for name, value in errs.items(): model.parameters[name].error = value return model def spatial_model(self): """Spatial model as a `~gammapy.modeling.models.SpatialModel` object.""" pars = {"lon_0": self.data.glon, "lat_0": self.data.glat, "frame": "galactic"} if self.is_pointlike: tag = "PointSpatialModel" else: tag = "DiskSpatialModel" pars["r_0"] = self.data["spec0_radius"] errs = { "lat_0": self.data.pos_err, "lon_0": self.data.pos_err / np.cos(self.data.glat), } model = Model.create(tag, "spatial", **pars) for name, value in errs.items(): model.parameters[name].error = value return model def sky_model(self): """Sky model as a `~gammapy.modeling.models.SkyModel` object.""" return SkyModel( spatial_model=self.spatial_model(), spectral_model=self.spectral_model(), name=self.name, ) class SourceCatalog3HWC(SourceCatalog): """HAWC 3HWC catalog. One source is represented by `~gammapy.catalog.SourceCatalogObject3HWC`. The data is from tables 2 and 3 in the paper [1]_. The catalog table contains 65 rows / sources. References ---------- .. [1] 3HWC: The Third HAWC Catalog of Very-High-Energy Gamma-ray Sources", https://data.hawc-observatory.org/datasets/3hwc-survey/index.php """ tag = "3hwc" """Catalog name""" description = "3HWC catalog from the HAWC observatory" """Catalog description""" source_object_class = SourceCatalogObject3HWC def __init__(self, filename="$GAMMAPY_DATA/catalogs/3HWC.ecsv"): table = Table.read(make_path(filename), format="ascii.ecsv") source_name_key = "source_name" super().__init__(table=table, source_name_key=source_name_key) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/hess.py0000644000175100001770000011227314721316200017000 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """HESS Galactic plane survey (HGPS) catalog.""" import numpy as np import astropy.units as u from astropy.coordinates import Angle from astropy.modeling.models import Gaussian1D from astropy.table import Table from gammapy.estimators import FluxPoints from gammapy.maps import MapAxis, RegionGeom from gammapy.modeling.models import Model, Models, SkyModel from gammapy.utils.interpolation import ScaledRegularGridInterpolator from gammapy.utils.scripts import make_path from gammapy.utils.table import table_row_to_dict from .core import SourceCatalog, SourceCatalogObject, format_flux_points_table __all__ = [ "SourceCatalogHGPS", "SourceCatalogLargeScaleHGPS", "SourceCatalogObjectHGPS", "SourceCatalogObjectHGPSComponent", ] # Flux factor, used for printing FF = 1e-12 # Multiplicative factor to go from cm^-2 s^-1 to % Crab for integral flux > 1 TeV # Here we use the same Crab reference that's used in the HGPS paper # CRAB = crab_integral_flux(energy_min=1, reference='hess_ecpl') FLUX_TO_CRAB = 100 / 2.26e-11 FLUX_TO_CRAB_DIFF = 100 / 3.5060459323111307e-11 class SourceCatalogObjectHGPSComponent(SourceCatalogObject): """One Gaussian component from the HGPS catalog. See Also -------- SourceCatalogHGPS, SourceCatalogObjectHGPS """ _source_name_key = "Component_ID" def __init__(self, data): self.data = data def __repr__(self): return f"{self.__class__.__name__}({self.name!r})" def __str__(self): """Pretty-print source data.""" d = self.data ss = "Component {}:\n".format(d["Component_ID"]) fmt = "{:<20s} : {:8.3f} +/- {:.3f} deg\n" ss += fmt.format("GLON", d["GLON"].value, d["GLON_Err"].value) ss += fmt.format("GLAT", d["GLAT"].value, d["GLAT_Err"].value) fmt = "{:<20s} : {:.3f} +/- {:.3f} deg\n" ss += fmt.format("Size", d["Size"].value, d["Size_Err"].value) val, err = d["Flux_Map"].value, d["Flux_Map_Err"].value fmt = "{:<20s} : ({:.2f} +/- {:.2f}) x 10^-12 cm^-2 s^-1 = ({:.1f} +/- {:.1f}) % Crab" ss += fmt.format( "Flux (>1 TeV)", val / FF, err / FF, val * FLUX_TO_CRAB, err * FLUX_TO_CRAB ) return ss @property def name(self): """Source name as a string.""" return self.data[self._source_name_key] def spatial_model(self): """Component spatial model as a `~gammapy.modeling.models.GaussianSpatialModel` object.""" d = self.data tag = "GaussianSpatialModel" pars = { "lon_0": d["GLON"], "lat_0": d["GLAT"], "sigma": d["Size"], "frame": "galactic", } errs = {"lon_0": d["GLON_Err"], "lat_0": d["GLAT_Err"], "sigma": d["Size_Err"]} model = Model.create(tag, "spatial", **pars) for name, value in errs.items(): model.parameters[name].error = value return model class SourceCatalogObjectHGPS(SourceCatalogObject): """One object from the HGPS catalog. The catalog is represented by `SourceCatalogHGPS`. Examples are given there. """ def __repr__(self): return f"{self.__class__.__name__}({self.name!r})" def __str__(self): return self.info() @property def flux_points(self): """Flux points as a `~gammapy.estimators.FluxPoints` object.""" reference_model = SkyModel(spectral_model=self.spectral_model(), name=self.name) return FluxPoints.from_table( self.flux_points_table, reference_model=reference_model, ) def info(self, info="all"): """Information string. Parameters ---------- info : {'all', 'basic', 'map', 'spec', 'flux_points', 'components', 'associations', 'id'} Comma separated list of options. """ if info == "all": info = "basic,associations,id,map,spec,flux_points,components" ss = "" ops = info.split(",") if "basic" in ops: ss += self._info_basic() if "map" in ops: ss += self._info_map() if "spec" in ops: ss += self._info_spec() if "flux_points" in ops: ss += self._info_flux_points() if "components" in ops and hasattr(self, "components"): ss += self._info_components() if "associations" in ops: ss += self._info_associations() if "id" in ops and hasattr(self, "identification_info"): ss += self._info_id() return ss def _info_basic(self): """Print basic information.""" d = self.data ss = "\n*** Basic info ***\n\n" ss += "Catalog row index (zero-based) : {}\n".format(self.row_index) ss += "{:<20s} : {}\n".format("Source name", self.name) ss += "{:<20s} : {}\n".format("Analysis reference", d["Analysis_Reference"]) ss += "{:<20s} : {}\n".format("Source class", d["Source_Class"]) ss += "{:<20s} : {}\n".format("Identified object", d["Identified_Object"]) ss += "{:<20s} : {}\n".format("Gamma-Cat id", d["Gamma_Cat_Source_ID"]) ss += "\n" return ss def _info_associations(self): ss = "\n*** Source associations info ***\n\n" lines = self.associations.pformat(max_width=-1, max_lines=-1) ss += "\n".join(lines) return ss + "\n" def _info_id(self): ss = "\n*** Source identification info ***\n\n" ss += "\n".join(f"{k}: {v}" for k, v in self.identification_info.items()) return ss + "\n" def _info_map(self): """Print information from map analysis.""" d = self.data ss = "\n*** Info from map analysis ***\n\n" ra_str = Angle(d["RAJ2000"], "deg").to_string(unit="hour", precision=0) dec_str = Angle(d["DEJ2000"], "deg").to_string(unit="deg", precision=0) ss += "{:<20s} : {:8.3f} = {}\n".format("RA", d["RAJ2000"], ra_str) ss += "{:<20s} : {:8.3f} = {}\n".format("DEC", d["DEJ2000"], dec_str) ss += "{:<20s} : {:8.3f} +/- {:.3f} deg\n".format( "GLON", d["GLON"].value, d["GLON_Err"].value ) ss += "{:<20s} : {:8.3f} +/- {:.3f} deg\n".format( "GLAT", d["GLAT"].value, d["GLAT_Err"].value ) ss += "{:<20s} : {:.3f}\n".format("Position Error (68%)", d["Pos_Err_68"]) ss += "{:<20s} : {:.3f}\n".format("Position Error (95%)", d["Pos_Err_95"]) ss += "{:<20s} : {:.0f}\n".format("ROI number", d["ROI_Number"]) ss += "{:<20s} : {}\n".format("Spatial model", d["Spatial_Model"]) ss += "{:<20s} : {}\n".format("Spatial components", d["Components"]) ss += "{:<20s} : {:.1f}\n".format("TS", d["Sqrt_TS"] ** 2) ss += "{:<20s} : {:.1f}\n".format("sqrt(TS)", d["Sqrt_TS"]) ss += "{:<20s} : {:.3f} +/- {:.3f} (UL: {:.3f}) deg\n".format( "Size", d["Size"].value, d["Size_Err"].value, d["Size_UL"].value ) ss += "{:<20s} : {:.3f}\n".format("R70", d["R70"]) ss += "{:<20s} : {:.3f}\n".format("RSpec", d["RSpec"]) ss += "{:<20s} : {:.1f}\n".format("Total model excess", d["Excess_Model_Total"]) ss += "{:<20s} : {:.1f}\n".format("Excess in RSpec", d["Excess_RSpec"]) ss += "{:<20s} : {:.1f}\n".format( "Model Excess in RSpec", d["Excess_RSpec_Model"] ) ss += "{:<20s} : {:.1f}\n".format("Background in RSpec", d["Background_RSpec"]) ss += "{:<20s} : {:.1f} hours\n".format("Livetime", d["Livetime"].value) ss += "{:<20s} : {:.2f}\n".format("Energy threshold", d["Energy_Threshold"]) val, err = d["Flux_Map"].value, d["Flux_Map_Err"].value ss += "{:<20s} : ({:.3f} +/- {:.3f}) x 10^-12 cm^-2 s^-1 = ({:.2f} +/- {:.2f}) % Crab\n".format( # noqa: 501 "Source flux (>1 TeV)", val / FF, err / FF, val * FLUX_TO_CRAB, err * FLUX_TO_CRAB, ) ss += "\nFluxes in RSpec (> 1 TeV):\n" ss += "{:<30s} : {:.3f} x 10^-12 cm^-2 s^-1 = {:5.2f} % Crab\n".format( "Map measurement", d["Flux_Map_RSpec_Data"].value / FF, d["Flux_Map_RSpec_Data"].value * FLUX_TO_CRAB, ) ss += "{:<30s} : {:.3f} x 10^-12 cm^-2 s^-1 = {:5.2f} % Crab\n".format( "Source model", d["Flux_Map_RSpec_Source"].value / FF, d["Flux_Map_RSpec_Source"].value * FLUX_TO_CRAB, ) ss += "{:<30s} : {:.3f} x 10^-12 cm^-2 s^-1 = {:5.2f} % Crab\n".format( "Other component model", d["Flux_Map_RSpec_Other"].value / FF, d["Flux_Map_RSpec_Other"].value * FLUX_TO_CRAB, ) ss += "{:<30s} : {:.3f} x 10^-12 cm^-2 s^-1 = {:5.2f} % Crab\n".format( "Large scale component model", d["Flux_Map_RSpec_LS"].value / FF, d["Flux_Map_RSpec_LS"].value * FLUX_TO_CRAB, ) ss += "{:<30s} : {:.3f} x 10^-12 cm^-2 s^-1 = {:5.2f} % Crab\n".format( "Total model", d["Flux_Map_RSpec_Total"].value / FF, d["Flux_Map_RSpec_Total"].value * FLUX_TO_CRAB, ) ss += "{:<35s} : {:5.1f} %\n".format( "Containment in RSpec", 100 * d["Containment_RSpec"] ) ss += "{:<35s} : {:5.1f} %\n".format( "Contamination in RSpec", 100 * d["Contamination_RSpec"] ) label, val = ( "Flux correction (RSpec -> Total)", 100 * d["Flux_Correction_RSpec_To_Total"], ) ss += f"{label:<35s} : {val:5.1f} %\n" label, val = ( "Flux correction (Total -> RSpec)", 100 * (1 / d["Flux_Correction_RSpec_To_Total"]), ) ss += f"{label:<35s} : {val:5.1f} %\n" return ss def _info_spec(self): """Print information from spectral analysis.""" d = self.data ss = "\n*** Info from spectral analysis ***\n\n" ss += "{:<20s} : {:.1f} hours\n".format("Livetime", d["Livetime_Spec"].value) lo = d["Energy_Range_Spec_Min"].value hi = d["Energy_Range_Spec_Max"].value ss += "{:<20s} : {:.2f} to {:.2f} TeV\n".format("Energy range:", lo, hi) ss += "{:<20s} : {:.1f}\n".format("Background", d["Background_Spec"]) ss += "{:<20s} : {:.1f}\n".format("Excess", d["Excess_Spec"]) ss += "{:<20s} : {}\n".format("Spectral model", d["Spectral_Model"]) val = d["TS_ECPL_over_PL"] ss += "{:<20s} : {:.1f}\n".format("TS ECPL over PL", val) val = d["Flux_Spec_Int_1TeV"].value err = d["Flux_Spec_Int_1TeV_Err"].value ss += "{:<20s} : ({:.3f} +/- {:.3f}) x 10^-12 cm^-2 s^-1 = ({:.2f} +/- {:.2f}) % Crab\n".format( # noqa: E501 "Best-fit model flux(> 1 TeV)", val / FF, err / FF, val * FLUX_TO_CRAB, err * FLUX_TO_CRAB, ) val = d["Flux_Spec_Energy_1_10_TeV"].value err = d["Flux_Spec_Energy_1_10_TeV_Err"].value ss += "{:<20s} : ({:.3f} +/- {:.3f}) x 10^-12 erg cm^-2 s^-1\n".format( "Best-fit model energy flux(1 to 10 TeV)", val / FF, err / FF ) ss += self._info_spec_pl() ss += self._info_spec_ecpl() return ss def _info_spec_pl(self): d = self.data ss = "{:<20s} : {:.2f}\n".format("Pivot energy", d["Energy_Spec_PL_Pivot"]) val = d["Flux_Spec_PL_Diff_Pivot"].value err = d["Flux_Spec_PL_Diff_Pivot_Err"].value ss += "{:<20s} : ({:.3f} +/- {:.3f}) x 10^-12 cm^-2 s^-1 TeV^-1 = ({:.2f} +/- {:.2f}) % Crab\n".format( # noqa: E501 "Flux at pivot energy", val / FF, err / FF, val * FLUX_TO_CRAB, err * FLUX_TO_CRAB_DIFF, ) val = d["Flux_Spec_PL_Int_1TeV"].value err = d["Flux_Spec_PL_Int_1TeV_Err"].value ss += "{:<20s} : ({:.3f} +/- {:.3f}) x 10^-12 cm^-2 s^-1 = ({:.2f} +/- {:.2f}) % Crab\n".format( # noqa: E501 "PL Flux(> 1 TeV)", val / FF, err / FF, val * FLUX_TO_CRAB, err * FLUX_TO_CRAB, ) val = d["Flux_Spec_PL_Diff_1TeV"].value err = d["Flux_Spec_PL_Diff_1TeV_Err"].value ss += "{:<20s} : ({:.3f} +/- {:.3f}) x 10^-12 cm^-2 s^-1 TeV^-1 = ({:.2f} +/- {:.2f}) % Crab\n".format( # noqa: E501 "PL Flux(@ 1 TeV)", val / FF, err / FF, val * FLUX_TO_CRAB, err * FLUX_TO_CRAB_DIFF, ) val = d["Index_Spec_PL"] err = d["Index_Spec_PL_Err"] ss += "{:<20s} : {:.2f} +/- {:.2f}\n".format("PL Index", val, err) return ss def _info_spec_ecpl(self): d = self.data ss = "" # ss = '{:<20s} : {:.1f}\n'.format('Pivot energy', d['Energy_Spec_ECPL_Pivot']) val = d["Flux_Spec_ECPL_Diff_1TeV"].value err = d["Flux_Spec_ECPL_Diff_1TeV_Err"].value ss += "{:<20s} : ({:.3f} +/- {:.3f}) x 10^-12 cm^-2 s^-1 TeV^-1 = ({:.2f} +/- {:.2f}) % Crab\n".format( # noqa: E501 "ECPL Flux(@ 1 TeV)", val / FF, err / FF, val * FLUX_TO_CRAB, err * FLUX_TO_CRAB_DIFF, ) val = d["Flux_Spec_ECPL_Int_1TeV"].value err = d["Flux_Spec_ECPL_Int_1TeV_Err"].value ss += "{:<20s} : ({:.3f} +/- {:.3f}) x 10^-12 cm^-2 s^-1 = ({:.2f} +/- {:.2f}) % Crab\n".format( # noqa: E501 "ECPL Flux(> 1 TeV)", val / FF, err / FF, val * FLUX_TO_CRAB, err * FLUX_TO_CRAB, ) val = d["Index_Spec_ECPL"] err = d["Index_Spec_ECPL_Err"] ss += "{:<20s} : {:.2f} +/- {:.2f}\n".format("ECPL Index", val, err) val = d["Lambda_Spec_ECPL"].value err = d["Lambda_Spec_ECPL_Err"].value ss += "{:<20s} : {:.3f} +/- {:.3f} TeV^-1\n".format("ECPL Lambda", val, err) # Use Gaussian analytical error propagation, # tested against the uncertainties package err = err / val**2 val = 1.0 / val ss += "{:<20s} : {:.2f} +/- {:.2f} TeV\n".format("ECPL E_cut", val, err) return ss def _info_flux_points(self): """Print flux point results.""" d = self.data ss = "\n*** Flux points info ***\n\n" ss += "Number of flux points: {}\n".format(d["N_Flux_Points"]) ss += "Flux points table: \n\n" lines = format_flux_points_table(self.flux_points_table).pformat( max_width=-1, max_lines=-1 ) ss += "\n".join(lines) return ss + "\n" def _info_components(self): """Print information about the components.""" ss = "\n*** Gaussian component info ***\n\n" ss += "Number of components: {}\n".format(len(self.components)) ss += "{:<20s} : {}\n\n".format("Spatial components", self.data["Components"]) for component in self.components: ss += str(component) ss += "\n\n" return ss @property def energy_range(self): """Spectral model energy range as a `~astropy.units.Quantity` with length 2.""" energy_min, energy_max = ( self.data["Energy_Range_Spec_Min"], self.data["Energy_Range_Spec_Max"], ) if np.isnan(energy_min): energy_min = u.Quantity(0.2, "TeV") if np.isnan(energy_max): energy_max = u.Quantity(50, "TeV") return u.Quantity([energy_min, energy_max], "TeV") def spectral_model(self, which="best"): """Spectral model as a `~gammapy.modeling.models.SpectralModel` object. One of the following models (given by ``Spectral_Model`` in the catalog): - ``PL`` : `~gammapy.modeling.models.PowerLawSpectralModel` - ``ECPL`` : `~gammapy.modeling.models.ExpCutoffPowerLawSpectralModel` Parameters ---------- which : {'best', 'pl', 'ecpl'} Which spectral model. """ data = self.data if which == "best": spec_type = self.data["Spectral_Model"].strip().lower() elif which in {"pl", "ecpl"}: spec_type = which else: raise ValueError(f"Invalid selection: which = {which!r}") if spec_type == "pl": tag = "PowerLawSpectralModel" pars = { "index": data["Index_Spec_PL"], "amplitude": data["Flux_Spec_PL_Diff_Pivot"], "reference": data["Energy_Spec_PL_Pivot"], } errs = { "amplitude": data["Flux_Spec_PL_Diff_Pivot_Err"], "index": data["Index_Spec_PL_Err"], } elif spec_type == "ecpl": tag = "ExpCutoffPowerLawSpectralModel" pars = { "index": data["Index_Spec_ECPL"], "amplitude": data["Flux_Spec_ECPL_Diff_Pivot"], "reference": data["Energy_Spec_ECPL_Pivot"], "lambda_": data["Lambda_Spec_ECPL"], } errs = { "index": data["Index_Spec_ECPL_Err"], "amplitude": data["Flux_Spec_ECPL_Diff_Pivot_Err"], "lambda_": data["Lambda_Spec_ECPL_Err"], } else: raise ValueError(f"Invalid spec_type: {spec_type}") model = Model.create(tag, "spectral", **pars) errs["reference"] = 0 * u.TeV for name, value in errs.items(): model.parameters[name].error = value return model def spatial_model(self): """Spatial model as a `~gammapy.modeling.models.SpatialModel` object. One of the following models (given by ``Spatial_Model`` in the catalog): - ``Point-Like`` or has a size upper limit : `~gammapy.modeling.models.PointSpatialModel` - ``Gaussian``: `~gammapy.modeling.models.GaussianSpatialModel` - ``2-Gaussian`` or ``3-Gaussian``: composite model (using ``+`` with Gaussians) - ``Shell``: `~gammapy.modeling.models.ShellSpatialModel` """ d = self.data pars = {"lon_0": d["GLON"], "lat_0": d["GLAT"], "frame": "galactic"} errs = {"lon_0": d["GLON_Err"], "lat_0": d["GLAT_Err"]} spatial_type = self.data["Spatial_Model"] if spatial_type == "Point-Like": tag = "PointSpatialModel" elif spatial_type == "Gaussian": tag = "GaussianSpatialModel" pars["sigma"] = d["Size"] errs["sigma"] = d["Size_Err"] elif spatial_type in {"2-Gaussian", "3-Gaussian"}: raise ValueError("For Gaussian or Multi-Gaussian models, use sky_model()!") elif spatial_type == "Shell": # HGPS contains no information on shell width # Here we assume a 5% shell width for all shells. tag = "ShellSpatialModel" pars["radius"] = 0.95 * d["Size"] pars["width"] = d["Size"] - pars["radius"] errs["radius"] = d["Size_Err"] else: raise ValueError(f"Invalid spatial_type: {spatial_type}") model = Model.create(tag, "spatial", **pars) for name, value in errs.items(): model.parameters[name].error = value return model def sky_model(self, which="best"): """Source sky model. Parameters ---------- which : {'best', 'pl', 'ecpl'} Which spectral model. Returns ------- sky_model : `~gammapy.modeling.models.Models` Models of the catalog object. """ models = self.components_models(which=which) if len(models) > 1: geom = self._get_components_geom(models) return models.to_template_sky_model(geom=geom, name=self.name) else: return models[0] def components_models(self, which="best", linked=False): """Models of the source components. Parameters ---------- which : {'best', 'pl', 'ecpl'} Which spectral model. linked : bool Each subcomponent of a source is given as a different `SkyModel`. If True, the spectral parameters except the normalisation are linked. Default is False. Returns ------- sky_model : `~gammapy.modeling.models.Models` Models of the catalog object. """ spatial_type = self.data["Spatial_Model"] missing_size = ( spatial_type == "Gaussian" and self.spatial_model().sigma.value == 0 ) if spatial_type in {"2-Gaussian", "3-Gaussian"} or missing_size: models = [] spectral_model = self.spectral_model(which=which) for component in self.components: spec_component = spectral_model.copy() weight = component.data["Flux_Map"] / self.data["Flux_Map"] spec_component.parameters["amplitude"].value *= weight if linked: for name in spec_component.parameters.names: if name not in ["norm", "amplitude"]: spec_component.__dict__[name] = spectral_model.parameters[ name ] model = SkyModel( spatial_model=component.spatial_model(), spectral_model=spec_component, name=component.name, ) models.append(model) else: models = [ SkyModel( spatial_model=self.spatial_model(), spectral_model=self.spectral_model(which=which), name=self.name, ) ] return Models(models) @staticmethod def _get_components_geom(models): energy_axis = MapAxis.from_energy_bounds( "100 GeV", "100 TeV", nbin=10, per_decade=True, name="energy_true" ) regions = [m.spatial_model.evaluation_region for m in models] geom = RegionGeom.from_regions( regions, binsz_wcs="0.02 deg", axes=[energy_axis] ) return geom.to_wcs_geom() @property def flux_points_table(self): """Flux points table as a `~astropy.table.Table`.""" table = Table() table.meta["sed_type_init"] = "dnde" table.meta["n_sigma_ul"] = 2 table.meta["n_sigma"] = 1 table.meta["sqrt_ts_threshold_ul"] = 1 mask = ~np.isnan(self.data["Flux_Points_Energy"]) table["e_ref"] = self.data["Flux_Points_Energy"][mask] table["e_min"] = self.data["Flux_Points_Energy_Min"][mask] table["e_max"] = self.data["Flux_Points_Energy_Max"][mask] table["dnde"] = self.data["Flux_Points_Flux"][mask] table["dnde_errn"] = self.data["Flux_Points_Flux_Err_Lo"][mask] table["dnde_errp"] = self.data["Flux_Points_Flux_Err_Hi"][mask] table["dnde_ul"] = self.data["Flux_Points_Flux_UL"][mask] table["is_ul"] = self.data["Flux_Points_Flux_Is_UL"][mask].astype("bool") return table class SourceCatalogHGPS(SourceCatalog): """HESS Galactic plane survey (HGPS) source catalog. Reference: https://www.mpi-hd.mpg.de/hfm/HESS/hgps/ One source is represented by `SourceCatalogObjectHGPS`. Examples -------- Let's assume you have downloaded the HGPS catalog FITS file, e.g. via: .. code-block:: bash curl -O https://www.mpi-hd.mpg.de/hfm/HESS/hgps/data/hgps_catalog_v1.fits.gz Then you can load it like this: >>> import matplotlib.pyplot as plt >>> from gammapy.catalog import SourceCatalogHGPS >>> filename = '$GAMMAPY_DATA/catalogs/hgps_catalog_v1.fits.gz' >>> cat = SourceCatalogHGPS(filename) Access a source by name: >>> source = cat['HESS J1843-033'] >>> print(source) *** Basic info *** Catalog row index (zero-based) : 64 Source name : HESS J1843-033 Analysis reference : HGPS Source class : Unid Identified object : -- Gamma-Cat id : 126 *** Info from map analysis *** RA : 280.952 deg = 18h43m48s DEC : -3.554 deg = -3d33m15s GLON : 28.899 +/- 0.072 deg GLAT : 0.075 +/- 0.036 deg Position Error (68%) : 0.122 deg Position Error (95%) : 0.197 deg ROI number : 3 Spatial model : 2-Gaussian Spatial components : HGPSC 083, HGPSC 084 TS : 256.6 sqrt(TS) : 16.0 Size : 0.239 +/- 0.063 (UL: 0.000) deg R70 : 0.376 deg RSpec : 0.376 deg Total model excess : 979.5 Excess in RSpec : 775.6 Model Excess in RSpec : 690.2 Background in RSpec : 1742.4 Livetime : 41.8 hours Energy threshold : 0.38 TeV Source flux (>1 TeV) : (2.882 +/- 0.305) x 10^-12 cm^-2 s^-1 = (12.75 +/- 1.35) % Crab Fluxes in RSpec (> 1 TeV): Map measurement : 2.267 x 10^-12 cm^-2 s^-1 = 10.03 % Crab Source model : 2.018 x 10^-12 cm^-2 s^-1 = 8.93 % Crab Other component model : 0.004 x 10^-12 cm^-2 s^-1 = 0.02 % Crab Large scale component model : 0.361 x 10^-12 cm^-2 s^-1 = 1.60 % Crab Total model : 2.383 x 10^-12 cm^-2 s^-1 = 10.54 % Crab Containment in RSpec : 70.0 % Contamination in RSpec : 15.3 % Flux correction (RSpec -> Total) : 121.0 % Flux correction (Total -> RSpec) : 82.7 % *** Info from spectral analysis *** Livetime : 16.8 hours Energy range: : 0.22 to 61.90 TeV Background : 5126.9 Excess : 980.1 Spectral model : PL TS ECPL over PL : -- Best-fit model flux(> 1 TeV) : (3.043 +/- 0.196) x 10^-12 cm^-2 s^-1 = (13.47 +/- 0.87) % Crab Best-fit model energy flux(1 to 10 TeV) : (10.918 +/- 0.733) x 10^-12 erg cm^-2 s^-1 Pivot energy : 1.87 TeV Flux at pivot energy : (0.914 +/- 0.058) x 10^-12 cm^-2 s^-1 TeV^-1 = (4.04 +/- 0.17) % Crab PL Flux(> 1 TeV) : (3.043 +/- 0.196) x 10^-12 cm^-2 s^-1 = (13.47 +/- 0.87) % Crab PL Flux(@ 1 TeV) : (3.505 +/- 0.247) x 10^-12 cm^-2 s^-1 TeV^-1 = (15.51 +/- 0.70) % Crab PL Index : 2.15 +/- 0.05 ECPL Flux(@ 1 TeV) : (0.000 +/- 0.000) x 10^-12 cm^-2 s^-1 TeV^-1 = (0.00 +/- 0.00) % Crab ECPL Flux(> 1 TeV) : (0.000 +/- 0.000) x 10^-12 cm^-2 s^-1 = (0.00 +/- 0.00) % Crab ECPL Index : -- +/- -- ECPL Lambda : 0.000 +/- 0.000 TeV^-1 ECPL E_cut : inf +/- nan TeV *** Flux points info *** Number of flux points: 6 Flux points table: e_ref e_min e_max dnde dnde_errn dnde_errp dnde_ul is_ul TeV TeV TeV 1 / (TeV s cm2) 1 / (TeV s cm2) 1 / (TeV s cm2) 1 / (TeV s cm2) ------ ------ ------ --------------- --------------- --------------- --------------- ----- 0.332 0.215 0.511 3.048e-11 6.890e-12 7.010e-12 4.455e-11 False 0.787 0.511 1.212 5.383e-12 6.655e-13 6.843e-13 6.739e-12 False 1.957 1.212 3.162 9.160e-13 9.732e-14 1.002e-13 1.120e-12 False 4.870 3.162 7.499 1.630e-13 2.001e-14 2.097e-14 2.054e-13 False 12.115 7.499 19.573 1.648e-14 3.124e-15 3.348e-15 2.344e-14 False 30.142 19.573 46.416 7.777e-16 4.468e-16 5.116e-16 1.883e-15 False *** Gaussian component info *** Number of components: 2 Spatial components : HGPSC 083, HGPSC 084 Component HGPSC 083: GLON : 29.047 +/- 0.026 deg GLAT : 0.244 +/- 0.027 deg Size : 0.125 +/- 0.021 deg Flux (>1 TeV) : (1.34 +/- 0.36) x 10^-12 cm^-2 s^-1 = (5.9 +/- 1.6) % Crab Component HGPSC 084: GLON : 28.770 +/- 0.059 deg GLAT : -0.073 +/- 0.069 deg Size : 0.229 +/- 0.046 deg Flux (>1 TeV) : (1.54 +/- 0.47) x 10^-12 cm^-2 s^-1 = (6.8 +/- 2.1) % Crab *** Source associations info *** Source_Name Association_Catalog Association_Name Separation deg ---------------- ------------------- --------------------- ---------- HESS J1843-033 3FGL 3FGL J1843.7-0322 0.178442 HESS J1843-033 3FGL 3FGL J1844.3-0344 0.242835 HESS J1843-033 SNR G28.6-0.1 0.330376 Access source spectral data and plot it: >>> ax = plt.subplot() >>> source.spectral_model().plot(source.energy_range, ax=ax) # doctest: +SKIP >>> source.spectral_model().plot_error(source.energy_range, ax=ax) # doctest: +SKIP >>> source.flux_points.plot(ax=ax) # doctest: +SKIP Gaussian component information can be accessed as well, either via the source, or via the catalog: >>> source.components [SourceCatalogObjectHGPSComponent('HGPSC 083'), SourceCatalogObjectHGPSComponent('HGPSC 084')] >>> cat.gaussian_component(83) SourceCatalogObjectHGPSComponent('HGPSC 084') """ tag = "hgps" """Source catalog name (str).""" description = "H.E.S.S. Galactic plane survey (HGPS) source catalog" """Source catalog description (str).""" source_object_class = SourceCatalogObjectHGPS def __init__( self, filename="$GAMMAPY_DATA/catalogs/hgps_catalog_v1.fits.gz", hdu="HGPS_SOURCES", ): filename = make_path(filename) table = Table.read(filename, hdu=hdu) source_name_alias = ("Identified_Object",) super().__init__(table=table, source_name_alias=source_name_alias) self._table_components = Table.read(filename, hdu="HGPS_GAUSS_COMPONENTS") self._table_associations = Table.read(filename, hdu="HGPS_ASSOCIATIONS") self._table_associations["Separation"].format = ".6f" self._table_identifications = Table.read(filename, hdu="HGPS_IDENTIFICATIONS") self._table_large_scale_component = Table.read( filename, hdu="HGPS_LARGE_SCALE_COMPONENT" ) @property def table_components(self): """Gaussian component table as a `~astropy.table.Table`.""" return self._table_components @property def table_associations(self): """Source association table as a `~astropy.table.Table`.""" return self._table_associations @property def table_identifications(self): """Source identification table as a `~astropy.table.Table`.""" return self._table_identifications @property def table_large_scale_component(self): """Large scale component table as a `~astropy.table.Table`.""" return self._table_large_scale_component @property def large_scale_component(self): """Large scale component model as a `~gammapy.catalog.SourceCatalogLargeScaleHGPS` object.""" return SourceCatalogLargeScaleHGPS(self.table_large_scale_component) def _make_source_object(self, index): """Make `SourceCatalogObject` for given row index.""" source = super()._make_source_object(index) if source.data["Components"] != "": source.components = list(self._get_gaussian_components(source)) self._attach_association_info(source) if source.data["Source_Class"] != "Unid": self._attach_identification_info(source) return source def _get_gaussian_components(self, source): for name in source.data["Components"].split(", "): row_index = int(name.split()[-1]) - 1 yield self.gaussian_component(row_index) def _attach_association_info(self, source): t = self.table_associations mask = source.data["Source_Name"] == t["Source_Name"] source.associations = t[mask] def _attach_identification_info(self, source): t = self._table_identifications idx = np.nonzero(source.name == t["Source_Name"])[0][0] source.identification_info = table_row_to_dict(t[idx]) def gaussian_component(self, row_idx): """Gaussian component as a `SourceCatalogObjectHGPSComponent` object.""" data = table_row_to_dict(self.table_components[row_idx]) data[SourceCatalogObject._row_index_key] = row_idx return SourceCatalogObjectHGPSComponent(data=data) def to_models(self, which="best", components_status="independent"): """Create Models object from catalog. Parameters ---------- which : {'best', 'pl', 'ecpl'} Which spectral model. components_status : {'independent', 'linked', 'merged'} Relation between the sources components. Available options are: * 'independent': each subcomponent of a source is given as a different `SkyModel` (Default). * 'linked': each subcomponent of a source is given as a different `SkyModel` but the spectral parameters except the normalisation are linked. * 'merged': the subcomponents are merged into a single `SkyModel` given as a `~gammapy.modeling.models.TemplateSpatialModel` with a `~gammapy.modeling.models.PowerLawNormSpectralModel`. In that case the relative weights between the components cannot be adjusted. Returns ------- models : `~gammapy.modeling.models.Models` Models of the catalog. """ models = [] for source in self: if components_status == "merged": m = [source.sky_model(which=which)] else: m = source.components_models( which=which, linked=components_status == "linked" ) models.extend(m) return Models(models) class SourceCatalogLargeScaleHGPS: """Gaussian band model. This 2-dimensional model is Gaussian in ``y`` for a given ``x``, and the Gaussian parameters can vary in ``x``. One application of this model is the diffuse emission along the Galactic plane, i.e. ``x = GLON`` and ``y = GLAT``. Parameters ---------- table : `~astropy.table.Table` Table of Gaussian parameters. ``x``, ``amplitude``, ``mean``, ``stddev``. interp_kwargs : dict Keyword arguments passed to `ScaledRegularGridInterpolator`. """ def __init__(self, table, interp_kwargs=None): interp_kwargs = interp_kwargs or {} interp_kwargs.setdefault("values_scale", "lin") self.table = table glon = Angle(self.table["GLON"]).wrap_at("180d") interps = {} for column in table.colnames: values = self.table[column].quantity interp = ScaledRegularGridInterpolator((glon,), values, **interp_kwargs) interps[column] = interp self._interp = interps def _interpolate_parameter(self, parname, glon): glon = glon.wrap_at("180d") return self._interp[parname]((np.asanyarray(glon),), clip=False) def peak_brightness(self, glon): """Peak brightness at a given longitude. Parameters ---------- glon : `~astropy.coordinates.Angle` Galactic Longitude. """ return self._interpolate_parameter("Surface_Brightness", glon) def peak_brightness_error(self, glon): """Peak brightness error at a given longitude. Parameters ---------- glon : `~astropy.coordinates.Angle` Galactic Longitude. """ return self._interpolate_parameter("Surface_Brightness_Err", glon) def width(self, glon): """Width at a given longitude. Parameters ---------- glon : `~astropy.coordinates.Angle` Galactic Longitude. """ return self._interpolate_parameter("Width", glon) def width_error(self, glon): """Width error at a given longitude. Parameters ---------- glon : `~astropy.coordinates.Angle` Galactic Longitude. """ return self._interpolate_parameter("Width_Err", glon) def peak_latitude(self, glon): """Peak position at a given longitude. Parameters ---------- glon : `~astropy.coordinates.Angle` Galactic Longitude. """ return self._interpolate_parameter("GLAT", glon) def peak_latitude_error(self, glon): """Peak position error at a given longitude. Parameters ---------- glon : `~astropy.coordinates.Angle` Galactic Longitude. """ return self._interpolate_parameter("GLAT_Err", glon) def evaluate(self, position): """Evaluate model at a given position. Parameters ---------- position : `~astropy.coordinates.SkyCoord` Position on the sky. """ glon, glat = position.galactic.l, position.galactic.b width = self.width(glon) amplitude = self.peak_brightness(glon) mean = self.peak_latitude(glon) return Gaussian1D.evaluate(glat, amplitude=amplitude, mean=mean, stddev=width) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/lhaaso.py0000644000175100001770000001436514721316200017310 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np import astropy.units as u from astropy.table import Table from gammapy.maps import MapAxis, RegionGeom from gammapy.modeling.models import Model, Models, SkyModel from gammapy.utils.gauss import Gauss2DPDF from gammapy.utils.scripts import make_path from .core import SourceCatalog, SourceCatalogObject __all__ = [ "SourceCatalogObject1LHAASO", "SourceCatalog1LHAASO", ] class SourceCatalogObject1LHAASO(SourceCatalogObject): """One source from the 1LHAASO catalog. Catalog is represented by `~gammapy.catalog.SourceCatalog1LHAASO`. """ _source_name_key = "Source_Name" def _parse(self, name, which): if which in self.data["Model_a"]: tag = "" elif which in self.data["Model_b"]: tag = "_b" else: raise ValueError("Invalid model component name") is_ul = False value = u.Quantity(self.data[f"{name}{tag}"]) if ( np.isnan(value) or value == 0 * value.unit ) and f"{name}_ul{tag}" in self.data: value = self.data[f"{name}_ul{tag}"] is_ul = True return value, is_ul def _get(self, name, which): value, _ = self._parse(name, which) return value def spectral_model(self, which): """Spectral model as a `~gammapy.modeling.models.PowerLawSpectralModel` object. * ``which="KM2A"`` - Sky model for KM2A analysis only. * ``which="WCDA"`` - Sky model for WCDA analysis only. """ pars = { "reference": self._get("E0", which), "amplitude": self._get("N0", which), "index": self._get("gamma", which), } errs = { "amplitude": self._get("N0_err", which), "index": self._get("gamma_err", which), } model = Model.create("PowerLawSpectralModel", "spectral", **pars) for name, value in errs.items(): model.parameters[name].error = value return model def spatial_model(self, which): """Spatial model as a `~gammapy.modeling.models.SpatialModel` object. * ``which="KM2A"`` - Sky model for KM2A analysis only. * ``which="WCDA"`` - Sky model for WCDA analysis only. """ lat_0 = self._get("DECJ2000", which) pars = {"lon_0": self._get("RAJ2000", which), "lat_0": lat_0, "frame": "fk5"} pos_err = self._get("pos_err", which) scale_r95 = Gauss2DPDF().containment_radius(0.95) errs = { "lat_0": pos_err / scale_r95, "lon_0": pos_err / scale_r95 / np.cos(lat_0), } r39, is_ul = self._parse("r39", which) if not is_ul: pars["sigma"] = r39 model = Model.create("GaussianSpatialModel", "spatial", **pars) else: model = Model.create("PointSpatialModel", "spatial", **pars) for name, value in errs.items(): model.parameters[name].error = value return model @staticmethod def _get_components_geom(models): energy_axis = MapAxis.from_energy_bounds( "0.1 TeV", "2000 TeV", nbin=10, per_decade=True, name="energy" ) regions = [m.spatial_model.evaluation_region for m in models] geom = RegionGeom.from_regions( regions, binsz_wcs="0.05 deg", axes=[energy_axis] ) return geom.to_wcs_geom() def sky_model(self, which="both"): """Sky model as a `~gammapy.modeling.models.SkyModel` object. * ``which="both"`` - Create a composite template if both models are available, or, use the available one if only one is present. * ``which="KM2A"`` - Sky model for KM2A analysis if available. * ``which="WCDA"`` - Sky model for WCDA analysis if available. """ if which == "both": wcda = self.sky_model(which="WCDA") km2a = self.sky_model(which="KM2A") models = [m for m in [wcda, km2a] if m is not None] if len(models) == 2: geom = self._get_components_geom(models) mask = geom.energy_mask(energy_max=25 * u.TeV) geom = geom.as_energy_true wcda_map = Models(wcda).to_template_sky_model(geom).spatial_model.map model = Models(km2a).to_template_sky_model(geom, name=km2a.name) model.spatial_model.map.data[mask] = wcda_map.data[mask] model.spatial_model.filename = f"{model.name}.fits" return model else: return models[0] else: _, is_ul = self._parse("N0", which) if is_ul: return None else: return SkyModel( spatial_model=self.spatial_model(which), spectral_model=self.spectral_model(which), name=self.name, ) class SourceCatalog1LHAASO(SourceCatalog): """First LHAASO catalog. One source is represented by `~gammapy.catalog.SourceCatalogObject1LHAASO`. The data is from table 1 in the paper [1]_. The catalog table contains 90 rows / sources. References ---------- .. [1] 1LHAASO: The First LHAASO Catalog of Gamma-Ray Sources, https://ui.adsabs.harvard.edu/abs/2023arXiv230517030C/abstract """ tag = "1LHAASO" """Catalog name""" description = "1LHAASO catalog from the LHAASO observatory" """Catalog description""" source_object_class = SourceCatalogObject1LHAASO def __init__(self, filename="$GAMMAPY_DATA/catalogs/1LHAASO_catalog.fits"): table = Table.read(make_path(filename)) source_name_key = "Source_Name" super().__init__(table=table, source_name_key=source_name_key) def to_models(self, which="both"): """Create Models object from catalog. * ``which="both"`` - Use first model or create a composite template if both models are available. * ``which="KM2A"`` - Sky model for KM2A analysis if available. * ``which="WCDA"`` - Sky model for WCDA analysis if available. """ models = Models() for _ in self: model = _.sky_model(which) if model: models.append(model) return models ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.1846418 gammapy-1.3/gammapy/catalog/tests/0000755000175100001770000000000014721316215016626 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/__init__.py0000644000175100001770000000010014721316200020720 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.188642 gammapy-1.3/gammapy/catalog/tests/data/0000755000175100001770000000000014721316215017537 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/2fhl_j0822.6-4250e.txt0000644000175100001770000000257214721316200022661 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 134 Source name : 2FHL J0822.6-4250e Associations : Puppis A, 3FGL J0822.6-4250e, 1FHL J0822.6-4250e, -- ASSOC_PROB_BAY : -- ASSOC_PROB_LR : -- Class : snr TeVCat flag : N/A *** Other info *** Test statistic (50 GeV - 2 TeV) : 63.870 *** Position info *** RA : 125.660 deg DEC : -42.840 deg GLON : 260.317 deg GLAT : -3.276 deg Error on position (68%) : 0.0000 deg ROI number : 33 *** Extended source information *** Model form : Disk Model semimajor : 0.3700 deg Model semiminor : 0.3700 deg Position angle : 90.0000 deg Spatial filename : PuppisA.fits *** Spectral fit info *** Power-law spectral index : 4.120 +- 0.840 Integral flux (50 GeV - 2 TeV) : 6.87e-11 +- 1.67e-11 cm-2 s-1 Energy flux (50 GeV - 2 TeV) : 8.09e-12 +- 2.27e-12 erg cm-2 s-1 *** Spectral points *** e_min e_max flux flux_errn flux_errp is_ul flux_ul GeV GeV 1 / (cm2 s) 1 / (cm2 s) 1 / (cm2 s) 1 / (cm2 s) ------- -------- ----------- ----------- ----------- ----- ----------- 50.000 171.000 4.233e-11 9.625e-12 1.104e-11 False nan 171.000 585.000 1.330e-16 nan 5.867e-12 True 1.173e-11 585.000 2000.000 4.808e-18 nan 7.717e-12 True 1.543e-11 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/2fhl_j1445.1-0329.txt0000644000175100001770000000226214721316200022510 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 221 Source name : 2FHL J1445.1-0329 Associations : RBS 1424, 3FGL J1445.0-0328, --, -- ASSOC_PROB_BAY : 0.997 ASSOC_PROB_LR : 0.944 Class : bll TeVCat flag : N/A *** Other info *** Test statistic (50 GeV - 2 TeV) : 28.530 *** Position info *** RA : 221.282 deg DEC : -3.494 deg GLON : 349.199 deg GLAT : 48.891 deg Error on position (68%) : 0.0563 deg ROI number : 68 *** Spectral fit info *** Power-law spectral index : 2.770 +- 0.810 Integral flux (50 GeV - 2 TeV) : 2.05e-11 +- 9.3e-12 cm-2 s-1 Energy flux (50 GeV - 2 TeV) : 3.56e-12 +- 2.18e-12 erg cm-2 s-1 *** Spectral points *** e_min e_max flux flux_errn flux_errp is_ul flux_ul GeV GeV 1 / (cm2 s) 1 / (cm2 s) 1 / (cm2 s) 1 / (cm2 s) ------- -------- ----------- ----------- ----------- ----- ----------- 50.000 171.000 1.831e-11 7.166e-12 9.314e-12 False nan 171.000 585.000 8.906e-16 nan 5.473e-12 True 1.095e-11 585.000 2000.000 1.017e-16 nan 6.759e-12 True 1.352e-11 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/2hwc_j0534+220.txt0000644000175100001770000000056214721316200022264 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 0 Source name : 2HWC J0534+220 *** Position info *** RA: 83.628 deg DEC: 22.024 deg GLON: 184.547 deg GLAT: -5.783 deg Position error: 0.057 deg *** Spectral info *** Spectrum 0: Flux at 7 TeV: 1.85e-13 +- 2.38e-15 cm-2 s-1 TeV-1 Spectral index: -2.580 +- 0.010 Test Radius: 0.0 deg No second spectrum available././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/2hwc_j0631+169.txt0000644000175100001770000000071214721316200022273 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 1 Source name : 2HWC J0631+169 *** Position info *** RA: 97.998 deg DEC: 16.997 deg GLON: 195.614 deg GLAT: 3.507 deg Position error: 0.114 deg *** Spectral info *** Spectrum 0: Flux at 7 TeV: 6.71e-15 +- 1.47e-15 cm-2 s-1 TeV-1 Spectral index: -2.570 +- 0.150 Test Radius: 0.0 deg Spectrum 1: Flux at 7 TeV: 4.87e-14 +- 6.85e-15 cm-2 s-1 TeV-1 Spectral index: -2.230 +- 0.080 Test Radius: 2.0 deg ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/2pc_j0034-0534.txt0000644000175100001770000000625114721316200022173 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 3 Source name : J0034-0534 *** Other info *** Binary : Y *** Pulsar info *** Period : 1.880 ms P_Dot : 4.980e-21 E_Dot : 1.720e+34 erg / s Type : MSP *** Position info *** RA : 8.591 deg DEC : -5.577 deg GLON : 111.492 deg GLAT : -68.069 deg *** Spectral info *** On peak : N TS DC : 563 TS cutoff : 45 TS b free : 0 * PLSuperExpCutoff b = 1 * Amplitude : 8.691e-12 1 / (MeV s cm2) +- 1.208e-12 1 / (MeV s cm2) Index 1 : 1.436 +- 0.169 Index 2 : 1.000 Reference : 755.700 MeV Ecut : 1831.000 MeV +- 416.700 MeV * PLSuperExpCutoff b free * Amplitude : 0.000e+00 1 / (MeV s cm2) +- 0.000e+00 1 / (MeV s cm2) Index 1 : -- +- -- Index 2 : -- +- -- Reference : 0.000 MeV Ecut : 0.000 MeV +- 0.000 MeV * PowerLaw * Amplitude : 2.435e-12 1 / (MeV s cm2) +- 1.520e-13 1 / (MeV s cm2) Index : 2.219 +- 0.049 Reference : 1000.000 MeV *** Spectral points *** e_min e_max e_ref flux flux_err e2dnde e2dnde_err is_ul flux_ul e2dnde_ul GeV GeV GeV ph / (GeV s cm2) ph / (GeV s cm2) erg / (s cm2) erg / (s cm2) ph / (GeV s cm2) erg / (s cm2) ------ ------- ------ ---------------- ---------------- ------------- ------------- ----- ---------------- ------------- 0.100 0.178 0.128 1.747e-07 0.000e+00 4.586e-12 0.000e+00 True 1.747e-07 4.586e-12 0.178 0.316 0.228 5.343e-08 1.041e-08 4.436e-12 8.638e-13 False nan nan 0.316 0.562 0.405 1.886e-08 2.588e-09 4.950e-12 6.794e-13 False nan nan 0.562 1.000 0.720 5.558e-09 6.914e-10 4.614e-12 5.740e-13 False nan nan 1.000 1.778 1.280 1.927e-09 2.320e-10 5.059e-12 6.090e-13 False nan nan 1.778 3.162 2.276 5.261e-10 7.922e-11 4.368e-12 6.576e-13 False nan nan 3.162 5.623 4.048 8.004e-11 2.215e-11 2.101e-12 5.816e-13 False nan nan 5.623 10.000 7.199 1.634e-11 0.000e+00 1.356e-12 0.000e+00 True 1.634e-11 1.356e-12 10.000 17.783 12.801 3.678e-12 0.000e+00 9.655e-13 0.000e+00 True 3.678e-12 9.655e-13 17.783 31.623 22.764 1.158e-12 0.000e+00 9.616e-13 0.000e+00 True 1.158e-12 9.616e-13 31.623 56.234 40.482 6.298e-13 0.000e+00 1.653e-12 0.000e+00 True 6.298e-13 1.653e-12 56.234 100.000 71.988 9.865e-13 0.000e+00 8.190e-12 0.000e+00 True 9.865e-13 8.190e-12 *** Phasogram info *** Number of peaks : 2 Peak separation : 0.285 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/2pc_j1112-6103.txt0000644000175100001770000000472714721316200022175 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 38 Source name : J1112-6103 *** Other info *** Binary : N *** Pulsar info *** Period : 64.970 ms P_Dot : 3.150e-14 E_Dot : 4.540e+36 erg / s Type : YRL *** Position info *** RA : 168.062 deg DEC : -61.059 deg GLON : 291.221 deg GLAT : -0.463 deg *** Spectral info *** On peak : N TS DC : 58 TS cutoff : 6 TS b free : 0 * PLSuperExpCutoff b = 1 * Amplitude : 3.555e-12 1 / (MeV s cm2) +- 1.157e-12 1 / (MeV s cm2) Index 1 : 1.557 +- 0.281 Index 2 : 1.000 Reference : 1000.000 MeV Ecut : 5889.000 MeV +- 3235.000 MeV * PLSuperExpCutoff b free * Amplitude : 0.000e+00 1 / (MeV s cm2) +- 0.000e+00 1 / (MeV s cm2) Index 1 : -- +- -- Index 2 : -- +- -- Reference : 0.000 MeV Ecut : 0.000 MeV +- 0.000 MeV * PowerLaw * Amplitude : 2.554e-12 1 / (MeV s cm2) +- 6.340e-13 1 / (MeV s cm2) Index : 2.070 +- 0.105 Reference : 1000.000 MeV *** Spectral points *** e_min e_max e_ref flux flux_err e2dnde e2dnde_err is_ul flux_ul e2dnde_ul GeV GeV GeV ph / (GeV s cm2) ph / (GeV s cm2) erg / (s cm2) erg / (s cm2) ph / (GeV s cm2) erg / (s cm2) ------ ------- ------ ---------------- ---------------- ------------- ------------- ----- ---------------- ------------- 0.100 0.316 0.152 2.641e-07 0.000e+00 9.769e-12 0.000e+00 True 2.641e-07 9.769e-12 0.316 1.000 0.481 2.015e-08 9.124e-09 7.452e-12 3.375e-12 False nan nan 1.000 3.162 1.520 1.999e-09 4.012e-10 7.396e-12 1.484e-12 False nan nan 3.162 10.000 4.805 8.503e-11 3.112e-11 3.145e-12 1.151e-12 False nan nan 10.000 31.623 15.195 1.433e-11 0.000e+00 5.302e-12 0.000e+00 True 1.433e-11 5.302e-12 31.623 100.000 48.052 1.453e-12 0.000e+00 5.373e-12 0.000e+00 True 1.453e-12 5.373e-12 *** Phasogram info *** Number of peaks : 2 Peak separation : 0.457 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/3fgl_J0000.1+6545.txt0000644000175100001770000000475114721316200022443 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 0 Source name : 3FGL J0000.1+6545 Extended name : Associations : Class1 : TeVCat flag : N *** Other info *** Other flags : 4 *** Position info *** RA : 0.038 deg DEC : 65.752 deg GLON : 117.694 deg GLAT : 3.403 deg Semimajor (68%) : 0.0628 deg Semiminor (68%) : 0.0481 deg Position angle (68%) : 41.03 deg Semimajor (95%) : 0.1019 deg Semiminor (95%) : 0.0780 deg Position angle (95%) : 41.03 deg ROI number : 185 *** Spectral info *** Spectrum type : PowerLaw Detection significance (100 MeV - 300 GeV) : 6.813 Significance curvature : 3.4 Pivot energy : 1159 MeV Power law spectral index : 2.411 Spectral index : 2.411 +- 0.082 Flux Density at pivot energy : 1.01e-12 +- 1.49e-13 cm-2 MeV-1 s-1 Integral flux (1 - 100 GeV) : 1.02e-09 +- 1.58e-10 cm-2 s-1 Energy flux (100 MeV - 100 GeV) : 1.36e-11 +- 2.07e-12 erg cm-2 s-1 *** Spectral points *** e_min e_max flux flux_errn flux_errp e2dnde e2dnde_errn e2dnde_errp is_ul flux_ul e2dnde_ul sqrt_ts MeV MeV 1 / (cm2 s) 1 / (cm2 s) 1 / (cm2 s) erg / (cm2 s) erg / (cm2 s) erg / (cm2 s) 1 / (cm2 s) erg / (cm2 s) --------- ---------- ----------- ----------- ----------- ------------- ------------- ------------- ----- ----------- ------------- ------- 100.000 300.000 1.808e-08 8.395e-09 8.236e-09 4.175e-12 1.938e-12 1.902e-12 False nan nan 2.169 300.000 1000.000 6.941e-09 1.353e-09 1.367e-09 4.544e-12 8.856e-13 8.948e-13 False nan nan 5.270 1000.000 3000.000 1.237e-09 2.251e-10 2.343e-10 2.855e-12 5.198e-13 5.411e-13 False nan nan 6.022 3000.000 10000.000 5.781e-11 4.045e-11 4.853e-11 3.784e-13 2.648e-13 3.177e-13 False nan nan 1.509 10000.000 100000.000 2.840e-11 1.484e-11 1.915e-11 4.317e-13 2.257e-13 2.912e-13 False nan nan 2.421 *** Lightcurve info *** Lightcurve measured in the energy band: 100 MeV - 100 GeV Variability index : 40.754 No peak measured for this source. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/3fgl_J0001.4+2120.txt0000644000175100001770000000547514721316200022434 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 4 Source name : 3FGL J0001.4+2120 Extended name : Associations : TXS 2358+209, 3EG J2359+2041 Class1 : fsrq TeVCat flag : N *** Other info *** Other flags : 0 *** Position info *** RA : 0.361 deg DEC : 21.338 deg GLON : 107.665 deg GLAT : -40.047 deg Semimajor (68%) : 0.1298 deg Semiminor (68%) : 0.1162 deg Position angle (68%) : -32.55 deg Semimajor (95%) : 0.2105 deg Semiminor (95%) : 0.1884 deg Position angle (95%) : -32.55 deg ROI number : 326 *** Spectral info *** Spectrum type : LogParabola Detection significance (100 MeV - 300 GeV) : 11.350 Significance curvature : 4.4 beta : 0.5191451907157898 +- 0.15508420765399933 Pivot energy : 311 MeV Power law spectral index : 2.777 Spectral index : 2.305 +- 0.180 Flux Density at pivot energy : 2.52e-11 +- 2.8e-12 cm-2 MeV-1 s-1 Integral flux (1 - 100 GeV) : 2.94e-10 +- 7.58e-11 cm-2 s-1 Energy flux (100 MeV - 100 GeV) : 8.07e-12 +- 8.38e-13 erg cm-2 s-1 *** Spectral points *** e_min e_max flux flux_errn flux_errp e2dnde e2dnde_errn e2dnde_errp is_ul flux_ul e2dnde_ul sqrt_ts MeV MeV 1 / (cm2 s) 1 / (cm2 s) 1 / (cm2 s) erg / (cm2 s) erg / (cm2 s) erg / (cm2 s) 1 / (cm2 s) erg / (cm2 s) --------- ---------- ----------- ----------- ----------- ------------- ------------- ------------- ----- ----------- ------------- ------- 100.000 300.000 1.524e-08 2.378e-09 2.390e-09 3.773e-12 5.888e-13 5.918e-13 False nan nan 6.788 300.000 1000.000 3.640e-09 5.311e-10 5.450e-10 2.260e-12 3.297e-13 3.384e-13 False nan nan 7.591 1000.000 3000.000 3.565e-10 9.341e-11 1.018e-10 7.153e-13 1.874e-13 2.042e-13 False nan nan 4.517 3000.000 10000.000 8.414e-15 nan 1.924e-11 4.392e-17 nan 1.004e-13 True 3.849e-11 2.009e-13 0.000 10000.000 100000.000 1.036e-14 nan 1.126e-11 9.595e-17 nan 1.043e-13 True 2.252e-11 2.087e-13 0.000 *** Lightcurve info *** Lightcurve measured in the energy band: 100 MeV - 100 GeV Variability index : 130.336 Significance peak (100 MeV - 100 GeV) : 8.248 Integral flux peak (100 MeV - 100 GeV) : 1.16e-07 +- 1.86e-08 cm^-2 s^-1 Time peak : 3.01e+08 s (Mission elapsed time) Peak interval : 30.4 day ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/3fgl_J0023.4+0923.txt0000644000175100001770000000507214721316200022442 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 55 Source name : 3FGL J0023.4+0923 Extended name : Associations : PSR J0023+0923 Class1 : PSR TeVCat flag : N *** Other info *** Other flags : 0 *** Position info *** RA : 5.865 deg DEC : 9.389 deg GLON : 111.455 deg GLAT : -52.858 deg Semimajor (68%) : 0.0525 deg Semiminor (68%) : 0.0485 deg Position angle (68%) : -86.71 deg Semimajor (95%) : 0.0852 deg Semiminor (95%) : 0.0786 deg Position angle (95%) : -86.71 deg ROI number : 576 *** Spectral info *** Spectrum type : PLExpCutoff Detection significance (100 MeV - 300 GeV) : 13.328 Significance curvature : 5.7 Cutoff energy : 984 +- 239 MeV Pivot energy : 905 MeV Power law spectral index : 2.283 Spectral index : 0.971 +- 0.303 Flux Density at pivot energy : 2.27e-12 +- 2.53e-13 cm-2 MeV-1 s-1 Integral flux (1 - 100 GeV) : 1.12e-09 +- 1.25e-10 cm-2 s-1 Energy flux (100 MeV - 100 GeV) : 7.28e-12 +- 8.09e-13 erg cm-2 s-1 *** Spectral points *** e_min e_max flux flux_errn flux_errp e2dnde e2dnde_errn e2dnde_errp is_ul flux_ul e2dnde_ul sqrt_ts MeV MeV 1 / (cm2 s) 1 / (cm2 s) 1 / (cm2 s) erg / (cm2 s) erg / (cm2 s) erg / (cm2 s) 1 / (cm2 s) erg / (cm2 s) --------- ---------- ----------- ----------- ----------- ------------- ------------- ------------- ----- ----------- ------------- ------- 100.000 300.000 3.158e-09 2.421e-09 2.484e-09 8.263e-13 6.335e-13 6.500e-13 False nan nan 1.316 300.000 1000.000 3.743e-09 4.966e-10 5.139e-10 2.719e-12 3.607e-13 3.733e-13 False nan nan 8.734 1000.000 3000.000 9.629e-10 1.320e-10 1.395e-10 2.157e-12 2.958e-13 3.125e-13 False nan nan 9.995 3000.000 10000.000 5.029e-11 2.401e-11 3.044e-11 2.625e-13 1.253e-13 1.589e-13 False nan nan 2.764 10000.000 100000.000 7.682e-12 nan 1.888e-11 7.117e-14 nan 1.749e-13 True 4.544e-11 4.210e-13 0.960 *** Lightcurve info *** Lightcurve measured in the energy band: 100 MeV - 100 GeV Variability index : 51.375 No peak measured for this source. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/3fgl_J0835.3-4510.txt0000644000175100001770000000517514721316200022456 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 960 Source name : 3FGL J0835.3-4510 Extended name : Associations : PSR J0835-4510, Vela, Vela Pulsar, 1AGL J0835-4509 Class1 : PSR TeVCat flag : P *** Other info *** Other flags : 0 *** Position info *** RA : 128.838 deg DEC : -45.178 deg GLON : 263.555 deg GLAT : -2.787 deg Semimajor (68%) : 0.0032 deg Semiminor (68%) : 0.0032 deg Position angle (68%) : 20.08 deg Semimajor (95%) : 0.0052 deg Semiminor (95%) : 0.0052 deg Position angle (95%) : 20.08 deg ROI number : 88 *** Spectral info *** Spectrum type : PLSuperExpCutoff Detection significance (100 MeV - 300 GeV) : 1048.959 Significance curvature : 54.0 Super-exponential cutoff index : 0.47586068511009216 +- 0.008638832718133926 Pivot energy : 833 MeV Power law spectral index : 1.952 Spectral index : 1.003 +- 0.018 Flux Density at pivot energy : 2.33e-09 +- 4.4e-12 cm-2 MeV-1 s-1 Integral flux (1 - 100 GeV) : 1.3e-06 +- 2.89e-09 cm-2 s-1 Energy flux (100 MeV - 100 GeV) : 8.93e-09 +- 1.73e-11 erg cm-2 s-1 *** Spectral points *** e_min e_max flux flux_errn flux_errp e2dnde e2dnde_errn e2dnde_errp is_ul flux_ul e2dnde_ul sqrt_ts MeV MeV 1 / (cm2 s) 1 / (cm2 s) 1 / (cm2 s) erg / (cm2 s) erg / (cm2 s) erg / (cm2 s) 1 / (cm2 s) erg / (cm2 s) --------- ---------- ----------- ----------- ----------- ------------- ------------- ------------- ----- ----------- ------------- ------- 100.000 300.000 5.376e-06 2.615e-08 2.615e-08 1.372e-09 6.673e-12 6.673e-12 False nan nan 160.971 300.000 1000.000 3.217e-06 7.406e-09 7.406e-09 2.292e-09 5.278e-12 5.278e-12 False nan nan 528.538 1000.000 3000.000 1.070e-06 3.075e-09 3.075e-09 2.525e-09 7.256e-12 7.256e-12 False nan nan 711.632 3000.000 10000.000 2.162e-07 1.255e-09 1.255e-09 1.321e-09 7.670e-12 7.670e-12 False nan nan 465.899 10000.000 100000.000 1.138e-08 2.888e-10 2.888e-10 1.055e-10 2.676e-12 2.676e-12 False nan nan 111.192 *** Lightcurve info *** Lightcurve measured in the energy band: 100 MeV - 100 GeV Variability index : 20.009 No peak measured for this source. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/3fhl_j2301.9+5855e.txt0000644000175100001770000000477014721316200022673 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 1500 Source name : 3FHL J2301.9+5855e Extended name : FGES J2301.9+5855 Associations : CTB 109, 3FGL J2301.2+5853 ASSOC_PROB_BAY : -- ASSOC_PROB_LR : -- Class : SNR TeVCat flag : N *** Other info *** Significance (10 GeV - 2 TeV) : 7.974 Npred : 53.1 HEP Energy : 364.828 GeV HEP Probability : 0.849 Bayesian Blocks : 1 Redshift : -- NuPeak_obs : 0.0 Hz *** Position info *** RA : 345.494 deg DEC : 58.920 deg GLON : 109.203 deg GLAT : -1.003 deg Semimajor (95%) : 0.0000 deg Semiminor (95%) : 0.0000 deg Position angle (95%) : 0.00 deg ROI number : 479 *** Extended source information *** Model form : Disk Model semimajor : 0.2488 deg Model semiminor : 0.2488 deg Position angle : 0.0000 deg Spatial function : RadialDisk Spatial filename : *** Spectral fit info *** Spectrum type : PowerLaw Significance curvature : 1.0 Power-law spectral index : 2.006 +- 0.174 Pivot energy : 30.2 GeV Flux Density at pivot energy : 1.61e-12 +- 2.81e-13 cm-2 GeV-1 s-1 Integral flux (10 GeV - 1 TeV) : 1.46e-10 +- 2.57e-11 cm-2 s-1 Energy flux (10 GeV - TeV) : 1.08e-11 +- 2.9e-12 erg cm-2 s-1 *** Spectral points *** e_min e_max flux flux_errn flux_errp e2dnde e2dnde_errn e2dnde_errp is_ul flux_ul e2dnde_ul sqrt_ts GeV GeV 1 / (cm2 s) 1 / (cm2 s) 1 / (cm2 s) erg / (cm2 s) erg / (cm2 s) erg / (cm2 s) 1 / (cm2 s) erg / (cm2 s) ------- -------- ----------- ----------- ----------- ------------- ------------- ------------- ----- ----------- ------------- ------- 10.000 20.000 7.545e-11 1.849e-11 2.071e-11 2.417e-12 5.923e-13 6.632e-13 False nan nan 5.446 20.000 50.000 4.747e-11 1.296e-11 1.507e-11 2.534e-12 6.916e-13 8.042e-13 False nan nan 5.218 50.000 150.000 2.058e-11 7.593e-12 9.512e-12 2.471e-12 9.117e-13 1.142e-12 False nan nan 4.016 150.000 500.000 4.437e-12 3.076e-12 4.877e-12 1.522e-12 1.055e-12 1.673e-12 False nan nan 1.791 500.000 2000.000 6.598e-17 nan 4.626e-12 7.039e-17 nan 4.936e-12 True 9.252e-12 9.872e-12 0.000 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/3pc_j0834-4159.txt0000644000175100001770000000102514721316200022205 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 51 Source name : J0834-4159 *** Pulsar info *** Period : 0.121 P_Dot : 4.281e-15 E_Dot : 9.511e+34 *** Position info *** RA : 128.574 deg DEC : -41.993 deg GLON : 260.886 deg GLAT : -1.036 deg *** Spectral info *** No spectral info available. *** Spectral points *** No spectral points available. *** Phasogram info *** No phasogram info available. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/3pc_j0835-4510.txt0000644000175100001770000000554214721316200022205 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 52 Source name : J0835-4510 *** Pulsar info *** Period : 0.089 P_Dot : 1.223e-13 E_Dot : 6.764e+36 *** Position info *** RA : 128.836 deg DEC : -45.176 deg GLON : 263.552 deg GLAT : -2.787 deg *** Spectral info *** TS : 443007 Significance (DC) : 666 Spectrum Type : PLSuperExpCutoff4 * SuperExpCutoffPowerLaw4FGLDR3 b = 2/3 * Amplitude : 5.290e-10 1 / (MeV s cm2) +- 1.665e-12 1 / (MeV s cm2) Index 1 : 2.153 +- 0.002 Index 2 : 0.667 Reference : 1760.385 MeV Expfactor : 0.517 +- 0.003 * SuperExpCutoffPowerLaw4FGLDR3 b free * Amplitude : 5.327e-10 1 / (MeV s cm2) +- 1.649e-12 1 / (MeV s cm2) Index 1 : 2.196 +- 0.004 Index 2 : 0.493 +- 0.011 Reference : 1760.385 MeV Expfactor : 0.549 +- 0.004 *** Spectral points *** e_min e_max e_ref flux flux_errn flux_errp e2dnde e2dnde_errn e2dnde_errp is_ul flux_ul e2dnde_ul sqrt_ts MeV MeV MeV 1 / (s cm2) 1 / (s cm2) 1 / (s cm2) MeV / (s cm2) MeV / (s cm2) MeV / (s cm2) 1 / (s cm2) MeV / (s cm2) ---------- ----------- ---------- ----------- ----------- ----------- ------------- ------------- ------------- ----- ----------- ------------- ------- 50.000 100.000 70.711 5.392e-06 3.093e-07 3.093e-07 5.543e-04 3.179e-05 3.179e-05 False nan nan 6.801 100.000 300.000 173.205 5.596e-06 4.955e-07 4.955e-07 8.878e-04 7.861e-05 7.861e-05 False nan nan 35.488 300.000 1000.000 547.723 3.387e-06 2.368e-08 2.368e-08 1.503e-03 1.050e-05 1.050e-05 False nan nan 202.272 1000.000 3000.000 1732.051 1.087e-06 4.350e-09 4.350e-09 1.600e-03 6.407e-06 6.407e-06 False nan nan 405.607 3000.000 10000.000 5477.226 2.264e-07 1.099e-09 1.099e-09 8.652e-04 4.201e-06 4.201e-06 False nan nan 441.811 10000.000 30000.000 17320.508 1.171e-08 1.586e-10 1.586e-10 1.425e-04 1.930e-06 1.930e-06 False nan nan 188.212 30000.000 100000.000 54772.256 2.162e-10 2.102e-11 2.102e-11 7.046e-06 6.849e-07 6.849e-07 False nan nan 26.673 100000.000 1000000.000 316227.766 4.480e-16 4.480e-16 3.641e-12 2.591e-11 2.591e-11 2.106e-07 True 7.283e-12 4.212e-07 0.000 *** Phasogram info *** Number of peaks : 4.000 Ph1 (peak one) : 0.133 Ph2 (peak two) : 0.565 Peak separation : 0.432 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/3pc_j0940-5428.txt0000644000175100001770000000540614721316200022212 0ustar00runnerdocker*** Basic info *** Catalog row index (zero-based) : 56 Source name : J0940-5428 *** Pulsar info *** Period : 0.088 P_Dot : 3.280e-14 E_Dot : 1.929e+36 *** Position info *** RA : 145.243 deg DEC : -54.478 deg GLON : 277.510 deg GLAT : -1.292 deg *** Spectral info *** TS : 688 Significance (DC) : 26 Spectrum Type : PLSuperExpCutoff4 * SuperExpCutoffPowerLaw4FGLDR3 b = 2/3 * Amplitude : 9.921e-13 1 / (MeV s cm2) +- 5.807e-14 1 / (MeV s cm2) Index 1 : 2.308 +- 0.073 Index 2 : 0.667 Reference : 1865.195 MeV Expfactor : 0.850 +- 0.147 * SuperExpCutoffPowerLaw4FGLDR3 b free * Amplitude : 0.000e+00 1 / (MeV s cm2) +- 0.000e+00 1 / (MeV s cm2) Index 1 : -- +- -- Index 2 : -- +- -- Reference : 0.000 MeV Expfactor : -- +- -- *** Spectral points *** e_min e_max e_ref flux flux_errn flux_errp e2dnde e2dnde_errn e2dnde_errp is_ul flux_ul e2dnde_ul sqrt_ts MeV MeV MeV 1 / (s cm2) 1 / (s cm2) 1 / (s cm2) MeV / (s cm2) MeV / (s cm2) MeV / (s cm2) 1 / (s cm2) MeV / (s cm2) ---------- ----------- ---------- ----------- ----------- ----------- ------------- ------------- ------------- ----- ----------- ------------- ------- 50.000 100.000 70.711 8.958e-09 8.958e-09 3.888e-08 9.257e-07 9.257e-07 4.018e-06 True 8.672e-08 8.961e-06 0.243 100.000 300.000 173.205 2.116e-09 2.116e-09 5.924e-09 3.405e-07 3.405e-07 9.532e-07 True 1.396e-08 2.247e-06 0.285 300.000 1000.000 547.723 6.219e-09 6.998e-10 7.105e-10 2.797e-06 3.147e-07 3.195e-07 False nan nan 9.466 1000.000 3000.000 1732.051 2.344e-09 1.448e-10 1.448e-10 3.433e-06 2.120e-07 2.120e-07 False nan nan 20.187 3000.000 10000.000 5477.226 3.715e-10 3.881e-11 4.073e-11 1.338e-06 1.398e-07 1.467e-07 False nan nan 13.200 10000.000 30000.000 17320.508 8.594e-12 5.297e-12 6.847e-12 1.017e-07 6.267e-08 8.102e-08 True 2.229e-11 2.637e-07 1.860 30000.000 100000.000 54772.256 4.300e-16 4.300e-16 4.000e-12 1.401e-11 1.401e-11 1.304e-07 True 8.001e-12 2.607e-07 0.000 100000.000 1000000.000 316227.766 1.867e-12 1.867e-12 4.319e-12 1.080e-07 1.080e-07 2.498e-07 True 1.051e-11 6.076e-07 0.812 *** Phasogram info *** Number of peaks : 1.000 Ph1 (peak one) : 0.454 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/4fgl-dr4_J0534.5+2200.txt0000644000175100001770000000675714721316200023143 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 1391 Source name : 4FGL J0534.5+2200 Extended name : Associations : PSR J0534+2200, Crab IC field, Crab pulsar, 3FGL J0534.5+2201, 3FHL J0534.5+2201, 2AGL J0534+2205, EGR J0534+2159 ASSOC_PROB_BAY : 1.000 ASSOC_PROB_LR : 1.000 Class1 : PSR Class2 : TeVCat flag : P *** Other info *** Significance (100 MeV - 1 TeV) : 25.186 Npred : 98594.1 Other flags : 784 *** Position info *** RA : 83.637 deg DEC : 22.015 deg GLON : 184.559 deg GLAT : -5.781 deg Semimajor (68%) : 0.0045 deg Semiminor (68%) : 0.0044 deg Position angle (68%) : -71.52 deg Semimajor (95%) : 0.0073 deg Semiminor (95%) : 0.0072 deg Position angle (95%) : -71.52 deg ROI number : 270 *** Spectral info *** Spectrum type : PLSuperExpCutoff Detection significance (100 MeV - 1 TeV) : 25.186 Exponential factor : 0.2337 +- 0.0188 Super-exponential cutoff index : 0.5219 +- 0.0946 Significance curvature : 27.6 Pivot energy : 1343 MeV Spectral index : 2.274 +- 0.008 Flux Density at pivot energy : 1.1e-10 +- 6.99e-13 cm-2 MeV-1 s-1 Integral flux (1 - 100 GeV) : 1.54e-07 +- 8.44e-10 cm-2 s-1 Energy flux (100 MeV - 100 GeV) : 1.46e-09 +- 2.16e-11 erg cm-2 s-1 *** Spectral points *** e_min e_max flux flux_errn flux_errp e2dnde e2dnde_errn e2dnde_errp is_ul flux_ul e2dnde_ul sqrt_ts MeV MeV 1 / (s cm2) 1 / (s cm2) 1 / (s cm2) erg / (s cm2) erg / (s cm2) erg / (s cm2) 1 / (s cm2) erg / (s cm2) ---------- ----------- ----------- ----------- ----------- ------------- ------------- ------------- ----- ----------- ------------- ------- 50.000 100.000 2.365e-06 2.901e-07 2.916e-07 3.800e-10 4.662e-11 4.686e-11 False nan nan 8.309 100.000 300.000 1.595e-06 3.088e-08 3.088e-08 3.840e-10 7.436e-12 7.436e-12 False nan nan 60.667 300.000 1000.000 5.546e-07 5.472e-09 5.472e-09 3.760e-10 3.710e-12 3.710e-12 False nan nan 138.346 1000.000 3000.000 1.255e-07 8.804e-10 8.804e-10 2.918e-10 2.047e-12 2.047e-12 False nan nan 155.573 3000.000 10000.000 2.464e-08 8.789e-10 8.789e-10 1.552e-10 5.536e-12 5.536e-12 False nan nan 9.059 10000.000 30000.000 2.149e-09 4.120e-10 2.330e-10 4.498e-11 8.622e-12 4.877e-12 False nan nan 3.116 30000.000 100000.000 1.542e-16 nan 4.113e-12 8.085e-18 nan 2.157e-13 True 8.227e-12 4.314e-13 0.000 100000.000 1000000.000 6.125e-18 nan 3.875e-12 5.675e-19 nan 3.590e-13 True 7.750e-12 7.180e-13 0.000 *** Lightcurve info *** Lightcurve measured in the energy band: 100 MeV - 100 GeV Variability index : 156.729 Significance peak (100 MeV - 100 GeV) : 154.788 Integral flux peak (100 MeV - 100 GeV) : 2.54e-06 +- 2.55e-08 cm^-2 s^-1 Time peak : 3.18e+08 s (Mission elapsed time) Peak interval : 3.65e+02 day ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/4fgl_J0000.3-7355.txt0000644000175100001770000000547014721316200022447 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 0 Source name : 4FGL J0000.3-7355 Extended name : Associations : ASSOC_PROB_BAY : 0.000 ASSOC_PROB_LR : 0.000 Class1 : Class2 : TeVCat flag : N *** Other info *** Significance (100 MeV - 1 TeV) : 7.400 Npred : 230.9 Other flags : 0 *** Position info *** RA : 0.098 deg DEC : -73.922 deg GLON : 307.709 deg GLAT : -42.730 deg Semimajor (68%) : 0.0324 deg Semiminor (68%) : 0.0315 deg Position angle (68%) : -62.70 deg Semimajor (95%) : 0.0525 deg Semiminor (95%) : 0.0510 deg Position angle (95%) : -62.70 deg ROI number : 881 *** Spectral info *** Spectrum type : PowerLaw Detection significance (100 MeV - 1 TeV) : 7.400 Pivot energy : 2430 MeV Spectral index : 2.119 +- 0.146 Flux Density at pivot energy : 2.95e-14 +- 5.33e-15 cm-2 MeV-1 s-1 Integral flux (1 - 100 GeV) : 1.72e-10 +- 3.11e-11 cm-2 s-1 Energy flux (100 MeV - 100 GeV) : 1.92e-12 +- 3.52e-13 erg cm-2 s-1 *** Spectral points *** e_min e_max flux flux_errn flux_errp e2dnde e2dnde_errn e2dnde_errp is_ul flux_ul e2dnde_ul sqrt_ts MeV MeV 1 / (cm2 s) 1 / (cm2 s) 1 / (cm2 s) erg / (cm2 s) erg / (cm2 s) erg / (cm2 s) 1 / (cm2 s) erg / (cm2 s) --------- ---------- ----------- ----------- ----------- ------------- ------------- ------------- ----- ----------- ------------- ------- 50.000 100.000 7.588e-09 nan 1.688e-08 1.210e-12 nan 2.692e-12 True 4.135e-08 6.593e-12 0.408 100.000 300.000 1.506e-11 nan 1.895e-09 3.578e-15 nan 4.501e-13 True 3.805e-09 9.037e-13 0.000 300.000 1000.000 4.787e-10 1.873e-10 1.972e-10 3.241e-13 1.268e-13 1.335e-13 False nan nan 2.647 1000.000 3000.000 1.242e-10 3.637e-11 3.881e-11 2.951e-13 8.637e-14 9.217e-14 False nan nan 4.171 3000.000 10000.000 4.543e-11 1.304e-11 1.609e-11 3.075e-13 8.828e-14 1.089e-13 False nan nan 5.110 10000.000 30000.000 1.034e-11 4.797e-12 6.787e-12 2.455e-13 1.139e-13 1.612e-13 False nan nan 3.866 30000.000 300000.000 1.819e-15 nan 3.998e-12 9.260e-17 nan 2.035e-13 True 7.997e-12 4.071e-13 0.000 *** Lightcurve info *** Lightcurve measured in the energy band: 100 MeV - 100 GeV Variability index : 8.665 No peak measured for this source. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/4fgl_J0001.5+2113.txt0000644000175100001770000000630014721316200022424 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 3 Source name : 4FGL J0001.5+2113 Extended name : Associations : TXS 2358+209, 3FGL J0001.4+2120, 3EG J2359+2041 ASSOC_PROB_BAY : 0.998 ASSOC_PROB_LR : 0.958 Class1 : fsrq Class2 : TeVCat flag : N *** Other info *** Significance (100 MeV - 1 TeV) : 30.666 Npred : 3110.9 Other flags : 0 *** Position info *** RA : 0.382 deg DEC : 21.218 deg GLON : 107.649 deg GLAT : -40.168 deg Semimajor (68%) : 0.0260 deg Semiminor (68%) : 0.0240 deg Position angle (68%) : -60.52 deg Semimajor (95%) : 0.0422 deg Semiminor (95%) : 0.0389 deg Position angle (95%) : -60.52 deg ROI number : 1052 *** Spectral info *** Spectrum type : LogParabola Detection significance (100 MeV - 1 TeV) : 30.666 beta : 0.1480 +- 0.03150 Significance curvature : 5.1 Pivot energy : 376 MeV Spectral index : 2.598 +- 0.045 Flux Density at pivot energy : 2.85e-11 +- 1.33e-12 cm-2 MeV-1 s-1 Integral flux (1 - 100 GeV) : 9.68e-10 +- 6.66e-11 cm-2 s-1 Energy flux (100 MeV - 100 GeV) : 1.93e-11 +- 8.21e-13 erg cm-2 s-1 *** Spectral points *** e_min e_max flux flux_errn flux_errp e2dnde e2dnde_errn e2dnde_errp is_ul flux_ul e2dnde_ul sqrt_ts MeV MeV 1 / (cm2 s) 1 / (cm2 s) 1 / (cm2 s) erg / (cm2 s) erg / (cm2 s) erg / (cm2 s) 1 / (cm2 s) erg / (cm2 s) --------- ---------- ----------- ----------- ----------- ------------- ------------- ------------- ----- ----------- ------------- ------- 50.000 100.000 6.076e-08 3.257e-08 2.360e-08 9.694e-12 5.197e-12 3.766e-12 False nan nan 1.869 100.000 300.000 3.663e-08 3.812e-09 3.848e-09 8.493e-12 8.838e-13 8.922e-13 False nan nan 10.183 300.000 1000.000 7.971e-09 4.437e-10 4.437e-10 5.048e-12 2.810e-13 2.810e-13 False nan nan 22.661 1000.000 3000.000 9.530e-10 8.496e-11 8.496e-11 2.075e-12 1.850e-13 1.850e-13 False nan nan 17.164 3000.000 10000.000 6.216e-11 1.722e-11 2.002e-11 3.673e-13 1.018e-13 1.183e-13 False nan nan 5.395 10000.000 30000.000 7.739e-15 nan 8.847e-12 1.594e-16 nan 1.822e-13 True 1.770e-11 3.646e-13 0.000 30000.000 300000.000 4.368e-16 nan 7.089e-12 1.312e-17 nan 2.130e-13 True 1.418e-11 4.260e-13 0.000 *** Lightcurve info *** Lightcurve measured in the energy band: 100 MeV - 100 GeV Variability index : 920.677 Significance peak (100 MeV - 100 GeV) : 36.358 Integral flux peak (100 MeV - 100 GeV) : 2.05e-07 +- 6.98e-09 cm^-2 s^-1 Time peak : 4.76e+08 s (Mission elapsed time) Peak interval : 3.65e+02 day ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/4fgl_J0002.8+6217.txt0000644000175100001770000000601114721316200022440 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 7 Source name : 4FGL J0002.8+6217 Extended name : Associations : PSR J0002+6216, 3FGL J0002.6+6218 ASSOC_PROB_BAY : 1.000 ASSOC_PROB_LR : 0.000 Class1 : PSR Class2 : TeVCat flag : N *** Other info *** Significance (100 MeV - 1 TeV) : 27.669 Npred : 2344.5 Other flags : 0 *** Position info *** RA : 0.720 deg DEC : 62.291 deg GLON : 117.320 deg GLAT : -0.051 deg Semimajor (68%) : 0.0165 deg Semiminor (68%) : 0.0158 deg Position angle (68%) : 3.84 deg Semimajor (95%) : 0.0268 deg Semiminor (95%) : 0.0256 deg Position angle (95%) : 3.84 deg ROI number : 1389 *** Spectral info *** Spectrum type : PLSuperExpCutoff Detection significance (100 MeV - 1 TeV) : 27.669 Exponential factor : 0.0130 +- 0.0019 Super-exponential cutoff index : 0.6667 +- -- Significance curvature : 9.7 Pivot energy : 1327 MeV Spectral index : 1.181 +- 0.191 Flux Density at pivot energy : 2.08e-12 +- 1.09e-13 cm-2 MeV-1 s-1 Integral flux (1 - 100 GeV) : 2.57e-09 +- 1.26e-10 cm-2 s-1 Energy flux (100 MeV - 100 GeV) : 1.86e-11 +- 1.51e-12 erg cm-2 s-1 *** Spectral points *** e_min e_max flux flux_errn flux_errp e2dnde e2dnde_errn e2dnde_errp is_ul flux_ul e2dnde_ul sqrt_ts MeV MeV 1 / (cm2 s) 1 / (cm2 s) 1 / (cm2 s) erg / (cm2 s) erg / (cm2 s) erg / (cm2 s) 1 / (cm2 s) erg / (cm2 s) --------- ---------- ----------- ----------- ----------- ------------- ------------- ------------- ----- ----------- ------------- ------- 50.000 100.000 2.846e-11 nan 4.019e-08 4.683e-15 nan 6.613e-12 True 8.041e-08 1.323e-11 0.000 100.000 300.000 7.734e-09 nan 8.182e-09 1.963e-12 nan 2.077e-12 True 2.410e-08 6.117e-12 0.808 300.000 1000.000 7.763e-09 7.266e-10 7.266e-10 5.483e-12 5.132e-13 5.132e-13 False nan nan 12.082 1000.000 3000.000 2.126e-09 1.359e-10 1.359e-10 4.900e-12 3.131e-13 3.131e-13 False nan nan 20.693 3000.000 10000.000 3.292e-10 3.758e-11 3.968e-11 1.862e-12 2.126e-13 2.245e-13 False nan nan 13.071 10000.000 30000.000 1.707e-12 nan 6.539e-12 3.236e-14 nan 1.239e-13 True 1.479e-11 2.803e-13 0.370 30000.000 300000.000 1.627e-15 nan 3.435e-12 4.523e-17 nan 9.547e-14 True 6.871e-12 1.910e-13 0.000 *** Lightcurve info *** Lightcurve measured in the energy band: 100 MeV - 100 GeV Variability index : 4.702 No peak measured for this source. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/4fgl_J1409.1-6121e.txt0000644000175100001770000000631114721316200022611 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 2718 Source name : 4FGL J1409.1-6121e Extended name : FGES J1409.1-6121 Associations : 3FGL J1405.4-6119, 3FHL J1409.1-6121e, J1407-6136c ASSOC_PROB_BAY : -- ASSOC_PROB_LR : -- Class1 : Class2 : TeVCat flag : N *** Other info *** Significance (100 MeV - 1 TeV) : 35.921 Npred : 11034.4 Other flags : 4 *** Position info *** RA : 212.294 deg DEC : -61.353 deg GLON : 312.111 deg GLAT : 0.126 deg Semimajor (68%) : 0.0000 deg Semiminor (68%) : 0.0000 deg Position angle (68%) : 0.00 deg Semimajor (95%) : 0.0000 deg Semiminor (95%) : 0.0000 deg Position angle (95%) : 0.00 deg ROI number : 280 *** Extended source information *** Model form : Disk Model semimajor : 0.7331 deg Model semiminor : 0.7331 deg Position angle : 0.0000 deg Spatial function : RadialDisk Spatial filename : *** Spectral info *** Spectrum type : LogParabola Detection significance (100 MeV - 1 TeV) : 35.921 beta : 0.0684 +- 0.01368 Significance curvature : 5.3 Pivot energy : 4441 MeV Spectral index : 2.136 +- 0.031 Flux Density at pivot energy : 1.32e-12 +- 4.51e-14 cm-2 MeV-1 s-1 Integral flux (1 - 100 GeV) : 2.61e-08 +- 9.95e-10 cm-2 s-1 Energy flux (100 MeV - 100 GeV) : 2.38e-10 +- 1.28e-11 erg cm-2 s-1 *** Spectral points *** e_min e_max flux flux_errn flux_errp e2dnde e2dnde_errn e2dnde_errp is_ul flux_ul e2dnde_ul sqrt_ts MeV MeV 1 / (cm2 s) 1 / (cm2 s) 1 / (cm2 s) erg / (cm2 s) erg / (cm2 s) erg / (cm2 s) 1 / (cm2 s) erg / (cm2 s) --------- ---------- ----------- ----------- ----------- ------------- ------------- ------------- ----- ----------- ------------- ------- 50.000 100.000 3.052e-09 nan 3.022e-07 4.973e-13 nan 4.925e-11 True 6.075e-07 9.900e-11 0.000 100.000 300.000 3.659e-10 nan 1.792e-07 9.065e-14 nan 4.440e-11 True 3.588e-07 8.889e-11 0.000 300.000 1000.000 6.031e-08 1.124e-08 1.126e-08 4.215e-11 7.855e-12 7.869e-12 False nan nan 5.408 1000.000 3000.000 2.127e-08 1.064e-09 1.064e-09 5.107e-11 2.555e-12 2.555e-12 False nan nan 21.359 3000.000 10000.000 5.507e-09 2.791e-10 2.791e-10 3.709e-11 1.880e-12 1.880e-12 False nan nan 21.737 10000.000 30000.000 1.497e-09 1.006e-10 1.006e-10 3.487e-11 2.344e-12 2.344e-12 False nan nan 17.587 30000.000 300000.000 3.534e-10 4.687e-11 4.730e-11 1.533e-11 2.034e-12 2.053e-12 False nan nan 8.643 *** Lightcurve info *** Lightcurve measured in the energy band: 100 MeV - 100 GeV Variability index : 3.216 No peak measured for this source. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/gammacat_hess_j1813-178.txt0000644000175100001770000000457114721316200024240 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based): 118 Common name: HESS J1813-178 Gamma names: HESS J1813-178 Fermi names: -- Other names: G12.82-0.02,PSR J1813-1749,CXOU J181335.1-174957,IGR J18135-1751,W33 Location: gal Class: pwn TeVCat ID: 114 TeVCat 2 ID: Unhlxa TeVCat name: TeV J1813-178 TGeVCat ID: 116 TGeVCat name: TeV J1813-1750 Discoverer: hess Discovery date: 2005-03 Seen by: hess,magic Reference: 2006ApJ...636..777A *** Position info *** SIMBAD: RA: 273.363 deg DEC: -17.849 deg GLON: 12.787 deg GLAT: 0.000 deg Measurement: RA: 273.408 deg DEC: -17.842 deg GLON: 12.813 deg GLAT: -0.034 deg Position error: 0.005 deg *** Morphology info *** Morphology model type: gauss Sigma: 0.036 deg Sigma error: 0.006 deg Sigma2: 0.000 deg Sigma2 error: 0.000 deg Position angle: 0.000 deg Position angle error: 0.000 deg Position angle frame: -- *** Spectral info *** Significance: 13.500 Livetime: 9.700 h Spectrum type: pl2 flux: 1.42e-11 +- 1.1e-12 (stat) +- 3e-13 (sys) cm-2 s-1 index: 2.09 +- 0.08 (stat) +- 0.2 (sys) e_min: 0.2 TeV e_max: 0.0 TeV Energy range: (0.0, 0.0) TeV theta: 0.15 deg Derived fluxes: Spectral model norm (1 TeV): 2.68e-12 1 / (cm2 s TeV) +- 2.55e-13 1 / (cm2 s TeV) (stat) cm-2 s-1 TeV-1 Integrated flux (>1 TeV): 2.46e-12 +- 3.69e-13 (stat) cm-2 s-1 Integrated flux (>1 TeV): 11.844 +- 1.780 (% Crab) Integrated flux (1-10 TeV): 8.92e-12 +- 1.46e-12 (stat) erg cm-2 s-1 *** Spectral points *** SED reference ID: 2006ApJ...636..777A Number of spectral points: 13 Number of upper limits: 0 e_ref dnde dnde_errn dnde_errp TeV 1 / (cm2 s TeV) 1 / (cm2 s TeV) 1 / (cm2 s TeV) ------ --------------- --------------- --------------- 0.323 2.736e-11 5.690e-12 5.971e-12 0.427 1.554e-11 3.356e-12 3.559e-12 0.574 8.142e-12 1.603e-12 1.716e-12 0.777 4.567e-12 9.319e-13 1.007e-12 1.023 2.669e-12 5.586e-13 6.110e-13 1.373 1.518e-12 3.378e-13 3.721e-13 1.841 7.966e-13 2.166e-13 2.426e-13 2.476 3.570e-13 1.135e-13 1.295e-13 3.159 3.321e-13 8.757e-14 1.012e-13 4.414 1.934e-13 5.764e-14 6.857e-14 5.560 4.461e-14 2.130e-14 2.844e-14 10.765 1.318e-14 6.056e-15 1.085e-14 22.052 1.372e-14 6.128e-15 1.178e-14 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/gammacat_hess_j1848-018.txt0000644000175100001770000000437314721316200024241 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based): 134 Common name: HESS J1848-018 Gamma names: HESS J1848-018,1HWC J1849-017c Fermi names: -- Other names: WR121a,W43 Location: gal Class: unid TeVCat ID: 187 TeVCat 2 ID: hcE3Ou TeVCat name: TeV J1848-017 TGeVCat ID: 128 TGeVCat name: TeV J1848-0147 Discoverer: hess Discovery date: 2008-07 Seen by: hess Reference: 2008AIPC.1085..372C *** Position info *** SIMBAD: RA: 282.120 deg DEC: -1.792 deg GLON: 31.000 deg GLAT: -0.159 deg Measurement: RA: 282.121 deg DEC: -1.792 deg GLON: 31.000 deg GLAT: -0.160 deg Position error: 0.000 deg *** Morphology info *** Morphology model type: gauss Sigma: 0.320 deg Sigma error: 0.020 deg Sigma2: 0.000 deg Sigma2 error: 0.000 deg Position angle: 0.000 deg Position angle error: 0.000 deg Position angle frame: -- *** Spectral info *** Significance: 9.000 Livetime: 50.000 h Spectrum type: pl norm: 3.7e-12 1 / (cm2 s TeV) +- 4e-13 1 / (cm2 s TeV) (stat) +- 7e-13 1 / (cm2 s TeV) (sys) cm-2 s-1 TeV-1 index: 2.8 +- 0.2 (stat) +- 0.2 (sys) reference: 1.0 TeV Energy range: (0.9, 12.0) TeV theta: 0.2 deg Derived fluxes: Spectral model norm (1 TeV): 3.7e-12 1 / (cm2 s TeV) +- 4e-13 1 / (cm2 s TeV) (stat) cm-2 s-1 TeV-1 Integrated flux (>1 TeV): 2.06e-12 +- 3.19e-13 (stat) cm-2 s-1 Integrated flux (>1 TeV): 9.909 +- 1.536 (% Crab) Integrated flux (1-10 TeV): 6.24e-12 +- 1.22e-12 (stat) erg cm-2 s-1 *** Spectral points *** SED reference ID: 2008AIPC.1085..372C Number of spectral points: 11 Number of upper limits: 0 e_ref dnde dnde_errn dnde_errp TeV 1 / (cm2 s TeV) 1 / (cm2 s TeV) 1 / (cm2 s TeV) ------ --------------- --------------- --------------- 0.624 9.942e-12 3.301e-12 3.265e-12 0.878 6.815e-12 1.042e-12 1.029e-12 1.284 1.707e-12 3.889e-13 3.826e-13 1.881 5.027e-13 1.566e-13 1.533e-13 2.754 3.266e-13 7.526e-14 7.323e-14 4.033 8.183e-14 3.609e-14 3.503e-14 5.905 2.979e-14 1.981e-14 1.921e-14 8.648 4.022e-15 9.068e-15 8.729e-15 12.663 -6.647e-15 3.786e-15 3.675e-15 18.542 3.735e-15 2.009e-15 1.786e-15 27.173 -5.317e-16 9.236e-16 8.568e-16 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/gammacat_vela_x.txt0000644000175100001770000000566114721316200023412 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based): 36 Common name: Vela X Gamma names: HESS J0835-455 Fermi names: -- Other names: -- Location: gal Class: pwn TeVCat ID: 86 TeVCat 2 ID: yVoFOS TeVCat name: TeV J0835-456 TGeVCat ID: 37 TGeVCat name: TeV J0835-4536 Discoverer: hess Discovery date: 2006-03 Seen by: hess Reference: 2012A&A...548A..38A *** Position info *** SIMBAD: RA: 128.287 deg DEC: -45.190 deg GLON: 263.332 deg GLAT: -3.106 deg Measurement: RA: 128.750 deg DEC: -45.600 deg GLON: 263.856 deg GLAT: -3.089 deg Position error: 0.000 deg *** Morphology info *** Morphology model type: gauss Sigma: 0.480 deg Sigma error: 0.030 deg Sigma2: 0.360 deg Sigma2 error: 0.030 deg Position angle: 41.000 deg Position angle error: 7.000 deg Position angle frame: radec *** Spectral info *** Significance: 27.900 Livetime: 53.100 h Spectrum type: ecpl norm: 1.46e-11 +- 8e-13 (stat) +- 3e-12 (sys) cm-2 s-1 TeV-1 index: 1.32 +- 0.06 (stat) +- 0.12 (sys) e_cut: 14.0 +- 1.6 (stat) +- 2.6 (sys) TeV reference: 1.0 TeV Energy range: (0.75, 0.0) TeV theta: 1.2 deg Derived fluxes: Spectral model norm (1 TeV): 1.36e-11 1 / (cm2 s TeV) +- 7.53e-13 1 / (cm2 s TeV) (stat) cm-2 s-1 TeV-1 Integrated flux (>1 TeV): 2.1e-11 +- 1.97e-12 (stat) cm-2 s-1 Integrated flux (>1 TeV): 101.425 +- 9.511 (% Crab) Integrated flux (1-10 TeV): 9.27e-11 +- 9.59e-12 (stat) erg cm-2 s-1 *** Spectral points *** SED reference ID: 2012A&A...548A..38A Number of spectral points: 24 Number of upper limits: 0 e_ref dnde dnde_errn dnde_errp TeV 1 / (cm2 s TeV) 1 / (cm2 s TeV) 1 / (cm2 s TeV) ------ --------------- --------------- --------------- 0.719 1.055e-11 3.284e-12 3.280e-12 0.868 1.304e-11 2.130e-12 2.130e-12 1.051 9.211e-12 1.401e-12 1.399e-12 1.274 8.515e-12 9.580e-13 9.610e-13 1.546 5.378e-12 7.070e-13 7.090e-13 1.877 4.455e-12 5.050e-13 5.070e-13 2.275 3.754e-12 3.300e-13 3.340e-13 2.759 2.418e-12 2.680e-13 2.700e-13 3.352 1.605e-12 1.800e-13 1.830e-13 4.078 1.445e-12 1.260e-13 1.290e-13 4.956 9.240e-13 9.490e-14 9.700e-14 6.008 7.348e-13 6.470e-14 6.710e-14 7.271 3.863e-13 4.540e-14 4.700e-14 8.795 3.579e-13 3.570e-14 3.750e-14 10.650 1.696e-13 2.490e-14 2.590e-14 12.910 1.549e-13 2.060e-14 2.160e-14 15.650 6.695e-14 1.134e-14 1.230e-14 18.880 2.105e-14 1.390e-14 1.320e-14 22.620 3.279e-14 6.830e-15 7.510e-15 26.870 3.026e-14 5.910e-15 6.660e-15 31.610 1.861e-14 4.380e-15 5.120e-15 36.970 5.653e-15 2.169e-15 2.917e-15 43.080 3.479e-15 1.641e-15 2.410e-15 52.370 1.002e-15 8.327e-16 1.615e-15 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/hess_j1713-397.txt0000644000175100001770000001454014721316200022405 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 33 Source name : HESS J1713-397 Analysis reference : EXTERN Source class : SNR Identified object : RX J1713.7-3946 Gamma-Cat id : 96 *** Info from map analysis *** RA : 258.355 deg = 17h13m25s DEC : -39.771 deg = -39d46m16s GLON : 347.311 +/- 0.000 deg GLAT : -0.457 +/- 0.000 deg Position Error (68%) : 0.000 deg Position Error (95%) : 0.000 deg ROI number : -1 Spatial model : Shell Spatial components : -- TS : -- sqrt(TS) : -- Size : 0.500 +/- 0.000 (UL: 0.000) deg R70 : 0.000 deg RSpec : 0.600 deg Total model excess : -- Excess in RSpec : -- Model Excess in RSpec : -- Background in RSpec : -- Livetime : 164.0 hours Energy threshold : 0.00 TeV Source flux (>1 TeV) : (16.879 +/- 0.824) x 10^-12 cm^-2 s^-1 = (74.69 +/- 3.64) % Crab Fluxes in RSpec (> 1 TeV): Map measurement : 0.000 x 10^-12 cm^-2 s^-1 = 0.00 % Crab Source model : 0.000 x 10^-12 cm^-2 s^-1 = 0.00 % Crab Other component model : 0.000 x 10^-12 cm^-2 s^-1 = 0.00 % Crab Large scale component model : 0.000 x 10^-12 cm^-2 s^-1 = 0.00 % Crab Total model : 0.000 x 10^-12 cm^-2 s^-1 = 0.00 % Crab Containment in RSpec : -- % Contamination in RSpec : -- % Flux correction (RSpec -> Total) : -- % Flux correction (Total -> RSpec) : -- % *** Info from spectral analysis *** Livetime : 164.0 hours Energy range: : 0.20 to 40.00 TeV Background : -- Excess : -- Spectral model : ECPL TS ECPL over PL : -- Best-fit model flux(> 1 TeV) : (16.879 +/- 0.824) x 10^-12 cm^-2 s^-1 = (74.69 +/- 3.64) % Crab Best-fit model energy flux(1 to 10 TeV) : (59.996 +/- 3.173) x 10^-12 erg cm^-2 s^-1 Pivot energy : 0.00 TeV Flux at pivot energy : (0.000 +/- 0.000) x 10^-12 cm^-2 s^-1 TeV^-1 = (0.00 +/- 0.00) % Crab PL Flux(> 1 TeV) : (0.000 +/- 0.000) x 10^-12 cm^-2 s^-1 = (0.00 +/- 0.00) % Crab PL Flux(@ 1 TeV) : (0.000 +/- 0.000) x 10^-12 cm^-2 s^-1 TeV^-1 = (0.00 +/- 0.00) % Crab PL Index : -- +/- -- ECPL Flux(@ 1 TeV) : (21.284 +/- 0.936) x 10^-12 cm^-2 s^-1 TeV^-1 = (94.18 +/- 2.67) % Crab ECPL Flux(> 1 TeV) : (16.879 +/- 0.824) x 10^-12 cm^-2 s^-1 = (74.69 +/- 3.64) % Crab ECPL Index : 2.06 +/- 0.02 ECPL Lambda : 0.078 +/- 0.007 TeV^-1 ECPL E_cut : 12.90 +/- 1.10 TeV *** Flux points info *** Number of flux points: 30 Flux points table: e_ref e_min e_max dnde dnde_errn dnde_errp dnde_ul is_ul TeV TeV TeV 1 / (cm2 s TeV) 1 / (cm2 s TeV) 1 / (cm2 s TeV) 1 / (cm2 s TeV) ------ ------ ------ --------------- --------------- --------------- --------------- ----- 0.230 0.210 0.250 5.239e-10 9.852e-11 9.852e-11 nan False 0.270 0.250 0.290 3.416e-10 4.426e-11 4.426e-11 nan False 0.320 0.290 0.350 2.609e-10 2.146e-11 2.146e-11 nan False 0.380 0.350 0.410 1.694e-10 1.042e-11 1.042e-11 nan False 0.450 0.410 0.490 1.221e-10 6.288e-12 6.288e-12 nan False 0.530 0.490 0.580 7.510e-11 3.844e-12 3.844e-12 nan False 0.630 0.580 0.690 5.394e-11 2.217e-12 2.217e-12 nan False 0.750 0.690 0.810 3.784e-11 1.487e-12 1.487e-12 nan False 0.890 0.810 0.960 2.529e-11 9.613e-13 9.613e-13 nan False 1.050 0.960 1.140 1.664e-11 6.341e-13 6.341e-13 nan False 1.250 1.140 1.360 1.206e-11 4.354e-13 4.354e-13 nan False 1.480 1.360 1.610 9.261e-12 3.049e-13 3.049e-13 nan False 1.760 1.610 1.910 6.025e-12 2.035e-13 2.035e-13 nan False 2.080 1.910 2.260 4.458e-12 1.717e-13 1.717e-13 nan False 2.470 2.260 2.680 2.680e-12 1.269e-13 1.269e-13 nan False 2.930 2.680 3.180 1.847e-12 8.506e-14 8.506e-14 nan False 3.470 3.180 3.770 1.343e-12 6.117e-14 6.117e-14 nan False 4.120 3.770 4.470 8.788e-13 4.707e-14 4.707e-14 nan False 4.880 4.470 5.290 6.552e-13 3.224e-14 3.224e-14 nan False 5.790 5.290 6.280 4.357e-13 2.439e-14 2.439e-14 nan False 6.860 6.280 7.440 2.666e-13 1.830e-14 1.830e-14 nan False 8.140 7.440 8.830 1.639e-13 1.422e-14 1.422e-14 nan False 9.650 8.830 10.470 9.518e-14 1.025e-14 1.025e-14 nan False 11.440 10.470 12.410 7.535e-14 7.583e-15 7.583e-15 nan False 13.560 12.410 14.710 2.831e-14 5.771e-15 5.771e-15 nan False 16.080 14.710 17.450 2.334e-14 4.031e-15 4.031e-15 nan False 19.990 17.450 22.530 5.967e-15 2.202e-15 2.202e-15 nan False 24.620 22.530 26.710 5.849e-15 1.802e-15 1.802e-15 nan False 29.190 26.710 31.670 2.044e-15 1.150e-15 1.150e-15 nan False 34.610 31.670 37.550 2.647e-15 6.617e-16 6.617e-16 nan False *** Source associations info *** Source_Name Association_Catalog Association_Name Separation deg ---------------- ------------------- --------------------- ---------- HESS J1713-397 2FHL 2FHL J1713.5-3945e 0.029067 HESS J1713-397 2FHL 2FHL J1714.1-4012 0.457815 HESS J1713-397 3FGL 3FGL J1713.5-3945e 0.029067 HESS J1713-397 SNR G347.3-0.5 0.059676 *** Source identification info *** Source_Name: HESS J1713-397 Identified_Object: RX J1713.7-3946 Class: SNR Evidence: Morphology Reference: 2004Natur.432...75A Distance_Reference: SNRCat Distance: 3.5 kpc Distance_Min: 1.0 kpc Distance_Max: 6.0 kpc ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/hess_j1825-137.txt0000644000175100001770000001223014721316200022373 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 54 Source name : HESS J1825-137 Analysis reference : HGPS Source class : PWN Identified object : PSR J1826-1334 Gamma-Cat id : 118 *** Info from map analysis *** RA : 276.260 deg = 18h25m02s DEC : -13.966 deg = -13d57m57s GLON : 17.525 +/- 0.082 deg GLAT : -0.618 +/- 0.011 deg Position Error (68%) : 0.125 deg Position Error (95%) : 0.203 deg ROI number : 6 Spatial model : 3-Gaussian Spatial components : HGPSC 065, HGPSC 066, HGPSC 067 TS : 5846.8 sqrt(TS) : 76.5 Size : 0.461 +/- 0.032 (UL: 0.000) deg R70 : 0.709 deg RSpec : 0.500 deg Total model excess : 18291.5 Excess in RSpec : 9525.5 Model Excess in RSpec : 9408.8 Background in RSpec : 8608.5 Livetime : 109.1 hours Energy threshold : 0.40 TeV Source flux (>1 TeV) : (18.408 +/- 0.556) x 10^-12 cm^-2 s^-1 = (81.45 +/- 2.46) % Crab Fluxes in RSpec (> 1 TeV): Map measurement : 8.997 x 10^-12 cm^-2 s^-1 = 39.81 % Crab Source model : 8.724 x 10^-12 cm^-2 s^-1 = 38.60 % Crab Other component model : 0.121 x 10^-12 cm^-2 s^-1 = 0.54 % Crab Large scale component model : 0.195 x 10^-12 cm^-2 s^-1 = 0.86 % Crab Total model : 9.040 x 10^-12 cm^-2 s^-1 = 40.00 % Crab Containment in RSpec : 47.4 % Contamination in RSpec : 3.5 % Flux correction (RSpec -> Total) : 203.6 % Flux correction (Total -> RSpec) : 49.1 % *** Info from spectral analysis *** Livetime : 12.5 hours Energy range: : 0.24 to 61.90 TeV Background : 5843.4 Excess : 3129.6 Spectral model : ECPL TS ECPL over PL : 11.4 Best-fit model flux(> 1 TeV) : (19.150 +/- 1.847) x 10^-12 cm^-2 s^-1 = (84.74 +/- 8.17) % Crab Best-fit model energy flux(1 to 10 TeV) : (66.403 +/- 7.952) x 10^-12 erg cm^-2 s^-1 Pivot energy : 1.16 TeV Flux at pivot energy : (17.165 +/- 0.573) x 10^-12 cm^-2 s^-1 TeV^-1 = (75.95 +/- 1.63) % Crab PL Flux(> 1 TeV) : (17.596 +/- 0.666) x 10^-12 cm^-2 s^-1 = (77.86 +/- 2.95) % Crab PL Flux(@ 1 TeV) : (24.233 +/- 0.812) x 10^-12 cm^-2 s^-1 TeV^-1 = (107.23 +/- 2.32) % Crab PL Index : 2.38 +/- 0.03 ECPL Flux(@ 1 TeV) : (25.557 +/- 0.900) x 10^-12 cm^-2 s^-1 TeV^-1 = (113.09 +/- 2.57) % Crab ECPL Flux(> 1 TeV) : (19.150 +/- 1.847) x 10^-12 cm^-2 s^-1 = (84.74 +/- 8.17) % Crab ECPL Index : 2.15 +/- 0.06 ECPL Lambda : 0.074 +/- 0.021 TeV^-1 ECPL E_cut : 13.57 +/- 3.94 TeV *** Flux points info *** Number of flux points: 6 Flux points table: e_ref e_min e_max dnde dnde_errn dnde_errp dnde_ul is_ul TeV TeV TeV 1 / (cm2 s TeV) 1 / (cm2 s TeV) 1 / (cm2 s TeV) 1 / (cm2 s TeV) ------ ------ ------ --------------- --------------- --------------- --------------- ----- 0.365 0.237 0.562 2.382e-10 1.642e-11 1.654e-11 2.715e-10 False 0.866 0.562 1.334 3.503e-11 1.959e-12 1.954e-12 3.880e-11 False 2.054 1.334 3.162 5.060e-12 3.173e-13 3.169e-13 5.707e-12 False 5.109 3.162 8.254 5.446e-13 5.534e-14 5.658e-14 6.582e-13 False 12.711 8.254 19.573 4.615e-14 9.341e-15 9.701e-15 6.613e-14 False 30.142 19.573 46.416 2.724e-15 1.420e-15 1.646e-15 6.221e-15 False *** Gaussian component info *** Number of components: 3 Spatial components : HGPSC 065, HGPSC 066, HGPSC 067 Component HGPSC 065: GLON : 16.989 +/- 0.091 deg GLAT : -0.491 +/- 0.038 deg Size : 0.477 +/- 0.030 deg Flux (>1 TeV) : (5.02 +/- 1.13) x 10^-12 cm^-2 s^-1 = (22.2 +/- 5.0) % Crab Component HGPSC 066: GLON : 17.712 +/- 0.025 deg GLAT : -0.660 +/- 0.014 deg Size : 0.391 +/- 0.017 deg Flux (>1 TeV) : (11.83 +/- 1.08) x 10^-12 cm^-2 s^-1 = (52.3 +/- 4.8) % Crab Component HGPSC 067: GLON : 17.841 +/- 0.009 deg GLAT : -0.706 +/- 0.009 deg Size : 0.109 +/- 0.009 deg Flux (>1 TeV) : (1.56 +/- 0.20) x 10^-12 cm^-2 s^-1 = (6.9 +/- 0.9) % Crab *** Source associations info *** Source_Name Association_Catalog Association_Name Separation deg ---------------- ------------------- --------------------- ---------- HESS J1825-137 2FHL 2FHL J1824.5-1350e 0.170969 HESS J1825-137 3FGL 3FGL J1824.5-1351e 0.169707 HESS J1825-137 PWN G18.0-0.7 0.479912 HESS J1825-137 PSR B1823-13 0.481048 *** Source identification info *** Source_Name: HESS J1825-137 Identified_Object: PSR J1826-1334 Class: PWN Evidence: ED Morph. Reference: 2006A%26A...460..365A Distance_Reference: ATNF Distance: 3.930000066757202 kpc Distance_Min: 1.965000033378601 kpc Distance_Max: 7.860000133514404 kpc ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/hess_j1930+188.txt0000644000175100001770000001103214721316200022373 0ustar00runnerdocker *** Basic info *** Catalog row index (zero-based) : 76 Source name : HESS J1930+188 Analysis reference : HGPS Source class : Composite Identified object : G54.1+0.3 Gamma-Cat id : 136 *** Info from map analysis *** RA : 292.605 deg = 19h30m25s DEC : 18.836 deg = 18d50m11s GLON : 54.057 +/- 0.013 deg GLAT : 0.266 +/- 0.014 deg Position Error (68%) : 0.031 deg Position Error (95%) : 0.049 deg ROI number : 1 Spatial model : Gaussian Spatial components : HGPSC 097 TS : 33.6 sqrt(TS) : 5.8 Size : 0.000 +/- 0.000 (UL: 0.072) deg R70 : 0.083 deg RSpec : 0.150 deg Total model excess : 64.6 Excess in RSpec : 81.2 Model Excess in RSpec : 59.2 Background in RSpec : 204.8 Livetime : 31.8 hours Energy threshold : 0.89 TeV Source flux (>1 TeV) : (0.293 +/- 0.093) x 10^-12 cm^-2 s^-1 = (1.30 +/- 0.41) % Crab Fluxes in RSpec (> 1 TeV): Map measurement : 0.369 x 10^-12 cm^-2 s^-1 = 1.63 % Crab Source model : 0.268 x 10^-12 cm^-2 s^-1 = 1.19 % Crab Other component model : -0.000 x 10^-12 cm^-2 s^-1 = -0.00 % Crab Large scale component model : 0.024 x 10^-12 cm^-2 s^-1 = 0.10 % Crab Total model : 0.292 x 10^-12 cm^-2 s^-1 = 1.29 % Crab Containment in RSpec : 91.6 % Contamination in RSpec : 8.1 % Flux correction (RSpec -> Total) : 100.3 % Flux correction (Total -> RSpec) : 99.7 % *** Info from spectral analysis *** Livetime : 12.9 hours Energy range: : 0.46 to 90.85 TeV Background : 611.9 Excess : 155.1 Spectral model : PL TS ECPL over PL : -- Best-fit model flux(> 1 TeV) : (0.318 +/- 0.068) x 10^-12 cm^-2 s^-1 = (1.41 +/- 0.30) % Crab Best-fit model energy flux(1 to 10 TeV) : (1.019 +/- 0.236) x 10^-12 erg cm^-2 s^-1 Pivot energy : 1.70 TeV Flux at pivot energy : (0.128 +/- 0.027) x 10^-12 cm^-2 s^-1 TeV^-1 = (0.57 +/- 0.08) % Crab PL Flux(> 1 TeV) : (0.318 +/- 0.068) x 10^-12 cm^-2 s^-1 = (1.41 +/- 0.30) % Crab PL Flux(@ 1 TeV) : (0.506 +/- 0.124) x 10^-12 cm^-2 s^-1 TeV^-1 = (2.24 +/- 0.35) % Crab PL Index : 2.59 +/- 0.26 ECPL Flux(@ 1 TeV) : (0.000 +/- 0.000) x 10^-12 cm^-2 s^-1 TeV^-1 = (0.00 +/- 0.00) % Crab ECPL Flux(> 1 TeV) : (0.000 +/- 0.000) x 10^-12 cm^-2 s^-1 = (0.00 +/- 0.00) % Crab ECPL Index : -- +/- -- ECPL Lambda : 0.000 +/- 0.000 TeV^-1 ECPL E_cut : inf +/- nan TeV *** Flux points info *** Number of flux points: 6 Flux points table: e_ref e_min e_max dnde dnde_errn dnde_errp dnde_ul is_ul TeV TeV TeV 1 / (cm2 s TeV) 1 / (cm2 s TeV) 1 / (cm2 s TeV) 1 / (cm2 s TeV) ------ ------ ------ --------------- --------------- --------------- --------------- ----- 0.681 0.464 1.000 1.302e-12 5.114e-13 5.323e-13 2.379e-12 False 1.468 1.000 2.154 2.069e-13 6.154e-14 6.449e-14 3.373e-13 False 3.162 2.154 4.642 1.875e-14 1.236e-14 1.299e-14 4.552e-14 False 6.813 4.642 10.000 2.002e-15 2.535e-15 2.777e-15 7.801e-15 True 14.678 10.000 21.544 2.197e-15 7.413e-16 8.351e-16 3.962e-15 False 31.623 21.544 46.416 -1.866e-16 7.490e-17 9.066e-17 5.105e-17 True *** Gaussian component info *** Number of components: 1 Spatial components : HGPSC 097 Component HGPSC 097: GLON : 54.057 +/- 0.013 deg GLAT : 0.266 +/- 0.014 deg Size : 0.022 +/- 0.025 deg Flux (>1 TeV) : (0.29 +/- 0.09) x 10^-12 cm^-2 s^-1 = (1.3 +/- 0.4) % Crab *** Source associations info *** Source_Name Association_Catalog Association_Name Separation deg ---------------- ------------------- --------------------- ---------- HESS J1930+188 COMP G54.1+0.3 0.037915 HESS J1930+188 PSR J1930+1852 0.039313 HESS J1930+188 EXTRA VER J1930+188 0.039313 *** Source identification info *** Source_Name: HESS J1930+188 Identified_Object: G54.1+0.3 Class: Composite Evidence: Position Reference: 2010ApJ...719L..69A Distance_Reference: SNRCat Distance: 6.400000095367432 kpc Distance_Min: 5.599999904632568 kpc Distance_Max: 7.199999809265137 kpc ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/data/make.py0000644000175100001770000000330014721316200021014 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Make test reference data files.""" from gammapy.catalog import CATALOG_REGISTRY cat = CATALOG_REGISTRY["3fgl"]() open("3fgl_J0000.1+6545.txt", "w").write(str(cat["3FGL J0000.1+6545"])) open("3fgl_J0001.4+2120.txt", "w").write(str(cat["3FGL J0001.4+2120"])) open("3fgl_J0023.4+0923.txt", "w").write(str(cat["3FGL J0023.4+0923"])) open("3fgl_J0835.3-4510.txt", "w").write(str(cat["3FGL J0835.3-4510"])) cat = CATALOG_REGISTRY["4fgl"]() open("4fgl_J0000.3-7355.txt", "w").write(str(cat["4FGL J0000.3-7355"])) open("4fgl_J0001.5+2113.txt", "w").write(str(cat["4FGL J0001.5+2113"])) open("4fgl_J0002.8+6217.txt", "w").write(str(cat["4FGL J0002.8+6217"])) open("4fgl_J1409.1-6121e.txt", "w").write(str(cat["4FGL J1409.1-6121e"])) cat = CATALOG_REGISTRY["2fhl"]() open("2fhl_j1445.1-0329.txt", "w").write(str(cat["2FHL J1445.1-0329"])) open("2fhl_j0822.6-4250e.txt", "w").write(str(cat["2FHL J0822.6-4250e"])) cat = CATALOG_REGISTRY["3fhl"]() open("3fhl_j2301.9+5855e.txt", "w").write(str(cat["3FHL J2301.9+5855e"])) cat = CATALOG_REGISTRY["2hwc"]() open("2hwc_j0534+220.txt", "w").write(str(cat["2HWC J0534+220"])) open("2hwc_j0631+169.txt", "w").write(str(cat["2HWC J0631+169"])) cat = CATALOG_REGISTRY["hgps"]() open("hess_j1713-397.txt", "w").write(str(cat["HESS J1713-397"])) open("hess_j1825-137.txt", "w").write(str(cat["HESS J1825-137"])) open("hess_j1930+188.txt", "w").write(str(cat["HESS J1930+188"])) cat = CATALOG_REGISTRY["gamma-cat"]() open("gammacat_hess_j1813-178.txt", "w").write(str(cat["HESS J1813-178"])) open("gammacat_hess_j1848-018.txt", "w").write(str(cat["HESS J1848-018"])) open("gammacat_vela_x.txt", "w").write(str(cat["Vela X"])) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/test_core.py0000644000175100001770000000557514721316200021175 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy.table import Column, Table from astropy.units import Quantity from gammapy.catalog import SourceCatalog from gammapy.utils.testing import assert_quantity_allclose class SomeSourceCatalog(SourceCatalog): """Minimal test source catalog class for unit tests.""" name = "test123" tag = "test123" description = "Test source catalog" def make_test_catalog(): table = Table() table["Source_Name"] = ["a", "bb", "ccc"] table["RA"] = Column([42.2, 43.3, 44.4], unit="deg") table["DEC"] = Column([1, 2, 3], unit="deg") return SomeSourceCatalog(table) class TestSourceCatalog: def setup_method(self): self.cat = make_test_catalog() def test_str(self): assert "description" in str(self.cat) assert "name" in str(self.cat) def test_table(self): assert_allclose(self.cat.table["RA"][1], 43.3) def test_row_index(self): idx = self.cat.row_index(name="bb") assert idx == 1 with pytest.raises(KeyError): self.cat.row_index(name="invalid") def test_source_name(self): name = self.cat.source_name(index=1) assert name == "bb" with pytest.raises(IndexError): self.cat.source_name(index=99) with pytest.raises(IndexError): self.cat.source_name("invalid") def test_getitem(self): source = self.cat["a"] assert source.data["Source_Name"] == "a" source = self.cat[0] assert source.data["Source_Name"] == "a" source = self.cat[np.int32(0)] assert source.data["Source_Name"] == "a" with pytest.raises(KeyError): self.cat["invalid"] with pytest.raises(IndexError): self.cat[99] with pytest.raises(TypeError): self.cat[1.2] def test_positions(self): positions = self.cat.positions assert len(positions) == 3 def test_selection(self): new = self.cat[self.cat.table["Source_Name"] != "a"] assert len(new.table) == 2 class TestSourceCatalogObject: def setup_method(self): self.cat = make_test_catalog() self.source = self.cat["bb"] def test_name(self): assert self.source.name == "bb" def test_row_index(self): assert self.source.row_index == 1 def test_data(self): d = self.source.data assert isinstance(d, dict) assert isinstance(d["RA"], Quantity) assert_quantity_allclose(d["RA"], Quantity(43.3, "deg")) assert isinstance(d["DEC"], Quantity) assert_quantity_allclose(d["DEC"], Quantity(2, "deg")) def test_position(self): position = self.source.position assert_allclose(position.ra.deg, 43.3) assert_allclose(position.dec.deg, 2) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/test_fermi.py0000644000175100001770000010070614721316200021337 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy import units as u from astropy.time import Time from astropy.utils.data import get_pkg_data_filename from gammapy.catalog import ( SourceCatalog2FHL, SourceCatalog2PC, SourceCatalog3FGL, SourceCatalog3FHL, SourceCatalog3PC, SourceCatalog4FGL, ) from gammapy.modeling.models import ( ExpCutoffPowerLaw3FGLSpectralModel, LogParabolaSpectralModel, PowerLaw2SpectralModel, PowerLawSpectralModel, SuperExpCutoffPowerLaw3FGLSpectralModel, SuperExpCutoffPowerLaw4FGLDR3SpectralModel, SuperExpCutoffPowerLaw4FGLSpectralModel, ) from gammapy.utils.gauss import Gauss2DPDF from gammapy.utils.testing import ( assert_quantity_allclose, assert_time_allclose, modify_unit_order_astropy_5_3, requires_data, ) SOURCES_2PC = [ dict( idx=3, name="J0034-0534", str_ref_file="data/2pc_j0034-0534.txt", spec_type=SuperExpCutoffPowerLaw3FGLSpectralModel, dnde=u.Quantity(8.69100018e-12, "cm-2 s-1 MeV-1"), dnde_err=u.Quantity(1.20799995e-12, "cm-2 s-1 MeV-1"), ), dict( idx=38, name="J1112-6103", str_ref_file="data/2pc_j1112-6103.txt", spec_type=PowerLawSpectralModel, dnde=u.Quantity(2.55399998e-12, "cm-2 s-1 MeV-1"), dnde_err=u.Quantity(6.34000014e-13, "cm-2 s-1 MeV-1"), ), ] SOURCES_3PC_NONE = [ dict( idx=51, name="J0834-4159", str_ref_file="data/3pc_j0834-4159.txt", spec_type=None, ), ] SOURCES_3PC = [ dict( idx=52, name="J0835-4510", str_ref_file="data/3pc_j0835-4510.txt", spec_type=SuperExpCutoffPowerLaw4FGLDR3SpectralModel, dnde=u.Quantity(5.32697275e-10, "cm-2 s-1 MeV-1"), dnde_err=u.Quantity(1.64905329e-12, "cm-2 s-1 MeV-1"), ), dict( idx=56, name="J0940-5428", str_ref_file="data/3pc_j0940-5428.txt", spec_type=SuperExpCutoffPowerLaw4FGLDR3SpectralModel, dnde=u.Quantity(9.92080658e-13, "cm-2 s-1 MeV-1"), dnde_err=u.Quantity(5.80666225e-14, "cm-2 s-1 MeV-1"), ), ] SOURCES_4FGL = [ dict( idx=0, name="4FGL J0000.3-7355", str_ref_file="data/4fgl_J0000.3-7355.txt", spec_type=PowerLawSpectralModel, dnde=u.Quantity(2.9476e-11, "cm-2 s-1 GeV-1"), dnde_err=u.Quantity(5.3318e-12, "cm-2 s-1 GeV-1"), ), dict( idx=3, name="4FGL J0001.5+2113", str_ref_file="data/4fgl_J0001.5+2113.txt", spec_type=LogParabolaSpectralModel, dnde=u.Quantity(2.8545e-8, "cm-2 s-1 GeV-1"), dnde_err=u.Quantity(1.3324e-9, "cm-2 s-1 GeV-1"), ), dict( idx=7, name="4FGL J0002.8+6217", str_ref_file="data/4fgl_J0002.8+6217.txt", spec_type=SuperExpCutoffPowerLaw4FGLSpectralModel, dnde=u.Quantity(2.084e-09, "cm-2 s-1 GeV-1"), dnde_err=u.Quantity(1.0885e-10, "cm-2 s-1 GeV-1"), ), dict( idx=2718, name="4FGL J1409.1-6121e", str_ref_file="data/4fgl_J1409.1-6121e.txt", spec_type=LogParabolaSpectralModel, dnde=u.Quantity(1.3237202133031811e-12, "cm-2 s-1 MeV-1"), dnde_err=u.Quantity(4.513233455580648e-14, "cm-2 s-1 MeV-1"), ), ] SOURCES_4FGL_DR4 = [ dict( idx=1391, name="4FGL J0534.5+2200", str_ref_file="data/4fgl-dr4_J0534.5+2200.txt", spec_type=SuperExpCutoffPowerLaw4FGLDR3SpectralModel, dnde=u.Quantity(1.1048e-07, "cm-2 s-1 GeV-1"), dnde_err=u.Quantity(6.9934e-10, "cm-2 s-1 GeV-1"), ) ] SOURCES_3FGL = [ dict( idx=0, name="3FGL J0000.1+6545", str_ref_file="data/3fgl_J0000.1+6545.txt", spec_type=PowerLawSpectralModel, dnde=u.Quantity(1.4351261e-9, "cm-2 s-1 GeV-1"), dnde_err=u.Quantity(2.1356270e-10, "cm-2 s-1 GeV-1"), ), dict( idx=4, name="3FGL J0001.4+2120", str_ref_file="data/3fgl_J0001.4+2120.txt", spec_type=LogParabolaSpectralModel, dnde=u.Quantity(8.3828599e-10, "cm-2 s-1 GeV-1"), dnde_err=u.Quantity(2.6713238e-10, "cm-2 s-1 GeV-1"), ), dict( idx=55, name="3FGL J0023.4+0923", str_ref_file="data/3fgl_J0023.4+0923.txt", spec_type=ExpCutoffPowerLaw3FGLSpectralModel, dnde=u.Quantity(1.8666925e-09, "cm-2 s-1 GeV-1"), dnde_err=u.Quantity(2.2068837e-10, "cm-2 s-1 GeV-1"), ), dict( idx=960, name="3FGL J0835.3-4510", str_ref_file="data/3fgl_J0835.3-4510.txt", spec_type=SuperExpCutoffPowerLaw3FGLSpectralModel, dnde=u.Quantity(1.6547128794756733e-06, "cm-2 s-1 GeV-1"), dnde_err=u.Quantity(1.6621504e-11, "cm-2 s-1 MeV-1"), ), ] SOURCES_2FHL = [ dict( idx=221, name="2FHL J1445.1-0329", str_ref_file="data/2fhl_j1445.1-0329.txt", spec_type=PowerLaw2SpectralModel, dnde=u.Quantity(1.065463448091757e-10, "cm-2 s-1 TeV-1"), dnde_err=u.Quantity(4.9691205387540815e-11, "cm-2 s-1 TeV-1"), ), dict( idx=134, name="2FHL J0822.6-4250e", str_ref_file="data/2fhl_j0822.6-4250e.txt", spec_type=LogParabolaSpectralModel, dnde=u.Quantity(2.46548351696472e-10, "cm-2 s-1 TeV-1"), dnde_err=u.Quantity(9.771755529198772e-11, "cm-2 s-1 TeV-1"), ), ] SOURCES_3FHL = [ dict( idx=352, name="3FHL J0534.5+2201", spec_type=PowerLawSpectralModel, dnde=u.Quantity(6.3848912826152664e-12, "cm-2 s-1 GeV-1"), dnde_err=u.Quantity(2.679593524691324e-13, "cm-2 s-1 GeV-1"), ), dict( idx=1442, name="3FHL J2158.8-3013", spec_type=LogParabolaSpectralModel, dnde=u.Quantity(2.056998292908196e-12, "cm-2 s-1 GeV-1"), dnde_err=u.Quantity(4.219030630302381e-13, "cm-2 s-1 GeV-1"), ), ] @requires_data() @pytest.mark.parametrize("ref", SOURCES_4FGL_DR4, ids=lambda _: _["name"]) def test_4FGL_DR4(ref): cat = SourceCatalog4FGL("$GAMMAPY_DATA/catalogs/fermi/gll_psc_v32.fit.gz") source = cat[ref["name"]] model = source.spectral_model() fp = source.flux_points not_ul = ~fp.is_ul.data.squeeze() fp_dnde = fp.dnde.quantity.squeeze()[not_ul] model_dnde = model(fp.energy_ref[not_ul]) assert_quantity_allclose(model_dnde, fp_dnde, rtol=0.07) models = cat.to_models() assert len(models) == len(cat.table) actual = str(cat[ref["idx"]]) with open(get_pkg_data_filename(ref["str_ref_file"])) as fh: expected = fh.read() assert actual == modify_unit_order_astropy_5_3(expected) @requires_data() class TestFermi4FGLObject: @classmethod def setup_class(cls): cls.cat = SourceCatalog4FGL("$GAMMAPY_DATA/catalogs/fermi/gll_psc_v20.fit.gz") cls.source_name = "4FGL J0534.5+2200" cls.source = cls.cat[cls.source_name] def test_name(self): assert self.source.name == self.source_name def test_row_index(self): assert self.source.row_index == 995 @pytest.mark.parametrize("ref", SOURCES_4FGL, ids=lambda _: _["name"]) def test_str(self, ref): actual = str(self.cat[ref["idx"]]) with open(get_pkg_data_filename(ref["str_ref_file"])) as fh: expected = fh.read() assert actual == modify_unit_order_astropy_5_3(expected) @pytest.mark.parametrize("ref", SOURCES_4FGL, ids=lambda _: _["name"]) def test_spectral_model(self, ref): model = self.cat[ref["idx"]].spectral_model() e_ref = model.reference.quantity dnde, dnde_err = model.evaluate_error(e_ref) assert isinstance(model, ref["spec_type"]) assert_quantity_allclose(dnde, ref["dnde"], rtol=1e-4) assert_quantity_allclose(dnde_err, ref["dnde_err"], rtol=1e-4) def test_spatial_model(self): model = self.cat["4FGL J0000.3-7355"].spatial_model() assert "PointSpatialModel" in model.tag assert model.frame == "icrs" p = model.parameters assert_allclose(p["lon_0"].value, 0.0983) assert_allclose(p["lat_0"].value, -73.921997) pos_err = model.position_error assert_allclose(pos_err.angle.value, -62.7) assert_allclose(0.5 * pos_err.height.value, 0.0525, rtol=1e-4) assert_allclose(0.5 * pos_err.width.value, 0.051, rtol=1e-4) assert_allclose(model.position.ra.value, pos_err.center.ra.value) assert_allclose(model.position.dec.value, pos_err.center.dec.value) model = self.cat["4FGL J1409.1-6121e"].spatial_model() assert "DiskSpatialModel" in model.tag assert model.frame == "icrs" p = model.parameters assert_allclose(p["lon_0"].value, 212.294006) assert_allclose(p["lat_0"].value, -61.353001) assert_allclose(p["r_0"].value, 0.7331369519233704) model = self.cat["4FGL J0617.2+2234e"].spatial_model() assert "GaussianSpatialModel" in model.tag assert model.frame == "icrs" p = model.parameters assert_allclose(p["lon_0"].value, 94.309998) assert_allclose(p["lat_0"].value, 22.58) assert_allclose(p["sigma"].value, 0.27) model = self.cat["4FGL J1443.0-6227e"].spatial_model() assert self.cat["4FGL J1443.0-6227e"].data_extended["version"] == 20 assert "TemplateSpatialModel" in model.tag assert model.frame == "fk5" assert model.normalize @pytest.mark.parametrize("ref", SOURCES_4FGL, ids=lambda _: _["name"]) def test_sky_model(self, ref): self.cat[ref["idx"]].sky_model def test_flux_points(self): flux_points = self.source.flux_points assert flux_points.norm.geom.axes["energy"].nbin == 7 assert flux_points.norm_ul desired = [ 2.2378458e-06, 1.4318283e-06, 5.4776939e-07, 1.2769708e-07, 2.5820052e-08, 2.3897000e-09, 7.1766204e-11, ] assert_allclose(flux_points.flux.data.flat, desired, rtol=1e-5) def test_flux_points_meta(self): source = self.cat["4FGL J0000.3-7355"] fp = source.flux_points assert_allclose(fp.sqrt_ts_threshold_ul, 1) assert_allclose(fp.n_sigma, 1) assert_allclose(fp.n_sigma_ul, 2) def test_flux_points_ul(self): source = self.cat["4FGL J0000.3-7355"] flux_points = source.flux_points desired = [ 4.13504750e-08, 3.80519616e-09, np.nan, np.nan, np.nan, np.nan, 7.99699456e-12, ] assert_allclose(flux_points.flux_ul.data.flat, desired, rtol=1e-5) def test_lightcurve_dr1(self): lc = self.source.lightcurve(interval="1-year") table = lc.to_table(format="lightcurve", sed_type="flux") assert len(table) == 8 assert table.colnames == [ "time_min", "time_max", "e_ref", "e_min", "e_max", "flux", "flux_errp", "flux_errn", "flux_ul", "ts", "sqrt_ts", "is_ul", ] axis = lc.geom.axes["time"] expected = Time(54682.6552835, format="mjd", scale="utc") assert_time_allclose(axis.time_min[0].utc, expected) expected = Time(55045.30090278, format="mjd", scale="utc") assert_time_allclose(axis.time_max[0].utc, expected) assert table["flux"].unit == "cm-2 s-1" assert_allclose(table["flux"][0], 2.2122326e-06, rtol=1e-3) assert table["flux_errp"].unit == "cm-2 s-1" assert_allclose(table["flux_errp"][0], 2.3099371e-08, rtol=1e-3) assert table["flux_errn"].unit == "cm-2 s-1" assert_allclose(table["flux_errn"][0], 2.3099371e-08, rtol=1e-3) table = self.source.lightcurve(interval="2-month").to_table( format="lightcurve", sed_type="flux" ) assert len(table) == 48 # (12 month/year / 2month) * 8 years assert table["flux"].unit == "cm-2 s-1" assert_allclose(table["flux"][0], 2.238483e-6, rtol=1e-3) assert table["flux_errp"].unit == "cm-2 s-1" assert_allclose(table["flux_errp"][0], 4.437058e-8, rtol=1e-3) assert table["flux_errn"].unit == "cm-2 s-1" assert_allclose(table["flux_errn"][0], 4.437058e-8, rtol=1e-3) def test_lightcurve_dr4(self): dr4 = SourceCatalog4FGL("$GAMMAPY_DATA/catalogs/fermi/gll_psc_v32.fit.gz") source_dr4 = dr4[self.source_name] table = source_dr4.lightcurve(interval="1-year").to_table( format="lightcurve", sed_type="flux" ) assert table["flux"].unit == "cm-2 s-1" assert_allclose(table["flux"][0], 2.307773e-06, rtol=1e-3) assert table["flux_errp"].unit == "cm-2 s-1" assert_allclose(table["flux_errp"][0], 2.298336e-08, rtol=1e-3) assert table["flux_errn"].unit == "cm-2 s-1" assert_allclose(table["flux_errn"][0], 2.298336e-08, rtol=1e-3) with pytest.raises(ValueError): source_dr4.lightcurve(interval="2-month") @requires_data() class TestFermi3FGLObject: @classmethod def setup_class(cls): cls.cat = SourceCatalog3FGL() # Use 3FGL J0534.5+2201 (Crab) as a test source cls.source_name = "3FGL J0534.5+2201" cls.source = cls.cat[cls.source_name] def test_name(self): assert self.source.name == self.source_name def test_row_index(self): assert self.source.row_index == 621 def test_data(self): assert_allclose(self.source.data["Signif_Avg"], 30.669872283935547) def test_position(self): position = self.source.position assert_allclose(position.ra.deg, 83.637199, atol=1e-3) assert_allclose(position.dec.deg, 22.024099, atol=1e-3) @pytest.mark.parametrize("ref", SOURCES_3FGL, ids=lambda _: _["name"]) def test_str(self, ref): actual = str(self.cat[ref["idx"]]) with open(get_pkg_data_filename(ref["str_ref_file"])) as fh: expected = fh.read() assert actual == modify_unit_order_astropy_5_3(expected) @pytest.mark.parametrize("ref", SOURCES_3FGL, ids=lambda _: _["name"]) def test_spectral_model(self, ref): model = self.cat[ref["idx"]].spectral_model() dnde, dnde_err = model.evaluate_error(1 * u.GeV) assert isinstance(model, ref["spec_type"]) assert_quantity_allclose(dnde, ref["dnde"]) assert_quantity_allclose(dnde_err, ref["dnde_err"], rtol=1e-3) def test_spatial_model(self): model = self.cat[0].spatial_model() assert "PointSpatialModel" in model.tag assert model.frame == "icrs" p = model.parameters assert_allclose(p["lon_0"].value, 0.0377) assert_allclose(p["lat_0"].value, 65.751701) model = self.cat[122].spatial_model() assert "GaussianSpatialModel" in model.tag assert model.frame == "icrs" p = model.parameters assert_allclose(p["lon_0"].value, 14.75) assert_allclose(p["lat_0"].value, -72.699997) assert_allclose(p["sigma"].value, 1.35) model = self.cat[955].spatial_model() assert "DiskSpatialModel" in model.tag assert model.frame == "icrs" p = model.parameters assert_allclose(p["lon_0"].value, 128.287201) assert_allclose(p["lat_0"].value, -45.190102) assert_allclose(p["r_0"].value, 0.91) model = self.cat[602].spatial_model() assert "TemplateSpatialModel" in model.tag assert model.frame == "fk5" assert model.normalize model = self.cat["3FGL J0000.2-3738"].spatial_model() pos_err = model.position_error assert_allclose(pos_err.angle.value, -88.55) assert_allclose(0.5 * pos_err.height.value, 0.0731, rtol=1e-4) assert_allclose(0.5 * pos_err.width.value, 0.0676, rtol=1e-4) assert_allclose(model.position.ra.value, pos_err.center.ra.value) assert_allclose(model.position.dec.value, pos_err.center.dec.value) @pytest.mark.parametrize("ref", SOURCES_3FGL, ids=lambda _: _["name"]) def test_sky_model(self, ref): self.cat[ref["idx"]].sky_model() def test_flux_points(self): flux_points = self.source.flux_points assert flux_points.energy_axis.nbin == 5 assert flux_points.norm_ul desired = [1.645888e-06, 5.445407e-07, 1.255338e-07, 2.545524e-08, 2.263189e-09] assert_allclose(flux_points.flux.data.flat, desired, rtol=1e-5) def test_flux_points_ul(self): source = self.cat["3FGL J0000.2-3738"] flux_points = source.flux_points desired = [4.096391e-09, 6.680059e-10, np.nan, np.nan, np.nan] assert_allclose(flux_points.flux_ul.data.flat, desired, rtol=1e-5) def test_lightcurve(self): lc = self.source.lightcurve() table = lc.to_table(format="lightcurve", sed_type="flux") assert len(table) == 48 assert table.colnames == [ "time_min", "time_max", "e_ref", "e_min", "e_max", "flux", "flux_errp", "flux_errn", "flux_ul", "is_ul", ] expected = Time(54680.02313657408, format="mjd", scale="utc") axis = lc.geom.axes["time"] assert_time_allclose(axis.time_min[0].utc, expected) expected = Time(54710.46295139, format="mjd", scale="utc") assert_time_allclose(axis.time_max[0].utc, expected) assert table["flux"].unit == "cm-2 s-1" assert_allclose(table["flux"][0], 2.384e-06, rtol=1e-3) assert table["flux_errp"].unit == "cm-2 s-1" assert_allclose(table["flux_errp"][0], 8.071e-08, rtol=1e-3) assert table["flux_errn"].unit == "cm-2 s-1" assert_allclose(table["flux_errn"][0], 8.071e-08, rtol=1e-3) def test_crab_alias(self): for name in [ "Crab", "3FGL J0534.5+2201", "1FHL J0534.5+2201", "PSR J0534+2200", ]: assert self.cat[name].row_index == 621 @requires_data() class TestFermi2FHLObject: @classmethod def setup_class(cls): cls.cat = SourceCatalog2FHL() # Use 2FHL J0534.5+2201 (Crab) as a test source cls.source_name = "2FHL J0534.5+2201" cls.source = cls.cat[cls.source_name] def test_name(self): assert self.source.name == self.source_name def test_position(self): position = self.source.position assert_allclose(position.ra.deg, 83.634102, atol=1e-3) assert_allclose(position.dec.deg, 22.0215, atol=1e-3) @pytest.mark.parametrize("ref", SOURCES_2FHL, ids=lambda _: _["name"]) def test_str(self, ref): actual = str(self.cat[ref["idx"]]) with open(get_pkg_data_filename(ref["str_ref_file"])) as fh: expected = fh.read() assert actual == modify_unit_order_astropy_5_3(expected) def test_spectral_model(self): model = self.source.spectral_model() energy = u.Quantity(100, "GeV") desired = u.Quantity(6.8700477298e-12, "cm-2 GeV-1 s-1") assert_quantity_allclose(model(energy), desired) def test_flux_points(self): # test flux point on PKS 2155-304 src = self.cat["PKS 2155-304"] flux_points = src.flux_points actual = flux_points.flux.quantity[:, 0, 0] desired = [2.866363e-10, 6.118736e-11, 3.257970e-16] * u.Unit("cm-2 s-1") assert_quantity_allclose(actual, desired) actual = flux_points.flux_ul.quantity[:, 0, 0] desired = [np.nan, np.nan, 1.294092e-11] * u.Unit("cm-2 s-1") assert_quantity_allclose(actual, desired, rtol=1e-3) def test_spatial_model(self): model = self.cat[221].spatial_model() assert "PointSpatialModel" in model.tag assert model.frame == "icrs" p = model.parameters assert_allclose(p["lon_0"].value, 221.281998, rtol=1e-5) assert_allclose(p["lat_0"].value, -3.4943, rtol=1e-5) model = self.cat["2FHL J1304.5-4353"].spatial_model() pos_err = model.position_error scale = Gauss2DPDF().containment_radius(0.95) / Gauss2DPDF().containment_radius( 0.68 ) assert_allclose(pos_err.height.value, 2 * 0.041987 * scale, rtol=1e-4) assert_allclose(pos_err.width.value, 2 * 0.041987 * scale, rtol=1e-4) assert_allclose(model.position.ra.value, pos_err.center.ra.value) assert_allclose(model.position.dec.value, pos_err.center.dec.value) model = self.cat[97].spatial_model() assert "GaussianSpatialModel" in model.tag assert model.frame == "icrs" p = model.parameters assert_allclose(p["lon_0"].value, 94.309998, rtol=1e-5) assert_allclose(p["lat_0"].value, 22.58, rtol=1e-5) assert_allclose(p["sigma"].value, 0.27) model = self.cat[134].spatial_model() assert "DiskSpatialModel" in model.tag assert model.frame == "icrs" p = model.parameters assert_allclose(p["lon_0"].value, 125.660004, rtol=1e-5) assert_allclose(p["lat_0"].value, -42.84, rtol=1e-5) assert_allclose(p["r_0"].value, 0.37) model = self.cat[256].spatial_model() assert "TemplateSpatialModel" in model.tag assert model.frame == "fk5" assert model.normalize @requires_data() class TestFermi3FHLObject: @classmethod def setup_class(cls): cls.cat = SourceCatalog3FHL() # Use 3FHL J0534.5+2201 (Crab) as a test source cls.source_name = "3FHL J0534.5+2201" cls.source = cls.cat[cls.source_name] def test_name(self): assert self.source.name == self.source_name def test_row_index(self): assert self.source.row_index == 352 def test_data(self): assert_allclose(self.source.data["Signif_Avg"], 168.64082) def test_str(self): actual = str(self.cat["3FHL J2301.9+5855e"]) # an extended source with open(get_pkg_data_filename("data/3fhl_j2301.9+5855e.txt")) as fh: expected = fh.read() assert actual == modify_unit_order_astropy_5_3(expected) def test_position(self): position = self.source.position assert_allclose(position.ra.deg, 83.634834, atol=1e-3) assert_allclose(position.dec.deg, 22.019203, atol=1e-3) @pytest.mark.parametrize("ref", SOURCES_3FHL, ids=lambda _: _["name"]) def test_spectral_model(self, ref): model = self.cat[ref["idx"]].spectral_model() dnde, dnde_err = model.evaluate_error(100 * u.GeV) assert isinstance(model, ref["spec_type"]) assert_quantity_allclose(dnde, ref["dnde"]) assert_quantity_allclose(dnde_err, ref["dnde_err"], rtol=1e-3) @pytest.mark.parametrize("ref", SOURCES_3FHL, ids=lambda _: _["name"]) def test_spatial_model(self, ref): model = self.cat[ref["idx"]].spatial_model() assert model.frame == "icrs" model = self.cat["3FHL J0002.1-6728"].spatial_model() pos_err = model.position_error assert_allclose(0.5 * pos_err.height.value, 0.035713, rtol=1e-4) assert_allclose(0.5 * pos_err.width.value, 0.035713, rtol=1e-4) assert_allclose(model.position.ra.value, pos_err.center.ra.value) assert_allclose(model.position.dec.value, pos_err.center.dec.value) @pytest.mark.parametrize("ref", SOURCES_3FHL, ids=lambda _: _["name"]) def test_sky_model(self, ref): self.cat[ref["idx"]].sky_model() def test_flux_points(self): flux_points = self.source.flux_points assert flux_points.energy_axis.nbin == 5 assert flux_points.norm_ul desired = [5.169889e-09, 2.245024e-09, 9.243175e-10, 2.758956e-10, 6.684021e-11] assert_allclose(flux_points.flux.data[:, 0, 0], desired, rtol=1e-3) def test_crab_alias(self): for name in ["Crab Nebula", "3FHL J0534.5+2201", "3FGL J0534.5+2201i"]: assert self.cat[name].row_index == 352 @requires_data() class TestFermi2PCObject: @classmethod def setup_class(cls): cls.cat = SourceCatalog2PC() # Use J0835-4510 (Vela pulsar) as a test source cls.source_name = "J0835-4510" cls.source = cls.cat[cls.source_name] def test_name(self): assert self.source.name == self.source_name def test_row_index(self): assert self.source.row_index == 26 @pytest.mark.parametrize("ref", SOURCES_2PC, ids=lambda _: _["name"]) def test_str(self, ref): actual = str(self.cat[ref["idx"]]) with open(get_pkg_data_filename(ref["str_ref_file"])) as fh: expected = fh.read() assert actual == modify_unit_order_astropy_5_3(expected) def test_position(self): position = self.source.position assert_allclose(position.ra.deg, 128.83580017, atol=1e-3) assert_allclose(position.dec.deg, -45.17630005, atol=1e-3) def test_data(self): assert_allclose(self.source.data["Period"], 89.36 * u.ms) @pytest.mark.parametrize("ref", SOURCES_2PC, ids=lambda _: _["name"]) def test_spectral_model(self, ref): model = self.cat[ref["idx"]].spectral_model() e_ref = model.reference.quantity dnde, dnde_err = model.evaluate_error(e_ref) assert isinstance(model, ref["spec_type"]) assert_quantity_allclose(dnde, ref["dnde"], rtol=1e-4) assert_quantity_allclose(dnde_err, ref["dnde_err"], rtol=1e-4) def test_spatial_model(self): model = self.source.spatial_model() assert "PointSpatialModel" in model.tag assert model.frame == "icrs" p = model.parameters assert_allclose(p["lon_0"].value, 128.83580017) assert_allclose(p["lat_0"].value, -45.17630005) @pytest.mark.parametrize("ref", SOURCES_2PC, ids=lambda _: _["name"]) def test_sky_model(self, ref): self.cat[ref["idx"]].sky_model def test_flux_points(self): flux_points = self.source.flux_points assert flux_points.norm.geom.axes["energy"].nbin == 12 assert flux_points.norm_ul desired = [ 3.72509152e-09, 2.66778077e-09, 1.89817530e-09, 1.23380980e-09, 7.13620134e-10, 3.64504041e-10, 1.47285787e-10, 4.66089223e-11, 9.73255716e-12, 1.48225895e-12, 1.20599140e-13, 3.89632873e-14, ] assert_allclose(flux_points.flux.data.flat, desired, rtol=1e-5) def test_flux_points_ul(self): flux_points = self.source.flux_points desired = [ np.nan, np.nan, np.nan, np.nan, np.nan, np.nan, np.nan, np.nan, np.nan, np.nan, np.nan, 3.896328e-14, ] assert_allclose(flux_points.flux_ul.data.flat, desired, rtol=1e-5) @pytest.mark.xfail def test_lightcurve(self, ref=SOURCES_3PC_NONE[0]): # TODO: add lightcurve test when lightcurve is introduce to the class. lightcurve = self.cat[ref["idx"]].lightcurve assert lightcurve is not None @requires_data() class TestFermi3PCObject: @classmethod def setup_class(cls): cls.cat = SourceCatalog3PC() # Use J0835-4510 (Vela pulsar) as a test source cls.source_name = "J0835-4510" cls.source = cls.cat[cls.source_name] def test_name(self): assert self.source.name == self.source_name def test_row_index(self): assert self.source.row_index == 52 @pytest.mark.parametrize( "ref", [SOURCES_3PC[0], SOURCES_3PC_NONE[0]], ids=lambda _: _["name"] ) def test_str(self, ref): actual = str(self.cat[ref["idx"]]) with open(get_pkg_data_filename(ref["str_ref_file"])) as fh: expected = fh.read() assert actual == modify_unit_order_astropy_5_3(expected) def test_position(self): position = self.source.position assert_allclose(position.ra.deg, 128.83580017, atol=1e-3) assert_allclose(position.dec.deg, -45.17630005, atol=1e-3) def test_data(self): assert_allclose(self.source.data["P0"], 0.08937108859019084) def test_spectral_model(self, ref_list=SOURCES_3PC): for ref in ref_list: model = self.cat[ref["idx"]].spectral_model() e_ref = model.reference.quantity dnde, dnde_err = model.evaluate_error(e_ref) assert isinstance(model, ref["spec_type"]) assert_quantity_allclose(dnde, ref["dnde"], rtol=1e-4) assert_quantity_allclose(dnde_err, ref["dnde_err"], rtol=1e-4) if ref["name"] == "J0940-5428": assert model.index_2.error == 0.0 def test_spectral_model_none(self, ref=SOURCES_3PC_NONE[0]): model = self.cat[ref["idx"]].spectral_model() assert model is None def test_spatial_model(self): model = self.source.spatial_model() assert "PointSpatialModel" in model.tag assert model.frame == "icrs" p = model.parameters assert_allclose(p["lon_0"].value, 128.83588121) assert_allclose(p["lat_0"].value, -45.17635419) def test_sky_model(self, ref=SOURCES_3PC[0]): self.cat[ref["idx"]].sky_model def test_flux_points(self, ref=SOURCES_3PC[0]): flux_points = self.cat[ref["idx"]].flux_points assert flux_points.norm.geom.axes["energy"].nbin == 8 assert flux_points.norm_ul desired = [ 5.431500e-06, 5.635604e-06, 3.341596e-06, 1.058516e-06, 2.264139e-07, 1.421599e-08, 6.306778e-10, 3.165719e-12, ] assert_allclose(flux_points.flux.data.flat, desired, rtol=1e-5) def test_flux_points_none(self, ref=SOURCES_3PC_NONE[0]): flux_points = self.cat[ref["idx"]].flux_points assert flux_points is None def test_flux_points_ul(self): flux_points = self.source.flux_points desired = [np.nan, np.nan, np.nan, np.nan, np.nan, np.nan, np.nan, 5.146455e-08] assert_allclose(flux_points.flux_ul.data.flat, desired, rtol=1e-5) @pytest.mark.xfail def test_lightcurve(self, ref=SOURCES_3PC_NONE[0]): # TODO: add lightcurve test when lightcurve is introduce to the class. lightcurve = self.cat[ref["idx"]].lightcurve assert lightcurve is not None @requires_data() class TestSourceCatalog3FGL: @classmethod def setup_class(cls): cls.cat = SourceCatalog3FGL() def test_main_table(self): assert len(self.cat.table) == 3034 def test_extended_sources(self): table = self.cat.extended_sources_table assert len(table) == 25 @requires_data() class TestSourceCatalog2FHL: @classmethod def setup_class(cls): cls.cat = SourceCatalog2FHL() def test_main_table(self): assert len(self.cat.table) == 360 def test_extended_sources(self): table = self.cat.extended_sources_table assert len(table) == 25 def test_crab_alias(self): for name in ["Crab", "3FGL J0534.5+2201i", "1FHL J0534.5+2201"]: assert self.cat[name].row_index == 85 @requires_data() class TestSourceCatalog3FHL: @classmethod def setup_class(cls): cls.cat = SourceCatalog3FHL() def test_main_table(self): assert len(self.cat.table) == 1556 def test_extended_sources(self): table = self.cat.extended_sources_table assert len(table) == 55 def test_to_models(self): mask = self.cat.table["GLAT"].quantity > 80 * u.deg subcat = self.cat[mask] models = subcat.to_models() assert len(models) == 17 @requires_data() class TestSourceCatalog2PC: @classmethod def setup_class(cls): cls.cat = SourceCatalog2PC() def test_main_table(self): assert len(self.cat.table) == 117 def test_spectral_table(self): table = self.cat.spectral_table assert len(table) == 117 def test_to_models(self): mask = self.cat.table["GLAT"].quantity > 30 * u.deg subcat = self.cat[mask] models = subcat.to_models() assert len(models) == 2 @requires_data() class TestSourceCatalog3PC: @classmethod def setup_class(cls): cls.cat = SourceCatalog3PC() def test_main_table(self): assert len(self.cat.table) == 294 def test_spectral_table(self): table = self.cat.spectral_table assert len(table) == 305 def test_to_models(self): mask = self.cat.table["Gb"] > 30 subcat = self.cat[mask] models = subcat.to_models() assert len(models) == 17 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/test_gammacat.py0000644000175100001770000001411014721316200022000 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose from astropy import units as u from astropy.utils.data import get_pkg_data_filename from gammapy.catalog import SourceCatalogGammaCat from gammapy.utils.gauss import Gauss2DPDF from gammapy.utils.testing import ( assert_quantity_allclose, modify_unit_order_astropy_5_3, requires_data, ) SOURCES = [ { "name": "Vela X", "str_ref_file": "data/gammacat_vela_x.txt", "spec_type": "ecpl", "dnde_1TeV": 1.36e-11 * u.Unit("cm-2 s-1 TeV-1"), "dnde_1TeV_err": 7.531e-13 * u.Unit("cm-2 s-1 TeV-1"), "flux_1TeV": 2.104e-11 * u.Unit("cm-2 s-1"), "eflux_1_10TeV": 9.265778680255336e-11 * u.Unit("erg cm-2 s-1"), "n_flux_points": 24, "spatial_model": "GaussianSpatialModel", "ra": 128.287003, "dec": -45.189999, }, { "name": "HESS J1848-018", "str_ref_file": "data/gammacat_hess_j1848-018.txt", "spec_type": "pl", "dnde_1TeV": 3.7e-12 * u.Unit("cm-2 s-1 TeV-1"), "dnde_1TeV_err": 4e-13 * u.Unit("cm-2 s-1 TeV-1"), "flux_1TeV": 2.056e-12 * u.Unit("cm-2 s-1"), "eflux_1_10TeV": 6.235650344765057e-12 * u.Unit("erg cm-2 s-1"), "n_flux_points": 11, "spatial_model": "GaussianSpatialModel", "ra": 282.119995, "dec": -1.792, }, { "name": "HESS J1813-178", "str_ref_file": "data/gammacat_hess_j1813-178.txt", "spec_type": "pl2", "dnde_1TeV": 2.678e-12 * u.Unit("cm-2 s-1 TeV-1"), "dnde_1TeV_err": 2.55e-13 * u.Unit("cm-2 s-1 TeV-1"), "flux_1TeV": 2.457e-12 * u.Unit("cm-2 s-1"), "eflux_1_10TeV": 8.923614018939419e-12 * u.Unit("erg cm-2 s-1"), "n_flux_points": 13, "spatial_model": "GaussianSpatialModel", "ra": 273.362915, "dec": -17.84889, }, ] @pytest.fixture(scope="session") def gammacat(): filename = "$GAMMAPY_DATA/catalogs/gammacat/gammacat.fits.gz" return SourceCatalogGammaCat(filename=filename) @requires_data() class TestSourceCatalogGammaCat: def test_source_table(self, gammacat): assert gammacat.tag == "gamma-cat" assert len(gammacat.table) == 162 def test_positions(self, gammacat): assert len(gammacat.positions) == 162 def test_w28_alias_names(self, gammacat): for name in [ "W28", "HESS J1801-233", "W 28", "SNR G6.4-0.1", "SNR G006.4-00.1", "GRO J1801-2320", ]: assert gammacat[name].row_index == 112 @requires_data() class TestSourceCatalogObjectGammaCat: def test_data(self, gammacat): source = gammacat[0] assert isinstance(source.data, dict) assert source.data["common_name"] == "CTA 1" assert_quantity_allclose(source.data["dec"], 72.782997 * u.deg) @pytest.mark.parametrize("ref", SOURCES, ids=lambda _: _["name"]) def test_str(self, gammacat, ref): actual = str(gammacat[ref["name"]]) with open(get_pkg_data_filename(ref["str_ref_file"])) as fh: expected = fh.read() assert actual == modify_unit_order_astropy_5_3(expected) @pytest.mark.parametrize("ref", SOURCES, ids=lambda _: _["name"]) def test_spectral_model(self, gammacat, ref): source = gammacat[ref["name"]] spectral_model = source.spectral_model() assert source.data["spec_type"] == ref["spec_type"] e_min, e_max, e_inf = [1, 10, 1e10] * u.TeV dne = spectral_model(e_min) flux = spectral_model.integral(energy_min=e_min, energy_max=e_inf) eflux = spectral_model.energy_flux(energy_min=e_min, energy_max=e_max).to( "erg cm-2 s-1" ) print(spectral_model) assert_quantity_allclose(dne, ref["dnde_1TeV"], rtol=1e-3) assert_quantity_allclose(flux, ref["flux_1TeV"], rtol=1e-3) assert_quantity_allclose(eflux, ref["eflux_1_10TeV"], rtol=1e-3) @pytest.mark.parametrize("ref", SOURCES, ids=lambda _: _["name"]) def test_spectral_model_err(self, gammacat, ref): source = gammacat[ref["name"]] spectral_model = source.spectral_model() e_min, e_max, e_inf = [1, 10, 1e10] * u.TeV dnde, dnde_err = spectral_model.evaluate_error(e_min) assert_quantity_allclose(dnde, ref["dnde_1TeV"], rtol=1e-3) assert_quantity_allclose(dnde_err, ref["dnde_1TeV_err"], rtol=1e-3) @pytest.mark.parametrize("ref", SOURCES, ids=lambda _: _["name"]) def test_flux_points(self, gammacat, ref): source = gammacat[ref["name"]] flux_points = source.flux_points assert flux_points.energy_axis.nbin == ref["n_flux_points"] @pytest.mark.parametrize("ref", SOURCES, ids=lambda _: _["name"]) def test_position(self, gammacat, ref): source = gammacat[ref["name"]] position = source.position assert_allclose(position.ra.deg, ref["ra"], atol=1e-3) assert_allclose(position.dec.deg, ref["dec"], atol=1e-3) @pytest.mark.parametrize("ref", SOURCES, ids=lambda _: _["name"]) def test_spatial_model(self, gammacat, ref): source = gammacat[ref["name"]] spatial_model = source.spatial_model() assert spatial_model.frame == "galactic" # TODO: add a point and shell source -> separate list of sources for # morphology test parametrization? assert spatial_model.__class__.__name__ == ref["spatial_model"] model = gammacat["HESS J1634-472"].spatial_model() pos_err = model.position_error scale_r95 = Gauss2DPDF().containment_radius(0.95) assert_allclose(pos_err.height.value, 2 * 0.044721 * scale_r95, rtol=1e-4) assert_allclose(pos_err.width.value, 2 * 0.044721 * scale_r95, rtol=1e-4) assert_allclose(model.position.l.value, pos_err.center.l.value) assert_allclose(model.position.b.value, pos_err.center.b.value) @pytest.mark.parametrize("ref", SOURCES, ids=lambda _: _["name"]) def test_sky_model(self, gammacat, ref): gammacat[ref["name"]].sky_model() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/test_hawc.py0000644000175100001770000001506014721316200021155 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose import astropy.units as u from astropy.utils.data import get_pkg_data_filename from gammapy.catalog import SourceCatalog2HWC from gammapy.catalog.hawc import SourceCatalog3HWC from gammapy.modeling.models import ( DiskSpatialModel, PointSpatialModel, PowerLawSpectralModel, ) from gammapy.utils.gauss import Gauss2DPDF from gammapy.utils.testing import requires_data @pytest.fixture(scope="session") def cat(): return SourceCatalog2HWC() @requires_data() class TestSourceCatalog2HWC: @staticmethod def test_source_table(cat): assert cat.tag == "2hwc" assert len(cat.table) == 40 @staticmethod def test_positions(cat): assert len(cat.positions) == 40 @staticmethod def test_to_models(cat): models = cat.to_models(which="point") assert len(models) == 40 @requires_data() class TestSourceCatalogObject2HWC: @staticmethod def test_data(cat): assert cat[0].data["source_name"] == "2HWC J0534+220" assert cat[0].n_models == 1 assert cat[1].data["source_name"] == "2HWC J0631+169" assert cat[1].n_models == 2 @staticmethod def test_str(cat): expected = open(get_pkg_data_filename("data/2hwc_j0534+220.txt")).read() assert str(cat[0]) == expected expected = open(get_pkg_data_filename("data/2hwc_j0631+169.txt")).read() assert str(cat[1]) == expected @staticmethod def test_position(cat): position = cat[0].position assert_allclose(position.ra.deg, 83.628, atol=1e-3) assert_allclose(position.dec.deg, 22.024, atol=1e-3) @staticmethod def test_sky_model(cat): model = cat[1].sky_model("extended") assert model.name == "2HWC J0631+169" assert isinstance(model.spectral_model, PowerLawSpectralModel) assert isinstance(model.spatial_model, DiskSpatialModel) with pytest.raises(ValueError): cat[0].sky_model("extended") @staticmethod def test_spectral_model(cat): m = cat[0].spectral_model() dnde, dnde_err = m.evaluate_error(1 * u.TeV) assert dnde.unit == "cm-2 s-1 TeV-1" assert_allclose(dnde.value, 2.802365e-11, rtol=1e-3) assert_allclose(dnde_err.value, 6.537506e-13, rtol=1e-3) @staticmethod def test_spatial_model(cat): m = cat[1].spatial_model() assert isinstance(m, PointSpatialModel) assert m.lon_0.unit == "deg" assert_allclose(m.lon_0.value, 195.614, atol=1e-2) assert_allclose(m.lon_0.error, 0.114, atol=1e-2) assert m.lat_0.unit == "deg" assert_allclose(m.lat_0.value, 3.507, atol=1e-2) assert m.frame == "galactic" m = cat[1].spatial_model("extended") assert isinstance(m, DiskSpatialModel) assert m.lon_0.unit == "deg" assert_allclose(m.lon_0.value, 195.614, atol=1e-10) assert m.lat_0.unit == "deg" assert_allclose(m.lat_0.value, 3.507, atol=1e-10) assert m.frame == "galactic" assert m.r_0.unit == "deg" assert_allclose(m.r_0.value, 2.0, atol=1e-3) model = cat["2HWC J0534+220"].spatial_model() pos_err = model.position_error scale_r95 = Gauss2DPDF().containment_radius(0.95) assert_allclose(pos_err.height.value, 2 * 0.057 * scale_r95, rtol=1e-4) assert_allclose(pos_err.width.value, 2 * 0.057 * scale_r95, rtol=1e-4) assert_allclose(model.position.l.value, pos_err.center.l.value) assert_allclose(model.position.b.value, pos_err.center.b.value) @pytest.fixture(scope="session") def ca_3hwc(): return SourceCatalog3HWC() @requires_data() class TestSourceCatalog3HWC: @staticmethod def test_source_table(ca_3hwc): assert ca_3hwc.tag == "3hwc" assert len(ca_3hwc.table) == 65 @staticmethod def test_positions(ca_3hwc): assert len(ca_3hwc.positions) == 65 @staticmethod def test_to_models(ca_3hwc): models = ca_3hwc.to_models() assert len(models) == 65 @requires_data() class TestSourceCatalogObject3HWC: @staticmethod def test_data(ca_3hwc): assert ca_3hwc[0].data["source_name"] == "3HWC J0534+220" assert ca_3hwc[0].n_models == 1 ca_3hwc[0].info() assert ca_3hwc[1].data["source_name"] == "3HWC J0540+228" assert ca_3hwc[1].n_models == 1 @staticmethod def test_sky_model(ca_3hwc): model = ca_3hwc[4].sky_model() assert model.name == "3HWC J0621+382" assert isinstance(model.spectral_model, PowerLawSpectralModel) assert isinstance(model.spatial_model, DiskSpatialModel) @staticmethod def test_spectral_model(ca_3hwc): m = ca_3hwc[1].spectral_model() assert_allclose(m.amplitude.value, 4.7558e-15, atol=1e-3) assert_allclose(m.amplitude.error, 7.411645e-16, atol=1e-3) assert_allclose(m.index.value, 2.8396, atol=1e-3) assert_allclose(m.index.error, 0.1425, atol=1e-3) m = ca_3hwc[0].spectral_model() dnde, dnde_err = m.evaluate_error(7 * u.TeV) assert dnde.unit == "cm-2 s-1 TeV-1" assert_allclose(dnde.value, 2.34e-13, rtol=1e-2) assert_allclose(dnde_err.value, 1.4e-15, rtol=1e-1) @staticmethod def test_spatial_model(ca_3hwc): m = ca_3hwc[1].spatial_model() assert isinstance(m, PointSpatialModel) assert m.lon_0.unit == "deg" assert_allclose(m.lon_0.value, 184.583, atol=1e-2) assert_allclose(m.lon_0.error, 0.112, atol=1e-2) assert m.lat_0.unit == "deg" assert_allclose(m.lat_0.value, -4.129, atol=1e-2) assert m.frame == "galactic" m = ca_3hwc["3HWC J0621+382"].spatial_model() assert isinstance(m, DiskSpatialModel) assert m.lon_0.unit == "deg" assert_allclose(m.lon_0.value, 175.444254, atol=1e-10) assert m.lat_0.unit == "deg" assert_allclose(m.lat_0.value, 10.966142, atol=1e-10) assert m.frame == "galactic" assert m.r_0.unit == "deg" assert_allclose(m.r_0.value, 0.5, atol=1e-3) model = ca_3hwc["3HWC J0621+382"].spatial_model() pos_err = model.position_error scale_r95 = Gauss2DPDF().containment_radius(0.95) assert_allclose(pos_err.height.value, 2 * 0.3005 * scale_r95, rtol=1e-3) assert_allclose(pos_err.width.value, 2 * 0.3005 * scale_r95, rtol=1e-3) assert_allclose(model.position.l.value, pos_err.center.l.value) assert_allclose(model.position.b.value, pos_err.center.b.value) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/test_hess.py0000644000175100001770000003030114721316200021170 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy import units as u from astropy.coordinates import Angle, SkyCoord from astropy.table import Table from astropy.utils.data import get_pkg_data_filename from gammapy.catalog import SourceCatalogHGPS, SourceCatalogLargeScaleHGPS from gammapy.modeling.models import ( ExpCutoffPowerLawSpectralModel, PowerLawSpectralModel, ) from gammapy.utils.gauss import Gauss2DPDF from gammapy.utils.testing import ( assert_quantity_allclose, modify_unit_order_astropy_5_3, requires_data, ) SOURCES = [ {"idx": 33, "name": "HESS J1713-397", "str_ref_file": "data/hess_j1713-397.txt"}, {"idx": 54, "name": "HESS J1825-137", "str_ref_file": "data/hess_j1825-137.txt"}, {"idx": 76, "name": "HESS J1930+188", "str_ref_file": "data/hess_j1930+188.txt"}, ] @pytest.fixture(scope="session") def cat(): return SourceCatalogHGPS("$GAMMAPY_DATA/catalogs/hgps_catalog_v1.fits.gz") @requires_data() class TestSourceCatalogHGPS: @staticmethod def test_source_table(cat): assert cat.tag == "hgps" assert len(cat.table) == 78 @staticmethod def test_str(cat): assert "name: hgps" in str(cat.__str__()) @staticmethod def test_positions(cat): assert len(cat.positions) == 78 @staticmethod def test_table_components(cat): assert len(cat.table_components) == 98 @staticmethod def test_table_associations(cat): assert len(cat.table_associations) == 223 @staticmethod def test_table_identifications(cat): assert len(cat.table_identifications) == 31 @staticmethod def test_gaussian_component(cat): # Row index starts at 0, component numbers at 1 # Thus we expect `HGPSC 084` at row 83 c = cat.gaussian_component(83) assert c.name == "HGPSC 084" @staticmethod def test_large_scale_component(cat): assert isinstance(cat.large_scale_component, SourceCatalogLargeScaleHGPS) @staticmethod def test_to_models(cat): models = cat.to_models(components_status="independent") assert len(models) == 96 models = cat.to_models(components_status="linked") assert len(models) == 96 models = cat.to_models(components_status="merged") assert len(models) == 78 gaussians = models.select(tag="gauss", model_type="spatial") assert np.all([m.spatial_model.sigma.value > 0.0 for m in gaussians]) @requires_data() class TestSourceCatalogObjectHGPS: @pytest.fixture(scope="class") def source(self, cat): return cat["HESS J1843-033"] @staticmethod def test_basics(source): assert source.name == "HESS J1843-033" assert source.row_index == 64 data = source.data assert data["Source_Class"] == "Unid" assert "SourceCatalogObjectHGPS" in repr(source) ss = str(source) assert "Source name : HESS J1843-033" in ss assert "Component HGPSC 083:" in ss @staticmethod @pytest.mark.parametrize("ref", SOURCES) def test_str(cat, ref): actual = str(cat[ref["idx"]]) with open(get_pkg_data_filename(ref["str_ref_file"])) as fh: expected = fh.read() assert actual == modify_unit_order_astropy_5_3(expected) @staticmethod def test_position(source): position = source.position assert_allclose(position.ra.deg, 280.95162964) assert_allclose(position.dec.deg, -3.55410194) @staticmethod def test_components(source): components = source.components assert len(components) == 2 c = components[1] assert c.name == "HGPSC 084" @staticmethod def test_energy_range(source): energy_range = source.energy_range assert energy_range.unit == "TeV" assert_allclose(energy_range.value, [0.21544346, 61.89658356]) @staticmethod def test_spectral_model_pl(cat): source = cat["HESS J1843-033"] model = source.spectral_model() assert isinstance(model, PowerLawSpectralModel) pars = model.parameters assert_allclose(pars["amplitude"].value, 9.140179932365378e-13) assert_allclose(pars["index"].value, 2.1513476371765137) assert_allclose(pars["reference"].value, 1.867810606956482) @staticmethod def test_spectral_model_ecpl(cat): source = cat["HESS J0835-455"] model = source.spectral_model() assert isinstance(model, ExpCutoffPowerLawSpectralModel) pars = model.parameters assert_allclose(pars["amplitude"].value, 6.408420542586617e-12) assert_allclose(pars["index"].value, 1.3543991614920847) assert_allclose(pars["reference"].value, 1.696938754239) assert_allclose(pars["lambda_"].value, 0.081517637) assert_allclose(pars["amplitude"].error, 3.260472e-13, rtol=1e-3) assert_allclose(pars["index"].error, 0.077331, atol=0.001) assert_allclose(pars["reference"].error, 0) assert_allclose(pars["lambda_"].error, 0.011535, atol=0.001) model = source.spectral_model("pl") assert isinstance(model, PowerLawSpectralModel) pars = model.parameters assert_allclose(pars["amplitude"].value, 1.833056926733856e-12) assert_allclose(pars["index"].value, 1.8913707) assert_allclose(pars["reference"].value, 3.0176312923431396) assert_allclose(pars["amplitude"].error, 6.992061e-14, rtol=1e-3) assert_allclose(pars["index"].error, 0.028383, atol=0.001) assert_allclose(pars["reference"].error, 0) @staticmethod def test_position_error(cat): scale_r95 = Gauss2DPDF().containment_radius(0.95) model = cat["HESS J1729-345"].spatial_model() pos_err = model.position_error assert_allclose(pos_err.angle.value, 0.0) assert_allclose(pos_err.height.value, 2 * 0.0414315 * scale_r95, rtol=1e-4) assert_allclose(pos_err.width.value, 2 * 0.0344351 * scale_r95, rtol=1e-4) assert_allclose(model.position.l.value, pos_err.center.l.value) assert_allclose(model.position.b.value, pos_err.center.b.value) model = cat["HESS J1858+020"].spatial_model() pos_err = model.position_error assert_allclose(pos_err.angle.value, 90.0) assert_allclose(pos_err.height.value, 2 * 0.0222614 * scale_r95, rtol=1e-4) assert_allclose(pos_err.width.value, 2 * 0.0145084 * scale_r95, rtol=1e-4) assert_allclose(model.position.l.value, pos_err.center.l.value) assert_allclose(model.position.b.value, pos_err.center.b.value) @staticmethod def test_sky_model_point(cat): model = cat["HESS J1826-148"].sky_model() p = model.parameters assert_allclose(p["amplitude"].value, 9.815771242691063e-13) assert_allclose(p["lon_0"].value, 16.882482528686523) assert_allclose(p["lat_0"].value, -1.2889292240142822) @staticmethod def test_sky_model_gaussian(cat): model = cat["HESS J1119-614"].sky_model() p = model.parameters assert_allclose(p["amplitude"].value, 7.959899015960725e-13) assert_allclose(p["lon_0"].value, 292.128082) assert_allclose(p["lat_0"].value, -0.5332353711128235) assert_allclose(p["sigma"].value, 0.09785966575145721) @staticmethod def test_sky_model_gaussian2(cat): models = cat["HESS J1843-033"].components_models() p = models[0].parameters assert_allclose(p["amplitude"].value, 4.259815e-13, rtol=1e-5) assert_allclose(p["lon_0"].value, 29.047216415405273) assert_allclose(p["lat_0"].value, 0.24389676749706268) assert_allclose(p["sigma"].value, 0.12499100714921951) p = models[1].parameters assert_allclose(p["amplitude"].value, 4.880365e-13, rtol=1e-5) assert_allclose(p["lon_0"].value, 28.77037811279297) assert_allclose(p["lat_0"].value, -0.0727819949388504) assert_allclose(p["sigma"].value, 0.2294706553220749) @staticmethod def test_sky_model_gaussian3(cat): models = cat["HESS J1825-137"].components_models() p = models[0].parameters assert_allclose(p["amplitude"].value, 1.8952104218765842e-11) assert_allclose(p["lon_0"].value, 16.988601684570312) assert_allclose(p["lat_0"].value, -0.4913068115711212) assert_allclose(p["sigma"].value, 0.47650089859962463) p = models[1].parameters assert_allclose(p["amplitude"].value, 4.4639763971527836e-11) assert_allclose(p["lon_0"].value, 17.71169090270996) assert_allclose(p["lat_0"].value, -0.6598004102706909) assert_allclose(p["sigma"].value, 0.3910967707633972) p = models[2].parameters assert_allclose(p["amplitude"].value, 5.870712920658374e-12) assert_allclose(p["lon_0"].value, 17.840524673461914) assert_allclose(p["lat_0"].value, -0.7057178020477295) assert_allclose(p["sigma"].value, 0.10932201147079468) @staticmethod def test_sky_model_gaussian_extern(cat): # special test for the only extern source with a gaussian morphology model = cat["HESS J1801-233"].components_models() p = model.parameters assert_allclose(p["amplitude"].value, 7.499999970031479e-13) assert_allclose(p["lon_0"].value, 6.656888961791992) assert_allclose(p["lat_0"].value, -0.267688125371933) assert_allclose(p["sigma"].value, 0.17) @staticmethod def test_sky_model_shell(cat): model = cat["Vela Junior"].sky_model() p = model.parameters assert_allclose(p["amplitude"].value, 3.2163001428830995e-11) assert_allclose(p["lon_0"].value, 266.287384) assert_allclose(p["lat_0"].value, -1.243260383605957) assert_allclose(p["radius"].value, 0.95) assert_allclose(p["width"].value, 0.05, rtol=1e-6) @staticmethod def test_flux_points_meta(cat): source = cat["HESS J1843-033"] fp = source.flux_points assert fp.sqrt_ts_threshold_ul == 1 assert fp.n_sigma == 1 assert fp.n_sigma_ul == 2 @requires_data() class TestSourceCatalogObjectHGPSComponent: @pytest.fixture(scope="class") def component(self, cat): return cat.gaussian_component(83) @staticmethod def test_repr(component): assert "SourceCatalogObjectHGPSComponent" in repr(component) @staticmethod def test_str(component): assert "Component HGPSC 084" in str(component) @staticmethod def test_name(component): assert component.name == "HGPSC 084" @staticmethod def test_index(component): assert component.row_index == 83 @staticmethod def test_spatial_model(component): model = component.spatial_model() assert model.frame == "galactic" p = model.parameters assert_allclose(p["lon_0"].value, 28.77037811279297) assert_allclose(p["lon_0"].error, 0.058748625218868256) assert_allclose(p["lat_0"].value, -0.0727819949388504) assert_allclose(p["lat_0"].error, 0.06880396604537964) assert_allclose(p["sigma"].value, 0.2294706553220749) assert_allclose(p["sigma"].error, 0.04618723690509796) class TestSourceCatalogLargeScaleHGPS: @pytest.fixture(scope="class") def model(self): table = Table() table["GLON"] = [-30, -10, 10, 20] * u.deg table["Surface_Brightness"] = [0, 1, 10, 0] * u.Unit("cm-2 s-1 sr-1") table["GLAT"] = [-1, 0, 1, 0] * u.deg table["Width"] = [0.4, 0.5, 0.3, 1.0] * u.deg return SourceCatalogLargeScaleHGPS(table) def test_evaluate(self, model): x = np.linspace(-100, 20, 5) y = np.linspace(-2, 2, 7) x, y = np.meshgrid(x, y) coords = SkyCoord(x, y, unit="deg", frame="galactic") image = model.evaluate(coords) desired = 1.223962643740966 * u.Unit("cm-2 s-1 sr-1") assert_quantity_allclose(image.sum(), desired) def test_parvals(self, model): glon = Angle(10, unit="deg") assert_quantity_allclose( model.peak_brightness(glon), 10 * u.Unit("cm-2 s-1 sr-1") ) assert_quantity_allclose(model.peak_latitude(glon), 1 * u.deg) assert_quantity_allclose(model.width(glon), 0.3 * u.deg) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/catalog/tests/test_lhaaso.py0000644000175100001770000001457114721316200021510 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from gammapy.catalog import SourceCatalog1LHAASO from gammapy.modeling.models import ( GaussianSpatialModel, PointSpatialModel, PowerLawNormSpectralModel, PowerLawSpectralModel, TemplateSpatialModel, ) from gammapy.utils.testing import requires_data @pytest.fixture(scope="session") def lhaaso1(): return SourceCatalog1LHAASO() @requires_data() class TestSourceCatalog1LHAASO: @staticmethod def test_source_table(lhaaso1): assert lhaaso1.tag == "1LHAASO" assert len(lhaaso1.table) == 90 @staticmethod def test_positions(lhaaso1): assert len(lhaaso1.positions) == 90 @staticmethod def test_to_models(lhaaso1): models = lhaaso1.to_models(which="both") assert len(models) == 90 models = lhaaso1.to_models(which="KM2A") assert np.all( [m.spectral_model.reference.quantity == 50 * u.TeV for m in models] ) assert len(models) == 75 models = lhaaso1.to_models(which="WCDA") assert np.all( [m.spectral_model.reference.quantity == 3 * u.TeV for m in models] ) assert len(models) == 69 @requires_data() class TestSourceCatalogObject1LHAASO: @staticmethod def test_data(lhaaso1): assert lhaaso1[0].data["Source_Name"] == "1LHAASO J0007+5659u" assert "KM2A" in lhaaso1[0].data["Model_a"] assert_allclose(lhaaso1[0].data["r39_ul"].value, 0.18) assert lhaaso1[0].data["r39_ul"].unit == u.deg assert_allclose(lhaaso1[0].data["N0"].value, 0.33e-16) assert lhaaso1[0].data["N0"].unit == u.Unit("cm−2 s−1 TeV−1") assert_allclose(lhaaso1[0].data["N0_err"].value, 0.05e-16) assert lhaaso1[0].data["N0_err"].unit == u.Unit("cm−2 s−1 TeV−1") assert_allclose(lhaaso1[0].data["N0_ul_b"].value, 0.27e-13) assert lhaaso1[0].data["N0_ul_b"].unit == u.Unit("cm−2 s−1 TeV−1") assert lhaaso1[1].data["ASSO_Name"] == "CTA 1" assert_allclose(lhaaso1[1].data["ASSO_Sep"].value, 0.12) assert lhaaso1[0].data["ASSO_Sep"].unit == u.deg assert lhaaso1[10].data["Source_Name"] == "1LHAASO J0428+5531*" assert "WCDA" in lhaaso1[10].data["Model_a"] assert_allclose(lhaaso1[10].data["RAJ2000"].value, 67.23) assert_allclose(lhaaso1[10].data["DECJ2000"].value, 55.53) assert_allclose(lhaaso1[10].data["pos_err"].value, 0.36) assert lhaaso1[10].data["RAJ2000"].unit == u.deg assert lhaaso1[10].data["DECJ2000"].unit == u.deg assert lhaaso1[10].data["pos_err"].unit == u.deg assert_allclose(lhaaso1[10].data["r39"].value, 1.18) assert_allclose(lhaaso1[10].data["r39_b"].value, 0.32) assert lhaaso1[10].data["r39_b"].unit == u.deg assert_allclose(lhaaso1[10].data["r39_err"].value, 0.12) assert_allclose(lhaaso1[10].data["r39_err_b"].value, 0.06) assert lhaaso1[10].data["r39_err_b"].unit == u.deg @staticmethod def test_position(lhaaso1): position = lhaaso1[0].position assert_allclose(position.ra.deg, 1.86, atol=1e-3) assert_allclose(position.dec.deg, 57.00, atol=1e-3) @staticmethod def test_sky_model(lhaaso1): model = lhaaso1[0].sky_model("both") assert model.name == "1LHAASO J0007+5659u" assert isinstance(model.spectral_model, PowerLawSpectralModel) assert isinstance(model.spatial_model, PointSpatialModel) assert lhaaso1[0].sky_model("WCDA") is None model = lhaaso1[1].sky_model("both") assert model.name == "1LHAASO J0007+7303u" assert isinstance(model.spectral_model, PowerLawNormSpectralModel) assert isinstance(model.spatial_model, TemplateSpatialModel) model = lhaaso1[1].sky_model("KM2A") assert model.name == "1LHAASO J0007+7303u" assert isinstance(model.spectral_model, PowerLawSpectralModel) assert isinstance(model.spatial_model, GaussianSpatialModel) model = lhaaso1[1].sky_model("WCDA") assert model.name == "1LHAASO J0007+7303u" assert isinstance(model.spectral_model, PowerLawSpectralModel) assert isinstance(model.spatial_model, PointSpatialModel) model = lhaaso1[11].sky_model("both") assert model.name == "1LHAASO J0500+4454" assert isinstance(model.spectral_model, PowerLawSpectralModel) assert isinstance(model.spatial_model, GaussianSpatialModel) @staticmethod def test_spectral_model(lhaaso1): m = lhaaso1[0].spectral_model("KM2A") dnde, dnde_err = m.evaluate_error(50 * u.TeV) assert dnde.unit == "cm-2 s-1 TeV-1" assert_allclose(dnde.value, 0.33e-16, rtol=1e-3) assert_allclose(dnde_err.value, 0.05e-16, rtol=1e-3) m = lhaaso1[11].spectral_model("WCDA") dnde, dnde_err = m.evaluate_error(3 * u.TeV) assert dnde.unit == "cm-2 s-1 TeV-1" assert_allclose(dnde.value, 0.69e-13, rtol=1e-3) assert_allclose(dnde_err.value, 0.16e-13, rtol=1e-3) @staticmethod def test_spatial_model(lhaaso1): m = lhaaso1[0].spatial_model("KM2A") assert isinstance(m, PointSpatialModel) assert m.lon_0.unit == "deg" assert_allclose(m.lon_0.value, 1.86, atol=1e-2) assert_allclose(m.lon_0.error, 0.09, atol=1e-2) assert m.lat_0.unit == "deg" assert_allclose(m.lat_0.value, 57.00, atol=1e-2) assert_allclose(m.lat_0.error, 0.049, atol=1e-2) assert m.frame == "fk5" m = lhaaso1[11].spatial_model("WCDA") assert isinstance(m, GaussianSpatialModel) assert m.lon_0.unit == "deg" assert_allclose(m.lon_0.value, 75.01, atol=1e-10) assert m.lat_0.unit == "deg" assert_allclose(m.lat_0.value, 44.92, atol=1e-10) assert m.frame == "fk5" assert m.sigma.unit == "deg" assert_allclose(m.sigma.value, 0.41, atol=1e-3) model = lhaaso1["1LHAASO J0007+5659u"].spatial_model("KM2A") pos_err = model.position_error assert_allclose(pos_err.height.value, 2 * 0.12, rtol=1e-4) assert_allclose(pos_err.width.value, 2 * 0.12, rtol=1e-4) assert_allclose(model.position.ra.value, pos_err.center.ra.value) assert_allclose(model.position.dec.value, pos_err.center.dec.value) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/conftest.py0000644000175100001770000000653714721316200016256 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst # this contains imports plugins that configure py.test for astropy tests. # by importing them here in conftest.py they are discoverable by py.test # no matter how it is invoked within the source tree. import os import pytest import numpy as np import astropy.units as u from astropy.time import Time import matplotlib from pytest_astropy_header.display import PYTEST_HEADER_MODULES, TESTED_VERSIONS from gammapy.data import GTI from gammapy.datasets import SpectrumDataset from gammapy.maps import MapAxis, RegionNDMap from gammapy.modeling.models import ( ConstantTemporalModel, Models, PowerLawSpectralModel, SkyModel, ) # TODO: activate this again and handle deprecations in the code # enable_deprecations_as_exceptions(warnings_to_ignore_entire_module=["iminuit", "naima"]) def pytest_configure(config): """Print some info ...""" from gammapy.utils.testing import has_data config.option.astropy_header = True # Declare for which packages version numbers should be displayed # when running the tests PYTEST_HEADER_MODULES["cython"] = "cython" PYTEST_HEADER_MODULES["iminuit"] = "iminuit" PYTEST_HEADER_MODULES["matplotlib"] = "matplotlib" PYTEST_HEADER_MODULES["astropy"] = "astropy" PYTEST_HEADER_MODULES["regions"] = "regions" PYTEST_HEADER_MODULES["healpy"] = "healpy" PYTEST_HEADER_MODULES["sherpa"] = "sherpa" PYTEST_HEADER_MODULES["gammapy"] = "gammapy" PYTEST_HEADER_MODULES["naima"] = "naima" PYTEST_HEADER_MODULES["ray"] = "ray" print("") print("Gammapy test data availability:") has_it = "yes" if has_data("gammapy-data") else "no" print(f"gammapy-data ... {has_it}") print("Gammapy environment variables:") var = os.environ.get("GAMMAPY_DATA", "not set") print(f"GAMMAPY_DATA = {var}") matplotlib.use("agg") print('Setting matplotlib backend to "agg" for the tests.') from . import __version__ packagename = os.path.basename(os.path.dirname(__file__)) TESTED_VERSIONS[packagename] = __version__ @pytest.fixture() def spectrum_dataset(): # TODO: change the fixture scope to "session". This currently crashes fitting tests name = "test" energy = np.logspace(-1, 1, 31) * u.TeV livetime = 100 * u.s pwl = PowerLawSpectralModel( index=2.1, amplitude="1e5 cm-2 s-1 TeV-1", reference="0.1 TeV", ) temp_mod = ConstantTemporalModel() model = SkyModel(spectral_model=pwl, temporal_model=temp_mod, name="test-source") axis = MapAxis.from_edges(energy, interp="log", name="energy") axis_true = MapAxis.from_edges(energy, interp="log", name="energy_true") background = RegionNDMap.create(region="icrs;circle(0, 0, 0.1)", axes=[axis]) models = Models([model]) exposure = RegionNDMap.create(region="icrs;circle(0, 0, 0.1)", axes=[axis_true]) exposure.quantity = u.Quantity("1 cm2") * livetime bkg_rate = np.ones(30) / u.s background.quantity = bkg_rate * livetime start = [1, 3, 5] * u.day stop = [2, 3.5, 6] * u.day t_ref = Time(55555, format="mjd") gti = GTI.create(start, stop, reference_time=t_ref) dataset = SpectrumDataset( models=models, exposure=exposure, background=background, name=name, gti=gti, ) dataset.fake(random_state=23) return dataset ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.192642 gammapy-1.3/gammapy/data/0000755000175100001770000000000014721316215014763 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/__init__.py0000644000175100001770000000176114721316200017073 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Data and observation handling.""" from gammapy.utils.observers import observatory_locations from .data_store import DataStore from .event_list import EventList from .filters import ObservationFilter from .gti import GTI from .hdu_index_table import HDUIndexTable from .metadata import EventListMetaData, ObservationMetaData from .obs_table import ObservationTable from .observations import Observation, Observations from .pointing import FixedPointingInfo, PointingInfo, PointingMode from .simulate import ObservationsEventsSampler from .utils import get_irfs_features __all__ = [ "DataStore", "EventList", "EventListMetaData", "ObservationMetaData", "FixedPointingInfo", "GTI", "HDUIndexTable", "Observation", "ObservationFilter", "Observations", "ObservationsEventsSampler", "ObservationTable", "observatory_locations", "PointingInfo", "PointingMode", "get_irfs_features", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/data_store.py0000644000175100001770000006474214721316200017471 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import html import logging import subprocess from copy import copy from pathlib import Path import numpy as np from astropy import units as u from astropy.coordinates import SkyCoord from astropy.io import fits import gammapy.utils.time as tu from gammapy.utils.pbar import progress_bar from gammapy.utils.scripts import make_path from gammapy.utils.testing import Checker from .hdu_index_table import HDUIndexTable from .obs_table import ObservationTable, ObservationTableChecker from .observations import Observation, ObservationChecker, Observations __all__ = ["DataStore"] ALL_IRFS = ["aeff", "edisp", "psf", "bkg", "rad_max"] ALL_HDUS = ["events", "gti", "pointing"] + ALL_IRFS REQUIRED_IRFS = { "full-enclosure": {"aeff", "edisp", "psf", "bkg"}, "point-like": {"aeff", "edisp"}, "all-optional": {}, } class MissingRequiredHDU(IOError): pass log = logging.getLogger(__name__) log.setLevel(logging.INFO) class DataStore: """IACT data store. The data selection and access happens using an observation and an HDU index file as described at :ref:`gadf:iact-storage`. Parameters ---------- hdu_table : `~gammapy.data.HDUIndexTable` HDU index table. obs_table : `~gammapy.data.ObservationTable` Observation index table. Examples -------- Here's an example how to create a `DataStore` to access H.E.S.S. data: >>> from gammapy.data import DataStore >>> data_store = DataStore.from_dir('$GAMMAPY_DATA/hess-dl3-dr1') >>> data_store.info() #doctest: +SKIP Data store: HDU index table: BASE_DIR: /Users/ASinha/Gammapy-dev/gammapy-data/hess-dl3-dr1 Rows: 630 OBS_ID: 20136 -- 47829 HDU_TYPE: ['aeff', 'bkg', 'edisp', 'events', 'gti', 'psf'] HDU_CLASS: ['aeff_2d', 'bkg_3d', 'edisp_2d', 'events', 'gti', 'psf_table'] Observation table: Observatory name: 'N/A' Number of observations: 105 For further usage example see :doc:`/tutorials/data/cta` tutorial. """ DEFAULT_HDU_TABLE = "hdu-index.fits.gz" """Default HDU table filename.""" DEFAULT_OBS_TABLE = "obs-index.fits.gz" """Default observation table filename.""" def __init__(self, hdu_table=None, obs_table=None): self.hdu_table = hdu_table self.obs_table = obs_table def __str__(self): return self.info(show=False) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property def obs_ids(self): """Return the sorted obs_ids contained in the datastore.""" return np.unique(self.hdu_table["OBS_ID"].data) @classmethod def from_file(cls, filename, hdu_hdu="HDU_INDEX", hdu_obs="OBS_INDEX"): """Create a Datastore from a FITS file. The FITS file must contain both index files. Parameters ---------- filename : str or `~pathlib.Path` FITS filename. hdu_hdu : str or int, optional FITS HDU name or number for the HDU index table. Default is "HDU_INDEX". hdu_obs : str or int, optional FITS HDU name or number for the observation index table. Default is "OBS_INDEX". Returns ------- data_store : `DataStore` Data store. """ filename = make_path(filename) hdu_table = HDUIndexTable.read(filename, hdu=hdu_hdu, format="fits") obs_table = None if hdu_obs: obs_table = ObservationTable.read(filename, hdu=hdu_obs, format="fits") return cls(hdu_table=hdu_table, obs_table=obs_table) @classmethod def from_dir(cls, base_dir, hdu_table_filename=None, obs_table_filename=None): """Create from a directory. Parameters ---------- base_dir : str or `~pathlib.Path` Base directory of the data files. hdu_table_filename : str or `~pathlib.Path`, optional Filename of the HDU index file. May be specified either relative to `base_dir` or as an absolute path. If None, default is "hdu-index.fits.gz". obs_table_filename : str or `~pathlib.Path`, optional Filename of the observation index file. May be specified either relative to `base_dir` or as an absolute path. If None, default is obs-index.fits.gz. Returns ------- data_store : `DataStore` Data store. Examples -------- >>> from gammapy.data import DataStore >>> data_store = DataStore.from_dir('$GAMMAPY_DATA/hess-dl3-dr1') """ base_dir = make_path(base_dir) if hdu_table_filename: hdu_table_filename = make_path(hdu_table_filename) if (base_dir / hdu_table_filename).exists(): hdu_table_filename = base_dir / hdu_table_filename else: hdu_table_filename = base_dir / cls.DEFAULT_HDU_TABLE if obs_table_filename: obs_table_filename = make_path(obs_table_filename) if (base_dir / obs_table_filename).exists(): obs_table_filename = base_dir / obs_table_filename elif not obs_table_filename.exists(): raise IOError(f"File not found : {obs_table_filename}") else: obs_table_filename = base_dir / cls.DEFAULT_OBS_TABLE if not hdu_table_filename.exists(): raise OSError(f"File not found: {hdu_table_filename}") log.debug(f"Reading {hdu_table_filename}") hdu_table = HDUIndexTable.read(hdu_table_filename, format="fits") hdu_table.meta["BASE_DIR"] = str(base_dir) if not obs_table_filename.exists(): log.info("Cannot find default obs-index table.") obs_table = None else: log.debug(f"Reading {obs_table_filename}") obs_table = ObservationTable.read(obs_table_filename, format="fits") return cls(hdu_table=hdu_table, obs_table=obs_table) @classmethod def from_events_files(cls, events_paths, irfs_paths=None): """Create from a list of event filenames. HDU and observation index tables will be created from the EVENTS header. IRFs are found only if you have a ``CALDB`` environment variable set, and if the EVENTS files contain the following keys: - ``TELESCOP`` (example: ``TELESCOP = CTA``) - ``CALDB`` (example: ``CALDB = 1dc``) - ``IRF`` (example: ``IRF = South_z20_50h``) Parameters ---------- events_paths : list of str or `~pathlib.Path` List of paths to the events files. irfs_paths : str or `~pathlib.Path`, or list of str or list of `~pathlib.Path`, optional Path to the IRFs file. If a list is provided it must be the same length as `events_paths`. If None the events files have to contain CALDB and IRF header keywords to locate the IRF files, otherwise the IRFs are assumed to be contained in the events files. Returns ------- data_store : `DataStore` Data store. Examples -------- This is how you can access a single event list:: >>> from gammapy.data import DataStore >>> import os >>> os.environ["CALDB"] = os.environ["GAMMAPY_DATA"] + "/cta-1dc/caldb" >>> path = "$GAMMAPY_DATA/cta-1dc/data/baseline/gps/gps_baseline_110380.fits" >>> data_store = DataStore.from_events_files([path]) >>> observations = data_store.get_observations() You can now analyse this data as usual (see any Gammapy tutorial). If you have multiple event files, you have to make the list. Here's an example using ``Path.glob`` to get a list of all events files in a given folder:: >>> import os >>> from pathlib import Path >>> path = Path(os.environ["GAMMAPY_DATA"]) / "cta-1dc/data" >>> paths = list(path.rglob("*.fits")) >>> data_store = DataStore.from_events_files(paths) >>> observations = data_store.get_observations() >>> #Note that you have a lot of flexibility to select the observations you want, >>> # by having a few lines of custom code to prepare ``paths``, or to select a >>> # subset via a method on the ``data_store`` or the ``observations`` objects. >>> # If you want to generate HDU and observation index files, write the tables to disk:: >>> data_store.hdu_table.write("hdu-index.fits.gz") # doctest: +SKIP >>> data_store.obs_table.write("obs-index.fits.gz") # doctest: +SKIP """ return DataStoreMaker(events_paths, irfs_paths).run() def info(self, show=True): """Print some info.""" s = "Data store:\n" s += self.hdu_table.summary() s += "\n\n" if self.obs_table: s += self.obs_table.summary() else: s += "No observation index table." if show: print(s) else: return s def obs(self, obs_id, required_irf="full-enclosure", require_events=True): """Access a given `~gammapy.data.Observation`. Parameters ---------- obs_id : int Observation ID. required_irf : list of str or str, optional The list can include the following options: * `"events"` : Events * `"gti"` : Good time intervals * `"aeff"` : Effective area * `"bkg"` : Background * `"edisp"` : Energy dispersion * `"psf"` : Point Spread Function * `"rad_max"` : Maximal radius Alternatively single string can be used as shortcut: * `"full-enclosure"` : includes `["events", "gti", "aeff", "edisp", "psf", "bkg"]` * `"point-like"` : includes `["events", "gti", "aeff", "edisp"]` Default is `"full-enclosure"`. require_events : bool, optional Require events and gti table or not. Default is True. Returns ------- observation : `~gammapy.data.Observation` Observation container. """ if obs_id not in self.hdu_table["OBS_ID"]: raise ValueError(f"OBS_ID = {obs_id} not in HDU index table.") kwargs = {"obs_id": int(obs_id)} # check for the "short forms" if isinstance(required_irf, str): required_irf = REQUIRED_IRFS[required_irf] if not set(required_irf).issubset(ALL_IRFS): difference = set(required_irf).difference(ALL_IRFS) raise ValueError( f"{difference} is not a valid hdu key. Choose from: {ALL_IRFS}" ) if require_events: required_hdus = {"events", "gti"}.union(required_irf) else: required_hdus = required_irf missing_hdus = [] for hdu in ALL_HDUS: hdu_location = self.hdu_table.hdu_location( obs_id=obs_id, hdu_type=hdu, warn_missing=False, ) if hdu_location is not None: kwargs[hdu] = hdu_location elif hdu in required_hdus: missing_hdus.append(hdu) if len(missing_hdus) > 0: raise MissingRequiredHDU( f"Required HDUs {missing_hdus} not found in observation {obs_id}" ) # TODO: right now, gammapy doesn't support using the pointing table of GADF # so we always pass the events location here to be read into a FixedPointingInfo if "events" in kwargs: pointing_location = copy(kwargs["events"]) pointing_location.hdu_class = "pointing" kwargs["pointing"] = pointing_location return Observation(**kwargs) def get_observations( self, obs_id=None, skip_missing=False, required_irf="full-enclosure", require_events=True, ): """Generate a `~gammapy.data.Observations`. Notes ----- The progress bar can be displayed for this function. Parameters ---------- obs_id : list, optional Observation IDs. If None, default is all observations ordered by OBS_ID are returned. This is not necessarily the order in the ``obs_table``. skip_missing : bool, optional Skip missing observations. Default is False. required_irf : list of str or str, optional Runs will be added to the list of observations only if the required HDUs are present. Otherwise, the given run will be skipped The list can include the following options: * `"events"` : Events * `"gti"` : Good time intervals * `"aeff"` : Effective area * `"bkg"` : Background * `"edisp"` : Energy dispersion * `"psf"` : Point Spread Function * `"rad_max"` : Maximal radius Alternatively single string can be used as shortcut: * `"full-enclosure"` : includes `["events", "gti", "aeff", "edisp", "psf", "bkg"]` * `"point-like"` : includes `["events", "gti", "aeff", "edisp"]` * `"all-optional"` : no HDUs are required, only warnings will be emitted for missing HDUs among all possibilities. Default is `"full-enclosure"`. require_events : bool, optional Require events and gti table or not. Default is True. Returns ------- observations : `~gammapy.data.Observations` Container holding a list of `~gammapy.data.Observation`. """ if obs_id is None: obs_id = self.obs_ids obs_list = [] for _ in progress_bar(obs_id, desc="Obs Id"): try: obs = self.obs(_, required_irf, require_events) except ValueError as err: if skip_missing: log.warning(f"Skipping missing obs_id: {_!r}") continue else: raise err except MissingRequiredHDU as e: log.warning(f"Skipping run with missing HDUs; {e}") continue obs_list.append(obs) log.info(f"Observations selected: {len(obs_list)} out of {len(obs_id)}.") return Observations(obs_list) def copy_obs(self, obs_id, outdir, hdu_class=None, verbose=False, overwrite=False): """Create a new `~gammapy.data.DataStore` containing a subset of observations. Parameters ---------- obs_id : array-like, `~gammapy.data.ObservationTable` List of observations to copy. outdir : str or `~pathlib.Path` Directory for the new store. hdu_class : list of str, optional see :attr:`gammapy.data.HDUIndexTable.VALID_HDU_CLASS`. verbose : bool, optional Print copied files. Default is False. overwrite : bool, optional Overwrite. Default is False. """ outdir = make_path(outdir) if not outdir.is_dir(): raise OSError(f"Not a directory: outdir={outdir}") if isinstance(obs_id, ObservationTable): obs_id = obs_id["OBS_ID"].data hdutable = self.hdu_table hdutable.add_index("OBS_ID") with hdutable.index_mode("discard_on_copy"): subhdutable = hdutable.loc[obs_id] if hdu_class is not None: subhdutable.add_index("HDU_CLASS") with subhdutable.index_mode("discard_on_copy"): subhdutable = subhdutable.loc[hdu_class] if self.obs_table: subobstable = self.obs_table.select_obs_id(obs_id) for idx in range(len(subhdutable)): # Changes to the file structure could be made here loc = subhdutable.location_info(idx) targetdir = outdir / loc.file_dir targetdir.mkdir(exist_ok=True, parents=True) cmd = ["cp"] if verbose: cmd += ["-v"] if not overwrite: cmd += ["-n"] cmd += [str(loc.path()), str(targetdir)] subprocess.run(cmd) filename = outdir / self.DEFAULT_HDU_TABLE subhdutable.write(filename, format="fits", overwrite=overwrite) if self.obs_table: filename = outdir / self.DEFAULT_OBS_TABLE subobstable.write(str(filename), format="fits", overwrite=overwrite) def check(self, checks="all"): """Check index tables and data files. This is a generator that yields a list of dicts. """ checker = DataStoreChecker(self) return checker.run(checks=checks) class DataStoreChecker(Checker): """Check data store. Checks data format and a bit about the content. """ CHECKS = { "obs_table": "check_obs_table", "hdu_table": "check_hdu_table", "observations": "check_observations", "consistency": "check_consistency", } def __init__(self, data_store): self.data_store = data_store def check_obs_table(self): """Check for the observation index table.""" yield from ObservationTableChecker(self.data_store.obs_table).run() def check_hdu_table(self): """Check for the HDU index table.""" t = self.data_store.hdu_table m = t.meta if m.get("HDUCLAS1", "") != "INDEX": yield { "level": "error", "hdu": "hdu-index", "msg": "Invalid header key. Must have HDUCLAS1=INDEX", } if m.get("HDUCLAS2", "") != "HDU": yield { "level": "error", "hdu": "hdu-index", "msg": "Invalid header key. Must have HDUCLAS2=HDU", } # Check that all HDU in the data files exist for idx in range(len(t)): location_info = t.location_info(idx) try: location_info.get_hdu() except KeyError: yield { "level": "error", "msg": f"HDU not found: {location_info.__dict__!r}", } def check_consistency(self): """Check consistency between multiple HDUs.""" # obs and HDU index should have the same OBS_ID obs_table_obs_id = set(self.data_store.obs_table["OBS_ID"]) hdu_table_obs_id = set(self.data_store.hdu_table["OBS_ID"]) if not obs_table_obs_id == hdu_table_obs_id: yield { "level": "error", "msg": "Inconsistent OBS_ID in obs and HDU index tables", } # TODO: obs table and events header should have the same times def check_observations(self): """Perform some sanity checks for all observations.""" for obs_id in self.data_store.obs_table["OBS_ID"]: obs = self.data_store.obs(obs_id) yield from ObservationChecker(obs).run() class DataStoreMaker: """Create data store index tables. This is a multistep process coded as a class. Users will usually call this via `DataStore.from_events_files`. """ def __init__(self, events_paths, irfs_paths=None): if isinstance(events_paths, (str, Path)): raise TypeError("Need list of paths, not a single string or Path object.") self.events_paths = [make_path(path) for path in events_paths] if irfs_paths is None or isinstance(irfs_paths, (str, Path)): self.irfs_paths = [make_path(irfs_paths)] * len(events_paths) else: self.irfs_paths = [make_path(path) for path in irfs_paths] # Cache for EVENTS file header information, to avoid multiple reads self._events_info = {} def run(self): """Run all steps.""" hdu_table = self.make_hdu_table() obs_table = self.make_obs_table() return DataStore(hdu_table=hdu_table, obs_table=obs_table) def get_events_info(self, events_path, irf_path=None): """Read events header information.""" if events_path not in self._events_info: self._events_info[events_path] = self.read_events_info( events_path, irf_path ) return self._events_info[events_path] def get_obs_info(self, events_path, irf_path=None): """Read events header information and add some extra information.""" # We could add or remove info here depending on what we want in the obs table return self.get_events_info(events_path, irf_path) @staticmethod def read_events_info(events_path, irf_path=None): """Read mandatory events header information.""" log.debug(f"Reading {events_path}") with fits.open(events_path, memmap=False) as hdu_list: header = hdu_list["EVENTS"].header na_int, na_str = -1, "NOT AVAILABLE" info = {} # Note: for some reason `header["OBS_ID"]` is sometimes `str`, maybe trailing whitespace # mandatory header info: info["OBS_ID"] = int(header["OBS_ID"]) info["TSTART"] = header["TSTART"] * u.s info["TSTOP"] = header["TSTOP"] * u.s info["ONTIME"] = header["ONTIME"] * u.s info["LIVETIME"] = header["LIVETIME"] * u.s info["DEADC"] = header["DEADC"] info["TELESCOP"] = header.get("TELESCOP", na_str) obs_mode = header.get("OBS_MODE", "POINTING") if obs_mode == "DRIFT": info["ALT_PNT"] = header["ALT_PNT"] * u.deg info["AZ_PNT"] = header["AZ_PNT"] * u.deg info["ZEN_PNT"] = 90 * u.deg - info["ALT_PNT"] else: info["RA_PNT"] = header["RA_PNT"] * u.deg info["DEC_PNT"] = header["DEC_PNT"] * u.deg # optional header info pos = SkyCoord(info["RA_PNT"], info["DEC_PNT"], unit="deg").galactic info["GLON_PNT"] = pos.l info["GLAT_PNT"] = pos.b info["DATE-OBS"] = header.get("DATE_OBS", na_str) info["TIME-OBS"] = header.get("TIME_OBS", na_str) info["DATE-END"] = header.get("DATE_END", na_str) info["TIME-END"] = header.get("TIME_END", na_str) info["N_TELS"] = header.get("N_TELS", na_int) info["OBJECT"] = header.get("OBJECT", na_str) # Not part of the spec, but good to know from which file the info comes info["EVENTS_FILENAME"] = str(events_path) info["EVENT_COUNT"] = header["NAXIS2"] # This is the info needed to link from EVENTS to IRFs info["CALDB"] = header.get("CALDB", na_str) info["IRF"] = header.get("IRF", na_str) if irf_path is not None: info["IRF_FILENAME"] = str(irf_path) elif info["CALDB"] != na_str and info["IRF"] != na_str: caldb_irf = CalDBIRF.from_meta(info) info["IRF_FILENAME"] = str(caldb_irf.file_path) else: info["IRF_FILENAME"] = info["EVENTS_FILENAME"] # Mandatory fields defining the time data for name in tu.TIME_KEYWORDS: info[name] = header.get(name, None) return info def make_obs_table(self): """Make observation index table.""" rows = [] time_rows = [] for events_path, irf_path in zip(self.events_paths, self.irfs_paths): row = self.get_obs_info(events_path, irf_path) rows.append(row) time_row = tu.extract_time_info(row) time_rows.append(time_row) names = list(rows[0].keys()) table = ObservationTable(rows=rows, names=names) m = table.meta if not tu.unique_time_info(time_rows): raise RuntimeError( "The time information in the EVENT header are not consistent between observations" ) for name in tu.TIME_KEYWORDS: m[name] = time_rows[0][name] m["HDUCLASS"] = "GADF" m["HDUDOC"] = "https://github.com/open-gamma-ray-astro/gamma-astro-data-formats" m["HDUVERS"] = "0.2" m["HDUCLAS1"] = "INDEX" m["HDUCLAS2"] = "OBS" return table def make_hdu_table(self): """Make HDU index table.""" rows = [] for events_path, irf_path in zip(self.events_paths, self.irfs_paths): rows.extend(self.get_hdu_table_rows(events_path, irf_path)) names = list(rows[0].keys()) # names = ['OBS_ID', 'HDU_TYPE', 'HDU_CLASS', 'FILE_DIR', 'FILE_NAME', 'HDU_NAME'] table = HDUIndexTable(rows=rows, names=names) m = table.meta m["HDUCLASS"] = "GADF" m["HDUDOC"] = "https://github.com/open-gamma-ray-astro/gamma-astro-data-formats" m["HDUVERS"] = "0.2" m["HDUCLAS1"] = "INDEX" m["HDUCLAS2"] = "HDU" return table def get_hdu_table_rows(self, events_path, irf_path=None): """Make HDU index table rows for one observation.""" events_info = self.get_obs_info(events_path, irf_path) info = dict( OBS_ID=events_info["OBS_ID"], FILE_DIR=events_path.parent.as_posix(), FILE_NAME=events_path.name, ) yield dict(HDU_TYPE="events", HDU_CLASS="events", HDU_NAME="EVENTS", **info) yield dict(HDU_TYPE="gti", HDU_CLASS="gti", HDU_NAME="GTI", **info) irf_path = Path(events_info["IRF_FILENAME"]) info = dict( OBS_ID=events_info["OBS_ID"], FILE_DIR=irf_path.parent.as_posix(), FILE_NAME=irf_path.name, ) yield dict( HDU_TYPE="aeff", HDU_CLASS="aeff_2d", HDU_NAME="EFFECTIVE AREA", **info ) yield dict( HDU_TYPE="edisp", HDU_CLASS="edisp_2d", HDU_NAME="ENERGY DISPERSION", **info ) yield dict( HDU_TYPE="psf", HDU_CLASS="psf_3gauss", HDU_NAME="POINT SPREAD FUNCTION", **info, ) yield dict(HDU_TYPE="bkg", HDU_CLASS="bkg_3d", HDU_NAME="BACKGROUND", **info) class CalDBIRF: """Helper class to work with IRFs in CALDB format.""" def __init__(self, telescop, caldb, irf): self.telescop = telescop self.caldb = caldb self.irf = irf @classmethod def from_meta(cls, meta): return cls(telescop=meta["TELESCOP"], caldb=meta["CALDB"], irf=meta["IRF"]) @property def file_dir(self): # In CTA 1DC the header key is "CTA", but the directory is lower-case "cta" telescop = self.telescop.lower() return f"$CALDB/data/{telescop}/{self.caldb}/bcf/{self.irf}" @property def file_path(self): return Path(f"{self.file_dir}/{self.file_name}") @property def file_name(self): path = make_path(self.file_dir) return list(path.iterdir())[0].name ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/event_list.py0000644000175100001770000010330414721316200017504 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import collections import copy import html import logging import warnings import numpy as np from astropy import units as u from astropy.coordinates import AltAz, Angle, SkyCoord, angular_separation from astropy.io import fits from astropy.table import Table from astropy.table import vstack as vstack_tables from astropy.visualization import quantity_support import matplotlib.pyplot as plt from gammapy.maps import MapAxis, MapCoord, RegionGeom, WcsNDMap from gammapy.maps.axes import UNIT_STRING_FORMAT from gammapy.utils.fits import earth_location_from_dict from gammapy.utils.scripts import make_path from gammapy.utils.testing import Checker from gammapy.utils.time import time_ref_from_dict from .metadata import EventListMetaData __all__ = ["EventList"] log = logging.getLogger(__name__) class EventList: """Event list. Event list data is stored as ``table`` (`~astropy.table.Table`) data member. The most important reconstructed event parameters are available as the following columns: - ``TIME`` - Mission elapsed time (sec) - ``RA``, ``DEC`` - ICRS system position (deg) - ``ENERGY`` - Energy (usually MeV for Fermi and TeV for IACTs) Note that ``TIME`` is usually sorted, but sometimes it is not. E.g. when simulating data, or processing it in certain ways. So generally any analysis code should assume ``TIME`` is not sorted. Other optional (columns) that are sometimes useful for high level analysis: - ``GLON``, ``GLAT`` - Galactic coordinates (deg) - ``DETX``, ``DETY`` - Field of view coordinates (deg) Note that when reading data for analysis you shouldn't use those values directly, but access them via properties which create objects of the appropriate class: - `time` for ``TIME`` - `radec` for ``RA``, ``DEC`` - `energy` for ``ENERGY`` - `galactic` for ``GLON``, ``GLAT`` Parameters ---------- table : `~astropy.table.Table` Event list table. meta : `~gammapy.data.EventListMetaData` The metadata. Default is None. Examples -------- >>> from gammapy.data import EventList >>> events = EventList.read("$GAMMAPY_DATA/cta-1dc/data/baseline/gps/gps_baseline_110380.fits") >>> print(events) EventList --------- Instrument : None Telescope : CTA Obs. ID : 110380 Number of events : 106217 Event rate : 59.273 1 / s Time start : 59235.5 Time stop : 59235.52074074074 Min. energy : 3.00e-02 TeV Max. energy : 1.46e+02 TeV Median energy : 1.02e-01 TeV Max. offset : 5.0 deg """ def __init__(self, table, meta=None): self.table = table self.meta = meta def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @classmethod def read(cls, filename, hdu="EVENTS", checksum=False, **kwargs): """Read from FITS file. Format specification: :ref:`gadf:iact-events` Parameters ---------- filename : `pathlib.Path`, str Filename hdu : str Name of events HDU. Default is "EVENTS". checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. """ filename = make_path(filename) with fits.open(filename) as hdulist: events_hdu = hdulist[hdu] if checksum: if events_hdu.verify_checksum() != 1: warnings.warn( f"Checksum verification failed for HDU {hdu} of {filename}.", UserWarning, ) table = Table.read(events_hdu) meta = EventListMetaData.from_header(table.meta) return cls(table=table, meta=meta) def to_table_hdu(self, format="gadf"): """ Convert event list to a `~astropy.io.fits.BinTableHDU`. Parameters ---------- format : str, optional Output format, currently only "gadf" is supported. Default is "gadf". Returns ------- hdu : `astropy.io.fits.BinTableHDU` EventList converted to FITS representation. """ if format != "gadf": raise ValueError(f"Only the 'gadf' format supported, got {format}") return fits.BinTableHDU(self.table, name="EVENTS") # TODO: Pass metadata here. Also check that specific meta contents are consistent @classmethod def from_stack(cls, event_lists, **kwargs): """Stack (concatenate) list of event lists. Calls `~astropy.table.vstack`. Parameters ---------- event_lists : list List of `~gammapy.data.EventList` to stack. **kwargs : dict, optional Keyword arguments passed to `~astropy.table.vstack`. """ tables = [_.table for _ in event_lists] stacked_table = vstack_tables(tables, **kwargs) log.warning("The meta information will be empty here.") return cls(stacked_table) def stack(self, other): """Stack with another EventList in place. Calls `~astropy.table.vstack`. Parameters ---------- other : `~gammapy.data.EventList` Event list to stack to self. """ self.table = vstack_tables([self.table, other.table]) def __str__(self): info = self.__class__.__name__ + "\n" info += "-" * len(self.__class__.__name__) + "\n\n" instrument = self.table.meta.get("INSTRUME") info += f"\tInstrument : {instrument}\n" telescope = self.table.meta.get("TELESCOP") info += f"\tTelescope : {telescope}\n" obs_id = self.table.meta.get("OBS_ID", "") info += f"\tObs. ID : {obs_id}\n\n" info += f"\tNumber of events : {len(self.table)}\n" rate = len(self.table) / self.observation_time_duration info += f"\tEvent rate : {rate:.3f}\n\n" info += f"\tTime start : {self.observation_time_start}\n" info += f"\tTime stop : {self.observation_time_stop}\n\n" info += f"\tMin. energy : {np.min(self.energy):.2e}\n" info += f"\tMax. energy : {np.max(self.energy):.2e}\n" info += f"\tMedian energy : {np.median(self.energy):.2e}\n\n" if self.is_pointed_observation: offset_max = np.max(self.offset) info += f"\tMax. offset : {offset_max:.1f}\n" return info.expandtabs(tabsize=2) @property def time_ref(self): """Time reference as a `~astropy.time.Time` object.""" return time_ref_from_dict(self.table.meta) @property def time(self): """Event times as a `~astropy.time.Time` object. Notes ----- Times are automatically converted to 64-bit floats. With 32-bit floats times will be incorrect by a few seconds when e.g. adding them to the reference time. """ met = u.Quantity(self.table["TIME"].astype("float64"), "second") return self.time_ref + met @property def observation_time_start(self): """Observation start time as a `~astropy.time.Time` object.""" return self.time_ref + u.Quantity(self.table.meta["TSTART"], "second") @property def observation_time_stop(self): """Observation stop time as a `~astropy.time.Time` object.""" return self.time_ref + u.Quantity(self.table.meta["TSTOP"], "second") @property def radec(self): """Event RA / DEC sky coordinates as a `~astropy.coordinates.SkyCoord` object.""" lon, lat = self.table["RA"], self.table["DEC"] return SkyCoord(lon, lat, unit="deg", frame="icrs") @property def galactic(self): """Event Galactic sky coordinates as a `~astropy.coordinates.SkyCoord` object. Always computed from RA / DEC using Astropy. """ return self.radec.galactic @property def energy(self): """Event energies as a `~astropy.units.Quantity`.""" return self.table["ENERGY"].quantity @property def galactic_median(self): """Median position as a `~astropy.coordinates.SkyCoord` object.""" galactic = self.galactic median_lon = np.median(galactic.l.wrap_at("180d")) median_lat = np.median(galactic.b) return SkyCoord(median_lon, median_lat, frame="galactic") def select_row_subset(self, row_specifier): """Select table row subset. Parameters ---------- row_specifier : slice or int or array of int Specification for rows to select, passed to ``self.table[row_specifier]``. Returns ------- event_list : `EventList` New event list with table row subset selected. Examples -------- >>> from gammapy.data import EventList >>> import numpy as np >>> filename = "$GAMMAPY_DATA/cta-1dc/data/baseline/gps/gps_baseline_110380.fits" >>> events = EventList.read(filename) >>> #Use a boolean mask as ``row_specifier``: >>> mask = events.table['MC_ID'] == 1 >>> events2 = events.select_row_subset(mask) >>> print(len(events2.table)) 97978 >>> #Use row index array as ``row_specifier``: >>> idx = np.where(events.table['MC_ID'] == 1)[0] >>> events2 = events.select_row_subset(idx) >>> print(len(events2.table)) 97978 """ table = self.table[row_specifier] return self.__class__(table=table) def select_energy(self, energy_range): """Select events in energy band. Parameters ---------- energy_range : `~astropy.units.Quantity` Energy range ``[energy_min, energy_max)``. Returns ------- event_list : `EventList` Copy of event list with selection applied. Examples -------- >>> from astropy import units as u >>> from gammapy.data import EventList >>> filename = "$GAMMAPY_DATA/fermi_3fhl/fermi_3fhl_events_selected.fits.gz" >>> event_list = EventList.read(filename) >>> energy_range =[1, 20] * u.TeV >>> event_list = event_list.select_energy(energy_range=energy_range) """ energy = self.energy mask = energy_range[0] <= energy mask &= energy < energy_range[1] return self.select_row_subset(mask) def select_time(self, time_interval): """Select events in time interval. Parameters ---------- time_interval : `astropy.time.Time` Start time (inclusive) and stop time (exclusive) for the selection. Returns ------- events : `EventList` Copy of event list with selection applied. """ time = self.time mask = time_interval[0] <= time mask &= time < time_interval[1] return self.select_row_subset(mask) def select_region(self, regions, wcs=None): """Select events in given region. Parameters ---------- regions : str or `~regions.Region` or list of `~regions.Region` Region or list of regions (pixel or sky regions accepted). A region can be defined as a string in the DS9 format as well. See http://ds9.si.edu/doc/ref/region.html for details. wcs : `~astropy.wcs.WCS`, optional World coordinate system transformation. Default is None. Returns ------- event_list : `EventList` Copy of event list with selection applied. """ geom = RegionGeom.from_regions(regions, wcs=wcs) mask = geom.contains(self.radec) return self.select_row_subset(mask) def select_parameter(self, parameter, band): """Select events with respect to a specified parameter. Parameters ---------- parameter : str Parameter used for the selection. Must be present in `self.table`. band : tuple or `astropy.units.Quantity` Minimum and maximum value for the parameter to be selected (minimum <= parameter < maximum). If parameter is not dimensionless, a Quantity is required. Returns ------- event_list : `EventList` Copy of event list with selection applied. Examples -------- >>> from astropy import units as u >>> from gammapy.data import EventList >>> filename = "$GAMMAPY_DATA/fermi_3fhl/fermi_3fhl_events_selected.fits.gz" >>> event_list = EventList.read(filename) >>> zd = (0, 30) * u.deg >>> event_list = event_list.select_parameter(parameter='ZENITH_ANGLE', band=zd) >>> print(len(event_list.table)) 123944 """ mask = band[0] <= self.table[parameter].quantity mask &= self.table[parameter].quantity < band[1] return self.select_row_subset(mask) @property def _default_plot_energy_axis(self): energy = self.energy return MapAxis.from_energy_bounds( energy_min=energy.min(), energy_max=energy.max(), nbin=50 ) def plot_energy(self, ax=None, **kwargs): """Plot counts as a function of energy. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None **kwargs : dict, optional Keyword arguments passed to `~matplotlib.pyplot.hist`. Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax energy_axis = self._default_plot_energy_axis kwargs.setdefault("log", True) kwargs.setdefault("histtype", "step") kwargs.setdefault("bins", energy_axis.edges) with quantity_support(): ax.hist(self.energy, **kwargs) energy_axis.format_plot_xaxis(ax=ax) ax.set_ylabel("Counts") ax.set_yscale("log") return ax def plot_time(self, ax=None, **kwargs): """Plot an event rate time curve. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. **kwargs : dict, optional Keyword arguments passed to `~matplotlib.pyplot.errorbar`. Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax # Note the events are not necessarily in time order time = self.table["TIME"] time = time - np.min(time) ax.set_xlabel(f"Time [{u.s.to_string(UNIT_STRING_FORMAT)}]") ax.set_ylabel("Counts") y, x_edges = np.histogram(time, bins=20) xerr = np.diff(x_edges) / 2 x = x_edges[:-1] + xerr yerr = np.sqrt(y) kwargs.setdefault("fmt", "none") ax.errorbar(x=x, y=y, xerr=xerr, yerr=yerr, **kwargs) return ax def plot_offset2_distribution( self, ax=None, center=None, max_percentile=98, **kwargs ): """Plot offset^2 distribution of the events. The distribution shown in this plot is for this quantity:: offset = center.separation(events.radec).deg offset2 = offset ** 2 Note that this method is just for a quicklook plot. If you want to do computations with the offset or offset^2 values, you can use the line above. As an example, here's how to compute the 68% event containment radius using `numpy.percentile`:: import numpy as np r68 = np.percentile(offset, q=68) Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. center : `astropy.coordinates.SkyCoord`, optional Center position for the offset^2 distribution. Default is the observation pointing position. max_percentile : float, optional Define the percentile of the offset^2 distribution used to define the maximum offset^2 value. Default is 98. **kwargs : dict, optional Extra keyword arguments are passed to `~matplotlib.pyplot.hist`. Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. Examples -------- Load an example event list: >>> from gammapy.data import EventList >>> from astropy import units as u >>> filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz" >>> events = EventList.read(filename) >>> #Plot the offset^2 distribution wrt. the observation pointing position >>> #(this is a commonly used plot to check the background spatial distribution): >>> events.plot_offset2_distribution() # doctest: +SKIP Plot the offset^2 distribution wrt. the Crab pulsar position (this is commonly used to check both the gamma-ray signal and the background spatial distribution): >>> import numpy as np >>> from astropy.coordinates import SkyCoord >>> center = SkyCoord(83.63307, 22.01449, unit='deg') >>> bins = np.linspace(start=0, stop=0.3 ** 2, num=30) * u.deg ** 2 >>> events.plot_offset2_distribution(center=center, bins=bins) # doctest: +SKIP Note how we passed the ``bins`` option of `matplotlib.pyplot.hist` to control the histogram binning, in this case 30 bins ranging from 0 to (0.3 deg)^2. """ ax = plt.gca() if ax is None else ax if center is None: center = self._plot_center offset2 = center.separation(self.radec) ** 2 max2 = np.percentile(offset2, q=max_percentile) kwargs.setdefault("histtype", "step") kwargs.setdefault("bins", 30) kwargs.setdefault("range", (0.0, max2.value)) with quantity_support(): ax.hist(offset2, **kwargs) ax.set_xlabel(rf"Offset$^2$ [{ax.xaxis.units.to_string(UNIT_STRING_FORMAT)}]") ax.set_ylabel("Counts") return ax def plot_energy_offset(self, ax=None, center=None, **kwargs): """Plot counts histogram with energy and offset axes. Parameters ---------- ax : `~matplotlib.pyplot.Axis`, optional Plot axis. Default is None. center : `~astropy.coordinates.SkyCoord`, optional Sky coord from which offset is computed. Default is None. **kwargs : dict, optional Keyword arguments forwarded to `~matplotlib.pyplot.pcolormesh`. Returns ------- ax : `~matplotlib.pyplot.Axis` Plot axis. """ from matplotlib.colors import LogNorm ax = plt.gca() if ax is None else ax if center is None: center = self._plot_center energy_axis = self._default_plot_energy_axis offset = center.separation(self.radec) offset_axis = MapAxis.from_bounds( 0 * u.deg, offset.max(), nbin=30, name="offset" ) counts = np.histogram2d( x=self.energy, y=offset, bins=(energy_axis.edges, offset_axis.edges), )[0] kwargs.setdefault("norm", LogNorm()) with quantity_support(): ax.pcolormesh(energy_axis.edges, offset_axis.edges, counts.T, **kwargs) energy_axis.format_plot_xaxis(ax=ax) offset_axis.format_plot_yaxis(ax=ax) return ax def check(self, checks="all"): """Run checks. This is a generator that yields a list of dicts. """ checker = EventListChecker(self) return checker.run(checks=checks) def map_coord(self, geom): """Event map coordinates for a given geometry. Parameters ---------- geom : `~gammapy.maps.Geom` Geometry. Returns ------- coord : `~gammapy.maps.MapCoord` Coordinates. """ coord = {"skycoord": self.radec} cols = {k.upper(): v for k, v in self.table.columns.items()} for axis in geom.axes: try: col = cols[axis.name.upper()] coord[axis.name] = u.Quantity(col).to(axis.unit) except KeyError: raise KeyError(f"Column not found in event list: {axis.name!r}") return MapCoord.create(coord) def select_mask(self, mask): """Select events inside a mask (`EventList`). Parameters ---------- mask : `~gammapy.maps.Map` Mask. Returns ------- event_list : `EventList` Copy of event list with selection applied. Examples -------- >>> from gammapy.data import EventList >>> from gammapy.maps import WcsGeom, Map >>> geom = WcsGeom.create(skydir=(0,0), width=(4, 4), frame="galactic") >>> mask = geom.region_mask("galactic;circle(0, 0, 0.5)") >>> filename = "$GAMMAPY_DATA/cta-1dc/data/baseline/gps/gps_baseline_110380.fits" >>> events = EventList.read(filename) >>> masked_event = events.select_mask(mask) >>> len(masked_event.table) 5594 """ coord = self.map_coord(mask.geom) values = mask.get_by_coord(coord) valid = values > 0 return self.select_row_subset(valid) @property def observatory_earth_location(self): """Observatory location as an `~astropy.coordinates.EarthLocation` object.""" return earth_location_from_dict(self.table.meta) @property def observation_time_duration(self): """Observation time duration in seconds as a `~astropy.units.Quantity`. This is a keyword related to IACTs. The wall time, including dead-time. """ time_delta = (self.observation_time_stop - self.observation_time_start).sec return u.Quantity(time_delta, "s") @property def observation_live_time_duration(self): """Live-time duration in seconds as a `~astropy.units.Quantity`. The dead-time-corrected observation time. - In Fermi-LAT it is automatically provided in the header of the event list. - In IACTs is computed as ``t_live = t_observation * (1 - f_dead)`` where ``f_dead`` is the dead-time fraction. """ return u.Quantity(self.table.meta["LIVETIME"], "second") @property def observation_dead_time_fraction(self): """Dead-time fraction as a float. This is a keyword related to IACTs. Defined as dead-time over observation time. Dead-time is defined as the time during the observation where the detector didn't record events: http://en.wikipedia.org/wiki/Dead_time https://ui.adsabs.harvard.edu/abs/2004APh....22..285F The dead-time fraction is used in the live-time computation, which in turn is used in the exposure and flux computation. """ return 1 - self.table.meta["DEADC"] @property def altaz_frame(self): """ALT / AZ frame as an `~astropy.coordinates.AltAz` object.""" return AltAz(obstime=self.time, location=self.observatory_earth_location) @property def altaz(self): """ALT / AZ position computed from RA / DEC as a `~astropy.coordinates.SkyCoord` object.""" return self.radec.transform_to(self.altaz_frame) @property def altaz_from_table(self): """ALT / AZ position from table as a `~astropy.coordinates.SkyCoord` object.""" lon = self.table["AZ"] lat = self.table["ALT"] return SkyCoord(lon, lat, unit="deg", frame=self.altaz_frame) @property def pointing_radec(self): """Pointing RA / DEC sky coordinates as a `~astropy.coordinates.SkyCoord` object.""" info = self.table.meta lon, lat = info["RA_PNT"], info["DEC_PNT"] return SkyCoord(lon, lat, unit="deg", frame="icrs") @property def offset(self): """Event offset from the array pointing position as an `~astropy.coordinates.Angle`.""" position = self.radec center = self.pointing_radec offset = center.separation(position) return Angle(offset, unit="deg") @property def offset_from_median(self): """Event offset from the median position as an `~astropy.coordinates.Angle`.""" position = self.radec center = self.galactic_median offset = center.separation(position) return Angle(offset, unit="deg") def select_offset(self, offset_band): """Select events in offset band. Parameters ---------- offset_band : `~astropy.coordinates.Angle` offset band ``[offset_min, offset_max)``. Returns ------- event_list : `EventList` Copy of event list with selection applied. Examples -------- >>> from gammapy.data import EventList >>> import astropy.units as u >>> filename = "$GAMMAPY_DATA/cta-1dc/data/baseline/gps/gps_baseline_110380.fits" >>> events = EventList.read(filename) >>> selected_events = events.select_offset([0.3, 0.9]*u.deg) >>> len(selected_events.table) 12688 """ offset = self.offset mask = offset_band[0] <= offset mask &= offset < offset_band[1] return self.select_row_subset(mask) def select_rad_max(self, rad_max, position=None): """Select energy dependent offset. Parameters ---------- rad_max : `~gamapy.irf.RadMax2D` Rad max definition. position : `~astropy.coordinates.SkyCoord`, optional Center position. Default is the pointing position. Returns ------- event_list : `EventList` Copy of event list with selection applied. """ if position is None: position = self.pointing_radec offset = position.separation(self.pointing_radec) separation = position.separation(self.radec) rad_max_for_events = rad_max.evaluate( method="nearest", energy=self.energy, offset=offset ) selected = separation <= rad_max_for_events return self.select_row_subset(selected) @property def is_pointed_observation(self): """Whether observation is pointed.""" return "RA_PNT" in self.table.meta def peek(self, allsky=False): """Quick look plots. Parameters ---------- allsky : bool, optional Whether to look at the events all-sky. Default is False. """ import matplotlib.gridspec as gridspec if allsky: gs = gridspec.GridSpec(nrows=2, ncols=2) fig = plt.figure(figsize=(8, 8)) else: gs = gridspec.GridSpec(nrows=2, ncols=3) fig = plt.figure(figsize=(12, 8)) # energy plot ax_energy = fig.add_subplot(gs[1, 0]) self.plot_energy(ax=ax_energy) # offset plots if not allsky: ax_offset = fig.add_subplot(gs[0, 1]) self.plot_offset2_distribution(ax=ax_offset) ax_energy_offset = fig.add_subplot(gs[0, 2]) self.plot_energy_offset(ax=ax_energy_offset) # time plot ax_time = fig.add_subplot(gs[1, 1]) self.plot_time(ax=ax_time) # image plot m = self._counts_image(allsky=allsky) if allsky: ax_image = fig.add_subplot(gs[0, :], projection=m.geom.wcs) else: ax_image = fig.add_subplot(gs[0, 0], projection=m.geom.wcs) m.plot(ax=ax_image, stretch="sqrt", vmin=0) plt.subplots_adjust(wspace=0.3) @property def _plot_center(self): if self.is_pointed_observation: return self.pointing_radec else: return self.galactic_median @property def _plot_width(self): if self.is_pointed_observation: offset = self.offset else: offset = self.offset_from_median return 2 * offset.max() def _counts_image(self, allsky): if allsky: opts = { "npix": (360, 180), "binsz": 1.0, "proj": "AIT", "frame": "galactic", } else: opts = { "width": self._plot_width, "binsz": 0.05, "proj": "TAN", "frame": "galactic", "skydir": self._plot_center, } m = WcsNDMap.create(**opts) m.fill_by_coord(self.radec) m = m.smooth(width=0.5) return m def plot_image(self, ax=None, allsky=False): """Quick look counts map sky plot. Parameters ---------- ax : `~matplotlib.pyplot.Axes`, optional Matplotlib axes. allsky : bool, optional Whether to plot on an all sky geom. Default is False. """ if ax is None: ax = plt.gca() m = self._counts_image(allsky=allsky) m.plot(ax=ax, stretch="sqrt") def copy(self): """Copy event list (`EventList`).""" return copy.deepcopy(self) class EventListChecker(Checker): """Event list checker. Data format specification: ref:`gadf:iact-events`. Parameters ---------- event_list : `~gammapy.data.EventList` Event list. """ CHECKS = { "meta": "check_meta", "columns": "check_columns", "times": "check_times", "coordinates_galactic": "check_coordinates_galactic", "coordinates_altaz": "check_coordinates_altaz", } accuracy = {"angle": Angle("1 arcsec"), "time": u.Quantity(1, "microsecond")} # https://gamma-astro-data-formats.readthedocs.io/en/latest/events/events.html#mandatory-header-keywords # noqa: E501 meta_required = [ "HDUCLASS", "HDUDOC", "HDUVERS", "HDUCLAS1", "OBS_ID", "TSTART", "TSTOP", "ONTIME", "LIVETIME", "DEADC", "RA_PNT", "DEC_PNT", # TODO: what to do about these? # They are currently listed as required in the spec, # but I think we should just require ICRS and those # are irrelevant, should not be used. # 'RADECSYS', # 'EQUINOX', "ORIGIN", "TELESCOP", "INSTRUME", "CREATOR", # https://gamma-astro-data-formats.readthedocs.io/en/latest/general/time.html#time-formats # noqa: E501 "MJDREFI", "MJDREFF", "TIMEUNIT", "TIMESYS", "TIMEREF", # https://gamma-astro-data-formats.readthedocs.io/en/latest/general/coordinates.html#coords-location # noqa: E501 "GEOLON", "GEOLAT", "ALTITUDE", ] _col = collections.namedtuple("col", ["name", "unit"]) columns_required = [ _col(name="EVENT_ID", unit=""), _col(name="TIME", unit="s"), _col(name="RA", unit="deg"), _col(name="DEC", unit="deg"), _col(name="ENERGY", unit="TeV"), ] def __init__(self, event_list): self.event_list = event_list def _record(self, level="info", msg=None): obs_id = self.event_list.table.meta["OBS_ID"] return {"level": level, "obs_id": obs_id, "msg": msg} def check_meta(self): meta_missing = sorted(set(self.meta_required) - set(self.event_list.table.meta)) if meta_missing: yield self._record( level="error", msg=f"Missing meta keys: {meta_missing!r}" ) def check_columns(self): t = self.event_list.table if len(t) == 0: yield self._record(level="error", msg="Events table has zero rows") for name, unit in self.columns_required: if name not in t.colnames: yield self._record(level="error", msg=f"Missing table column: {name!r}") else: if u.Unit(unit) != (t[name].unit or ""): yield self._record( level="error", msg=f"Invalid unit for column: {name!r}" ) def check_times(self): dt = (self.event_list.time - self.event_list.observation_time_start).sec if dt.min() < self.accuracy["time"].to_value("s"): yield self._record(level="error", msg="Event times before obs start time") dt = (self.event_list.time - self.event_list.observation_time_stop).sec if dt.max() > self.accuracy["time"].to_value("s"): yield self._record(level="error", msg="Event times after the obs end time") if np.min(np.diff(dt)) <= 0: yield self._record(level="error", msg="Events are not time-ordered.") def check_coordinates_galactic(self): """Check if RA / DEC matches GLON / GLAT.""" t = self.event_list.table if "GLON" not in t.colnames: return galactic = SkyCoord(t["GLON"], t["GLAT"], unit="deg", frame="galactic") separation = self.event_list.radec.separation(galactic).to("arcsec") if separation.max() > self.accuracy["angle"]: yield self._record( level="error", msg="GLON / GLAT not consistent with RA / DEC" ) def check_coordinates_altaz(self): """Check if ALT / AZ matches RA / DEC.""" t = self.event_list.table if "AZ" not in t.colnames: return altaz_astropy = self.event_list.altaz separation = angular_separation( altaz_astropy.data.lon, altaz_astropy.data.lat, t["AZ"].quantity, t["ALT"].quantity, ) if separation.max() > self.accuracy["angle"]: yield self._record( level="error", msg="ALT / AZ not consistent with RA / DEC" ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/filters.py0000644000175100001770000000756214721316200017011 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import copy import html import logging __all__ = ["ObservationFilter"] log = logging.getLogger(__name__) class ObservationFilter: """Holds and applies filters to observation data. Parameters ---------- time_filter : `astropy.time.Time`, optional Start and stop time of the selected time interval. Currently, we only support a single time interval. Default is None. event_filters : list of dict, optional An event filter dictionary needs two keys: - **type** : str, one of the keys in `~gammapy.data.ObservationFilter.EVENT_FILTER_TYPES` - **opts** : dict, it is passed on to the method of the `~gammapy.data.EventListBase` class that corresponds to the filter type (see `~gammapy.data.ObservationFilter.EVENT_FILTER_TYPES`) The filtered event list will be an intersection of all filters. A union of filters is not supported yet. Default is None. Examples -------- >>> from gammapy.data import ObservationFilter, DataStore, Observation >>> from astropy.time import Time >>> from astropy.coordinates import Angle >>> >>> time_filter = Time(['2021-03-27T20:10:00', '2021-03-27T20:20:00']) >>> phase_filter = {'type': 'custom', 'opts': dict(parameter='PHASE', band=(0.2, 0.8))} >>> >>> my_obs_filter = ObservationFilter(time_filter=time_filter, event_filters=[phase_filter]) >>> >>> ds = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps") >>> my_obs = ds.obs(obs_id=111630) >>> my_obs.obs_filter = my_obs_filter """ EVENT_FILTER_TYPES = dict(sky_region="select_region", custom="select_parameter") def __init__(self, time_filter=None, event_filters=None): self.time_filter = time_filter self.event_filters = event_filters or [] def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property def livetime_fraction(self): """Fraction of the livetime kept when applying the event_filters.""" return self._check_filter_phase(self.event_filters) def filter_events(self, events): """Apply filters to an event list. Parameters ---------- events : `~gammapy.data.EventListBase` Event list to which the filters will be applied. Returns ------- filtered_events : `~gammapy.data.EventListBase` The filtered event list. """ filtered_events = self._filter_by_time(events) for f in self.event_filters: method_str = self.EVENT_FILTER_TYPES[f["type"]] filtered_events = getattr(filtered_events, method_str)(**f["opts"]) return filtered_events def filter_gti(self, gti): """Apply filters to a GTI table. Parameters ---------- gti : `~gammapy.data.GTI` GTI table to which the filters will be applied. Returns ------- filtered_gti : `~gammapy.data.GTI` The filtered GTI table. """ return self._filter_by_time(gti) def _filter_by_time(self, data): """Return a new time filtered data object. Calls the `select_time` method of the data object. """ if self.time_filter: return data.select_time(self.time_filter) else: return data def copy(self): """Copy the `ObservationFilter` object.""" return copy.deepcopy(self) @staticmethod def _check_filter_phase(event_filter): if not event_filter: return 1 for f in event_filter: if f.get("opts").get("parameter") == "PHASE": band = f.get("opts").get("band") return band[1] - band[0] else: return 1 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/gti.py0000644000175100001770000003774314721316200016130 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import copy import html import warnings from operator import le, lt import numpy as np import astropy.units as u from astropy.io import fits from astropy.table import Table, vstack from astropy.time import Time from gammapy.utils.scripts import make_path from gammapy.utils.time import TIME_REF_DEFAULT, time_ref_from_dict, time_ref_to_dict from .metadata import GTIMetaData __all__ = ["GTI"] class GTI: """Good time intervals (GTI) `~astropy.table.Table`. Data format specification: :ref:`gadf:iact-gti`. Note: at the moment dead-time and live-time is in the EVENTS header ... the GTI header just deals with observation times. Parameters ---------- table : `~astropy.table.Table` GTI table. reference_time : `~astropy.time.Time`, optional The reference time. Default is None. If None, use TIME_REF_DEFAULT. Examples -------- Load GTIs for a H.E.S.S. event list: >>> from gammapy.data import GTI >>> gti = GTI.read('$GAMMAPY_DATA/hess-dl3-dr1//data/hess_dl3_dr1_obs_id_023523.fits.gz') >>> print(gti) GTI info: - Number of GTIs: 1 - Duration: 1687.0000000000016 s - Start: 123890826.00000001 s MET - Start: 2004-12-04T22:08:10.184 (time standard: TT) - Stop: 123892513.00000001 s MET - Stop: 2004-12-04T22:36:17.184 (time standard: TT) Load GTIs for a Fermi-LAT event list: >>> gti = GTI.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-events.fits.gz") >>> print(gti) GTI info: - Number of GTIs: 39042 - Duration: 183139597.9032163 s - Start: 239557417.49417615 s MET - Start: 2008-08-04T15:44:41.678 (time standard: TT) - Stop: 460249999.99999994 s MET - Stop: 2015-08-02T23:14:24.184 (time standard: TT) """ def __init__(self, table, reference_time=None): self.table = self._validate_table(table) if reference_time is None: reference_time = TIME_REF_DEFAULT meta = GTIMetaData() meta.reference_time = reference_time self._meta = meta def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @staticmethod def _validate_table(table): """Check that the input GTI fits the gammapy internal model.""" if not isinstance(table, Table): raise TypeError("GTI table is not an astropy Table.") colnames = ["START", "STOP"] if not set(colnames).issubset(table.colnames): raise ValueError("GTI table not correctly defined.") if len(table) == 0: return table for name in colnames: if not isinstance(table[name], Time): raise TypeError(f"Column {name} is not a Time object.") return table def copy(self): """Deep copy of the `~gammapy.data.GIT` object.""" return copy.deepcopy(self) @classmethod def create(cls, start, stop, reference_time=None): """Create a GTI table from start and stop times. Parameters ---------- start : `~astropy.time.Time` or `~astropy.units.Quantity` Start times, if a quantity then w.r.t. reference time. stop : `~astropy.time.Time` or `~astropy.units.Quantity` Stop times, if a quantity then w.r.t. reference time. reference_time : `~astropy.time.Time`, optional The reference time to use in GTI definition. Default is None. If None, use TIME_REF_DEFAULT. """ if reference_time is None: reference_time = TIME_REF_DEFAULT reference_time = Time(reference_time) reference_time.format = "mjd" if not isinstance(start, Time): start = reference_time + u.Quantity(start) if not isinstance(stop, Time): stop = reference_time + u.Quantity(stop) table = Table({"START": np.atleast_1d(start), "STOP": np.atleast_1d(stop)}) return cls(table, reference_time=reference_time) @classmethod def read(cls, filename, hdu="GTI", format="gadf", checksum=False): """Read from FITS file. Parameters ---------- filename : `pathlib.Path` or str Filename hdu : str hdu name. Default is "GTI". format: str Input format, currently only "gadf" is supported. Default is "gadf". checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. """ filename = make_path(filename) with fits.open(str(make_path(filename)), memmap=False) as hdulist: gti_hdu = hdulist[hdu] if checksum: if gti_hdu.verify_checksum() != 1: warnings.warn( f"Checksum verification failed for HDU {hdu} of {filename}.", UserWarning, ) return cls.from_table_hdu(gti_hdu, format=format) @classmethod def from_table_hdu(cls, table_hdu, format="gadf"): """Read from table HDU. Parameters ---------- table_hdu : `~astropy.io.fits.BinTableHDU` table hdu. format : {"gadf"} Input format, currently only "gadf" is supported. Default is "gadf". """ if format != "gadf": raise ValueError(f'Only the "gadf" format supported, got {format}') table = Table.read(table_hdu) time_ref = time_ref_from_dict(table.meta, format="mjd", scale="tt") # Check if TIMEUNIT keyword is present, otherwise assume seconds unit = table.meta.pop("TIMEUNIT", "s") start = u.Quantity(table["START"], unit) stop = u.Quantity(table["STOP"], unit) return cls.create(start, stop, time_ref) def to_table_hdu(self, format="gadf"): """ Convert this GTI instance to a `astropy.io.fits.BinTableHDU`. Parameters ---------- format : str, optional Output format, currently only "gadf" is supported. Default is "gadf". Returns ------- hdu : `astropy.io.fits.BinTableHDU` GTI table converted to FITS representation. """ if format != "gadf": raise ValueError(f'Only the "gadf" format supported, got {format}') # Don't impose the scale. GADF does not require it to be TT meta = time_ref_to_dict(self.time_ref, scale=self.time_ref.scale) start = self.time_start - self.time_ref stop = self.time_stop - self.time_ref table = Table({"START": start.to("s"), "STOP": stop.to("s")}, meta=meta) return fits.BinTableHDU(table, name="GTI") def write(self, filename, **kwargs): """Write to file. Parameters ---------- filename : str or `~pathlib.Path` Filename. **kwargs : dict, optional Keyword arguments passed to `~astropy.io.fits.HDUList.writeto`. """ hdu = self.to_table_hdu() hdulist = fits.HDUList([fits.PrimaryHDU(), hdu]) hdulist.writeto(make_path(filename), **kwargs) def __str__(self): t_start_met = self.met_start[0] t_stop_met = self.met_stop[-1] t_start = self.time_start[0].fits t_stop = self.time_stop[-1].fits return ( "GTI info:\n" f"- Number of GTIs: {len(self.table)}\n" f"- Duration: {self.time_sum}\n" f"- Start: {t_start_met} MET\n" f"- Start: {t_start} (time standard: {self.time_start[-1].scale.upper()})\n" f"- Stop: {t_stop_met} MET\n" f"- Stop: {t_stop} (time standard: {self.time_stop[-1].scale.upper()})\n" ) @property def time_delta(self): """GTI durations in seconds as a `~astropy.units.Quantity`.""" delta = self.time_stop - self.time_start return delta.to("s") @property def time_ref(self): """Time reference as a `~astropy.time.Time` object.""" return self._meta.reference_time @property def time_sum(self): """Sum of GTIs in seconds as a `~astropy.units.Quantity`.""" return self.time_delta.sum() @property def time_start(self): """GTI start times as a `~astropy.time.Time` object.""" return self.table["START"] @property def time_stop(self): """GTI end times as a `~astropy.time.Time` object.""" return self.table["STOP"] @property def met_start(self): """GTI start time difference with reference time in seconds, MET as a `~astropy.units.Quantity`.""" return (self.time_start - self.time_ref).to("s") @property def met_stop(self): """GTI start time difference with reference time in seconds, MET as a `~astropy.units.Quantity`.""" return (self.time_stop - self.time_ref).to("s") @property def time_intervals(self): """List of time intervals.""" return [ (t_start, t_stop) for t_start, t_stop in zip(self.time_start, self.time_stop) ] @classmethod def from_time_intervals(cls, time_intervals, reference_time=None): """From list of time intervals. Parameters ---------- time_intervals : list of `~astropy.time.Time` objects Time intervals. reference_time : `~astropy.time.Time`, optional Reference time to use in GTI definition. Default is None. If None, use TIME_REF_DEFAULT. Returns ------- gti : `GTI` GTI table. """ start = Time([_[0] for _ in time_intervals]) stop = Time([_[1] for _ in time_intervals]) if reference_time is None: reference_time = TIME_REF_DEFAULT return cls.create(start, stop, reference_time) def select_time(self, time_interval): """Select and crop GTIs in time interval. Parameters ---------- time_interval : `astropy.time.Time` Start and stop time for the selection. Returns ------- gti : `GTI` Copy of the GTI table with selection applied. """ interval_start, interval_stop = time_interval interval_start.format = self.time_start.format interval_stop.format = self.time_stop.format # get GTIs that fall within the time_interval mask = self.time_start < interval_stop mask &= self.time_stop > interval_start gti_within = self.table[mask] # crop the GTIs gti_within["START"] = np.clip( gti_within["START"], interval_start, interval_stop ) gti_within["STOP"] = np.clip(gti_within["STOP"], interval_start, interval_stop) return self.__class__(gti_within) def delete_interval(self, time_interval): """Select and crop GTIs in time interval. Parameters ---------- time_interval : [`astropy.time.Time`, `astropy.time.Time`] Start and stop time for the selection. Returns ------- gti : `GTI` Copy of the GTI table with the bad time interval deleted. """ interval_start, interval_stop = time_interval interval_start.format = self.time_start.format interval_stop.format = self.time_stop.format trim_table = self.table.copy() trim_table["STOP"][ (self.time_start < interval_start) & (self.time_stop > interval_start) ] = interval_start trim_table["START"][ (self.time_start < interval_stop) & (self.time_stop > interval_stop) ] = interval_stop mask = (self.time_stop > interval_stop) | (self.time_start < interval_start) return self.__class__(trim_table[mask]) def stack(self, other): """Stack with another GTI in place. This simply changes the time reference of the second GTI table and stack the two tables. No logic is applied to the intervals. Parameters ---------- other : `~gammapy.data.GTI` GTI to stack to self. """ self.table = self._validate_table(vstack([self.table, other.table])) @classmethod def from_stack(cls, gtis, **kwargs): """Stack (concatenate) list of GTIs. Calls `~astropy.table.vstack`. Parameters ---------- gtis : list of `GTI` List of good time intervals to stack. **kwargs : dict, optional Keywords passed on to `~astropy.table.vstack`. Returns ------- gti : `GTI` Stacked good time intervals. """ tables = [_.table for _ in gtis] stacked_table = vstack(tables, **kwargs) return cls(stacked_table) def union(self, overlap_ok=True, merge_equal=True): """Union of overlapping time intervals. Returns a new `~gammapy.data.GTI` object. Parameters ---------- overlap_ok : bool, optional Whether to raise an error when overlapping time bins are found. Default is True. merge_equal : bool; optional Whether to merge touching time bins e.g. ``(1, 2)`` and ``(2, 3)`` will result in ``(1, 3)``. Default is True. """ # Algorithm to merge overlapping intervals is well-known, # see e.g. https://stackoverflow.com/a/43600953/498873 table = self.table.copy() table.sort("START") compare = lt if merge_equal else le # We use Python dict instead of astropy.table.Row objects, # because on some versions modifying Row entries doesn't behave as expected merged = [{"START": table[0]["START"], "STOP": table[0]["STOP"]}] for row in table[1:]: interval = {"START": row["START"], "STOP": row["STOP"]} if compare(merged[-1]["STOP"], interval["START"]): merged.append(interval) else: if not overlap_ok: raise ValueError("Overlapping time bins") merged[-1]["STOP"] = max(interval["STOP"], merged[-1]["STOP"]) merged = Table(rows=merged, names=["START", "STOP"], meta=self.table.meta) return self.__class__(merged, reference_time=self.time_ref) def group_table(self, time_intervals, atol="1e-6 s"): """Compute the table with the info on the group to which belong each time interval. The t_start and t_stop are stored in MJD from a scale in "utc". Parameters ---------- time_intervals : list of `astropy.time.Time` Start and stop time for each interval to compute the LC. atol : `~astropy.units.Quantity` Tolerance value for time comparison with different scale. Default is "1e-6 s". Returns ------- group_table : `~astropy.table.Table` Contains the grouping info. """ atol = u.Quantity(atol) group_table = Table( names=("group_idx", "time_min", "time_max", "bin_type"), dtype=("i8", "f8", "f8", "S10"), ) time_intervals_lowedges = Time( [time_interval[0] for time_interval in time_intervals] ) time_intervals_upedges = Time( [time_interval[1] for time_interval in time_intervals] ) for t_start, t_stop in zip(self.time_start, self.time_stop): mask1 = t_start >= time_intervals_lowedges - atol mask2 = t_stop <= time_intervals_upedges + atol mask = mask1 & mask2 if np.any(mask): group_index = np.where(mask)[0] bin_type = "" else: group_index = -1 if np.any(mask1): bin_type = "overflow" elif np.any(mask2): bin_type = "underflow" else: bin_type = "outflow" group_table.add_row( [group_index, t_start.utc.mjd, t_stop.utc.mjd, bin_type] ) return group_table ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/hdu_index_table.py0000644000175100001770000001473014721316200020452 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import numpy as np from astropy.table import Table from astropy.utils import lazyproperty from gammapy.utils.fits import HDULocation from gammapy.utils.scripts import make_path __all__ = ["HDUIndexTable"] log = logging.getLogger(__name__) class HDUIndexTable(Table): """HDU index table. See :ref:`gadf:hdu-index`. """ VALID_HDU_TYPE = [ "events", "gti", "aeff", "edisp", "psf", "bkg", "rad_max", "pointing", ] """Valid values for `HDU_TYPE`.""" VALID_HDU_CLASS = [ "events", "gti", "aeff_2d", "edisp_2d", "psf_table", "psf_3gauss", "psf_king", "bkg_2d", "bkg_3d", "rad_max_2d", "pointing", ] """Valid values for `HDU_CLASS`.""" @classmethod def read(cls, filename, **kwargs): """Read :ref:`gadf:hdu-index`. Parameters ---------- filename : `pathlib.Path` or str Filename. **kwargs : dict, optional Keyword arguments passed to `~astropy.table.Table.read`. """ filename = make_path(filename) table = super().read(filename, **kwargs) table.meta["BASE_DIR"] = filename.parent.as_posix() # TODO: this is a workaround for the joint-crab validation with astropy>4.0. # TODO: Remove when handling of empty columns is clarified table["FILE_DIR"].fill_value = "" return table.filled() @property def base_dir(self): """Base directory.""" return make_path(self.meta.get("BASE_DIR", "")) def hdu_location(self, obs_id, hdu_type=None, hdu_class=None, warn_missing=True): """Create `HDULocation` for a given selection. Parameters ---------- obs_id : int Observation ID. hdu_type : str, optional HDU type (see `~gammapy.data.HDUIndexTable.VALID_HDU_TYPE`). Default is None. hdu_class : str, optional HDU class (see `~gammapy.data.HDUIndexTable.VALID_HDU_CLASS`). Default is None. warn_missing : bool, optional Warn if no HDU is found matching the selection. Default is True. Returns ------- location : `~gammapy.data.HDULocation` HDU location. """ self._validate_selection(obs_id=obs_id, hdu_type=hdu_type, hdu_class=hdu_class) idx = self.row_idx(obs_id=obs_id, hdu_type=hdu_type, hdu_class=hdu_class) if len(idx) == 1: idx = idx[0] elif len(idx) == 0: if warn_missing: log.warning( f"No HDU found matching: OBS_ID = {obs_id}, HDU_TYPE = {hdu_type}," " HDU_CLASS = {hdu_class}" ) return None else: idx = idx[0] log.warning( f"Found multiple HDU matching: OBS_ID = {obs_id}, HDU_TYPE = {hdu_type}," " HDU_CLASS = {hdu_class}." f" Returning the first entry, which has " f"HDU_TYPE = {self[idx]['HDU_TYPE']} and HDU_CLASS = {self[idx]['HDU_CLASS']}" ) return self.location_info(idx) def _validate_selection(self, obs_id, hdu_type, hdu_class): """Validate HDU selection. The goal is to give helpful error messages to the user. """ if hdu_type is None and hdu_class is None: raise ValueError("You have to specify `hdu_type` or `hdu_class`.") if hdu_type and hdu_type not in self.VALID_HDU_TYPE: valid = [str(_) for _ in self.VALID_HDU_TYPE] raise ValueError(f"Invalid hdu_type: {hdu_type}. Valid values are: {valid}") if hdu_class and hdu_class not in self.VALID_HDU_CLASS: valid = [str(_) for _ in self.VALID_HDU_CLASS] raise ValueError( f"Invalid hdu_class: {hdu_class}. Valid values are: {valid}" ) if obs_id not in self["OBS_ID"]: raise IndexError(f"No entry available with OBS_ID = {obs_id}") def row_idx(self, obs_id, hdu_type=None, hdu_class=None): """Table row indices for a given selection. Parameters ---------- obs_id : int Observation ID. hdu_type : str, optional HDU type (see `~gammapy.data.HDUIndexTable.VALID_HDU_TYPE`). Default is None. hdu_class : str, optional HDU class (see `~gammapy.data.HDUIndexTable.VALID_HDU_CLASS`). Default is None. Returns ------- idx : list of int List of row indices matching the selection. """ selection = self["OBS_ID"] == obs_id if hdu_class: is_hdu_class = self._hdu_class_stripped == hdu_class selection &= is_hdu_class if hdu_type: is_hdu_type = self._hdu_type_stripped == hdu_type selection &= is_hdu_type idx = np.where(selection)[0] return list(idx) def location_info(self, idx): """Create `HDULocation` for a given row index.""" row = self[idx] return HDULocation( hdu_class=row["HDU_CLASS"].strip(), base_dir=self.base_dir.as_posix(), file_dir=row["FILE_DIR"].strip(), file_name=row["FILE_NAME"].strip(), hdu_name=row["HDU_NAME"].strip(), ) @lazyproperty def _hdu_class_stripped(self): return np.array([_.strip() for _ in self["HDU_CLASS"]]) @lazyproperty def _hdu_type_stripped(self): return np.array([_.strip() for _ in self["HDU_TYPE"]]) @lazyproperty def obs_id_unique(self): """Observation IDs (unique).""" return np.unique(np.sort(self["OBS_ID"])) @lazyproperty def hdu_type_unique(self): """HDU types (unique).""" return list(np.unique(np.sort([_.strip() for _ in self["HDU_TYPE"]]))) @lazyproperty def hdu_class_unique(self): """HDU classes (unique).""" return list(np.unique(np.sort([_.strip() for _ in self["HDU_CLASS"]]))) def summary(self): """Summary report as a string.""" obs_id = self.obs_id_unique return ( "HDU index table:\n" f"BASE_DIR: {self.base_dir}\n" f"Rows: {len(self)}\n" f"OBS_ID: {obs_id[0]} -- {obs_id[-1]}\n" f"HDU_TYPE: {self.hdu_type_unique}\n" f"HDU_CLASS: {self.hdu_class_unique}\n" ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/ivoa.py0000644000175100001770000003050514721316200016270 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging from astropy.table import Column, Table from gammapy.data import DataStore __all__ = ["to_obscore_table"] log = logging.getLogger(__name__) log.setLevel(logging.INFO) DEFAULT_OBSCORE_TEMPLATE = { "dataproduct_type": "event", "calib_level": 2, "obs_collection": "DL3", "access_format": "application/fits", "s_fov": 10.0, } def empty_obscore_table(): """Generate the Obscore default table. In case the obscore standard changes, this function should be changed according to https://www.ivoa.net/documents/ObsCore Returns ------- table : `~astropy.table.Table` the empty table """ obscore_table = [None] * 29 obscore_table[0] = Column( name="dataproduct_type", unit="", description="Data product (file content) primary type", dtype="U10", meta={"Utype": "ObsDataset.dataProductType", "UCD": "meta.id"}, ) obscore_table[1] = Column( name="calib_level", unit="", description="Calibration level of the observation: in {0, 1, 2, 3, 4}", dtype="i4", meta={"Utype": "ObsDataset.calibLevel", "UCD": "meta.code;obs.calib"}, ) obscore_table[2] = Column( name="target_name", unit="", description="Object of interest", dtype="U25", meta={"Utype": "Target.name", "UCD": "meta.id;src"}, ) obscore_table[3] = Column( name="obs_id", unit="", description="Internal ID given by the ObsTAP service", dtype="U10", meta={"Utype": "DataID.observationID", "UCD": "meta.id"}, ) obscore_table[4] = Column( name="obs_collection", unit="", description="Name of the data collection", dtype="U10", meta={"Utype": "DataID.collection", "UCD": "meta.id"}, ) obscore_table[5] = Column( name="obs_publisher_did", unit="", description="ID for the Dataset given by the publisher", dtype="U30", meta={"Utype": "Curation.publisherDID", "UCD": "meta.ref.uri;meta.curation"}, ) obscore_table[6] = Column( name="access_url", unit="", description="URL used to access dataset", dtype="U30", meta={"Utype": "Access.reference", "UCD": "meta.ref.url"}, ) obscore_table[7] = Column( name="access_format", unit="", description="Content format of the dataset", dtype="U30", meta={"Utype": "Access.format", "UCD": "meta.code.mime"}, ) obscore_table[8] = Column( name="access_estsize", unit="kbyte", description="Estimated size of dataset: in kilobytes", dtype="i4", meta={"Utype": "Access.size", "UCD": "phys.size;meta.file"}, ) obscore_table[9] = Column( name="s_ra", unit="deg", description="Central Spatial Position in ICRS Right ascension", dtype="f8", meta={ "Utype": "Char.SpatialAxis.Coverage.Location.Coord.Position2D.Value2.C1", "UCD": "pos.eq.ra", }, ) obscore_table[10] = Column( name="s_dec", unit="deg", description="Central Spatial Position in ICRS Declination", dtype="f8", meta={ "Utype": "Char.SpatialAxis.Coverage.Location.Coord.Position2D.Value2.C2", "UCD": "pos.eq.dec", }, ) obscore_table[11] = Column( name="s_fov", unit="deg", description="Estimated size of the covered region as the diameter of a containing circle", dtype="f8", meta={ "Utype": "Char.SpatialAxis.Coverage.Bounds.Extent.diameter", "UCD": "phys.angSize;instr.fov", }, ) obscore_table[12] = Column( name="s_region", unit="", description="Sky region covered by the data product (expressed in ICRS frame)", dtype="U30", meta={ "Utype": "Char.SpatialAxis.Coverage.Support.Area", "UCD": "pos.outline;obs.field", }, ) obscore_table[13] = Column( name="s_resolution", unit="arcsec", description="Spatial resolution of data as FWHM of PSF", dtype="f8", meta={ "Utype": "Char.SpatialAxis.Resolution.refval.value", "UCD": "pos.angResolution", }, ) obscore_table[14] = Column( name="s_xel1", unit="", description="Number of elements along the first coordinate of the spatial axis", dtype="i4", meta={"Utype": "Char.SpatialAxis.numBins1", "UCD": "meta.number"}, ) obscore_table[15] = Column( name="s_xel2", unit="", description="Number of elements along the second coordinate of the spatial axis", dtype="i4", meta={"Utype": "Char.SpatialAxis.numBins2", "UCD": "meta.number"}, ) obscore_table[16] = Column( name="t_xel", unit="", description="Number of elements along the time axis", dtype="i4", meta={"Utype": "Char.TimeAxis.numBins", "UCD": "meta.number"}, ) obscore_table[17] = Column( name="t_min", unit="d", description="Start time in MJD", dtype="f8", meta={ "Utype": "Char.TimeAxis.Coverage.Bounds.Limits.StartTime", "UCD": "time.start;obs.exposure", }, ) obscore_table[18] = Column( name="t_max", unit="d", description="Stop time in MJD", dtype="f8", meta={ "Utype": "Char.TimeAxis.Coverage.Bounds.Limits.StopTime", "UCD": "time.end;obs.exposure", }, ) obscore_table[19] = Column( name="t_exptime", unit="s", description="Total exposure time", dtype="f8", meta={ "Utype": "Char.TimeAxis.Coverage.Support.Extent", "UCD": "time.duration;obs.exposure", }, ) obscore_table[20] = Column( name="t_resolution", unit="s", description="Temporal resolution FWHM", dtype="f8", meta={ "Utype": "Char.TimeAxis.Resolution.Refval.valueResolution.Refval.value", "UCD": "time.resolution", }, ) obscore_table[21] = Column( name="em_xel", unit="", description="Number of elements along the spectral axis", dtype="i4", meta={"Utype": "Char.SpectralAxis. numBins", "UCD": "meta.number"}, ) obscore_table[22] = Column( name="em_min", unit="TeV", description="start in spectral coordinates", dtype="f8", meta={ "Utype": "Char.SpectralAxis.Coverage.Bounds.Limits.LoLimit", "UCD": "em.wl;stat.min", }, ) obscore_table[23] = Column( name="em_max", unit="TeV", description="stop in spectral coordinates", dtype="f8", meta={ "Utype": "Char.SpectralAxis.Coverage.Bounds.Limits.HiLimit", "UCD": "em.wl;stat.max", }, ) obscore_table[24] = Column( name="em_res_power", unit="", description="Value of the resolving power along the spectral axis(R)", dtype="f8", meta={ "Utype": "Char.SpectralAxis.Resolution.ResolPower.refVal", "UCD": "spect.resolution", }, ) obscore_table[25] = Column( name="o_ucd", unit="", description="Nature of the observable axis", dtype="U30", meta={"Utype": "Char.ObservableAxis.ucd", "UCD": "meta.ucd"}, ) obscore_table[26] = Column( name="pol_xel", unit="", description="Number of elements along the polarization axis", dtype="i4", meta={"Utype": "Char.PolarizationAxis.numBins", "UCD": "meta.number"}, ) obscore_table[27] = Column( name="facility_name", unit="", description="The name of the facility, telescope space craft used for the observation", dtype="U10", meta={ "Utype": "Provenance.ObsConfig.Facility.name", "UCD": "meta.id;instr.tel", }, ) obscore_table[28] = Column( name="instrument_name", unit="", description="The name of the instrument used for the observation", dtype="U25", meta={"Utype": "Provenance.ObsConfig.Instrument.name", "UCD": "meta.id;instr"}, ) return Table(obscore_table) def observation_obscore_dict(observation): """Generates an obscore dict from an observation. Parameters ---------- observation : `~gammapy.data.Observation` the observation Returns ------- result : dict """ return { "target_name": observation.meta.target.name, "obs_id": str(observation.obs_id), "s_ra": observation.get_pointing_icrs(observation.tmid).ra.to_value("deg"), "s_dec": observation.get_pointing_icrs(observation.tmid).dec.to_value("deg"), "t_min": observation.tstart.to_value("mjd"), "t_max": observation.tstop.to_value("mjd"), "t_exptime": observation.observation_live_time_duration.to_value("s"), "em_min": observation.events.energy.min().value, "em_max": observation.events.energy.max().value, "facility_name": observation.meta.obs_info.telescope, "instrument_name": observation.meta.obs_info.instrument, } def to_obscore_table( base_dir, selected_obs=None, obs_publisher_did=None, access_url=None, obscore_template=None, ): """Generate the complete obscore Table by adding one row per observation using _obscore_row() Parameters ---------- base_dir : str or `~pathlib.Path` Base directory of the data files. selected_obs : list or array of Observation ID(int) Default is None (default of ``None`` means ``no observation ``). If not given, the full obscore (for all the obs_ids in DataStore) table is returned. obs_publisher_did : str, optional ID for the Dataset given by the publisher (check IVOA recommendations). Default is None. Giving the values of this argument is highly recommended. If not the corresponding obscore field is filled by the Observation ID value. access_url : str, optional URL used to to access (download) dataset(check IVOA recommendations). Default is None. Giving the values of this argument is highly recommended. If not the corresponding obscore field is filled by the Observation ID value. obscore_template : dict, optional Template for fixed values in the obscore Table. Default is DEFAULT_OBSCORE_TEMPLATE Returns ------- obscore_tab : ~astropy.table.Table Obscore table with number of rows = len(selected_obs) References ----------- * `IVOA ObsCore recommendations `_ * `https://www.ivoa.net/documents/ObsCore and https://www.ivoa.net/documents/TAP/`_ * `IVOA identifiers `_ """ if obs_publisher_did is None: log.warning( "Insufficient publisher information: 'obs_publisher_did'. Giving this value is highly recommended." ) obs_publisher_did = "" if access_url is None: log.warning( "Insufficient publisher information: 'access_url'. Giving this value is highly recommended." ) access_url = "" if obscore_template is None: log.info("No template provided, using DEFAULT_OBSCORE_TEMPLATE") result = DEFAULT_OBSCORE_TEMPLATE.copy() if obscore_template is not None: result.update(obscore_template) data_store = DataStore.from_dir(base_dir) if selected_obs is None: selected_obs = data_store.obs_ids obscore_table = empty_obscore_table() for obs_id in selected_obs: obscore_row = result.copy() hdu_loc = data_store.hdu_table.hdu_location(obs_id, "events") obscore_row["obs_publisher_did"] = (f"{obs_publisher_did}#{obs_id}",) obscore_row["access_estsize"] = hdu_loc.path().stat().st_size / 1024.0 obscore_row["access_url"] = f"{access_url}{hdu_loc.file_name}" observation = data_store.obs(obs_id) obscore_row.update(observation_obscore_dict(observation)) obscore_table.add_row(obscore_row) return obscore_table ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/metadata.py0000644000175100001770000001140614721316200017111 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from typing import ClassVar, Literal, Optional from pydantic import Field from gammapy.utils.fits import earth_location_from_dict from gammapy.utils.metadata import ( METADATA_FITS_KEYS, CreatorMetaData, MetaData, ObsInfoMetaData, PointingInfoMetaData, TargetMetaData, TimeInfoMetaData, ) from gammapy.utils.types import EarthLocationType, TimeType __all__ = ["ObservationMetaData", "GTIMetaData", "EventListMetaData"] OBSERVATION_METADATA_FITS_KEYS = { "location": { "input": lambda v: earth_location_from_dict(v), "output": lambda v: { "GEOLON": v.lon.deg, "GEOLAT": v.lat.deg, "ALTITUDE": v.height.to_value("m"), }, }, "deadtime_fraction": { "input": lambda v: 1 - v["DEADC"], "output": lambda v: {"DEADC": 1 - v}, }, } METADATA_FITS_KEYS["observation"] = OBSERVATION_METADATA_FITS_KEYS EVENTLIST_METADATA_FITS_KEYS = { "event_class": "EV_CLASS", } METADATA_FITS_KEYS["eventlist"] = EVENTLIST_METADATA_FITS_KEYS class ObservationMetaData(MetaData): """Metadata containing information about the Observation. Parameters ---------- obs_info : `~gammapy.utils.ObsInfoMetaData` The general observation information. pointing : `~gammapy.utils.PointingInfoMetaData` The pointing metadata. target : `~gammapy.utils.TargetMetaData` The target metadata. creation : `~gammapy.utils.CreatorMetaData` The creation metadata. location : `~astropy.coordinates.EarthLocation` or str, optional The observatory location. deadtime_fraction : float The observation deadtime fraction. Default is 0. optional : dict, optional Additional optional metadata. """ _tag: ClassVar[Literal["observation"]] = "observation" obs_info: Optional[ObsInfoMetaData] = None pointing: Optional[PointingInfoMetaData] = None target: Optional[TargetMetaData] = None location: Optional[EarthLocationType] = None deadtime_fraction: float = Field(0.0, ge=0, le=1.0) time_info: Optional[TimeInfoMetaData] = None creation: Optional[CreatorMetaData] = None optional: Optional[dict] = None @classmethod def from_header(cls, header, format="gadf"): """Create and fill the observation metadata from the event list metadata. Parameters ---------- header : dict Input FITS header. format : str The header data format. Default is gadf. """ meta = super().from_header(header, format) meta.creation = CreatorMetaData() # Include additional gadf keywords not specified as ObservationMetaData attributes optional_keywords = [ "OBSERVER", "EV_CLASS", "TELAPSE", "TELLIST", "N_TELS", "TASSIGN", "DST_VER", "ANA_VER", "CAL_VER", "CONV_DEP", "CONV_RA", "CONV_DEC", "TRGRATE", "ZTRGRATE", "MUONEFF", "BROKPIX", "AIRTEMP", "PRESSURE", "RELHUM", "NSBLEVEL", "CREATOR", "HDUVERS", ] optional = dict() for key in optional_keywords: if key in header.keys(): optional[key] = header[key] meta.optional = optional return meta class GTIMetaData(MetaData): """Metadata containing information about the GTI. Parameters ---------- reference_time : Time, str The GTI reference time. """ _tag: ClassVar[Literal["GTI"]] = "GTI" reference_time: Optional[TimeType] = None def from_header(cls, header, format="gadf"): meta = super().from_header(header, format) return meta class EventListMetaData(MetaData): """ Metadata containing information about the EventList. Parameters ---------- event_class : str The event class metadata. creation : `~gammapy.utils.metadata.CreatorMetaData` The creation metadata. """ _tag: ClassVar[Literal["EventList"]] = "eventlist" event_class: Optional[str] = None creation: Optional[CreatorMetaData] = None optional: Optional[dict] = None @classmethod def from_header(cls, header, format="gadf"): meta = super().from_header(header, format) # Include additional gadf keywords optional_keywords = [ "DST_VER", "ANA_VER", "CAL_VER", ] optional = dict() for key in optional_keywords: if key in header.keys(): optional[key] = header[key] meta.optional = optional return meta ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/obs_table.py0000644000175100001770000003441714721316200017272 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from collections import namedtuple import numpy as np from astropy.coordinates import Angle, SkyCoord from astropy.table import Table from astropy.units import Quantity, Unit from gammapy.utils.regions import SphericalCircleSkyRegion from gammapy.utils.scripts import make_path from gammapy.utils.testing import Checker from gammapy.utils.time import time_ref_from_dict __all__ = ["ObservationTable"] class ObservationTable(Table): """Observation table. Data format specification: :ref:`gadf:obs-index`. """ @classmethod def read(cls, filename, **kwargs): """Read an observation table from file. Parameters ---------- filename : `pathlib.Path` or str Filename. **kwargs : dict, optional Keyword arguments passed to `~astropy.table.Table.read`. """ return super().read(make_path(filename), **kwargs) @property def pointing_radec(self): """Pointing positions in ICRS as a `~astropy.coordinates.SkyCoord` object.""" return SkyCoord(self["RA_PNT"], self["DEC_PNT"], unit="deg", frame="icrs") @property def pointing_galactic(self): """Pointing positions in Galactic coordinates as a `~astropy.coordinates.SkyCoord` object.""" return SkyCoord( self["GLON_PNT"], self["GLAT_PNT"], unit="deg", frame="galactic" ) @property def time_ref(self): """Time reference as a `~astropy.time.Time` object.""" return time_ref_from_dict(self.meta) @property def time_start(self): """Observation start time as a `~astropy.time.Time` object.""" return self.time_ref + Quantity(self["TSTART"], "second") @property def time_stop(self): """Observation stop time as a `~astropy.time.Time` object.""" return self.time_ref + Quantity(self["TSTOP"], "second") def select_obs_id(self, obs_id): """Get `~gammapy.data.ObservationTable` containing only ``obs_id``. Raises KeyError if observation is not available. Parameters ---------- obs_id : int or list of int Observation ids. """ try: self.indices["OBS_ID"] except IndexError: self.add_index("OBS_ID") return self.__class__(self.loc["OBS_ID", obs_id]) def summary(self): """Summary information string.""" obs_name = self.meta.get( "OBSERVATORY_NAME", "N/A" ) # This is not GADF compliant if "N/A" in obs_name: obs_name = self.meta.get( "OBSERVATORY_NAME", self.meta.get("OBSERVER", "N/A") ) return ( f"Observation table:\n" f"Observatory name: {obs_name!r}\n" f"Number of observations: {len(self)}\n" ) def select_range(self, selection_variable, value_range, inverted=False): """Make an observation table, applying some selection. Generic function to apply a 1D box selection (min, max) to a table on any variable that is in the observation table and can be cast into a `~astropy.units.Quantity` object. If the range length is 0 (min = max), the selection is applied to the exact value indicated by the min value. This is useful for selection of exact values, for instance in discrete variables like the number of telescopes. If the inverted flag is activated, the selection is applied to keep all elements outside the selected range. Parameters ---------- selection_variable : str Name of variable to apply a cut (it should exist on the table). value_range : `~astropy.units.Quantity`-like Allowed range of values (min, max). The type should be consistent with the selection_variable. inverted : bool, optional Invert selection: keep all entries outside the (min, max) range. Default is False. Returns ------- obs_table : `~gammapy.data.ObservationTable` Observation table after selection. """ value_range = Quantity(value_range) # read values into a quantity in case units have to be taken into account value = Quantity(self[selection_variable]) mask = (value_range[0] <= value) & (value < value_range[1]) if np.allclose(value_range[0].value, value_range[1].value): mask = value_range[0] == value if inverted: mask = np.invert(mask) return self[mask] def select_time_range(self, time_range, partial_overlap=False, inverted=False): """Make an observation table, applying a time selection. Apply a 1D box selection (min, max) to a table on any time variable that is in the observation table. It supports absolute times in `~astropy.time.Time` format. If the inverted flag is activated, the selection is applied to keep all elements outside the selected range. Parameters ---------- time_range : `~astropy.time.Time` Allowed time range (min, max). partial_overlap : bool, optional Include partially overlapping observations. Default is False. inverted : bool, optional Invert selection: keep all entries outside the (min, max) range. Default is False. Returns ------- obs_table : `~gammapy.data.ObservationTable` Observation table after selection. """ tstart = self.time_start tstop = self.time_stop if not partial_overlap: mask1 = time_range[0] <= tstart mask2 = time_range[1] >= tstop else: mask1 = time_range[0] <= tstop mask2 = time_range[1] >= tstart mask = mask1 & mask2 if inverted: mask = np.invert(mask) return self[mask] def select_sky_circle(self, center, radius, inverted=False): """Make an observation table, applying a cone selection. Apply a selection based on the separation between the cone center and the observation pointing stored in the table. If the inverted flag is activated, the selection is applied to keep all elements outside the selected range. Parameters ---------- center : `~astropy.coordinates.SkyCoord` Cone center coordinate. radius : `~astropy.coordinates.Angle` Cone opening angle. The maximal separation allowed between the center and the observation pointing direction. inverted : bool, optional Invert selection: keep all entries outside the cone. Default is False. Returns ------- obs_table : `~gammapy.data.ObservationTable` Observation table after selection. """ region = SphericalCircleSkyRegion(center=center, radius=radius) mask = region.contains(self.pointing_radec) if inverted: mask = np.invert(mask) return self[mask] def select_observations(self, selections=None): """Select subset of observations from a list of selection criteria. Returns a new observation table representing the subset. There are 3 main kinds of selection criteria, according to the value of the **type** keyword in the **selection** dictionary: - circular region - time intervals (min, max) - intervals (min, max) on any other parameter present in the observation table, that can be cast into an `~astropy.units.Quantity` object Allowed selection criteria are interpreted using the following keywords in the **selection** dictionary under the **type** key. - ``sky_circle`` is a circular region centered in the coordinate marked by the **lon** and **lat** keywords, and radius **radius** - ``time_box`` is a 1D selection criterion acting on the observation start time (**TSTART**); the interval is set via the **time_range** keyword; uses `~gammapy.data.ObservationTable.select_time_range` - ``par_box`` is a 1D selection criterion acting on any parameter defined in the observation table that can be casted into an `~astropy.units.Quantity` object; the parameter name and interval can be specified using the keywords **variable** and **value_range** respectively; min = max selects exact values of the parameter; uses `~gammapy.data.ObservationTable.select_range` In all cases, the selection can be inverted by activating the **inverted** flag, in which case, the selection is applied to keep all elements outside the selected range. A few examples of selection criteria are given below. Parameters ---------- selections : list of dict, optional Dictionary of selection criteria. Default is None. Returns ------- obs_table : `~gammapy.data.ObservationTable` Observation table after selection. Examples -------- >>> from gammapy.data import ObservationTable >>> obs_table = ObservationTable.read('$GAMMAPY_DATA/hess-dl3-dr1/obs-index.fits.gz') >>> from astropy.coordinates import Angle >>> selection = dict(type='sky_circle', frame='galactic', ... lon=Angle(0, 'deg'), ... lat=Angle(0, 'deg'), ... radius=Angle(5, 'deg'), ... border=Angle(2, 'deg')) >>> selected_obs_table = obs_table.select_observations(selection) >>> from astropy.time import Time >>> time_range = Time(['2012-01-01T01:00:00', '2012-01-01T02:00:00']) >>> selection = dict(type='time_box', time_range=time_range) >>> selected_obs_table = obs_table.select_observations(selection) >>> value_range = Angle([60., 70.], 'deg') >>> selection = dict(type='par_box', variable='ALT_PNT', value_range=value_range) >>> selected_obs_table = obs_table.select_observations(selection) >>> selection = dict(type='par_box', variable='OBS_ID', value_range=[2, 5]) >>> selected_obs_table = obs_table.select_observations(selection) >>> selection = dict(type='par_box', variable='N_TELS', value_range=[4, 4]) >>> selected_obs_table = obs_table.select_observations(selection) """ if isinstance(selections, dict): selections = [selections] obs_table = self for selection in selections: obs_table = obs_table._apply_simple_selection(selection) return obs_table def _apply_simple_selection(self, selection): """Select subset of observations from a single selection criterion.""" selection = selection.copy() type = selection.pop("type") if type == "sky_circle": lon = Angle(selection.pop("lon"), "deg") lat = Angle(selection.pop("lat"), "deg") radius = Angle(selection.pop("radius"), "deg") radius += Angle(selection.pop("border", 0), "deg") center = SkyCoord(lon, lat, frame=selection.pop("frame")) return self.select_sky_circle(center, radius, **selection) elif type == "time_box": time_range = selection.pop("time_range") return self.select_time_range(time_range, **selection) elif type == "par_box": variable = selection.pop("variable") return self.select_range(variable, **selection) else: raise ValueError(f"Invalid selection type: {type}") class ObservationTableChecker(Checker): """Event list checker. Data format specification: ref:`gadf:iact-events`. Parameters ---------- obs_table : `~gammapy.data.ObservationTable` Observation table. """ CHECKS = { "meta": "check_meta", "columns": "check_columns", # "times": "check_times", # "coordinates_galactic": "check_coordinates_galactic", # "coordinates_altaz": "check_coordinates_altaz", } # accuracy = {"angle": Angle("1 arcsec"), "time": Quantity(1, "microsecond")} # https://gamma-astro-data-formats.readthedocs.io/en/latest/events/events.html#mandatory-header-keywords meta_required = [ "HDUCLASS", "HDUDOC", "HDUVERS", "HDUCLAS1", "HDUCLAS2", # https://gamma-astro-data-formats.readthedocs.io/en/latest/general/time.html#time-formats "MJDREFI", "MJDREFF", "TIMEUNIT", "TIMESYS", "TIMEREF", # https://gamma-astro-data-formats.readthedocs.io/en/latest/general/coordinates.html#coords-location "GEOLON", "GEOLAT", "ALTITUDE", ] _col = namedtuple("col", ["name", "unit"]) columns_required = [ _col(name="OBS_ID", unit=""), _col(name="RA_PNT", unit="deg"), _col(name="DEC_PNT", unit="deg"), _col(name="TSTART", unit="s"), _col(name="TSTOP", unit="s"), ] def __init__(self, obs_table): self.obs_table = obs_table @staticmethod def _record(level="info", msg=None): return {"level": level, "hdu": "obs-index", "msg": msg} def check_meta(self): m = self.obs_table.meta meta_missing = sorted(set(self.meta_required) - set(m)) if meta_missing: yield self._record( level="error", msg=f"Missing meta keys: {meta_missing!r}" ) if m.get("HDUCLAS1", "") != "INDEX": yield self._record(level="error", msg="HDUCLAS1 must be INDEX") if m.get("HDUCLAS2", "") != "OBS": yield self._record(level="error", msg="HDUCLAS2 must be OBS") def check_columns(self): t = self.obs_table if len(t) == 0: yield self._record(level="error", msg="Observation table has zero rows") for name, unit in self.columns_required: if name not in t.colnames: yield self._record(level="error", msg=f"Missing table column: {name!r}") else: if Unit(unit) != (t[name].unit or ""): yield self._record( level="error", msg=f"Invalid unit for column: {name!r}" ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/observations.py0000644000175100001770000007500214721316200020051 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import collections.abc import copy import html import inspect import itertools import logging import warnings from itertools import zip_longest import numpy as np import astropy.units as u from astropy.coordinates import SkyCoord from astropy.io import fits from astropy.time import Time from astropy.units import Quantity from astropy.utils import lazyproperty import matplotlib.pyplot as plt from gammapy.utils.deprecation import GammapyDeprecationWarning from gammapy.utils.fits import LazyFitsData, earth_location_to_dict from gammapy.utils.metadata import CreatorMetaData, TargetMetaData, TimeInfoMetaData from gammapy.utils.scripts import make_path from gammapy.utils.testing import Checker from gammapy.utils.time import time_ref_to_dict, time_relative_to_ref from .event_list import EventList, EventListChecker from .filters import ObservationFilter from .gti import GTI from .metadata import ObservationMetaData from .pointing import FixedPointingInfo __all__ = ["Observation", "Observations"] log = logging.getLogger(__name__) class Observation: """In-memory observation. Parameters ---------- obs_id : int, optional Observation id. Default is None aeff : `~gammapy.irf.EffectiveAreaTable2D`, optional Effective area. Default is None. edisp : `~gammapy.irf.EnergyDispersion2D`, optional Energy dispersion. Default is None. psf : `~gammapy.irf.PSF3D`, optional Point spread function. Default is None. bkg : `~gammapy.irf.Background3D`, optional Background rate model. Default is None. rad_max : `~gammapy.irf.RadMax2D`, optional Only for point-like IRFs: RAD_MAX table (energy dependent RAD_MAX) For a fixed RAD_MAX, create a RadMax2D with a single bin. Default is None. gti : `~gammapy.data.GTI`, optional Table with GTI start and stop time. Default is None. events : `~gammapy.data.EventList`, optional Event list. Default is None. obs_filter : `ObservationFilter`, optional Observation filter. Default is None. pointing : `~gammapy.data.FixedPointingInfo`, optional Pointing information. Default is None. location : `~astropy.coordinates.EarthLocation`, optional Earth location of the observatory. Default is None. """ aeff = LazyFitsData(cache=False) edisp = LazyFitsData(cache=False) psf = LazyFitsData(cache=False) _bkg = LazyFitsData(cache=False) _rad_max = LazyFitsData(cache=True) _events = LazyFitsData(cache=False) _gti = LazyFitsData(cache=True) _pointing = LazyFitsData(cache=True) def __init__( self, obs_id=None, meta=None, gti=None, aeff=None, edisp=None, psf=None, bkg=None, rad_max=None, events=None, obs_filter=None, pointing=None, location=None, ): self.obs_id = obs_id self.aeff = aeff self.edisp = edisp self.psf = psf self._bkg = bkg self._rad_max = rad_max self._gti = gti self._events = events self._pointing = pointing self._location = location # this is part of the meta or is it data? self.obs_filter = obs_filter or ObservationFilter() self._meta = meta def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property def bkg(self): """Background of the observation.""" from gammapy.irf import FoVAlignment bkg = self._bkg # used for backward compatibility of old HESS data try: if ( bkg and self._meta and self._meta.optional and self._meta.optional["CREATOR"] == "SASH FITS::EventListWriter" and self._meta.optional["HDUVERS"] == "0.2" ): bkg._fov_alignment = FoVAlignment.REVERSE_LON_RADEC except KeyError: pass return bkg @bkg.setter def bkg(self, value): self._bkg = value @property def meta(self): """Return metadata container.""" if self._meta is None and self.events: self._meta = ObservationMetaData.from_header(self.events.table.meta) return self._meta @property def rad_max(self): """Rad max IRF. None if not available.""" # prevent circular import from gammapy.irf import RadMax2D if self._rad_max is not None: return self._rad_max # load once to avoid trigger lazy loading it three times aeff = self.aeff if aeff is not None and aeff.is_pointlike: self._rad_max = RadMax2D.from_irf(aeff) return self._rad_max edisp = self.edisp if edisp is not None and edisp.is_pointlike: self._rad_max = RadMax2D.from_irf(self.edisp) return self._rad_max @property def available_hdus(self): """Which HDUs are available.""" available_hdus = [] keys = ["_events", "_gti", "aeff", "edisp", "psf", "_bkg", "_rad_max"] hdus = ["events", "gti", "aeff", "edisp", "psf", "bkg", "rad_max"] for key, hdu in zip(keys, hdus): available = self.__dict__.get(key, False) available_hdu = self.__dict__.get(f"_{hdu}_hdu", False) available_hdu_ = self.__dict__.get(f"_{key}_hdu", False) if available or available_hdu or available_hdu_: available_hdus.append(hdu) return available_hdus @property def available_irfs(self): """Which IRFs are available.""" return [_ for _ in self.available_hdus if _ not in ["events", "gti"]] @property def events(self): """Event list of the observation as an `~gammapy.data.EventList`.""" events = self.obs_filter.filter_events(self._events) return events @events.setter def events(self, value): if not isinstance(value, EventList): raise TypeError(f"events must be an EventList instance, got: {type(value)}") self._events = value @property def gti(self): """GTI of the observation as a `~gammapy.data.GTI`.""" gti = self.obs_filter.filter_gti(self._gti) return gti @staticmethod def _get_obs_info( pointing, deadtime_fraction, time_start, time_stop, reference_time, location ): """Create observation information dictionary from in memory data.""" obs_info = { "DEADC": 1 - deadtime_fraction, } if isinstance(pointing, SkyCoord): obs_info["RA_PNT"] = pointing.icrs.ra.deg obs_info["DEC_PNT"] = pointing.icrs.dec.deg obs_info.update(time_ref_to_dict(reference_time)) obs_info["TSTART"] = time_relative_to_ref(time_start, obs_info).to_value(u.s) obs_info["TSTOP"] = time_relative_to_ref(time_stop, obs_info).to_value(u.s) if location is not None: obs_info.update(earth_location_to_dict(location)) return obs_info @classmethod def create( cls, pointing, location=None, obs_id=0, livetime=None, tstart=None, tstop=None, irfs=None, deadtime_fraction=0.0, reference_time=Time("2000-01-01 00:00:00"), ): """Create an observation. User must either provide the livetime, or the start and stop times. Parameters ---------- pointing : `~gammapy.data.FixedPointingInfo` or `~astropy.coordinates.SkyCoord` Pointing information. location : `~astropy.coordinates.EarthLocation`, optional Earth location of the observatory. Default is None. obs_id : int, optional Observation ID as identifier. Default is 0. livetime : ~astropy.units.Quantity`, optional Livetime exposure of the simulated observation. Default is None. tstart : `~astropy.time.Time` or `~astropy.units.Quantity`, optional Start time of observation as `~astropy.time.Time` or duration relative to `reference_time`. Default is None. tstop : `astropy.time.Time` or `~astropy.units.Quantity`, optional Stop time of observation as `~astropy.time.Time` or duration relative to `reference_time`. Default is None. irfs : dict, optional IRFs used for simulating the observation: `bkg`, `aeff`, `psf`, `edisp`, `rad_max`. Default is None. deadtime_fraction : float, optional Deadtime fraction. Default is 0. reference_time : `~astropy.time.Time`, optional the reference time to use in GTI definition. Default is `~astropy.time.Time("2000-01-01 00:00:00")`. Returns ------- obs : `gammapy.data.MemoryObservation` Observation. """ if tstart is None: tstart = reference_time.copy() if tstop is None: tstop = tstart + Quantity(livetime) gti = GTI.create(tstart, tstop, reference_time=reference_time) obs_info = cls._get_obs_info( pointing=pointing, deadtime_fraction=deadtime_fraction, time_start=gti.time_start[0], time_stop=gti.time_stop[0], reference_time=reference_time, location=location, ) time_info = TimeInfoMetaData( time_start=gti.time_start[0], time_stop=gti.time_stop[-1], reference_time=reference_time, ) meta = ObservationMetaData( deadtime_fraction=deadtime_fraction, location=location, time_info=time_info, creation=CreatorMetaData(), target=TargetMetaData(), ) if not isinstance(pointing, FixedPointingInfo): warnings.warn( "Pointing will be required to be provided as FixedPointingInfo", GammapyDeprecationWarning, ) pointing = FixedPointingInfo.from_fits_header(obs_info) return cls( obs_id=obs_id, meta=meta, gti=gti, aeff=irfs.get("aeff"), bkg=irfs.get("bkg"), edisp=irfs.get("edisp"), psf=irfs.get("psf"), rad_max=irfs.get("rad_max"), pointing=pointing, location=location, ) @property def tstart(self): """Observation start time as a `~astropy.time.Time` object.""" return self.gti.time_start[0] @property def tstop(self): """Observation stop time as a `~astropy.time.Time` object.""" return self.gti.time_stop[0] @property def tmid(self): """Midpoint between start and stop time as a `~astropy.time.Time` object.""" return self.tstart + 0.5 * (self.tstop - self.tstart) @property def observation_time_duration(self): """Observation time duration in seconds as a `~astropy.units.Quantity`. The wall time, including dead-time. """ return self.gti.time_sum @property def observation_live_time_duration(self): """Live-time duration in seconds as a `~astropy.units.Quantity`. The dead-time-corrected observation time. Computed as ``t_live = t_observation * (1 - f_dead)`` where ``f_dead`` is the dead-time fraction. """ return ( self.observation_time_duration * (1 - self.observation_dead_time_fraction) * self.obs_filter.livetime_fraction ) @property def observation_dead_time_fraction(self): """Dead-time fraction (float). Defined as dead-time over observation time. Dead-time is defined as the time during the observation where the detector didn't record events: https://en.wikipedia.org/wiki/Dead_time https://ui.adsabs.harvard.edu/abs/2004APh....22..285F The dead-time fraction is used in the live-time computation, which in turn is used in the exposure and flux computation. """ return self.meta.deadtime_fraction @property def pointing(self): """Get the pointing for the observation as a `~gammapy.data.FixedPointingInfo` object.""" if self._pointing is None: self._pointing = FixedPointingInfo.from_fits_header(self.events.table.meta) return self._pointing def get_pointing_altaz(self, time): """Get the pointing in alt-az for given time.""" return self.pointing.get_altaz(time, self.observatory_earth_location) def get_pointing_icrs(self, time): """Get the pointing in ICRS for given time.""" return self.pointing.get_icrs(time, self.observatory_earth_location) @property def observatory_earth_location(self): """Observatory location as an `~astropy.coordinates.EarthLocation` object.""" if self._location is None: return self.meta.location return self._location @lazyproperty def target_radec(self): """Target RA / DEC sky coordinates as a `~astropy.coordinates.SkyCoord` object.""" return self.meta.target.position def __str__(self): pointing = self.get_pointing_icrs(self.tmid) ra = pointing.ra.deg dec = pointing.dec.deg pointing = f"{ra:.1f} deg, {dec:.1f} deg\n" # TODO: Which target was observed? # TODO: print info about available HDUs for this observation ... return ( f"{self.__class__.__name__}\n\n" f"\tobs id : {self.obs_id} \n " f"\ttstart : {self.tstart.mjd:.2f}\n" f"\ttstop : {self.tstop.mjd:.2f}\n" f"\tduration : {self.observation_time_duration:.2f}\n" f"\tpointing (icrs) : {pointing}\n" f"\tdeadtime fraction : {self.observation_dead_time_fraction:.1%}\n" ) def check(self, checks="all"): """Run checks. This is a generator that yields a list of dictionary. """ checker = ObservationChecker(self) return checker.run(checks=checks) def peek(self, figsize=(15, 10)): """Quick-look plots in a few panels. Parameters ---------- figsize : tuple, optional Figure size. Default is (15, 10). """ plottable_hds = ["events", "aeff", "psf", "edisp", "bkg", "rad_max"] plot_hdus = list(set(plottable_hds) & set(self.available_hdus)) plot_hdus.sort() n_irfs = len(plot_hdus) nrows = n_irfs // 2 ncols = 2 + n_irfs % 2 fig, axes = plt.subplots( nrows=nrows, ncols=ncols, figsize=figsize, gridspec_kw={"wspace": 0.3, "hspace": 0.3}, ) for idx, (ax, name) in enumerate(zip_longest(axes.flat, plot_hdus)): if name == "aeff": self.aeff.plot(ax=ax) ax.set_title("Effective area") if name == "bkg": bkg = self.bkg if not bkg.has_offset_axis: bkg = bkg.to_2d() bkg.plot(ax=ax) ax.set_title("Background rate") if name == "psf": self.psf.plot_containment_radius_vs_energy(ax=ax) ax.set_title("Point spread function") if name == "edisp": self.edisp.plot_bias(ax=ax, add_cbar=True) ax.set_title("Energy dispersion") if name == "rad_max": self.rad_max.plot_rad_max_vs_energy(ax=ax) ax.set_title("Rad max") if name == "events": m = self.events._counts_image(allsky=False) ax.remove() ax = fig.add_subplot(nrows, ncols, idx + 1, projection=m.geom.wcs) m.plot(ax=ax, stretch="sqrt", vmin=0, add_cbar=True) ax.set_title("Events") if name is None: ax.set_visible(False) def select_time(self, time_interval): """Select a time interval of the observation. Parameters ---------- time_interval : `astropy.time.Time` Start and stop time of the selected time interval. For now, we only support a single time interval. Returns ------- new_obs : `~gammapy.data.Observation` A new observation instance of the specified time interval. """ new_obs_filter = self.obs_filter.copy() new_obs_filter.time_filter = time_interval obs = copy.deepcopy(self) obs.obs_filter = new_obs_filter return obs @classmethod def read(cls, event_file, irf_file=None, checksum=False): """Create an Observation from a Event List and an (optional) IRF file. Parameters ---------- event_file : str or `~pathlib.Path` Path to the FITS file containing the event list and the GTI. irf_file : str or `~pathlib.Path`, optional Path to the FITS file containing the IRF components. Default is None. If None, the IRFs will be read from the event file. checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. Returns ------- observation : `~gammapy.data.Observation` Observation with the events and the IRF read from the file. """ from gammapy.irf.io import load_irf_dict_from_file events = EventList.read(event_file, checksum=checksum) gti = GTI.read(event_file, checksum=checksum) irf_file = irf_file if irf_file is not None else event_file irf_dict = load_irf_dict_from_file(irf_file) obs_info = events.table.meta meta = ObservationMetaData.from_header(obs_info) return cls( events=events, gti=gti, obs_id=meta.obs_info.obs_id, pointing=FixedPointingInfo.from_fits_header(obs_info), meta=meta, location=meta.location, **irf_dict, ) def write( self, path, overwrite=False, format="gadf", include_irfs=True, checksum=False ): """ Write this observation into `~pathlib.Path` using the specified format. Parameters ---------- path : str or `~pathlib.Path` Path for the output file. overwrite : bool, optional Overwrite existing file. Default is False. format : {"gadf"} Output format, currently only "gadf" is supported. Default is "gadf". include_irfs : bool, optional Whether to include irf components in the output file. Default is True. checksum : bool, optional When True adds both DATASUM and CHECKSUM cards to the headers written to the file. Default is False. """ if format != "gadf": raise ValueError(f'Only the "gadf" format is supported, got {format}') path = make_path(path) primary = fits.PrimaryHDU() primary.header.update(self.meta.creation.to_header(format)) hdul = fits.HDUList([primary]) events = self.events if events is not None: events_hdu = events.to_table_hdu(format=format) events_hdu.header.update( self.pointing.to_fits_header(time_ref=events.time_ref) ) hdul.append(events_hdu) gti = self.gti if gti is not None: hdul.append(gti.to_table_hdu(format=format)) if include_irfs: for irf_name in self.available_irfs: irf = getattr(self, irf_name) if irf is not None: hdul.append(irf.to_table_hdu(format="gadf-dl3")) hdul.writeto(path, overwrite=overwrite, checksum=checksum) def copy(self, in_memory=False, **kwargs): """Copy observation. Overwriting `Observation` arguments requires the ``in_memory`` argument to be true. Parameters ---------- in_memory : bool, optional Copy observation in memory. Default is False. **kwargs : dict, optional Keyword arguments passed to `Observation`. Examples -------- >>> from gammapy.data import Observation >>> obs = Observation.read("$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz") >>> obs_copy = obs.copy(in_memory=True, obs_id=1234) >>> print(obs_copy) # doctest: +SKIP Returns ------- obs : `Observation` Copied observation. """ if in_memory: argnames = inspect.getfullargspec(self.__init__).args # TODO: remove once obs_info is removed from the list of arguments in __init__ argnames.remove("self") for name in argnames: if name == "location": attr = "observatory_earth_location" else: attr = name value = getattr(self, attr) kwargs.setdefault(name, copy.deepcopy(value)) return self.__class__(**kwargs) if kwargs: raise ValueError("Overwriting arguments requires to set 'in_memory=True'") return copy.deepcopy(self) class Observations(collections.abc.MutableSequence): """Container class that holds a list of observations. Parameters ---------- observations : list A list of `~gammapy.data.Observation`. """ def __init__(self, observations=None): self._observations = [] if observations is None: observations = [] for obs in observations: self.append(obs) def __getitem__(self, item): if isinstance(item, (list, np.ndarray)) and all( isinstance(x, str) for x in item ): return self.__class__([self._observations[self.index(_)] for _ in item]) elif isinstance(item, (slice, list, np.ndarray)): return self.__class__(list(np.array(self._observations)[item])) else: return self._observations[self.index(item)] def __delitem__(self, key): del self._observations[self.index(key)] def __setitem__(self, key, obs): if isinstance(obs, Observation): if obs in self: log.warning( f"Observation with obs_id {obs.obs_id} already belongs to Observations." ) self._observations[self.index(key)] = obs else: raise TypeError(f"Invalid type: {type(obs)!r}") def insert(self, idx, obs): if isinstance(obs, Observation): if obs in self: log.warning( f"Observation with obs_id {obs.obs_id} already belongs to Observations." ) self._observations.insert(idx, obs) else: raise TypeError(f"Invalid type: {type(obs)!r}") def __len__(self): return len(self._observations) def __str__(self): s = self.__class__.__name__ + "\n" s += "Number of observations: {}\n".format(len(self)) for obs in self: s += str(obs) return s def index(self, key): if isinstance(key, (int, slice)): return key elif isinstance(key, str): return self.ids.index(key) elif isinstance(key, Observation): return self._observations.index(key) else: raise TypeError(f"Invalid type: {type(key)!r}") @property def ids(self): """List of observation IDs (`list`).""" return [str(obs.obs_id) for obs in self] def select_time(self, time_intervals): """Select a time interval of the observations. Parameters ---------- time_intervals : `astropy.time.Time` or list of `astropy.time.Time` List of start and stop time of the time intervals or one time interval. Returns ------- new_observations : `~gammapy.data.Observations` A new Observations instance of the specified time intervals. """ new_obs_list = [] if isinstance(time_intervals, Time): time_intervals = [time_intervals] for time_interval in time_intervals: for obs in self: if (obs.tstart < time_interval[1]) & (obs.tstop > time_interval[0]): new_obs = obs.select_time(time_interval) new_obs_list.append(new_obs) return self.__class__(new_obs_list) def _ipython_key_completions_(self): return self.ids def group_by_label(self, labels): """Split observations in multiple groups of observations. Parameters ---------- labels : list or `numpy.ndarray` Array of group labels. Returns ------- obs_clusters : dict of `~gammapy.data.Observations` Dictionary of Observations instance, one instance for each group. """ obs_groups = {} for label in np.unique(labels): observations = self.__class__( [obs for k, obs in enumerate(self) if labels[k] == label] ) obs_groups[f"group_{label}"] = observations return obs_groups @classmethod def from_stack(cls, observations_list): # TODO : Do more check when stacking observations when we have metadata. """Create a new `Observations` instance by concatenating a list of `Observations` objects. Parameters ---------- observations_list : list of `~gammapy.data.Observations` The list of `Observations` to stack. Returns ------- observations : `~gammapy.data.Observations` The `Observations` object resulting from stacking all the `Observations` in `observation_list`. """ obs = itertools.chain(*observations_list) return cls(list(obs)) def in_memory_generator(self): """A generator that iterates over observation. Yield an in memory copy of the observation.""" for obs in self: obs_copy = obs.copy(in_memory=True) yield obs_copy class ObservationChecker(Checker): """Check an observation. Checks data format and a bit about the content. """ CHECKS = { "events": "check_events", "gti": "check_gti", "aeff": "check_aeff", "edisp": "check_edisp", "psf": "check_psf", } def __init__(self, observation): self.observation = observation def _record(self, level="info", msg=None): return {"level": level, "obs_id": self.observation.obs_id, "msg": msg} def check_events(self): yield self._record(level="debug", msg="Starting events check") try: events = self.observation.events except Exception: yield self._record(level="warning", msg="Loading events failed") return yield from EventListChecker(events).run() # TODO: split this out into a GTIChecker def check_gti(self): yield self._record(level="debug", msg="Starting gti check") try: gti = self.observation.gti except Exception: yield self._record(level="warning", msg="Loading GTI failed") return if len(gti.table) == 0: yield self._record(level="error", msg="GTI table has zero rows") columns_required = ["START", "STOP"] for name in columns_required: if name not in gti.table.colnames: yield self._record(level="error", msg=f"Missing table column: {name!r}") # TODO: Check that header keywords agree with table entries # TSTART, TSTOP, MJDREFI, MJDREFF # Check that START and STOP times are consecutive # times = np.ravel(self.table['START'], self.table['STOP']) # # TODO: not sure this is correct ... add test with a multi-gti table from Fermi. # if not np.all(np.diff(times) >= 0): # yield 'GTIs are not consecutive or sorted.' # TODO: add reference times for all instruments and check for this # Use TELESCOP header key to check which instrument it is. def _check_times(self): """Check if various times are consistent. The headers and tables of the FITS EVENTS and GTI extension contain various observation and event time information. """ # http://fermi.gsfc.nasa.gov/ssc/data/analysis/documentation/Cicerone/Cicerone_Data/Time_in_ScienceTools.html # https://hess-confluence.desy.de/confluence/display/HESS/HESS+FITS+data+-+References+and+checks#HESSFITSdata-Referencesandchecks-Time telescope_met_refs = { "FERMI": Time("2001-01-01T00:00:00"), "HESS": Time("2001-01-01T00:00:00"), } meta = self.dset.event_list.table.meta telescope = meta["TELESCOP"] if telescope in telescope_met_refs.keys(): dt = self.time_ref - telescope_met_refs[telescope] if dt > self.accuracy["time"]: yield self._record( level="error", msg="Reference time incorrect for telescope" ) def check_aeff(self): yield self._record(level="debug", msg="Starting aeff check") try: aeff = self.observation.aeff except Exception: yield self._record(level="warning", msg="Loading aeff failed") return # Check that thresholds are meaningful for aeff if ( "LO_THRES" in aeff.meta and "HI_THRES" in aeff.meta and aeff.meta["LO_THRES"] >= aeff.meta["HI_THRES"] ): yield self._record( level="error", msg="LO_THRES >= HI_THRES in effective area meta data" ) # Check that data isn't all null if np.max(aeff.data.data) <= 0: yield self._record( level="error", msg="maximum entry of effective area is <= 0" ) def check_edisp(self): yield self._record(level="debug", msg="Starting edisp check") try: edisp = self.observation.edisp except Exception: yield self._record(level="warning", msg="Loading edisp failed") return # Check that data isn't all null if np.max(edisp.data.data) <= 0: yield self._record(level="error", msg="maximum entry of edisp is <= 0") def check_psf(self): yield self._record(level="debug", msg="Starting psf check") try: self.observation.psf except Exception: yield self._record(level="warning", msg="Loading psf failed") return ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/pointing.py0000644000175100001770000005763114721316200017172 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import html import logging import warnings from enum import Enum, auto import numpy as np import scipy.interpolate import astropy.units as u from astropy.coordinates import ( ICRS, AltAz, BaseCoordinateFrame, CartesianRepresentation, SkyCoord, UnitSphericalRepresentation, ) from astropy.io import fits from astropy.table import Table from astropy.units import Quantity from astropy.utils import lazyproperty from gammapy.utils.compat import COPY_IF_NEEDED from gammapy.utils.deprecation import GammapyDeprecationWarning from gammapy.utils.fits import earth_location_from_dict, earth_location_to_dict from gammapy.utils.scripts import make_path from gammapy.utils.time import time_ref_from_dict, time_ref_to_dict, time_to_fits_header log = logging.getLogger(__name__) __all__ = ["FixedPointingInfo", "PointingInfo", "PointingMode"] def _check_coord_frame(coord_or_frame, expected_frame, name): """Check if a skycoord or frame is given in expected_frame.""" is_coord = isinstance(coord_or_frame, SkyCoord) is_frame = isinstance(coord_or_frame, BaseCoordinateFrame) if not (is_frame or is_coord): raise TypeError( f"{name} must be a 'astropy.coordinates.SkyCoord'" "or {expected_frame} instance" ) if is_coord: frame = coord_or_frame.frame else: frame = coord_or_frame if not isinstance(frame, expected_frame): raise ValueError( f"{name} is in wrong frame, expected {expected_frame}, got {frame}" ) class PointingMode(Enum): """ Describes the behavior of the pointing during the observation. See :ref:`gadf:iact-events-obs-mode`. For ground-based instruments, the most common options will be: * POINTING: The telescope observes a fixed position in the ICRS frame * DRIFT: The telescope observes a fixed position in the alt-az frame Gammapy only supports fixed pointing positions over the whole observation (either in equatorial or horizontal coordinates). OGIP also defines RASTER, SLEW and SCAN. These cannot be treated using a fixed pointing position in either frame, so they would require the pointing table, which is at the moment not supported by gammapy. Data releases based on gadf v0.2 do not have consistent OBS_MODE keyword e.g. the H.E.S.S. data releases uses the not-defined value "WOBBLE". For all gadf data, we assume OBS_MODE to be the same as "POINTING", unless it is set to "DRIFT", making the assumption that one observation only contains a single fixed position. """ POINTING = auto() DRIFT = auto() @staticmethod def from_gadf_string(val): """Parse a string from the GADF header into a PointingMode.""" # OBS_MODE is not well-defined and not mandatory in GADF 0.2 # We always assume that the observations are pointing observations # unless the OBS_MODE is set to DRIFT if val.upper() == "DRIFT": return PointingMode.DRIFT else: return PointingMode.POINTING class FixedPointingInfo: """IACT array pointing info. Data format specification: :ref:`gadf:iact-pnt` Parameters ---------- meta : `~astropy.table.Table.meta` Meta header info from Table on pointing. Passing this is deprecated, provide ``mode`` and ``fixed_icrs`` or ``fixed_altaz`` instead or use `FixedPointingInfo.from_fits_header` instead. mode : `PointingMode` How the telescope was pointing during the observation. fixed_icrs : `~astropy.coordinates.SkyCoord`, optional The coordinates of the observation in ICRS as a `~astropy.coordinates.SkyCoord` object. Default is None. Required if mode is `PointingMode.POINTING`. fixed_altaz : `~astropy.coordinates.SkyCoord`, optional The coordinates of the observation in alt-az as a `~astropy.coordinates.SkyCoord` object. Default is None. Required if mode is `PointingMode.DRIFT`. Examples -------- >>> from gammapy.data import FixedPointingInfo, PointingMode >>> from astropy.coordinates import SkyCoord >>> import astropy.units as u >>> fixed_icrs = SkyCoord(83.633 * u.deg, 22.014 * u.deg, frame="icrs") >>> pointing_info = FixedPointingInfo(fixed_icrs=fixed_icrs) >>> print(pointing_info) FixedPointingInfo: mode: PointingMode.POINTING coordinates: """ def __init__( self, meta=None, *, mode=None, fixed_icrs=None, fixed_altaz=None, # these have nothing really to do with pointing_info # needed for backwards compatibility but should be removed and accessed # from the observation, not the pointing info. location=None, time_start=None, time_stop=None, time_ref=None, legacy_altaz=None, # store altaz given for mode=POINTING separately so it can be removed easily ): self._meta = None # TODO: for backwards compatibility, remove in 2.0 # and make other keywards required if meta is not None: warnings.warn( "Initializing a FixedPointingInfo using a `meta` dict is deprecated", GammapyDeprecationWarning, ) self._meta = meta self.__dict__.update(self.from_fits_header(meta).__dict__) return if mode is not None: warnings.warn( "Passing mode is deprecated and the argument will be removed in Gammapy 1.3." " pointing mode is deduced from whether fixed_icrs or fixed_altaz is given", GammapyDeprecationWarning, ) self._location = location self._time_start = time_start self._time_stop = time_stop self._time_ref = time_ref self._legacy_altaz = legacy_altaz or AltAz(np.nan * u.deg, np.nan * u.deg) if fixed_icrs is not None and fixed_altaz is not None: raise ValueError("fixed_icrs and fixed_altaz are mutually exclusive") if fixed_icrs is not None: _check_coord_frame(fixed_icrs, ICRS, "fixed_icrs") if np.isnan(fixed_icrs.ra.value) or np.isnan(fixed_icrs.dec.value): warnings.warn( "In future, fixed_icrs must have non-nan values", GammapyDeprecationWarning, ) self._mode = PointingMode.POINTING self._fixed_icrs = fixed_icrs self._fixed_altaz = None else: _check_coord_frame(fixed_altaz, AltAz, "fixed_altaz") self._mode = PointingMode.DRIFT self._fixed_icrs = None self._fixed_altaz = fixed_altaz def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @classmethod def from_fits_header(cls, header): """ Parse `~gammapy.data.FixedPointingInfo` from the given FITS header. Parameters ---------- header : `astropy.fits.Header` Header to parse, e.g. from a GADF EVENTS HDU. Currently, only the GADF format is supported. Returns ------- pointing : `~gammapy.data.FixedPointingInfo` The FixedPointingInfo instance filled from the given header. """ obs_mode = header.get("OBS_MODE", "POINTING") mode = PointingMode.from_gadf_string(obs_mode) try: location = earth_location_from_dict(header) except KeyError: location = None # we allow missing RA_PNT / DEC_PNT in POINTING for some reason... # FIXME: actually enforce this to be present instead of using nan ra = u.Quantity(header.get("RA_PNT", np.nan), u.deg) dec = u.Quantity(header.get("DEC_PNT", np.nan), u.deg) alt = header.get("ALT_PNT") az = header.get("AZ_PNT") legacy_altaz = None fixed_icrs = None pointing_altaz = None # we can be more strict with DRIFT, as support was only added recently if mode is PointingMode.DRIFT: if alt is None or az is None: raise IOError( "Keywords ALT_PNT and AZ_PNT are required for OBSMODE=DRIFT" ) pointing_altaz = AltAz(alt=alt * u.deg, az=az * u.deg) else: fixed_icrs = SkyCoord(ra, dec) if np.isnan(ra.value) or np.isnan(dec.value): warnings.warn( "RA_PNT / DEC_PNT will be required in a future version of" " gammapy for pointing-mode POINTING", GammapyDeprecationWarning, ) # store given altaz also for POINTING for backwards compatibility, # FIXME: remove in 2.0 if alt is not None and az is not None: legacy_altaz = AltAz(alt=alt * u.deg, az=az * u.deg) time_start = header.get("TSTART") time_stop = header.get("TSTOP") time_ref = None if time_start is not None or time_stop is not None: time_ref = time_ref_from_dict(header) time_unit = u.Unit(header.get("TIMEUNIT", "s"), format="fits") if time_start is not None: time_start = time_ref + u.Quantity(time_start, time_unit) if time_stop is not None: time_stop = time_ref + u.Quantity(time_stop, time_unit) return cls( location=location, fixed_icrs=fixed_icrs, fixed_altaz=pointing_altaz, time_start=time_start, time_stop=time_stop, time_ref=time_ref, legacy_altaz=legacy_altaz, ) def to_fits_header(self, format="gadf", version="0.3", time_ref=None): """ Convert this FixedPointingInfo object into a fits header for the given format. Parameters ---------- format : {"gadf"} Format, currently only "gadf" is supported. Default is "gadf". version : str, optional Version of the ``format``, this function currently supports gadf versions 0.2 and 0.3. Default is "0.3". time_ref : `astropy.time.Time`, optional Reference time for storing the time related information in fits format. Default is None. Returns ------- header : `astropy.fits.Header` Header with fixed pointing information filled for the requested format. """ if format != "gadf": raise ValueError(f'Only the "gadf" format supported, got {format}') if version not in {"0.2", "0.3"}: raise ValueError(f"Unsupported version {version} for format {format}") if self.mode == PointingMode.DRIFT and version == "0.2": raise ValueError("mode=DRIFT is only supported by GADF 0.3") header = fits.Header() if self.mode is PointingMode.POINTING: header["OBS_MODE"] = "POINTING" header["RA_PNT"] = self.fixed_icrs.ra.deg, u.deg.to_string("fits") header["DEC_PNT"] = self.fixed_icrs.dec.deg, u.deg.to_string("fits") elif self.mode is PointingMode.DRIFT: header["OBS_MODE"] = "DRIFT" header["AZ_PNT"] = self.fixed_altaz.az.deg, u.deg.to_string("fits") header["ALT_PNT"] = self.fixed_altaz.alt.deg, u.deg.to_string("fits") # FIXME: remove in 2.0 if self._legacy_altaz is not None and not np.isnan( self._legacy_altaz.alt.value ): header["AZ_PNT"] = self._legacy_altaz.az.deg, u.deg.to_string("fits") header["ALT_PNT"] = self._legacy_altaz.alt.deg, u.deg.to_string("fits") if self._time_start is not None: header["TSTART"] = time_to_fits_header(self._time_start, epoch=time_ref) if self._time_stop is not None: header["TSTOP"] = time_to_fits_header(self._time_stop, epoch=time_ref) if self._time_start is not None or self._time_stop is not None: header.update(time_ref_to_dict(time_ref)) if self._location is not None: header.update(earth_location_to_dict(self._location)) return header @classmethod def read(cls, filename, hdu="EVENTS"): """Read pointing information table from file to obtain the metadata. Parameters ---------- filename : str Filename. hdu : int or str, optional HDU number or name. Default is "EVENTS". Returns ------- pointing_info : `PointingInfo` Pointing information. """ filename = make_path(filename) header = fits.getheader(filename, extname=hdu) return cls.from_fits_header(header) @property def mode(self): """See `PointingMode`, if not present, assume POINTING.""" return self._mode @property def fixed_altaz(self): """The fixed coordinates of the observation in alt-az as a `~astropy.coordinates.SkyCoord` object. None if not a DRIFT observation. """ return self._fixed_altaz @property def fixed_icrs(self): """ The fixed coordinates of the observation in ICRS as a `~astropy.coordinates.SkyCoord` object. None if not a POINTING observation. """ return self._fixed_icrs def get_icrs(self, obstime=None, location=None) -> SkyCoord: """ Get the pointing position in ICRS frame for a given time. If the observation was performed tracking a fixed position in ICRS, the icrs pointing is returned with the given obstime attached. If the observation was performed in drift mode, the fixed alt-az coordinates are transformed to ICRS using the observation location and the given time. Parameters ---------- obstime : `astropy.time.Time`, optional Time for which to get the pointing position in ICRS frame. Default is None. location : `astropy.coordinates.EarthLocation`, optional Observatory location, only needed for drift observations to transform from horizontal coordinates to ICRS. Default is None. Returns ------- icrs : `astropy.coordinates.SkyCoord` Pointing position in ICRS frame. """ if self.mode == PointingMode.POINTING: return SkyCoord(self._fixed_icrs.data, location=location, obstime=obstime) if self.mode == PointingMode.DRIFT: if obstime is None: obstime = self.obstime return self.get_altaz(obstime, location=location).icrs raise ValueError(f"Unsupported pointing mode: {self.mode}.") def get_altaz(self, obstime=None, location=None) -> SkyCoord: """ Get the pointing position in alt-az frame for a given time. If the observation was performed tracking a fixed position in ICRS, the icrs pointing is transformed at the given time using the location of the observation. If the observation was performed in drift mode, the fixed alt-az coordinate is returned with `obstime` attached. Parameters ---------- obstime : `astropy.time.Time`, optional Time for which to get the pointing position in alt-az frame. Default is None. location : `astropy.coordinates.EarthLocation`, optional Observatory location, only needed for pointing observations to transform from ICRS to horizontal coordinates. Default is None. Returns ------- altaz : `astropy.coordinates.SkyCoord` Pointing position in alt-az frame. """ location = location if location is not None else self._location frame = AltAz(location=location, obstime=obstime) if self.mode == PointingMode.POINTING: return self.fixed_icrs.transform_to(frame) if self.mode == PointingMode.DRIFT: # see https://github.com/astropy/astropy/issues/12965 alt = self.fixed_altaz.alt az = self.fixed_altaz.az return SkyCoord( alt=u.Quantity( np.full(obstime.shape, alt.deg), u.deg, copy=COPY_IF_NEEDED ), az=u.Quantity( np.full(obstime.shape, az.deg), u.deg, copy=COPY_IF_NEEDED ), frame=frame, ) raise ValueError(f"Unsupported pointing mode: {self.mode}.") def __str__(self): coordinates = ( self.fixed_icrs if self.mode == PointingMode.POINTING else self.fixed_altaz ) return ( "FixedPointingInfo:\n\n" f"mode: {self.mode}\n" f"coordinates: {coordinates}" ) class PointingInfo: """IACT array pointing info. Data format specification: :ref:`gadf:iact-pnt`. Parameters ---------- table : `~astropy.table.Table` Table (with meta header information) on pointing. Examples -------- >>> from gammapy.data import PointingInfo >>> pointing_info = PointingInfo.read('$GAMMAPY_DATA/tests/pointing_table.fits.gz') >>> print(pointing_info) Pointing info: Location: GeodeticLocation(lon=, lat=, height=) MJDREFI, MJDREFF, TIMESYS = (51910, 0.000742870370370241, 'TT') Time ref: 2001-01-01T00:01:04.184 Time ref: 51910.00074287037 MJD (TT) Duration: 1586.0000000000018 sec = 0.44055555555555603 hours Table length: 100 START: Time: 2004-01-21T19:50:02.184 Time: 53025.826414166666 MJD (TT) RADEC: 83.6333 24.5144 deg ALTAZ: 11.4575 41.3409 deg END: Time: 2004-01-21T20:16:28.184 Time: 53025.844770648146 MJD (TT) RADEC: 83.6333 24.5144 deg ALTAZ: 3.44573 42.1319 deg Note: In order to reproduce the example you need the tests datasets folder. You may download it with the command ``gammapy download datasets --tests --out $GAMMAPY_DATA`` """ def __init__(self, table): self.table = table @classmethod def read(cls, filename, hdu="POINTING"): """Read `PointingInfo` table from file. Parameters ---------- filename : str Filename. hdu : int or str, optional HDU number or name. Default is "POINTING". Returns ------- pointing_info : `PointingInfo` Pointing information. """ filename = make_path(filename) table = Table.read(filename, hdu=hdu) return cls(table=table) def __str__(self): ss = "Pointing info:\n\n" ss += f"Location: {self.location.geodetic}\n" m = self.table.meta ss += "MJDREFI, MJDREFF, TIMESYS = {}\n".format( (m["MJDREFI"], m["MJDREFF"], m["TIMESYS"]) ) ss += f"Time ref: {self.time_ref.fits}\n" ss += f"Time ref: {self.time_ref.mjd} MJD (TT)\n" sec = self.duration.to("second").value hour = self.duration.to("hour").value ss += f"Duration: {sec} sec = {hour} hours\n" ss += "Table length: {}\n".format(len(self.table)) ss += "\nSTART:\n" + self._str_for_index(0) + "\n" ss += "\nEND:\n" + self._str_for_index(-1) + "\n" return ss def _str_for_index(self, idx): """Information for one point in the pointing table.""" ss = "Time: {}\n".format(self.time[idx].fits) ss += "Time: {} MJD (TT)\n".format(self.time[idx].mjd) ss += "RADEC: {} deg\n".format(self.radec[idx].to_string()) ss += "ALTAZ: {} deg\n".format(self.altaz[idx].to_string()) return ss @lazyproperty def location(self): """Observatory location as an `~astropy.coordinates.EarthLocation` object.""" return earth_location_from_dict(self.table.meta) @lazyproperty def time_ref(self): """Time reference as a `~astropy.time.Time` object.""" return time_ref_from_dict(self.table.meta) @lazyproperty def duration(self): """Pointing table duration as a `~astropy.time.TimeDelta` object. The time difference between the first and last entry. """ return self.time[-1] - self.time[0] @lazyproperty def time(self): """Time array as a `~astropy.time.Time` object.""" met = Quantity(self.table["TIME"].astype("float64"), "second") time = self.time_ref + met return time.tt @lazyproperty def radec(self): """RA / DEC position from table as a `~astropy.coordinates.SkyCoord`.""" lon = self.table["RA_PNT"] lat = self.table["DEC_PNT"] return SkyCoord(lon, lat, unit="deg", frame="icrs") @lazyproperty def altaz_frame(self): """ALT / AZ frame as a `~astropy.coordinates.AltAz` object.""" return AltAz(obstime=self.time, location=self.location) @lazyproperty def altaz(self): """ALT / AZ position computed from RA / DEC as a`~astropy.coordinates.SkyCoord`.""" return self.radec.transform_to(self.altaz_frame) @lazyproperty def altaz_from_table(self): """ALT / AZ position from table as a `~astropy.coordinates.SkyCoord`.""" lon = self.table["AZ_PNT"] lat = self.table["ALT_PNT"] return SkyCoord(lon, lat, unit="deg", frame=self.altaz_frame) @staticmethod def _interpolate_cartesian(mjd_support, coord_support, mjd): xyz = coord_support.cartesian x_new = scipy.interpolate.interp1d(mjd_support, xyz.x)(mjd) y_new = scipy.interpolate.interp1d(mjd_support, xyz.y)(mjd) z_new = scipy.interpolate.interp1d(mjd_support, xyz.z)(mjd) return CartesianRepresentation(x_new, y_new, z_new).represent_as( UnitSphericalRepresentation ) def altaz_interpolate(self, time): """Interpolate pointing for a given time.""" altaz_frame = AltAz(obstime=time, location=self.location) return SkyCoord( self._interpolate_cartesian(self.time.mjd, self.altaz, time.mjd), frame=altaz_frame, ) def get_icrs(self, obstime): """ Get the pointing position in ICRS frame for a given time. Parameters ---------- obstime : `astropy.time.Time` Time for which to get the pointing position in ICRS frame. Returns ------- icrs : `astropy.coordinates.SkyCoord` Pointing position in ICRS frame. """ return SkyCoord( self._interpolate_cartesian(self.time.mjd, self.radec, obstime.mjd), obstime=obstime, frame="icrs", ) def get_altaz(self, obstime): """ Get the pointing position in alt-az frame for a given time. If the observation was performed tracking a fixed position in ICRS, the icrs pointing is transformed at the given time using the location of the observation. If the observation was performed in drift mode, the fixed alt-az coordinate is returned with `obstime` attached. Parameters ---------- obstime : `astropy.time.Time` Time for which to get the pointing position in alt-az frame. Returns ------- altaz : `astropy.coordinates.SkyCoord` Pointing position in alt-az frame. """ # give precedence to ALT_PNT / AZ_PNT if present if "ALT_PNT" in self.table and "AZ_PNT" in self.table: altaz = self.altaz_from_table frame = AltAz(obstime=obstime, location=self.location) return SkyCoord( self._interpolate_cartesian(self.time.mjd, altaz, obstime.mjd), frame=frame, ) # fallback to transformation from required ICRS if not return self.altaz_interpolate(time=obstime) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/simulate.py0000644000175100001770000000666714721316200017171 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Simulate observations""" from itertools import repeat from gammapy.utils import parallel as parallel from gammapy.utils.scripts import make_path class ObservationsEventsSampler(parallel.ParallelMixin): """Run event sampling for an emsemble of observations Parameters ---------- sampler_kwargs : dict, optional Arguments passed to `~gammapy.datasets.MapDatasetEventSampler`. dataset_kwargs : dict, optional Arguments passed to `~gammapy.datasets.create_map_dataset_from_observation()`. outdir : str, Path path of the output files created. Default is "./simulated_data/". If None a list of `~gammapy.data.Observation` is returned. overwrite : bool Overwrite the output files or not n_jobs : int, optional Number of processes to run in parallel. Default is one, unless `~gammapy.utils.parallel.N_JOBS_DEFAULT` was modified. parallel_backend : {'multiprocessing', 'ray'}, optional Which backend to use for multiprocessing. Default is None. """ def __init__( self, sampler_kwargs=None, dataset_kwargs=None, outdir="./simulated_data/", overwrite=True, n_jobs=None, parallel_backend=None, ): if outdir is not None: outdir = make_path(outdir) outdir.mkdir(exist_ok=True, parents=True) self.outdir = outdir self.n_jobs = n_jobs self.parallel_backend = parallel_backend self.overwrite = overwrite if sampler_kwargs is None: sampler_kwargs = {} self.sampler_kwargs = sampler_kwargs self.dataset_kwargs = dataset_kwargs def simulate_observation(self, observation, models=None): """Simulate a single observation. Parameters ---------- observation : `~gammapy.data.Observation` Observation to be simulated. models : `~gammapy.modeling.Models`, optional Models to simulate. Can be None to only sample background events. Default is None. """ from gammapy.datasets import ObservationEventSampler sampler = ObservationEventSampler( **self.sampler_kwargs, dataset_kwargs=self.dataset_kwargs ) observation = sampler.run(observation, models=models) if self.outdir is not None: observation.write( self.outdir / f"obs_{observation.obs_id}.fits", overwrite=self.overwrite, ) else: return observation def run(self, observations, models=None): """Run event sampling for an ensemble of onservations Parameters ---------- observation : `~gammapy.data.Observation` Observation to be simulated. models : `~gammapy.modeling.Models`, optional Models to simulate. Can be None to only sample background events. Default is None. """ n_jobs = min(self.n_jobs, len(observations)) observations = parallel.run_multiprocessing( self.simulate_observation, zip( observations, repeat(models), ), backend=self.parallel_backend, pool_kwargs=dict(processes=n_jobs), task_name="Simulate observations", ) if self.outdir is None: return observations ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.192642 gammapy-1.3/gammapy/data/tests/0000755000175100001770000000000014721316215016125 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/tests/__init__.py0000644000175100001770000000010014721316200020217 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/tests/test_data_store.py0000644000175100001770000002610414721316200021660 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import os from pathlib import Path import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.io import fits from gammapy.data import DataStore from gammapy.irf import ( Background3D, EffectiveAreaTable2D, EnergyDependentMultiGaussPSF, EnergyDispersion2D, ) from gammapy.utils.scripts import make_path from gammapy.utils.testing import requires_data @pytest.fixture() def data_store(): return DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") @requires_data() def test_data_store_hd_hap(data_store): """Test HESS HAP-HD data access.""" obs = data_store.obs(obs_id=23523) assert obs.events.__class__.__name__ == "EventList" assert obs.gti.__class__.__name__ == "GTI" assert obs.aeff.__class__.__name__ == "EffectiveAreaTable2D" assert obs.edisp.__class__.__name__ == "EnergyDispersion2D" assert obs.psf.__class__.__name__ == "PSF3D" @requires_data() def test_data_store_from_dir(): """Test the `from_dir` method.""" data_store_rel_path = DataStore.from_dir( "$GAMMAPY_DATA/hess-dl3-dr1/", "hdu-index.fits.gz", "obs-index.fits.gz" ) data_store_abs_path = DataStore.from_dir( "$GAMMAPY_DATA/hess-dl3-dr1/", "$GAMMAPY_DATA/hess-dl3-dr1/hdu-index.fits.gz", "$GAMMAPY_DATA/hess-dl3-dr1/obs-index.fits.gz", ) assert "Data store" in data_store_rel_path.info(show=False) assert "Data store" in data_store_abs_path.info(show=False) @requires_data() def test_data_store_from_file(tmpdir): filename = "$GAMMAPY_DATA/hess-dl3-dr1/hdu-index.fits.gz" index_hdu = fits.open(make_path(filename))["HDU_INDEX"] filename = "$GAMMAPY_DATA/hess-dl3-dr1/obs-index.fits.gz" obs_hdu = fits.open(make_path(filename))["OBS_INDEX"] hdulist = fits.HDUList() hdulist.append(index_hdu) hdulist.append(obs_hdu) filename = tmpdir / "test-index.fits" hdulist.writeto(str(filename)) data_store = DataStore.from_file(filename) assert data_store.obs_table["OBS_ID"][0] == 20136 @requires_data() def test_data_store_get_observations(data_store, caplog): """Test loading data and IRF files via the DataStore""" observations = data_store.get_observations([23523, 23592]) assert observations[0].obs_id == 23523 observations = data_store.get_observations() assert len(observations) == 105 with pytest.raises(ValueError): data_store.get_observations([11111, 23592]) with caplog.at_level(logging.WARNING): observations = data_store.get_observations([11111, 23523], skip_missing=True) assert observations[0].obs_id == 23523 assert "Skipping missing obs_id: 11111" in [_.message for _ in caplog.records] with caplog.at_level(logging.INFO): observations = data_store.get_observations([11111, 23523], skip_missing=True) assert "Observations selected: 1 out of 2." in [ _.message for _ in caplog.records ] @requires_data() def test_broken_links_data_store(data_store): # Test that data_store without complete IRFs are properly loaded hdu_table = data_store.hdu_table index = np.where(hdu_table["OBS_ID"] == 23526)[0][0] hdu_table.remove_row(index) hdu_table._hdu_type_stripped = np.array([_.strip() for _ in hdu_table["HDU_TYPE"]]) observations = data_store.get_observations( [23523, 23526], required_irf=["aeff", "edisp"] ) assert len(observations) == 1 with pytest.raises(ValueError): _ = data_store.get_observations([23523], required_irf=["xyz"]) @requires_data() def test_data_store_copy_obs(tmp_path, data_store): data_store.copy_obs([23523, 23592], tmp_path, overwrite=True) substore = DataStore.from_dir(tmp_path) assert str(substore.hdu_table.base_dir) == str(tmp_path) assert len(substore.obs_table) == 2 desired = data_store.obs(23523) actual = substore.obs(23523) assert str(actual.events.table) == str(desired.events.table) @requires_data() def test_data_store_copy_obs_subset(tmp_path, data_store): # Copy only certain HDU classes data_store.copy_obs([23523, 23592], tmp_path, hdu_class=["events"]) substore = DataStore.from_dir(tmp_path) assert len(substore.hdu_table) == 2 @requires_data() class TestDataStoreChecker: def setup_method(self): self.data_store = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps") def test_check_all(self): records = list(self.data_store.check()) assert len(records) == 32 @pytest.fixture() def data_store_dc1(monkeypatch): paths = [ f"$GAMMAPY_DATA/cta-1dc/data/baseline/gps/gps_baseline_{obs_id:06d}.fits" for obs_id in [110380, 111140, 111630, 111159] ] caldb_path = Path(os.environ["GAMMAPY_DATA"]) / Path("cta-1dc/caldb") monkeypatch.setenv("CALDB", str(caldb_path)) return DataStore.from_events_files(paths) @requires_data() def test_data_store_from_events(data_store_dc1): # data_store_dc1 fixture is needed to set CALDB # Test that `DataStore.from_events_files` works. # The real tests for `DataStoreMaker` are below. path = "$GAMMAPY_DATA/cta-1dc/data/baseline/gps/gps_baseline_110380.fits" data_store = DataStore.from_events_files([path]) assert len(data_store.obs_table) == 1 assert len(data_store.hdu_table) == 6 assert data_store.obs_table.meta["MJDREFI"] == 51544 path2 = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_025511.fits.gz" with pytest.raises(RuntimeError): _ = DataStore.from_events_files([path, path2]) @requires_data() def test_data_store_maker_obs_table(data_store_dc1): table = data_store_dc1.obs_table assert table.__class__.__name__ == "ObservationTable" assert len(table) == 4 assert len(table.colnames) == 27 assert table["CALDB"][0] == "1dc" assert table["IRF"][0] == "South_z20_50h" path = Path(table["IRF_FILENAME"][0]) # checking str(path) would convert to windows path conventions on windows assert "$CALDB/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" in repr(path) @requires_data() def test_data_store_maker_hdu_table(data_store_dc1): table = data_store_dc1.hdu_table assert table.__class__.__name__ == "HDUIndexTable" assert len(table) == 24 hdu_class = ["events", "gti", "aeff_2d", "edisp_2d", "psf_3gauss", "bkg_3d"] assert list(data_store_dc1.hdu_table["HDU_CLASS"]) == 4 * hdu_class path = Path(table["FILE_DIR"][2]) # checking str(path) would convert to windows path conventions on windows assert "$CALDB/data/cta/1dc/bcf/South_z20_50h" in repr(path) @requires_data() def test_data_store_maker_observation(data_store_dc1): """Check that one observation can be accessed OK""" obs = data_store_dc1.obs(110380) assert obs.obs_id == 110380 assert obs.events.time[0].iso == "2021-01-21 12:00:03.045" assert obs.gti.time_start[0].iso == "2021-01-21 12:00:00.000" assert isinstance(obs.aeff, EffectiveAreaTable2D) assert isinstance(obs.bkg, Background3D) assert isinstance(obs.edisp, EnergyDispersion2D) assert isinstance(obs.psf, EnergyDependentMultiGaussPSF) @requires_data("gammapy-data") def test_data_store_fixed_rad_max(): data_store = DataStore.from_dir("$GAMMAPY_DATA/joint-crab/dl3/magic") observations = data_store.get_observations( [5029748], required_irf=["aeff", "edisp"] ) assert len(observations) == 1 obs = observations[0] assert obs.rad_max is not None assert obs.rad_max.quantity.shape == (1, 1) assert u.allclose(obs.rad_max.quantity, np.sqrt(0.02) * u.deg) # test it also works with edisp (removing aeff) obs = data_store.get_observations([5029748], required_irf=["aeff", "edisp"])[0] obs.aeff = None assert obs.rad_max is not None assert obs.rad_max.quantity.shape == (1, 1) assert u.allclose(obs.rad_max.quantity, 0.1414213 * u.deg) # removing the last irf means we have no rad_max info obs = data_store.get_observations([5029748], required_irf=["aeff", "edisp"])[0] obs.aeff = None obs.edisp = None assert obs.rad_max is None @requires_data() def test_data_store_header_info_in_meta(data_store): """Test information from the obs index header is propagated into meta""" obs = data_store.obs(obs_id=23523) assert obs.meta.obs_info is not None # TODO: restructure test once all elements are in place @requires_data() def test_data_store_bad_name(): with pytest.raises(IOError): DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/", "hdu-index.fits.gz", "bad") @requires_data() def test_data_store_from_dir_no_obs_index(caplog, tmpdir): """Test the `from_dir` method.""" # Create small data_store and remove obs-index table DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/").copy_obs([23523, 23592], tmpdir) os.remove(tmpdir / "obs-index.fits.gz") data_store = DataStore.from_dir(tmpdir) obs = data_store.obs(23523) observations = data_store.get_observations() assert data_store.obs_table is None assert "No observation index table." in data_store.info(show=False) assert_allclose(obs.meta.location.lon.deg, 16.500222) assert len(observations) == 2 test_dir = tmpdir / "test" os.mkdir(test_dir) data_store.copy_obs([23523], test_dir) data_store_copy = DataStore.from_dir(test_dir) assert len(data_store_copy.obs_ids) == 1 assert data_store_copy.obs_table is None @requires_data() def test_data_store_required_irf_pointlike_fixed_rad_max(): """Check behavior of the "point-like" option for data_store""" store = DataStore.from_dir("$GAMMAPY_DATA/joint-crab/dl3/magic") obs = store.get_observations([5029748], required_irf="point-like") assert len(obs) == 1 assert u.allclose(obs[0].rad_max.quantity, np.sqrt(0.02) * u.deg) obs = store.get_observations([5029748], required_irf=["aeff", "edisp"]) assert len(obs) == 1 assert u.allclose(obs[0].rad_max.quantity, np.sqrt(0.02) * u.deg) @requires_data() def test_data_store_required_irf_pointlike_variable_rad_max(): """Check behavior of the "point-like" option for data_store""" store = DataStore.from_dir("$GAMMAPY_DATA/magic/rad_max/data/") obs = store.get_observations( [5029747, 5029748], required_irf=["aeff", "edisp", "rad_max"] ) assert len(obs) == 2 assert obs[0].rad_max is not None obs = store.get_observations([5029747, 5029748], required_irf="point-like") assert len(obs) == 2 assert obs[0].rad_max.quantity is not None @requires_data() def test_data_store_no_events(): """Check behavior of the "point-like" option for data_store""" data_path = "$GAMMAPY_DATA/hawc/crab_events_pass4/" hdu_filename = "hdu-index-table-GP-no-events.fits.gz" obs_filename = "obs-index-table-GP-no-events.fits.gz" data_store = DataStore.from_dir( data_path, hdu_table_filename=hdu_filename, obs_table_filename=obs_filename ) observations = data_store.get_observations( required_irf=["aeff", "psf", "edisp"], require_events=False ) assert len(observations) == 3 for obs in observations: assert not obs.events assert not obs.gti ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/tests/test_event_list.py0000644000175100001770000002052014721316200021703 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose from astropy import units as u from astropy.coordinates import SkyCoord from astropy.table import Table from regions import CircleSkyRegion, RectangleSkyRegion from gammapy.data import GTI, EventList, Observation from gammapy.maps import MapAxis, WcsGeom from gammapy.utils.testing import mpl_plot_check, requires_data @requires_data() class TestEventListBase: def setup_class(self): self.events = EventList.read( "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" ) def test_select_parameter(self): events = self.events.select_parameter("ENERGY", (0.8 * u.TeV, 5.0 * u.TeV)) assert len(events.table) == 2716 def test_meta(self): assert self.events.meta.event_class == "std" assert self.events.meta.creation.creator == "SASH FITS::EventListWriter" assert self.events.meta.creation.date is None assert self.events.meta.creation.origin == "H.E.S.S. Collaboration" assert self.events.table["EVENT_ID"][0] == 1808181231761 def test_write(self): # Without GTI obs = Observation(events=self.events) # Write function is through obs obs.write("test.fits.gz", include_irfs=False, overwrite=True) read_again = EventList.read("test.fits.gz") assert (self.events.table == read_again.table).all() assert read_again.table.meta["EXTNAME"] == "EVENTS" assert read_again.table.meta["HDUCLASS"] == "GADF" assert read_again.table.meta["HDUCLAS1"] == "EVENTS" # With GTI gti = GTI.read( "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" ) obs = Observation(events=self.events, gti=gti) obs.write("test.fits", overwrite=True) read_again_ev = EventList.read("test.fits") read_again_gti = GTI.read("test.fits") assert (self.events.table == read_again_ev.table).all() assert gti.table.meta == read_again_gti.table.meta assert_allclose(gti.table["START"].mjd, read_again_gti.table["START"].mjd) assert_allclose(gti.table["STOP"].mjd, read_again_gti.table["STOP"].mjd) # test that it won't work if gti is not a GTI with pytest.raises(AttributeError): obs = Observation(events=self.events, gti=gti.table) obs.write("test.fits", overwrite=True) @requires_data() class TestEventListHESS: def setup_class(self): self.events = EventList.read( "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" ) def test_basics(self): assert "EventList" in str(self.events) assert self.events.is_pointed_observation assert len(self.events.table) == 11243 assert self.events.time[0].iso == "2004-03-26 02:57:47.004" assert self.events.radec[0].to_string() == "229.239 -58.3417" assert self.events.galactic[0].to_string(precision=2) == "321.07 -0.69" assert self.events.altaz[0].to_string() == "193.338 53.258" assert_allclose(self.events.offset[0].value, 0.54000974, rtol=1e-5) energy = self.events.energy[0] assert energy.unit == "TeV" assert_allclose(energy.value, 0.55890286) lon, lat, height = self.events.observatory_earth_location.to_geodetic() assert lon.unit == "deg" assert_allclose(lon.value, 16.5002222222222) assert lat.unit == "deg" assert_allclose(lat.value, -23.2717777777778) assert height.unit == "m" assert_allclose(height.value, 1835) def test_observation_time_duration(self): dt = self.events.observation_time_duration assert dt.unit == "s" assert_allclose(dt.value, 1682) def test_observation_live_time_duration(self): dt = self.events.observation_live_time_duration assert dt.unit == "s" assert_allclose(dt.value, 1521.026855) def test_observation_dead_time_fraction(self): deadc = self.events.observation_dead_time_fraction assert_allclose(deadc, 0.095703, rtol=1e-3) def test_altaz(self): altaz = self.events.altaz assert_allclose(altaz[0].az.deg, 193.337965, atol=1e-3) assert_allclose(altaz[0].alt.deg, 53.258024, atol=1e-3) def test_median_position(self): coord = self.events.galactic_median assert_allclose(coord.l.deg, 320.539346, atol=1e-3) assert_allclose(coord.b.deg, -0.882515, atol=1e-3) def test_median_offset(self): offset_max = self.events.offset_from_median.max() assert_allclose(offset_max.to_value("deg"), 36.346379, atol=1e-3) def test_from_stack(self): event_lists = [self.events] * 2 stacked_list = EventList.from_stack(event_lists) assert len(stacked_list.table) == 11243 * 2 def test_stack(self): events, other = self.events.copy(), self.events.copy() events.stack(other) assert len(events.table) == 11243 * 2 def test_offset_selection(self): offset_range = u.Quantity([0.5, 1.0] * u.deg) new_list = self.events.select_offset(offset_range) assert len(new_list.table) == 1820 def test_plot_time(self): with mpl_plot_check(): self.events.plot_time() def test_plot_energy(self): with mpl_plot_check(): self.events.plot_energy() def test_plot_offset2_distribution(self): with mpl_plot_check(): self.events.plot_offset2_distribution() def test_plot_energy_offset(self): with mpl_plot_check(): self.events.plot_energy_offset() def test_plot_image(self): with mpl_plot_check(): self.events.plot_image() def test_peek(self): with mpl_plot_check(): self.events.peek() @requires_data() class TestEventListFermi: def setup_class(self): self.events = EventList.read( "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-events.fits.gz" ) def test_basics(self): assert "EventList" in str(self.events) assert len(self.events.table) == 32843 assert not self.events.is_pointed_observation def test_peek(self): with mpl_plot_check(): self.events.peek(allsky=True) @requires_data() class TestEventListChecker: def setup_class(self): self.event_list = EventList.read( "$GAMMAPY_DATA/cta-1dc/data/baseline/gps/gps_baseline_111140.fits" ) def test_check_all(self): records = list(self.event_list.check()) assert len(records) == 3 class TestEventSelection: def setup_class(self): table = Table() table["RA"] = [0.0, 0.0, 0.0, 10.0] * u.deg table["DEC"] = [0.0, 0.9, 10.0, 10.0] * u.deg table["ENERGY"] = [1.0, 1.5, 1.5, 10.0] * u.TeV table["OFFSET"] = [0.1, 0.5, 1.0, 1.5] * u.deg self.events = EventList(table) center1 = SkyCoord(0.0, 0.0, frame="icrs", unit="deg") on_region1 = CircleSkyRegion(center1, radius=1.0 * u.deg) center2 = SkyCoord(0.0, 10.0, frame="icrs", unit="deg") on_region2 = RectangleSkyRegion(center2, width=0.5 * u.deg, height=0.3 * u.deg) self.on_regions = [on_region1, on_region2] def test_region_select(self): geom = WcsGeom.create(skydir=(0, 0), binsz=0.2, width=4.0 * u.deg, proj="TAN") new_list = self.events.select_region(self.on_regions[0], geom.wcs) assert len(new_list.table) == 2 union_region = self.on_regions[0].union(self.on_regions[1]) new_list = self.events.select_region(union_region, geom.wcs) assert len(new_list.table) == 3 region_string = "fk5;box(0,10, 0.25, 0.15)" new_list = self.events.select_region(region_string, geom.wcs) assert len(new_list.table) == 1 def test_map_select(self): axis = MapAxis.from_edges((0.5, 2.0), unit="TeV", name="ENERGY") geom = WcsGeom.create( skydir=(0, 0), binsz=0.2, width=4.0 * u.deg, proj="TAN", axes=[axis] ) mask = geom.region_mask(regions=[self.on_regions[0]]) new_list = self.events.select_mask(mask) assert len(new_list.table) == 2 def test_select_energy(self): energy_range = u.Quantity([1, 10], "TeV") new_list = self.events.select_energy(energy_range) assert len(new_list.table) == 3 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/tests/test_filters.py0000644000175100001770000000627214721316200021207 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from astropy import units as u from astropy.coordinates import Angle, SkyCoord from astropy.time import Time from astropy.units import Quantity from gammapy.data import GTI, DataStore, EventList, ObservationFilter from gammapy.utils.regions import SphericalCircleSkyRegion from gammapy.utils.testing import assert_allclose, assert_time_allclose, requires_data def test_event_filter_types(): for method_str in ObservationFilter.EVENT_FILTER_TYPES.values(): assert hasattr(EventList, method_str) @pytest.fixture(scope="session") def observation(): ds = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") return ds.obs(20136) @requires_data() def test_empty_observation_filter(observation): empty_obs_filter = ObservationFilter() events = observation.events filtered_events = empty_obs_filter.filter_events(events) assert filtered_events == events gti = observation.gti filtered_gti = empty_obs_filter.filter_gti(gti) assert filtered_gti == gti @requires_data() def test_filter_events(observation): custom_filter = { "type": "custom", "opts": {"parameter": "ENERGY", "band": Quantity([0.8 * u.TeV, 10.0 * u.TeV])}, } target_position = SkyCoord(ra=229.2, dec=-58.3, unit="deg", frame="icrs") region_radius = Angle("0.2 deg") region = SphericalCircleSkyRegion(center=target_position, radius=region_radius) region_filter = {"type": "sky_region", "opts": {"regions": region}} time_filter = Time([53090.12, 53090.13], format="mjd", scale="tt") obs_filter = ObservationFilter( event_filters=[custom_filter, region_filter], time_filter=time_filter ) events = observation.events filtered_events = obs_filter.filter_events(events) assert np.all( (filtered_events.energy >= 0.8 * u.TeV) & (filtered_events.energy < 10.0 * u.TeV) ) assert np.all( (filtered_events.time >= time_filter[0]) & (filtered_events.time < time_filter[1]) ) assert np.all(region.center.separation(filtered_events.radec) < region_radius) @requires_data() def test_filter_gti(observation): time_filter = Time([53090.125, 53090.130], format="mjd", scale="tt") obs_filter = ObservationFilter(time_filter=time_filter) gti = observation.gti filtered_gti = obs_filter.filter_gti(gti) assert isinstance(filtered_gti, GTI) assert_time_allclose(filtered_gti.time_start, time_filter[0]) assert_time_allclose(filtered_gti.time_stop, time_filter[1]) @pytest.mark.parametrize( "pars", [ { "p_in": [ {"type": "custom", "opts": dict(parameter="PHASE", band=(0.2, 0.8))} ], "p_out": 0.6, }, { "p_in": [ { "type": "custom", "opts": dict(parameter="ENERGY", band=(0.1, 1) * u.TeV), } ], "p_out": 1, }, { "p_in": [], "p_out": 1, }, ], ) def test_check_filter_phase(pars): assert_allclose(ObservationFilter._check_filter_phase(pars["p_in"]), pars["p_out"]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/tests/test_gti.py0000644000175100001770000002003514721316200020313 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose import astropy.units as u from astropy.table import QTable, Table from astropy.time import Time from gammapy.data import GTI from gammapy.utils.testing import assert_time_allclose, requires_data def make_gti(mets, time_ref="2010-01-01"): """Create GTI from a dict of MET (assumed to be in seconds) and a reference time.""" time_ref = Time(time_ref) times = {name: time_ref + met for name, met in mets.items()} table = Table(times) return GTI(table, reference_time=time_ref) def test_gti_table_validation(): start = Time([53090.123451203704], format="mjd", scale="tt") stop = Time([53090.14291879629], format="mjd", scale="tt") table = QTable([start, stop], names=["START", "STOP"]) validated = GTI._validate_table(table) assert validated == table bad_table = QTable([start, stop], names=["bad", "STOP"]) with pytest.raises(ValueError): GTI._validate_table(bad_table) with pytest.raises(TypeError): GTI._validate_table([start, stop]) bad_table = QTable([[1], [2]], names=["START", "STOP"]) with pytest.raises(TypeError): GTI._validate_table(bad_table) def test_simple_gti(): time_ref = Time("2010-01-01") gti = make_gti({"START": [0, 2] * u.s, "STOP": [1, 3] * u.s}, time_ref=time_ref) assert_allclose(gti.time_start.mjd - time_ref.mjd, [0, 2.3148146e-5]) assert_allclose( (gti.time_stop - time_ref).to_value("d"), [1.15740741e-05, 3.4722222e-05] ) assert_allclose(gti.time_delta.to_value("s"), [1, 1]) assert_allclose(gti.time_sum.to_value("s"), 2.0) @requires_data() def test_gti_hess(): gti = GTI.read("$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz") str_gti = str(gti) assert "Start: 101962602.0 s MET" in str_gti assert "Stop: 2004-03-26T03:25:48.184 (time standard: TT)" in str_gti assert "GTI" in str_gti assert len(gti.table) == 1 assert gti.time_delta[0].unit == "s" assert_allclose(gti.time_delta[0].value, 1682) assert_allclose(gti.time_sum.value, 1682) expected = Time(53090.123451203704, format="mjd", scale="tt") assert_time_allclose(gti.time_start[0], expected) expected = Time(53090.14291879629, format="mjd", scale="tt") assert_time_allclose(gti.time_stop[0], expected) @requires_data() def test_gti_fermi(): gti = GTI.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-events.fits.gz") assert "GTI" in str(gti) assert len(gti.table) == 39042 assert gti.time_delta[0].unit == "s" assert_allclose(gti.time_delta[0].value, 651.598893) assert_allclose(gti.time_sum.value, 1.831396e08) expected = Time(54682.65603794185, format="mjd", scale="tt") assert_time_allclose(gti.time_start[0], expected) expected = Time(54682.66357959571, format="mjd", scale="tt") assert_time_allclose(gti.time_stop[0], expected) @requires_data() @pytest.mark.parametrize( "time_interval, expected_length, expected_times", [ ( Time( ["2008-08-04T16:21:00", "2008-08-04T19:10:00"], format="isot", scale="tt", ), 2, Time( ["2008-08-04T16:21:00", "2008-08-04T19:10:00"], format="isot", scale="tt", ), ), ( Time([54682.68125, 54682.79861111], format="mjd", scale="tt"), 2, Time([54682.68125, 54682.79861111], format="mjd", scale="tt"), ), ( Time([10.0, 100000.0], format="mjd", scale="tt"), 39042, Time([54682.65603794185, 57236.96833546296], format="mjd", scale="tt"), ), (Time([10.0, 20.0], format="mjd", scale="tt"), 0, None), ], ) def test_select_time(time_interval, expected_length, expected_times): gti = GTI.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-events.fits.gz") gti_selected = gti.select_time(time_interval) assert len(gti_selected.table) == expected_length if expected_length != 0: expected_times.format = "mjd" assert_time_allclose(gti_selected.time_start[0], expected_times[0]) assert_time_allclose(gti_selected.time_stop[-1], expected_times[1]) def test_gti_delete_intervals(): gti = GTI.create( Time( [ "2020-01-01 01:00:00.000", "2020-01-01 02:00:00.000", "2020-01-01 03:00:00.000", "2020-01-01 05:00:00.000", ] ), Time( [ "2020-01-01 01:45:00.000", "2020-01-01 02:45:00.000", "2020-01-01 03:45:00.000", "2020-01-01 05:45:00.000", ] ), ) interval = Time(["2020-01-01 02:20:00.000", "2020-01-01 05:20:00.000"]) interval2 = Time(["2020-01-01 05:30:00.000", "2020-01-01 08:20:00.000"]) interval3 = Time(["2020-01-01 05:50:00.000", "2020-01-01 09:20:00.000"]) gti_trim = gti.delete_interval(interval) assert len(gti_trim.table) == 3 assert_time_allclose( gti_trim.table["START"], Time( [ "2020-01-01 01:00:00.000", "2020-01-01 02:00:00.000", "2020-01-01 05:20:00.000", ] ), ) assert_time_allclose( gti_trim.table["STOP"], Time( [ "2020-01-01 01:45:00.000", "2020-01-01 02:20:00.000", "2020-01-01 05:45:00.000", ] ), ) gti_trim2 = gti_trim.delete_interval(interval2) assert_time_allclose( gti_trim2.table["STOP"], Time( [ "2020-01-01 01:45:00.000", "2020-01-01 02:20:00.000", "2020-01-01 05:30:00.000", ] ), ) gti_trim3 = gti_trim2.delete_interval(interval3) assert_time_allclose( gti_trim3.table["STOP"], Time( [ "2020-01-01 01:45:00.000", "2020-01-01 02:20:00.000", "2020-01-01 05:30:00.000", ] ), ) def test_gti_stack(): time_ref = Time("2010-01-01") gti1 = make_gti({"START": [0, 2] * u.s, "STOP": [1, 3] * u.s}, time_ref=time_ref) gt1_pre_stack = gti1.copy() gti2 = make_gti( {"START": [4] * u.s, "STOP": [5] * u.s}, time_ref=time_ref + 10 * u.s ) gti1.stack(gti2) assert len(gti1.table) == 3 assert_time_allclose(gt1_pre_stack.time_ref, gti1.time_ref) assert_allclose(gti1.met_start.value, [0, 2, 14]) assert_allclose(gti1.met_stop.value, [1, 3, 15]) def test_gti_union(): gti = make_gti({"START": [5, 6, 1, 2] * u.s, "STOP": [8, 7, 3, 4] * u.s}) gti = gti.union() assert_allclose(gti.met_start.value, [1, 5]) assert_allclose(gti.met_stop.value, [4, 8]) def test_gti_create(): start = u.Quantity([1, 2], "min") stop = u.Quantity([1.5, 2.5], "min") time_ref = Time("2010-01-01 00:00:00.0") gti = GTI.create(start, stop, time_ref) assert len(gti.table) == 2 assert_allclose(gti.time_ref.mjd, time_ref.tt.mjd) start_met = (gti.time_start - gti.time_ref).to_value("s") assert_allclose(start_met, start.to_value("s")) def test_gti_write(tmp_path): time_ref = Time("2010-01-01", scale="tt") time_ref.format = "mjd" gti = make_gti({"START": [5, 6, 1, 2] * u.s, "STOP": [8, 7, 3, 4] * u.s}, time_ref) gti.write(tmp_path / "tmp.fits") new_gti = GTI.read(tmp_path / "tmp.fits") assert_time_allclose(new_gti.time_start, gti.time_start) assert_time_allclose(new_gti.time_stop, gti.time_stop) assert_time_allclose(new_gti.time_ref, gti.time_ref) def test_gti_from_time(): """Test astropy time is supported as input for GTI.create""" start = Time("2020-01-01T20:00:00") stop = Time("2020-01-01T20:15:00") ref = Time("2020-01-01T00:00:00") gti = GTI.create(start, stop, ref) assert_time_allclose(gti.table["START"], start) assert_time_allclose(gti.table["STOP"], stop) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/tests/test_hdu_index_table.py0000644000175100001770000001057014721316200022651 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from pathlib import Path import pytest from gammapy.data import HDUIndexTable from gammapy.utils.scripts import make_path from gammapy.utils.testing import requires_data @pytest.fixture(scope="session") def hdu_index_table(): table = HDUIndexTable( rows=[ { "OBS_ID": 42, "HDU_TYPE": "events", "HDU_CLASS": "spam42", "FILE_DIR": "a", "FILE_NAME": "b", "HDU_NAME": "c", } ] ) table.meta["BASE_DIR"] = "spam" return table def test_hdu_index_table(hdu_index_table): """ This test ensures that the HDUIndexTable works in a case-insensitive way concerning the values in the ``HDU_CLASS`` and ``HDU_TYPE`` columns. We ended up with this, because initially the HDU index table spec used lower-case, but then the ``HDUCLAS`` header keys to all the HDUs (e.g. EVENTS, IRFs, ...) were defined in upper-case. So for consistency we changed to all upper-case in the spec also for the HDU index table, just with a mention that values should be treated in a case-insensitive manner for backward compatibility with existing index tables. See https://github.com/open-gamma-ray-astro/gamma-astro-data-formats/issues/118 """ location = hdu_index_table.hdu_location(obs_id=42, hdu_type="events") assert location.path().as_posix() == "a/b" location = hdu_index_table.hdu_location(obs_id=42, hdu_type="bkg") assert location is None assert hdu_index_table.summary().startswith("HDU index table") @requires_data() def test_hdu_index_table_hd_hap(capfd): """Test HESS HAP-HD data access.""" hdu_index = HDUIndexTable.read("$GAMMAPY_DATA/hess-dl3-dr1/hdu-index.fits.gz") assert "BASE_DIR" in hdu_index.meta assert hdu_index.base_dir == make_path("$GAMMAPY_DATA/hess-dl3-dr1") # A few valid queries location = hdu_index.hdu_location(obs_id=23523, hdu_type="events") # We test here the std out location.info() out, err = capfd.readouterr() lines = out.split("\n") assert len(lines) == 6 hdu = location.get_hdu() assert hdu.name == "EVENTS" # The next line is to check if the HDU is still accessible # See https://github.com/gammapy/gammapy/issues/1775 assert hdu.filebytes() == 224640 assert location.path(abs_path=False) == Path( "data/hess_dl3_dr1_obs_id_023523.fits.gz" ) path1 = str(location.path(abs_path=True)) path2 = str(location.path(abs_path=False)) assert path1.endswith(path2) location = hdu_index.hdu_location(obs_id=23523, hdu_class="psf_table") assert location.path(abs_path=False) == Path( "data/hess_dl3_dr1_obs_id_023523.fits.gz" ) location = hdu_index.hdu_location(obs_id=23523, hdu_type="psf") assert location.path(abs_path=False) == Path( "data/hess_dl3_dr1_obs_id_023523.fits.gz" ) location = hdu_index.hdu_location(obs_id=23523, hdu_type="bkg") assert location.path(abs_path=False) == Path( "data/hess_dl3_dr1_obs_id_023523.fits.gz" ) # A few invalid queries with pytest.raises(IndexError) as excinfo: hdu_index.hdu_location(obs_id=42, hdu_class="psf_3gauss") msg = "No entry available with OBS_ID = 42" assert str(excinfo.value) == msg with pytest.raises(ValueError) as excinfo: hdu_index.hdu_location(obs_id=23523) msg = "You have to specify `hdu_type` or `hdu_class`." assert str(excinfo.value) == msg with pytest.raises(ValueError) as excinfo: hdu_index.hdu_location(obs_id=23523, hdu_type="invalid") msg = ( "Invalid hdu_type: invalid. Valid values are: ['events', 'gti', 'aeff', " "'edisp', 'psf', 'bkg', 'rad_max', 'pointing']" ) assert str(excinfo.value) == msg with pytest.raises(ValueError) as excinfo: hdu_index.hdu_location(obs_id=23523, hdu_class="invalid") msg = ( "Invalid hdu_class: invalid. Valid values are: ['events', 'gti', 'aeff_2d', 'edisp_2d', " "'psf_table', 'psf_3gauss', 'psf_king', 'bkg_2d', 'bkg_3d', 'rad_max_2d', 'pointing']" ) assert str(excinfo.value) == msg location = hdu_index.hdu_location(obs_id=23523, hdu_class="psf_king") assert location is None location = hdu_index.hdu_location(obs_id=23523, hdu_class="bkg_2d") assert location is None ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/tests/test_ivoa.py0000644000175100001770000000347514721316200020477 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from gammapy.data.ivoa import empty_obscore_table, to_obscore_table from gammapy.utils.testing import requires_data def test_obscore_structure(): obscore_default_tab = empty_obscore_table() assert len(obscore_default_tab.columns) == 29 assert len(obscore_default_tab["dataproduct_type"]) == 0 assert obscore_default_tab.colnames[0] == "dataproduct_type" assert obscore_default_tab.columns[0].dtype == " 1.0: # keep only the integer part (i.e. the day, not the fraction of the day) time_start_f, time_start_i = np.modf(time_start.mjd) time_start = Time(time_start_i, format="mjd", scale="utc") # random generation of night hours: 6 h (from 22 h to 4 h), leaving 1/2 h # time for the last run to finish night_start = Quantity(22.0, "hour") night_duration = Quantity(5.5, "hour") hour_start = random_state.uniform( night_start.value, night_start.value + night_duration.value, len(obs_id) ) hour_start = Quantity(hour_start, "hour") # add night hour to integer part of MJD time_start += hour_start if use_abs_time: # show the observation times in UTC time_start = Time(time_start.isot) else: # show the observation times in seconds after the reference time_start = time_relative_to_ref(time_start, header) # converting to quantity (better treatment of units) time_start = Quantity(time_start.sec, "second") obs_table["TSTART"] = time_start # stop time # calculated as TSTART + ONTIME if use_abs_time: time_stop = Time(obs_table["TSTART"]) time_stop += TimeDelta(obs_table["ONTIME"]) else: time_stop = TimeDelta(obs_table["TSTART"]) time_stop += TimeDelta(obs_table["ONTIME"]) # converting to quantity (better treatment of units) time_stop = Quantity(time_stop.sec, "second") obs_table["TSTOP"] = time_stop # az, alt # random points in a portion of sphere; default: above 45 deg altitude az, alt = sample_sphere( size=len(obs_id), lon_range=az_range, lat_range=alt_range, random_state=random_state, ) az = Angle(az, "deg") alt = Angle(alt, "deg") obs_table["AZ"] = az obs_table["ALT"] = alt # RA, dec # derive from az, alt taking into account that alt, az represent the values # at the middle of the observation, i.e. at time_ref + (TIME_START + TIME_STOP)/2 # (or better: time_ref + TIME_START + (TIME_OBSERVATION/2)) # in use_abs_time mode, the time_ref should not be added, since it's already included # in TIME_START and TIME_STOP az = Angle(obs_table["AZ"]) alt = Angle(obs_table["ALT"]) if use_abs_time: obstime = Time(obs_table["TSTART"]) obstime += TimeDelta(obs_table["ONTIME"]) / 2.0 else: obstime = time_ref_from_dict(obs_table.meta) obstime += TimeDelta(obs_table["TSTART"]) obstime += TimeDelta(obs_table["ONTIME"]) / 2.0 location = observatory_locations[observatory_name] altaz_frame = AltAz(obstime=obstime, location=location) alt_az_coord = SkyCoord(az, alt, frame=altaz_frame) sky_coord = alt_az_coord.transform_to("icrs") obs_table["RA_PNT"] = sky_coord.ra obs_table["DEC_PNT"] = sky_coord.dec # positions # number of telescopes # random integers in a specified range; default: between 3 and 4 n_tels = random_state.randint(n_tels_range[0], n_tels_range[1] + 1, len(obs_id)) obs_table["N_TELS"] = n_tels # muon efficiency # random between 0.6 and 1.0 muoneff = random_state.uniform(low=0.6, high=1.0, size=len(obs_id)) obs_table["MUONEFF"] = muoneff return obs_table def test_basics(): random_state = np.random.RandomState(seed=0) obs_table = make_test_observation_table(n_obs=10, random_state=random_state) assert obs_table.summary().startswith("Observation table") filtered = obs_table.select_obs_id(1) assert isinstance(filtered, ObservationTable) assert len(filtered) == 1 with pytest.raises(KeyError): obs_table.select_obs_id(21) def test_select_parameter_box(): # create random observation table random_state = np.random.RandomState(seed=0) obs_table = make_test_observation_table(n_obs=10, random_state=random_state) # select some pars and check the corresponding values in the columns # test box selection in obs_id variable = "OBS_ID" value_range = [2, 5] selection_id = dict(type="par_box", variable=variable, value_range=value_range) selected_obs_table = obs_table.select_observations(selection_id) assert len(selected_obs_table) == 3 assert (value_range[0] <= selected_obs_table[variable]).all() assert (selected_obs_table[variable] < value_range[1]).all() # test box selection in obs_id inverted selection_id_inverted = dict( type="par_box", variable=variable, value_range=value_range, inverted=True ) selected_obs_table = obs_table.select_observations(selection_id_inverted) assert len(selected_obs_table) == 7 assert ( (value_range[0] > selected_obs_table[variable]) | (selected_obs_table[variable] >= value_range[1]) ).all() # test box selection in alt variable = "ALT" value_range = Angle([60.0, 70.0], "deg") selection_alt = dict(type="par_box", variable=variable, value_range=value_range) selected_obs_table = obs_table.select_observations(selection_alt) assert (value_range[0] < Angle(selected_obs_table[variable])).all() assert (Angle(selected_obs_table[variable]) < value_range[1]).all() # test multiple selections selections = [selection_alt, selection_id] selected_obs_table = obs_table.select_observations(selections) assert len(selected_obs_table) == 2 def test_select_time_box(): # create random observation table with very close (in time) # observations (and times in absolute times) datestart = Time("2012-01-01T00:30:00") dateend = Time("2012-01-01T02:30:00") random_state = np.random.RandomState(seed=0) obs_table_time = make_test_observation_table( n_obs=10, date_range=(datestart, dateend), use_abs_time=False, random_state=random_state, ) # test box selection in time: (time_start, time_stop) within value_range value_range = Time(["2012-01-01T01:00:00", "2012-01-01T02:00:00"]) selection = dict(type="time_box", time_range=value_range) selected_obs_table = obs_table_time.select_observations(selection) time_start = selected_obs_table.time_start time_stop = selected_obs_table.time_stop assert (value_range[0] < time_start).all() & (time_stop < value_range[1]).all() # Test selection with partial overlap of observations and value_range selection_partial = dict( type="time_box", time_range=value_range, partial_overlap=True ) selected_obs_table = obs_table_time.select_observations(selection_partial) time_start = selected_obs_table.time_start time_stop = selected_obs_table.time_stop assert (value_range[1] > time_start).all() & (time_stop > value_range[0]).all() def test_select_sky_regions(): random_state = np.random.RandomState(seed=0) obs_table = make_test_observation_table(n_obs=100, random_state=random_state) selection = dict( type="sky_circle", frame="galactic", lon="0 deg", lat="0 deg", radius="50 deg", border="2 deg", ) obs_table = obs_table.select_observations(selection) assert len(obs_table) == 32 selection = dict( type="sky_circle", frame="galactic", lon="0 deg", lat="0 deg", radius="50 deg" ) obs_table = obs_table.select_observations(selection) assert len(obs_table) == 30 @requires_data() def test_observation_table_checker(): path = "$GAMMAPY_DATA/cta-1dc/index/gps/obs-index.fits.gz" obs_table = ObservationTable.read(path) checker = ObservationTableChecker(obs_table) records = list(checker.run()) assert len(records) == 1 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/tests/test_observations.py0000644000175100001770000005041214721316200022250 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import EarthLocation, SkyCoord from astropy.io import fits from astropy.table import Table from astropy.time import Time from astropy.units import Quantity from gammapy.data import ( DataStore, EventList, Observation, ObservationFilter, Observations, ) from gammapy.data.metadata import ObservationMetaData from gammapy.data.pointing import FixedPointingInfo from gammapy.data.utils import get_irfs_features from gammapy.irf import PSF3D, load_irf_dict_from_file from gammapy.utils.cluster import hierarchical_clustering from gammapy.utils.fits import HDULocation from gammapy.utils.testing import ( assert_skycoord_allclose, assert_time_allclose, mpl_plot_check, requires_data, ) @pytest.fixture(scope="session") def data_store(): return DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") @requires_data() def test_observation(data_store): """Test Observation class""" obs = data_store.obs(23523) assert_time_allclose(obs.tstart, Time(53343.92234009259, scale="tt", format="mjd")) assert_time_allclose(obs.tstop, Time(53343.94186555556, scale="tt", format="mjd")) c = SkyCoord(83.63333129882812, 21.51444435119629, unit="deg") assert_skycoord_allclose(obs.get_pointing_icrs(obs.tmid), c) c = SkyCoord(22.558341, 41.950807, unit="deg") assert_skycoord_allclose(obs.get_pointing_altaz(obs.tmid), c) c = SkyCoord(83.63333129882812, 22.01444435119629, unit="deg") assert_skycoord_allclose(obs.target_radec, c) @requires_data() def test_observation_peek(data_store): obs = Observation.read( "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz" ) with mpl_plot_check(): obs.peek() obs_with_radmax = Observation.read( "$GAMMAPY_DATA/magic/rad_max/data/20131004_05029747_DL3_CrabNebula-W0.40+035.fits" ) with mpl_plot_check(): obs_with_radmax.peek() @requires_data() @pytest.mark.parametrize( "time_interval, expected_times", [ ( Time( ["2004-12-04T22:10:00", "2004-12-04T22:30:00"], format="isot", scale="tt", ), Time( ["2004-12-04T22:10:00", "2004-12-04T22:30:00"], format="isot", scale="tt", ), ), ( Time([53343.930, 53343.940], format="mjd", scale="tt"), Time([53343.930, 53343.940], format="mjd", scale="tt"), ), ( Time([10.0, 100000.0], format="mjd", scale="tt"), Time([53343.92234009, 53343.94186563], format="mjd", scale="tt"), ), (Time([10.0, 20.0], format="mjd", scale="tt"), None), ], ) def test_observation_select_time(data_store, time_interval, expected_times): obs = data_store.obs(23523) new_obs = obs.select_time(time_interval) if expected_times: expected_times.format = "mjd" assert np.all( (new_obs.events.time >= expected_times[0]) & (new_obs.events.time < expected_times[1]) ) assert_time_allclose(new_obs.gti.time_start[0], expected_times[0], atol=0.01) assert_time_allclose(new_obs.gti.time_stop[-1], expected_times[1], atol=0.01) else: assert len(new_obs.events.table) == 0 assert len(new_obs.gti.table) == 0 @requires_data() @pytest.mark.parametrize( "time_interval, expected_times, expected_nr_of_obs", [ ( Time([53090.130, 53090.140], format="mjd", scale="tt"), Time([53090.130, 53090.140], format="mjd", scale="tt"), 1, ), ( Time([53090.130, 53091.110], format="mjd", scale="tt"), Time([53090.130, 53091.110], format="mjd", scale="tt"), 3, ), ( Time([10.0, 53111.0230], format="mjd", scale="tt"), Time([53090.1234512, 53111.0230], format="mjd", scale="tt"), 8, ), (Time([10.0, 20.0], format="mjd", scale="tt"), None, 0), ], ) def test_observations_select_time( data_store, time_interval, expected_times, expected_nr_of_obs ): obs_ids = data_store.obs_table["OBS_ID"][:8] obss = data_store.get_observations(obs_ids) new_obss = obss.select_time(time_interval) assert len(new_obss) == expected_nr_of_obs if expected_nr_of_obs > 0: assert new_obss[0].events.time[0] >= expected_times[0] assert new_obss[-1].events.time[-1] < expected_times[1] assert_time_allclose( new_obss[0].gti.time_start[0], expected_times[0], atol=0.01 ) assert_time_allclose( new_obss[-1].gti.time_stop[-1], expected_times[1], atol=0.01 ) @requires_data() def test_observations_mutation(data_store): obs_ids = data_store.obs_table["OBS_ID"][:4] obss = data_store.get_observations(obs_ids) assert obss.ids == ["20136", "20137", "20151", "20275"] obs_id = data_store.obs_table["OBS_ID"][4] obs = data_store.get_observations([obs_id])[0] obss.append(obs) assert obss.ids == ["20136", "20137", "20151", "20275", "20282"] obss.insert(0, obs) assert obss.ids == ["20282", "20136", "20137", "20151", "20275", "20282"] obss.pop(0) assert obss.ids == ["20136", "20137", "20151", "20275", "20282"] obs3 = obss[3] obss.pop(obss.ids[3]) assert obss.ids == ["20136", "20137", "20151", "20282"] obss.insert(3, obs3) assert obss.ids == ["20136", "20137", "20151", "20275", "20282"] obss.extend([obs]) assert obss.ids == ["20136", "20137", "20151", "20275", "20282", "20282"] obss.remove(obs) assert obss.ids == ["20136", "20137", "20151", "20275", "20282"] obss[0] = obs assert obss.ids == ["20282", "20137", "20151", "20275", "20282"] with pytest.raises(TypeError): obss.insert(5, "bad") with pytest.raises(TypeError): obss[5] = "bad" def test_empty_observations(): observations = Observations() assert len(observations) == 0 @requires_data() def test_observations_str(data_store): obs_ids = data_store.obs_table["OBS_ID"][:4] obss = data_store.get_observations(obs_ids) actual = obss.__str__() assert actual.split("\n")[1] == "Number of observations: 4" @requires_data() def test_observations_select_time_time_intervals_list(data_store): obs_ids = data_store.obs_table["OBS_ID"][:8] obss = data_store.get_observations(obs_ids) # third time interval is out of the observations time range time_intervals = [ Time([53090.130, 53090.140], format="mjd", scale="tt"), Time([53110.011, 53110.019], format="mjd", scale="tt"), Time([53112.345, 53112.42], format="mjd", scale="tt"), ] new_obss = obss.select_time(time_intervals) assert len(new_obss) == 2 assert new_obss[0].events.time[0] >= time_intervals[0][0] assert new_obss[0].events.time[-1] < time_intervals[0][1] assert new_obss[1].events.time[0] >= time_intervals[1][0] assert new_obss[1].events.time[-1] < time_intervals[1][1] assert_time_allclose(new_obss[0].gti.time_start[0], time_intervals[0][0]) assert_time_allclose(new_obss[0].gti.time_stop[-1], time_intervals[0][1]) assert_time_allclose(new_obss[1].gti.time_start[0], time_intervals[1][0]) assert_time_allclose(new_obss[1].gti.time_stop[-1], time_intervals[1][1]) @requires_data() def test_observation_cta_1dc(): ontime = 5.0 * u.hr pointing = FixedPointingInfo( fixed_icrs=SkyCoord(0, 0, unit="deg", frame="galactic").icrs, ) irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) t_ref = Time("2020-01-01T00:00:00") tstart = 20 * u.hour location = EarthLocation(lon="-70d18m58.84s", lat="-24d41m0.34s", height="2000m") obs = Observation.create( pointing, irfs=irfs, deadtime_fraction=0.1, tstart=tstart, tstop=tstart + ontime, reference_time=t_ref, location=location, ) assert_skycoord_allclose(obs.get_pointing_icrs(obs.tmid), pointing.fixed_icrs) assert_allclose(obs.observation_live_time_duration, 0.9 * ontime) assert obs.target_radec is None assert isinstance(obs.meta, ObservationMetaData) assert obs.meta.deadtime_fraction == 0.1 assert_allclose(obs.meta.location.height.to_value("m"), 2000) assert "Gammapy" in obs.meta.creation.creator @requires_data() def test_observation_create_radmax(): pointing = FixedPointingInfo( fixed_icrs=SkyCoord(0, 0, unit="deg", frame="galactic").icrs, ) obs = Observation.read("$GAMMAPY_DATA/joint-crab/dl3/magic/run_05029748_DL3.fits") livetime = 5.0 * u.hr deadtime = 0.5 irfs = {} for _ in obs.available_irfs: irfs[_] = getattr(obs, _) obs1 = Observation.create( pointing, irfs=irfs, deadtime_fraction=deadtime, livetime=livetime, ) assert_skycoord_allclose(obs1.get_pointing_icrs(obs1.tmid), pointing.fixed_icrs) assert_allclose(obs1.observation_live_time_duration, 0.5 * livetime) assert obs1.rad_max is not None assert obs1.psf is None @requires_data() def test_observation_read(): """read event list and irf components from different DL3 files""" obs = Observation.read( event_file="$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz", irf_file="$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020137.fits.gz", ) energy = Quantity(1, "TeV") offset = Quantity(0.5, "deg") val = obs.aeff.evaluate(energy_true=energy, offset=offset) assert obs.obs_id == 20136 assert len(obs.events.energy) == 11243 assert obs.available_hdus == ["events", "gti", "aeff", "edisp", "psf", "bkg"] assert_allclose(val.value, 278000.54120855, rtol=1e-5) assert val.unit == "m2" assert isinstance(obs.meta, ObservationMetaData) assert "Gammapy" in obs.meta.creation.creator assert obs.meta.obs_info.telescope == "HESS" assert obs.meta.obs_info.instrument == "H.E.S.S. Phase I" assert obs.meta.target.name == "MSH15-52" assert obs.meta.optional["N_TELS"] == 4 with pytest.raises(KeyError): obs.meta.optional["BROKPIX"] @requires_data() def test_observation_read_single_file(): """read event list and irf components from the same DL3 files""" obs = Observation.read( "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" ) energy = Quantity(1, "TeV") offset = Quantity(0.5, "deg") val = obs.aeff.evaluate(energy_true=energy, offset=offset) assert obs.obs_id == 20136 assert len(obs.events.energy) == 11243 assert obs.available_hdus == ["events", "gti", "aeff", "edisp", "psf", "bkg"] assert_allclose(val.value, 273372.44851054, rtol=1e-5) assert val.unit == "m2" @requires_data() def test_observation_read_single_file_fixed_rad_max(): """check that for a point-like observation without the RAD_MAX_2D table a RadMax2D object is generated from the RAD_MAX keyword""" obs = Observation.read("$GAMMAPY_DATA/joint-crab/dl3/magic/run_05029748_DL3.fits") assert obs.rad_max is not None assert obs.rad_max.quantity.shape == (1, 1) assert u.allclose(obs.rad_max.quantity, 0.1414213 * u.deg) @requires_data() class TestObservationChecker: def setup_method(self): self.data_store = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps") def test_check_all(self): observation = self.data_store.obs(111140) records = list(observation.check()) assert len(records) == 8 def test_checker_bad(self): for index in range(5): self.data_store.hdu_table[index]["FILE_NAME"] = "bad" observation = self.data_store.obs(110380) records = list(observation.check()) assert len(records) == 10 assert records[1]["msg"] == "Loading events failed" assert records[3]["msg"] == "Loading GTI failed" assert records[5]["msg"] == "Loading aeff failed" assert records[7]["msg"] == "Loading edisp failed" assert records[9]["msg"] == "Loading psf failed" @requires_data() def test_observation_write(tmp_path): obs = Observation.read( "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz" ) mjdreff = obs.events.table.meta["MJDREFF"] mjdrefi = obs.events.table.meta["MJDREFI"] path = tmp_path / "obs.fits.gz" obs.meta.creation.origin = "test" obs.write(path) obs_read = obs.read(path) assert obs_read.events is not None assert obs_read.gti is not None assert obs_read.aeff is not None assert obs_read.edisp is not None assert obs_read.bkg is not None assert obs_read.rad_max is None assert obs_read.obs_id == 23523 assert_allclose(obs_read.observatory_earth_location.lat.deg, -23.271778) assert_allclose(obs_read.events.table.meta["MJDREFF"], mjdreff) assert_allclose(obs_read.events.table.meta["MJDREFI"], mjdrefi) # unsupported format with pytest.raises(ValueError): obs.write(tmp_path / "foo.fits.gz", format="cool-new-format") # no irfs path = tmp_path / "obs_no_irfs.fits.gz" obs.write(path, include_irfs=False) obs_read = obs.read(path) assert obs_read.events is not None assert obs_read.gti is not None assert obs_read.aeff is None assert obs_read.edisp is None assert obs_read.bkg is None assert obs_read.rad_max is None @requires_data() def test_observation_read_write_checksum(tmp_path): obs = Observation.read( "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz" ) path = tmp_path / "obs.fits" obs.write(path, checksum=True) with fits.open(path) as hdul: for hdu in hdul: assert "CHECKSUM" in hdu.header assert "DATASUM" in hdu.header with open(path, "r+b") as file: chunk = file.read(10000) index = chunk.find("EV_CLASS".encode("ascii")) file.seek(index) file.write("BAD__KEY".encode("ascii")) with pytest.warns(UserWarning): Observation.read(path, checksum=True) @requires_data() def test_obervation_copy(data_store): obs = data_store.obs(23523) obs_copy = obs.copy() assert obs_copy.obs_id == 23523 assert isinstance(obs_copy.__dict__["_psf_hdu"], HDULocation) with pytest.raises(ValueError): _ = obs.copy(obs_id=1234) obs_copy = obs.copy(obs_id=1234, in_memory=True) assert isinstance(obs_copy.__dict__["psf"], PSF3D) assert obs_copy.obs_id == 1234 def test_observation_tmid(): from gammapy.data import GTI start = Time("2020-01-01T20:00:00") stop = Time("2020-01-01T20:10:00") expected = Time("2020-01-01T20:05:00") epoch = Time("2020-01-01T00:00:00") gti = GTI.create([(start - epoch).to(u.s)], [(stop - epoch).to(u.s)], epoch) obs = Observation(gti=gti) assert abs(obs.tmid - expected).to(u.ns) < 1 * u.us @requires_data() def test_observations_clustering(data_store): selection = dict( type="sky_circle", frame="icrs", lon="83.633 deg", lat="22.014 deg", radius="2 deg", ) obs_table = data_store.obs_table.select_observations(selection) observations = data_store.get_observations(obs_table["OBS_ID"]) coord = SkyCoord(83.63308, 22.01450, unit="deg", frame="icrs") names = ["edisp-bias", "edisp-res", "psf-radius"] features = get_irfs_features( observations, energy_true="1 TeV", position=coord, names=names ) n_features = len(names) features_array = np.array( [ features[col].data for col in features.columns if col not in ["obs_id", "dataset_name"] ] ).T assert features_array.shape == (len(observations), n_features) features = hierarchical_clustering( features, linkage_kwargs={"method": "complete"}, fcluster_kwargs={"t": 2} ) assert np.all( features["labels"].data == np.array( [ 1, 1, 2, 2, ] ) ) features = get_irfs_features( observations, energy_true="1 TeV", position=coord, names=names, apply_standard_scaler=True, ) features = hierarchical_clustering(features) features_array = np.array( [ features[col].data for col in features.columns if col not in ["obs_id", "dataset_name"] ] ).T obs_clusters = observations.group_by_label(features["labels"]) for k in range(n_features): assert_allclose(features_array[:, k].mean(), 0, atol=1e-7) assert_allclose(features_array[:, k].std(), 1, atol=1e-7) assert np.all(features["labels"].data == np.array([2, 1, 1, 1])) assert len(obs_clusters["group_1"]) == 3 assert len(obs_clusters["group_2"]) == 1 assert obs_clusters["group_2"][0].obs_id == 23523 @requires_data() def test_filter_live_time_phase(data_store): observation = data_store.obs(20136) phase_filter = {"type": "custom", "opts": dict(parameter="PHASE", band=(0.2, 0.8))} default_obs_live_time = observation.observation_live_time_duration obs_filter = ObservationFilter(event_filters=[phase_filter]) observation.obs_filter = obs_filter live_time_filter = observation.observation_live_time_duration assert_allclose(live_time_filter, default_obs_live_time * (0.8 - 0.2)) @requires_data() def test_stack_observations(data_store, caplog): obs_1 = data_store.get_observations([20136, 20137, 20151]) obs_2 = data_store.get_observations([20275, 20282]) obs12 = Observations.from_stack([obs_1, obs_2]) assert len(obs12) == 5 assert isinstance(obs12[0], Observation) obs122 = Observations.from_stack([obs12, obs_2]) assert len(obs122) == 7 assert "WARNING" in [_.levelname for _ in caplog.records] assert "Observation with obs_id 20275 already belongs to Observations." in [ _.message for _ in caplog.records ] caplog.clear() obs_1[2] = obs_1[0] assert "WARNING" in [_.levelname for _ in caplog.records] assert "Observation with obs_id 20136 already belongs to Observations." in [ _.message for _ in caplog.records ] with pytest.raises(TypeError): Observations.from_stack([obs_1, ["a"]]) @requires_data() def test_observations_generator(data_store): """Test Observations.generator()""" obs_1 = data_store.get_observations([20136, 20137, 20151]) for idx, obs in enumerate(obs_1.in_memory_generator()): assert isinstance(obs, Observation) assert obs.obs_id == obs_1[idx].obs_id assert isinstance(obs.events, EventList) assert isinstance(obs.psf, PSF3D) @requires_data() def test_event_setter(): irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) pointing = FixedPointingInfo( fixed_icrs=SkyCoord(0 * u.deg, 0 * u.deg), ) location = EarthLocation(lon="-70d18m58.84s", lat="-24d41m0.34s", height="2000m") obs = Observation.create( obs_id=1, pointing=pointing, livetime=20 * u.min, irfs=irfs, location=location, ) assert obs.events is None for invalid in (5, Table(), "foo"): with pytest.raises(TypeError): obs.events = invalid events = EventList(Table()) obs.events = events assert obs.events is events @requires_data() def test_observations_getitem(data_store): """Test mask indexing on Observations""" obs_1 = data_store.get_observations([20136, 20137, 20151]) assert isinstance(obs_1[0], Observation) assert isinstance(obs_1[1:], Observations) assert len(obs_1[1:]) == 2 mask = [True, False, True] obs_2 = obs_1[mask] assert len(obs_2) == 2 assert obs_2.ids == ["20136", "20151"] ind = [0, 2] obs_2 = obs_1[ind] assert len(obs_2) == 2 assert obs_2.ids == ["20136", "20151"] obs_2 = obs_1[np.array(mask)] assert len(obs_2) == 2 assert obs_2.ids == ["20136", "20151"] assert obs_1[["20136", "20151"]].ids == ["20136", "20151"] obs_2 = obs_1[[]] assert len(obs_2) == 0 assert isinstance(obs_2, Observations) obs_2 = obs_1[np.array([])] assert len(obs_2) == 0 assert isinstance(obs_2, Observations) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/tests/test_observers.py0000644000175100001770000000075614721316200021552 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from numpy.testing import assert_allclose from astropy.coordinates import Angle from gammapy.data import observatory_locations def test_observatory_locations(): location = observatory_locations["hess"] assert_allclose(location.lon.deg, Angle("16d30m00.8s").deg) assert_allclose(location.lat.deg, Angle("-23d16m18.4s").deg) assert_allclose(location.height.value, 1835) assert str(location.height.unit) == "m" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/tests/test_pointing.py0000644000175100001770000002771314721316200021371 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import ICRS, AltAz, SkyCoord from astropy.time import Time from gammapy.data import FixedPointingInfo, PointingInfo, observatory_locations from gammapy.data.pointing import PointingMode from gammapy.utils.deprecation import GammapyDeprecationWarning from gammapy.utils.fits import earth_location_to_dict from gammapy.utils.testing import assert_time_allclose, requires_data from gammapy.utils.time import time_ref_to_dict, time_relative_to_ref def test_fixed_pointing_icrs(): """Test new api of FixedPointingInfo in ICRS (POINTING)""" location = observatory_locations["cta_south"] fixed_icrs = SkyCoord(ra=83.28 * u.deg, dec=21.78 * u.deg) pointing = FixedPointingInfo( fixed_icrs=fixed_icrs, location=location, ) assert pointing.mode == PointingMode.POINTING assert pointing.fixed_icrs == fixed_icrs assert pointing.fixed_altaz is None obstime = Time("2020-11-01T03:00:00") altaz = pointing.get_altaz(obstime, location) icrs = pointing.get_icrs(obstime) back_trafo = altaz.transform_to("icrs") assert altaz.obstime == obstime assert isinstance(altaz.frame, AltAz) assert np.all(u.isclose(fixed_icrs.ra, icrs.ra)) assert u.isclose(back_trafo.ra, fixed_icrs.ra) assert u.isclose(back_trafo.dec, fixed_icrs.dec) obstimes = obstime + np.linspace(0, 0.25, 50) * u.hour altaz = pointing.get_altaz(obstimes, location) icrs = pointing.get_icrs(obstimes) back_trafo = altaz.transform_to("icrs") assert len(altaz) == len(obstimes) assert np.all(altaz.obstime == obstimes) assert isinstance(altaz.frame, AltAz) assert u.isclose(back_trafo.ra, fixed_icrs.ra).all() assert u.isclose(back_trafo.dec, fixed_icrs.dec).all() assert np.all(u.isclose(fixed_icrs.ra, icrs.ra)) header = pointing.to_fits_header() assert header["OBS_MODE"] == "POINTING" assert header["RA_PNT"] == fixed_icrs.ra.deg assert header["DEC_PNT"] == fixed_icrs.dec.deg def test_fixed_pointing_info_altaz(): """Test new api of FixedPointingInfo in AltAz (DRIFT)""" location = observatory_locations["cta_south"] fixed_altaz = SkyCoord(alt=70 * u.deg, az=0 * u.deg, frame=AltAz()) pointing = FixedPointingInfo( fixed_altaz=fixed_altaz, ) assert pointing.mode == PointingMode.DRIFT assert pointing.fixed_icrs is None assert pointing.fixed_altaz == fixed_altaz obstime = Time("2020-10-10T03:00:00") altaz = pointing.get_altaz(obstime=obstime, location=location) icrs = pointing.get_icrs(obstime=obstime, location=location) back_trafo = icrs.transform_to(AltAz(location=icrs.location, obstime=icrs.obstime)) assert isinstance(altaz.frame, AltAz) assert altaz.obstime == obstime assert altaz.location == location assert u.isclose(fixed_altaz.alt, altaz.alt) assert u.isclose(fixed_altaz.az, altaz.az, atol=1e-10 * u.deg) assert isinstance(icrs.frame, ICRS) assert icrs.obstime == obstime assert icrs.location == location assert u.isclose(back_trafo.alt, fixed_altaz.alt) assert u.isclose(back_trafo.az, fixed_altaz.az, atol=1e-10 * u.deg) # test multiple times at once obstimes = obstime + np.linspace(0, 0.25, 50) * u.hour altaz = pointing.get_altaz(obstime=obstimes, location=location) icrs = pointing.get_icrs(obstime=obstimes, location=location) back_trafo = icrs.transform_to(AltAz(location=icrs.location, obstime=icrs.obstime)) assert isinstance(altaz.frame, AltAz) assert np.all(altaz.obstime == obstimes) assert u.isclose(altaz.alt, fixed_altaz.alt).all() assert u.isclose(altaz.az, fixed_altaz.az).all() assert isinstance(icrs.frame, ICRS) assert u.isclose(back_trafo.alt, fixed_altaz.alt).all() assert u.isclose(back_trafo.az, fixed_altaz.az, atol=1e-10 * u.deg).all() header = pointing.to_fits_header() assert header["OBS_MODE"] == "DRIFT" assert header["AZ_PNT"] == fixed_altaz.az.deg assert header["ALT_PNT"] == fixed_altaz.alt.deg @requires_data() def test_read_gadf_drift(): """Test for reading FixedPointingInfo from GADF drift eventlist""" pointing = FixedPointingInfo.read( "$GAMMAPY_DATA/hawc/crab_events_pass4/events/EventList_Crab_fHitbin5GP.fits.gz" ) assert pointing.mode is PointingMode.DRIFT assert pointing.fixed_icrs is None assert isinstance(pointing.fixed_altaz, AltAz) assert pointing.fixed_altaz.alt == 0 * u.deg assert pointing.fixed_altaz.az == 0 * u.deg @requires_data() def test_read_gadf_pointing(): """Test for reading FixedPointingInfo from GADF pointing eventlist""" pointing = FixedPointingInfo.read( "$GAMMAPY_DATA/magic/rad_max/data/20131004_05029747_DL3_CrabNebula-W0.40+035.fits" ) assert pointing.mode is PointingMode.POINTING assert pointing.fixed_altaz is None assert isinstance(pointing.fixed_icrs.frame, ICRS) assert u.isclose(pointing.fixed_icrs.ra, 83.98333 * u.deg) assert u.isclose(pointing.fixed_icrs.dec, 22.24389 * u.deg) @requires_data() class TestFixedPointingInfo: @classmethod def setup_class(cls): filename = "$GAMMAPY_DATA/tests/pointing_table.fits.gz" cls.fpi = FixedPointingInfo.read(filename) def test_location(self): lon, lat, height = self.fpi._location.geodetic assert_allclose(lon.deg, 16.5002222222222) assert_allclose(lat.deg, -23.2717777777778) assert_allclose(height.value, 1834.999999999783) def test_time_ref(self): expected = Time(51910.00074287037, format="mjd", scale="tt") assert_time_allclose(self.fpi._time_ref, expected) def test_time_start(self): expected = Time(53025.826414166666, format="mjd", scale="tt") assert_time_allclose(self.fpi._time_start, expected) def test_time_stop(self): expected = Time(53025.844770648146, format="mjd", scale="tt") assert_time_allclose(self.fpi._time_stop, expected) def test_fixed_altaz(self): assert self.fpi._fixed_altaz is None def test_fixed_icrs(self): fixed_icrs = self.fpi._fixed_icrs assert_allclose(fixed_icrs.ra.deg, 83.633333333333) assert_allclose(fixed_icrs.dec.deg, 24.514444444444) @requires_data() class TestPointingInfo: @classmethod def setup_class(cls): filename = "$GAMMAPY_DATA/tests/pointing_table.fits.gz" cls.pointing_info = PointingInfo.read(filename) def test_str(self): ss = str(self.pointing_info) assert "Pointing info" in ss def test_location(self): lon, lat, height = self.pointing_info.location.geodetic assert_allclose(lon.deg, 16.5002222222222) assert_allclose(lat.deg, -23.2717777777778) assert_allclose(height.value, 1834.999999999783) def test_time_ref(self): expected = Time(51910.00074287037, format="mjd", scale="tt") assert_time_allclose(self.pointing_info.time_ref, expected) def test_table(self): assert len(self.pointing_info.table) == 100 def test_time(self): time = self.pointing_info.time assert len(time) == 100 expected = Time(53025.826414166666, format="mjd", scale="tt") assert_time_allclose(time[0], expected) def test_duration(self): duration = self.pointing_info.duration assert_allclose(duration.sec, 1586.0000000044238) def test_radec(self): pos = self.pointing_info.radec[0] assert_allclose(pos.ra.deg, 83.633333333333) assert_allclose(pos.dec.deg, 24.51444444) assert pos.name == "icrs" def test_altaz(self): pos = self.pointing_info.altaz[0] assert_allclose(pos.az.deg, 11.45751357) assert_allclose(pos.alt.deg, 41.34088901) assert pos.name == "altaz" def test_altaz_from_table(self): pos = self.pointing_info.altaz_from_table[0] assert_allclose(pos.az.deg, 11.20432353385406) assert_allclose(pos.alt.deg, 41.37921408774436) assert pos.name == "altaz" def test_altaz_interpolate(self): time = self.pointing_info.time[0] pos = self.pointing_info.altaz_interpolate(time) assert_allclose(pos.az.deg, 11.45751357) assert_allclose(pos.alt.deg, 41.34088901) assert pos.name == "altaz" @pytest.mark.parametrize( ("obs_mode"), [ ("POINTING"), ("WOBBLE"), ("SCAN"), ], ) def test_fixed_pointing_info_fixed_icrs_from_meta(obs_mode): location = observatory_locations["cta_south"] start = Time("2020-11-01T03:00:00") stop = Time("2020-11-01T03:15:00") ref = Time("2020-11-01T00:00:00") fixed_icrs = SkyCoord(ra=83.28 * u.deg, dec=21.78 * u.deg) meta = time_ref_to_dict(ref) meta["TSTART"] = time_relative_to_ref(start, meta).to_value(u.s) meta["TSTOP"] = time_relative_to_ref(stop, meta).to_value(u.s) meta.update(earth_location_to_dict(location)) meta["OBS_MODE"] = obs_mode meta["RA_PNT"] = fixed_icrs.ra.deg meta["DEC_PNT"] = fixed_icrs.dec.deg with pytest.warns(GammapyDeprecationWarning): pointing = FixedPointingInfo(meta=meta) # not given, but assumed if missing assert pointing.mode == PointingMode.POINTING assert pointing.fixed_icrs == fixed_icrs assert pointing.fixed_altaz is None altaz = pointing.get_altaz(start) icrs = pointing.get_icrs(start) assert altaz.obstime == start assert isinstance(altaz.frame, AltAz) assert np.all(u.isclose(fixed_icrs.ra, icrs.ra)) back_trafo = altaz.transform_to("icrs") assert u.isclose(back_trafo.ra, fixed_icrs.ra) assert u.isclose(back_trafo.dec, fixed_icrs.dec) times = start + np.linspace(0, 1, 50) * (stop - start) altaz = pointing.get_altaz(times) icrs = pointing.get_icrs(times) assert len(altaz) == len(times) assert np.all(altaz.obstime == times) assert isinstance(altaz.frame, AltAz) back_trafo = altaz.transform_to("icrs") assert u.isclose(back_trafo.ra, fixed_icrs.ra).all() assert u.isclose(back_trafo.dec, fixed_icrs.dec).all() assert np.all(u.isclose(fixed_icrs.ra, icrs.ra)) def test_fixed_pointing_info_fixed_altaz_from_meta(): location = observatory_locations["cta_south"] start = Time("2020-11-01T03:00:00") stop = Time("2020-11-01T03:15:00") ref = Time("2020-11-01T00:00:00") pointing_icrs = SkyCoord(ra=83.28 * u.deg, dec=21.78 * u.deg) fixed_altaz = pointing_icrs.transform_to(AltAz(obstime=start, location=location)) meta = time_ref_to_dict(ref) meta["TSTART"] = time_relative_to_ref(start, meta).to_value(u.s) meta["TSTOP"] = time_relative_to_ref(stop, meta).to_value(u.s) meta.update(earth_location_to_dict(location)) meta["OBS_MODE"] = "DRIFT" meta["ALT_PNT"] = fixed_altaz.alt.deg meta["AZ_PNT"] = fixed_altaz.az.deg with pytest.warns(GammapyDeprecationWarning): pointing = FixedPointingInfo(meta=meta) # not given, but assumed if missing assert pointing.mode == PointingMode.DRIFT assert pointing.fixed_icrs is None assert u.isclose(pointing.fixed_altaz.alt, fixed_altaz.alt) assert u.isclose(pointing.fixed_altaz.az, fixed_altaz.az) icrs = pointing.get_icrs(start) assert icrs.obstime == start assert isinstance(icrs.frame, ICRS) back_trafo = icrs.transform_to(fixed_altaz.frame) assert u.isclose(back_trafo.alt, fixed_altaz.alt) assert u.isclose(back_trafo.az, fixed_altaz.az) times = start + np.linspace(0, 1, 50) * (stop - start) icrs = pointing.get_icrs(times) altaz = pointing.get_altaz(times) assert len(icrs) == len(times) assert np.all(icrs.obstime == times) assert isinstance(icrs.frame, ICRS) back_trafo = icrs.transform_to(AltAz(location=location, obstime=times)) assert u.isclose(back_trafo.alt, fixed_altaz.alt).all() assert u.isclose(back_trafo.az, fixed_altaz.az).all() assert np.all(u.isclose(fixed_altaz.alt, altaz.alt)) assert np.all(u.isclose(fixed_altaz.az, altaz.az)) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/data/utils.py0000644000175100001770000001250614721316200016473 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np import astropy.units as u from astropy.table import Table from gammapy.utils.cluster import standard_scaler __all__ = ["get_irfs_features"] def get_irfs_features( observations, energy_true, position=None, fixed_offset=None, names=None, containment_fraction=0.68, apply_standard_scaler=False, ): """Get features from IRFs properties at a given position. Used for observations clustering. Parameters ---------- observations : `~gammapy.data.Observations` Container holding a list of `~gammapy.data.Observation`. energy_true : `~astropy.units.Quantity` Energy true at which to compute the containment radius. position : `~astropy.coordinates.SkyCoord`, optional Sky position. Default is None. fixed_offset : `~astropy.coordinates.Angle`, optional Offset calculated from the pointing position. Default is None. If neither the `position` nor the `fixed_offset` is specified, it uses the position of the center of the map by default. names : {"edisp-bias", "edisp-res", "psf-radius"} IRFs properties to be considered. Default is None. If None, all the features are computed. containment_fraction : float, optional Containment_fraction to compute the `psf-radius`. Default is 68%. apply_standard_scaler : bool, optional Compute standardize features by removing the mean and scaling to unit variance. Default is False. Returns ------- features : `~astropy.table.Table` Features table. Examples -------- Compute the IRF features for "edisp-bias", "edisp-res" and "psf-radius" at 1 TeV:: >>> from gammapy.data.utils import get_irfs_features >>> from gammapy.data import DataStore >>> from gammapy.utils.cluster import standard_scaler >>> from astropy.coordinates import SkyCoord >>> import astropy.units as u >>> selection = dict( ... type="sky_circle", ... frame="icrs", ... lon="329.716 deg", ... lat="-30.225 deg", ... radius="2 deg", ... ) >>> data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") >>> obs_table = data_store.obs_table.select_observations(selection) >>> obs = data_store.get_observations(obs_table["OBS_ID"][:3]) >>> position = SkyCoord(329.716 * u.deg, -30.225 * u.deg, frame="icrs") >>> names = ["edisp-bias", "edisp-res", "psf-radius"] >>> features_irfs = get_irfs_features( ... obs, ... energy_true="1 TeV", ... position=position, ... names=names, ... ) >>> print(features_irfs) edisp-bias obs_id edisp-res psf-radius deg ------------------- ------ ------------------- ------------------- 0.11587179071752986 33787 0.368346217294295 0.14149953611195087 0.04897634344908595 33788 0.33983991887701287 0.11553325504064559 0.033176650892097 33789 0.32377509405904137 0.10262943822890519 """ from gammapy.irf import EDispKernelMap, PSFMap if names is None: names = ["edisp-bias", "edisp-res", "psf-radius"] if position and fixed_offset: raise ValueError( "`position` and `fixed_offset` arguments are mutually exclusive" ) rows = [] for obs in observations: psf_kwargs = dict(fraction=containment_fraction, energy_true=energy_true) if isinstance(obs.psf, PSFMap) and isinstance(obs.edisp, EDispKernelMap): if position is None: position = obs.psf.psf_map.geom.center_skydir edisp_kernel = obs.edisp.get_edisp_kernel(position=position) psf_kwargs["position"] = position else: if fixed_offset is None: if position is None: offset = 0 * u.deg else: offset_max = np.minimum( obs.psf.axes["offset"].center[-1], obs.edisp.axes["offset"].center[-1], ) offset = np.minimum( position.separation(obs.get_pointing_icrs(obs.tmid)), offset_max ) else: offset = fixed_offset edisp_kernel = obs.edisp.to_edisp_kernel(offset) psf_kwargs["offset"] = offset data = {} for name in names: if name == "edisp-bias": data[name] = edisp_kernel.get_bias(energy_true)[0] if name == "edisp-res": data[name] = edisp_kernel.get_resolution(energy_true)[0] if name == "psf-radius": data[name] = obs.psf.containment_radius(**psf_kwargs).to("deg") data["obs_id"] = obs.obs_id rows.append(data) features = Table(rows) if apply_standard_scaler: features = standard_scaler(features) features.meta = dict( energy_true=energy_true, fixed_offset=fixed_offset, containment_fraction=containment_fraction, apply_standard_scaler=apply_standard_scaler, ) if position: features.meta["lon"] = position.galactic.l features.meta["lat"] = position.galactic.b features.meta["frame"] = "galactic" return features ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.196642 gammapy-1.3/gammapy/datasets/0000755000175100001770000000000014721316215015662 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/__init__.py0000644000175100001770000000245514721316200017773 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from gammapy.utils.registry import Registry from .core import Dataset, Datasets from .flux_points import FluxPointsDataset from .io import OGIPDatasetReader, OGIPDatasetWriter from .map import ( MapDataset, MapDatasetOnOff, create_empty_map_dataset_from_irfs, create_map_dataset_from_observation, create_map_dataset_geoms, ) from .metadata import MapDatasetMetaData from .simulate import MapDatasetEventSampler, ObservationEventSampler from .spectrum import SpectrumDataset, SpectrumDatasetOnOff from .utils import apply_edisp, split_dataset DATASET_REGISTRY = Registry( [ MapDataset, MapDatasetOnOff, SpectrumDataset, SpectrumDatasetOnOff, FluxPointsDataset, ] ) """Registry of dataset classes in Gammapy.""" __all__ = [ "create_empty_map_dataset_from_irfs", "create_map_dataset_from_observation", "create_map_dataset_geoms", "Dataset", "DATASET_REGISTRY", "Datasets", "FluxPointsDataset", "MapDataset", "MapDatasetEventSampler", "MapDatasetOnOff", "ObservationEventSampler", "OGIPDatasetWriter", "OGIPDatasetReader", "SpectrumDataset", "SpectrumDatasetOnOff", "MapDatasetMetaData", "apply_edisp", "split_dataset", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/actors.py0000644000175100001770000001662214721316200017530 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import copy import inspect import logging import numpy as np from gammapy.modeling.models import DatasetModels from .core import Dataset, Datasets from .map import MapDataset log = logging.getLogger(__name__) __all__ = ["DatasetsActor"] class DatasetsActor(Datasets): """A modified Dataset collection for parallel evaluation using ray actors. Support only datasets composed of MapDataset. Parameters ---------- datasets : `Datasets` Datasets """ def __init__(self, datasets=None): from ray import get log.warning( "Gammapy support for parallelisation with ray is still a prototype and is not fully functional." ) if datasets is not None: datasets_list = [] while datasets: d0 = datasets[0] if d0.tag != "MapDataset": raise TypeError( f"For now datasets parallel evaluation is only supported for MapDataset, got {d0.tag} instead" ) if isinstance(d0, MapDatasetActor): datasets_list.append(d0) else: datasets_list.append(MapDatasetActor(d0)) datasets.remove(d0) # moved to remote so removed from main process self._datasets = datasets_list self._ray_get = get self._covariance = None # trigger actors auto_init_wrapper (so overhead so appears on init) self.name def insert(self, idx, dataset): if isinstance(dataset, Dataset): if dataset.name in self.names: raise (ValueError("Dataset names must be unique")) self._datasets.insert(idx, MapDatasetActor(dataset)) else: raise TypeError(f"Invalid type: {type(dataset)!r}") def __getattr__(self, name): """Get attribute from remote each dataset.""" def wrapper(*args, **kwargs): results = self._ray_get( [ d._get_remote(name, *args, **kwargs, from_actors=True) for d in self._datasets ] ) if "plot" in name: results = [res(**kwargs) for res in results] for d in self._datasets: d._to_update = {} return results if inspect.ismethod(getattr(self._datasets[0], name)): return wrapper else: return wrapper() def stat_sum(self): """Compute joint likelihood.""" results = self._ray_get([d._update_stat_sum_remote() for d in self._datasets]) return np.sum(results) class RayFrontendMixin(object): """Ray mixin for a local class that interact with a remote instance.""" # TODO: abstract class ? @property def _remote_attr(self): return [ key for key in self._public_attr if key not in self._mutable_attr + self._local_attr ] def __getattr__(self, name): """Get attribute from remote.""" if name in self._remote_attr: results = self._ray_get(self._get_remote(name, from_actors=False)) return results else: return super().__getattribute__(name) def __setattr__(self, name, value): if name in self._remote_attr: raise AttributeError("can't set attribute") else: super().__setattr__(name, value) if name in self._mutable_attr: self._to_update[name] = value self._cache[name] = copy.deepcopy(value) def _get_remote(self, attr, *args, from_actors=False, **kwargs): results = self._actor._get.remote( attr, *args, **kwargs, to_update=self._to_update, from_actors=from_actors ) self._to_update = {} return results class MapDatasetActor(RayFrontendMixin): """MapDataset for parallel evaluation as a ray actor. Parameters ---------- dataset : `MapDataset` MapDataset """ _mutable_attr = ["models", "mask_fit"] _local_attr = ["name"] # immutable small enough to keep and acces locally _public_attr = [key for key in dir(MapDataset) if not key.startswith("__")] def __init__(self, dataset): from ray import get, remote self._ray_get = get self._actor = remote(_MapDatasetActorBackend).remote() self._actor._from_dataset.remote(dataset) self._name = dataset.name self._to_update = {} self._cache = {} if dataset.models is None: self.models = DatasetModels() else: self.models = dataset.models self.mask_fit = dataset.mask_fit self._to_update = {} # models and mask_fit are already ok from actor init @property def name(self): return self._name def _update_stat_sum_remote(self): self._check_parameters() values = self.models.parameters.free_parameters.value output = self._actor._update_stat_sum.remote(values, to_update=self._to_update) self._to_update = {} return output def __setattr__(self, name, value): if name == "models": if value is None: value = DatasetModels() value = value.select(datasets_names=self.name) super().__setattr__(name, value) def _get_remote(self, attr, *args, from_actors=False, **kwargs): self._check_models() results = super()._get_remote(attr, *args, from_actors=False, **kwargs) return results def _check_models(self): if ~np.all( self.models.parameters.value == self._cache["models"].parameters.value ) or len(self.models.parameters.free_parameters) != len( self._cache["models"].parameters.free_parameters ): self._to_update["models"] = self.models self._cache["models"] = self.models.copy() def _check_parameters(self): if self.models.parameters.names != self._cache[ "models" ].parameters.names or len(self.models.parameters.free_parameters) != len( self._cache["models"].parameters.free_parameters ): self._to_update["models"] = self.models self._cache["models"] = self.models.copy() class RayBackendMixin: """Ray mixin for the remote class.""" def _get(self, name, *args, to_update={}, from_actors=False, **kwargs): for key, value in to_update.items(): setattr(self, key, value) res = getattr(self, name) if isinstance(res, property): res = res() elif inspect.ismethod(res) and from_actors and "plot" not in name: try: res = res(*args, **kwargs) except TypeError: return res return res class _MapDatasetActorBackend(MapDataset, RayBackendMixin): """MapDataset backend for parallel evaluation as a ray actor. Parameters ---------- dataset : `MapDataset` MapDataset """ def _from_dataset(self, dataset): self.__dict__.update(dataset.__dict__) if self.models is None: self.models = DatasetModels() def _update_stat_sum(self, values, to_update={}): for key, value in to_update.items(): setattr(self, key, value) self.models.parameters.free_parameters.value = values return self.stat_sum() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/core.py0000644000175100001770000004447214721316200017171 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import abc import collections.abc import copy import html import logging import numpy as np from astropy import units as u from astropy.table import Table, vstack from gammapy.data import GTI from gammapy.modeling.models import DatasetModels, Models from gammapy.utils.scripts import make_name, make_path, read_yaml, to_yaml, write_yaml log = logging.getLogger(__name__) __all__ = ["Dataset", "Datasets"] class Dataset(abc.ABC): """Dataset abstract base class. For now, see existing examples of type of datasets: - `gammapy.datasets.MapDataset` - `gammapy.datasets.SpectrumDataset` - `gammapy.datasets.FluxPointsDataset` For more information see :ref:`datasets`. TODO: add tutorial how to create your own dataset types. """ _residuals_labels = { "diff": "data - model", "diff/model": "(data - model) / model", "diff/sqrt(model)": "(data - model) / sqrt(model)", } def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property @abc.abstractmethod def tag(self): pass @property def name(self): return self._name def to_dict(self): """Convert to dict for YAML serialization.""" name = self.name.replace(" ", "_") filename = f"{name}.fits" return {"name": self.name, "type": self.tag, "filename": filename} @property def mask(self): """Combined fit and safe mask.""" if self.mask_safe is not None and self.mask_fit is not None: return self.mask_safe & self.mask_fit elif self.mask_fit is not None: return self.mask_fit elif self.mask_safe is not None: return self.mask_safe def stat_sum(self): """Total statistic given the current model parameters and priors.""" stat = self.stat_array() if self.mask is not None: stat = stat[self.mask.data] prior_stat_sum = 0.0 if self.models is not None: prior_stat_sum = self.models.parameters.prior_stat_sum() return np.sum(stat, dtype=np.float64) + prior_stat_sum @abc.abstractmethod def stat_array(self): """Statistic array, one value per data point.""" def copy(self, name=None): """A deep copy. Parameters ---------- name : str, optional Name of the copied dataset. Default is None. Returns ------- dataset : `Dataset` Copied datasets. """ new = copy.deepcopy(self) name = make_name(name) new._name = name # TODO: check the model behaviour? new.models = None return new @staticmethod def _compute_residuals(data, model, method="diff"): with np.errstate(invalid="ignore"): if method == "diff": residuals = data - model elif method == "diff/model": residuals = (data - model) / model elif method == "diff/sqrt(model)": residuals = (data - model) / np.sqrt(model) else: raise AttributeError( f"Invalid method: {method!r} for computing residuals" ) return residuals class Datasets(collections.abc.MutableSequence): """Container class that holds a list of datasets. Parameters ---------- datasets : `Dataset` or list of `Dataset` Datasets. """ def __init__(self, datasets=None): if datasets is None: datasets = [] if isinstance(datasets, Datasets): datasets = datasets._datasets elif isinstance(datasets, Dataset): datasets = [datasets] elif not isinstance(datasets, list): raise TypeError(f"Invalid type: {datasets!r}") unique_names = [] for dataset in datasets: if dataset.name in unique_names: raise (ValueError("Dataset names must be unique")) unique_names.append(dataset.name) self._datasets = datasets self._covariance = None @property def parameters(self): """Unique parameters (`~gammapy.modeling.Parameters`). Duplicate parameter objects have been removed. The order of the unique parameters remains. """ return self.models.parameters.unique_parameters @property def models(self): """Unique models (`~gammapy.modeling.Models`). Duplicate model objects have been removed. The order of the unique models remains. """ models = {} for dataset in self: if dataset.models is not None: for model in dataset.models: models[model] = model models = DatasetModels(list(models.keys())) if self._covariance and self._covariance.parameters == models.parameters: return DatasetModels(models, covariance_data=self._covariance.data) else: return models @models.setter def models(self, models): """Unique models (`~gammapy.modeling.Models`). Duplicate model objects have been removed. The order of the unique models remains. """ if models: self._covariance = DatasetModels(models).covariance for dataset in self: dataset.models = models @property def names(self): return [d.name for d in self._datasets] @property def is_all_same_type(self): """Whether all contained datasets are of the same type.""" return len(set(_.__class__ for _ in self)) == 1 @property def is_all_same_shape(self): """Whether all contained datasets have the same data shape.""" return len(set(_.data_shape for _ in self)) == 1 @property def is_all_same_energy_shape(self): """Whether all contained datasets have the same data shape.""" return len(set(_.data_shape[0] for _ in self)) == 1 @property def energy_axes_are_aligned(self): """Whether all contained datasets have aligned energy axis.""" axes = [d.counts.geom.axes["energy"] for d in self] return np.all([axes[0].is_aligned(ax) for ax in axes]) @property def contributes_to_stat(self): """Stat contributions. Returns ------- contributions : `~numpy.array` Array indicating which dataset contributes to the likelihood. """ contributions = [] for dataset in self: if dataset.mask is not None: value = np.any(dataset.mask) else: value = True contributions.append(value) return np.array(contributions) def stat_sum(self): """Compute joint statistic function value.""" stat_sum = 0 # TODO: add parallel evaluation of likelihoods for dataset in self: stat_sum += dataset.stat_sum() return stat_sum def select_time(self, time_min, time_max, atol="1e-6 s"): """Select datasets in a given time interval. Parameters ---------- time_min, time_max : `~astropy.time.Time` Time interval. atol : `~astropy.units.Quantity` Tolerance value for time comparison with different scale. Default 1e-6 sec. Returns ------- datasets : `Datasets` Datasets in the given time interval. """ atol = u.Quantity(atol) datasets = [] for dataset in self: t_start = dataset.gti.time_start[0] t_stop = dataset.gti.time_stop[-1] if t_start >= (time_min - atol) and t_stop <= (time_max + atol): datasets.append(dataset) return self.__class__(datasets) def slice_by_energy(self, energy_min, energy_max): """Select and slice datasets in energy range. The method keeps the current dataset names. Datasets that do not contribute to the selected energy range are dismissed. Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Energy bounds to compute the flux point for. Returns ------- datasets : Datasets Datasets. """ datasets = [] for dataset in self: try: dataset_sliced = dataset.slice_by_energy( energy_min=energy_min, energy_max=energy_max, name=dataset.name, ) except ValueError: log.info( f"Dataset {dataset.name} does not contribute in the energy range" ) continue datasets.append(dataset_sliced) return self.__class__(datasets=datasets) def to_spectrum_datasets(self, region): """Extract spectrum datasets for the given region. To get more detailed information, see the corresponding function associated to each dataset type: `~gammapy.datasets.MapDataset.to_spectrum_dataset` or `~gammapy.datasets.MapDatasetOnOff.to_spectrum_dataset`. Parameters ---------- region : `~regions.SkyRegion` Region definition. Returns ------- datasets : `Datasets` List of `~gammapy.datasets.SpectrumDataset`. """ datasets = Datasets() for dataset in self: spectrum_dataset = dataset.to_spectrum_dataset( on_region=region, name=dataset.name ) datasets.append(spectrum_dataset) return datasets @property # TODO: make this a method to support different methods? def energy_ranges(self): """Get global energy range of datasets. The energy range is derived as the minimum / maximum of the energy ranges of all datasets. Returns ------- energy_min, energy_max : `~astropy.units.Quantity` Energy range. """ energy_mins, energy_maxs = [], [] for dataset in self: energy_axis = dataset.counts.geom.axes["energy"] energy_mins.append(energy_axis.edges[0]) energy_maxs.append(energy_axis.edges[-1]) return u.Quantity(energy_mins), u.Quantity(energy_maxs) def __str__(self): str_ = self.__class__.__name__ + "\n" str_ += "--------\n\n" for idx, dataset in enumerate(self): str_ += f"Dataset {idx}: \n\n" str_ += f"\tType : {dataset.tag}\n" str_ += f"\tName : {dataset.name}\n" try: instrument = set(dataset.meta_table["TELESCOP"]).pop() except (KeyError, TypeError): instrument = "" str_ += f"\tInstrument : {instrument}\n" if dataset.models: names = dataset.models.names else: names = "" str_ += f"\tModels : {names}\n\n" return str_.expandtabs(tabsize=2) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def copy(self): """A deep copy.""" return copy.deepcopy(self) @classmethod def read(cls, filename, filename_models=None, lazy=True, cache=True, checksum=True): """De-serialize datasets from YAML and FITS files. Parameters ---------- filename : str or `~pathlib.Path` File path or name of datasets yaml file. filename_models : str or `~pathlib.Path`, optional File path or name of models yaml file. Default is None. lazy : bool Whether to lazy load data into memory. Default is True. cache : bool Whether to cache the data after loading. Default is True. checksum : bool Whether to perform checksum verification. Default is False. Returns ------- dataset : `gammapy.datasets.Datasets` Datasets. """ from . import DATASET_REGISTRY filename = make_path(filename) data_list = read_yaml(filename, checksum=checksum) datasets = [] for data in data_list["datasets"]: path = filename.parent if (path / data["filename"]).exists(): data["filename"] = str(make_path(path / data["filename"])) dataset_cls = DATASET_REGISTRY.get_cls(data["type"]) dataset = dataset_cls.from_dict(data, lazy=lazy, cache=cache) datasets.append(dataset) datasets = cls(datasets) if filename_models: datasets.models = Models.read(filename_models, checksum=checksum) return datasets def write( self, filename, filename_models=None, overwrite=False, write_covariance=True, checksum=False, ): """Serialize datasets to YAML and FITS files. Parameters ---------- filename : str or `~pathlib.Path` File path or name of datasets yaml file. filename_models : str or `~pathlib.Path`, optional File path or name of models yaml file. Default is None. overwrite : bool, optional Overwrite existing file. Default is False. write_covariance : bool save covariance or not. Default is False. checksum : bool When True adds both DATASUM and CHECKSUM cards to the headers written to the FITS files. Default is False. """ path = make_path(filename) data = {"datasets": []} for dataset in self._datasets: d = dataset.to_dict() filename = d["filename"] dataset.write( path.parent / filename, overwrite=overwrite, checksum=checksum ) data["datasets"].append(d) if path.exists() and not overwrite: raise IOError(f"File exists already: {path}") yaml_str = to_yaml(data) write_yaml(yaml_str, path, checksum=checksum, overwrite=overwrite) if filename_models: self.models.write( filename_models, overwrite=overwrite, write_covariance=write_covariance, checksum=checksum, ) def stack_reduce(self, name=None, nan_to_num=True): """Reduce the Datasets to a unique Dataset by stacking them together. This works only if all datasets are of the same type and with aligned geometries, and if a proper in-place stack method exists for the Dataset type. For details, see :ref:`stack`. Parameters ---------- name : str, optional Name of the stacked dataset. Default is None. nan_to_num : bool Non-finite values are replaced by zero if True. Default is True. Returns ------- dataset : `~gammapy.datasets.Dataset` The stacked dataset. """ if not self.is_all_same_type: raise ValueError( "Stacking impossible: all Datasets contained are not of a unique type." ) stacked = self[0].to_masked(name=name, nan_to_num=nan_to_num) for dataset in self[1:]: stacked.stack(dataset, nan_to_num=nan_to_num) return stacked def info_table(self, cumulative=False): """Get info table for datasets. Parameters ---------- cumulative : bool Cumulate information across all datasets. If True, all model-dependent information will be lost. Default is False. Returns ------- info_table : `~astropy.table.Table` Info table. """ if not self.is_all_same_type: raise ValueError("Info table not supported for mixed dataset type.") rows = [] if cumulative: name = "stacked" stacked = self[0].to_masked(name=name) rows.append(stacked.info_dict()) for dataset in self[1:]: stacked.stack(dataset) rows.append(stacked.info_dict()) else: for dataset in self: rows.append(dataset.info_dict()) return Table(rows) # TODO: merge with meta table? @property def gti(self): """GTI table.""" time_intervals = [] for dataset in self: if dataset.gti is not None and len(dataset.gti.table) > 0: interval = (dataset.gti.time_start[0], dataset.gti.time_stop[-1]) time_intervals.append(interval) if len(time_intervals) == 0: return None return GTI.from_time_intervals(time_intervals) @property def meta_table(self): """Meta table.""" tables = [d.meta_table for d in self] if np.all([table is None for table in tables]): meta_table = Table() else: meta_table = vstack(tables).copy() meta_table.add_column([d.tag for d in self], index=0, name="TYPE") meta_table.add_column(self.names, index=0, name="NAME") return meta_table def __getitem__(self, key): return self._datasets[self.index(key)] def __delitem__(self, key): del self._datasets[self.index(key)] def __setitem__(self, key, dataset): if isinstance(dataset, Dataset): if dataset.name in self.names: raise (ValueError("Dataset names must be unique")) self._datasets[self.index(key)] = dataset else: raise TypeError(f"Invalid type: {type(dataset)!r}") def insert(self, idx, dataset): if isinstance(dataset, Dataset): if dataset.name in self.names: raise (ValueError("Dataset names must be unique")) self._datasets.insert(idx, dataset) else: raise TypeError(f"Invalid type: {type(dataset)!r}") def index(self, key): if isinstance(key, (int, slice)): return key elif isinstance(key, str): return self.names.index(key) elif isinstance(key, Dataset): return self._datasets.index(key) else: raise TypeError(f"Invalid type: {type(key)!r}") def __len__(self): return len(self._datasets) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/evaluator.py0000644000175100001770000004625114721316200020240 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import html import logging import numpy as np import astropy.units as u from astropy.coordinates import angular_separation from astropy.utils import lazyproperty from regions import CircleSkyRegion import matplotlib.pyplot as plt from gammapy.irf import EDispKernel, PSFKernel from gammapy.maps import HpxNDMap, Map, RegionNDMap, WcsNDMap from gammapy.modeling.models import PointSpatialModel, TemplateNPredModel from .utils import apply_edisp PSF_MAX_RADIUS = None PSF_CONTAINMENT = 0.999 CUTOUT_MARGIN = 0.1 * u.deg log = logging.getLogger(__name__) class MapEvaluator: """Sky model evaluation on maps. Evaluates a sky model on a 3D map and returns a map of the predicted counts. The convolution with IRFs will be performed as defined in the sky_model. To do so, IRF kernels are extracted at the position closest to the position of the model. Parameters ---------- model : `~gammapy.modeling.models.SkyModel` Sky model. exposure : `~gammapy.maps.Map` Exposure map. psf : `~gammapy.irf.PSFKernel` PSF kernel. edisp : `~gammapy.irf.EDispKernel` Energy dispersion. mask : `~gammapy.maps.Map` Mask to apply to the likelihood for fitting. gti : `~gammapy.data.GTI` GTI of the observation or union of GTI if it is a stacked observation. evaluation_mode : {"local", "global"} Model evaluation mode. The "local" mode evaluates the model components on smaller grids to save computation time. This mode is recommended for local optimization algorithms. The "global" evaluation mode evaluates the model components on the full map. This mode is recommended for global optimization algorithms. use_cache : bool Use npred caching. """ def __init__( self, model, exposure=None, psf=None, edisp=None, gti=None, mask=None, evaluation_mode="local", use_cache=True, ): self.model = model self.exposure = exposure self.psf = psf self.edisp = edisp self.mask = mask self.gti = gti self.use_cache = use_cache self.contributes = True self.psf_containment = None self._geom_reco_axis = None if evaluation_mode not in {"local", "global"}: raise ValueError(f"Invalid evaluation_mode: {evaluation_mode!r}") self.evaluation_mode = evaluation_mode # TODO: this is preliminary solution until we have further unified the model handling if ( isinstance(self.model, TemplateNPredModel) or self.model.spatial_model is None or self.model.evaluation_radius is None ): self.evaluation_mode = "global" # define cached computations self._cached_parameter_values = None self._cached_parameter_values_previous = None self._cached_parameter_values_spatial = None self._cached_position = (0, 0) self._computation_cache = None def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def reset_cache_properties(self): """Reset cached properties.""" del self._compute_npred del self._compute_flux_spatial @property def geom(self): """True energy map geometry (`~gammapy.maps.Geom`).""" if self.exposure is not None: return self.exposure.geom else: return None @property def _geom_reco(self): if self.edisp is not None: energy_axis = self.edisp.axes["energy"] elif self._geom_reco_axis is not None: energy_axis = self._geom_reco_axis else: energy_axis = self.geom.axes["energy_true"] geom = self.geom.to_image().to_cube(axes=[energy_axis.copy(name="energy")]) return geom @property def needs_update(self): """Check whether the model component has drifted away from its support.""" if isinstance(self.model, TemplateNPredModel): return False elif not self.contributes: return False elif self.exposure is None: return True elif self.geom.is_region: return False elif self.evaluation_mode == "global" or self.model.evaluation_radius is None: return False elif not self.parameters_spatial_changed(reset=False): return False else: return self.irf_position_changed @property def psf_width(self): """Width of the PSF.""" if self.psf is not None: psf_width = np.max(self.psf.psf_kernel_map.geom.width) else: psf_width = 0 * u.deg return psf_width def use_psf_containment(self, geom): """Use PSF containment for point sources and circular regions.""" if not geom.is_region: return False is_point_model = isinstance(self.model.spatial_model, PointSpatialModel) is_circle_region = isinstance(geom.region, CircleSkyRegion) return is_point_model & is_circle_region @lazyproperty def position(self): """Latest evaluation position.""" return self.model.position @lazyproperty def cutout_width(self): """Cutout width for the model component.""" return self.psf_width + 2 * (self.model.evaluation_radius + CUTOUT_MARGIN) def update(self, exposure, psf, edisp, geom, mask): """Update MapEvaluator, based on the current position of the model component. Parameters ---------- exposure : `~gammapy.maps.Map` Exposure map. psf : `gammapy.irf.PSFMap` PSF map. edisp : `gammapy.irf.EDispMap` Edisp map. geom : `WcsGeom` Counts geom. mask : `~gammapy.maps.Map` Mask to apply to the likelihood for fitting. """ # TODO: simplify and clean up log.debug("Updating model evaluator") del self.position del self.cutout_width self._geom_reco_axis = geom.axes["energy"] # lookup edisp del self._edisp_diagonal if edisp: energy_axis = geom.axes["energy"] self.edisp = edisp.get_edisp_kernel( position=self.position, energy_axis=energy_axis ) del self._edisp_diagonal # lookup psf if ( psf and self.model.spatial_model and not (isinstance(self.psf, PSFKernel) and psf.has_single_spatial_bin) ): energy_name = psf.energy_name geom_psf = geom if energy_name == "energy" else exposure.geom if self.use_psf_containment(geom=geom_psf): energy_values = geom_psf.axes[energy_name].center.reshape((-1, 1, 1)) kwargs = {energy_name: energy_values, "rad": geom.region.radius} self.psf_containment = psf.containment(**kwargs) else: self.psf = psf.get_psf_kernel( position=self.position, geom=geom_psf, containment=PSF_CONTAINMENT, max_radius=PSF_MAX_RADIUS, ) self.exposure = exposure if self.evaluation_mode == "local": self.contributes = self.model.contributes(mask=mask, margin=self.psf_width) if self.contributes and not self.model.contributes(mask=mask): log.warning( f"Model {self.model.name} is outside the target geom but contributes inside through the psf." "This contribution cannot be estimated precisely." "Consider extending the dataset geom and/or the masked margin in the mask_fit." ) if self.contributes and not self.geom.is_region: self.exposure = exposure._cutout_view( position=self.position, width=self.cutout_width, odd_npix=True ) self.reset_cache_properties() self._computation_cache = None self._cached_parameter_previous = None @lazyproperty def _edisp_diagonal(self): return EDispKernel.from_diagonal_response( energy_axis_true=self.geom.axes["energy_true"], energy_axis=self._geom_reco.axes["energy"], ) def compute_dnde(self): """Compute model differential flux at map pixel centers. Returns ------- model_map : `~gammapy.maps.Map` Sky cube with data filled with evaluated model values. Units: ``cm-2 s-1 TeV-1 deg-2``. """ return self.model.evaluate_geom(self.geom, self.gti) def compute_flux(self, *arg): """Compute flux.""" return self.model.integrate_geom(self.geom, self.gti) def compute_flux_psf_convolved(self, *arg): """Compute PSF convolved and temporal model corrected flux.""" value = self.compute_flux_spectral() if self.model.spatial_model: if self.psf_containment is not None: value = value * self.psf_containment else: value = value * self.compute_flux_spatial() if self.model.temporal_model: value *= self.compute_temporal_norm() return Map.from_geom(geom=self.geom, data=value.value, unit=value.unit) def compute_flux_spatial(self): """Compute spatial flux using caching.""" if self.parameters_spatial_changed() or not self.use_cache: del self._compute_flux_spatial return self._compute_flux_spatial @lazyproperty def _compute_flux_spatial(self): """Compute spatial flux. Returns ---------- value: `~astropy.units.Quantity` PSF-corrected, integrated flux over a given region. """ if self.geom.is_region: # We don't estimate spatial contributions if no psf are defined if self.geom.region is None or self.psf is None: return 1 wcs_geom = self.geom.to_wcs_geom(width_min=self.cutout_width) values = self._compute_flux_spatial_geom(wcs_geom) if not values.geom.has_energy_axis: axes = [self.geom.axes["energy_true"].squash()] values = values.to_cube(axes=axes) weights = wcs_geom.region_weights(regions=[self.geom.region]) value = (values.quantity * weights).sum(axis=(1, 2), keepdims=True) else: value = self._compute_flux_spatial_geom(self.geom) return value def _compute_flux_spatial_geom(self, geom): """Compute spatial flux oversampling geom if necessary.""" if not self.model.spatial_model.is_energy_dependent: geom = geom.to_image() value = self.model.spatial_model.integrate_geom(geom) if self.psf and self.model.apply_irf["psf"]: value = self.apply_psf(value) return value def compute_flux_spectral(self): """Compute spectral flux.""" energy = self.geom.axes["energy_true"].edges value = self.model.spectral_model.integral( energy[:-1], energy[1:], ) if self.geom.is_hpx: return value.reshape((-1, 1)) else: return value.reshape((-1, 1, 1)) def compute_temporal_norm(self): """Compute temporal norm.""" integral = self.model.temporal_model.integral( self.gti.time_start, self.gti.time_stop ) return np.sum(integral) def apply_exposure(self, flux): """Compute npred cube. For now just divide flux cube by exposure. """ npred = (flux.quantity * self.exposure.quantity).to_value("") return Map.from_geom(self.geom, data=npred, unit="") def apply_psf(self, npred): """Convolve npred cube with PSF.""" return npred.convolve(self.psf) def apply_edisp(self, npred): """Convolve map data with energy dispersion. Parameters ---------- npred : `~gammapy.maps.Map` Predicted counts in true energy bins. Returns ------- npred_reco : `~gammapy.maps.Map` Predicted counts in reconstructed energy bins. """ if self.model.apply_irf["edisp"] and self.edisp: return apply_edisp(npred, self.edisp) else: if "energy_true" in npred.geom.axes.names: return apply_edisp(npred, self._edisp_diagonal) else: return npred @lazyproperty def _compute_npred(self): """Compute npred.""" if isinstance(self.model, TemplateNPredModel): npred = self.model.evaluate() else: if ( self._norm_idx is not None and self.model.parameters.value[self._norm_idx] == 0 ): npred = Map.from_geom(self._geom_reco, data=0) elif not self.parameter_norm_only_changed or not self.use_cache: for method in self.methods_sequence: values = method(self._computation_cache) self._computation_cache = values npred = self._computation_cache else: npred = self._computation_cache * self.renorm() return npred @property def apply_psf_after_edisp(self): return ( self.psf is not None and "energy" in self.psf.psf_kernel_map.geom.axes.names ) def compute_npred(self): """Evaluate model predicted counts. Returns ------- npred : `~gammapy.maps.Map` Predicted counts on the map (in reconstructed energy bins). """ if self.parameters_changed or not self.use_cache: del self._compute_npred return self._compute_npred @property def parameters_changed(self): """Parameters changed.""" values = self.model.parameters.value changed = ~np.all(self._cached_parameter_values == values) if changed: self._cached_parameter_values = values return changed @property def parameter_norm_only_changed(self): """Only norm parameter changed.""" norm_only_changed = False idx = self._norm_idx values = self.model.parameters.value if idx is not None and self._computation_cache is not None: changed = self._cached_parameter_values_previous != values norm_only_changed = np.count_nonzero(changed) == 1 and changed[idx] if not norm_only_changed: self._cached_parameter_values_previous = values return norm_only_changed def parameters_spatial_changed(self, reset=True): """Parameters changed. Parameters ---------- reset : bool Reset cached values. Default is True. Returns ------- changed : bool Whether spatial parameters changed. """ values = self.model.spatial_model.parameters.value changed = ~np.all(self._cached_parameter_values_spatial == values) if changed and reset: self._cached_parameter_values_spatial = values return changed @property def irf_position_changed(self): """Position for IRF changed.""" # Here we do not use SkyCoord.separation to improve performance # (it avoids equivalence comparisons for frame and units) lon_cached, lat_cached = self._cached_position lon, lat = self.model.position_lonlat separation = angular_separation(lon, lat, lon_cached, lat_cached) changed = separation > (self.model.evaluation_radius + CUTOUT_MARGIN).to_value( u.rad ) if changed: self._cached_position = lon, lat return changed @lazyproperty def _norm_idx(self): """Norm index.""" names = self.model.parameters.names ind = [idx for idx, name in enumerate(names) if name in ["norm", "amplitude"]] if len(ind) == 1: return ind[0] else: return None def renorm(self): value = self.model.parameters.value[self._norm_idx] value_cached = self._cached_parameter_values_previous[self._norm_idx] return value / value_cached @lazyproperty def methods_sequence(self): """Order to apply the IRFs.""" if self.apply_psf_after_edisp: methods = [ self.compute_flux, self.apply_exposure, self.apply_edisp, self.apply_psf, ] if not self.psf or not self.model.apply_irf["psf"]: methods.remove(self.apply_psf) else: methods = [ self.compute_flux_psf_convolved, self.apply_exposure, self.apply_edisp, ] if not self.model.apply_irf["exposure"]: methods.remove(self.apply_exposure) return methods def peek(self, figsize=(12, 15)): """Quick-look summary plots. Parameters ---------- figsize : tuple Size of the figure. Default is (12, 15). """ if self.needs_update: raise AttributeError( "The evaluator needs to be updated first. Execute " "`MapDataset.npred_signal(model_names=...)` before calling this method." ) nrows = 1 if self.psf: nrows += 1 if self.edisp: nrows += 1 fig, axes = plt.subplots( ncols=2, nrows=nrows, subplot_kw={"projection": self.exposure.geom.wcs}, figsize=figsize, gridspec_kw={"hspace": 0.2, "wspace": 0.3}, ) axes = axes.flat exposure = self.exposure if isinstance(exposure, WcsNDMap) or isinstance(exposure, HpxNDMap): axes[0].set_title("Predicted counts") self.compute_npred().sum_over_axes().plot(ax=axes[0], add_cbar=True) axes[1].set_title("Exposure") self.exposure.sum_over_axes().plot(ax=axes[1], add_cbar=True) elif isinstance(exposure, RegionNDMap): axes[0].remove() ax0 = fig.add_subplot(nrows, 2, 1) ax0.set_title("Predicted counts") self.compute_npred().plot(ax=ax0) axes[1].remove() ax1 = fig.add_subplot(nrows, 2, 2) ax1.set_title("Exposure") self.exposure.plot(ax=ax1) idx = 3 if self.psf: axes[2].set_title("Energy-integrated PSF kernel") self.psf.plot_kernel(ax=axes[2], add_cbar=True) axes[3].set_title("PSF kernel at 1 TeV") self.psf.plot_kernel(ax=axes[3], add_cbar=True, energy=1 * u.TeV) idx += 2 if self.edisp: axes[idx - 1].remove() ax = fig.add_subplot(nrows, 2, idx) ax.set_title("Energy bias") self.edisp.plot_bias(ax=ax) axes[idx].remove() ax = fig.add_subplot(nrows, 2, idx + 1) ax.set_title("Energy dispersion matrix") self.edisp.plot_matrix(ax=ax) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/flux_points.py0000644000175100001770000006115314721316200020606 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import numpy as np from scipy.special import erfc from astropy import units as u from astropy.io import fits from astropy.table import Table from astropy.visualization import quantity_support import matplotlib.pyplot as plt from matplotlib.gridspec import GridSpec from gammapy.maps.axes import UNIT_STRING_FORMAT, MapAxis from gammapy.modeling.models import ( DatasetModels, Models, SkyModel, TemplateSpatialModel, ) from gammapy.utils.interpolation import interpolate_profile from gammapy.utils.scripts import make_name, make_path from .core import Dataset log = logging.getLogger(__name__) __all__ = ["FluxPointsDataset"] def _get_reference_model(model, energy_bounds, margin_percent=70): if isinstance(model.spatial_model, TemplateSpatialModel): geom = model.spatial_model.map.geom emin = energy_bounds[0] * (1 - margin_percent / 100) emax = energy_bounds[-1] * (1 + margin_percent / 100) energy_axis = MapAxis.from_energy_bounds( emin, emax, nbin=20, per_decade=True, name="energy_true" ) geom = geom.to_image().to_cube([energy_axis]) return Models([model]).to_template_spectral_model(geom) else: return model.spectral_model class FluxPointsDataset(Dataset): """Bundle a set of flux points with a parametric model, to compute fit statistic function using different statistics (see ``stat_type``). For more information see :ref:`datasets`. Parameters ---------- models : `~gammapy.modeling.models.Models` Models (only spectral part needs to be set). data : `~gammapy.estimators.FluxPoints` Flux points. Must be sorted along the energy axis. mask_fit : `numpy.ndarray` Mask to apply for fitting. mask_safe : `numpy.ndarray` Mask defining the safe data range. By default, upper limit values are excluded. meta_table : `~astropy.table.Table` Table listing information on observations used to create the dataset. One line per observation for stacked datasets. stat_type : str Method used to compute the statistics: * chi2 : estimate from chi2 statistics. * profile : estimate from interpolation of the likelihood profile. * distrib : Assuming gaussian errors the likelihood is given by the probability density function of the normal distribution. For the upper limit case it is necessary to marginalize over the unknown measurement, so we integrate the normal distribution up to the upper limit value which gives the complementary error function. See eq. C7 of `Mohanty et al (2013) `__ Default is `chi2`, in that case upper limits are ignored and the mean of asymetrics error is used. However, it is recommended to use `profile` if `stat_scan` is available on flux points. The `distrib` case provides an approximation if the profile is not available. stat_kwargs : dict Extra arguments specifying the interpolation scheme of the likelihood profile. Used only if `stat_type=="profile"`. In that case the default is : `stat_kwargs={"interp_scale":"sqrt", "extrapolate":True}` Examples -------- Load flux points from file and fit with a power-law model:: >>> from gammapy.modeling import Fit >>> from gammapy.modeling.models import PowerLawSpectralModel, SkyModel >>> from gammapy.estimators import FluxPoints >>> from gammapy.datasets import FluxPointsDataset >>> filename = "$GAMMAPY_DATA/tests/spectrum/flux_points/diff_flux_points.fits" >>> dataset = FluxPointsDataset.read(filename) >>> model = SkyModel(spectral_model=PowerLawSpectralModel()) >>> dataset.models = model Make the fit: >>> fit = Fit() >>> result = fit.run([dataset]) >>> print(result) OptimizeResult backend : minuit method : migrad success : True message : Optimization terminated successfully. nfev : 135 total stat : 25.21 CovarianceResult backend : minuit method : hesse success : True message : Hesse terminated successfully. >>> print(result.parameters.to_table()) type name value unit ... frozen link prior ---- --------- ---------- -------------- ... ------ ---- ----- index 2.2159e+00 ... False amplitude 2.1619e-13 TeV-1 s-1 cm-2 ... False reference 1.0000e+00 TeV ... True Note: In order to reproduce the example, you need the tests datasets folder. You may download it with the command: ``gammapy download datasets --tests --out $GAMMAPY_DATA`` """ tag = "FluxPointsDataset" def __init__( self, models=None, data=None, mask_fit=None, mask_safe=None, name=None, meta_table=None, stat_type="chi2", stat_kwargs=None, ): if not data.geom.has_energy_axis: raise ValueError("FluxPointsDataset needs an energy axis") self.data = data self.mask_fit = mask_fit self._name = make_name(name) self.models = models self.meta_table = meta_table self._available_stat_type = dict( chi2=self._stat_array_chi2, profile=self._stat_array_profile, distrib=self._stat_array_distrib, ) if stat_kwargs is None: stat_kwargs = dict() self.stat_kwargs = stat_kwargs self.stat_type = stat_type if mask_safe is None: mask_safe = np.ones(self.data.dnde.data.shape, dtype=bool) self.mask_safe = mask_safe @property def available_stat_type(self): return list(self._available_stat_type.keys()) @property def stat_type(self): return self._stat_type @stat_type.setter def stat_type(self, stat_type): if stat_type not in self.available_stat_type: raise ValueError( f"Invalid stat_type: possible options are {self.available_stat_type}" ) if stat_type == "chi2": self._mask_valid = (~self.data.is_ul).data & np.isfinite(self.data.dnde) elif stat_type == "distrib": self._mask_valid = ( self.data.is_ul.data & np.isfinite(self.data.dnde_ul) ) | np.isfinite(self.data.dnde) elif stat_type == "profile": self.stat_kwargs.setdefault("interp_scale", "sqrt") self.stat_kwargs.setdefault("extrapolate", True) self._profile_interpolators = self._get_valid_profile_interpolators() self._stat_type = stat_type @property def mask_valid(self): return self._mask_valid @property def mask_safe(self): return self._mask_safe & self.mask_valid @mask_safe.setter def mask_safe(self, mask_safe): self._mask_safe = mask_safe @property def name(self): return self._name @property def gti(self): """Good time interval info (`GTI`).""" return self.data.gti @property def models(self): return self._models @models.setter def models(self, models): if models is None: self._models = None else: models = DatasetModels(models) self._models = models.select(datasets_names=self.name) def write(self, filename, overwrite=False, checksum=False, **kwargs): """Write flux point dataset to file. Parameters ---------- filename : str Filename to write to. overwrite : bool, optional Overwrite existing file. Default is False. checksum : bool When True adds both DATASUM and CHECKSUM cards to the headers written to the FITS file. Applies only if filename has .fits suffix. Default is False. **kwargs : dict, optional Keyword arguments passed to `~astropy.table.Table.write`. """ table = self.data.to_table() if self.mask_fit is None: mask_fit = self.mask_safe else: mask_fit = self.mask_fit table["mask_fit"] = mask_fit table["mask_safe"] = self.mask_safe filename = make_path(filename) if "fits" in filename.suffixes: primary_hdu = fits.PrimaryHDU() hdu_table = fits.BinTableHDU(table, name="TABLE") fits.HDUList([primary_hdu, hdu_table]).writeto( filename, overwrite=overwrite, checksum=checksum ) else: table.write(make_path(filename), overwrite=overwrite, **kwargs) @classmethod def read(cls, filename, name=None): """Read pre-computed flux points and create a dataset. Parameters ---------- filename : str Filename to read from. name : str, optional Name of the new dataset. Default is None. Returns ------- dataset : `FluxPointsDataset` FluxPointsDataset. """ from gammapy.estimators import FluxPoints filename = make_path(filename) table = Table.read(filename) mask_fit = None mask_safe = None if "mask_safe" in table.colnames: mask_safe = table["mask_safe"].data.astype("bool") if "mask_fit" in table.colnames: mask_fit = table["mask_fit"].data.astype("bool") return cls( name=make_name(name), data=FluxPoints.from_table(table), mask_fit=mask_fit, mask_safe=mask_safe, ) @classmethod def from_dict(cls, data, **kwargs): """Create flux point dataset from dict. Parameters ---------- data : dict Dictionary containing data to create dataset from. Returns ------- dataset : `FluxPointsDataset` Flux point datasets. """ from gammapy.estimators import FluxPoints filename = make_path(data["filename"]) table = Table.read(filename) mask_fit = table["mask_fit"].data.astype("bool") mask_safe = table["mask_safe"].data.astype("bool") table.remove_columns(["mask_fit", "mask_safe"]) return cls( name=data["name"], data=FluxPoints.from_table(table), mask_fit=mask_fit, mask_safe=mask_safe, ) def __str__(self): str_ = f"{self.__class__.__name__}\n" str_ += "-" * len(self.__class__.__name__) + "\n" str_ += "\n" str_ += "\t{:32}: {} \n\n".format("Name", self.name) # data section n_bins = 0 if self.data is not None: n_bins = np.prod(self.data.geom.data_shape) str_ += "\t{:32}: {} \n".format("Number of total flux points", n_bins) n_fit_bins = 0 if self.mask is not None: n_fit_bins = np.sum(self.mask.data) str_ += "\t{:32}: {} \n\n".format("Number of fit bins", n_fit_bins) # likelihood section str_ += "\t{:32}: {}\n".format("Fit statistic type", self.stat_type) stat = np.nan if self.data is not None and self.models is not None: stat = self.stat_sum() str_ += "\t{:32}: {:.2f}\n\n".format("Fit statistic value (-2 log(L))", stat) # model section n_models = 0 if self.models is not None: n_models = len(self.models) str_ += "\t{:32}: {} \n".format("Number of models", n_models) if self.models is not None: str_ += "\t{:32}: {}\n".format( "Number of parameters", len(self.models.parameters) ) str_ += "\t{:32}: {}\n\n".format( "Number of free parameters", len(self.models.parameters.free_parameters) ) str_ += "\t" + "\n\t".join(str(self.models).split("\n")[2:]) return str_.expandtabs(tabsize=2) def data_shape(self): """Shape of the flux points data (tuple).""" return self.data.energy_ref.shape def flux_pred(self): """Compute predicted flux.""" flux = 0.0 for model in self.models: reference_model = _get_reference_model(model, self._energy_bounds) sky_model = SkyModel( spectral_model=reference_model, temporal_model=model.temporal_model ) flux_model = sky_model.evaluate_geom( self.data.geom.as_energy_true, self.gti ) flux += flux_model return flux def stat_array(self): """Fit statistic array.""" return self._available_stat_type[self.stat_type]() def _stat_array_chi2(self): """Chi2 statistics.""" model = self.flux_pred() data = self.data.dnde.quantity try: sigma = self.data.dnde_err.quantity except AttributeError: sigma = (self.data.dnde_errn + self.data.dnde_errp).quantity / 2 return ((data - model) / sigma).to_value("") ** 2 def _stat_array_profile(self): """Estimate statitistic from interpolation of the likelihood profile.""" model = np.zeros(self.data.dnde.data.shape) + ( self.flux_pred() / self.data.dnde_ref ).to_value("") stat = np.zeros(model.shape) for idx in np.ndindex(self._profile_interpolators.shape): stat[idx] = self._profile_interpolators[idx](model[idx]) return stat def _get_valid_profile_interpolators(self): value_scan = self.data.stat_scan.geom.axes["norm"].center shape_axes = self.data.stat_scan.geom._shape[slice(3, None)][::-1] interpolators = np.empty(shape_axes, dtype=object) self._mask_valid = np.ones(self.data.dnde.data.shape, dtype=bool) for idx in np.ndindex(shape_axes): stat_scan = np.abs( self.data.stat_scan.data[idx].squeeze() - self.data.stat.data[idx].squeeze() ) self._mask_valid[idx] = np.all(np.isfinite(stat_scan)) interpolators[idx] = interpolate_profile( value_scan, stat_scan, interp_scale=self.stat_kwargs["interp_scale"], extrapolate=self.stat_kwargs["extrapolate"], ) return interpolators def _stat_array_distrib(self): """Estimate statistic from probability distributions, assumes that flux points correspond to asymmetric gaussians and upper limits complementary error functions. """ model = np.zeros(self.data.dnde.data.shape) + self.flux_pred().to_value( self.data.dnde.unit ) stat = np.zeros(model.shape) mask_valid = ~np.isnan(self.data.dnde.data) loc = self.data.dnde.data[mask_valid] value = model[mask_valid] try: mask_p = (model >= self.data.dnde.data)[mask_valid] scale = np.zeros(mask_p.shape) scale[mask_p] = self.data.dnde_errp.data[mask_valid][mask_p] scale[~mask_p] = self.data.dnde_errn.data[mask_valid][~mask_p] mask_invalid = np.isnan(scale) scale[mask_invalid] = self.data.dnde_err.data[mask_valid][mask_invalid] except AttributeError: scale = self.data.dnde_err.data[mask_valid] stat[mask_valid] = ((value - loc) / scale) ** 2 mask_ul = self.data.is_ul.data value = model[mask_ul] loc_ul = self.data.dnde_ul.data[mask_ul] scale_ul = self.data.dnde_ul.data[mask_ul] stat[mask_ul] = 2 * np.log( (erfc((loc_ul - value) / scale_ul) / 2) / (erfc((loc_ul - 0) / scale_ul) / 2) ) stat[np.isnan(stat.data)] = 0 return stat def residuals(self, method="diff"): """Compute flux point residuals. Parameters ---------- method: {"diff", "diff/model"} Method used to compute the residuals. Available options are: - ``"diff"`` (default): data - model. - ``"diff/model"``: (data - model) / model. Returns ------- residuals : `~numpy.ndarray` Residuals array. """ fp = self.data model = self.flux_pred() residuals = self._compute_residuals(fp.dnde.quantity, model, method) # Remove residuals for upper_limits residuals[fp.is_ul.data] = np.nan return residuals def plot_fit( self, ax_spectrum=None, ax_residuals=None, kwargs_spectrum=None, kwargs_residuals=None, ): """Plot flux points, best fit model and residuals in two panels. Calls `~FluxPointsDataset.plot_spectrum` and `~FluxPointsDataset.plot_residuals`. Parameters ---------- ax_spectrum : `~matplotlib.axes.Axes`, optional Axes to plot flux points and best fit model on. Default is None. ax_residuals : `~matplotlib.axes.Axes`, optional Axes to plot residuals on. Default is None. kwargs_spectrum : dict, optional Keyword arguments passed to `~FluxPointsDataset.plot_spectrum`. Default is None. kwargs_residuals : dict, optional Keyword arguments passed to `~FluxPointsDataset.plot_residuals`. Default is None. Returns ------- ax_spectrum, ax_residuals : `~matplotlib.axes.Axes` Flux points, best fit model and residuals plots. Examples -------- >>> from gammapy.modeling.models import PowerLawSpectralModel, SkyModel >>> from gammapy.estimators import FluxPoints >>> from gammapy.datasets import FluxPointsDataset >>> #load precomputed flux points >>> filename = "$GAMMAPY_DATA/tests/spectrum/flux_points/diff_flux_points.fits" >>> flux_points = FluxPoints.read(filename) >>> model = SkyModel(spectral_model=PowerLawSpectralModel()) >>> dataset = FluxPointsDataset(model, flux_points) >>> #configuring optional parameters >>> kwargs_spectrum = {"kwargs_model": {"color":"red", "ls":"--"}, "kwargs_fp":{"color":"green", "marker":"o"}} >>> kwargs_residuals = {"color": "blue", "markersize":4, "marker":'s', } >>> dataset.plot_fit(kwargs_residuals=kwargs_residuals, kwargs_spectrum=kwargs_spectrum) # doctest: +SKIP """ if self.data.geom.ndim > 3: raise ValueError("Plot fit works with only one energy axis") fig = plt.figure(figsize=(9, 7)) gs = GridSpec(7, 1) if ax_spectrum is None: ax_spectrum = fig.add_subplot(gs[:5, :]) if ax_residuals is None: plt.setp(ax_spectrum.get_xticklabels(), visible=False) if ax_residuals is None: ax_residuals = fig.add_subplot(gs[5:, :], sharex=ax_spectrum) kwargs_spectrum = kwargs_spectrum or {} kwargs_residuals = kwargs_residuals or {} kwargs_residuals.setdefault("method", "diff/model") self.plot_spectrum(ax=ax_spectrum, **kwargs_spectrum) self.plot_residuals(ax=ax_residuals, **kwargs_residuals) return ax_spectrum, ax_residuals @property def _energy_bounds(self): try: return u.Quantity([self.data.energy_min.min(), self.data.energy_max.max()]) except KeyError: return u.Quantity([self.data.energy_ref.min(), self.data.energy_ref.max()]) @property def _energy_unit(self): return self.data.energy_ref.unit def plot_residuals(self, ax=None, method="diff", **kwargs): """Plot flux point residuals. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Axes to plot on. Default is None. method : {"diff", "diff/model"} Normalization used to compute the residuals, see `FluxPointsDataset.residuals`. Default is "diff". **kwargs : dict Keyword arguments passed to `~matplotlib.axes.Axes.errorbar`. Returns ------- ax : `~matplotlib.axes.Axes` Axes object. """ if self.data.geom.ndim > 3: raise ValueError("Plot residuals works with only one energy axis") ax = ax or plt.gca() fp = self.data residuals = self.residuals(method) xerr = self.data.energy_axis.as_plot_xerr yerr = fp._plot_get_flux_err(sed_type="dnde") if method == "diff/model": model = self.flux_pred() yerr = ( (yerr[0].quantity / model).squeeze(), (yerr[1].quantity / model).squeeze(), ) elif method == "diff": yerr = yerr[0].quantity.squeeze(), yerr[1].quantity.squeeze() else: raise ValueError('Invalid method, choose between "diff" and "diff/model"') kwargs.setdefault("color", kwargs.pop("c", "black")) kwargs.setdefault("marker", "+") kwargs.setdefault("linestyle", kwargs.pop("ls", "none")) with quantity_support(): ax.errorbar( fp.energy_ref, residuals.squeeze(), xerr=xerr, yerr=yerr, **kwargs ) ax.axhline(0, color=kwargs["color"], lw=0.5) # format axes ax.set_xlabel(f"Energy [{self._energy_unit.to_string(UNIT_STRING_FORMAT)}]") ax.set_xscale("log") label = self._residuals_labels[method] ax.set_ylabel(f"Residuals\n {label}") ymin = np.nanmin(residuals - yerr[0]) ymax = np.nanmax(residuals + yerr[1]) ymax = max(abs(ymin), ymax) ax.set_ylim(-1.05 * ymax, 1.05 * ymax) return ax def plot_spectrum( self, ax=None, kwargs_fp=None, kwargs_model=None, axis_name="energy" ): """Plot flux points and model. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Axes to plot on. Default is None. kwargs_fp : dict, optional Keyword arguments passed to `gammapy.estimators.FluxPoints.plot` to configure the plot style. Default is None. kwargs_model : dict, optional Keyword arguments passed to `gammapy.modeling.models.SpectralModel.plot` and `gammapy.modeling.models.SpectralModel.plot_error` to configure the plot style. Default is None. axis_name : str Axis along which to plot the flux points for multiple axes. Default is energy. Returns ------- ax : `~matplotlib.axes.Axes` Axes object. Examples -------- >>> from gammapy.modeling.models import PowerLawSpectralModel, SkyModel >>> from gammapy.estimators import FluxPoints >>> from gammapy.datasets import FluxPointsDataset >>> #load precomputed flux points >>> filename = "$GAMMAPY_DATA/tests/spectrum/flux_points/diff_flux_points.fits" >>> flux_points = FluxPoints.read(filename) >>> model = SkyModel(spectral_model=PowerLawSpectralModel()) >>> dataset = FluxPointsDataset(model, flux_points) >>> #configuring optional parameters >>> kwargs_model = {"color":"red", "ls":"--"} >>> kwargs_fp = {"color":"green", "marker":"o"} >>> dataset.plot_spectrum(kwargs_fp=kwargs_fp, kwargs_model=kwargs_model) # doctest: +SKIP """ kwargs_fp = (kwargs_fp or {}).copy() kwargs_model = (kwargs_model or {}).copy() # plot flux points kwargs_fp.setdefault("sed_type", "e2dnde") if axis_name == "time": kwargs_fp["sed_type"] = "norm" ax = self.data.plot(ax=ax, **kwargs_fp, axis_name=axis_name) kwargs_model.setdefault("label", "Best fit model") kwargs_model.setdefault("zorder", 10) for model in self.models: if model.datasets_names is None or self.name in model.datasets_names: if axis_name == "energy": kwargs_model.setdefault("sed_type", "e2dnde") kwargs_model.setdefault("energy_bounds", self._energy_bounds) model.spectral_model.plot(ax=ax, **kwargs_model) if axis_name == "time": kwargs_model.setdefault( "time_range", self.data.geom.axes["time"].time_bounds ) model.temporal_model.plot(ax=ax, **kwargs_model) if axis_name == "energy": kwargs_model["color"] = ax.lines[-1].get_color() kwargs_model.pop("label") for model in self.models: if model.datasets_names is None or self.name in model.datasets_names: model.spectral_model.plot_error(ax=ax, **kwargs_model) return ax ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/io.py0000644000175100001770000003237414721316200016646 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import abc import numpy as np from astropy import units as u from astropy.io import fits from astropy.table import Table from gammapy.data import GTI from gammapy.irf import EDispKernel, EDispKernelMap from gammapy.maps import RegionNDMap from gammapy.utils.scripts import make_path from .spectrum import SpectrumDatasetOnOff __all__ = [ "DatasetReader", "DatasetWriter", "OGIPDatasetReader", "OGIPDatasetWriter", ] class DatasetReader(abc.ABC): """Dataset reader base class.""" @property @abc.abstractmethod def tag(self): pass @abc.abstractmethod def read(self): pass class DatasetWriter(abc.ABC): """Dataset writer base class.""" @property @abc.abstractmethod def tag(self): pass @abc.abstractmethod def write(self, dataset): pass class OGIPDatasetWriter(DatasetWriter): """Write OGIP files. If you want to use the written files with Sherpa, you have to use the ``ogip-sherpa`` format. Then all files will be written in units of 'keV' and 'cm2'. The naming scheme is fixed as following: * PHA file is named filename.fits * BKG file is named filename_bkg.fits * ARF file is named filename_arf.fits * RMF file is named filename_rmf.fits Parameters ---------- filename : `~pathlib.Path` or str Filename. format : {"ogip", "ogip-sherpa"} Which format to use. Default is 'ogip'. overwrite : bool, optional Overwrite existing files. Default is False. checksum : bool When True adds both DATASUM and CHECKSUM cards to the headers written to the files. Default is False. """ tag = ["ogip", "ogip-sherpa"] def __init__(self, filename, format="ogip", overwrite=False, checksum=False): filename = make_path(filename) filename.parent.mkdir(exist_ok=True, parents=True) self.filename = filename self.format = format self.overwrite = overwrite self.checksum = checksum @staticmethod def get_filenames(filename): """Get filenames. Parameters ---------- filename : `~pathlib.Path` Filename. Returns ------- filenames : dict Dictionary of filenames. """ suffix = "".join(filename.suffixes) name = filename.name.replace(suffix, "") name = f"{name}{{}}{suffix}" return { "respfile": name.format("_rmf"), "backfile": name.format("_bkg"), "ancrfile": name.format("_arf"), } def get_ogip_meta(self, dataset, is_bkg=False): """Meta info for the OGIP data format.""" try: livetime = dataset.exposure.meta["livetime"] except KeyError: raise ValueError( "Storing in ogip format require the livetime " "to be defined in the exposure meta data" ) hdu_class = "BKG" if is_bkg else "TOTAL" meta = { "HDUCLAS2": hdu_class, "HDUCLAS3": "COUNT", "HDUCLAS4": "TYPE:1", "EXPOSURE": livetime.to_value("s"), "OBS_ID": dataset.name, } filenames = OGIPDatasetWriter.get_filenames(self.filename) meta["ANCRFILE"] = filenames["ancrfile"] if dataset.counts_off: meta["BACKFILE"] = filenames["backfile"] if dataset.edisp: meta["RESPFILE"] = filenames["respfile"] return meta def write(self, dataset): """Write dataset to file. Parameters ---------- dataset : `SpectrumDatasetOnOff` Dataset to write. """ filenames = self.get_filenames(self.filename) self.write_pha(dataset, filename=self.filename) path = self.filename.parent self.write_arf(dataset, filename=path / filenames["ancrfile"]) if dataset.counts_off: self.write_bkg(dataset, filename=path / filenames["backfile"]) if dataset.edisp: self.write_rmf(dataset, filename=path / filenames["respfile"]) def write_rmf(self, dataset, filename): """Write energy dispersion. Parameters ---------- dataset : `SpectrumDatasetOnOff` Dataset to write. filename : str or `~pathlib.Path` Filename to use. """ kernel = dataset.edisp.get_edisp_kernel() kernel.write( filename=filename, format=self.format, checksum=self.checksum, overwrite=self.overwrite, ) def write_arf(self, dataset, filename): """Write effective area. Parameters ---------- dataset : `SpectrumDatasetOnOff` Dataset to write. filename : str or `~pathlib.Path` Filename to use. """ aeff = dataset.exposure / dataset.exposure.meta["livetime"] aeff.write( filename=filename, overwrite=self.overwrite, format=self.format.replace("ogip", "ogip-arf"), checksum=self.checksum, ) def to_counts_hdulist(self, dataset, is_bkg=False): """Convert counts region map to hdulist. Parameters ---------- dataset : `SpectrumDatasetOnOff` Dataset to write. is_bkg : bool Whether to use counts off. Default is False. """ counts = dataset.counts_off if is_bkg else dataset.counts acceptance = dataset.acceptance_off if is_bkg else dataset.acceptance hdulist = counts.to_hdulist(format=self.format) table = Table.read(hdulist["SPECTRUM"]) meta = self.get_ogip_meta(dataset, is_bkg=is_bkg) if dataset.mask_safe is not None: mask_array = dataset.mask_safe.data[:, 0, 0] else: mask_array = np.ones(acceptance.data.size) table["QUALITY"] = np.logical_not(mask_array) del table.meta["QUALITY"] table["BACKSCAL"] = acceptance.data[:, 0, 0] del table.meta["BACKSCAL"] # adapt meta data table.meta.update(meta) hdulist["SPECTRUM"] = fits.BinTableHDU(table) return hdulist def write_pha(self, dataset, filename): """Write counts file. Parameters ---------- dataset : `SpectrumDatasetOnOff` Dataset to write. filename : str or `~pathlib.Path` Filename to use. """ hdulist = self.to_counts_hdulist(dataset) if dataset.gti: hdu = dataset.gti.to_table_hdu() hdulist.append(hdu) hdulist.writeto(filename, overwrite=self.overwrite, checksum=self.checksum) def write_bkg(self, dataset, filename): """Write off counts file. Parameters ---------- dataset : `SpectrumDatasetOnOff` Dataset to write. filename : str or `~pathlib.Path` Filename to use. """ hdulist = self.to_counts_hdulist(dataset, is_bkg=True) hdulist.writeto(filename, overwrite=self.overwrite, checksum=self.checksum) class OGIPDatasetReader(DatasetReader): """Read `~gammapy.datasets.SpectrumDatasetOnOff` from OGIP files. BKG file, ARF, and RMF must be set in the PHA header and be present in the same folder. The naming scheme is fixed to the following scheme: * PHA file is named ``pha_obs{name}.fits`` * BKG file is named ``bkg_obs{name}.fits`` * ARF file is named ``arf_obs{name}.fits`` * RMF file is named ``rmf_obs{name}.fits`` with ``{name}`` the dataset name. Parameters ---------- filename : str or `~pathlib.Path` OGIP PHA file to read. checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. """ tag = "ogip" def __init__(self, filename, checksum=False): self.filename = make_path(filename) self.checksum = checksum def get_valid_path(self, filename): """Get absolute or relative path. The relative path is with respect to the name of the reference file. Parameters ---------- filename : str or `~pathlib.Path` Filename. Returns ------- filename : `~pathlib.Path` Valid path. """ filename = make_path(filename) if not filename.exists(): return self.filename.parent / filename else: return filename def get_filenames(self, pha_meta): """Get filenames. Parameters ---------- pha_meta : dict Metadata from the PHA file. Returns ------- filenames : dict Dictionary with filenames of "arffile", "rmffile" (optional) and "bkgfile" (optional). """ filenames = {"arffile": self.get_valid_path(pha_meta["ANCRFILE"])} if "BACKFILE" in pha_meta: filenames["bkgfile"] = self.get_valid_path(pha_meta["BACKFILE"]) if "RESPFILE" in pha_meta: filenames["rmffile"] = self.get_valid_path(pha_meta["RESPFILE"]) return filenames @staticmethod def read_pha(filename, checksum=False): """Read PHA file. Parameters ---------- filename : str or `~pathlib.Path` PHA file name. checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. Returns ------- data : dict Dictionary with counts, acceptance and mask_safe. """ data = {} with fits.open(filename, memmap=False, checksum=checksum) as hdulist: data["counts"] = RegionNDMap.from_hdulist(hdulist, format="ogip") data["acceptance"] = RegionNDMap.from_hdulist( hdulist, format="ogip", ogip_column="BACKSCAL" ) if "GTI" in hdulist: data["gti"] = GTI.from_table_hdu(hdulist["GTI"]) data["mask_safe"] = RegionNDMap.from_hdulist( hdulist, format="ogip", ogip_column="QUALITY" ) return data @staticmethod def read_bkg(filename, checksum=False): """Read PHA background file. Parameters ---------- filename : str or `~pathlib.Path` PHA file name. checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. Returns ------- data : dict Dictionary with counts_off and acceptance_off. """ with fits.open(filename, memmap=False, checksum=checksum) as hdulist: counts_off = RegionNDMap.from_hdulist(hdulist, format="ogip") acceptance_off = RegionNDMap.from_hdulist( hdulist, ogip_column="BACKSCAL", format="ogip" ) return {"counts_off": counts_off, "acceptance_off": acceptance_off} @staticmethod def read_rmf(filename, exposure, checksum=False): """Read RMF file. Parameters ---------- filename : str or `~pathlib.Path` PHA file name. exposure : `RegionNDMap` Exposure map. checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. Returns ------- data : `EDispKernelMap` Dictionary with edisp. """ kernel = EDispKernel.read(filename, checksum=checksum) edisp = EDispKernelMap.from_edisp_kernel(kernel, geom=exposure.geom) # TODO: resolve this separate handling of exposure for edisp edisp.exposure_map.data = exposure.data[:, :, np.newaxis, :] return edisp @staticmethod def read_arf(filename, livetime, checksum=False): """Read ARF file. Parameters ---------- filename : str or `~pathlib.Path` PHA file name. livetime : `Quantity` Livetime. checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. Returns ------- data : `RegionNDMap` Exposure map. """ aeff = RegionNDMap.read(filename, format="ogip-arf", checksum=checksum) exposure = aeff * livetime exposure.meta["livetime"] = livetime return exposure def read(self): """Read dataset. Returns ------- dataset : SpectrumDatasetOnOff Spectrum dataset. """ kwargs = self.read_pha(self.filename, checksum=self.checksum) pha_meta = kwargs["counts"].meta name = str(pha_meta["OBS_ID"]) livetime = pha_meta["EXPOSURE"] * u.s filenames = self.get_filenames(pha_meta=pha_meta) exposure = self.read_arf( filenames["arffile"], livetime=livetime, checksum=self.checksum ) if "bkgfile" in filenames: bkg = self.read_bkg(filenames["bkgfile"], checksum=self.checksum) kwargs.update(bkg) if "rmffile" in filenames: kwargs["edisp"] = self.read_rmf( filenames["rmffile"], exposure=exposure, checksum=self.checksum ) return SpectrumDatasetOnOff(name=name, exposure=exposure, **kwargs) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/map.py0000644000175100001770000033234414721316200017014 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import numpy as np from scipy.stats import median_abs_deviation as mad import astropy.units as u from astropy.io import fits from astropy.table import Table from regions import CircleSkyRegion import matplotlib.pyplot as plt from gammapy.data import GTI, PointingMode from gammapy.irf import EDispKernelMap, EDispMap, PSFKernel, PSFMap, RecoPSFMap from gammapy.maps import LabelMapAxis, Map, MapAxes, MapAxis, WcsGeom from gammapy.modeling.models import DatasetModels, FoVBackgroundModel, Models from gammapy.stats import ( CashCountsStatistic, WStatCountsStatistic, cash, cash_sum_cython, get_wstat_mu_bkg, wstat, ) from gammapy.utils.fits import HDULocation, LazyFitsData from gammapy.utils.random import get_random_state from gammapy.utils.scripts import make_name, make_path from gammapy.utils.table import hstack_columns from .core import Dataset from .evaluator import MapEvaluator from .metadata import MapDatasetMetaData from .utils import get_axes __all__ = [ "MapDataset", "MapDatasetOnOff", "create_empty_map_dataset_from_irfs", "create_map_dataset_geoms", "create_map_dataset_from_observation", ] log = logging.getLogger(__name__) RAD_MAX = 0.66 RAD_AXIS_DEFAULT = MapAxis.from_bounds( 0, RAD_MAX, nbin=66, node_type="edges", name="rad", unit="deg" ) MIGRA_AXIS_DEFAULT = MapAxis.from_bounds( 0.2, 5, nbin=48, node_type="edges", name="migra" ) BINSZ_IRF_DEFAULT = 0.2 * u.deg EVALUATION_MODE = "local" USE_NPRED_CACHE = True def create_map_dataset_geoms( geom, energy_axis_true=None, migra_axis=None, rad_axis=None, binsz_irf=BINSZ_IRF_DEFAULT, reco_psf=False, ): """Create map geometries for a `MapDataset`. Parameters ---------- geom : `~gammapy.maps.WcsGeom` Reference target geometry with a reconstructed energy axis. It is used for counts and background maps. Additional external data axes can be added to support e.g. event types. energy_axis_true : `~gammapy.maps.MapAxis` True energy axis used for IRF maps. migra_axis : `~gammapy.maps.MapAxis` If set, this provides the migration axis for the energy dispersion map. If not set, an EDispKernelMap is produced instead. Default is None. rad_axis : `~gammapy.maps.MapAxis` Rad axis for the PSF map. binsz_irf : float IRF Map pixel size in degrees. reco_psf : bool Use reconstructed energy for the PSF geometry. Default is False. Returns ------- geoms : dict Dictionary with map geometries. """ rad_axis = rad_axis or RAD_AXIS_DEFAULT if energy_axis_true is not None: energy_axis_true.assert_name("energy_true") else: energy_axis_true = geom.axes["energy"].copy(name="energy_true") external_axes = geom.axes.drop("energy") geom_image = geom.to_image() geom_exposure = geom_image.to_cube(MapAxes([energy_axis_true]) + external_axes) geom_irf = geom_image.to_binsz(binsz=binsz_irf) if reco_psf: geom_psf = geom_irf.to_cube( MapAxes([rad_axis, geom.axes["energy"]]) + external_axes ) else: geom_psf = geom_irf.to_cube( MapAxes([rad_axis, energy_axis_true]) + external_axes ) if migra_axis: geom_edisp = geom_irf.to_cube( MapAxes([migra_axis, energy_axis_true]) + external_axes ) else: geom_edisp = geom_irf.to_cube( MapAxes([geom.axes["energy"], energy_axis_true]) + external_axes ) return { "geom": geom, "geom_exposure": geom_exposure, "geom_psf": geom_psf, "geom_edisp": geom_edisp, } def _default_energy_axis(observation, energy_bin_per_decade_max=30, position=None): # number of bins per decade estimated from the energy resolution # such as diff(ereco.edges)/ereco.center ~ min(eres) if isinstance(observation.psf, PSFMap): etrue = observation.psf.psf_map.geom.axes[observation.psf.energy_name] if isinstance(observation.edisp, EDispKernelMap): ekern = observation.edisp.get_edisp_kernel( energy_axis=None, position=position ) if isinstance(observation.edisp, EDispMap): ekern = observation.edisp.get_edisp_kernel( energy_axis=etrue.rename("energy"), position=position ) eres = ekern.get_resolution(etrue.center) elif hasattr(observation.psf, "axes"): etrue = observation.psf.axes[0] # only where psf is defined if position: offset = observation.pointing.fixed_icrs.separation(position) else: offset = 0 * u.deg ekern = observation.edisp.to_edisp_kernel(offset) eres = ekern.get_resolution(etrue.center) eres = eres[np.isfinite(eres) & (eres > 0.0)] if eres.size > 0: # remove outliers beyond_mad = np.median(eres) - mad(eres) * eres.unit eres[eres < beyond_mad] = np.nan nbin_per_decade = np.nan_to_num( int(np.rint(2.0 / np.nanmin(eres.value))), nan=np.inf ) nbin_per_decade = np.minimum(nbin_per_decade, energy_bin_per_decade_max) else: nbin_per_decade = energy_bin_per_decade_max energy_axis_true = MapAxis.from_energy_bounds( etrue.edges[0], etrue.edges[-1], nbin=nbin_per_decade, per_decade=True, name="energy_true", ) if hasattr(observation, "bkg") and observation.bkg: ereco = observation.bkg.axes["energy"] energy_axis = MapAxis.from_energy_bounds( ereco.edges[0], ereco.edges[-1], nbin=nbin_per_decade, per_decade=True, name="energy", ) else: energy_axis = energy_axis_true.rename("energy") return energy_axis, energy_axis_true def _default_binsz(observation, spatial_bin_size_min=0.01 * u.deg): # bin size estimated from the minimal r68 of the psf if isinstance(observation.psf, PSFMap): energy_axis = observation.psf.psf_map.geom.axes[observation.psf.energy_name] psf_r68 = observation.psf.containment_radius(0.68, energy_axis.edges) elif hasattr(observation.psf, "axes"): etrue = observation.psf.axes[0] # only where psf is defined psf_r68 = observation.psf.containment_radius( 0.68, energy_true=etrue.edges, offset=0.0 * u.deg ) psf_r68 = psf_r68[np.isfinite(psf_r68)] if psf_r68.size > 0: # remove outliers beyond_mad = np.median(psf_r68) - mad(psf_r68) * psf_r68.unit psf_r68[psf_r68 < beyond_mad] = np.nan binsz = np.nan_to_num(np.nanmin(psf_r68), nan=-np.inf) binsz = np.maximum(binsz, spatial_bin_size_min) else: binsz = spatial_bin_size_min return binsz def _default_width(observation, spatial_width_max=12 * u.deg): # width estimated from the rad_max or the offset_max if isinstance(observation.psf, PSFMap): width = 2.0 * np.max(observation.psf.psf_map.geom.width) elif hasattr(observation.psf, "axes"): width = 2.0 * observation.psf.axes["offset"].edges[-1] else: width = spatial_width_max return np.minimum(width, spatial_width_max) def create_empty_map_dataset_from_irfs( observation, dataset_name=None, energy_axis_true=None, energy_axis=None, energy_bin_per_decade_max=30, spatial_width=None, spatial_width_max=12 * u.deg, spatial_bin_size=None, spatial_bin_size_min=0.01 * u.deg, position=None, frame="icrs", ): """Create a MapDataset, if energy axes, spatial width or bin size are not given they are determined automatically from the IRFs, but the estimated value cannot exceed the given limits. Parameters ---------- observation : `~gammapy.data.Observation` Observation containing the IRFs. dataset_name : str, optional Default is None. If None it is determined from the observation ID. energy_axis_true : `~gammapy.maps.MapAxis`, optional True energy axis. Default is None. If None it is determined from the observation IRFs. energy_axis : `~gammapy.maps.MapAxis`, optional Reconstructed energy axis. Default is None. If None it is determined from the observation IRFs. energy_bin_per_decade_max : int, optional Maximal number of bin per decade in energy for the reference dataset spatial_width : `~astropy.units.Quantity`, optional Spatial window size. Default is None. If None it is determined from the observation offset max or rad max. spatial_width_max : `~astropy.quantity.Quantity`, optional Maximal spatial width. Default is 12 degree. spatial_bin_size : `~astropy.units.Quantity`, optional Pixel size. Default is None. If None it is determined from the observation PSF R68. spatial_bin_size_min : `~astropy.quantity.Quantity`, optional Minimal spatial bin size. Default is 0.01 degree. position : `~astropy.coordinates.SkyCoord`, optional Center of the geometry. Default is the observation pointing at mid-observation time. frame: str, optional frame of the coordinate system. Default is "icrs". """ if position is None: if hasattr(observation, "pointing"): if observation.pointing.mode is not PointingMode.POINTING: raise NotImplementedError( "Only datas with fixed pointing in ICRS are supported" ) position = observation.pointing.fixed_icrs if spatial_width is None: spatial_width = _default_width(observation, spatial_width_max) if spatial_bin_size is None: spatial_bin_size = _default_binsz(observation, spatial_bin_size_min) if energy_axis is None or energy_axis_true is None: energy_axis_, energy_axis_true_ = _default_energy_axis( observation, energy_bin_per_decade_max, position ) if energy_axis is None: energy_axis = energy_axis_ if energy_axis_true is None: energy_axis_true = energy_axis_true_ if dataset_name is None: dataset_name = f"obs_{observation.obs_id}" geom = WcsGeom.create( skydir=position.transform_to(frame), width=spatial_width, binsz=spatial_bin_size.to_value(u.deg), frame=frame, axes=[energy_axis], ) axes = dict( energy_axis_true=energy_axis_true, ) if observation.edisp is not None: if isinstance(observation.edisp, EDispMap): axes["migra_axis"] = observation.edisp.edisp_map.geom.axes["migra"] elif hasattr(observation.edisp, "axes"): axes["migra_axis"] = observation.edisp.axes["migra"] dataset = MapDataset.create( geom, name=dataset_name, **axes, ) return dataset def create_map_dataset_from_observation( observation, models=None, dataset_name=None, energy_axis_true=None, energy_axis=None, energy_bin_per_decade_max=30, spatial_width=None, spatial_width_max=12 * u.deg, spatial_bin_size=None, spatial_bin_size_min=0.01 * u.deg, position=None, frame="icrs", ): """Create a MapDataset, if energy axes, spatial width or bin size are not given they are determined automatically from the observation IRFs, but the estimated value cannot exceed the given limits. Parameters ---------- observation : `~gammapy.data.Observation` Observation to be simulated. models : `~gammapy.modeling.Models`, optional Models. Default is None. dataset_name : str, optional If `models` contains one or multiple `FoVBackgroundModel` it should match the `dataset_name` of the background model to use. Default is None. If None it is determined from the observation ID. energy_axis_true : `~gammapy.maps.MapAxis`, optional True energy axis. Default is None. If None it is determined from the observation IRFs. energy_axis : `~gammapy.maps.MapAxis`, optional Reconstructed energy axis. Default is None. If None it is determined from the observation IRFs. energy_bin_per_decade_max : int, optional Maximal number of bin per decade in energy for the reference dataset spatial_width : `~astropy.units.Quantity`, optional Spatial window size. Default is None. If None it is determined from the observation offset max or rad max. spatial_width_max : `~astropy.quantity.Quantity`, optional Maximal spatial width. Default is 12 degree. spatial_bin_size : `~astropy.units.Quantity`, optional Pixel size. Default is None. If None it is determined from the observation PSF R68. spatial_bin_size_min : `~astropy.quantity.Quantity`, optional Minimal spatial bin size. Default is 0.01 degree. position : `~astropy.coordinates.SkyCoord`, optional Center of the geometry. Defalut is the observation pointing. frame: str, optional frame of the coordinate system. Default is "icrs". """ from gammapy.makers import MapDatasetMaker dataset = create_empty_map_dataset_from_irfs( observation, dataset_name=dataset_name, energy_axis_true=energy_axis_true, energy_axis=energy_axis, energy_bin_per_decade_max=energy_bin_per_decade_max, spatial_width=spatial_width, spatial_width_max=spatial_width_max, spatial_bin_size=spatial_bin_size, spatial_bin_size_min=spatial_bin_size_min, position=position, frame=frame, ) if models is None: models = Models() if not np.any( [ isinstance(m, FoVBackgroundModel) and m.datasets_names[0] == dataset.name for m in models ] ): models.append(FoVBackgroundModel(dataset_name=dataset.name)) components = ["exposure"] if observation.edisp is not None: components.append("edisp") if observation.bkg is not None: components.append("background") if observation.psf is not None: components.append("psf") maker = MapDatasetMaker(selection=components) dataset = maker.run(dataset, observation) dataset.models = models return dataset class MapDataset(Dataset): """Main map dataset for likelihood fitting. It bundles together binned counts, background, IRFs in the form of `~gammapy.maps.Map`. A safe mask and a fit mask can be added to exclude bins during the analysis. If models are assigned to it, it can compute predicted counts in each bin of the counts `Map` and compute the associated statistic function, here the Cash statistic (see `~gammapy.stats.cash`). For more information see :ref:`datasets`. Parameters ---------- models : `~gammapy.modeling.models.Models` Source sky models. counts : `~gammapy.maps.WcsNDMap` or `~gammapy.utils.fits.HDULocation` Counts cube. exposure : `~gammapy.maps.WcsNDMap` or `~gammapy.utils.fits.HDULocation` Exposure cube. background : `~gammapy.maps.WcsNDMap` or `~gammapy.utils.fits.HDULocation` Background cube. mask_fit : `~gammapy.maps.WcsNDMap` or `~gammapy.utils.fits.HDULocation` Mask to apply to the likelihood for fitting. psf : `~gammapy.irf.PSFMap` or `~gammapy.utils.fits.HDULocation` PSF kernel. edisp : `~gammapy.irf.EDispMap` or `~gammapy.utils.fits.HDULocation` Energy dispersion kernel mask_safe : `~gammapy.maps.WcsNDMap` or `~gammapy.utils.fits.HDULocation` Mask defining the safe data range. gti : `~gammapy.data.GTI` GTI of the observation or union of GTI if it is a stacked observation. meta_table : `~astropy.table.Table` Table listing information on observations used to create the dataset. One line per observation for stacked datasets. meta : `~gammapy.datasets.MapDatasetMetaData` Associated meta data container Notes ----- If an `HDULocation` is passed the map is loaded lazily. This means the map data is only loaded in memory as the corresponding data attribute on the MapDataset is accessed. If it was accessed once it is cached for the next time. Examples -------- >>> from gammapy.datasets import MapDataset >>> filename = "$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz" >>> dataset = MapDataset.read(filename, name="cta-dataset") >>> print(dataset) MapDataset ---------- Name : cta-dataset Total counts : 104317 Total background counts : 91507.70 Total excess counts : 12809.30 Predicted counts : 91507.69 Predicted background counts : 91507.70 Predicted excess counts : nan Exposure min : 6.28e+07 m2 s Exposure max : 1.90e+10 m2 s Number of total bins : 768000 Number of fit bins : 691680 Fit statistic type : cash Fit statistic value (-2 log(L)) : nan Number of models : 0 Number of parameters : 0 Number of free parameters : 0 See Also -------- MapDatasetOnOff, SpectrumDataset, FluxPointsDataset. """ stat_type = "cash" tag = "MapDataset" counts = LazyFitsData(cache=True) exposure = LazyFitsData(cache=True) edisp = LazyFitsData(cache=True) background = LazyFitsData(cache=True) psf = LazyFitsData(cache=True) mask_fit = LazyFitsData(cache=True) mask_safe = LazyFitsData(cache=True) _lazy_data_members = [ "counts", "exposure", "edisp", "psf", "mask_fit", "mask_safe", "background", ] # TODO: shoule be part of the LazyFitsData no ? gti = None meta_table = None def __init__( self, models=None, counts=None, exposure=None, background=None, psf=None, edisp=None, mask_safe=None, mask_fit=None, gti=None, meta_table=None, name=None, meta=None, ): self._name = make_name(name) self._evaluators = {} self.counts = counts self.exposure = exposure self.background = background self._background_cached = None self._background_parameters_cached = None self.mask_fit = mask_fit if psf and not isinstance(psf, (PSFMap, HDULocation)): raise ValueError( f"'psf' must be a 'PSFMap' or `HDULocation` object, got {type(psf)}" ) self.psf = psf if edisp and not isinstance(edisp, (EDispMap, EDispKernelMap, HDULocation)): raise ValueError( "'edisp' must be a 'EDispMap', `EDispKernelMap` or 'HDULocation' " f"object, got `{type(edisp)}` instead." ) self.edisp = edisp self.mask_safe = mask_safe self.gti = gti self.models = models self.meta_table = meta_table if meta is None: self._meta = MapDatasetMetaData() else: self._meta = meta @property def _psf_kernel(self): """Precompute PSFkernel if there is only one spatial bin in the PSFmap""" if self.psf and self.psf.has_single_spatial_bin: if self.psf.energy_name == "energy_true": map_ref = self.exposure else: map_ref = self.counts if map_ref and not map_ref.geom.is_region: return self.psf.get_psf_kernel(map_ref.geom) @property def meta(self): return self._meta @meta.setter def meta(self, value): self._meta = value # TODO: keep or remove? @property def background_model(self): try: return self.models[f"{self.name}-bkg"] except (ValueError, TypeError): pass def __str__(self): str_ = f"{self.__class__.__name__}\n" str_ += "-" * len(self.__class__.__name__) + "\n" str_ += "\n" str_ += "\t{:32}: {{name}} \n\n".format("Name") str_ += "\t{:32}: {{counts:.0f}} \n".format("Total counts") str_ += "\t{:32}: {{background:.2f}}\n".format("Total background counts") str_ += "\t{:32}: {{excess:.2f}}\n\n".format("Total excess counts") str_ += "\t{:32}: {{npred:.2f}}\n".format("Predicted counts") str_ += "\t{:32}: {{npred_background:.2f}}\n".format( "Predicted background counts" ) str_ += "\t{:32}: {{npred_signal:.2f}}\n\n".format("Predicted excess counts") str_ += "\t{:32}: {{exposure_min:.2e}}\n".format("Exposure min") str_ += "\t{:32}: {{exposure_max:.2e}}\n\n".format("Exposure max") str_ += "\t{:32}: {{n_bins}} \n".format("Number of total bins") str_ += "\t{:32}: {{n_fit_bins}} \n\n".format("Number of fit bins") # likelihood section str_ += "\t{:32}: {{stat_type}}\n".format("Fit statistic type") str_ += "\t{:32}: {{stat_sum:.2f}}\n\n".format( "Fit statistic value (-2 log(L))" ) info = self.info_dict() str_ = str_.format(**info) # model section n_models, n_pars, n_free_pars = 0, 0, 0 if self.models is not None: n_models = len(self.models) n_pars = len(self.models.parameters) n_free_pars = len(self.models.parameters.free_parameters) str_ += "\t{:32}: {} \n".format("Number of models", n_models) str_ += "\t{:32}: {}\n".format("Number of parameters", n_pars) str_ += "\t{:32}: {}\n\n".format("Number of free parameters", n_free_pars) if self.models is not None: str_ += "\t" + "\n\t".join(str(self.models).split("\n")[2:]) return str_.expandtabs(tabsize=2) @property def geoms(self): """Map geometries. Returns ------- geoms : dict Dictionary of map geometries involved in the dataset. """ geoms = {} geoms["geom"] = self._geom if self.exposure: geoms["geom_exposure"] = self.exposure.geom if self.psf: geoms["geom_psf"] = self.psf.psf_map.geom if self.edisp: geoms["geom_edisp"] = self.edisp.edisp_map.geom return geoms @property def models(self): """Models set on the dataset (`~gammapy.modeling.models.Models`).""" return self._models @property def excess(self): """Observed excess: counts-background.""" return self.counts - self.background @models.setter def models(self, models): """Models setter.""" self._evaluators = {} if models is not None: models = DatasetModels(models) models = models.select(datasets_names=self.name) if models: psf = self._psf_kernel for model in models: if not isinstance(model, FoVBackgroundModel): evaluator = MapEvaluator( model=model, psf=psf, evaluation_mode=EVALUATION_MODE, gti=self.gti, use_cache=USE_NPRED_CACHE, ) self._evaluators[model.name] = evaluator self._models = models @property def evaluators(self): """Model evaluators.""" return self._evaluators @property def _geom(self): """Main analysis geometry.""" if self.counts is not None: return self.counts.geom elif self.background is not None: return self.background.geom elif self.mask_safe is not None: return self.mask_safe.geom elif self.mask_fit is not None: return self.mask_fit.geom else: raise ValueError( "Either 'counts', 'background', 'mask_fit'" " or 'mask_safe' must be defined." ) @property def data_shape(self): """Shape of the counts or background data (tuple).""" return self._geom.data_shape def _energy_range(self, mask_map=None): """Compute the energy range maps with or without the fit mask.""" geom = self._geom energy = geom.axes["energy"].edges e_i = geom.axes.index_data("energy") geom = geom.drop("energy") if mask_map is not None: mask = mask_map.data if mask.any(): idx = mask.argmax(e_i) energy_min = energy.value[idx] mask_nan = ~mask.any(e_i) energy_min[mask_nan] = np.nan mask = np.flip(mask, e_i) idx = mask.argmax(e_i) energy_max = energy.value[::-1][idx] energy_max[mask_nan] = np.nan else: energy_min = np.full(geom.data_shape, np.nan) energy_max = energy_min.copy() else: data_shape = geom.data_shape energy_min = np.full(data_shape, energy.value[0]) energy_max = np.full(data_shape, energy.value[-1]) map_min = Map.from_geom(geom, data=energy_min, unit=energy.unit) map_max = Map.from_geom(geom, data=energy_max, unit=energy.unit) return map_min, map_max @property def energy_range(self): """Energy range maps defined by the mask_safe and mask_fit.""" return self._energy_range(self.mask) @property def energy_range_safe(self): """Energy range maps defined by the mask_safe only.""" return self._energy_range(self.mask_safe) @property def energy_range_fit(self): """Energy range maps defined by the mask_fit only.""" return self._energy_range(self.mask_fit) @property def energy_range_total(self): """Largest energy range among all pixels, defined by mask_safe and mask_fit.""" energy_min_map, energy_max_map = self.energy_range return np.nanmin(energy_min_map.quantity), np.nanmax(energy_max_map.quantity) def npred(self): """Total predicted source and background counts. Returns ------- npred : `Map` Total predicted counts. """ npred_total = self.npred_signal() if self.background: npred_total += self.npred_background() npred_total.data[npred_total.data < 0.0] = 0 return npred_total def npred_background(self): """Predicted background counts. The predicted background counts depend on the parameters of the `FoVBackgroundModel` defined in the dataset. Returns ------- npred_background : `Map` Predicted counts from the background. """ background = self.background if self.background_model and background: if self._background_parameters_changed: values = self.background_model.evaluate_geom(geom=self.background.geom) if self._background_cached is None: self._background_cached = background * values else: self._background_cached.quantity = ( background.quantity * values.value ) return self._background_cached else: return background return background def _background_parameters_changed(self): values = self.background_model.parameters.value changed = ~np.all(self._background_parameters_cached == values) if changed: self._background_parameters_cached = values return changed def npred_signal(self, model_names=None, stack=True): """Model predicted signal counts. If a list of model name is passed, predicted counts from these components are returned. If stack is set to True, a map of the sum of all the predicted counts is returned. If stack is set to False, a map with an additional axis representing the models is returned. Parameters ---------- model_names : list of str List of name of SkyModel for which to compute the npred. If none, all the SkyModel predicted counts are computed. stack : bool Whether to stack the npred maps upon each other. Returns ------- npred_sig : `gammapy.maps.Map` Map of the predicted signal counts. """ npred_total = Map.from_geom(self._geom, dtype=float) evaluators = self.evaluators if model_names is not None: if isinstance(model_names, str): model_names = [model_names] evaluators = {name: self.evaluators[name] for name in model_names} npred_list = [] labels = [] for evaluator_name, evaluator in evaluators.items(): if evaluator.needs_update: evaluator.update( self.exposure, self.psf, self.edisp, self._geom, self.mask_image, ) if evaluator.contributes: npred = evaluator.compute_npred() if stack: npred_total.stack(npred) else: npred_geom = Map.from_geom(self._geom, dtype=float) npred_geom.stack(npred) labels.append(evaluator_name) npred_list.append(npred_geom) if npred_list != []: label_axis = LabelMapAxis(labels=labels, name="models") npred_total = Map.from_stack(npred_list, axis=label_axis) return npred_total @classmethod def from_geoms( cls, geom, geom_exposure=None, geom_psf=None, geom_edisp=None, reference_time="2000-01-01", name=None, **kwargs, ): """ Create a MapDataset object with zero filled maps according to the specified geometries. Parameters ---------- geom : `Geom` Geometry for the counts and background maps. geom_exposure : `Geom` Geometry for the exposure map. Default is None. geom_psf : `Geom` Geometry for the PSF map. Default is None. geom_edisp : `Geom` Geometry for the energy dispersion kernel map. If geom_edisp has a migra axis, this will create an EDispMap instead. Default is None. reference_time : `~astropy.time.Time` The reference time to use in GTI definition. Default is "2000-01-01". name : str Name of the returned dataset. Default is None. kwargs : dict, optional Keyword arguments to be passed. Returns ------- dataset : `MapDataset` or `SpectrumDataset` A dataset containing zero filled maps. """ name = make_name(name) kwargs = kwargs.copy() kwargs["name"] = name kwargs["counts"] = Map.from_geom(geom, unit="") kwargs["background"] = Map.from_geom(geom, unit="") if geom_exposure: kwargs["exposure"] = Map.from_geom(geom_exposure, unit="m2 s") if geom_edisp: if "energy" in geom_edisp.axes.names: kwargs["edisp"] = EDispKernelMap.from_geom(geom_edisp) else: kwargs["edisp"] = EDispMap.from_geom(geom_edisp) if geom_psf: if "energy_true" in geom_psf.axes.names: kwargs["psf"] = PSFMap.from_geom(geom_psf) elif "energy" in geom_psf.axes.names: kwargs["psf"] = RecoPSFMap.from_geom(geom_psf) kwargs.setdefault( "gti", GTI.create([] * u.s, [] * u.s, reference_time=reference_time) ) kwargs["mask_safe"] = Map.from_geom(geom, unit="", dtype=bool) return cls(**kwargs) @classmethod def create( cls, geom, energy_axis_true=None, migra_axis=None, rad_axis=None, binsz_irf=BINSZ_IRF_DEFAULT, reference_time="2000-01-01", name=None, meta_table=None, reco_psf=False, **kwargs, ): """Create a MapDataset object with zero filled maps. Parameters ---------- geom : `~gammapy.maps.WcsGeom` Reference target geometry in reco energy, used for counts and background maps. energy_axis_true : `~gammapy.maps.MapAxis`, optional True energy axis used for IRF maps. Default is None. migra_axis : `~gammapy.maps.MapAxis`, optional If set, this provides the migration axis for the energy dispersion map. If not set, an EDispKernelMap is produced instead. Default is None. rad_axis : `~gammapy.maps.MapAxis`, optional Rad axis for the PSF map. Default is None. binsz_irf : float IRF Map pixel size in degrees. Default is BINSZ_IRF_DEFAULT. reference_time : `~astropy.time.Time` The reference time to use in GTI definition. Default is "2000-01-01". name : str, optional Name of the returned dataset. Default is None. meta_table : `~astropy.table.Table`, optional Table listing information on observations used to create the dataset. One line per observation for stacked datasets. Default is None. reco_psf : bool Use reconstructed energy for the PSF geometry. Default is False. Returns ------- empty_maps : `MapDataset` A MapDataset containing zero filled maps. Examples -------- >>> from gammapy.datasets import MapDataset >>> from gammapy.maps import WcsGeom, MapAxis >>> energy_axis = MapAxis.from_energy_bounds(1.0, 10.0, 4, unit="TeV") >>> energy_axis_true = MapAxis.from_energy_bounds( ... 0.5, 20, 10, unit="TeV", name="energy_true" ... ) >>> geom = WcsGeom.create( ... skydir=(83.633, 22.014), ... binsz=0.02, width=(2, 2), ... frame="icrs", ... proj="CAR", ... axes=[energy_axis] ... ) >>> empty = MapDataset.create(geom=geom, energy_axis_true=energy_axis_true, name="empty") """ geoms = create_map_dataset_geoms( geom=geom, energy_axis_true=energy_axis_true, rad_axis=rad_axis, migra_axis=migra_axis, binsz_irf=binsz_irf, reco_psf=reco_psf, ) kwargs.update(geoms) return cls.from_geoms( reference_time=reference_time, name=name, meta_table=meta_table, **kwargs ) @property def mask_safe_image(self): """Reduced safe mask.""" if self.mask_safe is None: return None return self.mask_safe.reduce_over_axes(func=np.logical_or) @property def mask_fit_image(self): """Reduced fit mask.""" if self.mask_fit is None: return None return self.mask_fit.reduce_over_axes(func=np.logical_or) @property def mask_image(self): """Reduced mask.""" if self.mask is None: mask = Map.from_geom(self._geom.to_image(), dtype=bool) mask.data |= True return mask return self.mask.reduce_over_axes(func=np.logical_or) @property def mask_safe_psf(self): """Safe mask for PSF maps.""" if self.mask_safe is None or self.psf is None: return None geom = self.psf.psf_map.geom.squash("energy_true").squash("rad") mask_safe_psf = self.mask_safe_image.interp_to_geom(geom.to_image()) return mask_safe_psf.to_cube(geom.axes) @property def mask_safe_edisp(self): """Safe mask for edisp maps.""" if self.mask_safe is None or self.edisp is None: return None if self.mask_safe.geom.is_region: return self.mask_safe geom = self.edisp.edisp_map.geom.squash("energy_true") if "migra" in geom.axes.names: geom = geom.squash("migra") mask_safe_edisp = self.mask_safe_image.interp_to_geom( geom.to_image(), fill_value=None ) return mask_safe_edisp.to_cube(geom.axes) # allow extrapolation only along spatial dimension # to support case where mask_safe geom and irfs geom are different geom_same_axes = geom.to_image().to_cube(self.mask_safe.geom.axes) mask_safe_edisp = self.mask_safe.interp_to_geom(geom_same_axes, fill_value=None) mask_safe_edisp = mask_safe_edisp.interp_to_geom(geom) return mask_safe_edisp def to_masked(self, name=None, nan_to_num=True): """Return masked dataset. Parameters ---------- name : str, optional Name of the masked dataset. Default is None. nan_to_num : bool Non-finite values are replaced by zero if True. Default is True. Returns ------- dataset : `MapDataset` or `SpectrumDataset` Masked dataset. """ dataset = self.__class__.from_geoms(**self.geoms, name=name) dataset.stack(self, nan_to_num=nan_to_num) return dataset def stack(self, other, nan_to_num=True): r"""Stack another dataset in place. The original dataset is modified. Safe mask is applied to the other dataset to compute the stacked counts data. Counts outside the safe mask are lost. Note that the masking is not applied to the current dataset. If masking needs to be applied to it, use `~gammapy.MapDataset.to_masked()` first. The stacking of 2 datasets is implemented as follows. Here, :math:`k` denotes a bin in reconstructed energy and :math:`j = {1,2}` is the dataset number. The ``mask_safe`` of each dataset is defined as: .. math:: \epsilon_{jk} =\left\{\begin{array}{cl} 1, & \mbox{if bin k is inside the thresholds}\\ 0, & \mbox{otherwise} \end{array}\right. Then the total ``counts`` and model background ``bkg`` are computed according to: .. math:: \overline{\mathrm{n_{on}}}_k = \mathrm{n_{on}}_{1k} \cdot \epsilon_{1k} + \mathrm{n_{on}}_{2k} \cdot \epsilon_{2k}. \overline{bkg}_k = bkg_{1k} \cdot \epsilon_{1k} + bkg_{2k} \cdot \epsilon_{2k}. The stacked ``safe_mask`` is then: .. math:: \overline{\epsilon_k} = \epsilon_{1k} OR \epsilon_{2k}. For details, see :ref:`stack`. Parameters ---------- other : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.MapDatasetOnOff` Map dataset to be stacked with this one. If other is an on-off dataset alpha * counts_off is used as a background model. nan_to_num : bool Non-finite values are replaced by zero if True. Default is True. """ if self.counts and other.counts: self.counts.stack( other.counts, weights=other.mask_safe, nan_to_num=nan_to_num ) if self.exposure and other.exposure: self.exposure.stack( other.exposure, weights=other.mask_safe_image, nan_to_num=nan_to_num ) # TODO: check whether this can be improved e.g. handling this in GTI if "livetime" in other.exposure.meta and np.any(other.mask_safe_image): if "livetime" in self.exposure.meta: self.exposure.meta["livetime"] += other.exposure.meta["livetime"] else: self.exposure.meta["livetime"] = other.exposure.meta[ "livetime" ].copy() if self.stat_type == "cash": if self.background and other.background: background = self.npred_background() background.stack( other.npred_background(), weights=other.mask_safe, nan_to_num=nan_to_num, ) self.background = background if self.psf and other.psf: self.psf.stack(other.psf, weights=other.mask_safe_psf) if self.edisp and other.edisp: self.edisp.stack(other.edisp, weights=other.mask_safe_edisp) if self.mask_safe and other.mask_safe: self.mask_safe.stack(other.mask_safe) if self.mask_fit and other.mask_fit: self.mask_fit.stack(other.mask_fit) elif other.mask_fit: self.mask_fit = other.mask_fit.copy() if self.gti and other.gti: self.gti.stack(other.gti) self.gti = self.gti.union() if self.meta_table and other.meta_table: self.meta_table = hstack_columns(self.meta_table, other.meta_table) elif other.meta_table: self.meta_table = other.meta_table.copy() if self.meta and other.meta: self.meta.stack(other.meta) def stat_array(self): """Statistic function value per bin given the current model parameters.""" return cash(n_on=self.counts.data, mu_on=self.npred().data) def residuals(self, method="diff", **kwargs): """Compute residuals map. Parameters ---------- method : {"diff", "diff/model", "diff/sqrt(model)"} Method used to compute the residuals. Available options are: - "diff" (default): data - model. - "diff/model": (data - model) / model. - "diff/sqrt(model)": (data - model) / sqrt(model). Default is "diff". **kwargs : dict, optional Keyword arguments forwarded to `Map.smooth()`. Returns ------- residuals : `gammapy.maps.Map` Residual map. """ npred, counts = self.npred(), self.counts.copy() if self.mask: npred = npred * self.mask counts = counts * self.mask if kwargs: kwargs.setdefault("mode", "constant") kwargs.setdefault("width", "0.1 deg") kwargs.setdefault("kernel", "gauss") with np.errstate(invalid="ignore", divide="ignore"): npred = npred.smooth(**kwargs) counts = counts.smooth(**kwargs) if self.mask: mask = self.mask.smooth(**kwargs) npred /= mask counts /= mask residuals = self._compute_residuals(counts, npred, method=method) if self.mask: residuals.data[~self.mask.data] = np.nan return residuals def plot_residuals_spatial( self, ax=None, method="diff", smooth_kernel="gauss", smooth_radius="0.1 deg", **kwargs, ): """Plot spatial residuals. The normalization used for the residuals computation can be controlled using the method parameter. Parameters ---------- ax : `~astropy.visualization.wcsaxes.WCSAxes`, optional Axes to plot on. Default is None. method : {"diff", "diff/model", "diff/sqrt(model)"} Normalization used to compute the residuals, see `MapDataset.residuals`. Default is "diff". smooth_kernel : {"gauss", "box"} Kernel shape. Default is "gauss". smooth_radius: `~astropy.units.Quantity`, str or float Smoothing width given as quantity or float. If a float is given, it is interpreted as smoothing width in pixels. Default is "0.1 deg". **kwargs : dict, optional Keyword arguments passed to `~matplotlib.axes.Axes.imshow`. Returns ------- ax : `~astropy.visualization.wcsaxes.WCSAxes` WCSAxes object. Examples -------- >>> from gammapy.datasets import MapDataset >>> dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") >>> kwargs = {"cmap": "RdBu_r", "vmin":-5, "vmax":5, "add_cbar": True} >>> dataset.plot_residuals_spatial(method="diff/sqrt(model)", **kwargs) # doctest: +SKIP """ counts, npred = self.counts.copy(), self.npred() if counts.geom.is_region: raise ValueError("Cannot plot spatial residuals for RegionNDMap") if self.mask is not None: counts *= self.mask npred *= self.mask counts_spatial = counts.sum_over_axes().smooth( width=smooth_radius, kernel=smooth_kernel ) npred_spatial = npred.sum_over_axes().smooth( width=smooth_radius, kernel=smooth_kernel ) residuals = self._compute_residuals(counts_spatial, npred_spatial, method) if self.mask_safe is not None: mask = self.mask_safe.reduce_over_axes(func=np.logical_or, keepdims=True) residuals.data[~mask.data] = np.nan kwargs.setdefault("add_cbar", True) kwargs.setdefault("cmap", "coolwarm") kwargs.setdefault("vmin", -5) kwargs.setdefault("vmax", 5) ax = residuals.plot(ax, **kwargs) return ax def plot_residuals_spectral(self, ax=None, method="diff", region=None, **kwargs): """Plot spectral residuals. The residuals are extracted from the provided region, and the normalization used for its computation can be controlled using the method parameter. The error bars are computed using the uncertainty on the excess with a symmetric assumption. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Axes to plot on. Default is None. method : {"diff", "diff/sqrt(model)"} Normalization used to compute the residuals, see `SpectrumDataset.residuals`. Default is "diff". region : `~regions.SkyRegion` (required) Target sky region. Default is None. **kwargs : dict, optional Keyword arguments passed to `~matplotlib.axes.Axes.errorbar`. Returns ------- ax : `~matplotlib.axes.Axes` Axes object. Examples -------- >>> from gammapy.datasets import MapDataset >>> dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") >>> kwargs = {"markerfacecolor": "blue", "markersize":8, "marker":'s'} >>> dataset.plot_residuals_spectral(method="diff/sqrt(model)", **kwargs) # doctest: +SKIP """ counts, npred = self.counts.copy(), self.npred() counts_spec = counts.get_spectrum(region) npred_spec = npred.get_spectrum(region) residuals = self._compute_residuals(counts_spec, npred_spec, method) if self.stat_type == "wstat": counts_off = (self.counts_off).get_spectrum(region) with np.errstate(invalid="ignore"): alpha = self.background.get_spectrum(region) / counts_off mu_sig = self.npred_signal().get_spectrum(region) stat = WStatCountsStatistic( n_on=counts_spec, n_off=counts_off, alpha=alpha, mu_sig=mu_sig, ) elif self.stat_type == "cash": stat = CashCountsStatistic(counts_spec.data, npred_spec.data) excess_error = stat.error if method == "diff": yerr = excess_error elif method == "diff/sqrt(model)": yerr = excess_error / np.sqrt(npred_spec.data) else: raise ValueError( 'Invalid method, choose between "diff" and "diff/sqrt(model)"' ) kwargs.setdefault("color", kwargs.pop("c", "black")) ax = residuals.plot(ax, yerr=yerr, **kwargs) ax.axhline(0, color=kwargs["color"], lw=0.5) label = self._residuals_labels[method] ax.set_ylabel(f"Residuals ({label})") ax.set_yscale("linear") ymin = 1.05 * np.nanmin(residuals.data - yerr) ymax = 1.05 * np.nanmax(residuals.data + yerr) ax.set_ylim(ymin, ymax) return ax def plot_residuals( self, ax_spatial=None, ax_spectral=None, kwargs_spatial=None, kwargs_spectral=None, ): """Plot spatial and spectral residuals in two panels. Calls `~MapDataset.plot_residuals_spatial` and `~MapDataset.plot_residuals_spectral`. The spectral residuals are extracted from the provided region, and the normalization used for its computation can be controlled using the method parameter. The region outline is overlaid on the residuals map. If no region is passed, the residuals are computed for the entire map. Parameters ---------- ax_spatial : `~astropy.visualization.wcsaxes.WCSAxes`, optional Axes to plot spatial residuals on. Default is None. ax_spectral : `~matplotlib.axes.Axes`, optional Axes to plot spectral residuals on. Default is None. kwargs_spatial : dict, optional Keyword arguments passed to `~MapDataset.plot_residuals_spatial`. Default is None. kwargs_spectral : dict, optional Keyword arguments passed to `~MapDataset.plot_residuals_spectral`. The region should be passed as a dictionary key. Default is None. Returns ------- ax_spatial, ax_spectral : `~astropy.visualization.wcsaxes.WCSAxes`, `~matplotlib.axes.Axes` Spatial and spectral residuals plots. Examples -------- >>> from regions import CircleSkyRegion >>> from astropy.coordinates import SkyCoord >>> import astropy.units as u >>> from gammapy.datasets import MapDataset >>> dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") >>> reg = CircleSkyRegion(SkyCoord(0,0, unit="deg", frame="galactic"), radius=1.0 * u.deg) >>> kwargs_spatial = {"cmap": "RdBu_r", "vmin":-5, "vmax":5, "add_cbar": True} >>> kwargs_spectral = {"region":reg, "markerfacecolor": "blue", "markersize": 8, "marker": "s"} >>> dataset.plot_residuals(kwargs_spatial=kwargs_spatial, kwargs_spectral=kwargs_spectral) # doctest: +SKIP """ ax_spatial, ax_spectral = get_axes( ax_spatial, ax_spectral, 12, 4, [1, 2, 1], [1, 2, 2], {"projection": self._geom.to_image().wcs}, ) kwargs_spatial = kwargs_spatial or {} kwargs_spectral = kwargs_spectral or {} self.plot_residuals_spatial(ax_spatial, **kwargs_spatial) self.plot_residuals_spectral(ax_spectral, **kwargs_spectral) # Overlay spectral extraction region on the spatial residuals region = kwargs_spectral.get("region") if region is not None: pix_region = region.to_pixel(self._geom.to_image().wcs) pix_region.plot(ax=ax_spatial) return ax_spatial, ax_spectral def stat_sum(self): """Total statistic function value given the current model parameters and priors.""" prior_stat_sum = 0.0 if self.models is not None: prior_stat_sum = self.models.parameters.prior_stat_sum() counts, npred = self.counts.data.astype(float), self.npred().data if self.mask is not None: return ( cash_sum_cython(counts[self.mask.data], npred[self.mask.data]) + prior_stat_sum ) else: return cash_sum_cython(counts.ravel(), npred.ravel()) + prior_stat_sum def fake(self, random_state="random-seed"): """Simulate fake counts for the current model and reduced IRFs. This method overwrites the counts defined on the dataset object. Parameters ---------- random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is "random-seed". """ random_state = get_random_state(random_state) npred = self.npred() data = np.nan_to_num(npred.data, copy=True, nan=0.0, posinf=0.0, neginf=0.0) npred.data = random_state.poisson(data) npred.data = npred.data.astype("float") self.counts = npred def to_hdulist(self): """Convert map dataset to list of HDUs. Returns ------- hdulist : `~astropy.io.fits.HDUList` Map dataset list of HDUs. """ # TODO: what todo about the model and background model parameters? exclude_primary = slice(1, None) hdu_primary = fits.PrimaryHDU() header = hdu_primary.header header["NAME"] = self.name header.update(self.meta.to_header()) hdulist = fits.HDUList([hdu_primary]) if self.counts is not None: hdulist += self.counts.to_hdulist(hdu="counts")[exclude_primary] if self.exposure is not None: hdulist += self.exposure.to_hdulist(hdu="exposure")[exclude_primary] if self.background is not None: hdulist += self.background.to_hdulist(hdu="background")[exclude_primary] if self.edisp is not None: hdulist += self.edisp.to_hdulist()[exclude_primary] if self.psf is not None: hdulist += self.psf.to_hdulist()[exclude_primary] if self.mask_safe is not None: hdulist += self.mask_safe.to_hdulist(hdu="mask_safe")[exclude_primary] if self.mask_fit is not None: hdulist += self.mask_fit.to_hdulist(hdu="mask_fit")[exclude_primary] if self.gti is not None: hdulist.append(self.gti.to_table_hdu()) if self.meta_table is not None: hdulist.append(fits.BinTableHDU(self.meta_table, name="META_TABLE")) return hdulist @classmethod def from_hdulist(cls, hdulist, name=None, lazy=False, format="gadf"): """Create map dataset from list of HDUs. Parameters ---------- hdulist : `~astropy.io.fits.HDUList` List of HDUs. name : str, optional Name of the new dataset. Default is None. lazy : bool Whether to lazy load data into memory. Default is False. format : {"gadf"} Format the hdulist is given in. Default is "gadf". Returns ------- dataset : `MapDataset` Map dataset. """ name = make_name(name) kwargs = {"name": name} kwargs["meta"] = MapDatasetMetaData.from_header(hdulist["PRIMARY"].header) if "COUNTS" in hdulist: kwargs["counts"] = Map.from_hdulist(hdulist, hdu="counts", format=format) if "EXPOSURE" in hdulist: exposure = Map.from_hdulist(hdulist, hdu="exposure", format=format) if exposure.geom.axes[0].name == "energy": exposure.geom.axes[0].name = "energy_true" kwargs["exposure"] = exposure if "BACKGROUND" in hdulist: kwargs["background"] = Map.from_hdulist( hdulist, hdu="background", format=format ) if "EDISP" in hdulist: kwargs["edisp"] = EDispMap.from_hdulist( hdulist, hdu="edisp", exposure_hdu="edisp_exposure", format=format ) if "PSF" in hdulist: kwargs["psf"] = PSFMap.from_hdulist( hdulist, hdu="psf", exposure_hdu="psf_exposure", format=format ) if "MASK_SAFE" in hdulist: mask_safe = Map.from_hdulist(hdulist, hdu="mask_safe", format=format) mask_safe.data = mask_safe.data.astype(bool) kwargs["mask_safe"] = mask_safe if "MASK_FIT" in hdulist: mask_fit = Map.from_hdulist(hdulist, hdu="mask_fit", format=format) mask_fit.data = mask_fit.data.astype(bool) kwargs["mask_fit"] = mask_fit if "GTI" in hdulist: gti = GTI.from_table_hdu(hdulist["GTI"]) kwargs["gti"] = gti if "META_TABLE" in hdulist: meta_table = Table.read(hdulist, hdu="META_TABLE") kwargs["meta_table"] = meta_table return cls(**kwargs) def write(self, filename, overwrite=False, checksum=False): """Write Dataset to file. A MapDataset is serialised using the GADF format with a WCS geometry. A SpectrumDataset uses the same format, with a RegionGeom. Parameters ---------- filename : str Filename to write to. overwrite : bool, optional Overwrite existing file. Default is False. checksum : bool When True adds both DATASUM and CHECKSUM cards to the headers written to the file. Default is False. """ self.to_hdulist().writeto( str(make_path(filename)), overwrite=overwrite, checksum=checksum ) @classmethod def _read_lazy(cls, name, filename, cache, format=format): name = make_name(name) kwargs = {"name": name} try: kwargs["gti"] = GTI.read(filename) except KeyError: pass path = make_path(filename) for hdu_name in ["counts", "exposure", "mask_fit", "mask_safe", "background"]: kwargs[hdu_name] = HDULocation( hdu_class="map", file_dir=path.parent, file_name=path.name, hdu_name=hdu_name.upper(), cache=cache, format=format, ) kwargs["edisp"] = HDULocation( hdu_class="edisp_map", file_dir=path.parent, file_name=path.name, hdu_name="EDISP", cache=cache, format=format, ) kwargs["psf"] = HDULocation( hdu_class="psf_map", file_dir=path.parent, file_name=path.name, hdu_name="PSF", cache=cache, format=format, ) return cls(**kwargs) @classmethod def read( cls, filename, name=None, lazy=False, cache=True, format="gadf", checksum=False ): """Read a dataset from file. Parameters ---------- filename : str Filename to read from. name : str, optional Name of the new dataset. Default is None. lazy : bool Whether to lazy load data into memory. Default is False. cache : bool Whether to cache the data after loading. Default is True. format : {"gadf"} Format of the dataset file. Default is "gadf". checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. Returns ------- dataset : `MapDataset` Map dataset. """ if name is None: header = fits.getheader(str(make_path(filename))) name = header.get("NAME", name) ds_name = make_name(name) if lazy: return cls._read_lazy( name=ds_name, filename=filename, cache=cache, format=format ) else: with fits.open( str(make_path(filename)), memmap=False, checksum=checksum ) as hdulist: return cls.from_hdulist(hdulist, name=ds_name, format=format) @classmethod def from_dict(cls, data, lazy=False, cache=True): """Create from dicts and models list generated from YAML serialization.""" filename = make_path(data["filename"]) dataset = cls.read(filename, name=data["name"], lazy=lazy, cache=cache) return dataset @property def _counts_statistic(self): """Counts statistics of the dataset.""" return CashCountsStatistic(self.counts, self.npred_background()) def info_dict(self, in_safe_data_range=True): """Info dict with summary statistics, summed over energy. Parameters ---------- in_safe_data_range : bool Whether to sum only in the safe energy range. Default is True. Returns ------- info_dict : dict Dictionary with summary info. """ info = {} info["name"] = self.name if self.mask_safe and in_safe_data_range: mask = self.mask_safe.data.astype(bool) else: mask = slice(None) counts = 0 background, excess, sqrt_ts = np.nan, np.nan, np.nan if self.counts: counts = self.counts.data[mask].sum() if self.background: summed_stat = self._counts_statistic[mask].sum() background = self.background.data[mask].sum() excess = summed_stat.n_sig sqrt_ts = summed_stat.sqrt_ts info["counts"] = int(counts) info["excess"] = float(excess) info["sqrt_ts"] = sqrt_ts info["background"] = float(background) npred = np.nan if self.models or not np.isnan(background): npred = self.npred().data[mask].sum() info["npred"] = float(npred) npred_background = np.nan if self.background: npred_background = self.npred_background().data[mask].sum() info["npred_background"] = float(npred_background) npred_signal = np.nan if self.models and ( len(self.models) > 1 or not isinstance(self.models[0], FoVBackgroundModel) ): npred_signal = self.npred_signal().data[mask].sum() info["npred_signal"] = float(npred_signal) exposure_min = np.nan * u.Unit("cm s") exposure_max = np.nan * u.Unit("cm s") livetime = np.nan * u.s if self.exposure is not None: mask_exposure = self.exposure.data > 0 if self.mask_safe is not None: mask_spatial = self.mask_safe.reduce_over_axes(func=np.logical_or).data mask_exposure = mask_exposure & mask_spatial[np.newaxis, :, :] if not mask_exposure.any(): mask_exposure = slice(None) exposure_min = np.min(self.exposure.quantity[mask_exposure]) exposure_max = np.max(self.exposure.quantity[mask_exposure]) livetime = self.exposure.meta.get("livetime", np.nan * u.s).copy() info["exposure_min"] = exposure_min.item() info["exposure_max"] = exposure_max.item() info["livetime"] = livetime ontime = u.Quantity(np.nan, "s") if self.gti: ontime = self.gti.time_sum info["ontime"] = ontime info["counts_rate"] = info["counts"] / info["livetime"] info["background_rate"] = info["background"] / info["livetime"] info["excess_rate"] = info["excess"] / info["livetime"] # data section n_bins = 0 if self.counts is not None: n_bins = self.counts.data.size info["n_bins"] = int(n_bins) n_fit_bins = 0 if self.mask is not None: n_fit_bins = np.sum(self.mask.data) info["n_fit_bins"] = int(n_fit_bins) info["stat_type"] = self.stat_type stat_sum = np.nan if self.counts is not None and self.models is not None: stat_sum = self.stat_sum() info["stat_sum"] = float(stat_sum) return info def to_spectrum_dataset(self, on_region, containment_correction=False, name=None): """Return a ~gammapy.datasets.SpectrumDataset from on_region. Counts and background are summed in the on_region. Exposure is taken from the average exposure. The energy dispersion kernel is obtained at the on_region center. Only regions with centers are supported. The model is not exported to the ~gammapy.datasets.SpectrumDataset. It must be set after the dataset extraction. Parameters ---------- on_region : `~regions.SkyRegion` The input ON region on which to extract the spectrum. containment_correction : bool Apply containment correction for point sources and circular on regions. Default is False. name : str, optional Name of the new dataset. Default is None. Returns ------- dataset : `~gammapy.datasets.SpectrumDataset` The resulting reduced dataset. """ from .spectrum import SpectrumDataset dataset = self.to_region_map_dataset(region=on_region, name=name) if containment_correction: if not isinstance(on_region, CircleSkyRegion): raise TypeError( "Containment correction is only supported for" " `CircleSkyRegion`." ) elif self.psf is None or isinstance(self.psf, PSFKernel): raise ValueError("No PSFMap set. Containment correction impossible") else: geom = dataset.exposure.geom energy_true = geom.axes["energy_true"].center containment = self.psf.containment( position=on_region.center, energy_true=energy_true, rad=on_region.radius, ) dataset.exposure.quantity *= containment.reshape(geom.data_shape) kwargs = {"name": name} for key in [ "counts", "edisp", "mask_safe", "mask_fit", "exposure", "gti", "meta_table", ]: kwargs[key] = getattr(dataset, key) if self.stat_type == "cash": kwargs["background"] = dataset.background return SpectrumDataset(**kwargs) def to_region_map_dataset(self, region, name=None): """Integrate the map dataset in a given region. Counts and background of the dataset are integrated in the given region, taking the safe mask into account. The exposure is averaged in the region again taking the safe mask into account. The PSF and energy dispersion kernel are taken at the center of the region. Parameters ---------- region : `~regions.SkyRegion` Region from which to extract the spectrum. name : str, optional Name of the new dataset. Default is None. Returns ------- dataset : `~gammapy.datasets.MapDataset` The resulting reduced dataset. """ name = make_name(name) kwargs = {"gti": self.gti, "name": name, "meta_table": self.meta_table} if self.mask_safe: kwargs["mask_safe"] = self.mask_safe.to_region_nd_map(region, func=np.any) if self.mask_fit: kwargs["mask_fit"] = self.mask_fit.to_region_nd_map(region, func=np.any) if self.counts: kwargs["counts"] = self.counts.to_region_nd_map( region, np.sum, weights=self.mask_safe ) if self.stat_type == "cash" and self.background: kwargs["background"] = self.background.to_region_nd_map( region, func=np.sum, weights=self.mask_safe ) if self.exposure: kwargs["exposure"] = self.exposure.to_region_nd_map(region, func=np.mean) region = region.center if region else None # TODO: Compute average psf in region if self.psf: kwargs["psf"] = self.psf.to_region_nd_map(region) # TODO: Compute average edisp in region if self.edisp is not None: kwargs["edisp"] = self.edisp.to_region_nd_map(region) return self.__class__(**kwargs) def cutout(self, position, width, mode="trim", name=None): """Cutout map dataset. Parameters ---------- position : `~astropy.coordinates.SkyCoord` Center position of the cutout region. width : tuple of `~astropy.coordinates.Angle` Angular sizes of the region in (lon, lat) in that specific order. If only one value is passed, a square region is extracted. mode : {'trim', 'partial', 'strict'} Mode option for Cutout2D, for details see `~astropy.nddata.utils.Cutout2D`. Default is "trim". name : str, optional Name of the new dataset. Default is None. Returns ------- cutout : `MapDataset` Cutout map dataset. """ name = make_name(name) kwargs = {"gti": self.gti, "name": name, "meta_table": self.meta_table} cutout_kwargs = {"position": position, "width": width, "mode": mode} if self.counts is not None: kwargs["counts"] = self.counts.cutout(**cutout_kwargs) if self.exposure is not None: kwargs["exposure"] = self.exposure.cutout(**cutout_kwargs) if self.background is not None and self.stat_type == "cash": kwargs["background"] = self.background.cutout(**cutout_kwargs) if self.edisp is not None: kwargs["edisp"] = self.edisp.cutout(**cutout_kwargs) if self.psf is not None: kwargs["psf"] = self.psf.cutout(**cutout_kwargs) if self.mask_safe is not None: kwargs["mask_safe"] = self.mask_safe.cutout(**cutout_kwargs) if self.mask_fit is not None: kwargs["mask_fit"] = self.mask_fit.cutout(**cutout_kwargs) return self.__class__(**kwargs) def downsample(self, factor, axis_name=None, name=None): """Downsample map dataset. The PSFMap and EDispKernelMap are not downsampled, except if a corresponding axis is given. Parameters ---------- factor : int Downsampling factor. axis_name : str, optional Which non-spatial axis to downsample. By default only spatial axes are downsampled. Default is None. name : str, optional Name of the downsampled dataset. Default is None. Returns ------- dataset : `MapDataset` or `SpectrumDataset` Downsampled map dataset. """ name = make_name(name) kwargs = {"gti": self.gti, "name": name, "meta_table": self.meta_table} if self.counts is not None: kwargs["counts"] = self.counts.downsample( factor=factor, preserve_counts=True, axis_name=axis_name, weights=self.mask_safe, ) if self.exposure is not None: if axis_name is None: kwargs["exposure"] = self.exposure.downsample( factor=factor, preserve_counts=False, axis_name=None ) else: kwargs["exposure"] = self.exposure.copy() if self.background is not None and self.stat_type == "cash": kwargs["background"] = self.background.downsample( factor=factor, axis_name=axis_name, weights=self.mask_safe ) if self.edisp is not None: if axis_name is not None: kwargs["edisp"] = self.edisp.downsample( factor=factor, axis_name=axis_name, weights=self.mask_safe_edisp ) else: kwargs["edisp"] = self.edisp.copy() if self.psf is not None: kwargs["psf"] = self.psf.copy() if self.mask_safe is not None: kwargs["mask_safe"] = self.mask_safe.downsample( factor=factor, preserve_counts=False, axis_name=axis_name ) if self.mask_fit is not None: kwargs["mask_fit"] = self.mask_fit.downsample( factor=factor, preserve_counts=False, axis_name=axis_name ) return self.__class__(**kwargs) def pad(self, pad_width, mode="constant", name=None): """Pad the spatial dimensions of the dataset. The padding only applies to counts, masks, background and exposure. Counts, background and masks are padded with zeros, exposure is padded with edge value. Parameters ---------- pad_width : {sequence, array_like, int} Number of pixels padded to the edges of each axis. mode : str Pad mode. Default is "constant". name : str, optional Name of the padded dataset. Default is None. Returns ------- dataset : `MapDataset` Padded map dataset. """ name = make_name(name) kwargs = {"gti": self.gti, "name": name, "meta_table": self.meta_table} if self.counts is not None: kwargs["counts"] = self.counts.pad(pad_width=pad_width, mode=mode) if self.exposure is not None: kwargs["exposure"] = self.exposure.pad(pad_width=pad_width, mode=mode) if self.background is not None: kwargs["background"] = self.background.pad(pad_width=pad_width, mode=mode) if self.edisp is not None: kwargs["edisp"] = self.edisp.copy() if self.psf is not None: kwargs["psf"] = self.psf.copy() if self.mask_safe is not None: kwargs["mask_safe"] = self.mask_safe.pad(pad_width=pad_width, mode=mode) if self.mask_fit is not None: kwargs["mask_fit"] = self.mask_fit.pad(pad_width=pad_width, mode=mode) return self.__class__(**kwargs) def slice_by_idx(self, slices, name=None): """Slice sub dataset. The slicing only applies to the maps that define the corresponding axes. Parameters ---------- slices : dict Dictionary of axes names and integers or `slice` object pairs. Contains one element for each non-spatial dimension. For integer indexing the corresponding axes is dropped from the map. Axes not specified in the dict are kept unchanged. name : str, optional Name of the sliced dataset. Default is None. Returns ------- dataset : `MapDataset` or `SpectrumDataset` Sliced dataset. Examples -------- >>> from gammapy.datasets import MapDataset >>> dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") >>> slices = {"energy": slice(0, 3)} #to get the first 3 energy slices >>> sliced = dataset.slice_by_idx(slices) >>> print(sliced.geoms["geom"]) WcsGeom axes : ['lon', 'lat', 'energy'] shape : (np.int64(320), np.int64(240), 3) ndim : 3 frame : galactic projection : CAR center : 0.0 deg, 0.0 deg width : 8.0 deg x 6.0 deg wcs ref : 0.0 deg, 0.0 deg """ name = make_name(name) kwargs = {"gti": self.gti, "name": name, "meta_table": self.meta_table} if self.counts is not None: kwargs["counts"] = self.counts.slice_by_idx(slices=slices) if self.exposure is not None: kwargs["exposure"] = self.exposure.slice_by_idx(slices=slices) if self.background is not None and self.stat_type == "cash": kwargs["background"] = self.background.slice_by_idx(slices=slices) if self.edisp is not None: kwargs["edisp"] = self.edisp.slice_by_idx(slices=slices) if self.psf is not None: kwargs["psf"] = self.psf.slice_by_idx(slices=slices) if self.mask_safe is not None: kwargs["mask_safe"] = self.mask_safe.slice_by_idx(slices=slices) if self.mask_fit is not None: kwargs["mask_fit"] = self.mask_fit.slice_by_idx(slices=slices) return self.__class__(**kwargs) def slice_by_energy(self, energy_min=None, energy_max=None, name=None): """Select and slice datasets in energy range. Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity`, optional Energy bounds to compute the flux point for. Default is None. name : str, optional Name of the sliced dataset. Default is None. Returns ------- dataset : `MapDataset` Sliced Dataset. Examples -------- >>> from gammapy.datasets import MapDataset >>> dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") >>> sliced = dataset.slice_by_energy(energy_min="1 TeV", energy_max="5 TeV") >>> sliced.data_shape (3, np.int64(240), np.int64(320)) """ name = make_name(name) energy_axis = self._geom.axes["energy"] if energy_min is None: energy_min = energy_axis.bounds[0] if energy_max is None: energy_max = energy_axis.bounds[1] energy_min, energy_max = u.Quantity(energy_min), u.Quantity(energy_max) group = energy_axis.group_table(edges=[energy_min, energy_max]) is_normal = group["bin_type"] == "normal " group = group[is_normal] slices = { "energy": slice(int(group["idx_min"][0]), int(group["idx_max"][0]) + 1) } return self.slice_by_idx(slices, name=name) def reset_data_cache(self): """Reset data cache to free memory space.""" for name in self._lazy_data_members: if self.__dict__.pop(name, False): log.info(f"Clearing {name} cache for dataset {self.name}") def resample_energy_axis(self, energy_axis, name=None): """Resample MapDataset over new reco energy axis. Counts are summed taking into account safe mask. Parameters ---------- energy_axis : `~gammapy.maps.MapAxis` New reconstructed energy axis. name : str, optional Name of the new dataset. Default is None. Returns ------- dataset : `MapDataset` or `SpectrumDataset` Resampled dataset. """ name = make_name(name) kwargs = {"gti": self.gti, "name": name, "meta_table": self.meta_table} if self.exposure: kwargs["exposure"] = self.exposure if self.psf: kwargs["psf"] = self.psf if self.mask_safe is not None: kwargs["mask_safe"] = self.mask_safe.resample_axis( axis=energy_axis, ufunc=np.logical_or ) if self.mask_fit is not None: kwargs["mask_fit"] = self.mask_fit.resample_axis( axis=energy_axis, ufunc=np.logical_or ) if self.counts is not None: kwargs["counts"] = self.counts.resample_axis( axis=energy_axis, weights=self.mask_safe ) if self.background is not None and self.stat_type == "cash": kwargs["background"] = self.background.resample_axis( axis=energy_axis, weights=self.mask_safe ) # Mask_safe or mask_irf?? if isinstance(self.edisp, EDispKernelMap): kwargs["edisp"] = self.edisp.resample_energy_axis( energy_axis=energy_axis, weights=self.mask_safe_edisp ) else: # None or EDispMap kwargs["edisp"] = self.edisp return self.__class__(**kwargs) def to_image(self, name=None): """Create images by summing over the reconstructed energy axis. Parameters ---------- name : str, optional Name of the new dataset. Default is None. Returns ------- dataset : `MapDataset` or `SpectrumDataset` Dataset integrated over non-spatial axes. """ energy_axis = self._geom.axes["energy"].squash() return self.resample_energy_axis(energy_axis=energy_axis, name=name) def peek(self, figsize=(12, 8)): """Quick-look summary plots. Parameters ---------- figsize : tuple Size of the figure. Default is (12, 10). """ def plot_mask(ax, mask, **kwargs): if mask is not None: mask.plot_mask(ax=ax, **kwargs) fig, axes = plt.subplots( ncols=2, nrows=2, subplot_kw={"projection": self._geom.wcs}, figsize=figsize, gridspec_kw={"hspace": 0.25, "wspace": 0.1}, ) axes = axes.flat axes[0].set_title("Counts") self.counts.sum_over_axes().plot(ax=axes[0], add_cbar=True) plot_mask(ax=axes[0], mask=self.mask_fit_image, alpha=0.2) plot_mask(ax=axes[0], mask=self.mask_safe_image, hatches=["///"], colors="w") axes[1].set_title("Excess counts") self.excess.sum_over_axes().plot(ax=axes[1], add_cbar=True) plot_mask(ax=axes[1], mask=self.mask_fit_image, alpha=0.2) plot_mask(ax=axes[1], mask=self.mask_safe_image, hatches=["///"], colors="w") axes[2].set_title("Exposure") self.exposure.sum_over_axes().plot(ax=axes[2], add_cbar=True) plot_mask(ax=axes[2], mask=self.mask_safe_image, hatches=["///"], colors="w") axes[3].set_title("Background") self.background.sum_over_axes().plot(ax=axes[3], add_cbar=True) plot_mask(ax=axes[3], mask=self.mask_fit_image, alpha=0.2) plot_mask(ax=axes[3], mask=self.mask_safe_image, hatches=["///"], colors="w") class MapDatasetOnOff(MapDataset): """Map dataset for on-off likelihood fitting. It bundles together the binned on and off counts, the binned IRFs as well as the on and off acceptances. A safe mask and a fit mask can be added to exclude bins during the analysis. It uses the Wstat statistic (see `~gammapy.stats.wstat`), therefore no background model is needed. For more information see :ref:`datasets`. Parameters ---------- models : `~gammapy.modeling.models.Models` Source sky models. counts : `~gammapy.maps.WcsNDMap` Counts cube. counts_off : `~gammapy.maps.WcsNDMap` Ring-convolved counts cube. acceptance : `~gammapy.maps.WcsNDMap` Acceptance from the IRFs. acceptance_off : `~gammapy.maps.WcsNDMap` Acceptance off. exposure : `~gammapy.maps.WcsNDMap` Exposure cube. mask_fit : `~gammapy.maps.WcsNDMap` Mask to apply to the likelihood for fitting. psf : `~gammapy.irf.PSFKernel` PSF kernel. edisp : `~gammapy.irf.EDispKernel` Energy dispersion. mask_safe : `~gammapy.maps.WcsNDMap` Mask defining the safe data range. gti : `~gammapy.data.GTI` GTI of the observation or union of GTI if it is a stacked observation. meta_table : `~astropy.table.Table` Table listing information on observations used to create the dataset. One line per observation for stacked datasets. name : str Name of the dataset. meta : `~gammapy.datasets.MapDatasetMetaData` Associated meta data container See Also -------- MapDataset, SpectrumDataset, FluxPointsDataset. """ stat_type = "wstat" tag = "MapDatasetOnOff" def __init__( self, models=None, counts=None, counts_off=None, acceptance=None, acceptance_off=None, exposure=None, mask_fit=None, psf=None, edisp=None, name=None, mask_safe=None, gti=None, meta_table=None, meta=None, ): self._name = make_name(name) self._evaluators = {} self.counts = counts self.counts_off = counts_off self.exposure = exposure self.acceptance = acceptance self.acceptance_off = acceptance_off self.gti = gti self.mask_fit = mask_fit self.psf = psf self.edisp = edisp self.models = models self.mask_safe = mask_safe self.meta_table = meta_table if meta is None: self._meta = MapDatasetMetaData() else: self._meta = meta def __str__(self): str_ = super().__str__() if self.mask_safe: mask = self.mask_safe.data.astype(bool) else: mask = slice(None) counts_off = np.nan if self.counts_off is not None: counts_off = np.sum(self.counts_off.data[mask]) str_ += "\t{:32}: {:.0f} \n".format("Total counts_off", counts_off) acceptance = np.nan if self.acceptance is not None: acceptance = np.sum(self.acceptance.data[mask]) str_ += "\t{:32}: {:.0f} \n".format("Acceptance", acceptance) acceptance_off = np.nan if self.acceptance_off is not None: acceptance_off = np.sum(self.acceptance_off.data[mask]) str_ += "\t{:32}: {:.0f} \n".format("Acceptance off", acceptance_off) return str_.expandtabs(tabsize=2) @property def _geom(self): """Main analysis geometry.""" if self.counts is not None: return self.counts.geom elif self.counts_off is not None: return self.counts_off.geom elif self.acceptance is not None: return self.acceptance.geom elif self.acceptance_off is not None: return self.acceptance_off.geom else: raise ValueError( "Either 'counts', 'counts_off', 'acceptance' or 'acceptance_of' must be defined." ) @property def alpha(self): """Exposure ratio between signal and background regions. See :ref:`wstat`. Returns ------- alpha : `Map` Alpha map. """ with np.errstate(invalid="ignore", divide="ignore"): data = self.acceptance.quantity / self.acceptance_off.quantity data = np.nan_to_num(data) return Map.from_geom(self._geom, data=data.to_value(""), unit="") def npred_background(self): """Predicted background counts estimated from the marginalized likelihood estimate. See :ref:`wstat`. Returns ------- npred_background : `Map` Predicted background counts. """ mu_bkg = self.alpha.data * get_wstat_mu_bkg( n_on=self.counts.data, n_off=self.counts_off.data, alpha=self.alpha.data, mu_sig=self.npred_signal().data, ) mu_bkg = np.nan_to_num(mu_bkg) return Map.from_geom(geom=self._geom, data=mu_bkg) def npred_off(self): """Predicted counts in the off region; mu_bkg/alpha. See :ref:`wstat`. Returns ------- npred_off : `Map` Predicted off counts. """ return self.npred_background() / self.alpha @property def background(self): """Computed as alpha * n_off. See :ref:`wstat`. Returns ------- background : `Map` Background map. """ if self.counts_off is None: return None return self.alpha * self.counts_off def stat_array(self): """Statistic function value per bin given the current model parameters.""" mu_sig = self.npred_signal().data on_stat_ = wstat( n_on=self.counts.data, n_off=self.counts_off.data, alpha=list(self.alpha.data), mu_sig=mu_sig, ) return np.nan_to_num(on_stat_) @property def _counts_statistic(self): """Counts statistics of the dataset.""" return WStatCountsStatistic(self.counts, self.counts_off, self.alpha) @classmethod def from_geoms( cls, geom, geom_exposure=None, geom_psf=None, geom_edisp=None, reference_time="2000-01-01", name=None, **kwargs, ): """Create an empty `MapDatasetOnOff` object according to the specified geometries. Parameters ---------- geom : `gammapy.maps.WcsGeom` Geometry for the counts, counts_off, acceptance and acceptance_off maps. geom_exposure : `gammapy.maps.WcsGeom`, optional Geometry for the exposure map. Default is None. geom_psf : `gammapy.maps.WcsGeom`, optional Geometry for the PSF map. Default is None. geom_edisp : `gammapy.maps.WcsGeom`, optional Geometry for the energy dispersion kernel map. If geom_edisp has a migra axis, this will create an EDispMap instead. Default is None. reference_time : `~astropy.time.Time` The reference time to use in GTI definition. Default is "2000-01-01". name : str, optional Name of the returned dataset. Default is None. **kwargs : dict, optional Keyword arguments to be passed. Returns ------- empty_maps : `MapDatasetOnOff` A MapDatasetOnOff containing zero filled maps. """ # TODO: it seems the super() pattern does not work here? dataset = MapDataset.from_geoms( geom=geom, geom_exposure=geom_exposure, geom_psf=geom_psf, geom_edisp=geom_edisp, name=name, reference_time=reference_time, **kwargs, ) off_maps = {} for key in ["counts_off", "acceptance", "acceptance_off"]: off_maps[key] = Map.from_geom(geom, unit="") return cls.from_map_dataset(dataset, name=name, **off_maps) @classmethod def from_map_dataset( cls, dataset, acceptance, acceptance_off, counts_off=None, name=None ): """Create on off dataset from a map dataset. Parameters ---------- dataset : `MapDataset` Spectrum dataset defining counts, edisp, aeff, livetime etc. acceptance : `Map` Relative background efficiency in the on region. acceptance_off : `Map` Relative background efficiency in the off region. counts_off : `Map`, optional Off counts map . If the dataset provides a background model, and no off counts are defined. The off counts are deferred from counts_off / alpha. Default is None. name : str, optional Name of the returned dataset. Default is None. Returns ------- dataset : `MapDatasetOnOff` Map dataset on off. """ if counts_off is None and dataset.background is not None: alpha = acceptance / acceptance_off counts_off = dataset.npred_background() / alpha if np.isscalar(acceptance): acceptance = Map.from_geom(dataset._geom, data=acceptance) if np.isscalar(acceptance_off): acceptance_off = Map.from_geom(dataset._geom, data=acceptance_off) return cls( models=dataset.models, counts=dataset.counts, exposure=dataset.exposure, counts_off=counts_off, edisp=dataset.edisp, psf=dataset.psf, mask_safe=dataset.mask_safe, mask_fit=dataset.mask_fit, acceptance=acceptance, acceptance_off=acceptance_off, gti=dataset.gti, name=name, meta_table=dataset.meta_table, ) def to_map_dataset(self, name=None): """Convert a MapDatasetOnOff to a MapDataset. The background model template is taken as alpha * counts_off. Parameters ---------- name : str, optional Name of the new dataset. Default is None. Returns ------- dataset : `MapDataset` Map dataset with cash statistics. """ name = make_name(name) background = self.counts_off * self.alpha if self.counts_off else None return MapDataset( counts=self.counts, exposure=self.exposure, psf=self.psf, edisp=self.edisp, name=name, gti=self.gti, mask_fit=self.mask_fit, mask_safe=self.mask_safe, background=background, meta_table=self.meta_table, ) @property def _is_stackable(self): """Check if the Dataset contains enough information to be stacked.""" incomplete = ( self.acceptance_off is None or self.acceptance is None or self.counts_off is None ) unmasked = np.any(self.mask_safe.data) if incomplete and unmasked: return False else: return True def stack(self, other, nan_to_num=True): r"""Stack another dataset in place. Safe mask is applied to the other dataset to compute the stacked counts data, counts outside the safe mask are lost (as for `~gammapy.MapDataset.stack`). The ``acceptance`` of the stacked dataset is obtained by stacking the acceptance weighted by the other mask_safe onto the current unweighted acceptance. Note that the masking is not applied to the current dataset. If masking needs to be applied to it, use `~gammapy.MapDataset.to_masked()` first. The stacked ``acceptance_off`` is scaled so that: .. math:: \alpha_\text{stacked} = \frac{1}{a_\text{off}} = \frac{\alpha_1\text{OFF}_1 + \alpha_2\text{OFF}_2}{\text{OFF}_1 + OFF_2}. For details, see :ref:`stack`. Parameters ---------- other : `MapDatasetOnOff` Other dataset. nan_to_num : bool Non-finite values are replaced by zero if True. Default is True. """ if not isinstance(other, MapDatasetOnOff): raise TypeError("Incompatible types for MapDatasetOnOff stacking") if not self._is_stackable or not other._is_stackable: raise ValueError("Cannot stack incomplete MapDatasetOnOff.") geom = self.counts.geom total_off = Map.from_geom(geom) total_alpha = Map.from_geom(geom) total_acceptance = Map.from_geom(geom) total_acceptance.stack(self.acceptance, nan_to_num=nan_to_num) total_acceptance.stack( other.acceptance, weights=other.mask_safe, nan_to_num=nan_to_num ) if self.counts_off: total_off.stack(self.counts_off, nan_to_num=nan_to_num) total_alpha.stack(self.alpha * self.counts_off, nan_to_num=nan_to_num) if other.counts_off: total_off.stack( other.counts_off, weights=other.mask_safe, nan_to_num=nan_to_num ) total_alpha.stack( other.alpha * other.counts_off, weights=other.mask_safe, nan_to_num=nan_to_num, ) with np.errstate(divide="ignore", invalid="ignore"): acceptance_off = total_acceptance * total_off / total_alpha average_alpha = total_alpha.data.sum() / total_off.data.sum() # For the bins where the stacked OFF counts equal 0, the alpha value is # performed by weighting on the total OFF counts of each run is_zero = total_off.data == 0 acceptance_off.data[is_zero] = total_acceptance.data[is_zero] / average_alpha self.acceptance.data[...] = total_acceptance.data self.acceptance_off = acceptance_off self.counts_off = total_off super().stack(other, nan_to_num=nan_to_num) def stat_sum(self): """Total statistic function value given the current model parameters. If the off counts are passed as None and the elements of the safe mask are False, zero will be returned. Otherwise, the stat sum will be calculated and returned. """ if self.counts_off is None and not np.any(self.mask_safe.data): return 0 else: return Dataset.stat_sum(self) def fake(self, npred_background, random_state="random-seed"): """Simulate fake counts (on and off) for the current model and reduced IRFs. This method overwrites the counts defined on the dataset object. Parameters ---------- npred_background : `~gammapy.maps.Map` Expected number of background counts in the on region. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is "random-seed". """ random_state = get_random_state(random_state) npred = self.npred_signal() data = np.nan_to_num(npred.data, copy=True, nan=0.0, posinf=0.0, neginf=0.0) npred.data = random_state.poisson(data) npred_bkg = random_state.poisson(npred_background.data) self.counts = npred + npred_bkg npred_off = npred_background / self.alpha data_off = np.nan_to_num( npred_off.data, copy=True, nan=0.0, posinf=0.0, neginf=0.0 ) npred_off.data = random_state.poisson(data_off) self.counts_off = npred_off def to_hdulist(self): """Convert map dataset to list of HDUs. Returns ------- hdulist : `~astropy.io.fits.HDUList` Map dataset list of HDUs. """ hdulist = super().to_hdulist() exclude_primary = slice(1, None) del hdulist["BACKGROUND"] del hdulist["BACKGROUND_BANDS"] if self.counts_off is not None: hdulist += self.counts_off.to_hdulist(hdu="counts_off")[exclude_primary] if self.acceptance is not None: hdulist += self.acceptance.to_hdulist(hdu="acceptance")[exclude_primary] if self.acceptance_off is not None: hdulist += self.acceptance_off.to_hdulist(hdu="acceptance_off")[ exclude_primary ] return hdulist @classmethod def _read_lazy(cls, filename, name=None, cache=True, format="gadf"): raise NotImplementedError( f"Lazy loading is not implemented for {cls}, please use option lazy=False." ) @classmethod def from_hdulist(cls, hdulist, name=None, format="gadf"): """Create map dataset from list of HDUs. Parameters ---------- hdulist : `~astropy.io.fits.HDUList` List of HDUs. name : str, optional Name of the new dataset. Default is None. format : {"gadf"} Format the hdulist is given in. Default is "gadf". Returns ------- dataset : `MapDatasetOnOff` Map dataset. """ kwargs = {} kwargs["name"] = name if "COUNTS" in hdulist: kwargs["counts"] = Map.from_hdulist(hdulist, hdu="counts", format=format) if "COUNTS_OFF" in hdulist: kwargs["counts_off"] = Map.from_hdulist( hdulist, hdu="counts_off", format=format ) if "ACCEPTANCE" in hdulist: kwargs["acceptance"] = Map.from_hdulist( hdulist, hdu="acceptance", format=format ) if "ACCEPTANCE_OFF" in hdulist: kwargs["acceptance_off"] = Map.from_hdulist( hdulist, hdu="acceptance_off", format=format ) if "EXPOSURE" in hdulist: kwargs["exposure"] = Map.from_hdulist( hdulist, hdu="exposure", format=format ) if "EDISP" in hdulist: edisp_map = Map.from_hdulist(hdulist, hdu="edisp", format=format) try: exposure_map = Map.from_hdulist( hdulist, hdu="edisp_exposure", format=format ) except KeyError: exposure_map = None if edisp_map.geom.axes[0].name == "energy": kwargs["edisp"] = EDispKernelMap(edisp_map, exposure_map) else: kwargs["edisp"] = EDispMap(edisp_map, exposure_map) if "PSF" in hdulist: psf_map = Map.from_hdulist(hdulist, hdu="psf", format=format) try: exposure_map = Map.from_hdulist( hdulist, hdu="psf_exposure", format=format ) except KeyError: exposure_map = None kwargs["psf"] = PSFMap(psf_map, exposure_map) if "MASK_SAFE" in hdulist: mask_safe = Map.from_hdulist(hdulist, hdu="mask_safe", format=format) kwargs["mask_safe"] = mask_safe if "MASK_FIT" in hdulist: mask_fit = Map.from_hdulist(hdulist, hdu="mask_fit", format=format) kwargs["mask_fit"] = mask_fit if "GTI" in hdulist: gti = GTI.from_table_hdu(hdulist["GTI"]) kwargs["gti"] = gti if "META_TABLE" in hdulist: meta_table = Table.read(hdulist, hdu="META_TABLE") kwargs["meta_table"] = meta_table return cls(**kwargs) def info_dict(self, in_safe_data_range=True): """Basic info dict with summary statistics. If a region is passed, then a spectrum dataset is extracted, and the corresponding info returned. Parameters ---------- in_safe_data_range : bool Whether to sum only in the safe energy range. Default is True. Returns ------- info_dict : dict Dictionary with summary info. """ # TODO: remove code duplication with SpectrumDatasetOnOff info = super().info_dict(in_safe_data_range) if self.mask_safe and in_safe_data_range: mask = self.mask_safe.data.astype(bool) else: mask = slice(None) summed_stat = self._counts_statistic[mask].sum() counts_off = 0 if self.counts_off is not None: counts_off = summed_stat.n_off info["counts_off"] = int(counts_off) acceptance = 1 if self.acceptance: acceptance = self.acceptance.data[mask].sum() info["acceptance"] = float(acceptance) acceptance_off = np.nan alpha = np.nan if self.acceptance_off: alpha = summed_stat.alpha acceptance_off = acceptance / alpha info["acceptance_off"] = float(acceptance_off) info["alpha"] = float(alpha) info["stat_sum"] = self.stat_sum() return info def to_spectrum_dataset(self, on_region, containment_correction=False, name=None): """Return a ~gammapy.datasets.SpectrumDatasetOnOff from on_region. Counts and OFF counts are summed in the on_region. Acceptance is the average of all acceptances while acceptance OFF is taken such that number of excess is preserved in the on_region. Effective area is taken from the average exposure. The energy dispersion kernel is obtained at the on_region center. Only regions with centers are supported. The models are not exported to the ~gammapy.dataset.SpectrumDatasetOnOff. It must be set after the dataset extraction. Parameters ---------- on_region : `~regions.SkyRegion` The input ON region on which to extract the spectrum. containment_correction : bool Apply containment correction for point sources and circular on regions. Default is False. name : str, optional Name of the new dataset. Default is None. Returns ------- dataset : `~gammapy.datasets.SpectrumDatasetOnOff` The resulting reduced dataset. """ from .spectrum import SpectrumDatasetOnOff dataset = super().to_spectrum_dataset( on_region=on_region, containment_correction=containment_correction, name=name, ) kwargs = {"name": name} if self.counts_off is not None: kwargs["counts_off"] = self.counts_off.get_spectrum( on_region, np.sum, weights=self.mask_safe ) if self.acceptance is not None: kwargs["acceptance"] = self.acceptance.get_spectrum( on_region, np.mean, weights=self.mask_safe ) norm = self.background.get_spectrum( on_region, np.sum, weights=self.mask_safe ) acceptance_off = kwargs["acceptance"] * kwargs["counts_off"] / norm np.nan_to_num(acceptance_off.data, copy=False) kwargs["acceptance_off"] = acceptance_off return SpectrumDatasetOnOff.from_spectrum_dataset(dataset=dataset, **kwargs) def cutout(self, position, width, mode="trim", name=None): """Cutout map dataset. Parameters ---------- position : `~astropy.coordinates.SkyCoord` Center position of the cutout region. width : tuple of `~astropy.coordinates.Angle` Angular sizes of the region in (lon, lat) in that specific order. If only one value is passed, a square region is extracted. mode : {'trim', 'partial', 'strict'} Mode option for Cutout2D, for details see `~astropy.nddata.utils.Cutout2D`. Default is "trim". name : str, optional Name of the new dataset. Default is None. Returns ------- cutout : `MapDatasetOnOff` Cutout map dataset. """ cutout_kwargs = { "position": position, "width": width, "mode": mode, "name": name, } cutout_dataset = super().cutout(**cutout_kwargs) del cutout_kwargs["name"] if self.counts_off is not None: cutout_dataset.counts_off = self.counts_off.cutout(**cutout_kwargs) if self.acceptance is not None: cutout_dataset.acceptance = self.acceptance.cutout(**cutout_kwargs) if self.acceptance_off is not None: cutout_dataset.acceptance_off = self.acceptance_off.cutout(**cutout_kwargs) return cutout_dataset def downsample(self, factor, axis_name=None, name=None): """Downsample map dataset. The PSFMap and EDispKernelMap are not downsampled, except if a corresponding axis is given. Parameters ---------- factor : int Downsampling factor. axis_name : str, optional Which non-spatial axis to downsample. By default, only spatial axes are downsampled. Default is None. name : str, optional Name of the downsampled dataset. Default is None. Returns ------- dataset : `MapDatasetOnOff` Downsampled map dataset. """ dataset = super().downsample(factor, axis_name, name) counts_off = None if self.counts_off is not None: counts_off = self.counts_off.downsample( factor=factor, preserve_counts=True, axis_name=axis_name, weights=self.mask_safe, ) acceptance, acceptance_off = None, None if self.acceptance_off is not None: acceptance = self.acceptance.downsample( factor=factor, preserve_counts=False, axis_name=axis_name ) factor = self.background.downsample( factor=factor, preserve_counts=True, axis_name=axis_name, weights=self.mask_safe, ) acceptance_off = acceptance * counts_off / factor return self.__class__.from_map_dataset( dataset, acceptance=acceptance, acceptance_off=acceptance_off, counts_off=counts_off, ) def pad(self): """Not implemented for MapDatasetOnOff.""" raise NotImplementedError def slice_by_idx(self, slices, name=None): """Slice sub dataset. The slicing only applies to the maps that define the corresponding axes. Parameters ---------- slices : dict Dictionary of axes names and integers or `slice` object pairs. Contains one element for each non-spatial dimension. For integer indexing the corresponding axes is dropped from the map. Axes not specified in the dict are kept unchanged. name : str, optional Name of the sliced dataset. Default is None. Returns ------- map_out : `Map` Sliced map object. """ kwargs = {"name": name} dataset = super().slice_by_idx(slices, name) if self.counts_off is not None: kwargs["counts_off"] = self.counts_off.slice_by_idx(slices=slices) if self.acceptance is not None: kwargs["acceptance"] = self.acceptance.slice_by_idx(slices=slices) if self.acceptance_off is not None: kwargs["acceptance_off"] = self.acceptance_off.slice_by_idx(slices=slices) return self.from_map_dataset(dataset, **kwargs) def resample_energy_axis(self, energy_axis, name=None): """Resample MapDatasetOnOff over reconstructed energy edges. Counts are summed taking into account safe mask. Parameters ---------- energy_axis : `~gammapy.maps.MapAxis` New reco energy axis. name : str, optional Name of the new dataset. Default is None. Returns ------- dataset : `SpectrumDataset` Resampled spectrum dataset. """ dataset = super().resample_energy_axis(energy_axis, name) counts_off = None if self.counts_off is not None: counts_off = self.counts_off counts_off = counts_off.resample_axis( axis=energy_axis, weights=self.mask_safe ) acceptance = 1 acceptance_off = None if self.acceptance is not None: acceptance = self.acceptance acceptance = acceptance.resample_axis( axis=energy_axis, weights=self.mask_safe ) norm_factor = self.background.resample_axis( axis=energy_axis, weights=self.mask_safe ) acceptance_off = acceptance * counts_off / norm_factor return self.__class__.from_map_dataset( dataset, acceptance=acceptance, acceptance_off=acceptance_off, counts_off=counts_off, name=name, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/metadata.py0000644000175100001770000000625114721316200020012 0ustar00runnerdockerfrom typing import ClassVar, Literal, Optional, Union import numpy as np from astropy.coordinates import SkyCoord from pydantic import ConfigDict from gammapy.utils.metadata import ( METADATA_FITS_KEYS, CreatorMetaData, MetaData, ObsInfoMetaData, PointingInfoMetaData, ) __all__ = ["MapDatasetMetaData"] MapDataset_METADATA_FITS_KEYS = { "MapDataset": { "event_types": "EVT_TYPE", "optional": "OPTIONAL", }, } METADATA_FITS_KEYS.update(MapDataset_METADATA_FITS_KEYS) class MapDatasetMetaData(MetaData): """Metadata containing information about the Dataset. Parameters ---------- creation : `~gammapy.utils.CreatorMetaData`, optional The creation metadata. obs_info : list of `~gammapy.utils.ObsInfoMetaData` info about the observation. event_types : list of int or str Event types used in analysis. pointing: list of `~gammapy.utils.PointingInfoMetaData` Telescope pointing directions. optional : dict Additional optional metadata. """ model_config = ConfigDict(coerce_numbers_to_str=True) _tag: ClassVar[Literal["MapDataset"]] = "MapDataset" creation: Optional[CreatorMetaData] = CreatorMetaData() obs_info: Optional[Union[ObsInfoMetaData, list[ObsInfoMetaData]]] = None pointing: Optional[Union[PointingInfoMetaData, list[PointingInfoMetaData]]] = None event_type: Optional[Union[str, list[str]]] = None optional: Optional[dict] = None def stack(self, other): kwargs = {} kwargs["creation"] = self.creation return self.__class__(**kwargs) @classmethod def _from_meta_table(cls, table): """Create MapDatasetMetaData from MapDataset.meta_table Parameters ---------- table: `~astropy.table.Table` """ kwargs = {} kwargs["creation"] = CreatorMetaData() telescope = np.atleast_1d(table["TELESCOP"].data[0]) obs_id = np.atleast_1d(table["OBS_ID"].data[0].astype(str)) observation_mode = np.atleast_1d(table["OBS_MODE"].data[0]) obs_info = [] for i in range(len(obs_id)): obs_meta = ObsInfoMetaData( **{ "telescope": telescope[i], "obs_id": obs_id[i], "observation_mode": observation_mode[i], } ) obs_info.append(obs_meta) kwargs["obs_info"] = obs_info pointing_radec, pointing_altaz = None, None if "RA_PNT" in table.colnames: pointing_radec = SkyCoord( ra=table["RA_PNT"].data[0], dec=table["DEC_PNT"].data[0], unit="deg" ) if "ALT_PNT" in table.colnames: pointing_altaz = SkyCoord( alt=table["ALT_PNT"].data[0], az=table["AZ_PNT"].data[0], unit="deg", frame="altaz", ) pointings = [] for pra, paz in zip( np.atleast_1d(pointing_radec), np.atleast_1d(pointing_altaz) ): pointings.append(PointingInfoMetaData(radec_mean=pra, altaz_mean=paz)) kwargs["pointing"] = pointings return cls(**kwargs) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/simulate.py0000644000175100001770000005641714721316200020066 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Simulate observations.""" import html import logging from copy import deepcopy import numpy as np import astropy.units as u from astropy.coordinates import SkyCoord, SkyOffsetFrame from astropy.table import Table from astropy.time import Time from gammapy import __version__ from gammapy.data import EventList, observatory_locations from gammapy.maps import MapAxis, MapCoord, RegionNDMap, TimeMapAxis from gammapy.modeling.models import ( ConstantSpectralModel, ConstantTemporalModel, PointSpatialModel, ) from gammapy.utils.fits import earth_location_to_dict from gammapy.utils.random import get_random_state from .map import create_map_dataset_from_observation __all__ = ["MapDatasetEventSampler", "ObservationEventSampler"] log = logging.getLogger(__name__) class MapDatasetEventSampler: """Sample events from a map dataset. Parameters ---------- random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`}, optional Defines random number generator initialisation via the `~gammapy.utils.random.get_random_state` function. oversample_energy_factor : int, optional Defines an oversampling factor for the energies; it is used only when sampling an energy-dependent time-varying source. t_delta : `~astropy.units.Quantity`, optional Time interval used to sample the time-dependent source. keep_mc_id : bool, optional Flag to tag sampled events from a given model with a Montecarlo identifier. Default is True. If set to False, no identifier will be assigned. n_event_bunch : int Size of events bunches to sample. If None, sample all events in memory. Default is 10000. """ def __init__( self, random_state="random-seed", oversample_energy_factor=10, t_delta=0.5 * u.s, keep_mc_id=True, n_event_bunch=10000, ): self.random_state = get_random_state(random_state) self.oversample_energy_factor = oversample_energy_factor self.t_delta = t_delta self.keep_mc_id = keep_mc_id self.n_event_bunch = n_event_bunch def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def _make_table(self, coords, time_ref): """Create a table for sampled events. Parameters ---------- coords : `~gammapy.maps.MapCoord` Coordinates of the sampled events. time_ref : `~astropy.time.Time` Reference time of the event list. Returns ------- table : `~astropy.table.Table` Table of the sampled events. """ table = Table() try: energy = coords["energy_true"] except KeyError: energy = coords["energy"] table["TIME"] = (coords["time"] - time_ref).to("s") table["ENERGY_TRUE"] = energy table["RA_TRUE"] = coords.skycoord.icrs.ra.to("deg") table["DEC_TRUE"] = coords.skycoord.icrs.dec.to("deg") return table def _evaluate_timevar_source( self, dataset, model, ): """Calculate Npred for a given `dataset.model` by evaluating it on a region geometry. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Map dataset. model : `~gammapy.modeling.models.SkyModel` Sky model instance. Returns ------- npred : `~gammapy.maps.RegionNDMap` Npred map. """ energy_true = dataset.edisp.edisp_map.geom.axes["energy_true"] energy_new = energy_true.upsample(self.oversample_energy_factor) position = model.spatial_model.position region_exposure = dataset.exposure.to_region_nd_map(position) time_axis_eval = TimeMapAxis.from_gti_bounds( gti=dataset.gti, t_delta=self.t_delta ) time_min, time_max = time_axis_eval.time_bounds time_axis = MapAxis.from_bounds( time_min.mjd * u.d, time_max.mjd * u.d, nbin=time_axis_eval.nbin, name="time", ) temp_eval = model.temporal_model.evaluate( time_axis_eval.time_mid, energy=energy_new.center ) norm = model.spectral_model(energy=energy_new.center) if temp_eval.unit.is_equivalent(norm.unit): flux_diff = temp_eval.to(norm.unit) * norm.value[:, None] else: flux_diff = temp_eval * norm[:, None] flux_inte = flux_diff * energy_new.bin_width[:, None] flux_pred = RegionNDMap.create( region=position, axes=[time_axis, energy_new], data=np.array(flux_inte), unit=flux_inte.unit, ) mapcoord = flux_pred.geom.get_coord(sparse=True) mapcoord["energy_true"] = energy_true.center[:, None, None, None] flux_values = flux_pred.interp_by_coord(mapcoord) * flux_pred.unit data = flux_values * region_exposure.quantity[:, None, :, :] data /= time_axis.nbin / self.oversample_energy_factor npred = RegionNDMap.create( region=position, axes=[time_axis, energy_true], data=data.to_value(""), ) return npred def _sample_coord_time_energy(self, dataset, model): """Sample model components of a source with time-dependent spectrum. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Map dataset. model : `~gammapy.modeling.models.SkyModel` Sky model instance. Returns ------- table : `~astropy.table.Table` Table of sampled events. """ if not isinstance(model.spatial_model, PointSpatialModel): raise TypeError( f"Event sampler expects PointSpatialModel for a time varying source. Got {model.spatial_model} instead." ) if not isinstance(model.spectral_model, ConstantSpectralModel): raise TypeError( f"Event sampler expects ConstantSpectralModel for a time varying source. Got {model.spectral_model} instead." ) npred = self._evaluate_timevar_source(dataset, model=model) data = npred.data[np.isfinite(npred.data)] data = np.clip(data, 0, None) try: n_events = self.random_state.poisson(np.sum(data)) except ValueError: raise ValueError( f"The number of predicted events for the model {model.name} is too large. No event sampling will be performed for this model!" ) coords = npred.sample_coord(n_events=n_events, random_state=self.random_state) coords["time"] = Time( coords["time"], format="mjd", scale=dataset.gti.time_ref.scale ) table = self._make_table(coords, dataset.gti.time_ref) return table def _sample_coord_time(self, npred, temporal_model, gti): """Sample model components of a time-varying source. Parameters ---------- npred : `~gammapy.dataset.MapDataset` Npred map. temporal_model : `~gammapy.modeling.models\ Temporal model of the source. gti : `~gammapy.data.GTI` GTI of the dataset. Returns ------- table : `~astropy.table.Table` Table of sampled events. """ data = npred.data[np.isfinite(npred.data)] data = np.clip(data, 0, None) n_events = self.random_state.poisson(np.sum(data)) coords = npred.sample_coord(n_events=n_events, random_state=self.random_state) time_start, time_stop, time_ref = (gti.time_start, gti.time_stop, gti.time_ref) coords["time"] = temporal_model.sample_time( n_events=n_events, t_min=time_start, t_max=time_stop, random_state=self.random_state, t_delta=self.t_delta, ) table = self._make_table(coords, time_ref) return table def sample_sources(self, dataset, psf_update=False): """Sample source model components. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Map dataset. psf_update : bool Parameter to switch-off (on) the update of the PSF in the dataset; default is False. Returns ------- events : `~gammapy.data.EventList` Event list. """ if psf_update is True: psf_update = dataset.psf else: psf_update = None events_all = EventList(Table()) for idx, evaluator in enumerate(dataset.evaluators.values()): log.info(f"Evaluating model: {evaluator.model.name}") if evaluator.needs_update: evaluator.update( dataset.exposure, psf_update, dataset.edisp, dataset._geom, dataset.mask, ) if not evaluator.contributes: continue if evaluator.model.temporal_model is None: temporal_model = ConstantTemporalModel() else: temporal_model = evaluator.model.temporal_model if temporal_model.is_energy_dependent: table = self._sample_coord_time_energy(dataset, evaluator.model) else: flux = evaluator.compute_flux() npred = evaluator.apply_exposure(flux) table = self._sample_coord_time(npred, temporal_model, dataset.gti) if self.keep_mc_id: if len(table) == 0: mcid = table.Column(name="MC_ID", length=0, dtype=int) table.add_column(mcid) table["MC_ID"] = idx + 1 table.meta["MID{:05d}".format(idx + 1)] = idx + 1 table.meta["MMN{:05d}".format(idx + 1)] = evaluator.model.name events_all.stack(EventList(table)) return events_all def sample_background(self, dataset): """Sample background. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Map dataset. Returns ------- events : `gammapy.data.EventList` Background events. """ table = Table() if dataset.background: log.info("Evaluating background...") background = dataset.npred_background() temporal_model = ConstantTemporalModel() table = self._sample_coord_time(background, temporal_model, dataset.gti) table["ENERGY"] = table["ENERGY_TRUE"] table["RA"] = table["RA_TRUE"] table["DEC"] = table["DEC_TRUE"] if self.keep_mc_id: table["MC_ID"] = 0 table.meta["MID{:05d}".format(0)] = 0 table.meta["MMN{:05d}".format(0)] = dataset.background_model.name return EventList(table) def sample_edisp(self, edisp_map, events): """Sample energy dispersion map. Parameters ---------- edisp_map : `~gammapy.irf.EDispMap` Energy dispersion map. events : `~gammapy.data.EventList` Event list with the true energies. Returns ------- events : `~gammapy.data.EventList` Event list with reconstructed energy column. """ coord = MapCoord( { "lon": events.table["RA_TRUE"].quantity, "lat": events.table["DEC_TRUE"].quantity, "energy_true": events.table["ENERGY_TRUE"].quantity, }, frame="icrs", ) coords_reco = edisp_map.sample_coord( coord, self.random_state, self.n_event_bunch ) events.table["ENERGY"] = coords_reco["energy"] return events def sample_psf(self, psf_map, events): """Sample PSF map. Parameters ---------- psf_map : `~gammapy.irf.PSFMap` PSF map. events : `~gammapy.data.EventList` Event list. Returns ------- events : `~gammapy.data.EventList` Event list with reconstructed position columns. """ coord = MapCoord( { "lon": events.table["RA_TRUE"].quantity, "lat": events.table["DEC_TRUE"].quantity, "energy_true": events.table["ENERGY_TRUE"].quantity, }, frame="icrs", ) coords_reco = psf_map.sample_coord(coord, self.random_state, self.n_event_bunch) events.table["RA"] = coords_reco["lon"] * u.deg events.table["DEC"] = coords_reco["lat"] * u.deg return events @staticmethod def event_det_coords(observation, events): """Add columns of detector coordinates (DETX-DETY) to the event list. Parameters ---------- observation : `~gammapy.data.Observation` In memory observation. events : `~gammapy.data.EventList` Event list. Returns ------- events : `~gammapy.data.EventList` Event list with columns of event detector coordinates. """ sky_coord = SkyCoord(events.table["RA"], events.table["DEC"], frame="icrs") frame = SkyOffsetFrame(origin=observation.get_pointing_icrs(observation.tmid)) pseudo_fov_coord = sky_coord.transform_to(frame) events.table["DETX"] = pseudo_fov_coord.lon events.table["DETY"] = pseudo_fov_coord.lat return events @staticmethod def event_list_meta(dataset, observation, keep_mc_id=True): """Event list meta info. Please, note that this function will be updated in the future. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Map dataset. observation : `~gammapy.data.Observation` In memory observation. keep_mc_id : bool Flag to tag sampled events from a given model with a Montecarlo identifier. Default is True. If set to False, no identifier will be assigned. Returns ------- meta : dict Meta dictionary. """ # See: https://gamma-astro-data-formats.readthedocs.io/en/latest/events/events.html#mandatory-header-keywords # noqa: E501 meta = {} meta["HDUCLAS1"] = "EVENTS" meta["EXTNAME"] = "EVENTS" meta["HDUDOC"] = ( "https://github.com/open-gamma-ray-astro/gamma-astro-data-formats" ) meta["HDUVERS"] = "0.2" meta["HDUCLASS"] = "GADF" meta["OBS_ID"] = observation.obs_id meta["TSTART"] = (observation.tstart - dataset.gti.time_ref).to_value("s") meta["TSTOP"] = (observation.tstop - dataset.gti.time_ref).to_value("s") meta["ONTIME"] = observation.observation_time_duration.to("s").value meta["LIVETIME"] = observation.observation_live_time_duration.to("s").value meta["DEADC"] = 1 - observation.observation_dead_time_fraction fixed_icrs = observation.pointing.fixed_icrs meta["RA_PNT"] = fixed_icrs.ra.deg meta["DEC_PNT"] = fixed_icrs.dec.deg meta["EQUINOX"] = "J2000" meta["RADECSYS"] = "icrs" meta["CREATOR"] = "Gammapy {}".format(__version__) meta["EUNIT"] = "TeV" meta["EVTVER"] = "" meta["OBSERVER"] = "Gammapy user" meta["DSTYP1"] = "TIME" meta["DSUNI1"] = "s" meta["DSVAL1"] = "TABLE" meta["DSREF1"] = ":GTI" meta["DSTYP2"] = "ENERGY" meta["DSUNI2"] = "TeV" meta["DSVAL2"] = ( f'{dataset._geom.axes["energy"].edges.min().value}:{dataset._geom.axes["energy"].edges.max().value}' # noqa: E501 ) meta["DSTYP3"] = "POS(RA,DEC) " offset_max = np.max(dataset._geom.width).to_value("deg") meta["DSVAL3"] = ( f"CIRCLE({fixed_icrs.ra.deg},{fixed_icrs.dec.deg},{offset_max})" # noqa: E501 ) meta["DSUNI3"] = "deg " meta["NDSKEYS"] = " 3 " # get first non background model component for model in dataset.models: if model is not dataset.background_model: break else: model = None if model: meta["OBJECT"] = model.name meta["RA_OBJ"] = model.position.icrs.ra.deg meta["DEC_OBJ"] = model.position.icrs.dec.deg meta["TELAPSE"] = dataset.gti.time_sum.to("s").value meta["MJDREFI"] = int(dataset.gti.time_ref.mjd) meta["MJDREFF"] = dataset.gti.time_ref.mjd % 1 meta["TIMEUNIT"] = "s" meta["TIMESYS"] = dataset.gti.time_ref.scale meta["TIMEREF"] = "LOCAL" meta["DATE-OBS"] = dataset.gti.time_start.isot[0][0:10] meta["DATE-END"] = dataset.gti.time_stop.isot[0][0:10] meta["CONV_DEP"] = 0 meta["CONV_RA"] = 0 meta["CONV_DEC"] = 0 if keep_mc_id: meta["NMCIDS"] = len(dataset.models) # Necessary for DataStore, but they should be ALT and AZ instead! telescope = observation.aeff.meta["TELESCOP"] instrument = observation.aeff.meta["INSTRUME"] loc = observation.observatory_earth_location if loc is None: if telescope == "CTA": if instrument == "Southern Array": loc = observatory_locations["cta_south"] elif instrument == "Northern Array": loc = observatory_locations["cta_north"] else: loc = observatory_locations["cta_south"] else: loc = observatory_locations[telescope.lower()] meta.update(earth_location_to_dict(loc)) # this is not really correct but maybe OK for now coord_altaz = observation.pointing.get_altaz(dataset.gti.time_start, loc) meta["ALT_PNT"] = coord_altaz.alt.deg[0] meta["AZ_PNT"] = coord_altaz.az.deg[0] # TO DO: these keywords should be taken from the IRF of the dataset meta["ORIGIN"] = "Gammapy" meta["TELESCOP"] = observation.aeff.meta["TELESCOP"] meta["INSTRUME"] = observation.aeff.meta["INSTRUME"] meta["N_TELS"] = "" meta["TELLIST"] = "" meta["CREATED"] = Time.now().iso meta["OBS_MODE"] = "POINTING" meta["EV_CLASS"] = "" return meta def run(self, dataset, observation=None): """Run the event sampler, applying IRF corrections. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Map dataset. observation : `~gammapy.data.Observation`, optional In memory observation. Default is None. Returns ------- events : `~gammapy.data.EventList` Event list. """ events_src = self.sample_sources(dataset) if len(events_src.table) > 0: if dataset.psf: events_src = self.sample_psf(dataset.psf, events_src) else: events_src.table["RA"] = events_src.table["RA_TRUE"] events_src.table["DEC"] = events_src.table["DEC_TRUE"] if dataset.edisp: events_src = self.sample_edisp(dataset.edisp, events_src) else: events_src.table["ENERGY"] = events_src.table["ENERGY_TRUE"] events_bkg = self.sample_background(dataset) events = EventList.from_stack([events_bkg, events_src]) events.table["EVENT_ID"] = np.arange(len(events.table)) if observation is not None: events = self.event_det_coords(observation, events) events.table.meta.update( self.event_list_meta(dataset, observation, self.keep_mc_id) ) sort_by_time = np.argsort(events.table["TIME"]) events.table = events.table[sort_by_time] geom = dataset._geom selection = geom.contains(events.map_coord(geom)) log.info("Event sampling completed.") return events.select_row_subset(selection) class ObservationEventSampler(MapDatasetEventSampler): """ Sample event lists for a given observation and signal models. Signal events are sampled from the predicted counts distribution given by the product of the sky models and the expected exposure. They are then folded with the instrument response functions. To improve performance, IRFs are evaluated on a pre-defined binning, not at each individual event energy / coordinate. Parameters ---------- random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`}, optional Defines random number generator initialisation via the `~gammapy.utils.random.get_random_state` function. oversample_energy_factor : int, optional Defines an oversampling factor for the energies; it is used only when sampling an energy-dependent time-varying source. Default is 10. t_delta : `~astropy.units.Quantity`, optional Time interval used to sample the time-dependent source. Default is 0.5 s. keep_mc_id : bool, optional Flag to tag sampled events from a given model with a Montecarlo identifier. Default is True. If set to False, no identifier will be assigned. n_event_bunch : int Size of events bunches to sample. If None, sample all events in memory. Default is 10000. dataset_kwargs : dict, optional Arguments passed to `~gammapy.datasets.create_map_dataset_from_observation()` """ def __init__( self, random_state="random-seed", oversample_energy_factor=10, t_delta=0.5 * u.s, keep_mc_id=True, n_event_bunch=10000, dataset_kwargs=None, ): self.dataset_kwargs = dataset_kwargs or {} self.random_state = get_random_state(random_state) self.oversample_energy_factor = oversample_energy_factor self.t_delta = t_delta self.n_event_bunch = n_event_bunch self.keep_mc_id = keep_mc_id def run(self, observation, models=None, dataset_name=None): """Sample events for given observation and signal models. The signal distribution is sampled from the given models in true coordinates and energy. The true quantities are then folded with the IRFs to obtain the observable quantities. Parameters ---------- observation : `~gammapy.data.Observation` Observation to be simulated. models : `~gammapy.modeling.Models`, optional Models to simulate. Can be None to only sample background events. Default is None. dataset_name : str, optional If `models` contains one or multiple `FoVBackgroundModel` it should match the `dataset_name` of the background model to use. Default is None. Returns ------- observation : `~gammapy.data.Observation` A copy of the input observation with event list filled. """ dataset = create_map_dataset_from_observation( observation, models, dataset_name, **self.dataset_kwargs ) events = super().run(dataset, observation) observation = deepcopy(observation) observation._events = events return observation ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/spectrum.py0000644000175100001770000003716714721316200020106 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import matplotlib.pyplot as plt from matplotlib.gridspec import GridSpec from gammapy.utils.scripts import make_path from .map import MapDataset, MapDatasetOnOff from .utils import get_axes __all__ = ["SpectrumDatasetOnOff", "SpectrumDataset"] log = logging.getLogger(__name__) class PlotMixin: """Plot mixin for the spectral datasets.""" def plot_fit( self, ax_spectrum=None, ax_residuals=None, kwargs_spectrum=None, kwargs_residuals=None, ): """Plot spectrum and residuals in two panels. Calls `~SpectrumDataset.plot_excess` and `~SpectrumDataset.plot_residuals_spectral`. Parameters ---------- ax_spectrum : `~matplotlib.axes.Axes`, optional Axes to plot spectrum on. Default is None. ax_residuals : `~matplotlib.axes.Axes`, optional Axes to plot residuals on. Default is None. kwargs_spectrum : dict, optional Keyword arguments passed to `~SpectrumDataset.plot_excess`. Default is None. kwargs_residuals : dict, optional Keyword arguments passed to `~SpectrumDataset.plot_residuals_spectral`. Default is None. Returns ------- ax_spectrum, ax_residuals : `~matplotlib.axes.Axes` Spectrum and residuals plots. Examples -------- >>> #Creating a spectral dataset >>> from gammapy.datasets import SpectrumDatasetOnOff >>> from gammapy.modeling.models import PowerLawSpectralModel, SkyModel >>> filename = "$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs23523.fits" >>> dataset = SpectrumDatasetOnOff.read(filename) >>> p = PowerLawSpectralModel() >>> dataset.models = SkyModel(spectral_model=p) >>> # optional configurations >>> kwargs_excess = {"color": "blue", "markersize":8, "marker":'s', } >>> kwargs_npred_signal = {"color": "black", "ls":"--"} >>> kwargs_spectrum = {"kwargs_excess":kwargs_excess, "kwargs_npred_signal":kwargs_npred_signal} >>> kwargs_residuals = {"color": "black", "markersize":4, "marker":'s', } >>> dataset.plot_fit(kwargs_residuals=kwargs_residuals, kwargs_spectrum=kwargs_spectrum) # doctest: +SKIP """ gs = GridSpec(7, 1) bool_visible_xticklabel = not (ax_spectrum is None and ax_residuals is None) ax_spectrum, ax_residuals = get_axes( ax_spectrum, ax_residuals, 8, 7, [gs[:5, :]], [gs[5:, :]], kwargs2={"sharex": ax_spectrum}, ) kwargs_spectrum = kwargs_spectrum or {} kwargs_residuals = kwargs_residuals or {} self.plot_excess(ax_spectrum, **kwargs_spectrum) self.plot_residuals_spectral(ax_residuals, **kwargs_residuals) method = kwargs_residuals.get("method", "diff") label = self._residuals_labels[method] ax_residuals.set_ylabel(f"Residuals\n{label}") plt.setp(ax_spectrum.get_xticklabels(), visible=bool_visible_xticklabel) self.plot_masks(ax=ax_spectrum) self.plot_masks(ax=ax_residuals) return ax_spectrum, ax_residuals def plot_counts( self, ax=None, kwargs_counts=None, kwargs_background=None, **kwargs ): """Plot counts and background. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Axes to plot on. Default is None. kwargs_counts : dict, optional Keyword arguments passed to `~matplotlib.axes.Axes.hist` for the counts. Default is None. kwargs_background : dict, optional Keyword arguments passed to `~matplotlib.axes.Axes.hist` for the background. Default is None. **kwargs : dict, optional Keyword arguments passed to both `~matplotlib.axes.Axes.hist`. Returns ------- ax : `~matplotlib.axes.Axes` Axes object. """ kwargs_counts = kwargs_counts or {} kwargs_background = kwargs_background or {} plot_kwargs = kwargs.copy() plot_kwargs.update(kwargs_counts) plot_kwargs.setdefault("label", "Counts") ax = self.counts.plot_hist(ax=ax, **plot_kwargs) plot_kwargs = kwargs.copy() plot_kwargs.update(kwargs_background) plot_kwargs.setdefault("label", "Background") self.background.plot_hist(ax=ax, **plot_kwargs) ax.legend(numpoints=1) return ax def plot_masks(self, ax=None, kwargs_fit=None, kwargs_safe=None): """Plot safe mask and fit mask. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Axes to plot on. Default is None. kwargs_fit : dict, optional Keyword arguments passed to `~RegionNDMap.plot_mask()` for mask fit. Default is None. kwargs_safe : dict, optional Keyword arguments passed to `~RegionNDMap.plot_mask()` for mask safe. Default is None. Returns ------- ax : `~matplotlib.axes.Axes` Axes object. Examples -------- >>> # Reading a spectral dataset >>> from gammapy.datasets import SpectrumDatasetOnOff >>> filename = "$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs23523.fits" >>> dataset = SpectrumDatasetOnOff.read(filename) >>> dataset.mask_fit = dataset.mask_safe.copy() >>> dataset.mask_fit.data[40:46] = False # setting dummy mask_fit for illustration >>> # Plot the masks on top of the counts histogram >>> kwargs_safe = {"color":"green", "alpha":0.2} #optinonal arguments to configure >>> kwargs_fit = {"color":"pink", "alpha":0.2} >>> ax=dataset.plot_counts() # doctest: +SKIP >>> dataset.plot_masks(ax=ax, kwargs_fit=kwargs_fit, kwargs_safe=kwargs_safe) # doctest: +SKIP """ kwargs_fit = kwargs_fit or {} kwargs_safe = kwargs_safe or {} kwargs_fit.setdefault("label", "Mask fit") kwargs_fit.setdefault("color", "tab:green") kwargs_safe.setdefault("label", "Mask safe") kwargs_safe.setdefault("color", "black") if self.mask_fit: self.mask_fit.plot_mask(ax=ax, **kwargs_fit) if self.mask_safe: self.mask_safe.plot_mask(ax=ax, **kwargs_safe) ax.legend() return ax def plot_excess( self, ax=None, kwargs_excess=None, kwargs_npred_signal=None, **kwargs ): """Plot excess and predicted signal. The error bars are computed with a symmetric assumption on the excess. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Axes to plot on. Default is None. kwargs_excess : dict, optional Keyword arguments passed to `~matplotlib.axes.Axes.errorbar` for the excess. Default is None. kwargs_npred_signal : dict, optional Keyword arguments passed to `~matplotlib.axes.Axes.hist` for the predicted signal. Default is None. **kwargs : dict, optional Keyword arguments passed to both plot methods. Returns ------- ax : `~matplotlib.axes.Axes` Axes object. Examples -------- >>> #Creating a spectral dataset >>> from gammapy.datasets import SpectrumDatasetOnOff >>> from gammapy.modeling.models import PowerLawSpectralModel, SkyModel >>> filename = "$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs23523.fits" >>> dataset = SpectrumDatasetOnOff.read(filename) >>> p = PowerLawSpectralModel() >>> dataset.models = SkyModel(spectral_model=p) >>> #Plot the excess in blue and the npred in black dotted lines >>> kwargs_excess = {"color": "blue", "markersize":8, "marker":'s', } >>> kwargs_npred_signal = {"color": "black", "ls":"--"} >>> dataset.plot_excess(kwargs_excess=kwargs_excess, kwargs_npred_signal=kwargs_npred_signal) # doctest: +SKIP """ kwargs_excess = kwargs_excess or {} kwargs_npred_signal = kwargs_npred_signal or {} # Determine the uncertainty on the excess yerr = self._counts_statistic.error plot_kwargs = kwargs.copy() plot_kwargs.update(kwargs_excess) plot_kwargs.setdefault("label", "Excess counts") ax = self.excess.plot(ax, yerr=yerr, **plot_kwargs) plot_kwargs = kwargs.copy() plot_kwargs.update(kwargs_npred_signal) plot_kwargs.setdefault("label", "Predicted signal counts") self.npred_signal().plot_hist(ax, **plot_kwargs) ax.legend(numpoints=1) return ax def peek(self, figsize=(16, 4)): """Quick-look summary plots. Parameters ---------- figsize : tuple Size of the figure. Default is (16, 4). """ fig, (ax1, ax2, ax3) = plt.subplots(1, 3, figsize=figsize) ax1.set_title("Counts") self.plot_counts(ax1) self.plot_masks(ax=ax1) ax1.legend() ax2.set_title("Exposure") self.exposure.plot(ax2, ls="-", markersize=0, xerr=None) ax3.set_title("Energy Dispersion") if self.edisp is not None: kernel = self.edisp.get_edisp_kernel() kernel.plot_matrix(ax=ax3, add_cbar=True) class SpectrumDataset(PlotMixin, MapDataset): """Main dataset for spectrum fitting (1D analysis). It bundles together binned counts, background, IRFs into `~gammapy.maps.RegionNDMap` (a Map with only one spatial bin). A safe mask and a fit mask can be added to exclude bins during the analysis. If models are assigned to it, it can compute the predicted number of counts and the statistic function, here the Cash statistic (see `~gammapy.stats.cash`). For more information see :ref:`datasets`. """ stat_type = "cash" tag = "SpectrumDataset" def cutout(self, *args, **kwargs): """Not supported for `SpectrumDataset`""" raise NotImplementedError("Method not supported on a spectrum dataset") def plot_residuals_spatial(self, *args, **kwargs): """Not supported for `SpectrumDataset`""" raise NotImplementedError("Method not supported on a spectrum dataset") def to_spectrum_dataset(self, *args, **kwargs): """Not supported for `SpectrumDataset`""" raise NotImplementedError("Already a Spectrum Dataset. Method not supported") class SpectrumDatasetOnOff(PlotMixin, MapDatasetOnOff): """Spectrum dataset for 1D on-off likelihood fitting. It bundles together the binned on and off counts, the binned IRFs as well as the on and off acceptances. A fit mask can be added to exclude bins during the analysis. It uses the Wstat statistic (see `~gammapy.stats.wstat`). For more information see :ref:`datasets`. """ stat_type = "wstat" tag = "SpectrumDatasetOnOff" def cutout(self, *args, **kwargs): """Not supported for `SpectrumDatasetOnOff`.""" raise NotImplementedError("Method not supported on a spectrum dataset") def plot_residuals_spatial(self, *args, **kwargs): """Not supported for `SpectrumDatasetOnOff`.""" raise NotImplementedError("Method not supported on a spectrum dataset") @classmethod def read(cls, filename, format="ogip", checksum=False, **kwargs): """Read from file. For OGIP formats, filename is the name of a PHA file. The BKG, ARF, and RMF file names must be set in the PHA header and the files must be present in the same folder. For details, see `OGIPDatasetReader.read`. For the GADF format, a MapDataset serialisation is used. Parameters ---------- filename : `~pathlib.Path` or str OGIP PHA file to read. format : {"ogip", "ogip-sherpa", "gadf"} Format to use. Default is "ogip". checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. kwargs : dict, optional Keyword arguments passed to `MapDataset.read`. """ from .io import OGIPDatasetReader if format == "gadf": return super().read(filename, format="gadf", checksum=checksum, **kwargs) reader = OGIPDatasetReader(filename=filename, checksum=checksum) return reader.read() def write(self, filename, overwrite=False, format="ogip", checksum=False): """Write spectrum dataset on off to file. Can be serialised either as a `MapDataset` with a `RegionGeom` following the GADF specifications, or as per the OGIP format. For OGIP formats specs, see `OGIPDatasetWriter`. Parameters ---------- filename : `~pathlib.Path` or str Filename to write to. overwrite : bool, optional Overwrite existing file. Default is False. format : {"ogip", "ogip-sherpa", "gadf"} Format to use. Default is "ogip". checksum : bool When True adds both DATASUM and CHECKSUM cards to the headers written to the file. Default is False. """ from .io import OGIPDatasetWriter if format == "gadf": super().write(filename=filename, overwrite=overwrite, checksum=checksum) elif format in ["ogip", "ogip-sherpa"]: writer = OGIPDatasetWriter( filename=filename, format=format, overwrite=overwrite, checksum=checksum ) writer.write(self) else: raise ValueError(f"{format} is not a valid serialisation format") @classmethod def from_dict(cls, data, **kwargs): """Create spectrum dataset from dict. Reads file from the disk as specified in the dict. Parameters ---------- data : dict Dictionary containing data to create dataset from. Returns ------- dataset : `SpectrumDatasetOnOff` Spectrum dataset on off. """ filename = make_path(data["filename"]) dataset = cls.read(filename=filename) dataset.mask_fit = None return dataset def to_dict(self): """Convert to dict for YAML serialization.""" filename = f"pha_obs{self.name}.fits" return {"name": self.name, "type": self.tag, "filename": filename} @classmethod def from_spectrum_dataset(cls, **kwargs): """Create a SpectrumDatasetOnOff from a `SpectrumDataset` dataset. Parameters ---------- dataset : `SpectrumDataset` Spectrum dataset defining counts, edisp, exposure etc. acceptance : `~numpy.array` or float Relative background efficiency in the on region. acceptance_off : `~numpy.array` or float Relative background efficiency in the off region. counts_off : `~gammapy.maps.RegionNDMap` Off counts spectrum. If the dataset provides a background model, and no off counts are defined. The off counts are deferred from counts_off / alpha. Returns ------- dataset : `SpectrumDatasetOnOff` Spectrum dataset on off. """ return cls.from_map_dataset(**kwargs) def to_spectrum_dataset(self, name=None): """Convert a SpectrumDatasetOnOff to a SpectrumDataset. The background model template is taken as alpha*counts_off. Parameters ---------- name : str, optional Name of the new dataset. Default is None. Returns ------- dataset : `SpectrumDataset` SpectrumDataset with Cash statistic. """ return self.to_map_dataset(name=name).to_spectrum_dataset(on_region=None) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.196642 gammapy-1.3/gammapy/datasets/tests/0000755000175100001770000000000014721316215017024 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/tests/__init__.py0000644000175100001770000000010014721316200021116 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/tests/test_datasets.py0000644000175100001770000001506214721316200022243 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import os import pytest import numpy as np from numpy.testing import assert_allclose, assert_equal from astropy.coordinates import SkyCoord from gammapy.datasets import Datasets, SpectrumDatasetOnOff from gammapy.datasets.tests.test_map import get_map_dataset from gammapy.maps import MapAxis, WcsGeom from gammapy.modeling import Fit from gammapy.modeling.models import FoVBackgroundModel, Models, SkyModel from gammapy.modeling.tests.test_fit import MyDataset from gammapy.utils.testing import requires_data @pytest.fixture(scope="session") def datasets(): return Datasets([MyDataset(name="test-1"), MyDataset(name="test-2")]) def test_datasets_init(datasets): # Passing a Python list of `Dataset` objects should work Datasets(list(datasets)) # Passing an existing `Datasets` object should work Datasets(datasets) def test_datasets_types(datasets): assert datasets.is_all_same_type def test_datasets_likelihood(datasets): likelihood = datasets.stat_sum() assert_allclose(likelihood, 14472200.0002) def test_datasets_str(datasets): assert "Datasets" in str(datasets) def test_datasets_getitem(datasets): assert datasets["test-1"].name == "test-1" assert datasets["test-2"].name == "test-2" def test_names(datasets): assert datasets.names == ["test-1", "test-2"] def test_datasets_mutation(): dat = MyDataset(name="test-1") dats = Datasets([MyDataset(name="test-2"), MyDataset(name="test-3")]) dats2 = Datasets([MyDataset(name="test-4"), MyDataset(name="test-5")]) dats.insert(0, dat) assert dats.names == ["test-1", "test-2", "test-3"] dats.extend(dats2) assert dats.names == ["test-1", "test-2", "test-3", "test-4", "test-5"] dat3 = dats[3] dats.remove(dats[3]) assert dats.names == ["test-1", "test-2", "test-3", "test-5"] dats.append(dat3) assert dats.names == ["test-1", "test-2", "test-3", "test-5", "test-4"] dats.pop(3) assert dats.names == ["test-1", "test-2", "test-3", "test-4"] with pytest.raises(ValueError, match="Dataset names must be unique"): dats.append(dat) with pytest.raises(ValueError, match="Dataset names must be unique"): dats.insert(0, dat) with pytest.raises(ValueError, match="Dataset names must be unique"): dats.extend(dats2) @requires_data() def test_datasets_info_table(): datasets_hess = Datasets() for obs_id in [23523, 23526]: dataset = SpectrumDatasetOnOff.read( f"$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs{obs_id}.fits" ) datasets_hess.append(dataset) table = datasets_hess.info_table() assert table["name"][0] == "23523" assert table["name"][1] == "23526" assert np.isnan(table["npred_signal"][0]) assert_equal(table["counts"], [124, 126]) assert_allclose(table["background"], [7.666, 8.583], rtol=1e-3) table_cumul = datasets_hess.info_table(cumulative=True) assert table_cumul["name"][0] == "stacked" assert table_cumul["name"][1] == "stacked" assert np.isnan(table_cumul["npred_signal"][0]) assert table_cumul["alpha"][1] == table_cumul["alpha"][0] assert_equal(table_cumul["counts"], [124, 250]) assert_allclose(table_cumul["background"], [7.666, 16.25], rtol=1e-3) assert table["excess"].sum() == table_cumul["excess"][1] assert table["counts"].sum() == table_cumul["counts"][1] assert table["background"].sum() == table_cumul["background"][1] datasets_hess[0].mask_safe.data = ~datasets_hess[0].mask_safe.data assert datasets_hess.info_table()["counts"][0] == 65 datasets_hess[0].mask_safe.data = np.ones_like( datasets_hess[0].mask_safe.data, dtype=bool ) datasets_hess[1].mask_safe.data = np.ones_like( datasets_hess[1].mask_safe.data, dtype=bool ) assert_equal(datasets_hess.info_table()["counts"], [189, 199]) assert_equal(datasets_hess.info_table(cumulative=True)["counts"], [189, 388]) @requires_data() def test_datasets_write(tmp_path): axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=2) geom = WcsGeom.create( skydir=(266.40498829, -28.93617776), binsz=0.02, width=(2, 2), frame="icrs", axes=[axis], ) axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=3, name="energy_true") geom_etrue = WcsGeom.create( skydir=(266.40498829, -28.93617776), binsz=0.02, width=(2, 2), frame="icrs", axes=[axis], ) dataset_1 = get_map_dataset(geom, geom_etrue, name="test-1") datasets = Datasets([dataset_1]) model = SkyModel.create("pl", "point", name="src") model.spatial_model.position = SkyCoord("266d", "-28.93d", frame="icrs") dataset_1.models = [model] datasets.write( filename=tmp_path / "test", filename_models=tmp_path / "test_model", overwrite=False, ) os.remove(tmp_path / "test-1.fits") with pytest.raises(OSError): datasets.write( filename=tmp_path / "test", filename_models=tmp_path / "test_model", overwrite=False, ) @requires_data() def test_datasets_fit(): axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=2) geom = WcsGeom.create( skydir=(266.40498829, -28.93617776), binsz=0.05, width=(20, 20), frame="icrs", axes=[axis], ) axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=3, name="energy_true") geom_etrue = WcsGeom.create( skydir=(266.40498829, -28.93617776), binsz=0.05, width=(20, 20), frame="icrs", axes=[axis], ) dataset_1 = get_map_dataset(geom, geom_etrue, name="test-1") dataset_1.mask_fit = None dataset_1.background /= 400 dataset_2 = get_map_dataset(geom, geom_etrue, name="test-2") dataset_2.mask_fit = None dataset_2.background /= 400 datasets = Datasets([dataset_1, dataset_2]) model = SkyModel.create("pl", "point", name="src") model.spatial_model.position = dataset_1.exposure.geom.center_skydir model2 = model.copy() model2.spatial_model.lon_0.value += 0.1 model2.spatial_model.lat_0.value += 0.1 models = Models( [ model, model2, FoVBackgroundModel(dataset_name=dataset_1.name), FoVBackgroundModel(dataset_name=dataset_2.name), ] ) datasets.models = models dataset_1.fake() dataset_2.fake() fit = Fit() results = fit.run(datasets) assert_allclose(results.models.covariance.data, datasets.models.covariance.data) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/tests/test_evaluator.py0000644000175100001770000002124014721316200022430 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from copy import deepcopy from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion from gammapy.datasets.evaluator import MapEvaluator from gammapy.irf import PSFKernel, RecoPSFMap from gammapy.maps import Map, MapAxis, RegionGeom, RegionNDMap, WcsGeom from gammapy.modeling.models import ( ConstantSpectralModel, GaussianSpatialModel, Models, PointSpatialModel, PowerLawSpectralModel, SkyModel, ) from gammapy.utils.gauss import Gauss2DPDF from gammapy.utils.testing import mpl_plot_check @pytest.fixture def evaluator(): center = SkyCoord("0 deg", "0 deg", frame="galactic") region = CircleSkyRegion(center=center, radius=0.1 * u.deg) nbin = 2 energy_axis_true = MapAxis.from_energy_bounds( ".1 TeV", "10 TeV", nbin=nbin, name="energy_true" ) spectral_model = ConstantSpectralModel() spatial_model = PointSpatialModel( lon_0=0 * u.deg, lat_0=0 * u.deg, frame="galactic" ) models = SkyModel(spectral_model=spectral_model, spatial_model=spatial_model) model = Models(models) exposure_region = RegionNDMap.create( region, axes=[energy_axis_true], binsz_wcs="0.01deg", unit="m2 s", data=1.0 ) geom = RegionGeom(region, axes=[energy_axis_true], binsz_wcs="0.01deg") psf = PSFKernel.from_gauss(geom.to_wcs_geom(), sigma="0.1 deg") return MapEvaluator(model=model[0], exposure=exposure_region, psf=psf) def test_compute_flux_spatial(evaluator): flux = evaluator.compute_flux_spatial() g = Gauss2DPDF(0.1) reference = g.containment_fraction(0.1) assert_allclose(flux.value, reference, rtol=0.003) def test_compute_flux_spatial_no_psf(evaluator): # check that spatial integration is not performed in the absence of a psf evaluator_gaussian = deepcopy(evaluator) model = evaluator.model.copy() model.spatial_model = GaussianSpatialModel( lon_0=0 * u.deg, lat_0=0 * u.deg, sigma=0.001 * u.deg, frame="galactic" ) evaluator_gaussian.model = model evaluator_gaussian.model.apply_irf["psf"] = False flux = evaluator_gaussian.compute_flux_spatial() evaluator_gaussian.model.apply_irf["psf"] = True assert_allclose(flux, 1.0) evaluator_gaussian.psf = None flux = evaluator_gaussian.compute_flux_spatial() assert_allclose(flux, 1.0) evaluator.model.apply_irf["psf"] = False flux = evaluator.compute_flux_spatial() evaluator.model.apply_irf["psf"] = True assert_allclose(flux, 1.0) evaluator.psf = None flux = evaluator.compute_flux_spatial() assert_allclose(flux, 1.0) def test_large_oversampling(): nbin = 2 energy_axis_true = MapAxis.from_energy_bounds( ".1 TeV", "10 TeV", nbin=nbin, name="energy_true" ) geom = WcsGeom.create(width=1, binsz=0.02, axes=[energy_axis_true]) spectral_model = ConstantSpectralModel() spatial_model = GaussianSpatialModel( lon_0=0 * u.deg, lat_0=0 * u.deg, sigma=1e-4 * u.deg, frame="icrs" ) models = SkyModel(spectral_model=spectral_model, spatial_model=spatial_model) model = Models(models) exposure = Map.from_geom(geom, unit="m2 s") exposure.data += 1.0 psf = PSFKernel.from_gauss(geom, sigma="0.1 deg") evaluator = MapEvaluator(model=model[0], exposure=exposure, psf=psf) flux_1 = evaluator.compute_flux_spatial() spatial_model.sigma.value = 0.001 flux_2 = evaluator.compute_flux_spatial() spatial_model.sigma.value = 0.01 flux_3 = evaluator.compute_flux_spatial() spatial_model.sigma.value = 0.03 flux_4 = evaluator.compute_flux_spatial() assert_allclose(flux_1.data.sum(), nbin, rtol=1e-4) assert_allclose(flux_2.data.sum(), nbin, rtol=1e-4) assert_allclose(flux_3.data.sum(), nbin, rtol=1e-4) assert_allclose(flux_4.data.sum(), nbin, rtol=1e-4) def test_compute_npred_sign(): center = SkyCoord("0 deg", "0 deg", frame="galactic") energy_axis_true = MapAxis.from_energy_bounds( ".1 TeV", "10 TeV", nbin=2, name="energy_true" ) geom = WcsGeom.create( skydir=center, width=1 * u.deg, axes=[energy_axis_true], frame="galactic", binsz=0.2 * u.deg, ) spectral_model_pos = PowerLawSpectralModel(index=2, amplitude="1e-11 TeV-1 s-1 m-2") spectral_model_neg = PowerLawSpectralModel( index=2, amplitude="-1e-11 TeV-1 s-1 m-2" ) spatial_model = PointSpatialModel( lon_0=0 * u.deg, lat_0=0 * u.deg, frame="galactic" ) model_pos = SkyModel(spectral_model=spectral_model_pos, spatial_model=spatial_model) model_neg = SkyModel(spectral_model=spectral_model_neg, spatial_model=spatial_model) exposure = Map.from_geom(geom, unit="m2 s") exposure.data += 1.0 psf = PSFKernel.from_gauss(geom, sigma="0.1 deg") evaluator_pos = MapEvaluator(model=model_pos, exposure=exposure, psf=psf) evaluator_neg = MapEvaluator(model=model_neg, exposure=exposure, psf=psf) npred_pos = evaluator_pos.compute_npred() npred_neg = evaluator_neg.compute_npred() assert (npred_pos.data == -npred_neg.data).all() assert np.all(npred_pos.data >= 0) assert np.all(npred_neg.data <= 0) def test_psf_reco(): center = SkyCoord("0 deg", "0 deg", frame="galactic") energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3, name="energy") geom = WcsGeom.create( skydir=center, width=1 * u.deg, axes=[energy_axis], frame="galactic", binsz=0.2 * u.deg, ) spectral_model = PowerLawSpectralModel(index=2, amplitude="1e-11 TeV-1 s-1 m-2") spatial_model = PointSpatialModel( lon_0=0 * u.deg, lat_0=0 * u.deg, frame="galactic" ) model_pos = SkyModel(spectral_model=spectral_model, spatial_model=spatial_model) exposure = Map.from_geom(geom.as_energy_true, unit="m2 s") exposure.data += 1.0 psf = PSFKernel.from_gauss(geom, sigma=0.1 * u.deg) evaluator = MapEvaluator(model=model_pos, exposure=exposure, psf=psf) assert evaluator.apply_psf_after_edisp assert evaluator.methods_sequence[-1] == evaluator.apply_psf npred = evaluator.compute_npred() assert_allclose(npred.data.sum(), 9e-12) psf_map = RecoPSFMap.from_gauss( energy_axis=energy_axis, sigma=0.1 * u.deg, geom=geom.to_image() ) mask = Map.from_geom(geom, data=True) evaluator.psf = None evaluator.update(exposure, psf_map, None, geom, mask) assert evaluator.apply_psf_after_edisp assert evaluator.methods_sequence[-1] == evaluator.apply_psf assert_allclose(evaluator.compute_npred().data.sum(), 9e-12) def test_peek_region_geom(evaluator): with mpl_plot_check(): evaluator.peek() def test_peek_wcs_geom(): center = SkyCoord("0 deg", "0 deg", frame="galactic") energy_axis_true = MapAxis.from_energy_bounds( ".1 TeV", "10 TeV", nbin=2, name="energy_true" ) geom = WcsGeom.create( skydir=center, width=1 * u.deg, axes=[energy_axis_true], frame="galactic", binsz=0.2 * u.deg, ) spectral_model = PowerLawSpectralModel() spatial_model = PointSpatialModel( lon_0=0 * u.deg, lat_0=0 * u.deg, frame="galactic" ) model = SkyModel(spectral_model=spectral_model, spatial_model=spatial_model) exposure = Map.from_geom(geom, unit="m2 s") exposure.data += 1.0 evaluator = MapEvaluator(model=model, exposure=exposure) with mpl_plot_check(): evaluator.peek() def test_norm_only_changed(): center = SkyCoord("0 deg", "0 deg", frame="galactic") energy_axis_true = MapAxis.from_energy_bounds( ".1 TeV", "10 TeV", nbin=2, name="energy_true" ) geom = WcsGeom.create( skydir=center, width=1 * u.deg, axes=[energy_axis_true], frame="galactic", binsz=0.2 * u.deg, ) spectral_model = PowerLawSpectralModel(index=2, amplitude="1e-11 TeV-1 s-1 m-2") spatial_model = PointSpatialModel( lon_0=0 * u.deg, lat_0=0 * u.deg, frame="galactic" ) model = SkyModel(spectral_model=spectral_model, spatial_model=spatial_model) exposure = Map.from_geom(geom, unit="m2 s") exposure.data += 1.0 psf = PSFKernel.from_gauss(geom, sigma="0.1 deg") evaluator = MapEvaluator(model=model, exposure=exposure, psf=psf) _ = evaluator.compute_npred() spectral_model.amplitude.value *= 2 assert evaluator.parameter_norm_only_changed spectral_model.index.value *= 2 assert not evaluator.parameter_norm_only_changed spectral_model.amplitude.value *= 2 spectral_model.index.value *= 2 assert not evaluator.parameter_norm_only_changed ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/tests/test_flux_points.py0000644000175100001770000002074114721316200023005 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.table import Column, Table from astropy.time import Time from gammapy.data import GTI from gammapy.datasets import Datasets, FluxPointsDataset from gammapy.estimators import FluxPoints from gammapy.modeling import Fit from gammapy.modeling.models import ( ExpDecayTemporalModel, PowerLawSpectralModel, SkyModel, ) from gammapy.utils.scripts import make_path from gammapy.utils.testing import mpl_plot_check, requires_data, requires_dependency @pytest.fixture() def test_meta_table(dataset): meta_table = dataset.meta_table assert meta_table["TELESCOP"] == "CTA" assert meta_table["OBS_ID"] == "0001" assert meta_table["INSTRUME"] == "South_Z20_50h" @pytest.fixture() def dataset(): path = "$GAMMAPY_DATA/tests/spectrum/flux_points/diff_flux_points.fits" table = Table.read(make_path(path)) table["e_ref"] = table["e_ref"].quantity.to("TeV") gti = GTI.create( start=0 * u.s, stop=30 * u.min, reference_time=Time("2000-01-01", scale="utc") ) data = FluxPoints.from_table(table, format="gadf-sed") data.gti = gti model = SkyModel( spectral_model=PowerLawSpectralModel( index=2.3, amplitude="2e-13 cm-2 s-1 TeV-1", reference="1 TeV" ) ) obs_table = Table() obs_table["TELESCOP"] = ["CTA"] obs_table["OBS_ID"] = ["0001"] obs_table["INSTRUME"] = ["South_Z20_50h"] dataset = FluxPointsDataset(model, data, meta_table=obs_table) return dataset @requires_data() def test_flux_point_dataset_serialization(tmp_path): path = "$GAMMAPY_DATA/tests/spectrum/flux_points/diff_flux_points.fits" table = Table.read(make_path(path)) table["e_ref"] = table["e_ref"].quantity.to("TeV") data = FluxPoints.from_table(table, format="gadf-sed") spectral_model = PowerLawSpectralModel( index=2.3, amplitude="2e-13 cm-2 s-1 TeV-1", reference="1 TeV" ) model = SkyModel(spectral_model=spectral_model, name="test_model") dataset = FluxPointsDataset(model, data, name="test_dataset") dataset2 = FluxPointsDataset.read(path, name="test_dataset2") assert_allclose(dataset.data.dnde.data, dataset2.data.dnde.data) assert dataset.mask_safe.data == dataset2.mask_safe.data assert dataset2.name == "test_dataset2" Datasets([dataset]).write( filename=tmp_path / "tmp_datasets.yaml", filename_models=tmp_path / "tmp_models.yaml", ) datasets = Datasets.read( filename=tmp_path / "tmp_datasets.yaml", filename_models=tmp_path / "tmp_models.yaml", ) new_dataset = datasets[0] assert_allclose(new_dataset.data.dnde, dataset.data.dnde, 1e-4) if dataset.mask_fit is None: assert np.all(new_dataset.mask_fit == dataset.mask_safe) assert np.all(new_dataset.mask_safe == dataset.mask_safe) assert new_dataset.name == "test_dataset" @requires_data() def test_flux_point_dataset_str(dataset): assert "FluxPointsDataset" in str(dataset) # check print if no models present dataset.models = None assert "FluxPointsDataset" in str(dataset) @requires_data() def test_flux_point_dataset_flux_pred(dataset): assert_allclose(dataset.flux_pred()[0].value, 0.00022766, rtol=1e-2) dataset.models[0].temporal_model = ExpDecayTemporalModel( t0=5.0 * u.hr, t_ref=51543.5 * u.d ) assert_allclose(dataset.flux_pred()[0].value, 0.000472, rtol=1e-3) @requires_data() def test_flux_point_dataset_stat(dataset): dataset.stat_type = "chi2" fit = Fit() fit.run([dataset]) assert_allclose(dataset.stat_sum(), 25.205933, rtol=1e-3) dataset.stat_type = "distrib" fit = Fit() fit.run([dataset]) assert_allclose(dataset.stat_sum(), 36.153428, rtol=1e-3) def test_flux_point_dataset_with_time_axis(tmp_path): meta = dict(TIMESYS="utc", SED_TYPE="flux") table = Table( meta=meta, data=[ Column(Time(["2010-01-01", "2010-01-03"]).mjd, "time_min"), Column(Time(["2010-01-03", "2010-01-10"]).mjd, "time_max"), Column([[1.0, 2.0], [1.0, 2.0]], "e_min", unit="TeV"), Column([[2.0, 4.0], [2.0, 4.0]], "e_max", unit="TeV"), Column([[1e-11, 1e-12], [3e-11, 3e-12]], "flux", unit="cm-2 s-1"), Column([[0.1e-11, 1e-13], [0.3e-11, 3e-13]], "flux_err", unit="cm-2 s-1"), Column([[np.nan, np.nan], [3.6e-11, 3.6e-12]], "flux_ul", unit="cm-2 s-1"), Column([[False, False], [True, True]], "is_ul"), ], ) flux_points = FluxPoints.from_table(table=table) flux_points_dataset = FluxPointsDataset(data=flux_points) temporal_model = ExpDecayTemporalModel() temporal_model.t_ref.value = Time(["2010-01-01"]).mjd temporal_model.t0.quantity = 5.0 * u.hr model = SkyModel( spectral_model=PowerLawSpectralModel(), temporal_model=temporal_model ) flux_points_dataset.models = model assert flux_points_dataset.flux_pred().shape == (2, 2, 1, 1) assert flux_points_dataset.mask.shape == (2, 2, 1, 1) assert_allclose( flux_points_dataset.flux_pred()[0].value, [[[5.21782717e-14]], [[1.30445679e-14]]], rtol=1e-3, ) assert_allclose(flux_points_dataset.stat_sum(), 193.8093, rtol=1e-3) assert_allclose( flux_points_dataset.residuals()[0][0].value, 9.94782173e-12, rtol=1e-5 ) Datasets([flux_points_dataset]).write( filename=tmp_path / "tmp_datasets.yaml", filename_models=tmp_path / "tmp_models.yaml", ) datasets = Datasets.read( filename=tmp_path / "tmp_datasets.yaml", filename_models=tmp_path / "tmp_models.yaml", ) new_dataset = datasets[0] assert_allclose(new_dataset.data.dnde, flux_points_dataset.data.dnde, 1e-4) with mpl_plot_check(): flux_points_dataset.plot_spectrum(axis_name="time") with pytest.raises(ValueError): flux_points_dataset.plot_fit() with pytest.raises(ValueError): flux_points_dataset.plot_residuals() flux_points_dataset.stat_type = "distrib" assert_allclose(flux_points_dataset.stat_sum(), 193.8093, rtol=1e-3) @requires_data() class TestFluxPointFit: def test_fit_pwl_minuit(self, dataset): fit = Fit() result = fit.run(dataset) self.assert_result(result, dataset.models) @requires_dependency("sherpa") def test_fit_pwl_sherpa(self, dataset): fit = Fit(backend="sherpa", optimize_opts={"method": "simplex"}) result = fit.optimize(datasets=[dataset]) self.assert_result(result, dataset.models) @staticmethod def assert_result(result, models): assert result.success assert_allclose(result.total_stat, 25.2059, rtol=1e-3) index = models.parameters["index"] assert_allclose(index.value, 2.216, rtol=1e-3) amplitude = models.parameters["amplitude"] assert_allclose(amplitude.value, 2.1616e-13, rtol=1e-3) reference = models.parameters["reference"] assert_allclose(reference.value, 1, rtol=1e-8) @staticmethod def test_stat_profile(dataset): fit = Fit() result = fit.run(datasets=dataset) model = dataset.models[0].spectral_model assert_allclose(model.amplitude.error, 1.9e-14, rtol=1e-2) model.amplitude.scan_n_values = 3 model.amplitude.scan_n_sigma = 1 model.amplitude.interp = "lin" profile = fit.stat_profile( datasets=dataset, parameter="amplitude", ) ts_diff = profile["stat_scan"] - result.total_stat assert_allclose( model.amplitude.scan_values, [1.97e-13, 2.16e-13, 2.35e-13], rtol=1e-2 ) assert_allclose(ts_diff, [110.244116, 0.0, 110.292074], rtol=1e-2, atol=1e-7) value = model.parameters["amplitude"].value err = model.parameters["amplitude"].error model.amplitude.scan_values = np.array([value - err, value, value + err]) profile = fit.stat_profile( datasets=dataset, parameter="amplitude", ) ts_diff = profile["stat_scan"] - result.total_stat assert_allclose( model.amplitude.scan_values, [1.97e-13, 2.16e-13, 2.35e-13], rtol=1e-2 ) assert_allclose(ts_diff, [110.244116, 0.0, 110.292074], rtol=1e-2, atol=1e-7) @staticmethod def test_fp_dataset_plot_fit(dataset): with mpl_plot_check(): dataset.plot_fit(kwargs_residuals=dict(method="diff/model")) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/tests/test_io.py0000644000175100001770000001321614721316200021041 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import warnings import pytest from numpy.testing import assert_allclose import astropy.units as u from regions import CircleSkyRegion from gammapy.datasets import Datasets, SpectrumDataset, SpectrumDatasetOnOff from gammapy.modeling.models import DatasetModels from gammapy.utils.scripts import make_path from gammapy.utils.testing import requires_data @requires_data() def test_datasets_to_io(tmp_path): filedata = "$GAMMAPY_DATA/tests/models/gc_example_datasets.yaml" filemodel = "$GAMMAPY_DATA/tests/models/gc_example_models.yaml" datasets = Datasets.read( filename=filedata, filename_models=filemodel, ) assert len(datasets) == 2 assert len(datasets.models) == 5 dataset0 = datasets[0] assert dataset0.name == "gc" assert dataset0.counts.data.sum() == 22258 assert_allclose(dataset0.exposure.data.sum(), 8.057342e12, atol=0.1) assert dataset0.psf is not None assert dataset0.edisp is not None assert_allclose(dataset0.npred_background().data.sum(), 15726.8, atol=0.1) assert dataset0.background_model.name == "gc-bkg" dataset1 = datasets[1] assert dataset1.name == "g09" assert dataset1.background_model.name == "g09-bkg" assert ( dataset0.models["gll_iem_v06_cutout"] == dataset1.models["gll_iem_v06_cutout"] ) assert isinstance(dataset0.models, DatasetModels) assert len(dataset0.models) == 4 assert dataset0.models[0].name == "gc" assert dataset0.models[1].name == "gll_iem_v06_cutout" assert dataset0.models[2].name == "gc-bkg" assert ( dataset0.models["gc"].parameters["reference"] is dataset1.models["g09"].parameters["reference"] ) assert_allclose(dataset1.models["g09"].parameters["lon_0"].value, 0.9, atol=0.1) datasets.write( filename=tmp_path / "written_datasets.yaml", filename_models=tmp_path / "written_models.yaml", ) datasets.write( filename=tmp_path / "written_datasets.yaml", filename_models=tmp_path / "written_models.yaml", overwrite=True, ) datasets_read = Datasets.read( filename=tmp_path / "written_datasets.yaml", filename_models=tmp_path / "written_models.yaml", ) assert len(datasets.parameters) == 24 assert len(datasets_read) == 2 dataset0 = datasets_read[0] assert dataset0.name == "gc" assert dataset0.counts.data.sum() == 22258 assert_allclose(dataset0.exposure.data.sum(), 8.057342e12, atol=0.1) assert dataset0.psf is not None assert dataset0.edisp is not None assert_allclose(dataset0.npred_background().data.sum(), 15726.8, atol=0.1) assert datasets[1].name == "g09" dataset_copy = dataset0.copy(name="dataset0-copy") assert dataset_copy.models is None @requires_data() def test_spectrum_datasets_to_io(tmp_path): filedata = "$GAMMAPY_DATA/tests/models/gc_example_datasets.yaml" filemodel = "$GAMMAPY_DATA/tests/models/gc_example_models.yaml" datasets = Datasets.read( filename=filedata, filename_models=filemodel, ) reg = CircleSkyRegion(center=datasets[0]._geom.center_skydir, radius=1.0 * u.deg) dataset0 = datasets[0].to_spectrum_dataset(reg) datasets1 = Datasets([dataset0, datasets[1]]) datasets1.write( filename=tmp_path / "written_datasets.yaml", filename_models=tmp_path / "written_models.yaml", ) datasets_read = Datasets.read( filename=tmp_path / "written_datasets.yaml", filename_models=tmp_path / "written_models.yaml", ) assert len(datasets_read.parameters) == 21 assert len(datasets_read) == 2 assert datasets_read[0].counts.data.sum() == 18429 assert_allclose(datasets_read[0].exposure.data.sum(), 2.034089e10, atol=0.1) assert isinstance(datasets_read[0], SpectrumDataset) @requires_data() def test_ogip_writer(tmp_path): dataset = SpectrumDatasetOnOff.read( "$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs23523.fits" ) dataset.counts_off = None datasets = Datasets(dataset) datasets.write(tmp_path / "written_datasets.yaml") new_datasets = datasets.read(tmp_path / "written_datasets.yaml") assert new_datasets[0].counts_off is None @requires_data() def test_datasets_write_checksum(tmp_path): filedata = "$GAMMAPY_DATA/tests/models/gc_example_datasets.yaml" filemodel = "$GAMMAPY_DATA/tests/models/gc_example_models.yaml" datasets = Datasets.read( filename=filedata, filename_models=filemodel, ) filename = tmp_path / "written_datasets.yaml" filename_models = tmp_path / "written_models.yaml" datasets.write( filename=filename, filename_models=filename_models, checksum=True, ) assert "checksum" in filename.read_text() assert "checksum" in filename_models.read_text() with warnings.catch_warnings(): warnings.simplefilter("error") Datasets.read(filename=filename, filename_models=filename_models, checksum=True) # Remove checksum from datasets.yaml file yaml_content = filename.read_text() index = yaml_content.find("checksum") bad = make_path(tmp_path) / "bad_checksum.yaml" bad.write_text(yaml_content[:index]) with pytest.warns(UserWarning): Datasets.read(filename=bad, filename_models=filename_models, checksum=True) # Modify models yaml file yaml_content = filename_models.read_text() yaml_content = yaml_content.replace("name: gc", "name: bad") bad = make_path(tmp_path) / "bad_models.yaml" bad.write_text(yaml_content) with pytest.warns(UserWarning): Datasets.read(filename=filename, filename_models=bad, checksum=True) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/tests/test_map.py0000644000175100001770000023147414721316200021217 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import json import warnings import pytest import numpy as np from numpy.testing import assert_allclose, assert_equal import astropy.units as u from astropy.coordinates import SkyCoord from astropy.io import fits from astropy.table import Table from astropy.utils.exceptions import AstropyUserWarning from regions import CircleSkyRegion import gammapy.irf.psf.map as psf_map_module from gammapy.catalog import SourceCatalog3FHL from gammapy.data import GTI, DataStore from gammapy.datasets import ( Datasets, MapDataset, MapDatasetOnOff, create_empty_map_dataset_from_irfs, create_map_dataset_from_observation, ) from gammapy.datasets.map import RAD_AXIS_DEFAULT from gammapy.irf import ( EDispKernelMap, EDispMap, EffectiveAreaTable2D, EnergyDependentMultiGaussPSF, EnergyDispersion2D, PSFKernel, PSFMap, RecoPSFMap, ) from gammapy.makers.utils import make_map_exposure_true_energy, make_psf_map from gammapy.maps import ( HpxGeom, LabelMapAxis, Map, MapAxis, RegionGeom, WcsGeom, WcsNDMap, ) from gammapy.modeling import Fit from gammapy.modeling.models import ( DiskSpatialModel, FoVBackgroundModel, GaussianSpatialModel, Models, PointSpatialModel, PowerLawSpectralModel, SkyModel, UniformPrior, ) from gammapy.utils.testing import mpl_plot_check, requires_data, requires_dependency from gammapy.utils.types import JsonQuantityEncoder @pytest.fixture def geom_hpx(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) energy_axis_true = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=4, name="energy_true" ) geom = HpxGeom.create(nside=32, axes=[axis], frame="galactic") return {"geom": geom, "energy_axis_true": energy_axis_true} @pytest.fixture def geom_hpx_partial(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) energy_axis_true = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=4, name="energy_true" ) geom = HpxGeom.create( nside=32, axes=[axis], frame="galactic", region="DISK(110.,75.,10.)" ) return {"geom": geom, "energy_axis_true": energy_axis_true} @pytest.fixture def geom(): axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=2) return WcsGeom.create( skydir=(266.40498829, -28.93617776), binsz=0.02, width=(2, 2), frame="icrs", axes=[axis], ) @pytest.fixture def geom1(): e_axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=20) t_axis = MapAxis.from_bounds(0, 10, 2, name="time", unit="s") return WcsGeom.create( skydir=(266.40498829, -28.93617776), binsz=0.02, width=(3, 2), frame="icrs", axes=[e_axis, t_axis], ) @pytest.fixture def geom_etrue(): axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=3, name="energy_true") return WcsGeom.create( skydir=(266.40498829, -28.93617776), binsz=0.02, width=(2, 2), frame="icrs", axes=[axis], ) @pytest.fixture def geom_image(): energy = np.logspace(-1.0, 1.0, 2) axis = MapAxis.from_edges(energy, name="energy", unit=u.TeV, interp="log") return WcsGeom.create( skydir=(0, 0), binsz=0.02, width=(2, 2), frame="galactic", axes=[axis] ) def get_exposure(geom_etrue): filename = ( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) aeff = EffectiveAreaTable2D.read(filename, hdu="EFFECTIVE AREA") exposure_map = make_map_exposure_true_energy( pointing=SkyCoord(1, 0.5, unit="deg", frame="galactic"), livetime=1 * u.hr, aeff=aeff, geom=geom_etrue, ) return exposure_map def get_psf(): filename = ( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) psf = EnergyDependentMultiGaussPSF.read(filename, hdu="POINT SPREAD FUNCTION") geom = WcsGeom.create( skydir=(0, 0), frame="galactic", binsz=2, width=(2, 2), axes=[RAD_AXIS_DEFAULT, psf.axes["energy_true"]], ) return make_psf_map( psf=psf, pointing=SkyCoord(0, 0.5, unit="deg", frame="galactic"), geom=geom, exposure_map=Map.from_geom(geom.squash("rad"), unit="cm2 s"), ) @requires_data() def get_edisp(geom, geom_etrue): filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" edisp2d = EnergyDispersion2D.read(filename, hdu="EDISP") energy_axis = geom.axes["energy"] energy_axis_true = geom_etrue.axes["energy_true"] edisp_kernel = edisp2d.to_edisp_kernel( offset="1.2 deg", energy_axis=energy_axis, energy_axis_true=energy_axis_true ) edisp = EDispKernelMap.from_edisp_kernel(edisp_kernel) return edisp @pytest.fixture def sky_model(): spatial_model = GaussianSpatialModel( lon_0="0.2 deg", lat_0="0.1 deg", sigma="0.2 deg", frame="galactic" ) spectral_model = PowerLawSpectralModel( index=3, amplitude="1e-11 cm-2 s-1 TeV-1", reference="1 TeV" ) return SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, name="test-model" ) def get_map_dataset(geom, geom_etrue, edisp="edispmap", name="test", **kwargs): """Returns a MapDataset""" # define background model background = Map.from_geom(geom) background.data += 0.2 psf = get_psf() exposure = get_exposure(geom_etrue) e_reco = geom.axes["energy"] e_true = geom_etrue.axes["energy_true"] if edisp == "edispmap": edisp = EDispMap.from_diagonal_response(energy_axis_true=e_true) data = exposure.get_spectrum(geom.center_skydir).data edisp.exposure_map.data = np.repeat(np.expand_dims(data, -1), 2, axis=-1) elif edisp == "edispkernelmap": edisp = EDispKernelMap.from_diagonal_response( energy_axis=e_reco, energy_axis_true=e_true ) data = exposure.get_spectrum(geom.center_skydir).data edisp.exposure_map.data = np.repeat(np.expand_dims(data, -1), 2, axis=-1) else: edisp = None # define fit mask center = SkyCoord("0.2 deg", "0.1 deg", frame="galactic") circle = CircleSkyRegion(center=center, radius=1 * u.deg) mask_fit = geom.region_mask([circle]) models = FoVBackgroundModel(dataset_name=name) return MapDataset( models=models, exposure=exposure, background=background, psf=psf, edisp=edisp, mask_fit=mask_fit, name=name, **kwargs, ) def test_map_dataset_name(): with pytest.raises(ValueError, match="of type '"): _ = MapDataset(name=6353) with pytest.raises(ValueError, match="of type '"): _ = MapDataset(name=[1233, "234"]) @requires_data() def test_map_dataset_str(sky_model, geom, geom_etrue): dataset = get_map_dataset(geom, geom_etrue) bkg_model = FoVBackgroundModel(dataset_name=dataset.name) dataset.models = [sky_model, bkg_model] dataset.counts = dataset.npred() dataset.mask_safe = dataset.mask_fit assert "MapDataset" in str(dataset) assert "(frozen)" in str(dataset) assert "background" in str(dataset) dataset.mask_safe = None assert "MapDataset" in str(dataset) def test_map_dataset_str_empty(): dataset = MapDataset() assert "MapDataset" in str(dataset) @requires_data() def test_fake(sky_model, geom, geom_etrue): """Test the fake dataset""" dataset = get_map_dataset(geom, geom_etrue) bkg_model = FoVBackgroundModel(dataset_name=dataset.name) dataset.models = [sky_model, bkg_model] npred = dataset.npred() assert np.all(npred.data >= 0) # npred must be positive dataset.counts = npred real_dataset = dataset.copy() dataset.fake(314) assert real_dataset.counts.data.shape == dataset.counts.data.shape assert_allclose(real_dataset.counts.data.sum(), 9525.299054, rtol=1e-5) assert_allclose(dataset.counts.data.sum(), 9709) assert dataset.counts.data.dtype == float stacked = get_map_dataset(geom, geom_etrue) bkg_model = FoVBackgroundModel(dataset_name=stacked.name) stacked.models = [sky_model, bkg_model] stacked.counts = stacked.npred() dataset.stack(stacked) assert_allclose(dataset.counts.data.sum(), 19234.3407, 1e-2) @requires_data() def test_different_exposure_unit(sky_model, geom): energy_range_true = np.logspace(2, 4, 3) axis = MapAxis.from_edges( energy_range_true, name="energy_true", unit="GeV", interp="log" ) geom_gev = geom.to_image().to_cube([axis]) dataset = get_map_dataset(geom, geom_gev, edisp="None") bkg_model = FoVBackgroundModel(dataset_name=dataset.name) dataset.models = [sky_model, bkg_model] npred = dataset.npred() assert_allclose(npred.data[0, 50, 50], 6.086019, rtol=1e-2) @pytest.mark.parametrize(("edisp_mode"), ["edispmap", "edispkernelmap"]) @requires_data() def test_to_spectrum_dataset(sky_model, geom, geom_etrue, edisp_mode): dataset_ref = get_map_dataset(geom, geom_etrue, edisp=edisp_mode) bkg_model = FoVBackgroundModel(dataset_name=dataset_ref.name) dataset_ref.models = [sky_model, bkg_model] dataset_ref.counts = dataset_ref.npred_background() * 0.0 dataset_ref.counts.data[1, 50, 50] = 1 dataset_ref.counts.data[1, 60, 50] = 1 gti = GTI.create([0 * u.s], [1 * u.h], reference_time="2010-01-01T00:00:00") dataset_ref.gti = gti on_region = CircleSkyRegion(center=geom.center_skydir, radius=0.05 * u.deg) spectrum_dataset = dataset_ref.to_spectrum_dataset(on_region) spectrum_dataset_corrected = dataset_ref.to_spectrum_dataset( on_region, containment_correction=True ) mask = np.ones_like(dataset_ref.counts, dtype="bool") mask[1, 40:60, 40:60] = False dataset_ref.mask_safe = Map.from_geom(dataset_ref.counts.geom, data=mask) spectrum_dataset_mask = dataset_ref.to_spectrum_dataset(on_region) assert np.sum(spectrum_dataset.counts.data) == 1 assert spectrum_dataset.data_shape == (2, 1, 1) assert spectrum_dataset.background.geom.axes[0].nbin == 2 assert spectrum_dataset.exposure.geom.axes[0].nbin == 3 assert spectrum_dataset.exposure.unit == "m2s" energy_axis = geom.axes["energy"] assert ( spectrum_dataset.edisp.get_edisp_kernel(energy_axis=energy_axis) .axes["energy"] .nbin == 2 ) assert ( spectrum_dataset.edisp.get_edisp_kernel(energy_axis=energy_axis) .axes["energy_true"] .nbin == 3 ) assert_allclose(spectrum_dataset.edisp.exposure_map.data[1], 3.069227e09, rtol=1e-5) assert np.sum(spectrum_dataset_mask.counts.data) == 0 assert spectrum_dataset_mask.data_shape == (2, 1, 1) assert spectrum_dataset_corrected.exposure.unit == "m2s" assert_allclose(spectrum_dataset.exposure.data[1], 3.070884e09, rtol=1e-5) assert_allclose(spectrum_dataset_corrected.exposure.data[1], 2.05201e09, rtol=1e-5) @requires_data() def test_energy_range(sky_model, geom1, geom_etrue): sky_coord1 = SkyCoord(266.5, -29.3, unit="deg") region1 = CircleSkyRegion(sky_coord1, 0.5 * u.deg) mask1 = geom1.region_mask([region1]) & geom1.energy_mask(1 * u.TeV, 7 * u.TeV) sky_coord2 = SkyCoord(266.5, -28.7, unit="deg") region2 = CircleSkyRegion(sky_coord2, 0.5 * u.deg) mask2 = geom1.region_mask([region2]) & geom1.energy_mask(2 * u.TeV, 8 * u.TeV) mask3 = geom1.energy_mask(3 * u.TeV, 6 * u.TeV) mask_safe = Map.from_geom(geom1, data=(mask1 | mask2 | mask3).data) dataset = get_map_dataset(geom1, geom_etrue, edisp=None, mask_safe=mask_safe) energy = geom1.axes["energy"].edges.value e_min, e_max = dataset.energy_range_safe assert_allclose(e_min.get_by_coord((265, -28, 0)), energy[15]) assert_allclose(e_max.get_by_coord((265, -28, 5)), energy[17]) assert_allclose(e_min.get_by_coord((sky_coord1.ra, sky_coord1.dec, 6)), energy[10]) assert_allclose(e_max.get_by_coord((sky_coord1.ra, sky_coord1.dec, 1)), energy[18]) assert_allclose(e_min.get_by_coord((sky_coord2.ra, sky_coord2.dec, 2)), energy[14]) assert_allclose(e_max.get_by_coord((sky_coord2.ra, sky_coord2.dec, 7)), energy[19]) assert_allclose(e_min.get_by_coord((266.5, -29, 8)), energy[10]) assert_allclose(e_max.get_by_coord((266.5, -29, 3)), energy[19]) e_min, e_max = dataset.energy_range_fit assert_allclose(e_min.get_by_coord((265, -28, 0)), np.nan) assert_allclose(e_max.get_by_coord((265, -28, 5)), np.nan) assert_allclose(e_min.get_by_coord((266, -29, 4)), energy[0]) assert_allclose(e_max.get_by_coord((266, -29, 9)), energy[20]) e_min, e_max = dataset.energy_range assert_allclose(e_min.get_by_coord((266, -29, 4)), energy[15]) assert_allclose(e_max.get_by_coord((266, -29, 9)), energy[17]) mask_zeros = Map.from_geom(geom1, data=np.zeros_like(mask_safe)) e_min, e_max = dataset._energy_range(mask_zeros) assert_allclose(e_min.get_by_coord((266.5, -29, 8)), np.nan) assert_allclose(e_max.get_by_coord((266.5, -29, 3)), np.nan) e_min, e_max = dataset._energy_range() assert_allclose(e_min.get_by_coord((265, -28, 0)), energy[0]) assert_allclose(e_max.get_by_coord((265, -28, 5)), energy[20]) @requires_data() def test_info_dict(sky_model, geom, geom_etrue): dataset = get_map_dataset(geom, geom_etrue) bkg_model = FoVBackgroundModel(dataset_name=dataset.name) dataset.models = [sky_model, bkg_model] dataset.counts = dataset.npred() info_dict = dataset.info_dict() assert_allclose(info_dict["counts"], 9526, rtol=1e-3) assert_allclose(info_dict["background"], 4000.0005, rtol=1e-3) assert_allclose(info_dict["npred_background"], 4000.0, rtol=1e-3) assert_allclose(info_dict["excess"], 5525.756, rtol=1e-3) assert_allclose(info_dict["exposure_min"].value, 8.32e8, rtol=1e-3) assert_allclose(info_dict["exposure_max"].value, 1.105e10, rtol=1e-3) assert info_dict["exposure_max"].unit == "m2 s" assert info_dict["name"] == "test" gti = GTI.create([0 * u.s], [1 * u.h], reference_time="2010-01-01T00:00:00") dataset.gti = gti info_dict = dataset.info_dict() assert_allclose(info_dict["counts"], 9526, rtol=1e-3) assert_allclose(info_dict["background"], 4000.0005, rtol=1e-3) assert_allclose(info_dict["npred_background"], 4000.0, rtol=1e-3) assert_allclose(info_dict["sqrt_ts"], 74.024180, rtol=1e-3) assert_allclose(info_dict["excess"], 5525.756, rtol=1e-3) assert_allclose(info_dict["ontime"].value, 3600) assert info_dict["name"] == "test" # try to dump as json result = json.dumps(info_dict, cls=JsonQuantityEncoder) assert "counts" in result def get_fermi_3fhl_gc_dataset(): counts = Map.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-counts-cube.fits.gz") background = Map.read( "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-background-cube.fits.gz" ) bkg_model = FoVBackgroundModel(dataset_name="fermi-3fhl-gc") exposure = Map.read( "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-exposure-cube.fits.gz" ) return MapDataset( counts=counts, background=background, models=[bkg_model], exposure=exposure, name="fermi-3fhl-gc", ) @requires_data() def test_resample_energy_3fhl(): dataset = get_fermi_3fhl_gc_dataset() new_axis = MapAxis.from_edges([10, 35, 100] * u.GeV, interp="log", name="energy") grouped = dataset.resample_energy_axis(energy_axis=new_axis) assert grouped.counts.data.shape == (2, 200, 400) assert grouped.counts.data[0].sum() == 28581 assert_allclose( grouped.npred_background().data.sum(axis=(1, 2)), [25074.366386, 2194.298612], rtol=1e-5, ) assert_allclose(grouped.exposure.data, dataset.exposure.data, rtol=1e-5) axis = grouped.counts.geom.axes[0] npred = dataset.npred() npred_grouped = grouped.npred() assert_allclose(npred.resample_axis(axis=axis).data.sum(), npred_grouped.data.sum()) @requires_data() def test_to_image_3fhl(): dataset = get_fermi_3fhl_gc_dataset() dataset_im = dataset.to_image() assert dataset_im.counts.data.sum() == dataset.counts.data.sum() assert_allclose(dataset_im.npred_background().data.sum(), 28548.625, rtol=1e-5) assert_allclose(dataset_im.exposure.data, dataset.exposure.data, rtol=1e-5) npred = dataset.npred() npred_im = dataset_im.npred() assert_allclose(npred.data.sum(), npred_im.data.sum()) def test_to_image_mask_safe(): axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=2) geom = WcsGeom.create( skydir=(0, 0), binsz=0.5, width=(1, 1), frame="icrs", axes=[axis] ) dataset = MapDataset.create(geom) # Check map_safe handling data = np.array([[[False, True], [True, True]], [[False, False], [True, True]]]) dataset.mask_safe = WcsNDMap.from_geom(geom=geom, data=data) dataset_im = dataset.to_image() assert dataset_im.mask_safe.data.dtype == bool desired = np.array([[False, True], [True, True]]) assert (dataset_im.mask_safe.data == desired).all() # Check that missing entries in the dataset do not break dataset_copy = dataset.copy() dataset_copy.exposure = None dataset_im = dataset_copy.to_image() assert dataset_im.exposure is None dataset_copy = dataset.copy() dataset_copy.counts = None dataset_im = dataset_copy.to_image() assert dataset_im.counts is None @requires_data() def test_downsample(): dataset = get_fermi_3fhl_gc_dataset() downsampled = dataset.downsample(2) assert downsampled.counts.data.shape == (11, 100, 200) assert downsampled.counts.data.sum() == dataset.counts.data.sum() assert_allclose( downsampled.npred_background().data.sum(axis=(1, 2)), dataset.npred_background().data.sum(axis=(1, 2)), rtol=1e-5, ) assert_allclose(downsampled.exposure.data[5, 50, 100], 3.318082e11, rtol=1e-5) with pytest.raises(ValueError): dataset.downsample(2, axis_name="energy") def test_downsample_energy(geom, geom_etrue): # This checks that downsample and resample_energy_axis give identical results counts = Map.from_geom(geom, dtype="int") counts += 1 mask = Map.from_geom(geom, dtype="bool") mask.data[1:] = True counts += 1 exposure = Map.from_geom(geom_etrue, unit="m2s") edisp = EDispKernelMap.from_gauss(geom.axes[0], geom_etrue.axes[0], 0.1, 0.0) dataset = MapDataset( counts=counts, exposure=exposure, mask_safe=mask, edisp=edisp, ) dataset_downsampled = dataset.downsample(2, axis_name="energy") dataset_resampled = dataset.resample_energy_axis(geom.axes[0].downsample(2)) assert dataset_downsampled.edisp.edisp_map.data.shape == (3, 1, 1, 2) assert_allclose( dataset_downsampled.edisp.edisp_map.data[:, :, 0, 0], dataset_resampled.edisp.edisp_map.data[:, :, 0, 0], ) @requires_data() def test_map_dataset_fits_io(tmp_path, sky_model, geom, geom_etrue): dataset = get_map_dataset(geom, geom_etrue) bkg_model = FoVBackgroundModel(dataset_name=dataset.name) dataset.models = [sky_model, bkg_model] dataset.counts = dataset.npred() dataset.mask_safe = dataset.mask_fit gti = GTI.create([0 * u.s], [1 * u.h], reference_time="2010-01-01T00:00:00") dataset.gti = gti hdulist = dataset.to_hdulist() actual = [hdu.name for hdu in hdulist] desired = [ "PRIMARY", "COUNTS", "COUNTS_BANDS", "EXPOSURE", "EXPOSURE_BANDS", "BACKGROUND", "BACKGROUND_BANDS", "EDISP", "EDISP_BANDS", "EDISP_EXPOSURE", "EDISP_EXPOSURE_BANDS", "PSF", "PSF_BANDS", "PSF_EXPOSURE", "PSF_EXPOSURE_BANDS", "MASK_SAFE", "MASK_SAFE_BANDS", "MASK_FIT", "MASK_FIT_BANDS", "GTI", ] assert actual == desired dataset.write(tmp_path / "test.fits") dataset_new = MapDataset.read(tmp_path / "test.fits") assert dataset_new.name == "test" assert_allclose(dataset.meta.creation.date.mjd, dataset_new.meta.creation.date.mjd) assert dataset_new.mask.data.dtype == bool assert_allclose(dataset.counts.data, dataset_new.counts.data) assert_allclose( dataset.npred_background().data, dataset_new.npred_background().data ) assert_allclose(dataset.edisp.edisp_map.data, dataset_new.edisp.edisp_map.data) assert_allclose(dataset.psf.psf_map.data, dataset_new.psf.psf_map.data) assert_allclose(dataset.exposure.data, dataset_new.exposure.data) assert_allclose(dataset.mask_fit.data, dataset_new.mask_fit.data) assert_allclose(dataset.mask_safe.data, dataset_new.mask_safe.data) assert dataset.counts.geom == dataset_new.counts.geom assert dataset.exposure.geom == dataset_new.exposure.geom assert_allclose(dataset.exposure.meta["livetime"], 1 * u.h) assert dataset.npred_background().geom == dataset_new.npred_background().geom assert dataset.edisp.edisp_map.geom == dataset_new.edisp.edisp_map.geom assert_allclose( dataset.gti.time_sum.to_value("s"), dataset_new.gti.time_sum.to_value("s") ) # To test io of psf and edisp map stacked = MapDataset.create(geom) stacked.write(tmp_path / "test-2.fits", overwrite=True) stacked1 = MapDataset.read(tmp_path / "test-2.fits") assert stacked1.psf.psf_map is not None assert stacked1.psf.exposure_map is not None assert stacked1.edisp.edisp_map is not None assert stacked1.edisp.exposure_map is not None assert stacked.mask.data.dtype == bool assert_allclose(stacked1.psf.psf_map, stacked.psf.psf_map) assert_allclose(stacked1.edisp.edisp_map, stacked.edisp.edisp_map) @requires_data() def test_map_auto_psf_upsampling(sky_model, geom, geom_etrue): dataset_2 = get_map_dataset(geom, geom_etrue, name="test-2") datasets = Datasets([dataset_2]) models = Models(datasets.models) models.insert(0, sky_model) datasets.models = models psf_map_module.PSF_UPSAMPLING_FACTOR = None npred = dataset_2.npred().data.sum() assert_allclose(npred, 9525.340707, rtol=1e-3) psf_map_module.PSF_UPSAMPLING_FACTOR = 4 @requires_data() def test_map_fit(sky_model, geom, geom_etrue): dataset_1 = get_map_dataset(geom, geom_etrue, name="test-1") dataset_2 = get_map_dataset(geom, geom_etrue, name="test-2") datasets = Datasets([dataset_1, dataset_2]) models = Models(datasets.models) models.insert(0, sky_model) models["test-1-bkg"].spectral_model.norm.value = 0.5 models["test-model"].spatial_model.sigma.frozen = True datasets.models = models dataset_2.counts = dataset_2.npred() dataset_1.counts = dataset_1.npred() models["test-1-bkg"].spectral_model.norm.value = 0.49 models["test-2-bkg"].spectral_model.norm.value = 0.99 fit = Fit() result = fit.run(datasets=datasets) assert result.success assert "minuit" in str(result) npred = dataset_1.npred().data.sum() assert_allclose(npred, 7525.790688, rtol=1e-3) assert_allclose(result.total_stat, 21625.845714, rtol=1e-3) pars = models.parameters assert_allclose(pars["lon_0"].value, 0.2, rtol=1e-2) assert_allclose(pars["lon_0"].error, 0.002244, rtol=1e-2) assert_allclose(pars["index"].value, 3, rtol=1e-2) assert_allclose(pars["index"].error, 0.0242, rtol=1e-2) assert_allclose(pars["amplitude"].value, 1e-11, rtol=1e-2) assert_allclose(pars["amplitude"].error, 4.216e-13, rtol=1e-2) # background norm 1 assert_allclose(pars[8].value, 0.5, rtol=1e-2) assert_allclose(pars[8].error, 0.015811, rtol=1e-2) # background norm 2 assert_allclose(pars[11].value, 1, rtol=1e-2) assert_allclose(pars[11].error, 0.02147, rtol=1e-2) # test mask_safe evaluation dataset_1.mask_safe = geom.energy_mask(energy_min=1 * u.TeV) dataset_2.mask_safe = geom.energy_mask(energy_min=1 * u.TeV) stat = datasets.stat_sum() assert_allclose(stat, 14823.772744, rtol=1e-5) region = sky_model.spatial_model.to_region() initial_counts = dataset_1.counts.copy() with mpl_plot_check(): dataset_1.plot_residuals(kwargs_spectral=dict(region=region)) # check dataset has not changed assert initial_counts == dataset_1.counts # test model evaluation outside image dataset_1.models[0].spatial_model.lon_0.value = 150 dataset_1.npred() assert not dataset_1.evaluators["test-model"].contributes @requires_data() def test_prior_stat_sum(sky_model, geom, geom_etrue): dataset = get_map_dataset(geom, geom_etrue, name="test") datasets = Datasets([dataset]) models = Models(datasets.models) models.insert(0, sky_model) datasets.models = models dataset.counts = dataset.npred() uniformprior = UniformPrior(min=0, max=np.inf, weight=1) datasets.models.parameters["amplitude"].prior = uniformprior assert_allclose(datasets.stat_sum(), 12825.9370, rtol=1e-3) datasets.models.parameters["amplitude"].value = -1e-12 stat_sum_neg = datasets.stat_sum() assert_allclose(stat_sum_neg, 470298.864993, rtol=1e-3) datasets.models.parameters["amplitude"].prior.weight = 100 assert_allclose(datasets.stat_sum() - stat_sum_neg, 99, rtol=1e-3) @requires_data() @requires_dependency("ray") def test_map_fit_ray(sky_model, geom, geom_etrue): from gammapy.datasets.actors import DatasetsActor dataset_1 = get_map_dataset(geom, geom_etrue, name="test-1") dataset_2 = get_map_dataset(geom, geom_etrue, name="test-2") datasets = Datasets([dataset_1, dataset_2]) models = Models(datasets.models) models.insert(0, sky_model) models["test-1-bkg"].spectral_model.norm.value = 0.5 models["test-model"].spatial_model.sigma.frozen = True datasets.models = models dataset_2.counts = dataset_2.npred() dataset_1.counts = dataset_1.npred() models["test-1-bkg"].spectral_model.norm.value = 0.49 models["test-2-bkg"].spectral_model.norm.value = 0.99 datasets.models = None actors = DatasetsActor(datasets) assert len(actors.models) == 0 actors.models = models assert len(actors.models) == len(models) assert len(actors.parameters) == len(models.parameters.unique_parameters) assert len(actors[0].models) == len(models) - 1 fit = Fit() result = fit.run(datasets=actors) assert_allclose(result.models.covariance.data, actors.models.covariance.data) assert result.success assert result.optimize_result.backend == "minuit" npred = actors[0].npred().data.sum() assert_allclose(npred, 7525.790688, rtol=1e-3) assert_allclose(result.total_stat, 21625.845714, rtol=1e-3) pars = models.parameters assert_allclose(pars["lon_0"].value, 0.2, rtol=1e-2) assert_allclose(pars["lon_0"].error, 0.002244, rtol=1e-2) assert_allclose(pars["index"].value, 3, rtol=1e-2) assert_allclose(pars["index"].error, 0.0242, rtol=1e-2) assert_allclose(pars["amplitude"].value, 1e-11, rtol=1e-2) assert_allclose(pars["amplitude"].error, 4.216e-13, rtol=1e-2) # background norm 1 assert_allclose(pars[8].value, 0.5, rtol=1e-2) assert_allclose(pars[8].error, 0.015811, rtol=1e-2) # background norm 2 assert_allclose(pars[11].value, 1, rtol=1e-2) assert_allclose(pars[11].error, 0.02147, rtol=1e-2) with mpl_plot_check(): actors.plot_residuals() @requires_data() def test_map_fit_linked(sky_model, geom, geom_etrue): dataset_1 = get_map_dataset(geom, geom_etrue, name="test-1") dataset_2 = get_map_dataset(geom, geom_etrue, name="test-2") datasets = Datasets([dataset_1, dataset_2]) models = Models(datasets.models) models.insert(0, sky_model) sky_model2 = sky_model.copy(name="test-model-2") sky_model2.spectral_model.index = sky_model.spectral_model.index sky_model2.spectral_model.reference = sky_model.spectral_model.reference models.insert(0, sky_model2) models["test-1-bkg"].spectral_model.norm.value = 0.5 models["test-model"].spatial_model.sigma.frozen = True datasets.models = models dataset_2.counts = dataset_2.npred() dataset_1.counts = dataset_1.npred() models["test-1-bkg"].spectral_model.norm.value = 0.49 models["test-2-bkg"].spectral_model.norm.value = 0.99 fit = Fit() result = fit.run(datasets=datasets) assert result.success assert "minuit" in str(result) assert sky_model2.parameters["index"] is sky_model.parameters["index"] assert sky_model2.parameters["reference"] is sky_model.parameters["reference"] assert len(datasets.models.parameters.unique_parameters) == 20 assert datasets.models.covariance.shape == (22, 22) @requires_data() def test_map_fit_one_energy_bin(sky_model, geom_image): energy_axis = geom_image.axes["energy"] geom_etrue = geom_image.to_image().to_cube([energy_axis.copy(name="energy_true")]) dataset = get_map_dataset(geom_image, geom_etrue) bkg_model = FoVBackgroundModel(dataset_name=dataset.name) dataset.models = [sky_model, bkg_model] sky_model.spectral_model.index.value = 3.0 sky_model.spectral_model.index.frozen = True dataset.models[f"{dataset.name}-bkg"].spectral_model.norm.value = 0.5 dataset.counts = dataset.npred() # Move a bit away from the best-fit point, to make sure the optimiser runs sky_model.parameters["sigma"].value = 0.21 dataset.models[f"{dataset.name}-bkg"].parameters["norm"].frozen = True fit = Fit() result = fit.run(datasets=[dataset]) assert result.success npred = dataset.npred().data.sum() assert_allclose(npred, 16538.124036, rtol=1e-3) assert_allclose(result.total_stat, -34844.125047, rtol=1e-3) pars = sky_model.parameters assert_allclose(pars["lon_0"].value, 0.2, rtol=1e-2) assert_allclose(pars["lon_0"].error, 0.001689, rtol=1e-2) assert_allclose(pars["sigma"].value, 0.2, rtol=1e-2) assert_allclose(pars["sigma"].error, 0.00092, rtol=1e-2) assert_allclose(pars["amplitude"].value, 1e-11, rtol=1e-2) assert_allclose(pars["amplitude"].error, 8.127593e-14, rtol=1e-2) def test_create(): # tests empty datasets created rad_axis = MapAxis(nodes=np.linspace(0.0, 1.0, 51), unit="deg", name="rad") e_reco = MapAxis.from_edges( np.logspace(-1.0, 1.0, 3), name="energy", unit=u.TeV, interp="log" ) e_true = MapAxis.from_edges( np.logspace(-1.0, 1.0, 4), name="energy_true", unit=u.TeV, interp="log" ) geom = WcsGeom.create(binsz=0.02, width=(2, 2), axes=[e_reco]) empty_dataset = MapDataset.create( geom=geom, energy_axis_true=e_true, rad_axis=rad_axis ) assert empty_dataset.counts.data.shape == (2, 100, 100) assert empty_dataset.exposure.data.shape == (3, 100, 100) assert empty_dataset.psf.psf_map.data.shape == (3, 50, 10, 10) assert empty_dataset.psf.exposure_map.data.shape == (3, 1, 10, 10) assert isinstance(empty_dataset.edisp, EDispKernelMap) assert empty_dataset.edisp.edisp_map.data.shape == (3, 2, 10, 10) assert empty_dataset.edisp.exposure_map.data.shape == (3, 1, 10, 10) assert_allclose(empty_dataset.edisp.edisp_map.data.sum(), 300) assert_allclose(empty_dataset.gti.time_delta, 0.0 * u.s) def test_create_with_migra(tmp_path): # tests empty datasets created migra_axis = MapAxis(nodes=np.linspace(0.0, 3.0, 51), unit="", name="migra") rad_axis = MapAxis(nodes=np.linspace(0.0, 1.0, 51), unit="deg", name="rad") e_reco = MapAxis.from_edges( np.logspace(-1.0, 1.0, 3), name="energy", unit=u.TeV, interp="log" ) e_true = MapAxis.from_edges( np.logspace(-1.0, 1.0, 4), name="energy_true", unit=u.TeV, interp="log" ) geom = WcsGeom.create(binsz=0.02, width=(2, 2), axes=[e_reco]) empty_dataset = MapDataset.create( geom=geom, energy_axis_true=e_true, migra_axis=migra_axis, rad_axis=rad_axis ) empty_dataset.write(tmp_path / "test.fits") dataset_new = MapDataset.read(tmp_path / "test.fits") assert isinstance(empty_dataset.edisp, EDispMap) assert empty_dataset.edisp.edisp_map.data.shape == (3, 50, 10, 10) assert empty_dataset.edisp.exposure_map.data.shape == (3, 1, 10, 10) assert_allclose(empty_dataset.edisp.edisp_map.data.sum(), 5000) assert_allclose(empty_dataset.gti.time_delta, 0.0 * u.s) assert isinstance(dataset_new.edisp, EDispMap) assert dataset_new.edisp.edisp_map.data.shape == (3, 50, 10, 10) def test_create_high_dimension(): # tests empty datasets created with additional axes label_axis = LabelMapAxis(["a", "b"], name="type") rad_axis = MapAxis(nodes=np.linspace(0.0, 1.0, 51), unit="deg", name="rad") migra_axis = MapAxis(nodes=np.linspace(0.0, 3.0, 51), unit="", name="migra") e_reco = MapAxis.from_edges( np.logspace(-1.0, 1.0, 3), name="energy", unit=u.TeV, interp="log" ) e_true = MapAxis.from_edges( np.logspace(-1.0, 1.0, 4), name="energy_true", unit=u.TeV, interp="log" ) geom = WcsGeom.create(binsz=0.02, width=(2, 2), axes=[label_axis, e_reco]) empty_dataset = MapDataset.create( geom=geom, energy_axis_true=e_true, rad_axis=rad_axis ) assert empty_dataset.counts.data.shape == (2, 2, 100, 100) assert empty_dataset.exposure.data.shape == (2, 3, 100, 100) assert empty_dataset.psf.psf_map.data.shape == (2, 3, 50, 10, 10) assert empty_dataset.psf.exposure_map.data.shape == (2, 3, 1, 10, 10) assert empty_dataset.edisp.edisp_map.data.shape == (2, 3, 2, 10, 10) assert empty_dataset.edisp.exposure_map.data.shape == (2, 3, 1, 10, 10) assert_allclose(empty_dataset.edisp.edisp_map.data.sum(), 600) empty_dataset2 = MapDataset.create( geom=geom, energy_axis_true=e_true, rad_axis=rad_axis, migra_axis=migra_axis ) assert empty_dataset2.edisp.edisp_map.data.shape == (2, 3, 50, 10, 10) assert empty_dataset2.edisp.exposure_map.data.shape == (2, 3, 1, 10, 10) def test_stack(sky_model): axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=3) geom = WcsGeom.create( skydir=(266.40498829, -28.93617776), binsz=0.05, width=(2, 2), frame="icrs", axes=[axis], ) axis_etrue = MapAxis.from_energy_bounds( "0.1 TeV", "10 TeV", nbin=5, name="energy_true" ) geom_etrue = WcsGeom.create( skydir=(266.40498829, -28.93617776), binsz=0.05, width=(2, 2), frame="icrs", axes=[axis_etrue], ) edisp = EDispKernelMap.from_diagonal_response( energy_axis=axis, energy_axis_true=axis_etrue, geom=geom ) edisp.exposure_map.quantity = ( 1e0 * u.m**2 * u.s * np.ones(edisp.exposure_map.data.shape) ) bkg1 = Map.from_geom(geom) bkg1.data += 0.2 cnt1 = Map.from_geom(geom) cnt1.data = 1.0 * np.ones(cnt1.data.shape) exp1 = Map.from_geom(geom_etrue) exp1.quantity = 1e7 * u.m**2 * u.s * np.ones(exp1.data.shape) mask1 = Map.from_geom(geom) mask1.data = np.ones(mask1.data.shape, dtype=bool) mask1.data[0][:][5:10] = False dataset1 = MapDataset( counts=cnt1, background=bkg1, exposure=exp1, mask_safe=mask1, mask_fit=mask1, name="dataset-1", edisp=edisp, meta_table=Table({"OBS_ID": [0]}), ) bkg2 = Map.from_geom(geom) bkg2.data = 0.1 * np.ones(bkg2.data.shape) cnt2 = Map.from_geom(geom) cnt2.data = 1.0 * np.ones(cnt2.data.shape) exp2 = Map.from_geom(geom_etrue) exp2.quantity = 1e7 * u.m**2 * u.s * np.ones(exp2.data.shape) mask2 = Map.from_geom(geom) mask2.data = np.ones(mask2.data.shape, dtype=bool) mask2.data[0][:][5:10] = False mask2.data[1][:][10:15] = False dataset2 = MapDataset( counts=cnt2, background=bkg2, exposure=exp2, mask_safe=mask2, mask_fit=mask2, name="dataset-2", edisp=edisp, meta_table=Table({"OBS_ID": [1]}), ) background_model2 = FoVBackgroundModel(dataset_name="dataset-2") background_model1 = FoVBackgroundModel(dataset_name="dataset-1") dataset1.models = [background_model1, sky_model] dataset2.models = [background_model2, sky_model] stacked = MapDataset.from_geoms(**dataset1.geoms) stacked.stack(dataset1) stacked.stack(dataset2) stacked.models = [sky_model] npred_b = stacked.npred() assert_allclose(npred_b.data.sum(), 1459.985035, 1e-5) assert_allclose(stacked.npred_background().data.sum(), 1360.00, 1e-5) assert_allclose(stacked.background.data.sum(), 1360, 1e-5) assert_allclose(stacked.counts.data.sum(), 9000, 1e-5) assert_allclose(stacked.mask_safe.data.sum(), 4600) assert_allclose(stacked.mask_fit.data.sum(), 4600) assert_allclose(stacked.exposure.data.sum(), 1.6e11) assert_allclose(stacked.meta_table["OBS_ID"][0], [0, 1]) # stacking when no safe masks are defined dataset1 = MapDataset(counts=cnt1, background=bkg1) stacked = MapDataset.from_geoms(**dataset1.geoms) for i in range(3): stacked.stack(dataset1) assert_allclose(stacked.background.data.sum(), 2880.0, 1e-5) assert_allclose(stacked.counts.data.sum(), 14400.0, 1e-5) @requires_data() def test_npred(sky_model, geom, geom_etrue): dataset = get_map_dataset(geom, geom_etrue) pwl = PowerLawSpectralModel() gauss = GaussianSpatialModel( lon_0="0.0 deg", lat_0="0.0 deg", sigma="0.5 deg", frame="galactic" ) model1 = SkyModel(pwl, gauss, name="m1") bkg = FoVBackgroundModel(dataset_name=dataset.name) dataset.models = [bkg, sky_model, model1] assert_allclose( dataset.npred_signal(model_names=[model1.name]).data.sum(), 150.7487, rtol=1e-3 ) npred_model1_not_stack = dataset.npred_signal( model_names=[model1.name], stack=False ) assert isinstance(npred_model1_not_stack.geom.axes[-1], LabelMapAxis) assert npred_model1_not_stack.geom.axes[-1].name == "models" assert_equal(npred_model1_not_stack.geom.axes[-1].center, [model1.name]) assert dataset._background_cached is None assert_allclose(dataset.npred_background().data.sum(), 4000.0, rtol=1e-3) assert_allclose(dataset._background_cached.data.sum(), 4000.0, rtol=1e-3) assert_allclose(dataset.npred().data.sum(), 9676.047906, rtol=1e-3) assert_allclose(dataset.npred_signal().data.sum(), 5676.04790, rtol=1e-3) assert_allclose( dataset.npred_signal(model_names=[model1.name, sky_model.name]).data.sum(), 5676.04790, rtol=1e-3, ) npred_all_models_not_stack = dataset.npred_signal( model_names=[model1.name, sky_model.name], stack=False ) assert_allclose(npred_all_models_not_stack.geom.data_shape, (2, 2, 100, 100)) assert_allclose( npred_all_models_not_stack.sum_over_axes(["models"]).data.sum(), 5676.04790, rtol=1e-3, ) bkg.spectral_model.norm.value = 1.1 assert_allclose(dataset.npred_background().data.sum(), 4400.0, rtol=1e-3) assert_allclose(dataset._background_cached.data.sum(), 4400.0, rtol=1e-3) with pytest.raises( KeyError, match="m2", ): dataset.npred_signal(model_names=["m2"]) def test_stack_npred(): pwl = PowerLawSpectralModel() gauss = GaussianSpatialModel(sigma="0.2 deg") model = SkyModel(pwl, gauss) axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=5) axis_etrue = MapAxis.from_energy_bounds( "0.1 TeV", "10 TeV", nbin=11, name="energy_true" ) geom = WcsGeom.create( skydir=(0, 0), binsz=0.05, width=(2, 2), frame="icrs", axes=[axis], ) dataset_1 = MapDataset.create( geom, energy_axis_true=axis_etrue, name="dataset-1", gti=GTI.create("0 min", "30 min"), ) dataset_1.psf = None dataset_1.exposure.data += 1 dataset_1.mask_safe = geom.energy_mask(energy_min=1 * u.TeV) dataset_1.background.data += 1 bkg_model_1 = FoVBackgroundModel(dataset_name=dataset_1.name) dataset_1.models = [model, bkg_model_1] dataset_2 = MapDataset.create( geom, energy_axis_true=axis_etrue, name="dataset-2", gti=GTI.create("30 min", "60 min"), ) dataset_2.psf = None dataset_2.exposure.data += 1 dataset_2.mask_safe = geom.energy_mask(energy_min=0.2 * u.TeV) dataset_2.background.data += 1 bkg_model_2 = FoVBackgroundModel(dataset_name=dataset_2.name) dataset_2.models = [model, bkg_model_2] npred_1 = dataset_1.npred() npred_1.data[~dataset_1.mask_safe.data] = 0 npred_2 = dataset_2.npred() npred_2.data[~dataset_2.mask_safe.data] = 0 stacked_npred = Map.from_geom(geom) stacked_npred.stack(npred_1) stacked_npred.stack(npred_2) stacked = MapDataset.create(geom, energy_axis_true=axis_etrue, name="stacked") stacked.stack(dataset_1) stacked.stack(dataset_2) npred_stacked = stacked.npred() assert_allclose(npred_stacked.data, stacked_npred.data) def to_cube(image): # introduce a fake energy axis for now axis = MapAxis.from_edges([1, 10] * u.TeV, name="energy") geom = image.geom.to_cube([axis]) return WcsNDMap.from_geom(geom=geom, data=image.data, unit=image.unit) @pytest.fixture def images(): """Load some `counts`, `counts_off`, `acceptance_on`, `acceptance_off" images""" filename = "$GAMMAPY_DATA/tests/unbundled/hess/survey/hess_survey_snippet.fits.gz" exposure_image = WcsNDMap.read(filename, hdu="EXPGAMMAMAP").copy(unit="m2s") return { "counts": to_cube(WcsNDMap.read(filename, hdu="ON")), "counts_off": to_cube(WcsNDMap.read(filename, hdu="OFF")), "acceptance": to_cube(WcsNDMap.read(filename, hdu="ONEXPOSURE")), "acceptance_off": to_cube(WcsNDMap.read(filename, hdu="OFFEXPOSURE")), "exposure": to_cube(exposure_image), "background": to_cube(WcsNDMap.read(filename, hdu="BACKGROUND")), } def test_npred_psf_after_edisp(): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) energy_axis_true = MapAxis.from_energy_bounds( "0.8 TeV", "15 TeV", nbin=6, name="energy_true" ) geom = WcsGeom.create(width=4 * u.deg, binsz=0.02, axes=[energy_axis]) dataset = MapDataset.create(geom=geom, energy_axis_true=energy_axis_true) dataset.background.data += 1 dataset.exposure.data += 1e12 dataset.mask_safe.data += True dataset.psf = PSFMap.from_gauss( energy_axis_true=energy_axis_true, sigma=0.2 * u.deg ) model = SkyModel( spectral_model=PowerLawSpectralModel(), spatial_model=PointSpatialModel(), name="test-model", ) model.apply_irf["psf_after_edisp"] = True bkg_model = FoVBackgroundModel(dataset_name=dataset.name) dataset.models = [bkg_model, model] npred = dataset.npred() assert_allclose(npred.data.sum(), 129553.858658) def get_map_dataset_onoff(images, **kwargs): """Returns a MapDatasetOnOff""" mask_geom = images["counts"].geom mask_data = np.ones(images["counts"].data.shape, dtype=bool) mask_safe = Map.from_geom(mask_geom, data=mask_data) gti = GTI.create([0 * u.s], [1 * u.h], reference_time="2010-01-01T00:00:00") energy_axis = mask_geom.axes["energy"] energy_axis_true = energy_axis.copy(name="energy_true") psf = PSFMap.from_gauss( energy_axis_true=energy_axis_true, sigma=0.2 * u.deg, geom=mask_geom.to_image() ) edisp = EDispKernelMap.from_diagonal_response( energy_axis=energy_axis, energy_axis_true=energy_axis_true, geom=mask_geom ) return MapDatasetOnOff( counts=images["counts"], counts_off=images["counts_off"], acceptance=images["acceptance"], acceptance_off=images["acceptance_off"], exposure=images["exposure"], mask_safe=mask_safe, psf=psf, edisp=edisp, gti=gti, name="MapDatasetOnOff-test", **kwargs, ) @requires_data() @pytest.mark.parametrize("lazy", [False, True]) def test_map_dataset_on_off_fits_io(images, lazy, tmp_path): dataset = get_map_dataset_onoff(images) dataset.meta_table = Table(data=[[1.0 * u.h], [111]], names=["livetime", "obs_id"]) gti = GTI.create([0 * u.s], [1 * u.h], reference_time="2010-01-01T00:00:00") dataset.gti = gti hdulist = dataset.to_hdulist() actual = [hdu.name for hdu in hdulist] desired = [ "PRIMARY", "COUNTS", "COUNTS_BANDS", "EXPOSURE", "EXPOSURE_BANDS", "EDISP", "EDISP_BANDS", "EDISP_EXPOSURE", "EDISP_EXPOSURE_BANDS", "PSF", "PSF_BANDS", "PSF_EXPOSURE", "PSF_EXPOSURE_BANDS", "MASK_SAFE", "MASK_SAFE_BANDS", "GTI", "META_TABLE", "COUNTS_OFF", "COUNTS_OFF_BANDS", "ACCEPTANCE", "ACCEPTANCE_BANDS", "ACCEPTANCE_OFF", "ACCEPTANCE_OFF_BANDS", ] assert actual == desired dataset.write(tmp_path / "test.fits") if lazy: with pytest.raises(NotImplementedError): dataset_new = MapDatasetOnOff.read(tmp_path / "test.fits", lazy=lazy) else: dataset_new = MapDatasetOnOff.read(tmp_path / "test.fits", lazy=lazy) assert dataset_new.name == "MapDatasetOnOff-test" assert dataset_new.mask.data.dtype == bool assert dataset_new.meta_table["livetime"] == 1.0 * u.h assert dataset_new.meta_table["obs_id"] == 111 assert_allclose(dataset.counts.data, dataset_new.counts.data) assert_allclose(dataset.counts_off.data, dataset_new.counts_off.data) assert_allclose(dataset.acceptance.data, dataset_new.acceptance.data) assert_allclose(dataset.acceptance_off.data, dataset_new.acceptance_off.data) assert_allclose(dataset.exposure.data, dataset_new.exposure.data) assert_allclose(dataset.mask_safe, dataset_new.mask_safe) assert np.all(dataset.mask_safe.data == dataset_new.mask_safe.data) assert dataset.mask_safe.geom == dataset_new.mask_safe.geom assert dataset.counts.geom == dataset_new.counts.geom assert dataset.exposure.geom == dataset_new.exposure.geom assert_allclose( dataset.gti.time_sum.to_value("s"), dataset_new.gti.time_sum.to_value("s") ) assert dataset.psf.psf_map == dataset_new.psf.psf_map assert dataset.psf.exposure_map == dataset_new.psf.exposure_map assert dataset.edisp.edisp_map == dataset_new.edisp.edisp_map assert dataset.edisp.exposure_map == dataset_new.edisp.exposure_map @requires_data() def test_map_datasets_on_off_fits_io(images, tmp_path): dataset = get_map_dataset_onoff(images) Datasets([dataset]).write(tmp_path / "test.yaml") datasets = Datasets.read(tmp_path / "test.yaml", lazy=False) with pytest.raises(NotImplementedError): datasets = Datasets.read(tmp_path / "test.yaml", lazy=True) dataset_new = datasets[0] assert dataset.name == dataset_new.name assert_allclose(dataset.counts.data, dataset_new.counts.data) assert_allclose(dataset.counts_off.data, dataset_new.counts_off.data) assert_allclose(dataset.acceptance.data, dataset_new.acceptance.data) assert_allclose(dataset.acceptance_off.data, dataset_new.acceptance_off.data) assert_allclose(dataset.exposure.data, dataset_new.exposure.data) assert_allclose(dataset.mask_safe, dataset_new.mask_safe) @requires_data() def test_map_datasets_on_off_checksum(images, tmp_path): dataset = get_map_dataset_onoff(images) Datasets([dataset]).write(tmp_path / "test.yaml", checksum=True) hdul = fits.open(tmp_path / "MapDatasetOnOff-test.fits") for hdu in hdul: assert "CHECKSUM" in hdu.header assert "DATASUM" in hdu.header with warnings.catch_warnings(): warnings.simplefilter("error") Datasets.read(tmp_path / "test.yaml", lazy=False) path = tmp_path / "MapDatasetOnOff-test.fits" # Modify counts map header to replace interpolation scheme with open(path, "r+b") as file: chunk = file.read(10000) index = chunk.find("lin".encode("ascii")) file.seek(index) file.write("log".encode("ascii")) with pytest.warns(AstropyUserWarning): MapDatasetOnOff.read(path, checksum=True) def test_create_onoff(geom): # tests empty datasets created migra_axis = MapAxis(nodes=np.linspace(0.0, 3.0, 51), unit="", name="migra") rad_axis = MapAxis(nodes=np.linspace(0.0, 1.0, 51), unit="deg", name="rad") energy_axis = geom.axes["energy"].copy(name="energy_true") empty_dataset = MapDatasetOnOff.create(geom, energy_axis, migra_axis, rad_axis) assert_allclose(empty_dataset.counts.data.sum(), 0.0) assert_allclose(empty_dataset.counts_off.data.sum(), 0.0) assert_allclose(empty_dataset.acceptance.data.sum(), 0.0) assert_allclose(empty_dataset.acceptance_off.data.sum(), 0.0) assert empty_dataset.psf.psf_map.data.shape == (2, 50, 10, 10) assert empty_dataset.psf.exposure_map.data.shape == (2, 1, 10, 10) assert empty_dataset.edisp.edisp_map.data.shape == (2, 50, 10, 10) assert empty_dataset.edisp.exposure_map.data.shape == (2, 1, 10, 10) assert_allclose(empty_dataset.edisp.edisp_map.data.sum(), 3333.333333) assert_allclose(empty_dataset.gti.time_delta, 0.0 * u.s) @requires_data() def test_map_dataset_onoff_str(images): dataset = get_map_dataset_onoff(images) assert "MapDatasetOnOff" in str(dataset) assert "counts_off" in str(dataset) assert int(str(dataset)[-52:-48]) == 4273 @requires_data() def test_stack_onoff(images): dataset = get_map_dataset_onoff(images) stacked = dataset.copy() stacked.stack(dataset) assert_allclose(stacked.counts.data.sum(), 2 * dataset.counts.data.sum()) assert_allclose(stacked.counts_off.data.sum(), 2 * dataset.counts_off.data.sum()) assert_allclose(stacked.acceptance.data, 2 * dataset.acceptance.data) assert_allclose(np.nansum(stacked.acceptance_off.data), 40351192, rtol=1e-5) assert_allclose(stacked.exposure.data, 2.0 * dataset.exposure.data) @requires_data() def test_stack_onoff_with_masked_input(images): dataset = get_map_dataset_onoff(images) dataset.mask_safe.data[0, 125:, 125:] = False stacked = dataset.copy().to_masked() stacked.stack(dataset) assert_allclose(stacked.counts.data.sum(), 8054) assert_allclose( stacked.counts_off.data[0, :125, :125], 2 * dataset.counts_off.data[0, :125, :125], ) assert_allclose(stacked.counts_off.data[0, 125:, 125:], 0) assert_allclose( stacked.acceptance.data[0, :125, :125], 2 * dataset.acceptance.data[0, :125, :125], ) assert_allclose(stacked.acceptance.data[0, 125:, 125:], 0) assert_allclose(stacked.acceptance.data.sum(), 7957.8296) assert_allclose(np.nansum(stacked.acceptance_off.data), 37661880.0, rtol=1e-5) assert_allclose( stacked.exposure.data[0, :125, :125], 2.0 * dataset.exposure.data[0, :125, :125] ) assert_allclose(stacked.exposure.data[0, 125:, 125:], 0) @requires_data() def test_stack_onoff_with_unmasked_input(images): dataset = get_map_dataset_onoff(images) dataset.mask_safe.data[0, 125:, 125:] = False stacked = dataset.copy() stacked.stack(dataset) assert_allclose( stacked.counts.data[0, :125, :125], 2 * dataset.counts.data[0, :125, :125] ) assert_allclose( stacked.counts.data[0, 125:, 125:], dataset.counts.data[0, 125:, 125:] ) assert_allclose( stacked.counts_off.data[0, :125, :125], 2 * dataset.counts_off.data[0, :125, :125], ) assert_allclose( stacked.counts_off.data[0, 125:, 125:], dataset.counts_off.data[0, 125:, 125:] ) assert_allclose( stacked.acceptance.data[0, :125, :125], 2 * dataset.acceptance.data[0, :125, :125], ) assert_allclose( stacked.acceptance.data[0, 125:, 125:], dataset.acceptance.data[0, 125:, 125:] ) assert_allclose( stacked.acceptance_off.data[0, :125, :125], 2 * dataset.acceptance_off.data[0, :125, :125], rtol=1e-5, ) assert_allclose( stacked.acceptance_off.data[0, 125:, 125:], dataset.acceptance_off.data[0, 125:, 125:], rtol=1e-5, ) assert_allclose( stacked.exposure.data[0, :125, :125], 2.0 * dataset.exposure.data[0, :125, :125] ) assert_allclose( stacked.exposure.data[0, 125:, 125:], dataset.exposure.data[0, 125:, 125:] ) def test_dataset_cutout_aligned(geom): dataset = MapDataset.create(geom) kwargs = {"position": geom.center_skydir, "width": 1 * u.deg} geoms = {name: geom.cutout(**kwargs) for name, geom in dataset.geoms.items()} cutout = MapDataset.from_geoms(**geoms, name="cutout") assert dataset.counts.geom.is_aligned(cutout.counts.geom) assert dataset.exposure.geom.is_aligned(cutout.exposure.geom) assert dataset.edisp.edisp_map.geom.is_aligned(cutout.edisp.edisp_map.geom) assert dataset.psf.psf_map.geom.is_aligned(cutout.psf.psf_map.geom) def test_stack_onoff_cutout(geom_image): # Test stacking of cutouts energy_axis_true = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=3, name="energy_true" ) dataset = MapDatasetOnOff.create(geom_image, energy_axis_true=energy_axis_true) dataset.gti = GTI.create([0 * u.s], [1 * u.h], reference_time="2010-01-01T00:00:00") kwargs = {"position": geom_image.center_skydir, "width": 1 * u.deg} geoms = {name: geom.cutout(**kwargs) for name, geom in dataset.geoms.items()} dataset_cutout = MapDatasetOnOff.from_geoms(**geoms, name="cutout-dataset") dataset_cutout.gti = GTI.create( [0 * u.s], [1 * u.h], reference_time="2010-01-01T00:00:00" ) dataset_cutout.mask_safe.data += True dataset_cutout.counts.data += 1 dataset_cutout.counts_off.data += 1 dataset_cutout.exposure.data += 1 dataset.stack(dataset_cutout) assert_allclose(dataset.counts.data.sum(), 2500) assert_allclose(dataset.counts_off.data.sum(), 2500) assert_allclose(dataset.alpha.data.sum(), 0) assert_allclose(dataset.exposure.data.sum(), 7500) assert dataset_cutout.name == "cutout-dataset" def test_datasets_io_no_model(tmpdir): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=2) geom = WcsGeom.create(npix=(5, 5), axes=[axis]) dataset_1 = MapDataset.create(geom, name="dataset_1") dataset_2 = MapDataset.create(geom, name="dataset_2") datasets = Datasets([dataset_1, dataset_2]) datasets.write(filename=tmpdir / "datasets.yaml") filename_1 = tmpdir / "dataset_1.fits" assert filename_1.exists() filename_2 = tmpdir / "dataset_2.fits" assert filename_2.exists() @requires_data() def test_map_dataset_on_off_to_spectrum_dataset(images): dataset = get_map_dataset_onoff(images) gti = GTI.create([0 * u.s], [1 * u.h], reference_time="2010-01-01T00:00:00") dataset.gti = gti on_region = CircleSkyRegion( center=dataset.counts.geom.center_skydir, radius=0.1 * u.deg ) spectrum_dataset = dataset.to_spectrum_dataset(on_region) assert spectrum_dataset.counts.data[0] == 8 assert spectrum_dataset.data_shape == (1, 1, 1) assert spectrum_dataset.counts_off.data[0] == 33914 assert_allclose(spectrum_dataset.alpha.data[0], 0.0002143, atol=1e-7) excess_map = images["counts"] - images["background"] excess_true = excess_map.get_spectrum(on_region, np.sum).data[0] excess = spectrum_dataset.excess.data[0] assert_allclose(excess, excess_true, rtol=1e-3) assert spectrum_dataset.name != dataset.name @requires_data() def test_map_dataset_on_off_to_spectrum_dataset_weights(): e_reco = MapAxis.from_bounds(1, 10, nbin=3, unit="TeV", name="energy") geom = WcsGeom.create( skydir=(0, 0), width=(2.5, 2.5), binsz=0.5, axes=[e_reco], frame="galactic" ) counts = Map.from_geom(geom) counts.data += 1 counts_off = Map.from_geom(geom) counts_off.data += 2 acceptance = Map.from_geom(geom) acceptance.data += 1 acceptance_off = Map.from_geom(geom) acceptance_off.data += 4 weights = Map.from_geom(geom, dtype="bool") weights.data[1:, 2:4, 2] = True gti = GTI.create([0 * u.s], [1 * u.h], reference_time="2010-01-01T00:00:00") dataset = MapDatasetOnOff( counts=counts, counts_off=counts_off, acceptance=acceptance, acceptance_off=acceptance_off, mask_safe=weights, gti=gti, ) on_region = CircleSkyRegion( center=dataset.counts.geom.center_skydir, radius=1.5 * u.deg ) spectrum_dataset = dataset.to_spectrum_dataset(on_region) assert_allclose(spectrum_dataset.counts.data[:, 0, 0], [0, 2, 2]) assert_allclose(spectrum_dataset.counts_off.data[:, 0, 0], [0, 4, 4]) assert_allclose(spectrum_dataset.acceptance.data[:, 0, 0], [0, 0.08, 0.08]) assert_allclose(spectrum_dataset.acceptance_off.data[:, 0, 0], [0, 0.32, 0.32]) assert_allclose(spectrum_dataset.alpha.data[:, 0, 0], [0, 0.25, 0.25]) @requires_data() def test_map_dataset_on_off_cutout(images): dataset = get_map_dataset_onoff(images) gti = GTI.create([0 * u.s], [1 * u.h], reference_time="2010-01-01T00:00:00") dataset.gti = gti cutout_dataset = dataset.cutout( images["counts"].geom.center_skydir, ["1 deg", "1 deg"] ) assert cutout_dataset.counts.data.shape == (1, 50, 50) assert cutout_dataset.counts_off.data.shape == (1, 50, 50) assert cutout_dataset.acceptance.data.shape == (1, 50, 50) assert cutout_dataset.acceptance_off.data.shape == (1, 50, 50) assert cutout_dataset.name != dataset.name def test_map_dataset_on_off_fake(geom): rad_axis = MapAxis(nodes=np.linspace(0.0, 1.0, 51), unit="deg", name="rad") energy_true_axis = geom.axes["energy"].copy(name="energy_true") empty_dataset = MapDatasetOnOff.create(geom, energy_true_axis, rad_axis=rad_axis) empty_dataset.acceptance.data = 1.0 empty_dataset.acceptance_off.data = 10.0 empty_dataset.acceptance_off.data[0, 50, 50] = 0 background_map = Map.from_geom(geom, data=1) empty_dataset.fake(background_map, random_state=42) assert_allclose(empty_dataset.counts.data[0, 50, 50], 0) assert_allclose(empty_dataset.counts.data.mean(), 0.99445, rtol=1e-3) assert_allclose(empty_dataset.counts_off.data.mean(), 10.00055, rtol=1e-3) assert empty_dataset.counts.data.dtype == float @requires_data() def test_map_dataset_on_off_to_image(): axis = MapAxis.from_energy_bounds(1, 10, 2, unit="TeV") geom = WcsGeom.create(npix=(10, 10), binsz=0.05, axes=[axis]) counts = Map.from_geom(geom, data=np.ones((2, 10, 10))) counts_off = Map.from_geom(geom, data=np.ones((2, 10, 10))) acceptance = Map.from_geom(geom, data=np.ones((2, 10, 10))) acceptance_off = Map.from_geom(geom, data=np.ones((2, 10, 10))) acceptance_off *= 2 dataset = MapDatasetOnOff( counts=counts, counts_off=counts_off, acceptance=acceptance, acceptance_off=acceptance_off, ) image_dataset = dataset.to_image() assert image_dataset.counts.data.shape == (1, 10, 10) assert image_dataset.acceptance_off.data.shape == (1, 10, 10) assert_allclose(image_dataset.acceptance, 2) assert_allclose(image_dataset.acceptance_off, 4) assert_allclose(image_dataset.counts_off, 2) assert image_dataset.name != dataset.name # Try with a safe_mask mask_safe = Map.from_geom(geom, data=np.ones((2, 10, 10), dtype="bool")) mask_safe.data[0] = 0 dataset.mask_safe = mask_safe image_dataset = dataset.to_image() assert_allclose(image_dataset.acceptance, 1) assert_allclose(image_dataset.acceptance_off, 2) assert_allclose(image_dataset.counts_off, 1) def test_map_dataset_geom(geom, sky_model): e_true = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=5, name="energy_true") dataset = MapDataset.create(geom, energy_axis_true=e_true) dataset.counts = None dataset.background = None npred = dataset.npred() assert npred.geom == geom dataset.mask_safe = None dataset.mask_fit = None with pytest.raises(ValueError): dataset._geom @requires_data() def test_names(geom, geom_etrue, sky_model): m = Map.from_geom(geom) m.quantity = 0.2 * np.ones(m.data.shape) background_model1 = FoVBackgroundModel(dataset_name="test") assert background_model1.name == "test-bkg" c_map1 = Map.from_geom(geom) c_map1.quantity = 0.3 * np.ones(c_map1.data.shape) model1 = sky_model.copy() assert model1.name != sky_model.name model1 = sky_model.copy(name="model1") assert model1.name == "model1" model2 = sky_model.copy(name="model2") dataset1 = MapDataset( counts=c_map1, models=Models([model1, model2, background_model1]), exposure=get_exposure(geom_etrue), background=m, name="test", ) dataset2 = dataset1.copy() assert dataset2.name != dataset1.name assert dataset2.models is None dataset2 = dataset1.copy(name="dataset2") assert dataset2.name == "dataset2" assert dataset2.models is None def test_stack_dataset_dataset_on_off(): axis = MapAxis.from_edges([1, 10] * u.TeV, name="energy") geom = WcsGeom.create(width=1, axes=[axis]) gti = GTI.create([0 * u.s], [1 * u.h]) dataset = MapDataset.create(geom, gti=gti) dataset_on_off = MapDatasetOnOff.create(geom, gti=gti) dataset_on_off.mask_safe.data += True dataset_on_off.acceptance_off += 5 dataset_on_off.acceptance += 1 dataset_on_off.counts_off += 1 dataset.stack(dataset_on_off) assert_allclose(dataset.npred_background().data, 0.166667, rtol=1e-3) @requires_data() def test_info_dict_on_off(images): dataset = get_map_dataset_onoff(images) info_dict = dataset.info_dict() assert_allclose(info_dict["counts"], 4299, rtol=1e-3) assert_allclose(info_dict["excess"], -22.52295, rtol=1e-3) assert_allclose(info_dict["exposure_min"].value, 1.739467e08, rtol=1e-3) assert_allclose(info_dict["exposure_max"].value, 3.4298378e09, rtol=1e-3) assert_allclose(info_dict["npred"], 4321.518, rtol=1e-3) assert_allclose(info_dict["counts_off"], 20407510.0, rtol=1e-3) assert_allclose(info_dict["acceptance"], 4272.7075, rtol=1e-3) assert_allclose(info_dict["acceptance_off"], 20175596.0, rtol=1e-3) assert_allclose(info_dict["alpha"], 0.00021176, rtol=1e-3) assert_allclose(info_dict["ontime"].value, 3600) def test_slice_by_idx(): axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=17) axis_etrue = MapAxis.from_energy_bounds( "0.1 TeV", "10 TeV", nbin=31, name="energy_true" ) geom = WcsGeom.create( skydir=(0, 0), binsz=0.5, width=(2, 2), frame="icrs", axes=[axis], ) dataset = MapDataset.create(geom=geom, energy_axis_true=axis_etrue, binsz_irf=0.5) slices = {"energy": slice(5, 10)} sub_dataset = dataset.slice_by_idx(slices) assert sub_dataset.counts.geom.data_shape == (5, 4, 4) assert sub_dataset.mask_safe.geom.data_shape == (5, 4, 4) assert sub_dataset.npred_background().geom.data_shape == (5, 4, 4) assert sub_dataset.exposure.geom.data_shape == (31, 4, 4) assert sub_dataset.edisp.edisp_map.geom.data_shape == (31, 5, 4, 4) assert sub_dataset.psf.psf_map.geom.data_shape == (31, 66, 4, 4) axis = sub_dataset.counts.geom.axes["energy"] assert_allclose(axis.edges[0].value, 0.387468, rtol=1e-5) slices = {"energy_true": slice(5, 10)} sub_dataset = dataset.slice_by_idx(slices) assert sub_dataset.counts.geom.data_shape == (17, 4, 4) assert sub_dataset.mask_safe.geom.data_shape == (17, 4, 4) assert sub_dataset.npred_background().geom.data_shape == (17, 4, 4) assert sub_dataset.exposure.geom.data_shape == (5, 4, 4) assert sub_dataset.edisp.edisp_map.geom.data_shape == (5, 17, 4, 4) assert sub_dataset.psf.psf_map.geom.data_shape == (5, 66, 4, 4) axis = sub_dataset.counts.geom.axes["energy"] assert_allclose(axis.edges[0].value, 0.1, rtol=1e-5) axis = sub_dataset.exposure.geom.axes["energy_true"] assert_allclose(axis.edges[0].value, 0.210175, rtol=1e-5) def test_plot_residual_onoff(): axis = MapAxis.from_energy_bounds(1, 10, 2, unit="TeV") geom = WcsGeom.create(npix=(10, 10), binsz=0.05, axes=[axis]) counts = Map.from_geom(geom, data=np.ones((2, 10, 10))) counts_off = Map.from_geom(geom, data=np.ones((2, 10, 10))) acceptance = Map.from_geom(geom, data=np.ones((2, 10, 10))) acceptance_off = Map.from_geom(geom, data=np.ones((2, 10, 10))) acceptance_off *= 2 dataset = MapDatasetOnOff( counts=counts, counts_off=counts_off, acceptance=acceptance, acceptance_off=acceptance_off, ) with mpl_plot_check(): dataset.plot_residuals_spatial() def test_to_map_dataset(): axis = MapAxis.from_energy_bounds(1, 10, 2, unit="TeV") geom = WcsGeom.create(npix=(10, 10), binsz=0.05, axes=[axis]) counts = Map.from_geom(geom, data=np.ones((2, 10, 10))) counts_off = Map.from_geom(geom, data=np.ones((2, 10, 10))) acceptance = Map.from_geom(geom, data=np.ones((2, 10, 10))) acceptance_off = Map.from_geom(geom, data=np.ones((2, 10, 10))) acceptance_off *= 2 dataset_onoff = MapDatasetOnOff( counts=counts, counts_off=counts_off, acceptance=acceptance, acceptance_off=acceptance_off, ) dataset = dataset_onoff.to_map_dataset(name="ds") assert dataset.name == "ds" assert_allclose(dataset.npred_background().data.sum(), 100) assert isinstance(dataset, MapDataset) assert dataset.counts == dataset_onoff.counts dataset_onoff.counts_off = None dataset2 = dataset_onoff.to_map_dataset(name="ds2") assert dataset2.background is None def test_downsample_onoff(): axis = MapAxis.from_energy_bounds(1, 10, 4, unit="TeV") geom = WcsGeom.create(npix=(10, 10), binsz=0.05, axes=[axis]) counts = Map.from_geom(geom, data=np.ones((4, 10, 10))) counts_off = Map.from_geom(geom, data=np.ones((4, 10, 10))) acceptance = Map.from_geom(geom, data=np.ones((4, 10, 10))) acceptance_off = Map.from_geom(geom, data=np.ones((4, 10, 10))) acceptance_off *= 2 dataset_onoff = MapDatasetOnOff( counts=counts, counts_off=counts_off, acceptance=acceptance, acceptance_off=acceptance_off, ) downsampled = dataset_onoff.downsample(2, axis_name="energy") assert downsampled.counts.data.shape == (2, 10, 10) assert downsampled.counts.data.sum() == dataset_onoff.counts.data.sum() assert downsampled.counts_off.data.sum() == dataset_onoff.counts_off.data.sum() assert_allclose(downsampled.alpha.data, 0.5) @requires_data() def test_source_outside_geom(sky_model, geom, geom_etrue): dataset = get_map_dataset(geom, geom_etrue) dataset.edisp = get_edisp(geom, geom_etrue) models = dataset.models model = SkyModel( PowerLawSpectralModel(), DiskSpatialModel(lon_0=276.4 * u.deg, lat_0=-28.9 * u.deg, r_0=10 * u.deg), ) assert not geom.to_image().contains(model.position)[0] dataset.models = models + [model] dataset.npred() model_npred = dataset.evaluators[model.name].compute_npred().data assert np.sum(np.isnan(model_npred)) == 0 assert np.sum(~np.isfinite(model_npred)) == 0 assert np.sum(model_npred) > 0 # this is a regression test for an issue found, where the model selection fails @requires_data() def test_source_outside_geom_fermi(): dataset = MapDataset.read( "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc.fits.gz", format="gadf" ) catalog = SourceCatalog3FHL() source = catalog["3FHL J1637.8-3448"] dataset.models = source.sky_model() npred = dataset.npred() assert_allclose(npred.data.sum(), 28548.63, rtol=1e-4) def test_region_geom_io(tmpdir): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=1) geom = RegionGeom.create("icrs;circle(0, 0, 0.2)", axes=[axis]) dataset = MapDataset.create(geom, name="geom-test") filename = tmpdir / "test.fits" dataset.write(filename) dataset = MapDataset.read(filename, format="gadf") assert dataset.name == "geom-test" assert isinstance(dataset.counts.geom, RegionGeom) assert isinstance(dataset.edisp.edisp_map.geom, RegionGeom) assert isinstance(dataset.psf.psf_map.geom, RegionGeom) def test_dataset_mixed_geom(tmpdir): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) energy_axis_true = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=7, name="energy_true" ) rad_axis = MapAxis.from_bounds(0, 1, nbin=10, name="rad", unit="deg") geom = WcsGeom.create(npix=5, axes=[energy_axis]) geom_exposure = WcsGeom.create(npix=5, axes=[energy_axis_true]) geom_psf = RegionGeom.create( "icrs;circle(0, 0, 0.2)", axes=[rad_axis, energy_axis_true] ) geom_edisp = RegionGeom.create( "icrs;circle(0, 0, 0.2)", axes=[energy_axis, energy_axis_true] ) dataset = MapDataset.from_geoms( geom=geom, geom_exposure=geom_exposure, geom_psf=geom_psf, geom_edisp=geom_edisp ) assert isinstance(dataset.psf, PSFMap) assert isinstance(dataset._psf_kernel, PSFKernel) filename = tmpdir / "test.fits" dataset.write(filename) dataset = MapDataset.read(filename, format="gadf") assert isinstance(dataset.counts.geom, WcsGeom) assert isinstance(dataset.exposure.geom, WcsGeom) assert isinstance(dataset.background.geom, WcsGeom) assert isinstance(dataset.psf.psf_map.geom.region, CircleSkyRegion) assert isinstance(dataset.edisp.edisp_map.geom.region, CircleSkyRegion) assert isinstance(dataset.psf, PSFMap) assert dataset.psf.has_single_spatial_bin assert isinstance(dataset._psf_kernel, PSFKernel) geom_psf = WcsGeom.create(npix=1, axes=[rad_axis, energy_axis_true]) dataset = MapDataset.from_geoms( geom=geom, geom_exposure=geom_exposure, geom_psf=geom_psf, geom_edisp=geom_edisp ) assert isinstance(dataset.psf, PSFMap) assert dataset.psf.has_single_spatial_bin assert isinstance(dataset._psf_kernel, PSFKernel) psf_1bin = dataset.psf.copy() geom_psf = WcsGeom.create(npix=3, axes=[rad_axis, energy_axis_true]) dataset = MapDataset.from_geoms( geom=geom, geom_exposure=geom_exposure, geom_psf=geom_psf, geom_edisp=geom_edisp ) assert isinstance(dataset.psf, PSFMap) assert not dataset.psf.has_single_spatial_bin assert dataset._psf_kernel is None dataset.psf.psf_map = psf_1bin.psf_map assert dataset.psf.has_single_spatial_bin assert isinstance(dataset._psf_kernel, PSFKernel) geom_psf_reco = RegionGeom.create( "icrs;circle(0, 0, 0.2)", axes=[rad_axis, energy_axis] ) dataset = MapDataset.from_geoms( geom=geom, geom_exposure=geom_exposure, geom_psf=geom_psf_reco, geom_edisp=geom_edisp, ) assert dataset.psf.tag == "psf_map_reco" @requires_data() def test_map_dataset_region_geom_npred(): dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") pwl = PowerLawSpectralModel() point = PointSpatialModel(lon_0="0 deg", lat_0="0 deg", frame="galactic") model_1 = SkyModel(pwl, point, name="model-1") pwl = PowerLawSpectralModel(amplitude="1e-11 TeV-1 cm-2 s-1") gauss = GaussianSpatialModel( lon_0="0.3 deg", lat_0="0.3 deg", sigma="0.5 deg", frame="galactic" ) model_2 = SkyModel(pwl, gauss, name="model-2") dataset.models = [model_1, model_2] region = RegionGeom.create("galactic;circle(0, 0, 0.4)").region npred_ref = dataset.npred().to_region_nd_map(region) dataset_spec = dataset.to_region_map_dataset(region) dataset_spec.models = [model_1, model_2] npred = dataset_spec.npred() assert_allclose(npred_ref.data, npred.data, rtol=1e-2) @requires_dependency("healpy") def test_map_dataset_create_hpx_geom(geom_hpx): dataset = MapDataset.create(**geom_hpx, binsz_irf=10 * u.deg) assert isinstance(dataset.counts.geom, HpxGeom) assert dataset.counts.data.shape == (3, 12288) assert isinstance(dataset.background.geom, HpxGeom) assert dataset.background.data.shape == (3, 12288) assert isinstance(dataset.exposure.geom, HpxGeom) assert dataset.exposure.data.shape == (4, 12288) assert isinstance(dataset.edisp.edisp_map.geom, HpxGeom) assert dataset.edisp.edisp_map.data.shape == (4, 3, 768) assert isinstance(dataset.psf.psf_map.geom, HpxGeom) assert dataset.psf.psf_map.data.shape == (4, 66, 768) @requires_dependency("healpy") def test_map_dataset_create_hpx_geom_partial(geom_hpx_partial): dataset = MapDataset.create(**geom_hpx_partial, binsz_irf=2 * u.deg) assert isinstance(dataset.counts.geom, HpxGeom) assert dataset.counts.data.shape == (3, 90) assert isinstance(dataset.background.geom, HpxGeom) assert dataset.background.data.shape == (3, 90) assert isinstance(dataset.exposure.geom, HpxGeom) assert dataset.exposure.data.shape == (4, 90) assert isinstance(dataset.edisp.edisp_map.geom, HpxGeom) assert dataset.edisp.edisp_map.data.shape == (4, 3, 24) assert isinstance(dataset.psf.psf_map.geom, HpxGeom) assert dataset.psf.psf_map.data.shape == (4, 66, 24) @requires_dependency("healpy") def test_map_dataset_stack_hpx_geom(geom_hpx_partial, geom_hpx): dataset_all = MapDataset.create(**geom_hpx, binsz_irf=5 * u.deg) gti = GTI.create(start=0 * u.s, stop=30 * u.min) dataset_cutout = MapDataset.create(**geom_hpx_partial, binsz_irf=5 * u.deg, gti=gti) dataset_cutout.counts.data += 1 dataset_cutout.background.data += 1 dataset_cutout.exposure.data += 1 dataset_cutout.mask_safe.data[...] = True dataset_all.stack(dataset_cutout) assert_allclose(dataset_all.counts.data.sum(), 3 * 90) assert_allclose(dataset_all.background.data.sum(), 3 * 90) assert_allclose(dataset_all.exposure.data.sum(), 4 * 90) @requires_data() @requires_dependency("healpy") def test_map_dataset_hpx_geom_npred(geom_hpx_partial): hpx_geom = geom_hpx_partial["geom"] hpx_true = hpx_geom.to_image().to_cube([geom_hpx_partial["energy_axis_true"]]) dataset = get_map_dataset(hpx_geom, hpx_true, edisp="edispkernelmap") pwl = PowerLawSpectralModel() point = PointSpatialModel(lon_0="110 deg", lat_0="75 deg", frame="galactic") sky_model = SkyModel(pwl, point) dataset.models = [sky_model] assert_allclose(dataset.npred().data.sum(), 54, rtol=1e-3) @requires_data() def test_peek(images): dataset = get_map_dataset_onoff(images) with mpl_plot_check(): dataset.peek() def test_create_psf_reco(geom): dat = MapDataset.create(geom, reco_psf=True) assert isinstance(dat.psf, RecoPSFMap) def test_to_masked(): axis = MapAxis.from_energy_bounds(1, 10, 2, unit="TeV") geom = WcsGeom.create(npix=(10, 10), binsz=0.05, axes=[axis]) counts = Map.from_geom(geom, data=1) mask = Map.from_geom(geom, data=True, dtype=bool) mask.data[0][5:8] = False dataset = MapDataset(counts=counts, mask_safe=mask) d1 = dataset.to_masked() assert_allclose(d1.counts.data.sum(), 170) acceptance = Map.from_geom(geom, data=1) acceptance_off = Map.from_geom(geom, data=0.1) counts_off = counts datasetonoff = MapDatasetOnOff( counts=counts, acceptance=acceptance, mask_safe=mask, acceptance_off=acceptance_off, counts_off=counts_off, ) d1 = datasetonoff.to_masked() assert_allclose(d1.counts.data.sum(), 170) def test_get_psf_kernel_multiscale(): energy_axis = MapAxis.from_edges( np.logspace(-1.0, 1.0, 4), unit="TeV", name="energy_true" ) geom = WcsGeom.create(binsz=0.02 * u.deg, width=4.0 * u.deg, axes=[energy_axis]) psf = PSFMap.from_gauss(energy_axis, sigma=[0.1, 0.2, 0.3] * u.deg) kernel = psf.get_psf_kernel(geom=geom, max_radius="3 deg") assert_allclose(kernel.psf_kernel_map.geom.width, 2 * 3 * u.deg, atol=0.02) kernel = psf.get_psf_kernel(geom=geom, max_radius=None) geom_image = kernel.psf_kernel_map.geom.to_image() coords = geom_image.get_coord() sep = coords.skycoord.separation(geom_image.center_skydir) widths = [0.74, 1.34, 1.34] * u.deg for im, width in zip(kernel.psf_kernel_map.iter_by_image(), widths): mask = sep > width assert np.all(im.data[mask] == 0) assert np.any(im.data[~mask] > 0) @requires_data() def test_create_map_dataset_from_observation(): datastore = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") observations = datastore.get_observations() dataset_new = create_map_dataset_from_observation(observations[0]) assert dataset_new.counts.data.sum() == 0 assert_allclose(dataset_new.exposure.data.sum(), 43239974121207.85) @requires_data() def test_create_empty_map_dataset_from_irfs(geom, geom_etrue): datastore = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") obs = datastore.get_observations()[0] dataset_new = create_empty_map_dataset_from_irfs(obs) assert dataset_new.counts.data.sum() == 0 assert dataset_new.exposure.data.sum() == 0 dataset = get_map_dataset(geom, geom_etrue) obs.psf = dataset.psf obs.edisp = dataset.edisp dataset_new = create_empty_map_dataset_from_irfs(obs) assert dataset_new.counts.data.sum() == 0 assert dataset_new.exposure.data.sum() == 0 obs.edisp = dataset.edisp.to_edisp_kernel_map( dataset.background.geom.axes["energy"] ) dataset_new = create_empty_map_dataset_from_irfs(obs) assert dataset_new.counts.data.sum() == 0 assert dataset_new.exposure.data.sum() == 0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/tests/test_metadata.py0000644000175100001770000000701114721316200022206 0ustar00runnerdockerimport pytest from numpy.testing import assert_allclose from astropy.coordinates import SkyCoord from pydantic import ValidationError from gammapy.datasets import MapDatasetMetaData from gammapy.utils.metadata import ObsInfoMetaData, PointingInfoMetaData def test_meta_default(): meta = MapDatasetMetaData() assert meta.creation.creator.split()[0] == "Gammapy" assert meta.obs_info is None def test_mapdataset_metadata(): position = SkyCoord(83.6287, 22.0147, unit="deg", frame="icrs") obs_info_input = { "telescope": "cta-north", "instrument": "lst", "observation_mode": "wobble", "obs_id": 112, } input = { "obs_info": ObsInfoMetaData(**obs_info_input), "pointing": PointingInfoMetaData(radec_mean=position), "optional": dict(test=0.5, other=True), } meta = MapDatasetMetaData(**input) assert meta.obs_info.telescope == "cta-north" assert meta.obs_info.instrument == "lst" assert meta.obs_info.observation_mode == "wobble" assert_allclose(meta.pointing.radec_mean.dec.value, 22.0147) assert_allclose(meta.pointing.radec_mean.ra.deg, 83.6287) assert meta.obs_info.obs_id == 112 assert meta.optional["other"] is True assert meta.creation.creator.split()[0] == "Gammapy" assert meta.event_type is None with pytest.raises(ValidationError): meta.pointing = 2.0 input_bad = input.copy() input_bad["bad"] = position with pytest.raises(ValueError): MapDatasetMetaData(**input_bad) def test_mapdataset_metadata_lists(): obs_info_input1 = { "telescope": "cta-north", "instrument": "lst", "observation_mode": "wobble", "obs_id": 111, } obs_info_input2 = { "telescope": "cta-north", "instrument": "lst", "observation_mode": "wobble", "obs_id": 112, } input = { "obs_info": [ ObsInfoMetaData(**obs_info_input1), ObsInfoMetaData(**obs_info_input2), ], "pointing": [ PointingInfoMetaData( radec_mean=SkyCoord(83.6287, 22.0147, unit="deg", frame="icrs") ), PointingInfoMetaData( radec_mean=SkyCoord(83.1287, 22.5147, unit="deg", frame="icrs") ), ], } meta = MapDatasetMetaData(**input) assert meta.obs_info[0].telescope == "cta-north" assert meta.obs_info[0].instrument == "lst" assert meta.obs_info[0].observation_mode == "wobble" assert_allclose(meta.pointing[0].radec_mean.dec.value, 22.0147) assert_allclose(meta.pointing[1].radec_mean.ra.deg, 83.1287) assert meta.obs_info[0].obs_id == 111 assert meta.obs_info[1].obs_id == 112 assert meta.optional is None assert meta.event_type is None def test_mapdataset_metadata_stack(): input1 = { "obs_info": ObsInfoMetaData(**{"obs_id": 111}), "pointing": PointingInfoMetaData( radec_mean=SkyCoord(83.6287, 22.5147, unit="deg", frame="icrs") ), "optional": dict(test=0.5, other=True), } input2 = { "obs_info": ObsInfoMetaData(**{"instrument": "H.E.S.S.", "obs_id": 112}), "pointing": PointingInfoMetaData( radec_mean=SkyCoord(83.6287, 22.0147, unit="deg", frame="icrs") ), "optional": dict(test=0.1, other=False), } meta1 = MapDatasetMetaData(**input1) meta2 = MapDatasetMetaData(**input2) meta = meta1.stack(meta2) assert meta.creation.creator.split()[0] == "Gammapy" assert meta.obs_info is None ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/tests/test_simulate.py0000644000175100001770000010521214721316200022253 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import EarthLocation, SkyCoord from astropy.io import fits from astropy.table import Table from astropy.time import Time from gammapy.data import GTI, DataStore, Observation, ObservationsEventsSampler from gammapy.data.pointing import FixedPointingInfo from gammapy.datasets import MapDataset, MapDatasetEventSampler from gammapy.datasets.tests.test_map import get_map_dataset from gammapy.irf import load_irf_dict_from_file from gammapy.makers import MapDatasetMaker from gammapy.maps import MapAxis, RegionGeom, RegionNDMap, WcsGeom from gammapy.modeling.models import ( ConstantSpectralModel, FoVBackgroundModel, GaussianSpatialModel, LightCurveTemplateTemporalModel, Models, PointSpatialModel, PowerLawSpectralModel, SkyModel, ) from gammapy.utils.testing import requires_data LOCATION = EarthLocation(lon="-70d18m58.84s", lat="-24d41m0.34s", height="2000m") @pytest.fixture() def signal_model(): spatial_model = GaussianSpatialModel( lon_0="0 deg", lat_0="0 deg", sigma="0.2 deg", frame="galactic" ) spectral_model = PowerLawSpectralModel(amplitude="1e-11 cm-2 s-1 TeV-1") t_max = 1000 * u.s time = np.arange(t_max.value) * u.s tau = u.Quantity("2e2 s") norm = np.exp(-time / tau) table = Table() table["TIME"] = time table["NORM"] = norm / norm.max() t_ref = Time("2000-01-01") table.meta = dict(MJDREFI=t_ref.mjd, MJDREFF=0, TIMEUNIT="s", TIMESYS="utc") temporal_model = LightCurveTemplateTemporalModel.from_table(table) return SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, temporal_model=temporal_model, name="test-source", ) @pytest.fixture() def models(signal_model): bkg_model = FoVBackgroundModel(dataset_name="test") return [signal_model, bkg_model] @pytest.fixture() def model_alternative(): spatial_model1 = GaussianSpatialModel( lon_0="0 deg", lat_0="0 deg", sigma="0.2 deg", frame="galactic" ) spectral_model = PowerLawSpectralModel(amplitude="1e-11 cm-2 s-1 TeV-1") mod1 = SkyModel( spatial_model=spatial_model1, spectral_model=spectral_model, name="test-source", ) spatial_model2 = GaussianSpatialModel( lon_0="0.5 deg", lat_0="0.5 deg", sigma="0.2 deg", frame="galactic" ) mod2 = SkyModel( spatial_model=spatial_model2, spectral_model=spectral_model, name="test-source2", ) spatial_model3 = GaussianSpatialModel( lon_0="0.5 deg", lat_0="0.0 deg", sigma="0.2 deg", frame="galactic" ) mod3 = SkyModel( spatial_model=spatial_model3, spectral_model=spectral_model, name="test-source3", ) bkg_model = FoVBackgroundModel(dataset_name="test") model2 = Models([mod1, mod2, bkg_model, mod3]) return model2 def get_energy_dependent_temporal_model(): energy_axis = MapAxis.from_energy_bounds( energy_min=1 * u.TeV, energy_max=10 * u.TeV, nbin=10, name="energy" ) edges = np.linspace(0, 1000, 101) * u.s time_axis = MapAxis.from_edges(edges=edges, name="time", interp="lin") geom = RegionGeom.create( region=SkyCoord("0.5 deg", "0.5 deg", frame="galactic"), axes=[energy_axis, time_axis], ) coords = geom.get_coord(sparse=True) def f(time, energy): """Toy model for testing""" amplitude = u.Quantity("5e-11 cm-2 s-1 TeV-1") * np.exp(-(time / (200 * u.s))) index = 1 + 2.2 * time.to_value("s") / 1000 return PowerLawSpectralModel.evaluate( amplitude=amplitude, index=index, energy=energy, reference=1 * u.TeV ) data = f(time=coords["time"], energy=coords["energy"]) m = RegionNDMap.from_geom(data=data.value, geom=geom, unit=data.unit) t_ref = Time(51544.00074287037, format="mjd", scale="tt") return LightCurveTemplateTemporalModel(m, t_ref=t_ref) @pytest.fixture() @requires_data() def energy_dependent_temporal_sky_model(models): models[0].spatial_model = PointSpatialModel( lon_0="0 deg", lat_0="0 deg", frame="galactic" ) models[0].spectral_model = ConstantSpectralModel(const="1 cm-2 s-1 TeV-1") models[0].temporal_model = get_energy_dependent_temporal_model() return models[0] @pytest.fixture(scope="session") def dataset(): energy_axis = MapAxis.from_bounds( 1, 10, nbin=3, unit="TeV", name="energy", interp="log" ) geom = WcsGeom.create( skydir=(0, 0), binsz=0.05, width="5 deg", frame="galactic", axes=[energy_axis] ) etrue_axis = energy_axis.copy(name="energy_true") geom_true = geom.to_image().to_cube(axes=[etrue_axis]) dataset = get_map_dataset( geom=geom, geom_etrue=geom_true, edisp="edispmap", name="test" ) dataset.background /= 400 dataset.gti = GTI.create( start=0 * u.s, stop=1000 * u.s, reference_time=Time("2000-01-01").tt ) return dataset @requires_data() def test_evaluate_timevar_source(energy_dependent_temporal_sky_model, dataset): dataset.models = energy_dependent_temporal_sky_model evaluator = dataset.evaluators["test-source"] sampler = MapDatasetEventSampler(random_state=0) npred = sampler._evaluate_timevar_source(dataset, evaluator.model) assert_allclose(np.shape(npred.data), (3, 1999, 1, 1)) assert_allclose(npred.data[:, 10, 0, 0], [0.819546, 1.676989, 2.248684], rtol=2e-4) assert_allclose(npred.data[:, 50, 0, 0], [0.729098, 1.442345, 1.869792], rtol=2e-4) filename = "$GAMMAPY_DATA/gravitational_waves/GW_example_DC_map_file.fits.gz" temporal_model = LightCurveTemplateTemporalModel.read(filename, format="map") temporal_model.t_ref.value = 51544.00074287037 dataset.models[0].temporal_model = temporal_model evaluator = dataset.evaluators["test-source"] sampler = MapDatasetEventSampler(random_state=0) npred = sampler._evaluate_timevar_source(dataset, evaluator.model) assert_allclose( npred.data[:, 1000, 0, 0] / 1e-13, [0.038113, 0.033879, 0.010938], rtol=2e-4, ) @requires_data() def test_sample_coord_time_energy_no_spatial( dataset, energy_dependent_temporal_sky_model ): sampler = MapDatasetEventSampler(random_state=0) energy_dependent_temporal_sky_model.spatial_model = None energy_dependent_temporal_sky_model.spectral_model = ConstantSpectralModel( const="1 cm-2 s-1 TeV-1" ) dataset.models = energy_dependent_temporal_sky_model evaluator = dataset.evaluators["test-source"] with pytest.raises(TypeError): sampler._sample_coord_time_energy(dataset, evaluator.model) @requires_data() def test_sample_coord_time_energy_gaussian( dataset, energy_dependent_temporal_sky_model ): sampler = MapDatasetEventSampler(random_state=0) energy_dependent_temporal_sky_model.spatial_model = GaussianSpatialModel() energy_dependent_temporal_sky_model.spectral_model = ConstantSpectralModel( const="1 cm-2 s-1 TeV-1" ) dataset.models = energy_dependent_temporal_sky_model evaluator = dataset.evaluators["test-source"] with pytest.raises(TypeError): sampler._sample_coord_time_energy(dataset, evaluator.model) @requires_data() def test_sample_coord_time_energy(dataset, energy_dependent_temporal_sky_model): sampler = MapDatasetEventSampler(random_state=1) energy_dependent_temporal_sky_model.spatial_model = PointSpatialModel( lon_0="0 deg", lat_0="0 deg", frame="galactic" ) energy_dependent_temporal_sky_model.spectral_model.const.value = 2 dataset.models = energy_dependent_temporal_sky_model evaluator = dataset.evaluators["test-source"] expected = np.array([854.26361, 7.840697, 266.404988, -28.936178]) events = sampler._sample_coord_time_energy(dataset, evaluator.model) assert len(events) == 2514 assert_allclose( [events[0][0], events[0][1], events[0][2], events[0][3]], expected, rtol=1e-6, ) irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) livetime = 1.0 * u.hr pointing = FixedPointingInfo( fixed_icrs=SkyCoord(0, 0, unit="deg", frame="galactic").icrs, ) obs = Observation.create( obs_id=1001, pointing=pointing, livetime=livetime, irfs=irfs, location=LOCATION, ) new_mod = Models( [ energy_dependent_temporal_sky_model, FoVBackgroundModel(dataset_name=dataset.name), ] ) dataset.models = new_mod events = sampler.run(dataset, obs) assert dataset.gti.time_ref.scale == events.table.meta["TIMESYS"] @requires_data() def test_fail_sample_coord_time_energy( dataset, models, energy_dependent_temporal_sky_model ): new_dataset = dataset.copy("my-dataset") new_dataset.gti = GTI.create( start=0 * u.s, stop=2.5 * u.s, reference_time=Time("2000-01-01").tt ) new_dataset.models = energy_dependent_temporal_sky_model new_dataset.models[0].temporal_model.map.data *= 1e20 evaluator = new_dataset.evaluators["test-source"] sampler = MapDatasetEventSampler(random_state=0, oversample_energy_factor=1) with pytest.raises(ValueError): sampler._sample_coord_time_energy(new_dataset, evaluator.model) @requires_data() def test_negative_npred(dataset): spatial_model = PointSpatialModel(lon_0="0 deg", lat_0="0 deg", frame="galactic") spectral_model = PowerLawSpectralModel(amplitude="-4e-10 cm-2 s-1 TeV-1") dataset.models = [ SkyModel(spectral_model=spectral_model, spatial_model=spatial_model), FoVBackgroundModel(dataset_name=dataset.name), ] sampler = MapDatasetEventSampler(random_state=0, n_event_bunch=1000) events = sampler.run(dataset=dataset) assert len(events.table) == 15 @requires_data() def test_sample_coord_time_energy_random_seed( dataset, energy_dependent_temporal_sky_model ): sampler = MapDatasetEventSampler(random_state=2) energy_dependent_temporal_sky_model.temporal_model.map._unit == "" dataset.models = energy_dependent_temporal_sky_model evaluator = dataset.evaluators["test-source"] events = sampler._sample_coord_time_energy(dataset, evaluator.model) assert len(events) == 1256 assert_allclose( [events[0][1], events[0][2], events[0][3]], [1.932196, 266.404988, -28.936178], rtol=1e-3, ) # Important: do not increase the tolerance! assert_allclose( events[0][0], 0.29982, rtol=1.5e-6, ) @requires_data() def test_sample_coord_time_energy_unit(dataset, energy_dependent_temporal_sky_model): sampler = MapDatasetEventSampler(random_state=1) energy_dependent_temporal_sky_model.temporal_model.map._unit == "cm-2 s-1 TeV-1" energy_dependent_temporal_sky_model.spectral_model.parameters[0].unit = "" dataset.models = energy_dependent_temporal_sky_model evaluator = dataset.evaluators["test-source"] events = sampler._sample_coord_time_energy(dataset, evaluator.model) assert len(events) == 1254 assert_allclose( [events[0][1], events[0][2], events[0][3]], [6.22904, 266.404988, -28.936178], rtol=1e-6, ) # Important: do not increase the tolerance! assert_allclose( events[0][0], 854.10859, rtol=1.5e-6, ) @requires_data() def test_mde_sample_sources(dataset, models): dataset.models = models sampler = MapDatasetEventSampler(random_state=0) events = sampler.sample_sources(dataset=dataset) assert len(events.table["ENERGY_TRUE"]) == 90 assert_allclose(events.table["ENERGY_TRUE"][0], 2.383778805, rtol=1e-5) assert events.table["ENERGY_TRUE"].unit == "TeV" assert_allclose(events.table["RA_TRUE"][0], 266.56408893, rtol=1e-5) assert events.table["RA_TRUE"].unit == "deg" assert_allclose(events.table["DEC_TRUE"][0], -28.748145, rtol=1e-5) assert events.table["DEC_TRUE"].unit == "deg" assert_allclose(events.table["TIME"][0], 120.37471, rtol=1e-5) assert events.table["TIME"].unit == "s" assert_allclose(events.table["MC_ID"][0], 1, rtol=1e-5) @requires_data() def test_mde_sample_sources_psf_update(dataset, models): dataset.models = models sampler = MapDatasetEventSampler(random_state=0) events = sampler.sample_sources(dataset=dataset, psf_update=False) assert len(events.table["ENERGY_TRUE"]) == 90 @requires_data() def test_sample_sources_energy_dependent(dataset, energy_dependent_temporal_sky_model): dataset.models = energy_dependent_temporal_sky_model sampler = MapDatasetEventSampler(random_state=0) events = sampler.sample_sources(dataset=dataset) assert len(events.table["ENERGY_TRUE"]) == 1268 assert_allclose(events.table["ENERGY_TRUE"][0], 6.456526, rtol=1e-5) assert_allclose(events.table["RA_TRUE"][0], 266.404988, rtol=1e-5) assert_allclose(events.table["DEC_TRUE"][0], -28.936178, rtol=1e-5) assert_allclose(events.table["TIME"][0], 95.464699, rtol=1e-5) dt = np.max(events.table["TIME"]) - np.min(events.table["TIME"]) assert dt <= dataset.gti.time_sum.to("s").value + sampler.t_delta.to("s").value @requires_data() def test_mde_sample_weak_src(dataset, models): irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) livetime = 10.0 * u.hr pointing = FixedPointingInfo( fixed_icrs=SkyCoord(0, 0, unit="deg", frame="galactic").icrs, ) obs = Observation.create( obs_id=1001, pointing=pointing, livetime=livetime, irfs=irfs, location=LOCATION, ) models[0].parameters["amplitude"].value = 1e-25 dataset.models = models sampler = MapDatasetEventSampler(random_state=0) events = sampler.run(dataset=dataset, observation=obs) assert len(events.table) == 18 assert_allclose( len(np.where(events.table["MC_ID"] == 0)[0]), len(events.table), rtol=1e-5 ) @requires_data() def test_mde_sample_background(dataset, models): dataset.models = models sampler = MapDatasetEventSampler(random_state=0) events = sampler.sample_background(dataset=dataset) assert len(events.table) == 15 assert_allclose(np.mean(events.table["ENERGY"]), 4.1, rtol=0.1) assert events.table["ENERGY"].unit == "TeV" assert np.all(events.table["RA"] < 269.95) assert np.all(events.table["RA"] > 262.90) assert events.table["RA"].unit == "deg" assert np.all(events.table["DEC"] < -25.43) assert np.all(events.table["DEC"] > -32.43) assert events.table["DEC"].unit == "deg" assert events.table["DEC_TRUE"][0] == events.table["DEC"][0] assert_allclose(events.table["MC_ID"][0], 0, rtol=1e-5) @requires_data() def test_mde_sample_psf(dataset, models): dataset.models = models sampler = MapDatasetEventSampler(random_state=0) events = sampler.sample_sources(dataset=dataset) events = sampler.sample_psf(dataset.psf, events) assert len(events.table) == 90 assert_allclose(events.table["ENERGY_TRUE"][0], 2.38377880, rtol=1e-5) assert events.table["ENERGY_TRUE"].unit == "TeV" assert_allclose(events.table["RA"][0], 266.542912, rtol=1e-5) assert events.table["RA"].unit == "deg" assert_allclose(events.table["DEC"][0], -28.78829, rtol=1e-5) assert events.table["DEC"].unit == "deg" @requires_data() def test_mde_sample_edisp(dataset, models): dataset.models = models sampler = MapDatasetEventSampler(random_state=0) events = sampler.sample_sources(dataset=dataset) events = sampler.sample_edisp(dataset.edisp, events) assert len(events.table) == 90 assert_allclose(events.table["ENERGY"][0], 2.383778805, rtol=1e-5) assert events.table["ENERGY"].unit == "TeV" assert_allclose(events.table["RA_TRUE"][0], 266.564088, rtol=1e-5) assert events.table["RA_TRUE"].unit == "deg" assert_allclose(events.table["DEC_TRUE"][0], -28.7481450, rtol=1e-5) assert events.table["DEC_TRUE"].unit == "deg" assert_allclose(events.table["MC_ID"][0], 1, rtol=1e-5) @requires_data() def test_event_det_coords(dataset, models): irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) livetime = 1.0 * u.hr pointing = FixedPointingInfo( fixed_icrs=SkyCoord(0, 0, unit="deg", frame="galactic").icrs, ) obs = Observation.create( obs_id=1001, pointing=pointing, livetime=livetime, irfs=irfs, location=LOCATION, ) dataset.models = models sampler = MapDatasetEventSampler(random_state=0) events = sampler.run(dataset=dataset, observation=obs) assert len(events.table) == 99 # Check that det coordinates are within sqrt(2) * width of dataset assert np.all(events.table["DETX"] < 3.53) assert np.all(events.table["DETX"] > -3.53) assert events.table["DETX"].unit == "deg" assert np.all(events.table["DETY"] < 3.53) assert np.all(events.table["DETY"] > -3.53) assert events.table["DETY"].unit == "deg" @requires_data() def test_mde_run(dataset, models, caplog, tmp_path): irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) livetime = 1.0 * u.hr pointing = FixedPointingInfo( fixed_icrs=SkyCoord(0, 0, unit="deg", frame="galactic").icrs, ) obs = Observation.create( obs_id=1001, pointing=pointing, livetime=livetime, irfs=irfs, location=LOCATION, ) dataset.models = models logging.getLogger().setLevel(logging.INFO) sampler = MapDatasetEventSampler(random_state=0) events = sampler.run(dataset=dataset, observation=obs) captured = caplog.records str = [ "Evaluating model: test-source", "Evaluating background...", "Event sampling completed.", ] assert captured[1].message == str[0] assert captured[2].message == str[1] assert captured[4].message == str[2] dataset_bkg = dataset.copy(name="new-dataset") dataset_bkg.models = [FoVBackgroundModel(dataset_name=dataset_bkg.name)] events_bkg = sampler.run(dataset=dataset_bkg, observation=obs) # assert len(events.table) == 99 assert_allclose(np.mean(events.table["ENERGY"]), 3.5, atol=1.0) assert np.all(events.table["ENERGY"] > 1) src_events = events.select_parameter("MC_ID", [0.5, 1.5]) assert_allclose(np.mean(src_events.table["RA"]), 266.40, atol=0.1) assert_allclose(np.mean(src_events.table["DEC"]), -28.93, atol=0.1) separation = src_events.radec.separation(models[0].spatial_model.position) assert np.all(separation.to_value("deg") < 0.7) assert len(events_bkg.table) == 21 assert_allclose(events_bkg.table["MC_ID"][0], 0, rtol=1e-5) meta = events.table.meta assert meta["HDUCLAS1"] == "EVENTS" assert meta["EXTNAME"] == "EVENTS" assert ( meta["HDUDOC"] == "https://github.com/open-gamma-ray-astro/gamma-astro-data-formats" ) assert meta["HDUVERS"] == "0.2" assert meta["HDUCLASS"] == "GADF" assert meta["OBS_ID"] == 1001 assert_allclose(meta["TSTART"], 0.0) assert_allclose(meta["TSTOP"], 3600.0) assert_allclose(meta["ONTIME"], 3600.0) assert_allclose(meta["LIVETIME"], 3600.0) assert_allclose(meta["DEADC"], 1.0) assert_allclose(meta["RA_PNT"], 266.4049882865447) assert_allclose(meta["DEC_PNT"], -28.936177761791473) assert meta["EQUINOX"] == "J2000" assert meta["RADECSYS"] == "icrs" assert "Gammapy" in meta["CREATOR"] assert meta["EUNIT"] == "TeV" assert meta["EVTVER"] == "" assert meta["OBSERVER"] == "Gammapy user" assert meta["DSTYP1"] == "TIME" assert meta["DSUNI1"] == "s" assert meta["DSVAL1"] == "TABLE" assert meta["DSREF1"] == ":GTI" assert meta["DSTYP2"] == "ENERGY" assert meta["DSUNI2"] == "TeV" assert ":" in meta["DSVAL2"] assert meta["DSTYP3"] == "POS(RA,DEC) " assert "CIRCLE" in meta["DSVAL3"] assert meta["DSUNI3"] == "deg " assert meta["NDSKEYS"] == " 3 " assert_allclose(meta["RA_OBJ"], 266.4049882865447) assert_allclose(meta["DEC_OBJ"], -28.936177761791473) assert_allclose(meta["TELAPSE"], 1000.0) assert_allclose(meta["MJDREFI"], 51544) assert_allclose(meta["MJDREFF"], 0.0007428703684126958) assert meta["TIMEUNIT"] == "s" assert meta["TIMESYS"] == "tt" assert meta["TIMEREF"] == "LOCAL" assert meta["DATE-OBS"] == "2000-01-01" assert meta["DATE-END"] == "2000-01-01" assert meta["CONV_DEP"] == 0 assert meta["CONV_RA"] == 0 assert meta["CONV_DEC"] == 0 assert meta["MID00000"] == 0 assert meta["MMN00000"] == "test-bkg" assert meta["MID00001"] == 1 assert meta["NMCIDS"] == 2 assert_allclose(meta["ALT_PNT"], -13.5345076464, rtol=1e-7) assert_allclose(meta["AZ_PNT"], 228.82981620065763, rtol=1e-7) assert meta["ORIGIN"] == "Gammapy" assert meta["TELESCOP"] == "CTA" assert meta["INSTRUME"] == "1DC" assert meta["N_TELS"] == "" assert meta["TELLIST"] == "" # test writing out and reading back in works obs.events = events path = tmp_path / "obs.fits.gz" obs.write(path) obs_back = Observation.read(path) assert u.isclose(obs_back.observatory_earth_location.lon, LOCATION.lon) assert u.isclose(obs_back.observatory_earth_location.lat, LOCATION.lat) assert u.isclose(obs_back.observatory_earth_location.height, LOCATION.height) @requires_data() def test_irf_alpha_config(dataset, models): irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-caldb/Prod5-South-20deg-AverageAz-14MSTs37SSTs.180000s-v0.1.fits.gz" ) livetime = 1.0 * u.hr pointing = FixedPointingInfo( fixed_icrs=SkyCoord(0, 0, unit="deg", frame="galactic").icrs, ) obs = Observation.create( obs_id=1001, pointing=pointing, livetime=livetime, irfs=irfs, location=LOCATION, ) dataset.models = models sampler = MapDatasetEventSampler(random_state=0) events = sampler.run(dataset=dataset, observation=obs) assert events is not None @requires_data() def test_mde_run_switchoff(dataset, models): irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) livetime = 1.0 * u.hr pointing = FixedPointingInfo( fixed_icrs=SkyCoord(0, 0, unit="deg", frame="galactic").icrs, ) obs = Observation.create( obs_id=1001, pointing=pointing, livetime=livetime, irfs=irfs, location=LOCATION, ) dataset.models = models dataset.psf = None dataset.edisp = None dataset.background = None sampler = MapDatasetEventSampler(random_state=0) events = sampler.run(dataset=dataset, observation=obs) assert len(events.table) == 90 assert_allclose(events.table["ENERGY"][0], 1.947042, rtol=1e-5) assert_allclose(events.table["RA"][0], 266.875015, rtol=1e-5) assert_allclose(events.table["DEC"][0], -29.115063, rtol=1e-5) meta = events.table.meta assert meta["RA_PNT"] == 266.4049882865447 assert_allclose(meta["ONTIME"], 3600.0) assert meta["OBS_ID"] == 1001 assert meta["RADECSYS"] == "icrs" @requires_data() def test_events_datastore(tmp_path, dataset, models): irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) livetime = 10.0 * u.hr pointing = FixedPointingInfo( fixed_icrs=SkyCoord(0, 0, unit="deg", frame="galactic").icrs, ) obs = Observation.create( obs_id=1001, pointing=pointing, livetime=livetime, irfs=irfs, location=LOCATION, ) dataset.models = models sampler = MapDatasetEventSampler(random_state=0) events = sampler.run(dataset=dataset, observation=obs) primary_hdu = fits.PrimaryHDU() hdu_evt = fits.BinTableHDU(events.table) hdu_gti = dataset.gti.to_table_hdu(format="gadf") hdu_all = fits.HDUList([primary_hdu, hdu_evt, hdu_gti]) hdu_all.writeto(str(tmp_path / "events.fits")) DataStore.from_events_files([str(tmp_path / "events.fits")]) @requires_data() def test_MC_ID(model_alternative): irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) livetime = 0.1 * u.hr skydir = SkyCoord(0, 0, unit="deg", frame="galactic") pointing = FixedPointingInfo(fixed_icrs=skydir.icrs) obs = Observation.create( obs_id=1001, pointing=pointing, livetime=livetime, irfs=irfs, location=LOCATION, ) energy_axis = MapAxis.from_energy_bounds( "1.0 TeV", "10 TeV", nbin=10, per_decade=True ) energy_axis_true = MapAxis.from_energy_bounds( "0.5 TeV", "20 TeV", nbin=20, per_decade=True, name="energy_true" ) migra_axis = MapAxis.from_bounds(0.5, 2, nbin=150, node_type="edges", name="migra") geom = WcsGeom.create( skydir=skydir, width=(2, 2), binsz=0.06, frame="icrs", axes=[energy_axis], ) empty = MapDataset.create( geom, energy_axis_true=energy_axis_true, migra_axis=migra_axis, name="test", ) maker = MapDatasetMaker(selection=["exposure", "background", "psf", "edisp"]) dataset = maker.run(empty, obs) dataset.models = model_alternative sampler = MapDatasetEventSampler(random_state=0) events = sampler.run(dataset=dataset, observation=obs) assert len(events.table) == 215 assert len(np.where(events.table["MC_ID"] == 0)[0]) == 40 meta = events.table.meta assert meta["MID00000"] == 0 assert meta["MMN00000"] == "test-bkg" assert meta["MID00001"] == 1 assert meta["MID00002"] == 2 assert meta["MID00003"] == 3 assert meta["NMCIDS"] == 4 @requires_data() def test_MC_ID_NMCID(model_alternative): irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) livetime = 0.1 * u.hr skydir = SkyCoord(0, 0, unit="deg", frame="galactic") pointing = FixedPointingInfo(fixed_icrs=skydir.icrs) obs = Observation.create( obs_id=1001, pointing=pointing, livetime=livetime, irfs=irfs, location=LOCATION, ) energy_axis = MapAxis.from_energy_bounds( "1.0 TeV", "10 TeV", nbin=10, per_decade=True ) energy_axis_true = MapAxis.from_energy_bounds( "0.5 TeV", "20 TeV", nbin=20, per_decade=True, name="energy_true" ) migra_axis = MapAxis.from_bounds(0.5, 2, nbin=150, node_type="edges", name="migra") geom = WcsGeom.create( skydir=skydir, width=(2, 2), binsz=0.06, frame="icrs", axes=[energy_axis], ) empty = MapDataset.create( geom, energy_axis_true=energy_axis_true, migra_axis=migra_axis, name="test", ) maker = MapDatasetMaker(selection=["exposure", "background", "psf", "edisp"]) dataset = maker.run(empty, obs) model_alternative[0].spectral_model.parameters["amplitude"].value = 1e-16 dataset.models = model_alternative sampler = MapDatasetEventSampler(random_state=0) events = sampler.run(dataset=dataset, observation=obs) assert len(events.table) == 47 assert len(np.where(events.table["MC_ID"] == 0)[0]) == 47 meta = events.table.meta assert meta["MID00000"] == 0 assert meta["MMN00000"] == "test-bkg" assert meta["MID00001"] == 1 assert meta["MID00002"] == 2 assert meta["MID00003"] == 3 assert meta["NMCIDS"] == 4 @requires_data() def test_MC_ID_flag(model_alternative): irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) livetime = 0.1 * u.hr skydir = SkyCoord(0, 0, unit="deg", frame="galactic") pointing = FixedPointingInfo(fixed_icrs=skydir.icrs) obs = Observation.create( obs_id=1001, pointing=pointing, livetime=livetime, irfs=irfs, location=LOCATION, ) energy_axis = MapAxis.from_energy_bounds( "1.0 TeV", "10 TeV", nbin=10, per_decade=True ) energy_axis_true = MapAxis.from_energy_bounds( "0.5 TeV", "20 TeV", nbin=20, per_decade=True, name="energy_true" ) migra_axis = MapAxis.from_bounds(0.5, 2, nbin=150, node_type="edges", name="migra") geom = WcsGeom.create( skydir=skydir, width=(2, 2), binsz=0.06, frame="icrs", axes=[energy_axis], ) empty = MapDataset.create( geom, energy_axis_true=energy_axis_true, migra_axis=migra_axis, name="test", ) maker = MapDatasetMaker(selection=["exposure", "background", "psf", "edisp"]) dataset = maker.run(empty, obs) model_alternative[0].spectral_model.parameters["amplitude"].value = 1e-16 dataset.models = model_alternative sampler = MapDatasetEventSampler(random_state=0, keep_mc_id=False) events = sampler.run(dataset=dataset, observation=obs) meta = events.table.meta assert len(events.table) == 47 assert "MC_ID" not in events.table.colnames assert "MID00000" not in meta.keys() assert "MMN00000" not in meta.keys() assert "MID00001" not in meta.keys() assert "MID00002" not in meta.keys() assert "MID00003" not in meta.keys() assert "NMCIDS" not in meta.keys() @requires_data() def test_bunch_event_number_sample_sources(dataset): spatial_model = GaussianSpatialModel( lon_0="0 deg", lat_0="0 deg", sigma="0.2 deg", frame="galactic" ) spectral_model = PowerLawSpectralModel(amplitude="4e-10 cm-2 s-1 TeV-1") dataset.models = [ SkyModel(spectral_model=spectral_model, spatial_model=spatial_model), FoVBackgroundModel(dataset_name=dataset.name), ] sampler = MapDatasetEventSampler(random_state=0, n_event_bunch=1000) events = sampler.run(dataset=dataset) assert len(events.table) == 24128 @requires_data() def test_sort_evt_by_time(dataset): spatial_model = GaussianSpatialModel( lon_0="0 deg", lat_0="0 deg", sigma="0.2 deg", frame="galactic" ) spectral_model = PowerLawSpectralModel(amplitude="4e-10 cm-2 s-1 TeV-1") dataset.models = [ SkyModel(spectral_model=spectral_model, spatial_model=spatial_model), FoVBackgroundModel(dataset_name=dataset.name), ] sampler = MapDatasetEventSampler(random_state=0, n_event_bunch=1000) events = sampler.run(dataset=dataset) dt = np.diff(events.table["TIME"]) assert np.all(dt >= 0) @requires_data() def test_observation_event_sampler(signal_model, tmp_path): from gammapy.datasets.simulate import ObservationEventSampler datastore = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") obs = datastore.get_observations()[0] # first test defaults with HESS # otherwise with CTA the EdispMap computation takes too much time and memory maker = ObservationEventSampler() sim_obs = maker.run(obs, None) assert sim_obs.events is not None assert len(sim_obs.events.table) > 0 irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-caldb/Prod5-South-20deg-AverageAz-14MSTs37SSTs.180000s-v0.1.fits.gz" ) pointing = FixedPointingInfo( fixed_icrs=SkyCoord(83.63311446, 22.01448714, unit="deg", frame="icrs"), ) time_start = Time("2021-11-20T03:00:00") time_stop = Time("2021-11-20T03:30:00") obs = Observation.create( pointing=pointing, location=LOCATION, obs_id=1, tstart=time_start, tstop=time_stop, irfs=irfs, deadtime_fraction=0.01, ) dataset_kwargs = dict( spatial_width=5 * u.deg, spatial_bin_size=0.01 * u.deg, energy_axis=MapAxis.from_energy_bounds( 10 * u.GeV, 100 * u.TeV, nbin=5, per_decade=True ), energy_axis_true=MapAxis.from_energy_bounds( 10 * u.GeV, 100 * u.TeV, nbin=5, per_decade=True, name="energy_true" ), ) maker = ObservationEventSampler(dataset_kwargs=dataset_kwargs) sim_obs = maker.run(obs, [signal_model]) assert sim_obs.events is not None assert len(sim_obs.events.table) > 0 @pytest.fixture(scope="session") def observations(): pointing = FixedPointingInfo(fixed_icrs=SkyCoord(0 * u.deg, 0 * u.deg)) livetime = 0.5 * u.hr irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-caldb/Prod5-South-20deg-AverageAz-14MSTs37SSTs.180000s-v0.1.fits.gz" ) observations = [ Observation.create( obs_id=100 + k, pointing=pointing, livetime=livetime, irfs=irfs ) for k in range(2) ] return observations @pytest.fixture(scope="session") def models_list(): spectral_model_pwl = PowerLawSpectralModel( index=2, amplitude="1e-12 TeV-1 cm-2 s-1", reference="1 TeV" ) spatial_model_point = PointSpatialModel( lon_0="0 deg", lat_0="0.0 deg", frame="galactic" ) sky_model_pntpwl = SkyModel( spectral_model=spectral_model_pwl, spatial_model=spatial_model_point, name="point-pwl", ) models = Models(sky_model_pntpwl) return models @requires_data() def test_observations_events_sampler(tmpdir, observations): sampler_kwargs = dict(random_state=0) dataset_kwargs = dict( spatial_bin_size_min=0.1 * u.deg, spatial_width_max=0.2 * u.deg, energy_bin_per_decade_max=2, ) sampler = ObservationsEventsSampler( sampler_kwargs=sampler_kwargs, dataset_kwargs=dataset_kwargs, n_jobs=1, outdir=tmpdir, overwrite=True, ) sampler.run(observations, models=None) @requires_data() def test_observations_events_sampler_time( tmpdir, observations, energy_dependent_temporal_sky_model ): models = Models(energy_dependent_temporal_sky_model) sampler_kwargs = dict(random_state=0) dataset_kwargs = dict( spatial_bin_size_min=0.1 * u.deg, spatial_width_max=0.2 * u.deg, energy_bin_per_decade_max=2, ) sampler = ObservationsEventsSampler( sampler_kwargs=sampler_kwargs, dataset_kwargs=dataset_kwargs, n_jobs=1, outdir=tmpdir, overwrite=True, ) sampler.run(observations, models=models) @requires_data() def test_observations_events_sampler_parallel(tmpdir, observations, models_list): sampler_kwargs = dict(random_state=0) dataset_kwargs = dict( spatial_bin_size_min=0.1 * u.deg, spatial_width_max=0.2 * u.deg, energy_bin_per_decade_max=2, ) sampler = ObservationsEventsSampler( sampler_kwargs=sampler_kwargs, dataset_kwargs=dataset_kwargs, n_jobs=2, outdir=tmpdir, overwrite=True, ) sampler.run(observations, models=models_list) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/tests/test_spectrum.py0000644000175100001770000012344314721316200022300 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose, assert_equal import astropy.units as u from astropy.io import fits from astropy.table import Table from astropy.time import Time from astropy.utils.exceptions import AstropyUserWarning from gammapy.data import GTI from gammapy.datasets import Datasets, SpectrumDataset, SpectrumDatasetOnOff from gammapy.irf import EDispKernelMap, EffectiveAreaTable2D from gammapy.makers.utils import make_map_exposure_true_energy from gammapy.maps import LabelMapAxis, MapAxis, RegionGeom, RegionNDMap, WcsGeom from gammapy.modeling import Fit from gammapy.modeling.models import ( ConstantSpectralModel, ExpCutoffPowerLawSpectralModel, Models, PowerLawSpectralModel, SkyModel, ) from gammapy.utils.random import get_random_state from gammapy.utils.regions import compound_region_to_regions from gammapy.utils.testing import assert_time_allclose, mpl_plot_check, requires_data def test_data_shape(spectrum_dataset): assert spectrum_dataset.data_shape[0] == 30 def test_str(spectrum_dataset): assert "SpectrumDataset" in str(spectrum_dataset) def test_energy_range(spectrum_dataset): e_min, e_max = spectrum_dataset.energy_range assert e_min.unit == u.TeV assert e_max.unit == u.TeV assert_allclose(e_min, 0.1) assert_allclose(e_max, 10.0) def test_info_dict(spectrum_dataset): info_dict = spectrum_dataset.info_dict() assert_allclose(info_dict["counts"], 907010) assert_allclose(info_dict["background"], 3000.0) assert_allclose(info_dict["sqrt_ts"], 2924.522174) assert_allclose(info_dict["excess"], 904010) assert_allclose(info_dict["ontime"].value, 216000) assert info_dict["name"] == "test" def test_set_model(spectrum_dataset): spectrum_dataset = spectrum_dataset.copy() spectral_model = PowerLawSpectralModel() model = SkyModel(spectral_model=spectral_model, name="test") spectrum_dataset.models = model assert spectrum_dataset.models["test"] is model models = Models([model]) spectrum_dataset.models = models assert spectrum_dataset.models["test"] is model def test_spectrum_dataset_fits_io(spectrum_dataset, tmp_path): spectrum_dataset.meta_table = Table( data=[[1.0 * u.h], [111]], names=["livetime", "obs_id"] ) hdulist = spectrum_dataset.to_hdulist() actual = [hdu.name for hdu in hdulist] desired = [ "PRIMARY", "COUNTS", "COUNTS_BANDS", "COUNTS_REGION", "EXPOSURE", "EXPOSURE_BANDS", "EXPOSURE_REGION", "BACKGROUND", "BACKGROUND_BANDS", "BACKGROUND_REGION", "GTI", "META_TABLE", ] assert actual == desired spectrum_dataset.write(tmp_path / "test.fits") dataset_new = SpectrumDataset.read(tmp_path / "test.fits", name="test") assert_allclose(spectrum_dataset.counts.data, dataset_new.counts.data) assert_allclose( spectrum_dataset.npred_background().data, dataset_new.npred_background().data ) assert dataset_new.edisp is None assert dataset_new.edisp is None assert dataset_new.name == "test" assert_allclose(spectrum_dataset.exposure.data, dataset_new.exposure.data) assert spectrum_dataset.counts.geom == dataset_new.counts.geom assert_allclose(dataset_new.meta_table["obs_id"], 111) def test_npred_models(): e_reco = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) geom = RegionGeom(region=None, axes=[e_reco]) spectrum_dataset = SpectrumDataset.create(geom=geom) spectrum_dataset.exposure.quantity = 1e10 * u.Unit("cm2 h") pwl_1 = PowerLawSpectralModel(index=2) pwl_2 = PowerLawSpectralModel(index=2) model_1 = SkyModel(spectral_model=pwl_1) model_2 = SkyModel(spectral_model=pwl_2) spectrum_dataset.models = Models([model_1, model_2]) npred = spectrum_dataset.npred() assert_allclose(npred.data.sum(), 64.8) npred_sig = spectrum_dataset.npred_signal() assert_allclose(npred_sig.data.sum(), 64.8) npred_sig_model1 = spectrum_dataset.npred_signal(model_names=[model_1.name]) assert_allclose(npred_sig_model1.data.sum(), 32.4) assert_allclose( spectrum_dataset.npred_signal( model_names=[model_1.name, model_2.name] ).data.sum(), 64.8, ) npred_model1_not_stack = spectrum_dataset.npred_signal( model_names=[model_1.name], stack=False ) assert_allclose(npred_model1_not_stack.geom.data_shape, (1, 3, 1, 1)) assert_allclose(npred_model1_not_stack.data.sum(), 32.4) assert isinstance(npred_model1_not_stack.geom.axes[-1], LabelMapAxis) assert npred_model1_not_stack.geom.axes[-1].name == "models" assert_equal(npred_model1_not_stack.geom.axes[-1].center, [model_1.name]) npred_all_models_not_stack = spectrum_dataset.npred_signal( model_names=[model_1.name, model_2.name], stack=False ) assert_allclose(npred_all_models_not_stack.geom.data_shape, (2, 3, 1, 1)) assert_allclose( npred_all_models_not_stack.sum_over_axes(["models"]).data.sum(), 64.8 ) def test_npred_spatial_model(spectrum_dataset): model = SkyModel.create("pl", "gauss", name="test") spectrum_dataset.models = [model] npred = spectrum_dataset.npred() model.spatial_model.sigma.value = 1.0 npred_large_sigma = spectrum_dataset.npred() assert_allclose(npred.data.sum(), 3000) assert_allclose(npred_large_sigma.data.sum(), 3000) assert spectrum_dataset.evaluators["test"].psf is None def test_fit(spectrum_dataset): """Simple CASH fit to the on vector""" fit = Fit() result = fit.run(datasets=[spectrum_dataset]) assert result.success assert "minuit" in str(result) npred = spectrum_dataset.npred().data.sum() assert_allclose(npred, 907012.186399, rtol=1e-3) assert_allclose(result.total_stat, -18087404.624, rtol=1e-3) pars = spectrum_dataset.models.parameters assert_allclose(pars["index"].value, 2.1, rtol=1e-2) assert_allclose(pars["index"].error, 0.001276, rtol=1e-2) assert_allclose(pars["amplitude"].value, 1e5, rtol=1e-3) assert_allclose(pars["amplitude"].error, 153.450825, rtol=1e-2) def test_spectrum_dataset_create(): e_reco = MapAxis.from_edges(u.Quantity([0.1, 1, 10.0], "TeV"), name="energy") e_true = MapAxis.from_edges( u.Quantity([0.05, 0.5, 5, 20.0], "TeV"), name="energy_true" ) geom = RegionGeom(region=None, axes=[e_reco]) empty_spectrum_dataset = SpectrumDataset.create( geom, energy_axis_true=e_true, name="test" ) assert empty_spectrum_dataset.name == "test" assert empty_spectrum_dataset.counts.data.sum() == 0 assert empty_spectrum_dataset.data_shape[0] == 2 assert empty_spectrum_dataset.background.data.sum() == 0 assert empty_spectrum_dataset.background.geom.axes[0].nbin == 2 assert empty_spectrum_dataset.exposure.geom.axes[0].nbin == 3 assert empty_spectrum_dataset.edisp.edisp_map.geom.axes["energy"].nbin == 2 assert empty_spectrum_dataset.gti.time_sum.value == 0 assert len(empty_spectrum_dataset.gti.table) == 0 assert np.isnan(empty_spectrum_dataset.energy_range[0]) assert_allclose(empty_spectrum_dataset.mask_safe, 0) def test_spectrum_dataset_stack_diagonal_safe_mask(spectrum_dataset): geom = spectrum_dataset.counts.geom energy = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=30) energy_true = MapAxis.from_energy_bounds( "0.1 TeV", "10 TeV", nbin=30, name="energy_true" ) aeff = EffectiveAreaTable2D.from_parametrization( energy_axis_true=energy_true, instrument="HESS" ) livetime = 100 * u.s gti = GTI.create(start=0 * u.s, stop=livetime) geom_true = geom.as_energy_true exposure = make_map_exposure_true_energy( geom=geom_true, livetime=livetime, pointing=geom_true.center_skydir, aeff=aeff ) edisp = EDispKernelMap.from_diagonal_response( energy, energy_true, geom=geom.to_image() ) edisp.exposure_map.data = exposure.data[:, :, np.newaxis, :] background = spectrum_dataset.background mask_safe = RegionNDMap.from_geom(geom=geom, dtype=bool) mask_safe.data += True spectrum_dataset1 = SpectrumDataset( name="ds1", counts=spectrum_dataset.counts.copy(), exposure=exposure.copy(), edisp=edisp.copy(), background=background.copy(), gti=gti.copy(), mask_safe=mask_safe, ) livetime2 = 0.5 * livetime gti2 = GTI.create(start=200 * u.s, stop=200 * u.s + livetime2) bkg2 = RegionNDMap.from_geom(geom=geom, data=2 * background.data) geom = spectrum_dataset.counts.geom data = np.ones(spectrum_dataset.data_shape, dtype="bool") data[0] = False safe_mask2 = RegionNDMap.from_geom(geom=geom, data=data) exposure2 = exposure.copy() edisp = edisp.copy() edisp.exposure_map.data = exposure2.data[:, :, np.newaxis, :] spectrum_dataset2 = SpectrumDataset( name="ds2", counts=spectrum_dataset.counts.copy(), exposure=exposure2, edisp=edisp, background=bkg2, mask_safe=safe_mask2, gti=gti2, ) spectrum_dataset1.stack(spectrum_dataset2) reference = spectrum_dataset.counts.data assert_allclose(spectrum_dataset1.counts.data[1:], reference[1:] * 2) assert_allclose(spectrum_dataset1.counts.data[0], 141363) assert_allclose( spectrum_dataset1.exposure.quantity[0], 4.755644e09 * u.Unit("cm2 s") ) assert_allclose(spectrum_dataset1.background.data[1:], 3 * background.data[1:]) assert_allclose(spectrum_dataset1.background.data[0], background.data[0]) kernel = edisp.get_edisp_kernel() kernel_stacked = spectrum_dataset1.edisp.get_edisp_kernel() assert_allclose(kernel_stacked.pdf_matrix[1:], kernel.pdf_matrix[1:]) assert_allclose(kernel_stacked.pdf_matrix[0], 0.5 * kernel.pdf_matrix[0]) def test_spectrum_dataset_stack_nondiagonal_no_bkg(spectrum_dataset): energy = spectrum_dataset.counts.geom.axes["energy"] geom = spectrum_dataset.counts.geom edisp1 = EDispKernelMap.from_gauss( energy_axis=energy, energy_axis_true=energy.copy(name="energy_true"), sigma=0.1, bias=0, geom=geom.to_image(), ) edisp1.exposure_map.data += 1 aeff = EffectiveAreaTable2D.from_parametrization( energy_axis_true=energy.copy(name="energy_true"), instrument="HESS" ) livetime = 100 * u.s geom_true = geom.as_energy_true exposure = make_map_exposure_true_energy( geom=geom_true, livetime=livetime, pointing=geom_true.center_skydir, aeff=aeff ) geom = spectrum_dataset.counts.geom counts = RegionNDMap.from_geom(geom=geom) gti = GTI.create(start=0 * u.s, stop=livetime) spectrum_dataset1 = SpectrumDataset( counts=counts, exposure=exposure, edisp=edisp1, meta_table=Table({"OBS_ID": [0]}), gti=gti.copy(), ) edisp2 = EDispKernelMap.from_gauss( energy_axis=energy, energy_axis_true=energy.copy(name="energy_true"), sigma=0.2, bias=0.0, geom=geom, ) edisp2.exposure_map.data += 1 gti2 = GTI.create(start=100 * u.s, stop=200 * u.s) spectrum_dataset2 = SpectrumDataset( counts=counts, exposure=exposure.copy(), edisp=edisp2, meta_table=Table({"OBS_ID": [1]}), gti=gti2, ) spectrum_dataset1.stack(spectrum_dataset2) assert_allclose(spectrum_dataset1.meta_table["OBS_ID"][0], [0, 1]) assert spectrum_dataset1.background_model is None assert_allclose(spectrum_dataset1.gti.time_sum.to_value("s"), 200) assert_allclose( spectrum_dataset1.exposure.quantity[2].to_value("m2 s"), 1573851.079861 ) kernel = edisp1.get_edisp_kernel() assert_allclose(kernel.get_bias(1 * u.TeV), 0.0, atol=1.2e-3) assert_allclose(kernel.get_resolution(1 * u.TeV), 0.1581, atol=1e-2) def test_peek(spectrum_dataset): with mpl_plot_check(): spectrum_dataset.peek() with mpl_plot_check(): spectrum_dataset.plot_fit() spectrum_dataset.edisp = None with mpl_plot_check(): spectrum_dataset.peek() class TestSpectrumOnOff: """Test ON OFF SpectrumDataset""" def setup_method(self): etrue = np.logspace(-1, 1, 10) * u.TeV self.e_true = MapAxis.from_energy_edges(etrue, name="energy_true") ereco = np.logspace(-1, 1, 5) * u.TeV elo = ereco[:-1] self.e_reco = MapAxis.from_energy_edges(ereco, name="energy") start = u.Quantity([0], "s") stop = u.Quantity([1000], "s") time_ref = Time("2010-01-01 00:00:00.0") self.gti = GTI.create(start, stop, time_ref) self.livetime = self.gti.time_sum self.wcs = WcsGeom.create(npix=300, binsz=0.01, frame="icrs").wcs self.aeff = RegionNDMap.create( region="icrs;circle(0.,1.,0.1)", wcs=self.wcs, axes=[self.e_true], unit="cm2", ) self.aeff.data += 1 data = np.ones(elo.shape) data[-1] = 0 # to test stats calculation with empty bins axis = MapAxis.from_edges(ereco, name="energy", interp="log") self.on_counts = RegionNDMap.create( region="icrs;circle(0.,1.,0.1)", wcs=self.wcs, axes=[axis], meta={"EXPOSURE": self.livetime.to_value("s")}, ) self.on_counts.data += 1 self.on_counts.data[-1] = 0 self.off_counts = RegionNDMap.create( region="icrs;box(0.,1.,0.1, 0.2,30);box(-1.,-1.,0.1, 0.2,150)", wcs=self.wcs, axes=[axis], ) self.off_counts.data += 10 acceptance = RegionNDMap.from_geom(self.on_counts.geom) acceptance.data += 1 data = np.ones(elo.shape) data[-1] = 0 acceptance_off = RegionNDMap.from_geom(self.off_counts.geom) acceptance_off.data += 10 self.edisp = EDispKernelMap.from_diagonal_response( self.e_reco, self.e_true, self.on_counts.geom.to_image() ) exposure = self.aeff * self.livetime exposure.meta["livetime"] = self.livetime mask_safe = RegionNDMap.from_geom(self.on_counts.geom, dtype=bool) mask_safe.data += True self.dataset = SpectrumDatasetOnOff( counts=self.on_counts, counts_off=self.off_counts, exposure=exposure, edisp=self.edisp, acceptance=acceptance, acceptance_off=acceptance_off, name="test", gti=self.gti, mask_safe=mask_safe, ) def test_spectrum_dataset_on_off_create(self): e_reco = MapAxis.from_edges(u.Quantity([0.1, 1, 10.0], "TeV"), name="energy") e_true = MapAxis.from_edges( u.Quantity([0.05, 0.5, 5, 20.0], "TeV"), name="energy_true" ) geom = RegionGeom(region=None, axes=[e_reco]) empty_dataset = SpectrumDatasetOnOff.create(geom=geom, energy_axis_true=e_true) assert empty_dataset.counts.data.sum() == 0 assert empty_dataset.data_shape[0] == 2 assert empty_dataset.counts_off.data.sum() == 0 assert empty_dataset.counts_off.geom.axes[0].nbin == 2 assert_allclose(empty_dataset.acceptance_off, 0) assert_allclose(empty_dataset.acceptance, 0) assert empty_dataset.acceptance.data.shape[0] == 2 assert empty_dataset.acceptance_off.data.shape[0] == 2 assert empty_dataset.gti.time_sum.value == 0 assert len(empty_dataset.gti.table) == 0 assert np.isnan(empty_dataset.energy_range[0]) def test_create_stack(self): geom = RegionGeom(region=None, axes=[self.e_reco]) stacked = SpectrumDatasetOnOff.create(geom=geom, energy_axis_true=self.e_true) stacked.mask_safe.data += True stacked.stack(self.dataset) e_min_stacked, e_max_stacked = stacked.energy_range e_min_dataset, e_max_dataset = self.dataset.energy_range assert_allclose(e_min_stacked, e_min_dataset) assert_allclose(e_max_stacked, e_max_dataset) def test_alpha(self): assert self.dataset.alpha.data.shape == (4, 1, 1) assert_allclose(self.dataset.alpha.data, 0.1) def test_npred_no_edisp(self): const = 1 * u.Unit("cm-2 s-1 TeV-1") model = SkyModel(spectral_model=ConstantSpectralModel(const=const)) livetime = 1 * u.s aeff = RegionNDMap.create( region=self.aeff.geom.region, unit="cm2", axes=[self.e_reco.copy(name="energy_true")], ) aeff.data += 1 dataset = SpectrumDatasetOnOff( counts=self.on_counts, counts_off=self.off_counts, exposure=aeff * livetime, models=model, ) energy = aeff.geom.axes[0].edges expected = aeff.data[0] * (energy[-1] - energy[0]) * const * livetime assert_allclose(dataset.npred_signal().data.sum(), expected.value) def test_to_spectrum_dataset(self): ds = self.dataset.to_spectrum_dataset() assert isinstance(ds, SpectrumDataset) assert_allclose(ds.background.data.sum(), 4) def test_peek(self): dataset = self.dataset.copy() dataset.models = SkyModel(spectral_model=PowerLawSpectralModel()) with mpl_plot_check(): dataset.peek() def test_plot_fit(self): dataset = self.dataset.copy() dataset.models = SkyModel(spectral_model=PowerLawSpectralModel()) with mpl_plot_check(): dataset.plot_fit() def test_to_from_ogip_files(self, tmp_path): dataset = self.dataset.copy(name="test") dataset.write(tmp_path / "test.fits") newdataset = SpectrumDatasetOnOff.read(tmp_path / "test.fits") expected_regions = compound_region_to_regions(self.off_counts.geom.region) regions = compound_region_to_regions(newdataset.counts_off.geom.region) assert newdataset.counts.meta["RESPFILE"] == "test_rmf.fits" assert newdataset.counts.meta["BACKFILE"] == "test_bkg.fits" assert newdataset.counts.meta["ANCRFILE"] == "test_arf.fits" assert_allclose(self.on_counts.data, newdataset.counts.data) assert_allclose(self.off_counts.data, newdataset.counts_off.data) assert_allclose(self.edisp.edisp_map.data, newdataset.edisp.edisp_map.data) assert_time_allclose(newdataset.gti.time_start, dataset.gti.time_start) assert len(regions) == len(expected_regions) assert regions[0].center.is_equivalent_frame(expected_regions[0].center) assert_allclose(regions[1].angle, expected_regions[1].angle) def test_to_from_ogip_files_no_mask(self, tmp_path): dataset = self.dataset.copy(name="test") dataset.mask_safe = None dataset.write(tmp_path / "test.fits") newdataset = SpectrumDatasetOnOff.read(tmp_path / "test.fits") assert_allclose(newdataset.mask_safe.data, True) def test_to_from_ogip_files_zip(self, tmp_path): dataset = self.dataset.copy(name="test") dataset.write(tmp_path / "test.fits.gz") newdataset = SpectrumDatasetOnOff.read(tmp_path / "test.fits.gz") assert newdataset.counts.meta["RESPFILE"] == "test_rmf.fits.gz" assert newdataset.counts.meta["BACKFILE"] == "test_bkg.fits.gz" assert newdataset.counts.meta["ANCRFILE"] == "test_arf.fits.gz" def test_to_from_ogip_files_no_edisp(self, tmp_path): mask_safe = RegionNDMap.from_geom(self.on_counts.geom, dtype=bool) mask_safe.data += True acceptance = RegionNDMap.from_geom(self.on_counts.geom, data=1.0) exposure = self.aeff * self.livetime exposure.meta["livetime"] = self.livetime dataset = SpectrumDatasetOnOff( counts=self.on_counts, exposure=exposure, mask_safe=mask_safe, acceptance=acceptance, name="test", ) dataset.write(tmp_path / "pha_obstest.fits") newdataset = SpectrumDatasetOnOff.read(tmp_path / "pha_obstest.fits") assert_allclose(self.on_counts.data, newdataset.counts.data) assert newdataset.counts_off is None assert newdataset.edisp is None assert newdataset.gti is None def test_to_ogip_files_checksum(self, tmp_path): dataset = self.dataset.copy(name="test") dataset.write(tmp_path / "test.fits", format="ogip", checksum=True) for name in ["test.fits", "test_arf.fits", "test_rmf.fits", "test_bkg.fits"]: # TODO: this should not emit AstropyUserWarning with fits.open(tmp_path / name, checksum=True) as hdul: for hdu in hdul: assert "CHECKSUM" in hdu.header assert "DATASUM" in hdu.header def replace_in_fits_header(filename, string): with open(filename, "r+b") as file: chunk = file.read(10000) index = chunk.find(string.encode("ascii")) file.seek(index) file.write("bad".encode("ascii")) path = tmp_path / "test.fits" replace_in_fits_header(path, "unknown") with pytest.warns(AstropyUserWarning): SpectrumDatasetOnOff.read(path, checksum=True) def test_spectrum_dataset_onoff_fits_io(self, tmp_path): self.dataset.write(tmp_path / "test.fits", format="gadf") d1 = SpectrumDatasetOnOff.read(tmp_path / "test.fits", format="gadf") assert isinstance(d1.counts.geom, RegionGeom) assert d1.exposure == self.dataset.exposure assert_allclose(d1.counts_off.data, self.dataset.counts_off.data) def test_energy_mask(self): mask = self.dataset.counts.geom.energy_mask( energy_min=0.3 * u.TeV, energy_max=6 * u.TeV ) desired = [False, True, True, False] assert_allclose(mask.data[:, 0, 0], desired) mask = self.dataset.counts.geom.energy_mask(energy_max=6 * u.TeV) desired = [True, True, True, False] assert_allclose(mask.data[:, 0, 0], desired) mask = self.dataset.counts.geom.energy_mask(energy_min=1 * u.TeV) desired = [False, False, True, True] assert_allclose(mask.data[:, 0, 0], desired) def test_str(self): model = SkyModel(spectral_model=PowerLawSpectralModel()) dataset = SpectrumDatasetOnOff( counts=self.on_counts, counts_off=self.off_counts, models=model, exposure=self.aeff * self.livetime, edisp=self.edisp, acceptance=RegionNDMap.from_geom(geom=self.on_counts.geom, data=1), acceptance_off=RegionNDMap.from_geom(geom=self.off_counts.geom, data=10), ) assert "SpectrumDatasetOnOff" in str(dataset) assert "wstat" in str(dataset) def test_fake(self): """Test the fake dataset""" source_model = SkyModel(spectral_model=PowerLawSpectralModel()) dataset = SpectrumDatasetOnOff( name="test", counts=self.on_counts, counts_off=self.off_counts, models=source_model, exposure=self.aeff * self.livetime, edisp=self.edisp, acceptance=RegionNDMap.from_geom(geom=self.on_counts.geom, data=1), acceptance_off=RegionNDMap.from_geom(geom=self.off_counts.geom, data=10), ) real_dataset = dataset.copy() background = RegionNDMap.from_geom(dataset.counts.geom) background.data += 1 dataset.fake(npred_background=background, random_state=314) assert real_dataset.counts.data.shape == dataset.counts.data.shape assert real_dataset.counts_off.data.shape == dataset.counts_off.data.shape assert dataset.counts_off.data.sum() == 39 assert dataset.counts.data.sum() == 5 def test_info_dict(self): info_dict = self.dataset.info_dict() assert_allclose(info_dict["counts"], 3) assert_allclose(info_dict["counts_off"], 40) assert_allclose(info_dict["acceptance"], 4) assert_allclose(info_dict["acceptance_off"], 40) assert_allclose(info_dict["alpha"], 0.1) assert_allclose(info_dict["excess"], -1, rtol=1e-2) assert_allclose(info_dict["ontime"].value, 1e3) assert_allclose(info_dict["sqrt_ts"], -0.501005, rtol=1e-2) assert info_dict["name"] == "test" def test_resample_energy_axis(self): axis = MapAxis.from_edges([0.1, 1, 10] * u.TeV, name="energy", interp="log") grouped = self.dataset.resample_energy_axis(energy_axis=axis) assert grouped.counts.data.shape == (2, 1, 1) # exposure should be untouched assert_allclose(grouped.exposure.data, 1000) assert_allclose(np.squeeze(grouped.counts), [2, 1]) assert_allclose(np.squeeze(grouped.counts_off), [20, 20]) assert grouped.edisp.edisp_map.data.shape == (9, 2, 1, 1) assert_allclose(np.squeeze(grouped.acceptance), [2, 2]) assert_allclose(np.squeeze(grouped.acceptance_off), [20, 20]) def test_to_image(self): grouped = self.dataset.to_image() assert grouped.counts.data.shape == (1, 1, 1) # exposure should be untouched assert_allclose(grouped.exposure.data, 1000) assert_allclose(np.squeeze(grouped.counts), 3) assert_allclose(np.squeeze(grouped.counts_off), 40) assert grouped.edisp.edisp_map.data.shape == (9, 1, 1, 1) assert_allclose(np.squeeze(grouped.acceptance), 4) assert_allclose(np.squeeze(grouped.acceptance_off), 40) @requires_data() class TestSpectralFit: """Test fit in astrophysical scenario""" def setup_method(self): path = "$GAMMAPY_DATA/joint-crab/spectra/hess/" self.datasets = Datasets( [ SpectrumDatasetOnOff.read(path + "pha_obs23523.fits"), SpectrumDatasetOnOff.read(path + "pha_obs23592.fits"), ] ) self.pwl = SkyModel( spectral_model=PowerLawSpectralModel( index=2, amplitude=1e-12 * u.Unit("cm-2 s-1 TeV-1"), reference=1 * u.TeV ) ) self.ecpl = SkyModel( spectral_model=ExpCutoffPowerLawSpectralModel( index=2, amplitude=1e-12 * u.Unit("cm-2 s-1 TeV-1"), reference=1 * u.TeV, lambda_=0.1 / u.TeV, ) ) # Example fit for one observation self.datasets[0].models = self.pwl self.fit = Fit() def set_model(self, model): for obs in self.datasets: obs.models = model def test_basic_results(self): self.set_model(self.pwl) result = self.fit.run([self.datasets[0]]) pars = self.datasets.parameters assert self.pwl is self.datasets[0].models[0] assert_allclose(result.total_stat, 38.343, rtol=1e-3) assert_allclose(pars["index"].value, 2.817, rtol=1e-3) assert pars["amplitude"].unit == "cm-2 s-1 TeV-1" assert_allclose(pars["amplitude"].value, 5.142e-11, rtol=1e-3) assert_allclose(self.datasets[0].npred().data[60], 0.6102, rtol=1e-3) pars.to_table() def test_basic_errors(self): self.set_model(self.pwl) self.fit.run([self.datasets[0]]) pars = self.pwl.parameters assert_allclose(pars["index"].error, 0.149633, rtol=1e-3) assert_allclose(pars["amplitude"].error, 6.423139e-12, rtol=1e-3) pars.to_table() def test_ecpl_fit(self): self.set_model(self.ecpl) fit = Fit() fit.run([self.datasets[0]]) actual = self.datasets.parameters["lambda_"].quantity assert actual.unit == "TeV-1" assert_allclose(actual.value, 0.145215, rtol=1e-2) def test_joint_fit(self): self.set_model(self.pwl) fit = Fit() fit.run(self.datasets) actual = self.datasets.parameters["index"].value assert_allclose(actual, 2.7806, rtol=1e-3) actual = self.datasets.parameters["amplitude"].quantity assert actual.unit == "cm-2 s-1 TeV-1" assert_allclose(actual.value, 5.200e-11, rtol=1e-3) def test_stats(self): dataset = self.datasets[0].copy() dataset.models = self.pwl fit = Fit() result = fit.run(datasets=[dataset]) stats = dataset.stat_array() actual = np.sum(stats[dataset.mask_safe]) desired = result.total_stat assert_allclose(actual, desired) def test_fit_range(self): # Fit range not restricted fit range should be the thresholds obs = self.datasets[0] actual = obs.energy_range[0] assert actual.unit == "keV" assert_allclose(actual, 8.912509e08) def test_no_edisp(self): dataset = self.datasets[0].copy() dataset.edisp = None dataset.models = self.pwl fit = Fit() fit.run(datasets=[dataset]) assert_allclose(self.pwl.spectral_model.index.value, 2.7961, atol=0.02) def test_stacked_fit(self): dataset = self.datasets[0].copy() dataset.stack(self.datasets[1]) dataset.models = SkyModel(PowerLawSpectralModel()) fit = Fit() fit.run(datasets=[dataset]) pars = dataset.models.parameters assert_allclose(pars["index"].value, 2.7767, rtol=1e-3) assert u.Unit(pars["amplitude"].unit) == "cm-2 s-1 TeV-1" assert_allclose(pars["amplitude"].value, 5.191e-11, rtol=1e-3) def _read_hess_obs(): path = "$GAMMAPY_DATA/joint-crab/spectra/hess/" obs1 = SpectrumDatasetOnOff.read(path + "pha_obs23523.fits") obs2 = SpectrumDatasetOnOff.read(path + "pha_obs23592.fits") return [obs1, obs2] def make_gti(times, time_ref="2010-01-01"): return GTI.create(times["START"], times["STOP"], time_ref) @requires_data("gammapy-data") def make_observation_list(): """obs with dummy IRF""" nbin = 3 energy = np.logspace(-1, 1, nbin + 1) * u.TeV livetime = 2 * u.h data_on = np.arange(nbin) dataoff_1 = np.ones(3) dataoff_2 = np.ones(3) * 3 dataoff_1[1] = 0 dataoff_2[1] = 0 axis = MapAxis.from_edges(energy, name="energy", interp="log") axis_true = axis.copy(name="energy_true") geom = RegionGeom(region=None, axes=[axis]) geom_true = RegionGeom(region=None, axes=[axis_true]) on_vector = RegionNDMap.from_geom(geom=geom, data=data_on) off_vector1 = RegionNDMap.from_geom(geom=geom, data=dataoff_1) off_vector2 = RegionNDMap.from_geom(geom=geom, data=dataoff_2) mask_safe = RegionNDMap.from_geom(geom, dtype=bool) mask_safe.data += True acceptance = RegionNDMap.from_geom(geom=geom, data=1) acceptance_off_1 = RegionNDMap.from_geom(geom=geom, data=2) acceptance_off_2 = RegionNDMap.from_geom(geom=geom, data=4) aeff = RegionNDMap.from_geom(geom_true, data=1, unit="m2") edisp = EDispKernelMap.from_gauss( energy_axis=axis, energy_axis_true=axis_true, sigma=0.2, bias=0, geom=geom ) time_ref = Time("2010-01-01") gti1 = make_gti( {"START": [5, 6, 1, 2] * u.s, "STOP": [8, 7, 3, 4] * u.s}, time_ref=time_ref ) gti2 = make_gti({"START": [14] * u.s, "STOP": [15] * u.s}, time_ref=time_ref) exposure = aeff * livetime exposure.meta["livetime"] = livetime obs1 = SpectrumDatasetOnOff( counts=on_vector, counts_off=off_vector1, exposure=exposure, edisp=edisp, mask_safe=mask_safe, acceptance=acceptance.copy(), acceptance_off=acceptance_off_1, name="1", gti=gti1, ) obs2 = SpectrumDatasetOnOff( counts=on_vector, counts_off=off_vector2, exposure=exposure.copy(), edisp=edisp, mask_safe=mask_safe, acceptance=acceptance.copy(), acceptance_off=acceptance_off_2, name="2", gti=gti2, ) obs_list = [obs1, obs2] return obs_list @requires_data("gammapy-data") class TestSpectrumDatasetOnOffStack: def setup_method(self): self.datasets = _read_hess_obs() # Change threshold to make stuff more interesting geom = self.datasets[0]._geom self.datasets[0].mask_safe = geom.energy_mask( energy_min=1.2 * u.TeV, energy_max=50 * u.TeV ) mask = geom.energy_mask(energy_max=20 * u.TeV) self.datasets[1].mask_safe &= mask self.stacked_dataset = self.datasets[0].to_masked() self.stacked_dataset.stack(self.datasets[1]) def test_basic(self): obs_1, obs_2 = self.datasets counts1 = obs_1.counts.data[obs_1.mask_safe].sum() counts2 = obs_2.counts.data[obs_2.mask_safe].sum() summed_counts = counts1 + counts2 stacked_counts = self.stacked_dataset.counts.data.sum() off1 = obs_1.counts_off.data[obs_1.mask_safe].sum() off2 = obs_2.counts_off.data[obs_2.mask_safe].sum() summed_off = off1 + off2 stacked_off = self.stacked_dataset.counts_off.data.sum() assert summed_counts == stacked_counts assert summed_off == stacked_off def test_thresholds(self): energy_min, energy_max = self.stacked_dataset.energy_range assert energy_min.unit == "keV" assert_allclose(energy_min, 8.912509e08, rtol=1e-3) assert energy_max.unit == "keV" assert_allclose(energy_max, 4.466836e10, rtol=1e-3) def test_verify_npred(self): """Verifying npred is preserved during the stacking""" pwl = SkyModel( spectral_model=PowerLawSpectralModel( index=2, amplitude=2e-11 * u.Unit("cm-2 s-1 TeV-1"), reference=1 * u.TeV ) ) self.stacked_dataset.models = pwl npred_stacked = self.stacked_dataset.npred_signal().data npred_stacked[~self.stacked_dataset.mask_safe.data] = 0 npred_summed = np.zeros_like(npred_stacked) for dataset in self.datasets: dataset.models = pwl npred_summed[dataset.mask_safe] += dataset.npred_signal().data[ dataset.mask_safe ] assert_allclose(npred_stacked, npred_summed, rtol=1e-6) def test_stack_backscal(self): """Verify backscal stacking""" obs1, obs2 = make_observation_list() obs1.stack(obs2) assert_allclose(obs1.alpha.data[0], 1.25 / 4.0) # When the OFF stack observation counts=0, the alpha is averaged on the # total OFF counts for each run. assert_allclose(obs1.alpha.data[1], 2.5 / 8.0) def test_stack_gti(self): obs1, obs2 = make_observation_list() obs1.stack(obs2) assert_allclose(obs1.gti.met_start.value, [1.0, 5.0, 14.0]) assert_allclose(obs1.gti.met_stop.value, [4.0, 8.0, 15.0]) @requires_data("gammapy-data") def test_datasets_stack_reduce(): datasets = Datasets() obs_ids = [23523, 23526, 23559, 23592] for obs_id in obs_ids: filename = f"$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs{obs_id}.fits" ds = SpectrumDatasetOnOff.read(filename) datasets.append(ds) stacked = datasets.stack_reduce(name="stacked") assert_allclose(stacked.exposure.meta["livetime"].to_value("s"), 6313.8116406202325) info_table = datasets.info_table() assert_allclose(info_table["counts"], [124, 126, 119, 90]) info_table_cum = datasets.info_table(cumulative=True) assert_allclose(info_table_cum["counts"], [124, 250, 369, 459]) assert stacked.name == "stacked" @requires_data("gammapy-data") def test_datasets_stack_reduce_no_off(): datasets = Datasets() obs_ids = [23523, 23526, 23559, 23592] for obs_id in obs_ids: filename = f"$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs{obs_id}.fits" ds = SpectrumDatasetOnOff.read(filename) datasets.append(ds) datasets[-1].counts_off = None with pytest.raises(ValueError): stacked = datasets.stack_reduce(name="stacked") datasets[-1].mask_safe.data[...] = False stacked = datasets.stack_reduce(name="stacked") assert_allclose(stacked.exposure.meta["livetime"].to_value("s"), 4732.5469999) assert stacked.counts == 369 datasets[0].mask_safe.data[...] = False stacked = datasets.stack_reduce(name="stacked") assert_allclose(stacked.exposure.meta["livetime"].to_value("s"), 3150.81024152) assert stacked.counts == 245 @requires_data("gammapy-data") def test_stack_livetime(): dataset_ref = SpectrumDatasetOnOff.read( "$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs23523.fits" ) energy_axis = dataset_ref.counts.geom.axes["energy"] energy_axis_true = dataset_ref.exposure.geom.axes["energy_true"] geom = RegionGeom(region=None, axes=[energy_axis]) dataset = SpectrumDatasetOnOff.create(geom=geom, energy_axis_true=energy_axis_true) dataset.stack(dataset_ref) assert_allclose(dataset.exposure.meta["livetime"], 1581.736758 * u.s) dataset.stack(dataset_ref) assert_allclose(dataset.exposure.meta["livetime"], 2 * 1581.736758 * u.s) def test_spectrum_dataset_on_off_to_yaml(tmpdir): spectrum_datasets_on_off = make_observation_list() datasets = Datasets(spectrum_datasets_on_off) datasets.write( filename=tmpdir / "datasets.yaml", filename_models=tmpdir / "models.yaml" ) datasets_read = Datasets.read( filename=tmpdir / "datasets.yaml", filename_models=tmpdir / "models.yaml" ) assert len(datasets_read) == len(datasets) assert datasets_read[0].name == datasets[0].name assert datasets_read[1].name == datasets[1].name assert datasets_read[1].counts.data.sum() == datasets[1].counts.data.sum() class TestFit: """Test fit on counts spectra without any IRFs""" def setup_method(self): self.nbins = 30 energy = np.logspace(-1, 1, self.nbins + 1) * u.TeV self.source_model = SkyModel( spectral_model=PowerLawSpectralModel( index=2, amplitude=1e5 * u.Unit("cm-2 s-1 TeV-1"), reference=0.1 * u.TeV ) ) bkg_model = PowerLawSpectralModel( index=3, amplitude=1e4 * u.Unit("cm-2 s-1 TeV-1"), reference=0.1 * u.TeV ) self.alpha = 0.1 random_state = get_random_state(23) npred = self.source_model.spectral_model.integral(energy[:-1], energy[1:]).value source_counts = random_state.poisson(npred) axis = MapAxis.from_edges(energy, name="energy", interp="log") geom = RegionGeom(region=None, axes=[axis]) self.src = RegionNDMap.from_geom(geom=geom, data=source_counts) self.exposure = RegionNDMap.from_geom(geom.as_energy_true, data=1, unit="cm2 s") npred_bkg = bkg_model.integral(energy[:-1], energy[1:]).value bkg_counts = random_state.poisson(npred_bkg) off_counts = random_state.poisson(npred_bkg * 1.0 / self.alpha) self.bkg = RegionNDMap.from_geom(geom=geom, data=bkg_counts) self.off = RegionNDMap.from_geom(geom=geom, data=off_counts) def test_cash(self): """Simple CASH fit to the on vector""" dataset = SpectrumDataset( models=self.source_model, counts=self.src, exposure=self.exposure, ) npred = dataset.npred().data assert_allclose(npred[5], 660.5171, rtol=1e-5) stat_val = dataset.stat_sum() assert_allclose(stat_val, -107346.5291, rtol=1e-5) self.source_model.parameters["index"].value = 1.12 fit = Fit() fit.run(datasets=[dataset]) # These values are check with sherpa fits, do not change pars = self.source_model.parameters assert_allclose(pars["index"].value, 1.995525, rtol=1e-3) assert_allclose(pars["amplitude"].value, 100245.9, rtol=1e-3) def test_wstat(self): """WStat with on source and background spectrum""" on_vector = self.src.copy() on_vector.data += self.bkg.data acceptance = RegionNDMap.from_geom(self.src.geom, data=1) acceptance_off = RegionNDMap.from_geom(self.bkg.geom, data=1 / self.alpha) dataset = SpectrumDatasetOnOff( counts=on_vector, counts_off=self.off, exposure=self.exposure, acceptance=acceptance, acceptance_off=acceptance_off, ) dataset.models = self.source_model self.source_model.parameters.index = 1.12 fit = Fit() result = fit.run(datasets=[dataset]) pars = self.source_model.parameters assert_allclose(pars["index"].value, 1.997342, rtol=1e-3) assert_allclose(pars["amplitude"].value, 100245.187067, rtol=1e-3) assert_allclose(result.total_stat, 30.022316, rtol=1e-3) def test_fit_range(self): """Test fit range without complication of thresholds""" geom = self.src.geom mask_safe = RegionNDMap.from_geom(geom, dtype=bool) mask_safe.data += True dataset = SpectrumDatasetOnOff(counts=self.src, mask_safe=mask_safe) assert np.sum(dataset.mask_safe) == self.nbins energy_min, energy_max = dataset.energy_range assert_allclose(energy_max, 10) assert_allclose(energy_min, 0.1) def test_stat_profile(self): geom = self.src.geom mask_safe = RegionNDMap.from_geom(geom, dtype=bool) mask_safe.data += True dataset = SpectrumDataset( models=self.source_model, exposure=self.exposure, counts=self.src, mask_safe=mask_safe, ) fit = Fit() fit.run(datasets=[dataset]) true_idx = self.source_model.parameters["index"].value values = np.linspace(0.95 * true_idx, 1.05 * true_idx, 100) self.source_model.spectral_model.index.scan_values = values profile = fit.stat_profile(datasets=[dataset], parameter="index") actual = values[np.argmin(profile["stat_scan"])] assert_allclose(actual, true_idx, rtol=0.01) def test_stat_sum(): axis = MapAxis.from_energy_bounds(0.1, 10, 5, unit="TeV") geom = RegionGeom.create(None, axes=[axis]) dataset = SpectrumDatasetOnOff.create(geom) dataset.counts_off = None stat = dataset.stat_sum() assert stat == 0 dataset.mask_safe.data[0] = True with pytest.raises(AttributeError): dataset.stat_sum() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/tests/test_utils.py0000644000175100001770000000626114721316200021574 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from gammapy.datasets import MapDataset from gammapy.datasets.utils import apply_edisp, split_dataset from gammapy.irf import EDispKernel from gammapy.maps import Map, MapAxis from gammapy.modeling.models import ( Models, PowerLawNormSpectralModel, SkyModel, TemplateSpatialModel, ) from gammapy.utils.testing import requires_data @pytest.fixture def region_map_true(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=6, name="energy_true") m = Map.create( region="icrs;circle(83.63, 21.51, 1)", map_type="region", axes=[axis], unit="1/TeV", ) m.data = np.arange(m.data.size, dtype=float).reshape(m.geom.data_shape) return m def test_apply_edisp(region_map_true): e_true = region_map_true.geom.axes[0] e_reco = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) edisp = EDispKernel.from_diagonal_response( energy_axis_true=e_true, energy_axis=e_reco ) m = apply_edisp(region_map_true, edisp) assert m.geom.data_shape == (3, 1, 1) e_reco = m.geom.axes[0].edges assert e_reco.unit == "TeV" assert m.geom.axes[0].name == "energy" assert_allclose(e_reco[[0, -1]].value, [1, 10]) @requires_data() def test_dataset_split(): template_diffuse = TemplateSpatialModel.read( filename="$GAMMAPY_DATA/fermi-3fhl-gc/gll_iem_v06_gc.fits.gz", normalize=False, ) diffuse_iem = SkyModel( spectral_model=PowerLawNormSpectralModel(), spatial_model=template_diffuse, name="diffuse-iem", ) dataset = MapDataset.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc.fits.gz") width = 4 * u.deg margin = 1 * u.deg datasets = split_dataset(dataset, width, margin) assert len(datasets) == 15 assert len(datasets.models) == 0 datasets = split_dataset(dataset, width, margin, split_template_models=False) assert len(datasets.models) == 0 dataset.models = Models() datasets = split_dataset(dataset, width, margin) assert len(datasets.models) == 0 dataset.models = Models([diffuse_iem]) datasets = split_dataset(dataset, width, margin, split_template_models=False) assert len(datasets) == 15 assert len(datasets.models) == 1 datasets = split_dataset( dataset, width=width, margin=margin, split_template_models=True ) assert len(datasets.models) == len(datasets) assert len(datasets.parameters.free_parameters) == 1 assert ( datasets[7].models[0].spatial_model.map.geom.width[0][0] == width + 2 * margin ) assert ( datasets[7].models[0].spatial_model.map.geom.width[1][0] == width + 2 * margin ) geom = dataset.counts.geom pixel_width = np.ceil((width / geom.pixel_scales).to_value("")).astype(int) margin_width = np.ceil((margin / geom.pixel_scales).to_value("")).astype(int) assert datasets[7].mask_fit.data[0, :, :].sum() == np.prod(pixel_width) assert (~datasets[7].mask_fit.data[0, :, :]).sum() == np.prod( pixel_width + 2 * margin_width ) - np.prod(pixel_width) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/datasets/utils.py0000644000175100001770000001614614721316200017376 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from astropy.coordinates import SkyCoord from gammapy.maps import Map from gammapy.modeling.models.utils import cutout_template_models from . import Datasets __all__ = ["apply_edisp", "split_dataset"] def apply_edisp(input_map, edisp): """Apply energy dispersion to map. Requires "energy_true" axis. Parameters ---------- input_map : `~gammapy.maps.Map` The map to be convolved with the energy dispersion. It must have an axis named "energy_true". edisp : `~gammapy.irf.EDispKernel` Energy dispersion matrix. Returns ------- map : `~gammapy.maps.Map` Map with energy dispersion applied. Examples -------- >>> from gammapy.irf.edisp import EDispKernel >>> from gammapy.datasets.utils import apply_edisp >>> from gammapy.maps import MapAxis, Map >>> import numpy as np >>> >>> axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=6, name="energy_true") >>> m = Map.create( ... skydir=(0.8, 0.8), ... width=(1, 1), ... binsz=0.02, ... axes=[axis], ... frame="galactic" ... ) >>> e_true = m.geom.axes[0] >>> e_reco = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) >>> edisp = EDispKernel.from_diagonal_response(energy_axis_true=e_true, energy_axis=e_reco) >>> map_edisp = apply_edisp(m, edisp) >>> print(map_edisp) WcsNDMap geom : WcsGeom axes : ['lon', 'lat', 'energy'] shape : (np.int64(50), np.int64(50), 3) ndim : 3 unit : dtype : float64 """ # TODO: either use sparse matrix multiplication or something like edisp.is_diagonal if edisp is not None: loc = input_map.geom.axes.index("energy_true") data = np.rollaxis(input_map.data, loc, len(input_map.data.shape)) data = np.dot(data, edisp.pdf_matrix) data = np.rollaxis(data, -1, loc) energy_axis = edisp.axes["energy"].copy(name="energy") else: data = input_map.data energy_axis = input_map.geom.axes["energy_true"].copy(name="energy") geom = input_map.geom.to_image().to_cube(axes=[energy_axis]) return Map.from_geom(geom=geom, data=data, unit=input_map.unit) def get_figure(fig, width, height): import matplotlib.pyplot as plt if plt.get_fignums(): if not fig: fig = plt.gcf() fig.clf() else: fig = plt.figure(figsize=(width, height)) return fig def get_axes(ax1, ax2, width, height, args1, args2, kwargs1=None, kwargs2=None): if not ax1 and not ax2: kwargs1 = kwargs1 or {} kwargs2 = kwargs2 or {} fig = get_figure(None, width, height) ax1 = fig.add_subplot(*args1, **kwargs1) ax2 = fig.add_subplot(*args2, **kwargs2) elif not ax1 or not ax2: raise ValueError("Either both or no Axes must be provided") return ax1, ax2 def get_nearest_valid_exposure_position(exposure, position=None): mask_exposure = exposure > 0.0 * exposure.unit mask_exposure = mask_exposure.reduce_over_axes(func=np.logical_or) if not position: position = mask_exposure.geom.center_skydir return mask_exposure.mask_nearest_position(position) def split_dataset(dataset, width, margin, split_template_models=True): """Split dataset in multiple non-overlapping analysis regions. Parameters ---------- dataset : `~gammapy.datasets.Dataset` Dataset to split. width : `~astropy.coordinates.Angle` Angular size of each sub-region. margin : `~astropy.coordinates.Angle` Angular size to be added to the `width`. The margin should be defined such as sources outside the region of interest that contributes inside are well-defined. The mask_fit in the margin region is False and unchanged elsewhere. split_template_models : bool, optional Apply cutout to template models or not. Default is True. Returns ------- datasets : `~gammapy.datasets.Datasets` Split datasets. Examples -------- >>> from gammapy.datasets import MapDataset >>> from gammapy.datasets.utils import split_dataset >>> from gammapy.modeling.models import GaussianSpatialModel, PowerLawSpectralModel, SkyModel >>> import astropy.units as u >>> dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") >>> # Split the dataset >>> width = 4 * u.deg >>> margin = 1 * u.deg >>> split_datasets = split_dataset(dataset, width, margin, split_template_models=False) >>> # Apply a model and split the dataset >>> spatial_model = GaussianSpatialModel() >>> spectral_model = PowerLawSpectralModel() >>> sky_model = SkyModel( ... spatial_model=spatial_model, spectral_model=spectral_model, name="test-model" ... ) >>> dataset.models = [sky_model] >>> split_datasets = split_dataset( ... dataset, width=width, margin=margin, split_template_models=True ... ) """ if margin >= width / 2.0: raise ValueError("margin should be lower than width/2.") geom = dataset.counts.geom.to_image() pixel_width = np.ceil((width / geom.pixel_scales).to_value("")).astype(int) ilon = range(0, geom.data_shape[1], pixel_width[1]) ilat = range(0, geom.data_shape[0], pixel_width[0]) datasets = Datasets() for il in ilon: for ib in ilat: l, b = geom.pix_to_coord((il, ib)) cutout_kwargs = dict( position=SkyCoord(l, b, frame=geom.frame), width=width + 2 * margin ) d = dataset.cutout(**cutout_kwargs) # apply mask geom_cut = d.counts.geom geom_cut_image = geom_cut.to_image() ilgrid, ibgrid = np.meshgrid( range(geom_cut_image.data_shape[1]), range(geom_cut_image.data_shape[0]) ) il_cut, ib_cut = geom_cut_image.coord_to_pix((l, b)) mask = (ilgrid >= il_cut - pixel_width[1] / 2.0) & ( ibgrid >= ib_cut - pixel_width[0] / 2.0 ) if il == ilon[-1]: mask &= ilgrid <= il_cut + pixel_width[1] / 2.0 else: mask &= ilgrid < il_cut + pixel_width[1] / 2.0 if ib == ilat[-1]: mask &= ibgrid <= ib_cut + pixel_width[0] / 2.0 else: mask &= ibgrid < ib_cut + pixel_width[0] / 2.0 mask = np.expand_dims(mask, 0) mask = np.repeat(mask, geom_cut.data_shape[0], axis=0) d.mask_fit = Map.from_geom(geom_cut, data=mask) if dataset.mask_fit is not None: d.mask_fit &= dataset.mask_fit.interp_to_geom( geom_cut, method="nearest" ) # template models cutout (should limit memory usage in parallel) if split_template_models: d.models = cutout_template_models( dataset.models, cutout_kwargs, [d.name] ) else: d.models = dataset.models datasets.append(d) return datasets ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.2006419 gammapy-1.3/gammapy/estimators/0000755000175100001770000000000014721316215016244 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/__init__.py0000644000175100001770000000245314721316200020353 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Estimators.""" from gammapy.utils.registry import Registry from .core import Estimator from .energydependentmorphology import EnergyDependentMorphologyEstimator from .map import ASmoothMapEstimator, ExcessMapEstimator, FluxMaps, TSMapEstimator from .metadata import FluxMetaData from .parameter import ParameterEstimator from .points import ( FluxPoints, FluxPointsEstimator, FluxProfileEstimator, LightCurveEstimator, SensitivityEstimator, ) from .profile import ImageProfile, ImageProfileEstimator __all__ = [ "ASmoothMapEstimator", "Estimator", "ESTIMATOR_REGISTRY", "ExcessMapEstimator", "FluxMaps", "FluxPoints", "FluxPointsEstimator", "FluxProfileEstimator", "ImageProfile", "ImageProfileEstimator", "LightCurveEstimator", "ParameterEstimator", "SensitivityEstimator", "TSMapEstimator", "EnergyDependentMorphologyEstimator", "FluxMetaData", ] ESTIMATOR_REGISTRY = Registry( [ ExcessMapEstimator, TSMapEstimator, FluxPointsEstimator, ASmoothMapEstimator, LightCurveEstimator, SensitivityEstimator, FluxProfileEstimator, ParameterEstimator, ] ) """Registry of estimator classes in Gammapy.""" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/core.py0000644000175100001770000000607014721316200017543 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import abc import html import inspect from copy import deepcopy import numpy as np from astropy import units as u from gammapy.maps import MapAxis from gammapy.modeling.models import ModelBase __all__ = ["Estimator"] class Estimator(abc.ABC): """Abstract estimator base class.""" _available_selection_optional = {} @property @abc.abstractmethod def tag(self): pass @abc.abstractmethod def run(self, datasets): pass @property def selection_optional(self): """""" return self._selection_optional @selection_optional.setter def selection_optional(self, selection): """Set optional selection.""" available = self._available_selection_optional if selection is None: self._selection_optional = [] elif "all" in selection: self._selection_optional = available else: if set(selection).issubset(set(available)): self._selection_optional = selection else: difference = set(selection).difference(set(available)) raise ValueError(f"{difference} is not a valid method.") def _get_energy_axis(self, dataset): """Energy axis.""" if self.energy_edges is None: energy_axis = dataset.counts.geom.axes["energy"].squash() if getattr(self, "sum_over_energy_groups", False): energy_edges = [energy_axis.edges[0], energy_axis.edges[1]] energy_axis = MapAxis.from_energy_edges(energy_edges) else: energy_axis = MapAxis.from_energy_edges(self.energy_edges) return energy_axis def copy(self): """Copy estimator.""" return deepcopy(self) @property def config_parameters(self): """Configuration parameters.""" pars = self.__dict__.copy() pars = {key.strip("_"): value for key, value in pars.items()} return pars def __str__(self): s = f"{self.__class__.__name__}\n" s += "-" * (len(s) - 1) + "\n\n" pars = self.config_parameters max_len = np.max([len(_) for _ in pars]) + 1 for name, value in sorted(pars.items()): if isinstance(value, ModelBase): s += f"\t{name:{max_len}s}: {value.tag[0]}\n" elif inspect.isclass(value): s += f"\t{name:{max_len}s}: {value.__name__}\n" elif isinstance(value, u.Quantity): s += f"\t{name:{max_len}s}: {value}\n" elif isinstance(value, Estimator): pass elif isinstance(value, np.ndarray): value = np.array_str(value, precision=2, suppress_small=True) s += f"\t{name:{max_len}s}: {value}\n" else: s += f"\t{name:{max_len}s}: {value}\n" return s.expandtabs(tabsize=2) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/energydependentmorphology.py0000644000175100001770000002601014721316200024107 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Implementation of energy-dependent morphology estimator tool.""" import numpy as np from gammapy.datasets import Datasets from gammapy.modeling import Fit from gammapy.modeling.models import FoVBackgroundModel, Models from gammapy.modeling.selection import TestStatisticNested from gammapy.stats.utils import ts_to_sigma from .core import Estimator __all__ = ["weighted_chi2_parameter", "EnergyDependentMorphologyEstimator"] def weighted_chi2_parameter(results_edep, parameters=["sigma"]): r"""Calculate the weighted chi-squared value for the parameters of interest. The chi-squared parameter is defined as follows: .. math:: \chi^2 = \sum_i \frac{(x_i - \bar{\mu})^2}{\sigma_i ^ 2} where the :math:`x_i` and :math:`\sigma_i` are the value and error of the parameter of interest, and the weighted mean is .. math:: \bar{\mu} = \sum_i \frac{(x_i/ \sigma_i ^ 2)}{(1/\sigma_i ^ 2)} Parameters ---------- result_edep : `dict` Dictionary of results for the energy-dependent estimator. parameters : list of str, optional The model parameters to calculate the chi-squared value for. Default is ["sigma"]. Returns ------- chi2_result : `dict` Dictionary with the chi-squared values for the parameters of interest. Notes ----- This chi-square should be utilised with caution as it does not take into account any correlation between the parameters. To properly utilise the chi-squared parameter one must ensure each of the parameters are independent, which cannot be guaranteed in this use case. """ chi2_value = [] df = [] for parameter in parameters: values = results_edep[parameter][1:] errors = results_edep[f"{parameter}_err"][1:] weights = 1 / errors**2 avg = np.average(values, weights=weights) chi2_value += [np.sum((values - avg) ** 2 / errors**2).to_value()] df += [len(values) - 1] significance = [ts_to_sigma(chi2_value[i], df[i]) for i in range(len(chi2_value))] chi2_result = {} chi2_result["parameter"] = parameters chi2_result["chi2"] = chi2_value chi2_result["df"] = df chi2_result["significance"] = significance return chi2_result class EnergyDependentMorphologyEstimator(Estimator): """Test if there is any energy-dependent morphology in a map dataset for a given set of energy bins. Parameters ---------- energy_edges : list of `~astropy.units.Quantity` Energy edges for the energy-dependence test. source : str or int For which source in the model to compute the estimator. fit : `~gammapy.modeling.Fit`, optional Fit instance specifying the backend and fit options. If None, the fit backend default is minuit. Default is None. Examples -------- For a usage example see :doc:`/tutorials/analysis-3d/energy_dependent_estimation` tutorial. """ tag = "EnergyDependentMorphologyEstimator" def __init__(self, energy_edges, source=0, fit=None): self.energy_edges = energy_edges self.source = source self.num_energy_bands = len(self.energy_edges) - 1 if fit is None: fit = Fit() self.fit = fit def _slice_datasets(self, datasets): """Calculate a dataset for each energy slice. Parameters ---------- datasets : `~gammapy.datasets.Datasets` Input datasets to use. Returns ------- slices_src : `~gammapy.datasets.Datasets` Sliced datasets. """ model = datasets.models[self.source] filtered_names = [name for name in datasets.models.names if name != self.source] other_models = Models() for name in filtered_names: other_models.append(datasets.models[name]) slices_src = Datasets() for i, (emin, emax) in enumerate( zip(self.energy_edges[:-1], self.energy_edges[1:]) ): for dataset in datasets: sliced_src = dataset.slice_by_energy( emin, emax, name=f"{self.source}_{i}" ) bkg_sliced_model = FoVBackgroundModel(dataset_name=sliced_src.name) sliced_src.models = [ model.copy(name=f"{sliced_src.name}-model"), *other_models, bkg_sliced_model, ] slices_src.append(sliced_src) return slices_src def _estimate_source_significance(self, datasets): """Estimate the significance of the source above the background. Parameters ---------- datasets : `~gammapy.datasets.Datasets` Input datasets to use. Returns ------- result_bkg_src : `dict` Dictionary with the results of the null hypothesis with no source, and alternative hypothesis with the source added in. Entries are: * "Emin" : the minimum energy of the energy band * "Emax" : the maximum energy of the energy band * "delta_ts" : difference in ts * "df" : the degrees of freedom between null and alternative hypothesis * "significance" : significance of the result """ slices_src = self._slice_datasets(datasets) # Norm is free and fit test_results = [] for sliced in slices_src: parameters = [ param for param in sliced.models[ f"{sliced.name}-model" ].parameters.free_parameters ] null_values = [0] + [ param.value for param in sliced.models[ f"{sliced.name}-model" ].spatial_model.parameters.free_parameters ] test = TestStatisticNested( parameters=parameters, null_values=null_values, n_sigma=-np.inf, fit=self.fit, ) test_results.append(test.run(sliced)) delta_ts_bkg_src = [_["ts"] for _ in test_results] df_src = [ len(_["fit_results"].parameters.free_parameters.names) for _ in test_results ] df_bkg = 1 df_bkg_src = df_src[0] - df_bkg sigma_ts_bkg_src = ts_to_sigma(delta_ts_bkg_src, df=df_bkg_src) # Prepare results dictionary for signal above background result_bkg_src = {} result_bkg_src["Emin"] = self.energy_edges[:-1] result_bkg_src["Emax"] = self.energy_edges[1:] result_bkg_src["delta_ts"] = delta_ts_bkg_src result_bkg_src["df"] = [df_bkg_src] * self.num_energy_bands result_bkg_src["significance"] = [elem for elem in sigma_ts_bkg_src] return result_bkg_src def estimate_energy_dependence(self, datasets): """Estimate the potential of energy-dependent morphology. Parameters ---------- datasets : `~gammapy.datasets.Datasets` Input datasets to use. Returns ------- results : `dict` Dictionary with results of the energy-dependence test. Entries are: * "delta_ts" : difference in ts between fitting each energy band individually (sliced fit) and the joint fit * "df" : the degrees of freedom between fitting each energy band individually (sliced fit) and the joint fit * "result" : the results for the fitting each energy band individually (sliced fit) and the joint fit """ model = datasets.models[self.source] # Calculate the individually sliced components slices_src = self._slice_datasets(datasets) results_src = [] for sliced in slices_src: results_src.append(self.fit.run(sliced)) results_src_total_stat = [result.total_stat for result in results_src] free_x, free_y = np.shape( [result.parameters.free_parameters.names for result in results_src] ) df_src = free_x * free_y # Calculate the joint fit parameters = model.spatial_model.parameters.free_parameters.names slice0 = slices_src[0] for i, slice_j in enumerate(slices_src[1:]): for param in parameters: setattr( slice_j.models[f"{self.source}_{i+1}-model"].spatial_model, param, slice0.models[f"{self.source}_0-model"].spatial_model.parameters[ param ], ) result_joint = self.fit.run(slices_src) # Compare fit of individual energy slices to the results with joint fit delta_ts_joint = result_joint.total_stat - np.sum(results_src_total_stat) df_joint = len(slices_src.parameters.free_parameters.names) df = df_src - df_joint # Prepare results dictionary joint_values = [result_joint.parameters[param].value for param in parameters] joint_errors = [result_joint.parameters[param].error for param in parameters] parameter_values = np.empty((len(parameters), self.num_energy_bands)) parameter_errors = np.empty((len(parameters), self.num_energy_bands)) for i in range(self.num_energy_bands): parameter_values[:, i] = [ results_src[i].parameters[param].value for param in parameters ] parameter_errors[:, i] = [ results_src[i].parameters[param].error for param in parameters ] result = {} result["Hypothesis"] = ["H0"] + ["H1"] * self.num_energy_bands result["Emin"] = np.append(self.energy_edges[0], self.energy_edges[:-1]) result["Emax"] = np.append(self.energy_edges[-1], self.energy_edges[1:]) units = [result_joint.parameters[param].unit for param in parameters] # Results for H0 in the first row and then H1 -- i.e. individual bands in other rows for i in range(len(parameters)): result[f"{parameters[i]}"] = np.append( joint_values[i] * units[i], parameter_values[i] * units[i] ) result[f"{parameters[i]}_err"] = np.append( joint_errors[i] * units[i], parameter_errors[i] * units[i] ) return dict(delta_ts=delta_ts_joint, df=df, result=result) def run(self, datasets): """Run the energy-dependence estimator. Parameters ---------- datasets : `~gammapy.datasets.Datasets` Input datasets to use. Returns ------- results : `dict` Dictionary with the various energy-dependence estimation values. """ if not isinstance(datasets, Datasets) or datasets.is_all_same_type is False: raise ValueError("Unsupported datasets type.") results = {} results["energy_dependence"] = self.estimate_energy_dependence(datasets) results["src_above_bkg"] = self._estimate_source_significance(datasets) return results ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/flux.py0000644000175100001770000001314014721316200017565 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import numpy as np from gammapy.datasets import Datasets from gammapy.datasets.actors import DatasetsActor from gammapy.estimators.parameter import ParameterEstimator from gammapy.estimators.utils import _get_default_norm from gammapy.maps import Map, MapAxis from gammapy.modeling.models import ScaleSpectralModel log = logging.getLogger(__name__) class FluxEstimator(ParameterEstimator): """Flux estimator. Estimates flux for a given list of datasets with their model in a given energy range. To estimate the model flux the amplitude of the reference spectral model is fitted within the energy range. The amplitude is re-normalized using the "norm" parameter, which specifies the deviation of the flux from the reference model in this energy range. Note that there should be only one free norm or amplitude parameter for the estimator to run. Parameters ---------- source : str or int For which source in the model to compute the flux. n_sigma : int Sigma to use for asymmetric error computation. n_sigma_ul : int Sigma to use for upper limit computation. selection_optional : list of str, optional Which additional quantities to estimate. Available options are: * "all": all the optional steps are executed. * "errn-errp": estimate asymmetric errors. * "ul": estimate upper limits. * "scan": estimate fit statistic profiles. Default is None so the optional steps are not executed. fit : `Fit` Fit instance specifying the backend and fit options. reoptimize : bool If True the free parameters of the other models are fitted in each bin independently, together with the norm of the source of interest (but the other parameters of the source of interest are kept frozen). If False only the norm of the source of interest if fitted, and all other parameters are frozen at their current values. Default is False. norm : `~gammapy.modeling.Parameter` or dict Norm parameter used for the fit. Default is None and a new parameter is created automatically, with value=1, name="norm", scan_min=0.2, scan_max=5, and scan_n_values = 11. By default, the min and max are not set and derived from the source model, unless the source model does not have one and only one norm parameter. If a dict is given the entries should be a subset of `~gammapy.modeling.Parameter` arguments. """ tag = "FluxEstimator" def __init__( self, source=0, n_sigma=1, n_sigma_ul=2, selection_optional=None, fit=None, reoptimize=False, norm=None, ): self.source = source self.norm = _get_default_norm(norm, interp="log") super().__init__( null_value=0, n_sigma=n_sigma, n_sigma_ul=n_sigma_ul, selection_optional=selection_optional, fit=fit, reoptimize=reoptimize, ) def get_scale_model(self, models): """Set scale model. Parameters ---------- models : `Models` Models. Returns ------- model : `ScaleSpectralModel` Scale spectral model. """ ref_model = models[self.source].spectral_model scale_model = ScaleSpectralModel(ref_model) scale_model.norm = self.norm.copy() return scale_model def estimate_npred_excess(self, datasets): """Estimate npred excess for the source. Parameters ---------- datasets : Datasets Datasets. Returns ------- result : dict Dictionary with an array with one entry per dataset with the sum of the masked npred excess. """ npred_excess = [] for dataset in datasets: name = datasets.models[self.source].name npred_signal = dataset.npred_signal(model_names=[name]) npred = Map.from_geom(dataset.counts.geom) npred.stack(npred_signal) npred_excess.append(npred.data[dataset.mask].sum()) return {"npred_excess": np.array(npred_excess), "datasets": datasets.names} def run(self, datasets): """Estimate flux for a given energy range. Parameters ---------- datasets : list of `~gammapy.datasets.SpectrumDataset` Spectrum datasets. Returns ------- result : dict Dictionary with results for the flux point. """ if not isinstance(datasets, DatasetsActor): datasets = Datasets(datasets) models = datasets.models.copy() model = self.get_scale_model(models) energy_min, energy_max = datasets.energy_ranges energy_axis = MapAxis.from_energy_edges([energy_min.min(), energy_max.max()]) with np.errstate(invalid="ignore", divide="ignore"): result = model.reference_fluxes(energy_axis=energy_axis) # convert to scalar values result = {key: value.item() for key, value in result.items()} # freeze all source model parameters models[self.source].parameters.freeze_all() models[self.source].spectral_model = model datasets.models = models result.update(super().run(datasets, model.norm)) datasets.models[self.source].spectral_model.norm.value = result["norm"] result.update(self.estimate_npred_excess(datasets=datasets)) return result ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.2006419 gammapy-1.3/gammapy/estimators/map/0000755000175100001770000000000014721316215017021 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/map/__init__.py0000644000175100001770000000046414721316200021130 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from .asmooth import ASmoothMapEstimator from .core import FluxMaps from .excess import ExcessMapEstimator from .ts import TSMapEstimator __all__ = [ "ASmoothMapEstimator", "ExcessMapEstimator", "FluxMaps", "TSMapEstimator", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/map/asmooth.py0000644000175100001770000002414314721316200021043 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Implementation of adaptive smoothing algorithms.""" import numpy as np from astropy.convolution import Gaussian2DKernel, Tophat2DKernel from astropy.coordinates import Angle from gammapy.datasets import MapDatasetOnOff from gammapy.maps import Map, Maps, WcsNDMap from gammapy.modeling.models import PowerLawSpectralModel from gammapy.stats import CashCountsStatistic from gammapy.utils.array import scale_cube from gammapy.utils.pbar import progress_bar from ..core import Estimator from ..utils import estimate_exposure_reco_energy from gammapy.utils.deprecation import deprecated_renamed_argument __all__ = ["ASmoothMapEstimator"] def _sqrt_ts_asmooth(counts, background): """Significance according to formula (5) in asmooth paper.""" return (counts - background) / np.sqrt(counts + background) class ASmoothMapEstimator(Estimator): """Adaptively smooth counts image. Achieves a roughly constant sqrt(TS) of features across the whole image. Algorithm based on https://ui.adsabs.harvard.edu/abs/2006MNRAS.368...65E . The algorithm was slightly adapted to also allow Li & Ma to estimate the sqrt(TS) of a feature in the image. Parameters ---------- scales : `~astropy.units.Quantity` Smoothing scales. kernel : `astropy.convolution.Kernel` Smoothing kernel. spectral_model : `~gammapy.modeling.models.SpectralModel`, optional Spectral model assumption. Default is power-law with spectral index of 2. method : {'lima', 'asmooth'} Significance estimation method. Default is 'lima'. threshold : float Significance threshold. Default is 5. energy_edges : list of `~astropy.units.Quantity`, optional Edges of the target maps energy bins. The resulting bin edges won't be exactly equal to the input ones, but rather the closest values to the energy axis edges of the parent dataset. Default is None: apply the estimator in each energy bin of the parent dataset. For further explanation see :ref:`estimators`. Examples -------- >>> import astropy.units as u >>> import numpy as np >>> from gammapy.estimators import ASmoothMapEstimator >>> from gammapy.datasets import MapDataset >>> dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") >>> scales = u.Quantity(np.arange(0.1, 1, 0.1), unit="deg") >>> smooth = ASmoothMapEstimator(threshold=3, scales=scales, energy_edges=[1, 10] * u.TeV) >>> images = smooth.run(dataset) """ tag = "ASmoothMapEstimator" @deprecated_renamed_argument("spectrum", "spectral_model", "v1.3") def __init__( self, scales=None, kernel=Gaussian2DKernel, spectral_model=None, method="lima", threshold=5, energy_edges=None, ): if spectral_model is None: spectral_model = PowerLawSpectralModel(index=2) self.spectral_model = spectral_model if scales is None: scales = self.get_scales(n_scales=9, kernel=kernel) self.scales = scales self.kernel = kernel self.threshold = threshold self.method = method self.energy_edges = energy_edges def selection_all(self): """Which quantities are computed.""" return @staticmethod def get_scales(n_scales, factor=np.sqrt(2), kernel=Gaussian2DKernel): """Create list of Gaussian widths. Parameters ---------- n_scales : int Number of scales. factor : float Incremental factor. Returns ------- scales : `~numpy.ndarray` Scale array. """ if kernel == Gaussian2DKernel: sigma_0 = 1.0 / np.sqrt(9 * np.pi) elif kernel == Tophat2DKernel: sigma_0 = 1.0 / np.sqrt(np.pi) return sigma_0 * factor ** np.arange(n_scales) def get_kernels(self, pixel_scale): """Get kernels according to the specified method. Parameters ---------- pixel_scale : `~astropy.coordinates.Angle` Sky image pixel scale. Returns ------- kernels : list List of `~astropy.convolution.Kernel`. """ scales = self.scales.to_value("deg") / Angle(pixel_scale).deg kernels = [] for scale in scales: # .value: kernel = self.kernel(scale, mode="oversample") # TODO: check if normalizing here makes sense kernel.normalize("peak") kernels.append(kernel) return kernels @staticmethod def _sqrt_ts_cube(cubes, method): if method in {"lima"}: scube = CashCountsStatistic(cubes["counts"], cubes["background"]).sqrt_ts elif method == "asmooth": scube = _sqrt_ts_asmooth(cubes["counts"], cubes["background"]) elif method == "ts": raise NotImplementedError() else: raise ValueError( "Not a valid sqrt_ts estimation method." " Choose one of the following: 'lima' or 'asmooth'" ) return scube def run(self, dataset): """Run adaptive smoothing on input MapDataset. Notes ----- The progress bar can be displayed for this function. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.MapDatasetOnOff` The input dataset (with one bin in energy at most). Returns ------- images : dict of `~gammapy.maps.WcsNDMap` Smoothed images; keys are: * 'counts' * 'background' * 'flux' (optional) * 'scales' * 'sqrt_ts'. """ energy_axis = self._get_energy_axis(dataset) results = [] for energy_min, energy_max in progress_bar( energy_axis.iter_by_edges, desc="Energy bins" ): dataset_sliced = dataset.slice_by_energy( energy_min=energy_min, energy_max=energy_max, name=dataset.name ) if dataset.models is not None: models_sliced = dataset.models._slice_by_energy( energy_min=energy_min, energy_max=energy_max, ) dataset_sliced.models = models_sliced result = self.estimate_maps(dataset_sliced) results.append(result) maps = Maps() for name in results[0].keys(): maps[name] = Map.from_stack( maps=[_[name] for _ in results], axis_name="energy" ) return maps def estimate_maps(self, dataset): """Run adaptive smoothing on input Maps. Parameters ---------- dataset : `MapDataset` Dataset. Returns ------- images : dict of `~gammapy.maps.WcsNDMap` Smoothed images; keys are: * 'counts' * 'background' * 'flux' (optional) * 'scales' * 'sqrt_ts'. """ dataset_image = dataset.to_image(name=dataset.name) dataset_image.models = dataset.models # extract 2d arrays counts = dataset_image.counts.data[0].astype(float) background = dataset_image.npred_background().data[0] if isinstance(dataset_image, MapDatasetOnOff): background = dataset_image.background.data[0] if dataset_image.exposure is not None: exposure = estimate_exposure_reco_energy(dataset_image, self.spectral_model) else: exposure = None pixel_scale = dataset_image.counts.geom.pixel_scales.mean() kernels = self.get_kernels(pixel_scale) cubes = {} cubes["counts"] = scale_cube(counts, kernels) cubes["background"] = scale_cube(background, kernels) if exposure is not None: flux = (dataset_image.counts - background) / exposure cubes["flux"] = scale_cube(flux.data[0], kernels) cubes["sqrt_ts"] = self._sqrt_ts_cube(cubes, method=self.method) smoothed = self._reduce_cubes(cubes, kernels) result = {} geom = dataset_image.counts.geom for name, data in smoothed.items(): # set remaining pixels with sqrt_ts < threshold to mean value if name in ["counts", "background"]: mask = np.isnan(data) data[mask] = np.mean(locals()[name][mask]) result[name] = WcsNDMap(geom, data, unit="") else: unit = "deg" if name == "scale" else "" result[name] = WcsNDMap(geom, data, unit=unit) if exposure is not None: data = smoothed["flux"] mask = np.isnan(data) data[mask] = np.mean(flux.data[0][mask]) result["flux"] = WcsNDMap(geom, data, unit=flux.unit) return result def _reduce_cubes(self, cubes, kernels): """ Combine scale cube to image. Parameters ---------- cubes : dict Data cubes. """ shape = cubes["counts"].shape[:2] smoothed = {} # Init smoothed data arrays for key in ["counts", "background", "scale", "sqrt_ts"]: smoothed[key] = np.tile(np.nan, shape) if "flux" in cubes: smoothed["flux"] = np.tile(np.nan, shape) for idx, scale in enumerate(self.scales): # slice out 2D image at index idx out of cube slice_ = np.s_[:, :, idx] mask = np.isnan(smoothed["counts"]) mask = (cubes["sqrt_ts"][slice_] > self.threshold) & mask smoothed["scale"][mask] = scale smoothed["sqrt_ts"][mask] = cubes["sqrt_ts"][slice_][mask] # renormalize smoothed data arrays norm = kernels[idx].array.sum() for key in ["counts", "background"]: smoothed[key][mask] = cubes[key][slice_][mask] / norm if "flux" in cubes: smoothed["flux"][mask] = cubes["flux"][slice_][mask] / norm return smoothed ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/map/core.py0000644000175100001770000011761414721316200020327 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging from copy import deepcopy import numpy as np from astropy import units as u from astropy.io import fits from astropy.utils import classproperty from gammapy.data import GTI from gammapy.maps import Map, Maps, TimeMapAxis from gammapy.modeling.models import ( Models, PowerLawSpectralModel, SkyModel, SpectralModel, ) from gammapy.utils.scripts import make_path __all__ = ["FluxMaps"] log = logging.getLogger(__name__) DEFAULT_UNIT = { "dnde": u.Unit("cm-2 s-1 TeV-1"), "e2dnde": u.Unit("erg cm-2 s-1"), "flux": u.Unit("cm-2 s-1"), "eflux": u.Unit("erg cm-2 s-1"), "norm": u.Unit(""), } REQUIRED_MAPS = { "dnde": ["dnde"], "e2dnde": ["e2dnde"], "flux": ["flux"], "eflux": ["eflux"], "likelihood": ["norm"], } REQUIRED_COLUMNS = { "dnde": ["e_ref", "dnde"], "e2dnde": ["e_ref", "e2dnde"], "flux": ["e_min", "e_max", "flux"], "eflux": ["e_min", "e_max", "eflux"], # TODO: extend required columns "likelihood": [ "e_min", "e_max", "e_ref", "ref_dnde", "ref_flux", "ref_eflux", "norm", ], } REQUIRED_QUANTITIES_SCAN = ["stat_scan", "stat"] OPTIONAL_QUANTITIES = { "dnde": ["dnde_err", "dnde_errp", "dnde_errn", "dnde_ul"], "e2dnde": ["e2dnde_err", "e2dnde_errp", "e2dnde_errn", "e2dnde_ul"], "flux": ["flux_err", "flux_errp", "flux_errn", "flux_ul", "flux_sensitivity"], "eflux": ["eflux_err", "eflux_errp", "eflux_errn", "eflux_ul"], "likelihood": ["norm_err", "norm_errn", "norm_errp", "norm_ul"], } VALID_QUANTITIES = [ "norm", "norm_err", "norm_errn", "norm_errp", "norm_ul", "norm_sensitivity", "ts", "sqrt_ts", "npred", "npred_excess", "stat", "stat_null", "stat_scan", "dnde_scan_values", "niter", "is_ul", "counts", "success", "n_dof", "alpha", "acceptance_on", "acceptance_off", ] OPTIONAL_QUANTITIES_COMMON = [ "ts", "sqrt_ts", "npred", "npred_excess", "stat", "stat_null", "stat_scan", "dnde_scan_values", "niter", "is_ul", "counts", "success", "n_dof", ] class FluxMaps: """A flux map / points container. It contains a set of `~gammapy.maps.Map` objects that store the estimated flux as a function of energy as well as associated quantities (typically errors, upper limits, delta TS and possibly raw quantities such counts, excesses etc). It also contains a reference model to convert the flux values in different formats. Usually, this should be the model used to produce the flux map. The associated map geometry can use a `RegionGeom` to store the equivalent of flux points, or a `WcsGeom`/`HpxGeom` to store an energy dependent flux map. The container relies internally on the 'Likelihood' SED type defined in :ref:`gadf:flux-points` and offers convenience properties to convert to other flux formats, namely: ``dnde``, ``flux``, ``eflux`` or ``e2dnde``. The conversion is done according to the reference model spectral shape. Parameters ---------- data : dict of `~gammapy.maps.Map` The maps dictionary. Expected entries are the following: * norm : the norm factor. * norm_err : optional, the error on the norm factor. * norm_errn : optional, the negative error on the norm factor. * norm_errp : optional, the positive error on the norm factor. * norm_ul : optional, the upper limit on the norm factor. * norm_scan : optional, the norm values of the test statistic scan. * stat_scan : optional, the test statistic scan values. * ts : optional, the delta test statistic associated with the flux value. * sqrt_ts : optional, the square root of the test statistic, when relevant. * success : optional, a boolean tagging the validity of the estimation. * n_dof : optional, the number of degrees of freedom used in TS computation * alpha : optional, normalisation factor to accounts for differences between the test region and the background * acceptance_off : optional, acceptance from the off region * acceptance_on : optional, acceptance from the on region reference_model : `~gammapy.modeling.models.SkyModel`, optional The reference model to use for conversions. If None, a model consisting of a point source with a power law spectrum of index 2 is assumed. meta : dict, optional Dict of metadata. gti : `~gammapy.data.GTI`, optional Maps GTI information. filter_success_nan : boolean, optional Set fitted norm and error to NaN when the fit has not succeeded. """ _expand_slice = (slice(None), np.newaxis, np.newaxis) def __init__( self, data, reference_model, meta=None, gti=None, filter_success_nan=True ): self._data = data if isinstance(reference_model, SpectralModel): reference_model = SkyModel(reference_model) self._reference_model = reference_model if meta is None: meta = {} meta.setdefault("sed_type_init", "likelihood") self.meta = meta self.gti = gti self._filter_success_nan = filter_success_nan @property def filter_success_nan(self): return self._filter_success_nan @filter_success_nan.setter def filter_success_nan(self, value): self._filter_success_nan = value @property def available_quantities(self): """Available quantities.""" return list(self._data.keys()) @staticmethod def all_quantities(sed_type): """All quantities allowed for a given SED type. Parameters ---------- sed_type : {"likelihood", "dnde", "e2dnde", "flux", "eflux"} SED type. Returns ------- list : list of str All allowed quantities for a given SED type. """ quantities = [] quantities += REQUIRED_MAPS[sed_type] quantities += OPTIONAL_QUANTITIES[sed_type] quantities += OPTIONAL_QUANTITIES_COMMON if sed_type == "likelihood": quantities += REQUIRED_QUANTITIES_SCAN return quantities @staticmethod def _validate_data(data, sed_type, check_scan=False): """Check that map input is valid and correspond to one of the SED type.""" try: keys = data.keys() required = set(REQUIRED_MAPS[sed_type]) except KeyError: raise ValueError(f"Unknown SED type: '{sed_type}'") if check_scan: required = required.union(REQUIRED_QUANTITIES_SCAN) if not required.issubset(keys): missing = required.difference(keys) raise ValueError( "Missing data / column for SED type '{}':" " {}".format( sed_type, missing ) ) # TODO: add support for scan def _check_quantity(self, quantity): if quantity not in self.available_quantities: raise AttributeError( f"Quantity '{quantity}' is not defined on current flux estimate." ) @staticmethod def _guess_sed_type(quantities): """Guess SED type from table content.""" valid_sed_types = list(REQUIRED_COLUMNS.keys()) for sed_type in valid_sed_types: required = set(REQUIRED_COLUMNS[sed_type]) if required.issubset(quantities): return sed_type @property def is_convertible_to_flux_sed_type(self): """Check whether differential SED type is convertible to integral SED type.""" if self.sed_type_init in ["dnde", "e2dnde"]: return self.energy_axis.node_type == "edges" return True @property def has_ul(self): """Whether the flux estimate has norm_ul defined.""" return "norm_ul" in self._data @property def has_any_ts(self): """Whether the flux estimate has either sqrt(TS) or test statistic defined.""" return any(_ in self._data for _ in ["ts", "sqrt_ts"]) @property def has_stat_profiles(self): """Whether the flux estimate has test statistic profiles.""" return "stat_scan" in self._data @property def has_success(self): """Whether the flux estimate has the fit status.""" return "success" in self._data @property def n_sigma(self): """n sigma""" return self.meta.get("n_sigma", 1) @property def n_sigma_ul(self): """n sigma UL.""" return self.meta.get("n_sigma_ul") @property def sqrt_ts_threshold_ul(self): """sqrt(TS) threshold for upper limits.""" return self.meta.get("sqrt_ts_threshold_ul", 2) @sqrt_ts_threshold_ul.setter def sqrt_ts_threshold_ul(self, value): """sqrt(TS) threshold for upper limits. Parameters ---------- value : int Threshold value in sqrt(TS) for upper limits. """ self.meta["sqrt_ts_threshold_ul"] = value if self.has_any_ts: self.is_ul = self.sqrt_ts < self.sqrt_ts_threshold_ul else: raise ValueError("Either TS or sqrt_ts is required to set the threshold") @property def sed_type_init(self): """Initial SED type.""" return self.meta.get("sed_type_init") @property def sed_type_plot_default(self): """Initial SED type.""" if self.sed_type_init == "likelihood": return "dnde" return self.sed_type_init @property def geom(self): """Reference map geometry as a `~gammapy.maps.Geom`.""" return self.norm.geom @property def energy_axis(self): """Energy axis as a `~gammapy.maps.MapAxis`.""" return self.geom.axes["energy"] @classproperty def reference_model_default(self): """Reference model default as a `~gammapy.modeling.models.SkyModel`.""" return SkyModel(PowerLawSpectralModel(index=2)) @property def reference_model(self): """Reference model as a `~gammapy.modeling.models.SkyModel`.""" return self._reference_model @property def reference_spectral_model(self): """Reference spectral model as a `SpectralModel`.""" return self.reference_model.spectral_model @property def energy_ref(self): """Reference energy. Defined by `energy_ref` column in `FluxPoints.table` or computed as log center, if `energy_min` and `energy_max` columns are present in `FluxEstimate.data`. Returns ------- energy_ref : `~astropy.units.Quantity` Reference energy. """ return self.energy_axis.center @property def energy_min(self): """Energy minimum. Returns ------- energy_min : `~astropy.units.Quantity` Lower bound of energy bin. """ return self.energy_axis.edges[:-1] @property def energy_max(self): """Energy maximum. Returns ------- energy_max : `~astropy.units.Quantity` Upper bound of energy bin. """ return self.energy_axis.edges[1:] # TODO: keep or remove? @property def niter(self): """Number of iterations of fit.""" self._check_quantity("niter") return self._data["niter"] @property def success(self): """Fit success flag.""" self._check_quantity("success") return self._data["success"] @property def is_ul(self): """Whether data is an upper limit.""" # TODO: make this a well defined behaviour is_ul = self.norm.copy(data=False) if "is_ul" in self._data: is_ul = self._data["is_ul"] elif self.has_any_ts and self.has_ul: is_ul.data = self.sqrt_ts.data < self.sqrt_ts_threshold_ul elif self.has_ul: is_ul.data = np.isfinite(self.norm_ul) else: is_ul.data = np.isnan(self.norm) return is_ul @is_ul.setter def is_ul(self, value): """Whether data is an upper limit. Parameters ---------- value : `~Map` Boolean map. """ if not isinstance(value, Map): value = self.norm.copy(data=value) self._data["is_ul"] = value @property def counts(self): """Predicted counts null hypothesis.""" self._check_quantity("counts") return self._data["counts"] @property def npred(self): """Predicted counts from best fit hypothesis.""" self._check_quantity("npred") return self._data["npred"] @property def npred_background(self): """Predicted background counts from best fit hypothesis.""" self._check_quantity("npred") self._check_quantity("npred_excess") return self.npred - self.npred_excess @property def npred_excess(self): """Predicted excess count from best fit hypothesis.""" self._check_quantity("npred_excess") return self._data["npred_excess"] def _expand_dims(self, data): # TODO: instead make map support broadcasting axes = self.npred.geom.axes # here we need to rely on broadcasting if "dataset" in axes.names: idx = axes.index_data("dataset") data = np.expand_dims(data, axis=idx) return data @staticmethod def _use_center_as_labels(input_map): """Change the node_type of the input map.""" energy_axis = input_map.geom.axes["energy"] energy_axis.use_center_as_plot_labels = True return input_map @property def npred_excess_ref(self): """Predicted excess reference counts.""" return self.npred_excess / self._expand_dims(self.norm.data) @property def npred_excess_err(self): """Predicted excess counts error.""" return self.npred_excess_ref * self._expand_dims(self.norm_err.data) @property def npred_excess_errp(self): """Predicted excess counts positive error.""" return self.npred_excess_ref * self._expand_dims(self.norm_errp.data) @property def npred_excess_errn(self): """Predicted excess counts negative error.""" return self.npred_excess_ref * self._expand_dims(self.norm_errn.data) @property def npred_excess_ul(self): """Predicted excess counts upper limits.""" return self.npred_excess_ref * self._expand_dims(self.norm_ul.data) @property def stat_scan(self): """Fit statistic scan value.""" self._check_quantity("stat_scan") return self._data["stat_scan"] @property def dnde_scan_values(self): """Fit statistic norm scan values.""" self._check_quantity("dnde_scan_values") return self._data["dnde_scan_values"] @property def stat(self): """Fit statistic value.""" self._check_quantity("stat") return self._data["stat"] @property def stat_null(self): """Fit statistic value for the null hypothesis.""" self._check_quantity("stat_null") return self._data["stat_null"] @property def ts(self): """Test statistic map as a `~gammapy.maps.Map` object.""" self._check_quantity("ts") return self._data["ts"] @property def ts_scan(self): """Test statistic scan as a `~gammapy.maps.Map` object.""" return self.stat_scan - np.expand_dims(self.stat.data, 2) # TODO: always derive sqrt(TS) from TS? @property def sqrt_ts(self): r"""sqrt(TS) as defined by: .. math:: \sqrt{TS} = \left \{ \begin{array}{ll} -\sqrt{TS} & : \text{if} \ norm < 0 \\ \sqrt{TS} & : \text{else} \end{array} \right. Returns ------- sqrt_ts : `Map` sqrt(TS) map. """ if "sqrt_ts" in self._data: return self._data["sqrt_ts"] else: with np.errstate(invalid="ignore", divide="ignore"): ts = np.clip(self.ts.data, 0, None) data = np.where(self.norm > 0, np.sqrt(ts), -np.sqrt(ts)) return Map.from_geom(geom=self.geom, data=data) @property def norm(self): """Norm values.""" return self._filter_convergence_failure(self._data["norm"]) @property def norm_err(self): """Norm error.""" self._check_quantity("norm_err") return self._filter_convergence_failure(self._data["norm_err"]) @property def norm_errn(self): """Negative norm error.""" self._check_quantity("norm_errn") return self._data["norm_errn"] @property def norm_errp(self): """Positive norm error.""" self._check_quantity("norm_errp") return self._data["norm_errp"] @property def norm_ul(self): """Norm upper limit.""" self._check_quantity("norm_ul") return self._data["norm_ul"] @property def norm_sensitivity(self): """Norm sensitivity.""" self._check_quantity("norm_sensitivity") return self._data["norm_sensitivity"] @property def n_dof(self): """Number of degrees of freedom of the fit per energy bin.""" self._check_quantity("n_dof") return self._data["n_dof"] @property def alpha(self): """The normalisation, alpha, for differences between the on and off regions.""" self._check_quantity("alpha") return self._data["alpha"] @property def acceptance_on(self): """The acceptance in the on region.""" self._check_quantity("acceptance_on") return self._data["acceptance_on"] @property def acceptance_off(self): """The acceptance in the off region.""" self._check_quantity("acceptance_off") return self._data["acceptance_off"] @property def dnde_ref(self): """Reference differential flux.""" result = self.reference_spectral_model(self.energy_axis.center) return result[self._expand_slice] @property def e2dnde_ref(self): """Reference differential flux * energy ** 2.""" energy = self.energy_axis.center result = self.reference_spectral_model(energy) * energy**2 return result[self._expand_slice] @property def flux_ref(self): """Reference integral flux.""" if not self.is_convertible_to_flux_sed_type: raise ValueError( "Missing energy range definition, cannot convert to SED type 'flux'." ) energy_min = self.energy_axis.edges[:-1] energy_max = self.energy_axis.edges[1:] result = self.reference_spectral_model.integral(energy_min, energy_max) return result[self._expand_slice] @property def eflux_ref(self): """Reference energy flux.""" if not self.is_convertible_to_flux_sed_type: raise ValueError( "Missing energy range definition, cannot convert to SED type 'eflux'." ) energy_min = self.energy_axis.edges[:-1] energy_max = self.energy_axis.edges[1:] result = self.reference_spectral_model.energy_flux(energy_min, energy_max) return result[self._expand_slice] @property def dnde(self): """Return differential flux (dnde) SED values.""" return self._use_center_as_labels(self.norm * self.dnde_ref) @property def dnde_err(self): """Return differential flux (dnde) SED errors.""" return self._use_center_as_labels(self.norm_err * self.dnde_ref) @property def dnde_errn(self): """Return differential flux (dnde) SED negative errors.""" return self._use_center_as_labels(self.norm_errn * self.dnde_ref) @property def dnde_errp(self): """Return differential flux (dnde) SED positive errors.""" return self._use_center_as_labels(self.norm_errp * self.dnde_ref) @property def dnde_ul(self): """Return differential flux (dnde) SED upper limit.""" return self._use_center_as_labels(self.norm_ul * self.dnde_ref) @property def e2dnde(self): """Return differential energy flux (e2dnde) SED values.""" return self._use_center_as_labels(self.norm * self.e2dnde_ref) @property def e2dnde_err(self): """Return differential energy flux (e2dnde) SED errors.""" return self._use_center_as_labels(self.norm_err * self.e2dnde_ref) @property def e2dnde_errn(self): """Return differential energy flux (e2dnde) SED negative errors.""" return self._use_center_as_labels(self.norm_errn * self.e2dnde_ref) @property def e2dnde_errp(self): """Return differential energy flux (e2dnde) SED positive errors.""" return self._use_center_as_labels(self.norm_errp * self.e2dnde_ref) @property def e2dnde_ul(self): """Return differential energy flux (e2dnde) SED upper limit.""" return self._use_center_as_labels(self.norm_ul * self.e2dnde_ref) @property def flux(self): """Return integral flux (flux) SED values.""" return self.norm * self.flux_ref @property def flux_err(self): """Return integral flux (flux) SED values.""" return self.norm_err * self.flux_ref @property def flux_errn(self): """Return integral flux (flux) SED negative errors.""" return self.norm_errn * self.flux_ref @property def flux_errp(self): """Return integral flux (flux) SED positive errors.""" return self.norm_errp * self.flux_ref @property def flux_ul(self): """Return integral flux (flux) SED upper limits.""" return self.norm_ul * self.flux_ref @property def flux_sensitivity(self): """Sensitivity given as the flux for which the significance is ``self.meta["n_sigma_sensitivity]``.""" return self.norm_sensitivity * self.flux_ref @property def eflux(self): """Return energy flux (eflux) SED values.""" return self.norm * self.eflux_ref @property def eflux_err(self): """Return energy flux (eflux) SED errors.""" return self.norm_err * self.eflux_ref @property def eflux_errn(self): """Return energy flux (eflux) SED negative errors.""" return self.norm_errn * self.eflux_ref @property def eflux_errp(self): """Return energy flux (eflux) SED positive errors.""" return self.norm_errp * self.eflux_ref @property def eflux_ul(self): """Return energy flux (eflux) SED upper limits.""" return self.norm_ul * self.eflux_ref def _filter_convergence_failure(self, some_map): """Put NaN where pixels did not converge.""" if not self._filter_success_nan: return some_map if not self.has_success: return some_map if self.success.data.shape == some_map.data.shape: new_map = some_map.copy() new_map.data[~self.success.data] = np.nan else: mask_nan = np.ones(self.success.data.shape) mask_nan[~self.success.data] = np.nan new_map = some_map * np.expand_dims(mask_nan, 2) new_map.data = new_map.data.astype(some_map.data.dtype) return new_map def get_flux_points(self, position=None): """Extract flux point at a given position. Parameters ---------- position : `~astropy.coordinates.SkyCoord` Position where the flux points are extracted. Returns ------- flux_points : `~gammapy.estimators.FluxPoints` Flux points object. """ from gammapy.estimators import FluxPoints if position is None: position = self.geom.center_skydir data = {} for name in self._data: m = getattr(self, name) if m.data.dtype is np.dtype(bool): data[name] = m.to_region_nd_map( region=position, method="nearest", func=np.any ) else: data[name] = m.to_region_nd_map(region=position, method="nearest") return FluxPoints( data, reference_model=self.reference_model, meta=self.meta.copy(), gti=self.gti, ) def to_maps(self, sed_type=None): """Return maps in a given SED type. Parameters ---------- sed_type : {"likelihood", "dnde", "e2dnde", "flux", "eflux"}, optional SED type to convert to. If None, set to `Likelihood`. Default is None. Returns ------- maps : `Maps` Maps object containing the requested maps. """ maps = Maps() if sed_type is None: sed_type = self.sed_type_init for quantity in self.all_quantities(sed_type=sed_type): m = getattr(self, quantity, None) if m is not None: maps[quantity] = m return maps @classmethod def from_stack(cls, maps, axis, meta=None): """Create flux points by stacking list of flux points. The first `FluxPoints` object in the list is taken as a reference to infer column names and units for the stacked object. Parameters ---------- maps : list of `FluxMaps` List of maps to stack. axis : `MapAxis` New axis to create. meta : dict, optional Metadata of the resulting flux points. Default is None. Returns ------- flux_maps : `FluxMaps` Stacked flux maps along axis. """ reference = maps[0] data = {} for quantity in reference.available_quantities: data[quantity] = Map.from_stack( [_._data[quantity] for _ in maps], axis=axis ) if meta is None: meta = reference.meta.copy() gtis = [_.gti for _ in maps if _.gti] if gtis: gti = GTI.from_stack(gtis) else: gti = None return cls( data=data, reference_model=reference.reference_model, meta=meta, gti=gti ) def iter_by_axis(self, axis_name): """Create a set of FluxMaps by splitting along an axis. Parameters ---------- axis_name : str Name of the axis to split on. Returns ------- flux_maps : `FluxMap` FluxMap iteration. """ split_maps = {} axis = self.geom.axes[axis_name] gti = self.gti for amap in self.available_quantities: split_maps[amap] = list(getattr(self, amap).iter_by_axis(axis_name)) for idx in range(axis.nbin): maps = {} for amap in self.available_quantities: maps[amap] = split_maps[amap][idx] if isinstance(axis, TimeMapAxis): gti = self.gti.select_time([axis.time_min[idx], axis.time_max[idx]]) yield self.__class__.from_maps( maps=maps, sed_type=self.sed_type_init, reference_model=self.reference_model, gti=gti, meta=self.meta, ) @classmethod def from_maps(cls, maps, sed_type=None, reference_model=None, gti=None, meta=None): """Create FluxMaps from a dictionary of maps. Parameters ---------- maps : `Maps` Maps object containing the input maps. sed_type : str, optional SED type of the input maps. If None, set to "likelihood". Default is None. reference_model : `~gammapy.modeling.models.SkyModel`, optional Reference model to use for conversions. If None, a model consisting of a point source with a power law spectrum of index 2 is assumed. Default is None. gti : `~gammapy.data.GTI`, optional Maps GTI information. Default is None. meta : `dict` Meta dictionary. Returns ------- flux_maps : `~gammapy.estimators.FluxMaps` Flux maps object. """ if sed_type is None: sed_type = cls._guess_sed_type(maps.keys()) if sed_type is None: raise ValueError("Specifying the SED type is required") cls._validate_data(data=maps, sed_type=sed_type) if sed_type == "likelihood": return cls(data=maps, reference_model=reference_model, gti=gti, meta=meta) if reference_model is None: log.warning( "No reference model set for FluxMaps. Assuming point source with E^-2 spectrum." ) reference_model = cls.reference_model_default elif isinstance(reference_model, SpectralModel): reference_model = SkyModel(reference_model) map_ref = maps[sed_type] energy_axis = map_ref.geom.axes["energy"] with np.errstate(invalid="ignore", divide="ignore"): fluxes = reference_model.spectral_model.reference_fluxes( energy_axis=energy_axis ) # TODO: handle reshaping in MapAxis factor = fluxes[f"ref_{sed_type}"].to(map_ref.unit)[cls._expand_slice] data = {} data["norm"] = map_ref / factor for key in OPTIONAL_QUANTITIES[sed_type]: if key in maps: norm_type = key.replace(sed_type, "norm") data[norm_type] = maps[key] / factor # We add the remaining maps for key in OPTIONAL_QUANTITIES_COMMON: if key in maps: data[key] = maps[key] return cls(data=data, reference_model=reference_model, gti=gti, meta=meta) def to_hdulist(self, sed_type=None, hdu_bands=None): """Convert flux map to list of HDUs. For now, one cannot export the reference model. Parameters ---------- sed_type : str, optional SED type to convert to. If None, set to "likelihood". Default is None. hdu_bands : str, optional Name of the HDU with the BANDS table. Default is 'BANDS' If set to None, each map will have its own hdu_band. Default is None. Returns ------- hdulist : `~astropy.io.fits.HDUList` Map dataset list of HDUs. """ if sed_type is None: sed_type = self.sed_type_init exclude_primary = slice(1, None) hdu_primary = fits.PrimaryHDU() hdu_primary.header["SED_TYPE"] = sed_type hdulist = fits.HDUList([hdu_primary]) maps = self.to_maps(sed_type=sed_type) hdulist.extend(maps.to_hdulist(hdu_bands=hdu_bands)[exclude_primary]) if self.gti: hdu = self.gti.to_table_hdu(format="gadf") hdulist.append(hdu) return hdulist @classmethod def from_hdulist(cls, hdulist, hdu_bands=None, sed_type=None, checksum=False): """Create flux map dataset from list of HDUs. Parameters ---------- hdulist : `~astropy.io.fits.HDUList` List of HDUs. hdu_bands : str, optional Name of the HDU with the BANDS table. Default is 'BANDS' If set to None, each map should have its own hdu_band. Default is None. sed_type : {"dnde", "flux", "e2dnde", "eflux", "likelihood"}, optional Sed type. Default is None. Returns ------- flux_maps : `~gammapy.estimators.FluxMaps` Flux maps object. """ maps = Maps.from_hdulist(hdulist=hdulist, hdu_bands=hdu_bands) if sed_type is None: sed_type = hdulist[0].header.get("SED_TYPE", None) filename = hdulist[0].header.get("MODEL", None) if filename: reference_model = Models.read(filename, checksum=checksum)[0] else: reference_model = None if "GTI" in hdulist: gti = GTI.from_table_hdu(hdulist["GTI"]) else: gti = None return cls.from_maps( maps=maps, sed_type=sed_type, reference_model=reference_model, gti=gti ) def write( self, filename, filename_model=None, overwrite=False, sed_type=None, checksum=False, ): """Write flux map to file. Parameters ---------- filename : str Filename to write to. filename_model : str Filename of the model (yaml format). If None, keep string before '.' and add '_model.yaml' suffix. overwrite : bool, optional Overwrite existing file. Default is False. sed_type : str, optional Sed type to convert to. If None, set to "likelihood". Default is None. checksum : bool, optional When True adds both DATASUM and CHECKSUM cards to the headers written to the file. Default is False. """ if sed_type is None: sed_type = self.sed_type_init filename = make_path(filename) if filename_model is None: name_string = filename.as_posix() for suffix in filename.suffixes: name_string.replace(suffix, "") filename_model = name_string + "_model.yaml" filename_model = make_path(filename_model) hdulist = self.to_hdulist(sed_type) models = Models(self.reference_model) models.write(filename_model, overwrite=overwrite, write_covariance=False) hdulist[0].header["MODEL"] = filename_model.as_posix() hdulist.writeto(filename, overwrite=overwrite) @classmethod def read(cls, filename, checksum=False): """Read map dataset from file. Parameters ---------- filename : str Filename to read from. checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. Returns ------- flux_maps : `~gammapy.estimators.FluxMaps` Flux maps object. """ with fits.open( str(make_path(filename)), memmap=False, checksum=checksum ) as hdulist: return cls.from_hdulist(hdulist, checksum=checksum) def copy(self, reference_model=None): """Deep copy. Parameters ---------- reference_model : `~gammapy.modeling.models.SkyModel`, optional The reference model to use for conversions. If None, the original model is copied. Flux maps have been obtained for a specific reference model. Changing it will change the fluxes. Handle with care. Returns ------- flux_maps : `~gammapy.estimators.FluxMaps` Copied flux maps object. """ new = deepcopy(self) if reference_model is not None: new._reference_model = reference_model.copy() log.warning( "Changing the reference model will change the fluxes. Handle with care." ) return new def slice_by_idx(self, slices): """Slice flux maps by index. Parameters ---------- slices : dict Dictionary of axes names and integers or `slice` object pairs. Contains one element for each non-spatial dimension. For integer indexing the corresponding axes is dropped from the map. Axes not specified in the dict are kept unchanged. Returns ------- flux_maps : `FluxMaps` Sliced flux maps object. Examples -------- >>> from gammapy.estimators import FluxPoints >>> import astropy.units as u >>> fp = FluxPoints.read("$GAMMAPY_DATA/estimators/crab_hess_fp/crab_hess_fp.fits") >>> slices = {"energy": slice(0, 2)} >>> sliced = fp.slice_by_idx(slices) """ data = {} for key, item in self._data.items(): data[key] = item.slice_by_idx(slices) return self.__class__( data=data, reference_model=self.reference_model, meta=self.meta.copy(), gti=self.gti, ) def slice_by_coord(self, slices): """Slice flux maps by coordinate values. Parameters ---------- slices : dict Dictionary of axes names and `astropy.Quantity` or `astropy.Time` or `slice` object pairs. Contains one element for each non-spatial dimension. For integer indexing the corresponding axes is dropped from the map. Axes not specified in the dict are kept unchanged. Returns ------- flux_maps : `FluxMaps` Sliced flux maps object. Examples -------- >>> from gammapy.estimators import FluxPoints >>> import astropy.units as u >>> lc_1d = FluxPoints.read("$GAMMAPY_DATA/estimators/pks2155_hess_lc/pks2155_hess_lc.fits") >>> slices = {"time": slice(2035.93*u.day, 2036.05*u.day)} >>> sliced = lc_1d.slice_by_coord(slices) """ idx_intervals = [] for key, interval in zip(slices.keys(), slices.values()): axis = self.geom.axes[key] group = axis.group_table([interval.start, interval.stop]) is_normal = group["bin_type"] == "normal " group = group[is_normal] idx_intervals.append( slice(int(group["idx_min"][0]), int(group["idx_max"][0] + 1)) ) return self.slice_by_idx(dict(zip(slices.keys(), idx_intervals))) def slice_by_time(self, time_min, time_max): """Slice flux maps by coordinate values along the time axis. Parameters ---------- time_min, time_max : `~astropy.time.Time` Time bounds used to slice the flux map. Returns ------- flux_maps : `FluxMaps` Sliced flux maps object. Examples -------- >>> from gammapy.estimators import FluxPoints >>> import astropy.units as u >>> lc_1d = FluxPoints.read("$GAMMAPY_DATA/estimators/pks2155_hess_lc/pks2155_hess_lc.fits") >>> sliced = lc_1d.slice_by_time(time_min=2035.93*u.day, time_max=2036.05*u.day) """ time_slice = slice(time_min, time_max) return self.slice_by_coord({"time": time_slice}) def slice_by_energy(self, energy_min, energy_max): """Slice flux maps by coordinate values along the energy axis. Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Energy bounds used to slice the flux map. Returns ------- flux_maps : `FluxMaps` Sliced flux maps object. Examples -------- >>> from gammapy.estimators import FluxPoints >>> import astropy.units as u >>> fp = FluxPoints.read("$GAMMAPY_DATA/estimators/crab_hess_fp/crab_hess_fp.fits") >>> sliced = fp.slice_by_energy(energy_min=2*u.TeV, energy_max=10*u.TeV) """ energy_slice = slice(energy_min, energy_max) return self.slice_by_coord({"energy": energy_slice}) # TODO: should we allow this? def __getitem__(self, item): return getattr(self, item) def __str__(self): str_ = f"{self.__class__.__name__}\n" str_ += "-" * len(self.__class__.__name__) str_ += "\n\n" str_ += "\t" + f"geom : {self.geom.__class__.__name__}\n" str_ += "\t" + f"axes : {self.geom.axes_names}\n" str_ += "\t" + f"shape : {self.geom.data_shape[::-1]}\n" str_ += "\t" + f"quantities : {list(self.available_quantities)}\n" str_ += ( "\t" + f"ref. model : {self.reference_spectral_model.tag[-1]}\n" ) str_ += "\t" + f"n_sigma : {self.n_sigma}\n" str_ += "\t" + f"n_sigma_ul : {self.n_sigma_ul}\n" str_ += "\t" + f"sqrt_ts_threshold_ul : {self.sqrt_ts_threshold_ul}\n" str_ += "\t" + f"sed type init : {self.sed_type_init}\n" return str_.expandtabs(tabsize=2) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/map/excess.py0000644000175100001770000004226414721316200020667 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import copy import logging import numpy as np from astropy.convolution import Tophat2DKernel from astropy.coordinates import Angle from gammapy.datasets import MapDataset, MapDatasetOnOff from gammapy.maps import Map from gammapy.modeling.models import PowerLawSpectralModel, SkyModel from gammapy.stats import CashCountsStatistic, WStatCountsStatistic from ..core import Estimator from ..utils import estimate_exposure_reco_energy from .core import FluxMaps __all__ = ["ExcessMapEstimator"] log = logging.getLogger(__name__) def _get_convolved_maps(dataset, kernel, mask, correlate_off): """Return convolved maps. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.MapDatasetOnOff` Map dataset. kernel : `~astropy.convolution.Kernel` Kernel. mask : `~gammapy.maps.Map` Mask map. correlate_off : bool Correlate OFF events. Returns ------- convolved_maps : dict Dictionary of convolved maps. """ # Kernel is modified later make a copy here kernel = copy.deepcopy(kernel) kernel_data = kernel.data / kernel.data.max() # fft convolution adds numerical noise, to ensure integer results we call # np.rint n_on = dataset.counts * mask n_on_conv = np.rint(n_on.convolve(kernel_data).data) convolved_maps = {"n_on_conv": n_on_conv} if isinstance(dataset, MapDatasetOnOff): n_off = dataset.counts_off * mask npred_sig = dataset.npred_signal() * mask acceptance_on = dataset.acceptance * mask acceptance_off = dataset.acceptance_off * mask npred_sig_convolve = npred_sig.convolve(kernel_data) if correlate_off: background = dataset.background * mask background.data[dataset.acceptance_off == 0] = 0.0 background_conv = background.convolve(kernel_data) n_off = n_off.convolve(kernel_data) with np.errstate(invalid="ignore", divide="ignore"): alpha = background_conv / n_off else: acceptance_on_convolve = acceptance_on.convolve(kernel_data) with np.errstate(invalid="ignore", divide="ignore"): alpha = acceptance_on_convolve / acceptance_off convolved_maps.update( { "n_off": n_off, "npred_sig_convolve": npred_sig_convolve, "acceptance_on": acceptance_on, "acceptance_off": acceptance_off, "alpha": alpha, } ) else: npred = dataset.npred() * mask background_conv = npred.convolve(kernel_data) convolved_maps.update( { "background_conv": background_conv, } ) return convolved_maps def convolved_map_dataset_counts_statistics(convolved_maps, stat_type): """Return a `CountsStatistic` object. Parameters ---------- convolved_maps : dict Dictionary of convolved maps. stat_type : str The statistic type, either 'wstat' or 'cash'. Returns ------- counts_statistic : `~gammapy.stats.CashCountsStatistic` or `~gammapy.stats.WStatCountsStatistic` The counts statistic. """ if stat_type == "wstat": n_on_conv = convolved_maps["n_on_conv"] n_off = convolved_maps["n_off"] alpha = convolved_maps["alpha"] npred_sig_convolve = convolved_maps["npred_sig_convolve"] return WStatCountsStatistic( n_on_conv.data, n_off.data, alpha.data, npred_sig_convolve.data ) elif stat_type == "cash": n_on_conv = convolved_maps["n_on_conv"] background_conv = convolved_maps["background_conv"] return CashCountsStatistic(n_on_conv.data, background_conv.data) class ExcessMapEstimator(Estimator): """Computes correlated excess, significance and error maps from a map dataset. If a model is set on the dataset the excess map estimator will compute the excess taking into account the predicted counts of the model. .. note:: By default, the excess estimator correlates the off counts as well to avoid artifacts at the edges of the :term:`FoV` for stacked on-off datasets. However, when the on-off dataset has been derived from a ring background estimate, this leads to the off counts being correlated twice. To avoid artifacts and the double correlation, the `ExcessMapEstimator` has to be applied per dataset and the resulting maps need to be stacked, taking the :term:`FoV` cut into account. Parameters ---------- correlation_radius : `~astropy.coordinates.Angle` Correlation radius to use. n_sigma : float Confidence level for the asymmetric errors expressed in number of sigma. n_sigma_ul : float Confidence level for the upper limits expressed in number of sigma. n_sigma_sensitivity : float Confidence level for the sensitivity expressed in number of sigma. gamma_min_sensitivity : float, optional Minimum number of gamma-rays. Default is 10. bkg_syst_fraction_sensitivity : float, optional Fraction of background counts that are above the gamma-ray counts. Default is 0.05. apply_threshold_sensitivity : bool If True, use `bkg_syst_fraction_sensitivity` and `gamma_min_sensitivity` in the sensitivity computation. Default is False which is the same setting as the HGPS catalog. selection_optional : list of str, optional Which additional maps to estimate besides delta TS, significance and symmetric error. Available options are: * "all": all the optional steps are executed. * "errn-errp": estimate asymmetric errors. * "ul": estimate upper limits. * "sensitivity": estimate sensitivity for a given significance. * "alpha": normalisation factor to accounts for differences between the on and off regions. * "acceptance_on": acceptance from the on region. * "acceptance_off": acceptange from the off region. Default is None so the optional steps are not executed. Note: "alpha", "acceptance_on" and "acceptance_off" can only be selected if the dataset is a `~gammapy.datasets.MapDatasetOnOff`. energy_edges : list of `~astropy.units.Quantity`, optional Edges of the target maps energy bins. The resulting bin edges won't be exactly equal to the input ones, but rather the closest values to the energy axis edges of the parent dataset. Default is None: apply the estimator in each energy bin of the parent dataset. For further explanation see :ref:`estimators`. correlate_off : bool Correlate OFF events. Default is True. spectral_model : `~gammapy.modeling.models.SpectralModel` Spectral model used for the computation of the flux map. If None, a `~gammapy.modeling.models.PowerLawSpectralModel` of index 2 is assumed (default). sum_over_energy_groups : bool Only used if ``energy_edges`` is None. If False, apply the estimator in each energy bin of the parent dataset. If True, apply the estimator in only one bin defined by the energy edges of the parent dataset. Default is False. Examples -------- >>> from gammapy.datasets import MapDataset >>> from gammapy.estimators import ExcessMapEstimator >>> dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") >>> estimator = ExcessMapEstimator(correlation_radius="0.1 deg") >>> result = estimator.run(dataset) >>> print(result) FluxMaps -------- geom : WcsGeom axes : ['lon', 'lat', 'energy'] shape : (np.int64(320), np.int64(240), 1) quantities : ['npred', 'npred_excess', 'counts', 'ts', 'sqrt_ts', 'norm', 'norm_err'] ref. model : pl n_sigma : 1 n_sigma_ul : 2 sqrt_ts_threshold_ul : 2 sed type init : likelihood """ tag = "ExcessMapEstimator" _available_selection_optional = [ "errn-errp", "ul", "sensitivity", "alpha", "acceptance_on", "acceptance_off", ] def __init__( self, correlation_radius="0.1 deg", n_sigma=1, n_sigma_ul=2, selection_optional=None, energy_edges=None, correlate_off=True, spectral_model=None, n_sigma_sensitivity=5, gamma_min_sensitivity=10, bkg_syst_fraction_sensitivity=0.05, apply_threshold_sensitivity=False, sum_over_energy_groups=False, ): self.correlation_radius = correlation_radius self.n_sigma = n_sigma self.n_sigma_ul = n_sigma_ul self.n_sigma_sensitivity = n_sigma_sensitivity self.gamma_min_sensitivity = gamma_min_sensitivity self.bkg_syst_fraction_sensitivity = bkg_syst_fraction_sensitivity self.apply_threshold_sensitivity = apply_threshold_sensitivity self.selection_optional = selection_optional self.energy_edges = energy_edges self.sum_over_energy_groups = sum_over_energy_groups self.correlate_off = correlate_off if spectral_model is None: spectral_model = PowerLawSpectralModel(index=2) self.spectral_model = spectral_model @property def correlation_radius(self): return self._correlation_radius @correlation_radius.setter def correlation_radius(self, correlation_radius): """Set radius.""" self._correlation_radius = Angle(correlation_radius) def run(self, dataset): """Compute correlated excess, Li & Ma significance and flux maps. If a model is set on the dataset the excess map estimator will compute the excess taking into account the predicted counts of the model. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.MapDatasetOnOff` Map dataset. Returns ------- maps : `FluxMaps` Flux maps. """ if not isinstance(dataset, MapDataset): raise ValueError( "Unsupported dataset type. Excess map is not applicable to 1D datasets." ) axis = self._get_energy_axis(dataset) resampled_dataset = dataset.resample_energy_axis( energy_axis=axis, name=dataset.name ) if dataset.exposure: reco_exposure = estimate_exposure_reco_energy( dataset, self.spectral_model, normalize=False ) reco_exposure = reco_exposure.resample_axis( axis=axis, weights=dataset.mask_safe ) else: reco_exposure = None if isinstance(dataset, MapDatasetOnOff): resampled_dataset.models = dataset.models else: resampled_dataset.background = dataset.npred().resample_axis( axis=axis, weights=dataset.mask_safe ) resampled_dataset.models = None result = self.estimate_excess_map(resampled_dataset, reco_exposure) return result def estimate_kernel(self, dataset): """Get the convolution kernel for the input dataset. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input dataset. Returns ------- kernel : `~astropy.convolution.Tophat2DKernel` Kernel. """ pixel_size = np.mean(np.abs(dataset.counts.geom.wcs.wcs.cdelt)) size = self.correlation_radius.deg / pixel_size kernel = Tophat2DKernel(size) geom = dataset.counts.geom.to_image() geom = geom.to_odd_npix(max_radius=self.correlation_radius) return Map.from_geom(geom, data=kernel.array) @staticmethod def estimate_mask_default(dataset): """Get mask used by the estimator. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input dataset. Returns ------- mask : `~gammapy.maps.Map` Mask map. """ if dataset.mask_fit: mask = dataset.mask elif dataset.mask_safe: mask = dataset.mask_safe else: mask = Map.from_geom(dataset.counts.geom, data=True, dtype=bool) return mask def estimate_exposure_reco_energy(self, dataset, kernel, mask, reco_exposure): """Estimate exposure map in reconstructed energy for a single dataset assuming the given spectral_model shape. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Map dataset. kernel : `~astropy.convolution.Tophat2DKernel` Kernel. mask : `~gammapy.maps.Map` Mask map. Returns ------- reco_exposure : `~gammapy.maps.Map` Reconstructed exposure map. """ if dataset.exposure: with np.errstate(invalid="ignore", divide="ignore"): reco_exposure = reco_exposure.convolve(kernel.data) / mask.convolve( kernel.data ) else: reco_exposure = 1 return reco_exposure def estimate_excess_map(self, dataset, reco_exposure): """Estimate excess and test statistic maps for a single dataset. If exposure is defined, a flux map is also computed. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Map dataset. """ kernel = self.estimate_kernel(dataset) geom = dataset.counts.geom mask = self.estimate_mask_default(dataset) convolved_maps = _get_convolved_maps(dataset, kernel, mask, self.correlate_off) counts_stat = convolved_map_dataset_counts_statistics( convolved_maps=convolved_maps, stat_type=dataset.stat_type ) maps = {} maps["npred"] = Map.from_geom(geom, data=counts_stat.n_on) maps["npred_excess"] = Map.from_geom(geom, data=counts_stat.n_sig) maps["counts"] = maps["npred"] maps["ts"] = Map.from_geom(geom, data=counts_stat.ts) maps["sqrt_ts"] = Map.from_geom(geom, data=counts_stat.sqrt_ts) reco_exposure = self.estimate_exposure_reco_energy( dataset, kernel, mask, reco_exposure ) with np.errstate(invalid="ignore", divide="ignore"): maps["norm"] = maps["npred_excess"] / reco_exposure maps["norm_err"] = ( Map.from_geom(geom, data=counts_stat.error * self.n_sigma) / reco_exposure ) if "errn-errp" in self.selection_optional: maps["norm_errn"] = ( Map.from_geom(geom, data=counts_stat.compute_errn(self.n_sigma)) / reco_exposure ) maps["norm_errp"] = ( Map.from_geom(geom, data=counts_stat.compute_errp(self.n_sigma)) / reco_exposure ) if "ul" in self.selection_optional: maps["norm_ul"] = ( Map.from_geom( geom, data=counts_stat.compute_upper_limit(self.n_sigma_ul) ) / reco_exposure ) if "sensitivity" in self.selection_optional: excess_counts = counts_stat.n_sig_matching_significance( self.n_sigma_sensitivity ) if self.apply_threshold_sensitivity: is_gamma_limited = excess_counts < self.gamma_min_sensitivity excess_counts[is_gamma_limited] = self.gamma_min_sensitivity bkg_syst_limited = ( excess_counts < self.bkg_syst_fraction_sensitivity * dataset.background.data ) excess_counts[bkg_syst_limited] = ( self.bkg_syst_fraction_sensitivity * dataset.background.data[bkg_syst_limited] ) excess = Map.from_geom(geom=geom, data=excess_counts) maps["norm_sensitivity"] = excess / reco_exposure if isinstance(dataset, MapDatasetOnOff): keys_onoff = set(["alpha", "acceptance_on", "acceptance_off"]) for key in keys_onoff.intersection(self.selection_optional): maps[key] = convolved_maps[key] # return nan values outside mask for name in maps: maps[name].data[~mask] = np.nan meta = { "n_sigma": self.n_sigma, "n_sigma_ul": self.n_sigma_ul, "n_sigma_sensitivity": self.n_sigma_sensitivity, "sed_type_init": "likelihood", } if self.apply_threshold_sensitivity: meta["gamma_min_sensitivity"] = self.gamma_min_sensitivity meta["bkg_syst_fraction_sensitivity"] = self.bkg_syst_fraction_sensitivity return FluxMaps.from_maps( maps=maps, meta=meta, reference_model=SkyModel(self.spectral_model), sed_type="likelihood", ) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.2006419 gammapy-1.3/gammapy/estimators/map/tests/0000755000175100001770000000000014721316215020163 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/map/tests/__init__.py0000644000175100001770000000010014721316200022255 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/map/tests/test_asmooth.py0000644000175100001770000001003714721316200023241 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose import astropy.units as u from astropy.convolution import Tophat2DKernel from gammapy.datasets import Datasets, MapDataset, MapDatasetOnOff from gammapy.estimators import ASmoothMapEstimator from gammapy.maps import Map, MapAxis, WcsNDMap from gammapy.utils.testing import requires_data from gammapy.modeling.models import PowerLawSpectralModel from gammapy.utils.deprecation import GammapyDeprecationWarning @pytest.fixture(scope="session") def input_dataset_simple(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=1) filename = "$GAMMAPY_DATA/tests/unbundled/poisson_stats_image/input_all.fits.gz" bkg_map = Map.read(filename, hdu="background") counts = Map.read(filename, hdu="counts") counts = counts.to_cube(axes=[axis]) bkg_map = bkg_map.to_cube(axes=[axis]) return MapDataset(counts=counts, background=bkg_map, name="test") @pytest.fixture(scope="session") def input_dataset(): filename = "$GAMMAPY_DATA/fermi-3fhl-crab/Fermi-LAT-3FHL_datasets.yaml" filename_models = "$GAMMAPY_DATA/fermi-3fhl-crab/Fermi-LAT-3FHL_models.yaml" datasets = Datasets.read(filename=filename, filename_models=filename_models) dataset = datasets[0] dataset.psf = None return dataset @requires_data() def test_asmooth(input_dataset_simple): kernel = Tophat2DKernel scales = ASmoothMapEstimator.get_scales(3, factor=2, kernel=kernel) * 0.1 * u.deg with pytest.warns(GammapyDeprecationWarning): ASmoothMapEstimator(scales=scales, spectrum=PowerLawSpectralModel()) asmooth = ASmoothMapEstimator( scales=scales, kernel=kernel, method="lima", threshold=2.5 ) smoothed = asmooth.estimate_maps(input_dataset_simple) desired = { "counts": 6.454327, "background": 1.0, "scale": 0.056419, "sqrt_ts": 18.125747, } for name in smoothed: actual = smoothed[name].data[0, 100, 100] assert_allclose(actual, desired[name], rtol=1e-5) @requires_data() def test_asmooth_dataset(input_dataset): kernel = Tophat2DKernel scales = ASmoothMapEstimator.get_scales(3, factor=2, kernel=kernel) * 0.1 * u.deg asmooth = ASmoothMapEstimator( scales=scales, kernel=kernel, method="lima", threshold=2.5 ) smoothed = asmooth.run(input_dataset) assert smoothed["flux"].data.shape == (1, 40, 50) assert smoothed["flux"].unit == u.Unit("cm-2s-1") assert smoothed["counts"].unit == u.Unit("") assert smoothed["background"].unit == u.Unit("") assert smoothed["scale"].unit == u.Unit("deg") desired = { "counts": 369.479167, "background": 0.184005, "scale": 0.056419, "sqrt_ts": 72.971513, "flux": 1.237119e-09, } for name in smoothed: actual = smoothed[name].data[0, 20, 25] assert_allclose(actual, desired[name], rtol=1e-2) def test_asmooth_map_dataset_on_off(): kernel = Tophat2DKernel scales = ASmoothMapEstimator.get_scales(3, factor=2, kernel=kernel) * 0.1 * u.deg asmooth = ASmoothMapEstimator( kernel=kernel, scales=scales, method="lima", threshold=2.5 ) axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=1) counts = WcsNDMap.create(npix=(50, 50), binsz=0.02, unit="", axes=[axis]) counts += 2 counts_off = WcsNDMap.create(npix=(50, 50), binsz=0.02, unit="", axes=[axis]) counts_off += 3 acceptance = WcsNDMap.create(npix=(50, 50), binsz=0.02, unit="", axes=[axis]) acceptance += 1 acceptance_off = WcsNDMap.create(npix=(50, 50), binsz=0.02, unit="", axes=[axis]) acceptance_off += 3 dataset = MapDatasetOnOff( counts=counts, counts_off=counts_off, acceptance=acceptance, acceptance_off=acceptance_off, ) smoothed = asmooth.run(dataset) assert_allclose(smoothed["counts"].data[0, 25, 25], 2) assert_allclose(smoothed["background"].data[0, 25, 25], 1) assert_allclose(smoothed["sqrt_ts"].data[0, 25, 25], 4.39, rtol=1e-2) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/map/tests/test_core.py0000644000175100001770000005310114721316200022516 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from scipy import stats import astropy.units as u from astropy.coordinates import SkyCoord from astropy.io import fits from astropy.time import Time from gammapy.data import GTI from gammapy.estimators import FluxMaps from gammapy.estimators.utils import combine_flux_maps from gammapy.maps import MapAxis, Maps, RegionGeom, TimeMapAxis, WcsNDMap from gammapy.modeling.models import ( LogParabolaSpectralModel, PointSpatialModel, PowerLawSpectralModel, SkyModel, ) from gammapy.utils.testing import mpl_plot_check @pytest.fixture(scope="session") def reference_model(): return SkyModel( spatial_model=PointSpatialModel(), spectral_model=PowerLawSpectralModel(index=2) ) @pytest.fixture(scope="session") def logpar_reference_model(): logpar = LogParabolaSpectralModel( amplitude="2e-12 cm-2s-1TeV-1", alpha=1.5, beta=0.5 ) return SkyModel(spatial_model=PointSpatialModel(), spectral_model=logpar) @pytest.fixture(scope="session") def wcs_flux_map(): energy_axis = MapAxis.from_energy_bounds(0.1, 10, 2, unit="TeV") map_dict = {} map_dict["norm"] = WcsNDMap.create( npix=10, frame="galactic", axes=[energy_axis], unit="" ) map_dict["norm"].data += 1.0 map_dict["norm_err"] = WcsNDMap.create( npix=10, frame="galactic", axes=[energy_axis], unit="" ) map_dict["norm_err"].data += 0.1 map_dict["norm_errp"] = WcsNDMap.create( npix=10, frame="galactic", axes=[energy_axis], unit="" ) map_dict["norm_errp"].data += 0.2 map_dict["norm_errn"] = WcsNDMap.create( npix=10, frame="galactic", axes=[energy_axis], unit="" ) map_dict["norm_errn"].data += 0.2 map_dict["norm_ul"] = WcsNDMap.create( npix=10, frame="galactic", axes=[energy_axis], unit="" ) map_dict["norm_ul"].data += 2.0 # Add another map map_dict["sqrt_ts"] = WcsNDMap.create( npix=10, frame="galactic", axes=[energy_axis], unit="" ) map_dict["sqrt_ts"].data += 1.0 # Add another map map_dict["ts"] = WcsNDMap.create( npix=10, frame="galactic", axes=[energy_axis], unit="" ) map_dict["ts"].data[1] += 3.0 # Add another map map_dict["success"] = WcsNDMap.create( npix=10, frame="galactic", axes=[energy_axis], unit="", dtype=np.dtype(bool) ) map_dict["success"].data = True map_dict["success"].data[0, 0, 1] = False return map_dict @pytest.fixture(scope="session") def partial_wcs_flux_map(): energy_axis = MapAxis.from_energy_bounds(0.1, 10, 2, unit="TeV") map_dict = {} map_dict["norm"] = WcsNDMap.create( npix=10, frame="galactic", axes=[energy_axis], unit="" ) map_dict["norm"].data += 1.0 map_dict["norm_err"] = WcsNDMap.create( npix=10, frame="galactic", axes=[energy_axis], unit="" ) map_dict["norm_err"].data += 0.1 # Add another map map_dict["sqrt_ts"] = WcsNDMap.create( npix=10, frame="galactic", axes=[energy_axis], unit="" ) map_dict["sqrt_ts"].data += 1.0 return map_dict @pytest.fixture(scope="session") def region_map_flux_estimate(): axis = MapAxis.from_energy_edges((0.1, 1.0, 10.0), unit="TeV") geom = RegionGeom.create("galactic;circle(0, 0, 0.1)", axes=[axis]) maps = Maps.from_geom( geom=geom, names=["norm", "norm_err", "norm_errn", "norm_errp", "norm_ul"] ) maps["norm"].data = np.array([1.0, 1.0]) maps["norm_err"].data = np.array([0.1, 0.1]) maps["norm_errn"].data = np.array([0.2, 0.2]) maps["norm_errp"].data = np.array([0.15, 0.15]) maps["norm_ul"].data = np.array([2.0, 2.0]) return maps @pytest.fixture(scope="session") def map_flux_estimate(): axis = MapAxis.from_energy_edges((0.1, 1.0, 10.0), unit="TeV") nmap = WcsNDMap.create(npix=5, axes=[axis]) cols = dict() cols["norm"] = nmap.copy(data=1.0) cols["norm_err"] = nmap.copy(data=0.1) cols["norm_errn"] = nmap.copy(data=0.2) cols["norm_errp"] = nmap.copy(data=0.15) cols["norm_ul"] = nmap.copy(data=2.0) return cols def test_table_properties(region_map_flux_estimate): model = SkyModel(PowerLawSpectralModel(amplitude="1e-10 cm-2s-1TeV-1", index=2)) fe = FluxMaps(data=region_map_flux_estimate, reference_model=model) assert fe.dnde.unit == u.Unit("cm-2s-1TeV-1") assert_allclose(fe.dnde.data.flat, [1e-9, 1e-11]) assert_allclose(fe.dnde_err.data.flat, [1e-10, 1e-12]) assert_allclose(fe.dnde_errn.data.flat, [2e-10, 2e-12]) assert_allclose(fe.dnde_errp.data.flat, [1.5e-10, 1.5e-12]) assert_allclose(fe.dnde_ul.data.flat, [2e-9, 2e-11]) assert fe.e2dnde.unit == u.Unit("TeV cm-2s-1") assert_allclose(fe.e2dnde.data.flat, [1e-10, 1e-10]) assert fe.flux.unit == u.Unit("cm-2s-1") assert_allclose(fe.flux.data.flat, [9e-10, 9e-11]) assert fe.eflux.unit == u.Unit("TeV cm-2s-1") assert_allclose(fe.eflux.data.flat, [2.302585e-10, 2.302585e-10]) def test_missing_column(region_map_flux_estimate): del region_map_flux_estimate["norm_errn"] model = SkyModel(PowerLawSpectralModel(amplitude="1e-10 cm-2s-1TeV-1", index=2)) fe = FluxMaps(data=region_map_flux_estimate, reference_model=model) with pytest.raises(AttributeError): fe.dnde_errn def test_map_properties(map_flux_estimate): model = SkyModel(PowerLawSpectralModel(amplitude="1e-10 cm-2s-1TeV-1", index=2)) fe = FluxMaps(data=map_flux_estimate, reference_model=model) assert fe.dnde.unit == u.Unit("cm-2s-1TeV-1") assert_allclose(fe.dnde.quantity.value[:, 2, 2], [1e-9, 1e-11]) assert_allclose(fe.dnde_err.quantity.value[:, 2, 2], [1e-10, 1e-12]) assert_allclose(fe.dnde_errn.quantity.value[:, 2, 2], [2e-10, 2e-12]) assert_allclose(fe.dnde_errp.quantity.value[:, 2, 2], [1.5e-10, 1.5e-12]) assert_allclose(fe.dnde_ul.quantity.value[:, 2, 2], [2e-9, 2e-11]) assert fe.e2dnde.unit == u.Unit("TeV cm-2s-1") assert_allclose(fe.e2dnde.quantity.value[:, 2, 2], [1e-10, 1e-10]) assert_allclose(fe.e2dnde_err.quantity.value[:, 2, 2], [1e-11, 1e-11]) assert_allclose(fe.e2dnde_errn.quantity.value[:, 2, 2], [2e-11, 2e-11]) assert_allclose(fe.e2dnde_errp.quantity.value[:, 2, 2], [1.5e-11, 1.5e-11]) assert_allclose(fe.e2dnde_ul.quantity.value[:, 2, 2], [2e-10, 2e-10]) assert fe.flux.unit == u.Unit("cm-2s-1") assert_allclose(fe.flux.quantity.value[:, 2, 2], [9e-10, 9e-11]) assert_allclose(fe.flux_err.quantity.value[:, 2, 2], [9e-11, 9e-12]) assert_allclose(fe.flux_errn.quantity.value[:, 2, 2], [1.8e-10, 1.8e-11]) assert_allclose(fe.flux_errp.quantity.value[:, 2, 2], [1.35e-10, 1.35e-11]) assert_allclose(fe.flux_ul.quantity.value[:, 2, 2], [1.8e-9, 1.8e-10]) assert fe.eflux.unit == u.Unit("TeV cm-2s-1") assert_allclose(fe.eflux.quantity.value[:, 2, 2], [2.302585e-10, 2.302585e-10]) assert_allclose(fe.eflux_err.quantity.value[:, 2, 2], [2.302585e-11, 2.302585e-11]) assert_allclose(fe.eflux_errn.quantity.value[:, 2, 2], [4.60517e-11, 4.60517e-11]) assert_allclose( fe.eflux_errp.quantity.value[:, 2, 2], [3.4538775e-11, 3.4538775e-11] ) assert_allclose(fe.eflux_ul.quantity.value[:, 2, 2], [4.60517e-10, 4.60517e-10]) def test_combine_flux_maps(map_flux_estimate, wcs_flux_map, reference_model): start = u.Quantity([1, 2], "min") stop = u.Quantity([1.5, 2.5], "min") gti = GTI.create(start, stop) data = map_flux_estimate.copy() ts = data["norm"].copy() data["norm_err"].data[0, 0, 0] = 0.0 ts.data = (data["norm"].data / data["norm_err"].data) ** 2 data["ts"] = ts model = SkyModel(PowerLawSpectralModel(amplitude="1e-10 cm-2s-1TeV-1", index=2)) fe = FluxMaps(data=data, reference_model=model, gti=gti) iE = 0 energy = fe.geom.axes[0].center[iE] fe_new = combine_flux_maps([fe, fe]) assert_allclose(fe_new.dnde.quantity, fe.dnde.quantity) assert_allclose( fe_new.dnde_err.quantity.value, fe.dnde_err.quantity.value / np.sqrt(2) ) fe_new = combine_flux_maps( [fe, fe], reference_model=FluxMaps.reference_model_default.spectral_model ) assert_allclose(fe_new.dnde.quantity, fe.dnde.quantity) assert_allclose( fe_new.dnde_err.quantity.value, fe.dnde_err.quantity.value / np.sqrt(2) ) ratio = model.spectral_model( energy ) / FluxMaps.reference_model_default.spectral_model(energy) assert_allclose( fe_new.norm.quantity.value[iE, :, :] / fe.norm.quantity.value[iE, :, :], ratio ) assert fe_new.meta == fe.meta fe = FluxMaps(data, reference_model) fe_new = combine_flux_maps([fe, fe], reference_model=reference_model) assert_allclose(fe_new.dnde.quantity, fe.dnde.quantity) assert_allclose( fe_new.dnde_err.quantity.value, fe.dnde_err.quantity.value / np.sqrt(2) ) assert_allclose(fe_new.norm.quantity, fe.norm.quantity) ts = ( -2 * np.log( stats.norm.pdf(0, loc=fe.dnde.data, scale=fe.dnde_err.data) / stats.norm.pdf(fe.dnde.data, loc=fe.dnde.data, scale=fe.dnde_err.data) ) + (fe.ts - (fe.norm.data / fe.norm_err.data) ** 2) * 2 ) assert_allclose(fe_new.ts, ts * 2) def test_flux_map_properties(wcs_flux_map, reference_model): fluxmap = FluxMaps(wcs_flux_map, reference_model) assert_allclose(fluxmap.dnde.data[:, 0, 0], [1e-11, 1e-13]) assert_allclose(fluxmap.dnde_err.data[:, 0, 0], [1e-12, 1e-14]) assert_allclose(fluxmap.dnde_err.data[:, 0, 0], [1e-12, 1e-14]) assert_allclose(fluxmap.dnde_errn.data[:, 0, 0], [2e-12, 2e-14]) assert_allclose(fluxmap.dnde_errp.data[:, 0, 0], [2e-12, 2e-14]) assert_allclose(fluxmap.dnde_ul.data[:, 0, 0], [2e-11, 2e-13]) assert_allclose(fluxmap.flux.data[:, 0, 0], [9e-12, 9e-13]) assert_allclose(fluxmap.flux_err.data[:, 0, 0], [9e-13, 9e-14]) assert_allclose(fluxmap.flux_errn.data[:, 0, 0], [18e-13, 18e-14]) assert_allclose(fluxmap.flux_errp.data[:, 0, 0], [18e-13, 18e-14]) assert_allclose(fluxmap.flux_ul.data[:, 0, 0], [18e-12, 18e-13]) assert_allclose(fluxmap.eflux.data[:, 0, 0], [2.302585e-12, 2.302585e-12]) assert_allclose(fluxmap.eflux_err.data[:, 0, 0], [2.302585e-13, 2.302585e-13]) assert_allclose(fluxmap.eflux_errp.data[:, 0, 0], [4.60517e-13, 4.60517e-13]) assert_allclose(fluxmap.eflux_errn.data[:, 0, 0], [4.60517e-13, 4.60517e-13]) assert_allclose(fluxmap.eflux_ul.data[:, 0, 0], [4.60517e-12, 4.60517e-12]) assert_allclose(fluxmap.e2dnde.data[:, 0, 0], [1e-12, 1e-12]) assert_allclose(fluxmap.e2dnde_err.data[:, 0, 0], [1e-13, 1e-13]) assert_allclose(fluxmap.e2dnde_errn.data[:, 0, 0], [2e-13, 2e-13]) assert_allclose(fluxmap.e2dnde_errp.data[:, 0, 0], [2e-13, 2e-13]) assert_allclose(fluxmap.e2dnde_ul.data[:, 0, 0], [2e-12, 2e-12]) assert_allclose(fluxmap.sqrt_ts.data, 1) assert_allclose(fluxmap.ts.data[:, 0, 0], [0, 3]) assert_allclose(fluxmap.success.data[:, 0, 1], [False, True]) assert_allclose(fluxmap.flux.data[:, 0, 1], [np.nan, 9e-13]) assert_allclose(fluxmap.flux_err.data[:, 0, 1], [np.nan, 9e-14]) assert_allclose(fluxmap.eflux.data[:, 0, 1], [np.nan, 2.30258509e-12]) assert_allclose(fluxmap.e2dnde_err.data[:, 0, 1], [np.nan, 1e-13]) def test_flux_map_failed_properties(wcs_flux_map, reference_model): fluxmap = FluxMaps(wcs_flux_map, reference_model) fluxmap.filter_success_nan = False assert_allclose(fluxmap.success.data[:, 0, 1], [False, True]) assert_allclose(fluxmap.flux.data[:, 0, 1], [9.0e-12, 9e-13]) assert not fluxmap.filter_success_nan def test_flux_map_str(wcs_flux_map, reference_model): fluxmap = FluxMaps(wcs_flux_map, reference_model) fm_str = fluxmap.__str__() assert "WcsGeom" in fm_str assert "errn" in fm_str assert "sqrt_ts" in fm_str assert "pl" in fm_str assert "n_sigma" in fm_str assert "n_sigma_ul" in fm_str assert "sqrt_ts_threshold" in fm_str @pytest.mark.parametrize("sed_type", ["likelihood", "dnde", "flux", "eflux", "e2dnde"]) def test_flux_map_read_write(tmp_path, wcs_flux_map, logpar_reference_model, sed_type): fluxmap = FluxMaps(wcs_flux_map, logpar_reference_model) fluxmap.write(tmp_path / "tmp.fits", sed_type=sed_type, overwrite=True) new_fluxmap = FluxMaps.read(tmp_path / "tmp.fits") assert_allclose(new_fluxmap.norm.data[:, 0, 0], [1, 1]) assert_allclose(new_fluxmap.norm_err.data[:, 0, 0], [0.1, 0.1]) assert_allclose(new_fluxmap.norm_errn.data[:, 0, 0], [0.2, 0.2]) assert_allclose(new_fluxmap.norm_ul.data[:, 0, 0], [2, 2]) # check model assert ( new_fluxmap.reference_model.spectral_model.tag[0] == "LogParabolaSpectralModel" ) assert new_fluxmap.reference_model.spectral_model.alpha.value == 1.5 assert new_fluxmap.reference_model.spectral_model.beta.value == 0.5 assert new_fluxmap.reference_model.spectral_model.amplitude.value == 2e-12 # check existence and content of additional map assert_allclose(new_fluxmap.sqrt_ts.data, 1.0) assert_allclose(new_fluxmap.success.data[:, 0, 1], [False, True]) assert_allclose(new_fluxmap.is_ul.data, True) @pytest.mark.parametrize("sed_type", ["likelihood", "dnde", "flux", "eflux", "e2dnde"]) def test_partial_flux_map_read_write( tmp_path, partial_wcs_flux_map, reference_model, sed_type ): fluxmap = FluxMaps(partial_wcs_flux_map, reference_model) fluxmap.write(tmp_path / "tmp.fits", sed_type=sed_type, overwrite=True) new_fluxmap = FluxMaps.read(tmp_path / "tmp.fits") assert_allclose(new_fluxmap.norm.data[:, 0, 0], [1, 1]) assert_allclose(new_fluxmap.norm_err.data[:, 0, 0], [0.1, 0.1]) # check model assert new_fluxmap.reference_model.spectral_model.tag[0] == "PowerLawSpectralModel" assert new_fluxmap.reference_model.spectral_model.index.value == 2 # check existence and content of additional map assert_allclose(new_fluxmap._data["sqrt_ts"].data, 1.0) # the TS map shouldn't exist with pytest.raises(AttributeError): new_fluxmap.ts def test_flux_map_read_write_gti(tmp_path, partial_wcs_flux_map, reference_model): start = u.Quantity([1, 2], "min") stop = u.Quantity([1.5, 2.5], "min") gti = GTI.create(start, stop) fluxmap = FluxMaps(partial_wcs_flux_map, reference_model, gti=gti) fluxmap.write(tmp_path / "tmp.fits", sed_type="dnde") new_fluxmap = FluxMaps.read(tmp_path / "tmp.fits") assert len(new_fluxmap.gti.table) == 2 assert_allclose(gti.met_start.to_value("s"), start.to_value("s")) @pytest.mark.xfail def test_flux_map_read_write_no_reference_model(tmp_path, wcs_flux_map, caplog): fluxmap = FluxMaps(wcs_flux_map) fluxmap.write(tmp_path / "tmp.fits") new_fluxmap = FluxMaps.read(tmp_path / "tmp.fits") assert new_fluxmap.reference_model.spectral_model.tag[0] == "PowerLawSpectralModel" assert "WARNING" in [_.levelname for _ in caplog.records] assert "No reference model set for FluxMaps." in [_.message for _ in caplog.records] def test_flux_map_read_write_missing_reference_model( tmp_path, wcs_flux_map, reference_model ): fluxmap = FluxMaps(wcs_flux_map, reference_model) fluxmap.write(tmp_path / "tmp.fits") with fits.open(tmp_path / "tmp.fits") as hdulist: hdulist[0].header["MODEL"] = "non_existent" with pytest.raises(FileNotFoundError): _ = FluxMaps.from_hdulist(hdulist) def test_flux_map_read_checksum(tmp_path, wcs_flux_map, reference_model): fluxmap = FluxMaps(wcs_flux_map, reference_model) fluxmap.write( tmp_path / "tmp.fits", filename_model=tmp_path / "test.yaml", checksum=True, overwrite=True, ) path = tmp_path / "test.yaml" text = path.read_text() text.replace("1.0e-12", "1.2e-12") with open(tmp_path / "test.yaml", "r+") as file: lines = file.read() lines.replace("1.0e-12", "1.2e-12") file.seek(0) file.write(lines) with pytest.warns(UserWarning): FluxMaps.read(tmp_path / "tmp.fits", checksum=True) @pytest.mark.xfail def test_flux_map_init_no_reference_model(wcs_flux_map, caplog): fluxmap = FluxMaps(data=wcs_flux_map) assert fluxmap.reference_model.spectral_model.tag[0] == "PowerLawSpectralModel" assert fluxmap.reference_model.spatial_model.tag[0] == "PointSpatialModel" assert fluxmap.reference_model.spectral_model.index.value == 2 assert "WARNING" in [_.levelname for _ in caplog.records] assert "No reference model set for FluxMaps." in [_.message for _ in caplog.records] def test_get_flux_point(wcs_flux_map, reference_model): fluxmap = FluxMaps(wcs_flux_map, reference_model) coord = SkyCoord(0.0, 0.0, unit="deg", frame="galactic") fp = fluxmap.get_flux_points(coord) table = fp.to_table() assert_allclose(table["e_min"], [0.1, 1.0]) assert_allclose(table["norm"], [1, 1]) assert_allclose(table["norm_err"], [0.1, 0.1]) assert_allclose(table["norm_errn"], [0.2, 0.2]) assert_allclose(table["norm_errp"], [0.2, 0.2]) assert_allclose(table["norm_ul"], [2, 2]) assert_allclose(table["sqrt_ts"], [1, 1]) assert_allclose(table["ts"], [0, 3], atol=1e-15) assert_allclose(fp.dnde.data.flat, [1e-11, 1e-13]) assert fp.dnde.unit == "cm-2s-1TeV-1" with mpl_plot_check(): fp.plot() def test_get_flux_point_missing_map(wcs_flux_map, reference_model): other_data = wcs_flux_map.copy() other_data.pop("norm_errn") other_data.pop("norm_errp") fluxmap = FluxMaps(other_data, reference_model) coord = SkyCoord(0.0, 0.0, unit="deg", frame="galactic") table = fluxmap.get_flux_points(coord).to_table() assert_allclose(table["e_min"], [0.1, 1.0]) assert_allclose(table["norm"], [1, 1]) assert_allclose(table["norm_err"], [0.1, 0.1]) assert_allclose(table["norm_ul"], [2, 2]) assert "norm_errn" not in table.columns assert table["success"].data.dtype == np.dtype(bool) def test_flux_map_from_dict_inconsistent_units(wcs_flux_map, reference_model): ref_map = FluxMaps(wcs_flux_map, reference_model) map_dict = dict() map_dict["eflux"] = ref_map.eflux map_dict["eflux"].quantity = map_dict["eflux"].quantity.to("keV/m2/s") map_dict["eflux_err"] = ref_map.eflux_err map_dict["eflux_err"].quantity = map_dict["eflux_err"].quantity.to("keV/m2/s") flux_map = FluxMaps.from_maps(map_dict, "eflux", reference_model) assert_allclose(flux_map.norm.data[:, 0, 0], 1.0) assert flux_map.norm.unit == "" assert_allclose(flux_map.norm_err.data[:, 0, 0], 0.1) assert flux_map.norm_err.unit == "" def test_flux_map_iter_by_axis(): axis1 = MapAxis.from_energy_edges((0.1, 1.0, 10.0), unit="TeV") axis2 = TimeMapAxis.from_time_bounds( Time(51544, format="mjd"), Time(51548, format="mjd"), 3 ) geom = RegionGeom.create("galactic;circle(0, 0, 0.1)", axes=[axis1, axis2]) maps = Maps.from_geom( geom=geom, names=["norm", "norm_err", "norm_errn", "norm_errp", "norm_ul"] ) val = np.ones(geom.data_shape) maps["norm"].data = val maps["norm_err"].data = 0.1 * val maps["norm_errn"].data = 0.2 * val maps["norm_errp"].data = 0.15 * val maps["norm_ul"].data = 2.0 * val start = u.Quantity([1, 2, 3], "day") stop = u.Quantity([1.5, 2.5, 3.9], "day") gti = GTI.create(start, stop) ref_map = FluxMaps(maps, gti=gti, reference_model=PowerLawSpectralModel()) split_maps = list(ref_map.iter_by_axis("time")) assert len(split_maps) == 3 assert split_maps[0].available_quantities == ref_map.available_quantities assert_allclose(split_maps[0].gti.time_stop.value, 51545.3340, rtol=1e-3) assert split_maps[0].reference_model == ref_map.reference_model def test_slice_by_coord(): axis1 = MapAxis.from_energy_edges((0.1, 1.0, 5.0, 10.0), unit="TeV") axis2 = TimeMapAxis.from_time_bounds( Time(51544, format="mjd"), Time(51548, format="mjd"), 3 ) geom = RegionGeom.create("galactic;circle(0, 0, 0.1)", axes=[axis1, axis2]) maps = Maps.from_geom( geom=geom, names=["norm", "norm_err", "norm_errn", "norm_errp", "norm_ul"] ) val = np.ones(geom.data_shape) maps["norm"].data = val maps["norm_err"].data = 0.1 * val maps["norm_errn"].data = 0.2 * val maps["norm_errp"].data = 0.15 * val maps["norm_ul"].data = 2.0 * val start = u.Quantity([1, 2, 3], "day") stop = u.Quantity([1.5, 2.5, 3.9], "day") gti = GTI.create(start, stop) ref_map = FluxMaps(maps, gti=gti, reference_model=PowerLawSpectralModel()) sliced_map = ref_map.slice_by_coord( {"time": slice(Time(51544, format="mjd"), Time(51546, format="mjd"))} ) assert sliced_map.available_quantities == ref_map.available_quantities assert_allclose(sliced_map.gti.time_stop.value, 51545.3340, rtol=1e-3) assert sliced_map.reference_model == ref_map.reference_model sliced_map2 = ref_map.slice_by_coord({"energy": slice(0.5 * u.TeV, 5.0 * u.TeV)}) assert sliced_map2.geom.axes["energy"].nbin == 1 sliced_map3 = ref_map.slice_by_coord({"energy": slice(1.0 * u.TeV, 10.0 * u.TeV)}) assert sliced_map3.geom.axes["energy"].nbin == 2 sliced_map4 = ref_map.slice_by_coord({"time": slice(1 * u.d, 3 * u.d)}) assert sliced_map4.geom.axes["time"].nbin == 1 def test_copy(map_flux_estimate): model = SkyModel(PowerLawSpectralModel(amplitude="1e-10 cm-2s-1TeV-1", index=3)) fe = FluxMaps(data=map_flux_estimate, reference_model=model) fe_new = fe.copy(reference_model=None) assert fe_new.reference_model.spectral_model.index.value == 3 model = SkyModel(PowerLawSpectralModel(amplitude="1e-10 cm-2s-1TeV-1", index=4)) fe_new = fe.copy(reference_model=model) assert fe_new.reference_model.spectral_model.index.value == 4 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/map/tests/test_excess.py0000644000175100001770000004262714721316200023073 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from gammapy.datasets import MapDataset, MapDatasetOnOff from gammapy.estimators import ExcessMapEstimator from gammapy.estimators.utils import ( estimate_exposure_reco_energy, get_combined_significance_maps, ) from gammapy.irf import PSFMap from gammapy.maps import Map, MapAxis, WcsGeom from gammapy.modeling.models import ( GaussianSpatialModel, PowerLawSpectralModel, SkyModel, ) from gammapy.utils.testing import requires_data def image_to_cube(input_map, energy_min, energy_max): energy_min = u.Quantity(energy_min) energy_max = u.Quantity(energy_max) axis = MapAxis.from_energy_bounds(energy_min, energy_max, nbin=1) geom = input_map.geom.to_cube([axis]) return Map.from_geom(geom, data=input_map.data[np.newaxis, :, :]) @pytest.fixture def simple_dataset(): axis = MapAxis.from_energy_bounds(0.1, 10, 1, unit="TeV") geom = WcsGeom.create(npix=20, binsz=0.02, axes=[axis]) dataset = MapDataset.create(geom) dataset.mask_safe += np.ones(dataset.data_shape, dtype=bool) dataset.counts += 2 dataset.background += 1 return dataset @pytest.fixture def simple_dataset_mask_safe(): axis = MapAxis.from_energy_bounds(0.1, 10, 3, unit="TeV") geom = WcsGeom.create(npix=20, binsz=0.02, axes=[axis]) dataset = MapDataset.create(geom) dataset.mask_safe += np.ones(dataset.data_shape, dtype=bool) dataset.mask_safe.data[0, :10, :] = False dataset.counts += 2 dataset.background += 1 dataset.exposure.data += 1 dataset.exposure.data[0, :, :] = 2 dataset.exposure.data[2, :10, :] = 4 return dataset @pytest.fixture def simple_dataset_on_off(): axis = MapAxis.from_energy_bounds(0.1, 10, 2, unit="TeV") geom = WcsGeom.create(npix=20, binsz=0.02, axes=[axis]) dataset = MapDatasetOnOff.create(geom) dataset.mask_safe += np.ones(dataset.data_shape, dtype=bool) dataset.counts += 2 dataset.counts_off += 1 dataset.acceptance += 1 dataset.acceptance_off += 1 dataset.exposure.data += 1e6 dataset.psf = None return dataset @requires_data() def test_compute_lima_image(): """ Test Li & Ma image against TS image for Tophat kernel """ filename = "$GAMMAPY_DATA/tests/unbundled/poisson_stats_image/input_all.fits.gz" counts = Map.read(filename, hdu="counts") counts = image_to_cube(counts, "1 GeV", "100 GeV") background = Map.read(filename, hdu="background") background = image_to_cube(background, "1 GeV", "100 GeV") dataset = MapDataset(counts=counts, background=background) estimator = ExcessMapEstimator("0.1 deg") result_lima = estimator.run(dataset) assert_allclose(result_lima["sqrt_ts"].data[:, 100, 100], 30.814916, atol=1e-3) assert_allclose(result_lima["sqrt_ts"].data[:, 1, 1], 0.164, atol=1e-3) assert_allclose(result_lima["npred_background"].data[:, 1, 1], 37, atol=1e-3) assert_allclose(result_lima["npred"].data[:, 1, 1], 38, atol=1e-3) assert_allclose(result_lima["npred_excess"].data[:, 1, 1], 1, atol=1e-3) @requires_data() def test_compute_lima_on_off_image(): """ Test Li & Ma image with snippet from the H.E.S.S. survey data. """ filename = "$GAMMAPY_DATA/tests/unbundled/hess/survey/hess_survey_snippet.fits.gz" n_on = Map.read(filename, hdu="ON") counts = image_to_cube(n_on, "1 TeV", "100 TeV") n_off = Map.read(filename, hdu="OFF") counts_off = image_to_cube(n_off, "1 TeV", "100 TeV") a_on = Map.read(filename, hdu="ONEXPOSURE") acceptance = image_to_cube(a_on, "1 TeV", "100 TeV") a_off = Map.read(filename, hdu="OFFEXPOSURE") acceptance_off = image_to_cube(a_off, "1 TeV", "100 TeV") dataset = MapDatasetOnOff( counts=counts, counts_off=counts_off, acceptance=acceptance, acceptance_off=acceptance_off, ) significance = Map.read(filename, hdu="SIGNIFICANCE") significance = image_to_cube(significance, "1 TeV", "10 TeV") estimator = ExcessMapEstimator("0.1 deg", correlate_off=False) results = estimator.run(dataset) # Reproduce safe significance threshold from HESS software results["sqrt_ts"].data[results["npred"].data < 5] = 0 # crop the image at the boundaries, because the reference image # is cut out from a large map, there is no way to reproduce the # result with regular boundary handling actual = results["sqrt_ts"].crop((11, 11)).data desired = significance.crop((11, 11)).data # Set boundary to NaN in reference image # The absolute tolerance is low because the method used here is slightly # different from the one used in HGPS n_off is convolved as well to ensure # the method applies to true ON-OFF datasets assert_allclose(actual, desired, atol=0.2, rtol=1e-5) actual = np.nan_to_num(results["npred_background"].crop((11, 11)).data) background_corr = image_to_cube( Map.read(filename, hdu="BACKGROUNDCORRELATED"), "1 TeV", "100 TeV" ) desired = background_corr.crop((11, 11)).data # Set boundary to NaN in reference image # The absolute tolerance is low because the method used here is slightly different # from the one used in HGPS n_off is convolved as well to ensure the method applies # to true ON-OFF datasets assert_allclose(actual, desired, atol=0.2, rtol=1e-5) def test_significance_map_estimator_map_dataset(simple_dataset): simple_dataset.exposure = None estimator = ExcessMapEstimator(0.1 * u.deg, selection_optional=["all"]) result = estimator.run(simple_dataset) assert_allclose(result["npred"].data[0, 10, 10], 162) assert_allclose(result["npred_excess"].data[0, 10, 10], 81) assert_allclose(result["npred_background"].data[0, 10, 10], 81) assert_allclose(result["sqrt_ts"].data[0, 10, 10], 7.910732, atol=1e-5) assert_allclose(result["npred_excess_err"].data[0, 10, 10], 12.727922, atol=1e-3) assert_allclose(result["npred_excess_errp"].data[0, 10, 10], 13.063328, atol=1e-3) assert_allclose(result["npred_excess_errn"].data[0, 10, 10], 12.396716, atol=1e-3) assert_allclose(result["npred_excess_ul"].data[0, 10, 10], 107.806275, atol=1e-3) assert_allclose(result["norm_sensitivity"].data[0, 10, 10], 48.997699, atol=1e-3) assert_allclose(result["flux_sensitivity"].data[0, 10, 10], 4.850772e-10, rtol=1e-4) estimator = ExcessMapEstimator( 0.1 * u.deg, selection_optional=["sensitivity"], apply_threshold_sensitivity=True, ) assert_allclose(result["norm_sensitivity"].data[0, 10, 10], 48.997699, atol=1e-3) assert_allclose(result["flux_sensitivity"].data[0, 10, 10], 4.850772e-10, rtol=1e-3) def test_significance_map_estimator_map_dataset_mask_safe(simple_dataset_mask_safe): estimator = ExcessMapEstimator(0.1 * u.deg, selection_optional=["all"]) result = estimator.run(simple_dataset_mask_safe) assert_allclose(result["npred"].data[0, 10, 10], 416) assert_allclose(result["npred_excess"].data[0, 10, 10], 208) assert_allclose(result["npred_background"].data[0, 10, 10], 208) assert_allclose(result["sqrt_ts"].data[0, 10, 10], 12.676681, atol=1e-5) assert_allclose(result["npred_excess_err"].data[0, 10, 10], 20.396078, atol=1e-3) assert_allclose(result["npred_excess_errp"].data[0, 10, 10], 20.730114, atol=1e-3) assert_allclose(result["npred_excess_errn"].data[0, 10, 10], 20.063506, atol=1e-3) assert_allclose(result["npred_excess_ul"].data[0, 10, 10], 250.136257, atol=1e-3) assert_allclose(result["flux"].data[0, 0, 0], 0.016359, atol=1e-3) assert_allclose(result["flux"].data[0, 10, 10], 0.018004, atol=1e-3) reco_exposure = result["npred_excess"] / result["norm"] assert_allclose(np.unique(reco_exposure).min(), 3.1469110e-08, rtol=1e-5) assert_allclose(np.unique(reco_exposure).max(), 1.7745566e-07, rtol=1e-5) simple_dataset_mask_safe.exposure = None result = estimator.run(simple_dataset_mask_safe) assert_allclose(result["npred_excess"].data[0, 10, 10], 208) assert_allclose(result["npred_background"].data[0, 10, 10], 208) assert_allclose(result["sqrt_ts"].data[0, 10, 10], 12.676681, atol=1e-5) assert_allclose(result["npred_excess_err"].data[0, 10, 10], 20.396078, atol=1e-3) assert_allclose(result["npred_excess_errp"].data[0, 10, 10], 20.730114, atol=1e-3) assert_allclose(result["npred_excess_errn"].data[0, 10, 10], 20.063506, atol=1e-3) assert_allclose(result["npred_excess_ul"].data[0, 10, 10], 250.136257, atol=1e-3) assert_allclose(result["flux"].data[0, 0, 0], 5.148e-10, atol=1e-3) assert_allclose(result["flux"].data[0, 10, 10], 2.0592e-09, atol=1e-3) def test_significance_map_estimator_map_dataset_exposure(simple_dataset): simple_dataset.exposure += 1e10 * u.cm**2 * u.s axis = simple_dataset.exposure.geom.axes[0] simple_dataset.psf = PSFMap.from_gauss(axis, sigma="0.05 deg") model = SkyModel( PowerLawSpectralModel(amplitude="1e-9 cm-2 s-1 TeV-1"), GaussianSpatialModel( lat_0=0.0 * u.deg, lon_0=0.0 * u.deg, sigma=0.1 * u.deg, frame="icrs" ), name="sky_model", ) simple_dataset.models = [model] simple_dataset.npred() estimator = ExcessMapEstimator(0.1 * u.deg, selection_optional="all") result = estimator.run(simple_dataset) assert_allclose(result["npred_excess"].data.sum(), 19733.602, rtol=1e-3) assert_allclose(result["sqrt_ts"].data[0, 10, 10], 4.217129, rtol=1e-3) assert_allclose(result["flux_sensitivity"].data[0, 10, 10], 5.742761e-09, rtol=1e-3) # without mask safe simple_dataset_no_mask = MapDataset( counts=simple_dataset.counts, exposure=simple_dataset.exposure, background=simple_dataset.background, psf=simple_dataset.psf, edisp=simple_dataset.edisp, ) simple_dataset_no_mask.models = [model] result_no_mask = estimator.run(simple_dataset_no_mask) assert_allclose(result_no_mask["npred_excess"].data.sum(), 19733.602, rtol=1e-3) assert_allclose(result_no_mask["sqrt_ts"].data[0, 10, 10], 4.217129, rtol=1e-3) assert_allclose(result["flux_sensitivity"].data[0, 10, 10], 5.742761e-09, rtol=1e-3) def test_excess_map_estimator_map_dataset_on_off_no_correlation( simple_dataset_on_off, ): # Test with exposure estimator_image = ExcessMapEstimator( 0.11 * u.deg, energy_edges=[0.1, 1] * u.TeV, correlate_off=False ) result_image = estimator_image.run(simple_dataset_on_off) assert result_image["npred"].data.shape == (1, 20, 20) assert_allclose(result_image["npred"].data[0, 10, 10], 194) assert_allclose(result_image["npred_excess"].data[0, 10, 10], 97) assert_allclose(result_image["sqrt_ts"].data[0, 10, 10], 0.780125, atol=1e-3) assert_allclose(result_image["flux"].data[:, 10, 10], 9.7e-9, atol=1e-5) def test_excess_map_estimator_map_dataset_on_off_with_correlation_no_exposure( simple_dataset_on_off, ): # First without exposure simple_dataset_on_off.exposure = None estimator = ExcessMapEstimator( 0.11 * u.deg, energy_edges=[0.1, 1, 10] * u.TeV, correlate_off=True ) result = estimator.run(simple_dataset_on_off) assert result["npred"].data.shape == (2, 20, 20) assert_allclose(result["npred"].data[:, 10, 10], 194) assert_allclose(result["npred_excess"].data[:, 10, 10], 97) assert_allclose(result["sqrt_ts"].data[:, 10, 10], 5.741116, atol=1e-5) def test_excess_map_estimator_map_dataset_on_off_with_correlation( simple_dataset_on_off, ): simple_dataset_on_off.exposure = None estimator_image = ExcessMapEstimator( 0.11 * u.deg, energy_edges=[0.1, 1] * u.TeV, correlate_off=True ) result_image = estimator_image.run(simple_dataset_on_off) assert result_image["npred"].data.shape == (1, 20, 20) assert_allclose(result_image["npred"].data[0, 10, 10], 194) assert_allclose(result_image["npred_excess"].data[0, 10, 10], 97) assert_allclose(result_image["sqrt_ts"].data[0, 10, 10], 5.741116, atol=1e-3) assert_allclose(result_image["flux"].data[:, 10, 10], 9.7e-9, atol=1e-5) def test_excess_map_estimator_map_dataset_on_off_with_correlation_mask_fit( simple_dataset_on_off, ): geom = simple_dataset_on_off.counts.geom mask_fit = Map.from_geom(geom, data=1, dtype=bool) mask_fit.data[:, :, 10] = False mask_fit.data[:, 10, :] = False simple_dataset_on_off.mask_fit = mask_fit estimator_image = ExcessMapEstimator(0.11 * u.deg, correlate_off=True) result_image = estimator_image.run(simple_dataset_on_off) assert result_image["npred"].data.shape == (1, 20, 20) assert_allclose(result_image["sqrt_ts"].data[0, 10, 10], np.nan, atol=1e-3) assert_allclose(result_image["npred"].data[0, 10, 10], np.nan) assert_allclose(result_image["npred_excess"].data[0, 10, 10], np.nan) assert_allclose(result_image["sqrt_ts"].data[0, 9, 9], 7.186745, atol=1e-3) assert_allclose(result_image["npred"].data[0, 9, 9], 304) assert_allclose(result_image["npred_excess"].data[0, 9, 9], 152) assert result_image["flux"].unit == u.Unit("cm-2s-1") assert_allclose(result_image["flux"].data[0, 9, 9], 1.190928e-08, rtol=1e-3) def test_excess_map_estimator_map_dataset_on_off_with_correlation_model( simple_dataset_on_off, ): model = SkyModel( PowerLawSpectralModel(amplitude="1e-9 cm-2 s-1TeV-1"), GaussianSpatialModel( lat_0=0.0 * u.deg, lon_0=0.0 * u.deg, sigma=0.1 * u.deg, frame="icrs" ), name="sky_model", ) simple_dataset_on_off.models = [model] estimator_mod = ExcessMapEstimator(0.11 * u.deg, correlate_off=True) result_mod = estimator_mod.run(simple_dataset_on_off) assert result_mod["npred"].data.shape == (1, 20, 20) assert_allclose(result_mod["sqrt_ts"].data[0, 10, 10], 6.240846, atol=1e-3) assert_allclose(result_mod["npred"].data[0, 10, 10], 388) assert_allclose(result_mod["npred_excess"].data[0, 10, 10], 148.68057) assert result_mod["flux"].unit == "cm-2s-1" assert_allclose(result_mod["flux"].data[0, 10, 10], 1.486806e-08, rtol=1e-3) assert_allclose(result_mod["flux"].data.sum(), 5.254442e-06, rtol=1e-3) def test_excess_map_estimator_map_dataset_on_off_reco_exposure( simple_dataset_on_off, ): model = SkyModel( PowerLawSpectralModel(amplitude="1e-9 cm-2 s-1TeV-1"), GaussianSpatialModel( lat_0=0.0 * u.deg, lon_0=0.0 * u.deg, sigma=0.1 * u.deg, frame="icrs" ), name="sky_model", ) simple_dataset_on_off.models = [model] spectral_model = PowerLawSpectralModel(index=15) estimator_mod = ExcessMapEstimator( 0.11 * u.deg, correlate_off=True, spectral_model=spectral_model, ) result_mod = estimator_mod.run(simple_dataset_on_off) assert result_mod["flux"].unit == "cm-2s-1" assert_allclose(result_mod["flux"].data.sum(), 5.254442e-06, rtol=1e-3) reco_exposure = estimate_exposure_reco_energy( simple_dataset_on_off, spectral_model=spectral_model ) assert_allclose(reco_exposure.data.sum(), 8e12, rtol=0.001) def test_incorrect_selection(): with pytest.raises(ValueError): ExcessMapEstimator(0.11 * u.deg, selection_optional=["bad"]) with pytest.raises(ValueError): ExcessMapEstimator(0.11 * u.deg, selection_optional=["ul", "bad"]) estimator = ExcessMapEstimator(0.11 * u.deg) with pytest.raises(ValueError): estimator.selection_optional = "bad" def test_significance_map_estimator_incorrect_dataset(): with pytest.raises(ValueError): ExcessMapEstimator("bad") def test_joint_excess_map(simple_dataset): simple_dataset.exposure += 1e10 * u.cm**2 * u.s axis = simple_dataset.exposure.geom.axes[0] simple_dataset.psf = PSFMap.from_gauss(axis, sigma="0.05 deg") model = SkyModel( PowerLawSpectralModel(amplitude="1e-9 cm-2 s-1 TeV-1"), GaussianSpatialModel( lat_0=0.0 * u.deg, lon_0=0.0 * u.deg, sigma=0.1 * u.deg, frame="icrs" ), name="sky_model", ) simple_dataset.models = [model] simple_dataset.npred() simple_dataset2 = simple_dataset.copy() simple_dataset2.models = [model] simple_dataset2.npred() stacked_dataset = simple_dataset.copy(name="copy") stacked_dataset.counts *= 2 stacked_dataset.exposure *= 2 stacked_dataset.background *= 2 stacked_dataset.models = [model] estimator = ExcessMapEstimator(0.1 * u.deg, sum_over_energy_groups=True) assert estimator.sum_over_energy_groups result = estimator.run(stacked_dataset) assert_allclose(result["npred_excess"].data.sum(), 2 * 19733.602, rtol=1e-3) assert_allclose(result["sqrt_ts"].data[0, 10, 10], 5.960441, rtol=1e-3) result = get_combined_significance_maps( estimator, [simple_dataset, simple_dataset2] ) assert_allclose(result["npred_excess"].data.sum(), 2 * 19733.602, rtol=1e-3) assert_allclose(result["significance"].data[10, 10], 5.618187, rtol=1e-3) assert_allclose( result["df"].data, 2 * (~np.isnan(result["significance"].data)), rtol=1e-3 ) def test_maps_alpha(simple_dataset_on_off): estimator = ExcessMapEstimator( selection_optional=["alpha", "acceptance_on", "acceptance_off"] ) result = estimator.run(simple_dataset_on_off) assert_allclose(result["acceptance_on"].data[:, 10, 10], 2, atol=1e-3) assert_allclose(result["acceptance_off"].data[:, 10, 10], 2, atol=1e-3) assert_allclose(result["alpha"].data[:, 10, 10], 1, atol=1e-3) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/map/tests/test_ts.py0000644000175100001770000006014014721316200022215 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from copy import deepcopy import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import Angle, SkyCoord from gammapy.datasets import Datasets, MapDataset, MapDatasetOnOff from gammapy.estimators import TSMapEstimator from gammapy.estimators.utils import ( approximate_profile_map, combine_flux_maps, get_combined_flux_maps, get_combined_significance_maps, get_flux_map_from_profile, ) from gammapy.irf import EDispKernelMap, PSFMap from gammapy.maps import Map, MapAxis, WcsGeom from gammapy.modeling.models import ( ConstantSpatialModel, GaussianSpatialModel, PointSpatialModel, PowerLawSpectralModel, SkyModel, TemplateSpatialModel, ) from gammapy.utils.testing import requires_data, requires_dependency @pytest.fixture(scope="session") def fake_dataset(): axis = MapAxis.from_energy_bounds(0.1, 10, 5, unit="TeV", name="energy") axis_true = MapAxis.from_energy_bounds(0.05, 20, 10, unit="TeV", name="energy_true") geom = WcsGeom.create(npix=50, binsz=0.02, axes=[axis]) dataset = MapDataset.create(geom) dataset.psf = PSFMap.from_gauss(axis_true, sigma="0.05 deg") dataset.mask_safe += np.ones(dataset.data_shape, dtype=bool) dataset.background += 1 dataset.exposure += 1e12 * u.cm**2 * u.s spatial_model = PointSpatialModel() spectral_model = PowerLawSpectralModel(amplitude="1e-10 cm-2s-1TeV-1", index=2) model = SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, name="source" ) dataset.models = [model] dataset.fake(random_state=42) return dataset @pytest.fixture(scope="session") def input_dataset(): filename = "$GAMMAPY_DATA/tests/unbundled/poisson_stats_image/input_all.fits.gz" energy = MapAxis.from_energy_bounds("0.1 TeV", "1 TeV", 1) energy_true = MapAxis.from_energy_bounds("0.1 TeV", "1 TeV", 1, name="energy_true") counts2D = Map.read(filename, hdu="counts") counts = counts2D.to_cube([energy]) exposure2D = Map.read(filename, hdu="exposure") exposure2D = Map.from_geom(exposure2D.geom, data=exposure2D.data, unit="cm2s") exposure = exposure2D.to_cube([energy_true]) background2D = Map.read(filename, hdu="background") background = background2D.to_cube([energy]) # add mask mask2D_data = np.ones_like(background2D.data).astype("bool") mask2D_data[0:40, :] = False mask2D = Map.from_geom(geom=counts2D.geom, data=mask2D_data) mask = mask2D.to_cube([energy]) name = "test-dataset" return MapDataset( counts=counts, exposure=exposure, background=background, mask_safe=mask, name=name, ) @pytest.fixture(scope="session") def fermi_dataset(): size = Angle("3 deg", "3.5 deg") counts = Map.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-counts-cube.fits.gz") counts = counts.cutout(counts.geom.center_skydir, size) background = Map.read( "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-background-cube.fits.gz" ) background = background.cutout(background.geom.center_skydir, size) exposure = Map.read( "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-exposure-cube.fits.gz" ) exposure = exposure.cutout(exposure.geom.center_skydir, size) psfmap = PSFMap.read( "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-psf-cube.fits.gz", format="gtpsf" ) edisp = EDispKernelMap.from_diagonal_response( energy_axis=counts.geom.axes["energy"], energy_axis_true=exposure.geom.axes["energy_true"], ) return MapDataset( counts=counts, background=background, exposure=exposure, psf=psfmap, name="fermi-3fhl-gc", edisp=edisp, ) @requires_data() def test_compute_ts_map(input_dataset): """Minimal test of compute_ts_image""" spatial_model = GaussianSpatialModel(sigma="0.1 deg") spectral_model = PowerLawSpectralModel(index=2) model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) ts_estimator = TSMapEstimator(model=model, threshold=1, selection_optional=[]) kernel = ts_estimator.estimate_kernel(dataset=input_dataset) assert_allclose(kernel.geom.width, 1.22 * u.deg) assert_allclose(kernel.data.sum(), 1.0) result = ts_estimator.run(input_dataset) assert_allclose(result["ts"].data[0, 99, 99], 1704.23, rtol=1e-2) assert_allclose(result["niter"].data[0, 99, 99], 7) assert_allclose(result["flux"].data[0, 99, 99], 1.02e-09, rtol=1e-2) assert_allclose(result["flux_err"].data[0, 99, 99], 3.84e-11, rtol=1e-2) assert_allclose(result["npred"].data[0, 99, 99], 4744.020361, rtol=1e-2) assert_allclose(result["npred_excess"].data[0, 99, 99], 1026.874063, rtol=1e-2) assert_allclose(result["npred_excess_err"].data[0, 99, 99], 38.470995, rtol=1e-2) assert result["flux"].unit == u.Unit("cm-2s-1") assert result["flux_err"].unit == u.Unit("cm-2s-1") # Check mask is correctly taken into account assert np.isnan(result["ts"].data[0, 30, 40]) energy_axis = result["ts"].geom.axes["energy"] assert_allclose(energy_axis.edges.value, [0.1, 1]) @requires_data() @requires_dependency("ray") def test_compute_ts_map_parallel_ray(input_dataset): """Minimal test of compute_ts_image""" spatial_model = GaussianSpatialModel(sigma="0.1 deg") spectral_model = PowerLawSpectralModel(index=2) model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) ts_estimator = TSMapEstimator( model=model, threshold=1, selection_optional=[], parallel_backend="ray", n_jobs=2, ) assert ts_estimator.parallel_backend == "ray" assert ts_estimator.n_jobs == 2 result = ts_estimator.run(input_dataset) assert_allclose(result["ts"].data[0, 99, 99], 1704.23, rtol=1e-2) assert_allclose(result["niter"].data[0, 99, 99], 7) assert_allclose(result["flux"].data[0, 99, 99], 1.02e-09, rtol=1e-2) assert_allclose(result["flux_err"].data[0, 99, 99], 3.84e-11, rtol=1e-2) assert_allclose(result["npred"].data[0, 99, 99], 4744.020361, rtol=1e-2) assert_allclose(result["npred_excess"].data[0, 99, 99], 1026.874063, rtol=1e-2) assert_allclose(result["npred_excess_err"].data[0, 99, 99], 38.470995, rtol=1e-2) @requires_data() def test_compute_ts_map_parallel_multiprocessing(input_dataset): """Minimal test of compute_ts_image""" spatial_model = GaussianSpatialModel(sigma="0.1 deg") spectral_model = PowerLawSpectralModel(index=2) model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) ts_estimator = TSMapEstimator( model=model, threshold=1, selection_optional=[], n_jobs=3, parallel_backend="multiprocessing", ) result = ts_estimator.run(input_dataset) assert ts_estimator.n_jobs == 3 assert_allclose(result["ts"].data[0, 99, 99], 1704.23, rtol=1e-2) assert_allclose(result["niter"].data[0, 99, 99], 7) assert_allclose(result["flux"].data[0, 99, 99], 1.02e-09, rtol=1e-2) assert_allclose(result["flux_err"].data[0, 99, 99], 3.84e-11, rtol=1e-2) assert_allclose(result["npred"].data[0, 99, 99], 4744.020361, rtol=1e-2) assert_allclose(result["npred_excess"].data[0, 99, 99], 1026.874063, rtol=1e-2) assert_allclose(result["npred_excess_err"].data[0, 99, 99], 38.470995, rtol=1e-2) @requires_data() def test_compute_ts_map_psf(fermi_dataset): spatial_model = PointSpatialModel() spectral_model = PowerLawSpectralModel(amplitude="1e-22 cm-2 s-1 keV-1") model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) estimator = TSMapEstimator( model=model, kernel_width="1 deg", selection_optional=["ul", "errn-errp"] ) result = estimator.run(fermi_dataset) assert_allclose(result["ts"].data[0, 29, 29], 830.97957, rtol=2e-3) assert_allclose(result["niter"].data[0, 29, 29], 7) assert_allclose(result["flux"].data[0, 29, 29], 1.339426e-09, rtol=2e-3) assert_allclose(result["flux_err"].data[0, 29, 29], 7.883016e-11, rtol=2e-3) assert_allclose(result["flux_errp"].data[0, 29, 29], 7.913813e-11, rtol=2e-3) assert_allclose(result["flux_errn"].data[0, 29, 29], 7.453983e-11, rtol=2e-3) assert_allclose(result["flux_ul"].data[0, 29, 29], 1.501809e-09, rtol=2e-3) assert result["flux"].unit == u.Unit("cm-2s-1") assert result["flux_err"].unit == u.Unit("cm-2s-1") assert result["flux_ul"].unit == u.Unit("cm-2s-1") @requires_data() def test_compute_ts_map_energy(fermi_dataset): spatial_model = PointSpatialModel() spectral_model = PowerLawSpectralModel(amplitude="1e-22 cm-2 s-1 keV-1") model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) estimator = TSMapEstimator( model=model, kernel_width="0.6 deg", energy_edges=[10, 100, 1000] * u.GeV, sum_over_energy_groups=False, ) result = estimator.run(fermi_dataset) result.filter_success_nan = False assert_allclose(result.ts.data[1, 43, 30], 0.212079, atol=0.01) assert not result["success"].data[1, 43, 30] assert_allclose(result["ts"].data[:, 29, 29], [795.815842, 17.52017], rtol=1e-2) assert_allclose( result["flux"].data[:, 29, 29], [1.233119e-09, 3.590694e-11], rtol=1e-2 ) assert_allclose( result["flux_err"].data[:, 29, 29], [7.382305e-11, 1.338985e-11], rtol=1e-2 ) assert_allclose(result["niter"].data[:, 29, 29], [6, 6]) energy_axis = result["ts"].geom.axes["energy"] assert_allclose(energy_axis.edges.to_value("GeV"), [10, 84.471641, 500], rtol=1e-4) fermi_dataset_maksed = fermi_dataset.copy() mask_safe = Map.from_geom(fermi_dataset.counts.geom, dtype=bool) mask_safe.data[:-3, :, :] = True fermi_dataset_maksed.mask_safe = mask_safe result = estimator.run(fermi_dataset_maksed) result.filter_success_nan = False assert_allclose(result.ts.data[1, 43, 30], 0.164831, atol=0.01) assert not result["success"].data[1, 43, 30] assert_allclose(result["ts"].data[:, 29, 29], [795.815842, 8.777864], rtol=1e-2) assert_allclose( result["flux"].data[:, 29, 29], [1.223901e-09, 3.748007e-11], rtol=1e-2 ) assert_allclose( result["flux_err"].data[:, 29, 29], [7.363390e-11, 1.799367e-11], rtol=1e-2 ) assert_allclose(result["niter"].data[:, 29, 29], [6, 6]) energy_axis = result["ts"].geom.axes["energy"] assert_allclose(energy_axis.edges.to_value("GeV"), [10, 84.471641, 500], rtol=1e-4) @requires_data() def test_compute_ts_map_invalid(fermi_dataset): spatial_model = PointSpatialModel() spectral_model = PowerLawSpectralModel(amplitude="1e-22 cm-2 s-1 keV-1") model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) estimator = TSMapEstimator( model=model, kernel_width="0.6 deg", energy_edges=[10, 100, 1000] * u.GeV, sum_over_energy_groups=False, ) fermi_dataset_empty = fermi_dataset.copy() fermi_dataset_empty.background = None with pytest.raises(ValueError): result = estimator.run(fermi_dataset_empty) fermi_dataset_empty = fermi_dataset.copy() fermi_dataset_empty.background.data = 0 with pytest.raises(ValueError): result = estimator.run(fermi_dataset_empty) fermi_dataset_empty = fermi_dataset.copy() mask_safe = Map.from_geom(fermi_dataset.counts.geom, dtype=bool) fermi_dataset_empty.mask_safe = mask_safe with pytest.raises(ValueError): result = estimator.run(fermi_dataset_empty) spatial_model = ConstantSpatialModel(value=0 / u.sr) spectral_model = PowerLawSpectralModel(amplitude="1e-22 cm-2 s-1 keV-1") model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) estimator = TSMapEstimator( model=model, kernel_width="0.6 deg", energy_edges=[10, 100, 1000] * u.GeV, sum_over_energy_groups=False, ) result = estimator.run(fermi_dataset) assert_allclose(result["ts"].data, 0) @requires_data() def test_compute_ts_map_downsampled(input_dataset): """Minimal test of compute_ts_image""" spatial_model = GaussianSpatialModel(sigma="0.11 deg") spectral_model = PowerLawSpectralModel(index=2) model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) ts_estimator = TSMapEstimator( model=model, downsampling_factor=2, kernel_width="1 deg", selection_optional=["ul"], ) result = ts_estimator.run(input_dataset) assert_allclose(result["ts"].data[0, 99, 99], 1661.49, rtol=1e-2) assert_allclose(result["niter"].data[0, 99, 99], 7) assert_allclose(result["flux"].data[0, 99, 99], 1.065988e-09, rtol=1e-2) assert_allclose(result["flux_err"].data[0, 99, 99], 4.005628e-11, rtol=1e-2) assert_allclose(result["flux_ul"].data[0, 99, 99], 1.14818952e-09, rtol=1e-2) assert result["flux"].unit == u.Unit("cm-2s-1") assert result["flux_err"].unit == u.Unit("cm-2s-1") assert result["flux_ul"].unit == u.Unit("cm-2s-1") # Check mask is correctly taken into account assert np.isnan(result["ts"].data[0, 30, 40]) def test_ts_map_stat_scan(fake_dataset): model = fake_dataset.models["source"] dataset = fake_dataset.downsample(25) estimator_ref = TSMapEstimator( model, kernel_width="0.3 deg", energy_edges=[200, 3500] * u.GeV, selection_optional=["errn-errp", "ul"], ) estimator = TSMapEstimator( model, kernel_width="0.3 deg", selection_optional=["stat_scan"], energy_edges=[200, 3500] * u.GeV, ) maps_ref = estimator_ref.run(dataset) maps = estimator.run(dataset) success = maps.success.data assert maps.stat_scan.geom.data_shape == (1, 109, 2, 2) ts = np.abs(maps["stat_scan"].data.min(axis=1)) assert_allclose(ts[success], maps_ref.ts.data[success], rtol=1e-3) dnde_ref = maps.dnde_ref.squeeze() assert maps.dnde_scan_values.unit == dnde_ref.unit ind_best = maps.stat_scan.data.argmin(axis=1) ij, ik, il = np.indices(ind_best.shape) norm = maps.dnde_scan_values.data[ij, ind_best, ik, il] / dnde_ref.value assert_allclose(norm[success], maps_ref.norm.data[success], rtol=1e-5) stat_scan_aprrox = approximate_profile_map(maps, sqrt_ts_threshold_ul="ignore") ts_aprrox = np.abs(stat_scan_aprrox.data.min(axis=1)) assert_allclose(ts_aprrox[success], maps_ref.ts.data[success], rtol=1e-3) stat_scan_aprrox = approximate_profile_map(maps, sqrt_ts_threshold_ul=None) ts_aprrox = np.abs(stat_scan_aprrox.data.min(axis=1)) assert_allclose(ts_aprrox[success], maps_ref.ts.data[success], rtol=1e-3) maps_from_scan = get_flux_map_from_profile(maps) assert_allclose( maps_from_scan.ts.data[success], maps_ref.ts.data[success], rtol=1e-3 ) combined_map = combine_flux_maps([maps, maps], method="profile") assert_allclose(combined_map.ts.data, 2 * ts, rtol=1e-4) assert_allclose(combined_map.norm.data[success], norm[success], rtol=5e-2) maps1 = deepcopy(maps) combined_map = combine_flux_maps([maps, maps1], method="gaussian_errors") assert_allclose(combined_map.ts.data[success], 2 * ts[success], rtol=1e-4) assert_allclose(combined_map.norm.data[success], norm[success], rtol=5e-2) combined_map = combine_flux_maps([maps, maps1], method="distrib") assert_allclose(combined_map.ts.data, 2 * ts, rtol=1e-4) assert_allclose(combined_map.norm.data[success], norm[success], rtol=5e-2) combined_results = get_combined_flux_maps( estimator, [dataset, dataset.copy()], method="distrib" ) combined_map = combined_results["flux_maps"] assert len(combined_results["estimator_results"]) == 2 assert_allclose(combined_map.ts.data, 2 * ts, rtol=1e-4) assert_allclose(combined_map.norm.data[success], norm[success], rtol=5e-2) combined_map = combine_flux_maps([maps, maps1], method="profile") assert_allclose(combined_map.ts.data, 2 * ts, rtol=1e-4) assert_allclose(combined_map.norm.data[success], norm[success], rtol=5e-2) maps1._reference_model.parameters["amplitude"].value = 1 combined_map = combine_flux_maps([maps, maps1], method="distrib") assert_allclose(combined_map.ts.data, 2 * ts, rtol=1e-4) assert_allclose(combined_map.norm.data[success], norm[success], rtol=5e-2) combined_map = combine_flux_maps([maps, maps1], method="profile") assert_allclose(combined_map.ts.data, 2 * ts, rtol=1e-4) assert_allclose(combined_map.norm.data[success], norm[success], rtol=5e-2) assert_allclose(maps.norm.data[success], maps_ref.norm.data[success], rtol=1e-4) assert_allclose( maps.norm_errn.data[success], maps_ref.norm_errn.data[success], rtol=5e-2 ) assert_allclose( maps.norm_errp.data[success], maps_ref.norm_errp.data[success], rtol=5e-2 ) assert_allclose( maps.norm_ul.data[success], maps_ref.norm_ul.data[success], rtol=5e-2 ) with pytest.raises(ValueError): combine_flux_maps([maps, maps1], method="test") def test_ts_map_stat_scan_different_energy(fake_dataset): model = fake_dataset.models["source"] dataset = fake_dataset.downsample(25) estimator = TSMapEstimator( model, kernel_width="0.3 deg", energy_edges=[200, 3500] * u.GeV, selection_optional=["stat_scan"], ) estimator_1 = TSMapEstimator( model, kernel_width="0.3 deg", selection_optional=["stat_scan"], energy_edges=[0.2, 10] * u.TeV, ) maps = estimator.run(dataset) maps_1 = estimator_1.run(dataset) combined_map = combine_flux_maps([maps_1, maps], method="profile") assert combined_map.ts.data.shape == (1, 2, 2) def test_ts_map_with_model(fake_dataset): model = fake_dataset.models["source"] fake_dataset = fake_dataset.copy() fake_dataset.models = [] estimator = TSMapEstimator( model, kernel_width="0.3 deg", selection_optional=["ul", "errn-errp"], energy_edges=[200, 3500] * u.GeV, ) maps = estimator.run(fake_dataset) assert_allclose(maps["sqrt_ts"].data[:, 25, 25], 18.369942, atol=0.1) assert_allclose(maps["flux"].data[:, 25, 25], 3.513e-10, atol=1e-12) assert_allclose(maps["flux_err"].data[0, 0, 0], 2.413244e-11, rtol=1e-4) fake_dataset.models = [model] maps = estimator.run(fake_dataset) assert_allclose(maps["sqrt_ts"].data[:, 25, 25], -0.231187, atol=0.1) assert_allclose(maps["flux"].data[:, 25, 25], -5.899423e-12, atol=1e-12) # Try downsampling estimator = TSMapEstimator( model, kernel_width="0.3 deg", selection_optional=[], downsampling_factor=2, energy_edges=[200, 3500] * u.GeV, ) maps = estimator.run(fake_dataset) assert_allclose(maps["sqrt_ts"].data[:, 25, 25], -0.279392, atol=0.1) assert_allclose(maps["flux"].data[:, 25, 25], -2.015715e-13, atol=1e-12) @requires_data() def test_compute_ts_map_with_hole(fake_dataset): """Test of compute_ts_image with a null exposure at the center of the map""" holes_dataset = fake_dataset.copy("holes_dataset") i, j, ie = holes_dataset.exposure.geom.center_pix holes_dataset.exposure.data[:, np.int_(i), np.int_(j)] = 0.0 spatial_model = GaussianSpatialModel(sigma="0.1 deg") spectral_model = PowerLawSpectralModel(index=2) model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) ts_estimator = TSMapEstimator(model=model, threshold=1, selection_optional=[]) kernel = ts_estimator.estimate_kernel(dataset=holes_dataset) assert_allclose(kernel.geom.width, 1.0 * u.deg) assert_allclose(kernel.data.sum(), 1.0) holes_dataset.exposure.data[...] = 0.0 with pytest.raises(ValueError): kernel = ts_estimator.estimate_kernel(dataset=holes_dataset) def test_MapDatasetOnOff_error(): """Test raise error when applying TSMapEStimator to MapDatasetOnOff""" axis = MapAxis.from_edges([1, 10] * u.TeV, name="energy") geom = WcsGeom.create(width=1, axes=[axis]) dataset_on_off = MapDatasetOnOff.create(geom) ts_estimator = TSMapEstimator() with pytest.raises(TypeError): ts_estimator.run(dataset=dataset_on_off) @requires_data() def test_with_TemplateSpatialModel(): # Test for bug reported in 4920 dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") dataset = dataset.downsample(10) filename = "$GAMMAPY_DATA/catalogs/fermi/Extended_archive_v18/Templates/RXJ1713_2016_250GeV.fits" model = TemplateSpatialModel.read(filename, normalize=False) model.position = SkyCoord(0, 0, unit="deg", frame="galactic") sky_model = SkyModel(spatial_model=model, spectral_model=PowerLawSpectralModel()) dataset.models = sky_model estimator = TSMapEstimator( model=sky_model, energy_edges=[1.0, 5.0] * u.TeV, n_jobs=4, ) result = estimator.run(dataset) assert_allclose(result["sqrt_ts"].data[0, 12, 16], 22.932, rtol=1e-3) def test_joint_ts_map(fake_dataset): model = fake_dataset.models["source"] fake_dataset = fake_dataset.copy() fake_dataset2 = fake_dataset.copy() fake_dataset.models = [model] fake_dataset2.models = [model] estimator = TSMapEstimator( model=model, selection_optional=[], sum_over_energy_groups=True ) assert estimator.sum_over_energy_groups result = estimator.run(fake_dataset) assert_allclose(result["npred_excess"].data.sum(), 902.403647, rtol=1e-3) assert_allclose(result["sqrt_ts"].data[0, 10, 10], 1.360219, rtol=1e-3) result = get_combined_significance_maps(estimator, [fake_dataset, fake_dataset2]) assert_allclose(result["npred_excess"].data.sum(), 2 * 902.403647, rtol=1e-3) assert_allclose(result["significance"].data[10, 10], 1.414529, rtol=1e-3) assert_allclose( result["df"].data, 2 * (~np.isnan(result["significance"].data)), rtol=1e-3 ) estimator = TSMapEstimator( model=model, threshold=1, selection_optional="all", sum_over_energy_groups=True ) result = estimator.run([fake_dataset, fake_dataset2]) assert_allclose(result["sqrt_ts"].data[0, 10, 10], 1.92364, rtol=1e-3) @requires_data() def test_joint_ts_map_hawc(): datasets = Datasets.read("$GAMMAPY_DATA/hawc/DL4/HAWC_pass4_public_Crab.yaml") datasets = Datasets(datasets[-2:]) estimator = TSMapEstimator( kernel_width=2 * u.deg, sum_over_energy_groups=False, n_jobs=4 ) result = estimator.run(datasets) assert_allclose(result["flux"].data[0, 59, 59], 1.909396e-13, rtol=1e-3) assert_allclose(result["sqrt_ts"].data[0, 59, 59], 10.878956, rtol=1e-3) estimator = TSMapEstimator( kernel_width=2 * u.deg, sum_over_energy_groups=False, selection_optional=["stat_scan"], n_jobs=4, ) result = estimator.run(datasets) assert_allclose(result["flux"].data[0, 59, 59], 1.909396e-13, rtol=1e-3) assert_allclose(result["sqrt_ts"].data[0, 59, 59], 10.878956, rtol=1e-3) assert result.stat_scan.geom.data_shape == (1, 109, 120, 120) assert result.dnde_scan_values.geom.data_shape == (1, 109, 120, 120) assert_allclose( result["dnde_scan_values"].data[0, 0, 59, 59], -3.164557e-13, rtol=1e-3 ) assert_allclose(result["stat_scan"].data[0, 0, 59, 59], 5193.588657, rtol=1e-3) estimator = TSMapEstimator( kernel_width=2 * u.deg, sum_over_energy_groups=True, n_jobs=4 ) result = estimator.run(datasets) assert_allclose(result["flux"].data[0, 59, 59], 1.99452e-13, rtol=1e-3) assert_allclose(result["sqrt_ts"].data[0, 59, 59], 11.997135, rtol=1e-3) estimator = TSMapEstimator( kernel_width=2 * u.deg, sum_over_energy_groups=True, selection_optional=["stat_scan"], n_jobs=4, ) result = estimator.run(datasets) assert_allclose(result["flux"].data[0, 59, 59], 1.99452e-13, rtol=1e-3) assert_allclose(result["sqrt_ts"].data[0, 59, 59], 11.997135, rtol=1e-3) assert result.stat_scan.geom.data_shape == (1, 109, 120, 120) assert result.dnde_scan_values.geom.data_shape == (1, 109, 120, 120) assert_allclose( result["dnde_scan_values"].data[0, 0, 59, 59], -3.164557e-13, rtol=1e-3 ) assert_allclose(result["stat_scan"].data[0, 0, 59, 59], 7625.040553, rtol=1e-3) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/map/ts.py0000644000175100001770000010366314721316200020024 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Functions to compute test statistic images.""" import warnings from itertools import repeat import numpy as np import scipy.optimize from scipy.interpolate import InterpolatedUnivariateSpline from astropy.coordinates import Angle from astropy.utils import lazyproperty import gammapy.utils.parallel as parallel from gammapy.datasets import Datasets from gammapy.datasets.map import MapEvaluator from gammapy.datasets.utils import get_nearest_valid_exposure_position from gammapy.maps import Map, MapAxis, Maps from gammapy.modeling.models import PointSpatialModel, PowerLawSpectralModel, SkyModel from gammapy.stats import cash, cash_sum_cython, f_cash_root_cython, norm_bounds_cython from gammapy.utils.array import shape_2N, symmetric_crop_pad_width from gammapy.utils.pbar import progress_bar from gammapy.utils.roots import find_roots from ..core import Estimator from ..utils import ( _generate_scan_values, _get_default_norm, _get_norm_scan_values, estimate_exposure_reco_energy, ) from .core import FluxMaps __all__ = ["TSMapEstimator"] def _extract_array(array, shape, position): """Helper function to extract parts of a larger array. Simple implementation of an array extract function , because `~astropy.ndata.utils.extract_array` introduces too much overhead.` Parameters ---------- array : `~numpy.ndarray` The array from which to extract. shape : tuple or int The shape of the extracted array. position : tuple of numbers or number The position of the small array's center with respect to the large array. """ x_width = shape[2] // 2 y_width = shape[1] // 2 y_lo = position[0] - y_width y_hi = position[0] + y_width + 1 x_lo = position[1] - x_width x_hi = position[1] + x_width + 1 return array[:, y_lo:y_hi, x_lo:x_hi] class TSMapEstimator(Estimator, parallel.ParallelMixin): r"""Compute test statistic map from a MapDataset using different optimization methods. The map is computed fitting by a single parameter norm fit. The fit is simplified by finding roots of the derivative of the fit statistics using various root finding algorithms. The approach is described in Appendix A in Stewart (2009). Parameters ---------- model : `~gammapy.modeling.models.SkyModel` Source model kernel. If set to None, assume spatail model: point source model, PointSpatialModel. spectral model: PowerLawSpectral Model of index 2 kernel_width : `~astropy.coordinates.Angle` Width of the kernel to use: the kernel will be truncated at this size n_sigma : int Number of sigma for flux error. Default is 1. n_sigma_ul : int Number of sigma for flux upper limits. Default is 2. downsampling_factor : int Sample down the input maps to speed up the computation. Only integer values that are a multiple of 2 are allowed. Note that the kernel is not sampled down, but must be provided with the downsampled bin size. threshold : float, optional If the test statistic value corresponding to the initial flux estimate is not above this threshold, the optimizing step is omitted to save computing time. Default is None. rtol : float Relative precision of the flux estimate. Used as a stopping criterion for the norm fit. Default is 0.01. selection_optional : list of str, optional Which maps to compute besides TS, sqrt(TS), flux and symmetric error on flux. Available options are: * "all": all the optional steps are executed * "errn-errp": estimate asymmetric error on flux. * "ul": estimate upper limits on flux. * "stat_scan": estimate likelihood profile Default is None so the optional steps are not executed. energy_edges : list of `~astropy.units.Quantity`, optional Edges of the target maps energy bins. The resulting bin edges won't be exactly equal to the input ones, but rather the closest values to the energy axis edges of the parent dataset. Default is None: apply the estimator in each energy bin of the parent dataset. For further explanation see :ref:`estimators`. sum_over_energy_groups : bool Whether to sum over the energy groups or fit the norm on the full energy cube. norm : `~gammapy.modeling.Parameter` or dict Norm parameter used for the likelihood profile computation on a fixed norm range. Only used for "stat_scan" in `selection_optional`. Default is None and a new parameter is created automatically, with value=1, name="norm", scan_min=-100, scan_max=100, and values sampled such as we can probe a 0.1% relative error on the norm. If a dict is given the entries should be a subset of `~gammapy.modeling.Parameter` arguments. n_jobs : int Number of processes used in parallel for the computation. Default is one, unless `~gammapy.utils.parallel.N_JOBS_DEFAULT` was modified. The number of jobs limited to the number of physical CPUs. parallel_backend : {"multiprocessing", "ray"} Which backend to use for multiprocessing. Defaults to `~gammapy.utils.parallel.BACKEND_DEFAULT`. max_niter : int Maximal number of iterations used by the root finding algorithm. Default is 100. Notes ----- Negative :math:`TS` values are defined as following: .. math:: TS = \left \{ \begin{array}{ll} -TS \text{ if } F < 0 \\ TS \text{ else} \end{array} \right. Where :math:`F` is the fitted flux norm. Examples -------- >>> import astropy.units as u >>> from gammapy.estimators import TSMapEstimator >>> from gammapy.datasets import MapDataset >>> from gammapy.modeling.models import (SkyModel, PowerLawSpectralModel,PointSpatialModel) >>> spatial_model = PointSpatialModel() >>> spectral_model = PowerLawSpectralModel(amplitude="1e-22 cm-2 s-1 keV-1", index=2) >>> model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) >>> dataset = MapDataset.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc.fits.gz") >>> estimator = TSMapEstimator( ... model, kernel_width="1 deg", energy_edges=[10, 100] * u.GeV, downsampling_factor=4 ... ) >>> maps = estimator.run(dataset) >>> print(maps) FluxMaps -------- geom : WcsGeom axes : ['lon', 'lat', 'energy'] shape : (np.int64(400), np.int64(200), 1) quantities : ['ts', 'norm', 'niter', 'norm_err', 'npred', 'npred_excess', 'stat', 'stat_null', 'success'] ref. model : pl n_sigma : 1 n_sigma_ul : 2 sqrt_ts_threshold_ul : 2 sed type init : likelihood References ---------- [Stewart2009]_ """ tag = "TSMapEstimator" _available_selection_optional = ["errn-errp", "ul", "stat_scan"] def __init__( self, model=None, kernel_width=None, downsampling_factor=None, n_sigma=1, n_sigma_ul=2, threshold=None, rtol=0.01, selection_optional=None, energy_edges=None, sum_over_energy_groups=True, n_jobs=None, parallel_backend=None, norm=None, max_niter=100, ): if kernel_width is not None: kernel_width = Angle(kernel_width) self.kernel_width = kernel_width self.norm = _get_default_norm(norm, scan_values=_generate_scan_values()) if model is None: model = SkyModel( spectral_model=PowerLawSpectralModel(), spatial_model=PointSpatialModel(), name="ts-kernel", ) self.model = model self.downsampling_factor = downsampling_factor self.n_sigma = n_sigma self.n_sigma_ul = n_sigma_ul self.threshold = threshold self.rtol = rtol self.n_jobs = n_jobs self.parallel_backend = parallel_backend self.sum_over_energy_groups = sum_over_energy_groups self.max_niter = max_niter self.selection_optional = selection_optional self.energy_edges = energy_edges self._flux_estimator = BrentqFluxEstimator( rtol=self.rtol, n_sigma=self.n_sigma, n_sigma_ul=self.n_sigma_ul, selection_optional=selection_optional, ts_threshold=threshold, norm=self.norm, max_niter=self.max_niter, ) @property def selection_all(self): """Which quantities are computed.""" selection = [ "ts", "norm", "niter", "norm_err", "npred", "npred_excess", "stat", "stat_null", "success", ] if "stat_scan" in self.selection_optional: selection += [ "dnde_scan_values", "stat_scan", "norm_errp", "norm_errn", "norm_ul", ] else: if "errn-errp" in self.selection_optional: selection += ["norm_errp", "norm_errn"] if "ul" in self.selection_optional: selection += ["norm_ul"] return selection def estimate_kernel(self, dataset): """Get the convolution kernel for the input dataset. Convolves the model with the IRFs at the center of the dataset, or at the nearest position with non-zero exposure. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input dataset. Returns ------- kernel : `~gammapy.maps.Map` Kernel map. """ geom = dataset.exposure.geom if self.kernel_width is not None: geom = geom.to_odd_npix(max_radius=self.kernel_width / 2) model = self.model.copy() model.spatial_model.position = geom.center_skydir # Creating exposure map with the mean non-null exposure exposure = Map.from_geom(geom, unit=dataset.exposure.unit) position = get_nearest_valid_exposure_position( dataset.exposure, geom.center_skydir ) exposure_position = dataset.exposure.to_region_nd_map(position) if not np.any(exposure_position.data): raise ValueError( "No valid exposure. Impossible to compute kernel for TS Map." ) exposure.data[...] = exposure_position.data # We use global evaluation mode to not modify the geometry evaluator = MapEvaluator(model=model) evaluator.update( exposure=exposure, psf=dataset.psf, edisp=dataset.edisp, geom=dataset.counts.geom, mask=dataset.mask_image, ) kernel = evaluator.compute_npred() kernel.data /= kernel.data.sum() return kernel def estimate_flux_default(self, dataset, kernel=None, exposure=None): """Estimate default flux map using a given kernel. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input dataset. kernel : `~gammapy.maps.WcsNDMap` Source model kernel. exposure : `~gammapy.maps.WcsNDMap` Exposure map on reconstructed energy. Returns ------- flux : `~gammapy.maps.WcsNDMap` Approximate flux map. """ if exposure is None: exposure = estimate_exposure_reco_energy(dataset, self.model.spectral_model) if kernel is None: kernel = self.estimate_kernel(dataset=dataset) kernel = kernel.data / np.sum(kernel.data**2) with np.errstate(invalid="ignore", divide="ignore"): flux = (dataset.counts - dataset.npred()) / exposure flux.data = np.nan_to_num(flux.data) flux.quantity = flux.quantity.to("1 / (cm2 s)") flux = flux.convolve(kernel) if dataset.mask_safe: flux *= dataset.mask_safe return flux.sum_over_axes() @staticmethod def estimate_mask_default(dataset): """Compute default mask where to estimate test statistic values. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input dataset. Returns ------- mask : `WcsNDMap` Mask map. """ geom = dataset.counts.geom.to_image() mask = np.ones(geom.data_shape, dtype=bool) if dataset.mask is not None: mask &= dataset.mask.reduce_over_axes(func=np.logical_or, keepdims=False) # in some image there are pixels, which have exposure, but zero # background, which doesn't make sense and causes the TS computation # to fail, this is a temporary fix npred = dataset.npred() if dataset.mask_safe: npred *= dataset.mask_safe background = npred.sum_over_axes(keepdims=False) mask[background.data == 0] = False return Map.from_geom(data=mask, geom=geom) def estimate_pad_width(self, dataset, kernel=None): """Estimate pad width of the dataset. Parameters ---------- dataset : `MapDataset` Input MapDataset. kernel : `WcsNDMap` Source model kernel. Returns ------- pad_width : tuple Padding width. """ if kernel is None: kernel = self.estimate_kernel(dataset=dataset) geom = dataset.counts.geom.to_image() geom_kernel = kernel.geom.to_image() pad_width = np.array(geom_kernel.data_shape) // 2 if self.downsampling_factor and self.downsampling_factor > 1: shape = tuple(np.array(geom.data_shape) + 2 * pad_width) pad_width = symmetric_crop_pad_width(geom.data_shape, shape_2N(shape))[0] return tuple(pad_width) def estimate_fit_input_maps(self, dataset): """Estimate fit input maps. Parameters ---------- dataset : `MapDataset` Map dataset. Returns ------- maps : dict of `Map` Maps dictionary. """ # First create 2D map arrays exposure = estimate_exposure_reco_energy(dataset, self.model.spectral_model) kernel = self.estimate_kernel(dataset) mask = self.estimate_mask_default(dataset=dataset) flux = self.estimate_flux_default( dataset=dataset, kernel=kernel, exposure=exposure ) mask_safe = dataset.mask_safe if dataset.mask_safe else 1.0 counts = dataset.counts * mask_safe background = dataset.npred() * mask_safe exposure *= mask_safe energy_axis = counts.geom.axes["energy"] flux_ref = self.model.spectral_model.integral( energy_axis.edges[0], energy_axis.edges[-1] ) exposure_npred = (exposure * flux_ref * mask.data).to_unit("") norm = (flux / flux_ref).to_unit("") if self.sum_over_energy_groups: if dataset.mask_safe is None: mask_safe = Map.from_geom(counts.geom, data=True, dtype=bool) counts = counts.sum_over_axes() background = background.sum_over_axes() exposure_npred = exposure_npred.sum_over_axes() else: mask_safe = None # already applied return { "counts": counts, "background": background, "norm": norm, "mask": mask, "mask_safe": mask_safe, "exposure": exposure_npred, "kernel": kernel, } def estimate_flux_map(self, datasets): """Estimate flux and test statistic maps for single dataset. Parameters ---------- dataset : `~gammapy.datasets.Datasets` or `~gammapy.datasets.MapDataset` Map dataset or Datasets (list of MapDataset with the same spatial geometry). """ maps = [self.estimate_fit_input_maps(dataset=d) for d in datasets] mask = np.sum([_["mask"].data for _ in maps], axis=0).astype(bool) if not np.any(mask): raise ValueError( """No valid positions found. Check that the dataset background is defined and not only zeros, or that the mask_safe is not all False." """ ) x, y = np.where(np.squeeze(mask)) positions = list(zip(x, y)) inputs = zip( positions, repeat([_["counts"].data.astype(float) for _ in maps]), repeat([_["exposure"].data.astype(float) for _ in maps]), repeat([_["background"].data.astype(float) for _ in maps]), repeat([_["kernel"].data for _ in maps]), repeat([_["norm"].data for _ in maps]), repeat([_["mask_safe"] for _ in maps]), repeat(self._flux_estimator), ) results = parallel.run_multiprocessing( _ts_value, inputs, backend=self.parallel_backend, pool_kwargs=dict(processes=self.n_jobs), task_name="TS map", ) result = {} j, i = zip(*positions) geom = maps[0]["counts"].geom.squash(axis_name="energy") energy_axis = geom.axes["energy"] dnde_ref = self.model.spectral_model(energy_axis.center) for name in self.selection_all: if name in ["dnde_scan_values", "stat_scan"]: norm_bin_axis = MapAxis( range(len(results[0]["dnde_scan_values"])), interp="lin", node_type="center", name="dnde_bin", ) axes = geom.axes + [norm_bin_axis] geom_scan = geom.to_image().to_cube(axes) if name == "dnde_scan_values": unit = dnde_ref.unit factor = dnde_ref.value else: unit = "" factor = 1 m = Map.from_geom(geom_scan, data=np.nan, unit=unit) m.data[:, 0, j, i] = np.array([_[name] for _ in results]).T * factor else: m = Map.from_geom(geom=geom, data=np.nan, unit="") m.data[0, j, i] = [_[name] for _ in results] result[name] = m return result def run(self, datasets): """ Run test statistic map estimation. Requires a MapDataset with counts, exposure and background_model properly set to run. Notes ----- The progress bar can be displayed for this function. Parameters ---------- dataset : `~gammapy.datasets.Datasets` or `~gammapy.datasets.MapDataset` Map dataset or Datasets (list of MapDataset with the same spatial geometry). Returns ------- maps : dict Dictionary containing result maps. Keys are: * ts : delta(TS) map * sqrt_ts : sqrt(delta(TS)), or significance map * flux : flux map * flux_err : symmetric error map * flux_ul : upper limit map. """ datasets = Datasets(datasets) geom_ref = datasets[0].counts.geom for dataset in datasets: if dataset.stat_type != "cash": raise TypeError( f"{type(dataset)} is not a valid type for {self.__class__}" ) if dataset.counts.geom.to_image() != geom_ref.to_image(): raise TypeError("Datasets geometries must match") datasets_models = datasets.models pad_width = (0, 0) for dataset in datasets: pad_width_dataset = self.estimate_pad_width(dataset=dataset) pad_width = tuple(np.maximum(pad_width, pad_width_dataset)) datasets_padded = Datasets() for dataset in datasets: dataset = dataset.pad(pad_width, name=dataset.name) dataset = dataset.downsample(self.downsampling_factor, name=dataset.name) datasets_padded.append(dataset) datasets = datasets_padded energy_axis = self._get_energy_axis(dataset=datasets[0]) results = [] for energy_min, energy_max in progress_bar( energy_axis.iter_by_edges, desc="Energy bins" ): datasets_sliced = datasets.slice_by_energy( energy_min=energy_min, energy_max=energy_max ) if datasets_models is not None: models_sliced = datasets_models._slice_by_energy( energy_min=energy_min, energy_max=energy_max, sum_over_energy_groups=self.sum_over_energy_groups, ) datasets_sliced.models = models_sliced result = self.estimate_flux_map(datasets_sliced) results.append(result) maps = Maps() for name in self.selection_all: m = Map.from_stack(maps=[_[name] for _ in results], axis_name="energy") order = 0 if name in ["niter", "success"] else 1 m = m.upsample( factor=self.downsampling_factor, preserve_counts=False, order=order ) maps[name] = m.crop(crop_width=pad_width) maps["success"].data = maps["success"].data.astype(bool) meta = {"n_sigma": self.n_sigma, "n_sigma_ul": self.n_sigma_ul} return FluxMaps( data=maps, reference_model=self.model, gti=dataset.gti, meta=meta, ) # TODO: merge with MapDataset? class SimpleMapDataset: """Simple map dataset. Parameters ---------- counts : `~numpy.ndarray` Counts array. background : `~numpy.ndarray` Background array. model : `~numpy.ndarray` Kernel array. """ def __init__(self, model, counts, background, norm_guess): self.model = model self.counts = counts self.background = background self.norm_guess = norm_guess @lazyproperty def norm_bounds(self): """Bounds for x.""" return norm_bounds_cython(self.counts, self.background, self.model) def npred(self, norm): """Predicted number of counts.""" return self.background + norm * self.model def stat_sum(self, norm): """Statistics sum.""" return cash_sum_cython(self.counts, self.npred(norm)) def stat_derivative(self, norm): """Statistics derivative.""" return f_cash_root_cython(norm, self.counts, self.background, self.model) def stat_2nd_derivative(self, norm): """Statistics 2nd derivative.""" term_top = self.model**2 * self.counts term_bottom = (self.background + norm * self.model) ** 2 mask = term_bottom == 0 return (term_top / term_bottom)[~mask].sum() @classmethod def from_arrays( cls, counts, background, exposure, norm, position, kernel, mask_safe ): """""" if mask_safe: # compute mask_safe weighted kernel for the sum_over_axes case mask_safe = _extract_array(mask_safe.data, kernel.shape, position) kernel = (kernel * mask_safe).sum(axis=0, keepdims=True) with np.errstate(invalid="ignore", divide="ignore"): kernel /= mask_safe.sum(axis=0, keepdims=True) kernel[~np.isfinite(kernel)] = 0 counts_cutout = _extract_array(counts, kernel.shape, position) background_cutout = _extract_array(background, kernel.shape, position) exposure_cutout = _extract_array(exposure, kernel.shape, position) model = kernel * exposure_cutout norm_guess = norm[0, position[0], position[1]] mask_invalid = (counts_cutout == 0) & (background_cutout == 0) & (model == 0) return cls( counts=counts_cutout[~mask_invalid], background=background_cutout[~mask_invalid], model=model[~mask_invalid], norm_guess=norm_guess, ) # TODO: merge with `FluxEstimator`? class BrentqFluxEstimator(Estimator): """Single parameter flux estimator.""" _available_selection_optional = ["errn-errp", "ul", "stat_scan"] tag = "BrentqFluxEstimator" def __init__( self, rtol, n_sigma, n_sigma_ul, selection_optional=None, max_niter=100, ts_threshold=None, norm=None, ): self.rtol = rtol self.n_sigma = n_sigma self.n_sigma_ul = n_sigma_ul self.selection_optional = selection_optional self.max_niter = max_niter self.ts_threshold = ts_threshold self.norm = norm def estimate_best_fit(self, dataset): """Estimate best fit norm parameter. Parameters ---------- dataset : `SimpleMapDataset` Simple map dataset. Returns ------- result : dict Result dictionary including 'norm' and 'norm_err'. """ # Compute norm bounds and assert counts > 0 norm_min, norm_max, norm_min_total = dataset.norm_bounds if not dataset.counts.sum() > 0: norm, niter, success = norm_min_total, 0, True else: with warnings.catch_warnings(): warnings.simplefilter("ignore") try: # here we do not use avoid find_roots for performance result_fit = scipy.optimize.brentq( f=dataset.stat_derivative, a=norm_min, b=norm_max, maxiter=self.max_niter, full_output=True, rtol=self.rtol, ) norm = max(result_fit[0], norm_min_total) niter = result_fit[1].iterations success = result_fit[1].converged except (RuntimeError, ValueError): norm, niter, success = norm_min_total, self.max_niter, False with np.errstate(invalid="ignore", divide="ignore"): norm_err = np.sqrt(1 / dataset.stat_2nd_derivative(norm)) * self.n_sigma stat = dataset.stat_sum(norm=norm) stat_null = dataset.stat_sum(norm=0) return { "norm": norm, "norm_err": norm_err, "niter": niter, "ts": stat_null - stat, "stat": stat, "stat_null": stat_null, "success": success, } def _confidence(self, dataset, n_sigma, result, positive): stat_best = result["stat"] norm = result["norm"] norm_err = result["norm_err"] def ts_diff(x): return (stat_best + n_sigma**2) - dataset.stat_sum(x) if positive: min_norm = norm max_norm = norm + 1e2 * norm_err factor = 1 else: min_norm = norm - 1e2 * norm_err max_norm = norm factor = -1 with warnings.catch_warnings(): warnings.simplefilter("ignore") roots, res = find_roots( ts_diff, [min_norm], [max_norm], nbin=1, maxiter=self.max_niter, rtol=self.rtol, ) # Where the root finding fails NaN is set as norm return (roots[0] - norm) * factor def estimate_ul(self, dataset, result): """Compute upper limit using likelihood profile method. Parameters ---------- dataset : `SimpleMapDataset` Simple map dataset. Returns ------- result : dict Result dictionary including 'norm_ul'. """ flux_ul = result["norm"] + self._confidence( dataset=dataset, n_sigma=self.n_sigma_ul, result=result, positive=True ) return {"norm_ul": flux_ul} def estimate_errn_errp(self, dataset, result): """Compute norm errors using likelihood profile method. Parameters ---------- dataset : `SimpleMapDataset` Simple map dataset. Returns ------- result : dict Result dictionary including 'norm_errp' and 'norm_errn'. """ flux_errn = self._confidence( dataset=dataset, result=result, n_sigma=self.n_sigma, positive=False ) flux_errp = self._confidence( dataset=dataset, result=result, n_sigma=self.n_sigma, positive=True ) return {"norm_errn": flux_errn, "norm_errp": flux_errp} def estimate_scan(self, dataset, result): """Compute likelihood profile. Parameters ---------- dataset : `SimpleMapDataset` Simple map dataset. Returns ------- result : dict Result dictionary including 'stat_scan'. """ sparse_norms = _get_norm_scan_values(self.norm, result) scale = sparse_norms[None, :] model = dataset.model.ravel()[:, None] background = dataset.background.ravel()[:, None] counts = dataset.counts.ravel()[:, None] stat_scan = cash(counts, model * scale + background) stat_scan_local = stat_scan.sum(axis=0) - result["stat_null"] spline = InterpolatedUnivariateSpline( sparse_norms, stat_scan_local, k=1, ext="raise", check_finite=True ) norms = np.unique(np.concatenate((sparse_norms, self.norm.scan_values))) stat_scan = spline(norms) ts = -stat_scan.min() ind = stat_scan.argmin() norm = norms[ind] maskp = norms > norm stat_diff = stat_scan - stat_scan.min() ind = np.abs(stat_diff - self.n_sigma**2)[~maskp].argmin() norm_errn = norm - norms[~maskp][ind] ind = np.abs(stat_diff - self.n_sigma**2)[maskp].argmin() norm_errp = norms[maskp][ind] - norm ind = np.abs(stat_diff - self.n_sigma_ul**2)[maskp].argmin() norm_ul = norms[maskp][ind] norm_err = (norm_errn + norm_errp) / 2 return dict( ts=ts, norm=norm, norm_err=norm_err, norm_errn=norm_errn, norm_errp=norm_errp, norm_ul=norm_ul, stat_scan=stat_scan_local, dnde_scan_values=sparse_norms, ) def estimate_default(self, dataset): """Estimate default norm. Parameters ---------- dataset : `SimpleMapDataset` Simple map dataset. Returns ------- result : dict Result dictionary including 'norm', 'norm_err' and "niter". """ norm = dataset.norm_guess with np.errstate(invalid="ignore", divide="ignore"): norm_err = np.sqrt(1 / dataset.stat_2nd_derivative(norm)) * self.n_sigma stat = dataset.stat_sum(norm=norm) stat_null = dataset.stat_sum(norm=0) return { "norm": norm, "norm_err": norm_err, "niter": 0, "ts": stat_null - stat, "stat": stat, "stat_null": stat_null, "success": True, } def run(self, dataset): """Run flux estimator. Parameters ---------- dataset : `SimpleMapDataset` Simple map dataset. Returns ------- result : dict Result dictionary. """ if self.ts_threshold is not None: result = self.estimate_default(dataset) if result["ts"] > self.ts_threshold: result = self.estimate_best_fit(dataset) else: result = self.estimate_best_fit(dataset) if "ul" in self.selection_optional: result.update(self.estimate_ul(dataset, result)) if "errn-errp" in self.selection_optional: result.update(self.estimate_errn_errp(dataset, result)) if "stat_scan" in self.selection_optional: result.update(self.estimate_scan(dataset, result)) norm = result["norm"] result["npred"] = dataset.npred(norm=norm).sum() result["npred_excess"] = result["npred"] - dataset.npred(norm=0).sum() result["stat"] = dataset.stat_sum(norm=norm) return result def _ts_value( position, counts, exposure, background, kernel, norm, mask_safe, flux_estimator ): """Compute test statistic value at a given pixel position. Uses approach described in Stewart (2009). Parameters ---------- position : tuple (i, j) Pixel position. counts : `~numpy.ndarray` Counts image. background : `~numpy.ndarray` Background image. exposure : `~numpy.ndarray` Exposure image. kernel : `astropy.convolution.Kernel2D` Source model kernel. norm : `~numpy.ndarray` Norm image. The flux value at the given pixel position is used as starting value for the minimization. Returns ------- TS : float Test statistic value at the given pixel position. """ datasets = [] nd = len(counts) for idx in range(nd): datasets.append( SimpleMapDataset.from_arrays( counts=counts[idx], background=background[idx], exposure=exposure[idx], norm=norm[idx], position=position, kernel=kernel[idx], mask_safe=mask_safe[idx], ) ) norm_guess = np.array([d.norm_guess for d in datasets]) mask_valid = np.isfinite(norm_guess) if np.any(mask_valid): norm_guess = np.mean(norm_guess[mask_valid]) else: norm_guess = 1.0 dataset = SimpleMapDataset( counts=np.concatenate([d.counts for d in datasets]), background=np.concatenate([d.background for d in datasets]), model=np.concatenate([d.model for d in datasets]), norm_guess=norm_guess, ) return flux_estimator.run(dataset) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/metadata.py0000644000175100001770000000504514721316200020374 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging from enum import Enum from typing import ClassVar, Literal, Optional from gammapy.utils.metadata import ( METADATA_FITS_KEYS, CreatorMetaData, MetaData, TargetMetaData, ) __all__ = ["FluxMetaData"] FLUX_METADATA_FITS_KEYS = { "flux": { "sed_type": "SED_TYPE", "sed_type_init": "SEDTYPEI", "n_sigma": "N_SIGMA", "n_sigma_ul": "NSIGMAUL", "sqrt_ts_threshold_ul": "STSTHUL", "n_sigma_sensitivity": "NSIGMSEN", }, } METADATA_FITS_KEYS.update(FLUX_METADATA_FITS_KEYS) log = logging.getLogger(__name__) class SEDTYPEEnum(str, Enum): dnde = "dnde" flux = "flux" eflux = "eflux" e2dnde = "e2dnde" likelihood = "likelihood" class FluxMetaData(MetaData): """Metadata containing information about the FluxPoints and FluxMaps. Attributes ---------- sed_type : {"dnde", "flux", "eflux", "e2dnde", "likelihood"}, optional SED type. sed_type_init : {"dnde", "flux", "eflux", "e2dnde", "likelihood"}, optional SED type of the initial data. n_sigma : float, optional Significance threshold above which upper limits should be used. n_sigma_ul : float, optional Significance value used for the upper limit computation. sqrt_ts_threshold_ul : float, optional Threshold on the square root of the likelihood value above which upper limits should be used. n_sigma_sensitivity : float, optional Sigma number for which the flux sensitivity is computed target : `~gammapy.utils.TargetMetaData`, optional General metadata information about the target. creation : `~gammapy.utils.CreatorMetaData`, optional The creation metadata. optional : dict, optional additional optional metadata. Note : these quantities are serialized in FITS header with the keywords stored in the dictionary FLUX_METADATA_FITS_KEYS """ _tag: ClassVar[Literal["flux"]] = "flux" sed_type: Optional[SEDTYPEEnum] = None sed_type_init: Optional[SEDTYPEEnum] = None n_sigma: Optional[float] = None n_sigma_ul: Optional[float] = None sqrt_ts_threshold_ul: Optional[float] = None n_sigma_sensitivity: Optional[float] = None target: Optional[TargetMetaData] = None # TODO: add obs_ids: Optional[List[int]] and instrument: Optional[str], or a List[ObsInfoMetaData] # TODO : add dataset_names: Optional[List[str]] creation: Optional[CreatorMetaData] = CreatorMetaData() optional: Optional[dict] = None ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/parameter.py0000644000175100001770000002576714721316200020611 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import numpy as np from gammapy.datasets import Datasets from gammapy.datasets.actors import DatasetsActor from gammapy.modeling import Fit from .core import Estimator log = logging.getLogger(__name__) class ParameterEstimator(Estimator): """Model parameter estimator. Estimates a model parameter for a group of datasets. Compute best fit value, symmetric and delta(TS) for a given null value. Additionally asymmetric errors as well as parameter upper limit and fit statistic profile can be estimated. Parameters ---------- n_sigma : int Sigma to use for asymmetric error computation. Default is 1. n_sigma_ul : int Sigma to use for upper limit computation. Default is 2. null_value : float Which null value to use for the parameter. selection_optional : list of str, optional Which additional quantities to estimate. Available options are: * "all": all the optional steps are executed. * "errn-errp": estimate asymmetric errors on parameter best fit value. * "ul": estimate upper limits. * "scan": estimate fit statistic profiles. Default is None so the optional steps are not executed. fit : `~gammapy.modeling.Fit` Fit instance specifying the backend and fit options. reoptimize : bool Re-optimize other free model parameters. Default is True. Examples -------- >>> from gammapy.datasets import SpectrumDatasetOnOff, Datasets >>> from gammapy.modeling.models import SkyModel, PowerLawSpectralModel >>> from gammapy.estimators import ParameterEstimator >>> >>> filename = "$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs23523.fits" >>> dataset = SpectrumDatasetOnOff.read(filename) >>> datasets = Datasets([dataset]) >>> spectral_model = PowerLawSpectralModel(amplitude="3e-11 cm-2s-1TeV-1", index=2.7) >>> >>> model = SkyModel(spectral_model=spectral_model, name="Crab") >>> model.spectral_model.amplitude.scan_n_values = 10 >>> >>> for dataset in datasets: ... dataset.models = model >>> >>> estimator = ParameterEstimator(selection_optional="all") >>> result = estimator.run(datasets, parameter="amplitude") """ tag = "ParameterEstimator" _available_selection_optional = ["errn-errp", "ul", "scan"] def __init__( self, n_sigma=1, n_sigma_ul=2, null_value=1e-150, selection_optional=None, fit=None, reoptimize=True, ): self.n_sigma = n_sigma self.n_sigma_ul = n_sigma_ul self.null_value = null_value self.selection_optional = selection_optional if fit is None: fit = Fit() self.fit = fit self.reoptimize = reoptimize def estimate_best_fit(self, datasets, parameter): """Estimate parameter asymmetric errors. Parameters ---------- datasets : `~gammapy.datasets.Datasets` Datasets. parameter : `Parameter` For which parameter to get the value. Returns ------- result : dict Dictionary with the various parameter estimation values. Entries are: * parameter.name: best fit parameter value. * "stat": best fit total stat. * "success": boolean flag for fit success. * parameter.name_err: covariance-based error estimate on parameter value. """ value, total_stat, success, error = np.nan, 0.0, False, np.nan if np.any(datasets.contributes_to_stat): result = self.fit.run(datasets=datasets) value, error = parameter.value, parameter.error total_stat = result.optimize_result.total_stat success = result.success return { f"{parameter.name}": value, "stat": total_stat, "success": success, f"{parameter.name}_err": error * self.n_sigma, } def estimate_ts(self, datasets, parameter): """Estimate parameter ts. Parameters ---------- datasets : `~gammapy.datasets.Datasets` Datasets. parameter : `Parameter` For which parameter to get the value. Returns ------- result : dict Dictionary with the test statistic of the best fit value compared to the null hypothesis. Entries are: * "ts" : fit statistic difference with null hypothesis. * "npred" : predicted number of counts per dataset. * "stat_null" : total stat corresponding to the null hypothesis """ npred = self.estimate_npred(datasets=datasets) if not np.any(datasets.contributes_to_stat): stat = np.nan npred["npred"][...] = np.nan else: stat = datasets.stat_sum() with datasets.parameters.restore_status(): # compute ts value parameter.value = self.null_value if self.reoptimize: parameter.frozen = True _ = self.fit.optimize(datasets=datasets) ts = datasets.stat_sum() - stat stat_null = datasets.stat_sum() return {"ts": ts, "npred": npred["npred"], "stat_null": stat_null} def estimate_errn_errp(self, datasets, parameter): """Estimate parameter asymmetric errors. Parameters ---------- datasets : `~gammapy.datasets.Datasets` Datasets. parameter : `Parameter` For which parameter to get the value. Returns ------- result : dict Dictionary with the parameter asymmetric errors. Entries are: * {parameter.name}_errp : positive error on parameter value. * {parameter.name}_errn : negative error on parameter value. """ if not np.any(datasets.contributes_to_stat): return { f"{parameter.name}_errp": np.nan, f"{parameter.name}_errn": np.nan, } self.fit.optimize(datasets=datasets) res = self.fit.confidence( datasets=datasets, parameter=parameter, sigma=self.n_sigma, reoptimize=self.reoptimize, ) return { f"{parameter.name}_errp": res["errp"], f"{parameter.name}_errn": res["errn"], } def estimate_scan(self, datasets, parameter): """Estimate parameter statistic scan. Parameters ---------- datasets : `~gammapy.datasets.Datasets` The datasets used to estimate the model parameter. parameter : `~gammapy.modeling.Parameter` For which parameter to get the value. Returns ------- result : dict Dictionary with the parameter fit scan values. Entries are: * parameter.name_scan : parameter values scan. * "stat_scan" : fit statistic values scan. """ scan_values = parameter.scan_values if not np.any(datasets.contributes_to_stat): return { f"{parameter.name}_scan": scan_values, "stat_scan": scan_values * np.nan, } self.fit.optimize(datasets=datasets) profile = self.fit.stat_profile( datasets=datasets, parameter=parameter, reoptimize=self.reoptimize ) return { f"{parameter.name}_scan": scan_values, "stat_scan": profile["stat_scan"], } def estimate_ul(self, datasets, parameter): """Estimate parameter ul. Parameters ---------- datasets : `~gammapy.datasets.Datasets` The datasets used to estimate the model parameter. parameter : `~gammapy.modeling.Parameter` For which parameter to get the value. Returns ------- result : dict Dictionary with the parameter upper limits. Entries are: * parameter.name_ul : upper limit on parameter value. """ if not np.any(datasets.contributes_to_stat): return {f"{parameter.name}_ul": np.nan} self.fit.optimize(datasets=datasets) res = self.fit.confidence( datasets=datasets, parameter=parameter, sigma=self.n_sigma_ul, reoptimize=self.reoptimize, ) return {f"{parameter.name}_ul": res["errp"] + parameter.value} @staticmethod def estimate_counts(datasets): """Estimate counts for the flux point. Parameters ---------- datasets : Datasets Datasets. Returns ------- result : dict Dictionary with an array with one entry per dataset with the sum of the masked counts. """ counts = [] for dataset in datasets: mask = dataset.mask counts.append(dataset.counts.data[mask].sum()) return {"counts": np.array(counts, dtype=int), "datasets": datasets.names} @staticmethod def estimate_npred(datasets): """Estimate npred for the flux point. Parameters ---------- datasets : `~gammapy.datasets.Datasets` Datasets. Returns ------- result : dict Dictionary with an array with one entry per dataset with the sum of the masked npred. """ npred = [] for dataset in datasets: mask = dataset.mask npred.append(dataset.npred().data[mask].sum()) return {"npred": np.array(npred), "datasets": datasets.names} def run(self, datasets, parameter): """Run the parameter estimator. Parameters ---------- datasets : `~gammapy.datasets.Datasets` The datasets used to estimate the model parameter. parameter : `str` or `~gammapy.modeling.Parameter` For which parameter to run the estimator. Returns ------- result : dict Dictionary with the various parameter estimation values. """ if not isinstance(datasets, DatasetsActor): datasets = Datasets(datasets) parameter = datasets.parameters[parameter] with datasets.parameters.restore_status(): if not self.reoptimize: datasets.parameters.freeze_all() parameter.frozen = False result = self.estimate_best_fit(datasets, parameter) result.update(self.estimate_ts(datasets, parameter)) if "errn-errp" in self.selection_optional: result.update(self.estimate_errn_errp(datasets, parameter)) if "ul" in self.selection_optional: result.update(self.estimate_ul(datasets, parameter)) if "scan" in self.selection_optional: result.update(self.estimate_scan(datasets, parameter)) result.update(self.estimate_counts(datasets)) return result ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.2006419 gammapy-1.3/gammapy/estimators/points/0000755000175100001770000000000014721316215017560 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/points/__init__.py0000644000175100001770000000062514721316200021666 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from .core import FluxPoints from .lightcurve import LightCurveEstimator from .profile import FluxProfileEstimator from .sed import FluxPointsEstimator from .sensitivity import SensitivityEstimator __all__ = [ "FluxPoints", "FluxPointsEstimator", "FluxProfileEstimator", "LightCurveEstimator", "SensitivityEstimator", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/points/core.py0000644000175100001770000007633514721316200021072 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging from copy import deepcopy import numpy as np from scipy import stats from scipy.interpolate import interp1d from scipy.optimize import minimize from astropy.io import fits from astropy.io.registry import IORegistryError from astropy.table import Table, vstack from astropy.time import Time from astropy.visualization import quantity_support import matplotlib.pyplot as plt from gammapy.data import GTI from gammapy.maps import Map, MapAxis, Maps, RegionNDMap, TimeMapAxis from gammapy.maps.axes import UNIT_STRING_FORMAT, flat_if_equal from gammapy.modeling.models import TemplateSpectralModel from gammapy.modeling.models.spectral import scale_plot_flux from gammapy.modeling.scipy import stat_profile_ul_scipy from gammapy.utils.interpolation import interpolate_profile from gammapy.utils.scripts import make_path from gammapy.utils.table import table_standardise_units_copy from gammapy.utils.time import time_ref_to_dict from ..map.core import DEFAULT_UNIT, FluxMaps __all__ = ["FluxPoints"] log = logging.getLogger(__name__) def squash_fluxpoints(flux_point, axis): """Squash a `~FluxPoints` object into one point. The log-likelihoods profiles in each bin are summed to compute the resultant quantities. Stat profiles must be present on the `~FluxPoints` object for this method to work. """ value_scan = flux_point.stat_scan.geom.axes["norm"].center stat_scan = np.sum(flux_point.stat_scan.data, axis=0).ravel() f = interp1d(value_scan, stat_scan, kind="quadratic", bounds_error=False) f = interpolate_profile(value_scan, stat_scan) minimizer = minimize( f, x0=value_scan[int(len(value_scan) / 2)], bounds=[(value_scan[0], value_scan[-1])], method="L-BFGS-B", ) maps = Maps() geom = flux_point.geom.to_image() if axis.name != "energy": geom = geom.to_cube([flux_point.geom.axes["energy"]]) maps["norm"] = Map.from_geom(geom, data=minimizer.x) maps["norm_err"] = Map.from_geom( geom, data=np.sqrt(minimizer.hess_inv.todense()).reshape(geom.data_shape) ) maps["n_dof"] = Map.from_geom(geom, data=flux_point.geom.axes[axis.name].nbin) if "norm_ul" in flux_point.available_quantities: delta_ts = flux_point.meta.get("n_sigma_ul", 2) ** 2 ul = stat_profile_ul_scipy(value_scan, stat_scan, delta_ts=delta_ts) maps["norm_ul"] = Map.from_geom(geom, data=ul.value) maps["stat"] = Map.from_geom(geom, data=f(minimizer.x)) geom_scan = geom.to_cube([MapAxis.from_nodes(value_scan, name="norm")]) maps["stat_scan"] = Map.from_geom( geom=geom_scan, data=stat_scan.reshape(geom_scan.data_shape) ) try: maps["stat_null"] = Map.from_geom(geom, data=np.sum(flux_point.stat_null.data)) maps["ts"] = maps["stat_null"] - maps["stat"] except AttributeError: log.info( "Stat null info not present on original FluxPoints object. TS not computed" ) maps["success"] = Map.from_geom(geom=geom, data=minimizer.success, dtype=bool) combined_fp = FluxPoints.from_maps( maps=maps, sed_type=flux_point.sed_type_init, reference_model=flux_point.reference_model, gti=flux_point.gti, meta=flux_point.meta, ) return combined_fp class FluxPoints(FluxMaps): """Flux points container. The supported formats are described here: :ref:`gadf:flux-points`. In summary, the following formats and minimum required columns are: * Format ``dnde``: columns ``e_ref`` and ``dnde`` * Format ``e2dnde``: columns ``e_ref``, ``e2dnde`` * Format ``flux``: columns ``e_min``, ``e_max``, ``flux`` * Format ``eflux``: columns ``e_min``, ``e_max``, ``eflux`` Parameters ---------- table : `~astropy.table.Table` Table with flux point data. Attributes ---------- table : `~astropy.table.Table` Table with flux point data. Examples -------- The `FluxPoints` object is most easily created by reading a file with flux points given in one of the formats documented above:: >>> from gammapy.estimators import FluxPoints >>> filename = '$GAMMAPY_DATA/hawc_crab/HAWC19_flux_points.fits' >>> flux_points = FluxPoints.read(filename) >>> flux_points.plot() #doctest: +SKIP An instance of `FluxPoints` can also be created by passing an instance of `astropy.table.Table`, which contains the required columns, such as `'e_ref'` and `'dnde'`. The corresponding `sed_type` has to be defined in the meta data of the table:: >>> import numpy as np >>> from astropy import units as u >>> from astropy.table import Table >>> from gammapy.estimators import FluxPoints >>> from gammapy.modeling.models import PowerLawSpectralModel >>> table = Table() >>> pwl = PowerLawSpectralModel() >>> e_ref = np.geomspace(1, 100, 7) * u.TeV >>> table["e_ref"] = e_ref >>> table["dnde"] = pwl(e_ref) >>> table["dnde_err"] = pwl.evaluate_error(e_ref)[0] >>> table.meta["SED_TYPE"] = "dnde" >>> flux_points = FluxPoints.from_table(table) >>> flux_points.plot(sed_type="flux") #doctest: +SKIP If you have flux points in a different data format, the format can be changed by renaming the table columns and adding meta data:: >>> from astropy import units as u >>> from astropy.table import Table >>> from gammapy.estimators import FluxPoints >>> from gammapy.utils.scripts import make_path >>> filename = make_path('$GAMMAPY_DATA/tests/spectrum/flux_points/flux_points_ctb_37b.txt') >>> table = Table.read(filename ,format='ascii.csv', delimiter=' ', comment='#') >>> table.rename_column('Differential_Flux', 'dnde') >>> table['dnde'].unit = 'cm-2 s-1 TeV-1' >>> table.rename_column('lower_error', 'dnde_errn') >>> table['dnde_errn'].unit = 'cm-2 s-1 TeV-1' >>> table.rename_column('upper_error', 'dnde_errp') >>> table['dnde_errp'].unit = 'cm-2 s-1 TeV-1' >>> table.rename_column('E', 'e_ref') >>> table['e_ref'].unit = 'TeV' >>> flux_points = FluxPoints.from_table(table, sed_type="dnde") >>> flux_points.plot(sed_type="e2dnde") #doctest: +SKIP Note: In order to reproduce the example you need the tests datasets folder. You may download it with the command ``gammapy download datasets --tests --out $GAMMAPY_DATA`` """ @classmethod def read( cls, filename, sed_type=None, format=None, reference_model=None, checksum=False, table_format="ascii.ecsv", **kwargs, ): """Read precomputed flux points. Parameters ---------- filename : str Filename. sed_type : {"dnde", "flux", "eflux", "e2dnde", "likelihood"} SED type. format : {"gadf-sed", "lightcurve", "profile"}, optional Format string. If None, the format is extracted from the input. Default is None. reference_model : `SpectralModel` Reference spectral model. checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. table_format : str Format string for the ~astropy.Table object. Default is "ascii.ecsv" **kwargs : dict, optional Keyword arguments passed to `astropy.table.Table.read`. Returns ------- flux_points : `FluxPoints` Flux points. """ filename = make_path(filename) gti = None try: table = Table.read(filename, format=table_format, **kwargs) except (IORegistryError, UnicodeDecodeError): with fits.open(filename, checksum=checksum) as hdulist: if "FLUXPOINTS" in hdulist: fp = hdulist["FLUXPOINTS"] else: fp = hdulist[""] # to handle older files table = Table.read(fp) if "GTI" in hdulist: gti = GTI.from_table_hdu(hdulist["GTI"]) return cls.from_table( table=table, sed_type=sed_type, reference_model=reference_model, format=format, gti=gti, ) def write( self, filename, sed_type=None, format=None, overwrite=False, checksum=False ): """Write flux points. Parameters ---------- filename : str Filename. sed_type : {"dnde", "flux", "eflux", "e2dnde", "likelihood"}, optional SED type. Default is None. format : {"gadf-sed", "lightcurve", "binned-time-series", "profile"}, optional Format specification. The following formats are supported: * "gadf-sed": format for SED flux points see :ref:`gadf:flux-points` for details * "lightcurve": Gammapy internal format to store energy dependent lightcurves. Basically a generalisation of the "gadf" format, but currently there is no detailed documentation available. * "binned-time-series": table format support by Astropy's `~astropy.timeseries.BinnedTimeSeries`. * "profile": Gammapy internal format to store energy dependent flux profiles. Basically a generalisation of the "gadf" format, but currently there is no detailed documentation available. If None, the format will be guessed by looking at the axes that are present in the object. Default is None. overwrite : bool, optional Overwrite existing file. Default is False. checksum : bool, optional When True adds both DATASUM and CHECKSUM cards to the headers written to the file. Default is False. """ filename = make_path(filename) if sed_type is None: sed_type = self.sed_type_init table = self.to_table(sed_type=sed_type, format=format) if ".fits" not in filename.suffixes: table.write(filename, overwrite=overwrite) return primary_hdu = fits.PrimaryHDU() hdu_evt = fits.BinTableHDU(table, name="FLUXPOINTS") hdu_all = fits.HDUList([primary_hdu, hdu_evt]) if self.gti: hdu_all.append(self.gti.to_table_hdu()) hdu_all.writeto(filename, overwrite=overwrite, checksum=checksum) @staticmethod def _convert_loglike_columns(table): # TODO: check sign and factor 2 here # https://github.com/gammapy/gammapy/pull/2546#issuecomment-554274318 # The idea below is to support the format here: # https://gamma-astro-data-formats.readthedocs.io/en/latest/spectra/flux_points/index.html#likelihood-columns # but internally to go to the uniform "stat" if "loglike" in table.colnames and "stat" not in table.colnames: table["stat"] = 2 * table["loglike"] if "loglike_null" in table.colnames and "stat_null" not in table.colnames: table["stat_null"] = 2 * table["loglike_null"] if "dloglike_scan" in table.colnames and "stat_scan" not in table.colnames: table["stat_scan"] = 2 * table["dloglike_scan"] return table @staticmethod def _table_guess_format(table): """Format of the table to be transformed to FluxPoints.""" names = table.colnames if "time_min" in names: return "lightcurve" elif "x_min" in names: return "profile" else: return "gadf-sed" @classmethod def from_table( cls, table, sed_type=None, format=None, reference_model=None, gti=None ): """Create flux points from a table. The table column names must be consistent with the sed_type. Parameters ---------- table : `~astropy.table.Table` Table. sed_type : {"dnde", "flux", "eflux", "e2dnde", "likelihood"}, optional SED type. Default is None. format : {"gadf-sed", "lightcurve", "profile"}, optional Table format. If None, it is extracted from the table column content. Default is None. reference_model : `SpectralModel`, optional Reference spectral model. Default is None. gti : `GTI`, optional Good time intervals. Default is None. Returns ------- flux_points : `FluxPoints` Flux points. """ table = table_standardise_units_copy(table) if format is None: format = cls._table_guess_format(table) log.info("Inferred format: " + format) if sed_type is None: sed_type = table.meta.get("SED_TYPE", None) if sed_type is None: sed_type = cls._guess_sed_type(table.colnames) if sed_type is None: raise ValueError("Specifying the SED type is required") if sed_type == "likelihood": table = cls._convert_loglike_columns(table) if reference_model is None: reference_model = TemplateSpectralModel( energy=flat_if_equal(table["e_ref"].quantity), values=flat_if_equal(table["ref_dnde"].quantity), ) maps = Maps() table.meta.setdefault("SED_TYPE", sed_type) for name in cls.all_quantities(sed_type=sed_type): if name in table.colnames: maps[name] = RegionNDMap.from_table( table=table, colname=name, format=format ) meta = cls._get_meta_gadf(table) return cls.from_maps( maps=maps, reference_model=reference_model, meta=meta, sed_type=sed_type, gti=gti, ) @staticmethod def _get_meta_gadf(table): meta = {} meta.update(table.meta) conf_ul = table.meta.get("UL_CONF") if conf_ul: n_sigma_ul = np.round(stats.norm.isf(0.5 * (1 - conf_ul)), 1) meta["n_sigma_ul"] = n_sigma_ul meta["sed_type_init"] = table.meta.get("SED_TYPE") return meta @staticmethod def _format_table(table): """Format table.""" for column in table.colnames: if column.startswith(("dnde", "eflux", "flux", "e2dnde", "ref")): table[column].format = ".3e" elif column.startswith( ("e_min", "e_max", "e_ref", "sqrt_ts", "norm", "ts", "stat") ): table[column].format = ".3f" return table def _guess_format(self): """Format of the FluxPoints object.""" names = self.geom.axes.names if "time" in names: return "lightcurve" elif "projected-distance" in names: return "profile" else: return "gadf-sed" def to_table(self, sed_type=None, format=None, formatted=False): """Create table for a given SED type. Parameters ---------- sed_type : {"likelihood", "dnde", "e2dnde", "flux", "eflux"} SED type to convert to. Default is `likelihood`. format : {"gadf-sed", "lightcurve", "binned-time-series", "profile"}, optional Format specification. The following formats are supported: * "gadf-sed": format for SED flux points see :ref:`gadf:flux-points` for details * "lightcurve": Gammapy internal format to store energy dependent lightcurves. Basically a generalisation of the "gadf" format, but currently there is no detailed documentation available. * "binned-time-series": table format support by Astropy's `~astropy.timeseries.BinnedTimeSeries`. * "profile": Gammapy internal format to store energy dependent flux profiles. Basically a generalisation of the "gadf" format, but currently there is no detailed documentation available. If None, the format will be guessed by looking at the axes that are present in the object. Default is None. formatted : bool Formatted version with column formats applied. Numerical columns are formatted to .3f and .3e respectively. Default is False. Returns ------- table : `~astropy.table.Table` Flux points table. Examples -------- This is how to read and plot example flux points: >>> from gammapy.estimators import FluxPoints >>> fp = FluxPoints.read("$GAMMAPY_DATA/hawc_crab/HAWC19_flux_points.fits") >>> table = fp.to_table(sed_type="flux", formatted=True) >>> print(table[:2]) e_ref e_min e_max flux flux_err flux_ul ts sqrt_ts is_ul TeV TeV TeV 1 / (s cm2) 1 / (s cm2) 1 / (s cm2) ----- ----- ----- ----------- ----------- ----------- -------- ------- ----- 1.334 1.000 1.780 1.423e-11 3.135e-13 nan 2734.000 52.288 False 2.372 1.780 3.160 5.780e-12 1.082e-13 nan 4112.000 64.125 False """ if sed_type is None: sed_type = self.sed_type_init if format is None: format = self._guess_format() log.info("Inferred format: " + format) if format == "gadf-sed": # TODO: what to do with GTI info? if not self.geom.axes.names == ["energy"]: raise ValueError( "Only flux points with a single energy axis " "can be converted to 'gadf-sed'" ) idx = (Ellipsis, 0, 0) table = self.energy_axis.to_table(format="gadf-sed") table.meta["SED_TYPE"] = sed_type if not self.is_convertible_to_flux_sed_type: table.remove_columns(["e_min", "e_max"]) if self.n_sigma_ul: table.meta["UL_CONF"] = np.round( 1 - 2 * stats.norm.sf(self.n_sigma_ul), 7 ) if sed_type == "likelihood": table["ref_dnde"] = self.dnde_ref[idx] table["ref_flux"] = self.flux_ref[idx] table["ref_eflux"] = self.eflux_ref[idx] for quantity in self.all_quantities(sed_type=sed_type): data = getattr(self, quantity, None) if data: table[quantity] = data.quantity[idx] if self.has_stat_profiles: norm_axis = self.stat_scan.geom.axes["norm"] table["norm_scan"] = norm_axis.center.reshape((1, -1)) table["stat"] = self.stat.data[idx] table["stat_scan"] = self.stat_scan.data[idx] table["is_ul"] = self.is_ul.data[idx] if not self.has_ul: table.remove_columns("is_ul") elif format == "lightcurve": time_axis = self.geom.axes["time"] tables = [] for idx, (time_min, time_max) in enumerate(time_axis.iter_by_edges): table_flat = Table() table_flat["time_min"] = [time_min.mjd] table_flat["time_max"] = [time_max.mjd] fp = self.slice_by_idx(slices={"time": idx}) table = fp.to_table(sed_type=sed_type, format="gadf-sed") for column in table.columns: table_flat[column] = table[column][np.newaxis] tables.append(table_flat) table = vstack(tables) # serialize with reference time set to mjd=0.0 ref_time = Time(0.0, format="mjd", scale=time_axis.reference_time.scale) table.meta.update(time_ref_to_dict(ref_time, scale=ref_time.scale)) table.meta["TIMEUNIT"] = "d" elif format == "binned-time-series": message = ( "Format 'binned-time-series' support a single time axis " f"only. Got {self.geom.axes.names}" ) if not self.geom.axes.is_unidimensional: raise ValueError(message) axis = self.geom.axes.primary_axis if not isinstance(axis, TimeMapAxis): raise ValueError(message) table = Table() table["time_bin_start"] = axis.time_min table["time_bin_size"] = axis.time_delta for quantity in self.all_quantities(sed_type=sed_type): data = getattr(self, quantity, None) if data: table[quantity] = data.quantity.squeeze() elif format == "profile": x_axis = self.geom.axes["projected-distance"] tables = [] for idx, (x_min, x_max) in enumerate(x_axis.iter_by_edges): table_flat = Table() table_flat["x_min"] = [x_min] table_flat["x_max"] = [x_max] table_flat["x_ref"] = [(x_max + x_min) / 2] fp = self.slice_by_idx(slices={"projected-distance": idx}) table = fp.to_table(sed_type=sed_type, format="gadf-sed") for column in table.columns: table_flat[column] = table[column][np.newaxis] tables.append(table_flat) table = vstack(tables) else: raise ValueError(f"Not a supported format {format}") if formatted: table = self._format_table(table=table) return table @staticmethod def _energy_ref_lafferty(model, energy_min, energy_max): """Helper for `to_sed_type`. Compute energy_ref that the value at energy_ref corresponds to the mean value between energy_min and energy_max. """ flux = model.integral(energy_min, energy_max) dnde_mean = flux / (energy_max - energy_min) return model.inverse(dnde_mean) def _plot_get_flux_err(self, sed_type=None): """Compute flux error for given SED type.""" y_errn, y_errp = None, None if "norm_err" in self.available_quantities: # symmetric error y_errn = getattr(self, sed_type + "_err") y_errp = y_errn.copy() if "norm_errp" in self.available_quantities: y_errn = getattr(self, sed_type + "_errn") y_errp = getattr(self, sed_type + "_errp") return y_errn, y_errp def plot(self, ax=None, sed_type=None, energy_power=0, time_format="iso", **kwargs): """Plot flux points. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Axis object to plot on. Default is None. sed_type : {"dnde", "flux", "eflux", "e2dnde"}, optional SED type. Default is None. energy_power : float, optional Power of energy to multiply flux axis with. Default is 0. time_format : {"iso", "mjd"} Used time format is a time axis is present. Default is "iso". **kwargs : dict, optional Keyword arguments passed to `~RegionNDMap.plot`. Returns ------- ax : `~matplotlib.axes.Axes` Axis object. """ if sed_type is None: sed_type = self.sed_type_plot_default if not self.norm.geom.is_region: raise ValueError("Plotting only supported for region based flux points") if ax is None: ax = plt.gca() flux_unit = DEFAULT_UNIT[sed_type] flux = getattr(self, sed_type) # get errors and ul y_errn, y_errp = self._plot_get_flux_err(sed_type=sed_type) is_ul = self.is_ul.data if self.has_ul and y_errn and is_ul.any(): flux_ul = getattr(self, sed_type + "_ul").quantity y_errn.data[is_ul] = np.clip( 0.5 * flux_ul[is_ul].to_value(y_errn.unit), 0, np.inf ) y_errp.data[is_ul] = 0 flux.data[is_ul] = flux_ul[is_ul].to_value(flux.unit) kwargs.setdefault("uplims", is_ul) # set flux points plotting defaults if y_errp and y_errn: y_errp = np.clip( scale_plot_flux(y_errp, energy_power=energy_power).quantity, 0, np.inf ) y_errn = np.clip( scale_plot_flux(y_errn, energy_power=energy_power).quantity, 0, np.inf ) kwargs.setdefault("yerr", (y_errn, y_errp)) else: kwargs.setdefault("yerr", None) flux = scale_plot_flux(flux=flux.to_unit(flux_unit), energy_power=energy_power) if "time" in flux.geom.axes_names: flux.geom.axes["time"].time_format = time_format ax = flux.plot(ax=ax, **kwargs) else: ax = flux.plot(ax=ax, **kwargs) ax.set_xlabel(f"Energy [{ax.xaxis.units.to_string(UNIT_STRING_FORMAT)}]") ax.set_xscale("log", nonpositive="clip") self._plot_format_yax(ax=ax, energy_power=energy_power, sed_type=sed_type) if len(flux.geom.axes) > 1: ax.legend() return ax @staticmethod def _plot_format_yax(ax, energy_power, sed_type): if energy_power > 0: ax.set_ylabel( f"e{energy_power} * {sed_type} [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]" ) else: ax.set_ylabel( f"{sed_type} [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]" ) ax.set_yscale("log", nonpositive="clip") def plot_ts_profiles( self, ax=None, sed_type=None, add_cbar=True, **kwargs, ): """Plot fit statistic SED profiles as a density plot. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Axis object to plot on. Default is None. sed_type : {"dnde", "flux", "eflux", "e2dnde"}, optional SED type. Default is None. add_cbar : bool, optional Whether to add a colorbar to the plot. Default is True. **kwargs : dict, optional Keyword arguments passed to `~matplotlib.pyplot.pcolormesh`. Returns ------- ax : `~matplotlib.axes.Axes` Axis object. """ if ax is None: ax = plt.gca() if sed_type is None: sed_type = self.sed_type_plot_default if not self.norm.geom.is_region: raise ValueError("Plotting only supported for region based flux points") if not self.geom.axes.is_unidimensional: raise ValueError( "Profile plotting is only supported for unidimensional maps" ) axis = self.geom.axes.primary_axis if isinstance(axis, TimeMapAxis) and not axis.is_contiguous: axis = axis.to_contiguous() if ax.yaxis.units is None: yunits = DEFAULT_UNIT[sed_type] else: yunits = ax.yaxis.units ax.yaxis.set_units(yunits) flux_ref = getattr(self, sed_type + "_ref").to(yunits) ts = self.ts_scan norm_min, norm_max = ts.geom.axes["norm"].bounds.to_value("") flux = MapAxis.from_bounds( norm_min * flux_ref.value.min(), norm_max * flux_ref.value.max(), nbin=500, interp=axis.interp, unit=flux_ref.unit, ) norm = flux.center / flux_ref.reshape((-1, 1)) coords = ts.geom.get_coord() coords["norm"] = norm coords[axis.name] = axis.center.reshape((-1, 1)) z = ts.interp_by_coord(coords, values_scale="stat-profile") kwargs.setdefault("vmax", 0) kwargs.setdefault("vmin", -4) kwargs.setdefault("zorder", 0) kwargs.setdefault("cmap", "Blues") kwargs.setdefault("linewidths", 0) kwargs.setdefault("shading", "auto") # clipped values are set to NaN so that they appear white on the plot z[-z < kwargs["vmin"]] = np.nan with quantity_support(): caxes = ax.pcolormesh(axis.as_plot_edges, flux.edges, -z.T, **kwargs) axis.format_plot_xaxis(ax=ax) ax.set_ylabel(f"{sed_type} [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]") ax.set_yscale("log") if add_cbar: label = "Fit statistic difference" ax.figure.colorbar(caxes, ax=ax, label=label) return ax def recompute_ul(self, n_sigma_ul=2, **kwargs): """Recompute upper limits corresponding to the given value. The pre-computed statistic profiles must exist for the re-computation. Parameters ---------- n_sigma_ul : int Number of sigma to use for upper limit computation. Default is 2. **kwargs : dict, optional Keyword arguments passed to `~scipy.optimize.brentq`. Returns ------- flux_points : `~gammapy.estimators.FluxPoints` A new FluxPoints object with modified upper limits. Examples -------- >>> from gammapy.estimators import FluxPoints >>> filename = '$GAMMAPY_DATA/tests/spectrum/flux_points/binlike.fits' >>> flux_points = FluxPoints.read(filename) >>> flux_points_recomputed = flux_points.recompute_ul(n_sigma_ul=3) >>> print(flux_points.meta["n_sigma_ul"], flux_points.flux_ul.data[0]) 2.0 [[3.95451985e-09]] >>> print(flux_points_recomputed.meta["n_sigma_ul"], flux_points_recomputed.flux_ul.data[0]) 3 [[6.22245374e-09]] """ if not self.has_stat_profiles: raise ValueError( "Stat profiles not present. Upper limit computation is not possible" ) delta_ts = n_sigma_ul**2 flux_points = deepcopy(self) value_scan = self.stat_scan.geom.axes["norm"].center shape_axes = self.stat_scan.geom._shape[slice(3, None)][::-1] for idx in np.ndindex(shape_axes): stat_scan = np.abs( self.stat_scan.data[idx].squeeze() - self.stat.data[idx].squeeze() ) flux_points.norm_ul.data[idx] = stat_profile_ul_scipy( value_scan, stat_scan, delta_ts=delta_ts, **kwargs ) flux_points.meta["n_sigma_ul"] = n_sigma_ul return flux_points def resample_axis(self, axis_new): """Rebin the flux point object along the new axis. The log-likelihoods profiles in each bin are summed to compute the resultant quantities. Stat profiles must be present on the `~gammapy.estimators.FluxPoints` object for this method to work. For now, works only for flat `~gammapy.estimators.FluxPoints`. Parameters ---------- axis_new : `MapAxis` or `TimeMapAxis` The new axis to resample along Returns ------- flux_points : `~gammapy.estimators.FluxPoints` A new FluxPoints object with modified axis. """ if not self.has_stat_profiles: raise ValueError("Stat profiles not present, rebinning is not possible") fluxpoints = [] for edge_min, edge_max in zip(axis_new.edges_min, axis_new.edges_max): if isinstance(axis_new, TimeMapAxis): edge_min = edge_min + axis_new.reference_time edge_max = edge_max + axis_new.reference_time fp = self.slice_by_coord({axis_new.name: slice(edge_min, edge_max)}) fp_new = squash_fluxpoints(fp, axis_new) fluxpoints.append(fp_new) return self.__class__.from_stack(fluxpoints, axis=axis_new) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/points/lightcurve.py0000644000175100001770000002006614721316200022304 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging from itertools import repeat import numpy as np import astropy.units as u import gammapy.utils.parallel as parallel from gammapy.data import GTI from gammapy.datasets import Datasets from gammapy.datasets.actors import DatasetsActor from gammapy.maps import LabelMapAxis, Map, TimeMapAxis from gammapy.utils.pbar import progress_bar from .core import FluxPoints from .sed import FluxPointsEstimator __all__ = ["LightCurveEstimator"] log = logging.getLogger(__name__) class LightCurveEstimator(FluxPointsEstimator): """Estimate light curve. The estimator will apply flux point estimation on the source model component to datasets in each of the provided time intervals. The normalisation, `norm`, is the only parameter of the source model left free to vary. Other model components can be left free to vary with the reoptimize option. If no time intervals are provided, the estimator will use the time intervals defined by the datasets GTIs. To be included in the estimation, the dataset must have their GTI fully overlapping a time interval. Time intervals without any dataset GTI fully overlapping will be dropped. They will not be stored in the final lightcurve `FluxPoints` object. Parameters ---------- time_intervals : list of `astropy.time.Time` Start and stop time for each interval to compute the LC. source : str or int For which source in the model to compute the flux points. Default is 0. atol : `~astropy.units.Quantity` Tolerance value for time comparison with different scale. Default 1e-6 sec. n_sigma : int Number of sigma to use for asymmetric error computation. Default is 1. n_sigma_ul : int Number of sigma to use for upper limit computation. Default is 2. selection_optional : list of str, optional Which steps to execute. Available options are: * "all": all the optional steps are executed. * "errn-errp": estimate asymmetric errors. * "ul": estimate upper limits. * "scan": estimate fit statistic profiles. Default is None so the optional steps are not executed. energy_edges : list of `~astropy.units.Quantity`, optional Edges of the lightcurve energy bins. The resulting bin edges won't be exactly equal to the input ones, but rather the closest values to the energy axis edges of the parent dataset. Default is None: apply the estimator in each energy bin of the parent dataset. For further explanation see :ref:`estimators`. fit : `Fit` Fit instance specifying the backend and fit options. reoptimize : bool If True the free parameters of the other models are fitted in each bin independently, together with the norm of the source of interest (but the other parameters of the source of interest are kept frozen). If False only the norm of the source of interest if fitted, and all other parameters are frozen at their current values. n_jobs : int Number of processes used in parallel for the computation. Default is one, unless `~gammapy.utils.parallel.N_JOBS_DEFAULT` was modified. The number of jobs is limited to the number of physical CPUs. parallel_backend : {"multiprocessing", "ray"} Which backend to use for multiprocessing. Defaults to `~gammapy.utils.parallel.BACKEND_DEFAULT`. norm : ~gammapy.modeling.Parameter` or dict Norm parameter used for the fit Default is None and a new parameter is created automatically, with value=1, name="norm", scan_min=0.2, scan_max=5, and scan_n_values = 11. By default the min and max are not set and derived from the source model, unless the source model does not have one and only one norm parameter. If a dict is given the entries should be a subset of `~gammapy.modeling.Parameter` arguments. Examples -------- For a usage example see :doc:`/tutorials/analysis-time/light_curve` tutorial. """ tag = "LightCurveEstimator" def __init__(self, time_intervals=None, atol="1e-6 s", **kwargs): self.time_intervals = time_intervals self.atol = u.Quantity(atol) super().__init__(**kwargs) def run(self, datasets): """Run light curve extraction. Normalize integral and energy flux between emin and emax. Notes ----- The progress bar can be displayed for this function. Parameters ---------- datasets : list of `~gammapy.datasets.SpectrumDataset` or `~gammapy.datasets.MapDataset` Spectrum or Map datasets. Returns ------- lightcurve : `~gammapy.estimators.FluxPoints` Light curve flux points. """ if not isinstance(datasets, DatasetsActor): datasets = Datasets(datasets) if self.time_intervals is None: gti = datasets.gti else: gti = GTI.from_time_intervals(self.time_intervals) gti = gti.union(overlap_ok=False, merge_equal=False) rows = [] valid_intervals = [] parallel_datasets = [] dataset_names = datasets.names for t_min, t_max in progress_bar( gti.time_intervals, desc="Time intervals selection" ): datasets_to_fit = datasets.select_time( time_min=t_min, time_max=t_max, atol=self.atol ) if len(datasets_to_fit) == 0: log.info( f"No Dataset for the time interval {t_min} to {t_max}. Skipping interval." ) continue valid_intervals.append([t_min, t_max]) if self.n_jobs == 1: fp = self.estimate_time_bin_flux(datasets_to_fit, dataset_names) rows.append(fp) else: parallel_datasets.append(datasets_to_fit) if self.n_jobs > 1: self._update_child_jobs() rows = parallel.run_multiprocessing( self.estimate_time_bin_flux, zip( parallel_datasets, repeat(dataset_names), ), backend=self.parallel_backend, pool_kwargs=dict(processes=self.n_jobs), task_name="Time intervals", ) if len(rows) == 0: raise ValueError("LightCurveEstimator: No datasets in time intervals") gti = GTI.from_time_intervals(valid_intervals) axis = TimeMapAxis.from_gti(gti=gti) return FluxPoints.from_stack( maps=rows, axis=axis, ) @staticmethod def expand_map(m, dataset_names): """Expand map in dataset axis. Parameters ---------- map : `Map` Map to expand. dataset_names : list of str Dataset names. Returns ------- map : `Map` Expanded map. """ label_axis = LabelMapAxis(labels=dataset_names, name="dataset") geom = m.geom.replace_axis(axis=label_axis) result = Map.from_geom(geom, data=np.nan) coords = m.geom.get_coord(sparse=True) result.set_by_coord(coords, vals=m.data) return result def estimate_time_bin_flux(self, datasets, dataset_names=None): """Estimate flux point for a single energy group. Parameters ---------- datasets : `~gammapy.modeling.Datasets` List of dataset objects. Returns ------- result : `FluxPoints` Resulting flux points. """ estimator = self.copy() estimator.n_jobs = self._n_child_jobs fp = estimator._run_flux_points(datasets) if dataset_names: for name in ["counts", "npred", "npred_excess"]: fp._data[name] = self.expand_map( fp._data[name], dataset_names=dataset_names ) return fp def _run_flux_points(self, datasets): return super().run(datasets) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/points/profile.py0000644000175100001770000001506114721316200021567 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Tools to create profiles (i.e. 1D "slices" from 2D images).""" from itertools import repeat import numpy as np from astropy import units as u from regions import CircleAnnulusSkyRegion import gammapy.utils.parallel as parallel from gammapy.datasets import Datasets from gammapy.maps import MapAxis from gammapy.modeling.models import PowerLawSpectralModel, SkyModel from .core import FluxPoints from .sed import FluxPointsEstimator from gammapy.utils.deprecation import deprecated_renamed_argument __all__ = ["FluxProfileEstimator"] class FluxProfileEstimator(FluxPointsEstimator): """Estimate flux profiles. Parameters ---------- regions : list of `~regions.SkyRegion` Regions to use. spectral_model : `~gammapy.modeling.models.SpectralModel`, optional Spectral model to compute the fluxes or brightness. Default is power-law with spectral index of 2. n_jobs : int, optional Number of processes used in parallel for the computation. Default is one, unless `~gammapy.utils.parallel.N_JOBS_DEFAULT` was modified. The number of jobs is limited to the number of physical CPUs. parallel_backend : {"multiprocessing", "ray"}, optional Which backend to use for multiprocessing. Defaults to `~gammapy.utils.parallel.BACKEND_DEFAULT`. **kwargs : dict, optional Keywords forwarded to the `FluxPointsEstimator` (see documentation there for further description of valid keywords). Examples -------- This example shows how to compute a counts profile for the Fermi galactic center region:: >>> from astropy import units as u >>> from astropy.coordinates import SkyCoord >>> from gammapy.data import GTI >>> from gammapy.estimators import FluxProfileEstimator >>> from gammapy.utils.regions import make_orthogonal_rectangle_sky_regions >>> from gammapy.datasets import MapDataset >>> from gammapy.maps import RegionGeom >>> # load example data >>> filename = "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc.fits.gz" >>> dataset = MapDataset.read(filename, name="fermi-dataset") >>> # configuration >>> dataset.gti = GTI.create("0s", "1e7s", "2010-01-01") >>> # creation of the boxes and axis >>> start_pos = SkyCoord("-1d", "0d", frame='galactic') >>> end_pos = SkyCoord("1d", "0d", frame='galactic') >>> regions = make_orthogonal_rectangle_sky_regions( ... start_pos=start_pos, ... end_pos=end_pos, ... wcs=dataset.counts.geom.wcs, ... height=2 * u.deg, ... nbin=21 ... ) >>> # set up profile estimator and run >>> prof_maker = FluxProfileEstimator(regions=regions, energy_edges=[10, 2000] * u.GeV) >>> fermi_prof = prof_maker.run(dataset) >>> print(fermi_prof) FluxPoints ---------- geom : RegionGeom axes : ['lon', 'lat', 'energy', 'projected-distance'] shape : (1, 1, 1, 21) quantities : ['norm', 'norm_err', 'ts', 'npred', 'npred_excess', 'stat', 'stat_null', 'counts', 'success'] ref. model : pl n_sigma : 1 n_sigma_ul : 2 sqrt_ts_threshold_ul : 2 sed type init : likelihood """ tag = "FluxProfileEstimator" @deprecated_renamed_argument("spectrum", "spectral_model", "v1.3") def __init__(self, regions, spectral_model=None, **kwargs): if len(regions) <= 1: raise ValueError( "Please provide at least two regions for flux profile estimation." ) self.regions = regions if spectral_model is None: spectral_model = PowerLawSpectralModel() self.spectral_model = spectral_model super().__init__(**kwargs) @property def projected_distance_axis(self): """Get projected distance from the first region. For normal region this is defined as the distance from the center of the region. For annulus shaped regions it is the mean between the inner and outer radius. Returns ------- axis : `MapAxis` Projected distance axis. """ distances = [] center = self.regions[0].center for idx, region in enumerate(self.regions): if isinstance(region, CircleAnnulusSkyRegion): distance = (region.inner_radius + region.outer_radius) / 2.0 else: distance = center.separation(region.center) distances.append(distance) return MapAxis.from_nodes( u.Quantity(distances, "deg"), name="projected-distance" ) def run(self, datasets): """Run flux profile estimation. Parameters ---------- datasets : list of `~gammapy.datasets.MapDataset` Map datasets. Returns ------- profile : `~gammapy.estimators.FluxPoints` Profile flux points. """ datasets = Datasets(datasets=datasets) maps = parallel.run_multiprocessing( self._run_region, zip(repeat(datasets), self.regions), backend=self.parallel_backend, pool_kwargs=dict(processes=self.n_jobs), task_name="Flux profile estimation", ) return FluxPoints.from_stack( maps=maps, axis=self.projected_distance_axis, ) def _run_region(self, datasets, region): # TODO: test if it would be more efficient # to not compute datasets_to_fit in parallel # to avoid copying the full datasets datasets_to_fit = datasets.to_spectrum_datasets(region=region) for dataset_spec, dataset_map in zip(datasets_to_fit, datasets): dataset_spec.background.data = ( dataset_map.npred() .to_region_nd_map(region, func=np.sum, weights=dataset_map.mask_safe) .data ) datasets_to_fit.models = SkyModel(self.spectral_model, name="test-source") estimator = self.copy() estimator.n_jobs = self._n_child_jobs return estimator._run_flux_points(datasets_to_fit) def _run_flux_points(self, datasets): return super().run(datasets) @property def config_parameters(self): """Configuration parameters.""" pars = self.__dict__.copy() pars = {key.strip("_"): value for key, value in pars.items()} pars.pop("regions") return pars ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/points/sed.py0000644000175100001770000002164114721316200020703 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging from itertools import repeat import numpy as np from astropy import units as u from astropy.table import Table import gammapy.utils.parallel as parallel from gammapy.datasets import Datasets from gammapy.datasets.actors import DatasetsActor from gammapy.datasets.flux_points import _get_reference_model from gammapy.maps import MapAxis from gammapy.modeling import Fit from ..flux import FluxEstimator from .core import FluxPoints log = logging.getLogger(__name__) __all__ = ["FluxPointsEstimator"] class FluxPointsEstimator(FluxEstimator, parallel.ParallelMixin): """Flux points estimator. Estimates flux points for a given list of datasets, energies and spectral model. To estimate the flux point the amplitude of the reference spectral model is fitted within the energy range defined by the energy group. This is done for each group independently. The amplitude is re-normalized using the "norm" parameter, which specifies the deviation of the flux from the reference model in this energy group. See https://gamma-astro-data-formats.readthedocs.io/en/latest/spectra/binned_likelihoods/index.html for details. The method is also described in the `Fermi-LAT catalog paper `__ or the `H.E.S.S. Galactic Plane Survey paper `__ Parameters ---------- source : str or int For which source in the model to compute the flux points. n_sigma : int Number of sigma to use for asymmetric error computation. Default is 1. n_sigma_ul : int Number of sigma to use for upper limit computation. Default is 2. selection_optional : list of str, optional Which additional quantities to estimate. Available options are: * "all": all the optional steps are executed. * "errn-errp": estimate asymmetric errors on flux. * "ul": estimate upper limits. * "scan": estimate fit statistic profiles. Default is None so the optional steps are not executed. energy_edges : list of `~astropy.units.Quantity`, optional Edges of the flux points energy bins. The resulting bin edges won't be exactly equal to the input ones, but rather the closest values to the energy axis edges of the parent dataset. Default is None: apply the estimator in each energy bin of the parent dataset. For further explanation see :ref:`estimators`. fit : `Fit` Fit instance specifying the backend and fit options. reoptimize : bool If True the free parameters of the other models are fitted in each bin independently, together with the norm of the source of interest (but the other parameters of the source of interest are kept frozen). If False only the norm of the source of interest if fitted, and all other parameters are frozen at their current values. sum_over_energy_groups : bool Whether to sum over the energy groups or fit the norm on the full energy grid. n_jobs : int Number of processes used in parallel for the computation. Default is one, unless `~gammapy.utils.parallel.N_JOBS_DEFAULT` was modified. The number of jobs is limited to the number of physical CPUs. parallel_backend : {"multiprocessing", "ray"} Which backend to use for multiprocessing. norm : `~gammapy.modeling.Parameter` or dict Norm parameter used for the fit Default is None and a new parameter is created automatically, with value=1, name="norm", scan_min=0.2, scan_max=5, and scan_n_values = 11. By default the min and max are not set and derived from the source model, unless the source model does not have one and only one norm parameter. If a dict is given the entries should be a subset of `~gammapy.modeling.Parameter` arguments. """ tag = "FluxPointsEstimator" def __init__( self, energy_edges=[1, 10] * u.TeV, sum_over_energy_groups=False, n_jobs=None, parallel_backend=None, **kwargs, ): self.energy_edges = energy_edges self.sum_over_energy_groups = sum_over_energy_groups self.n_jobs = n_jobs self.parallel_backend = parallel_backend fit = Fit(confidence_opts={"backend": "scipy"}) kwargs.setdefault("fit", fit) super().__init__(**kwargs) def run(self, datasets): """Run the flux point estimator for all energy groups. Parameters ---------- datasets : `~gammapy.datasets.Datasets` Datasets. Returns ------- flux_points : `FluxPoints` Estimated flux points. """ if not isinstance(datasets, DatasetsActor): datasets = Datasets(datasets=datasets) if not datasets.energy_axes_are_aligned: raise ValueError("All datasets must have aligned energy axes.") telescopes = [] for d in datasets: if d.meta_table is not None and "TELESCOP" in d.meta_table.colnames: telescopes.extend(list(d.meta_table["TELESCOP"].flatten())) if len(np.unique(telescopes)) > 1: raise ValueError( "All datasets must use the same value of the" " 'TELESCOP' meta keyword." ) meta = { "n_sigma": self.n_sigma, "n_sigma_ul": self.n_sigma_ul, "sed_type_init": "likelihood", } rows = parallel.run_multiprocessing( self.estimate_flux_point, zip( repeat(datasets), self.energy_edges[:-1], self.energy_edges[1:], ), backend=self.parallel_backend, pool_kwargs=dict(processes=self.n_jobs), task_name="Energy bins", ) table = Table(rows, meta=meta) model = _get_reference_model(datasets.models[self.source], self.energy_edges) return FluxPoints.from_table( table=table, reference_model=model.copy(), gti=datasets.gti, format="gadf-sed", ) def estimate_flux_point(self, datasets, energy_min, energy_max): """Estimate flux point for a single energy group. Parameters ---------- datasets : `~gammapy.datasets.Datasets` Datasets. energy_min, energy_max : `~astropy.units.Quantity` Energy bounds to compute the flux point for. Returns ------- result : dict Dictionary with results for the flux point. """ datasets_sliced = datasets.slice_by_energy( energy_min=energy_min, energy_max=energy_max ) if self.sum_over_energy_groups: datasets_sliced = datasets_sliced.__class__( [_.to_image(name=_.name) for _ in datasets_sliced] ) if len(datasets_sliced) > 0: if datasets.models is not None: models_sliced = datasets.models._slice_by_energy( energy_min=energy_min, energy_max=energy_max, sum_over_energy_groups=self.sum_over_energy_groups, ) datasets_sliced.models = models_sliced return super().run(datasets=datasets_sliced) else: log.warning(f"No dataset contribute in range {energy_min}-{energy_max}") model = _get_reference_model( datasets.models[self.source], self.energy_edges ) return self._nan_result(datasets, model, energy_min, energy_max) def _nan_result(self, datasets, model, energy_min, energy_max): """NaN result.""" energy_axis = MapAxis.from_energy_edges([energy_min, energy_max]) with np.errstate(invalid="ignore", divide="ignore"): result = model.reference_fluxes(energy_axis=energy_axis) # convert to scalar values result = {key: value.item() for key, value in result.items()} result.update( { "norm": np.nan, "stat": np.nan, "success": False, "norm_err": np.nan, "ts": np.nan, "counts": np.zeros(len(datasets)), "npred": np.nan * np.zeros(len(datasets)), "npred_excess": np.nan * np.zeros(len(datasets)), "datasets": datasets.names, } ) if "errn-errp" in self.selection_optional: result.update({"norm_errp": np.nan, "norm_errn": np.nan}) if "ul" in self.selection_optional: result.update({"norm_ul": np.nan}) if "scan" in self.selection_optional: norm_scan = self.norm.copy().scan_values result.update({"norm_scan": norm_scan, "stat_scan": np.nan * norm_scan}) return result ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/points/sensitivity.py0000644000175100001770000001445514721316200022527 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from astropy.table import Column, Table from gammapy.maps import Map from gammapy.modeling.models import PowerLawSpectralModel, SkyModel from gammapy.stats import WStatCountsStatistic from ..core import Estimator from gammapy.utils.deprecation import deprecated_renamed_argument __all__ = ["SensitivityEstimator"] class SensitivityEstimator(Estimator): """Estimate sensitivity. This class allows to determine for each reconstructed energy bin the flux associated to the number of gamma-ray events for which the significance is ``n_sigma``, and being larger than ``gamma_min`` and ``bkg_sys`` percent larger than the number of background events in the ON region. Parameters ---------- spectral_model : `~gammapy.modeling.models.SpectralModel`, optional Spectral model assumption. Default is power-law with spectral index of 2. n_sigma : float, optional Minimum significance. Default is 5. gamma_min : float, optional Minimum number of gamma-rays. Default is 10. bkg_syst_fraction : float, optional Fraction of background counts above which the number of gamma-rays is. Default is 0.05. Examples -------- For a usage example see :doc:`/tutorials/analysis-1d/cta_sensitivity` tutorial. """ tag = "SensitivityEstimator" @deprecated_renamed_argument("spectrum", "spectral_model", "v1.3") def __init__( self, spectral_model=None, n_sigma=5.0, gamma_min=10, bkg_syst_fraction=0.05, ): if spectral_model is None: spectral_model = PowerLawSpectralModel( index=2, amplitude="1 cm-2 s-1 TeV-1" ) self.spectral_model = spectral_model self.n_sigma = n_sigma self.gamma_min = gamma_min self.bkg_syst_fraction = bkg_syst_fraction def estimate_min_excess(self, dataset): """Estimate minimum excess to reach the given significance. Parameters ---------- dataset : `SpectrumDataset` Spectrum dataset. Returns ------- excess : `~gammapy.maps.RegionNDMap` Minimal excess. """ n_off = dataset.counts_off.data stat = WStatCountsStatistic( n_on=dataset.alpha.data * n_off, n_off=n_off, alpha=dataset.alpha.data ) excess_counts = stat.n_sig_matching_significance(self.n_sigma) is_gamma_limited = excess_counts < self.gamma_min excess_counts[is_gamma_limited] = self.gamma_min bkg_syst_limited = ( excess_counts < self.bkg_syst_fraction * dataset.background.data ) excess_counts[bkg_syst_limited] = ( self.bkg_syst_fraction * dataset.background.data[bkg_syst_limited] ) excess = Map.from_geom(geom=dataset._geom, data=excess_counts) return excess def estimate_min_e2dnde(self, excess, dataset): """Estimate dnde from a given minimum excess. Parameters ---------- excess : `~gammapy.maps.RegionNDMap` Minimal excess. dataset : `~gammapy.datasets.SpectrumDataset` Spectrum dataset. Returns ------- e2dnde : `~astropy.units.Quantity` Minimal differential flux. """ energy = dataset._geom.axes["energy"].center dataset.models = SkyModel(spectral_model=self.spectral_model) npred = dataset.npred_signal() phi_0 = excess / npred dnde_model = self.spectral_model(energy=energy) dnde = phi_0.data[:, 0, 0] * dnde_model * energy**2 return dnde.to("erg / (cm2 s)") def _get_criterion(self, excess, bkg): is_gamma_limited = excess == self.gamma_min is_bkg_syst_limited = excess == bkg * self.bkg_syst_fraction criterion = np.chararray(excess.shape, itemsize=12) criterion[is_gamma_limited] = "gamma" criterion[is_bkg_syst_limited] = "bkg" criterion[~np.logical_or(is_gamma_limited, is_bkg_syst_limited)] = ( "significance" ) return criterion def run(self, dataset): """Run the sensitivity estimation. Parameters ---------- dataset : `SpectrumDatasetOnOff` Dataset to compute sensitivity for. Returns ------- sensitivity : `~astropy.table.Table` Sensitivity table. """ energy = dataset._geom.axes["energy"].center excess = self.estimate_min_excess(dataset) e2dnde = self.estimate_min_e2dnde(excess, dataset) criterion = self._get_criterion( excess.data.squeeze(), dataset.background.data.squeeze() ) return Table( [ Column( data=energy, name="e_ref", format="5g", description="Energy center", ), Column( data=dataset._geom.axes["energy"].edges_min, name="e_min", format="5g", description="Energy edge low", ), Column( data=dataset._geom.axes["energy"].edges_max, name="e_max", format="5g", description="Energy edge high", ), Column( data=e2dnde, name="e2dnde", format="5g", description="Energy squared times differential flux", ), Column( data=np.atleast_1d(excess.data.squeeze()), name="excess", format="5g", description="Number of excess counts in the bin", ), Column( data=np.atleast_1d(dataset.background.data.squeeze()), name="background", format="5g", description="Number of background counts in the bin", ), Column( data=np.atleast_1d(criterion), name="criterion", description="Sensitivity-limiting criterion", ), ] ) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.204642 gammapy-1.3/gammapy/estimators/points/tests/0000755000175100001770000000000014721316215020722 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/points/tests/__init__.py0000644000175100001770000000010014721316200023014 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/points/tests/test_core.py0000644000175100001770000003375314721316200023270 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.table import Table from astropy.time import Time import matplotlib.pyplot as plt from gammapy.catalog.fermi import SourceCatalog3FGL, SourceCatalog4FGL from gammapy.data import GTI from gammapy.estimators import FluxPoints from gammapy.estimators.map.core import DEFAULT_UNIT from gammapy.estimators.utils import get_rebinned_axis from gammapy.maps import MapAxis from gammapy.modeling.models import PowerLawSpectralModel, SpectralModel from gammapy.utils.scripts import make_path from gammapy.utils.testing import ( assert_quantity_allclose, mpl_plot_check, requires_data, ) FLUX_POINTS_FILES = [ "diff_flux_points.ecsv", "diff_flux_points.fits", "flux_points.ecsv", "flux_points.fits", ] class LWTestModel(SpectralModel): @staticmethod def evaluate(x): return 1e4 * np.exp(-6 * x) def integral(self, xmin, xmax, **kwargs): return -1.0 / 6 * 1e4 * (np.exp(-6 * xmax) - np.exp(-6 * xmin)) def inverse(self, y): return -1.0 / 6 * np.log(y * 1e-4) class XSqrTestModel(SpectralModel): @staticmethod def evaluate(x): return x**2 def integral(self, xmin, xmax, **kwargs): return 1.0 / 3 * (xmax**3 - xmin**2) def inverse(self, y): return np.sqrt(y) class ExpTestModel(SpectralModel): @staticmethod def evaluate(x): return np.exp(x * u.Unit("1 / TeV")) def integral(self, xmin, xmax, **kwargs): return np.exp(xmax * u.Unit("1 / TeV")) - np.exp(xmin * u.Unit("1 / TeV")) def inverse(self, y): return np.log(y * u.TeV) * u.TeV def test_energy_ref_lafferty(): """ Tests Lafferty & Wyatt x-point method. Using input function g(x) = 10^4 exp(-6x) against check values from paper Lafferty & Wyatt. Nucl. Instr. and Meth. in Phys. Res. A 355 (1995) 541-547, p. 542 Table 1 """ # These are the results from the paper desired = np.array([0.048, 0.190, 0.428, 0.762]) model = LWTestModel() energy_min = np.array([0.0, 0.1, 0.3, 0.6]) energy_max = np.array([0.1, 0.3, 0.6, 1.0]) actual = FluxPoints._energy_ref_lafferty(model, energy_min, energy_max) assert_allclose(actual, desired, atol=1e-3) @pytest.mark.xfail def test_dnde_from_flux(): """Tests y-value normalization adjustment method.""" table = Table() table["e_min"] = np.array([10, 20, 30, 40]) table["e_max"] = np.array([20, 30, 40, 50]) table["flux"] = np.array([42, 52, 62, 72]) # 'True' integral flux in this test bin # Get values model = XSqrTestModel() table["e_ref"] = FluxPoints._energy_ref_lafferty( model, table["e_min"], table["e_max"] ) dnde = FluxPoints.from_table(table, reference_model=model) # Set up test case comparison dnde_model = model(table["e_ref"]) # Test comparison result desired = model.integral(table["e_min"], table["e_max"]) # Test output result actual = table["flux"] * (dnde_model / dnde) # Compare assert_allclose(actual, desired, rtol=1e-6) @pytest.mark.xfail @pytest.mark.parametrize("method", ["table", "lafferty", "log_center"]) def test_compute_flux_points_dnde_exp(method): """ Tests against analytical result or result from a powerlaw. """ model = ExpTestModel() energy_min = [1.0, 10.0] * u.TeV energy_max = [10.0, 100.0] * u.TeV table = Table() table.meta["SED_TYPE"] = "flux" table["e_min"] = energy_min table["e_max"] = energy_max flux = model.integral(energy_min, energy_max) table["flux"] = flux if method == "log_center": energy_ref = np.sqrt(energy_min * energy_max) elif method == "table": energy_ref = [2.0, 20.0] * u.TeV elif method == "lafferty": energy_ref = FluxPoints._energy_ref_lafferty(model, energy_min, energy_max) table["e_ref"] = energy_ref result = FluxPoints.from_table(table, reference_model=model) # Test energy actual = result.energy_ref assert_quantity_allclose(actual, energy_ref, rtol=1e-8) # Test flux actual = result.dnde desired = model(energy_ref) assert_quantity_allclose(actual, desired, rtol=1e-8) @requires_data() def test_fermi_to_dnde(): catalog_4fgl = SourceCatalog4FGL("$GAMMAPY_DATA/catalogs/fermi/gll_psc_v20.fit.gz") src = catalog_4fgl["FGES J1553.8-5325"] fp = src.flux_points assert_allclose( fp.dnde.quantity[1, 0, 0], 4.567393e-10 * u.Unit("cm-2 s-1 MeV-1"), rtol=1e-5, ) @pytest.fixture(params=FLUX_POINTS_FILES, scope="session") def flux_points(request): path = "$GAMMAPY_DATA/tests/spectrum/flux_points/" + request.param return FluxPoints.read(path) @pytest.fixture(scope="session") def flux_points_likelihood(): path = "$GAMMAPY_DATA/tests/spectrum/flux_points/binlike.fits" return FluxPoints.read(path) @requires_data() class TestFluxPoints: def test_info(self, flux_points): info = str(flux_points) assert "geom" in info assert "axes" in info assert "ref. model" in info assert "quantities" in info def test_energy_ref(self, flux_points): actual = flux_points.energy_ref desired = np.sqrt(flux_points.energy_min * flux_points.energy_max) assert_quantity_allclose(actual, desired) def test_energy_min(self, flux_points): actual = flux_points.energy_min desired = 299530.97 * u.MeV assert_quantity_allclose(actual.sum(), desired) def test_energy_max(self, flux_points): actual = flux_points.energy_max desired = 399430.975 * u.MeV assert_quantity_allclose(actual.sum(), desired) def test_write_fits(self, tmp_path, flux_points): start = u.Quantity([1, 2], "min") stop = u.Quantity([1.5, 2.5], "min") time_ref = Time("2010-01-01 00:00:00.0") gti = GTI.create(start, stop, time_ref) flux_points.gti = gti flux_points.write(tmp_path / "tmp.fits", sed_type=flux_points.sed_type_init) actual = FluxPoints.read(tmp_path / "tmp.fits") actual._data.pop("is_ul", None) flux_points._data.pop("is_ul", None) assert actual.gti.time_start[0] == gti.time_start[0] assert str(flux_points) == str(actual) def test_write_ecsv(self, tmp_path, flux_points): filename = tmp_path / "flux_points.ecsv" filename.touch() flux_points.write( filename, sed_type=flux_points.sed_type_init, overwrite=True, ) actual = FluxPoints.read(filename) actual._data.pop("is_ul", None) flux_points._data.pop("is_ul", None) assert str(flux_points) == str(actual) def test_quantity_access(self, flux_points_likelihood): assert flux_points_likelihood.sqrt_ts assert flux_points_likelihood.ts assert flux_points_likelihood.stat assert_allclose(flux_points_likelihood.n_sigma_ul, 2) assert flux_points_likelihood.sed_type_init == "likelihood" def test_plot(self, flux_points): fig = plt.figure() ax = fig.add_axes([0.2, 0.2, 0.7, 0.7]) ax.xaxis.set_units(u.eV) yunit = DEFAULT_UNIT[flux_points.sed_type_init] ax.yaxis.set_units(yunit) with mpl_plot_check(): flux_points.plot(ax=ax) def test_plot_likelihood(self, flux_points_likelihood): plt.figure() with mpl_plot_check(): flux_points_likelihood.plot_ts_profiles() def test_plot_likelihood_error(self, flux_points_likelihood): del flux_points_likelihood._data["stat_scan"] with pytest.raises(AttributeError): fig = plt.figure() ax = fig.add_subplot() flux_points_likelihood.plot_ts_profiles(ax=ax) @requires_data() def test_plot_format_yaxis(): path = "$GAMMAPY_DATA/tests/spectrum/flux_points/flux_points.fits" fp = FluxPoints.read(path) with mpl_plot_check(): ax = fp.plot(sed_type="e2dnde") assert ( ax.yaxis.get_label().get_text() == "e2dnde [$\\mathrm{erg\\,s^{-1}\\,cm^{-2}}$]" ) with mpl_plot_check(): ax = fp.plot(sed_type="dnde", energy_power=2) assert ( ax.yaxis.get_label().get_text() == "e2 * dnde [$\\mathrm{TeV\\,s^{-1}\\,cm^{-2}}$]" ) with mpl_plot_check(): ax = fp.plot(sed_type="dnde") assert ( ax.yaxis.get_label().get_text() == "dnde [$\\mathrm{TeV^{-1}\\,s^{-1}\\,cm^{-2}}$]" ) with mpl_plot_check(): ax = fp.plot(sed_type="dnde", energy_power=2.7) assert ( ax.yaxis.get_label().get_text() == "e2.7 * dnde [$\\mathrm{TeV^{17/10}\\,s^{-1}\\,cm^{-2}}$]" ) @requires_data() def test_flux_points_single_bin_dnde(): path = make_path("$GAMMAPY_DATA/tests/spectrum/flux_points/diff_flux_points.fits") table = Table.read(path) table_single_bin = table[1:2] fp = FluxPoints.from_table(table_single_bin, sed_type="dnde") with pytest.raises(ValueError): _ = fp.flux_ref with mpl_plot_check(): fp.plot(sed_type="e2dnde") with pytest.raises(ValueError): fp.to_table(sed_type="flux") table = fp.to_table(sed_type="dnde") assert_allclose(table["e_ref"], 153.992 * u.MeV, rtol=0.001) assert "e_min" not in table.colnames assert "e_max" not in table.colnames @requires_data() def test_compute_flux_points_dnde_fermi(): """ Test compute_flux_points_dnde on fermi source. """ fermi_3fgl = SourceCatalog3FGL() source = fermi_3fgl["3FGL J0835.3-4510"] flux_points = source.flux_points table = source.flux_points_table for column in ["e2dnde", "e2dnde_errn", "e2dnde_errp", "e2dnde_ul"]: actual = table[column].quantity desired = getattr(flux_points, column).quantity.squeeze() assert_quantity_allclose(actual[:-1], desired[:-1], rtol=0.05) @requires_data() def test_plot_fp_no_ul(): path = make_path("$GAMMAPY_DATA/tests/spectrum/flux_points/diff_flux_points.fits") table = Table.read(path) table.remove_column("dnde_ul") fp = FluxPoints.from_table(table, sed_type="dnde") with mpl_plot_check(): fp.plot() @requires_data() def test_is_ul(tmp_path): catalog_4fgl = SourceCatalog4FGL("$GAMMAPY_DATA/catalogs/fermi/gll_psc_v20.fit.gz") src = catalog_4fgl["FGES J1553.8-5325"] fp = src.flux_points is_ul = fp._data["is_ul"].data.squeeze() assert_allclose(fp.is_ul.data.squeeze(), is_ul) table = fp.to_table() assert_allclose(table["is_ul"].data.data, is_ul) fp.sqrt_ts_threshold_ul = 100 assert_allclose(fp.is_ul.data.squeeze(), np.ones(is_ul.shape, dtype=bool)) table = fp.to_table() assert_allclose(table["is_ul"].data.data, np.ones(is_ul.shape, dtype=bool)) table.write(tmp_path / "test_modif_ul_threshold.fits") table_read = Table.read(tmp_path / "test_modif_ul_threshold.fits") assert_allclose(table_read["is_ul"].data.data, np.ones(is_ul.shape, dtype=bool)) fp_read = FluxPoints.from_table(table_read) assert_allclose(fp_read.is_ul.data.squeeze(), np.ones(is_ul.shape, dtype=bool)) assert_allclose(fp_read.to_table()["is_ul"], fp_read.is_ul.data.squeeze()) fp.is_ul = is_ul assert_allclose(fp.is_ul.data.squeeze(), is_ul) table = fp.to_table() assert_allclose(table["is_ul"].data.data, is_ul) def test_flux_points_plot_no_error_bar(): table = Table() pwl = PowerLawSpectralModel() e_ref = np.geomspace(1, 100, 7) * u.TeV table["e_ref"] = e_ref table["dnde"] = pwl(e_ref) table.meta["SED_TYPE"] = "dnde" flux_points = FluxPoints.from_table(table) with mpl_plot_check(): _ = flux_points.plot(sed_type="dnde") @requires_data() def test_fp_no_is_ul(): path = make_path("$GAMMAPY_DATA/tests/spectrum/flux_points/flux_points.fits") table = Table.read(path) table.remove_column("is_ul") table.remove_column("flux_ul") fp = FluxPoints.from_table(table) fp_table = fp.to_table() assert "is_ul" not in fp_table.colnames def test_table_columns(): table = Table() table["e_min"] = np.array([10, 20, 30, 40]) * u.TeV table["e_max"] = np.array([20, 30, 40, 50]) * u.TeV table["flux"] = np.array([42, 52, 62, 72]) / (u.s * u.cm * u.cm) table["other"] = np.array([1, 2, 3, 4]) table["n_dof"] = np.array([1, 2, 1, 2]) # Get values model = PowerLawSpectralModel() table["e_ref"] = FluxPoints._energy_ref_lafferty( model, table["e_min"], table["e_max"] ) fp = FluxPoints.from_table(table, reference_model=model) assert fp.available_quantities == ["norm", "n_dof"] assert_allclose(fp.n_dof.data.ravel(), table["n_dof"]) @requires_data() def test_resample_axis(): lc_1d = FluxPoints.read( "$GAMMAPY_DATA/estimators/pks2155_hess_lc/pks2155_hess_lc.fits", format="lightcurve", ) axis_new = get_rebinned_axis( lc_1d, method="fixed-bins", group_size=5, axis_name="time" ) l1 = lc_1d.resample_axis(axis_new=axis_new) assert_allclose(l1.norm.data.ravel()[0:2], [1.56321943, 2.12845751], rtol=1e-3) assert_allclose(l1.norm_err.data.ravel()[0:2], [0.03904136, 0.03977413], rtol=1e-3) assert_allclose(l1.n_dof.data.ravel()[0], 5.0) assert l1.success.data.ravel()[0] axis_new = get_rebinned_axis( lc_1d, method="min-ts", ts_threshold=300, axis_name="time" ) l1 = lc_1d.resample_axis(axis_new=axis_new) assert_allclose(l1.norm_err.data.ravel()[0:2], [0.072236, 0.092942], rtol=1e-3) assert_allclose(l1.ts.data.ravel()[0:2], [312.742222, 454.99609], rtol=1e-3) assert_allclose(l1.stat_null.data.ravel()[0:2], [319.8675, 462.329], rtol=1e-3) assert l1.success.data.ravel()[0] assert_allclose(l1.n_dof.data[0][0][0][0], 2) path = make_path("$GAMMAPY_DATA/tests/spectrum/flux_points/flux_points.fits") table = Table.read(path) fp = FluxPoints.from_table(table) with pytest.raises(ValueError): fp.resample_axis(axis_new=MapAxis.from_nodes([0, 1, 2])) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/points/tests/test_lightcurve.py0000644000175100001770000007322714721316200024514 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.table import Column, Table from astropy.time import Time from astropy.timeseries import BinnedTimeSeries, BoxLeastSquares from gammapy.data import GTI from gammapy.datasets import Datasets, FluxPointsDataset from gammapy.datasets.actors import DatasetsActor from gammapy.estimators import FluxPoints, LightCurveEstimator from gammapy.estimators.points.lightcurve import parallel as parallel from gammapy.estimators.points.tests.test_sed import ( simulate_map_dataset, simulate_spectrum_dataset, ) from gammapy.modeling import Fit from gammapy.modeling.models import FoVBackgroundModel, PowerLawSpectralModel, SkyModel from gammapy.utils.testing import ( assert_time_allclose, mpl_plot_check, requires_data, requires_dependency, ) @pytest.fixture(scope="session") def lc(): meta = dict(TIMESYS="utc", SED_TYPE="flux") table = Table( meta=meta, data=[ Column(Time(["2010-01-01", "2010-01-03"]).mjd, "time_min"), Column(Time(["2010-01-03", "2010-01-10"]).mjd, "time_max"), Column([[1.0], [1.0]], "e_min", unit="TeV"), Column([[2.0], [2.0]], "e_max", unit="TeV"), Column([1e-11, 3e-11], "flux", unit="cm-2 s-1"), Column([0.1e-11, 0.3e-11], "flux_err", unit="cm-2 s-1"), Column([np.nan, 3.6e-11], "flux_ul", unit="cm-2 s-1"), Column([False, True], "is_ul"), Column([True, True], "success"), ], ) gti = GTI.create("0h", "1h", "2010-01-01T00:00:00") return FluxPoints.from_table(table=table, gti=gti) @pytest.fixture(scope="session") def lc_2d(): meta = dict(TIMESYS="utc", SED_TYPE="flux") table = Table( meta=meta, data=[ Column(Time(["2010-01-01", "2010-01-03"]).mjd, "time_min"), Column(Time(["2010-01-03", "2010-01-10"]).mjd, "time_max"), Column([[1.0, 2.0], [1.0, 2.0]], "e_min", unit="TeV"), Column([[2.0, 4.0], [2.0, 4.0]], "e_max", unit="TeV"), Column([[1e-11, 1e-12], [3e-11, 3e-12]], "flux", unit="cm-2 s-1"), Column([[0.1e-11, 1e-13], [0.3e-11, 3e-13]], "flux_err", unit="cm-2 s-1"), Column([[np.nan, np.nan], [3.6e-11, 3.6e-12]], "flux_ul", unit="cm-2 s-1"), Column([[False, False], [True, True]], "is_ul"), ], ) return FluxPoints.from_table(table=table) def test_lightcurve_str(lc): info_str = str(lc) assert "time" in info_str def test_lightcurve_properties_time(lc): axis = lc.geom.axes["time"] assert axis.reference_time.scale == "utc" assert axis.reference_time.format == "mjd" # Time-related attributes time = axis.time_mid assert time.scale == "utc" assert time.format == "mjd" assert_allclose(time.mjd, [55198, 55202.5]) assert_allclose(axis.time_min.mjd, [55197, 55199]) assert_allclose(axis.time_max.mjd, [55199, 55206]) # Note: I'm not sure why the time delta has this scale and format time_delta = axis.time_delta assert time_delta.scale == "tai" assert time_delta.format == "jd" assert_allclose(time_delta.jd, [2, 7]) def test_lightcurve_properties_flux(lc): table = lc.to_table(sed_type="flux") flux = table["flux"].quantity assert flux.unit == "cm-2 s-1" assert_allclose(flux.value, [[1e-11], [3e-11]]) # TODO: extend these tests to cover other time scales. # In those cases, CSV should not round-trip because there # is no header info in CSV to store the time scale! @pytest.mark.parametrize("sed_type", ["dnde", "flux", "likelihood"]) def test_lightcurve_read_write(tmp_path, lc, sed_type): lc.write(tmp_path / "tmp.fits", sed_type=sed_type) lc = FluxPoints.read(tmp_path / "tmp.fits") assert_allclose(lc.gti.time_start[0].mjd, 55197.000766, rtol=1e-5) # Check if time-related info round-trips axis = lc.geom.axes["time"] assert axis.reference_time.scale == "utc" assert axis.reference_time.format == "mjd" assert_allclose(axis.time_mid.mjd, [55198, 55202.5]) def test_lightcurve_plot(lc, lc_2d): with mpl_plot_check(): lc.plot() with mpl_plot_check(): lc.plot(marker="o", time_format="mjd") with mpl_plot_check(): lc_2d.plot(axis_name="time") @requires_data() def test_lightcurve_to_time_series(): from gammapy.catalog import SourceCatalog4FGL catalog_4fgl = SourceCatalog4FGL("$GAMMAPY_DATA/catalogs/fermi/gll_psc_v20.fit.gz") lightcurve = catalog_4fgl["FGES J1553.8-5325"].lightcurve() table = lightcurve.to_table(sed_type="flux", format="binned-time-series") timeseries = BinnedTimeSeries(data=table) assert_allclose(timeseries.time_bin_center.mjd[0], 54863.97885336907) assert_allclose(timeseries.time_bin_end.mjd[0], 55045.301668796295) time_axis = lightcurve.geom.axes["time"] assert_allclose(timeseries.time_bin_end.mjd[0], time_axis.time_max.mjd[0]) # assert that it interfaces with periodograms p = BoxLeastSquares.from_timeseries( timeseries=timeseries, signal_column_name="flux", uncertainty="flux_errp" ) result = p.power(1 * u.year, 0.5 * u.year) assert_allclose(result.duration, 182.625 * u.d) def get_spectrum_datasets(): model = SkyModel(spectral_model=PowerLawSpectralModel()) dataset_1 = simulate_spectrum_dataset(model=model, random_state=0) dataset_1._name = "dataset_1" gti1 = GTI.create("0h", "1h", Time("2010-01-01T00:00:00").tt) dataset_1.gti = gti1 dataset_2 = simulate_spectrum_dataset(model=model, random_state=1) dataset_2._name = "dataset_2" gti2 = GTI.create("1h", "2h", Time("2010-01-01T00:00:00").tt) dataset_2.gti = gti2 return [dataset_1, dataset_2] @requires_data() def test_group_datasets_in_time_interval(): # Doing a LC on one hour bin datasets = Datasets(get_spectrum_datasets()) time_intervals = [ Time(["2010-01-01T00:00:00", "2010-01-01T01:00:00"]), Time(["2010-01-01T01:00:00", "2010-01-01T02:00:00"]), ] group_table = datasets.gti.group_table(time_intervals) assert len(group_table) == 2 assert_allclose(group_table["time_min"], [55197.0, 55197.04166666667]) assert_allclose(group_table["time_max"], [55197.04166666667, 55197.083333333336]) assert_allclose(group_table["group_idx"], [0, 1]) @requires_data() def test_group_datasets_in_time_interval_outflows(): datasets = Datasets(get_spectrum_datasets()) # Check Overflow time_intervals = [ Time(["2010-01-01T00:00:00", "2010-01-01T00:55:00"]), Time(["2010-01-01T01:00:00", "2010-01-01T02:00:00"]), ] group_table = datasets.gti.group_table(time_intervals) assert group_table["bin_type"][0] == "overflow" # Check underflow time_intervals = [ Time(["2010-01-01T00:05:00", "2010-01-01T01:00:00"]), Time(["2010-01-01T01:00:00", "2010-01-01T02:00:00"]), ] group_table = datasets.gti.group_table(time_intervals) assert group_table["bin_type"][0] == "underflow" @requires_data() def test_lightcurve_estimator_fit_options(): # Doing a LC on one hour bin datasets = get_spectrum_datasets() time_intervals = [ Time(["2010-01-01T00:00:00", "2010-01-01T01:00:00"]), Time(["2010-01-01T01:00:00", "2010-01-01T02:00:00"]), ] energy_edges = [1, 30] * u.TeV estimator = LightCurveEstimator( energy_edges=energy_edges, time_intervals=time_intervals, selection_optional="all", fit=Fit(backend="minuit", optimize_opts=dict(tol=0.2, strategy=1)), norm=dict(scan_n_values=3), ) assert_allclose(estimator.fit.optimize_opts["tol"], 0.2) result = estimator.fit.run(datasets=datasets) assert_allclose(result.minuit.tol, 0.2) @requires_data() def test_lightcurve_estimator_spectrum_datasets(): # Doing a LC on one hour bin datasets = get_spectrum_datasets() time_intervals = [ Time(["2010-01-01T00:00:00", "2010-01-01T01:00:00"]).tt, Time(["2010-01-01T01:00:00", "2010-01-01T02:00:00"]).tt, ] estimator = LightCurveEstimator( energy_edges=[1, 30] * u.TeV, time_intervals=time_intervals, selection_optional="all", ) estimator.norm.scan_n_values = 3 lightcurve = estimator.run(datasets) table = lightcurve.to_table() assert_allclose(table["time_min"], [55197.0, 55197.041667]) assert_allclose(table["time_max"], [55197.041667, 55197.083333]) assert_allclose(table["e_ref"], [[5.623413], [5.623413]]) assert_allclose(table["e_min"], [[1], [1]]) assert_allclose(table["e_max"], [[31.622777], [31.622777]]) assert_allclose(table["ref_dnde"], [[3.162278e-14], [3.162278e-14]], rtol=1e-5) assert_allclose(table["ref_flux"], [[9.683772e-13], [9.683772e-13]], rtol=1e-5) assert_allclose(table["ref_eflux"], [[3.453878e-12], [3.453878e-12]], rtol=1e-5) assert_allclose(table["stat"], [[16.824042], [17.391981]], rtol=1e-5) assert_allclose(table["norm"], [[0.911963], [0.9069318]], rtol=1e-2) assert_allclose(table["norm_err"], [[0.057769], [0.057835]], rtol=1e-2) assert_allclose(table["counts"], [[[791, np.nan]], [[np.nan, 784]]]) assert_allclose(table["norm_errp"], [[0.058398], [0.058416]], rtol=1e-2) assert_allclose(table["norm_errn"], [[0.057144], [0.057259]], rtol=1e-2) assert_allclose(table["norm_ul"], [[1.029989], [1.025061]], rtol=1e-2) assert_allclose(table["sqrt_ts"], [[19.384781], [19.161769]], rtol=1e-2) assert_allclose(table["ts"], [[375.769735], [367.173374]], rtol=1e-2) assert_allclose(table[0]["norm_scan"], [[0.2, 1.0, 5.0]]) assert_allclose( table[0]["stat_scan"], [[224.058304, 19.074405, 2063.75636]], rtol=1e-5, ) assert table.meta["TIMESYS"] == "tt" assert table.meta["TIMEUNIT"] == "d" assert table.meta["MJDREFF"] == 0 assert table.meta["MJDREFI"] == 0 # TODO: fix reference model I/O fp = FluxPoints.from_table(table=table, reference_model=PowerLawSpectralModel()) assert fp.norm.geom.axes.names == ["energy", "time"] assert fp.counts.geom.axes.names == ["dataset", "energy", "time"] assert fp.stat_scan.geom.axes.names == ["norm", "energy", "time"] assert_time_allclose( fp.geom.axes["time"].time_min, lightcurve.geom.axes["time"].time_min ) assert_time_allclose( fp.geom.axes["time"].time_max, lightcurve.geom.axes["time"].time_max ) @requires_data() def test_lightcurve_estimator_spectrum_datasets_2_energy_bins(): # Doing a LC on one hour bin datasets = get_spectrum_datasets() time_intervals = [ Time(["2010-01-01T00:00:00", "2010-01-01T01:00:00"]), Time(["2010-01-01T01:00:00", "2010-01-01T02:00:00"]), ] estimator = LightCurveEstimator( energy_edges=[1, 5, 30] * u.TeV, time_intervals=time_intervals, selection_optional="all", ) estimator.norm.scan_n_values = 3 lightcurve = estimator.run(datasets) table = lightcurve.to_table() assert_allclose(table["time_min"], [55197.0, 55197.041667]) assert_allclose(table["time_max"], [55197.041667, 55197.083333]) assert_allclose(table["e_ref"], [[2.238721, 12.589254], [2.238721, 12.589254]]) assert_allclose(table["e_min"], [[1, 5.011872], [1, 5.011872]]) assert_allclose(table["e_max"], [[5.011872, 31.622777], [5.011872, 31.622777]]) assert_allclose( table["ref_dnde"], [[1.995262e-13, 6.309573e-15], [1.995262e-13, 6.309573e-15]], rtol=1e-5, ) assert_allclose( table["stat"], [[8.234951, 8.30321], [2.037205, 15.300507]], rtol=1e-5, ) assert_allclose( table["norm"], [[0.894723, 0.967419], [0.914283, 0.882351]], rtol=1e-2, ) assert_allclose( table["norm_err"], [[0.065905, 0.121288], [0.06601, 0.119457]], rtol=1e-2, ) assert_allclose( table["counts"], [[[669.0, np.nan], [122.0, np.nan]], [[np.nan, 667.0], [np.nan, 117.0]]], ) assert_allclose( table["norm_errp"], [[0.06664, 0.124741], [0.066815, 0.122832]], rtol=1e-2, ) assert_allclose( table["norm_errn"], [[0.065176, 0.117904], [0.065212, 0.116169]], rtol=1e-2, ) assert_allclose( table["norm_ul"], [[1.029476, 1.224117], [1.049283, 1.134874]], rtol=1e-2, ) assert_allclose( table["sqrt_ts"], [[16.233236, 10.608376], [16.609784, 9.557339]], rtol=1e-2, ) assert_allclose(table[0]["norm_scan"], [[0.2, 1.0, 5.0], [0.2, 1.0, 5.0]]) assert_allclose( table[0]["stat_scan"], [[153.880281, 10.701492, 1649.609684], [70.178023, 8.372913, 414.146676]], rtol=1e-5, ) # those quantities are currently not part of the table so we test separately npred = lightcurve.npred.data.squeeze() assert_allclose( npred, [[[669.36, np.nan], [121.66, np.nan]], [[np.nan, 664.41], [np.nan, 115.09]]], rtol=1e-3, ) npred_excess_err = lightcurve.npred_excess_err.data.squeeze() assert_allclose( npred_excess_err, [[[26.80, np.nan], [11.31, np.nan]], [[np.nan, 26.85], [np.nan, 11.14]]], rtol=1e-3, ) npred_excess_errp = lightcurve.npred_excess_errp.data.squeeze() assert_allclose( npred_excess_errp, [[[27.11, np.nan], [11.63, np.nan]], [[np.nan, 27.15], [np.nan, 11.46]]], rtol=1e-3, ) npred_excess_errn = lightcurve.npred_excess_errn.data.squeeze() assert_allclose( npred_excess_errn, [[[26.50, np.nan], [11.00, np.nan]], [[np.nan, 26.54], [np.nan, 10.84]]], rtol=1e-3, ) npred_excess_ul = lightcurve.npred_excess_ul.data.squeeze() assert_allclose( npred_excess_ul, [[[418.68, np.nan], [114.19, np.nan]], [[np.nan, 426.74], [np.nan, 105.86]]], rtol=1e-3, ) fp = FluxPoints.from_table(table=table, reference_model=PowerLawSpectralModel()) assert fp.norm.geom.axes.names == ["energy", "time"] assert fp.counts.geom.axes.names == ["dataset", "energy", "time"] assert fp.stat_scan.geom.axes.names == ["norm", "energy", "time"] @requires_data() def test_lightcurve_estimator_spectrum_datasets_with_mask_fit(): # Doing a LC on one hour bin datasets = get_spectrum_datasets() time_intervals = [ Time(["2010-01-01T00:00:00", "2010-01-01T01:00:00"]), Time(["2010-01-01T01:00:00", "2010-01-01T02:00:00"]), ] energy_min_fit = 1 * u.TeV energy_max_fit = 3 * u.TeV for dataset in datasets: geom = dataset.counts.geom dataset.mask_fit = geom.energy_mask( energy_min=energy_min_fit, energy_max=energy_max_fit ) selection = ["scan"] estimator = LightCurveEstimator( energy_edges=[1, 30] * u.TeV, time_intervals=time_intervals, selection_optional=selection, ) estimator.norm.scan_n_values = 3 lightcurve = estimator.run(datasets) table = lightcurve.to_table() assert_allclose(table["time_min"], [55197.0, 55197.041667]) assert_allclose(table["time_max"], [55197.041667, 55197.083333]) assert_allclose(table["stat"], [[6.603043], [0.421051]], rtol=1e-3) assert_allclose(table["norm"], [[0.885124], [0.967054]], rtol=1e-3) @requires_data() def test_lightcurve_estimator_spectrum_datasets_default(): # Test default time interval: each time interval is equal to the gti of each # dataset, here one hour datasets = get_spectrum_datasets() selection = ["scan"] estimator = LightCurveEstimator( energy_edges=[1, 30] * u.TeV, selection_optional=selection ) estimator.norm.scan_n_values = 3 lightcurve = estimator.run(datasets) table = lightcurve.to_table() assert_allclose(table["time_min"], [55197.0, 55197.041667]) assert_allclose(table["time_max"], [55197.041667, 55197.083333]) assert_allclose(table["norm"], [[0.911963], [0.906931]], rtol=1e-3) @requires_data() def test_lightcurve_estimator_spectrum_datasets_notordered(): # Test that if the time intervals given are not ordered in time, it is first ordered # correctly and then compute as expected datasets = get_spectrum_datasets() time_intervals = [ Time(["2010-01-01T01:00:00", "2010-01-01T02:00:00"]), Time(["2010-01-01T00:00:00", "2010-01-01T01:00:00"]), ] estimator = LightCurveEstimator( energy_edges=[1, 100] * u.TeV, time_intervals=time_intervals, selection_optional=["scan"], ) estimator.norm.scan_n_values = 3 lightcurve = estimator.run(datasets) table = lightcurve.to_table() assert_allclose(table["time_min"], [55197.0, 55197.041667]) assert_allclose(table["time_max"], [55197.041667, 55197.083333]) assert_allclose(table["norm"], [[0.911963], [0.906931]], rtol=1e-3) @requires_data() def test_lightcurve_estimator_spectrum_datasets_largerbin(): # Test all dataset in a single LC bin, here two hours datasets = get_spectrum_datasets() time_intervals = [Time(["2010-01-01T00:00:00", "2010-01-01T02:00:00"])] estimator = LightCurveEstimator( energy_edges=[1, 30] * u.TeV, time_intervals=time_intervals, selection_optional=["scan"], ) estimator.norm.scan_n_values = 3 lightcurve = estimator.run(datasets) table = lightcurve.to_table() assert_allclose(table["time_min"], [55197.0]) assert_allclose(table["time_max"], [55197.083333]) assert_allclose(table["e_ref"][0], [5.623413]) assert_allclose(table["e_min"][0], [1]) assert_allclose(table["e_max"][0], [31.622777]) assert_allclose(table["ref_dnde"][0], [3.162278e-14], rtol=1e-5) assert_allclose(table["ref_flux"][0], [9.683772e-13], rtol=1e-5) assert_allclose(table["ref_eflux"][0], [3.453878e-12], rtol=1e-5) assert_allclose(table["stat"][0], [34.219808], rtol=1e-5) assert_allclose(table["norm"][0], [0.909646], rtol=1e-5) assert_allclose(table["norm_err"][0], [0.040874], rtol=1e-3) assert_allclose(table["ts"][0], [742.939324], rtol=1e-4) @requires_data() def test_lightcurve_estimator_spectrum_datasets_emptybin(): # Test all dataset in a single LC bin, here two hours datasets = get_spectrum_datasets() time_intervals = [ Time(["2010-01-01T00:00:00", "2010-01-01T02:00:00"]), Time(["2010-02-01T00:00:00", "2010-02-01T02:00:00"]), ] estimator = LightCurveEstimator( energy_edges=[1, 30] * u.TeV, time_intervals=time_intervals, selection_optional=["scan"], ) estimator.norm.scan_n_values = 3 lightcurve = estimator.run(datasets) table = lightcurve.to_table() assert_allclose(table["time_min"], [55197.0]) assert_allclose(table["time_max"], [55197.083333]) @requires_data() def test_lightcurve_estimator_spectrum_datasets_timeoverlaped(): # Check that it returns a ValueError if the time intervals overlapped datasets = get_spectrum_datasets() time_intervals = [ Time(["2010-01-01T00:00:00", "2010-01-01T01:30:00"]), Time(["2010-01-01T01:00:00", "2010-01-01T02:00:00"]), ] with pytest.raises(ValueError) as excinfo: estimator = LightCurveEstimator(time_intervals=time_intervals) estimator.norm.scan_n_values = 3 estimator.run(datasets) msg = "Overlapping time bins" assert str(excinfo.value) == msg @requires_data() def test_lightcurve_estimator_spectrum_datasets_gti_not_include_in_time_intervals(): # Check that it returns a ValueError if the time intervals are smaller than the dataset GTI. datasets = get_spectrum_datasets() time_intervals = [ Time(["2010-01-01T00:00:00", "2010-01-01T00:05:00"]), Time(["2010-01-01T01:00:00", "2010-01-01T01:05:00"]), ] estimator = LightCurveEstimator( energy_edges=[1, 30] * u.TeV, time_intervals=time_intervals, selection_optional=["scan"], ) estimator.norm.scan_n_values = 3 with pytest.raises(ValueError) as excinfo: estimator.run(datasets) msg = "LightCurveEstimator: No datasets in time intervals" assert str(excinfo.value) == msg def get_map_datasets(): dataset_1 = simulate_map_dataset(random_state=0, name="dataset_1") gti1 = GTI.create("0 h", "1 h", "2010-01-01T00:00:00") dataset_1.gti = gti1 dataset_2 = simulate_map_dataset(random_state=1, name="dataset_2") gti2 = GTI.create("1 h", "2 h", "2010-01-01T00:00:00") dataset_2.gti = gti2 model = dataset_1.models["source"].copy(name="test_source") bkg_model_1 = FoVBackgroundModel(dataset_name="dataset_1") bkg_model_2 = FoVBackgroundModel(dataset_name="dataset_2") dataset_1.models = [model, bkg_model_1] dataset_2.models = [model, bkg_model_2] return [dataset_1, dataset_2] @requires_data() def test_lightcurve_estimator_map_datasets(): datasets = get_map_datasets() time_intervals = [ Time(["2010-01-01T00:00:00", "2010-01-01T01:00:00"]), Time(["2010-01-01T01:00:00", "2010-01-01T02:00:00"]), ] estimator = LightCurveEstimator( energy_edges=[1, 100] * u.TeV, source="test_source", time_intervals=time_intervals, selection_optional=["scan"], ) lightcurve = estimator.run(datasets) table = lightcurve.to_table() assert_allclose(table["time_min"], [55197.0, 55197.041667]) assert_allclose(table["time_max"], [55197.041667, 55197.083333]) assert_allclose(table["e_ref"], [[10.857111], [10.857111]]) assert_allclose(table["e_min"], [[1.178769], [1.178769]], rtol=1e-5) assert_allclose(table["e_max"], [[100], [100]]) assert_allclose(table["ref_dnde"], [[8.483429e-14], [8.483429e-14]], rtol=1e-5) assert_allclose(table["ref_flux"], [[8.383429e-12], [8.383429e-12]], rtol=1e-5) assert_allclose(table["ref_eflux"], [[4.4407e-11], [4.4407e-11]], rtol=1e-5) assert_allclose(table["stat"], [[9402.778975], [9517.750207]], rtol=1e-2) assert_allclose(table["norm"], [[0.971592], [0.963286]], rtol=1e-2) assert_allclose(table["norm_err"], [[0.044643], [0.044475]], rtol=1e-2) assert_allclose(table["sqrt_ts"], [[35.880361], [35.636547]], rtol=1e-2) assert_allclose(table["ts"], [[1287.4003], [1269.963491]], rtol=1e-2) datasets = get_map_datasets() time_intervals2 = [Time(["2010-01-01T00:00:00", "2010-01-01T02:00:00"])] estimator2 = LightCurveEstimator( energy_edges=[1, 100] * u.TeV, source="test_source", time_intervals=time_intervals2, selection_optional=["scan"], ) lightcurve2 = estimator2.run(datasets) table = lightcurve2.to_table() assert_allclose(table["time_min"][0], [55197.0]) assert_allclose(table["time_max"][0], [55197.083333]) assert_allclose(table["e_ref"][0], [10.857111], rtol=1e-5) assert_allclose(table["e_min"][0], [1.178769], rtol=1e-5) assert_allclose(table["e_max"][0], [100]) assert_allclose(table["ref_dnde"][0], [8.483429e-14], rtol=1e-5) assert_allclose(table["ref_flux"][0], [8.383429e-12], rtol=1e-5) assert_allclose(table["ref_eflux"][0], [4.4407e-11], rtol=1e-5) assert_allclose(table["stat"][0], [18920.54651], rtol=1e-2) assert_allclose(table["norm"][0], [0.967438], rtol=1e-2) assert_allclose(table["norm_err"][0], [0.031508], rtol=1e-2) assert_allclose(table["counts"][0], [[2205, 2220]]) assert_allclose(table["ts"][0], [2557.346464], rtol=1e-2) @requires_data() @requires_dependency("ray") def test_lightcurve_estimator_map_datasets_ray_actors(): datasets = get_map_datasets() datasets = DatasetsActor(datasets) time_intervals = [ Time(["2010-01-01T00:00:00", "2010-01-01T01:00:00"]), Time(["2010-01-01T01:00:00", "2010-01-01T02:00:00"]), ] estimator = LightCurveEstimator( energy_edges=[1, 100] * u.TeV, source="test_source", time_intervals=time_intervals, selection_optional=["scan"], ) lightcurve = estimator.run(datasets) table = lightcurve.to_table() assert_allclose(table["time_min"], [55197.0, 55197.041667]) assert_allclose(table["time_max"], [55197.041667, 55197.083333]) assert_allclose(table["e_ref"], [[10.857111], [10.857111]]) assert_allclose(table["e_min"], [[1.178769], [1.178769]], rtol=1e-5) assert_allclose(table["e_max"], [[100], [100]]) assert_allclose(table["ref_dnde"], [[8.483429e-14], [8.483429e-14]], rtol=1e-5) assert_allclose(table["ref_flux"], [[8.383429e-12], [8.383429e-12]], rtol=1e-5) assert_allclose(table["ref_eflux"], [[4.4407e-11], [4.4407e-11]], rtol=1e-5) assert_allclose(table["stat"], [[9402.778975], [9517.750207]], rtol=1e-2) assert_allclose(table["norm"], [[0.971592], [0.963286]], rtol=1e-2) assert_allclose(table["norm_err"], [[0.044643], [0.044475]], rtol=1e-2) assert_allclose(table["sqrt_ts"], [[35.880361], [35.636547]], rtol=1e-2) assert_allclose(table["ts"], [[1287.4003], [1269.963491]], rtol=1e-2) datasets = get_map_datasets() time_intervals2 = [Time(["2010-01-01T00:00:00", "2010-01-01T02:00:00"])] estimator2 = LightCurveEstimator( energy_edges=[1, 100] * u.TeV, source="test_source", time_intervals=time_intervals2, selection_optional=["scan"], ) lightcurve2 = estimator2.run(datasets) table = lightcurve2.to_table() assert_allclose(table["time_min"][0], [55197.0]) assert_allclose(table["time_max"][0], [55197.083333]) assert_allclose(table["e_ref"][0], [10.857111], rtol=1e-5) assert_allclose(table["e_min"][0], [1.178769], rtol=1e-5) assert_allclose(table["e_max"][0], [100]) assert_allclose(table["ref_dnde"][0], [8.483429e-14], rtol=1e-5) assert_allclose(table["ref_flux"][0], [8.383429e-12], rtol=1e-5) assert_allclose(table["ref_eflux"][0], [4.4407e-11], rtol=1e-5) assert_allclose(table["stat"][0], [18920.54651], rtol=1e-2) assert_allclose(table["norm"][0], [0.967438], rtol=1e-2) assert_allclose(table["norm_err"][0], [0.031508], rtol=1e-2) assert_allclose(table["counts"][0], [[2205, 2220]]) assert_allclose(table["ts"][0], [2557.346464], rtol=1e-2) @requires_data() def test_recompute_ul(): datasets = get_spectrum_datasets() selection = ["all"] estimator = LightCurveEstimator( energy_edges=[1, 3, 10, 30] * u.TeV, selection_optional=selection, n_sigma_ul=2 ) lightcurve = estimator.run(datasets) assert_allclose( lightcurve.dnde_ul.data[0], [[[3.26070325e-13]], [[3.82484628e-14]], [[4.16433528e-15]]], rtol=1e-3, ) new_lightcurve = lightcurve.recompute_ul(n_sigma_ul=4) assert_allclose( new_lightcurve.dnde_ul.data[0], [[[3.75813611e-13]], [[4.63590079e-14]], [[5.64634904e-15]]], rtol=1e-3, ) assert new_lightcurve.meta["n_sigma_ul"] == 4 # test if scan is not present selection = [] estimator = LightCurveEstimator( energy_edges=[1, 30] * u.TeV, selection_optional=selection ) lightcurve = estimator.run(datasets) with pytest.raises(ValueError): lightcurve.recompute_ul(n_sigma_ul=4) @requires_data() def test_dataset_stat_type(): datasets = get_spectrum_datasets() selection = ["all"] estimator = LightCurveEstimator( energy_edges=[1, 3, 30] * u.TeV, selection_optional=selection, n_sigma_ul=2 ) lightcurve = estimator.run(datasets) assert_allclose( lightcurve.dnde_ul.data[0], [[[3.260703e-13]], [[1.159354e-14]]], rtol=1e-3 ) lightcurve_dataset = FluxPointsDataset( data=lightcurve, models=[lightcurve.reference_model] ) lightcurve_dataset.stat_type = "profile" assert_allclose(lightcurve_dataset.stat_sum(), 5.741539, rtol=1e-5) lightcurve_dataset.stat_type = "distrib" assert_allclose(lightcurve_dataset.stat_sum(), 5.824218, rtol=1e-5) lightcurve_dataset.stat_type = "chi2" assert_allclose(lightcurve_dataset.stat_sum(), 6.014128, rtol=1e-5) # test if scan is not present selection = [] estimator = LightCurveEstimator( energy_edges=[1, 30] * u.TeV, selection_optional=selection ) lightcurve = estimator.run(datasets) lightcurve_dataset = FluxPointsDataset( data=lightcurve, models=[lightcurve.reference_model] ) with pytest.raises(AttributeError): lightcurve_dataset.stat_type = "profile" lightcurve_dataset.stat_sum() @requires_data() def test_lightcurve_parallel_multiprocessing(): datasets = get_spectrum_datasets() selection = ["all"] estimator = LightCurveEstimator( energy_edges=[1, 3, 30] * u.TeV, selection_optional=selection, n_sigma_ul=2, n_jobs=2, ) assert estimator.n_jobs == 2 lightcurve = estimator.run(datasets) assert_allclose( lightcurve.dnde_ul.data[0], [[[3.260703e-13]], [[1.159354e-14]]], rtol=1e-3 ) assert estimator._get_n_child_jobs == 1 parallel.ALLOW_CHILD_JOBS = True lightcurve = estimator.run(datasets) assert_allclose( lightcurve.dnde_ul.data[0], [[[3.260703e-13]], [[1.159354e-14]]], rtol=1e-3 ) assert estimator._get_n_child_jobs == 2 parallel.ALLOW_CHILD_JOBS = False @requires_data() @requires_dependency("ray") def test_lightcurve_parallel_ray(): datasets = get_spectrum_datasets() selection = ["all"] estimator = LightCurveEstimator( energy_edges=[1, 3, 30] * u.TeV, selection_optional=selection, n_sigma_ul=2, n_jobs=2, parallel_backend="ray", ) assert estimator.n_jobs == 2 lightcurve = estimator.run(datasets) assert_allclose( lightcurve.dnde_ul.data[0], [[[3.260703e-13]], [[1.159354e-14]]], rtol=1e-3 ) assert estimator._get_n_child_jobs == 1 parallel.ALLOW_CHILD_JOBS = True lightcurve = estimator.run(datasets) assert_allclose( lightcurve.dnde_ul.data[0], [[[3.260703e-13]], [[1.159354e-14]]], rtol=1e-3 ) assert estimator._get_n_child_jobs == 2 parallel.ALLOW_CHILD_JOBS = False ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/points/tests/test_profile.py0000644000175100001770000002355414721316200023776 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion import gammapy.utils.parallel as parallel from gammapy.data import GTI from gammapy.datasets import MapDatasetOnOff from gammapy.estimators import FluxPoints, FluxProfileEstimator from gammapy.estimators.points.tests.test_sed import simulate_map_dataset from gammapy.maps import MapAxis, WcsGeom from gammapy.modeling.models import PowerLawSpectralModel from gammapy.utils.regions import ( make_concentric_annulus_sky_regions, make_orthogonal_rectangle_sky_regions, ) from gammapy.utils.testing import requires_data, requires_dependency from gammapy.utils.deprecation import GammapyDeprecationWarning def get_simple_dataset_on_off(): axis = MapAxis.from_energy_bounds(0.1, 10, 2, unit="TeV") geom = WcsGeom.create(npix=40, binsz=0.01, axes=[axis]) dataset = MapDatasetOnOff.create(geom, name="test-on-off") dataset.mask_safe += True dataset.counts += 5 dataset.counts_off += 1 dataset.acceptance += 1 dataset.acceptance_off += 1 dataset.exposure += 1000 * u.m**2 * u.s dataset.gti = GTI.create([0 * u.s], [5 * u.h], reference_time="2010-01-01T00:00:00") return dataset def make_boxes(wcs): start_line = SkyCoord(0.08, -0.16, unit="deg", frame="icrs") end_line = SkyCoord(359.9, 0.16, unit="deg", frame="icrs") return make_orthogonal_rectangle_sky_regions( start_line, end_line, wcs, 0.1 * u.deg, 8 ) def make_horizontal_boxes(wcs): start_line = SkyCoord(0.08, 0.1, unit="deg", frame="icrs") end_line = SkyCoord(359.9, 0.1, unit="deg", frame="icrs") return make_orthogonal_rectangle_sky_regions( start_line, end_line, wcs, 0.1 * u.deg, 8 ) def test_profile_content(): mapdataset_onoff = get_simple_dataset_on_off() wcs = mapdataset_onoff.counts.geom.wcs boxes = make_horizontal_boxes(wcs) with pytest.warns(GammapyDeprecationWarning): FluxProfileEstimator( regions=boxes, energy_edges=[0.1, 1, 10] * u.TeV, spectrum=PowerLawSpectralModel(), ) prof_maker = FluxProfileEstimator( regions=boxes, energy_edges=[0.1, 1, 10] * u.TeV, selection_optional="all", n_sigma=1, n_sigma_ul=3, ) result = prof_maker.run(mapdataset_onoff) imp_prof = result.to_table(sed_type="flux") assert_allclose(imp_prof[7]["x_min"], 0.1462, atol=1e-4) assert_allclose(imp_prof[7]["x_ref"], 0.1575, atol=1e-4) assert_allclose(imp_prof[7]["counts"], [[100.0], [100.0]], atol=1e-2) assert_allclose(imp_prof[7]["sqrt_ts"], [7.63, 7.63], atol=1e-2) assert_allclose(imp_prof[0]["flux"], [8e-06, 8.0e-06], atol=1e-3) # TODO: npred quantities are not supported by the table serialisation format # so we rely on the FluxPoints object npred_excess = result.npred_excess.data[7].squeeze() assert_allclose(npred_excess, [80.0, 80.0], rtol=1e-3) errn = result.npred_excess_errn.data[7].squeeze() assert_allclose(errn, [10.75, 10.75], atol=1e-2) ul = result.npred_excess_ul.data[7].squeeze() assert_allclose(ul, [111.32, 111.32], atol=1e-2) def test_radial_profile(): dataset = get_simple_dataset_on_off() geom = dataset.counts.geom regions = make_concentric_annulus_sky_regions( center=geom.center_skydir, radius_max=0.2 * u.deg, ) prof_maker = FluxProfileEstimator( regions, energy_edges=[0.1, 1, 10] * u.TeV, selection_optional="all", n_sigma_ul=3, ) result = prof_maker.run(dataset) imp_prof = result.to_table(sed_type="flux") assert_allclose(imp_prof[7]["x_min"], 0.14, atol=1e-4) assert_allclose(imp_prof[7]["x_ref"], 0.15, atol=1e-4) assert_allclose(imp_prof[7]["counts"], [[980.0], [980.0]], atol=1e-2) assert_allclose(imp_prof[7]["sqrt_ts"], [23.886444, 23.886444], atol=1e-5) assert_allclose(imp_prof[0]["flux"], [8e-06, 8.0e-06], atol=1e-3) # TODO: npred quantities are not supported by the table serialisation format # so we rely on the FluxPoints object npred_excess = result.npred_excess.data[7].squeeze() assert_allclose(npred_excess, [784.0, 784.0], rtol=1e-3) errn = result.npred_excess_errn.data[7].squeeze() assert_allclose(errn, [34.075, 34.075], rtol=2e-3) ul = result.npred_excess_ul.data[0].squeeze() assert_allclose(ul, [72.074, 72.074], rtol=1e-3) def test_radial_profile_one_interval(): dataset = get_simple_dataset_on_off() geom = dataset.counts.geom regions = make_concentric_annulus_sky_regions( center=geom.center_skydir, radius_max=0.2 * u.deg, ) prof_maker = FluxProfileEstimator( regions, selection_optional="all", energy_edges=[0.1, 10] * u.TeV, n_sigma_ul=3, sum_over_energy_groups=True, ) result = prof_maker.run(dataset) imp_prof = result.to_table(sed_type="flux") assert_allclose(imp_prof[7]["counts"], [[1960]], atol=1e-5) assert_allclose(imp_prof[7]["npred_excess"], [[1568.0]], rtol=1e-3) assert_allclose(imp_prof[7]["sqrt_ts"], [33.780533], rtol=1e-3) assert_allclose(imp_prof[0]["flux"], [16e-06], atol=1e-3) axis = result.counts.geom.axes["dataset"] assert axis.center == ["test-on-off"] errn = result.npred_excess_errn.data[7].squeeze() assert_allclose(errn, [48.278367], rtol=2e-3) ul = result.npred_excess_ul.data[0].squeeze() assert_allclose(ul, [130.394824], rtol=1e-3) def test_serialisation(tmpdir): dataset = get_simple_dataset_on_off() geom = dataset.counts.geom regions = make_concentric_annulus_sky_regions( center=geom.center_skydir, radius_max=0.2 * u.deg, ) est = FluxProfileEstimator(regions, energy_edges=[0.1, 10] * u.TeV) result = est.run(dataset) result.write(tmpdir / "profile.fits") profile = FluxPoints.read( tmpdir / "profile.fits", reference_model=PowerLawSpectralModel(), ) assert_allclose(result.norm, profile.norm, rtol=1e-4) assert_allclose(result.norm_err, profile.norm_err, rtol=1e-4) assert_allclose(result.npred, profile.npred) assert_allclose(result.ts, profile.ts) assert_allclose(profile.gti.time_start[0].mjd, 55197.000766, rtol=1e-5) assert np.all(result.is_ul == profile.is_ul) def test_regions_init(): with pytest.raises(ValueError): FluxProfileEstimator(regions=[]) region = CircleSkyRegion(center=SkyCoord("0d", "0d"), radius=0.1 * u.deg) with pytest.raises(ValueError): FluxProfileEstimator(regions=[region]) @requires_data() def test_profile_with_model_or_mask(): dataset = simulate_map_dataset(name="test-map-pwl") geom = dataset.counts.geom regions = make_concentric_annulus_sky_regions( center=geom.center_skydir, radius_max=0.2 * u.deg, ) prof_maker = FluxProfileEstimator( regions, selection_optional="all", energy_edges=[0.1, 10] * u.TeV, n_sigma_ul=3, sum_over_energy_groups=True, ) result = prof_maker.run(dataset) imp_prof = result.to_table(sed_type="flux") assert_allclose(imp_prof[7]["npred_excess"], [[-1.115967]], rtol=1e-3) dataset.models = None result = prof_maker.run(dataset) imp_prof = result.to_table(sed_type="flux") assert_allclose(imp_prof[7]["npred_excess"], [[112.95312]], rtol=1e-3) dataset.mask_fit = ~geom.region_mask([regions[7]]) result = prof_maker.run(dataset) imp_prof = result.to_table(sed_type="flux") assert_allclose(imp_prof[7]["npred_excess"], [[0]], rtol=1e-3) @requires_data() def test_profile_multiprocessing(): dataset = simulate_map_dataset(name="test-map-pwl") geom = dataset.counts.geom regions = make_concentric_annulus_sky_regions( center=geom.center_skydir, radius_max=0.2 * u.deg, ) prof_maker = FluxProfileEstimator( regions, selection_optional="all", energy_edges=[0.1, 10] * u.TeV, n_sigma_ul=3, sum_over_energy_groups=True, n_jobs=2, parallel_backend="multiprocessing", ) result = prof_maker.run(dataset) imp_prof = result.to_table(sed_type="flux") assert_allclose(imp_prof[7]["npred_excess"], [[-1.115967]], rtol=1e-3) @requires_data() @requires_dependency("ray") def test_profile_multiprocessing_ray_with_manager(): dataset = simulate_map_dataset(name="test-map-pwl") geom = dataset.counts.geom regions = make_concentric_annulus_sky_regions( center=geom.center_skydir, radius_max=0.2 * u.deg, ) with parallel.multiprocessing_manager(backend="ray", pool_kwargs=dict(processes=2)): prof_maker = FluxProfileEstimator( regions, selection_optional="all", energy_edges=[0.1, 10] * u.TeV, n_sigma_ul=3, sum_over_energy_groups=True, ) assert prof_maker.n_jobs == 2 assert prof_maker.parallel_backend == "ray" result = prof_maker.run(dataset) imp_prof = result.to_table(sed_type="flux") assert_allclose(imp_prof[7]["npred_excess"], [[-1.115967]], rtol=1e-3) @requires_data() @requires_dependency("ray") def test_profile_multiprocessing_ray(): dataset = simulate_map_dataset(name="test-map-pwl") geom = dataset.counts.geom regions = make_concentric_annulus_sky_regions( center=geom.center_skydir, radius_max=0.2 * u.deg, ) prof_maker = FluxProfileEstimator( regions, selection_optional="all", energy_edges=[0.1, 10] * u.TeV, n_sigma_ul=3, sum_over_energy_groups=True, n_jobs=2, parallel_backend="ray", ) result = prof_maker.run(dataset) imp_prof = result.to_table(sed_type="flux") assert_allclose(imp_prof[7]["npred_excess"], [[-1.115967]], rtol=1e-3) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/points/tests/test_sed.py0000644000175100001770000006547314721316200023117 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy import units as u from astropy.coordinates import EarthLocation, SkyCoord from astropy.table import Table from gammapy.data import Observation from gammapy.data.pointing import FixedPointingInfo from gammapy.datasets import ( Datasets, FluxPointsDataset, MapDataset, SpectrumDatasetOnOff, ) from gammapy.datasets.spectrum import SpectrumDataset from gammapy.estimators import FluxPoints, FluxPointsEstimator from gammapy.irf import EDispKernelMap, EffectiveAreaTable2D, load_irf_dict_from_file from gammapy.makers import MapDatasetMaker from gammapy.makers.utils import make_map_exposure_true_energy from gammapy.maps import MapAxis, RegionGeom, RegionNDMap, WcsGeom from gammapy.modeling import Fit from gammapy.modeling.models import ( ExpCutoffPowerLawSpectralModel, FoVBackgroundModel, GaussianSpatialModel, Models, PiecewiseNormSpectralModel, PowerLawSpectralModel, SkyModel, TemplateNPredModel, TemplateSpatialModel, ) from gammapy.utils import parallel from gammapy.utils.testing import requires_data, requires_dependency @pytest.fixture() def fermi_datasets(): filename = "$GAMMAPY_DATA/fermi-3fhl-crab/Fermi-LAT-3FHL_datasets.yaml" filename_models = "$GAMMAPY_DATA/fermi-3fhl-crab/Fermi-LAT-3FHL_models.yaml" return Datasets.read(filename=filename, filename_models=filename_models) # TODO: use pre-generated data instead def simulate_spectrum_dataset(model, random_state=0): energy_edges = np.logspace(-0.5, 1.5, 21) * u.TeV energy_axis = MapAxis.from_edges(energy_edges, interp="log", name="energy") energy_axis_true = energy_axis.copy(name="energy_true") aeff = EffectiveAreaTable2D.from_parametrization(energy_axis_true=energy_axis_true) bkg_model = SkyModel( spectral_model=PowerLawSpectralModel( index=2.5, amplitude="1e-12 cm-2 s-1 TeV-1" ), name="background", ) bkg_model.spectral_model.amplitude.frozen = True bkg_model.spectral_model.index.frozen = True geom = RegionGeom.create(region="icrs;circle(0, 0, 0.1)", axes=[energy_axis]) acceptance = RegionNDMap.from_geom(geom=geom, data=1) edisp = EDispKernelMap.from_diagonal_response( energy_axis=energy_axis, energy_axis_true=energy_axis_true, geom=geom, ) geom_true = RegionGeom.create( region="icrs;circle(0, 0, 0.1)", axes=[energy_axis_true] ) exposure = make_map_exposure_true_energy( pointing=SkyCoord("0d", "0d"), aeff=aeff, livetime=100 * u.h, geom=geom_true ) mask_safe = RegionNDMap.from_geom(geom=geom, dtype=bool) mask_safe.data += True acceptance_off = RegionNDMap.from_geom(geom=geom, data=5) dataset = SpectrumDatasetOnOff( name="test_onoff", exposure=exposure, acceptance=acceptance, acceptance_off=acceptance_off, edisp=edisp, mask_safe=mask_safe, ) dataset.models = bkg_model bkg_npred = dataset.npred_signal() dataset.models = model dataset.fake( random_state=random_state, npred_background=bkg_npred, ) return dataset def create_fpe(model): model = SkyModel(spectral_model=model, name="source") dataset = simulate_spectrum_dataset(model) energy_edges = [0.1, 1, 10, 100] * u.TeV dataset.models = model fpe = FluxPointsEstimator( energy_edges=energy_edges, source="source", selection_optional="all", fit=Fit(backend="minuit", optimize_opts=dict(tol=0.2, strategy=1)), ) fpe.norm.scan_n_values = 11 datasets = [dataset] return datasets, fpe def simulate_map_dataset(random_state=0, name=None): irfs = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) skydir = SkyCoord("0 deg", "0 deg", frame="galactic") pointing = FixedPointingInfo(fixed_icrs=skydir.icrs) energy_edges = np.logspace(-1, 2, 15) * u.TeV energy_axis = MapAxis.from_edges(edges=energy_edges, name="energy", interp="log") geom = WcsGeom.create( skydir=skydir, width=(4, 4), binsz=0.1, axes=[energy_axis], frame="galactic" ) gauss = GaussianSpatialModel( lon_0="0 deg", lat_0="0 deg", sigma="0.4 deg", frame="galactic" ) pwl = PowerLawSpectralModel(amplitude="1e-11 cm-2 s-1 TeV-1") skymodel = SkyModel(spatial_model=gauss, spectral_model=pwl, name="source") obs = Observation.create( pointing=pointing, livetime=1 * u.h, irfs=irfs, location=EarthLocation(lon="-70d18m58.84s", lat="-24d41m0.34s", height="2000m"), ) empty = MapDataset.create(geom, name=name) maker = MapDatasetMaker(selection=["exposure", "background", "psf", "edisp"]) dataset = maker.run(empty, obs) bkg_model = FoVBackgroundModel(dataset_name=dataset.name) dataset.models = [bkg_model, skymodel] dataset.fake(random_state=random_state) return dataset @pytest.fixture(scope="session") def fpe_map_pwl(): dataset_1 = simulate_map_dataset(name="test-map-pwl") dataset_2 = dataset_1.copy(name="test-map-pwl-2") dataset_2.models = dataset_1.models dataset_2.mask_safe = RegionNDMap.from_geom(dataset_2.counts.geom, dtype=bool) energy_edges = [0.1, 1, 10, 100] * u.TeV datasets = [dataset_1, dataset_2] fpe = FluxPointsEstimator( energy_edges=energy_edges, source="source", selection_optional="all", ) fpe.norm.scan_n_values = 3 return datasets, fpe @pytest.fixture(scope="session") def fpe_map_pwl_ray(): """duplicate of fpe_map_pwl to avoid fails due to execution order""" dataset_1 = simulate_map_dataset(name="test-map-pwl") dataset_2 = dataset_1.copy(name="test-map-pwl-2") dataset_2.models = dataset_1.models dataset_2.mask_safe = RegionNDMap.from_geom(dataset_2.counts.geom, dtype=bool) energy_edges = [0.1, 1, 10, 100] * u.TeV datasets = [dataset_1, dataset_2] fpe = FluxPointsEstimator( energy_edges=energy_edges, source="source", selection_optional="all", ) fpe.norm.scan_n_values = 3 return datasets, fpe @pytest.fixture(scope="session") def fpe_map_pwl_reoptimize(): dataset = simulate_map_dataset() energy_edges = [1, 10] * u.TeV dataset.models.parameters["lon_0"].frozen = True dataset.models.parameters["lat_0"].frozen = True dataset.models.parameters["sigma"].frozen = True datasets = [dataset] fpe = FluxPointsEstimator( energy_edges=energy_edges, reoptimize=True, source="source", ) fpe.norm.scan_values = [0.8, 1, 1.2] return datasets, fpe @pytest.fixture(scope="session") def fpe_pwl(): return create_fpe(PowerLawSpectralModel()) @pytest.fixture(scope="session") def fpe_ecpl(): return create_fpe(ExpCutoffPowerLawSpectralModel(lambda_="1 TeV-1")) def test_str(fpe_pwl): datasets, fpe = fpe_pwl assert "FluxPointsEstimator" in str(fpe) def test_run_pwl(fpe_pwl, tmpdir): datasets, fpe = fpe_pwl fp = fpe.run(datasets) table = fp.to_table() actual = table["e_min"].data assert_allclose(actual, [0.316228, 1.0, 10.0], rtol=1e-5) actual = table["e_max"].data assert_allclose(actual, [1.0, 10.0, 31.622777], rtol=1e-5) actual = table["e_ref"].data assert_allclose(actual, [0.562341, 3.162278, 17.782794], rtol=1e-3) actual = table["ref_flux"].quantity desired = [2.162278e-12, 9.000000e-13, 6.837722e-14] * u.Unit("1 / (cm2 s)") assert_allclose(actual, desired, rtol=1e-3) actual = table["ref_dnde"].quantity desired = [3.162278e-12, 1.000000e-13, 3.162278e-15] * u.Unit("1 / (cm2 s TeV)") assert_allclose(actual, desired, rtol=1e-3) actual = table["ref_eflux"].quantity desired = [1.151293e-12, 2.302585e-12, 1.151293e-12] * u.Unit("TeV / (cm2 s)") assert_allclose(actual, desired, rtol=1e-3) actual = table["norm"].data assert_allclose(actual, [1.081434, 0.91077, 0.922176], rtol=1e-3) actual = table["norm_err"].data assert_allclose(actual, [0.066374, 0.061025, 0.179729], rtol=1e-2) actual = table["norm_errn"].data assert_allclose(actual, [0.065803, 0.060403, 0.171376], rtol=1e-2) actual = table["norm_errp"].data assert_allclose(actual, [0.06695, 0.061652, 0.18839], rtol=1e-2) actual = table["counts"].data.squeeze() assert_allclose(actual, [1490, 748, 43]) actual = table["norm_ul"].data assert_allclose(actual, [1.216227, 1.035472, 1.316878], rtol=1e-2) actual = table["sqrt_ts"].data assert_allclose(actual, [18.568429, 18.054651, 7.057121], rtol=1e-2) actual = table["ts"].data assert_allclose(actual, [344.7866, 325.9704, 49.8029], rtol=1e-2) actual = table["stat"].data assert_allclose(actual, [2.76495, 13.11912, 3.70128], rtol=1e-2) actual = table["stat_null"].data assert_allclose(actual, [347.55159, 339.08952, 53.50424], rtol=1e-2) actual = table["norm_scan"][0][[0, 5, -1]] assert_allclose(actual, [0.2, 1.0, 5.0]) actual = table["stat_scan"][0][[0, 5, -1]] assert_allclose(actual, [220.369, 4.301, 1881.626], rtol=1e-2) actual = table["npred"].data assert_allclose(actual, [[1492.966], [749.459], [43.105]], rtol=1e-3) actual = table["npred_excess"].data assert_allclose(actual, [[660.5625], [421.5402], [34.3258]], rtol=1e-3) actual = table.meta["UL_CONF"] assert_allclose(actual, 0.9544997) npred_excess_err = fp.npred_excess_err.data.squeeze() assert_allclose(npred_excess_err, [40.541334, 28.244024, 6.690005], rtol=1e-3) npred_excess_errp = fp.npred_excess_errp.data.squeeze() assert_allclose(npred_excess_errp, [40.838806, 28.549508, 7.013377], rtol=1e-3) npred_excess_errn = fp.npred_excess_errn.data.squeeze() assert_allclose(npred_excess_errn, [40.247313, 27.932033, 6.378465], rtol=1e-3) npred_excess_ul = fp.npred_excess_ul.data.squeeze() assert_allclose(npred_excess_ul, [742.87486, 479.169719, 49.019125], rtol=1e-3) # test GADF I/O fp.write(tmpdir / "test.fits") fp_new = FluxPoints.read(tmpdir / "test.fits") assert fp_new.meta["sed_type_init"] == "likelihood" # test datasets stat fp_dataset = FluxPointsDataset(data=fp, models=fp.reference_model) fp_dataset.stat_type = "chi2" assert_allclose(fp_dataset.stat_sum(), 3.82374, rtol=1e-4) fp_dataset.stat_type = "profile" assert_allclose(fp_dataset.stat_sum(), 3.790053, rtol=1e-4) fp_dataset.stat_type = "distrib" assert_allclose(fp_dataset.stat_sum(), 3.783325, rtol=1e-4) def test_run_ecpl(fpe_ecpl, tmpdir): datasets, fpe = fpe_ecpl fp = fpe.run(datasets) table = fp.to_table() actual = table["ref_flux"].quantity desired = [9.024362e-13, 1.781341e-13, 1.260298e-18] * u.Unit("1 / (cm2 s)") assert_allclose(actual, desired, rtol=1e-3) actual = table["ref_dnde"].quantity desired = [1.351382e-12, 7.527318e-15, 2.523659e-22] * u.Unit("1 / (cm2 s TeV)") assert_allclose(actual, desired, rtol=1e-3) actual = table["ref_eflux"].quantity desired = [4.770557e-13, 2.787695e-13, 1.371963e-17] * u.Unit("TeV / (cm2 s)") assert_allclose(actual, desired, rtol=1e-3) actual = table["norm"].data assert_allclose(actual, [1.001683, 1.061821, 1.237512e03], rtol=1e-3) actual = table["norm_err"].data assert_allclose(actual, [1.386091e-01, 2.394241e-01, 3.259756e03], rtol=1e-2) actual = table["norm_errn"].data assert_allclose(actual, [1.374962e-01, 2.361246e-01, 2.888978e03], rtol=1e-2) actual = table["norm_errp"].data assert_allclose(actual, [1.397358e-01, 2.428481e-01, 3.716550e03], rtol=1e-2) actual = table["norm_ul"].data assert_allclose(actual, [1.283433e00, 1.555117e00, 9.698645e03], rtol=1e-2) actual = table["sqrt_ts"].data assert_allclose(actual, [7.678454, 4.735691, 0.399243], rtol=1e-2) # test GADF I/O fp.write(tmpdir / "test.fits") fp_new = FluxPoints.read(tmpdir / "test.fits") assert fp_new.meta["sed_type_init"] == "likelihood" @requires_data() def test_run_map_pwl(fpe_map_pwl, tmpdir): datasets, fpe = fpe_map_pwl fp = fpe.run(datasets) table = fp.to_table() actual = table["e_min"].data assert_allclose(actual, [0.1, 1.178769, 8.48342], rtol=1e-5) actual = table["e_max"].data assert_allclose(actual, [1.178769, 8.483429, 100.0], rtol=1e-5) actual = table["e_ref"].data assert_allclose(actual, [0.343332, 3.162278, 29.126327], rtol=1e-5) actual = table["norm"].data assert_allclose(actual, [0.974726, 0.96342, 0.994251], rtol=1e-2) actual = table["norm_err"].data assert_allclose(actual, [0.067637, 0.052022, 0.087059], rtol=3e-2) actual = table["counts"].data assert_allclose(actual, [[44611, 0], [1923, 0], [282, 0]]) actual = table["norm_ul"].data assert_allclose(actual, [1.111852, 1.07004, 1.17829], rtol=1e-2) actual = table["sqrt_ts"].data assert_allclose(actual, [16.681221, 28.408676, 21.91912], rtol=1e-2) actual = table["norm_scan"][0] assert_allclose(actual, [0.2, 1.0, 5]) actual = table["stat_scan"][0] - table["stat"][0] assert_allclose(actual, [1.628398e02, 1.452456e-01, 2.008018e03], rtol=1e-2) # test GADF I/O fp.write(tmpdir / "test.fits") fp_new = FluxPoints.read(tmpdir / "test.fits") assert fp_new.meta["sed_type_init"] == "likelihood" @requires_data() def test_run_map_pwl_reoptimize(fpe_map_pwl_reoptimize): datasets, fpe = fpe_map_pwl_reoptimize fpe = fpe.copy() fpe.selection_optional = ["scan"] fp = fpe.run(datasets) table = fp.to_table() actual = table["norm"].data assert_allclose(actual, 0.962368, rtol=1e-2) actual = table["norm_err"].data assert_allclose(actual, 0.053878, rtol=1e-2) actual = table["sqrt_ts"].data assert_allclose(actual, 25.196585, rtol=1e-2) actual = table["norm_scan"][0] assert_allclose(actual, [0.8, 1, 1.2]) actual = table["stat_scan"][0] - table["stat"][0] assert_allclose(actual, [9.788123, 0.486066, 17.603708], rtol=1e-2) def test_run_no_edip(fpe_pwl, tmpdir): datasets, fpe = fpe_pwl datasets = datasets.copy() datasets[0].models["source"].apply_irf["edisp"] = False fp = fpe.run(datasets) table = fp.to_table() actual = table["norm"].data assert_allclose(actual, [1.081434, 0.91077, 0.922176], rtol=1e-3) datasets[0].edisp = None fp = fpe.run(datasets) table = fp.to_table() actual = table["norm"].data assert_allclose(actual, [1.081434, 0.91077, 0.922176], rtol=1e-3) datasets[0].models["source"].apply_irf["edisp"] = True fp = fpe.run(datasets) table = fp.to_table() actual = table["norm"].data assert_allclose(actual, [1.081434, 0.91077, 0.922176], rtol=1e-3) @requires_dependency("iminuit") @requires_data() def test_run_template_npred(fpe_map_pwl, tmpdir): datasets, fpe = fpe_map_pwl dataset = datasets[0] models = Models(dataset.models) model = TemplateNPredModel(dataset.background, datasets_names=[dataset.name]) models.append(model) dataset.models = models dataset.background.data = 0 fp = fpe.run(dataset) table = fp.to_table() actual = table["norm"].data assert_allclose(actual, [0.974726, 0.96342, 0.994251], rtol=1e-2) fpe.sum_over_energy_groups = True fp = fpe.run(dataset) table = fp.to_table() actual = table["norm"].data assert_allclose(actual, [0.955512, 0.965328, 0.995237], rtol=1e-2) @requires_data() def test_reoptimize_no_free_parameters(fpe_pwl, caplog): datasets, fpe = fpe_pwl fpe.reoptimize = True with pytest.raises(ValueError, match="No free parameters for fitting"): fpe.run(datasets) fpe.reoptimize = False @requires_data() def test_flux_points_estimator_no_norm_scan(fpe_pwl, tmpdir): datasets, fpe = fpe_pwl fpe.selection_optional = None fp = fpe.run(datasets) assert_allclose(fpe.fit.optimize_opts["tol"], 0.2) assert fp.sed_type_init == "likelihood" assert "stat_scan" not in fp._data # test GADF I/O fp.write(tmpdir / "test.fits") fp_new = FluxPoints.read(tmpdir / "test.fits") assert fp_new.meta["sed_type_init"] == "likelihood" def test_no_likelihood_contribution(): dataset = simulate_spectrum_dataset( SkyModel(spectral_model=PowerLawSpectralModel(), name="source") ) dataset_2 = dataset.slice_by_idx(slices={"energy": slice(0, 5)}) dataset.mask_safe = RegionNDMap.from_geom(dataset.counts.geom, dtype=bool) fpe = FluxPointsEstimator(energy_edges=[1.0, 3.0, 10.0] * u.TeV, source="source") table = fpe.run([dataset, dataset_2]).to_table() assert np.isnan(table["norm"]).all() assert np.isnan(table["norm_err"]).all() assert_allclose(table["counts"], 0) def test_mask_shape(): axis = MapAxis.from_edges([1.0, 3.0, 10.0], unit="TeV", interp="log", name="energy") geom_1 = WcsGeom.create(binsz=1, width=3, axes=[axis]) geom_2 = WcsGeom.create(binsz=1, width=5, axes=[axis]) dataset_1 = MapDataset.create(geom_1) dataset_2 = MapDataset.create(geom_2) dataset_1.gti = None dataset_2.gti = None dataset_1.psf = None dataset_2.psf = None dataset_1.edisp = None dataset_2.edisp = None dataset_2.mask_safe = None model = SkyModel( spectral_model=PowerLawSpectralModel(), spatial_model=GaussianSpatialModel(), name="source", ) dataset_1.models = model dataset_2.models = model fpe = FluxPointsEstimator(energy_edges=[1, 10] * u.TeV, source="source") fp = fpe.run([dataset_2, dataset_1]) table = fp.to_table() assert_allclose(table["counts"], 0) assert_allclose(table["npred"], 0) def test_run_pwl_parameter_range(fpe_pwl): pl = PowerLawSpectralModel(amplitude="1e-16 cm-2s-1TeV-1") datasets, fpe = create_fpe(pl) fp = fpe.run(datasets) table_no_bounds = fp.to_table() fpe.norm.min = 0 fpe.norm.max = 1e4 fp = fpe.run(datasets) table_with_bounds = fp.to_table() actual = table_with_bounds["norm"].data assert_allclose(actual, [0.0, 0.0, 0.0], atol=1e-2) actual = table_with_bounds["norm_errp"].data assert_allclose(actual, [212.593368, 298.383045, 449.951747], rtol=1e-2) actual = table_with_bounds["norm_ul"].data assert_allclose(actual, [640.067576, 722.571371, 1414.22209], rtol=1e-2) actual = table_with_bounds["sqrt_ts"].data assert_allclose(actual, [0.0, 0.0, 0.0], atol=1e-2) actual = table_no_bounds["norm"].data assert_allclose(actual, [-511.76675, -155.75408, -853.547117], rtol=1e-3) actual = table_no_bounds["norm_err"].data assert_allclose(actual, [504.601499, 416.69248, 851.223077], rtol=1e-2) actual = table_no_bounds["norm_ul"].data assert_allclose(actual, [514.957128, 707.888477, 1167.105962], rtol=1e-2) actual = table_no_bounds["sqrt_ts"].data assert_allclose(actual, [-1.006081, -0.364848, -0.927819], rtol=1e-2) def test_flux_points_estimator_small_edges(): pl = PowerLawSpectralModel(amplitude="1e-11 cm-2s-1TeV-1") datasets, fpe = create_fpe(pl) fpe.energy_edges = datasets[0].counts.geom.axes["energy"].upsample(3).edges[1:4] fpe.selection_optional = [] fp = fpe.run(datasets) assert_allclose(fp.ts.data[0, 0, 0], 2156.96959291) assert np.isnan(fp.ts.data[1, 0, 0]) assert np.isnan(fp.npred.data[1, 0, 0]) def test_flux_points_recompute_ul(fpe_pwl): datasets, fpe = fpe_pwl fpe.selection_optional = ["all"] fp = fpe.run(datasets) assert_allclose( fp.flux_ul.data, [[[2.629819e-12]], [[9.319243e-13]], [[9.004449e-14]]], rtol=1e-3, ) fp1 = fp.recompute_ul(n_sigma_ul=4) assert_allclose( fp1.flux_ul.data, [[[2.92877891e-12]], [[1.04993236e-12]], [[1.22089744e-13]]], rtol=1e-3, ) assert fp1.meta["n_sigma_ul"] == 4 # check that it returns a sensible value fp2 = fp.recompute_ul(n_sigma_ul=2) assert_allclose(fp2.flux_ul.data, fp.flux_ul.data, rtol=1e-2) def test_flux_points_parallel_multiprocessing(fpe_pwl): datasets, fpe = fpe_pwl fpe.selection_optional = ["all"] fpe.n_jobs = 2 assert fpe.n_jobs == 2 fp = fpe.run(datasets) assert_allclose( fp.flux_ul.data, [[[2.629819e-12]], [[9.319243e-13]], [[9.004449e-14]]], rtol=1e-3, ) def test_global_n_jobs_default_handling(): fpe = FluxPointsEstimator(energy_edges=[1, 3, 10] * u.TeV) assert fpe.n_jobs == 1 parallel.N_JOBS_DEFAULT = 2 assert fpe.n_jobs == 2 fpe.n_jobs = 5 assert fpe.n_jobs == 5 fpe.n_jobs = None assert fpe.n_jobs == 2 assert fpe._n_jobs is None parallel.N_JOBS_DEFAULT = 1 assert fpe.n_jobs == 1 @requires_dependency("ray") def test_flux_points_parallel_ray(fpe_pwl): datasets, fpe = fpe_pwl fpe.selection_optional = ["all"] fpe.parallel_backend = "ray" fpe.n_jobs = 2 fp = fpe.run(datasets) assert_allclose( fp.flux_ul.data, [[[2.629819e-12]], [[9.319243e-13]], [[9.004449e-14]]], rtol=1e-3, ) @requires_dependency("ray") def test_flux_points_parallel_ray_actor_spectrum(fpe_pwl): from gammapy.datasets.actors import DatasetsActor datasets, fpe = fpe_pwl with pytest.raises(TypeError): DatasetsActor(datasets) @requires_data() @requires_dependency("ray") def test_flux_points_parallel_ray_actor_map(fpe_map_pwl_ray): from gammapy.datasets.actors import DatasetsActor datasets, fpe = fpe_map_pwl_ray actors = DatasetsActor(datasets) fp = fpe.run(actors) table = fp.to_table() actual = table["e_min"].data assert_allclose(actual, [0.1, 1.178769, 8.48342], rtol=1e-5) actual = table["e_max"].data assert_allclose(actual, [1.178769, 8.483429, 100.0], rtol=1e-5) actual = table["e_ref"].data assert_allclose(actual, [0.343332, 3.162278, 29.126327], rtol=1e-5) actual = table["norm"].data assert_allclose(actual, [0.974726, 0.96342, 0.994251], rtol=1e-2) actual = table["norm_err"].data assert_allclose(actual, [0.067637, 0.052022, 0.087059], rtol=3e-2) actual = table["counts"].data assert_allclose(actual, [[44611, 0], [1923, 0], [282, 0]]) actual = table["norm_ul"].data assert_allclose(actual, [1.111852, 1.07004, 1.17829], rtol=1e-2) actual = table["sqrt_ts"].data assert_allclose(actual, [16.681221, 28.408676, 21.91912], rtol=1e-2) actual = table["norm_scan"][0] assert_allclose(actual, [0.2, 1.0, 5]) actual = table["stat_scan"][0] - table["stat"][0] assert_allclose(actual, [1.628398e02, 1.452456e-01, 2.008018e03], rtol=1e-2) def test_fpe_non_aligned_energy_axes(): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=10) geom_1 = RegionGeom.create("icrs;circle(0, 0, 0.1)", axes=[energy_axis]) dataset_1 = SpectrumDataset.create(geom=geom_1) energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=7) geom_2 = RegionGeom.create("icrs;circle(0, 0, 0.1)", axes=[energy_axis]) dataset_2 = SpectrumDataset.create(geom=geom_2) fpe = FluxPointsEstimator(energy_edges=[1, 3, 10] * u.TeV) with pytest.raises(ValueError, match="must have aligned energy axes"): fpe.run(datasets=[dataset_1, dataset_2]) def test_fpe_non_uniform_datasets(): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=10) geom_1 = RegionGeom.create("icrs;circle(0, 0, 0.1)", axes=[energy_axis]) dataset_1 = SpectrumDataset.create( geom=geom_1, meta_table=Table({"TELESCOP": ["CTA"]}) ) energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=10) geom_2 = RegionGeom.create("icrs;circle(0, 0, 0.1)", axes=[energy_axis]) dataset_2 = SpectrumDataset.create( geom=geom_2, meta_table=Table({"TELESCOP": ["CTB"]}) ) fpe = FluxPointsEstimator(energy_edges=[1, 3, 10] * u.TeV) with pytest.raises(ValueError, match="same value of the 'TELESCOP' meta keyword"): fpe.run(datasets=[dataset_1, dataset_2]) @requires_data() def test_flux_points_estimator_norm_spectral_model(fermi_datasets): energy_edges = [10, 30, 100, 300, 1000] * u.GeV model_ref = fermi_datasets.models["Crab Nebula"] estimator = FluxPointsEstimator( energy_edges=energy_edges, source="Crab Nebula", selection_optional=[], reoptimize=True, ) flux_points = estimator.run(fermi_datasets[0]) flux_points_dataset = FluxPointsDataset(data=flux_points, models=model_ref) flux_pred_ref = flux_points_dataset.flux_pred() models = Models([model_ref]) geom = fermi_datasets[0].exposure.geom.to_image() energy_axis = MapAxis.from_energy_bounds( 3 * u.GeV, 1.7 * u.TeV, nbin=30, per_decade=True, name="energy_true" ) geom = geom.to_cube([energy_axis]) model = models.to_template_sky_model(geom, name="test") fermi_datasets.models = [fermi_datasets[0].background_model, model] estimator = FluxPointsEstimator( energy_edges=energy_edges, source="test", selection_optional=[], reoptimize=True ) flux_points = estimator.run(fermi_datasets[0]) flux_points_dataset = FluxPointsDataset(data=flux_points, models=model) flux_pred = flux_points_dataset.flux_pred() assert_allclose(flux_pred, flux_pred_ref, rtol=2e-4) # test model 2d norms = ( model.spatial_model.map.data.sum(axis=(1, 2)) / model.spatial_model.map.data.sum() ) model.spatial_model = TemplateSpatialModel( model.spatial_model.map.reduce_over_axes(), normalize=False ) model.spectral_model = PiecewiseNormSpectralModel(geom.axes[0].center, norms) flux_points_dataset = FluxPointsDataset(data=flux_points, models=model) flux_pred = flux_points_dataset.flux_pred() assert_allclose(flux_pred, flux_pred_ref, rtol=2e-4) @requires_data() def test_fpe_diff_lengths(): dataset = SpectrumDatasetOnOff.read( "$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs23523.fits" ) dataset1 = SpectrumDatasetOnOff.read( "$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs23559.fits" ) dataset.meta_table = Table(names=["NAME", "TELESCOP"], data=[["23523"], ["hess"]]) dataset1.meta_table = Table(names=["NAME", "TELESCOP"], data=[["23559"], ["hess"]]) dataset2 = Datasets([dataset, dataset1]).stack_reduce(name="dataset2") dataset3 = dataset1.copy() dataset3.meta_table = None pwl = PowerLawSpectralModel() datasets = Datasets([dataset1, dataset2, dataset3]) datasets.models = SkyModel(spectral_model=pwl, name="test") energy_edges = [1, 2, 4, 10] * u.TeV fpe = FluxPointsEstimator(energy_edges=energy_edges, source="test") fp = fpe.run(datasets) assert_allclose( fp.dnde.data, [[[2.034323e-11]], [[3.39049716e-12]], [[2.96231326e-13]]], rtol=1e-3, ) dataset4 = dataset1.copy() dataset4.meta_table = Table(names=["NAME", "TELESCOP"], data=[["23523"], ["not"]]) datasets = Datasets([dataset1, dataset2, dataset3, dataset4]) with pytest.raises(ValueError): fp = fpe.run(datasets) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/points/tests/test_sensitivity.py0000644000175100001770000000714014721316200024721 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose from gammapy.datasets import SpectrumDataset, SpectrumDatasetOnOff from gammapy.estimators import FluxPoints, SensitivityEstimator from gammapy.irf import EDispKernelMap from gammapy.maps import MapAxis, RegionNDMap from gammapy.modeling.models import PowerLawSpectralModel from gammapy.utils.deprecation import GammapyDeprecationWarning @pytest.fixture() def spectrum_dataset(): e_true = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=20, name="energy_true") e_reco = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=4) background = RegionNDMap.create(region="icrs;circle(0, 0, 0.1)", axes=[e_reco]) background.data += 3600 background.data[0] *= 1e3 background.data[-1] *= 1e-3 edisp = EDispKernelMap.from_diagonal_response( energy_axis_true=e_true, energy_axis=e_reco, geom=background.geom ) exposure = RegionNDMap.create( region="icrs;circle(0, 0, 0.1)", axes=[e_true], unit="m2 h", data=1e6 ) return SpectrumDataset( name="test", exposure=exposure, edisp=edisp, background=background ) def test_cta_sensitivity_estimator(spectrum_dataset, caplog): geom = spectrum_dataset.background.geom dataset_on_off = SpectrumDatasetOnOff.from_spectrum_dataset( dataset=spectrum_dataset, acceptance=RegionNDMap.from_geom(geom=geom, data=1), acceptance_off=RegionNDMap.from_geom(geom=geom, data=5), ) with pytest.warns(GammapyDeprecationWarning): SensitivityEstimator( gamma_min=25, bkg_syst_fraction=0.075, spectrum=PowerLawSpectralModel() ) sens = SensitivityEstimator(gamma_min=25, bkg_syst_fraction=0.075) table = sens.run(dataset_on_off) assert len(table) == 4 assert table.colnames == [ "e_ref", "e_min", "e_max", "e2dnde", "excess", "background", "criterion", ] assert table["e_ref"].unit == "TeV" assert table["e2dnde"].unit == "erg / (cm2 s)" row = table[0] assert_allclose(row["e_ref"], 1.33352, rtol=1e-3) assert_allclose(row["e2dnde"], 2.74559e-08, rtol=1e-3) assert_allclose(row["excess"], 270000, rtol=1e-3) assert_allclose(row["background"], 3.6e06, rtol=1e-3) assert row["criterion"] == "bkg" row = table[1] assert_allclose(row["e_ref"], 2.37137, rtol=1e-3) assert_allclose(row["e2dnde"], 6.04795e-11, rtol=1e-3) assert_allclose(row["excess"], 334.454, rtol=1e-3) assert_allclose(row["background"], 3600, rtol=1e-3) assert row["criterion"] == "significance" row = table[3] assert_allclose(row["e_ref"], 7.49894, rtol=1e-3) assert_allclose(row["e2dnde"], 1.42959e-11, rtol=1e-3) assert_allclose(row["excess"], 25, rtol=1e-3) assert_allclose(row["background"], 3.6, rtol=1e-3) assert row["criterion"] == "gamma" def test_integral_estimation(spectrum_dataset, caplog): dataset = spectrum_dataset.to_image() geom = dataset.background.geom dataset_on_off = SpectrumDatasetOnOff.from_spectrum_dataset( dataset=dataset, acceptance=RegionNDMap.from_geom(geom=geom, data=1), acceptance_off=RegionNDMap.from_geom(geom=geom, data=5), ) sens = SensitivityEstimator(gamma_min=25, bkg_syst_fraction=0.075) table = sens.run(dataset_on_off) flux_points = FluxPoints.from_table( table, sed_type="e2dnde", reference_model=sens.spectral_model ) assert_allclose(table["excess"].data.squeeze(), 270540, rtol=1e-3) assert_allclose(flux_points.flux.data.squeeze(), 7.52e-9, rtol=1e-3) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/profile.py0000644000175100001770000003255214721316200020257 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Tools to create profiles (i.e. 1D "slices" from 2D images).""" import numpy as np import scipy.ndimage from astropy import units as u from astropy.convolution import Box1DKernel, Gaussian1DKernel from astropy.coordinates import Angle from astropy.table import Table import matplotlib.pyplot as plt from gammapy.maps.axes import UNIT_STRING_FORMAT from .core import Estimator __all__ = ["ImageProfile", "ImageProfileEstimator"] # TODO: implement measuring profile along arbitrary directions # TODO: think better about error handling. e.g. MC based methods class ImageProfileEstimator(Estimator): """Estimate profile from image. Parameters ---------- x_edges : `~astropy.coordinates.Angle`, optional Coordinate edges to define a custom measurement grid. method : {'sum', 'mean'} Compute sum or mean within profile bins. Default is 'sum'. axis : {'lon', 'lat', 'radial'} Along which axis to estimate the profile. Default is 'lon'. center : `~astropy.coordinates.SkyCoord`, optional Center coordinate for the radial profile option. Examples -------- This example shows how to compute a counts profile for the Fermi galactic center region:: import matplotlib.pyplot as plt from gammapy.estimators import ImageProfileEstimator from gammapy.maps import Map from astropy import units as u # load example data filename = '$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-counts.fits.gz' fermi_cts = Map.read(filename) # set up profile estimator and run p = ImageProfileEstimator(axis='lon', method='sum') profile = p.run(fermi_cts) # smooth profile and plot smoothed = profile.smooth(kernel='gauss') smoothed.peek() plt.show() """ tag = "ImageProfileEstimator" def __init__(self, x_edges=None, method="sum", axis="lon", center=None): if method not in ["sum", "mean"]: raise ValueError("Not a valid method, choose either 'sum' or 'mean'") if axis not in ["lon", "lat", "radial"]: raise ValueError("Not a valid axis, choose either 'lon', 'lat' or 'radial'") if axis == "radial" and center is None: raise ValueError("Please provide center coordinate for radial profiles") self._x_edges = x_edges self.parameters = {"method": method, "axis": axis, "center": center} def _get_x_edges(self, image): if self._x_edges is not None: return self._x_edges p = self.parameters coordinates = image.geom.get_coord(mode="edges").skycoord if p["axis"] == "lat": x_edges = coordinates[:, 0].data.lat elif p["axis"] == "lon": lon = coordinates[0, :].data.lon x_edges = lon.wrap_at("180d") elif p["axis"] == "radial": rad_step = image.geom.pixel_scales.mean() corners = [0, 0, -1, -1], [0, -1, 0, -1] rad_max = coordinates[corners].separation(p["center"]).max() x_edges = Angle(np.arange(0, rad_max.deg, rad_step.deg), unit="deg") return x_edges def _estimate_profile(self, image, image_err, mask): p = self.parameters labels = self._label_image(image, mask) profile_err = None index = np.arange(1, len(self._get_x_edges(image))) if p["method"] == "sum": profile = scipy.ndimage.sum(image.data, labels.data, index) if image.unit.is_equivalent("counts"): profile_err = np.sqrt(profile) elif image_err: # gaussian error propagation err_sum = scipy.ndimage.sum(image_err.data**2, labels.data, index) profile_err = np.sqrt(err_sum) elif p["method"] == "mean": # gaussian error propagation profile = scipy.ndimage.mean(image.data, labels.data, index) if image_err: N = scipy.ndimage.sum(~np.isnan(image_err.data), labels.data, index) err_sum = scipy.ndimage.sum(image_err.data**2, labels.data, index) profile_err = np.sqrt(err_sum) / N return profile, profile_err def _label_image(self, image, mask=None): p = self.parameters coordinates = image.geom.get_coord().skycoord x_edges = self._get_x_edges(image) if p["axis"] == "lon": lon = coordinates.data.lon.wrap_at("180d") data = np.digitize(lon.degree, x_edges.deg) elif p["axis"] == "lat": lat = coordinates.data.lat data = np.digitize(lat.degree, x_edges.deg) elif p["axis"] == "radial": separation = coordinates.separation(p["center"]) data = np.digitize(separation.degree, x_edges.deg) if mask is not None: # assign masked values to background data[mask.data] = 0 return image.copy(data=data) def run(self, image, image_err=None, mask=None): """Run image profile estimator. Parameters ---------- image : `~gammapy.maps.Map` Input image to run profile estimator on. image_err : `~gammapy.maps.Map`, optional Input error image to run profile estimator on. Default is None. mask : `~gammapy.maps.Map` Optional mask to exclude regions from the measurement. Returns ------- profile : `ImageProfile` Result image profile object. """ p = self.parameters if image.unit.is_equivalent("count"): image_err = image.copy(data=np.sqrt(image.data)) profile, profile_err = self._estimate_profile(image, image_err, mask) result = Table() x_edges = self._get_x_edges(image) result["x_min"] = x_edges[:-1] result["x_max"] = x_edges[1:] result["x_ref"] = (x_edges[:-1] + x_edges[1:]) / 2 result["profile"] = profile * image.unit if profile_err is not None: result["profile_err"] = profile_err * image.unit result.meta["PROFILE_TYPE"] = p["axis"] return ImageProfile(result) class ImageProfile: """Image profile class. The image profile data is stored in `~astropy.table.Table` object, with the following columns: * `x_ref` Coordinate bin center (required). * `x_min` Coordinate bin minimum (optional). * `x_max` Coordinate bin maximum (optional). * `profile` Image profile data (required). * `profile_err` Image profile data error (optional). Parameters ---------- table : `~astropy.table.Table` Table instance with the columns specified as above. """ def __init__(self, table): self.table = table def smooth(self, kernel="box", radius="0.1 deg", **kwargs): r"""Smooth profile with error propagation. Smoothing is described by a convolution: .. math:: x_j = \sum_i x_{(j - i)} h_i Where :math:`h_i` are the coefficients of the convolution kernel. The corresponding error on :math:`x_j` is then estimated using Gaussian error propagation, neglecting correlations between the individual :math:`x_{(j - i)}`: .. math:: \Delta x_j = \sqrt{\sum_i \Delta x^{2}_{(j - i)} h^{2}_i} Parameters ---------- kernel : {'gauss', 'box'} Kernel shape. Default is "box". radius : `~astropy.units.Quantity`, str or float Smoothing width given as quantity or float. If a float is given it is interpreted as smoothing width in pixels. If an (angular) quantity is given it is converted to pixels using `xref[1] - x_ref[0]`. Default is "0.1 deg". kwargs : dict, optional Keyword arguments passed to `~scipy.ndimage.uniform_filter` ('box') and `~scipy.ndimage.gaussian_filter` ('gauss'). Returns ------- profile : `ImageProfile` Smoothed image profile. """ table = self.table.copy() profile = table["profile"] radius = u.Quantity(radius) radius = np.abs(radius / np.diff(self.x_ref))[0] width = 2 * radius.value + 1 if kernel == "box": smoothed = scipy.ndimage.uniform_filter( profile.astype("float"), width, **kwargs ) # renormalize data if table["profile"].unit.is_equivalent("count"): smoothed *= int(width) smoothed_err = np.sqrt(smoothed) elif "profile_err" in table.colnames: profile_err = table["profile_err"] # use gaussian error propagation box = Box1DKernel(width) err_sum = scipy.ndimage.convolve(profile_err**2, box.array**2) smoothed_err = np.sqrt(err_sum) elif kernel == "gauss": smoothed = scipy.ndimage.gaussian_filter( profile.astype("float"), width, **kwargs ) # use gaussian error propagation if "profile_err" in table.colnames: profile_err = table["profile_err"] gauss = Gaussian1DKernel(width) err_sum = scipy.ndimage.convolve(profile_err**2, gauss.array**2) smoothed_err = np.sqrt(err_sum) else: raise ValueError("Not valid kernel choose either 'box' or 'gauss'") table["profile"] = smoothed * self.table["profile"].unit if "profile_err" in table.colnames: table["profile_err"] = smoothed_err * self.table["profile"].unit return self.__class__(table) def plot(self, ax=None, **kwargs): """Plot image profile. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Axes object. Default is None. **kwargs : dict, optional Keyword arguments passed to `~matplotlib.axes.Axes.plot`. Returns ------- ax : `~matplotlib.axes.Axes` Axes object. """ if ax is None: ax = plt.gca() y = self.table["profile"].data x = self.x_ref.value ax.plot(x, y, **kwargs) ax.set_xlabel(self.table.meta.get("PROFILE_TYPE", "axis")) ax.set_ylabel("profile") ax.set_xlim(x.max(), x.min()) return ax def plot_err(self, ax=None, **kwargs): """Plot image profile error as band. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Axes object. Default is None. **kwargs : dict, optional Keyword arguments passed to `~matplotlib.pyplot.fill_between`. Returns ------- ax : `~matplotlib.axes.Axes` Axes object. """ if ax is None: ax = plt.gca() y = self.table["profile"].data ymin = y - self.table["profile_err"].data ymax = y + self.table["profile_err"].data x = self.x_ref.value # plotting defaults kwargs.setdefault("alpha", 0.5) ax.fill_between(x, ymin, ymax, **kwargs) ax.set_xlabel(f"x [{u.deg.to_string(UNIT_STRING_FORMAT)}]") ax.set_ylabel("profile") return ax @property def x_ref(self): """Reference x coordinates.""" return self.table["x_ref"].quantity @property def x_min(self): """Min. x coordinates.""" return self.table["x_min"].quantity @property def x_max(self): """Max. x coordinates.""" return self.table["x_max"].quantity @property def profile(self): """Image profile quantity.""" return self.table["profile"].quantity @property def profile_err(self): """Image profile error quantity.""" try: return self.table["profile_err"].quantity except KeyError: return None def peek(self, figsize=(8, 4.5), **kwargs): """Show image profile and error. Parameters ---------- figsize : tuple Size of the figure. Default is (8, 4.5). **kwargs : dict, optional Keyword arguments passed to `ImageProfile.plot_profile()`. Returns ------- ax : `~matplotlib.axes.Axes` Axes object. """ fig = plt.figure(figsize=figsize) ax = fig.add_axes([0.1, 0.1, 0.8, 0.8]) ax = self.plot(ax, **kwargs) if "profile_err" in self.table.colnames: ax = self.plot_err(ax, color=kwargs.get("c")) return ax def normalize(self, mode="peak"): """Normalize profile to peak value or integral. Parameters ---------- mode : ['integral', 'peak'] Normalize image profile so that it integrates to unity ('integral') or the maximum value corresponds to one ('peak'). Default is "peak". Returns ------- profile : `ImageProfile` Normalized image profile. """ table = self.table.copy() profile = self.table["profile"] if mode == "peak": norm = np.nanmax(profile) elif mode == "integral": norm = np.nansum(profile) else: raise ValueError(f"Invalid normalization mode: {mode!r}") table["profile"] /= norm if "profile_err" in table.colnames: table["profile_err"] /= norm return self.__class__(table) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.204642 gammapy-1.3/gammapy/estimators/tests/0000755000175100001770000000000014721316215017406 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/tests/__init__.py0000644000175100001770000000010014721316200021500 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/tests/test_core.py0000644000175100001770000000030514721316200021737 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from gammapy.estimators import ESTIMATOR_REGISTRY def test_estimator_registry(): assert "Estimator" in str(ESTIMATOR_REGISTRY) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/tests/test_energydependentmorphology.py0000644000175100001770000000701514721316200026314 0ustar00runnerdockerimport pytest from numpy.testing import assert_allclose from astropy import units as u from astropy.coordinates import SkyCoord from gammapy.datasets import Datasets, MapDataset from gammapy.estimators.energydependentmorphology import ( EnergyDependentMorphologyEstimator, weighted_chi2_parameter, ) from gammapy.modeling.models import ( GaussianSpatialModel, PowerLawSpectralModel, SkyModel, ) @pytest.fixture() def create_model(): source_pos = SkyCoord(5.58, 0.2, unit="deg", frame="galactic") spectral_model = PowerLawSpectralModel( index=2.94, amplitude=9.8e-12 * u.Unit("cm-2 s-1 TeV-1"), reference=1.0 * u.TeV ) spatial_model = GaussianSpatialModel( lon_0=source_pos.l, lat_0=source_pos.b, frame="galactic", sigma=0.2 * u.deg ) model = SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, name="source" ) model.spatial_model.lon_0.frozen = False model.spatial_model.lat_0.frozen = False model.spatial_model.sigma.frozen = False model.spectral_model.amplitude.frozen = False model.spectral_model.index.frozen = True model.spatial_model.lon_0.min = source_pos.galactic.l.deg - 0.8 model.spatial_model.lon_0.max = source_pos.galactic.l.deg + 0.8 model.spatial_model.lat_0.min = source_pos.galactic.b.deg - 0.8 model.spatial_model.lat_0.max = source_pos.galactic.b.deg + 0.8 return model class TestEnergyDependentEstimator: def __init__(self): energy_edges = [1, 3, 5, 20] * u.TeV stacked_dataset = MapDataset.read( "$GAMMAPY_DATA/estimators/mock_DL4/dataset_energy_dependent.fits.gz" ) datasets = Datasets([stacked_dataset]) datasets.models = create_model() estimator = EnergyDependentMorphologyEstimator( energy_edges=energy_edges, source="source" ) self.results = estimator.run(datasets) def test_edep(self): results_edep = self.results["energy_dependence"]["result"] assert_allclose( results_edep["lon_0"], [5.6067162, 5.601791, 5.6180701, 5.5973948] * u.deg, atol=1e-5, ) assert_allclose( results_edep["lat_0"], [0.20237541, 0.21819575, 0.18371523, 0.18106852] * u.deg, atol=1e-5, ) assert_allclose( results_edep["sigma"], [0.21563528, 0.25686477, 0.19736596, 0.13505605] * u.deg, atol=1e-5, ) assert_allclose( self.results["energy_dependence"]["delta_ts"], 75.61713199794758, atol=1e-5 ) def test_significance(self): results_src = self.results["src_above_bkg"] assert_allclose( results_src["delta_ts"], [998.0521965029693, 712.8735641098574, 289.81556949490914], atol=1e-5, ) assert_allclose( results_src["significance"], [31.27752315246094, 26.34612970747113, 16.54625269423397], atol=1e-5, ) def test_chi2(self): results_edep = self.results["energy_dependence"]["result"] chi2_sigma = weighted_chi2_parameter( results_edep, parameters=["sigma", "lat_0", "lon_0"] ) assert_allclose( chi2_sigma["chi2"], [87.84278516393066, 4.605432972153188, 1.320491077667271], atol=1e-5, ) assert_allclose( chi2_sigma["significance"], [9.107664118611664, 1.6449173252682943, 0.6484028260024965], atol=1e-5, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/tests/test_flux.py0000644000175100001770000001646214721316200022000 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from gammapy.datasets import Datasets, SpectrumDatasetOnOff from gammapy.estimators.flux import FluxEstimator from gammapy.modeling import Parameter from gammapy.modeling.models import ( Models, NaimaSpectralModel, PowerLawNormSpectralModel, PowerLawSpectralModel, SkyModel, ) from gammapy.utils.testing import requires_data, requires_dependency @pytest.fixture() def fermi_datasets(): from gammapy.datasets import Datasets filename = "$GAMMAPY_DATA/fermi-3fhl-crab/Fermi-LAT-3FHL_datasets.yaml" filename_models = "$GAMMAPY_DATA/fermi-3fhl-crab/Fermi-LAT-3FHL_models.yaml" return Datasets.read(filename=filename, filename_models=filename_models) @pytest.fixture(scope="session") def hess_datasets(): datasets = Datasets() pwl = PowerLawSpectralModel(amplitude="3.5e-11 cm-2s-1TeV-1", index=2.7) model = SkyModel(spectral_model=pwl, name="Crab") for obsid in [23523, 23526]: dataset = SpectrumDatasetOnOff.read( f"$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs{obsid}.fits" ) dataset.models = model datasets.append(dataset) return datasets @requires_data() def test_flux_estimator_fermi_no_reoptimization(fermi_datasets): norm = Parameter( value=1, name="norm", scan_n_values=5, scan_min=0.5, scan_max=2, interp="log" ) estimator = FluxEstimator( 0, norm=norm, selection_optional="all", reoptimize=False, ) datasets = fermi_datasets.slice_by_energy(energy_min="1 GeV", energy_max="100 GeV") datasets.models = fermi_datasets.models result = estimator.run(datasets) assert_allclose(result["norm"], 0.98949, atol=1e-3) assert_allclose(result["ts"], 25083.75408, rtol=1e-3) assert_allclose(result["norm_err"], 0.01998, atol=1e-3) assert_allclose(result["norm_errn"], 0.0199, atol=1e-3) assert_allclose(result["norm_errp"], 0.0199, atol=1e-3) assert len(result["norm_scan"]) == 5 assert_allclose(result["norm_scan"][0], 0.5) assert_allclose(result["norm_scan"][-1], 2) assert_allclose(result["e_min"], 10 * u.GeV, atol=1e-3) assert_allclose(result["e_max"], 83.255 * u.GeV, atol=1e-3) @requires_data() def test_flux_estimator_fermi_with_reoptimization(fermi_datasets): estimator = FluxEstimator(0, selection_optional=None, reoptimize=True) datasets = fermi_datasets.slice_by_energy(energy_min="1 GeV", energy_max="100 GeV") datasets.models = fermi_datasets.models result = estimator.run(datasets) assert_allclose(result["norm"], 0.989989, atol=1e-3) assert_allclose(result["ts"], 18729.368105, rtol=1e-3) assert_allclose(result["norm_err"], 0.01998, atol=1e-3) @requires_data() def test_flux_estimator_1d(hess_datasets): estimator = FluxEstimator( source="Crab", selection_optional=["errn-errp", "ul"], reoptimize=False ) datasets = hess_datasets.slice_by_energy( energy_min=1 * u.TeV, energy_max=10 * u.TeV, ) datasets.models = hess_datasets.models result = estimator.run(datasets) assert_allclose(result["norm"], 1.218139, atol=1e-3) assert_allclose(result["ts"], 527.492959, atol=1e-3) assert_allclose(result["norm_err"], 0.095496, atol=1e-3) assert_allclose(result["norm_errn"], 0.093204, atol=1e-3) assert_allclose(result["norm_errp"], 0.097818, atol=1e-3) assert_allclose(result["norm_ul"], 1.418475, atol=1e-3) assert_allclose(result["e_min"], 1 * u.TeV, atol=1e-3) assert_allclose(result["e_max"], 10 * u.TeV, atol=1e-3) assert_allclose(result["npred"], [93.209263, 93.667283], atol=1e-3) assert_allclose(result["npred_excess"], [86.27813, 88.6715], atol=1e-3) @requires_data() def test_inhomogeneous_datasets(fermi_datasets, hess_datasets): datasets = Datasets() datasets.extend(fermi_datasets) datasets.extend(hess_datasets) datasets = datasets.slice_by_energy( energy_min=1 * u.TeV, energy_max=10 * u.TeV, ) datasets.models = fermi_datasets.models estimator = FluxEstimator( source="Crab Nebula", selection_optional=[], reoptimize=True ) result = estimator.run(datasets) assert_allclose(result["norm"], 1.190622, atol=1e-3) assert_allclose(result["ts"], 612.503013, atol=1e-3) assert_allclose(result["norm_err"], 0.090744, atol=1e-3) assert_allclose(result["e_min"], 0.693145 * u.TeV, atol=1e-3) assert_allclose(result["e_max"], 10 * u.TeV, atol=1e-3) def test_flux_estimator_norm_range(): model = SkyModel.create("pl", "gauss", name="test") norm = Parameter(value=1, name="norm", min=1e-3, max=1e2, interp="log") estimator = FluxEstimator( source="test", norm=norm, selection_optional=[], reoptimize=True ) scale_model = estimator.get_scale_model(Models([model])) assert_allclose(scale_model.norm.min, 1e-3) assert_allclose(scale_model.norm.max, 1e2) assert scale_model.norm.interp == "log" def test_flux_estimator_norm_dict(): norm = dict(value=1, name="norm", min=1e-3, max=1e2, interp="log") estimator = FluxEstimator( source="test", norm=norm, selection_optional=[], reoptimize=True ) assert estimator.norm.value == 1 assert_allclose(estimator.norm.min, 1e-3) assert_allclose(estimator.norm.max, 1e2) assert estimator.norm.interp == "log" def test_flux_estimator_compound_model(): pl = PowerLawSpectralModel() pl.amplitude.min = 1e-15 pl.amplitude.max = 1e-10 pln = PowerLawNormSpectralModel() pln.norm.value = 0.1 pln.norm.frozen = True spectral_model = pl * pln model = SkyModel(spectral_model=spectral_model, name="test") norm = Parameter(value=1, name="norm", min=1e-3, max=1e2, interp="log") estimator = FluxEstimator( source="test", norm=norm, selection_optional=[], reoptimize=True ) scale_model = estimator.get_scale_model(Models([model])) assert_allclose(scale_model.norm.min, 1e-3) assert_allclose(scale_model.norm.max, 1e2) pl2 = PowerLawSpectralModel() pl2.amplitude.min = 1e-14 pl2.amplitude.max = 1e-10 spectral_model2 = pl + pl2 model2 = SkyModel(spectral_model=spectral_model2, name="test") pl2.amplitude.frozen = True scale_model = estimator.get_scale_model(Models([model2])) assert_allclose(scale_model.norm.min, 1e-3) pl.amplitude.frozen = True pl2.amplitude.frozen = False scale_model = estimator.get_scale_model(Models([model2])) assert_allclose(scale_model.norm.min, 1e-3) pl2.amplitude.frozen = True scale_model = estimator.get_scale_model(Models([model2])) assert_allclose(scale_model.norm.min, 1e-3) @requires_dependency("naima") def test_flux_estimator_naima_model(): import naima ECPL = naima.models.ExponentialCutoffPowerLaw( 1e36 * u.Unit("1/eV"), 1 * u.TeV, 2.1, 13 * u.TeV ) IC = naima.models.InverseCompton(ECPL, seed_photon_fields=["CMB"]) naima_model = NaimaSpectralModel(IC) model = SkyModel(spectral_model=naima_model, name="test") estimator = FluxEstimator(source="test", selection_optional=[], reoptimize=True) scale_model = estimator.get_scale_model(Models([model])) assert_allclose(scale_model.norm.min, np.nan) assert_allclose(scale_model.norm.max, np.nan) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/tests/test_metadata.py0000644000175100001770000000454214721316200022576 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from astropy import units as u from astropy.coordinates import SkyCoord from pydantic import ValidationError from gammapy.utils.metadata import CreatorMetaData, TargetMetaData from gammapy.utils.testing import requires_data from gammapy.version import version from ..metadata import FluxMetaData @pytest.fixture() def default(): default = FluxMetaData( sed_type="likelihood", creation=CreatorMetaData( date="2022-01-01", creator="gammapy test", origin="CTA" ), target=TargetMetaData( name="PKS2155-304", position=SkyCoord( "21 58 52.06 -30 13 32.11", frame="icrs", unit=(u.hourangle, u.deg) ), ), n_sigma=2.0, n_sigma_ul=None, ) return default @requires_data() def test_creator(default): assert default.creation.creator == "gammapy test" assert default.sed_type == "likelihood" assert default.sed_type_init is None assert default.target.name == "PKS2155-304" assert default.target.position is not None with pytest.raises(ValidationError): default.target.position = 2.0 with pytest.raises(ValidationError): default.sed_type = "test" default = FluxMetaData() assert default.creation.creator == f"Gammapy {version}" def test_from_header(): tdict = { "SED_TYPE": "dnde", "N_SIGMA": "4.3", "OBJECT": "RXJ", "RA_OBJ": "123.5", "DEC_OBJ": "-4.8", "OBS_IDS": ["1", "2", "3", "6"], "DATASETS": "myanalysis", "INSTRU": "HAWC", "CREATED": "2023-10-27 16:05:09.795", "ORIGIN": "Gammapy v2.3", "CREATOR": "mynotebook", "EPHEM": "fermi23.1", } meta = FluxMetaData.from_header(tdict) assert meta.n_sigma == 4.3 assert meta.creation.origin == "Gammapy v2.3" assert meta.target.position == SkyCoord(123.5 * u.deg, -4.8 * u.deg, frame="icrs") assert meta.optional is None # TODO: add for support: assert meta.optional["EPHEM"] == "fermi23.1" @requires_data() def test_to_header(default): hdr = default.to_header() assert hdr["OBJECT"] == "PKS2155-304" assert hdr["ORIGIN"] == "CTA" assert hdr["RA_OBJ"] == 329.7169166666666 assert hdr["N_SIGMA"] == 2.0 assert hdr["CREATOR"] == "gammapy test" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/tests/test_parameter_estimator.py0000644000175100001770000000700614721316200025063 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from gammapy.datasets import Datasets, SpectrumDatasetOnOff from gammapy.estimators.parameter import ParameterEstimator from gammapy.modeling.models import PowerLawSpectralModel, SkyModel from gammapy.utils.testing import requires_data @pytest.fixture def crab_datasets_1d(): filename = "$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs23523.fits" dataset = SpectrumDatasetOnOff.read(filename) datasets = Datasets([dataset]) return datasets @pytest.fixture def pwl_model(): return PowerLawSpectralModel(amplitude="3e-11 cm-2s-1TeV-1", index=2.7) @pytest.fixture def crab_datasets_fermi(): filename = "$GAMMAPY_DATA/fermi-3fhl-crab/Fermi-LAT-3FHL_datasets.yaml" filename_models = "$GAMMAPY_DATA/fermi-3fhl-crab/Fermi-LAT-3FHL_models.yaml" return Datasets.read(filename=filename, filename_models=filename_models) @requires_data() def test_parameter_estimator_1d(crab_datasets_1d, pwl_model): datasets = crab_datasets_1d model = SkyModel(spectral_model=pwl_model, name="Crab") model.spectral_model.amplitude.scan_n_values = 10 for dataset in datasets: dataset.models = model estimator = ParameterEstimator(selection_optional="all") result = estimator.run(datasets, parameter="amplitude") assert_allclose(result["amplitude"], 5.1428e-11, rtol=1e-3) assert_allclose(result["amplitude_err"], 6.42467e-12, rtol=1e-3) assert_allclose(result["ts"], 353.2092, rtol=1e-3) assert_allclose(result["stat"], 38.3435, rtol=1e-3) assert_allclose(result["stat_null"], 391.5527, rtol=1e-3) assert_allclose(result["amplitude_errp"], 6.703e-12, rtol=5e-3) assert_allclose(result["amplitude_errn"], 6.152e-12, rtol=5e-3) # Add test for scan assert_allclose(result["amplitude_scan"].shape, 10) @requires_data() def test_parameter_estimator_3d_no_reoptimization(crab_datasets_fermi): datasets = crab_datasets_fermi parameter = datasets[0].models.parameters["amplitude"] parameter.scan_n_values = 10 estimator = ParameterEstimator(reoptimize=False, selection_optional=["scan"]) alpha_value = datasets[0].models.parameters["alpha"].value result = estimator.run(datasets, parameter) assert not datasets[0].models.parameters["alpha"].frozen assert_allclose(datasets[0].models.parameters["alpha"].value, alpha_value) assert_allclose(result["amplitude"], 0.018251, rtol=1e-3) assert_allclose(result["amplitude_scan"].shape, 10) assert_allclose(result["amplitude_scan"][0], 0.017282, atol=1e-3) @requires_data() def test_parameter_estimator_no_data(crab_datasets_1d, pwl_model): datasets = crab_datasets_1d model = SkyModel(spectral_model=pwl_model, name="Crab") model.spectral_model.amplitude.scan_n_values = 10 for dataset in datasets: dataset.mask_safe.data[...] = False dataset.models = model estimator = ParameterEstimator(selection_optional="all") result = estimator.run(datasets, parameter="amplitude") assert np.isnan(result["amplitude"]) assert np.isnan(result["amplitude_err"]) assert np.isnan(result["amplitude_errp"]) assert np.isnan(result["amplitude_errn"]) assert np.isnan(result["amplitude_ul"]) assert np.isnan(result["ts"]) assert np.isnan(result["npred"]) assert_allclose(result["counts"], 0) # Add test for scan assert_allclose(result["amplitude_scan"].shape, 10) assert np.all(np.isnan(result["stat_scan"])) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/tests/test_profile.py0000644000175100001770000001116614721316200022456 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from astropy import units as u from astropy.coordinates import Angle, SkyCoord from astropy.table import Table from gammapy.estimators import ImageProfile, ImageProfileEstimator from gammapy.maps import WcsGeom, WcsNDMap from gammapy.utils.testing import assert_quantity_allclose, mpl_plot_check @pytest.fixture(scope="session") def checkerboard_image(): nxpix, nypix = 10, 6 # set up data as a checkerboard of 0.5 and 1.5, so that the mean and sum # are not completely trivial to compute data = 1.5 * np.ones((nypix, nxpix)) data[slice(0, nypix + 1, 2), slice(0, nxpix + 1, 2)] = 0.5 data[slice(1, nypix + 1, 2), slice(1, nxpix + 1, 2)] = 0.5 geom = WcsGeom.create(npix=(nxpix, nypix), frame="galactic", binsz=0.02) return WcsNDMap(geom=geom, data=data, unit="cm-2 s-1") @pytest.fixture(scope="session") def cosine_profile(): table = Table() table["x_ref"] = np.linspace(-90, 90, 11) * u.deg table["profile"] = np.cos(table["x_ref"].to("rad")) * u.Unit("cm-2 s-1") table["profile_err"] = 0.1 * table["profile"] return ImageProfile(table) class TestImageProfileEstimator: @staticmethod def test_lat_profile_sum(checkerboard_image): p = ImageProfileEstimator(axis="lat", method="sum") profile = p.run(checkerboard_image) desired = 10 * np.ones(6) * u.Unit("cm-2 s-1") assert_quantity_allclose(profile.profile, desired) @staticmethod def test_lon_profile_sum(checkerboard_image): p = ImageProfileEstimator(axis="lon", method="sum") profile = p.run(checkerboard_image) desired = 6 * np.ones(10) * u.Unit("cm-2 s-1") assert_quantity_allclose(profile.profile, desired) @staticmethod def test_radial_profile_sum(checkerboard_image): center = SkyCoord(0, 0, unit="deg", frame="galactic") p = ImageProfileEstimator(axis="radial", method="sum", center=center) profile = p.run(checkerboard_image) desired = [4.0, 8.0, 20.0, 12.0, 12.0] * u.Unit("cm-2 s-1") assert_quantity_allclose(profile.profile, desired) with pytest.raises(ValueError): ImageProfileEstimator(axis="radial") @staticmethod def test_lat_profile_mean(checkerboard_image): p = ImageProfileEstimator(axis="lat", method="mean") profile = p.run(checkerboard_image) desired = np.ones(6) * u.Unit("cm-2 s-1") assert_quantity_allclose(profile.profile, desired) @staticmethod def test_lon_profile_mean(checkerboard_image): p = ImageProfileEstimator(axis="lon", method="mean") profile = p.run(checkerboard_image) desired = np.ones(10) * u.Unit("cm-2 s-1") assert_quantity_allclose(profile.profile, desired) @staticmethod def test_x_edges_lat(checkerboard_image): x_edges = Angle(np.linspace(-0.06, 0.06, 4), "deg") p = ImageProfileEstimator(x_edges=x_edges, axis="lat", method="sum") profile = p.run(checkerboard_image) desired = 20 * np.ones(3) * u.Unit("cm-2 s-1") assert_quantity_allclose(profile.profile, desired) @staticmethod def test_x_edges_lon(checkerboard_image): x_edges = Angle(np.linspace(-0.1, 0.1, 6), "deg") p = ImageProfileEstimator(x_edges=x_edges, axis="lon", method="sum") profile = p.run(checkerboard_image) desired = 12 * np.ones(5) * u.Unit("cm-2 s-1") assert_quantity_allclose(profile.profile, desired) class TestImageProfile: @staticmethod def test_normalize(cosine_profile): normalized = cosine_profile.normalize(mode="integral") profile = normalized.profile assert_quantity_allclose(profile.sum(), 1 * u.Unit("cm-2 s-1")) normalized = cosine_profile.normalize(mode="peak") profile = normalized.profile assert_quantity_allclose(profile.max(), 1 * u.Unit("cm-2 s-1")) @staticmethod def test_profile_x_edges(cosine_profile): assert_quantity_allclose(cosine_profile.x_ref.sum(), 0 * u.deg) @staticmethod @pytest.mark.parametrize("kernel", ["gauss", "box"]) def test_smooth(cosine_profile, kernel): # smoothing should preserve the mean desired_mean = cosine_profile.profile.mean() smoothed = cosine_profile.smooth(kernel, radius=3) assert_quantity_allclose(smoothed.profile.mean(), desired_mean) # smoothing should decrease errors assert smoothed.profile_err.mean() < cosine_profile.profile_err.mean() @staticmethod def test_peek(cosine_profile): with mpl_plot_check(): cosine_profile.peek() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/tests/test_utils.py0000644000175100001770000002304714721316200022157 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import SkyCoord from astropy.table import Column, Table from astropy.time import Time from gammapy.datasets import MapDataset from gammapy.estimators import ExcessMapEstimator, FluxPoints from gammapy.estimators.utils import ( compute_lightcurve_doublingtime, compute_lightcurve_fpp, compute_lightcurve_fvar, compute_lightcurve_discrete_correlation, find_peaks, find_peaks_in_flux_map, get_rebinned_axis, resample_energy_edges, ) from gammapy.maps import Map, MapAxis from gammapy.utils.testing import assert_time_allclose, requires_data @pytest.fixture() def lc(): meta = dict(TIMESYS="utc", SED_TYPE="flux") table = Table( meta=meta, data=[ Column(Time(["2010-01-01", "2010-01-03"]).mjd, "time_min"), Column(Time(["2010-01-03", "2010-01-10"]).mjd, "time_max"), Column([[1.0, 2.0], [1.0, 2.0]], "e_min", unit="TeV"), Column([[2.0, 5.0], [2.0, 5.0]], "e_max", unit="TeV"), Column([[1e-11, 4e-12], [3e-11, 7e-12]], "flux", unit="cm-2 s-1"), Column( [[0.1e-11, 0.4e-12], [0.3e-11, 0.7e-12]], "flux_err", unit="cm-2 s-1" ), Column([[np.nan, np.nan], [3.6e-11, 1e-11]], "flux_ul", unit="cm-2 s-1"), Column([[False, False], [True, True]], "is_ul"), Column([[True, True], [True, True]], "success"), ], ) return FluxPoints.from_table(table=table, format="lightcurve") @pytest.fixture(scope="session") def lc2(): meta = dict(TIMESYS="utc", SED_TYPE="flux") table = Table( meta=meta, data=[ Column(Time(["2010-01-01", "2010-01-03", "2010-01-05"]).mjd, "time_min"), Column(Time(["2010-01-03", "2010-01-05", "2010-01-07"]).mjd, "time_max"), Column([[1.0, 2.0], [1.0, 2.0], [1.0, 2.0]], "e_min", unit="GeV"), Column([[2.0, 5.0], [2.0, 5.0], [2.0, 5.0]], "e_max", unit="GeV"), Column( [[1.51e-7, 3.4e-8], [3.1e-7, 6.7e-8], [3.1e-7, 7.5e-8]], "flux", unit="m-2 s-1", ), Column( [[0.1e-7, 0.4e-8], [0.3e-7, 0.7e-8], [0.31e-7, 0.72e-8]], "flux_err", unit="m-2 s-1", ), Column( [[np.nan, np.nan], [3.6e-7, 1e-7], [3.6e-7, 1e-7]], "flux_ul", unit="m-2 s-1", ), Column([[False, False], [True, True], [True, True]], "is_ul"), Column([[True, True], [True, True], [True, True]], "success"), ], ) return FluxPoints.from_table(table=table, format="lightcurve") class TestFindPeaks: def test_simple(self): """Test a simple example""" image = Map.create(npix=(10, 5), unit="s") image.data[3, 3] = 11 image.data[3, 4] = 10 image.data[3, 5] = 12 image.data[3, 6] = np.nan image.data[0, 9] = 1e20 table = find_peaks(image, threshold=3) assert len(table) == 3 assert table["value"].unit == "s" assert table["ra"].unit == "deg" assert table["dec"].unit == "deg" row = table[0] assert tuple((row["x"], row["y"])) == (9, 0) assert_allclose(row["value"], 1e20) assert_allclose(row["ra"], 359.55) assert_allclose(row["dec"], -0.2) row = table[1] assert tuple((row["x"], row["y"])) == (5, 3) assert_allclose(row["value"], 12) def test_no_peak(self): image = Map.create(npix=(10, 5)) image.data[3, 5] = 12 table = find_peaks(image, threshold=12.1) assert len(table) == 0 def test_constant(self): image = Map.create(npix=(10, 5)) table = find_peaks(image, threshold=3) assert len(table) == 0 def test_flat_map(self): """Test a simple example""" axis1 = MapAxis.from_edges([1, 2], name="axis1") axis2 = MapAxis.from_edges([9, 10], name="axis2") image = Map.create(npix=(10, 5), unit="s", axes=[axis1, axis2]) image.data[..., 3, 3] = 11 image.data[..., 3, 4] = 10 image.data[..., 3, 5] = 12 image.data[..., 3, 6] = np.nan image.data[..., 0, 9] = 1e20 table = find_peaks(image, threshold=3) row = table[0] assert len(table) == 3 assert_allclose(row["value"], 1e20) assert_allclose(row["ra"], 359.55) assert_allclose(row["dec"], -0.2) class TestFindFluxPeaks: """Tests for find_peaks_in_flux_map""" @requires_data() def test_find_peaks_in_flux_map(self): """Test a simple example""" dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") estimator = ExcessMapEstimator( correlation_radius="0.1 deg", energy_edges=[0.1, 10] * u.TeV ) maps = estimator.run(dataset) table = find_peaks_in_flux_map(maps, threshold=5, min_distance=0.1 * u.deg) assert table["ra"].unit == "deg" assert table["dec"].unit == "deg" @requires_data() def test_no_peak(self): dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") position = SkyCoord(1.2, 1, frame="galactic", unit="deg") dataset_cutout = dataset.cutout( position=position, width=(0.2 * u.deg, 0.2 * u.deg) ) estimator = ExcessMapEstimator( correlation_radius="0.1 deg", energy_edges=[0.1, 10] * u.TeV ) maps = estimator.run(dataset_cutout) table = find_peaks_in_flux_map(maps, threshold=5, min_distance=0.1 * u.deg) assert len(table) == 0 def test_resample_energy_edges(spectrum_dataset): resampled_energy_edges = resample_energy_edges(spectrum_dataset, conditions={}) assert (resampled_energy_edges == spectrum_dataset._geom.axes["energy"].edges).all() with pytest.raises(ValueError): resample_energy_edges( spectrum_dataset, conditions={"counts_min": spectrum_dataset.counts.data.sum() + 1}, ) resampled_energy_edges = resample_energy_edges( spectrum_dataset, conditions={"excess_min": spectrum_dataset.excess.data[-1] + 1}, ) grouped = spectrum_dataset.resample_energy_axis( MapAxis.from_edges(edges=resampled_energy_edges, name="energy") ) assert grouped.counts.data.shape == (29, 1, 1) assert_allclose(np.squeeze(grouped.counts)[-1], 2518.0) assert_allclose(np.squeeze(grouped.background)[-1], 200) def test_compute_lightcurve_fvar(lc): fvar = compute_lightcurve_fvar(lc) ffvar = fvar["fvar"].quantity ffvar_err = fvar["fvar_err"].quantity assert_allclose(ffvar, [[[0.698212]], [[0.37150576]]]) assert_allclose(ffvar_err, [[[0.0795621]], [[0.074706]]]) def test_compute_lightcurve_fpp(lc): fpp = compute_lightcurve_fpp(lc) ffpp = fpp["fpp"].quantity ffpp_err = fpp["fpp_err"].quantity assert_allclose(ffpp, [[[0.99373035]], [[0.53551551]]]) assert_allclose(ffpp_err, [[[0.07930673]], [[0.07397653]]]) def test_compute_lightcurve_doublingtime(lc): dtime = compute_lightcurve_doublingtime(lc) ddtime = dtime["doublingtime"].quantity ddtime_err = dtime["doubling_err"].quantity dcoord = dtime["doubling_coord"] assert_allclose(ddtime, [[[245305.49]], [[481572.59]]] * u.s) assert_allclose(ddtime_err, [[[45999.766]], [[11935.665]]] * u.s) assert_time_allclose( dcoord, Time([[[55197.99960648]], [[55197.99960648]]], format="mjd", scale="utc"), ) def test_compute_dcf(lc, lc2): dict = compute_lightcurve_discrete_correlation(lc, lc2, tau=3 * u.d) assert_allclose(dict["bins"], [-388800.0, -129600.0, 129600.0, 388800.0] * u.s) assert_allclose( dict["discrete_correlation"], [ [-0.760599, -1.052783], [-0.760599, -0.537134], [1.014132, 1.059945], [-1.521198, -1.589918], ], rtol=1e-6, ) assert_allclose( dict["discrete_correlation_err"], [[np.nan, np.nan], [np.nan, np.nan], [0.310513, 0.372241], [np.nan, np.nan]], rtol=1e-6, ) dict2 = compute_lightcurve_discrete_correlation(lc2, tau=3 * u.d) assert_allclose(dict2["bins"], [-388800.0, -129600.0, 129600.0, 388800.0] * u.s) assert_allclose( dict2["discrete_correlation"], [ [-1.11074, -1.448801], [-0.277685, -0.124862], [0.55537, 0.629465], [-1.11074, -1.448801], ], rtol=1e-5, ) assert_allclose( dict2["discrete_correlation_err"], [[np.nan, np.nan], [1.178118, 0.868782], [0.589059, 0.53472], [np.nan, np.nan]], rtol=1e-6, ) dict3 = compute_lightcurve_discrete_correlation(lc2) assert_allclose(dict3["bins"], [-345600.0, -115200.0, 115200.0, 345600.0] * u.s) @requires_data() def test_get_rebinned_axis(): lc_1d = FluxPoints.read( "$GAMMAPY_DATA/estimators/pks2155_hess_lc/pks2155_hess_lc.fits", format="lightcurve", ) axis_new = get_rebinned_axis( lc_1d, method="fixed-bins", group_size=2, axis_name="time" ) assert_allclose(axis_new.bin_width[0], 20 * u.min) axis_new = get_rebinned_axis( lc_1d, method="min-ts", ts_threshold=2500.0, axis_name="time" ) assert_allclose(axis_new.bin_width, [50, 30, 30, 50, 110, 70] * u.min) with pytest.raises(ValueError): get_rebinned_axis(lc_1d, method="error", value=2, axis_name="time") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/estimators/utils.py0000644000175100001770000014067714721316200017767 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np import scipy.ndimage from scipy import special from scipy.interpolate import InterpolatedUnivariateSpline from astropy import units as u from astropy.coordinates import SkyCoord from astropy.table import Table from gammapy.datasets import SpectrumDataset, SpectrumDatasetOnOff from gammapy.datasets.map import MapEvaluator from gammapy.maps import Map, MapAxis, TimeMapAxis, WcsNDMap from gammapy.modeling import Parameter from gammapy.modeling.models import ( ConstantFluxSpatialModel, PowerLawSpectralModel, SkyModel, ) from gammapy.stats import ( compute_flux_doubling, compute_fpp, compute_fvar, discrete_correlation, ) from gammapy.stats.utils import ts_to_sigma from .map.core import FluxMaps __all__ = [ "combine_flux_maps", "combine_significance_maps", "estimate_exposure_reco_energy", "find_peaks", "find_peaks_in_flux_map", "resample_energy_edges", "get_rebinned_axis", "get_combined_flux_maps", "get_combined_significance_maps", "compute_lightcurve_fvar", "compute_lightcurve_fpp", "compute_lightcurve_doublingtime", "compute_lightcurve_discrete_correlation", ] def find_peaks(image, threshold, min_distance=1): """Find local peaks in an image. This is a very simple peak finder, that finds local peaks (i.e. maxima) in images above a given ``threshold`` within a given ``min_distance`` around each given pixel. If you get multiple spurious detections near a peak, usually it's best to smooth the image a bit, or to compute it using a different method in the first place to result in a smooth image. You can also increase the ``min_distance`` parameter. The output table contains one row per peak and the following columns: - ``x`` and ``y`` are the pixel coordinates (first pixel at zero). - ``ra`` and ``dec`` are the RA / DEC sky coordinates (ICRS frame). - ``value`` is the pixel value. It is sorted by peak value, starting with the highest value. If there are no pixel values above the threshold, an empty table is returned. There are more featureful peak finding and source detection methods e.g. in the ``photutils`` or ``scikit-image`` Python packages. Parameters ---------- image : `~gammapy.maps.WcsNDMap` Image like Map. threshold : float or array-like The data value or pixel-wise data values to be used for the detection threshold. A 2D ``threshold`` must have the same shape as the map ``data``. min_distance : int or `~astropy.units.Quantity` Minimum distance between peaks. An integer value is interpreted as pixels. Default is 1. Returns ------- output : `~astropy.table.Table` Table with parameters of detected peaks. Examples -------- >>> import astropy.units as u >>> from gammapy.datasets import MapDataset >>> from gammapy.estimators import ExcessMapEstimator >>> from gammapy.estimators.utils import find_peaks >>> >>> dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") >>> estimator = ExcessMapEstimator( ... correlation_radius="0.1 deg", energy_edges=[0.1, 10] * u.TeV ... ) >>> maps = estimator.run(dataset) >>> # Find the peaks which are above 5 sigma >>> sources = find_peaks(maps["sqrt_ts"], threshold=5, min_distance="0.25 deg") >>> print(sources) value x y ra dec deg deg ------ --- --- --------- --------- 32.191 161 118 266.41924 -28.98772 18.7 125 124 266.80571 -28.14079 9.4498 257 122 264.86178 -30.97529 9.3784 204 103 266.14201 -30.10041 5.3493 282 150 263.78083 -31.12704 """ # Input validation if not isinstance(image, WcsNDMap): raise TypeError("find_peaks only supports WcsNDMap") if not image.geom.is_flat: raise ValueError( "find_peaks only supports flat Maps, with no spatial axes of length 1." ) if isinstance(min_distance, (str, u.Quantity)): min_distance = np.mean(u.Quantity(min_distance) / image.geom.pixel_scales) min_distance = np.round(min_distance).to_value("") size = 2 * min_distance + 1 # Remove non-finite values to avoid warnings or spurious detection data = image.sum_over_axes(keepdims=False).data data[~np.isfinite(data)] = np.nanmin(data) # Handle edge case of constant data; treat as no peak if np.all(data == data.flat[0]): return Table() # Run peak finder data_max = scipy.ndimage.maximum_filter(data, size=size, mode="constant") mask = (data == data_max) & (data > threshold) y, x = mask.nonzero() value = data[y, x] # Make and return results table if len(value) == 0: return Table() coord = SkyCoord.from_pixel(x, y, wcs=image.geom.wcs).icrs table = Table() table["value"] = value * image.unit table["x"] = x table["y"] = y table["ra"] = coord.ra table["dec"] = coord.dec table["ra"].format = ".5f" table["dec"].format = ".5f" table["value"].format = ".5g" table.sort("value") table.reverse() return table def find_peaks_in_flux_map(maps, threshold, min_distance=1): """Find local test statistic peaks for a given Map. Utilises the `~gammapy.estimators.utils.find_peaks` function to find various parameters from FluxMaps. Parameters ---------- maps : `~gammapy.estimators.FluxMaps` Input flux map object. threshold : float or array-like The test statistic data value or pixel-wise test statistic data values to be used for the detection threshold. A 2D ``threshold`` must have the same. shape as the map ``data``. min_distance : int or `~astropy.units.Quantity` Minimum distance between peaks. An integer value is interpreted as pixels. Default is 1. Returns ------- output : `~astropy.table.Table` Table with parameters of detected peaks. Examples -------- >>> import astropy.units as u >>> from gammapy.datasets import MapDataset >>> from gammapy.estimators import ExcessMapEstimator >>> from gammapy.estimators.utils import find_peaks_in_flux_map >>> >>> dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") >>> estimator = ExcessMapEstimator( ... correlation_radius="0.1 deg", energy_edges=[0.1, 10]*u.TeV ... ) >>> maps = estimator.run(dataset) >>> # Find the peaks which are above 5 sigma >>> sources = find_peaks_in_flux_map(maps, threshold=5, min_distance=0.1*u.deg) >>> print(sources[:4]) x y ra dec npred npred_excess counts ts sqrt_ts norm norm_err flux flux_err deg deg 1 / (s cm2) 1 / (s cm2) --- --- --------- --------- --------- ------------ --------- -------- ------- ------- -------- ----------- ----------- 158 135 266.05019 -28.70181 192.00000 61.33788 192.00000 25.11839 5.01183 0.28551 0.06450 2.827e-12 6.385e-13 92 133 267.07022 -27.31834 137.00000 51.99467 137.00000 26.78181 5.17511 0.37058 0.08342 3.669e-12 8.259e-13 176 134 265.80492 -29.09805 195.00000 65.15990 195.00000 28.29158 5.31898 0.30561 0.06549 3.025e-12 6.484e-13 282 150 263.78083 -31.12704 84.00000 39.99004 84.00000 28.61526 5.34932 0.55027 0.12611 5.448e-12 1.249e-12 """ quantity_for_peaks = maps["sqrt_ts"] if not isinstance(maps, FluxMaps): raise TypeError( f"find_peaks_in_flux_map expects FluxMaps input. Got {type(maps)} instead." ) if not quantity_for_peaks.geom.is_flat: raise ValueError( "find_peaks_in_flux_map only supports flat Maps, with energy axis of length 1." ) table = find_peaks(quantity_for_peaks, threshold, min_distance) if len(table) == 0: return Table() x = np.array(table["x"]) y = np.array(table["y"]) table.remove_column("value") for name in maps.available_quantities: values = maps[name].quantity peaks = values[0, y, x] table[name] = peaks flux_data = maps["flux"].quantity table["flux"] = flux_data[0, y, x] flux_err_data = maps["flux_err"].quantity table["flux_err"] = flux_err_data[0, y, x] for column in table.colnames: if column.startswith(("flux", "flux_err")): table[column].format = ".3e" elif column.startswith( ( "npred", "npred_excess", "counts", "sqrt_ts", "norm", "ts", "norm_err", "stat", "stat_null", ) ): table[column].format = ".5f" table.reverse() return table def estimate_exposure_reco_energy(dataset, spectral_model=None, normalize=True): """Estimate an exposure map in reconstructed energy. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.MapDatasetOnOff` The input dataset. spectral_model : `~gammapy.modeling.models.SpectralModel`, optional Assumed spectral shape. If None, a Power Law of index 2 is assumed. Default is None. normalize : bool Normalize the exposure to the total integrated flux of the spectral model. When not normalized it directly gives the predicted counts from the spectral model. Default is True. Returns ------- exposure : `~gammapy.maps.Map` Exposure map in reconstructed energy. """ if spectral_model is None: spectral_model = PowerLawSpectralModel() model = SkyModel( spatial_model=ConstantFluxSpatialModel(), spectral_model=spectral_model ) energy_axis = dataset._geom.axes["energy"] if dataset.edisp is not None: edisp = dataset.edisp.get_edisp_kernel(position=None, energy_axis=energy_axis) else: edisp = None eval = MapEvaluator(model=model, exposure=dataset.exposure, edisp=edisp) reco_exposure = eval.compute_npred() if normalize: ref_flux = spectral_model.integral( energy_axis.edges[:-1], energy_axis.edges[1:] ) reco_exposure = reco_exposure / ref_flux[:, np.newaxis, np.newaxis] return reco_exposure def _satisfies_conditions(info_dict, conditions): satisfies = True for key in conditions.keys(): satisfies &= info_dict[key.strip("_min")] > conditions[key] return satisfies def resample_energy_edges(dataset, conditions={}): """Return energy edges that satisfy given condition on the per bin statistics. Parameters ---------- dataset : `~gammapy.datasets.SpectrumDataset` or `~gammapy.datasets.SpectrumDatasetOnOff` The input dataset. conditions : dict Keyword arguments containing the per-bin conditions used to resample the axis. Available options are: 'counts_min', 'background_min', 'excess_min', 'sqrt_ts_min', 'npred_min', 'npred_background_min', 'npred_signal_min'. Default is {}. Returns ------- energy_edges : list of `~astropy.units.Quantity` Energy edges for the resampled energy axis. Examples -------- >>> from gammapy.datasets import Datasets, SpectrumDatasetOnOff >>> from gammapy.estimators.utils import resample_energy_edges >>> >>> datasets = Datasets() >>> >>> for obs_id in [23523, 23526]: ... dataset = SpectrumDatasetOnOff.read( ... f"$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs{obs_id}.fits" ... ) ... datasets.append(dataset) >>> >>> spectrum_dataset = Datasets(datasets).stack_reduce() >>> # Resample the energy edges so the minimum sqrt_ts is 2 >>> resampled_energy_edges = resample_energy_edges( ... spectrum_dataset, ... conditions={"sqrt_ts_min": 2} ... ) """ if not isinstance(dataset, (SpectrumDataset, SpectrumDatasetOnOff)): raise NotImplementedError( "This method is currently supported for spectral datasets only." ) available_conditions = [ "counts_min", "background_min", "excess_min", "sqrt_ts_min", "npred_min", "npred_background_min", "npred_signal_min", ] for key in conditions.keys(): if key not in available_conditions: raise ValueError( f"Unrecognized option {key}. The available methods are: {available_conditions}." ) axis = dataset.counts.geom.axes["energy"] energy_min_all, energy_max_all = dataset.energy_range_total energy_edges = [energy_max_all] while energy_edges[-1] > energy_min_all: for energy_min in reversed(axis.edges_min): if energy_min >= energy_edges[-1]: continue elif len(energy_edges) == 1 and energy_min == energy_min_all: raise ValueError("The given conditions cannot be met.") sliced = dataset.slice_by_energy( energy_min=energy_min, energy_max=energy_edges[-1] ) with np.errstate(invalid="ignore"): info = sliced.info_dict() if _satisfies_conditions(info, conditions): energy_edges.append(energy_min) break return u.Quantity(energy_edges[::-1]) def compute_lightcurve_fvar(lightcurve, flux_quantity="flux"): """ Compute the fractional excess variance of the input lightcurve. Internally calls the `~gammapy.stats.compute_fvar` function. Parameters ---------- lightcurve : `~gammapy.estimators.FluxPoints` The lightcurve object. flux_quantity : str Flux quantity to use for calculation. Should be 'dnde', 'flux', 'e2dnde' or 'eflux'. Default is 'flux'. Returns ------- fvar : `~astropy.table.Table` Table of fractional excess variance and associated error for each energy bin of the lightcurve. """ flux = getattr(lightcurve, flux_quantity) flux_err = getattr(lightcurve, flux_quantity + "_err") time_id = flux.geom.axes.index_data("time") fvar, fvar_err = compute_fvar(flux.data, flux_err.data, axis=time_id) significance = fvar / fvar_err energies = lightcurve.geom.axes["energy"].edges table = Table( [energies[:-1], energies[1:], fvar, fvar_err, significance], names=("min_energy", "max_energy", "fvar", "fvar_err", "significance"), meta=lightcurve.meta, ) return table def compute_lightcurve_fpp(lightcurve, flux_quantity="flux"): """ Compute the point-to-point excess variance of the input lightcurve. Internally calls the `~gammapy.stats.compute_fpp` function Parameters ---------- lightcurve : `~gammapy.estimators.FluxPoints` The lightcurve object. flux_quantity : str Flux quantity to use for calculation. Should be 'dnde', 'flux', 'e2dnde' or 'eflux'. Default is 'flux'. Returns ------- table : `~astropy.table.Table` Table of point-to-point excess variance and associated error for each energy bin of the lightcurve. """ flux = getattr(lightcurve, flux_quantity) flux_err = getattr(lightcurve, flux_quantity + "_err") time_id = flux.geom.axes.index_data("time") fpp, fpp_err = compute_fpp(flux.data, flux_err.data, axis=time_id) significance = fpp / fpp_err energies = lightcurve.geom.axes["energy"].edges table = Table( [energies[:-1], energies[1:], fpp, fpp_err, significance], names=("min_energy", "max_energy", "fpp", "fpp_err", "significance"), meta=dict(quantity=flux_quantity), ) return table def compute_lightcurve_doublingtime(lightcurve, flux_quantity="flux"): """ Compute the minimum characteristic flux doubling and halving time for the input lightcurve. Internally calls the `~gammapy.stats.compute_flux_doubling` function. The characteristic doubling time is estimated to obtain the minimum variability timescale for the light curves in which rapid variations are clearly evident: for example it is useful in AGN flaring episodes. This quantity, especially for AGN flares, is often expressed as the pair of doubling time and halving time, or the minimum characteristic time for the rising and falling components respectively. Parameters ---------- lightcurve : `~gammapy.estimators.FluxPoints` The lightcurve object. axis_name : str Name of the axis over which to compute the flux doubling. flux_quantity : str Flux quantity to use for calculation. Should be 'dnde', 'flux', 'e2dnde' or 'eflux'. Default is 'flux'. Returns ------- table : `~astropy.table.Table` Table of flux doubling/halving and associated error for each energy bin of the lightcurve with axis coordinates at which they were found. References ---------- .. [Brown2013] "Locating the γ-ray emission region of the flat spectrum radio quasar PKS 1510−089", Brown et al. (2013) https://academic.oup.com/mnras/article/431/1/824/1054498 """ flux = getattr(lightcurve, flux_quantity) flux_err = getattr(lightcurve, flux_quantity + "_err") coords = lightcurve.geom.axes["time"].center axis = flux.geom.axes.index_data("time") doubling_dict = compute_flux_doubling(flux.data, flux_err.data, coords, axis=axis) energies = lightcurve.geom.axes["energy"].edges table = Table( [ energies[:-1], energies[1:], doubling_dict["doubling"], doubling_dict["doubling_err"], lightcurve.geom.axes["time"].reference_time + doubling_dict["doubling_coord"], doubling_dict["halving"], doubling_dict["halving_err"], lightcurve.geom.axes["time"].reference_time + doubling_dict["halving_coord"], ], names=( "min_energy", "max_energy", "doublingtime", "doubling_err", "doubling_coord", "halvingtime", "halving_err", "halving_coord", ), meta=dict(flux_quantity=flux_quantity), ) return table def compute_lightcurve_discrete_correlation( lightcurve1, lightcurve2=None, flux_quantity="flux", tau=None ): """Compute the discrete correlation function for two lightcurves, or the discrete autocorrelation if only one lightcurve is provided. NaN values will be ignored in the computation in order to account for possible gaps in the data. Internally calls the `~gammapy.stats.discrete_correlation` function. Parameters ---------- lightcurve1 : `~gammapy.estimators.FluxPoints` The first lightcurve object. lightcurve2 : `~gammapy.estimators.FluxPoints`, optional The second lightcurve object. If not provided, the autocorrelation for the first lightcurve will be computed. Default is None. flux_quantity : str Flux quantity to use for calculation. Should be 'dnde', 'flux', 'e2dnde' or 'eflux'. The choice does not affect the computation. Default is 'flux'. tau : `~astropy.units.Quantity`, optional Size of the bins to compute the discrete correlation. If None, the bin size will be double the bins of the first lightcurve. Default is None. Returns ------- discrete_correlation_dict : dict Dictionary containing the discrete correlation results. Entries are: * "bins" : the array of discrete time bins * "discrete_correlation" : discrete correlation function values * "discrete_correlation_err" : associated error References ---------- .. [Edelson1988] "THE DISCRETE CORRELATION FUNCTION: A NEW METHOD FOR ANALYZING UNEVENLY SAMPLED VARIABILITY DATA", Edelson et al. (1988) https://ui.adsabs.harvard.edu/abs/1988ApJ...333..646E/abstract """ flux1 = getattr(lightcurve1, flux_quantity) flux_err1 = getattr(lightcurve1, flux_quantity + "_err") coords1 = lightcurve1.geom.axes["time"].center axis = flux1.geom.axes.index_data("time") if tau is None: tau = (coords1[-1] - coords1[0]) / (0.5 * len(coords1)) if lightcurve2: flux2 = getattr(lightcurve2, flux_quantity) flux_err2 = getattr(lightcurve2, flux_quantity + "_err") coords2 = lightcurve2.geom.axes["time"].center bins, dcf, dcf_err = discrete_correlation( flux1.data, flux_err1.data, flux2.data, flux_err2.data, coords1, coords2, tau, axis, ) else: bins, dcf, dcf_err = discrete_correlation( flux1.data, flux_err1.data, flux1.data, flux_err1.data, coords1, coords1, tau, axis, ) discrete_correlation_dict = { "bins": bins, "discrete_correlation": dcf, "discrete_correlation_err": dcf_err, } return discrete_correlation_dict def get_edges_fixed_bins(fluxpoint, group_size, axis_name="energy"): """Rebin the flux point to combine value adjacent bins. Parameters ---------- fluxpoint : `~gammapy.estimators.FluxPoints` The flux points object to rebin. group_size : int Number of bins to combine. axis_name : str, optional The axis name to combine along. Default is 'energy'. Returns ------- edges_min : `~astropy.units.Quantity` or `~astropy.time.Time` Minimum bin edge for the new axis. edges_max : `~astropy.units.Quantity` or `~astropy.time.Time` Maximum bin edge for the new axis. """ ax = fluxpoint.geom.axes[axis_name] nbin = ax.nbin if not isinstance(group_size, int): raise ValueError("Only integer number of bins can be combined") idx = np.arange(0, nbin, group_size) if idx[-1] < nbin: idx = np.append(idx, nbin) edges_min = ax.edges_min[idx[:-1]] edges_max = ax.edges_max[idx[1:] - 1] return edges_min, edges_max def get_edges_min_ts(fluxpoint, ts_threshold, axis_name="energy"): """Rebin the flux point to combine adjacent bins until a minimum TS is obtained. Note that to convert TS to significance, it is necessary to take the number of degrees of freedom into account. Parameters ---------- fluxpoint : `~gammapy.estimators.FluxPoints` The flux points object to rebin. ts_threshold : float The minimum significance desired. axis_name : str, optional The axis name to combine along. Default is 'energy'. Returns ------- edges_min : `~astropy.units.Quantity` or `~astropy.time.Time` Minimum bin edge for the new axis. edges_max : `~astropy.units.Quantity` or `~astropy.time.Time` Maximum bin edge for the new axis. """ ax = fluxpoint.geom.axes[axis_name] nbin = ax.nbin e_min, e_max = ax.edges_min[0], ax.edges_max[0] edges_min = np.zeros(nbin) * e_min.unit edges_max = np.zeros(nbin) * e_max.unit i, i1 = 0, 0 while e_max < ax.edges_max[-1]: ts = fluxpoint.ts.data[i] e_min = ax.edges_min[i] while ts < ts_threshold and i < ax.nbin - 1: i = i + 1 ts = ts + fluxpoint.ts.data[i] e_max = ax.edges_max[i] i = i + 1 edges_min[i1] = e_min edges_max[i1] = e_max i1 = i1 + 1 edges_max = edges_max[:i1] edges_min = edges_min[:i1] return edges_min, edges_max RESAMPLE_METHODS = { "fixed-bins": get_edges_fixed_bins, "min-ts": get_edges_min_ts, } def get_rebinned_axis(fluxpoint, axis_name="energy", method=None, **kwargs): """Get the rebinned axis for resampling the flux point object along the mentioned axis. Parameters ---------- fluxpoint : `~gammapy.estimators.FluxPoints` The flux point object to rebin. axis_name : str, optional The axis name to combine along. Default is 'energy'. method : str The method to resample the axis. Supported options are 'fixed_bins' and 'min-ts'. kwargs : dict Keywords passed to `get_edges_fixed_bins` or `get_edges_min_ts`. If method is 'fixed-bins', keyword should be `group_size`. If method is 'min-ts', keyword should be `ts_threshold`. Returns ------- axis_new : `~gammapy.maps.MapAxis` or `~gammapy.maps.TimeMapAxis` The new axis. Examples -------- >>> from gammapy.estimators.utils import get_rebinned_axis >>> from gammapy.estimators import FluxPoints >>> >>> # Rebin lightcurve axis >>> lc_1d = FluxPoints.read( ... "$GAMMAPY_DATA/estimators/pks2155_hess_lc/pks2155_hess_lc.fits", ... format="lightcurve", ... ) >>> # Rebin axis by combining adjacent bins as per the group_size >>> new_axis = get_rebinned_axis( ... lc_1d, method="fixed-bins", group_size=2, axis_name="time" ... ) >>> >>> # Rebin HESS flux points axis >>> fp = FluxPoints.read( ... "$GAMMAPY_DATA/estimators/crab_hess_fp/crab_hess_fp.fits" ... ) >>> # Rebin according to a minimum significance >>> axis_new = get_rebinned_axis( ... fp, method='min-ts', ts_threshold=4, axis_name='energy' ... ) """ # TODO: Make fixed_bins and fixed_edges work for multidimensions if not fluxpoint.geom.axes.is_unidimensional: raise ValueError( "Rebinning is supported only for Unidimensional FluxPoints \n " "Please use `iter_by_axis` to create Unidimensional FluxPoints" ) if method not in RESAMPLE_METHODS.keys(): raise ValueError("Incorrect option. Choose from", RESAMPLE_METHODS.keys()) edges_min, edges_max = RESAMPLE_METHODS[method]( fluxpoint=fluxpoint, axis_name=axis_name, **kwargs ) ax = fluxpoint.geom.axes[axis_name] if isinstance(ax, TimeMapAxis): axis_new = TimeMapAxis.from_time_edges( time_min=edges_min + ax.reference_time, time_max=edges_max + ax.reference_time, ) else: edges = np.append(edges_min, edges_max[-1]) axis_new = MapAxis.from_edges(edges, name=axis_name, interp=ax.interp) return axis_new def combine_significance_maps(maps): """Computes excess and significance for a set of datasets. The significance computation assumes that the model contains one degree of freedom per valid energy bin in each dataset. The method implemented here is valid under the assumption that the TS in each independent bin follows a Chi2 distribution, then the sum of the TS also follows a Chi2 distribution (with the sum of the degrees of freedom). See, Zhen (2014): https://www.sciencedirect.com/science/article/abs/pii/S0167947313003204, Lancaster (1961): https://onlinelibrary.wiley.com/doi/10.1111/j.1467-842X.1961.tb00058.x Parameters ---------- maps : list of `~gammapy.estimators.FluxMaps` List of maps with the same geometry. Returns ------- results : dict Dictionary with entries: * "significance" : joint significance map. * "df" : degree of freedom map (one norm per valid bin). * "npred_excess" : summed excess map. * "estimator_results" : dictionary containing the flux maps computed for each dataset. See also -------- get_combined_significance_maps : same method but computing the significance maps from estimators and datasets. """ geom = maps[0].ts.geom.to_image() ts_sum = Map.from_geom(geom) ts_sum_sign = Map.from_geom(geom) npred_excess_sum = Map.from_geom(geom) df = Map.from_geom(geom) for result in maps: df += np.sum(result["ts"].data > 0, axis=0) # one dof (norm) per valid bin ts_sum += result["ts"].reduce_over_axes() ts_sum_sign += ( result["ts"] * np.sign(result["npred_excess"]) ).reduce_over_axes() npred_excess_sum += result["npred_excess"].reduce_over_axes() significance = Map.from_geom(geom) significance.data = ts_to_sigma(ts_sum.data, df.data) * np.sign(ts_sum_sign) return dict( significance=significance, df=df, npred_excess=npred_excess_sum, estimator_results=maps, ) def get_combined_significance_maps(estimator, datasets): """Compute excess and significance for a set of datasets. The significance computation assumes that the model contains one degree of freedom per valid energy bin in each dataset. This method implemented here is valid under the assumption that the TS in each independent bin follows a Chi2 distribution, then the sum of the TS also follows a Chi2 distribution (with the sum of degree of freedom). See, Zhen (2014): https://www.sciencedirect.com/science/article/abs/pii/S0167947313003204, Lancaster (1961): https://onlinelibrary.wiley.com/doi/10.1111/j.1467-842X.1961.tb00058.x Parameters ---------- estimator : `~gammapy.estimators.ExcessMapEstimator` or `~gammapy.estimators.TSMapEstimator` Excess Map Estimator or TS Map Estimator dataset : `~gammapy.datasets.Datasets` Datasets containing only `~gammapy.datasets.MapDataset`. Returns ------- results : dict Dictionary with entries: * "significance" : joint significance map. * "df" : degree of freedom map (one norm per valid bin). * "npred_excess" : summed excess map. * "estimator_results" : dictionary containing the flux maps computed for each dataset. See also -------- combine_significance_maps : same method but using directly the significance maps from estimators """ from .map.excess import ExcessMapEstimator from .map.ts import TSMapEstimator if not isinstance(estimator, (ExcessMapEstimator, TSMapEstimator)): raise TypeError( f"estimator type should be ExcessMapEstimator or TSMapEstimator), got {type(estimator)} instead." ) results = [] for dataset in datasets: results.append(estimator.run(dataset)) return combine_significance_maps(results) def combine_flux_maps( maps, method="gaussian_errors", reference_model=None, dnde_scan_axis=None ): """Create a FluxMaps by combining a list of flux maps with the same geometry. This assumes the flux maps are independent measurements of the same true value. The GTI is stacked in the process. Parameters ---------- maps : list of `~gammapy.estimators.FluxMaps` List of maps with the same geometry. method : str * gaussian_errors : Under the gaussian error approximation the likelihood is given by the gaussian distibution. The product of gaussians is also a gaussian so can derive dnde, dnde_err, and ts. * distrib : Likelihood profile approximation assuming that probabilities distributions for flux points correspond to asymmetric gaussians and for upper limits to complementary error functions. Use available quantities among dnde, dnde_err, dnde_errp, dnde_errn, dnde_ul, and ts. * profile : Sum the likelihood profile maps. The flux maps must contains the `stat_scan` maps. Default is "gaussian_errors" which is the faster but least accurate solution, "distrib" will be more accurate if dnde_errp and dnde_errn are available, "profile" will be even more accurate if "stat_scan" is available. reference_model : `~gammapy.modeling.models.SkyModel`, optional Reference model to use for conversions. Default is None and is will use the reference_model of the first FluxMaps in the list. dnde_scan_axis : `~gammapy.maps.MapAxis` Map axis providing the dnde values used to compute the profile. Default is None and it will be derived from the first FluxMaps in the list. Used only if `method` is distrib or profile. Returns ------- flux_maps : `~gammapy.estimators.FluxMaps` Joint flux map. See also -------- get_combined_flux_maps : same method but using directly the flux maps from estimators """ gtis = [map_.gti for map_ in maps if map_.gti is not None] if np.any(gtis): gti = gtis[0].copy() for k in range(1, len(gtis)): gti.stack(gtis[k]) else: gti = None # TODO : change this once we have stackable metadata objets metas = [map_.meta for map_ in maps if map_.meta is not None] meta = {} if np.any(metas): for data in metas: meta.update(data) if reference_model is None: reference_model = maps[0].reference_model if method == "gaussian_errors": means = [map_.dnde.copy() for map_ in maps] sigmas = [map_.dnde_err.copy() for map_ in maps] # compensate for the ts deviation from gaussian approximation expectation in each map ts_diff = np.nansum( [ map_.ts.data - (map_.dnde.data / map_.dnde_err.data) ** 2 for map_ in maps ], axis=0, ) mean = means[0] sigma = sigmas[0] for k in range(1, len(means)): mean_k = means[k].quantity.to_value(mean.unit) sigma_k = sigmas[k].quantity.to_value(sigma.unit) mask_valid = np.isfinite(mean) & np.isfinite(sigma) & (sigma.data != 0) mask_valid_k = np.isfinite(mean_k) & np.isfinite(sigma_k) & (sigma_k != 0) mask = mask_valid & mask_valid_k mask_k = ~mask_valid & mask_valid_k mean.data[mask] = ( (mean.data * sigma_k**2 + mean_k * sigma.data**2) / (sigma.data**2 + sigma_k**2) )[mask] sigma.data[mask] = ( sigma.data * sigma_k / np.sqrt(sigma.data**2 + sigma_k**2) )[mask] mean.data[mask_k] = mean_k[mask_k] sigma.data[mask_k] = sigma_k[mask_k] ts = mean * mean / sigma / sigma + ts_diff ts.data[~np.isfinite(ts.data)] = np.nan kwargs = dict( sed_type="dnde", reference_model=reference_model, meta=meta, gti=gti ) return FluxMaps.from_maps(dict(dnde=mean, dnde_err=sigma, ts=ts), **kwargs) elif method in ["distrib", "profile"]: if dnde_scan_axis is None: dnde_scan_axis = _default_scan_map(maps[0]).geom.axes["dnde"] for k, map_ in enumerate(maps): if method == "profile": map_stat_scan = interpolate_profile_map(map_, dnde_scan_axis) else: map_stat_scan = approximate_profile_map(map_, dnde_scan_axis) map_stat_scan.data[np.isnan(map_stat_scan.data)] = 0.0 if k == 0: stat_scan = map_stat_scan else: stat_scan.data += map_stat_scan.data return get_flux_map_from_profile( {"stat_scan": stat_scan}, reference_model=reference_model, meta=meta, gti=gti, ) else: raise ValueError( f'Invalid method provided : {method}. Available methods are : "gaussian_errors", "distrib", "profile"' ) def get_combined_flux_maps( estimator, datasets, method="gaussian_errors", reference_model=None, dnde_scan_axis=None, ): """Create a `~gammapy.estimators.FluxMaps` by combining a list of flux maps with the same geometry. This assumes the flux maps are independent measurements of the same true value. The GTI is stacked in the process. Parameters ---------- estimator : `~gammapy.estimators.ExcessMapEstimator` or `~gammapy.estimators.TSMapEstimator` Excess Map Estimator or TS Map Estimator dataset : `~gammapy.datasets.Datasets` or list of `~gammapy.datasets.MapDataset` Datasets containing only `~gammapy.datasets.MapDataset`. method : str * gaussian_errors : Under the gaussian error approximation the likelihood is given by the gaussian distibution. The product of gaussians is also a gaussian so can derive dnde, dnde_err, and ts. * distrib : Likelihood profile approximation assuming that probabilities distributions for flux points correspond to asymmetric gaussians and for upper limits to complementary error functions. Use available quantities among dnde, dnde_err, dnde_errp, dnde_errn, dnde_ul, and ts. * profile : Sum the likelihood profile maps. The flux maps must contains the `stat_scan` maps. Default is "gaussian_errors" which is the faster but least accurate solution, "distrib" will be more accurate if dnde_errp and dnde_errn are available, "profile" will be even more accurate if "stat_scan" is available. reference_model : `~gammapy.modeling.models.SkyModel`, optional Reference model to use for conversions. Default is None and is will use the reference_model of the first FluxMaps in the list. dnde_scan_axis : `~gammapy.maps.MapAxis`, optional Map axis providing the dnde values used to compute the profile. If None, it will be derived from the first FluxMaps in the list. Default is None. Used only if `method` is "distrib" or "profile". Returns ------- results : dict Dictionary with entries: * "flux_maps" : `gammapy.estimators.FluxMaps` * "estimator_results" : dictionary containing the flux maps computed for each dataset. See also -------- combine_flux_maps : same method but using directly the flux maps from estimators """ from .map.excess import ExcessMapEstimator from .map.ts import TSMapEstimator if not isinstance(estimator, (ExcessMapEstimator, TSMapEstimator)): raise TypeError( f"`estimator` type should be ExcessMapEstimator or TSMapEstimator), got {type(estimator)} instead." ) results = [] for dataset in datasets: results.append(estimator.run(dataset)) output = dict() output["flux_maps"] = combine_flux_maps( results, method=method, reference_model=reference_model, dnde_scan_axis=dnde_scan_axis, ) output["estimator_results"] = results return output def _default_scan_map(flux_map, dnde_scan_axis=None): if dnde_scan_axis is None: dnde_scan_axis = MapAxis( _generate_scan_values() * flux_map.dnde_ref.squeeze(), interp="lin", node_type="center", name="dnde", unit=flux_map.dnde_ref.unit, ) geom = flux_map.dnde.geom geom_scan = geom.to_image().to_cube([dnde_scan_axis] + list(geom.axes)) return Map.from_geom(geom_scan, data=np.nan, unit="") def interpolate_profile_map(flux_map, dnde_scan_axis=None): """Interpolate sparse likelihood profile to regular grid. Parameters ---------- flux_map : `~gammapy.estimators.FluxMaps` Flux map. dnde_scan_axis : `~gammapy.maps.MapAxis` Map axis providing the dnde values used to compute the profile. Default is None and it will be derived from the flux_map. Returns ------- scan_map: `~gammapy.estimators.Maps` Likelihood profile map. """ stat_scan = _default_scan_map(flux_map, dnde_scan_axis) dnde_scan_axis = stat_scan.geom.axes["dnde"] mask_valid = ~np.isnan(flux_map.dnde.data) dnde_scan_values = flux_map.dnde_scan_values.quantity.to_value(dnde_scan_axis.unit) for ij, il, ik in zip(*np.where(mask_valid)): spline = InterpolatedUnivariateSpline( dnde_scan_values[ij, :, il, ik], flux_map.stat_scan.data[ij, :, il, ik], k=1, ext="raise", check_finite=True, ) stat_scan.data[ij, :, il, ik] = spline(dnde_scan_axis.center) return stat_scan def approximate_profile_map( flux_map, dnde_scan_axis=None, sqrt_ts_threshold_ul="ignore" ): """Likelihood profile approximation assuming that probabilities distributions for flux points correspond to asymmetric gaussians and for upper limits to complementary error functions. Use available quantities among dnde, dnde_err, dnde_errp, dnde_errn, dnde_ul and ts. Parameters ---------- flux_map : `~gammapy.estimators.FluxMaps` Flux map. dnde_scan_axis : `~gammapy.maps.MapAxis` Map axis providing the dnde values used to compute the profile. Default is None and it will be derived from the flux_map. sqrt_ts_threshold_ul : int Threshold value in sqrt(TS) for upper limits. Default is `ignore` and no threshold is applied. Setting to `None` will use the one of `flux_map`. Returns ------- scan_map: `~gammapy.estimators.Maps` Likelihood profile map. """ stat_approx = _default_scan_map(flux_map, dnde_scan_axis) dnde_coord = stat_approx.geom.get_coord()["dnde"].value if sqrt_ts_threshold_ul is None: sqrt_ts_threshold_ul = flux_map.sqrt_ts_threshold_ul mask_valid = ~np.isnan(flux_map.dnde.data) ij, il, ik = np.where(mask_valid) loc = flux_map.dnde.data[mask_valid][:, None] value = dnde_coord[ij, :, il, ik] try: mask_p = dnde_coord >= flux_map.dnde.data mask_p2d = mask_p[ij, :, il, ik] new_axis = np.ones(mask_p2d.shape[1], dtype=bool)[None, :] scale = np.zeros(mask_p2d.shape) scale[mask_p2d] = (flux_map.dnde_errp.data[mask_valid][:, None] * new_axis)[ mask_p2d ] scale[~mask_p2d] = (flux_map.dnde_errn.data[mask_valid][:, None] * new_axis)[ ~mask_p2d ] except AttributeError: scale = flux_map.dnde_err.data[mask_valid] scale = scale[:, None] stat_approx.data[ij, :, il, ik] = ((value - loc) / scale) ** 2 try: invalid_value = 999 stat_min_p = (stat_approx.data + invalid_value * (~mask_p)).min( axis=1, keepdims=True ) stat_min_m = (stat_approx.data + invalid_value * mask_p).min( axis=1, keepdims=True ) mask_minp = mask_p & (stat_min_p > stat_min_m) stat_approx.data[mask_minp] = (stat_approx.data + stat_min_m - stat_min_p)[ mask_minp ] mask_minn = ~mask_p & (stat_min_m >= stat_min_p) stat_approx.data[mask_minn] = (stat_approx.data + stat_min_p - stat_min_m)[ mask_minn ] except NameError: pass if not sqrt_ts_threshold_ul == "ignore" and sqrt_ts_threshold_ul is not None: mask_ul = (flux_map.sqrt_ts.data < sqrt_ts_threshold_ul) & ~np.isnan( flux_map.dnde_ul.data ) ij, il, ik = np.where(mask_ul) value = dnde_coord[ij, :, il, ik] loc_ul = flux_map.dnde_ul.data[mask_ul][:, None] scale_ul = flux_map.dnde_ul.data[mask_ul][:, None] stat_approx.data[ij, :, il, ik] = -2 * np.log( (special.erfc((-loc_ul + value) / scale_ul) / 2) / (special.erfc((-loc_ul + 0) / scale_ul) / 2) ) stat_approx.data[np.isnan(stat_approx.data)] = np.inf stat_approx.data += -flux_map.ts.data - stat_approx.data.min(axis=1) return stat_approx def get_flux_map_from_profile( flux_map, n_sigma=1, n_sigma_ul=2, reference_model=None, meta=None, gti=None ): """Create a new flux map using the likehood profile (stat_scan) to get ts, dnde, dnde_err, dnde_errp, dnde_errn, and dnde_ul. Parameters ---------- flux_maps : `~gammapy.estimators.FluxMaps` or dict of `~gammapy.maps.WcsNDMap` Flux map or dict containing a `stat_scan` entry n_sigma : int Number of sigma for flux error. Default is 1. n_sigma_ul : int Number of sigma for flux upper limits. Default is 2. reference_model : `~gammapy.modeling.models.SkyModel`, optional The reference model to use for conversions. If None, a model consisting of a point source with a power law spectrum of index 2 is assumed. Default is None and the one of `flux_map` will be used if available meta : dict, optional Dict of metadata. Default is None and the one of `flux_map` will be used if available gti : `~gammapy.data.GTI`, optional Maps GTI information. Default is None and the one of `flux_map` will be used if available Returns ------- flux_maps : `~gammapy.estimators.FluxMaps` Flux map. """ if isinstance(flux_map, dict): output_maps = flux_map else: if reference_model is None: reference_model = flux_map.reference_model if gti is None: gti = flux_map.gti if meta is None: meta = flux_map.meta output_maps = dict( stat_scan=flux_map.stat_scan, dnde_scan_values=flux_map.dnde_scan_values ) if getattr(flux_map, "dnde_scan_values", False): dnde_coord = flux_map["dnde_scan_values"].quantity else: dnde_coord = flux_map["stat_scan"].geom.get_coord()["dnde"] geom = ( flux_map["stat_scan"] .geom.to_image() .to_cube([flux_map["stat_scan"].geom.axes["energy"]]) ) ts = -flux_map["stat_scan"].data.min(axis=1) * u.Unit("") ind = flux_map["stat_scan"].data.argmin(axis=1) ij, ik, il = np.indices(ind.shape) dnde = dnde_coord[ij, ind, ik, il] maskp = dnde_coord > dnde stat_diff = flux_map["stat_scan"].data - flux_map["stat_scan"].data.min(axis=1) invalid_value = 999 ind = np.abs(stat_diff + invalid_value * maskp - n_sigma**2).argmin(axis=1) dnde_errn = dnde - dnde_coord[ij, ind, ik, il] ind = np.abs(stat_diff + invalid_value * (~maskp) - n_sigma**2).argmin(axis=1) dnde_errp = dnde_coord[ij, ind, ik, il] - dnde ind = np.abs(stat_diff + invalid_value * (~maskp) - n_sigma_ul**2).argmin(axis=1) dnde_ul = dnde_coord[ij, ind, ik, il] dnde_err = (dnde_errn + dnde_errp) / 2 maps = dict( ts=ts, dnde=dnde, dnde_err=dnde_err, dnde_errn=dnde_errn, dnde_errp=dnde_errp, dnde_ul=dnde_ul, ) for key in maps.keys(): maps[key] = Map.from_geom(geom, data=maps[key].value, unit=maps[key].unit) kwargs = dict(sed_type="dnde", gti=gti, reference_model=reference_model, meta=meta) output_maps.update(maps) return FluxMaps.from_maps(output_maps, **kwargs) def _generate_scan_values(power_min=-6, power_max=2, relative_error=1e-2): """Values sampled such as we can probe a given `relative_error` on the norm between 10**`power_min` and 10**`power_max`. """ arrays = [] for power in range(power_min, power_max): vmin = 10**power vmax = 10 ** (power + 1) bin_per_decade = int((vmax - vmin) / (vmin * relative_error)) arrays.append(np.linspace(vmin, vmax, bin_per_decade + 1, dtype=np.float32)) scan_1side = np.unique(np.concatenate(arrays)) return np.concatenate((-scan_1side[::-1], [0], scan_1side)) def _get_default_norm( norm, scan_min=0.2, scan_max=5, scan_n_values=11, scan_values=None, interp="lin", ): """Create default norm parameter.""" if norm is None or isinstance(norm, dict): norm_kwargs = dict( name="norm", value=1, unit="", interp=interp, frozen=False, scan_min=scan_min, scan_max=scan_max, scan_n_values=scan_n_values, scan_values=scan_values, ) if isinstance(norm, dict): norm_kwargs.update(norm) try: norm = Parameter(**norm_kwargs) except TypeError as error: raise TypeError(f"Invalid dict key for norm init : {error}") if norm.name != "norm": raise ValueError("norm.name is not 'norm'") return norm def _get_norm_scan_values(norm, result): """Compute norms based on the fit result to sample the stat profile at different scales.""" norm_err = result["norm_err"] norm_value = result["norm"] if ~np.isfinite(norm_err) or norm_err == 0: norm_err = 0.1 if ~np.isfinite(norm_value) or norm_value == 0: norm_value = 1.0 sparse_norms = np.concatenate( ( norm_value + np.linspace(-2.5, 2.5, 51) * norm_err, norm_value + np.linspace(-10, 10, 21) * norm_err, np.abs(norm_value) * np.linspace(-10, 10, 21), np.linspace(-10, 10, 21), np.linspace(norm.scan_values[0], norm.scan_values[-1], 2), ) ) sparse_norms = np.unique(sparse_norms) if len(sparse_norms) != 109: rand_norms = 20 * np.random.rand(109 - len(sparse_norms)) - 10 sparse_norms = np.concatenate((sparse_norms, rand_norms)) return np.sort(sparse_norms) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.204642 gammapy-1.3/gammapy/extern/0000755000175100001770000000000014721316215015357 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/extern/__init__.py0000644000175100001770000000065514721316200017470 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """ This package contains `extern` code, i.e. code that we just copied here into the `gammapy.extern` package, because we wanted to use it, but not have an extra dependency (these are single-file external packages). * ``xmltodict.py`` for easily converting XML from / to Python dicts Origin: https://github.com/martinblech/xmltodict/blob/master/xmltodict.py """ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/extern/xmltodict.py0000644000175100001770000002775614721316200017753 0ustar00runnerdocker#!/usr/bin/env python "Makes working with XML feel like you are working with JSON" from xml.parsers import expat from xml.sax.saxutils import XMLGenerator from xml.sax.xmlreader import AttributesImpl try: # pragma no cover from cStringIO import StringIO except ImportError: # pragma no cover try: from StringIO import StringIO except ImportError: from io import StringIO try: # pragma no cover from collections import OrderedDict except ImportError: # pragma no cover try: from ordereddict import OrderedDict except ImportError: OrderedDict = dict try: # pragma no cover _basestring = basestring except NameError: # pragma no cover _basestring = str try: # pragma no cover _unicode = unicode except NameError: # pragma no cover _unicode = str __author__ = "Martin Blech" __version__ = "0.9.0" __license__ = "MIT" class ParsingInterrupted(Exception): pass class _DictSAXHandler(object): def __init__( self, item_depth=0, item_callback=lambda *args: True, xml_attribs=True, attr_prefix="@", cdata_key="#text", force_cdata=False, cdata_separator="", postprocessor=None, dict_constructor=OrderedDict, strip_whitespace=True, namespace_separator=":", namespaces=None, ): self.path = [] self.stack = [] self.data = None self.item = None self.item_depth = item_depth self.xml_attribs = xml_attribs self.item_callback = item_callback self.attr_prefix = attr_prefix self.cdata_key = cdata_key self.force_cdata = force_cdata self.cdata_separator = cdata_separator self.postprocessor = postprocessor self.dict_constructor = dict_constructor self.strip_whitespace = strip_whitespace self.namespace_separator = namespace_separator self.namespaces = namespaces def _build_name(self, full_name): if not self.namespaces: return full_name i = full_name.rfind(self.namespace_separator) if i == -1: return full_name namespace, name = full_name[:i], full_name[i + 1 :] short_namespace = self.namespaces.get(namespace, namespace) if not short_namespace: return name else: return self.namespace_separator.join((short_namespace, name)) def _attrs_to_dict(self, attrs): if isinstance(attrs, dict): return attrs return self.dict_constructor(zip(attrs[0::2], attrs[1::2])) def startElement(self, full_name, attrs): name = self._build_name(full_name) attrs = self._attrs_to_dict(attrs) self.path.append((name, attrs or None)) if len(self.path) > self.item_depth: self.stack.append((self.item, self.data)) if self.xml_attribs: attrs = self.dict_constructor( (self.attr_prefix + key, value) for (key, value) in attrs.items() ) else: attrs = None self.item = attrs or None self.data = None def endElement(self, full_name): name = self._build_name(full_name) if len(self.path) == self.item_depth: item = self.item if item is None: item = self.data should_continue = self.item_callback(self.path, item) if not should_continue: raise ParsingInterrupted() if len(self.stack): item, data = self.item, self.data self.item, self.data = self.stack.pop() if self.strip_whitespace and data is not None: data = data.strip() or None if data and self.force_cdata and item is None: item = self.dict_constructor() if item is not None: if data: self.push_data(item, self.cdata_key, data) self.item = self.push_data(self.item, name, item) else: self.item = self.push_data(self.item, name, data) else: self.item = self.data = None self.path.pop() def characters(self, data): if not self.data: self.data = data else: self.data += self.cdata_separator + data def push_data(self, item, key, data): if self.postprocessor is not None: result = self.postprocessor(self.path, key, data) if result is None: return item key, data = result if item is None: item = self.dict_constructor() try: value = item[key] if isinstance(value, list): value.append(data) else: item[key] = [value, data] except KeyError: item[key] = data return item def parse( xml_input, encoding=None, expat=expat, process_namespaces=False, namespace_separator=":", **kwargs, ): """Parse the given XML input and convert it into a dictionary. `xml_input` can either be a `string` or a file-like object. If `xml_attribs` is `True`, element attributes are put in the dictionary among regular child elements, using `@` as a prefix to avoid collisions. If set to `False`, they are just ignored. Simple example:: >>> from gammapy.extern import xmltodict >>> doc = xmltodict.parse(\"\"\" ... ... 1 ... 2 ... ... \"\"\") >>> doc['a']['@prop'] 'x' >>> doc['a']['b'] ['1', '2'] If `item_depth` is `0`, the function returns a dictionary for the root element (default behavior). Otherwise, it calls `item_callback` every time an item at the specified depth is found and returns `None` in the end (streaming mode). The callback function receives two parameters: the `path` from the document root to the item (name-attribs pairs), and the `item` (dict). If the callback's return value is false-ish, parsing will be stopped with the :class:`ParsingInterrupted` exception. Streaming example:: >>> def handle(path, item): ... print('path:%s item:%s' % (path, item)) ... return True ... >>> xmltodict.parse(\"\"\" ... ... 1 ... 2 ... \"\"\", item_depth=2, item_callback=handle) path:[('a', OrderedDict([('prop', 'x')])), ('b', None)] item: 1 path:[('a', OrderedDict([('prop', 'x')])), ('b', None)] item: 2 The optional argument `postprocessor` is a function that takes `path`, `key` and `value` as positional arguments and returns a new `(key, value)` pair where both `key` and `value` may have changed. Usage example:: >>> def postprocessor(path, key, value): ... try: ... return key + ':int', int(value) ... except (ValueError, TypeError): ... return key, value >>> xmltodict.parse('12x', ... postprocessor=postprocessor) OrderedDict([('a', OrderedDict([('b:int', [1, 2]), ('b', 'x')]))]) You can pass an alternate version of `expat` (such as `defusedexpat`) by using the `expat` parameter. E.g: >>> import defusedexpat >>> xmltodict.parse('hello', expat=defusedexpat.pyexpat) OrderedDict([(u'a', u'hello')]) """ handler = _DictSAXHandler(namespace_separator=namespace_separator, **kwargs) if isinstance(xml_input, _unicode): if not encoding: encoding = "utf-8" xml_input = xml_input.encode(encoding) if not process_namespaces: namespace_separator = None parser = expat.ParserCreate(encoding, namespace_separator) try: parser.ordered_attributes = True except AttributeError: # Jython's expat does not support ordered_attributes pass parser.StartElementHandler = handler.startElement parser.EndElementHandler = handler.endElement parser.CharacterDataHandler = handler.characters parser.buffer_text = True try: parser.ParseFile(xml_input) except (TypeError, AttributeError): parser.Parse(xml_input, True) return handler.item def _emit( key, value, content_handler, attr_prefix="@", cdata_key="#text", depth=0, preprocessor=None, pretty=False, newl="\n", indent="\t", ): if preprocessor is not None: result = preprocessor(key, value) if result is None: return key, value = result if not isinstance(value, (list, tuple)): value = [value] if depth == 0 and len(value) > 1: raise ValueError("document with multiple roots") for v in value: if v is None: v = OrderedDict() elif not isinstance(v, dict): v = _unicode(v) if isinstance(v, _basestring): v = OrderedDict(((cdata_key, v),)) cdata = None attrs = OrderedDict() children = [] for ik, iv in v.items(): if ik == cdata_key: cdata = iv continue if ik.startswith(attr_prefix): attrs[ik[len(attr_prefix) :]] = iv continue children.append((ik, iv)) if pretty: content_handler.ignorableWhitespace(depth * indent) content_handler.startElement(key, AttributesImpl(attrs)) if pretty and children: content_handler.ignorableWhitespace(newl) for child_key, child_value in children: _emit( child_key, child_value, content_handler, attr_prefix, cdata_key, depth + 1, preprocessor, pretty, newl, indent, ) if cdata is not None: content_handler.characters(cdata) if pretty and children: content_handler.ignorableWhitespace(depth * indent) content_handler.endElement(key) if pretty and depth: content_handler.ignorableWhitespace(newl) def unparse(input_dict, output=None, encoding="utf-8", full_document=True, **kwargs): """Emit an XML document for the given `input_dict` (reverse of `parse`). The resulting XML document is returned as a string, but if `output` (a file-like object) is specified, it is written there instead. Dictionary keys prefixed with `attr_prefix` (default=`'@'`) are interpreted as XML node attributes, whereas keys equal to `cdata_key` (default=`'#text'`) are treated as character data. The `pretty` parameter (default=`False`) enables pretty-printing. In this mode, lines are terminated with `'\n'` and indented with `'\t'`, but this can be customized with the `newl` and `indent` parameters. """ ((key, value),) = input_dict.items() must_return = False if output is None: output = StringIO() must_return = True content_handler = XMLGenerator(output, encoding) if full_document: content_handler.startDocument() _emit(key, value, content_handler, **kwargs) if full_document: content_handler.endDocument() if must_return: value = output.getvalue() try: # pragma no cover value = value.decode(encoding) except AttributeError: # pragma no cover pass return value if __name__ == "__main__": # pragma: no cover import marshal import sys (item_depth,) = sys.argv[1:] item_depth = int(item_depth) def handle_item(path, item): marshal.dump((path, item), sys.stdout) return True try: root = parse( sys.stdin, item_depth=item_depth, item_callback=handle_item, dict_constructor=dict, ) if item_depth == 0: handle_item([], root) except KeyboardInterrupt: pass ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.204642 gammapy-1.3/gammapy/irf/0000755000175100001770000000000014721316215014632 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/__init__.py0000644000175100001770000000247514721316200016745 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Instrument response functions (IRFs).""" from gammapy.utils.registry import Registry from .background import Background2D, Background3D, BackgroundIRF from .core import IRF, FoVAlignment, IRFMap from .edisp import EDispKernel, EDispKernelMap, EDispMap, EnergyDispersion2D from .effective_area import EffectiveAreaTable2D from .io import load_irf_dict_from_file from .psf import ( PSF3D, EnergyDependentMultiGaussPSF, ParametricPSF, PSFKernel, PSFKing, PSFMap, RecoPSFMap, ) from .rad_max import RadMax2D __all__ = [ "Background2D", "Background3D", "BackgroundIRF", "EDispKernel", "EDispKernelMap", "EDispMap", "EffectiveAreaTable2D", "EnergyDependentMultiGaussPSF", "EnergyDispersion2D", "FoVAlignment", "IRF_REGISTRY", "IRFMap", "IRF", "load_irf_dict_from_file", "ParametricPSF", "PSF3D", "PSFKernel", "PSFKing", "PSFMap", "RecoPSFMap", "RadMax2D", ] IRF_REGISTRY = Registry( [ EffectiveAreaTable2D, EnergyDispersion2D, PSF3D, EnergyDependentMultiGaussPSF, PSFKing, Background3D, Background2D, PSFMap, RecoPSFMap, EDispKernelMap, RadMax2D, EDispMap, ] ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/background.py0000644000175100001770000003770014721316200017324 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import numpy as np import astropy.units as u from astropy.coordinates import angular_separation from astropy.visualization import quantity_support import matplotlib.pyplot as plt from matplotlib.colors import LogNorm from gammapy.maps import MapAxes, MapAxis from gammapy.maps.axes import UNIT_STRING_FORMAT from gammapy.utils.compat import COPY_IF_NEEDED from gammapy.visualization.utils import add_colorbar from .core import IRF from .io import gadf_is_pointlike __all__ = ["Background3D", "Background2D", "BackgroundIRF"] log = logging.getLogger(__name__) class BackgroundIRF(IRF): """Background IRF base class. Parameters ---------- axes : list of `~gammapy.maps.MapAxis` or `~gammapy.maps.MapAxes` Axes. data : `~np.ndarray` Data array. unit : str or `~astropy.units.Unit` Data unit usually ``s^-1 MeV^-1 sr^-1``. meta : dict Metadata dictionary. """ default_interp_kwargs = dict(bounds_error=False, fill_value=0.0, values_scale="log") """Default Interpolation kwargs to extrapolate.""" @classmethod def from_table(cls, table, format="gadf-dl3"): """Read from `~astropy.table.Table`. Parameters ---------- table : `~astropy.table.Table` Table with background data. format : {"gadf-dl3"} Format specification. Default is "gadf-dl3". Returns ------- bkg : `Background2D` or `Background2D` Background IRF class. """ # TODO: some of the existing background files have missing HDUCLAS keywords # which are required to define the correct Gammapy axis names if "HDUCLAS2" not in table.meta: log.warning("Missing 'HDUCLAS2' keyword assuming 'BKG'") table = table.copy() table.meta["HDUCLAS2"] = "BKG" axes = MapAxes.from_table(table, format=format)[cls.required_axes] # TODO: spec says key should be "BKG", but there are files around # (e.g. CTA 1DC) that use "BGD". For now we support both if "BKG" in table.colnames: bkg_name = "BKG" elif "BGD" in table.colnames: bkg_name = "BGD" else: raise ValueError("Invalid column names. Need 'BKG' or 'BGD'.") data = table[bkg_name].quantity[0].T if data.unit == "" or isinstance(data.unit, u.UnrecognizedUnit): data = u.Quantity(data.value, "s-1 MeV-1 sr-1", copy=COPY_IF_NEEDED) log.warning( "Invalid unit found in background table! Assuming (s-1 MeV-1 sr-1)" ) # TODO: The present HESS and CTA background fits files # have a reverse order (lon, lat, E) than recommended in GADF(E, lat, lon) # For now, we support both. if axes.shape == axes.shape[::-1]: log.error("Ambiguous axes order in Background fits files!") if np.shape(data) != axes.shape: log.debug("Transposing background table on read") data = data.transpose() return cls( axes=axes, data=data.value, meta=table.meta, unit=data.unit, is_pointlike=gadf_is_pointlike(table.meta), fov_alignment=table.meta.get("FOVALIGN", "RADEC"), ) class Background3D(BackgroundIRF): """Background 3D. Data format specification: :ref:`gadf:bkg_3d`. Parameters ---------- axes : list of `~gammapy.maps.MapAxis` or `~gammapy.maps.MapAxes` Required axes (in the given order) are: * energy (reconstructed energy axis) * fov_lon (field of view longitude) * fov_lon (field of view latitude) data : `~np.ndarray` Data array. unit : str or `~astropy.units.Unit` Data unit usually ``s^-1 MeV^-1 sr^-1``. fov_alignment : `~gammapy.irf.FoVAlignment` The orientation of the field of view coordinate system. meta : dict Metadata dictionary. Examples -------- Here's an example you can use to learn about this class: >>> from gammapy.irf import Background3D >>> filename = '$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits' >>> bkg_3d = Background3D.read(filename, hdu='BACKGROUND') >>> print(bkg_3d) Background3D ------------ axes : ['energy', 'fov_lon', 'fov_lat'] shape : (21, 36, 36) ndim : 3 unit : 1 / (MeV s sr) dtype : >f4 """ tag = "bkg_3d" required_axes = ["energy", "fov_lon", "fov_lat"] default_unit = u.s**-1 * u.MeV**-1 * u.sr**-1 def to_2d(self): """Convert to `Background2D`. This takes the values at Y = 0 and X >= 0. """ # TODO: this is incorrect as it misses the Jacobian? idx_lon = self.axes["fov_lon"].coord_to_idx(0 * u.deg)[0] idx_lat = self.axes["fov_lat"].coord_to_idx(0 * u.deg)[0] data = self.quantity[:, idx_lon:, idx_lat].copy() offset = self.axes["fov_lon"].edges[idx_lon:] offset_axis = MapAxis.from_edges(offset, name="offset") return Background2D( axes=[self.axes["energy"], offset_axis], data=data.value, unit=data.unit ) def peek(self, figsize=(10, 8)): """Quick-look summary plots. Parameters ---------- figsize : tuple, optional Size of the figure. Default is (10, 8). """ return self.to_2d().peek(figsize) def plot_at_energy( self, energy=1 * u.TeV, add_cbar=True, ncols=3, figsize=None, axes_loc=None, kwargs_colorbar=None, **kwargs, ): """Plot the background rate in FoV coordinates at a given energy. Parameters ---------- energy : `~astropy.units.Quantity`, optional List of energies. Default is 1 TeV. add_cbar : bool, optional Add color bar. Default is True. ncols : int, optional Number of columns to plot. Default is 3. figsize : tuple, optional Figure size. Default is None. axes_loc : dict, optional Keyword arguments passed to `~mpl_toolkits.axes_grid1.axes_divider.AxesDivider.append_axes`. kwargs_colorbar : dict, optional Keyword arguments passed to `~matplotlib.pyplot.colorbar`. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.pcolormesh`. """ kwargs_colorbar = kwargs_colorbar or {} n = len(energy) cols = min(ncols, n) rows = 1 + (n - 1) // cols width = 12 cfraction = 0.0 if add_cbar: cfraction = 0.15 if figsize is None: figsize = (width, rows * width // (cols * (1 + cfraction))) fig, axes = plt.subplots( ncols=cols, nrows=rows, figsize=figsize, gridspec_kw={"hspace": 0.2, "wspace": 0.3}, ) x = self.axes["fov_lat"].edges y = self.axes["fov_lon"].edges X, Y = np.meshgrid(x, y) for i, ee in enumerate(energy): if len(energy) == 1: ax = axes else: ax = axes.flat[i] bkg = self.evaluate(energy=ee) bkg_unit = bkg.unit bkg = bkg.value with quantity_support(): caxes = ax.pcolormesh(X, Y, bkg.squeeze(), **kwargs) self.axes["fov_lat"].format_plot_xaxis(ax) self.axes["fov_lon"].format_plot_yaxis(ax) ax.set_title(str(ee)) if add_cbar: label = f"Background [{bkg_unit.to_string(UNIT_STRING_FORMAT)}]" kwargs_colorbar.setdefault("label", label) cbar = add_colorbar(caxes, ax=ax, axes_loc=axes_loc, **kwargs_colorbar) cbar.formatter.set_powerlimits((0, 0)) row, col = np.unravel_index(i, shape=(rows, cols)) if col > 0: ax.set_ylabel("") if row < rows - 1: ax.set_xlabel("") ax.set_aspect("equal", "box") class Background2D(BackgroundIRF): """Background 2D. Data format specification: :ref:`gadf:bkg_2d` Parameters ---------- axes : list of `~gammapy.maps.MapAxis` or `~gammapy.maps.MapAxes` Required axes (in the given order) are: * energy (reconstructed energy axis) * offset (field of view offset axis) data : `~np.ndarray` Data array. unit : str or `~astropy.units.Unit` Data unit usually ``s^-1 MeV^-1 sr^-1``. meta : dict Metadata dictionary. """ tag = "bkg_2d" required_axes = ["energy", "offset"] default_unit = u.s**-1 * u.MeV**-1 * u.sr**-1 def to_3d(self): """Convert to Background3D.""" offsets = self.axes["offset"].edges edges_neg = np.negative(offsets)[::-1] edges_neg = edges_neg[edges_neg <= 0] edges = np.concatenate((edges_neg, offsets[offsets > 0])) fov_lat = MapAxis.from_edges(edges=edges, name="fov_lat") fov_lon = MapAxis.from_edges(edges=edges, name="fov_lon") axes = MapAxes([self.axes["energy"], fov_lon, fov_lat]) coords = axes.get_coord() offset = angular_separation( 0 * u.rad, 0 * u.rad, coords["fov_lon"], coords["fov_lat"] ) data = self.evaluate(offset=offset, energy=coords["energy"]) return Background3D( axes=axes, data=data, ) def plot_at_energy( self, energy=1 * u.TeV, add_cbar=True, ncols=3, figsize=None, **kwargs ): """Plot the background rate in FoV coordinates at a given energy. Parameters ---------- energy : `~astropy.units.Quantity`, optional List of energy. Default is 1 TeV. add_cbar : bool, optional Add color bar. Default is True. ncols : int, optional Number of columns to plot. Default is 3. figsize : tuple, optional Figure size. Default is None. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.pcolormesh`. """ bkg_3d = self.to_3d() bkg_3d.plot_at_energy( energy=energy, add_cbar=add_cbar, ncols=ncols, figsize=figsize, **kwargs ) def plot( self, ax=None, add_cbar=True, axes_loc=None, kwargs_colorbar=None, **kwargs ): """Plot energy offset dependence of the background model. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. add_cbar : bool, optional Add a colorbar to the plot. Default is True. axes_loc : dict, optional Keyword arguments passed to `~mpl_toolkits.axes_grid1.axes_divider.AxesDivider.append_axes`. kwargs_colorbar : dict, optional Keyword arguments passed to `~matplotlib.pyplot.colorbar`. kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.pcolormesh`. Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax energy_axis, offset_axis = self.axes["energy"], self.axes["offset"] data = self.quantity.value kwargs.setdefault("cmap", "GnBu") kwargs.setdefault("edgecolors", "face") kwargs.setdefault("norm", LogNorm()) kwargs_colorbar = kwargs_colorbar or {} with quantity_support(): caxes = ax.pcolormesh( energy_axis.edges, offset_axis.edges, data.T, **kwargs ) energy_axis.format_plot_xaxis(ax=ax) offset_axis.format_plot_yaxis(ax=ax) if add_cbar: label = ( f"Background rate [{self.quantity.unit.to_string(UNIT_STRING_FORMAT)}]" ) kwargs_colorbar.setdefault("label", label) add_colorbar(caxes, ax=ax, axes_loc=axes_loc, **kwargs_colorbar) def plot_offset_dependence(self, ax=None, energy=None, **kwargs): """Plot background rate versus offset for a given energy. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. energy : `~astropy.units.Quantity`, optional Energy. Default is None. Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax if energy is None: energy_axis = self.axes["energy"] e_min, e_max = np.log10(energy_axis.center.value[[0, -1]]) energy = np.logspace(e_min, e_max, 4) * energy_axis.unit offset_axis = self.axes["offset"] for ee in energy: bkg = self.evaluate(offset=offset_axis.center, energy=ee) if np.isnan(bkg).all(): continue label = f"energy = {ee:.1f}" with quantity_support(): ax.plot(offset_axis.center, bkg, label=label, **kwargs) offset_axis.format_plot_xaxis(ax=ax) ax.set_ylabel( f"Background rate [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]" ) ax.set_yscale("log") ax.legend(loc="upper right") return ax def plot_energy_dependence(self, ax=None, offset=None, **kwargs): """Plot background rate versus energy for a given offset. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. offset : `~astropy.coordinates.Angle`, optional Offset. Default is None. kwargs : dict Forwarded to plt.plot(). Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax if offset is None: offset_axis = self.axes["offset"] off_min, off_max = offset_axis.center.value[[0, -1]] offset = np.linspace(off_min, off_max, 4) * offset_axis.unit energy_axis = self.axes["energy"] for off in offset: bkg = self.evaluate(offset=off, energy=energy_axis.center) label = f"offset = {off:.2f}" with quantity_support(): ax.plot(energy_axis.center, bkg, label=label, **kwargs) energy_axis.format_plot_xaxis(ax=ax) ax.set_yscale("log") ax.set_ylabel( f"Background rate [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]" ) ax.legend(loc="best") return ax def plot_spectrum(self, ax=None, **kwargs): """Plot angle integrated background rate versus energy. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. **kwargs : dict Keyword arguments forwarded to `~matplotib.pyplot.plot`. Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax offset_axis = self.axes["offset"] energy_axis = self.axes["energy"] bkg = self.integral(offset=offset_axis.bounds[1], axis_name="offset") bkg = bkg.to(u.Unit("s-1") / energy_axis.unit) with quantity_support(): ax.plot(energy_axis.center, bkg, label="integrated spectrum", **kwargs) energy_axis.format_plot_xaxis(ax=ax) ax.set_yscale("log") ax.set_ylabel( f"Background rate [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]" ) ax.legend(loc="best") return ax def peek(self, figsize=(10, 8)): """Quick-look summary plots.""" fig, axes = plt.subplots(nrows=2, ncols=2, figsize=figsize) self.plot(ax=axes[1][1]) self.plot_offset_dependence(ax=axes[0][0]) self.plot_energy_dependence(ax=axes[1][0]) self.plot_spectrum(ax=axes[0][1]) plt.tight_layout() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/core.py0000644000175100001770000010015014721316200016123 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import abc import html import logging from copy import deepcopy from enum import Enum import numpy as np from astropy import units as u from astropy.io import fits from astropy.table import Table from astropy.utils import lazyproperty from gammapy.maps import Map, MapAxes, MapAxis, RegionGeom from gammapy.utils.compat import COPY_IF_NEEDED from gammapy.utils.integrate import trapz_loglog from gammapy.utils.interpolation import ( ScaledRegularGridInterpolator, interpolation_scale, ) from gammapy.utils.scripts import make_path from .io import IRF_DL3_HDU_SPECIFICATION, IRF_MAP_HDU_SPECIFICATION, gadf_is_pointlike log = logging.getLogger(__name__) class FoVAlignment(str, Enum): """ Orientation of the Field of View Coordinate System. Currently, only two possible alignments are supported: alignment with the horizontal coordinate system (ALTAZ) and alignment with the equatorial coordinate system (RADEC). """ ALTAZ = "ALTAZ" RADEC = "RADEC" # used for backward compatibility of old HESS data REVERSE_LON_RADEC = "REVERSE_LON_RADEC" class IRF(metaclass=abc.ABCMeta): """IRF base class for DL3 instrument response functions. Parameters ---------- axes : list of `~gammapy.maps.MapAxis` or `~gammapy.maps.MapAxes` Axes. data : `~numpy.ndarray` or `~astropy.units.Quantity`, optional Data. Default is 0. unit : str or `~astropy.units.Unit`, optional Unit, ignored if data is a Quantity. Default is "". is_pointlike : bool, optional Whether the IRF is point-like. True for point-like IRFs, False for full-enclosure. Default is False. fov_alignment : `FoVAlignment`, optional The orientation of the field of view coordinate system. Default is FoVAlignment.RADEC. meta : dict, optional Metadata dictionary. Default is None. """ default_interp_kwargs = dict( bounds_error=False, fill_value=0.0, ) def __init__( self, axes, data=0, unit="", is_pointlike=False, fov_alignment=FoVAlignment.RADEC, meta=None, interp_kwargs=None, ): axes = MapAxes(axes) axes.assert_names(self.required_axes) self._axes = axes self._fov_alignment = FoVAlignment(fov_alignment) self._is_pointlike = is_pointlike if isinstance(data, u.Quantity): self.data = data.value if not self.default_unit.is_equivalent(data.unit): raise ValueError( f"Error: {data.unit} is not an allowed unit. {self.tag} " f"requires {self.default_unit} data quantities." ) else: self._unit = data.unit else: self.data = data self._unit = unit self.meta = meta or {} if interp_kwargs is None: interp_kwargs = self.default_interp_kwargs.copy() self.interp_kwargs = interp_kwargs @property @abc.abstractmethod def tag(self): pass @property @abc.abstractmethod def required_axes(self): pass @property def is_pointlike(self): """Whether the IRF is pointlike of full containment.""" return self._is_pointlike @property def has_offset_axis(self): """Whether the IRF explicitly depends on offset.""" return "offset" in self.required_axes @property def fov_alignment(self): """Alignment of the field of view coordinate axes, see `FoVAlignment`.""" return self._fov_alignment @property def data(self): return self._data @data.setter def data(self, value): """Set data. Parameters ---------- value : `~numpy.ndarray` Data array. """ required_shape = self.axes.shape if np.isscalar(value): value = value * np.ones(required_shape) if isinstance(value, u.Quantity): raise TypeError("Map data must be a Numpy array. Set unit separately") if np.shape(value) != required_shape: raise ValueError( f"data shape {value.shape} does not match" f"axes shape {required_shape}" ) self._data = value # reset cached interpolators self.__dict__.pop("_interpolate", None) self.__dict__.pop("_integrate_rad", None) def interp_missing_data(self, axis_name): """Interpolate missing data along a given axis.""" data = self.data.copy() values_scale = self.interp_kwargs.get("values_scale", "lin") scale = interpolation_scale(values_scale) axis = self.axes.index(axis_name) mask = ~np.isfinite(data) | (data == 0.0) coords = np.where(mask) xp = np.arange(data.shape[axis]) for coord in zip(*coords): idx = list(coord) idx[axis] = slice(None) fp = data[tuple(idx)] valid = ~mask[tuple(idx)] if np.any(valid): value = np.interp( x=coord[axis], xp=xp[valid], fp=scale(fp[valid]), left=np.nan, right=np.nan, ) if not np.isnan(value): data[coord] = scale.inverse(value) self.data = data # reset cached values @property def unit(self): """Map unit as a `~astropy.units.Unit` object.""" return self._unit @lazyproperty def _interpolate(self): kwargs = self.interp_kwargs.copy() # Allow extrapolation with in bins kwargs["fill_value"] = None points = [a.center for a in self.axes] points_scale = tuple([a.interp for a in self.axes]) return ScaledRegularGridInterpolator( points, self.quantity, points_scale=points_scale, **kwargs, ) @property def quantity(self): """Quantity as a `~astropy.units.Quantity` object.""" return u.Quantity(self.data, unit=self.unit, copy=COPY_IF_NEEDED) @quantity.setter def quantity(self, val): """Set data and unit. Parameters ---------- value : `~astropy.units.Quantity` Quantity. """ val = u.Quantity(val, copy=COPY_IF_NEEDED) self.data = val.value self._unit = val.unit def to_unit(self, unit): """Convert IRF to different unit. Parameters ---------- unit : `~astropy.unit.Unit` or str New unit. Returns ------- irf : `IRF` IRF with new unit and converted data. """ data = self.quantity.to_value(unit) return self.__class__( self.axes, data=data, meta=self.meta, interp_kwargs=self.interp_kwargs ) @property def axes(self): """`MapAxes`.""" return self._axes def __str__(self): str_ = f"{self.__class__.__name__}\n" str_ += "-" * len(self.__class__.__name__) + "\n\n" str_ += f"\taxes : {self.axes.names}\n" str_ += f"\tshape : {self.data.shape}\n" str_ += f"\tndim : {len(self.axes)}\n" str_ += f"\tunit : {self.unit}\n" str_ += f"\tdtype : {self.data.dtype}\n" return str_.expandtabs(tabsize=2) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def evaluate(self, method=None, **kwargs): """Evaluate IRF. Parameters ---------- **kwargs : dict Coordinates at which to evaluate the IRF. method : str {'linear', 'nearest'}, optional Interpolation method. Returns ------- array : `~astropy.units.Quantity` Interpolated values. """ # TODO: change to coord dict? non_valid_axis = set(kwargs).difference(self.axes.names) if non_valid_axis: raise ValueError( f"Not a valid coordinate axis {non_valid_axis}" f" Choose from: {self.axes.names}" ) coords_default = self.axes.get_coord() for key, value in kwargs.items(): coord = kwargs.get(key, value) if coord is not None: coords_default[key] = u.Quantity(coord, copy=COPY_IF_NEEDED) data = self._interpolate(coords_default.values(), method=method) if self.interp_kwargs["fill_value"] is not None: idxs = self.axes.coord_to_idx(coords_default, clip=False) invalid = np.broadcast_arrays(*[idx == -1 for idx in idxs]) mask = self._mask_out_bounds(invalid) if not data.shape: mask = mask.squeeze() data[mask] = self.interp_kwargs["fill_value"] data[~np.isfinite(data)] = self.interp_kwargs["fill_value"] return data @staticmethod def _mask_out_bounds(invalid): return np.any(invalid, axis=0) def integrate_log_log(self, axis_name, **kwargs): """Integrate along a given axis. This method uses log-log trapezoidal integration. Parameters ---------- axis_name : str Along which axis to integrate. **kwargs : dict Coordinates at which to evaluate the IRF. Returns ------- array : `~astropy.units.Quantity` Returns 2D array with axes offset. """ axis = self.axes.index(axis_name) data = self.evaluate(**kwargs, method="linear") values = kwargs[axis_name] return trapz_loglog(data, values, axis=axis) def cumsum(self, axis_name): """Compute cumsum along a given axis. Parameters ---------- axis_name : str Along which axis to integrate. Returns ------- irf : `~IRF` Cumsum IRF. """ axis = self.axes[axis_name] axis_idx = self.axes.index(axis_name) shape = [1] * len(self.axes) shape[axis_idx] = -1 values = self.quantity * axis.bin_width.reshape(shape) if axis_name in ["rad", "offset"]: # take Jacobian into account values = 2 * np.pi * axis.center.reshape(shape) * values data = values.cumsum(axis=axis_idx) axis_shifted = MapAxis.from_nodes( axis.edges[1:], name=axis.name, interp=axis.interp ) axes = self.axes.replace(axis_shifted) return self.__class__(axes=axes, data=data.value, unit=data.unit) def integral(self, axis_name, **kwargs): """Compute integral along a given axis. This method uses interpolation of the cumulative sum. Parameters ---------- axis_name : str Along which axis to integrate. **kwargs : dict Coordinates at which to evaluate the IRF. Returns ------- array : `~astropy.units.Quantity` Returns 2D array with axes offset. """ cumsum = self.cumsum(axis_name=axis_name) return cumsum.evaluate(**kwargs) def normalize(self, axis_name): """Normalise data in place along a given axis. Parameters ---------- axis_name : str Along which axis to normalize. """ cumsum = self.cumsum(axis_name=axis_name).quantity with np.errstate(invalid="ignore", divide="ignore"): axis = self.axes.index(axis_name=axis_name) normed = self.quantity / cumsum.max(axis=axis, keepdims=True) self.quantity = np.nan_to_num(normed) @classmethod def from_hdulist(cls, hdulist, hdu=None, format="gadf-dl3"): """Create from `~astropy.io.fits.HDUList`. Parameters ---------- hdulist : `~astropy.io.HDUList` HDU list. hdu : str HDU name. format : {"gadf-dl3"} Format specification. Default is "gadf-dl3". Returns ------- irf : `IRF` IRF class. """ if hdu is None: hdu = IRF_DL3_HDU_SPECIFICATION[cls.tag]["extname"] return cls.from_table(Table.read(hdulist[hdu]), format=format) @classmethod def read(cls, filename, hdu=None, format="gadf-dl3"): """Read from file. Parameters ---------- filename : str or `~pathlib.Path` Filename. hdu : str HDU name. format : {"gadf-dl3"}, optional Format specification. Default is "gadf-dl3". Returns ------- irf : `IRF` IRF class. """ with fits.open(str(make_path(filename)), memmap=False) as hdulist: return cls.from_hdulist(hdulist, hdu=hdu) @classmethod def from_table(cls, table, format="gadf-dl3"): """Read from `~astropy.table.Table`. Parameters ---------- table : `~astropy.table.Table` Table with IRF data. format : {"gadf-dl3"}, optional Format specification. Default is "gadf-dl3". Returns ------- irf : `IRF` IRF class. """ axes = MapAxes.from_table(table=table, format=format) axes = axes[cls.required_axes] column_name = IRF_DL3_HDU_SPECIFICATION[cls.tag]["column_name"] data = table[column_name].quantity[0].transpose() return cls( axes=axes, data=data.value, meta=table.meta, unit=data.unit, is_pointlike=gadf_is_pointlike(table.meta), fov_alignment=table.meta.get("FOVALIGN", "RADEC"), ) def to_table(self, format="gadf-dl3"): """Convert to table. Parameters ---------- format : {"gadf-dl3"}, optional Format specification. Default is "gadf-dl3". Returns ------- table : `~astropy.table.Table` IRF data table. """ table = self.axes.to_table(format=format) if format == "gadf-dl3": table.meta = self.meta.copy() spec = IRF_DL3_HDU_SPECIFICATION[self.tag] table.meta.update(spec["mandatory_keywords"]) if "FOVALIGN" in table.meta: table.meta["FOVALIGN"] = self.fov_alignment.value if self.is_pointlike: table.meta["HDUCLAS3"] = "POINT-LIKE" else: table.meta["HDUCLAS3"] = "FULL-ENCLOSURE" table[spec["column_name"]] = self.quantity.T[np.newaxis] else: raise ValueError(f"Not a valid supported format: '{format}'") return table def to_table_hdu(self, format="gadf-dl3"): """Convert to `~astropy.io.fits.BinTableHDU`. Parameters ---------- format : {"gadf-dl3"}, optional Format specification. Default is "gadf-dl3". Returns ------- hdu : `~astropy.io.fits.BinTableHDU` IRF data table HDU. """ name = IRF_DL3_HDU_SPECIFICATION[self.tag]["extname"] return fits.BinTableHDU(self.to_table(format=format), name=name) def to_hdulist(self, format="gadf-dl3"): """ Write the HDU list. Parameters ---------- format : {"gadf-dl3"}, optional Format specification. Default is "gadf-dl3". """ hdu = self.to_table_hdu(format=format) return fits.HDUList([fits.PrimaryHDU(), hdu]) def write(self, filename, *args, **kwargs): """Write IRF to fits. Calls `~astropy.io.fits.HDUList.writeto`, forwarding all arguments. """ self.to_hdulist().writeto(str(make_path(filename)), *args, **kwargs) def pad(self, pad_width, axis_name, **kwargs): """Pad IRF along a given axis. Parameters ---------- pad_width : {sequence, array_like, int} Number of pixels padded to the edges of each axis. axis_name : str Axis to downsample. By default, spatial axes are padded. **kwargs : dict Keyword argument forwarded to `~numpy.pad`. Returns ------- irf : `IRF` Padded IRF. """ if np.isscalar(pad_width): pad_width = (pad_width, pad_width) idx = self.axes.index(axis_name) pad_width_np = [(0, 0)] * self.data.ndim pad_width_np[idx] = pad_width kwargs.setdefault("mode", "constant") axes = self.axes.pad(axis_name=axis_name, pad_width=pad_width) data = np.pad(self.data, pad_width=pad_width_np, **kwargs) return self.__class__( data=data, axes=axes, meta=self.meta.copy(), unit=self.unit ) def slice_by_idx(self, slices): """Slice sub IRF from IRF object. Parameters ---------- slices : dict Dictionary of axes names and `slice` object pairs. Contains one element for each non-spatial dimension. Axes not specified in the dictionary are kept unchanged. Returns ------- sliced : `IRF` Sliced IRF object. """ axes = self.axes.slice_by_idx(slices) diff = set(self.axes.names).difference(axes.names) if diff: diff_slice = {key: value for key, value in slices.items() if key in diff} raise ValueError(f"Integer indexing not supported, got {diff_slice}") slices = tuple([slices.get(ax.name, slice(None)) for ax in self.axes]) data = self.data[slices] return self.__class__(axes=axes, data=data, unit=self.unit, meta=self.meta) def is_allclose(self, other, rtol_axes=1e-3, atol_axes=1e-6, **kwargs): """Compare two data IRFs for equivalency. Parameters ---------- other : `~gammapy.irfs.IRF` The IRF to compare against. rtol_axes : float, optional Relative tolerance for the axis comparison. Default is 1e-3. atol_axes : float, optional Absolute tolerance for the axis comparison. Default is 1e-6. **kwargs : dict Keywords passed to `numpy.allclose`. Returns ------- is_allclose : bool Whether the IRF is all close. """ if not isinstance(other, self.__class__): return TypeError(f"Cannot compare {type(self)} and {type(other)}") if self.data.shape != other.data.shape: return False axes_eq = self.axes.is_allclose(other.axes, rtol=rtol_axes, atol=atol_axes) data_eq = np.allclose(self.quantity, other.quantity, **kwargs) return axes_eq and data_eq def __eq__(self, other): if not isinstance(other, self.__class__): return False return self.is_allclose(other=other, rtol=1e-3, rtol_axes=1e-6) class IRFMap: """IRF map base class for DL4 instrument response functions.""" def __init__(self, irf_map, exposure_map): self._irf_map = irf_map self.exposure_map = exposure_map # TODO: only allow for limited set of additional axes? irf_map.geom.axes.assert_names(self.required_axes, allow_extra=True) @property @abc.abstractmethod def tag(self): pass @property @abc.abstractmethod def required_axes(self): pass @lazyproperty def has_single_spatial_bin(self): return self._irf_map.geom.to_image().data_shape == (1, 1) # TODO: add mask safe to IRFMap as a regular attribute and don't derive it from the data @property def mask_safe_image(self): """Mask safe for the map.""" mask = self._irf_map > (0 * self._irf_map.unit) return mask.reduce_over_axes(func=np.logical_or) def to_region_nd_map(self, region): """Extract IRFMap in a given region or position. If a region is given a mean IRF is computed, if a position is given the IRF is interpolated. Parameters ---------- region : `~regions.SkyRegion` or `~astropy.coordinates.SkyCoord` Region or position where to get the map. Returns ------- irf : `IRFMap` IRF map with region geometry. """ if region is None: region = self._irf_map.geom.center_skydir # TODO: compute an exposure weighted mean PSF here kwargs = {"region": region, "func": np.nanmean} if "energy" in self._irf_map.geom.axes.names: kwargs["method"] = "nearest" irf_map = self._irf_map.to_region_nd_map(**kwargs) if self.exposure_map: exposure_map = self.exposure_map.to_region_nd_map(**kwargs) else: exposure_map = None return self.__class__(irf_map, exposure_map=exposure_map) def _get_nearest_valid_position(self, position): """Get nearest valid position.""" is_valid = np.nan_to_num(self.mask_safe_image.get_by_coord(position))[0] if not is_valid and np.any(self.mask_safe_image > 0): log.warning( f"Position {position} is outside " "valid IRF map range, using nearest IRF defined within" ) position = self.mask_safe_image.mask_nearest_position(position) return position @classmethod def from_hdulist( cls, hdulist, hdu=None, hdu_bands=None, exposure_hdu=None, exposure_hdu_bands=None, format="gadf", ): """Create from `~astropy.io.fits.HDUList`. Parameters ---------- hdulist : `~astropy.fits.HDUList` HDU list. hdu : str, optional Name or index of the HDU with the IRF map. Default is None. hdu_bands : str, optional Name or index of the HDU with the IRF map BANDS table. Default is None. exposure_hdu : str, optional Name or index of the HDU with the exposure map data. Default is None. exposure_hdu_bands : str, optional Name or index of the HDU with the exposure map BANDS table. Default is None. format : {"gadf", "gtpsf"}, optional File format. Default is "gadf". Returns ------- irf_map : `IRFMap` IRF map. """ output_class = cls if format == "gadf": if hdu is None: hdu = IRF_MAP_HDU_SPECIFICATION[cls.tag] irf_map = Map.from_hdulist( hdulist, hdu=hdu, hdu_bands=hdu_bands, format=format ) if exposure_hdu is None: exposure_hdu = IRF_MAP_HDU_SPECIFICATION[cls.tag] + "_exposure" if exposure_hdu in hdulist: exposure_map = Map.from_hdulist( hdulist, hdu=exposure_hdu, hdu_bands=exposure_hdu_bands, format=format, ) else: exposure_map = None if cls.tag == "psf_map" and "energy" in irf_map.geom.axes.names: from .psf import RecoPSFMap output_class = RecoPSFMap if cls.tag == "edisp_map" and irf_map.geom.axes[0].name == "energy": from .edisp import EDispKernelMap output_class = EDispKernelMap elif format == "gtpsf": rad_axis = MapAxis.from_table_hdu(hdulist["THETA"], format=format) table = Table.read(hdulist["PSF"]) energy_axis_true = MapAxis.from_table(table, format=format) geom_psf = RegionGeom.create(region=None, axes=[rad_axis, energy_axis_true]) psf_map = Map.from_geom(geom=geom_psf, data=table["Psf"].data, unit="sr-1") geom_exposure = geom_psf.squash("rad") exposure_map = Map.from_geom( geom=geom_exposure, data=table["Exposure"].data.reshape(geom_exposure.data_shape), unit="cm2 s", ) return cls(psf_map=psf_map, exposure_map=exposure_map) else: raise ValueError(f"Format {format} not supported") return output_class(irf_map, exposure_map) @classmethod def read(cls, filename, format="gadf", hdu=None, checksum=False): """Read an IRF_map from file and create corresponding object. Parameters ---------- filename : str or `~pathlib.Path` File name. format : {"gadf", "gtpsf"}, optional File format. Default is "gadf". hdu : str or int HDU location. Default is None. checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. Returns ------- irf_map : `PSFMap`, `EDispMap` or `EDispKernelMap` IRF map. """ filename = make_path(filename) # TODO: this will test all hdus and the one specifically of interest with fits.open(filename, memmap=False, checksum=checksum) as hdulist: return cls.from_hdulist(hdulist, format=format, hdu=hdu) def to_hdulist(self, format="gadf"): """Convert to `~astropy.io.fits.HDUList`. Parameters ---------- format : {"gadf", "gtpsf"}, optional File format. Default is "gadf". Returns ------- hdu_list : `~astropy.io.fits.HDUList` HDU list. """ if format == "gadf": hdu = IRF_MAP_HDU_SPECIFICATION[self.tag] hdulist = self._irf_map.to_hdulist(hdu=hdu, format=format) exposure_hdu = hdu + "_exposure" if self.exposure_map is not None: new_hdulist = self.exposure_map.to_hdulist( hdu=exposure_hdu, format=format ) hdulist.extend(new_hdulist[1:]) elif format == "gtpsf": if not self._irf_map.geom.is_region: raise ValueError( "Format 'gtpsf' is only supported for region geometries" ) rad_hdu = self._irf_map.geom.axes["rad"].to_table_hdu(format=format) psf_table = self._irf_map.geom.axes["energy_true"].to_table(format=format) psf_table["Exposure"] = self.exposure_map.quantity[..., 0, 0].to("cm^2 s") psf_table["Psf"] = self._irf_map.quantity[..., 0, 0].to("sr^-1") psf_hdu = fits.BinTableHDU(data=psf_table, name="PSF") hdulist = fits.HDUList([fits.PrimaryHDU(), rad_hdu, psf_hdu]) else: raise ValueError(f"Format {format} not supported") return hdulist def write(self, filename, overwrite=False, format="gadf", checksum=False): """Write IRF map to fits. Parameters ---------- filename : str or `~pathlib.Path` Filename to write to. overwrite : bool, optional Overwrite existing file. Default is False. format : {"gadf", "gtpsf"}, optional File format. Default is "gadf". checksum : bool, optional When True adds both DATASUM and CHECKSUM cards to the headers written to the file. Default is False. """ hdulist = self.to_hdulist(format=format) hdulist.writeto(str(filename), overwrite=overwrite, checksum=checksum) def stack(self, other, weights=None, nan_to_num=True): """Stack IRF map with another one in place. Parameters ---------- other : `~gammapy.irf.IRFMap` IRF map to be stacked with this one. weights : `~gammapy.maps.Map`, optional Map with stacking weights. Default is None. nan_to_num: bool, optional Non-finite values are replaced by zero if True. Default is True. """ if self.exposure_map is None or other.exposure_map is None: raise ValueError( f"Missing exposure map for {self.__class__.__name__}.stack" ) cutout_info = getattr(other._irf_map.geom, "cutout_info", None) if cutout_info is not None: slices = cutout_info["parent-slices"] parent_slices = Ellipsis, slices[0], slices[1] else: parent_slices = slice(None) self._irf_map.data[parent_slices] *= self.exposure_map.data[parent_slices] self._irf_map.stack( other._irf_map * other.exposure_map.data, weights=weights, nan_to_num=nan_to_num, ) # stack exposure map if weights and "energy" in weights.geom.axes.names: weights = weights.reduce( axis_name="energy", func=np.logical_or, keepdims=True ) self.exposure_map.stack( other.exposure_map, weights=weights, nan_to_num=nan_to_num ) with np.errstate(invalid="ignore"): self._irf_map.data[parent_slices] /= self.exposure_map.data[parent_slices] self._irf_map.data = np.nan_to_num(self._irf_map.data) def copy(self): """Copy IRF map.""" return deepcopy(self) def cutout(self, position, width, mode="trim", min_npix=3): """Cutout IRF map. Parameters ---------- position : `~astropy.coordinates.SkyCoord` Center position of the cutout region. width : tuple of `~astropy.coordinates.Angle` Angular sizes of the region in (lon, lat) in that specific order. If only one value is passed, a square region is extracted. mode : {'trim', 'partial', 'strict'}, optional Mode option for Cutout2D, for details see `~astropy.nddata.utils.Cutout2D`. Default is "trim". min_npix : bool, optional Force width to a minimmum number of pixels. Default is 3. The default is 3 pixels so interpolation is done correctly if the binning of the IRF is larger than the width of the analysis region. Returns ------- cutout : `IRFMap` Cutout IRF map. """ irf_map = self._irf_map.cutout(position, width, mode, min_npix=min_npix) if self.exposure_map: exposure_map = self.exposure_map.cutout( position, width, mode, min_npix=min_npix ) else: exposure_map = None return self.__class__(irf_map, exposure_map=exposure_map) def downsample(self, factor, axis_name=None, weights=None): """Downsample the spatial dimension by a given factor. Parameters ---------- factor : int Downsampling factor. axis_name : str Axis to downsample. By default, spatial axes are downsampled. weights : `~gammapy.maps.Map`, optional Map with weights downsampling. Default is None. Returns ------- map : `IRFMap` Downsampled IRF map. """ irf_map = self._irf_map.downsample( factor=factor, axis_name=axis_name, preserve_counts=True, weights=weights ) if axis_name is None: exposure_map = self.exposure_map.downsample( factor=factor, preserve_counts=False ) else: exposure_map = self.exposure_map.copy() return self.__class__(irf_map, exposure_map=exposure_map) def slice_by_idx(self, slices): """Slice sub dataset. The slicing only applies to the maps that define the corresponding axes. Parameters ---------- slices : dict Dictionary of axes names and integers or `slice` object pairs. Contains one element for each non-spatial dimension. For integer indexing the corresponding axes is dropped from the map. Axes not specified in the dictionary are kept unchanged. Returns ------- map_out : `IRFMap` Sliced IRF map object. """ irf_map = self._irf_map.slice_by_idx(slices=slices) if "energy_true" in slices and self.exposure_map: exposure_map = self.exposure_map.slice_by_idx(slices=slices) else: exposure_map = self.exposure_map return self.__class__(irf_map, exposure_map=exposure_map) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.208642 gammapy-1.3/gammapy/irf/edisp/0000755000175100001770000000000014721316215015736 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/edisp/__init__.py0000644000175100001770000000042114721316200020036 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from .core import EnergyDispersion2D from .kernel import EDispKernel from .map import EDispKernelMap, EDispMap __all__ = [ "EDispKernel", "EDispKernelMap", "EDispMap", "EnergyDispersion2D", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/edisp/core.py0000644000175100001770000002500714721316200017236 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import numpy as np import scipy.special from astropy import units as u from astropy.coordinates import Angle, SkyCoord from astropy.units import Quantity from astropy.visualization import quantity_support import matplotlib.pyplot as plt from matplotlib.colors import PowerNorm from gammapy.maps import MapAxes, MapAxis, RegionGeom from gammapy.utils.deprecation import deprecated_renamed_argument from gammapy.visualization.utils import add_colorbar from ..core import IRF __all__ = ["EnergyDispersion2D"] log = logging.getLogger(__name__) class EnergyDispersion2D(IRF): """Offset-dependent energy dispersion matrix. Data format specification: :ref:`gadf:edisp_2d` Parameters ---------- axes : list of `~gammapy.maps.MapAxis` or `~gammapy.maps.MapAxes` Required axes (in the given order) are: * energy_true (true energy axis) * migra (energy migration axis) * offset (field of view offset axis) data : `~numpy.ndarray` Energy dispersion probability density. Examples -------- Read energy dispersion IRF from disk: >>> from gammapy.maps import MapAxis, MapAxes >>> from gammapy.irf import EnergyDispersion2D >>> filename = '$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz' >>> edisp2d = EnergyDispersion2D.read(filename, hdu="EDISP") Create energy dispersion matrix (`~gammapy.irf.EnergyDispersion`) for a given field of view offset and energy binning: >>> energy_axis = MapAxis.from_bounds(0.1, 20, nbin=60, unit="TeV", interp="log", name='energy') >>> edisp = edisp2d.to_edisp_kernel(offset='1.2 deg', energy_axis=energy_axis, ... energy_axis_true=energy_axis.copy(name='energy_true')) Create energy dispersion IRF from axes: >>> energy_axis_true = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=10, name="energy_true") >>> offset_axis = MapAxis.from_bounds(0, 1, nbin=3, unit="deg", name="offset", node_type="edges") >>> migra_axis = MapAxis.from_bounds(0, 3, nbin=3, name="migra", node_type="edges") >>> axes = MapAxes([energy_axis_true, migra_axis, offset_axis]) >>> edisp2d_axes = EnergyDispersion2D(axes=axes) See Also -------- EnergyDispersion. """ tag = "edisp_2d" required_axes = ["energy_true", "migra", "offset"] default_unit = u.one @property def _default_offset(self): if self.axes["offset"].nbin == 1: default_offset = self.axes["offset"].center else: default_offset = [1.0] * u.deg return default_offset def _mask_out_bounds(self, invalid): return ( invalid[self.axes.index("energy_true")] & invalid[self.axes.index("migra")] ) | invalid[self.axes.index("offset")] @classmethod def from_gauss( cls, energy_axis_true, migra_axis, offset_axis, bias, sigma, pdf_threshold=1e-6 ): """Create Gaussian energy dispersion matrix (`EnergyDispersion2D`). The output matrix will be Gaussian in (energy_true / energy). The ``bias`` and ``sigma`` should be either floats or arrays of same dimension than ``energy_true``. ``bias`` refers to the mean value of the ``migra`` distribution minus one, i.e. ``bias=0`` means no bias. Note that, the output matrix is flat in offset. Parameters ---------- energy_axis_true : `MapAxis` True energy axis. migra_axis : `~astropy.units.Quantity` Migra axis. offset_axis : `~astropy.units.Quantity` Bin edges of offset. bias : float or `~numpy.ndarray` Center of Gaussian energy dispersion, bias. sigma : float or `~numpy.ndarray` RMS width of Gaussian energy dispersion, resolution. pdf_threshold : float, optional Zero suppression threshold. Default is 1e-6. """ axes = MapAxes([energy_axis_true, migra_axis, offset_axis]) coords = axes.get_coord(mode="edges", axis_name="migra") migra_min = coords["migra"][:, :-1, :] migra_max = coords["migra"][:, 1:, :] # Analytical formula for integral of Gaussian s = np.sqrt(2) * sigma t1 = (migra_max - 1 - bias) / s t2 = (migra_min - 1 - bias) / s pdf = (scipy.special.erf(t1) - scipy.special.erf(t2)) / 2 pdf = pdf / (migra_max - migra_min) # no offset dependence data = pdf.T * np.ones(axes.shape) data[data < pdf_threshold] = 0 return cls( axes=axes, data=data.value, ) @deprecated_renamed_argument( ["energy_true", "energy"], ["energy_axis_true", "energy_axis"], ["v1.3", "v1.3"], arg_in_kwargs=True, ) def to_edisp_kernel(self, offset, energy_axis_true=None, energy_axis=None): """Detector response R(Delta E_reco, Delta E_true). Probability to reconstruct an energy in a given true energy band in a given reconstructed energy band. Parameters ---------- offset : `~astropy.coordinates.Angle` Offset. energy_axis_true : `~gammapy.maps.MapAxis`, optional True energy axis. Default is None. energy_axis : `~gammapy.maps.MapAxis`, optional Reconstructed energy axis. Default is None. Returns ------- edisp : `~gammapy.irf.EDispKernel` Energy dispersion matrix. """ from gammapy.makers.utils import make_edisp_kernel_map offset = Angle(offset) if isinstance(energy_axis, Quantity): energy_axis = MapAxis.from_energy_edges(energy_axis) if energy_axis is None: energy_axis = self.axes["energy_true"].copy(name="energy") if isinstance(energy_axis_true, Quantity): energy_axis_true = MapAxis.from_energy_edges( energy_axis_true, name="energy_true", ) if energy_axis_true is None: energy_axis_true = self.axes["energy_true"] pointing = SkyCoord("0d", "0d") center = pointing.directional_offset_by( position_angle=0 * u.deg, separation=offset ) geom = RegionGeom.create(region=center, axes=[energy_axis, energy_axis_true]) edisp = make_edisp_kernel_map(geom=geom, edisp=self, pointing=pointing) return edisp.get_edisp_kernel() def normalize(self): """Normalise energy dispersion.""" super().normalize(axis_name="migra") def plot_migration(self, ax=None, offset=None, energy_true=None, **kwargs): """Plot energy dispersion for given offset and true energy. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. offset : `~astropy.coordinates.Angle`, optional Offset. Default is None. energy_true : `~astropy.units.Quantity`, optional True energy. Default is None. **kwargs : dict Keyword arguments forwarded to `~matplotlib.pyplot.plot`. Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax if offset is None: offset = self._default_offset else: offset = np.atleast_1d(Angle(offset)) if energy_true is None: energy_true = u.Quantity([0.1, 1, 10], "TeV") else: energy_true = np.atleast_1d(u.Quantity(energy_true)) migra = self.axes["migra"] with quantity_support(): for ener in energy_true: for off in offset: disp = self.evaluate( offset=off, energy_true=ener, migra=migra.center ) label = f"offset = {off:.1f}\nenergy = {ener:.1f}" ax.plot(migra.center, disp, label=label, **kwargs) migra.format_plot_xaxis(ax=ax) ax.set_ylabel("Probability density") ax.legend(loc="upper left") return ax def plot_bias( self, ax=None, offset=None, add_cbar=False, axes_loc=None, kwargs_colorbar=None, **kwargs, ): """Plot migration as a function of true energy for a given offset. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. offset : `~astropy.coordinates.Angle`, optional Offset. Default is None. add_cbar : bool, optional Add a colorbar to the plot. Default is False. axes_loc : dict, optional Keyword arguments passed to `~mpl_toolkits.axes_grid1.axes_divider.AxesDivider.append_axes`. kwargs_colorbar : dict, optional Keyword arguments passed to `~matplotlib.pyplot.colorbar`. kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.pcolormesh`. Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. """ kwargs.setdefault("cmap", "GnBu") kwargs.setdefault("norm", PowerNorm(gamma=0.5)) kwargs_colorbar = kwargs_colorbar or {} ax = plt.gca() if ax is None else ax if offset is None: offset = self._default_offset energy_true = self.axes["energy_true"] migra = self.axes["migra"] z = self.evaluate( offset=offset, energy_true=energy_true.center.reshape(1, -1, 1), migra=migra.center.reshape(1, 1, -1), ).value[0] with quantity_support(): caxes = ax.pcolormesh(energy_true.edges, migra.edges, z.T, **kwargs) energy_true.format_plot_xaxis(ax=ax) migra.format_plot_yaxis(ax=ax) if add_cbar: label = "Probability density [A.U]." kwargs_colorbar.setdefault("label", label) add_colorbar(caxes, ax=ax, axes_loc=axes_loc, **kwargs_colorbar) return ax def peek(self, figsize=(15, 5)): """Quick-look summary plots. Parameters ---------- figsize : tuple, optional Size of the resulting plot. Default is (15, 5). """ fig, axes = plt.subplots(nrows=1, ncols=3, figsize=figsize) self.plot_bias(ax=axes[0]) self.plot_migration(ax=axes[1]) edisp = self.to_edisp_kernel(offset=self._default_offset[0]) edisp.plot_matrix(ax=axes[2]) plt.tight_layout() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/edisp/kernel.py0000644000175100001770000005140614721316200017570 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from astropy.io import fits from astropy.table import Table from astropy.units import Quantity from astropy.visualization import quantity_support import matplotlib.pyplot as plt from matplotlib.colors import PowerNorm from gammapy.maps import MapAxis from gammapy.maps.axes import UNIT_STRING_FORMAT from gammapy.utils.scripts import make_path from gammapy.visualization.utils import add_colorbar from ..core import IRF __all__ = ["EDispKernel"] class EDispKernel(IRF): """Energy dispersion matrix. Data format specification: :ref:`gadf:ogip-rmf`. Parameters ---------- axes : list of `~gammapy.maps.MapAxis` or `~gammapy.maps.MapAxes` Required axes (in the given order) are: * energy_true (true energy axis) * energy (reconstructed energy axis) data : array_like 2D energy dispersion matrix. Examples -------- Create a Gaussian energy dispersion matrix:: >>> from gammapy.maps import MapAxis >>> from gammapy.irf import EDispKernel >>> energy = MapAxis.from_energy_bounds(0.1, 10, 10, unit='TeV') >>> energy_true = MapAxis.from_energy_bounds(0.1, 10, 10, unit='TeV', name='energy_true') >>> edisp = EDispKernel.from_gauss(energy_axis_true=energy_true, energy_axis=energy, sigma=0.1, bias=0) Have a quick look: >>> print(edisp) EDispKernel ----------- axes : ['energy_true', 'energy'] shape : (10, 10) ndim : 2 unit : dtype : float64 >>> edisp.peek() """ tag = "edisp_kernel" required_axes = ["energy_true", "energy"] default_interp_kwargs = dict(bounds_error=False, fill_value=0, method="nearest") """Default Interpolation kwargs for `~IRF`. Fill zeros and do not interpolate""" @property def pdf_matrix(self): """Energy dispersion PDF matrix as a `~numpy.ndarray`. Rows (first index): True Energy Columns (second index): Reco Energy """ return self.data def pdf_in_safe_range(self, lo_threshold, hi_threshold): """PDF matrix with bins outside threshold set to 0. Parameters ---------- lo_threshold : `~astropy.units.Quantity` Low reconstructed energy threshold. hi_threshold : `~astropy.units.Quantity` High reconstructed energy threshold. """ data = self.pdf_matrix.copy() energy = self.axes["energy"].edges if lo_threshold is None and hi_threshold is None: idx = slice(None) else: idx = (energy[:-1] < lo_threshold) | (energy[1:] > hi_threshold) data[:, idx] = 0 return data def to_image(self, lo_threshold=None, hi_threshold=None): """Return a 2D edisp by summing the pdf matrix over the ereco axis. Parameters ---------- lo_threshold : `~astropy.units.Quantity`, optional Low reconstructed energy threshold. Default is None. hi_threshold : `~astropy.units.Quantity`, optional High reconstructed energy threshold. Default is None. """ energy_axis = self.axes["energy"] lo_threshold = lo_threshold or energy_axis.edges[0] hi_threshold = hi_threshold or energy_axis.edges[-1] data = self.pdf_in_safe_range(lo_threshold, hi_threshold) return self.__class__( axes=self.axes.squash("energy"), data=np.sum(data, axis=1, keepdims=True), ) @classmethod def from_gauss(cls, energy_axis_true, energy_axis, sigma, bias, pdf_threshold=1e-6): """Create Gaussian energy dispersion matrix (`EnergyDispersion`). Calls :func:`gammapy.irf.EnergyDispersion2D.from_gauss`. Parameters ---------- energy_axis_true : `~astropy.units.Quantity` Bin edges of true energy axis. energy_axis : `~astropy.units.Quantity` Bin edges of reconstructed energy axis. bias : float or `~numpy.ndarray` Center of Gaussian energy dispersion, bias. sigma : float or `~numpy.ndarray` RMS width of Gaussian energy dispersion, resolution. pdf_threshold : float, optional Zero suppression threshold. Default is 1e-6. Returns ------- edisp : `EDispKernel` Energy dispersion kernel. """ from .core import EnergyDispersion2D migra_axis = MapAxis.from_bounds(1.0 / 3, 3, nbin=200, name="migra") # A dummy offset axis (need length 2 for interpolation to work) offset_axis = MapAxis.from_edges([0, 1, 2], unit="deg", name="offset") edisp = EnergyDispersion2D.from_gauss( energy_axis_true=energy_axis_true, migra_axis=migra_axis, offset_axis=offset_axis, sigma=sigma, bias=bias, pdf_threshold=pdf_threshold, ) return edisp.to_edisp_kernel( offset=offset_axis.center[0], energy_axis=energy_axis ) @classmethod def from_diagonal_response(cls, energy_axis_true, energy_axis=None): """Create energy dispersion from a diagonal response, i.e. perfect energy resolution. This creates the matrix corresponding to a perfect energy response. It contains ones where the energy_true center is inside the e_reco bin. It is a square diagonal matrix if energy_true = e_reco. This is useful in cases where code always applies an edisp, but you don't want it to do anything. Parameters ---------- energy_axis_true : `~gammapy.maps.MapAxis` True energy axis. energy_axis : `~gammapy.maps.MapAxis`, optional Reconstructed energy axis. Default is None. Examples -------- If ``energy_true`` equals ``energy``, you get a diagonal matrix: >>> from gammapy.irf import EDispKernel >>> from gammapy.maps import MapAxis >>> import astropy.units as u >>> energy_true_axis = MapAxis.from_energy_edges( ... [0.5, 1, 2, 4, 6] * u.TeV, name="energy_true" ... ) >>> edisp = EDispKernel.from_diagonal_response(energy_true_axis) >>> edisp.plot_matrix() # doctest: +SKIP Example with different energy binnings: >>> energy_true_axis = MapAxis.from_energy_edges( ... [0.5, 1, 2, 4, 6] * u.TeV, name="energy_true" ... ) >>> energy_axis = MapAxis.from_energy_edges([2, 4, 6] * u.TeV) >>> edisp = EDispKernel.from_diagonal_response(energy_true_axis, energy_axis) >>> edisp.plot_matrix() # doctest: +SKIP """ from .map import get_overlap_fraction energy_axis_true.assert_name("energy_true") if energy_axis is None: energy_axis = energy_axis_true.copy(name="energy") data = get_overlap_fraction(energy_axis, energy_axis_true) return cls(axes=[energy_axis_true, energy_axis], data=data.value) @classmethod def from_hdulist(cls, hdulist, hdu1="MATRIX", hdu2="EBOUNDS"): """Create `EnergyDispersion` object from `~astropy.io.fits.HDUList`. Parameters ---------- hdulist : `~astropy.io.fits.HDUList` HDU list with ``MATRIX`` and ``EBOUNDS`` extensions. hdu1 : str, optional HDU containing the energy dispersion matrix. Default is "MATRIX". hdu2 : str, optional HDU containing the energy axis information. Default is "EBOUNDS". """ matrix_hdu = hdulist[hdu1] ebounds_hdu = hdulist[hdu2] data = matrix_hdu.data header = matrix_hdu.header pdf_matrix = np.zeros([len(data), header["DETCHANS"]], dtype=np.float64) for i, l in enumerate(data): if l.field("N_GRP"): m_start = 0 for k in range(l.field("N_GRP")): chan_min = l.field("F_CHAN")[k] chan_max = l.field("F_CHAN")[k] + l.field("N_CHAN")[k] pdf_matrix[i, chan_min:chan_max] = l.field("MATRIX")[ m_start : m_start + l.field("N_CHAN")[ k ] # noqa: E203 ] m_start += l.field("N_CHAN")[k] table = Table.read(ebounds_hdu) energy_axis = MapAxis.from_table(table, format="ogip") table = Table.read(matrix_hdu) energy_axis_true = MapAxis.from_table(table, format="ogip-arf") return cls(axes=[energy_axis_true, energy_axis], data=pdf_matrix) @classmethod def read(cls, filename, hdu1="MATRIX", hdu2="EBOUNDS", checksum=False): """Read from file. Parameters ---------- filename : `pathlib.Path` or str File to read. hdu1 : str, optional HDU containing the energy dispersion matrix. Default is "MATRIX". hdu2 : str, optional HDU containing the energy axis information. Default is "EBOUNDS". checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. """ with fits.open( str(make_path(filename)), memmap=False, checksum=checksum ) as hdulist: return cls.from_hdulist(hdulist, hdu1=hdu1, hdu2=hdu2) def to_hdulist(self, format="ogip", **kwargs): """Convert RMF to FITS HDU list format. Parameters ---------- format : {"ogip", "ogip-sherpa"} Format to use. Default is "ogip". Returns ------- hdulist : `~astropy.io.fits.HDUList` RMF in HDU list format. Notes ----- For more information on the RMF FITS file format see: https://heasarc.gsfc.nasa.gov/docs/heasarc/caldb/docs/summary/cal_gen_92_002_summary.html """ # Cannot use table_to_fits here due to variable length array # http://docs.astropy.org/en/v1.0.4/io/fits/usage/unfamiliar.html format_arf = format.replace("ogip", "ogip-arf") table = self.to_table(format=format_arf) name = table.meta.pop("name") header = fits.Header() header.update(table.meta) cols = table.columns c0 = fits.Column( name=cols[0].name, format="E", array=cols[0], unit=str(cols[0].unit) ) c1 = fits.Column( name=cols[1].name, format="E", array=cols[1], unit=str(cols[1].unit) ) c2 = fits.Column(name=cols[2].name, format="I", array=cols[2]) c3 = fits.Column(name=cols[3].name, format="PI()", array=cols[3]) c4 = fits.Column(name=cols[4].name, format="PI()", array=cols[4]) c5 = fits.Column(name=cols[5].name, format="PE()", array=cols[5]) hdu = fits.BinTableHDU.from_columns( [c0, c1, c2, c3, c4, c5], header=header, name=name ) ebounds_hdu = self.axes["energy"].to_table_hdu(format=format) prim_hdu = fits.PrimaryHDU() return fits.HDUList([prim_hdu, hdu, ebounds_hdu]) def to_table(self, format="ogip"): """Convert to `~astropy.table.Table`. The output table is in the OGIP RMF format. https://heasarc.gsfc.nasa.gov/docs/heasarc/caldb/docs/memos/cal_gen_92_002/cal_gen_92_002.html#Tab:1 # noqa: E501 Parameters ---------- format : {"ogip", "ogip-sherpa"} Format to use. Default is "ogip". Returns ------- table : `~astropy.table.Table` Matrix table. """ table = self.axes["energy_true"].to_table(format=format) rows = self.pdf_matrix.shape[0] n_grp = [] f_chan = np.ndarray(dtype=object, shape=rows) n_chan = np.ndarray(dtype=object, shape=rows) matrix = np.ndarray(dtype=object, shape=rows) # Make RMF type matrix for idx, row in enumerate(self.data): pos = np.nonzero(row)[0] borders = np.where(np.diff(pos) != 1)[0] # add 1 to borders for correct behaviour of np.split groups = np.split(pos, borders + 1) n_grp_temp = len(groups) if len(groups) > 0 else 1 n_chan_temp = np.asarray([val.size for val in groups]) try: f_chan_temp = np.asarray([val[0] for val in groups]) except IndexError: f_chan_temp = np.zeros(1) n_grp.append(n_grp_temp) f_chan[idx] = f_chan_temp n_chan[idx] = n_chan_temp matrix[idx] = row[pos] n_grp = np.asarray(n_grp, dtype=np.int16) # Get total number of groups and channel subsets numgrp, numelt = 0, 0 for val, val2 in zip(n_grp, n_chan): numgrp += np.sum(val) numelt += np.sum(val2) table["N_GRP"] = n_grp table["F_CHAN"] = f_chan table["N_CHAN"] = n_chan table["MATRIX"] = matrix table.meta = { "name": "MATRIX", "chantype": "PHA", "hduclass": "OGIP", "hduclas1": "RESPONSE", "hduclas2": "RSP_MATRIX", "detchans": self.axes["energy"].nbin, "numgrp": numgrp, "numelt": numelt, "tlmin4": 0, } return table def write(self, filename, format="ogip", checksum=False, **kwargs): """Write to file. Parameters ---------- filename : str Filename. format : {"ogip", "ogip-sherpa"} Format to use. Default is "ogip". checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. """ filename = str(make_path(filename)) hdulist = self.to_hdulist(format=format) hdulist.writeto(filename, checksum=checksum, **kwargs) def get_resolution(self, energy_true): """Get energy resolution for a given true energy. The resolution is given as a percentage of the true energy. Parameters ---------- energy_true : `~astropy.units.Quantity` True energy. """ energy_axis_true = self.axes["energy_true"] var = self._get_variance(energy_true) idx_true = energy_axis_true.coord_to_idx(energy_true) energy_true_real = energy_axis_true.center[idx_true] return np.sqrt(var) / energy_true_real def get_bias(self, energy_true): r"""Get reconstruction bias for a given true energy. Bias is defined as .. math:: \frac{E_{reco}-E_{true}}{E_{true}} Parameters ---------- energy_true : `~astropy.units.Quantity` True energy. """ energy_axis_true = self.axes["energy_true"] energy = self.get_mean(energy_true) idx_true = energy_axis_true.coord_to_idx(energy_true) energy_true_real = energy_axis_true.center[idx_true] bias = (energy - energy_true_real) / energy_true_real return bias def get_bias_energy(self, bias, energy_min=None, energy_max=None): """Find energy corresponding to a given bias. In case the solution is not unique, provide the ``energy_min`` or ``energy_max`` arguments to limit the solution to the given range. By default, the peak energy of the bias is chosen as ``energy_min``. Parameters ---------- bias : float Bias value. energy_min : `~astropy.units.Quantity` Lower bracket value in case solution is not unique. energy_max : `~astropy.units.Quantity` Upper bracket value in case solution is not unique. Returns ------- bias_energy : `~astropy.units.Quantity` Reconstructed energy corresponding to the given bias. """ from gammapy.modeling.models import TemplateSpectralModel energy_true = self.axes["energy_true"].center values = self.get_bias(energy_true) if energy_min is None: # use the peak bias energy as default minimum energy_min = energy_true[np.nanargmax(values)] if energy_max is None: energy_max = energy_true[-1] bias_spectrum = TemplateSpectralModel(energy=energy_true, values=values) energy_true_bias = bias_spectrum.inverse( Quantity(bias), energy_min=energy_min, energy_max=energy_max ) if np.isnan(energy_true_bias[0]): energy_true_bias[0] = energy_min # return reconstructed energy return energy_true_bias * (1 + bias) def get_mean(self, energy_true): """Get mean reconstructed energy for a given true energy.""" idx = self.axes["energy_true"].coord_to_idx(energy_true) pdf = self.data[idx] # compute sum along reconstructed energy norm = np.sum(pdf, axis=-1) temp = np.sum(pdf * self.axes["energy"].center, axis=-1) with np.errstate(invalid="ignore"): # corm can be zero mean = np.nan_to_num(temp / norm) return mean def _get_variance(self, energy_true): """Get variance of log reconstructed energy.""" # evaluate the pdf at given true energies idx = self.axes["energy_true"].coord_to_idx(energy_true) pdf = self.data[idx] # compute mean mean = self.get_mean(energy_true) # create array of reconstructed-energy nodes # for each given true energy value # (first axis is reconstructed energy) erec = self.axes["energy"].center erec = np.repeat(erec, max(np.sum(mean.shape), 1)).reshape( erec.shape + mean.shape ) # compute deviation from mean # (and move reconstructed energy axis to last axis) temp_ = (erec - mean) ** 2 temp = np.rollaxis(temp_, 1) # compute sum along reconstructed energy # axis to determine the variance norm = np.sum(pdf, axis=-1) var = np.sum(temp * pdf, axis=-1) return var / norm def plot_matrix( self, ax=None, add_cbar=False, axes_loc=None, kwargs_colorbar=None, **kwargs ): """Plot PDF matrix. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. add_cbar : bool, optional Add a colorbar to the plot. Default is False. axes_loc : dict, optional Keyword arguments passed to `~mpl_toolkits.axes_grid1.axes_divider.AxesDivider.append_axes`. kwargs_colorbar : dict, optional Keyword arguments passed to `~matplotlib.pyplot.colorbar`. kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.pcolormesh`. Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. """ kwargs.setdefault("cmap", "GnBu") norm = PowerNorm(gamma=0.5, vmin=0, vmax=1) kwargs.setdefault("norm", norm) kwargs_colorbar = kwargs_colorbar or {} ax = plt.gca() if ax is None else ax energy_axis_true = self.axes["energy_true"] energy_axis = self.axes["energy"] with quantity_support(): caxes = ax.pcolormesh( energy_axis_true.edges, energy_axis.edges, self.data.T, **kwargs ) if add_cbar: label = "Probability density (A.U.)" kwargs_colorbar.setdefault("label", label) add_colorbar(caxes, ax=ax, axes_loc=axes_loc, **kwargs_colorbar) energy_axis_true.format_plot_xaxis(ax=ax) energy_axis.format_plot_yaxis(ax=ax) return ax def plot_bias(self, ax=None, **kwargs): """Plot reconstruction bias. See `~gammapy.irf.EnergyDispersion.get_bias` method. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. **kwargs : dict Keyword arguments. Returns ------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. """ ax = plt.gca() if ax is None else ax energy = self.axes["energy_true"].center bias = self.get_bias(energy) with quantity_support(): ax.plot(energy, bias, **kwargs) ax.set_xlabel( f"$E_\\mathrm{{True}}$ [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]" ) ax.set_ylabel( "($E_\\mathrm{{Reco}} - E_\\mathrm{{True}}) / E_\\mathrm{{True}}$" ) ax.set_xscale("log") return ax def peek(self, figsize=(15, 5)): """Quick-look summary plots. Parameters ---------- figsize : tuple, optional Size of the figure. Default is (15, 5). """ fig, axes = plt.subplots(nrows=1, ncols=2, figsize=figsize) self.plot_bias(ax=axes[0]) self.plot_matrix(ax=axes[1]) plt.tight_layout() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/edisp/map.py0000644000175100001770000004347414721316200017073 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from gammapy.maps import Map, MapAxis, MapCoord, RegionGeom, WcsGeom from gammapy.utils.random import InverseCDFSampler, get_random_state from ..core import IRFMap from .kernel import EDispKernel __all__ = ["EDispMap", "EDispKernelMap"] def get_overlap_fraction(energy_axis, energy_axis_true): a_min = energy_axis.edges[:-1] a_max = energy_axis.edges[1:] b_min = energy_axis_true.edges[:-1][:, np.newaxis] b_max = energy_axis_true.edges[1:][:, np.newaxis] xmin = np.fmin(a_max, b_max) xmax = np.fmax(a_min, b_min) return (np.clip(xmin - xmax, 0, np.inf) / (b_max - b_min)).to("") class EDispMap(IRFMap): """Energy dispersion map. Parameters ---------- edisp_map : `~gammapy.maps.Map` The input Energy Dispersion Map. Should be a Map with 2 non-spatial axes. migra and true energy axes should be given in this specific order. exposure_map : `~gammapy.maps.Map`, optional Associated exposure map. Needs to have a consistent map geometry. Examples -------- :: # Energy dispersion map for CTAO data import numpy as np from astropy import units as u from astropy.coordinates import SkyCoord from gammapy.maps import WcsGeom, MapAxis from gammapy.irf import EnergyDispersion2D, EffectiveAreaTable2D from gammapy.makers.utils import make_edisp_map, make_map_exposure_true_energy # Define energy dispersion map geometry energy_axis_true = MapAxis.from_edges(np.logspace(-1, 1, 10), unit="TeV", name="energy_true") migra_axis = MapAxis.from_edges(np.linspace(0, 3, 100), name="migra") pointing = SkyCoord(0, 0, unit="deg") geom = WcsGeom.create( binsz=0.25 * u.deg, width=10 * u.deg, skydir=pointing, axes=[migra_axis, energy_axis_true], ) # Extract EnergyDispersion2D from CTA 1DC IRF filename = "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" edisp2D = EnergyDispersion2D.read(filename, hdu="ENERGY DISPERSION") aeff2d = EffectiveAreaTable2D.read(filename, hdu="EFFECTIVE AREA") # Create the exposure map exposure_geom = geom.squash(axis_name="migra") exposure_map = make_map_exposure_true_energy(pointing, "1 h", aeff2d, exposure_geom) # Create the EDispMap for the specified pointing edisp_map = make_edisp_map(edisp2D, pointing, geom, exposure_map) # Get an Energy Dispersion (1D) at any position in the image pos = SkyCoord(2.0, 2.5, unit="deg") energy_axis = MapAxis.from_energy_bounds(0.1, 10, 5, unit="TeV", name="energy") edisp = edisp_map.get_edisp_kernel(energy_axis, position=pos) # Write map to disk edisp_map.write("edisp_map.fits") """ tag = "edisp_map" required_axes = ["migra", "energy_true"] def __init__(self, edisp_map, exposure_map=None): super().__init__(irf_map=edisp_map, exposure_map=exposure_map) @property def edisp_map(self): return self._irf_map @edisp_map.setter def edisp_map(self, value): del self.has_single_spatial_bin self._irf_map = value def normalize(self): """Normalize PSF map.""" self.edisp_map.normalize(axis_name="migra") def get_edisp_kernel(self, energy_axis, position=None): """Get energy dispersion at a given position. Parameters ---------- energy_axis : `~gammapy.maps.MapAxis` Reconstructed energy axis. position : `~astropy.coordinates.SkyCoord` The target position. Should be a single coordinates. Returns ------- edisp : `~gammapy.irf.EnergyDispersion` The energy dispersion (i.e. rmf object). """ edisp_map = self.to_region_nd_map(region=position) edisp_kernel_map = edisp_map.to_edisp_kernel_map(energy_axis=energy_axis) return edisp_kernel_map.get_edisp_kernel() def to_edisp_kernel_map(self, energy_axis): """Convert to map with energy dispersion kernels. Parameters ---------- energy_axis : `~gammapy.maps.MapAxis` Reconstructed energy axis. Returns ------- edisp : `~gammapy.maps.EDispKernelMap` Energy dispersion kernel map. """ energy_axis_true = self.edisp_map.geom.axes["energy_true"] geom_image = self.edisp_map.geom.to_image() geom = geom_image.to_cube([energy_axis, energy_axis_true]) coords = geom.get_coord(sparse=True, mode="edges", axis_name="energy") migra = coords["energy"] / coords["energy_true"] coords = { "skycoord": coords.skycoord, "energy_true": coords["energy_true"], "migra": migra, } values = self.edisp_map.integral(axis_name="migra", coords=coords) axis = self.edisp_map.geom.axes.index_data("migra") data = np.clip(np.diff(values, axis=axis), 0, np.inf) edisp_kernel_map = Map.from_geom(geom=geom, data=data.to_value(""), unit="") if self.exposure_map: geom = geom.squash(axis_name=energy_axis.name) exposure_map = self.exposure_map.copy(geom=geom) else: exposure_map = None return EDispKernelMap( edisp_kernel_map=edisp_kernel_map, exposure_map=exposure_map ) @classmethod def from_geom(cls, geom): """Create energy dispersion map from geometry. By default, a diagonal energy dispersion matrix is created. Parameters ---------- geom : `~gammapy.maps.Geom` Energy dispersion map geometry. Returns ------- edisp_map : `~gammapy.maps.EDispMap` Energy dispersion map. """ if "energy_true" not in [ax.name for ax in geom.axes]: raise ValueError("EDispMap requires true energy axis") exposure_map = Map.from_geom(geom=geom.squash(axis_name="migra"), unit="m2 s") edisp_map = Map.from_geom(geom, unit="") migra_axis = geom.axes["migra"] migra_0 = migra_axis.coord_to_pix(1) # distribute over two pixels migra = geom.get_idx()[2] data = np.abs(migra - migra_0) data = np.where(data < 1, 1 - data, 0) edisp_map.quantity = data / migra_axis.bin_width.reshape((1, -1, 1, 1)) return cls(edisp_map, exposure_map) def sample_coord(self, map_coord, random_state=0, chunk_size=10000): """Apply the energy dispersion corrections on the coordinates of a set of simulated events. Parameters ---------- map_coord : `~gammapy.maps.MapCoord` Sequence of coordinates and energies of sampled events. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`}, optional Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is 0. chunk_size : int If set, this will slice the input MapCoord into smaller chunks of chunk_size elements. Default is 10000. Returns ------- `~gammapy.maps.MapCoord`. Sequence of energy dispersion corrected coordinates of the input map_coord map. """ random_state = get_random_state(random_state) migra_axis = self.edisp_map.geom.axes["migra"] position = map_coord.skycoord energy_true = map_coord["energy_true"] size = position.size energy_reco = np.ones(size) * map_coord["energy_true"].unit chunk_size = size if chunk_size is None else chunk_size index = 0 while index < size: chunk = slice(index, index + chunk_size, 1) coord = { "skycoord": position[chunk].reshape(-1, 1), "energy_true": energy_true[chunk].reshape(-1, 1), "migra": migra_axis.center, } pdf_edisp = self.edisp_map.interp_by_coord(coord) sample_edisp = InverseCDFSampler( pdf_edisp, axis=1, random_state=random_state ) pix_edisp = sample_edisp.sample_axis() migra = migra_axis.pix_to_coord(pix_edisp) energy_reco[chunk] = energy_true[chunk] * migra index += chunk_size return MapCoord.create({"skycoord": position, "energy": energy_reco}) @classmethod def from_diagonal_response(cls, energy_axis_true, migra_axis=None): """Create an all-sky EDisp map with diagonal response. Parameters ---------- energy_axis_true : `~gammapy.maps.MapAxis` True energy axis. migra_axis : `~gammapy.maps.MapAxis`, optional Migra axis. Default is None. Returns ------- edisp_map : `~gammapy.maps.EDispMap` Energy dispersion map. """ migra_res = 1e-5 migra_axis_default = MapAxis.from_bounds( 1 - migra_res, 1 + migra_res, nbin=3, name="migra", node_type="edges" ) migra_axis = migra_axis or migra_axis_default geom = WcsGeom.create( npix=(2, 1), proj="CAR", binsz=180, axes=[migra_axis, energy_axis_true] ) return cls.from_geom(geom) def peek(self, figsize=(15, 5)): """Quick-look summary plots. Plots corresponding to the center of the map. Parameters ---------- figsize : tuple Size of figure. """ e_true = self.edisp_map.geom.axes[1] e_reco = MapAxis.from_energy_bounds( e_true.edges.min(), e_true.edges.max(), nbin=len(e_true.center), name="energy", ) self.get_edisp_kernel(energy_axis=e_reco).peek(figsize) class EDispKernelMap(IRFMap): """Energy dispersion kernel map. Parameters ---------- edisp_kernel_map : `~gammapy.maps.Map` The input energy dispersion kernel map. Should be a Map with 2 non-spatial axes. Reconstructed and true energy axes should be given in this specific order. exposure_map : `~gammapy.maps.Map`, optional Associated exposure map. Needs to have a consistent map geometry. """ tag = "edisp_kernel_map" required_axes = ["energy", "energy_true"] def __init__(self, edisp_kernel_map, exposure_map=None): super().__init__(irf_map=edisp_kernel_map, exposure_map=exposure_map) @property def edisp_map(self): return self._irf_map @edisp_map.setter def edisp_map(self, value): self._irf_map = value @classmethod def from_geom(cls, geom): """Create energy dispersion map from geometry. By default, a diagonal energy dispersion matrix is created. Parameters ---------- geom : `~gammapy.maps.Geom` Energy dispersion map geometry. Returns ------- edisp_map : `EDispKernelMap` Energy dispersion kernel map. """ # TODO: allow only list of additional axes geom.axes.assert_names(cls.required_axes, allow_extra=True) geom_exposure = geom.squash(axis_name="energy") exposure = Map.from_geom(geom_exposure, unit="m2 s") energy_axis = geom.axes["energy"] energy_axis_true = geom.axes["energy_true"] data = get_overlap_fraction(energy_axis, energy_axis_true) edisp_kernel_map = Map.from_geom(geom, unit="") edisp_kernel_map.quantity += np.resize(data, geom.data_shape_axes) return cls(edisp_kernel_map=edisp_kernel_map, exposure_map=exposure) def get_edisp_kernel(self, position=None, energy_axis=None): """Get energy dispersion at a given position. Parameters ---------- position : `~astropy.coordinates.SkyCoord` or `~regions.SkyRegion`, optional The target position. Should be a single coordinates. Default is None. energy_axis : `MapAxis`, optional Reconstructed energy axis, only used for checking. Default is None. Returns ------- edisp : `~gammapy.irf.EnergyDispersion` The energy dispersion (i.e. rmf object). """ if energy_axis: assert energy_axis == self.edisp_map.geom.axes["energy"] if isinstance(self.edisp_map.geom, RegionGeom): kernel_map = self.edisp_map else: if position is None: position = self.edisp_map.geom.center_skydir position = self._get_nearest_valid_position(position) kernel_map = self.edisp_map.to_region_nd_map(region=position) return EDispKernel( axes=kernel_map.geom.axes[["energy_true", "energy"]], data=kernel_map.data[..., 0, 0], ) @classmethod def from_diagonal_response(cls, energy_axis, energy_axis_true, geom=None): """Create an energy dispersion map with diagonal response. Parameters ---------- energy_axis : `~gammapy.maps.MapAxis` Energy axis. energy_axis_true : `~gammapy.maps.MapAxis` True energy axis geom : `~gammapy.maps.Geom`, optional The (2D) geometry object to use. If None, an all sky geometry with 2 bins is created. Default is None. Returns ------- edisp_map : `EDispKernelMap` Energy dispersion kernel map. """ if geom is None: geom = WcsGeom.create( npix=(2, 1), proj="CAR", binsz=180, axes=[energy_axis, energy_axis_true] ) else: geom = geom.to_image().to_cube([energy_axis, energy_axis_true]) return cls.from_geom(geom) @classmethod def from_edisp_kernel(cls, edisp, geom=None): """Create an energy dispersion map from the input 1D kernel. The kernel will be duplicated over all spatial bins. Parameters ---------- edisp : `~gammapy.irf.EDispKernel` The input 1D kernel. geom : `~gammapy.maps.Geom`, optional The (2D) geometry object to use. If None, an all sky geometry with 2 bins is created. Default is None. Returns ------- edisp_map : `EDispKernelMap` Energy dispersion kernel map. """ edisp_map = cls.from_diagonal_response( edisp.axes["energy"], edisp.axes["energy_true"], geom=geom ) edisp_map.edisp_map.data *= 0 edisp_map.edisp_map.data[:, :, ...] = edisp.pdf_matrix[ :, :, np.newaxis, np.newaxis ] return edisp_map @classmethod def from_gauss( cls, energy_axis, energy_axis_true, sigma, bias, pdf_threshold=1e-6, geom=None ): """Create an energy dispersion map from the input 1D kernel. The kernel will be duplicated over all spatial bins. Parameters ---------- energy_axis_true : `~astropy.units.Quantity` Bin edges of true energy axis. energy_axis : `~astropy.units.Quantity` Bin edges of reconstructed energy axis. bias : float or `~numpy.ndarray` Center of Gaussian energy dispersion, bias. sigma : float or `~numpy.ndarray` RMS width of Gaussian energy dispersion, resolution. pdf_threshold : float, optional Zero suppression threshold. Default is 1e-6. geom : `~gammapy.maps.Geom`, optional The (2D) geometry object to use. If None, an all sky geometry with 2 bins is created. Default is None. Returns ------- edisp_map : `EDispKernelMap` Energy dispersion kernel map. """ kernel = EDispKernel.from_gauss( energy_axis=energy_axis, energy_axis_true=energy_axis_true, sigma=sigma, bias=bias, pdf_threshold=pdf_threshold, ) return cls.from_edisp_kernel(kernel, geom=geom) def to_image(self, weights=None): """Return a 2D EdispKernelMap by summing over the reconstructed energy axis. Parameters ---------- weights: `~gammapy.maps.Map`, optional Weights to be applied. Default is None. Returns ------- edisp : `EDispKernelMap` Energy dispersion kernel map. """ edisp = self.edisp_map.data if weights: edisp = edisp * weights.data data = np.sum(edisp, axis=1, keepdims=True) geom = self.edisp_map.geom.squash(axis_name="energy") edisp_map = Map.from_geom(geom=geom, data=data) return self.__class__( edisp_kernel_map=edisp_map, exposure_map=self.exposure_map ) def resample_energy_axis(self, energy_axis, weights=None): """Return a resampled `EDispKernelMap`. Bins are grouped according to the edges of the reconstructed energy axis provided. The true energy is left unchanged. Parameters ---------- energy_axis : `~gammapy.maps.MapAxis` The reconstructed energy axis to use for the grouping. weights: `~gammapy.maps.Map`, optional Weights to be applied. Default is None. Returns ------- edisp : `EDispKernelMap` Energy dispersion kernel map. """ new_edisp_map = self.edisp_map.resample_axis(axis=energy_axis, weights=weights) return self.__class__( edisp_kernel_map=new_edisp_map, exposure_map=self.exposure_map ) def peek(self, figsize=(15, 5)): """Quick-look summary plots. Plots corresponding to the center of the map. Parameters ---------- figsize : tuple, optional Size of the figure. Default is (15, 5). """ self.get_edisp_kernel().peek(figsize) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.208642 gammapy-1.3/gammapy/irf/edisp/tests/0000755000175100001770000000000014721316215017100 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/edisp/tests/__init__.py0000644000175100001770000000010014721316200021172 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/edisp/tests/test_core.py0000644000175100001770000001121714721316200021435 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from copy import deepcopy import pytest import numpy as np from numpy.testing import assert_allclose, assert_equal import astropy.units as u from astropy.coordinates import Angle from gammapy.irf import EnergyDispersion2D from gammapy.maps import MapAxes, MapAxis from gammapy.utils.testing import mpl_plot_check, requires_data @requires_data() class TestEnergyDispersion2D: @classmethod def setup_class(cls): filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" cls.edisp = EnergyDispersion2D.read(filename, hdu="EDISP") # Make a test case energy_axis_true = MapAxis.from_energy_bounds( "0.1 TeV", "100 TeV", nbin=50, name="energy_true" ) migra_axis = MapAxis.from_bounds( 0, 4, nbin=1000, node_type="edges", name="migra" ) offset_axis = MapAxis.from_bounds(0, 2.5, nbin=5, unit="deg", name="offset") energy_true = energy_axis_true.edges[:-1] sigma = 0.15 / (energy_true / (1 * u.TeV)).value ** 0.3 bias = 1e-3 * (energy_true - 1 * u.TeV).value cls.edisp2 = EnergyDispersion2D.from_gauss( energy_axis_true=energy_axis_true, migra_axis=migra_axis, bias=bias, sigma=sigma, offset_axis=offset_axis, ) def test_str(self): assert "EnergyDispersion2D" in str(self.edisp) def test_evaluation(self): # Check output shape energy = [1, 2] * u.TeV migra = np.array([0.98, 0.97, 0.7]) offset = [0.1, 0.2, 0.3, 0.4] * u.deg actual = self.edisp.evaluate( energy_true=energy.reshape(-1, 1, 1), migra=migra.reshape(1, -1, 1), offset=offset.reshape(1, 1, -1), ) assert_allclose(actual.shape, (2, 3, 4)) # Check evaluation at all nodes actual = self.edisp.evaluate().shape desired = ( self.edisp.axes["energy_true"].nbin, self.edisp.axes["migra"].nbin, self.edisp.axes["offset"].nbin, ) assert_equal(actual, desired) def test_exporter(self): # Check RMF exporter offset = Angle(0.612, "deg") e_reco_axis = MapAxis.from_energy_bounds(1, 10, 7, "TeV") e_true_axis = MapAxis.from_energy_bounds(0.8, 5, 5, "TeV", name="energy_true") rmf = self.edisp.to_edisp_kernel( offset, energy_axis_true=e_true_axis, energy_axis=e_reco_axis ) assert_allclose(rmf.data.data[2, 3], 0.08, atol=5e-2) # same tolerance as above # Test for v1.3 rmf2 = self.edisp.to_edisp_kernel( offset, energy_axis_true=e_true_axis.edges, energy_axis=e_reco_axis.edges ) assert_allclose(rmf2.data.data[2, 3], 0.08, atol=5e-2) def test_write(self): energy_axis_true = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=10, name="energy_true" ) offset_axis = MapAxis.from_bounds( 0, 1, nbin=3, unit="deg", name="offset", node_type="edges" ) migra_axis = MapAxis.from_bounds(0, 3, nbin=3, name="migra", node_type="edges") axes = MapAxes([energy_axis_true, migra_axis, offset_axis]) data = np.ones(shape=axes.shape) edisp_test = EnergyDispersion2D(axes=axes) with pytest.raises(ValueError) as error: wrong_unit = u.m**2 EnergyDispersion2D(axes=axes, data=data * wrong_unit) assert error.match( f"Error: {wrong_unit} is not an allowed unit. {edisp_test.tag} " f"requires {edisp_test.default_unit} data quantities." ) edisp = EnergyDispersion2D(axes=axes, data=data) hdu = edisp.to_table_hdu() energy = edisp.axes["energy_true"].edges assert_equal(hdu.data["ENERG_LO"][0], energy[:-1].value) assert hdu.header["TUNIT1"] == edisp.axes["energy_true"].unit def test_plot_migration(self): with mpl_plot_check(): self.edisp.plot_migration() def test_plot_bias(self): with mpl_plot_check(): self.edisp.plot_bias() def test_peek(self): with mpl_plot_check(): self.edisp.peek() def test_eq(self): assert not self.edisp2 == self.edisp edisp1 = deepcopy(self.edisp) assert self.edisp == edisp1 @requires_data("gammapy-data") def test_edisp2d_pointlike(): filename = "$GAMMAPY_DATA/joint-crab/dl3/magic/run_05029748_DL3.fits" edisp = EnergyDispersion2D.read(filename) hdu = edisp.to_table_hdu() assert edisp.is_pointlike assert hdu.header["HDUCLAS3"] == "POINT-LIKE" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/edisp/tests/test_kernel.py0000644000175100001770000001117214721316200021765 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import warnings import pytest import numpy as np from numpy.testing import assert_allclose, assert_equal import astropy.units as u from astropy.io import fits from gammapy.irf import EDispKernel from gammapy.maps import MapAxis from gammapy.utils.testing import mpl_plot_check, requires_data class TestEDispKernel: def setup_method(self): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=100) energy_axis_true = energy_axis.copy(name="energy_true") self.resolution = 0.1 self.bias = 0 self.edisp = EDispKernel.from_gauss( energy_axis_true=energy_axis_true, energy_axis=energy_axis, pdf_threshold=1e-7, sigma=self.resolution, bias=self.bias, ) def test_from_diagonal_response(self): energy_axis_true = MapAxis.from_energy_edges( [0.5, 1, 2, 4, 6] * u.TeV, name="energy_true" ) energy_axis = MapAxis.from_energy_edges([2, 4, 6] * u.TeV) edisp = EDispKernel.from_diagonal_response(energy_axis_true, energy_axis) assert edisp.pdf_matrix.shape == (4, 2) expected = [[0, 0], [0, 0], [1, 0], [0, 1]] assert_equal(edisp.pdf_matrix, expected) # Test with different energy units energy_axis = MapAxis.from_energy_edges([2000, 4000, 6000] * u.GeV) edisp = EDispKernel.from_diagonal_response(energy_axis_true, energy_axis) assert edisp.pdf_matrix.shape == (4, 2) assert_allclose(edisp.pdf_matrix, expected, atol=1e-5) # Test square matrix edisp = EDispKernel.from_diagonal_response(energy_axis_true) assert_allclose(edisp.axes["energy"].edges, edisp.axes["energy_true"].edges) assert edisp.axes["energy"].unit == "TeV" assert_equal(edisp.pdf_matrix[0][0], 1) assert_equal(edisp.pdf_matrix[2][0], 0) assert edisp.pdf_matrix.sum() == 4 def test_to_image(self): energy_axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=3) energy_axis_true = MapAxis.from_energy_bounds( "0.08 TeV", "20 TeV", nbin=5, name="energy_true" ) edisp = EDispKernel.from_gauss( energy_axis=energy_axis, energy_axis_true=energy_axis_true, sigma=0.2, bias=0.1, ) im = edisp.to_image() assert im.pdf_matrix.shape == (5, 1) assert_allclose( im.pdf_matrix, [[0.97142], [1.0], [1.0], [1.0], [0.12349]], rtol=1e-3 ) assert_allclose(im.axes["energy"].edges, [0.1, 10] * u.TeV) def test_str(self): assert "EDispKernel" in str(self.edisp) def test_evaluate(self): # Check for correct normalization pdf = self.edisp.evaluate(energy_true=3.34 * u.TeV) assert_allclose(np.sum(pdf), 1, atol=1e-2) def test_get_bias(self): bias = self.edisp.get_bias(3.34 * u.TeV) assert_allclose(bias, self.bias, atol=1e-2) def test_get_resolution(self): resolution = self.edisp.get_resolution(3.34 * u.TeV) assert_allclose(resolution, self.resolution, atol=1e-2) def test_io(self, tmp_path): indices = np.array([[1, 3, 6], [3, 3, 2]]) desired = self.edisp.pdf_matrix[indices] self.edisp.write(tmp_path / "tmp.fits") edisp2 = EDispKernel.read(tmp_path / "tmp.fits") actual = edisp2.pdf_matrix[indices] assert_allclose(actual, desired) def test_plot_matrix(self): with mpl_plot_check(): self.edisp.plot_matrix() def test_plot_bias(self): with mpl_plot_check(): self.edisp.plot_bias() def test_peek(self): with mpl_plot_check(): self.edisp.peek() @requires_data("gammapy-data") def test_get_bias_energy(): """Obs read from file""" rmffile = "$GAMMAPY_DATA/joint-crab/spectra/hess/rmf_obs23523.fits" edisp = EDispKernel.read(rmffile) thresh_lo = edisp.get_bias_energy(0.1) assert_allclose(thresh_lo.to("TeV").value, 0.9174, rtol=1e-4) @pytest.mark.xfail def test_io_ogip_checksum(tmp_path): """Obs read from file""" energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=100) energy_axis_true = energy_axis.copy(name="energy_true") edisp = EDispKernel.from_diagonal_response( energy_axis=energy_axis, energy_axis_true=energy_axis_true ) path = tmp_path / "test.fits" edisp.write(path, checksum=True) # TODO: why does this fail? # MATRIX hdu checksum is changed. with warnings.catch_warnings(): warnings.simplefilter("error") fits.open(path, checksum=True) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/edisp/tests/test_map.py0000644000175100001770000002527114721316200021267 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import SkyCoord from astropy.units import Unit from gammapy.irf import ( EDispKernel, EDispKernelMap, EDispMap, EffectiveAreaTable2D, EnergyDispersion2D, ) from gammapy.makers.utils import make_edisp_map, make_map_exposure_true_energy from gammapy.maps import MapAxis, MapCoord, RegionGeom, WcsGeom from gammapy.utils.testing import mpl_plot_check def fake_aeff2d(area=1e6 * u.m**2): offsets = np.array((0.0, 1.0, 2.0, 3.0)) * u.deg energy_axis_true = MapAxis.from_energy_bounds( "0.1 TeV", "10 TeV", nbin=4, name="energy_true" ) offset_axis = MapAxis.from_edges(offsets, name="offset") return EffectiveAreaTable2D( axes=[energy_axis_true, offset_axis], data=area.value, unit=area.unit ) def make_edisp_map_test(): pointing = SkyCoord(0, 0, unit="deg") energy_axis_true = MapAxis.from_energy_edges( energy_edges=[0.2, 0.7, 1.5, 2.0, 10.0] * u.TeV, name="energy_true", ) migra_axis = MapAxis(nodes=np.linspace(0.0, 3.0, 51), unit="", name="migra") offset_axis = MapAxis.from_nodes([0.0, 1.0, 2.0, 3.0] * u.deg, name="offset") edisp2d = EnergyDispersion2D.from_gauss( energy_axis_true=energy_axis_true, migra_axis=migra_axis, offset_axis=offset_axis, bias=0, sigma=0.2, ) geom = WcsGeom.create( skydir=pointing, binsz=1.0, width=5.0, axes=[migra_axis, energy_axis_true] ) aeff2d = fake_aeff2d() exposure_geom = geom.squash(axis_name="migra") exposure_map = make_map_exposure_true_energy(pointing, "1 h", aeff2d, exposure_geom) return make_edisp_map(edisp2d, pointing, geom, exposure_map) def test_make_edisp_map(): energy_axis = MapAxis( nodes=[0.2, 0.7, 1.5, 2.0, 10.0], unit="TeV", name="energy_true", node_type="edges", interp="log", ) migra_axis = MapAxis(nodes=np.linspace(0.0, 3.0, 51), unit="", name="migra") edmap = make_edisp_map_test() assert edmap.edisp_map.geom.axes[0] == migra_axis assert edmap.edisp_map.geom.axes[1] == energy_axis assert edmap.edisp_map.unit == Unit("") assert edmap.edisp_map.data.shape == (4, 50, 5, 5) def test_edisp_map_to_from_hdulist(): edmap = make_edisp_map_test() hdulist = edmap.to_hdulist() assert "EDISP" in hdulist assert "EDISP_BANDS" in hdulist assert "EDISP_EXPOSURE" in hdulist assert "EDISP_EXPOSURE_BANDS" in hdulist new_edmap = EDispMap.from_hdulist(hdulist) assert_allclose(edmap.edisp_map.data, new_edmap.edisp_map.data) assert new_edmap.edisp_map.geom == edmap.edisp_map.geom assert new_edmap.exposure_map.geom == edmap.exposure_map.geom def test_edisp_map_read_write(tmp_path): edisp_map = make_edisp_map_test() edisp_map.write(tmp_path / "tmp.fits") new_edmap = EDispMap.read(tmp_path / "tmp.fits") assert_allclose(edisp_map.edisp_map.quantity, new_edmap.edisp_map.quantity) def test_edisp_map_to_energydispersion(): edmap = make_edisp_map_test() position = SkyCoord(0, 0, unit="deg") energy_axis = MapAxis.from_edges(np.logspace(-0.3, 0.2, 200) * u.TeV, name="energy") edisp = edmap.get_edisp_kernel(position=position, energy_axis=energy_axis) # Note that the bias and resolution are rather poorly evaluated on an EnergyDispersion object assert_allclose(edisp.get_bias(energy_true=1.0 * u.TeV), 0.0, atol=3e-2) assert_allclose(edisp.get_resolution(energy_true=1.0 * u.TeV), 0.2, atol=3e-2) def test_edisp_map_from_geom_error(): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) energy_axis_true = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=3, name="energy_true" ) geom = WcsGeom.create(npix=(1, 1), axes=[energy_axis_true, energy_axis]) with pytest.raises(ValueError): EDispKernelMap.from_geom(geom=geom) def test_edisp_map_stacking(): edmap1 = make_edisp_map_test() edmap2 = make_edisp_map_test() edmap2.exposure_map.quantity *= 2 edmap_stack = edmap1.copy() edmap_stack.stack(edmap2) assert_allclose(edmap_stack.edisp_map.data, edmap1.edisp_map.data) assert_allclose(edmap_stack.exposure_map.data, edmap1.exposure_map.data * 3) def test_sample_coord(): edisp_map = make_edisp_map_test() coords = MapCoord( {"lon": [0, 0] * u.deg, "lat": [0, 0.5] * u.deg, "energy_true": [1, 3] * u.TeV}, frame="icrs", ) coords_corrected = edisp_map.sample_coord(map_coord=coords) assert len(coords_corrected["energy"]) == 2 assert coords_corrected["energy"].unit == "TeV" assert_allclose(coords_corrected["energy"].value, [1.024664, 3.34484], rtol=1e-5) @pytest.mark.parametrize("position", ["0d 0d", "180d 0d", "0d 90d", "180d -90d"]) def test_edisp_from_diagonal_response(position): position = SkyCoord(position) energy_axis_true = MapAxis.from_energy_bounds( "0.3 TeV", "10 TeV", nbin=31, name="energy_true" ) energy_axis = MapAxis.from_energy_bounds( "0.3 TeV", "10 TeV", nbin=31, name="energy" ) edisp_map = EDispMap.from_diagonal_response(energy_axis_true) edisp_kernel = edisp_map.get_edisp_kernel( position=position, energy_axis=energy_axis ) sum_kernel = np.sum(edisp_kernel.data, axis=1) # We exclude the first and last bin, where there is no # e_reco to contribute to assert_allclose(sum_kernel[1:-1], 1) def test_edisp_map_to_edisp_kernel_map(): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=5) energy_axis_true = MapAxis.from_energy_bounds( "0.3 TeV", "30 TeV", nbin=10, per_decade=True, name="energy_true" ) migra_axis = MapAxis(nodes=np.linspace(0.0, 3.0, 51), unit="", name="migra") edisp_map = EDispMap.from_diagonal_response(energy_axis_true, migra_axis) edisp_kernel_map = edisp_map.to_edisp_kernel_map(energy_axis) position = SkyCoord(0, 0, unit="deg") kernel = edisp_kernel_map.get_edisp_kernel(position=position) assert edisp_kernel_map.exposure_map.geom.axes[0].name == "energy" actual = kernel.pdf_matrix.sum(axis=0) assert_allclose(actual, 2.0) def test_edisp_kernel_map_stack(): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=5) energy_axis_true = MapAxis.from_energy_bounds( "0.3 TeV", "30 TeV", nbin=10, per_decade=True, name="energy_true" ) edisp_1 = EDispKernelMap.from_diagonal_response( energy_axis=energy_axis, energy_axis_true=energy_axis_true ) edisp_1.exposure_map.data += 1 edisp_2 = EDispKernelMap.from_diagonal_response( energy_axis=energy_axis, energy_axis_true=energy_axis_true ) edisp_2.exposure_map.data += 2 geom = edisp_1.edisp_map.geom weights = geom.energy_mask(energy_min=2 * u.TeV) edisp_1.stack(edisp_2, weights=weights) position = SkyCoord(0, 0, unit="deg") kernel = edisp_1.get_edisp_kernel(position=position) actual = kernel.pdf_matrix.sum(axis=0) exposure = edisp_1.exposure_map.data[:, 0, 0, 0] assert_allclose(actual, [2.0 / 3.0, 2.0 / 3.0, 2.0, 2.0, 2.0]) assert_allclose(exposure, 3.0) def test_incorrect_edisp_kernel_map_stack(): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=5) energy_axis_true = MapAxis.from_energy_bounds( "0.3 TeV", "30 TeV", nbin=10, per_decade=True, name="energy_true" ) edisp_1 = EDispKernelMap.from_diagonal_response( energy_axis=energy_axis, energy_axis_true=energy_axis_true ) edisp_1.exposure_map.data += 1 edisp_2 = EDispKernelMap.from_diagonal_response( energy_axis=energy_axis, energy_axis_true=energy_axis_true ) edisp_2.exposure_map = None with pytest.raises(ValueError) as except_info: edisp_1.stack(edisp_2) assert except_info.match("Missing exposure map for EDispKernelMap.stack") def test_edispkernel_from_diagonal_response(): energy_axis_true = MapAxis.from_energy_bounds( "0.3 TeV", "10 TeV", nbin=11, name="energy_true" ) energy_axis = MapAxis.from_energy_bounds( "0.3 TeV", "10 TeV", nbin=11, name="energy" ) geom = RegionGeom.create("fk5;circle(0, 0, 10)") region_edisp = EDispKernelMap.from_diagonal_response( energy_axis, energy_axis_true, geom=geom ) sum_kernel = np.sum(region_edisp.edisp_map.data[..., 0, 0], axis=1) # We exclude the first and last bin, where there is no # e_reco to contribute to assert_allclose(sum_kernel[1:-1], 1) def test_edispkernel_from_1d(): energy_axis_true = MapAxis.from_energy_bounds( "0.5 TeV", "5 TeV", nbin=31, name="energy_true" ) energy_axis = MapAxis.from_energy_bounds( "0.1 TeV", "10 TeV", nbin=11, name="energy" ) edisp = EDispKernel.from_gauss(energy_axis_true, energy_axis, 0.1, 0.0) geom = RegionGeom.create("fk5;circle(0, 0, 10)") region_edisp = EDispKernelMap.from_edisp_kernel(edisp, geom=geom) sum_kernel = np.sum(region_edisp.edisp_map.data[..., 0, 0], axis=1) assert_allclose(sum_kernel, 1, rtol=1e-5) allsky_edisp = EDispKernelMap.from_edisp_kernel(edisp) sum_kernel = np.sum(allsky_edisp.edisp_map.data[..., 0, 0], axis=1) assert allsky_edisp.edisp_map.data.shape == (31, 11, 1, 2) assert_allclose(sum_kernel, 1, rtol=1e-5) def test_edisp_kernel_map_to_image(): e_reco = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=3) e_true = MapAxis.from_energy_bounds( "0.08 TeV", "20 TeV", nbin=5, name="energy_true" ) edisp = EDispKernelMap.from_diagonal_response(e_reco, e_true) im = edisp.to_image() assert im.edisp_map.data.shape == (5, 1, 1, 2) assert_allclose(im.edisp_map.data[0, 0, 0, 0], 0.87605894, rtol=1e-5) def test_edisp_kernel_map_resample_axis(): e_reco = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=4) e_true = MapAxis.from_energy_bounds( "0.08 TeV", "20 TeV", nbin=10, name="energy_true" ) edisp = EDispKernelMap.from_diagonal_response(e_reco, e_true) e_reco = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=2) im = edisp.resample_energy_axis(energy_axis=e_reco) res = np.sum(im.edisp_map.data[4, :, 0, 0]) assert im.edisp_map.data.shape == (10, 2, 1, 2) assert_allclose(res, 1.0, rtol=1e-5) def test_peek(): e_reco = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=3) e_true = MapAxis.from_energy_bounds( "0.08 TeV", "20 TeV", nbin=5, name="energy_true" ) edisp = EDispKernelMap.from_diagonal_response(e_reco, e_true) with mpl_plot_check(): edisp.peek() edisp = EDispMap.from_diagonal_response(e_true) with mpl_plot_check(): edisp.peek() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/effective_area.py0000644000175100001770000002257014721316200020134 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import warnings import numpy as np import astropy.units as u from astropy.visualization import quantity_support import matplotlib.pyplot as plt from gammapy.maps import MapAxes, MapAxis from gammapy.maps.axes import UNIT_STRING_FORMAT from gammapy.visualization.utils import add_colorbar from .core import IRF from gammapy.utils.deprecation import GammapyDeprecationWarning __all__ = ["EffectiveAreaTable2D"] class EffectiveAreaTable2D(IRF): """2D effective area table. Data format specification: :ref:`gadf:aeff_2d`. Parameters ---------- axes : list of `~gammapy.maps.MapAxis` or `~gammapy.maps.MapAxes` Required axes (in the given order) are: * energy_true (true energy axis) * offset (field of view offset axis) data : `~astropy.units.Quantity` Effective area. meta : dict Metadata dictionary. Examples -------- Here's an example you can use to learn about this class: >>> from gammapy.irf import EffectiveAreaTable2D >>> filename = '$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits' >>> aeff = EffectiveAreaTable2D.read(filename, hdu='EFFECTIVE AREA') >>> print(aeff) EffectiveAreaTable2D -------------------- axes : ['energy_true', 'offset'] shape : (42, 6) ndim : 2 unit : m2 dtype : >f4 Here's another one, created from scratch, without reading a file: >>> from gammapy.irf import EffectiveAreaTable2D >>> from gammapy.maps import MapAxis >>> energy_axis_true = MapAxis.from_energy_bounds( ... "0.1 TeV", "100 TeV", nbin=30, name="energy_true" ... ) >>> offset_axis = MapAxis.from_bounds(0, 5, nbin=4, name="offset") >>> aeff = EffectiveAreaTable2D(axes=[energy_axis_true, offset_axis], data=1e10, unit="cm2") >>> print(aeff) EffectiveAreaTable2D -------------------- axes : ['energy_true', 'offset'] shape : (30, 4) ndim : 2 unit : cm2 dtype : float64 """ tag = "aeff_2d" required_axes = ["energy_true", "offset"] default_unit = u.m**2 def plot_energy_dependence(self, ax=None, offset=None, **kwargs): """Plot effective area versus energy for a given offset. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. offset : list of `~astropy.coordinates.Angle`, optional Offset. Default is None. kwargs : dict Forwarded to plt.plot(). Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax if offset is None: off_min, off_max = self.axes["offset"].bounds offset = np.linspace(off_min, off_max, 4) energy_axis = self.axes["energy_true"] for off in offset: area = self.evaluate(offset=off, energy_true=energy_axis.center) label = kwargs.pop("label", f"offset = {off:.1f}") with quantity_support(): ax.plot(energy_axis.center, area, label=label, **kwargs) energy_axis.format_plot_xaxis(ax=ax) ax.set_ylabel( f"Effective Area [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]" ) ax.legend() return ax def plot_offset_dependence(self, ax=None, energy=None, **kwargs): """Plot effective area versus offset for a given energy. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. energy : `~astropy.units.Quantity` Energy. **kwargs : dict Keyword argument passed to `~matplotlib.pyplot.plot`. Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax if energy is None: energy_axis = self.axes["energy_true"] e_min, e_max = energy_axis.center[[0, -1]] energy = np.geomspace(e_min, e_max, 4) offset_axis = self.axes["offset"] for ee in energy: area = self.evaluate(offset=offset_axis.center, energy_true=ee) area /= np.nanmax(area) if np.isnan(area).all(): continue label = f"energy = {ee:.1f}" with quantity_support(): ax.plot(offset_axis.center, area, label=label, **kwargs) offset_axis.format_plot_xaxis(ax=ax) ax.set_ylim(0, 1.1) ax.set_ylabel("Relative Effective Area") ax.legend(loc="best") return ax def plot( self, ax=None, add_cbar=True, axes_loc=None, kwargs_colorbar=None, **kwargs ): """Plot effective area image. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. add_cbar : bool, optional Add a colorbar to the plot. Default is True. axes_loc : dict, optional Keyword arguments passed to `~mpl_toolkits.axes_grid1.axes_divider.AxesDivider.append_axes`. kwargs_colorbar : dict, optional Keyword arguments passed to `~matplotlib.pyplot.colorbar`. kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.pcolormesh`. Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax energy = self.axes["energy_true"] offset = self.axes["offset"] aeff = self.evaluate( offset=offset.center, energy_true=energy.center[:, np.newaxis] ) vmin, vmax = np.nanmin(aeff.value), np.nanmax(aeff.value) kwargs.setdefault("cmap", "GnBu") kwargs.setdefault("edgecolors", "face") kwargs.setdefault("vmin", vmin) kwargs.setdefault("vmax", vmax) kwargs_colorbar = kwargs_colorbar or {} with quantity_support(): caxes = ax.pcolormesh(energy.edges, offset.edges, aeff.value.T, **kwargs) energy.format_plot_xaxis(ax=ax) offset.format_plot_yaxis(ax=ax) if add_cbar: label = f"Effective Area [{aeff.unit.to_string(UNIT_STRING_FORMAT)}]" kwargs_colorbar.setdefault("label", label) add_colorbar(caxes, ax=ax, axes_loc=axes_loc, **kwargs_colorbar) return ax def peek(self, figsize=(15, 5)): """Quick-look summary plots. Parameters ---------- figsize : tuple, optional Size of the figure. Default is (15, 5). """ ncols = 2 if self.is_pointlike else 3 fig, axes = plt.subplots(nrows=1, ncols=ncols, figsize=figsize) self.plot(ax=axes[ncols - 1]) self.plot_energy_dependence(ax=axes[0]) if self.is_pointlike is False: self.plot_offset_dependence(ax=axes[1]) plt.tight_layout() @classmethod def from_parametrization(cls, energy_axis_true=None, instrument="HESS"): r"""Create parametrized effective area. Parametrizations of the effective areas of different Cherenkov telescopes taken from Appendix B of Abramowski et al. (2010), see https://ui.adsabs.harvard.edu/abs/2010MNRAS.402.1342A . .. math:: A_{eff}(E) = g_1 \left(\frac{E}{\mathrm{MeV}}\right)^{-g_2}\exp{\left(-\frac{g_3}{E}\right)} This method does not model the offset dependence of the effective area, but just assumes that it is constant. Parameters ---------- energy_axis_true : `MapAxis`, optional Energy binning, analytic function is evaluated at log centers. Default is None. instrument : {'HESS', 'HESS2', 'CTAO'} Instrument name. Default is 'HESS'. Returns ------- aeff : `EffectiveAreaTable2D` Effective area table. """ # noqa: E501 # Put the parameters g in a dictionary. # Units: g1 (cm^2), g2 (), g3 (MeV) pars = { "HESS": [6.85e9, 0.0891, 5e5], "HESS2": [2.05e9, 0.0891, 1e5], "CTAO": [1.71e11, 0.0891, 1e5], } if instrument == "CTA": instrument = "CTAO" warnings.warn( "Since v1.3, the value 'CTA' is replaced by 'CTAO' for the argument instrument.", GammapyDeprecationWarning, ) if instrument not in pars.keys(): ss = f"Unknown instrument: {instrument}\n" ss += f"Valid instruments: {list(pars.keys())}" raise ValueError(ss) if energy_axis_true is None: energy_axis_true = MapAxis.from_energy_bounds( "2 GeV", "200 TeV", nbin=20, per_decade=True, name="energy_true" ) g1, g2, g3 = pars[instrument] offset_axis = MapAxis.from_edges([0.0, 5.0] * u.deg, name="offset") axes = MapAxes([energy_axis_true, offset_axis]) coords = axes.get_coord() energy, offset = coords["energy_true"].to_value("MeV"), coords["offset"] data = np.ones_like(offset.value) * g1 * energy ** (-g2) * np.exp(-g3 / energy) # TODO: fake offset dependence? meta = {"TELESCOP": instrument} return cls(axes=axes, data=data, unit="cm2", meta=meta) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/io.py0000644000175100001770000001557214721316200015617 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging from astropy.io import fits from gammapy.data.hdu_index_table import HDUIndexTable from gammapy.utils.fits import HDULocation from gammapy.utils.scripts import make_path __all__ = ["load_irf_dict_from_file"] log = logging.getLogger(__name__) IRF_DL3_AXES_SPECIFICATION = { "THETA": {"name": "offset", "interp": "lin"}, "ENERG": {"name": "energy_true", "interp": "log"}, "ETRUE": {"name": "energy_true", "interp": "log"}, "RAD": {"name": "rad", "interp": "lin"}, "DETX": {"name": "fov_lon", "interp": "lin"}, "DETY": {"name": "fov_lat", "interp": "lin"}, "MIGRA": {"name": "migra", "interp": "lin"}, } COMMON_HEADERS = { "HDUCLASS": "GADF", "HDUDOC": "https://github.com/open-gamma-ray-astro/gamma-astro-data-formats", "HDUVERS": "0.2", } COMMON_IRF_HEADERS = { **COMMON_HEADERS, "HDUCLAS1": "RESPONSE", } # The key is the class tag. # TODO: extend the info here with the minimal header info IRF_DL3_HDU_SPECIFICATION = { "bkg_3d": { "extname": "BACKGROUND", "column_name": "BKG", "mandatory_keywords": { **COMMON_IRF_HEADERS, "HDUCLAS2": "BKG", "HDUCLAS3": "FULL-ENCLOSURE", # added here to have HDUCLASN in order "HDUCLAS4": "BKG_3D", "FOVALIGN": "RADEC", }, }, "bkg_2d": { "extname": "BACKGROUND", "column_name": "BKG", "mandatory_keywords": { **COMMON_IRF_HEADERS, "HDUCLAS2": "BKG", "HDUCLAS3": "FULL-ENCLOSURE", # added here to have HDUCLASN in order "HDUCLAS4": "BKG_2D", }, }, "edisp_2d": { "extname": "ENERGY DISPERSION", "column_name": "MATRIX", "mandatory_keywords": { **COMMON_IRF_HEADERS, "HDUCLAS2": "EDISP", "HDUCLAS3": "FULL-ENCLOSURE", # added here to have HDUCLASN in order "HDUCLAS4": "EDISP_2D", }, }, "psf_table": { "extname": "PSF_2D_TABLE", "column_name": "RPSF", "mandatory_keywords": { **COMMON_IRF_HEADERS, "HDUCLAS2": "RPSF", "HDUCLAS3": "FULL-ENCLOSURE", # added here to have HDUCLASN in order "HDUCLAS4": "PSF_TABLE", }, }, "psf_3gauss": { "extname": "PSF_2D_GAUSS", "column_name": { "sigma_1": "SIGMA_1", "sigma_2": "SIGMA_2", "sigma_3": "SIGMA_3", "scale": "SCALE", "ampl_2": "AMPL_2", "ampl_3": "AMPL_3", }, "mandatory_keywords": { **COMMON_IRF_HEADERS, "HDUCLAS2": "RPSF", "HDUCLAS3": "FULL-ENCLOSURE", # added here to have HDUCLASN in order "HDUCLAS4": "PSF_3GAUSS", }, }, "psf_king": { "extname": "PSF_2D_KING", "column_name": { "sigma": "SIGMA", "gamma": "GAMMA", }, "mandatory_keywords": { **COMMON_IRF_HEADERS, "HDUCLAS2": "RPSF", "HDUCLAS3": "FULL-ENCLOSURE", # added here to have HDUCLASN in order "HDUCLAS4": "PSF_KING", }, }, "aeff_2d": { "extname": "EFFECTIVE AREA", "column_name": "EFFAREA", "mandatory_keywords": { **COMMON_IRF_HEADERS, "HDUCLAS2": "EFF_AREA", "HDUCLAS3": "FULL-ENCLOSURE", # added here to have HDUCLASN in order "HDUCLAS4": "AEFF_2D", }, }, "rad_max_2d": { "extname": "RAD_MAX", "column_name": "RAD_MAX", "mandatory_keywords": { **COMMON_IRF_HEADERS, "HDUCLAS2": "RAD_MAX", "HDUCLAS3": "POINT-LIKE", "HDUCLAS4": "RAD_MAX_2D", }, }, } IRF_MAP_HDU_SPECIFICATION = { "edisp_kernel_map": "edisp", "edisp_map": "edisp", "psf_map": "psf", "psf_map_reco": "psf", } def gadf_is_pointlike(header): """Check if a GADF IRF is pointlike based on the header.""" return header.get("HDUCLAS3") == "POINT-LIKE" class UnknownHDUClass(IOError): """Raised when a file contains an unknown HDUCLASS.""" def _get_hdu_type_and_class(header): """Get gammapy hdu_type and class from FITS header. Contains a workaround to support CTA 1DC irf file. """ hdu_clas2 = header.get("HDUCLAS2", "") hdu_clas4 = header.get("HDUCLAS4", "") clas2_to_type = {"rpsf": "psf", "eff_area": "aeff"} hdu_type = clas2_to_type.get(hdu_clas2.lower(), hdu_clas2.lower()) hdu_class = hdu_clas4.lower() if hdu_type not in HDUIndexTable.VALID_HDU_TYPE: raise UnknownHDUClass(f"HDUCLAS2={hdu_clas2}, HDUCLAS4={hdu_clas4}") # workaround for CTA 1DC files with non-compliant HDUCLAS4 names if hdu_class not in HDUIndexTable.VALID_HDU_CLASS: hdu_class = f"{hdu_type}_{hdu_class}" if hdu_class not in HDUIndexTable.VALID_HDU_CLASS: raise UnknownHDUClass(f"HDUCLAS2={hdu_clas2}, HDUCLAS4={hdu_clas4}") return hdu_type, hdu_class def load_irf_dict_from_file(filename): """Load all available IRF components from given file into a dictionary. If multiple IRFs of the same type are present, the first encountered is returned. Parameters ---------- filename : str or `~pathlib.Path` Path to the file containing the IRF components, if EVENTS and GTI HDUs are included in the file, they are ignored. Returns ------- irf_dict : dict of `~gammapy.irf.IRF` Dictionary with instances of the Gammapy objects corresponding to the IRF components. """ from .rad_max import RadMax2D filename = make_path(filename) irf_dict = {} is_pointlike = False with fits.open(filename) as hdulist: for hdu in hdulist: hdu_clas1 = hdu.header.get("HDUCLAS1", "").lower() # not an IRF component if hdu_clas1 != "response": continue is_pointlike |= hdu.header.get("HDUCLAS3") == "POINT-LIKE" try: hdu_type, hdu_class = _get_hdu_type_and_class(hdu.header) except UnknownHDUClass as e: log.warning("File has unknown class %s", e) continue loc = HDULocation( hdu_class=hdu_class, hdu_name=hdu.name, file_dir=filename.parent, file_name=filename.name, ) if hdu_type in irf_dict.keys(): log.warning(f"more than one HDU of {hdu_type} type found") log.warning( f"loaded the {irf_dict[hdu_type].meta['EXTNAME']} HDU in the dictionary" ) continue data = loc.load() irf_dict[hdu_type] = data if is_pointlike and "rad_max" not in irf_dict: irf_dict["rad_max"] = RadMax2D.from_irf(irf_dict["aeff"]) return irf_dict ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.208642 gammapy-1.3/gammapy/irf/psf/0000755000175100001770000000000014721316215015422 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/psf/__init__.py0000644000175100001770000000057514721316200017534 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from .kernel import PSFKernel from .map import PSFMap, RecoPSFMap from .parametric import EnergyDependentMultiGaussPSF, ParametricPSF, PSFKing from .table import PSF3D __all__ = [ "EnergyDependentMultiGaussPSF", "ParametricPSF", "PSF3D", "PSFKernel", "PSFKing", "PSFMap", "RecoPSFMap", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/psf/core.py0000644000175100001770000002311514721316200016720 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from astropy import units as u from astropy.visualization import quantity_support import matplotlib.pyplot as plt import matplotlib.ticker as mtick from gammapy.maps.axes import UNIT_STRING_FORMAT from gammapy.utils.array import array_stats_str from gammapy.visualization.utils import add_colorbar from ..core import IRF class PSF(IRF): """PSF base class.""" def normalize(self): """Normalise PSF.""" super().normalize(axis_name="rad") def containment(self, rad, **kwargs): """Containment tof the PSF at given axes coordinates. Parameters ---------- rad : `~astropy.units.Quantity` Rad value. **kwargs : dict Other coordinates. Returns ------- containment : `~numpy.ndarray` Containment. """ containment = self.integral(axis_name="rad", rad=rad, **kwargs) return containment.to("") def containment_radius(self, fraction, factor=20, **kwargs): """Containment radius at given axes coordinates. Parameters ---------- fraction : float or `~numpy.ndarray` Containment fraction. factor : int, optional Up-sampling factor of the rad axis, determines the precision of the computed containment radius. Default is 20. **kwargs : dict Other coordinates. Returns ------- radius : `~astropy.coordinates.Angle` Containment radius. """ # TODO: this uses a lot of numpy broadcasting tricks, maybe simplify... from gammapy.datasets.map import RAD_AXIS_DEFAULT output = np.broadcast(*kwargs.values(), fraction) try: rad_axis = self.axes["rad"] except KeyError: rad_axis = RAD_AXIS_DEFAULT # upsample for better precision rad = rad_axis.upsample(factor=factor).center axis = tuple(range(output.ndim)) rad = np.expand_dims(rad, axis=axis).T containment = self.containment(rad=rad, **kwargs) fraction_idx = np.argmin(np.abs(containment - fraction), axis=0) return rad[fraction_idx].reshape(output.shape) def info( self, fraction=(0.68, 0.95), energy_true=([1.0], [10.0]) * u.TeV, offset=0 * u.deg, ): """ Print PSF summary information. The containment radius for given fraction, energies and thetas is computed and printed on the command line. Parameters ---------- fraction : list, optional Containment fraction to compute containment radius for, between 0 and 1. Default is (0.68, 0.95). energy_true : `~astropy.units.u.Quantity`, optional Energies to compute containment radius for. Default is ([1.0], [10.0]) TeV. offset : `~astropy.units.u.Quantity`, optional Offset to compute containment radius for. Default is 0 deg. Returns ------- info : string Formatted string containing the summary information. """ info = "\nSummary PSF info\n" info += "----------------\n" info += array_stats_str(self.axes["offset"].center.to("deg"), "Theta") info += array_stats_str(self.axes["energy_true"].edges[1:], "Energy hi") info += array_stats_str(self.axes["energy_true"].edges[:-1], "Energy lo") containment_radius = self.containment_radius( energy_true=energy_true, offset=offset, fraction=fraction ) energy_true, offset, fraction = np.broadcast_arrays( energy_true, offset, fraction, subok=True ) for idx in np.ndindex(containment_radius.shape): info += f"{100 * fraction[idx]:.2f} containment radius " info += f"at offset = {offset[idx]} " info += f"and energy_true = {energy_true[idx]:4.1f}: " info += f"{containment_radius[idx]:.3f}\n" return info def plot_containment_radius_vs_energy( self, ax=None, fraction=(0.68, 0.95), offset=(0, 1) * u.deg, **kwargs ): """Plot containment fraction as a function of energy. Parameters ---------- ax : `~matplotlib.pyplot.Axes`, optional Matplotlib axes. Default is None. fraction : list of float or `~numpy.ndarray`, optional Containment fraction between 0 and 1. Default is (0.68, 0.95). offset : `~astropy.units.Quantity`, optional Offset array. Default is (0, 1) deg. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.plot`. Returns ------- ax : `~matplotlib.pyplot.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax energy_true = self.axes["energy_true"] for theta in offset: for frac in fraction: plot_kwargs = kwargs.copy() radius = self.containment_radius( energy_true=energy_true.center, offset=theta, fraction=frac ) plot_kwargs.setdefault("label", f"{theta}, {100 * frac:.1f}%") with quantity_support(): ax.plot(energy_true.center, radius, **plot_kwargs) energy_true.format_plot_xaxis(ax=ax) ax.legend(loc="best") ax.set_ylabel( f"Containment radius [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]" ) ax.yaxis.set_major_formatter(mtick.FormatStrFormatter("%.2f")) return ax def plot_containment_radius( self, ax=None, fraction=0.68, add_cbar=True, axes_loc=None, kwargs_colorbar=None, **kwargs, ): """Plot containment image with energy and theta axes. Parameters ---------- ax : `~matplotlib.pyplot.Axes`, optional Matplotlib axes. Default is None. fraction : float, optional Containment fraction between 0 and 1. Default is 0.68. add_cbar : bool, optional Add a colorbar. Default is True. axes_loc : dict, optional Keyword arguments passed to `~mpl_toolkits.axes_grid1.axes_divider.AxesDivider.append_axes`. kwargs_colorbar : dict, optional Keyword arguments passed to `~matplotlib.pyplot.colorbar`. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.pcolormesh`. Returns ------- ax : `~matplotlib.pyplot.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax energy = self.axes["energy_true"] offset = self.axes["offset"] # Set up and compute data containment = self.containment_radius( energy_true=energy.center[:, np.newaxis], offset=offset.center, fraction=fraction, ) # plotting defaults kwargs.setdefault("cmap", "GnBu") kwargs.setdefault("vmin", np.nanmin(containment.value)) kwargs.setdefault("vmax", np.nanmax(containment.value)) kwargs_colorbar = kwargs_colorbar or {} # Plotting with quantity_support(): caxes = ax.pcolormesh( energy.edges, offset.edges, containment.value.T, **kwargs ) energy.format_plot_xaxis(ax=ax) offset.format_plot_yaxis(ax=ax) if add_cbar: label = f"Containment radius R{100 * fraction:.0f} ({containment.unit})" kwargs_colorbar.setdefault("label", label) add_colorbar(caxes, ax=ax, axes_loc=axes_loc, **kwargs_colorbar) return ax def plot_psf_vs_rad( self, ax=None, offset=[0] * u.deg, energy_true=[0.1, 1, 10] * u.TeV, **kwargs ): """Plot PSF vs rad. Parameters ---------- ax : `~matplotlib.pyplot.Axes`, optional Matplotlib axes. Default is None. offset : `~astropy.coordinates.Angle`, optional Offset in the field of view. Default is 0 deg. energy_true : `~astropy.units.Quantity`, optional True energy at which to plot the profile. Default is [0.1, 1, 10] TeV. kwargs : dict Keyword arguments. """ from gammapy.datasets.map import RAD_AXIS_DEFAULT ax = plt.gca() if ax is None else ax try: rad = self.axes["rad"] except KeyError: rad = RAD_AXIS_DEFAULT for theta in offset: for energy in energy_true: psf_value = self.evaluate( rad=rad.center, energy_true=energy, offset=theta ) label = f"Offset: {theta:.1f}, Energy: {energy:.1f}" with quantity_support(): ax.plot(rad.center, psf_value, label=label, **kwargs) rad.format_plot_xaxis(ax=ax) ax.set_yscale("log") ax.set_ylabel(f"PSF [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]") ax.legend() return ax def peek(self, figsize=(15, 5)): """Quick-look summary plots. Parameters ---------- figsize : tuple, optional Size of the figure. Default is (15, 5). """ fig, axes = plt.subplots(nrows=1, ncols=3, figsize=figsize) self.plot_containment_radius(fraction=0.68, ax=axes[0]) self.plot_containment_radius(fraction=0.95, ax=axes[1]) self.plot_containment_radius_vs_energy(ax=axes[2]) plt.tight_layout() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/psf/kernel.py0000644000175100001770000002167014721316200017254 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import html import numpy as np import astropy.units as u import matplotlib.pyplot as plt from gammapy.maps import Map from gammapy.modeling.models import PowerLawSpectralModel from gammapy.utils.deprecation import deprecated_renamed_argument __all__ = ["PSFKernel"] class PSFKernel: """PSF kernel for `~gammapy.maps.Map`. This is a container class to store a PSF kernel that can be used to convolve `~gammapy.maps.WcsNDMap` objects. It is usually computed from an `~gammapy.irf.PSFMap`. Parameters ---------- psf_kernel_map : `~gammapy.maps.Map` PSF kernel stored in a Map. Examples -------- .. testcode:: from gammapy.maps import Map, WcsGeom, MapAxis from gammapy.irf import PSFMap from astropy import units as u # Define energy axis energy_axis_true = MapAxis.from_energy_bounds( "0.1 TeV", "10 TeV", nbin=3, name="energy_true" ) # Create WcsGeom and map geom = WcsGeom.create(binsz=0.02 * u.deg, width=2.0 * u.deg, axes=[energy_axis_true]) some_map = Map.from_geom(geom) # Fill map at three positions some_map.fill_by_coord([[0, 0, 0], [0, 0, 0], [0.3, 1, 3]]) psf = PSFMap.from_gauss( energy_axis_true=energy_axis_true, sigma=[0.3, 0.2, 0.1] * u.deg ) kernel = psf.get_psf_kernel(geom=geom, max_radius=1*u.deg) # Do the convolution some_map_convolved = some_map.convolve(kernel) some_map_convolved.plot_grid() # doctest: +SKIP """ def __init__(self, psf_kernel_map, normalize=True): self._psf_kernel_map = psf_kernel_map if normalize: self.normalize() def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def normalize(self): """Force normalisation of the kernel.""" data = self.psf_kernel_map.data if self.psf_kernel_map.geom.is_image: axis = (0, 1) else: axis = (1, 2) with np.errstate(divide="ignore", invalid="ignore"): data = np.nan_to_num(data / data.sum(axis=axis, keepdims=True)) self.psf_kernel_map.data = data @property def data(self): """Access the PSFKernel numpy array.""" return self._psf_kernel_map.data @property def psf_kernel_map(self): """The map object holding the kernel as a `~gammapy.maps.Map`.""" return self._psf_kernel_map @classmethod def read(cls, *args, **kwargs): """Read kernel Map from file.""" psf_kernel_map = Map.read(*args, **kwargs) return cls(psf_kernel_map) @classmethod def from_spatial_model(cls, model, geom, max_radius=None, factor=4): """Create PSF kernel from spatial model. Parameters ---------- geom : `~gammapy.maps.WcsGeom` Map geometry. model : `~gammapy.modeling.models.SpatiaModel` Gaussian width. max_radius : `~astropy.coordinates.Angle`, optional Desired kernel map size. Default is None. factor : int, optional Oversample factor to compute the PSF. Default is 4. Returns ------- kernel : `~gammapy.irf.PSFKernel` The kernel Map with reduced geometry according to the max_radius. """ if max_radius is None: max_radius = model.evaluation_radius geom = geom.to_odd_npix(max_radius=max_radius) model.position = geom.center_skydir geom = geom.upsample(factor=factor) map = model.integrate_geom(geom, oversampling_factor=1) return cls(psf_kernel_map=map.downsample(factor=factor)) @classmethod def from_gauss(cls, geom, sigma, max_radius=None, factor=4): """Create Gaussian PSF. This is used for testing and examples. The map geometry parameters (pixel size, energy bins) are taken from ``geom``. The Gaussian width ``sigma`` is a scalar. TODO : support array input if it should vary along the energy axis. Parameters ---------- geom : `~gammapy.maps.WcsGeom` Map geometry. sigma : `~astropy.coordinates.Angle` Gaussian width. max_radius : `~astropy.coordinates.Angle`, optional Desired kernel map size. Default is None. factor : int, optional Oversample factor to compute the PSF. Default is 4. Returns ------- kernel : `~gammapy.irf.PSFKernel` The kernel Map with reduced geometry according to the max_radius. """ from gammapy.modeling.models import GaussianSpatialModel gauss = GaussianSpatialModel(sigma=sigma) return cls.from_spatial_model( model=gauss, geom=geom, max_radius=max_radius, factor=factor ) def write(self, *args, **kwargs): """Write the Map object which contains the PSF kernel to file.""" self.psf_kernel_map.write(*args, **kwargs) @deprecated_renamed_argument("spectrum", "spectral_model", "v1.3") def to_image(self, spectral_model=None, exposure=None, keepdims=True): """Transform 3D PSFKernel into a 2D PSFKernel. Parameters ---------- spectral_model : `~gammapy.modeling.models.SpectralModel`, optional Spectral model to compute the weights. Default is power-law with spectral index of 2. exposure : `~astropy.units.Quantity` or `~numpy.ndarray`, optional 1D array containing exposure in each true energy bin. It must have the same size as the PSFKernel energy axis. Default is uniform exposure over energy. keepdims : bool, optional If true, the resulting PSFKernel will keep an energy axis with one bin. Default is True. Returns ------- weighted_kernel : `~gammapy.irf.PSFKernel` The weighted kernel summed over energy. """ map = self.psf_kernel_map if spectral_model is None: spectral_model = PowerLawSpectralModel(index=2.0) if exposure is None: exposure = np.ones(map.geom.axes[0].center.shape) exposure = u.Quantity(exposure) if exposure.shape != map.geom.axes[0].center.shape: raise ValueError("Incorrect exposure_array shape") # Compute weights vector for name in map.geom.axes.names: if "energy" in name: energy_name = name energy_edges = map.geom.axes[energy_name].edges weights = spectral_model.integral( energy_min=energy_edges[:-1], energy_max=energy_edges[1:], intervals=True ) weights *= exposure weights /= weights.sum() spectrum_weighted_kernel = map.copy() spectrum_weighted_kernel.quantity *= weights[:, np.newaxis, np.newaxis] return self.__class__(spectrum_weighted_kernel.sum_over_axes(keepdims=keepdims)) def slice_by_idx(self, slices): """Slice by index.""" kernel = self.psf_kernel_map.slice_by_idx(slices=slices) return self.__class__(psf_kernel_map=kernel) def plot_kernel(self, ax=None, energy=None, add_cbar=False, **kwargs): """Plot PSF kernel. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. energy : `~astropy.units.Quantity`, optional If None, the PSF kernel is summed over the energy axis. Otherwise, the kernel corresponding to the energy bin including the given energy is shown. add_cbar : bool, optional Add a colorbar. Default is False. kwargs: dict Keyword arguments passed to `~matplotlib.axes.Axes.imshow`. Returns ------- ax : `~matplotlib.axes.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax if energy is not None: kernel_map = self.psf_kernel_map energy_center = kernel_map.geom.axes["energy_true"].center.to(energy.unit) idx = np.argmin(np.abs(energy_center.value - energy.value)) kernel_map.get_image_by_idx([idx]).plot(ax=ax, add_cbar=add_cbar, **kwargs) else: self.to_image().psf_kernel_map.plot(ax=ax, add_cbar=add_cbar, **kwargs) return ax def peek(self, figsize=(15, 5)): """Quick-look summary plots. Parameters ---------- figsize : tuple, optional Size of the figure. Default is (15, 5). """ fig, axes = plt.subplots(nrows=1, ncols=2, figsize=figsize) axes[0].set_title("Energy-integrated PSF kernel") self.plot_kernel(ax=axes[0], add_cbar=True) axes[1].set_title("PSF kernel at 1 TeV") self.plot_kernel(ax=axes[1], add_cbar=True, energy=1 * u.TeV) plt.tight_layout() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/psf/map.py0000644000175100001770000006057314721316200016556 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np import astropy.units as u from astropy.visualization import quantity_support import matplotlib.pyplot as plt from matplotlib.ticker import FormatStrFormatter from gammapy.maps import HpxGeom, Map, MapAxes, MapAxis, MapCoord, WcsGeom from gammapy.maps.axes import UNIT_STRING_FORMAT from gammapy.modeling.models import PowerLawSpectralModel from gammapy.utils.gauss import Gauss2DPDF from gammapy.utils.random import InverseCDFSampler, get_random_state from ..core import IRFMap from .core import PSF from .kernel import PSFKernel from gammapy.utils.deprecation import deprecated_renamed_argument __all__ = ["PSFMap", "RecoPSFMap"] PSF_MAX_OVERSAMPLING = 4 # for backward compatibility def _psf_upsampling_factor(psf, geom, position, energy=None, precision_factor=12): """Minimal factor between the bin half-width of the geom and the median R68% containment radius.""" if energy is None: energy = geom.axes[psf.energy_name].center psf_r68s = psf.containment_radius( 0.68, geom.axes[psf.energy_name].center, position=position ) factors = [] for psf_r68 in psf_r68s: base_factor = (2 * psf_r68 / geom.pixel_scales.max()).to_value("") factor = np.minimum( int(np.ceil(precision_factor / base_factor)), PSF_MAX_OVERSAMPLING ) if isinstance(geom, HpxGeom): factor = int(2 ** np.ceil(np.log(factor) / np.log(2))) factors.append(factor) return factors class IRFLikePSF(PSF): required_axes = ["energy_true", "rad", "lat_idx", "lon_idx"] tag = "irf_like_psf" class PSFMap(IRFMap): """Class containing the Map of PSFs and allowing to interact with it. Parameters ---------- psf_map : `~gammapy.maps.Map` The input PSF Map. Should be a Map with 2 non spatial axes. rad and true energy axes should be given in this specific order. exposure_map : `~gammapy.maps.Map` Associated exposure map. Needs to have a consistent map geometry. Examples -------- .. testcode:: from astropy.coordinates import SkyCoord from gammapy.maps import WcsGeom, MapAxis from gammapy.data import Observation, FixedPointingInfo from gammapy.irf import load_irf_dict_from_file from gammapy.makers import MapDatasetMaker # Define observation pointing_position = SkyCoord(0, 0, unit="deg", frame="galactic") pointing = FixedPointingInfo( fixed_icrs=pointing_position.icrs, ) filename = "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" irfs = load_irf_dict_from_file(filename) obs = Observation.create(pointing=pointing, irfs=irfs, livetime="1h") # Define energy axis. Note that the name is fixed. energy_axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=3, name="energy_true") # Define rad axis. Again note the axis name rad_axis = MapAxis.from_bounds(0, 0.5, nbin=100, name="rad", unit="deg") # Create WcsGeom geom = WcsGeom.create( binsz=0.25, width="5 deg", skydir=pointing_position, axes=[rad_axis, energy_axis] ) maker = MapDatasetMaker() psf = maker.make_psf(geom=geom, observation=obs) # Get a PSF kernel at the center of the image upsample_geom = geom.upsample(factor=10).drop("rad") psf_kernel = psf.get_psf_kernel(geom=upsample_geom) """ tag = "psf_map" required_axes = ["rad", "energy_true"] def __init__(self, psf_map, exposure_map=None): super().__init__(irf_map=psf_map, exposure_map=exposure_map) @property def energy_name(self): return self.required_axes[-1] @property def psf_map(self): return self._irf_map @psf_map.setter def psf_map(self, value): del self.has_single_spatial_bin self._irf_map = value def normalize(self): """Normalize PSF map.""" self.psf_map.normalize(axis_name="rad") @classmethod def from_geom(cls, geom): """Create PSF map from geometry. Parameters ---------- geom : `Geom` PSF map geometry. Returns ------- psf_map : `PSFMap` Point spread function map. """ geom_exposure = geom.squash(axis_name="rad") exposure_psf = Map.from_geom(geom_exposure, unit="m2 s") psf_map = Map.from_geom(geom, unit="sr-1") return cls(psf_map, exposure_psf) # TODO: this is a workaround for now, probably add Map.integral() or similar @property def _psf_irf(self): geom = self.psf_map.geom npix_x, npix_y = geom.npix axis_lon = MapAxis.from_edges(np.arange(npix_x[0] + 1) - 0.5, name="lon_idx") axis_lat = MapAxis.from_edges(np.arange(npix_y[0] + 1) - 0.5, name="lat_idx") axes = MapAxes( [geom.axes[self.energy_name], geom.axes["rad"], axis_lat, axis_lon] ) psf = IRFLikePSF psf.required_axes = axes.names return psf( axes=axes, data=self.psf_map.data, unit=self.psf_map.unit, ) def _get_irf_coords(self, **kwargs): coords = MapCoord.create(kwargs) geom = self.psf_map.geom.to_image() lon_pix, lat_pix = geom.coord_to_pix(coords.skycoord) coords_irf = { "lon_idx": lon_pix, "lat_idx": lat_pix, self.energy_name: coords[self.energy_name], } try: coords_irf["rad"] = coords["rad"] except KeyError: pass return coords_irf def containment(self, rad, energy_true, position=None): """Containment at given coordinates. Parameters ---------- rad : `~astropy.units.Quantity` Rad value. energy_true : `~astropy.units.Quantity` Energy true value. position : `~astropy.coordinates.SkyCoord`, optional Sky position. If None, the center of the map is chosen. Default is None. Returns ------- containment : `~astropy.units.Quantity` Containment values. """ if position is None: position = self.psf_map.geom.center_skydir coords = {"skycoord": position, "rad": rad, self.energy_name: energy_true} return self.psf_map.integral(axis_name="rad", coords=coords).to("") def containment_radius(self, fraction, energy_true, position=None): """Containment at given coordinates. Parameters ---------- fraction : float Containment fraction. energy_true : `~astropy.units.Quantity` Energy true value. position : `~astropy.coordinates.SkyCoord`, optional Sky position. If None, the center of the map is chosen. Default is None. Returns ------- containment : `~astropy.units.Quantity` Containment values. """ if position is None: position = self.psf_map.geom.center_skydir kwargs = {self.energy_name: energy_true, "skycoord": position} coords = self._get_irf_coords(**kwargs) return self._psf_irf.containment_radius(fraction, **coords) def containment_radius_map(self, energy_true, fraction=0.68): """Containment radius map. Parameters ---------- energy_true : `~astropy.units.Quantity` Energy true at which to compute the containment radius. fraction : float, optional Containment fraction (range: 0 to 1). Default is 0.68. Returns ------- containment_radius_map : `~gammapy.maps.Map` Containment radius map. """ geom = self.psf_map.geom.to_image() data = self.containment_radius( fraction, energy_true, geom.get_coord().skycoord, ) return Map.from_geom(geom=geom, data=data.value, unit=data.unit) def get_psf_kernel( self, geom, position=None, max_radius=None, containment=0.999, precision_factor=12, ): """Return a PSF kernel at the given position. The PSF is returned in the form a WcsNDMap defined by the input Geom. Parameters ---------- geom : `~gammapy.maps.Geom` Target geometry to use. position : `~astropy.coordinates.SkyCoord`, optional Target position. Should be a single coordinate. By default, the center position is used. max_radius : `~astropy.coordinates.Angle`, optional Maximum angular size of the kernel map. Default is None and it will be computed for the `containment` fraction set. containment : float, optional Containment fraction to use as size of the kernel. The radius can be overwritten using the `max_radius` argument. Default is 0.999. precision_factor : int, optional Factor between the bin half-width of the geom and the median R68% containment radius. Used only for the oversampling method. Default is 10. Returns ------- kernel : `~gammapy.irf.PSFKernel` or list of `PSFKernel` The resulting kernel. """ if geom.is_region or geom.is_hpx: geom = geom.to_wcs_geom() if position is None: position = self.psf_map.geom.center_skydir position = self._get_nearest_valid_position(position) energy_axis = self.psf_map.geom.axes[self.energy_name] kwargs = { "fraction": containment, "position": position, self.energy_name: energy_axis.center, } radii = self.containment_radius(**kwargs) if max_radius is None: max_radius = np.max(radii) else: max_radius = u.Quantity(max_radius) radii[radii > max_radius] = max_radius n_radii = len(radii) factor = _psf_upsampling_factor(self, geom, position, precision_factor) geom = geom.to_odd_npix(max_radius=max_radius) kernel_map = Map.from_geom(geom=geom) for im, ind in zip(kernel_map.iter_by_image(keepdims=True), range(n_radii)): geom_image_cut = im.geom.to_odd_npix(max_radius=radii[ind]).upsample( factor=factor[ind] ) coords = geom_image_cut.get_coord(sparse=True) rad = coords.skycoord.separation(geom.center_skydir) coords = { self.energy_name: coords[self.energy_name], "rad": rad, "skycoord": position, } data = self.psf_map.interp_by_coord( coords=coords, method="linear", ) kernel_image = Map.from_geom( geom=geom_image_cut, data=np.clip(data, 0, np.inf) ) kernel_image = kernel_image.downsample( factor=factor[ind], preserve_counts=True ) coords = kernel_image.geom.get_coord() im.fill_by_coord(coords, weights=kernel_image.data) return PSFKernel(kernel_map, normalize=True) def sample_coord(self, map_coord, random_state=0, chunk_size=10000): """Apply PSF corrections on the coordinates of a set of simulated events. Parameters ---------- map_coord : `~gammapy.maps.MapCoord` object. Sequence of coordinates and energies of sampled events. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`}, optional Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is 0. chunk_size : int If set, this will slice the input MapCoord into smaller chunks of chunk_size elements. Default is 10000. Returns ------- corr_coord : `~gammapy.maps.MapCoord` Sequence of PSF-corrected coordinates of the input map_coord map. """ random_state = get_random_state(random_state) rad_axis = self.psf_map.geom.axes["rad"] position = map_coord.skycoord energy = map_coord[self.energy_name] size = position.size separation = np.ones(size) * u.deg chunk_size = size if chunk_size is None else chunk_size index = 0 while index < size: chunk = slice(index, index + chunk_size, 1) coord = { "skycoord": position[chunk].reshape(-1, 1), self.energy_name: energy[chunk].reshape(-1, 1), "rad": rad_axis.center, } pdf = ( self.psf_map.interp_by_coord(coord) * rad_axis.center.value * rad_axis.bin_width.value ) sample_pdf = InverseCDFSampler(pdf, axis=1, random_state=random_state) pix_coord = sample_pdf.sample_axis() separation[chunk] = rad_axis.pix_to_coord(pix_coord) index += chunk_size position_angle = random_state.uniform(360, size=len(map_coord.lon)) * u.deg event_positions = map_coord.skycoord.directional_offset_by( position_angle=position_angle, separation=separation ) return MapCoord.create({"skycoord": event_positions, self.energy_name: energy}) @classmethod def from_gauss(cls, energy_axis_true, rad_axis=None, sigma=0.1 * u.deg, geom=None): """Create all-sky PSF map from Gaussian width. This is used for testing and examples. The width can be the same for all energies or be an array with one value per energy node. It does not depend on position. Parameters ---------- energy_axis_true : `~gammapy.maps.MapAxis` True energy axis. rad_axis : `~gammapy.maps.MapAxis` Offset angle wrt source position axis. sigma : `~astropy.coordinates.Angle` Gaussian width. geom : `Geom` Image geometry. By default, an all-sky geometry is created. Returns ------- psf_map : `PSFMap` Point spread function map. """ from gammapy.datasets.map import RAD_AXIS_DEFAULT if rad_axis is None: rad_axis = RAD_AXIS_DEFAULT.copy() if geom is None: geom = WcsGeom.create( npix=(2, 1), proj="CAR", binsz=180, ) geom = geom.to_cube([rad_axis, energy_axis_true]) coords = geom.get_coord(sparse=True) sigma = u.Quantity(sigma).reshape((-1, 1, 1, 1)) gauss = Gauss2DPDF(sigma=sigma) data = gauss(coords["rad"]) * np.ones(geom.data_shape) psf_map = Map.from_geom(geom=geom, data=data.to_value("sr-1"), unit="sr-1") exposure_map = Map.from_geom( geom=geom.squash(axis_name="rad"), unit="m2 s", data=1.0 ) return cls(psf_map=psf_map, exposure_map=exposure_map) @deprecated_renamed_argument("spectrum", "spectral_model", "v1.3") def to_image(self, spectral_model=None, keepdims=True): """Reduce to a 2D map after weighing with the associated exposure and a spectrum. Parameters ---------- spectral_model : `~gammapy.modeling.models.SpectralModel`, optional Spectral model to compute the weights. Default is power-law with spectral index of 2. keepdims : bool, optional If True, the energy axis is kept with one bin. If False, the axis is removed. Returns ------- psf_out : `PSFMap` `PSFMap` with the energy axis summed over. """ from gammapy.makers.utils import _map_spectrum_weight if spectral_model is None: spectral_model = PowerLawSpectralModel(index=2.0) exp_weighed = _map_spectrum_weight(self.exposure_map, spectral_model) exposure = exp_weighed.sum_over_axes( axes_names=[self.energy_name], keepdims=keepdims ) psf_data = exp_weighed.data * self.psf_map.data / exposure.data psf_map = Map.from_geom(geom=self.psf_map.geom, data=psf_data, unit="sr-1") psf = psf_map.sum_over_axes(axes_names=[self.energy_name], keepdims=keepdims) return self.__class__(psf_map=psf, exposure_map=exposure) def plot_containment_radius_vs_energy( self, ax=None, fraction=(0.68, 0.95), **kwargs ): """Plot containment fraction as a function of energy. The method plots the containment radius at the center of the map. Parameters ---------- ax : `~matplotlib.pyplot.Axes`, optional Matplotlib axes. Default is None. fraction : list of float or `~numpy.ndarray` Containment fraction between 0 and 1. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.plot` Returns ------- ax : `~matplotlib.pyplot.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax position = self.psf_map.geom.center_skydir energy_axis = self.psf_map.geom.axes[self.energy_name] energy_true = energy_axis.center for frac in fraction: radius = self.containment_radius(frac, energy_true, position) label = f"Containment: {100 * frac:.1f}%" with quantity_support(): ax.plot(energy_true, radius, label=label, **kwargs) ax.semilogx() ax.legend(loc="best") ax.yaxis.set_major_formatter(FormatStrFormatter("%.2f")) energy_axis.format_plot_xaxis(ax=ax) ax.set_ylabel( f"Containment radius [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]" ) return ax def plot_psf_vs_rad(self, ax=None, energy_true=[0.1, 1, 10] * u.TeV, **kwargs): """Plot PSF vs radius. The method plots the profile at the center of the map. Parameters ---------- ax : `~matplotlib.pyplot.Axes`, optional Matplotlib axes. Default is None. energy : `~astropy.units.Quantity` Energies where to plot the PSF. **kwargs : dict Keyword arguments pass to `~matplotlib.pyplot.plot`. Returns ------- ax : `~matplotlib.pyplot.Axes` Matplotlib axes. """ ax = plt.gca() if ax is None else ax rad = self.psf_map.geom.axes["rad"].center for value in energy_true: psf_value = self.psf_map.interp_by_coord( { "skycoord": self.psf_map.geom.center_skydir, self.energy_name: value, "rad": rad, } ) label = f"{value:.0f}" psf_value *= self.psf_map.unit with quantity_support(): ax.plot(rad, psf_value, label=label, **kwargs) ax.set_yscale("log") ax.set_xlabel(f"Rad [{ax.xaxis.units.to_string(UNIT_STRING_FORMAT)}]") ax.set_ylabel(f"PSF [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]") ax.xaxis.set_major_formatter(FormatStrFormatter("%.2f")) ax.legend() return ax def __str__(self): return str(self.psf_map) def peek(self, figsize=(12, 10)): """Quick-look summary plots. Parameters ---------- figsize : tuple Size of figure. """ fig, axes = plt.subplots( ncols=2, nrows=2, subplot_kw={"projection": self.psf_map.geom.wcs}, figsize=figsize, gridspec_kw={"hspace": 0.3, "wspace": 0.3}, ) axes = axes.flat axes[0].remove() ax0 = fig.add_subplot(2, 2, 1) ax0.set_title("Containment radius at center of map") self.plot_containment_radius_vs_energy(ax=ax0) axes[1].remove() ax1 = fig.add_subplot(2, 2, 2) ax1.set_ylim(1e-4, 1e4) ax1.set_title("PSF at center of map") self.plot_psf_vs_rad(ax=ax1) axes[2].set_title("Exposure") if self.exposure_map is not None: self.exposure_map.reduce_over_axes().plot(ax=axes[2], add_cbar=True) axes[3].set_title("Containment radius at 1 TeV") kwargs = {self.energy_name: 1 * u.TeV} self.containment_radius_map(**kwargs).plot(ax=axes[3], add_cbar=True) class RecoPSFMap(PSFMap): """Class containing the Map of PSFs in reconstructed energy and allowing to interact with it. Parameters ---------- psf_map : `~gammapy.maps.Map` the input PSF Map. Should be a Map with 2 non spatial axes. rad and energy axes should be given in this specific order. exposure_map : `~gammapy.maps.Map` Associated exposure map. Needs to have a consistent map geometry. """ tag = "psf_map_reco" required_axes = ["rad", "energy"] @property def energy_name(self): return self.required_axes[-1] @classmethod def from_gauss(cls, energy_axis, rad_axis=None, sigma=0.1 * u.deg, geom=None): """Create all -sky PSF map from Gaussian width. This is used for testing and examples. The width can be the same for all energies or be an array with one value per energy node. It does not depend on position. Parameters ---------- energy_axis : `~gammapy.maps.MapAxis` Energy axis. rad_axis : `~gammapy.maps.MapAxis` Offset angle wrt source position axis. sigma : `~astropy.coordinates.Angle` Gaussian width. geom : `Geom` Image geometry. By default, an all-sky geometry is created. Returns ------- psf_map : `PSFMap` Point spread function map. """ return super().from_gauss(energy_axis, rad_axis, sigma, geom) def containment(self, rad, energy, position=None): """Containment at given coordinates. Parameters ---------- rad : `~astropy.units.Quantity` Rad value. energy : `~astropy.units.Quantity` Energy value. position : `~astropy.coordinates.SkyCoord`, optional Sky position. By default, the center of the map is chosen. Returns ------- containment : `~astropy.units.Quantity` Containment values. """ return super().containment(rad, energy, position) def containment_radius(self, fraction, energy, position=None): """Containment at given coordinates. Parameters ---------- fraction : float Containment fraction. energy : `~astropy.units.Quantity` Energy value. position : `~astropy.coordinates.SkyCoord`, optional Sky position. By default, the center of the map is chosen. Returns ------- containment : `~astropy.units.Quantity` Containment values. """ return super().containment_radius(fraction, energy, position) def containment_radius_map(self, energy, fraction=0.68): """Containment radius map. Parameters ---------- energy : `~astropy.units.Quantity` Energy at which to compute the containment radius fraction : float, optional Containment fraction (range: 0 to 1). Default is 0.68. Returns ------- containment_radius_map : `~gammapy.maps.Map` Containment radius map. """ return super().containment_radius_map(energy, fraction=0.68) def plot_psf_vs_rad(self, ax=None, energy=[0.1, 1, 10] * u.TeV, **kwargs): """Plot PSF vs radius. The method plots the profile at the center of the map. Parameters ---------- ax : `~matplotlib.pyplot.Axes`, optional Matplotlib axes. Default is None. energy : `~astropy.units.Quantity` Energies where to plot the PSF. **kwargs : dict Keyword arguments pass to `~matplotlib.pyplot.plot`. Returns ------- ax : `~matplotlib.pyplot.Axes` Matplotlib axes. """ return super().plot_psf_vs_rad(ax, energy_true=energy, **kwargs) def stack(self, other, weights=None, nan_to_num=True): """Stack IRF map with another one in place.""" raise NotImplementedError( "Stacking is not supported for PSF in reconstructed energy." ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/psf/parametric.py0000644000175100001770000003106614721316200020123 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import abc import logging import numpy as np from astropy import units as u from gammapy.maps import MapAxes, MapAxis from gammapy.utils.gauss import MultiGauss2D from gammapy.utils.interpolation import ScaledRegularGridInterpolator from .core import PSF __all__ = ["ParametricPSF", "EnergyDependentMultiGaussPSF", "PSFKing"] log = logging.getLogger(__name__) class ParametricPSF(PSF): """Parametric PSF base class. Parameters ---------- axes : list of `MapAxis` or `MapAxes` Axes. data : dict of `~numpy.ndarray` or `~numpy.recarray` Data. unit : dict of str or `~astropy.units.Unit` Unit. meta : dict Metadata dictionary. """ @property @abc.abstractmethod def required_parameters(self): return [] @abc.abstractmethod def evaluate_direct(self, rad, **kwargs): pass @abc.abstractmethod def evaluate_containment(self, rad, **kwargs): pass def normalize(self): """Normalize parametric PSF.""" raise NotImplementedError @property def quantity(self): """Quantity.""" quantity = {} for name in self.required_parameters: quantity[name] = self.data[name] * self.unit[name] return quantity @property def unit(self): """Map unit as a `~astropy.units.Unit`.""" return self._unit def to_unit(self, unit): """Convert IRF to unit.""" raise NotImplementedError @property def _interpolators(self): interps = {} for name in self.required_parameters: points = [a.center for a in self.axes] points_scale = tuple([a.interp for a in self.axes]) interps[name] = ScaledRegularGridInterpolator( points, values=self.quantity[name], points_scale=points_scale ) return interps def evaluate_parameters(self, energy_true, offset): """Evaluate analytic PSF parameters at a given energy and offset. Uses nearest-neighbor interpolation. Parameters ---------- energy_true : `~astropy.units.Quantity` Energy value. offset : `~astropy.coordinates.Angle` Offset in the field of view. Returns ------- values : `~astropy.units.Quantity` Interpolated value. """ pars = {} for name in self.required_parameters: value = self._interpolators[name]((energy_true, offset)) pars[name] = value return pars def to_table(self, format="gadf-dl3"): """Convert PSF table data to table. Parameters ---------- format : {"gadf-dl3"} Format specification. Default is "gadf-dl3". Returns ------- hdu_list : `~astropy.io.fits.HDUList` PSF in HDU list format. """ from gammapy.irf.io import IRF_DL3_HDU_SPECIFICATION table = self.axes.to_table(format="gadf-dl3") spec = IRF_DL3_HDU_SPECIFICATION[self.tag]["column_name"] for name in self.required_parameters: column_name = spec[name] table[column_name] = self.data[name].T[np.newaxis] table[column_name].unit = self.unit[name] # Create hdu and hdu list return table @classmethod def from_table(cls, table, format="gadf-dl3"): """Create parametric PSF from `~astropy.table.Table`. Parameters ---------- table : `~astropy.table.Table` Table information. format : {"gadf-dl3"}, optional Format specification. Default is "gadf-dl3". Returns ------- psf : `~ParametricPSF` PSF class. """ from gammapy.irf.io import IRF_DL3_HDU_SPECIFICATION axes = MapAxes.from_table(table, format=format)[cls.required_axes] dtype = { "names": cls.required_parameters, "formats": len(cls.required_parameters) * (np.float32,), } data = np.empty(axes.shape, dtype=dtype) unit = {} spec = IRF_DL3_HDU_SPECIFICATION[cls.tag]["column_name"] for name in cls.required_parameters: column = table[spec[name]] values = column.data[0].transpose() # This fixes some files where sigma is written as zero if "sigma" in name: values[values == 0] = 1.0 data[name] = values.reshape(axes.shape) unit[name] = column.unit or "" unit = {key: u.Unit(val) for key, val in unit.items()} return cls(axes=axes, data=data, meta=table.meta.copy(), unit=unit) def to_psf3d(self, rad=None): """Create a PSF3D from a parametric PSF. It will be defined on the same energy and offset values than the input PSF. Parameters ---------- rad : `~astropy.units.Quantity` Rad values. Returns ------- psf3d : `~gammapy.irf.PSF3D` 3D PSF. """ from gammapy.datasets.map import RAD_AXIS_DEFAULT from gammapy.irf import PSF3D offset_axis = self.axes["offset"] energy_axis_true = self.axes["energy_true"] if rad is None: rad_axis = RAD_AXIS_DEFAULT else: rad_axis = MapAxis.from_edges(rad, name="rad") axes = MapAxes([energy_axis_true, offset_axis, rad_axis]) data = self.evaluate(**axes.get_coord()) return PSF3D(axes=axes, data=data.value, unit=data.unit, meta=self.meta.copy()) def __str__(self): str_ = f"{self.__class__.__name__}\n" str_ += "-" * len(self.__class__.__name__) + "\n\n" str_ += f"\taxes : {self.axes.names}\n" str_ += f"\tshape : {self.data.shape}\n" str_ += f"\tndim : {len(self.axes)}\n" str_ += f"\tparameters: {self.required_parameters}\n" return str_.expandtabs(tabsize=2) def containment(self, rad, **kwargs): """Containment of the PSF at given axes coordinates. Parameters ---------- rad : `~astropy.units.Quantity` Rad value. **kwargs : dict Other coordinates. Returns ------- containment : `~numpy.ndarray` Containment. """ pars = self.evaluate_parameters(**kwargs) containment = self.evaluate_containment(rad=rad, **pars) return containment def evaluate(self, rad, **kwargs): """Evaluate the PSF model. Parameters ---------- rad : `~astropy.coordinates.Angle` Offset from PSF center used for evaluating the PSF on a grid. **kwargs : dict Other coordinates. Returns ------- psf_value : `~astropy.units.Quantity` PSF value. """ pars = self.evaluate_parameters(**kwargs) value = self.evaluate_direct(rad=rad, **pars) return value def is_allclose(self, other, rtol_axes=1e-3, atol_axes=1e-6, **kwargs): """Compare two data IRFs for equivalency. Parameters ---------- other : `gammapy.irfs.ParametricPSF` The PSF to compare against. rtol_axes : float, optional Relative tolerance for the axis comparison. Default is 1e-3. atol_axes : float, optional Relative tolerance for the axis comparison. Default is 1e-6. **kwargs : dict Keywords passed to `numpy.allclose`. Returns ------- is_allclose : bool Whether the IRF is all close. """ if not isinstance(other, self.__class__): return TypeError(f"Cannot compare {type(self)} and {type(other)}") data_eq = True for key in self.quantity.keys(): if self.quantity[key].shape != other.quantity[key].shape: return False data_eq &= np.allclose(self.quantity[key], other.quantity[key], **kwargs) axes_eq = self.axes.is_allclose(other.axes, rtol=rtol_axes, atol=atol_axes) return axes_eq and data_eq def get_sigmas_and_norms(**kwargs): """Convert scale and amplitude to norms.""" sigmas = u.Quantity([kwargs[f"sigma_{idx}"] for idx in [1, 2, 3]]) scale = kwargs["scale"] ones = np.ones(scale.shape) amplitudes = u.Quantity([ones, kwargs["ampl_2"], kwargs["ampl_3"]]) norms = 2 * scale * amplitudes * sigmas**2 return sigmas, norms class EnergyDependentMultiGaussPSF(ParametricPSF): """Triple Gauss analytical PSF depending on true energy and offset. Parameters ---------- axes : list of `~gammapy.maps.MapAxis` or `~gammapy.maps.MapAxes` Required axes (in the given order) are: * energy_true (true energy axis) * migra_axis (energy migration axis) * offset_axis (field of view offset axis) data : `~numpy.recarray` Data array. meta : dict Metadata dictionary. Examples -------- Plot R68 of the PSF vs. offset and true energy: .. plot:: :include-source: import matplotlib.pyplot as plt from gammapy.irf import EnergyDependentMultiGaussPSF filename = '$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits' psf = EnergyDependentMultiGaussPSF.read(filename, hdu='POINT SPREAD FUNCTION') psf.plot_containment_radius(fraction=0.68) plt.show() """ tag = "psf_3gauss" required_axes = ["energy_true", "offset"] required_parameters = ["sigma_1", "sigma_2", "sigma_3", "scale", "ampl_2", "ampl_3"] @staticmethod def evaluate_containment(rad, **kwargs): """Containment of the PSF at given axes coordinates. Parameters ---------- rad : `~astropy.units.Quantity` Rad value. **kwargs : dict Parameters, see `required_parameters`. Returns ------- containment : `~numpy.ndarray` Containment. """ sigmas, norms = get_sigmas_and_norms(**kwargs) m = MultiGauss2D(sigmas=sigmas, norms=norms) m.normalize() containment = m.containment_fraction(rad) return containment @staticmethod def evaluate_direct(rad, **kwargs): """Evaluate PSF model. Parameters ---------- rad : `~astropy.units.Quantity` Rad value. **kwargs : dict Parameters, see `required_parameters`. Returns ------- value : `~numpy.ndarray` PSF value. """ sigmas, norms = get_sigmas_and_norms(**kwargs) m = MultiGauss2D(sigmas=sigmas, norms=norms) m.normalize() return m(rad) class PSFKing(ParametricPSF): """King profile analytical PSF depending on energy and offset. This PSF parametrisation and FITS data format is described here: :ref:`gadf:psf_king`. Parameters ---------- axes : list of `MapAxis` or `MapAxes` Data axes, required are ["energy_true", "offset"]. meta : dict Metadata dictionary. """ tag = "psf_king" required_axes = ["energy_true", "offset"] required_parameters = ["gamma", "sigma"] default_interp_kwargs = dict(bounds_error=False, fill_value=None) @staticmethod def evaluate_containment(rad, gamma, sigma): """Containment of the PSF at given axes coordinates. Parameters ---------- rad : `~astropy.units.Quantity` Rad value. gamma : `~astropy.units.Quantity` Gamma parameter. sigma : `~astropy.units.Quantity` Sigma parameter. Returns ------- containment : `~numpy.ndarray` Containment. """ with np.errstate(divide="ignore", invalid="ignore"): powterm = 1 - gamma term = (1 + rad**2 / (2 * gamma * sigma**2)) ** powterm containment = 1 - term return containment @staticmethod def evaluate_direct(rad, gamma, sigma): """Evaluate the PSF model. Formula is given here: :ref:`gadf:psf_king`. Parameters ---------- rad : `~astropy.coordinates.Angle` Offset from PSF center used for evaluating the PSF on a grid. Returns ------- psf_value : `~astropy.units.Quantity` PSF value. """ with np.errstate(divide="ignore"): term1 = 1 / (2 * np.pi * sigma**2) term2 = 1 - 1 / gamma term3 = (1 + rad**2 / (2 * gamma * sigma**2)) ** (-gamma) return term1 * term2 * term3 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/psf/table.py0000644000175100001770000000150314721316200017054 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import astropy.units as u from .core import PSF __all__ = ["PSF3D"] log = logging.getLogger(__name__) class PSF3D(PSF): """PSF with axes: energy, offset, rad. Data format specification: :ref:`gadf:psf_table` Parameters ---------- axes : list of `~gammapy.maps.MapAxis` or `~gammapy.maps.MapAxes` Required axes (in the given order) are: * energy_true (true energy axis) * migra (energy migration axis) * rad (rad axis) data : `~astropy.units.Quantity` PSF (3-dim with axes: psf[rad_index, offset_index, energy_index]. meta : dict Metadata dictionary. """ tag = "psf_table" required_axes = ["energy_true", "offset", "rad"] default_unit = u.sr**-1 ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.208642 gammapy-1.3/gammapy/irf/psf/tests/0000755000175100001770000000000014721316215016564 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/psf/tests/__init__.py0000644000175100001770000000010014721316200020656 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.208642 gammapy-1.3/gammapy/irf/psf/tests/data/0000755000175100001770000000000014721316215017475 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/psf/tests/data/psf_info.txt0000644000175100001770000000106614721316200022036 0ustar00runnerdocker Summary PSF info ---------------- Theta : size = 12, min = 0.000 deg, max = 2.200 deg Energy hi : size = 15, min = 0.158 TeV, max = 100.000 TeV Energy lo : size = 15, min = 0.100 TeV, max = 63.096 TeV 68.00 containment radius at offset = 0.0 deg and energy_true = 1.0 TeV: 0.148 deg 95.00 containment radius at offset = 0.0 deg and energy_true = 1.0 TeV: 0.315 deg 68.00 containment radius at offset = 0.0 deg and energy_true = 10.0 TeV: 0.167 deg 95.00 containment radius at offset = 0.0 deg and energy_true = 10.0 TeV: 0.355 deg ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/psf/tests/test_kernel.py0000644000175100001770000000563514721316200021460 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from gammapy.irf import PSFKernel from gammapy.maps import MapAxis, WcsGeom from gammapy.modeling.models import DiskSpatialModel, PowerLawSpectralModel from gammapy.utils.testing import mpl_plot_check from gammapy.utils.deprecation import GammapyDeprecationWarning @pytest.fixture def kernel_gaussian(): sigma = 0.5 * u.deg binsz = 0.1 * u.deg geom = WcsGeom.create( binsz=binsz, npix=150, axes=[MapAxis((0, 1, 2), unit=u.TeV, name="energy_true")] ) return PSFKernel.from_gauss(geom, sigma) def test_psf_kernel_from_gauss(kernel_gaussian): # Check that both maps are identical assert_allclose( kernel_gaussian.psf_kernel_map.data[0], kernel_gaussian.psf_kernel_map.data[1] ) # Is there an odd number of pixels assert_allclose(np.array(kernel_gaussian.psf_kernel_map.geom.npix) % 2, 1) def test_psf_kernel_read_write(kernel_gaussian, tmp_path): kernel_gaussian.write(tmp_path / "tmp.fits", overwrite=True) kernel2 = PSFKernel.read(tmp_path / "tmp.fits") assert_allclose(kernel_gaussian.psf_kernel_map.data, kernel2.psf_kernel_map.data) def test_psf_kernel_to_image(): sigma1 = 0.5 * u.deg sigma2 = 0.2 * u.deg binsz = 0.1 * u.deg axis = MapAxis.from_energy_bounds(1, 10, 2, unit="TeV", name="energy_true") geom = WcsGeom.create(binsz=binsz, npix=50, axes=[axis]) disk_1 = DiskSpatialModel(r_0=sigma1) disk_2 = DiskSpatialModel(r_0=sigma2) rad_max = 2.5 * u.deg kernel1 = PSFKernel.from_spatial_model(disk_1, geom, max_radius=rad_max, factor=4) kernel2 = PSFKernel.from_spatial_model(disk_2, geom, max_radius=rad_max, factor=4) with pytest.warns(GammapyDeprecationWarning): kernel1.to_image(spectrum=PowerLawSpectralModel()) kernel1.psf_kernel_map.data[1, :, :] = kernel2.psf_kernel_map.data[1, :, :] kernel_image_1 = kernel1.to_image() kernel_image_2 = kernel1.to_image(exposure=[1, 2]) assert_allclose(kernel_image_1.psf_kernel_map.data.sum(), 1.0, atol=1e-5) assert_allclose(kernel_image_1.psf_kernel_map.data[0, 25, 25], 0.028096, atol=1e-5) assert_allclose(kernel_image_1.psf_kernel_map.data[0, 22, 22], 0.009615, atol=1e-5) assert_allclose(kernel_image_1.psf_kernel_map.data[0, 20, 20], 0.0, atol=1e-5) assert_allclose(kernel_image_2.psf_kernel_map.data.sum(), 1.0, atol=1e-5) assert_allclose(kernel_image_2.psf_kernel_map.data[0, 25, 25], 0.037555, atol=1e-5) assert_allclose(kernel_image_2.psf_kernel_map.data[0, 22, 22], 0.007752, atol=1e-5) assert_allclose(kernel_image_2.psf_kernel_map.data[0, 20, 20], 0.0, atol=1e-5) def test_plot_kernel(kernel_gaussian): with mpl_plot_check(): kernel_gaussian.plot_kernel() def test_peek(kernel_gaussian): with mpl_plot_check(): kernel_gaussian.peek() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/psf/tests/test_map.py0000644000175100001770000004765714721316200020767 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import SkyCoord from astropy.units import Unit from gammapy.data import DataStore from gammapy.irf import PSF3D, EffectiveAreaTable2D, PSFMap, RecoPSFMap from gammapy.makers.utils import make_map_exposure_true_energy, make_psf_map from gammapy.maps import Map, MapAxis, MapCoord, RegionGeom, WcsGeom from gammapy.utils.testing import mpl_plot_check, requires_data from gammapy.modeling.models import PowerLawSpectralModel from gammapy.utils.deprecation import GammapyDeprecationWarning @pytest.fixture(scope="session") def data_store(): return DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") def fake_psf3d(sigma=0.15 * u.deg, shape="gauss"): offset_axis = MapAxis.from_nodes([0, 1, 2, 3] * u.deg, name="offset") energy_axis_true = MapAxis.from_energy_bounds( "0.1 TeV", "10 TeV", nbin=4, name="energy_true" ) rad = np.linspace(0, 1.0, 101) * u.deg rad_axis = MapAxis.from_edges(rad, name="rad") O, R, E = np.meshgrid(offset_axis.center, rad_axis.edges, energy_axis_true.center) Rmid = 0.5 * (R[:-1] + R[1:]) if shape == "gauss": val = np.exp(-0.5 * Rmid**2 / sigma**2) else: val = Rmid < sigma drad = 2 * np.pi * (np.cos(R[:-1]) - np.cos(R[1:])) * u.Unit("sr") psf_value = val / ((val * drad).sum(0)[0]) return PSF3D( axes=[energy_axis_true, offset_axis, rad_axis], data=psf_value.T.value, unit=psf_value.unit, ) def fake_aeff2d(area=1e6 * u.m**2): energy_axis_true = MapAxis.from_energy_bounds( "0.1 TeV", "10 TeV", nbin=4, name="energy_true" ) offset_axis = MapAxis.from_edges([0.0, 1.0, 2.0, 3.0] * u.deg, name="offset") return EffectiveAreaTable2D( axes=[energy_axis_true, offset_axis], data=area.value, unit=area.unit ) def test_make_psf_map(): psf = fake_psf3d(0.3 * u.deg) pointing = SkyCoord(0, 0, unit="deg") energy_axis = MapAxis( nodes=[0.2, 0.7, 1.5, 2.0, 10.0], unit="TeV", name="energy_true" ) rad_axis = MapAxis(nodes=np.linspace(0.0, 1.0, 51), unit="deg", name="rad") geom = WcsGeom.create( skydir=pointing, binsz=0.2, width=5, axes=[rad_axis, energy_axis] ) psfmap = make_psf_map(psf, pointing, geom) assert psfmap.psf_map.geom.axes[0] == rad_axis assert psfmap.psf_map.geom.axes[1] == energy_axis assert psfmap.psf_map.unit == "deg-2" assert psfmap.psf_map.data.shape == (4, 50, 25, 25) def make_test_psfmap(size, shape="gauss"): psf = fake_psf3d(size, shape) aeff2d = fake_aeff2d() pointing = SkyCoord(0, 0, unit="deg") energy_axis = MapAxis( nodes=[0.2, 0.7, 1.5, 2.0, 10.0], unit="TeV", name="energy_true" ) rad_axis = MapAxis.from_edges( edges=np.linspace(0.0, 1, 101), unit="deg", name="rad" ) geom = WcsGeom.create( skydir=pointing, binsz=0.2, width=5, axes=[rad_axis, energy_axis] ) exposure_geom = geom.squash(axis_name="rad") exposure_map = make_map_exposure_true_energy(pointing, "1 h", aeff2d, exposure_geom) return make_psf_map(psf, pointing, geom, exposure_map) def test_psf_map_containment_radius(): psf_map = make_test_psfmap(0.15 * u.deg) psf = fake_psf3d(0.15 * u.deg) position = SkyCoord(0, 0, unit="deg") # Check that containment radius is consistent between psf_table and psf3d assert_allclose( psf_map.containment_radius( energy_true=1 * u.TeV, position=position, fraction=0.9 ), psf.containment_radius(energy_true=1 * u.TeV, offset=0 * u.deg, fraction=0.9), rtol=1e-2, ) assert_allclose( psf_map.containment_radius( energy_true=1 * u.TeV, position=position, fraction=0.5 ), psf.containment_radius(energy_true=1 * u.TeV, offset=0 * u.deg, fraction=0.5), rtol=1e-2, ) def test_psf_map_containment(): psf_map = make_test_psfmap(0.15 * u.deg) assert_allclose(psf_map.containment(rad=10 * u.deg, energy_true=[10] * u.TeV), 1) def test_psf_map_cutout(): psf_map = make_test_psfmap(0.15 * u.deg) psf_map_cutout = psf_map.cutout( position=SkyCoord(1, 1, unit="deg"), width=(1, 0.01) ) assert_allclose(psf_map_cutout.psf_map.geom.data_shape, (4, 100, 3, 5)) def test_psfmap_to_psf_kernel(): psfmap = make_test_psfmap(0.15 * u.deg) energy_axis = psfmap.psf_map.geom.axes[1] # create PSFKernel kern_geom = WcsGeom.create(binsz=0.02, width=5.0, axes=[energy_axis]) psfkernel = psfmap.get_psf_kernel( position=SkyCoord(1, 1, unit="deg"), geom=kern_geom, max_radius=1 * u.deg ) assert_allclose(psfkernel.psf_kernel_map.geom.width, 2.02 * u.deg) assert_allclose(psfkernel.psf_kernel_map.data.sum(axis=(1, 2)), 1.0, atol=1e-7) psfkernel = psfmap.get_psf_kernel( position=SkyCoord(1, 1, unit="deg"), geom=kern_geom, ) assert_allclose(psfkernel.psf_kernel_map.geom.width, 1.14 * u.deg) assert_allclose(psfkernel.psf_kernel_map.data.sum(axis=(1, 2)), 1.0, atol=1e-7) def test_psfmap_to_from_hdulist(): psfmap = make_test_psfmap(0.15 * u.deg) hdulist = psfmap.to_hdulist() assert "PSF" in hdulist assert "PSF_BANDS" in hdulist assert "PSF_EXPOSURE" in hdulist assert "PSF_EXPOSURE_BANDS" in hdulist new_psfmap = PSFMap.from_hdulist(hdulist) assert_allclose(psfmap.psf_map.data, new_psfmap.psf_map.data) assert new_psfmap.psf_map.geom == psfmap.psf_map.geom assert new_psfmap.exposure_map.geom == psfmap.exposure_map.geom def test_psfmap_read_write(tmp_path): psfmap = make_test_psfmap(0.15 * u.deg) psfmap.write(tmp_path / "tmp.fits") new_psfmap = PSFMap.read(tmp_path / "tmp.fits") assert_allclose(psfmap.psf_map.quantity, new_psfmap.psf_map.quantity) def test_containment_radius_map(): psf = fake_psf3d(0.15 * u.deg) pointing = SkyCoord(0, 0, unit="deg") energy_axis = MapAxis(nodes=[0.2, 1, 2], unit="TeV", name="energy_true") psf_theta_axis = MapAxis(nodes=np.linspace(0.0, 0.6, 30), unit="deg", name="rad") geom = WcsGeom.create( skydir=pointing, binsz=0.5, width=(4, 3), axes=[psf_theta_axis, energy_axis] ) psfmap = make_psf_map(psf=psf, pointing=pointing, geom=geom) m = psfmap.containment_radius_map(energy_true=1 * u.TeV) coord = SkyCoord(0.3, 0, unit="deg") val = m.interp_by_coord(coord) assert_allclose(val, 0.226477, rtol=1e-2) def test_psfmap_stacking(): psfmap1 = make_test_psfmap(0.1 * u.deg, shape="flat") psfmap2 = make_test_psfmap(0.1 * u.deg, shape="flat") psfmap2.exposure_map.quantity *= 2 psfmap_stack = psfmap1.copy() psfmap_stack.stack(psfmap2) mask = psfmap_stack.psf_map.data > 0 assert_allclose(psfmap_stack.psf_map.data[mask], psfmap1.psf_map.data[mask]) assert_allclose(psfmap_stack.exposure_map.data, psfmap1.exposure_map.data * 3) psfmap3 = make_test_psfmap(0.3 * u.deg, shape="flat") psfmap_stack = psfmap1.copy() psfmap_stack.stack(psfmap3) assert_allclose(psfmap_stack.psf_map.data[0, 40, 20, 20], 0.0) assert_allclose(psfmap_stack.psf_map.data[0, 20, 20, 20], 1.768388, rtol=1e-6) assert_allclose(psfmap_stack.psf_map.data[0, 0, 20, 20], 17.683883, rtol=1e-6) # TODO: add a test comparing make_mean_psf and PSFMap.stack for a set of # observations in an Observations def test_sample_coord(): psf_map = make_test_psfmap(0.1 * u.deg, shape="gauss") coords_in = MapCoord( {"lon": [0, 0] * u.deg, "lat": [0, 0.5] * u.deg, "energy_true": [1, 3] * u.TeV}, frame="icrs", ) coords = psf_map.sample_coord(map_coord=coords_in) assert coords.frame == "icrs" assert len(coords.lon) == 2 assert_allclose(coords.lon, [0.074855, 0.042655], rtol=1e-3) assert_allclose(coords.lat, [-0.101561, 0.347365], rtol=1e-3) def test_sample_coord_gauss(): psf_map = make_test_psfmap(0.1 * u.deg, shape="gauss") lon, lat = np.zeros(10000) * u.deg, np.zeros(10000) * u.deg energy = np.ones(10000) * u.TeV coords_in = MapCoord.create( {"lon": lon, "lat": lat, "energy_true": energy}, frame="icrs" ) coords = psf_map.sample_coord(coords_in) assert_allclose(np.mean(coords.skycoord.data.lon.wrap_at("180d").deg), 0, atol=2e-3) assert_allclose(np.mean(coords.lat), 0, atol=2e-3) def make_psf_map_obs(geom, obs): exposure_map = make_map_exposure_true_energy( geom=geom.squash(axis_name="rad"), pointing=obs.get_pointing_icrs(obs.tmid), aeff=obs.aeff, livetime=obs.observation_live_time_duration, ) psf_map = make_psf_map( geom=geom, psf=obs.psf, pointing=obs.get_pointing_icrs(obs.tmid), exposure_map=exposure_map, ) return psf_map @requires_data() @pytest.mark.parametrize( "pars", [ { "energy": None, "rad": None, "energy_shape": 32, "psf_energy": 0.8659643, "rad_shape": 144, "psf_rad": 0.0015362848, "psf_exposure": 3.14711e12 * u.Unit("cm2 s"), "psf_value_shape": (32, 144), "psf_value": 4369.96391 * u.Unit("sr-1"), }, { "energy": MapAxis.from_energy_bounds(1, 10, 100, "TeV", name="energy_true"), "rad": None, "energy_shape": 100, "psf_energy": 1.428893959, "rad_shape": 144, "psf_rad": 0.0015362848, "psf_exposure": 4.723409e12 * u.Unit("cm2 s"), "psf_value_shape": (100, 144), "psf_value": 3714.303683 * u.Unit("sr-1"), }, { "energy": None, "rad": MapAxis.from_nodes(np.arange(0, 2, 0.002), unit="deg", name="rad"), "energy_shape": 32, "psf_energy": 0.8659643, "rad_shape": 1000, "psf_rad": 0.000524, "psf_exposure": 3.14711e12 * u.Unit("cm2 s"), "psf_value_shape": (32, 1000), "psf_value": 7.902016 * u.Unit("deg-2"), }, { "energy": MapAxis.from_energy_bounds(1, 10, 100, "TeV", name="energy_true"), "rad": MapAxis.from_nodes(np.arange(0, 2, 0.002), unit="deg", name="rad"), "energy_shape": 100, "psf_energy": 1.428893959, "rad_shape": 1000, "psf_rad": 0.000524, "psf_exposure": 4.723409e12 * u.Unit("cm2 s"), "psf_value_shape": (100, 1000), "psf_value": 6.868102 * u.Unit("deg-2"), }, ], ) def test_make_psf(pars, data_store): obs = data_store.obs(23523) psf = obs.psf if pars["energy"] is None: energy_axis = psf.axes["energy_true"] else: energy_axis = pars["energy"] if pars["rad"] is None: rad_axis = psf.axes["rad"] else: rad_axis = pars["rad"] position = SkyCoord(83.63, 22.01, unit="deg") geom = WcsGeom.create( skydir=position, npix=(3, 3), axes=[rad_axis, energy_axis], binsz=0.2 ) psf_map = make_psf_map_obs(geom, obs) psf = psf_map.to_region_nd_map(position) axis = psf.psf_map.geom.axes["energy_true"] assert axis.unit == "TeV" assert axis.nbin == pars["energy_shape"] assert_allclose(axis.center.value[15], pars["psf_energy"], rtol=1e-3) rad_axis = psf.psf_map.geom.axes["rad"] assert rad_axis.unit == "deg" assert rad_axis.nbin == pars["rad_shape"] assert_allclose(rad_axis.center.to_value("rad")[15], pars["psf_rad"], rtol=1e-3) exposure = psf.exposure_map.quantity.squeeze() assert exposure.unit == "m2 s" assert exposure.shape == (pars["energy_shape"],) assert_allclose(exposure[15], pars["psf_exposure"], rtol=1e-3) data = psf.psf_map.quantity.squeeze() assert data.unit == "deg-2" assert data.shape == pars["psf_value_shape"] assert_allclose(data[15, 50], pars["psf_value"], rtol=1e-3) @requires_data() def test_make_mean_psf(data_store): observations = data_store.get_observations([23523, 23526]) position = SkyCoord(83.63, 22.01, unit="deg") psf = observations[0].psf geom = WcsGeom.create( skydir=position, npix=(3, 3), axes=psf.axes[["rad", "energy_true"]], binsz=0.2, ) psf_map_1 = make_psf_map_obs(geom, observations[0]) psf_map_2 = make_psf_map_obs(geom, observations[1]) stacked_psf = psf_map_1.copy() stacked_psf.stack(psf_map_2) psf = stacked_psf.to_region_nd_map(position).psf_map assert not np.isnan(psf.quantity.squeeze()).any() assert_allclose(psf.quantity.squeeze()[22, 22], 12206.1665 / u.sr, rtol=1e-3) @requires_data() @pytest.mark.parametrize("position", ["0d 0d", "180d 0d", "0d 90d", "180d -90d"]) def test_psf_map_read(position): position = SkyCoord(position) filename = "$GAMMAPY_DATA/fermi_3fhl/fermi_3fhl_psf_gc.fits.gz" psf = PSFMap.read(filename, format="gtpsf") value = psf.containment(position=position, energy_true=100 * u.GeV, rad=0.1 * u.deg) assert_allclose(value, 0.682022, rtol=1e-5) assert psf.psf_map.unit == "sr-1" def test_psf_map_write_gtpsf(tmpdir): energy_axis_true = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=3, name="energy_true" ) geom = RegionGeom.create("icrs;circle(0, 0, 0.1)") psf = PSFMap.from_gauss( energy_axis_true=energy_axis_true, sigma=[0.1, 0.2, 0.3] * u.deg, geom=geom ) psf.exposure_map = Map.from_geom(geom.to_cube([energy_axis_true]), unit="cm2 s") filename = tmpdir / "test_psf.fits" psf.write(filename, format="gtpsf") psf = PSFMap.read(filename, format="gtpsf") value = psf.containment_radius(energy_true=energy_axis_true.center, fraction=0.394) assert_allclose(value, [0.1, 0.2, 0.3] * u.deg, rtol=1e-5) assert psf.psf_map.unit == "sr-1" def test_to_image(): psfmap = make_test_psfmap(0.15 * u.deg) with pytest.warns(GammapyDeprecationWarning): psfmap.to_image(spectrum=PowerLawSpectralModel()) psf2D = psfmap.to_image() assert_allclose(psf2D.psf_map.geom.data_shape, (1, 100, 25, 25)) assert_allclose(psf2D.exposure_map.geom.data_shape, (1, 1, 25, 25)) assert_allclose(psf2D.psf_map.data[0][0][12][12], 7.068315, rtol=1e-2) def test_psf_map_from_gauss(): energy_axis = MapAxis.from_nodes( [1, 3, 10], name="energy_true", interp="log", unit="TeV" ) rad = np.linspace(0, 1.5, 100) * u.deg rad_axis = MapAxis.from_nodes(rad, name="rad", unit="deg") # define sigmas starting at 0.1 in steps of 0.1 deg sigma = [0.1, 0.2, 0.4] * u.deg # with energy-dependent sigma psfmap = PSFMap.from_gauss(energy_axis, rad_axis, sigma) assert psfmap.psf_map.geom.axes[0] == rad_axis assert psfmap.psf_map.geom.axes[1] == energy_axis assert psfmap.exposure_map.geom.axes["rad"].nbin == 1 assert psfmap.exposure_map.geom.axes["energy_true"] == psfmap.psf_map.geom.axes[1] assert psfmap.psf_map.unit == "sr-1" assert psfmap.psf_map.data.shape == (3, 100, 1, 2) radius = psfmap.containment_radius(fraction=0.394, energy_true=[1, 3, 10] * u.TeV) assert_allclose(radius, sigma, rtol=0.01) # test that it won't work with different number of sigmas and energies with pytest.raises(ValueError): PSFMap.from_gauss(energy_axis, rad_axis, sigma=[1, 2] * u.deg) def test_psf_map_from_gauss_const_sigma(): energy_axis = MapAxis.from_nodes( [1, 3, 10], name="energy_true", interp="log", unit="TeV" ) rad = np.linspace(0, 1.5, 100) * u.deg rad_axis = MapAxis.from_nodes(rad, name="rad", unit="deg") # with constant sigma psfmap = PSFMap.from_gauss(energy_axis, rad_axis, sigma=0.1 * u.deg) assert psfmap.psf_map.geom.axes[0] == rad_axis assert psfmap.psf_map.geom.axes[1] == energy_axis assert psfmap.psf_map.unit == Unit("sr-1") assert psfmap.psf_map.data.shape == (3, 100, 1, 2) radius = psfmap.containment_radius(energy_true=[1, 3, 10] * u.TeV, fraction=0.394) assert_allclose(radius, 0.1 * u.deg, rtol=0.01) @requires_data() def test_psf_map_plot_containment_radius(): filename = "$GAMMAPY_DATA/fermi_3fhl/fermi_3fhl_psf_gc.fits.gz" psf = PSFMap.read(filename, format="gtpsf") with mpl_plot_check(): psf.plot_containment_radius_vs_energy() @requires_data() def test_psf_map_plot_psf_vs_rad(): filename = "$GAMMAPY_DATA/fermi_3fhl/fermi_3fhl_psf_gc.fits.gz" psf = PSFMap.read(filename, format="gtpsf") with mpl_plot_check(): psf.plot_psf_vs_rad() @requires_data() def test_psf_containment_coords(): # regression test to check the coordinate conversion for PSFMap.containment psf = PSFMap.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz", hdu="PSF") position = SkyCoord("266.415d", "-29.006d", frame="icrs") radius = psf.containment_radius( energy_true=1 * u.TeV, fraction=0.99, position=position ) assert_allclose(radius, 0.10575 * u.deg, rtol=1e-5) @requires_data() def test_peek(): psf_map = PSFMap.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz", hdu="PSF") with mpl_plot_check(): psf_map.peek() def test_psf_map_reco(tmpdir): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3, name="energy") geom = RegionGeom.create("icrs;circle(0, 0, 0.1)") psf_map = RecoPSFMap.from_gauss( energy_axis=energy_axis, sigma=[0.1, 0.2, 0.3] * u.deg, geom=geom ) filename = tmpdir / "test_psf_reco.fits" psf_map.write(filename, format="gadf") psf_map = RecoPSFMap.read(filename, format="gadf") assert psf_map.psf_map.unit == "sr-1" assert "energy" in psf_map.psf_map.geom.axes.names assert psf_map.energy_name == "energy" assert psf_map.required_axes == ["rad", "energy"] value = psf_map.containment(rad=0.1, energy=energy_axis.center) assert_allclose(value, [0.3938, 0.1175, 0.0540], rtol=1e-2) value = psf_map.containment_radius(energy=energy_axis.center, fraction=0.394) assert_allclose(value, [0.1, 0.2, 0.3] * u.deg, rtol=1e-2) value = psf_map.containment_radius_map(energy=1 * u.TeV, fraction=0.394) assert_allclose(value.data[0], 0.11875, rtol=1e-2) kern_geom = WcsGeom.create(binsz=0.02, width=5.0, axes=[energy_axis]) psfkernel = psf_map.get_psf_kernel( position=SkyCoord(1, 1, unit="deg"), geom=kern_geom, max_radius=1 * u.deg ) assert "energy" in kern_geom.axes.names psfkernel.to_image() psf_map.to_image() coords_in = MapCoord( {"lon": [0, 0] * u.deg, "lat": [0, 0.5] * u.deg, "energy": [1, 3] * u.TeV}, frame="icrs", ) coords = psf_map.sample_coord(map_coord=coords_in) assert coords.frame == "icrs" assert len(coords.lon) == 2 with mpl_plot_check(): psf_map.plot_containment_radius_vs_energy() with mpl_plot_check(): psf_map.plot_psf_vs_rad() @requires_data() def test_psf_map_reco_hawc(): filename = ( "$GAMMAPY_DATA/hawc/crab_events_pass4/irfs/PSFMap_Crab_fHitbin5NN.fits.gz" ) reco_psf_map = RecoPSFMap.read(filename, format="gadf") assert "energy" in reco_psf_map.psf_map.geom.axes.names assert reco_psf_map.energy_name == "energy" assert reco_psf_map.required_axes == ["rad", "energy"] assert reco_psf_map.exposure_map is None with mpl_plot_check(): reco_psf_map.plot_containment_radius_vs_energy() with mpl_plot_check(): reco_psf_map.plot_psf_vs_rad() assert_allclose( reco_psf_map.containment_radius(0.68, [1, 2] * u.TeV), [0.001, 0.43733357] * u.deg, ) reco_psf_map = PSFMap.read(filename, format="gadf") assert isinstance(reco_psf_map, RecoPSFMap) assert "energy" in reco_psf_map.psf_map.geom.axes.names assert reco_psf_map.energy_name == "energy" assert reco_psf_map.required_axes == ["rad", "energy"] assert reco_psf_map.exposure_map is None with mpl_plot_check(): reco_psf_map.plot_containment_radius_vs_energy() with mpl_plot_check(): reco_psf_map.plot_psf_vs_rad() assert_allclose( reco_psf_map.containment_radius(0.68, [1, 2] * u.TeV), [0.001, 0.43733357] * u.deg, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/psf/tests/test_parametric.py0000644000175100001770000001345714721316200022330 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from copy import deepcopy import pytest import numpy as np from numpy.testing import assert_allclose from astropy import units as u from astropy.coordinates import Angle from astropy.io import fits from astropy.utils.data import get_pkg_data_filename from gammapy.irf import EnergyDependentMultiGaussPSF, PSFKing from gammapy.utils.testing import mpl_plot_check, requires_data @requires_data() class TestEnergyDependentMultiGaussPSF: @pytest.fixture(scope="session") def psf(self): filename = "$GAMMAPY_DATA/tests/unbundled/irfs/psf.fits" return EnergyDependentMultiGaussPSF.read(filename, hdu="POINT SPREAD FUNCTION") def test_info(self, psf): info_str = open(get_pkg_data_filename("./data/psf_info.txt")).read() print(psf.info()) assert psf.info() == info_str def test_write(self, tmp_path, psf): psf.write(tmp_path / "tmp.fits") with fits.open(tmp_path / "tmp.fits", memmap=False) as hdu_list: assert len(hdu_list) == 2 def test_to_table_psf(self, psf): energy = 1 * u.TeV theta = 0 * u.deg containment = [0.68, 0.8, 0.9] desired = psf.containment_radius( energy_true=energy, offset=theta, fraction=containment ) assert_allclose(desired, [0.14775, 0.18675, 0.25075] * u.deg, rtol=1e-3) def test_to_unit(self, psf): with pytest.raises(NotImplementedError): psf.to_unit("deg-2") def test_to_psf3d(self, psf): rads = np.linspace(0.0, 1.0, 101) * u.deg psf_3d = psf.to_psf3d(rads) rad_axis = psf_3d.axes["rad"] assert rad_axis.nbin == 100 assert rad_axis.unit == "deg" theta = 0.5 * u.deg energy = 0.5 * u.TeV containment = [0.68, 0.8, 0.9] desired = psf.containment_radius( energy_true=energy, offset=theta, fraction=containment ) actual = psf_3d.containment_radius( energy_true=energy, offset=theta, fraction=containment ) assert_allclose(np.squeeze(desired), actual, atol=0.005) # test default case psf_3d_def = psf.to_psf3d() assert psf_3d_def.axes["rad"].nbin == 66 def test_eq(self, psf): psf1 = deepcopy(psf) assert psf1 == psf psf1.data[0][0] = 10 assert not psf1 == psf def test_peek(self, psf): with mpl_plot_check(): psf.peek() @requires_data() def test_psf_cta_1dc(): filename = ( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) psf_irf = EnergyDependentMultiGaussPSF.read(filename, hdu="POINT SPREAD FUNCTION") # Check that PSF is filled with 0 for energy / offset where no PSF info is given. # This is needed so that stacked PSF computation doesn't error out, # trying to interpolate for observations / energies where this occurs. value = psf_irf.evaluate( energy_true=0.05 * u.TeV, rad=0.03 * u.deg, offset=4.5 * u.deg ) assert_allclose(value, 0 * u.Unit("deg-2")) # Check that evaluation works for an energy / offset where an energy is available radius = psf_irf.containment_radius( fraction=0.68, energy_true=1 * u.TeV, offset=2 * u.deg ) assert_allclose(radius, 0.052841 * u.deg, atol=1e-4) @requires_data() def test_get_sigmas_and_norms(): filename = "$GAMMAPY_DATA/cta-caldb/Prod5-South-20deg-AverageAz-14MSTs37SSTs.180000s-v0.1.fits.gz" # noqa: E501 psf_irf = EnergyDependentMultiGaussPSF.read(filename, hdu="POINT SPREAD FUNCTION") value = psf_irf.evaluate( energy_true=1 * u.TeV, rad=0.03 * u.deg, offset=3.5 * u.deg ) assert_allclose(value, 78.25826069 * u.Unit("deg-2")) @pytest.fixture(scope="session") def psf_king(): return PSFKing.read("$GAMMAPY_DATA/tests/hess_psf_king_023523.fits.gz") @requires_data() def test_psf_king_evaluate(psf_king): param_off1 = psf_king.evaluate_parameters(energy_true=1 * u.TeV, offset=0 * u.deg) param_off2 = psf_king.evaluate_parameters(energy_true=1 * u.TeV, offset=1 * u.deg) assert_allclose(param_off1["gamma"].value, 1.733179, rtol=1e-5) assert_allclose(param_off2["gamma"].value, 1.812795, rtol=1e-5) assert_allclose(param_off1["sigma"], 0.040576 * u.deg, rtol=1e-5) assert_allclose(param_off2["sigma"], 0.040765 * u.deg, rtol=1e-5) @requires_data() def test_psf_king_containment_radius(psf_king): radius = psf_king.containment_radius( fraction=0.68, energy_true=1 * u.TeV, offset=0.0 * u.deg ) assert_allclose(radius, 0.14575 * u.deg, rtol=1e-5) @requires_data() def test_psf_king_evaluate_2(psf_king): theta1 = Angle(0, "deg") theta2 = Angle(1, "deg") rad = Angle(1, "deg") # energy = Quantity(1, "TeV") match with bin number 8 # offset equal 1 degree match with the bin 200 in the psf_table value_off1 = psf_king.evaluate(rad=rad, energy_true=1 * u.TeV, offset=theta1) value_off2 = psf_king.evaluate(rad=rad, energy_true=1 * u.TeV, offset=theta2) # Test that the value at 1 degree in the histogram for the energy 1 Tev and # theta=0 or 1 degree is equal to the one obtained from the self.evaluate_direct() # method at 1 degree assert_allclose(0.005234 * u.Unit("deg-2"), value_off1, rtol=1e-4) assert_allclose(0.004015 * u.Unit("deg-2"), value_off2, rtol=1e-4) @requires_data() def test_psf_king_write(psf_king, tmp_path): psf_king.write(tmp_path / "tmp.fits") psf_king2 = PSFKing.read(tmp_path / "tmp.fits") assert_allclose( psf_king2.axes["energy_true"].edges, psf_king.axes["energy_true"].edges ) assert_allclose(psf_king2.axes["offset"].center, psf_king.axes["offset"].center) assert_allclose(psf_king2.data["gamma"], psf_king.data["gamma"]) assert_allclose(psf_king2.data["sigma"], psf_king.data["sigma"]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/psf/tests/test_table.py0000644000175100001770000000601014721316200021253 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy import units as u from gammapy.irf import PSF3D from gammapy.maps import MapAxis from gammapy.utils.testing import mpl_plot_check, requires_data @pytest.fixture(scope="session") def psf_3d(): filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz" return PSF3D.read(filename, hdu="PSF") def test_psf_3d_wrong_units(): energy_axis = MapAxis.from_energy_edges([80, 125] * u.GeV, name="energy_true") offset_axis = MapAxis.from_edges([0, 1, 2], unit="deg", name="offset") rad_axis = MapAxis.from_edges([0, 1, 2], unit="deg", name="rad") wrong_unit = u.cm**2 * u.s data = np.ones((energy_axis.nbin, offset_axis.nbin, rad_axis.nbin)) * wrong_unit psf3d_test = PSF3D(axes=[energy_axis, offset_axis, rad_axis]) with pytest.raises(ValueError) as error: PSF3D(axes=[energy_axis, offset_axis, rad_axis], data=data) assert error.match( f"Error: {wrong_unit} is not an allowed unit. {psf3d_test.tag} requires " f"{psf3d_test.default_unit} data quantities." ) @requires_data() def test_psf_3d_basics(psf_3d): rad_axis = psf_3d.axes["rad"] assert_allclose(rad_axis.edges[-2].value, 0.659048, rtol=1e-5) assert rad_axis.nbin == 144 assert rad_axis.unit == "deg" energy_axis_true = psf_3d.axes["energy_true"] assert_allclose(energy_axis_true.edges[0].value, 0.01) assert energy_axis_true.nbin == 32 assert energy_axis_true.unit == "TeV" assert psf_3d.data.shape == (32, 6, 144) assert psf_3d.unit == "sr-1" psf_3d_new_unit = psf_3d.to_unit("deg-2") assert_allclose(psf_3d_new_unit.data, psf_3d.data / 3282.8063, rtol=1e-6) assert_allclose(psf_3d.meta["LO_THRES"], 0.01) assert "PSF3D" in str(psf_3d) with pytest.raises(ValueError): PSF3D(axes=psf_3d.axes, data=psf_3d.data.T) @requires_data() def test_psf_3d_evaluate(psf_3d): q = psf_3d.evaluate(energy_true="1 TeV", offset="0.3 deg", rad="0.1 deg") assert_allclose(q.value, 25847.249548) # TODO: is this the shape we want here? assert q.shape == () assert q.unit == "sr-1" @requires_data() def test_psf_3d_containment_radius(psf_3d): q = psf_3d.containment_radius( energy_true=1 * u.TeV, fraction=0.68, offset=0 * u.deg ) assert_allclose(q.value, 0.123352, rtol=1e-2) assert q.isscalar assert q.unit == "deg" q = psf_3d.containment_radius( energy_true=[1, 3] * u.TeV, fraction=0.68, offset=0 * u.deg ) assert_allclose(q.value, [0.123261, 0.13131], rtol=1e-2) assert q.shape == (2,) @requires_data() def test_psf_3d_plot_vs_rad(psf_3d): with mpl_plot_check(): psf_3d.plot_psf_vs_rad() @requires_data() def test_psf_3d_plot_containment(psf_3d): with mpl_plot_check(): psf_3d.plot_containment_radius() @requires_data() def test_psf_3d_peek(psf_3d): with mpl_plot_check(): psf_3d.peek() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/rad_max.py0000644000175100001770000000754414721316200016623 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import astropy.units as u from astropy.visualization import quantity_support import matplotlib.pyplot as plt from matplotlib.ticker import FormatStrFormatter from gammapy.maps.axes import UNIT_STRING_FORMAT from .core import IRF __all__ = [ "RadMax2D", ] class RadMax2D(IRF): """2D Rad Max table. This is not directly a IRF component but is needed as additional information for point-like IRF components when an energy or field of view dependent directional cut has been applied. Data format specification: :ref:`gadf:rad_max_2d`. Parameters ---------- axes : list of `~gammapy.maps.MapAxis` or `~gammapy.maps.MapAxes` Required axes (in the given order) are: * energy (reconstructed energy axis) * offset (field of view offset axis) data : `~astropy.units.Quantity` Applied directional cut. meta : dict Metadata dictionary. """ tag = "rad_max_2d" required_axes = ["energy", "offset"] default_unit = u.deg @classmethod def from_irf(cls, irf): """Create a RadMax2D instance from another IRF component. This reads the RAD_MAX metadata keyword from the IRF and creates a RadMax2D with a single bin in energy and offset using the ranges from the input IRF. Parameters ---------- irf : `~gammapy.irf.EffectiveAreaTable2D` or `~gammapy.irf.EnergyDispersion2D` IRF instance from which to read the RAD_MAX and limit information. Returns ------- rad_max : `RadMax2D` `RadMax2D` object with a single bin corresponding to the fixed RAD_MAX cut. Notes ----- This assumes the true energy axis limits are also valid for the reconstructed energy limits. """ if not irf.is_pointlike: raise ValueError("RadMax2D.from_irf requires a point-like irf") if "RAD_MAX" not in irf.meta: raise ValueError("Irf does not contain RAD_MAX keyword") rad_max_value = irf.meta["RAD_MAX"] if not isinstance(rad_max_value, float): raise ValueError( f"RAD_MAX must be a float, got '{type(rad_max_value)}' instead" ) energy_axis = irf.axes["energy_true"].copy(name="energy").squash() offset_axis = irf.axes["offset"].squash() return cls( data=rad_max_value, axes=[energy_axis, offset_axis], unit="deg", interp_kwargs={"method": "nearest", "fill_value": None}, ) def plot_rad_max_vs_energy(self, ax=None, **kwargs): """Plot rad max value against energy. Parameters ---------- ax : `~matplotlib.pyplot.Axes`, optional Matplotlib axes. Default is None. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.pcolormesh`. Returns ------- ax : `~matplotlib.pyplot.Axes` Matplotlib axes. """ if ax is None: ax = plt.gca() energy_axis = self.axes["energy"] offset_axis = self.axes["offset"] with quantity_support(): for value in offset_axis.center: rad_max = self.evaluate(offset=value) label = f"Offset {value:.2f}" kwargs.setdefault("label", label) ax.plot(energy_axis.center, rad_max, **kwargs) energy_axis.format_plot_xaxis(ax=ax) ax.set_ylim(0 * u.deg, None) ax.legend(loc="best") ax.set_ylabel(f"Rad max. [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]") ax.yaxis.set_major_formatter(FormatStrFormatter("%.2f")) return ax @property def is_fixed_rad_max(self): """Return True if rad_max axes are flat.""" return self.axes.is_flat ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.212642 gammapy-1.3/gammapy/irf/tests/0000755000175100001770000000000014721316215015774 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/tests/__init__.py0000644000175100001770000000010014721316200020066 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/tests/test_background.py0000644000175100001770000003213314721316200021520 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from copy import deepcopy import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from gammapy.irf import Background2D, Background3D, FoVAlignment from gammapy.maps import MapAxis from gammapy.utils.testing import mpl_plot_check, requires_data @pytest.fixture(scope="session") def bkg_3d(): """Example with simple values to test evaluate""" energy = [0.1, 10, 1000] * u.TeV energy_axis = MapAxis.from_energy_edges(energy) fov_lon = [0, 1, 2, 3] * u.deg fov_lon_axis = MapAxis.from_edges(fov_lon, name="fov_lon") fov_lat = [0, 1, 2, 3] * u.deg fov_lat_axis = MapAxis.from_edges(fov_lat, name="fov_lat") data = np.ones((2, 3, 3)) # Axis order is (energy, fov_lon, fov_lat) # data.value[1, 0, 0] = 1 data[1, 1, 1] = 100 return Background3D( axes=[energy_axis, fov_lon_axis, fov_lat_axis], data=data, unit="s-1 GeV-1 sr-1" ) @pytest.fixture(scope="session") def bkg_3d_interp(): """Example with simple values to test evaluate""" energy = np.logspace(-1, 3, 6) * u.TeV energy_axis = MapAxis.from_energy_edges(energy) fov_lon = [0, 1, 2, 3] * u.deg fov_lon_axis = MapAxis.from_edges(fov_lon, name="fov_lon") fov_lat = [0, 1, 2, 3] * u.deg fov_lat_axis = MapAxis.from_edges(fov_lat, name="fov_lat") data = np.ones((5, 3, 3)) data[-2, :, :] = 0.0 # clipping of value before last will cause extrapolation problems # as found with CTA background IRF bkg = Background3D( axes=[energy_axis, fov_lon_axis, fov_lat_axis], data=data, unit="s-1 GeV-1 sr-1", ) return bkg @requires_data() def test_background_3d_basics(bkg_3d): assert "Background3D" in str(bkg_3d) axis = bkg_3d.axes["energy"] assert axis.nbin == 2 assert axis.unit == "TeV" axis = bkg_3d.axes["fov_lon"] assert axis.nbin == 3 assert axis.unit == "deg" axis = bkg_3d.axes["fov_lat"] assert axis.nbin == 3 assert axis.unit == "deg" data = bkg_3d.quantity assert data.shape == (2, 3, 3) assert data.unit == "s-1 GeV-1 sr-1" bkg_2d = bkg_3d.to_2d() assert bkg_2d.data.data.shape == (2, 3) bkg_3d_new_unit = bkg_3d.to_unit("s-1 MeV-1 sr-1") assert_allclose(bkg_3d_new_unit.data[1, 1, 1], 0.1) def test_background_3d_read_write(tmp_path, bkg_3d): bkg_3d.to_table_hdu().writeto(tmp_path / "bkg3d.fits") bkg_3d_2 = Background3D.read(tmp_path / "bkg3d.fits") axis = bkg_3d_2.axes["energy"] assert axis.nbin == 2 assert axis.unit == "TeV" axis = bkg_3d_2.axes["fov_lon"] assert axis.nbin == 3 assert axis.unit == "deg" axis = bkg_3d_2.axes["fov_lat"] assert axis.nbin == 3 assert axis.unit == "deg" data = bkg_3d_2.quantity assert data.shape == (2, 3, 3) assert data.unit == "s-1 GeV-1 sr-1" def test_background_3d_evaluate(bkg_3d): # Evaluate at nodes where we put a non-zero value res = bkg_3d.evaluate( fov_lon=[0.5, 1.5] * u.deg, fov_lat=[0.5, 1.5] * u.deg, energy=[100, 100] * u.TeV, ) assert_allclose(res.value, [1, 100]) assert res.shape == (2,) assert res.unit == "s-1 GeV-1 sr-1" res = bkg_3d.evaluate( fov_lon=[1, 0.5] * u.deg, fov_lat=[1, 0.5] * u.deg, energy=[100, 100] * u.TeV, ) assert_allclose(res.value, [3.162278, 1], rtol=1e-5) res = bkg_3d.evaluate( fov_lon=[[1, 0.5], [1, 0.5]] * u.deg, fov_lat=[[1, 0.5], [1, 0.5]] * u.deg, energy=[[1, 1], [100, 100]] * u.TeV, ) assert_allclose(res.value, [[1, 1], [3.162278, 1]], rtol=1e-5) assert res.shape == (2, 2) def test_plot_at_energy(bkg_3d): with mpl_plot_check(): bkg_3d.plot_at_energy(energy=[5] * u.TeV) def test_background_3d_missing_values(bkg_3d_interp): res = bkg_3d_interp.evaluate( fov_lon=0.5 * u.deg, fov_lat=0.5 * u.deg, energy=2000 * u.TeV, ) assert_allclose(res.value, 0.0) res = bkg_3d_interp.evaluate( fov_lon=0.5 * u.deg, fov_lat=0.5 * u.deg, energy=999 * u.TeV, ) assert_allclose(res.value, 8.796068e18) # without missing value interpolation # extrapolation within the last bin would give too high value bkg_3d_interp.interp_missing_data(axis_name="energy") assert np.all(bkg_3d_interp.data != 0) bkg_3d_interp.interp_missing_data(axis_name="energy") res = bkg_3d_interp.evaluate( fov_lon=0.5 * u.deg, fov_lat=0.5 * u.deg, energy=999 * u.TeV, ) assert_allclose(res.value, 1.0) def test_background_3d_integrate(bkg_3d): # Example has bkg rate = 4 s-1 MeV-1 sr-1 at this node: # fov_lon=1.5 deg, fov_lat=1.5 deg, energy=100 TeV rate = bkg_3d.integrate_log_log( fov_lon=[1.5, 1.5] * u.deg, fov_lat=[1.5, 1.5] * u.deg, energy=[100, 100 + 2e-6] * u.TeV, axis_name="energy", ) assert rate.shape == (1,) # Expect approximately `rate * de` # with `rate = 4 s-1 sr-1 MeV-1` and `de = 2 MeV` assert_allclose(rate.to("s-1 sr-1").value, 0.2, rtol=1e-5) rate = bkg_3d.integrate_log_log( fov_lon=0.5 * u.deg, fov_lat=0.5 * u.deg, energy=[1, 100] * u.TeV, axis_name="energy", ) assert_allclose(rate.to("s-1 sr-1").value, 99000) rate = bkg_3d.integrate_log_log( fov_lon=[[1, 0.5], [1, 0.5]] * u.deg, fov_lat=[[1, 1], [0.5, 0.5]] * u.deg, energy=[[1, 1], [100, 100]] * u.TeV, axis_name="energy", ) assert rate.shape == (1, 2) assert_allclose(rate.to("s-1 sr-1").value, [[99000.0, 99000.0]], rtol=1e-5) @requires_data() def test_background_3d_read(): filename = ( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) bkg = Background3D.read(filename) data = bkg.quantity assert bkg.axes.names == ["energy", "fov_lon", "fov_lat"] assert data.shape == (21, 36, 36) assert data.unit == "s-1 MeV-1 sr-1" @requires_data() def test_background_3d_read_gadf(): filename = "$GAMMAPY_DATA/tests/irf/bkg_3d_full_example.fits" bkg = Background3D.read(filename) data = bkg.quantity assert bkg.axes.names == ["energy", "fov_lon", "fov_lat"] assert data.shape == (20, 15, 15) assert data.unit == "s-1 MeV-1 sr-1" def test_bkg_3d_wrong_units(): energy = [0.1, 10, 1000] * u.TeV energy_axis = MapAxis.from_energy_edges(energy) fov_lon = [0, 1, 2, 3] * u.deg fov_lon_axis = MapAxis.from_edges(fov_lon, name="fov_lon") fov_lat = [0, 1, 2, 3] * u.deg fov_lat_axis = MapAxis.from_edges(fov_lat, name="fov_lat") wrong_unit = u.cm**2 * u.s data = np.ones((2, 3, 3)) * wrong_unit with pytest.raises(ValueError) as error: Background3D(axes=[energy_axis, fov_lon_axis, fov_lat_axis], data=data) assert error.match( "Error: (.*) is not an allowed unit. (.*) requires (.*) data quantities." ) def test_bkg_2d_wrong_units(): energy = [0.1, 10, 1000] * u.TeV energy_axis = MapAxis.from_energy_edges(energy) offset_axis = MapAxis.from_edges([0, 1, 2], unit="deg", name="offset") wrong_unit = u.cm**2 * u.s data = np.ones((energy_axis.nbin, offset_axis.nbin)) * wrong_unit bkg2d_test = Background2D(axes=[energy_axis, offset_axis]) with pytest.raises(ValueError) as error: Background2D(axes=[energy_axis, offset_axis], data=data) assert error.match( f"Error: {wrong_unit} is not an allowed unit. {bkg2d_test.tag}" f" requires {bkg2d_test.default_unit} data quantities." ) def test_background_2d_read_missing_hducls(): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) offset_axis = MapAxis.from_edges([0, 1, 2], unit="deg", name="offset") bkg = Background2D(axes=[energy_axis, offset_axis], unit="s-1 MeV-1 sr-1") table = bkg.to_table() table.meta.pop("HDUCLAS2") bkg = Background2D.from_table(table) assert bkg.axes[0].name == "energy" @pytest.fixture(scope="session") def bkg_2d(): """A simple Background2D test case""" energy = [0.1, 10, 1000] * u.TeV energy_axis = MapAxis.from_energy_edges(energy) offset = [0, 1, 2, 3] * u.deg offset_axis = MapAxis.from_edges(offset, name="offset") data = np.ones((2, 3)) data[1, 0] = 2 data[1, 1] = 4 return Background2D( axes=[energy_axis, offset_axis], data=data, unit="s-1 MeV-1 sr-1" ) def test_background_2d_evaluate(bkg_2d): # TODO: the test cases here can probably be improved a bit # There's some redundancy, and no case exactly at a node in energy # Evaluate at log center between nodes in energy res = bkg_2d.evaluate(offset=[1, 0.5] * u.deg, energy=[1, 1] * u.TeV) assert_allclose(res.value, [1, 1]) assert res.shape == (2,) assert res.unit == "s-1 MeV-1 sr-1" res = bkg_2d.evaluate(offset=[1, 0.5] * u.deg, energy=[100, 100] * u.TeV) assert_allclose(res.value, [2.8284, 2], rtol=1e-4) res = bkg_2d.evaluate( offset=[[1, 0.5], [1, 0.5]] * u.deg, energy=[[1, 1], [100, 100]] * u.TeV, ) assert_allclose(res.value, [[1, 1], [2.8284, 2]], rtol=1e-4) assert res.shape == (2, 2) res = bkg_2d.evaluate(offset=[1, 1] * u.deg, energy=[1, 100] * u.TeV) assert_allclose(res.value, [1, 2.8284], rtol=1e-4) assert res.shape == (2,) def test_background_2d_read_write(tmp_path, bkg_2d): bkg_2d.to_table_hdu().writeto(tmp_path / "tmp.fits") bkg_2d_2 = Background2D.read(tmp_path / "tmp.fits") axis = bkg_2d_2.axes["energy"] assert axis.nbin == 2 assert axis.unit == "TeV" axis = bkg_2d_2.axes["offset"] assert axis.nbin == 3 assert axis.unit == "deg" data = bkg_2d_2.data assert data.shape == (2, 3) assert bkg_2d_2.unit == "s-1 MeV-1 sr-1" @requires_data() def test_background_2d_read_gadf(): filename = "$GAMMAPY_DATA/tests/irf/bkg_2d_full_example.fits" bkg = Background2D.read(filename) data = bkg.quantity assert data.shape == (20, 5) assert bkg.axes.names == ["energy", "offset"] assert data.unit == "s-1 MeV-1 sr-1" def test_background_2d_integrate(bkg_2d): # TODO: change test case to something better (with known answer) # e.g. constant spectrum or power-law. rate = bkg_2d.integrate_log_log( offset=[1, 0.51] * u.deg, energy=[0.11, 0.5] * u.TeV, axis_name="energy" ) assert rate.shape == (1,) assert_allclose(rate.to("s-1 sr-1").value[0], 304211.869056) rate = bkg_2d.integrate_log_log( offset=[1, 0.5] * u.deg, energy=[1, 100] * u.TeV, axis_name="energy" ) assert_allclose(rate.to("s-1 sr-1").value[0], 1.7296602e08) rate = bkg_2d.integrate_log_log( offset=[[1, 0.5], [1, 0.5]] * u.deg, energy=[1, 100] * u.TeV, axis_name="energy" ) assert rate.shape == (1, 2) assert_allclose(rate.value, [[99, 198]]) def test_to_3d(bkg_2d): bkg_3d = bkg_2d.to_3d() assert bkg_3d.data.shape == (2, 6, 6) assert_allclose(bkg_3d.data[1, 3, 3], 2.31, rtol=0.1) # assert you get back same after joint to 2d # need high rtol due to interpolation effects? b2 = bkg_3d.to_2d() assert_allclose(bkg_2d.data, b2.data, rtol=0.2) assert b2.unit == bkg_2d.unit def test_plot(bkg_2d): with mpl_plot_check(): bkg_2d.plot() with mpl_plot_check(): bkg_2d.plot_energy_dependence() with mpl_plot_check(): bkg_2d.plot_offset_dependence() with mpl_plot_check(): bkg_2d.plot_spectrum() with mpl_plot_check(): bkg_2d.peek() with mpl_plot_check(): bkg_2d.plot_at_energy(energy=[1.0, 5.0] * u.TeV) def test_eq(bkg_2d): bkg1 = deepcopy(bkg_2d) assert bkg1 == bkg_2d bkg1.data[0][0] = 10 assert not bkg1 == bkg_2d def test_write_bkg_3d(): e_reco = MapAxis.from_energy_bounds(0.1, 10, 6, unit="TeV", name="energy") lon_axis = MapAxis.from_bounds( -2.3, 2.3, 10, interp="lin", unit="deg", name="fov_lon" ) lat_axis = MapAxis.from_bounds( -2.3, 2.3, 10, interp="lin", unit="deg", name="fov_lat" ) bg_3d = Background3D( axes=[e_reco, lon_axis, lat_axis], data=np.ones((6, 10, 10)), unit=u.Unit("s-1 MeV-1 sr-1"), fov_alignment=FoVAlignment.ALTAZ, ) hduBackground = bg_3d.to_table_hdu() hduBackground.writeto("background.fits", overwrite=True) bg = Background3D.read("background.fits", hdu="BACKGROUND") assert bg.fov_alignment.value == "ALTAZ" def test_to2d_to3d(): # Test for issue 4950 energy = [0.1, 10, 1000] * u.TeV energy_axis = MapAxis.from_energy_edges(energy) nbin = 21 fov_lon_axis = MapAxis.from_bounds(-2 * u.deg, 2 * u.deg, nbin=nbin, name="fov_lon") fov_lat_axis = MapAxis.from_bounds(-2 * u.deg, 2 * u.deg, nbin=nbin, name="fov_lat") data = np.ones((2, nbin, nbin)) * 1.5 bkg = Background3D( axes=[energy_axis, fov_lon_axis, fov_lat_axis], data=data, unit="s-1 GeV-1 sr-1" ) bkg2d = bkg.to_2d() assert_allclose(bkg2d.axes["offset"].center[0].value, 0.0, atol=1e-5) bkg3d = bkg2d.to_3d() assert bkg3d.axes["fov_lon"].is_allclose(fov_lon_axis) assert_allclose(bkg3d.data[0][10], bkg.data[0][10], rtol=1e-5) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/tests/test_core.py0000644000175100001770000000532614721316200020335 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np import astropy.units as u from gammapy.irf.core import IRF, FoVAlignment from gammapy.maps import MapAxis class MyCustomIRF(IRF): tag = "myirf" required_axes = ["energy", "offset"] default_unit = u.deg def test_irf_init_quantity(): energy_axis = MapAxis.from_energy_bounds(10, 100, 10, unit="TeV", name="energy") offset_axis = MapAxis.from_bounds(0, 2.5, 5, unit="deg", name="offset") data = np.full((10, 5), 1) irf = MyCustomIRF(axes=[energy_axis, offset_axis], data=data, unit=u.deg) irf2 = MyCustomIRF(axes=[energy_axis, offset_axis], data=data * u.deg) assert np.all(irf.quantity == irf2.quantity) def test_immutable(): energy_axis = MapAxis.from_energy_bounds(10, 100, 10, unit="TeV", name="energy") offset_axis = MapAxis.from_bounds(0, 2.5, 5, unit="deg", name="offset") data = np.full((10, 5), 1) * u.deg test_irf = MyCustomIRF( axes=[energy_axis, offset_axis], data=data, is_pointlike=False, fov_alignment=FoVAlignment.RADEC, ) with pytest.raises(AttributeError): test_irf.is_pointlike = True with pytest.raises(AttributeError): test_irf.fov_alignment = FoVAlignment.ALTAZ def test_slice_by_idx(): energy_axis = MapAxis.from_energy_bounds(10, 100, 10, unit="TeV", name="energy") offset_axis = MapAxis.from_bounds(0, 2.5, 5, unit="deg", name="offset") data = np.full((10, 5), 1) irf = MyCustomIRF(axes=[energy_axis, offset_axis], data=data, unit=u.deg) irf_sliced = irf.slice_by_idx({"energy": slice(3, 7)}) assert irf_sliced.data.shape == (4, 5) assert irf_sliced.axes["energy"].nbin == 4 irf_sliced = irf.slice_by_idx({"offset": slice(3, 5)}) assert irf_sliced.data.shape == (10, 2) assert irf_sliced.axes["offset"].nbin == 2 irf_sliced = irf.slice_by_idx({"energy": slice(3, 7), "offset": slice(3, 5)}) assert irf_sliced.data.shape == (4, 2) assert irf_sliced.axes["offset"].nbin == 2 assert irf_sliced.axes["energy"].nbin == 4 with pytest.raises(ValueError) as exc_info: _ = irf.slice_by_idx({"energy": 3, "offset": 7}) assert ( str(exc_info.value) == "Integer indexing not supported, got {'energy': 3, 'offset': 7}" ) def test_cum_sum(): energy_axis = MapAxis.from_energy_bounds(10, 100, 10, unit="TeV", name="energy") offset_axis = MapAxis.from_bounds(0, 2.5, 1, unit="deg", name="offset") data = np.full((10, 1), 1) irf = MyCustomIRF(axes=[energy_axis, offset_axis], data=data, unit="") cumsum = irf.cumsum(axis_name="offset") assert cumsum.unit == u.Unit("deg^2") assert cumsum.data[0, 0] == 2.5**2 * np.pi ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/tests/test_effective_area.py0000644000175100001770000001241714721316200022334 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose, assert_equal import astropy.units as u from gammapy.irf import EffectiveAreaTable2D from gammapy.maps import MapAxis from gammapy.utils.testing import ( assert_quantity_allclose, mpl_plot_check, requires_data, ) from gammapy.utils.deprecation import GammapyDeprecationWarning @pytest.fixture(scope="session") def aeff(): filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz" return EffectiveAreaTable2D.read(filename, hdu="AEFF") @requires_data() def test_basic(aeff): assert aeff.axes["energy_true"].nbin == 96 assert aeff.axes["offset"].nbin == 6 assert aeff.data.shape == (96, 6) assert aeff.axes["energy_true"].unit == "TeV" assert aeff.axes["offset"].unit == "deg" assert aeff.unit == "m2" assert_quantity_allclose(aeff.meta["HI_THRES"], 100, rtol=1e-3) assert_quantity_allclose(aeff.meta["LO_THRES"], 0.870964, rtol=1e-3) test_val = aeff.evaluate(energy_true="14 TeV", offset="0.2 deg") assert_allclose(test_val.value, 683177.5, rtol=1e-3) def test_from_parametrization(): # Log center of this is 100 GeV area_ref = 1.65469579e07 * u.cm**2 axis = MapAxis.from_energy_edges([80, 125] * u.GeV, name="energy_true") area = EffectiveAreaTable2D.from_parametrization(axis, "HESS") assert_allclose(area.quantity, area_ref) assert area.unit == area_ref.unit # Log center of this is 0.1, 2 TeV area_ref = [1.65469579e07, 1.46451957e09] * u.cm * u.cm axis = MapAxis.from_energy_edges([0.08, 0.125, 32] * u.TeV, name="energy_true") area = EffectiveAreaTable2D.from_parametrization(axis, "HESS") assert_allclose(area.quantity[:, 0], area_ref) assert area.unit == area_ref.unit assert area.meta["TELESCOP"] == "HESS" with pytest.warns(GammapyDeprecationWarning): area2 = EffectiveAreaTable2D.from_parametrization(axis, "CTA") assert area2.unit == area_ref.unit with pytest.raises(ValueError): area2 = EffectiveAreaTable2D.from_parametrization(axis, "SWIFT") @requires_data() def test_plot(aeff): with mpl_plot_check(): aeff.plot() with mpl_plot_check(): aeff.plot_energy_dependence() with mpl_plot_check(): aeff.plot_offset_dependence() def test_to_table(): energy_axis_true = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=10, name="energy_true" ) offset_axis = MapAxis.from_bounds(0, 1, nbin=4, name="offset", unit="deg") aeff = EffectiveAreaTable2D( axes=[energy_axis_true, offset_axis], data=1, unit="cm2" ) hdu = aeff.to_table_hdu() assert_equal(hdu.data["ENERG_LO"][0], aeff.axes["energy_true"].edges[:-1].value) assert hdu.header["TUNIT1"] == aeff.axes["energy_true"].unit assert "FOVALIGN" not in hdu.header def test_to_table_is_pointlike(): energy_axis = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=3, name="energy_true" ) offset_axis = MapAxis.from_bounds(0 * u.deg, 2 * u.deg, nbin=2, name="offset") aeff = EffectiveAreaTable2D( data=np.ones((3, 2)) * u.m**2, axes=[energy_axis, offset_axis] ) hdu = aeff.to_table_hdu() assert "is_pointlike" not in hdu.header def test_wrong_axis_order(): energy_axis_true = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=10, name="energy_true" ) offset = np.linspace(0, 1, 4) * u.deg offset_axis = MapAxis.from_nodes(offset, name="offset") data = np.ones(shape=(offset_axis.nbin, energy_axis_true.nbin)) with pytest.raises(ValueError): EffectiveAreaTable2D( axes=[energy_axis_true, offset_axis], data=data, unit="cm2" ) def test_wrong_units(): energy_axis_true = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=10, name="energy_true" ) offset_axis = MapAxis.from_bounds(0 * u.deg, 2 * u.deg, nbin=2, name="offset") wrong_unit = u.TeV data = np.ones((energy_axis_true.nbin, offset_axis.nbin)) * wrong_unit area_test = EffectiveAreaTable2D(axes=[energy_axis_true, offset_axis]) with pytest.raises(ValueError) as error: EffectiveAreaTable2D(data=data, axes=[energy_axis_true, offset_axis]) assert error.match( f"Error: {wrong_unit} is not an allowed unit. {area_test.tag} " f"requires {area_test.default_unit} data quantities." ) @requires_data("gammapy-data") def test_aeff2d_pointlike(): filename = "$GAMMAPY_DATA/joint-crab/dl3/magic/run_05029748_DL3.fits" aeff = EffectiveAreaTable2D.read(filename) hdu = aeff.to_table_hdu() assert aeff.is_pointlike assert hdu.header["HDUCLAS3"] == "POINT-LIKE" def test_eq(): energy1 = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=2, name="energy_true") energy2 = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3, name="energy_true") offset_axis = MapAxis.from_bounds(0 * u.deg, 2 * u.deg, nbin=2, name="offset") data1 = np.ones((energy1.nbin, offset_axis.nbin)) * u.cm**2 data2 = np.ones((energy2.nbin, offset_axis.nbin)) * u.cm**2 aeff1 = EffectiveAreaTable2D(data=data1, axes=[energy1, offset_axis]) aeff2 = EffectiveAreaTable2D(data=data2, axes=[energy2, offset_axis]) assert not aeff1 == aeff2 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/tests/test_gadf.py0000644000175100001770000000777214721316200020315 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np import astropy.units as u def test_effective_area_2d_to_gadf(): from gammapy.irf import EffectiveAreaTable2D from gammapy.maps import MapAxis energy_axis = MapAxis.from_energy_bounds( 1 * u.TeV, 10 * u.TeV, nbin=3, name="energy_true" ) offset_axis = MapAxis.from_bounds(0 * u.deg, 2 * u.deg, nbin=2, name="offset") data = np.ones((energy_axis.nbin, offset_axis.nbin)) * u.m**2 aeff = EffectiveAreaTable2D(data=data, axes=[energy_axis, offset_axis]) hdu = aeff.to_table_hdu(format="gadf-dl3") columns = {column.name for column in hdu.columns} mandatory_columns = { "ENERG_LO", "ENERG_HI", "THETA_LO", "THETA_HI", "EFFAREA", } missing = mandatory_columns.difference(columns) assert len(missing) == 0, f"GADF HDU missing required column(s) {missing}" header = hdu.header assert header["HDUCLASS"] == "GADF" assert ( header["HDUDOC"] == "https://github.com/open-gamma-ray-astro/gamma-astro-data-formats" ) assert header["HDUVERS"] == "0.2" assert header["HDUCLAS1"] == "RESPONSE" assert header["HDUCLAS2"] == "EFF_AREA" assert header["HDUCLAS3"] == "FULL-ENCLOSURE" assert header["HDUCLAS4"] == "AEFF_2D" assert header["EXTNAME"] == "EFFECTIVE AREA" def test_energy_dispersion_2d_to_gadf(): from gammapy.irf import EnergyDispersion2D from gammapy.maps import MapAxis energy_axis = MapAxis.from_energy_bounds( 1 * u.TeV, 10 * u.TeV, nbin=3, name="energy_true" ) offset_axis = MapAxis.from_bounds(0 * u.deg, 2 * u.deg, nbin=2, name="offset") migra_axis = MapAxis.from_bounds(0.2, 5, nbin=5, interp="log", name="migra") data = np.zeros((energy_axis.nbin, migra_axis.nbin, offset_axis.nbin)) edisp = EnergyDispersion2D(data=data, axes=[energy_axis, migra_axis, offset_axis]) hdu = edisp.to_table_hdu(format="gadf-dl3") mandatory_columns = { "ENERG_LO", "ENERG_HI", "MIGRA_LO", "MIGRA_HI", "THETA_LO", "THETA_HI", "MATRIX", } columns = {column.name for column in hdu.columns} missing = mandatory_columns.difference(columns) assert len(missing) == 0, f"GADF HDU missing required column(s) {missing}" header = hdu.header assert header["HDUCLASS"] == "GADF" assert ( header["HDUDOC"] == "https://github.com/open-gamma-ray-astro/gamma-astro-data-formats" ) assert header["HDUVERS"] == "0.2" assert header["HDUCLAS1"] == "RESPONSE" assert header["HDUCLAS2"] == "EDISP" assert header["HDUCLAS3"] == "FULL-ENCLOSURE" assert header["HDUCLAS4"] == "EDISP_2D" def test_psf_3d_to_gadf(): from gammapy.irf import PSF3D from gammapy.maps import MapAxis energy_axis = MapAxis.from_energy_bounds( 1 * u.TeV, 10 * u.TeV, nbin=3, name="energy_true" ) offset_axis = MapAxis.from_bounds(0 * u.deg, 2 * u.deg, nbin=2, name="offset") rad_axis = MapAxis.from_bounds(0.0 * u.deg, 1 * u.deg, nbin=10, name="rad") data = np.zeros((energy_axis.nbin, offset_axis.nbin, rad_axis.nbin)) / u.sr psf = PSF3D(data=data, axes=[energy_axis, offset_axis, rad_axis]) hdu = psf.to_table_hdu(format="gadf-dl3") mandatory_columns = { "ENERG_LO", "ENERG_HI", "THETA_LO", "THETA_HI", "RAD_LO", "RAD_HI", "RPSF", } columns = {column.name for column in hdu.columns} missing = mandatory_columns.difference(columns) assert len(missing) == 0, f"GADF HDU missing required column(s) {missing}" header = hdu.header assert header["HDUCLASS"] == "GADF" assert ( header["HDUDOC"] == "https://github.com/open-gamma-ray-astro/gamma-astro-data-formats" ) assert header["HDUVERS"] == "0.2" assert header["HDUCLAS1"] == "RESPONSE" assert header["HDUCLAS2"] == "RPSF" assert header["HDUCLAS3"] == "FULL-ENCLOSURE" assert header["HDUCLAS4"] == "PSF_TABLE" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/tests/test_io.py0000644000175100001770000002013014721316200020002 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.io import fits from astropy.units import Quantity from gammapy.irf import ( Background3D, EffectiveAreaTable2D, EnergyDependentMultiGaussPSF, EnergyDispersion2D, RadMax2D, load_irf_dict_from_file, ) from gammapy.maps import MapAxis from gammapy.utils.scripts import make_path from gammapy.utils.testing import requires_data @requires_data() def test_load_irf_dict_from_file(): """Test that the IRF components in a dictionary loaded from a DL3 file can be loaded in a dictionary and correctly used""" irf = load_irf_dict_from_file( "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" ) energy = Quantity(1, "TeV") offset = Quantity(0.5, "deg") val = irf["aeff"].evaluate(energy_true=energy, offset=offset) assert_allclose(val.value, 273372.44851054, rtol=1e-5) assert val.unit == "m2" val = irf["edisp"].evaluate(offset=offset, energy_true=energy, migra=1) assert_allclose(val.value, 1.84269482, rtol=1e-5) assert val.unit == "" val = irf["psf"].evaluate( rad=Quantity(0.1, "deg"), energy_true=energy, offset=offset ) assert_allclose(val, 6.75981573 * u.Unit("deg-2"), rtol=1e-5) val = irf["bkg"].evaluate(energy=energy, fov_lon=offset, fov_lat="0.1 deg") assert_allclose(val.value, 0.00031552, rtol=1e-5) assert val.unit == "1 / (MeV s sr)" @requires_data() def test_irf_dict_from_file_duplicate_irfs(caplog, tmp_path): """catch the warning message about two type of IRF with the same hdu class encountered in the same file""" original_file = make_path( "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" ) dummy_file = tmp_path / "020136_duplicated_psf.fits" # create a dummy file with the PSF HDU repeated twice f = fits.open(original_file) f.append(f[5].copy()) f[7].name = "PSF2" f.writeto(dummy_file) load_irf_dict_from_file(dummy_file) assert "more than one HDU" in caplog.text assert "loaded the PSF HDU in the dictionary" in caplog.text @requires_data() def test_irf_dict_from_file_fixed_rad_max(): """test that for point-like IRF without RAD_MAX_2D HDU a RadMax2D with a single value is generated from the RAD_MAX header keyword""" irf = load_irf_dict_from_file( "$GAMMAPY_DATA/joint-crab/dl3/magic/run_05029748_DL3.fits" ) assert "RAD_MAX" in irf["aeff"].meta assert "rad_max" in irf assert isinstance(irf["rad_max"], RadMax2D) # check that has a single-bin in energy and offset assert irf["rad_max"].axes["energy"].nbin == 1 assert irf["rad_max"].axes["offset"].nbin == 1 assert irf["rad_max"].quantity.to_value("deg") == irf["aeff"].meta["RAD_MAX"] class TestIRFWrite: def setup_method(self): self.energy_lo = np.logspace(0, 1, 10)[:-1] * u.TeV self.energy_hi = np.logspace(0, 1, 10)[1:] * u.TeV self.energy_axis_true = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=9, name="energy_true" ) self.offset_lo = np.linspace(0, 1, 4)[:-1] * u.deg self.offset_hi = np.linspace(0, 1, 4)[1:] * u.deg self.offset_axis = MapAxis.from_bounds( 0, 1, nbin=3, unit="deg", name="offset", node_type="edges" ) self.migra_lo = np.linspace(0, 3, 4)[:-1] self.migra_hi = np.linspace(0, 3, 4)[1:] self.migra_axis = MapAxis.from_bounds( 0, 3, nbin=3, name="migra", node_type="edges" ) self.fov_lon_lo = np.linspace(-6, 6, 11)[:-1] * u.deg self.fov_lon_hi = np.linspace(-6, 6, 11)[1:] * u.deg self.fov_lon_axis = MapAxis.from_bounds(-6, 6, nbin=10, name="fov_lon") self.fov_lat_lo = np.linspace(-6, 6, 11)[:-1] * u.deg self.fov_lat_hi = np.linspace(-6, 6, 11)[1:] * u.deg self.fov_lat_axis = MapAxis.from_bounds(-6, 6, nbin=10, name="fov_lat") self.aeff_data = np.random.rand(9, 3) * u.cm * u.cm self.edisp_data = np.random.rand(9, 3, 3) self.bkg_data = np.random.rand(9, 10, 10) / u.MeV / u.s / u.sr self.aeff = EffectiveAreaTable2D( axes=[self.energy_axis_true, self.offset_axis], data=self.aeff_data.value, unit=self.aeff_data.unit, ) self.edisp = EnergyDispersion2D( axes=[ self.energy_axis_true, self.migra_axis, self.offset_axis, ], data=self.edisp_data, ) axes = [ self.energy_axis_true.copy(name="energy"), self.fov_lon_axis, self.fov_lat_axis, ] self.bkg = Background3D( axes=axes, data=self.bkg_data.value, unit=self.bkg_data.unit ) def test_array_to_container(self): assert_allclose(self.aeff.quantity, self.aeff_data) assert_allclose(self.edisp.quantity, self.edisp_data) assert_allclose(self.bkg.quantity, self.bkg_data) def test_container_to_table(self): assert_allclose(self.aeff.to_table()["ENERG_LO"].quantity[0], self.energy_lo) assert_allclose(self.edisp.to_table()["ENERG_LO"].quantity[0], self.energy_lo) assert_allclose(self.bkg.to_table()["ENERG_LO"].quantity[0], self.energy_lo) assert_allclose(self.aeff.to_table()["EFFAREA"].quantity[0].T, self.aeff_data) assert_allclose(self.edisp.to_table()["MATRIX"].quantity[0].T, self.edisp_data) assert_allclose(self.bkg.to_table()["BKG"].quantity[0].T, self.bkg_data) assert self.aeff.to_table()["EFFAREA"].quantity[0].unit == self.aeff_data.unit assert self.bkg.to_table()["BKG"].quantity[0].unit == self.bkg_data.unit def test_container_to_fits(self): assert_allclose(self.aeff.to_table()["ENERG_LO"].quantity[0], self.energy_lo) assert self.aeff.to_table_hdu().header["EXTNAME"] == "EFFECTIVE AREA" assert self.edisp.to_table_hdu().header["EXTNAME"] == "ENERGY DISPERSION" assert self.bkg.to_table_hdu().header["EXTNAME"] == "BACKGROUND" hdu = self.aeff.to_table_hdu() assert_allclose( hdu.data[hdu.header["TTYPE1"]][0], self.aeff.axes[0].edges[:-1].value ) hdu = self.aeff.to_table_hdu() assert_allclose(hdu.data[hdu.header["TTYPE5"]][0].T, self.aeff.data) hdu = self.edisp.to_table_hdu() assert_allclose( hdu.data[hdu.header["TTYPE1"]][0], self.edisp.axes[0].edges[:-1].value ) hdu = self.edisp.to_table_hdu() assert_allclose(hdu.data[hdu.header["TTYPE7"]][0].T, self.edisp.data) hdu = self.bkg.to_table_hdu() assert_allclose( hdu.data[hdu.header["TTYPE1"]][0], self.bkg.axes[0].edges[:-1].value ) hdu = self.bkg.to_table_hdu() assert_allclose(hdu.data[hdu.header["TTYPE7"]][0].T, self.bkg.data) def test_writeread(self, tmp_path): path = tmp_path / "tmp.fits" fits.HDUList( [ fits.PrimaryHDU(), self.aeff.to_table_hdu(), self.edisp.to_table_hdu(), self.bkg.to_table_hdu(), ] ).writeto(path) read_aeff = EffectiveAreaTable2D.read(path, hdu="EFFECTIVE AREA") assert_allclose(read_aeff.quantity, self.aeff_data) read_edisp = EnergyDispersion2D.read(path, hdu="ENERGY DISPERSION") assert_allclose(read_edisp.quantity, self.edisp_data) read_bkg = Background3D.read(path, hdu="BACKGROUND") assert_allclose(read_bkg.quantity, self.bkg_data) @requires_data() def test_load_irf_dict_from_file_cta(): """Test that CTA IRFs can be loaded and evaluated.""" irf = load_irf_dict_from_file( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) assert set(irf.keys()) == {"aeff", "edisp", "psf", "bkg"} assert isinstance(irf["aeff"], EffectiveAreaTable2D) assert isinstance(irf["edisp"], EnergyDispersion2D) assert isinstance(irf["psf"], EnergyDependentMultiGaussPSF) assert isinstance(irf["bkg"], Background3D) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/irf/tests/test_rad_max.py0000644000175100001770000000630514721316200021016 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from gammapy.irf import EffectiveAreaTable2D, RadMax2D from gammapy.maps import MapAxis from gammapy.utils.compat import COPY_IF_NEEDED from gammapy.utils.testing import mpl_plot_check @pytest.fixture() def rad_max_2d(): n_energy = 10 energy_axis = MapAxis.from_energy_bounds( 50 * u.GeV, 100 * u.TeV, n_energy, name="energy" ) n_offset = 5 offset_axis = MapAxis.from_bounds(0, 2, n_offset, unit=u.deg, name="offset") shape = (n_energy, n_offset) rad_max = np.linspace(0.1, 0.5, n_energy * n_offset).reshape(shape) return RadMax2D( axes=[ energy_axis, offset_axis, ], data=rad_max, unit=u.deg, ) def test_rad_max_roundtrip(rad_max_2d, tmp_path): rad_max_2d.write(tmp_path / "rad_max.fits") rad_max_read = RadMax2D.read(tmp_path / "rad_max.fits") assert_allclose(rad_max_read.data, rad_max_2d.data) assert rad_max_2d.axes == rad_max_read.axes def test_rad_max_plot(rad_max_2d): with mpl_plot_check(): rad_max_2d.plot_rad_max_vs_energy() def test_rad_max_from_irf(): e_bins = 3 o_bins = 2 energy_axis = MapAxis.from_energy_bounds( 1 * u.TeV, 10 * u.TeV, nbin=e_bins, name="energy_true" ) offset_axis = MapAxis.from_bounds(0 * u.deg, 3 * u.deg, nbin=o_bins, name="offset") aeff = EffectiveAreaTable2D( data=u.Quantity(np.ones((e_bins, o_bins)), u.m**2, copy=COPY_IF_NEEDED), axes=[energy_axis, offset_axis], is_pointlike=True, ) with pytest.raises(ValueError): # not a point-like IRF RadMax2D.from_irf(aeff) with pytest.raises(ValueError): # missing rad_max RadMax2D.from_irf(aeff) aeff.meta["RAD_MAX"] = "0.2 deg" with pytest.raises(ValueError): # invalid format RadMax2D.from_irf(aeff) aeff.meta["RAD_MAX"] = 0.2 rad_max = RadMax2D.from_irf(aeff) assert rad_max.axes["energy"].nbin == 1 assert rad_max.axes["offset"].nbin == 1 assert rad_max.axes["energy"].edges[0] == aeff.axes["energy_true"].edges[0] assert rad_max.axes["energy"].edges[1] == aeff.axes["energy_true"].edges[-1] assert rad_max.axes["offset"].edges[0] == aeff.axes["offset"].edges[0] assert rad_max.axes["offset"].edges[1] == aeff.axes["offset"].edges[-1] assert rad_max.quantity.shape == (1, 1) assert rad_max.quantity[0, 0] == 0.2 * u.deg def test_rad_max_single_bin(): energy_axis = MapAxis.from_energy_bounds(0.01, 100, 1, unit="TeV") offset_axis = MapAxis.from_bounds( 0.0, 5, 1, unit="deg", name="offset", ) rad_max = RadMax2D( data=[[0.1]] * u.deg, axes=[energy_axis, offset_axis], interp_kwargs={"method": "nearest", "fill_value": None}, ) value = rad_max.evaluate(energy=1 * u.TeV, offset=1 * u.deg) assert_allclose(value, 0.1 * u.deg) value = rad_max.evaluate(energy=[1, 2, 3] * u.TeV, offset=[[1]] * u.deg) assert value.shape == (1, 3) assert_allclose(value, 0.1 * u.deg) assert rad_max.is_fixed_rad_max ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.212642 gammapy-1.3/gammapy/makers/0000755000175100001770000000000014721316215015334 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/__init__.py0000644000175100001770000000233714721316200017444 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from gammapy.utils.registry import Registry from .background import ( AdaptiveRingBackgroundMaker, FoVBackgroundMaker, PhaseBackgroundMaker, ReflectedRegionsBackgroundMaker, ReflectedRegionsFinder, RegionsFinder, RingBackgroundMaker, WobbleRegionsFinder, ) from .core import Maker from .map import MapDatasetMaker from .reduce import DatasetsMaker from .safe import SafeMaskMaker from .spectrum import SpectrumDatasetMaker MAKER_REGISTRY = Registry( [ ReflectedRegionsBackgroundMaker, AdaptiveRingBackgroundMaker, FoVBackgroundMaker, PhaseBackgroundMaker, RingBackgroundMaker, SpectrumDatasetMaker, MapDatasetMaker, SafeMaskMaker, DatasetsMaker, ] ) """Registry of maker classes in Gammapy.""" __all__ = [ "AdaptiveRingBackgroundMaker", "DatasetsMaker", "FoVBackgroundMaker", "Maker", "MAKER_REGISTRY", "MapDatasetMaker", "PhaseBackgroundMaker", "ReflectedRegionsBackgroundMaker", "ReflectedRegionsFinder", "RegionsFinder", "RingBackgroundMaker", "SafeMaskMaker", "SpectrumDatasetMaker", "WobbleRegionsFinder", ] ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.212642 gammapy-1.3/gammapy/makers/background/0000755000175100001770000000000014721316215017453 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/background/__init__.py0000644000175100001770000000111714721316200021556 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from .fov import FoVBackgroundMaker from .phase import PhaseBackgroundMaker from .reflected import ( ReflectedRegionsBackgroundMaker, ReflectedRegionsFinder, RegionsFinder, WobbleRegionsFinder, ) from .ring import AdaptiveRingBackgroundMaker, RingBackgroundMaker __all__ = [ "AdaptiveRingBackgroundMaker", "FoVBackgroundMaker", "PhaseBackgroundMaker", "ReflectedRegionsBackgroundMaker", "ReflectedRegionsFinder", "RegionsFinder", "RingBackgroundMaker", "WobbleRegionsFinder", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/background/fov.py0000644000175100001770000002147714721316200020624 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """FoV background estimation.""" import logging import numpy as np from gammapy.maps import Map, RegionGeom from gammapy.modeling import Fit from gammapy.modeling.models import FoVBackgroundModel, Model from ..core import Maker __all__ = ["FoVBackgroundMaker"] log = logging.getLogger(__name__) class FoVBackgroundMaker(Maker): """Normalize template background on the whole field-of-view. The dataset background model can be simply scaled (method="scale") or fitted (method="fit") on the dataset counts. The normalization is performed outside the exclusion mask that is passed on init. This also internally takes into account the dataset fit mask. If a SkyModel is set on the input dataset its parameters are frozen during the FoV re-normalization. If the requirement (greater than) of either min_counts or min_npred_background is not satisfied, the background will not be normalised. Parameters ---------- method : {'scale', 'fit'} The normalization method to be applied. Default 'scale'. exclusion_mask : `~gammapy.maps.WcsNDMap` Exclusion mask. spectral_model : SpectralModel or str Reference norm spectral model to use for the `FoVBackgroundModel`, if none is defined on the dataset. By default, use pl-norm. min_counts : int Minimum number of counts, or residuals counts if a SkyModel is set, required outside the exclusion region. min_npred_background : float Minimum number of predicted background counts required outside the exclusion region. """ tag = "FoVBackgroundMaker" available_methods = ["fit", "scale"] def __init__( self, method="scale", exclusion_mask=None, spectral_model="pl-norm", min_counts=0, min_npred_background=0, fit=None, ): self.method = method self.exclusion_mask = exclusion_mask self.min_counts = min_counts self.min_npred_background = min_npred_background if isinstance(spectral_model, str): spectral_model = Model.create(tag=spectral_model, model_type="spectral") if not spectral_model.is_norm_spectral_model: raise ValueError("Spectral model must be a norm spectral model") self.default_spectral_model = spectral_model if fit is None: fit = Fit() self.fit = fit @property def method(self): """Method property.""" return self._method @method.setter def method(self, value): """Method setter.""" if value not in self.available_methods: raise ValueError( f"Not a valid method for FoVBackgroundMaker: {value}." f" Choose from {self.available_methods}" ) self._method = value def make_default_fov_background_model(self, dataset): """Add FoV background model to the model definition. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input map dataset. Returns ------- dataset : `~gammapy.datasets.MapDataset` Map dataset including background model. """ bkg_model = FoVBackgroundModel( dataset_name=dataset.name, spectral_model=self.default_spectral_model.copy() ) if dataset.models is None: dataset.models = bkg_model else: dataset.models = dataset.models + bkg_model return dataset def make_exclusion_mask(self, dataset): """Project input exclusion mask to dataset geometry. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input map dataset. Returns ------- mask : `~gammapy.maps.WcsNDMap` Projected exclusion mask. """ geom = dataset._geom if self.exclusion_mask: mask = self.exclusion_mask.interp_to_geom(geom=geom) else: mask = Map.from_geom(geom=geom, data=1, dtype=bool) return mask def _make_masked_summed_counts(self, dataset): """Compute the sums of the counts, npred, and background maps within the mask.""" npred = dataset.npred() mask = dataset.mask & ~np.isnan(npred) count_tot = dataset.counts.data[mask].sum() npred_tot = npred.data[mask].sum() bkg_tot = dataset.npred_background().data[mask].sum() return { "counts": count_tot, "npred": npred_tot, "bkg": bkg_tot, } def _verify_requirements(self, dataset): """Verify that the requirements of min_counts and min_npred_background are satisfied.""" total = self._make_masked_summed_counts(dataset) not_bkg_tot = total["npred"] - total["bkg"] value = (total["counts"] - not_bkg_tot) / total["bkg"] if not np.isfinite(value): log.warning( f"FoVBackgroundMaker failed. Non-finite normalisation value for {dataset.name}. " "Setting mask to False." ) return False elif total["bkg"] <= self.min_npred_background: log.warning( f"FoVBackgroundMaker failed. Only {int(total['bkg'])} background" f" counts outside exclusion mask for {dataset.name}. " "Setting mask to False." ) return False elif total["counts"] <= self.min_counts: log.warning( f"FoVBackgroundMaker failed. Only {int(total['counts'])} counts " f"outside exclusion mask for {dataset.name}. " "Setting mask to False." ) return False elif total["counts"] - not_bkg_tot <= 0: log.warning( f"FoVBackgroundMaker failed. Negative residuals counts for {dataset.name}. " "Setting mask to False." ) return False else: return True def run(self, dataset, observation=None): """Run FoV background maker. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input map dataset. """ if isinstance(dataset.counts.geom, RegionGeom): raise TypeError( "FoVBackgroundMaker does not support region based datasets." ) mask_fit = dataset.mask_fit if mask_fit: dataset.mask_fit *= self.make_exclusion_mask(dataset) else: dataset.mask_fit = self.make_exclusion_mask(dataset) if dataset.background_model is None: dataset = self.make_default_fov_background_model(dataset) if self._verify_requirements(dataset) is True: if self.method == "fit": dataset = self.make_background_fit(dataset) else: # always scale the background first dataset = self.make_background_scale(dataset) else: dataset.mask_safe.data[...] = False dataset.mask_fit = mask_fit return dataset def make_background_fit(self, dataset): """Fit the FoV background model on the dataset counts data. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input dataset. Returns ------- dataset : `~gammapy.datasets.MapDataset` Map dataset with fitted background model. """ # freeze all model components not related to background model models = dataset.models.select(tag="sky-model") with models.restore_status(restore_values=False): models.select(tag="sky-model").freeze() fit_result = self.fit.run(datasets=[dataset]) if not fit_result.success: log.warning( f"FoVBackgroundMaker failed. Fit did not converge for {dataset.name}. " "Setting mask to False." ) dataset.mask_safe.data[...] = False return dataset def make_background_scale(self, dataset): """Fit the FoV background model on the dataset counts data. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input dataset. Returns ------- dataset : `~gammapy.datasets.MapDataset` Map dataset with scaled background model. """ total = self._make_masked_summed_counts(dataset) not_bkg_tot = total["npred"] - total["bkg"] value = (total["counts"] - not_bkg_tot) / total["bkg"] error = np.sqrt(total["counts"] - not_bkg_tot) / total["bkg"] dataset.models[f"{dataset.name}-bkg"].spectral_model.norm.value = value dataset.models[f"{dataset.name}-bkg"].spectral_model.norm.error = error return dataset ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/background/phase.py0000644000175100001770000001202014721316200021112 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from copy import deepcopy import numpy as np from regions import PointSkyRegion from gammapy.data import EventList from gammapy.datasets import MapDatasetOnOff, SpectrumDataset from gammapy.makers.utils import make_counts_rad_max from gammapy.maps import Map from ..core import Maker __all__ = ["PhaseBackgroundMaker"] class PhaseBackgroundMaker(Maker): """Background estimation with on and off phases. Parameters ---------- on_phase : `tuple` or list of tuples On-phase defined by the two edges of each interval (edges are excluded). off_phase : `tuple` or list of tuples Off-phase defined by the two edges of each interval (edges are excluded). phase_column_name : `str`, optional The name of the column in the event file from which the phase information are extracted. Default is 'PHASE'. """ tag = "PhaseBackgroundMaker" def __init__(self, on_phase, off_phase, phase_column_name="PHASE"): self.on_phase = self._check_intervals(deepcopy(on_phase)) self.off_phase = self._check_intervals(deepcopy(off_phase)) self.phase_column_name = phase_column_name def __str__(self): s = self.__class__.__name__ s += f"\nOn phase interval : {self.on_phase}" s += f"\nOff phase interval : {self.off_phase}" s += f"\nPhase column name : {self.phase_column_name}" return s @staticmethod def _make_counts(dataset, observation, phases, phase_column_name): event_lists = [] for interval in phases: events = observation.events.select_parameter( parameter=phase_column_name, band=interval ) event_lists.append(events) events = EventList.from_stack(event_lists) geom = dataset.counts.geom if geom.is_region and isinstance(geom.region, PointSkyRegion): counts = make_counts_rad_max(geom, observation.rad_max, events) else: counts = Map.from_geom(geom) counts.fill_events(events) return counts def make_counts_off(self, dataset, observation): """Make off counts. Parameters ---------- dataset : `SpectrumDataset` Input dataset. observation : `DatastoreObservation` Data store observation. Returns ------- counts_off : `RegionNDMap` Off counts. """ return self._make_counts( dataset, observation, self.off_phase, self.phase_column_name ) def make_counts(self, dataset, observation): """Make on counts. Parameters ---------- dataset : `SpectrumDataset` Input dataset. observation : `DatastoreObservation` Data store observation. Returns ------- counts : `RegionNDMap` On counts. """ return self._make_counts( dataset, observation, self.on_phase, self.phase_column_name ) def run(self, dataset, observation): """Make on off dataset. Parameters ---------- dataset : `SpectrumDataset` or `MapDataset` Input dataset. observation : `Observation` Data store observation. Returns ------- dataset_on_off : `SpectrumDatasetOnOff` or `MapDatasetOnOff` On off dataset. """ counts_off = self.make_counts_off(dataset, observation) counts = self.make_counts(dataset, observation) acceptance = Map.from_geom(geom=dataset.counts.geom) acceptance.data = np.sum([_[1] - _[0] for _ in self.on_phase]) acceptance_off = Map.from_geom(geom=dataset.counts.geom) acceptance_off.data = np.sum([_[1] - _[0] for _ in self.off_phase]) dataset_on_off = MapDatasetOnOff.from_map_dataset( dataset=dataset, counts_off=counts_off, acceptance=acceptance, acceptance_off=acceptance_off, ) dataset_on_off.counts = counts if isinstance(dataset, SpectrumDataset): dataset_on_off = dataset_on_off.to_spectrum_dataset(dataset._geom.region) return dataset_on_off @staticmethod def _check_intervals(intervals): """Split phase intervals that go below phase 0 and above phase 1. Parameters ---------- intervals: `tuple`or list of `tuple` Phase interval or list of phase intervals to check. Returns ------- intervals: list of `tuple` Phase interval checked. """ if isinstance(intervals, tuple): intervals = [intervals] for phase_interval in intervals: if phase_interval[0] % 1 > phase_interval[1] % 1: intervals.remove(phase_interval) intervals.append((phase_interval[0] % 1, 1)) if phase_interval[1] % 1 != 0: intervals.append((0, phase_interval[1] % 1)) return intervals ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/background/reflected.py0000644000175100001770000005152414721316200021763 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import html import logging from abc import ABCMeta, abstractmethod from itertools import combinations import numpy as np from astropy import units as u from astropy.coordinates import Angle from regions import CircleSkyRegion, PixCoord, PointSkyRegion from gammapy.datasets import SpectrumDatasetOnOff from gammapy.maps import RegionGeom, RegionNDMap, WcsGeom, WcsNDMap from ..core import Maker from ..utils import make_counts_off_rad_max __all__ = [ "ReflectedRegionsBackgroundMaker", "ReflectedRegionsFinder", "RegionsFinder", "WobbleRegionsFinder", ] log = logging.getLogger(__name__) FULL_CIRCLE = Angle(2 * np.pi, "rad") def are_regions_overlapping_rad_max(regions, rad_max, offset, e_min, e_max): """Calculate pair-wise separations between all regions and compare with rad_max to find overlaps.""" separations = u.Quantity( [a.center.separation(b.center) for a, b in combinations(regions, 2)] ) rad_max_at_offset = rad_max.evaluate(offset=offset) # do not check bins outside of energy range edges_min = rad_max.axes["energy"].edges_min edges_max = rad_max.axes["energy"].edges_max # to be sure all possible values are included, we check # for the *upper* energy bin to be larger than e_min and the *lower* edge # to be larger than e_max mask = (edges_max >= e_min) & (edges_min <= e_max) rad_max_at_offset = rad_max_at_offset[mask] return np.any(separations[np.newaxis, :] < (2 * rad_max_at_offset)) def is_rad_max_compatible_region_geom(rad_max, geom, rtol=1e-3): """Check if input RegionGeom is compatible with rad_max for point-like analysis. Parameters ---------- geom : `~gammapy.maps.RegionGeom` Input RegionGeom. rtol : float, optional Relative tolerance. Default is 1e-3. Returns ------- valid : bool True if rad_max is fixed and region is a CircleSkyRegion with compatible radius True if region is a PointSkyRegion False otherwise. """ if geom.is_all_point_sky_regions: valid = True elif isinstance(geom.region, CircleSkyRegion) and rad_max.is_fixed_rad_max: valid = np.allclose(geom.region.radius, rad_max.quantity, rtol) if not valid: raise ValueError( f"CircleSkyRegion radius must be equal to RADMAX " f"for point-like IRFs with fixed RADMAX. " f"Expected {rad_max.quantity} got {geom.region.radius}." ) else: valid = False return valid class RegionsFinder(metaclass=ABCMeta): """Base class for regions finders. Parameters ---------- binsz : `~astropy.coordinates.Angle` Bin size of the reference map used for region finding. """ def __init__(self, binsz=0.01 * u.deg): """Create a new RegionFinder.""" self.binsz = Angle(binsz) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @abstractmethod def run(self, region, center, exclusion_mask=None): """Find reflected regions. Parameters ---------- region : `~regions.SkyRegion` Region to rotate. center : `~astropy.coordinates.SkyCoord` Rotation point. exclusion_mask : `~gammapy.maps.WcsNDMap`, optional Exclusion mask. Regions intersecting with this mask will not be included in the returned regions. Default is None. Returns ------- regions : list of `~regions.SkyRegion` Reflected regions. wcs : `~astropy.wcs.WCS` WCS for the determined regions. """ def _create_reference_geometry(self, region, center): """Reference geometry. The size of the map is chosen such that all reflected regions are contained on the image. To do so, the reference map width is taken to be 4 times the distance between the target region center and the rotation point. This distance is larger than the typical dimension of the region itself (otherwise the rotation point would lie inside the region). A minimal width value is added by default in case the region center and the rotation center are too close. The WCS of the map is the TAN projection at the `center` in the coordinate system used by the `region` center. """ frame = region.center.frame.name # width is the full width of an image (not the radius) width = 4 * region.center.separation(center) + Angle("0.3 deg") return WcsGeom.create( skydir=center, binsz=self.binsz, width=width, frame=frame, proj="TAN" ) @staticmethod def _get_center_pixel(center, reference_geom): """Center pixel coordinate.""" return PixCoord.from_sky(center, reference_geom.wcs) @staticmethod def _get_region_pixels(region, reference_geom): """Pixel region.""" return region.to_pixel(reference_geom.wcs) @staticmethod def _exclusion_mask_ref(reference_geom, exclusion_mask): """Exclusion mask reprojected.""" if exclusion_mask: mask = exclusion_mask.interp_to_geom(reference_geom, fill_value=True) else: mask = WcsNDMap.from_geom(geom=reference_geom, data=True) return mask @staticmethod def _get_excluded_pixels(reference_geom, exclusion_mask): """Excluded pixel coordinates.""" # find excluded PixCoords exclusion_mask = ReflectedRegionsFinder._exclusion_mask_ref( reference_geom, exclusion_mask, ) pix_y, pix_x = np.nonzero(~exclusion_mask.data) return PixCoord(pix_x, pix_y) class WobbleRegionsFinder(RegionsFinder): """Find the OFF regions symmetric to the ON region. This is a simpler version of the `ReflectedRegionsFinder`, that will place ``n_off_regions`` regions at symmetric positions on the circle created by the center position and the on region. Returns no regions if the regions are found to be overlapping, in that case reduce the number of off regions and/or their size. Parameters ---------- n_off_regions : int Number of off regions to create. Actual number of off regions might be smaller if an ``exclusion_mask`` is given to `WobbleRegionsFinder.run`. binsz : `~astropy.coordinates.Angle` Bin size of the reference map used for region finding. """ def __init__(self, n_off_regions, binsz=0.01 * u.deg): super().__init__(binsz=binsz) self.n_off_regions = n_off_regions def run(self, region, center, exclusion_mask=None): """Find off regions. Parameters ---------- region : `~regions.SkyRegion` Region to rotate. center : `~astropy.coordinates.SkyCoord` Rotation point. exclusion_mask : `~gammapy.maps.WcsNDMap`, optional Exclusion mask. Regions intersecting with this mask will not be included in the returned regions. Default is None. Returns ------- regions : list of `~regions.SkyRegion` Reflected regions. wcs : `~astropy.wcs.WCS` WCS for the determined regions. """ reference_geom = self._create_reference_geometry(region, center) center_pixel = self._get_center_pixel(center, reference_geom) region_pix = self._get_region_pixels(region, reference_geom) excluded_pixels = self._get_excluded_pixels(reference_geom, exclusion_mask) n_positions = self.n_off_regions + 1 increment = FULL_CIRCLE / n_positions regions = [] for i in range(1, n_positions): angle = i * increment region_test = region_pix.rotate(center_pixel, angle) # for PointSkyRegion, we test if the point is inside the exclusion mask # otherwise we test if there is overlap excluded = False if exclusion_mask is not None: if isinstance(region, PointSkyRegion): excluded = ( excluded_pixels.separation(region_test.center) < 1 ).any() else: excluded = region_test.contains(excluded_pixels).any() if not excluded: regions.append(region_test) # We cannot check for overlap of PointSkyRegion here, this is done later # in make_counts_off_rad_max in the rad_max case if not isinstance(region, PointSkyRegion): if self._are_regions_overlapping(regions, reference_geom): log.warning("Found overlapping off regions, returning no regions") return [], reference_geom.wcs return [r.to_sky(reference_geom.wcs) for r in regions], reference_geom.wcs @staticmethod def _are_regions_overlapping(regions, reference_geom): # check for overlap masks = [ region.to_mask().to_image(reference_geom._shape) > 0 for region in regions ] for mask_a, mask_b in combinations(masks, 2): if np.any(mask_a & mask_b): return True return False class ReflectedRegionsFinder(RegionsFinder): """Find reflected regions. This class is responsible for placing a reflected region for a given input region and pointing position. It converts to pixel coordinates internally assuming a tangent projection at center position. If the center lies inside the input region, no reflected regions can be found. If you want to make a background estimate for an IACT observation using the reflected regions method, see also `~gammapy.makers.ReflectedRegionsBackgroundMaker`. Parameters ---------- angle_increment : `~astropy.coordinates.Angle`, optional Rotation angle applied when a region falls in an excluded region. min_distance : `~astropy.coordinates.Angle`, optional Minimum rotation angle between two consecutive reflected regions. min_distance_input : `~astropy.coordinates.Angle`, optional Minimum rotation angle between the input region and the first reflected region. max_region_number : int, optional Maximum number of regions to use. binsz : `~astropy.coordinates.Angle` Bin size of the reference map used for region finding. Examples -------- >>> from astropy.coordinates import SkyCoord, Angle >>> from regions import CircleSkyRegion >>> from gammapy.makers import ReflectedRegionsFinder >>> pointing = SkyCoord(83.2, 22.7, unit='deg', frame='icrs') >>> target_position = SkyCoord(80.2, 23.5, unit='deg', frame='icrs') >>> theta = Angle(0.4, 'deg') >>> on_region = CircleSkyRegion(target_position, theta) >>> finder = ReflectedRegionsFinder(min_distance_input='1 rad') >>> regions, wcs = finder.run(region=on_region, center=pointing) >>> print(regions[0]) # doctest: +ELLIPSIS Region: CircleSkyRegion center: radius: 1438.3... arcsec """ def __init__( self, angle_increment="0.1 rad", min_distance="0 rad", min_distance_input="0.1 rad", max_region_number=10000, binsz="0.01 deg", ): super().__init__(binsz=binsz) self.angle_increment = Angle(angle_increment) if self.angle_increment <= Angle(0, "deg"): raise ValueError("angle_increment is too small") self.min_distance = Angle(min_distance) self.min_distance_input = Angle(min_distance_input) self.max_region_number = max_region_number self.binsz = Angle(binsz) @staticmethod def _region_angular_size(region, reference_geom, center_pix): """Compute maximum angular size of a group of pixels as seen from center. This assumes that the center lies outside the group of pixel. Returns ------- angular_size : `~astropy.coordinates.Angle` The maximum angular size. """ mask = reference_geom.region_mask([region]).data pix_y, pix_x = np.nonzero(mask) pixels = PixCoord(pix_x, pix_y) dx, dy = center_pix.x - pixels.x, center_pix.y - pixels.y angles = Angle(np.arctan2(dx, dy), "rad") angular_size = np.max(angles) - np.min(angles) if angular_size.value > np.pi: angle_wrapped = angles.wrap_at(0 * u.rad) angular_size = np.max(angle_wrapped) - np.min(angle_wrapped) return angular_size def _get_angle_range(self, region, reference_geom, center_pix): """Minimum and maximum angle.""" region_angular_size = self._region_angular_size( region=region, reference_geom=reference_geom, center_pix=center_pix ) # Minimum angle a region has to be moved to not overlap with previous one # Add required minimal distance between two off regions angle_min = region_angular_size + self.min_distance angle_max = FULL_CIRCLE - angle_min - self.min_distance_input return angle_min, angle_max def run(self, region, center, exclusion_mask=None): """Find reflected regions. Parameters ---------- region : `~regions.SkyRegion` Region to rotate. center : `~astropy.coordinates.SkyCoord` Rotation point. exclusion_mask : `~gammapy.maps.WcsNDMap`, optional Exclusion mask. Regions intersecting with this mask will not be included in the returned regions. Default is None. Returns ------- regions : list of `~regions.SkyRegion` Reflected regions. wcs : `~astropy.wcs.WCS` WCS for the determined regions. """ if isinstance(region, PointSkyRegion): raise TypeError( "ReflectedRegionsFinder does not work with PointSkyRegion. Use WobbleRegionsFinder instead." ) regions = [] reference_geom = self._create_reference_geometry(region, center) center_pixel = self._get_center_pixel(center, reference_geom) region_pix = self._get_region_pixels(region, reference_geom) excluded_pixels = self._get_excluded_pixels(reference_geom, exclusion_mask) angle_min, angle_max = self._get_angle_range( region=region, reference_geom=reference_geom, center_pix=center_pixel, ) angle = angle_min + self.min_distance_input while angle < angle_max: region_test = region_pix.rotate(center_pixel, angle) if not np.any(region_test.contains(excluded_pixels)): region = region_test.to_sky(reference_geom.wcs) regions.append(region) if len(regions) >= self.max_region_number: break angle += angle_min else: angle += self.angle_increment return regions, reference_geom.wcs class ReflectedRegionsBackgroundMaker(Maker): """Reflected regions background maker. Attributes ---------- region_finder: RegionsFinder If not given, a `ReflectedRegionsFinder` will be created and any of the ``**kwargs`` will be forwarded to the `ReflectedRegionsFinder`. exclusion_mask : `~gammapy.maps.WcsNDMap`, optional Exclusion mask. The map must contain at max one non-spatial dimension, and this dimension must be one bin. """ tag = "ReflectedRegionsBackgroundMaker" def __init__( self, region_finder=None, exclusion_mask=None, **kwargs, ): if exclusion_mask and not exclusion_mask.is_mask: raise ValueError("Exclusion mask must contain boolean values") if exclusion_mask and not exclusion_mask.geom.is_flat: raise ValueError("Exclusion mask must only contain spatial dimension") if exclusion_mask: exclusion_mask = exclusion_mask.sum_over_axes(keepdims=False) exclusion_mask.data = exclusion_mask.data.astype("bool") self.exclusion_mask = exclusion_mask if region_finder is None: self.region_finder = ReflectedRegionsFinder(**kwargs) else: if kwargs: raise ValueError("No kwargs can be given if providing a region_finder") self.region_finder = region_finder @staticmethod def _filter_regions_off_rad_max(regions_off, energy_axis, geom, events, rad_max): # check for overlap offset = geom.region.center.separation(events.pointing_radec) e_min, e_max = energy_axis.bounds regions = [geom.region] + regions_off overlap = are_regions_overlapping_rad_max( regions, rad_max, offset, e_min, e_max ) if overlap: log.warning("Found overlapping on/off regions, choose less off regions") return [] return regions_off def make_counts_off(self, dataset, observation): """Make off counts. **NOTE for 1D analysis:** as for `~gammapy.makers.map.MapDatasetMaker.make_counts`, if the geometry of the dataset is a `~regions.CircleSkyRegion` then only a single instance of the `ReflectedRegionsFinder` will be called. If, on the other hand, the geometry of the dataset is a `~regions.PointSkyRegion`, then we have to call the `ReflectedRegionsFinder` several time, each time with a different size of the on region that we will read from the `RAD_MAX_2D` table. Parameters ---------- dataset : `~gammapy.datasets.SpectrumDataset` Spectrum dataset. observation : `~gammapy.data.Observation` Observation container. Returns ------- counts_off : `~gammapy.maps.RegionNDMap` Counts vs estimated energy extracted from the OFF regions. """ geom = dataset.counts.geom energy_axis = geom.axes["energy"] events = observation.events rad_max = observation.rad_max is_point_sky_region = geom.is_all_point_sky_regions if rad_max and not is_rad_max_compatible_region_geom( rad_max=rad_max, geom=geom ): raise ValueError( "Must use `PointSkyRegion` or `CircleSkyRegion` with rad max " "equivalent radius in point-like analysis," f" got {type(geom.region)} instead" ) regions_off, wcs = self.region_finder.run( center=observation.get_pointing_icrs(observation.tmid), region=geom.region, exclusion_mask=self.exclusion_mask, ) if geom.is_all_point_sky_regions and len(regions_off) > 0: regions_off = self._filter_regions_off_rad_max( regions_off, energy_axis, geom, events, rad_max ) if len(regions_off) == 0: log.warning( "ReflectedRegionsBackgroundMaker failed. No OFF region found " f"outside exclusion mask for dataset '{dataset.name}'." ) return None, RegionNDMap.from_geom(geom=geom, data=0) geom_off = RegionGeom.from_regions( regions=regions_off, axes=[energy_axis], wcs=wcs, ) if is_point_sky_region: counts_off = make_counts_off_rad_max( geom_off=geom_off, rad_max=rad_max, events=events, ) else: counts_off = RegionNDMap.from_geom(geom=geom_off) counts_off.fill_events(events) acceptance_off = RegionNDMap.from_geom(geom=geom_off, data=len(regions_off)) return counts_off, acceptance_off def run(self, dataset, observation): """Run reflected regions background maker. Parameters ---------- dataset : `~gammapy.datasets.SpectrumDataset` Spectrum dataset. observation : `~gammapy.data.Observation` Data store observation. Returns ------- dataset_on_off : `~gammapy.datasets.SpectrumDatasetOnOff` On-Off dataset. """ counts_off, acceptance_off = self.make_counts_off(dataset, observation) acceptance = RegionNDMap.from_geom(geom=dataset.counts.geom, data=1) dataset_onoff = SpectrumDatasetOnOff.from_spectrum_dataset( dataset=dataset, acceptance=acceptance, acceptance_off=acceptance_off, counts_off=counts_off, name=dataset.name, ) if dataset_onoff.counts_off is None: dataset_onoff.mask_safe.data[...] = False log.warning( f"ReflectedRegionsBackgroundMaker failed. Setting {dataset_onoff.name} " "mask to False." ) return dataset_onoff ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/background/ring.py0000644000175100001770000002560214721316200020763 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Ring background estimation.""" import itertools import numpy as np from astropy.convolution import Ring2DKernel, Tophat2DKernel from astropy.coordinates import Angle from gammapy.maps import Map from gammapy.utils.array import scale_cube from ..core import Maker __all__ = ["AdaptiveRingBackgroundMaker", "RingBackgroundMaker"] class AdaptiveRingBackgroundMaker(Maker): """Adaptive ring background algorithm. This algorithm extends the `RingBackgroundMaker` method by adapting the size of the ring to achieve a minimum on / off exposure ratio (alpha) in regions where the area to estimate the background from is limited. Parameters ---------- r_in : `~astropy.units.Quantity` Inner radius of the ring. r_out_max : `~astropy.units.Quantity` Maximum outer radius of the ring. width : `~astropy.units.Quantity` Width of the ring. stepsize : `~astropy.units.Quantity` Stepsize used for increasing the radius. threshold_alpha : float Threshold on alpha above which the adaptive ring takes action. theta : `~astropy.units.Quantity` Integration radius used for alpha computation. method : {'fixed_width', 'fixed_r_in'} Adaptive ring method. Default is 'fixed_width'. exclusion_mask : `~gammapy.maps.WcsNDMap` Exclusion mask. See Also -------- RingBackgroundMaker. """ tag = "AdaptiveRingBackgroundMaker" def __init__( self, r_in, r_out_max, width, stepsize="0.02 deg", threshold_alpha=0.1, theta="0.22 deg", method="fixed_width", exclusion_mask=None, ): if method not in ["fixed_width", "fixed_r_in"]: raise ValueError("Not a valid adaptive ring method.") self.r_in = Angle(r_in) self.r_out_max = Angle(r_out_max) self.width = Angle(width) self.stepsize = Angle(stepsize) self.threshold_alpha = threshold_alpha self.theta = Angle(theta) self.method = method self.exclusion_mask = exclusion_mask def kernels(self, image): """Ring kernels according to the specified method. Parameters ---------- image : `~gammapy.maps.WcsNDMap` Map specifying the WCS information. Returns ------- kernels : list List of `~astropy.convolution.Ring2DKernel`. """ scale = image.geom.pixel_scales[0] r_in = (self.r_in / scale).to_value("") r_out_max = (self.r_out_max / scale).to_value("") width = (self.width / scale).to_value("") stepsize = (self.stepsize / scale).to_value("") if self.method == "fixed_width": r_ins = np.arange(r_in, (r_out_max - width), stepsize) widths = [width] elif self.method == "fixed_r_in": widths = np.arange(width, (r_out_max - r_in), stepsize) r_ins = [r_in] else: raise ValueError(f"Invalid method: {self.method!r}") kernels = [] for r_in, width in itertools.product(r_ins, widths): kernel = Ring2DKernel(r_in, width) kernel.normalize("peak") kernels.append(kernel) return kernels @staticmethod def _alpha_approx_cube(cubes): acceptance = cubes["acceptance"] acceptance_off = cubes["acceptance_off"] with np.errstate(divide="ignore", invalid="ignore"): alpha_approx = np.where( acceptance_off > 0, acceptance / acceptance_off, np.inf ) return alpha_approx def _reduce_cubes(self, cubes, dataset): """Compute off and off acceptance map. Calculated by reducing the cubes. The data is iterated along the third axis (i.e. increasing ring sizes), the value with the first approximate alpha < threshold is taken. """ threshold = self.threshold_alpha alpha_approx_cube = self._alpha_approx_cube(cubes) counts_off_cube = cubes["counts_off"] acceptance_off_cube = cubes["acceptance_off"] acceptance_cube = cubes["acceptance"] shape = alpha_approx_cube.shape[:2] counts_off = np.tile(np.nan, shape) acceptance_off = np.tile(np.nan, shape) acceptance = np.tile(np.nan, shape) for idx in np.arange(alpha_approx_cube.shape[-1]): mask = (alpha_approx_cube[:, :, idx] <= threshold) & np.isnan(counts_off) counts_off[mask] = counts_off_cube[:, :, idx][mask] acceptance_off[mask] = acceptance_off_cube[:, :, idx][mask] acceptance[mask] = acceptance_cube[:, :, idx][mask] counts = dataset.counts acceptance = counts.copy(data=acceptance[np.newaxis, Ellipsis]) acceptance_off = counts.copy(data=acceptance_off[np.newaxis, Ellipsis]) counts_off = counts.copy(data=counts_off[np.newaxis, Ellipsis]) return acceptance, acceptance_off, counts_off def make_cubes(self, dataset): """Make acceptance, off acceptance, off counts cubes. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input map dataset. Returns ------- cubes : dict of `~gammapy.maps.WcsNDMap` Dictionary containing ``counts_off``, ``acceptance`` and ``acceptance_off`` cubes. """ counts = dataset.counts background = dataset.npred_background() kernels = self.kernels(counts) if self.exclusion_mask: exclusion = self.exclusion_mask.interp_to_geom(geom=counts.geom) else: exclusion = Map.from_geom(geom=counts.geom, data=True, dtype=bool) cubes = {} cubes["counts_off"] = scale_cube( (counts.data * exclusion.data)[0, Ellipsis], kernels ) cubes["acceptance_off"] = scale_cube( (background.data * exclusion.data)[0, Ellipsis], kernels ) scale = background.geom.pixel_scales[0].to("deg") theta = self.theta * scale tophat = Tophat2DKernel(theta.value) tophat.normalize("peak") acceptance = background.convolve(tophat.array) acceptance_data = acceptance.data[0, Ellipsis] cubes["acceptance"] = np.repeat( acceptance_data[Ellipsis, np.newaxis], len(kernels), axis=2 ) return cubes def run(self, dataset, observation=None): """Run adaptive ring background maker. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input map dataset. Returns ------- dataset_on_off : `~gammapy.datasets.MapDatasetOnOff` On off dataset. """ from gammapy.datasets import MapDatasetOnOff cubes = self.make_cubes(dataset) acceptance, acceptance_off, counts_off = self._reduce_cubes(cubes, dataset) mask_safe = dataset.mask_safe.copy() not_has_off_acceptance = acceptance_off.data <= 0 mask_safe.data[not_has_off_acceptance] = 0 dataset_on_off = MapDatasetOnOff.from_map_dataset( dataset=dataset, counts_off=counts_off, acceptance=acceptance, acceptance_off=acceptance_off, name=dataset.name, ) dataset_on_off.mask_safe = mask_safe return dataset_on_off class RingBackgroundMaker(Maker): """Perform a local renormalisation of the existing background template, using a ring kernel. Expected signal regions should be removed by passing an exclusion mask. Parameters ---------- r_in : `~astropy.units.Quantity` Inner ring radius. width : `~astropy.units.Quantity` Ring width. exclusion_mask : `~gammapy.maps.WcsNDMap` Exclusion mask. Examples -------- For a usage example, see :doc:`/tutorials/analysis-2d/ring_background` tutorial. See Also -------- AdaptiveRingBackgroundEstimator. """ tag = "RingBackgroundMaker" def __init__(self, r_in, width, exclusion_mask=None): self.r_in = Angle(r_in) self.width = Angle(width) self.exclusion_mask = exclusion_mask def kernel(self, image): """Ring kernel. Parameters ---------- image : `~gammapy.maps.WcsNDMap` Input map. Returns ------- ring : `~astropy.convolution.Ring2DKernel` Ring kernel. """ scale = image.geom.pixel_scales[0].to("deg") r_in = self.r_in.to("deg") / scale width = self.width.to("deg") / scale ring = Ring2DKernel(r_in.value, width.value) ring.normalize("peak") return ring def make_maps_off(self, dataset): """Make off maps. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input map dataset. Returns ------- maps_off : dict of `~gammapy.maps.WcsNDMap` Dictionary containing `counts_off` and `acceptance_off` maps. """ counts = dataset.counts background = dataset.npred_background() if self.exclusion_mask is not None: # reproject exclusion mask coords = counts.geom.get_coord() data = self.exclusion_mask.get_by_coord(coords) exclusion = Map.from_geom(geom=counts.geom, data=data) else: data = np.ones(counts.geom.data_shape, dtype=bool) exclusion = Map.from_geom(geom=counts.geom, data=data) maps_off = {} ring = self.kernel(counts) counts_excluded = counts * exclusion maps_off["counts_off"] = counts_excluded.convolve(ring.array) background_excluded = background * exclusion maps_off["acceptance_off"] = background_excluded.convolve(ring.array) return maps_off def run(self, dataset, observation=None): """Run ring background maker. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Input map dataset. Returns ------- dataset_on_off : `~gammapy.datasets.MapDatasetOnOff` On off dataset. """ from gammapy.datasets import MapDatasetOnOff maps_off = self.make_maps_off(dataset) maps_off["acceptance"] = dataset.npred_background() mask_safe = dataset.mask_safe.copy() not_has_off_acceptance = maps_off["acceptance_off"].data <= 0 mask_safe.data[not_has_off_acceptance] = 0 dataset_on_off = MapDatasetOnOff.from_map_dataset( dataset=dataset, name=dataset.name, **maps_off ) dataset_on_off.mask_safe = mask_safe return dataset_on_off def __str__(self): return ( "RingBackground parameters: \n" f"r_in : {self.parameters['r_in']}\n" f"width: {self.parameters['width']}\n" f"Exclusion mask: {self.exclusion_mask}" ) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.212642 gammapy-1.3/gammapy/makers/background/tests/0000755000175100001770000000000014721316215020615 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/background/tests/__init__.py0000644000175100001770000000010014721316200022707 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/background/tests/test_fov.py0000644000175100001770000003014314721316200023013 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy.coordinates import Angle, SkyCoord from regions import CircleSkyRegion from gammapy.data import DataStore from gammapy.datasets import MapDataset, SpectrumDataset from gammapy.makers import FoVBackgroundMaker, MapDatasetMaker, SafeMaskMaker from gammapy.maps import Map, MapAxis, RegionGeom, WcsGeom from gammapy.modeling import Fit from gammapy.modeling.models import ( FoVBackgroundModel, PointSpatialModel, PowerLawNormSpectralModel, PowerLawSpectralModel, SkyModel, ) from gammapy.utils.testing import requires_data @pytest.fixture(scope="session") def observation(): """Example observation list for testing.""" datastore = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") obs_id = 23523 return datastore.obs(obs_id) @pytest.fixture(scope="session") def geom(): energy_axis = MapAxis.from_edges([1, 10], unit="TeV", name="energy", interp="log") return WcsGeom.create( skydir=SkyCoord(83.633, 22.014, unit="deg"), binsz=0.02, width=(5, 5), frame="galactic", proj="CAR", axes=[energy_axis], ) @pytest.fixture(scope="session") def reference(geom): return MapDataset.create(geom) @pytest.fixture(scope="session") def exclusion_mask(geom): """Example mask for testing.""" pos = SkyCoord(83.633, 22.014, unit="deg", frame="icrs") region = CircleSkyRegion(pos, Angle(0.3, "deg")) return ~geom.region_mask([region]) @pytest.fixture(scope="session") def obs_dataset(geom, observation): safe_mask_maker = SafeMaskMaker(methods=["offset-max"], offset_max="2 deg") map_dataset_maker = MapDatasetMaker(selection=["counts", "background", "exposure"]) reference = MapDataset.create(geom) cutout = reference.cutout( observation.get_pointing_icrs(observation.tmid), width="4 deg", name="test-fov" ) dataset = map_dataset_maker.run(cutout, observation) dataset = safe_mask_maker.run(dataset, observation) return dataset def test_fov_bkg_maker_incorrect_method(): with pytest.raises(ValueError): FoVBackgroundMaker(method="bad") @requires_data() def test_fov_bkg_maker_scale(obs_dataset, exclusion_mask): fov_bkg_maker = FoVBackgroundMaker(method="scale", exclusion_mask=exclusion_mask) test_dataset = obs_dataset.copy(name="test-fov") dataset = fov_bkg_maker.run(test_dataset) model = dataset.models[f"{dataset.name}-bkg"].spectral_model assert_allclose(model.norm.value, 0.83, rtol=1e-2) assert_allclose(model.norm.error, 0.0207, rtol=1e-2) assert_allclose(model.tilt.value, 0.0, rtol=1e-2) @requires_data() def test_fov_bkg_maker_scale_nocounts(obs_dataset, exclusion_mask, caplog): fov_bkg_maker = FoVBackgroundMaker(method="scale", exclusion_mask=exclusion_mask) test_dataset = obs_dataset.copy(name="test-fov") test_dataset.counts *= 0 dataset = fov_bkg_maker.run(test_dataset) model = dataset.models[f"{dataset.name}-bkg"].spectral_model assert_allclose(model.norm.value, 1, rtol=1e-4) assert_allclose(model.tilt.value, 0.0, rtol=1e-2) assert "WARNING" in [_.levelname for _ in caplog.records] message1 = ( "FoVBackgroundMaker failed. Only 0 counts outside exclusion mask " "for test-fov. Setting mask to False." ) assert message1 in [_.message for _ in caplog.records] @requires_data() def test_fov_bkg_maker_fit(obs_dataset, exclusion_mask): fov_bkg_maker = FoVBackgroundMaker(method="fit", exclusion_mask=exclusion_mask) dataset1 = obs_dataset.copy(name="test-fov") dataset = fov_bkg_maker.run(dataset1) model = dataset.models[f"{dataset.name}-bkg"].spectral_model assert_allclose(model.norm.value, 0.83077, rtol=1e-4) assert_allclose(model.norm.error, 0.02069, rtol=1e-2) assert_allclose(model.tilt.value, 0.0, rtol=1e-4) assert_allclose(model.tilt.error, 0.0, rtol=1e-2) assert_allclose(fov_bkg_maker.default_spectral_model.tilt.value, 0.0) assert_allclose(fov_bkg_maker.default_spectral_model.norm.value, 1.0) spectral_model = PowerLawNormSpectralModel() spectral_model.tilt.frozen = False fov_bkg_maker = FoVBackgroundMaker( method="fit", exclusion_mask=exclusion_mask, spectral_model=spectral_model ) dataset2 = obs_dataset.copy(name="test-fov") dataset = fov_bkg_maker.run(dataset2) model = dataset.models[f"{dataset.name}-bkg"].spectral_model assert_allclose(model.norm.value, 0.901523, rtol=1e-4) assert_allclose(model.norm.error, 0.60880, rtol=1e-4) assert_allclose(model.tilt.value, 0.071069, rtol=1e-4) assert_allclose(model.tilt.error, 0.586600, rtol=1e-4) assert_allclose(fov_bkg_maker.default_spectral_model.tilt.value, 0.0) assert_allclose(fov_bkg_maker.default_spectral_model.norm.value, 1.0) @requires_data() def test_fov_bkg_maker_fit_nocounts(obs_dataset, exclusion_mask): fov_bkg_maker = FoVBackgroundMaker(method="fit", exclusion_mask=exclusion_mask) test_dataset = obs_dataset.copy(name="test-fov") test_dataset.counts.data[...] = 0 dataset = fov_bkg_maker.run(test_dataset) assert np.all(dataset.mask_safe.data == 0) @requires_data() def test_fov_bkg_maker_with_source_model(obs_dataset, exclusion_mask, caplog): test_dataset = obs_dataset.copy(name="test-fov") # crab model spatial_model = PointSpatialModel( lon_0="83.619deg", lat_0="22.024deg", frame="icrs" ) spectral_model = PowerLawSpectralModel( index=2.6, amplitude="4.5906e-11 cm-2 s-1 TeV-1", reference="1 TeV" ) model = SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, name="test-source" ) bkg_model = FoVBackgroundModel(dataset_name="test-fov") test_dataset.models = [model, bkg_model] # pre-fit both source and background to get reference model Fit().run(test_dataset) bkg_model_spec = test_dataset.models[f"{test_dataset.name}-bkg"].spectral_model norm_ref = 0.897 assert not bkg_model_spec.norm.frozen assert_allclose(bkg_model_spec.norm.value, norm_ref, rtol=1e-4) assert_allclose(bkg_model_spec.tilt.value, 0.0, rtol=1e-4) # apply scale method with pre-fitted source model and no exclusion_mask bkg_model_spec.norm.value = 1 fov_bkg_maker = FoVBackgroundMaker(method="scale", exclusion_mask=None) dataset = fov_bkg_maker.run(test_dataset) bkg_model_spec = test_dataset.models[f"{dataset.name}-bkg"].spectral_model assert_allclose(bkg_model_spec.norm.value, norm_ref, rtol=1e-4) assert_allclose(bkg_model_spec.tilt.value, 0.0, rtol=1e-4) # apply fit method with pre-fitted source model and no exclusion mask bkg_model_spec.norm.value = 1 fov_bkg_maker = FoVBackgroundMaker(method="fit", exclusion_mask=None) dataset = fov_bkg_maker.run(test_dataset) bkg_model_spec = test_dataset.models[f"{dataset.name}-bkg"].spectral_model assert_allclose(bkg_model_spec.norm.value, norm_ref, rtol=1e-4) assert_allclose(bkg_model_spec.tilt.value, 0.0, rtol=1e-4) # apply scale method with pre-fitted source model and exclusion_mask bkg_model_spec.norm.value = 1 fov_bkg_maker = FoVBackgroundMaker(method="scale", exclusion_mask=exclusion_mask) dataset = fov_bkg_maker.run(test_dataset) bkg_model_spec = test_dataset.models[f"{dataset.name}-bkg"].spectral_model assert_allclose(bkg_model_spec.norm.value, 0.830779, rtol=1e-4) assert_allclose(bkg_model_spec.tilt.value, 0.0, rtol=1e-4) # apply fit method with pre-fitted source model and exclusion mask bkg_model_spec.norm.value = 1 fov_bkg_maker = FoVBackgroundMaker(method="fit", exclusion_mask=exclusion_mask) dataset = fov_bkg_maker.run(test_dataset) bkg_model_spec = test_dataset.models[f"{dataset.name}-bkg"].spectral_model assert_allclose(bkg_model_spec.norm.value, 0.830779, rtol=1e-4) assert_allclose(bkg_model_spec.tilt.value, 0.0, rtol=1e-4) # Here we check that source parameters are correctly thawed after fit. assert not dataset.models.parameters["index"].frozen assert not dataset.models.parameters["lon_0"].frozen # test model.spectral_model.amplitude.value *= 1e5 fov_bkg_maker = FoVBackgroundMaker(method="scale") dataset = fov_bkg_maker.run(test_dataset) assert "WARNING" in [_.levelname for _ in caplog.records] message1 = ( "FoVBackgroundMaker failed. Negative residuals counts for" " test-fov. Setting mask to False." ) assert message1 in [_.message for _ in caplog.records] @requires_data() def test_fov_bkg_maker_fit_with_tilt(obs_dataset, exclusion_mask): fov_bkg_maker = FoVBackgroundMaker( method="fit", exclusion_mask=exclusion_mask, ) test_dataset = obs_dataset.copy(name="test-fov") model = FoVBackgroundModel(dataset_name="test-fov") model.spectral_model.tilt.frozen = False test_dataset.models = [model] dataset = fov_bkg_maker.run(test_dataset) model = dataset.models[f"{dataset.name}-bkg"].spectral_model assert_allclose(model.norm.value, 0.901523, rtol=1e-4) assert_allclose(model.tilt.value, 0.071069, rtol=1e-4) @requires_data() def test_fov_bkg_maker_fit_fail(obs_dataset, exclusion_mask, caplog): fov_bkg_maker = FoVBackgroundMaker(method="fit", exclusion_mask=exclusion_mask) test_dataset = obs_dataset.copy(name="test-fov") # Putting null background model to prevent convergence test_dataset.background.data *= 0 dataset = fov_bkg_maker.run(test_dataset) model = dataset.models[f"{dataset.name}-bkg"].spectral_model assert_allclose(model.norm.value, 1, rtol=1e-4) assert "WARNING" in [_.levelname for _ in caplog.records] message1 = ( "FoVBackgroundMaker failed. Non-finite normalisation value for " "test-fov. Setting mask to False." ) assert message1 in [_.message for _ in caplog.records] @requires_data() def test_fov_bkg_maker_scale_fail(obs_dataset, exclusion_mask, caplog): fov_bkg_maker = FoVBackgroundMaker(method="scale", exclusion_mask=exclusion_mask) test_dataset = obs_dataset.copy(name="test-fov") # Putting negative background model to prevent correct scaling test_dataset.background.data *= -1 dataset = fov_bkg_maker.run(test_dataset) model = dataset.models[f"{dataset.name}-bkg"].spectral_model assert_allclose(model.norm.value, 1, rtol=1e-4) assert "WARNING" in [_.levelname for _ in caplog.records] message1 = ( "FoVBackgroundMaker failed. Only -1940 background counts outside" " exclusion mask for test-fov. Setting mask to False." ) assert message1 in [_.message for _ in caplog.records] @requires_data() def test_fov_bkg_maker_mask_fit_handling(obs_dataset, exclusion_mask): fov_bkg_maker = FoVBackgroundMaker(method="scale", exclusion_mask=exclusion_mask) test_dataset = obs_dataset.copy(name="test-fov") region = CircleSkyRegion(obs_dataset._geom.center_skydir, Angle(0.4, "deg")) mask_fit = obs_dataset._geom.region_mask(regions=[region]) test_dataset.mask_fit = mask_fit dataset = fov_bkg_maker.run(test_dataset) assert np.all(test_dataset.mask_fit == mask_fit) model = dataset.models[f"{dataset.name}-bkg"].spectral_model assert_allclose(model.norm.value, 0.9975, rtol=1e-3) assert_allclose(model.norm.error, 0.1115, rtol=1e-3) assert_allclose(model.tilt.value, 0.0, rtol=1e-2) @requires_data() def test_fov_bkg_maker_spectrumdataset(obs_dataset): from regions import CircleSkyRegion maker = FoVBackgroundMaker() energy_axis = MapAxis.from_edges([1, 10], unit="TeV", name="energy", interp="log") region = CircleSkyRegion(obs_dataset._geom.center_skydir, Angle("0.1 deg")) geom = RegionGeom.create(region, axes=[energy_axis]) dataset = SpectrumDataset.create(geom) with pytest.raises(TypeError): maker.run(dataset) region_dataset = obs_dataset.to_region_map_dataset(region) with pytest.raises(TypeError): maker.run(region_dataset) def test_fov_background_maker_str(): exclusion_mask = Map.create(binsz=0.2, width=(2, 2)) maker_fov = FoVBackgroundMaker(exclusion_mask=exclusion_mask) assert "FoVBackgroundMaker" in str(maker_fov) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/background/tests/test_phase.py0000644000175100001770000001324214721316200023322 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy import units as u from astropy.coordinates import Angle, SkyCoord from regions import CircleSkyRegion, PointSkyRegion from gammapy.data import DataStore, EventList from gammapy.datasets import MapDataset, SpectrumDataset from gammapy.makers import MapDatasetMaker, PhaseBackgroundMaker, SpectrumDatasetMaker from gammapy.maps import MapAxis, RegionGeom, WcsGeom from gammapy.utils.testing import requires_data POS_CRAB = SkyCoord(83.6331, +22.0145, frame="icrs", unit="deg") POS_VELA = SkyCoord("08h35m20.65525s", "-45d10m35.1545s", frame="icrs") @pytest.fixture(scope="session") def observations_cta(): datastore_cta = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps") return datastore_cta.get_observations([111630]) @pytest.fixture(scope="session") def observations_magic(): datastore_magic = DataStore.from_dir("$GAMMAPY_DATA/magic/rad_max/data") return datastore_magic.get_observations(required_irf="point-like")[0] @pytest.fixture(scope="session") def observations_hess(): datastore_hess = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") return datastore_hess.obs(23523) @pytest.fixture(scope="session") def phase_bkg_maker(): """Example background estimator for testing.""" return PhaseBackgroundMaker(on_phase=(0.5, 0.6), off_phase=(0.7, 1)) @requires_data() def test_basic(phase_bkg_maker): assert "PhaseBackgroundMaker" in str(phase_bkg_maker) @requires_data() def test_run_spectrum(observations_cta, phase_bkg_maker): maker = SpectrumDatasetMaker() e_reco = MapAxis.from_edges(np.logspace(0, 2, 5) * u.TeV, name="energy") e_true = MapAxis.from_edges(np.logspace(-0.5, 2, 11) * u.TeV, name="energy_true") radius = Angle(0.2, "deg") region = CircleSkyRegion(POS_VELA, radius) geom = RegionGeom.create(region=region, axes=[e_reco]) dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=e_true) obs = observations_cta[0] dataset = maker.run(dataset_empty, obs) dataset_on_off = phase_bkg_maker.run(dataset, obs) assert_allclose(dataset_on_off.acceptance, 0.1) assert_allclose(dataset_on_off.acceptance_off, 0.3) assert_allclose(dataset_on_off.counts.data.sum(), 28) assert_allclose(dataset_on_off.counts_off.data.sum(), 57) @requires_data() def test_run_map(observations_cta, phase_bkg_maker): maker = MapDatasetMaker() e_reco = MapAxis.from_edges(np.logspace(0, 2, 5) * u.TeV, name="energy") e_true = MapAxis.from_edges(np.logspace(-0.5, 2, 11) * u.TeV, name="energy_true") binsz = Angle(0.02, "deg") geom = WcsGeom.create(binsz=binsz, skydir=POS_VELA, width="2 deg", axes=[e_reco]) dataset_empty = MapDataset.create(geom=geom, energy_axis_true=e_true) obs = observations_cta[0] dataset = maker.run(dataset_empty, obs) dataset_on_off = phase_bkg_maker.run(dataset, obs) assert_allclose(dataset_on_off.acceptance, 0.1) assert_allclose(dataset_on_off.acceptance_off, 0.3) assert_allclose(dataset_on_off.counts.data.sum(), 78) assert_allclose(dataset_on_off.counts_off.data.sum(), 263) @pytest.mark.parametrize( "pars", [ {"p_in": [[0.2, 0.3]], "p_out": [[0.2, 0.3]]}, {"p_in": [[0.9, 0.1]], "p_out": [[0.9, 1], [0, 0.1]]}, {"p_in": [[-0.2, 0.1]], "p_out": [[0.8, 1], [0, 0.1]]}, {"p_in": [[0.8, 1.2]], "p_out": [[0.8, 1], [0, 0.2]]}, {"p_in": [[0.8, 1]], "p_out": [[0.8, 1]]}, {"p_in": [[0.2, 0.4], [0.8, 0.9]], "p_out": [[0.2, 0.4], [0.8, 0.9]]}, ], ) def test_check_phase_intervals(pars): assert_allclose( PhaseBackgroundMaker._check_intervals(pars["p_in"]), pars["p_out"], rtol=1e-5 ) def test_copy_interval(): on = (0.8, 1.2) off = [(0.1, 0.3), (0.4, 0.5)] PhaseBackgroundMaker(on_phase=on, off_phase=off) assert on == (0.8, 1.2) assert off == [(0.1, 0.3), (0.4, 0.5)] @requires_data() @pytest.mark.parametrize( "pars", [ { "p_in": ["observations_magic", PointSkyRegion(POS_CRAB)], "p_out": [ np.array([58, 19, 2, 2, 0, 0]), np.array([163, 46, 15, 1, 0, 0]), ], }, { "p_in": ["observations_magic", CircleSkyRegion(POS_CRAB, 0.1 * u.deg)], "p_out": [ np.array([23, 18, 2, 2, 0, 0]), np.array([51, 35, 15, 1, 0, 0]), ], }, { "p_in": ["observations_hess", CircleSkyRegion(POS_CRAB, 0.1 * u.deg)], "p_out": [ np.array([0, 1, 7, 4, 0, 0]), np.array([0, 9, 32, 12, 1, 0]), ], }, ], ) def test_make_counts(phase_bkg_maker, pars, request): maker = SpectrumDatasetMaker( containment_correction=False, selection=["counts", "exposure", "edisp"] ) e_reco = MapAxis.from_energy_bounds(0.05, 100, nbin=6, unit="TeV", name="energy") e_true = MapAxis.from_energy_bounds( 0.01, 300, nbin=15, unit="TeV", name="energy_true" ) obs = request.getfixturevalue(pars["p_in"][0]) table = obs.events.table table["PHASE"] = np.linspace(0, 1, len(table["TIME"])) obs._events = EventList(table) on_region = pars["p_in"][1] geom = RegionGeom.create(region=on_region, axes=[e_reco]) dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=e_true) dataset = maker.run(dataset_empty, obs) dataset_on_off = phase_bkg_maker.run(dataset, obs) assert_allclose( [ np.squeeze(dataset_on_off.counts.data), np.squeeze(dataset_on_off.counts_off.data), ], pars["p_out"], ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/background/tests/test_reflected.py0000644000175100001770000003751214721316200024165 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import Angle, SkyCoord from regions import ( CircleSkyRegion, EllipseAnnulusSkyRegion, EllipseSkyRegion, PointSkyRegion, RectangleSkyRegion, ) from gammapy.data import DataStore from gammapy.datasets import SpectrumDataset from gammapy.makers import ( ReflectedRegionsBackgroundMaker, ReflectedRegionsFinder, SafeMaskMaker, SpectrumDatasetMaker, WobbleRegionsFinder, ) from gammapy.maps import Map, MapAxis, RegionGeom, WcsGeom from gammapy.utils.regions import compound_region_to_regions from gammapy.utils.testing import assert_quantity_allclose, requires_data @pytest.fixture(scope="session") def exclusion_mask(): """Example mask for testing.""" pos = SkyCoord(83.63, 22.01, unit="deg", frame="icrs") exclusion_region = CircleSkyRegion(pos, Angle(0.3, "deg")) geom = WcsGeom.create(skydir=pos, binsz=0.02, width=10.0) return ~geom.region_mask([exclusion_region]) @pytest.fixture(scope="session") def on_region(): """Example on_region for testing.""" pos = SkyCoord(83.63, 22.01, unit="deg", frame="icrs") radius = Angle(0.11, "deg") region = CircleSkyRegion(pos, radius) return region @pytest.fixture(scope="session") def observations(): """Example observation list for testing.""" datastore = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") obs_ids = [23523, 23526] return datastore.get_observations(obs_ids) @pytest.fixture(scope="session") def observations_fixed_rad_max(): """Example observation list for testing.""" datastore = DataStore.from_dir("$GAMMAPY_DATA/joint-crab/dl3/magic/") obs_ids = [5029748] return datastore.get_observations(obs_ids, required_irf="point-like") @pytest.fixture() def reflected_bkg_maker(exclusion_mask): finder = ReflectedRegionsFinder() return ReflectedRegionsBackgroundMaker( region_finder=finder, exclusion_mask=exclusion_mask, ) region_finder_param = [ (SkyCoord(83.2, 22.5, unit="deg"), 15, Angle("82.592 deg"), 17, 17), (SkyCoord(84.2, 22.5, unit="deg"), 17, Angle("83.636 deg"), 19, 19), (SkyCoord(83.2, 21.5, unit="deg"), 15, Angle("83.672 deg"), 17, 17), ] @requires_data() @pytest.mark.parametrize( "pointing_pos, nreg1, reg3_ra, nreg2, nreg3", region_finder_param ) def test_find_reflected_regions( exclusion_mask, on_region, pointing_pos, nreg1, reg3_ra, nreg2, nreg3 ): pointing = pointing_pos finder = ReflectedRegionsFinder( min_distance_input="0 deg", ) regions, _ = finder.run( center=pointing, region=on_region, exclusion_mask=exclusion_mask, ) assert len(regions) == nreg1 assert_quantity_allclose(regions[3].center.icrs.ra, reg3_ra, rtol=1e-2) # Test without exclusion regions, _ = finder.run(center=pointing, region=on_region) assert len(regions) == nreg2 # Test with too small exclusion small_mask = exclusion_mask.cutout(pointing, Angle("0.1 deg")) regions, _ = finder.run( center=pointing, region=on_region, exclusion_mask=small_mask, ) assert len(regions) == nreg3 # Test with maximum number of regions finder.max_region_number = 5 regions, _ = finder.run( center=pointing, region=on_region, exclusion_mask=small_mask, ) assert len(regions) == 5 # Test with an other type of region on_ellipse_annulus = EllipseAnnulusSkyRegion( center=on_region.center.galactic, inner_width=0.1 * u.deg, outer_width=0.2 * u.deg, inner_height=0.3 * u.deg, outer_height=0.6 * u.deg, angle=130 * u.deg, ) regions, _ = finder.run( region=on_ellipse_annulus, center=pointing, exclusion_mask=small_mask, ) assert len(regions) == 5 center = SkyCoord(0.5, 0.0, unit="deg") other_region_finder_param = [ (RectangleSkyRegion(center, 0.5 * u.deg, 0.5 * u.deg, angle=0 * u.deg), 3), (RectangleSkyRegion(center, 0.5 * u.deg, 1 * u.deg, angle=0 * u.deg), 1), (RectangleSkyRegion(center, 0.5 * u.deg, 1 * u.deg, angle=90 * u.deg), 1), (EllipseSkyRegion(center, 0.1 * u.deg, 1 * u.deg, angle=0 * u.deg), 2), (EllipseSkyRegion(center, 0.1 * u.deg, 1 * u.deg, angle=60 * u.deg), 3), (EllipseSkyRegion(center, 0.1 * u.deg, 1 * u.deg, angle=90 * u.deg), 2), ] @pytest.mark.parametrize("region, nreg", other_region_finder_param) def test_non_circular_regions(region, nreg): pointing = SkyCoord(0.0, 0.0, unit="deg") finder = ReflectedRegionsFinder(min_distance_input="0 deg") regions, _ = finder.run(center=pointing, region=region) assert len(regions) == nreg def test_bad_on_region(exclusion_mask, on_region): pointing = SkyCoord(83.63, 22.01, unit="deg", frame="icrs") finder = ReflectedRegionsFinder( min_distance_input="0 deg", ) regions, _ = finder.run( center=pointing, region=on_region, exclusion_mask=exclusion_mask, ) assert len(regions) == 0 @requires_data() def test_reflected_bkg_maker(on_region, reflected_bkg_maker, observations): datasets = [] e_reco = MapAxis.from_edges(np.logspace(0, 2, 5) * u.TeV, name="energy") e_true = MapAxis.from_edges(np.logspace(-0.5, 2, 11) * u.TeV, name="energy_true") geom = RegionGeom(region=on_region, axes=[e_reco]) dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=e_true) maker = SpectrumDatasetMaker(selection=["counts"]) for obs in observations: dataset = maker.run(dataset_empty, obs) dataset_on_off = reflected_bkg_maker.run(dataset, obs) datasets.append(dataset_on_off) assert_allclose(datasets[0].counts_off.data.sum(), 76) assert_allclose(datasets[1].counts_off.data.sum(), 60) regions_0 = compound_region_to_regions(datasets[0].counts_off.geom.region) regions_1 = compound_region_to_regions(datasets[1].counts_off.geom.region) assert_allclose(len(regions_0), 11) assert_allclose(len(regions_1), 11) @requires_data() def test_reflected_bkg_maker_no_off(reflected_bkg_maker, observations, caplog): pos = SkyCoord(83.6333313, 21.51444435, unit="deg", frame="icrs") radius = Angle(0.11, "deg") region = CircleSkyRegion(pos, radius) maker = SpectrumDatasetMaker(selection=["counts", "exposure"]) datasets = [] e_reco = MapAxis.from_edges(np.logspace(0, 2, 5) * u.TeV, name="energy") e_true = MapAxis.from_edges(np.logspace(-0.5, 2, 11) * u.TeV, name="energy_true") geom = RegionGeom.create(region=region, axes=[e_reco]) dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=e_true) safe_mask_masker = SafeMaskMaker(methods=["aeff-max"], aeff_percent=10) for obs in observations: dataset = maker.run(dataset_empty, obs) dataset_on_off = reflected_bkg_maker.run(dataset, obs) dataset_on_off = safe_mask_masker.run(dataset_on_off, obs) datasets.append(dataset_on_off) assert datasets[0].counts_off is None assert_allclose(datasets[0].acceptance_off, 0) assert_allclose(datasets[0].mask_safe.data, False) assert "WARNING" in [record.levelname for record in caplog.records] message1 = ( f"ReflectedRegionsBackgroundMaker failed. " f"No OFF region found outside exclusion mask for dataset '{datasets[0].name}'." ) message2 = ( f"ReflectedRegionsBackgroundMaker failed. " f"Setting {datasets[0].name} mask to False." ) assert message1 in [record.message for record in caplog.records] assert message2 in [record.message for record in caplog.records] @requires_data() def test_reflected_bkg_maker_no_off_background(reflected_bkg_maker, observations): pos = SkyCoord(83.6333313, 21.51444435, unit="deg", frame="icrs") radius = Angle(0.11, "deg") region = CircleSkyRegion(pos, radius) maker = SpectrumDatasetMaker(selection=["counts", "background"]) datasets = [] e_reco = MapAxis.from_edges(np.logspace(0, 2, 5) * u.TeV, name="energy") e_true = MapAxis.from_edges(np.logspace(-0.5, 2, 11) * u.TeV, name="energy_true") geom = RegionGeom.create(region=region, axes=[e_reco]) dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=e_true) for obs in observations: dataset = maker.run(dataset_empty, obs) dataset_on_off = reflected_bkg_maker.run(dataset, obs) datasets.append(dataset_on_off) assert_allclose(datasets[0].counts_off.data, 0) assert_allclose(datasets[0].acceptance_off, 0) def test_wobble_regions_finder(): center = SkyCoord(83.6333313, 21.51444435, unit="deg", frame="icrs") source_angle = 35 * u.deg on_region = CircleSkyRegion( center=center, radius=0.15 * u.deg, ) pointing = on_region.center.directional_offset_by( separation=0.4 * u.deg, position_angle=180 * u.deg + source_angle, ) assert u.isclose( pointing.position_angle(on_region.center).to(u.deg), source_angle, rtol=0.05 ) n_off_regions = 3 finder = WobbleRegionsFinder(n_off_regions) regions, _ = finder.run(on_region, pointing) assert len(regions) == 3 for i, off_region in enumerate(regions, start=1): assert u.isclose(pointing.separation(off_region.center), 0.4 * u.deg) expected = source_angle + 360 * u.deg * i / (n_off_regions + 1) assert u.isclose( pointing.position_angle(off_region.center).to(u.deg), expected.to(u.deg), rtol=0.001, ) # test with exclusion region pos = pointing.directional_offset_by( separation=0.4 * u.deg, position_angle=source_angle + 90 * u.deg, ) exclusion_region = CircleSkyRegion(pos, Angle(0.2, "deg")) geom = WcsGeom.create(skydir=pos, binsz=0.01, width=10.0) exclusion_mask = ~geom.region_mask([exclusion_region]) finder = WobbleRegionsFinder(n_off_regions) regions, _ = finder.run(on_region, pointing, exclusion_mask) assert len(regions) == 2 def test_wobble_regions_finder_overlapping(caplog): """Test that overlapping regions are not produced""" center = SkyCoord(83.6333313, 21.51444435, unit="deg", frame="icrs") source_angle = 35 * u.deg on_region = CircleSkyRegion( center=center, radius=0.5 * u.deg, ) pointing = on_region.center.directional_offset_by( separation=0.4 * u.deg, position_angle=180 * u.deg + source_angle, ) n_off_regions = 3 finder = WobbleRegionsFinder(n_off_regions) with caplog.at_level(logging.WARNING): regions, _ = finder.run(on_region, pointing) assert len(regions) == 0 assert caplog.record_tuples == [ ( "gammapy.makers.background.reflected", logging.WARNING, "Found overlapping off regions, returning no regions", ) ] @requires_data() def test_reflected_bkg_maker_with_wobble_finder( on_region, observations, exclusion_mask ): datasets = [] reflected_bkg_maker = ReflectedRegionsBackgroundMaker( region_finder=WobbleRegionsFinder(n_off_regions=3), exclusion_mask=exclusion_mask, ) e_reco = MapAxis.from_edges(np.logspace(0, 2, 5) * u.TeV, name="energy") geom = RegionGeom(region=on_region, axes=[e_reco]) dataset_empty = SpectrumDataset.create(geom=geom) spectrum_dataset_maker = SpectrumDatasetMaker(selection=["counts"]) for obs in observations: dataset = spectrum_dataset_maker.run(dataset_empty, obs) dataset_on_off = reflected_bkg_maker.run(dataset, obs) datasets.append(dataset_on_off) regions_0 = compound_region_to_regions(datasets[0].counts_off.geom.region) regions_1 = compound_region_to_regions(datasets[1].counts_off.geom.region) assert_allclose(len(regions_0), 3) assert_allclose(len(regions_1), 3) @requires_data() def test_reflected_bkg_maker_fixed_rad_max( reflected_bkg_maker, observations_fixed_rad_max ): e_reco = MapAxis.from_energy_bounds(0.1, 10, 5, unit="TeV") e_true = MapAxis.from_energy_bounds(0.1, 10, 5, unit="TeV", name="energy_true") pos = SkyCoord(83.63, 22.01, unit="deg", frame="icrs") radius = Angle(0.1414, "deg") region = CircleSkyRegion(pos, radius) geom = RegionGeom(region=region, axes=[e_reco]) dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=e_true) maker = SpectrumDatasetMaker(selection=["counts"]) obs = observations_fixed_rad_max[0] dataset = maker.run(dataset_empty, obs) dataset_on_off = reflected_bkg_maker.run(dataset, obs) assert_allclose(dataset_on_off.counts_off.data.sum(), 217) regions_0 = compound_region_to_regions(dataset_on_off.counts_off.geom.region) assert_allclose(len(regions_0), 6) @requires_data() def test_reflected_bkg_maker_fixed_rad_max_wobble( exclusion_mask, observations_fixed_rad_max ): reflected_bkg_maker = ReflectedRegionsBackgroundMaker( region_finder=WobbleRegionsFinder(n_off_regions=3), exclusion_mask=exclusion_mask, ) e_reco = MapAxis.from_energy_bounds(0.1, 10, 5, unit="TeV") e_true = MapAxis.from_energy_bounds(0.1, 10, 5, unit="TeV", name="energy_true") pos = SkyCoord(83.63, 22.01, unit="deg", frame="icrs") radius = Angle(0.1414, "deg") region = CircleSkyRegion(pos, radius) geom = RegionGeom(region=region, axes=[e_reco]) dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=e_true) maker = SpectrumDatasetMaker(selection=["counts"]) obs = observations_fixed_rad_max[0] dataset = maker.run(dataset_empty, obs) dataset_on_off = reflected_bkg_maker.run(dataset, obs) assert_allclose(dataset_on_off.counts_off.data.sum(), 102) regions_0 = compound_region_to_regions(dataset_on_off.counts_off.geom.region) assert_allclose(len(regions_0), 3) @requires_data() def test_reflected_bkg_maker_fixed_rad_max_bad( reflected_bkg_maker, observations_fixed_rad_max ): e_reco = MapAxis.from_energy_bounds(0.1, 10, 5, unit="TeV") pos = SkyCoord(83.63, 22.01, unit="deg", frame="icrs") bad_radius = Angle(0.11, "deg") region_bad_size = CircleSkyRegion(pos, bad_radius) maker = SpectrumDatasetMaker(selection=["counts"]) obs = observations_fixed_rad_max[0] geom_bad_size = RegionGeom(region=region_bad_size, axes=[e_reco]) dataset_empty = SpectrumDataset.create(geom=geom_bad_size) dataset = maker.run(dataset_empty, obs) with pytest.raises(ValueError): reflected_bkg_maker.run(dataset, obs) region_bad_shape = RectangleSkyRegion(pos, 0.2 * u.deg, 0.2 * u.deg) geom_bad_shape = RegionGeom(region_bad_shape, axes=[e_reco]) dataset_empty = SpectrumDataset.create(geom=geom_bad_shape) dataset = maker.run(dataset_empty, obs) with pytest.raises(ValueError): reflected_bkg_maker.run(dataset, obs) region_point_shape = PointSkyRegion(pos) geom_point_shape = RegionGeom(region_point_shape, axes=[e_reco]) dataset_empty = SpectrumDataset.create(geom=geom_point_shape) dataset = maker.run(dataset_empty, obs) with pytest.raises(TypeError): reflected_bkg_maker.run(dataset, obs) def test_reflected_bkg_exclsion_error(exclusion_mask): energy = MapAxis.from_energy_bounds(1 * u.TeV, 2 * u.TeV, nbin=1, name="energy") exclusion_cube = exclusion_mask.to_cube([energy]) ReflectedRegionsBackgroundMaker(exclusion_mask=exclusion_cube) energy = MapAxis.from_energy_bounds(2 * u.TeV, 3 * u.TeV, nbin=1, name="energy") exclusion_cube_2 = exclusion_mask.to_cube([energy]) exclusion_stack = Map.from_stack([exclusion_cube, exclusion_cube_2]) with pytest.raises(ValueError): ReflectedRegionsBackgroundMaker(exclusion_mask=exclusion_stack) exclusion_mask.data = exclusion_mask.data.astype("int") with pytest.raises(ValueError): ReflectedRegionsBackgroundMaker(exclusion_mask=exclusion_mask) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/background/tests/test_ring.py0000644000175100001770000001153314721316200023162 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose from astropy.coordinates import Angle, SkyCoord from regions import CircleSkyRegion from gammapy.data import DataStore from gammapy.datasets import MapDataset from gammapy.makers import ( AdaptiveRingBackgroundMaker, MapDatasetMaker, RingBackgroundMaker, SafeMaskMaker, ) from gammapy.maps import MapAxis, WcsGeom from gammapy.utils.testing import requires_data @pytest.fixture(scope="session") def observations(): """Example observation list for testing.""" datastore = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") obs_ids = [23523, 23526] return datastore.get_observations(obs_ids) @pytest.fixture(scope="session") def geom(): energy_axis = MapAxis.from_edges([1, 10], unit="TeV", name="energy", interp="log") return WcsGeom.create( skydir=SkyCoord(83.633, 22.014, unit="deg"), binsz=0.02, width=(10, 10), frame="galactic", proj="CAR", axes=[energy_axis], ) @pytest.fixture(scope="session") def exclusion_mask(geom): """Example mask for testing.""" pos = SkyCoord(83.633, 22.014, unit="deg", frame="icrs") region = CircleSkyRegion(pos, Angle(0.15, "deg")) return ~geom.region_mask([region]) @requires_data() def test_ring_bkg_maker(geom, observations, exclusion_mask): ring_bkg_maker = RingBackgroundMaker( r_in="0.2 deg", width="0.3 deg", exclusion_mask=exclusion_mask ) safe_mask_maker = SafeMaskMaker(methods=["offset-max"], offset_max="2 deg") map_dataset_maker = MapDatasetMaker(selection=["counts", "background", "exposure"]) reference = MapDataset.create(geom) datasets = [] for obs in observations: cutout = reference.cutout(obs.get_pointing_icrs(obs.tmid), width="4 deg") dataset = map_dataset_maker.run(cutout, obs) dataset = safe_mask_maker.run(dataset, obs) dataset = dataset.to_image() dataset_on_off = ring_bkg_maker.run(dataset) datasets.append(dataset_on_off) mask = dataset.mask_safe assert_allclose(datasets[0].counts_off.data[mask].sum(), 2511333) assert_allclose(datasets[1].counts_off.data[mask].sum(), 2143577.0) assert_allclose(datasets[0].acceptance_off.data[mask].sum(), 2961300, rtol=1e-5) assert_allclose(datasets[1].acceptance_off.data[mask].sum(), 2364657.2, rtol=1e-5) assert_allclose(datasets[0].alpha.data[0][100][100], 0.00063745599, rtol=1e-5) assert_allclose( datasets[0].exposure.data[0][100][100], 806254444.8480084, rtol=1e-5 ) @pytest.mark.parametrize( "pars", [ { "obs_idx": 0, "method": "fixed_r_in", "counts_off": 2511417.0, "acceptance_off": 2960679.594648, "alpha": 0.000637456020, "exposure": 806254444.8480084, }, { "obs_idx": 0, "method": "fixed_width", "counts_off": 2511417.0, "acceptance_off": 2960679.594648, "alpha": 0.000637456020, "exposure": 806254444.8480084, }, { "obs_idx": 1, "method": "fixed_r_in", "counts_off": 2143577.0, "acceptance_off": 2364657.352647, "alpha": 0.00061841976, "exposure": 779613265.2688407, }, { "obs_idx": 1, "method": "fixed_width", "counts_off": 2143577.0, "acceptance_off": 2364657.352647, "alpha": 0.00061841976, "exposure": 779613265.2688407, }, ], ) @requires_data() def test_adaptive_ring_bkg_maker(pars, geom, observations, exclusion_mask): adaptive_ring_bkg_maker = AdaptiveRingBackgroundMaker( r_in="0.2 deg", width="0.3 deg", r_out_max="2 deg", stepsize="0.2 deg", exclusion_mask=exclusion_mask, method=pars["method"], ) safe_mask_maker = SafeMaskMaker(methods=["offset-max"], offset_max="2 deg") map_dataset_maker = MapDatasetMaker(selection=["counts", "background", "exposure"]) obs = observations[pars["obs_idx"]] dataset = MapDataset.create(geom).cutout( obs.get_pointing_icrs(obs.tmid), width="4 deg" ) dataset = map_dataset_maker.run(dataset, obs) dataset = safe_mask_maker.run(dataset, obs) dataset = dataset.to_image() dataset_on_off = adaptive_ring_bkg_maker.run(dataset) mask = dataset.mask_safe assert_allclose(dataset_on_off.counts_off.data[mask].sum(), pars["counts_off"]) assert_allclose( dataset_on_off.acceptance_off.data[mask].sum(), pars["acceptance_off"], rtol=1e-5, ) assert_allclose(dataset_on_off.alpha.data[0][100][100], pars["alpha"], rtol=1e-5) assert_allclose( dataset_on_off.exposure.data[0][100][100], pars["exposure"], rtol=1e-5 ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/core.py0000644000175100001770000000165214721316200016634 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import abc import html import numpy as np __all__ = ["Maker"] class Maker(abc.ABC): """Abstract maker base class.""" @property @abc.abstractmethod def tag(self): pass @abc.abstractmethod def run(self): pass def __str__(self): s = f"{self.__class__.__name__}\n" s += "-" * (len(s) - 1) + "\n\n" names = self.__init__.__code__.co_varnames max_len = np.max([len(_) for _ in names]) + 1 for name in names: value = getattr(self, name, None) if value is None: continue else: s += f"\t{name:{max_len}s}: {value}\n" return s.expandtabs(tabsize=2) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/map.py0000644000175100001770000003257314721316200016467 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import astropy.units as u from astropy.table import Table from regions import PointSkyRegion from gammapy.datasets import MapDatasetMetaData from gammapy.irf import EDispKernelMap, PSFMap, RecoPSFMap from gammapy.maps import Map from .core import Maker from .utils import ( make_counts_rad_max, make_edisp_kernel_map, make_edisp_map, make_map_background_irf, make_map_exposure_true_energy, make_psf_map, ) __all__ = ["MapDatasetMaker"] log = logging.getLogger(__name__) class MapDatasetMaker(Maker): """Make binned maps for a single IACT observation. Parameters ---------- selection : list of str, optional Select which maps to make, the available options are: 'counts', 'exposure', 'background', 'psf', 'edisp'. By default, all maps are made. background_oversampling : int Background evaluation oversampling factor in energy. background_interp_missing_data : bool, optional Interpolate missing values in background 3d map. Default is True, have to be set to True for CTAO IRF. background_pad_offset : bool, optional Pad one bin in offset for 2d background map. This avoids extrapolation at edges and use the nearest value. Default is True. Examples -------- This example shows how to run the MapMaker for a single observation. >>> from gammapy.data import DataStore >>> from gammapy.datasets import MapDataset >>> from gammapy.maps import WcsGeom, MapAxis >>> from gammapy.makers import MapDatasetMaker >>> # Load an observation >>> data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") >>> obs = data_store.obs(23523) >>> # Prepare the geometry >>> energy_axis = MapAxis.from_energy_bounds(1.0, 10.0, 4, unit="TeV") >>> energy_axis_true = MapAxis.from_energy_bounds( 0.5, 20, 10, unit="TeV", name="energy_true") >>> geom = WcsGeom.create( ... skydir=(83.633, 22.014), ... binsz=0.02, ... width=(2, 2), ... frame="icrs", ... proj="CAR", ... axes=[energy_axis], ... ) >>> # Run the maker >>> empty = MapDataset.create(geom=geom, energy_axis_true=energy_axis_true, name="empty") >>> maker = MapDatasetMaker() >>> dataset = maker.run(empty, obs) >>> print(dataset) MapDataset ---------- Name : empty Total counts : 787 Total background counts : 684.52 Total excess counts : 102.48 Predicted counts : 684.52 Predicted background counts : 684.52 Predicted excess counts : nan Exposure min : 7.01e+07 m2 s Exposure max : 1.10e+09 m2 s Number of total bins : 40000 Number of fit bins : 40000 Fit statistic type : cash Fit statistic value (-2 log(L)) : nan Number of models : 0 Number of parameters : 0 Number of free parameters : 0 """ tag = "MapDatasetMaker" available_selection = ["counts", "exposure", "background", "psf", "edisp"] def __init__( self, selection=None, background_oversampling=None, background_interp_missing_data=True, background_pad_offset=True, ): self.background_oversampling = background_oversampling self.background_interp_missing_data = background_interp_missing_data self.background_pad_offset = background_pad_offset if selection is None: selection = self.available_selection selection = set(selection) if not selection.issubset(self.available_selection): difference = selection.difference(self.available_selection) raise ValueError(f"{difference} is not a valid method.") self.selection = selection @staticmethod def make_counts(geom, observation): """Make counts map. Parameters ---------- geom : `~gammapy.maps.Geom` Reference map geometry. observation : `~gammapy.data.Observation` Observation container. Returns ------- counts : `~gammapy.maps.Map` Counts map. """ if geom.is_region and isinstance(geom.region, PointSkyRegion): counts = make_counts_rad_max(geom, observation.rad_max, observation.events) else: counts = Map.from_geom(geom) counts.fill_events(observation.events) return counts @staticmethod def make_exposure(geom, observation, use_region_center=True): """Make exposure map. Parameters ---------- geom : `~gammapy.maps.Geom` Reference map geometry. observation : `~gammapy.data.Observation` Observation container. use_region_center : bool, optional For geom as a `~gammapy.maps.RegionGeom`. If True, consider the values at the region center. If False, average over the whole region. Default is True. Returns ------- exposure : `~gammapy.maps.Map` Exposure map. """ if isinstance(observation.aeff, Map): return observation.aeff.interp_to_geom( geom=geom, ) return make_map_exposure_true_energy( pointing=observation.get_pointing_icrs(observation.tmid), livetime=observation.observation_live_time_duration, aeff=observation.aeff, geom=geom, use_region_center=use_region_center, ) @staticmethod def make_exposure_irf(geom, observation, use_region_center=True): """Make exposure map with IRF geometry. Parameters ---------- geom : `~gammapy.maps.Geom` Reference geometry. observation : `~gammapy.data.Observation` Observation container. use_region_center : bool, optional For geom as a `~gammapy.maps.RegionGeom`. If True, consider the values at the region center. If False, average over the whole region. Default is True. Returns ------- exposure : `~gammapy.maps.Map` Exposure map. """ return make_map_exposure_true_energy( pointing=observation.get_pointing_icrs(observation.tmid), livetime=observation.observation_live_time_duration, aeff=observation.aeff, geom=geom, use_region_center=use_region_center, ) def make_background(self, geom, observation): """Make background map. Parameters ---------- geom : `~gammapy.maps.Geom` Reference geometry. observation : `~gammapy.data.Observation` Observation container. Returns ------- background : `~gammapy.maps.Map` Background map. """ bkg = observation.bkg if isinstance(bkg, Map): return bkg.interp_to_geom(geom=geom, preserve_counts=True) use_region_center = getattr(self, "use_region_center", True) if self.background_interp_missing_data: bkg.interp_missing_data(axis_name="energy") if self.background_pad_offset and bkg.has_offset_axis: bkg = bkg.pad(1, mode="edge", axis_name="offset") return make_map_background_irf( pointing=observation.pointing, ontime=observation.observation_time_duration, bkg=bkg, geom=geom, oversampling=self.background_oversampling, use_region_center=use_region_center, obstime=observation.tmid, ) def make_edisp(self, geom, observation): """Make energy dispersion map. Parameters ---------- geom : `~gammapy.maps.Geom` Reference geometry. observation : `~gammapy.data.Observation` Observation container. Returns ------- edisp : `~gammapy.irf.EDispMap` Energy dispersion map. """ exposure = self.make_exposure_irf(geom.squash(axis_name="migra"), observation) use_region_center = getattr(self, "use_region_center", True) return make_edisp_map( edisp=observation.edisp, pointing=observation.get_pointing_icrs(observation.tmid), geom=geom, exposure_map=exposure, use_region_center=use_region_center, ) def make_edisp_kernel(self, geom, observation): """Make energy dispersion kernel map. Parameters ---------- geom : `~gammapy.maps.Geom` Reference geometry. Must contain "energy" and "energy_true" axes in that order. observation : `~gammapy.data.Observation` Observation container. Returns ------- edisp : `~gammapy.irf.EDispKernelMap` Energy dispersion kernel map. """ if isinstance(observation.edisp, EDispKernelMap): exposure = None interp_map = observation.edisp.edisp_map.interp_to_geom(geom) return EDispKernelMap(edisp_kernel_map=interp_map, exposure_map=exposure) exposure = self.make_exposure_irf(geom.squash(axis_name="energy"), observation) use_region_center = getattr(self, "use_region_center", True) return make_edisp_kernel_map( edisp=observation.edisp, pointing=observation.get_pointing_icrs(observation.tmid), geom=geom, exposure_map=exposure, use_region_center=use_region_center, ) def make_psf(self, geom, observation): """Make PSF map. Parameters ---------- geom : `~gammapy.maps.Geom` Reference geometry. observation : `~gammapy.data.Observation` Observation container. Returns ------- psf : `~gammapy.irf.PSFMap` PSF map. """ psf = observation.psf if isinstance(psf, RecoPSFMap): return RecoPSFMap(psf.psf_map.interp_to_geom(geom)) elif isinstance(psf, PSFMap): return PSFMap(psf.psf_map.interp_to_geom(geom)) exposure = self.make_exposure_irf(geom.squash(axis_name="rad"), observation) return make_psf_map( psf=psf, pointing=observation.get_pointing_icrs(observation.tmid), geom=geom, exposure_map=exposure, ) @staticmethod def make_meta_table(observation): """Make information meta table. Parameters ---------- observation : `~gammapy.data.Observation` Observation. Returns ------- meta_table : `~astropy.table.Table` Meta table. """ row = {} row["TELESCOP"] = observation.aeff.meta.get("TELESCOP", "Unknown") row["OBS_ID"] = observation.obs_id row.update(observation.pointing.to_fits_header()) meta_table = Table([row]) if "ALT_PNT" in meta_table.colnames: meta_table["ALT_PNT"].unit = u.deg meta_table["AZ_PNT"].unit = u.deg if "RA_PNT" in meta_table.colnames: meta_table["RA_PNT"].unit = u.deg meta_table["DEC_PNT"].unit = u.deg return meta_table @staticmethod def _make_metadata(table): return MapDatasetMetaData._from_meta_table(table) def run(self, dataset, observation): """Make map dataset. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Reference dataset. observation : `~gammapy.data.Observation` Observation. Returns ------- dataset : `~gammapy.datasets.MapDataset` Map dataset. """ kwargs = {"gti": observation.gti} kwargs["meta_table"] = self.make_meta_table(observation) kwargs["meta"] = self._make_metadata(kwargs["meta_table"]) mask_safe = Map.from_geom(dataset.counts.geom, dtype=bool) mask_safe.data[...] = True kwargs["mask_safe"] = mask_safe if "counts" in self.selection: counts = self.make_counts(dataset.counts.geom, observation) else: counts = Map.from_geom(dataset.counts.geom, data=0) kwargs["counts"] = counts if "exposure" in self.selection: exposure = self.make_exposure(dataset.exposure.geom, observation) kwargs["exposure"] = exposure if "background" in self.selection: kwargs["background"] = self.make_background( dataset.counts.geom, observation ) if "psf" in self.selection: psf = self.make_psf(dataset.psf.psf_map.geom, observation) kwargs["psf"] = psf if "edisp" in self.selection: if dataset.edisp.edisp_map.geom.axes[0].name.upper() == "MIGRA": edisp = self.make_edisp(dataset.edisp.edisp_map.geom, observation) else: edisp = self.make_edisp_kernel( dataset.edisp.edisp_map.geom, observation ) kwargs["edisp"] = edisp return dataset.__class__(name=dataset.name, **kwargs) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/reduce.py0000644000175100001770000001424414721316200017154 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging from astropy.coordinates import Angle import gammapy.utils.parallel as parallel from gammapy.datasets import Datasets, MapDataset, MapDatasetOnOff, SpectrumDataset from .core import Maker from .safe import SafeMaskMaker log = logging.getLogger(__name__) __all__ = [ "DatasetsMaker", ] class DatasetsMaker(Maker, parallel.ParallelMixin): """Run makers in a chain. Parameters ---------- makers : list of `Maker` objects Makers. stack_datasets : bool, optional If True, stack into the reference dataset (see `run` method arguments). Default is True. n_jobs : int, optional Number of processes to run in parallel. Default is one, unless `~gammapy.utils.parallel.N_JOBS_DEFAULT` was modified. cutout_mode : {'trim', 'partial', 'strict'} Used only to cutout the reference `MapDataset` around each processed observation. Mode is an option for Cutout2D, for details see `~astropy.nddata.utils.Cutout2D`. Default is "trim". cutout_width : tuple of `~astropy.coordinates.Angle`, optional Angular sizes of the region in (lon, lat) in that specific order. If only one value is passed, a square region is extracted. If None it returns an error, except if the list of makers includes a `SafeMaskMaker` with the offset-max method defined. In that case it is set to two times `offset_max`. Default is None. parallel_backend : {'multiprocessing', 'ray'}, optional Which backend to use for multiprocessing. Default is None. """ tag = "DatasetsMaker" def __init__( self, makers, stack_datasets=True, n_jobs=None, cutout_mode="trim", cutout_width=None, parallel_backend=None, ): self.log = logging.getLogger(__name__) self.makers = makers self.cutout_mode = cutout_mode if cutout_width is not None: cutout_width = Angle(cutout_width) self.cutout_width = cutout_width self._apply_cutout = True if self.cutout_width is None: if self.offset_max is None: self._apply_cutout = False else: self.cutout_width = 2 * self.offset_max self.n_jobs = n_jobs self.parallel_backend = parallel_backend self.stack_datasets = stack_datasets self._datasets = [] self._error = False @property def offset_max(self): maker = self.safe_mask_maker if maker is not None and hasattr(maker, "offset_max"): return maker.offset_max @property def safe_mask_maker(self): for m in self.makers: if isinstance(m, SafeMaskMaker): return m def make_dataset(self, dataset, observation): """Make single dataset. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Reference dataset. observation : `Observation` Observation. """ if self._apply_cutout: cutouts_kwargs = { "position": observation.get_pointing_icrs(observation.tmid).galactic, "width": self.cutout_width, "mode": self.cutout_mode, } dataset_obs = dataset.cutout( **cutouts_kwargs, ) else: dataset_obs = dataset.copy() if dataset.models is not None: models = dataset.models.copy() models.reassign(dataset.name, dataset_obs.name) dataset_obs.models = models log.info(f"Computing dataset for observation {observation.obs_id}") for maker in self.makers: log.info(f"Running {maker.tag}") dataset_obs = maker.run(dataset=dataset_obs, observation=observation) return dataset_obs def callback(self, dataset): if self.stack_datasets: if type(self._dataset) is MapDataset and type(dataset) is MapDatasetOnOff: dataset = dataset.to_map_dataset(name=dataset.name) self._dataset.stack(dataset) else: self._datasets.append(dataset) def error_callback(self, dataset): # parallel run could cause a memory error with non-explicit message. self._error = True def run(self, dataset, observations, datasets=None): """Run data reduction. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` Reference dataset (used only for stacking if datasets are provided). observations : `Observations` Observations. datasets : `~gammapy.datasets.Datasets` Base datasets, if provided its length must be the same as the observations. Returns ------- datasets : `~gammapy.datasets.Datasets` Datasets. """ if isinstance(dataset, MapDataset): # also valid for Spectrum as it inherits from MapDataset self._dataset = dataset else: raise TypeError("Invalid reference dataset.") if isinstance(dataset, SpectrumDataset): self._apply_cutout = False if datasets is not None: self._apply_cutout = False else: datasets = len(observations) * [dataset] n_jobs = min(self.n_jobs, len(observations)) parallel.run_multiprocessing( self.make_dataset, zip(datasets, observations), backend=self.parallel_backend, pool_kwargs=dict(processes=n_jobs), method="apply_async", method_kwargs=dict( callback=self.callback, error_callback=self.error_callback, ), task_name="Data reduction", ) if self._error: raise RuntimeError("Execution of a sub-process failed") if self.stack_datasets: return Datasets([self._dataset]) lookup = { d.meta_table["OBS_ID"][0]: idx for idx, d in enumerate(self._datasets) } return Datasets([self._datasets[lookup[obs.obs_id]] for obs in observations]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/safe.py0000644000175100001770000003235514721316200016626 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import numpy as np from astropy import units as u from astropy.coordinates import Angle from gammapy.irf import EDispKernelMap from gammapy.maps import Map from gammapy.modeling.models import TemplateSpectralModel from .core import Maker __all__ = ["SafeMaskMaker"] log = logging.getLogger(__name__) class SafeMaskMaker(Maker): """Make safe data range mask for a given observation. For more information see :ref:`safe-data-range`. .. warning:: Currently, some methods computing a safe energy range ("aeff-default", "aeff-max" and "edisp-bias") determine a true energy range and apply it to reconstructed energy, effectively neglecting the energy dispersion. Parameters ---------- methods : {"aeff-default", "aeff-max", "edisp-bias", "offset-max", "bkg-peak"} Method to use for the safe energy range. Can be a list with a combination of those. Resulting masks are combined with logical `and`. "aeff-default" uses the energy ranged specified in the DL3 data files, if available. aeff_percent : float Percentage of the maximal effective area to be used as lower energy threshold for method "aeff-max". bias_percent : float Percentage of the energy bias to be used as lower energy threshold for method "edisp-bias". position : `~astropy.coordinates.SkyCoord` Position at which the `aeff_percent` or `bias_percent` are computed. fixed_offset : `~astropy.coordinates.Angle` Offset, calculated from the pointing position, at which the `aeff_percent` or `bias_percent` are computed. If neither the position nor fixed_offset is specified, it uses the position of the center of the map by default. offset_max : str or `~astropy.units.Quantity` Maximum offset cut. irfs : {"DL4", "DL3"} Whether to use reprojected ("DL4") or raw ("DL3") irfs. Default is "DL4". """ tag = "SafeMaskMaker" available_methods = { "aeff-default", "aeff-max", "edisp-bias", "offset-max", "bkg-peak", } def __init__( self, methods=["aeff-default"], aeff_percent=10, bias_percent=10, position=None, fixed_offset=None, offset_max="3 deg", irfs="DL4", ): methods = set(methods) if not methods.issubset(self.available_methods): difference = methods.difference(self.available_methods) raise ValueError(f"{difference} is not a valid method.") self.methods = methods self.aeff_percent = aeff_percent self.bias_percent = bias_percent self.position = position self.fixed_offset = fixed_offset self.offset_max = Angle(offset_max) if self.position and self.fixed_offset: raise ValueError( "`position` and `fixed_offset` attributes are mutually exclusive" ) if irfs not in ["DL3", "DL4"]: ValueError( "Invalid option for irfs: expected 'DL3' or 'DL4', got {irfs} instead." ) self.irfs = irfs def make_mask_offset_max(self, dataset, observation): """Make maximum offset mask. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset` Dataset to compute mask for. observation : `~gammapy.data.Observation` Observation to compute mask for. Returns ------- mask_safe : `~numpy.ndarray` Maximum offset mask. """ if observation is None: raise ValueError("Method 'offset-max' requires an observation object.") separation = dataset._geom.separation( observation.get_pointing_icrs(observation.tmid) ) return separation < self.offset_max @staticmethod def make_mask_energy_aeff_default(dataset, observation): """Make safe energy mask from aeff default. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset` Dataset to compute mask for. observation : `~gammapy.data.Observation` Observation to compute mask for. Returns ------- mask_safe : `~numpy.ndarray` Safe data range mask. """ if observation is None: raise ValueError("Method 'aeff-default' requires an observation object.") energy_max = observation.aeff.meta.get("HI_THRES", None) if energy_max: energy_max = energy_max * u.TeV else: log.warning( f"No default upper safe energy threshold defined for obs {observation.obs_id}" ) energy_min = observation.aeff.meta.get("LO_THRES", None) if energy_min: energy_min = energy_min * u.TeV else: log.warning( f"No default lower safe energy threshold defined for obs {observation.obs_id}" ) return dataset._geom.energy_mask(energy_min=energy_min, energy_max=energy_max) def _get_offset(self, observation): offset = self.fixed_offset if offset is None: if self.position: offset = observation.get_pointing_icrs(observation.tmid).separation( self.position ) else: offset = 0.0 * u.deg return offset def _get_position(self, observation, geom): if self.fixed_offset is not None and observation is not None: pointing = observation.get_pointing_icrs(observation.tmid) return pointing.directional_offset_by( position_angle=0 * u.deg, separation=self.fixed_offset ) elif self.position is not None: return self.position else: return geom.center_skydir def make_mask_energy_aeff_max(self, dataset, observation=None): """Make safe energy mask from effective area maximum value. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset` Dataset to compute mask for. observation : `~gammapy.data.Observation` Observation to compute mask for. It is a mandatory argument when fixed_offset is set. Returns ------- mask_safe : `~numpy.ndarray` Safe data range mask. """ if self.fixed_offset is not None and observation is None: raise ValueError( f"{observation} argument is mandatory with {self.fixed_offset}" ) geom, exposure = dataset._geom, dataset.exposure if self.irfs == "DL3": offset = self._get_offset(observation) values = observation.aeff.evaluate( offset=offset, energy_true=observation.aeff.axes["energy_true"].edges ) valid = observation.aeff.axes["energy_true"].edges[ values > self.aeff_percent * np.max(values) / 100 ] energy_min = np.min(valid) else: position = self._get_position(observation, geom) aeff = exposure.get_spectrum(position) / exposure.meta["livetime"] if not np.any(aeff.data > 0.0): log.warning( f"Effective area is all zero at [{position.to_string('dms')}]. " f"No safe energy band can be defined for the dataset '{dataset.name}': " "setting `mask_safe` to all False." ) return Map.from_geom(geom, data=False, dtype="bool") model = TemplateSpectralModel.from_region_map(aeff) energy_true = model.energy energy_min = energy_true[np.where(model.values > 0)[0][0]] energy_max = energy_true[-1] aeff_thres = (self.aeff_percent / 100) * aeff.quantity.max() inversion = model.inverse( aeff_thres, energy_min=energy_min, energy_max=energy_max ) if not np.isnan(inversion[0]): energy_min = inversion[0] return geom.energy_mask(energy_min=energy_min) def make_mask_energy_edisp_bias(self, dataset, observation=None): """Make safe energy mask from energy dispersion bias. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset` Dataset to compute mask for. observation : `~gammapy.data.Observation` Observation to compute mask for. It is a mandatory argument when fixed_offset is set. Returns ------- mask_safe : `~numpy.ndarray` Safe data range mask. """ if self.fixed_offset is not None and observation is None: raise ValueError( f"{observation} argument is mandatory with {self.fixed_offset}" ) edisp, geom = dataset.edisp, dataset._geom if self.irfs == "DL3": offset = self._get_offset(observation) edisp = observation.edisp.to_edisp_kernel(offset) else: kwargs = dict() kwargs["position"] = self._get_position(observation, geom) if not isinstance(edisp, EDispKernelMap): kwargs["energy_axis"] = dataset._geom.axes["energy"] edisp = edisp.get_edisp_kernel(**kwargs) energy_min = edisp.get_bias_energy(self.bias_percent / 100)[0] return geom.energy_mask(energy_min=energy_min) def make_mask_energy_bkg_peak(self, dataset, observation=None): """Make safe energy mask based on the binned background. The energy threshold is defined as the lower edge of the energy bin with the highest predicted background rate. This is to ensure analysis in a region where a Powerlaw approximation to the background spectrum is valid. The is motivated by its use in the H.E.S.S. DL3 validation paper: https://arxiv.org/pdf/1910.08088.pdf Parameters ---------- dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset` Dataset to compute mask for. observation: `~gammapy.data.Observation` Observation to compute mask for. It is a mandatory argument when DL3 irfs are used. Returns ------- mask_safe : `~numpy.ndarray` Safe data range mask. """ geom = dataset._geom if self.irfs == "DL3": bkg = observation.bkg.to_2d() background_spectrum = np.ravel( bkg.integral(axis_name="offset", offset=bkg.axes["offset"].bounds[1]) ) energy_axis = bkg.axes["energy"] else: background_spectrum = dataset.npred_background().get_spectrum() energy_axis = geom.axes["energy"] idx = np.argmax(background_spectrum.data, axis=0) return geom.energy_mask(energy_min=energy_axis.edges[idx]) @staticmethod def make_mask_bkg_invalid(dataset): """Mask non-finite values and zeros values in background maps. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset` Dataset to compute mask for. Returns ------- mask_safe : `~numpy.ndarray` Safe data range mask. """ bkg = dataset.background.data mask = np.isfinite(bkg) if not dataset.stat_type == "wstat": mask &= bkg > 0.0 return mask def run(self, dataset, observation=None): """Make safe data range mask. Parameters ---------- dataset : `~gammapy.datasets.MapDataset` or `~gammapy.datasets.SpectrumDataset` Dataset to compute mask for. observation : `~gammapy.data.Observation` Observation to compute mask for. Returns ------- dataset : `Dataset` Dataset with defined safe range mask. """ if self.irfs == "DL3": if observation is None: raise ValueError("observation argument is mandatory with DL3 irfs") if dataset.mask_safe: mask_safe = dataset.mask_safe.data else: mask_safe = np.ones(dataset._geom.data_shape, dtype=bool) if dataset.background is not None: # apply it first so only clipped values are removed for "bkg-peak" mask_safe &= self.make_mask_bkg_invalid(dataset) if "offset-max" in self.methods: mask_safe &= self.make_mask_offset_max(dataset, observation) if "aeff-default" in self.methods: mask_safe &= self.make_mask_energy_aeff_default(dataset, observation) if "aeff-max" in self.methods: mask_safe &= self.make_mask_energy_aeff_max(dataset, observation) if "edisp-bias" in self.methods: mask_safe &= self.make_mask_energy_edisp_bias(dataset, observation) if "bkg-peak" in self.methods: mask_safe &= self.make_mask_energy_bkg_peak(dataset, observation) dataset.mask_safe = Map.from_geom(dataset._geom, data=mask_safe, dtype=bool) return dataset ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/spectrum.py0000644000175100001770000001065114721316200017545 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging from regions import CircleSkyRegion from .map import MapDatasetMaker __all__ = ["SpectrumDatasetMaker"] log = logging.getLogger(__name__) class SpectrumDatasetMaker(MapDatasetMaker): """Make spectrum for a single IACT observation. The IRFs and background are computed at a single fixed offset, which is recommended only for point-sources. Parameters ---------- selection : list of str, optional Select which maps to make, the available options are: 'counts', 'exposure', 'background', 'edisp'. By default, all maps are made. containment_correction : bool Apply containment correction for point sources and circular on regions. background_oversampling : int Background evaluation oversampling factor in energy. use_region_center : bool If True, approximate the IRFs by the value at the center of the region. If False, the IRFs are averaged over the entire. """ tag = "SpectrumDatasetMaker" available_selection = ["counts", "background", "exposure", "edisp"] def __init__( self, selection=None, containment_correction=False, background_oversampling=None, use_region_center=True, ): self.containment_correction = containment_correction self.use_region_center = use_region_center super().__init__( selection=selection, background_oversampling=background_oversampling ) def make_exposure(self, geom, observation): """Make exposure. Parameters ---------- geom : `~gammapy.maps.RegionGeom` Reference map geometry. observation : `~gammapy.data.Observation` Observation to compute effective area for. Returns ------- exposure : `~gammapy.maps.RegionNDMap` Exposure map. """ exposure = super().make_exposure( geom, observation, use_region_center=self.use_region_center ) is_pointlike = exposure.meta.get("is_pointlike", False) if is_pointlike and self.use_region_center is False: log.warning( "MapMaker: use_region_center=False should not be used with point-like IRF. " "Results are likely inaccurate." ) if self.containment_correction: if is_pointlike: raise ValueError( "Cannot apply containment correction for point-like IRF." ) if not isinstance(geom.region, CircleSkyRegion): raise TypeError( "Containment correction only supported for circular regions." ) offset = geom.separation(observation.get_pointing_icrs(observation.tmid)) containment = observation.psf.containment( rad=geom.region.radius, offset=offset, energy_true=geom.axes["energy_true"].center, ) exposure.quantity *= containment.reshape(geom.data_shape) return exposure @staticmethod def make_counts(geom, observation): """Make counts map. If the `~gammapy.maps.RegionGeom` is built from a `~regions.CircleSkyRegion`, the latter will be directly used to extract the counts. If instead the `~gammapy.maps.RegionGeom` is built from a `~regions.PointSkyRegion`, the size of the ON region is taken from the `RAD_MAX_2D` table containing energy-dependent theta2 cuts. Parameters ---------- geom : `~gammapy.maps.Geom` Reference map geometry. observation : `~gammapy.data.Observation` Observation container. Returns ------- counts : `~gammapy.maps.RegionNDMap` Counts map. """ return super(SpectrumDatasetMaker, SpectrumDatasetMaker).make_counts( geom, observation ) def run(self, dataset, observation): """Make spectrum dataset. Parameters ---------- dataset : `~gammapy.spectrum.SpectrumDataset` Reference dataset. observation : `~gammapy.data.Observation` Observation. Returns ------- dataset : `~gammapy.spectrum.SpectrumDataset` Spectrum dataset. """ return super(SpectrumDatasetMaker, self).run(dataset, observation) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.216642 gammapy-1.3/gammapy/makers/tests/0000755000175100001770000000000014721316215016476 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/tests/__init__.py0000644000175100001770000000010014721316200020570 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/tests/test_asymm_irf.py0000644000175100001770000001071414721316200022072 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np import scipy.special from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import SkyCoord from gammapy.irf import IRF from gammapy.makers.utils import make_edisp_kernel_map, make_map_exposure_true_energy from gammapy.maps import MapAxes, MapAxis, WcsGeom class EffectiveArea3D(IRF): tag = "aeff_3d" required_axes = ["energy_true", "fov_lon", "fov_lat"] default_unit = u.m**2 @pytest.fixture def aeff_3d(): energy_axis = MapAxis.from_energy_edges( [0.1, 0.3, 1.0, 3.0, 10.0] * u.TeV, name="energy_true" ) fov_lon_axis = MapAxis.from_edges([-1.5, -0.5, 0.5, 1.5] * u.deg, name="fov_lon") fov_lat_axis = MapAxis.from_edges([-1.5, -0.5, 0.5, 1.5] * u.deg, name="fov_lat") data = np.ones((4, 3, 3)) for i in range(1, 4): data[i] = data[i - 1] * 1.5 return EffectiveArea3D( [energy_axis, fov_lon_axis, fov_lat_axis], data=data, unit=u.m**2 ) class EnergyDispersion3D(IRF): tag = "edisp_3d" required_axes = ["energy_true", "migra", "fov_lon", "fov_lat"] default_unit = u.one @classmethod def from_gauss( cls, energy_axis_true, migra_axis, fov_lon_axis, fov_lat_axis, bias, sigma, pdf_threshold=1e-6, ): axes = MapAxes([energy_axis_true, migra_axis, fov_lon_axis, fov_lat_axis]) coords = axes.get_coord(mode="edges", axis_name="migra") migra_min = coords["migra"][:, :-1, :] migra_max = coords["migra"][:, 1:, :] # Analytical formula for integral of Gaussian s = np.sqrt(2) * sigma t1 = (migra_max - 1 - bias) / s t2 = (migra_min - 1 - bias) / s pdf = (scipy.special.erf(t1) - scipy.special.erf(t2)) / 2 pdf = pdf / (migra_max - migra_min) r1 = np.rollaxis(pdf, -1, 1) r2 = np.rollaxis(r1, 0, -1) data = r2 * np.ones(axes.shape) data[data < pdf_threshold] = 0 return cls( axes=axes, data=data.value, ) @pytest.fixture def edisp_3d(): energy_axis_true = MapAxis.from_energy_bounds( "0.1 TeV", "100 TeV", nbin=5, name="energy_true" ) migra_axis = MapAxis.from_bounds(0, 4, nbin=10, node_type="edges", name="migra") fov_lon_axis = MapAxis.from_edges([-1.5, -0.5, 0.5, 1.5] * u.deg, name="fov_lon") fov_lat_axis = MapAxis.from_edges([-1.5, -0.5, 0.5, 1.5] * u.deg, name="fov_lat") energy_true = energy_axis_true.edges[:-1] sigma = 0.15 / (energy_true / (1 * u.TeV)).value ** 0.3 bias = 1e-3 * (energy_true - 1 * u.TeV).value return EnergyDispersion3D.from_gauss( energy_axis_true=energy_axis_true, migra_axis=migra_axis, fov_lon_axis=fov_lon_axis, fov_lat_axis=fov_lat_axis, bias=bias, sigma=sigma, ) def test_aeff_3d(aeff_3d): res = aeff_3d.evaluate( fov_lon=[-0.5, 0.8] * u.deg, fov_lat=[-0.5, 1.0] * u.deg, energy_true=[0.2, 8.0] * u.TeV, ) assert_allclose(res.data, [1.06246937, 3.74519106], rtol=1e-5) axis = MapAxis.from_energy_bounds(0.1 * u.TeV, 10 * u.TeV, 6, name="energy_true") pointing = SkyCoord(2, 1, unit="deg") geom = WcsGeom.create(npix=(4, 3), binsz=2, axes=[axis], skydir=pointing) exposure_map = make_map_exposure_true_energy( pointing=pointing, livetime="42 h", aeff=aeff_3d, geom=geom ) assert_allclose( exposure_map.data[3][1][1:3], [323894.44971479, 323894.44971479], rtol=1e-5 ) def test_edisp_3d(edisp_3d): energy = [1, 2] * u.TeV migra = np.array([0.98, 0.97, 0.7]) fov_lon = [0.1, 1.5] * u.deg fov_lat = [0.0, 0.3] * u.deg eval = edisp_3d.evaluate( energy_true=energy.reshape(-1, 1, 1, 1), migra=migra.reshape(1, -1, 1, 1), fov_lon=fov_lon.reshape(1, 1, -1, 1), fov_lat=fov_lat.reshape(1, 1, 1, -1), ) assert_allclose( eval[0][2], [[0.71181427, 0.71181427], [0.71181427, 0.71181427]], rtol=1e-3 ) etrue = MapAxis.from_energy_bounds(0.5, 2, 6, unit="TeV", name="energy_true") ereco = MapAxis.from_energy_bounds(0.5, 2, 3, unit="TeV", name="energy") pointing = SkyCoord(2, 1, unit="deg") geom = WcsGeom.create(10, binsz=0.5, axes=[ereco, etrue], skydir=pointing) edispmap = make_edisp_kernel_map(edisp_3d, pointing, geom) assert_allclose(edispmap.edisp_map.data[3][1][5][5], 0.623001, rtol=1e-3) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/tests/test_core.py0000644000175100001770000000026114721316200021030 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from gammapy.makers import MAKER_REGISTRY def test_maker_registry(): assert "Maker" in str(MAKER_REGISTRY) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/tests/test_map.py0000644000175100001770000004605314721316200020666 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import SkyCoord from astropy.table import Table from astropy.time import Time from regions import CircleSkyRegion from gammapy.data import ( GTI, DataStore, EventList, FixedPointingInfo, HDUIndexTable, Observation, ObservationTable, ) from gammapy.datasets import MapDataset, MapDatasetMetaData from gammapy.datasets.map import RAD_AXIS_DEFAULT from gammapy.irf import Background2D, EDispKernelMap, EDispMap, PSFMap from gammapy.makers import FoVBackgroundMaker, MapDatasetMaker, SafeMaskMaker from gammapy.maps import HpxGeom, Map, MapAxis, WcsGeom from gammapy.utils.testing import requires_data, requires_dependency @pytest.fixture(scope="session") def observations(): data_store = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps/") obs_id = [110380, 111140] return data_store.get_observations(obs_id) def geom(ebounds, binsz=0.5): skydir = SkyCoord(0, -1, unit="deg", frame="galactic") energy_axis = MapAxis.from_edges(ebounds, name="energy", unit="TeV", interp="log") return WcsGeom.create( skydir=skydir, binsz=binsz, width=(10, 5), frame="galactic", axes=[energy_axis] ) @pytest.fixture(scope="session") def geom_config_hpx(): energy_axis = MapAxis.from_energy_bounds("0.5 TeV", "30 TeV", nbin=3) energy_axis_true = MapAxis.from_energy_bounds( "0.3 TeV", "30 TeV", nbin=3, per_decade=True, name="energy_true" ) geom_hpx = HpxGeom.create( binsz=0.1, frame="galactic", axes=[energy_axis], region="DISK(0, 0, 5.)" ) return {"geom": geom_hpx, "energy_axis_true": energy_axis_true} @requires_data() @pytest.mark.parametrize( "pars", [ { # Default, same e_true and reco "geom": geom(ebounds=[0.1, 1, 10]), "e_true": None, "counts": 34366, "exposure": 9.995376e08, "exposure_image": 3.99815e11, "background": 27989.05, "binsz_irf": 0.5, "migra": None, }, { # Test single energy bin "geom": geom(ebounds=[0.1, 10]), "e_true": None, "counts": 34366, "exposure": 5.843302e08, "exposure_image": 1.16866e11, "background": 30424.451, "binsz_irf": 0.5, "migra": None, }, { # Test single energy bin with exclusion mask "geom": geom(ebounds=[0.1, 10]), "e_true": None, "exclusion_mask": Map.from_geom(geom(ebounds=[0.1, 10])), "counts": 34366, "exposure": 5.843302e08, "exposure_image": 1.16866e11, "background": 30424.451, "binsz_irf": 0.5, "migra": None, }, { # Test for different e_true and e_reco bins "geom": geom(ebounds=[0.1, 1, 10]), "e_true": MapAxis.from_edges( [0.1, 0.5, 2.5, 10.0], name="energy_true", unit="TeV", interp="log" ), "counts": 34366, "exposure": 9.951827e08, "exposure_image": 5.971096e11, "background": 28760.283, "background_oversampling": 2, "binsz_irf": 0.5, "migra": None, }, { # Test for different e_true and e_reco and spatial bins "geom": geom(ebounds=[0.1, 1, 10]), "e_true": MapAxis.from_edges( [0.1, 0.5, 2.5, 10.0], name="energy_true", unit="TeV", interp="log" ), "counts": 34366, "exposure": 9.951827e08, "exposure_image": 5.971096e11, "background": 28760.283, "background_oversampling": 2, "binsz_irf": 1.0, "migra": None, }, { # Test for different e_true and e_reco and use edispmap "geom": geom(ebounds=[0.1, 1, 10]), "e_true": MapAxis.from_edges( [0.1, 0.5, 2.5, 10.0], name="energy_true", unit="TeV", interp="log" ), "counts": 34366, "exposure": 9.951827e08, "exposure_image": 5.971096e11, "background": 28760.283, "background_oversampling": 2, "binsz_irf": 0.5, "migra": MapAxis.from_edges( np.linspace(0.0, 3.0, 100), name="migra", unit="" ), }, ], ) def test_map_maker(pars, observations): stacked = MapDataset.create( geom=pars["geom"], energy_axis_true=pars["e_true"], binsz_irf=pars["binsz_irf"], migra_axis=pars["migra"], ) maker = MapDatasetMaker(background_oversampling=pars.get("background_oversampling")) safe_mask_maker = SafeMaskMaker(methods=["offset-max"], offset_max="2 deg") for obs in observations: cutout = stacked.cutout(position=obs.get_pointing_icrs(obs.tmid), width="4 deg") dataset = maker.run(cutout, obs) dataset = safe_mask_maker.run(dataset, obs) stacked.stack(dataset) counts = stacked.counts assert counts.unit == "" assert_allclose(counts.data.sum(), pars["counts"], rtol=1e-5) exposure = stacked.exposure assert exposure.unit == "m2 s" assert_allclose(exposure.data.mean(), pars["exposure"], rtol=3e-3) background = stacked.npred_background() assert background.unit == "" assert_allclose(background.data.sum(), pars["background"], rtol=1e-4) image_dataset = stacked.to_image() counts = image_dataset.counts assert counts.unit == "" assert_allclose(counts.data.sum(), pars["counts"], rtol=1e-4) exposure = image_dataset.exposure assert exposure.unit == "m2 s" assert_allclose(exposure.data.sum(), pars["exposure_image"], rtol=1e-3) background = image_dataset.npred_background() assert background.unit == "" assert_allclose(background.data.sum(), pars["background"], rtol=1e-4) @requires_data() def test_map_maker_obs(observations): # Test for different spatial geoms and etrue, ereco bins geom_reco = geom(ebounds=[0.1, 1, 10]) e_true = MapAxis.from_edges( [0.1, 0.5, 2.5, 10.0], name="energy_true", unit="TeV", interp="log" ) reference = MapDataset.create( geom=geom_reco, energy_axis_true=e_true, binsz_irf=1.0 ) maker_obs = MapDatasetMaker() map_dataset = maker_obs.run(reference, observations[0]) assert map_dataset.counts.geom == geom_reco assert map_dataset.npred_background().geom == geom_reco assert isinstance(map_dataset.edisp, EDispKernelMap) assert map_dataset.edisp.edisp_map.data.shape == (3, 2, 5, 10) assert map_dataset.edisp.exposure_map.data.shape == (3, 1, 5, 10) assert map_dataset.psf.psf_map.data.shape == (3, 66, 5, 10) assert map_dataset.psf.exposure_map.data.shape == (3, 1, 5, 10) assert_allclose(map_dataset.gti.time_delta, 1800.0 * u.s) @requires_data() def test_map_maker_obs_with_migra(observations): # Test for different spatial geoms and etrue, ereco bins migra = MapAxis.from_edges(np.linspace(0, 2.0, 50), unit="", name="migra") geom_reco = geom(ebounds=[0.1, 1, 10]) e_true = MapAxis.from_edges( [0.1, 0.5, 2.5, 10.0], name="energy_true", unit="TeV", interp="log" ) reference = MapDataset.create( geom=geom_reco, energy_axis_true=e_true, migra_axis=migra, binsz_irf=1.0 ) maker_obs = MapDatasetMaker() map_dataset = maker_obs.run(reference, observations[0]) assert map_dataset.counts.geom == geom_reco assert isinstance(map_dataset.edisp, EDispMap) assert map_dataset.edisp.edisp_map.data.shape == (3, 49, 5, 10) assert map_dataset.edisp.exposure_map.data.shape == (3, 1, 5, 10) @requires_data() def test_make_meta_table(observations): maker_obs = MapDatasetMaker() map_dataset_meta_table = maker_obs.make_meta_table(observation=observations[0]) assert_allclose(map_dataset_meta_table["RA_PNT"], 267.68121338) assert_allclose(map_dataset_meta_table["DEC_PNT"], -29.6075) assert_allclose(map_dataset_meta_table["OBS_ID"], 110380) assert map_dataset_meta_table["OBS_MODE"] == "POINTING" meta = maker_obs._make_metadata(map_dataset_meta_table) assert meta.obs_info[0].observation_mode == "POINTING" assert meta.obs_info[0].telescope == "CTA" assert meta.obs_info[0].obs_id == 110380 assert_allclose(meta.pointing[0].radec_mean.dec.value, -29.6075) @requires_data() def test_make_map_no_count(observations): dataset = MapDataset.create(geom((0.1, 1, 10))) maker_obs = MapDatasetMaker(selection=["exposure"]) map_dataset = maker_obs.run(dataset, observation=observations[0]) assert map_dataset.counts is not None assert_allclose(map_dataset.counts.data, 0) assert map_dataset.counts.geom == dataset.counts.geom @requires_data() @requires_dependency("healpy") def test_map_dataset_maker_hpx(geom_config_hpx, observations): reference = MapDataset.create(**geom_config_hpx, binsz_irf=5 * u.deg) maker = MapDatasetMaker() safe_mask_maker = SafeMaskMaker( offset_max="2.5 deg", methods=["aeff-default", "offset-max"] ) dataset = maker.run(reference, observation=observations[0]) dataset = safe_mask_maker.run(dataset, observation=observations[0]).to_masked() assert_allclose(dataset.counts.data.sum(), 4264) assert_allclose(dataset.background.data.sum(), 2964.5369, rtol=1e-5) assert_allclose(dataset.exposure.data[4, 1000], 5.987e09, rtol=1e-4) coords = SkyCoord([0, 3], [0, 0], frame="galactic", unit="deg") coords = {"skycoord": coords, "energy": 1 * u.TeV} assert_allclose(dataset.mask_safe.get_by_coord(coords), [True, False]) kernel = dataset.edisp.get_edisp_kernel() assert_allclose(kernel.data.sum(axis=1)[3], 1, rtol=0.01) def test_interpolate_map_dataset(): energy = MapAxis.from_energy_bounds("1 TeV", "300 TeV", nbin=5, name="energy") energy_true = MapAxis.from_nodes( np.logspace(-1, 3, 20), name="energy_true", interp="log", unit="TeV" ) # make dummy map IRFs geom_allsky = WcsGeom.create( npix=(5, 3), proj="CAR", binsz=60, axes=[energy], skydir=(0, 0) ) geom_allsky_true = geom_allsky.drop("energy").to_cube([energy_true]) # background geom_background = WcsGeom.create( skydir=(0, 0), width=(5, 5), binsz=0.2 * u.deg, axes=[energy] ) value = 30 bkg_map = Map.from_geom(geom_background, unit="") bkg_map.data = value * np.ones(bkg_map.data.shape) # effective area - with a gradient that also depends on energy aeff_map = Map.from_geom(geom_allsky_true, unit="cm2 s") ra_arr = np.arange(aeff_map.data.shape[1]) dec_arr = np.arange(aeff_map.data.shape[2]) for i in np.arange(aeff_map.data.shape[0]): aeff_map.data[i, :, :] = ( (i + 1) * 10 * np.meshgrid(dec_arr, ra_arr)[0] + 10 * np.meshgrid(dec_arr, ra_arr)[1] + 10 ) aeff_map.meta["TELESCOP"] = "HAWC" # psf map width = 0.2 * u.deg rad_axis = MapAxis.from_nodes(np.linspace(0, 2, 50), name="rad", unit="deg") psfMap = PSFMap.from_gauss(energy_true, rad_axis, width) # edispmap edispmap = EDispKernelMap.from_gauss( energy, energy_true, sigma=0.1, bias=0.0, geom=geom_allsky ) # events and gti nr_ev = 10 ev_t = Table() gti_t = Table() ev_t["EVENT_ID"] = np.arange(nr_ev) ev_t["TIME"] = nr_ev * [Time("2011-01-01 00:00:00", scale="utc", format="iso")] ev_t["RA"] = np.linspace(-1, 1, nr_ev) * u.deg ev_t["DEC"] = np.linspace(-1, 1, nr_ev) * u.deg ev_t["ENERGY"] = np.logspace(0, 2, nr_ev) * u.TeV gti_t["START"] = [Time("2010-12-31 00:00:00", scale="utc", format="iso")] gti_t["STOP"] = [Time("2011-01-02 00:00:00", scale="utc", format="iso")] events = EventList(ev_t) gti = GTI(gti_t) # define observation obs = Observation( obs_id=0, gti=gti, aeff=aeff_map, edisp=edispmap, psf=psfMap, bkg=bkg_map, events=events, obs_filter=None, pointing=FixedPointingInfo(fixed_icrs=SkyCoord(0 * u.deg, 0 * u.deg)), ) # define analysis geometry geom_target = WcsGeom.create( skydir=(0, 0), width=(5, 5), binsz=0.1 * u.deg, axes=[energy] ) maker = MapDatasetMaker( selection=["exposure", "counts", "background", "edisp", "psf"] ) dataset = MapDataset.create( geom=geom_target, energy_axis_true=energy_true, rad_axis=rad_axis, name="test" ) dataset = maker.run(dataset, obs) # test counts assert dataset.counts.data.sum() == nr_ev # test background assert np.floor(np.sum(dataset.npred_background().data)) == np.sum(bkg_map.data) coords_bg = {"skycoord": SkyCoord("0 deg", "0 deg"), "energy": energy.center[0]} assert_allclose( dataset.npred_background().get_by_coord(coords_bg)[0], 7.5, atol=1e-4 ) # test effective area coords_aeff = { "skycoord": SkyCoord("0 deg", "0 deg"), "energy_true": energy_true.center[0], } assert_allclose( aeff_map.get_by_coord(coords_aeff)[0], dataset.exposure.interp_by_coord(coords_aeff)[0], atol=1e-3, ) # test edispmap pdfmatrix_preinterp = edispmap.get_edisp_kernel( position=SkyCoord("0 deg", "0 deg") ).pdf_matrix pdfmatrix_postinterp = dataset.edisp.get_edisp_kernel( position=SkyCoord("0 deg", "0 deg") ).pdf_matrix assert_allclose(pdfmatrix_preinterp, pdfmatrix_postinterp, atol=1e-7) # test psfmap geom_psf = geom_target.drop("energy").to_cube([energy_true]) psfkernel_preinterp = psfMap.get_psf_kernel( position=SkyCoord("0 deg", "0 deg"), geom=geom_psf, max_radius=2 * u.deg ).data psfkernel_postinterp = dataset.psf.get_psf_kernel( position=SkyCoord("0 deg", "0 deg"), geom=geom_psf, max_radius=2 * u.deg ).data assert_allclose(psfkernel_preinterp, psfkernel_postinterp, atol=1e-4) @requires_data() @pytest.mark.xfail def test_minimal_datastore(): """ "Check that a standard analysis runs on a minimal datastore""" energy_axis = MapAxis.from_energy_bounds( 1, 10, nbin=3, per_decade=False, unit="TeV", name="energy" ) geom = WcsGeom.create( skydir=(83.633, 22.014), binsz=0.5, width=(2, 2), frame="icrs", proj="CAR", axes=[energy_axis], ) data_store = DataStore.from_dir("$GAMMAPY_DATA/tests/minimal_datastore") observations = data_store.get_observations() maker = MapDatasetMaker() offset_max = 2.3 * u.deg maker_safe_mask = SafeMaskMaker(methods=["offset-max"], offset_max=offset_max) circle = CircleSkyRegion( center=SkyCoord("83.63 deg", "22.14 deg"), radius=0.2 * u.deg ) exclusion_mask = ~geom.region_mask(regions=[circle]) maker_fov = FoVBackgroundMaker(method="fit", exclusion_mask=exclusion_mask) stacked = MapDataset.create(geom=geom, name="crab-stacked") for obs in observations: dataset = maker.run(stacked, obs) dataset = maker_safe_mask.run(dataset, obs) dataset = maker_fov.run(dataset) stacked.stack(dataset) assert_allclose(stacked.exposure.data.sum(), 6.01909e10) assert_allclose(stacked.counts.data.sum(), 1446) assert_allclose(stacked.background.data.sum(), 1445.9841) assert stacked.meta.creation.creator.split()[0] == "Gammapy" @requires_data() def test_dataset_hawc(): # create the energy reco axis energy_axis = MapAxis.from_edges( [1.00, 1.78, 3.16, 5.62, 10.0, 17.8, 31.6, 56.2, 100, 177, 316] * u.TeV, name="energy", interp="log", ) # and energy true axis energy_axis_true = MapAxis.from_energy_bounds( 1e-3, 1e4, nbin=140, unit="TeV", name="energy_true" ) # create a geometry around the Crab location geom = WcsGeom.create( skydir=SkyCoord(ra=83.63, dec=22.01, unit="deg", frame="icrs"), width=3 * u.deg, axes=[energy_axis], binsz=0.1, ) maker = MapDatasetMaker( selection=["counts", "background", "exposure", "edisp", "psf"] ) safemask_maker = SafeMaskMaker(methods=["aeff-max"], aeff_percent=10) results = {} results["GP"] = [6.53623241669e16, 58, 0.72202391] results["NN"] = [6.57154247837e16, 62, 0.76743538] for which in ["GP", "NN"]: # paths and file names data_path = "$GAMMAPY_DATA/hawc/crab_events_pass4/" hdu_filename = "hdu-index-table-" + which + "-Crab.fits.gz" obs_filename = "obs-index-table-" + which + "-Crab.fits.gz" # We want the last event lass for speed obs_table = ObservationTable.read(data_path + obs_filename) hdu_table = HDUIndexTable.read(data_path + hdu_filename, hdu=9) data_store = DataStore(hdu_table=hdu_table, obs_table=obs_table) observations = data_store.get_observations() # create empty dataset that will contain the data geom_exposure = geom.to_image().to_cube([energy_axis_true]) geom_psf = geom.to_image().to_cube([RAD_AXIS_DEFAULT, energy_axis]) geom_edisp = geom.to_cube([energy_axis_true]) dataset_empty = MapDataset.from_geoms( geom=geom, name="nHit-9", geom_exposure=geom_exposure, geom_psf=geom_psf, geom_edisp=geom_edisp, ) # run the maker dataset = maker.run(dataset_empty, observations[0]) dataset.exposure.meta["livetime"] = "1 s" dataset = safemask_maker.run(dataset) assert_allclose(dataset.exposure.data.sum(), results[which][0]) assert_allclose(dataset.counts.data.sum(), results[which][1]) assert_allclose(dataset.background.data.sum(), results[which][2]) @requires_data() def test_make_background_2d(observations): filename = "$GAMMAPY_DATA/tests/irf/bkg_2d_full_example.fits" bkg = Background2D.read(filename) # TODO: better example file for 2d bkg bkg.axes[0]._unit = "TeV" bkg.axes[1]._unit = "deg" bkg._unit = "s-1 TeV-1 sr-1" obs = observations[0] obs.bkg = bkg geom_reco = geom(ebounds=[0.1, 1, 10]) e_true = MapAxis.from_edges( [0.1, 0.5, 2.5, 10.0], name="energy_true", unit="TeV", interp="log" ) reference = MapDataset.create( geom=geom_reco, energy_axis_true=e_true, binsz_irf=1.0 ) maker_obs = MapDatasetMaker(selection=["background"], background_pad_offset=True) map_dataset = maker_obs.run(reference, obs) assert_allclose(map_dataset.background.data.sum(), 17636.60091226549) @requires_data() def test_meta_data_creation(observations): reference = MapDataset.create(geom=geom(ebounds=[0.1, 1, 10])) maker = MapDatasetMaker() dataset0 = maker.run(reference, observations[0]) dataset1 = maker.run(reference, observations[1]) assert dataset0.meta.obs_info[0].obs_id == 110380 assert dataset1.meta.obs_info[0].obs_id == 111140 # test stacking dataset0.stack(dataset1) stacked_meta = MapDatasetMetaData._from_meta_table(dataset0.meta_table) assert stacked_meta.obs_info[0].obs_id == 110380 assert stacked_meta.obs_info[1].obs_id == 111140 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/tests/test_reduce.py0000644000175100001770000003602614721316200021357 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import pytest from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import Angle, SkyCoord from regions import CircleSkyRegion, PointSkyRegion from gammapy.data import DataStore, Observation from gammapy.datasets import MapDataset, SpectrumDataset from gammapy.makers import ( DatasetsMaker, FoVBackgroundMaker, MapDatasetMaker, ReflectedRegionsBackgroundMaker, SafeMaskMaker, SpectrumDatasetMaker, WobbleRegionsFinder, ) from gammapy.maps import MapAxis, RegionGeom, WcsGeom from gammapy.utils.testing import requires_data, requires_dependency @pytest.fixture() def observations_cta(): data_store = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps/") return data_store.get_observations()[:3] @pytest.fixture() def observations_cta_with_issue(): data_store = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps/") list = data_store.get_observations()[:2] empty_obs = Observation() list.append(empty_obs) return list @pytest.fixture() def observations_hess(): datastore = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") obs_ids = [23523, 23526, 23559, 23592] return datastore.get_observations(obs_ids) @pytest.fixture(scope="session") def observations_magic_rad_max(): observations = [ Observation.read( "$GAMMAPY_DATA/magic/rad_max/data/20131004_05029747_DL3_CrabNebula-W0.40+035.fits" ), Observation.read( "$GAMMAPY_DATA/magic/rad_max/data/20131004_05029748_DL3_CrabNebula-W0.40+215.fits" ), ] return observations @pytest.fixture() def map_dataset(): skydir = SkyCoord(0, -1, unit="deg", frame="galactic") energy_axis = MapAxis.from_edges( [0.1, 1, 10], name="energy", unit="TeV", interp="log" ) geom = WcsGeom.create( skydir=skydir, binsz=0.5, width=(10, 5), frame="galactic", axes=[energy_axis] ) return MapDataset.create(geom=geom) @pytest.fixture() def spectrum_dataset(): target_position = SkyCoord(ra=83.63, dec=22.01, unit="deg", frame="icrs") on_region_radius = Angle("0.11 deg") on_region = CircleSkyRegion(center=target_position, radius=on_region_radius) energy_axis = MapAxis.from_energy_bounds( 0.1, 40, nbin=15, per_decade=True, unit="TeV", name="energy" ) energy_axis_true = MapAxis.from_energy_bounds( 0.05, 100, nbin=20, per_decade=True, unit="TeV", name="energy_true" ) geom = RegionGeom.create(region=on_region, axes=[energy_axis]) return SpectrumDataset.create( geom=geom, energy_axis_true=energy_axis_true, ) def get_spectrum_dataset_rad_max(name, e_min=0.005 * u.TeV): """get the spectrum dataset maker for the energy-dependent spectrum extraction""" target_position = SkyCoord(ra=83.63, dec=22.01, unit="deg", frame="icrs") on_center = PointSkyRegion(target_position) energy_axis = MapAxis.from_energy_bounds( e_min, 50, nbin=28, per_decade=False, unit="TeV", name="energy" ) energy_axis_true = MapAxis.from_energy_bounds( e_min, 50, nbin=20, per_decade=False, unit="TeV", name="energy_true" ) geom = RegionGeom.create(region=on_center, axes=[energy_axis]) return SpectrumDataset.create( geom=geom, energy_axis_true=energy_axis_true, name=name ) @pytest.fixture(scope="session") def exclusion_mask(): exclusion_region = CircleSkyRegion( center=SkyCoord(183.604, -8.708, unit="deg", frame="galactic"), radius=0.5 * u.deg, ) skydir = SkyCoord(ra=83.63, dec=22.01, unit="deg", frame="icrs") geom = WcsGeom.create( npix=(150, 150), binsz=0.05, skydir=skydir, proj="TAN", frame="icrs" ) return ~geom.region_mask([exclusion_region]) @pytest.fixture() def full_exclusion_mask(): exclusion_region = CircleSkyRegion( center=SkyCoord(183.604, -8.708, unit="deg", frame="galactic"), radius=15 * u.deg, ) skydir = SkyCoord(ra=83.63, dec=22.01, unit="deg", frame="icrs") geom = WcsGeom.create( npix=(150, 150), binsz=0.05, skydir=skydir, proj="TAN", frame="icrs" ) return ~geom.region_mask([exclusion_region]) @pytest.fixture(scope="session") def makers_map(): return [ MapDatasetMaker(), SafeMaskMaker(methods=["offset-max"], offset_max="2 deg"), FoVBackgroundMaker(method="scale"), ] @pytest.fixture(scope="session") def makers_spectrum(exclusion_mask): return [ SpectrumDatasetMaker( containment_correction=True, selection=["counts", "exposure", "edisp"] ), ReflectedRegionsBackgroundMaker(exclusion_mask=exclusion_mask), SafeMaskMaker(methods=["aeff-max"], aeff_percent=10), ] @pytest.fixture() def makers_spectrum_large_region(full_exclusion_mask): return [ SpectrumDatasetMaker( containment_correction=True, selection=["counts", "exposure", "edisp"] ), ReflectedRegionsBackgroundMaker(exclusion_mask=full_exclusion_mask), SafeMaskMaker(methods=["aeff-max"], aeff_percent=10), ] @requires_data() @pytest.mark.parametrize( "pars", [ { "stack_datasets": True, "cutout_width": None, "n_jobs": 1, "backend": None, }, { "stack_datasets": False, "cutout_width": None, "n_jobs": 2, "backend": "multiprocessing", }, { "stack_datasets": True, "cutout_width": None, "n_jobs": 2, "backend": "multiprocessing", }, ], ) def test_datasets_maker_map(pars, observations_cta, makers_map, map_dataset): makers = DatasetsMaker( makers_map, stack_datasets=pars["stack_datasets"], cutout_mode="partial", cutout_width=pars["cutout_width"], n_jobs=pars["n_jobs"], parallel_backend=pars["backend"], ) datasets = makers.run(map_dataset, observations_cta) if len(datasets) == 1: counts = datasets[0].counts assert counts.unit == "" assert_allclose(counts.data.sum(), 46716, rtol=1e-5) exposure = datasets[0].exposure assert exposure.unit == "m2 s" assert_allclose(exposure.data.mean(), 1.350841e09, rtol=3e-3) else: assert len(datasets) == 3 # get by name because of async counts = datasets[0].counts assert counts.unit == "" assert_allclose(counts.data.sum(), 26318, rtol=1e-5) exposure = datasets[0].exposure assert exposure.unit == "m2 s" assert_allclose(exposure.data.mean(), 2.436063e09, rtol=3e-3) @requires_data() def test_failure_datasets_maker_map( observations_cta_with_issue, makers_map, map_dataset ): makers = DatasetsMaker( makers_map, stack_datasets=True, cutout_mode="partial", cutout_width="15 deg", n_jobs=4, parallel_backend="multiprocessing", ) with pytest.raises(RuntimeError): makers.run(map_dataset, observations_cta_with_issue) @requires_data() @requires_dependency("ray") def test_datasets_maker_map_ray(observations_cta, makers_map, map_dataset): makers = DatasetsMaker( makers_map, stack_datasets=True, cutout_mode="partial", cutout_width=None, n_jobs=2, parallel_backend="ray", ) datasets = makers.run(dataset=map_dataset, observations=observations_cta) counts = datasets[0].counts assert counts.unit == "" assert_allclose(counts.data.sum(), 46716, rtol=1e-5) exposure = datasets[0].exposure assert exposure.unit == "m2 s" assert_allclose(exposure.data.mean(), 1.350841e09, rtol=3e-3) @requires_data() def test_datasets_maker_map_cutout_width(observations_cta, makers_map, map_dataset): makers = DatasetsMaker( makers_map, stack_datasets=True, cutout_mode="partial", cutout_width="5 deg", n_jobs=1, ) datasets = makers.run(map_dataset, observations_cta) counts = datasets[0].counts assert counts.unit == "" assert_allclose(counts.data.sum(), 46716, rtol=1e-5) exposure = datasets[0].exposure assert exposure.unit == "m2 s" assert_allclose(exposure.data.mean(), 1.350841e09, rtol=3e-3) @requires_data() def test_datasets_maker_map_2_steps(observations_cta, map_dataset): makers = DatasetsMaker( [MapDatasetMaker()], stack_datasets=False, cutout_mode="partial", cutout_width="5 deg", n_jobs=1, ) datasets = makers.run(map_dataset, observations_cta) makers_list = [ SafeMaskMaker(methods=["offset-max"], offset_max="2 deg"), FoVBackgroundMaker(method="scale"), ] makers = DatasetsMaker( makers_list, stack_datasets=True, cutout_mode="partial", cutout_width="5 deg", n_jobs=1, ) datasets = makers.run(map_dataset, observations_cta, datasets) counts = datasets[0].counts assert counts.unit == "" assert_allclose(counts.data.sum(), 46716, rtol=1e-5) exposure = datasets[0].exposure assert exposure.unit == "m2 s" assert_allclose(exposure.data.mean(), 1.350841e09, rtol=3e-3) @requires_data() def test_datasets_maker_spectrum(observations_hess, makers_spectrum, spectrum_dataset): makers = DatasetsMaker(makers_spectrum, stack_datasets=False, n_jobs=2) datasets = makers.run(spectrum_dataset, observations_hess) counts = datasets[0].counts assert counts.unit == "" assert_allclose(counts.data.sum(), 192, rtol=1e-5) assert_allclose(datasets[0].background.data.sum(), 18.66666664, rtol=1e-5) exposure = datasets[0].exposure assert exposure.unit == "m2 s" assert_allclose(exposure.data.mean(), 3.94257338e08, rtol=3e-3) @requires_data() def test_datasets_maker_spectrum_large_region( observations_hess, makers_spectrum_large_region, spectrum_dataset ): makers = DatasetsMaker(makers_spectrum_large_region, stack_datasets=True, n_jobs=4) datasets = makers.run(spectrum_dataset, observations_hess) counts = datasets[0].counts assert counts.unit == "" assert_allclose(counts.data.sum(), 0, rtol=1e-5) assert_allclose(datasets[0].background.data.sum(), 0, rtol=1e-5) @requires_data() def test_dataset_maker_spectrum_rad_max(observations_magic_rad_max): """test the energy-dependent spectrum extraction""" observation = observations_magic_rad_max[0] maker = SpectrumDatasetMaker( containment_correction=False, selection=["counts", "exposure", "edisp"] ) dataset = maker.run(get_spectrum_dataset_rad_max("spec"), observation) finder = WobbleRegionsFinder(n_off_regions=1) bkg_maker = ReflectedRegionsBackgroundMaker(region_finder=finder) dataset_on_off = bkg_maker.run(dataset, observation) counts = dataset_on_off.counts counts_off = dataset_on_off.counts_off assert counts.unit == "" assert counts_off is not None, "Extracting off counts failed" assert counts_off.unit == "" assert_allclose(counts.data.sum(), 964, rtol=1e-5) assert_allclose(counts_off.data.sum(), 530, rtol=1e-5) exposure = dataset_on_off.exposure assert exposure.unit == "m2 s" assert_allclose(exposure.data.mean(), 68067647.733834, rtol=1e-5) @requires_data() def test_dataset_maker_spectrum_global_rad_max(): """test the energy-dependent spectrum extraction""" observation = Observation.read( "$GAMMAPY_DATA/joint-crab/dl3/magic/run_05029748_DL3.fits" ) maker = SpectrumDatasetMaker( containment_correction=False, selection=["counts", "exposure", "edisp"] ) dataset = maker.run(get_spectrum_dataset_rad_max("spec"), observation) finder = WobbleRegionsFinder(n_off_regions=3) bkg_maker = ReflectedRegionsBackgroundMaker(region_finder=finder) dataset_on_off = bkg_maker.run(dataset, observation) counts = dataset_on_off.counts counts_off = dataset_on_off.counts_off assert counts.unit == "" assert counts_off.unit == "" assert_allclose(counts.data.sum(), 438, rtol=1e-5) assert_allclose(counts_off.data.sum(), 276, rtol=1e-5) @requires_data() def test_dataset_maker_spectrum_rad_max_overlapping(observations_magic_rad_max, caplog): """test the energy-dependent spectrum extraction""" observation = observations_magic_rad_max[0] maker = SpectrumDatasetMaker( containment_correction=False, selection=["counts", "exposure", "edisp"] ) finder = WobbleRegionsFinder(n_off_regions=5) bkg_maker = ReflectedRegionsBackgroundMaker(region_finder=finder) with caplog.at_level(logging.WARNING): dataset = maker.run(get_spectrum_dataset_rad_max("spec"), observation) dataset_on_off = bkg_maker.run(dataset, observation) assert len(caplog.record_tuples) == 3 assert caplog.record_tuples[0] == ( "gammapy.makers.background.reflected", logging.WARNING, "Found overlapping on/off regions, choose less off regions", ) # overlapping off regions means not counts will be filled assert dataset_on_off.counts_off is None assert (dataset_on_off.acceptance_off.data == 0).all() # test that it works if we only look at higher energies with lower # rad max, allowing more off regions caplog.clear() with caplog.at_level(logging.WARNING): dataset = maker.run( get_spectrum_dataset_rad_max("spec", e_min=250 * u.GeV), observation ) dataset_on_off = bkg_maker.run(dataset, observation) assert dataset_on_off.counts_off is not None assert len(caplog.records) == 0 @requires_data() def test_dataset_maker_spectrum_rad_max_all_excluded( observations_magic_rad_max, caplog ): """test the energy-dependent spectrum extraction""" observation = observations_magic_rad_max[0] maker = SpectrumDatasetMaker( containment_correction=False, selection=["counts", "exposure", "edisp"] ) dataset = maker.run(get_spectrum_dataset_rad_max("spec"), observation) # excludes all possible off regions exclusion_region = CircleSkyRegion( center=observation.get_pointing_icrs(observation.tmid), radius=1 * u.deg, ) geom = WcsGeom.create( npix=(150, 150), binsz=0.05, skydir=observation.get_pointing_icrs(observation.tmid), proj="TAN", frame="icrs", ) exclusion_mask = ~geom.region_mask([exclusion_region]) finder = WobbleRegionsFinder(n_off_regions=1) bkg_maker = ReflectedRegionsBackgroundMaker( region_finder=finder, exclusion_mask=exclusion_mask, ) with caplog.at_level(logging.WARNING): dataset_on_off = bkg_maker.run(dataset, observation) # overlapping off regions means not counts will be filled assert dataset_on_off.counts_off is None assert (dataset_on_off.acceptance_off.data == 0).all() assert len(caplog.record_tuples) == 2 assert caplog.record_tuples[0] == ( "gammapy.makers.background.reflected", logging.WARNING, "ReflectedRegionsBackgroundMaker failed. No OFF region found outside " "exclusion mask for dataset 'spec'.", ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/tests/test_safe.py0000644000175100001770000002550314721316200021024 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import pytest import numpy as np from numpy.testing import assert_allclose from astropy import units as u from regions import CircleSkyRegion from gammapy.data import DataStore from gammapy.datasets import MapDataset, SpectrumDatasetOnOff from gammapy.makers import MapDatasetMaker, SafeMaskMaker from gammapy.maps import MapAxis, RegionGeom, WcsGeom from gammapy.utils.testing import requires_data @pytest.fixture(scope="session") def observations(): data_store = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps/") obs_id = [110380, 111140] return data_store.get_observations(obs_id) @pytest.fixture def observations_hess_dl3(): """HESS DL3 observation list.""" datastore = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") obs_ids = [23523, 23526] return datastore.get_observations(obs_ids) @pytest.fixture(scope="session") def observation_cta_1dc(): data_store = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps/") return data_store.obs(110380) @pytest.fixture(scope="session") def dataset(observation_cta_1dc): axis = MapAxis.from_bounds( 0.1, 10, nbin=16, unit="TeV", name="energy", interp="log" ) axis_true = MapAxis.from_bounds( 0.1, 50, nbin=30, unit="TeV", name="energy_true", interp="log" ) pointing = observation_cta_1dc.get_pointing_icrs(observation_cta_1dc.tmid) geom = WcsGeom.create(npix=(11, 11), axes=[axis], skydir=pointing) empty_dataset = MapDataset.create(geom=geom, energy_axis_true=axis_true) dataset_maker = MapDatasetMaker() return dataset_maker.run(dataset=empty_dataset, observation=observation_cta_1dc) @pytest.fixture(scope="session") def shifted_dataset(observation_cta_1dc): axis = MapAxis.from_bounds(0.1, 1, nbin=5, unit="TeV", name="energy", interp="log") axis_true = MapAxis.from_bounds( 0.1, 2, nbin=10, unit="TeV", name="energy_true", interp="log" ) pointing = observation_cta_1dc.get_pointing_icrs(observation_cta_1dc.tmid) skydir = pointing.directional_offset_by( position_angle=0.0 * u.deg, separation=10 * u.deg ) geom = WcsGeom.create(npix=(11, 11), axes=[axis], skydir=skydir) empty_dataset = MapDataset.create( geom=geom, energy_axis_true=axis_true, name="shifted" ) dataset_maker = MapDatasetMaker() return dataset_maker.run(dataset=empty_dataset, observation=observation_cta_1dc) @pytest.fixture(scope="session") def spectrum_dataset_on_off(observation_cta_1dc): axis = MapAxis.from_bounds( 0.1, 10, nbin=16, unit="TeV", name="energy", interp="log" ) axis_true = MapAxis.from_bounds( 0.1, 50, nbin=30, unit="TeV", name="energy_true", interp="log" ) axis_migra = MapAxis.from_bounds(0.2, 5.0, nbin=48, name="migra") pointing = observation_cta_1dc.get_pointing_icrs(observation_cta_1dc.tmid) region = CircleSkyRegion(pointing, radius=0.3 * u.deg) geom = RegionGeom.create(region, axes=[axis]) return SpectrumDatasetOnOff.create( geom, energy_axis_true=axis_true, migra_axis=axis_migra ) @requires_data() def test_safe_mask_maker_offset_max(dataset, observation_cta_1dc): pointing = observation_cta_1dc.get_pointing_icrs(observation_cta_1dc.tmid) safe_mask_maker = SafeMaskMaker( offset_max="3 deg", position=pointing, ) mask_offset = safe_mask_maker.make_mask_offset_max( dataset=dataset, observation=observation_cta_1dc ) assert_allclose(mask_offset.sum(), 109) @requires_data() def test_safe_mask_maker_aeff_default(dataset, observation_cta_1dc, caplog): pointing = observation_cta_1dc.get_pointing_icrs(observation_cta_1dc.tmid) safe_mask_maker = SafeMaskMaker(position=pointing) mask_energy_aeff_default = safe_mask_maker.make_mask_energy_aeff_default( dataset=dataset, observation=observation_cta_1dc ) assert_allclose(mask_energy_aeff_default.data.sum(), 1936) assert "WARNING" in [_.levelname for _ in caplog.records] messages = [_.message for _ in caplog.records] message = "No default upper safe energy threshold defined for obs 110380" assert message == messages[0] message = "No default lower safe energy threshold defined for obs 110380" assert message == messages[1] @requires_data() def test_safe_mask_maker_aeff_max(dataset, observation_cta_1dc): pointing = observation_cta_1dc.get_pointing_icrs(observation_cta_1dc.tmid) safe_mask_maker = SafeMaskMaker(position=pointing) mask_aeff_max = safe_mask_maker.make_mask_energy_aeff_max(dataset) assert_allclose(mask_aeff_max.data.sum(), 1210) @requires_data() def test_safe_mask_maker_aeff_max_fixed_observation( dataset, shifted_dataset, observation_cta_1dc, caplog ): safe_mask_maker = SafeMaskMaker(methods=["aeff-max"], aeff_percent=20) mask_aeff_max = safe_mask_maker.make_mask_energy_aeff_max( dataset, observation=observation_cta_1dc ) assert_allclose(mask_aeff_max.data.sum(), 847) with caplog.at_level(logging.WARNING): mask_aeff_max_bis = safe_mask_maker.make_mask_energy_aeff_max( shifted_dataset, observation=observation_cta_1dc ) assert len(caplog.record_tuples) == 1 assert caplog.record_tuples[0] == ( "gammapy.makers.safe", logging.WARNING, "Effective area is all zero at [267d40m52.368168s -19d36m27s]. No safe " "energy band can be defined for the dataset 'shifted': setting `mask_safe`" " to all False.", ) assert_allclose(mask_aeff_max_bis.data.sum(), 0) maker = SafeMaskMaker(irfs="DL3") with pytest.raises(ValueError): maker.run(dataset) @requires_data() def test_safe_mask_maker_aeff_max_fixed_offset(dataset, observation_cta_1dc): safe_mask_maker = SafeMaskMaker( methods=["aeff-max"], aeff_percent=20, fixed_offset=5 * u.deg ) mask_aeff_max = safe_mask_maker.make_mask_energy_aeff_max( dataset, observation=observation_cta_1dc ) assert_allclose(mask_aeff_max.data.sum(), 726) with pytest.raises(ValueError): mask_aeff_max = safe_mask_maker.make_mask_energy_aeff_max(dataset) safe_mask_maker1 = SafeMaskMaker( methods=["aeff-max"], aeff_percent=20, fixed_offset=None, irfs="DL3" ) mask1 = safe_mask_maker1.make_mask_energy_aeff_max(dataset, observation_cta_1dc) assert_allclose(mask1.data.sum(), 847) @requires_data() def test_safe_mask_maker_offset_max_fixed_offset(dataset, observation_cta_1dc): safe_mask_maker_offset = SafeMaskMaker(offset_max="3 deg", fixed_offset=1.5 * u.deg) mask_aeff_max_offset = safe_mask_maker_offset.make_mask_energy_aeff_max( dataset=dataset, observation=observation_cta_1dc ) assert_allclose(mask_aeff_max_offset.data.sum(), 1210) @requires_data() def test_safe_mask_maker_edisp_bias(dataset, observation_cta_1dc): pointing = observation_cta_1dc.get_pointing_icrs(observation_cta_1dc.tmid) safe_mask_maker = SafeMaskMaker(bias_percent=0.02, position=pointing) mask_edisp_bias = safe_mask_maker.make_mask_energy_edisp_bias(dataset=dataset) assert_allclose(mask_edisp_bias.data.sum(), 1815) @requires_data() def test_safe_mask_maker_spectrum_dataset_edisp_bias_no_position( spectrum_dataset_on_off, ): safe_mask_maker = SafeMaskMaker(methods=["edisp-bias"], bias_percent=0.02) safe_mask_maker.run(spectrum_dataset_on_off) mask_edisp_bias = safe_mask_maker.make_mask_energy_edisp_bias( dataset=spectrum_dataset_on_off ) assert_allclose(mask_edisp_bias.data.sum(), 14) @requires_data() def test_safe_mask_maker_edisp_bias_fixed_offset(dataset, observation_cta_1dc): safe_mask_maker_offset = SafeMaskMaker( offset_max="3 deg", bias_percent=0.02, fixed_offset=1.5 * u.deg ) mask_edisp_bias_offset = safe_mask_maker_offset.make_mask_energy_edisp_bias( dataset=dataset, observation=observation_cta_1dc ) assert_allclose(mask_edisp_bias_offset.data.sum(), 1694) safe_mask_maker1 = SafeMaskMaker( irfs="DL3", bias_percent=0.02, fixed_offset=1.5 * u.deg ) mask_edisp_bias_offset = safe_mask_maker1.make_mask_energy_edisp_bias( dataset, observation_cta_1dc ) assert_allclose(mask_edisp_bias_offset.data.sum(), 1694) @requires_data() def test_safe_mask_maker_bkg_peak(dataset, observation_cta_1dc): pointing = observation_cta_1dc.get_pointing_icrs(observation_cta_1dc.tmid) safe_mask_maker = SafeMaskMaker(position=pointing) mask_bkg_peak = safe_mask_maker.make_mask_energy_bkg_peak(dataset) assert_allclose(mask_bkg_peak.data.sum(), 1936) safe_mask_maker1 = SafeMaskMaker(fixed_offset=1.0 * u.deg, irfs="DL3") mask_bkg_peak = safe_mask_maker1.make_mask_energy_bkg_peak( dataset, observation_cta_1dc ) assert_allclose(mask_bkg_peak.data.sum(), 1936) @requires_data() def test_safe_mask_maker_bkg_peak_first_bin(dataset, observation_cta_1dc): pointing = observation_cta_1dc.get_pointing_icrs(observation_cta_1dc.tmid) safe_mask_maker = SafeMaskMaker(position=pointing) dataset_maker = MapDatasetMaker() axis = MapAxis.from_bounds(1.0, 10, nbin=6, unit="TeV", name="energy", interp="log") geom = WcsGeom.create(npix=(5, 5), axes=[axis], skydir=pointing) empty_dataset = MapDataset.create(geom=geom) dataset = dataset_maker.run(empty_dataset, observation_cta_1dc) mask_bkg_peak = safe_mask_maker.make_mask_energy_bkg_peak(dataset) assert np.all(mask_bkg_peak) @requires_data() def test_safe_mask_maker_no_root(dataset): safe_mask_maker_noroot = SafeMaskMaker( offset_max="3 deg", aeff_percent=-10, bias_percent=-10 ) mask_aeff_max_noroot = safe_mask_maker_noroot.make_mask_energy_aeff_max(dataset) mask_edisp_bias_noroot = safe_mask_maker_noroot.make_mask_energy_edisp_bias(dataset) assert_allclose(mask_aeff_max_noroot.data.sum(), 1815) assert_allclose(mask_edisp_bias_noroot.data.sum(), 1936) @requires_data() def test_safe_mask_maker_bkg_invalid(observations_hess_dl3): obs = observations_hess_dl3[0] pointing = obs.get_pointing_icrs(obs.tmid) axis = MapAxis.from_bounds( 0.1, 10, nbin=16, unit="TeV", name="energy", interp="log" ) axis_true = MapAxis.from_bounds( 0.1, 50, nbin=30, unit="TeV", name="energy_true", interp="log" ) geom = WcsGeom.create(npix=(9, 9), axes=[axis], skydir=pointing) empty_dataset = MapDataset.create(geom=geom, energy_axis_true=axis_true) dataset_maker = MapDatasetMaker() safe_mask_maker_nonan = SafeMaskMaker([]) dataset = dataset_maker.run(empty_dataset, obs) bkg = dataset.background.data bkg[0, 0, 0] = np.nan mask_nonan = safe_mask_maker_nonan.make_mask_bkg_invalid(dataset) assert not mask_nonan[0, 0, 0] assert_allclose(bkg[mask_nonan].max(), 20.656366) dataset = safe_mask_maker_nonan.run(dataset, obs) assert_allclose(dataset.mask_safe, mask_nonan) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/tests/test_spectrum.py0000644000175100001770000003114014721316200021742 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import Angle, SkyCoord from regions import CircleSkyRegion from gammapy.data import DataStore from gammapy.datasets import SpectrumDataset from gammapy.irf import FoVAlignment from gammapy.makers import ( ReflectedRegionsBackgroundMaker, SafeMaskMaker, SpectrumDatasetMaker, ) from gammapy.maps import MapAxis, RegionGeom, WcsGeom from gammapy.utils.testing import assert_quantity_allclose, requires_data @pytest.fixture def observations_hess_dl3(): """HESS DL3 observation list.""" datastore = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") obs_ids = [23523, 23526] return datastore.get_observations(obs_ids) @pytest.fixture def observations_magic_dl3(): """MAGIC DL3 observation list.""" datastore = DataStore.from_dir("$GAMMAPY_DATA/joint-crab/dl3/magic/") obs_ids = [5029748] return datastore.get_observations(obs_ids, required_irf=["aeff", "edisp"]) @pytest.fixture def observations_cta_dc1(): """CTA DC1 observation list.""" datastore = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps/") obs_ids = [110380, 111140] return datastore.get_observations(obs_ids) @pytest.fixture() def spectrum_dataset_gc(): e_reco = MapAxis.from_energy_edges(np.logspace(0, 2, 5) * u.TeV, name="energy") e_true = MapAxis.from_energy_edges( np.logspace(-1, 2, 13) * u.TeV, name="energy_true" ) geom = RegionGeom.create("galactic;circle(0, 0, 0.11)", axes=[e_reco]) return SpectrumDataset.create(geom=geom, energy_axis_true=e_true) @pytest.fixture() def spectrum_dataset_magic_crab(): e_reco = MapAxis.from_edges(np.logspace(0, 2, 5) * u.TeV, name="energy") e_true = MapAxis.from_edges(np.logspace(-0.5, 2, 11) * u.TeV, name="energy_true") geom = RegionGeom.create( "icrs;circle(83.63, 22.01, 0.14)", axes=[e_reco], binsz_wcs="0.01deg" ) return SpectrumDataset.create(geom=geom, energy_axis_true=e_true) @pytest.fixture() def spectrum_dataset_crab(): e_reco = MapAxis.from_energy_edges(np.logspace(0, 2, 5) * u.TeV) e_true = MapAxis.from_energy_edges( np.logspace(-0.5, 2, 11) * u.TeV, name="energy_true" ) geom = RegionGeom.create( "icrs;circle(83.63, 22.01, 0.11)", axes=[e_reco], binsz_wcs="0.01deg" ) return SpectrumDataset.create(geom=geom, energy_axis_true=e_true) @pytest.fixture() def spectrum_dataset_crab_fine(): e_true = MapAxis.from_energy_edges( np.logspace(-2, 2.5, 109) * u.TeV, name="energy_true" ) e_reco = MapAxis.from_energy_edges(np.logspace(-2, 2, 73) * u.TeV) geom = RegionGeom.create("icrs;circle(83.63, 22.01, 0.11)", axes=[e_reco]) return SpectrumDataset.create(geom=geom, energy_axis_true=e_true) @pytest.fixture def reflected_regions_bkg_maker(): pos = SkyCoord(83.63, 22.01, unit="deg", frame="icrs") exclusion_region = CircleSkyRegion(pos, Angle(0.3, "deg")) geom = WcsGeom.create(skydir=pos, binsz=0.02, width=10.0) exclusion_mask = ~geom.region_mask([exclusion_region]) return ReflectedRegionsBackgroundMaker( exclusion_mask=exclusion_mask, min_distance_input="0.2 deg" ) @requires_data() def test_region_center_spectrum_dataset_maker_hess_dl3( spectrum_dataset_crab, observations_hess_dl3 ): datasets = [] maker = SpectrumDatasetMaker(use_region_center=True) for obs in observations_hess_dl3: dataset = maker.run(spectrum_dataset_crab, obs) datasets.append(dataset) assert isinstance(datasets[0], SpectrumDataset) assert not datasets[0].exposure.meta["is_pointlike"] assert_allclose(datasets[0].counts.data.sum(), 100) assert_allclose(datasets[1].counts.data.sum(), 92) assert_allclose(datasets[0].exposure.meta["livetime"].value, 1581.736758) assert_allclose(datasets[1].exposure.meta["livetime"].value, 1572.686724) assert_allclose(datasets[0].npred_background().data.sum(), 7.747881, rtol=1e-5) assert_allclose(datasets[1].npred_background().data.sum(), 5.731624, rtol=1e-5) @requires_data() def test_spectrum_dataset_maker_hess_dl3(spectrum_dataset_crab, observations_hess_dl3): datasets = [] maker = SpectrumDatasetMaker(use_region_center=False) datasets = [] for obs in observations_hess_dl3: assert obs.meta.optional["CREATOR"] == "SASH FITS::EventListWriter" assert obs.meta.optional["HDUVERS"] == "0.2" assert obs.bkg.fov_alignment == FoVAlignment.REVERSE_LON_RADEC dataset = maker.run(spectrum_dataset_crab, obs) datasets.append(dataset) # Exposure assert_allclose(datasets[0].exposure.data.sum(), 7.3111e09) assert_allclose(datasets[1].exposure.data.sum(), 6.634534e09) # Background assert_allclose(datasets[0].npred_background().data.sum(), 7.7429157, rtol=1e-5) assert_allclose(datasets[1].npred_background().data.sum(), 5.7314076, rtol=1e-5) # Compare background with using bigger region e_reco = datasets[0].background.geom.axes["energy"] e_true = datasets[0].exposure.geom.axes["energy_true"] geom_bigger = RegionGeom.create("icrs;circle(83.63, 22.01, 0.22)", axes=[e_reco]) datasets_big_region = [] bigger_region_dataset = SpectrumDataset.create( geom=geom_bigger, energy_axis_true=e_true ) for obs in observations_hess_dl3: dataset = maker.run(bigger_region_dataset, obs) datasets_big_region.append(dataset) ratio_regions = ( datasets[0].counts.geom.solid_angle() / datasets_big_region[1].counts.geom.solid_angle() ) ratio_bg_1 = ( datasets[0].npred_background().data.sum() / datasets_big_region[0].npred_background().data.sum() ) ratio_bg_2 = ( datasets[1].npred_background().data.sum() / datasets_big_region[1].npred_background().data.sum() ) assert_allclose(ratio_bg_1, ratio_regions, rtol=1e-2) assert_allclose(ratio_bg_2, ratio_regions, rtol=1e-2) # Edisp -> it isn't exactly 8, is that right? it also isn't without averaging assert_allclose( datasets[0].edisp.edisp_map.data[:, :, 0, 0].sum(), e_reco.nbin * 2, rtol=1e-1 ) assert_allclose( datasets[1].edisp.edisp_map.data[:, :, 0, 0].sum(), e_reco.nbin * 2, rtol=1e-1 ) @requires_data() def test_spectrum_dataset_maker_hess_cta(spectrum_dataset_gc, observations_cta_dc1): maker = SpectrumDatasetMaker(use_region_center=True) datasets = [] for obs in observations_cta_dc1: dataset = maker.run(spectrum_dataset_gc, obs) datasets.append(dataset) assert_allclose(datasets[0].counts.data.sum(), 53) assert_allclose(datasets[1].counts.data.sum(), 47) assert_allclose(datasets[0].exposure.meta["livetime"].value, 1764.000034) assert_allclose(datasets[1].exposure.meta["livetime"].value, 1764.000034) assert_allclose(datasets[0].npred_background().data.sum(), 2.238805, rtol=1e-5) assert_allclose(datasets[1].npred_background().data.sum(), 2.165188, rtol=1e-5) @requires_data() def test_safe_mask_maker_dl3(spectrum_dataset_crab, observations_hess_dl3): safe_mask_maker = SafeMaskMaker(bias_percent=20) maker = SpectrumDatasetMaker() obs = observations_hess_dl3[0] dataset = maker.run(spectrum_dataset_crab, obs) dataset = safe_mask_maker.run(dataset, obs) assert_allclose(dataset.energy_range[0], 1) assert dataset.energy_range[0].unit == "TeV" mask_safe = safe_mask_maker.make_mask_energy_aeff_max(dataset) assert mask_safe.data.sum() == 4 mask_safe = safe_mask_maker.make_mask_energy_edisp_bias(dataset) assert mask_safe.data.sum() == 3 mask_safe = safe_mask_maker.make_mask_energy_bkg_peak(dataset) assert mask_safe.data.sum() == 4 @requires_data() def test_safe_mask_maker_dc1(spectrum_dataset_gc, observations_cta_dc1): safe_mask_maker = SafeMaskMaker(methods=["aeff-max"]) empty = SpectrumDataset.from_geoms(**spectrum_dataset_gc.geoms) obs = observations_cta_dc1[1] maker = SpectrumDatasetMaker() dataset = maker.run(empty, obs) dataset = safe_mask_maker.run(dataset, obs) assert_allclose(dataset.energy_range[0].data, 1.0, rtol=1e-3) assert dataset.energy_range[0].unit == "TeV" @requires_data() def test_make_meta_table(observations_hess_dl3): maker_obs = SpectrumDatasetMaker() map_spectrumdataset_meta_table = maker_obs.make_meta_table( observation=observations_hess_dl3[0] ) assert_allclose(map_spectrumdataset_meta_table["RA_PNT"], 83.63333129882812) assert_allclose(map_spectrumdataset_meta_table["DEC_PNT"], 21.51444435119629) assert_allclose(map_spectrumdataset_meta_table["OBS_ID"], 23523) @requires_data() def test_region_center_spectrum_dataset_maker_magic_dl3( spectrum_dataset_magic_crab, observations_magic_dl3, caplog ): maker = SpectrumDatasetMaker(use_region_center=True, selection=["exposure"]) maker_average = SpectrumDatasetMaker( use_region_center=False, selection=["exposure"] ) maker_correction = SpectrumDatasetMaker( containment_correction=True, selection=["exposure"] ) # containment correction should fail with pytest.raises(ValueError): maker_correction.run(spectrum_dataset_magic_crab, observations_magic_dl3[0]) # use_center = True should run and raise no warning dataset = maker.run(spectrum_dataset_magic_crab, observations_magic_dl3[0]) assert isinstance(dataset, SpectrumDataset) assert dataset.exposure.meta["is_pointlike"] assert "WARNING" not in [record.levelname for record in caplog.records] # use_center = False should raise a warning dataset_average = maker_average.run( spectrum_dataset_magic_crab, observations_magic_dl3[0] ) assert "WARNING" in [record.levelname for record in caplog.records] message = ( "MapMaker: use_region_center=False should not be used with point-like IRF. " "Results are likely inaccurate." ) assert message in [record.message for record in caplog.records] assert dataset_average.exposure.meta["is_pointlike"] @requires_data() class TestSpectrumMakerChain: @staticmethod @pytest.mark.parametrize( "pars, results", [ ( dict(containment_correction=False), dict(n_on=125, sigma=18.953014, aeff=580254.9 * u.m**2, edisp=0.235864), ), ( dict(containment_correction=True), dict( n_on=125, sigma=18.953014, aeff=375314.356461 * u.m**2, edisp=0.235864, ), ), ], ) def test_extract( pars, results, observations_hess_dl3, spectrum_dataset_crab_fine, reflected_regions_bkg_maker, ): """Test quantitative output for various configs""" safe_mask_maker = SafeMaskMaker() maker = SpectrumDatasetMaker( containment_correction=pars["containment_correction"] ) obs = observations_hess_dl3[0] dataset = maker.run(spectrum_dataset_crab_fine, obs) dataset = reflected_regions_bkg_maker.run(dataset, obs) dataset = safe_mask_maker.run(dataset, obs) exposure_actual = ( dataset.exposure.interp_by_coord( { "energy_true": 5 * u.TeV, "skycoord": dataset.counts.geom.center_skydir, } ) * dataset.exposure.unit ) edisp_actual = dataset.edisp.get_edisp_kernel().evaluate( energy_true=5 * u.TeV, energy=5.2 * u.TeV ) aeff_actual = exposure_actual / dataset.exposure.meta["livetime"] assert_quantity_allclose(aeff_actual, results["aeff"], rtol=1e-3) assert_quantity_allclose(edisp_actual, results["edisp"], rtol=1e-3) info = dataset.info_dict() assert info["counts"] == results["n_on"] assert_allclose(info["sqrt_ts"], results["sigma"], rtol=1e-2) assert_allclose(dataset.gti.met_start, obs.gti.met_start) assert_allclose(dataset.gti.met_stop, obs.gti.met_stop) def test_compute_energy_threshold( self, spectrum_dataset_crab_fine, observations_hess_dl3 ): maker = SpectrumDatasetMaker(containment_correction=True) safe_mask_maker = SafeMaskMaker(methods=["aeff-max"], aeff_percent=10) obs = observations_hess_dl3[0] dataset = maker.run(spectrum_dataset_crab_fine, obs) dataset = safe_mask_maker.run(dataset, obs) actual = dataset.energy_range[0] assert_quantity_allclose(actual, 0.681292 * u.TeV, rtol=1e-3) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/tests/test_utils.py0000644000175100001770000004534614721316200021255 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy import units as u from astropy.coordinates import EarthLocation, SkyCoord from astropy.table import Table from astropy.time import Time from regions import PointSkyRegion from gammapy.data import GTI, DataStore, EventList, FixedPointingInfo, Observation from gammapy.irf import ( Background2D, Background3D, EffectiveAreaTable2D, EnergyDispersion2D, ) from gammapy.makers import WobbleRegionsFinder from gammapy.makers.utils import ( _map_spectrum_weight, guess_instrument_fov, make_counts_off_rad_max, make_counts_rad_max, make_edisp_kernel_map, make_effective_livetime_map, make_map_background_irf, make_map_exposure_true_energy, make_observation_time_map, make_theta_squared_table, ) from gammapy.maps import HpxGeom, MapAxis, RegionGeom, WcsGeom, WcsNDMap from gammapy.modeling.models import ConstantSpectralModel from gammapy.utils.testing import requires_data from gammapy.utils.time import time_ref_to_dict @pytest.fixture(scope="session") def observations(): """Example observation list for testing.""" datastore = DataStore.from_dir("$GAMMAPY_DATA/magic/rad_max/data") return datastore.get_observations(required_irf="point-like")[0] @pytest.fixture(scope="session") def aeff(): filename = ( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) return EffectiveAreaTable2D.read(filename, hdu="EFFECTIVE AREA") def geom(map_type, ebounds): axis = MapAxis.from_edges(ebounds, name="energy_true", unit="TeV", interp="log") if map_type == "wcs": return WcsGeom.create(npix=(4, 3), binsz=2, axes=[axis]) elif map_type == "hpx": return HpxGeom(256, axes=[axis]) else: raise ValueError() @requires_data() @pytest.mark.parametrize( "pars", [ { "geom": geom(map_type="wcs", ebounds=[0.1, 1, 10]), "shape": (2, 3, 4), "sum": 8.103974e08, }, { "geom": geom(map_type="wcs", ebounds=[0.1, 10]), "shape": (1, 3, 4), "sum": 2.387916e08, }, # TODO: make this work for HPX # 'HpxGeom' object has no attribute 'separation' # { # 'geom': geom(map_type='hpx', ebounds=[0.1, 1, 10]), # 'shape': '???', # 'sum': '???', # }, ], ) def test_make_map_exposure_true_energy(aeff, pars): m = make_map_exposure_true_energy( pointing=SkyCoord(2, 1, unit="deg"), livetime="42 s", aeff=aeff, geom=pars["geom"], ) assert m.data.shape == pars["shape"] assert m.unit == "m2 s" assert_allclose(m.data.sum(), pars["sum"], rtol=1e-5) def test_map_spectrum_weight(): axis = MapAxis.from_edges([0.1, 10, 1000], unit="TeV", name="energy_true") expo_map = WcsNDMap.create(npix=10, binsz=1, axes=[axis], unit="m2 s") expo_map.data += 1 spectrum = ConstantSpectralModel(const="42 cm-2 s-1 TeV-1") weighted_expo = _map_spectrum_weight(expo_map, spectrum) assert weighted_expo.data.shape == (2, 10, 10) assert weighted_expo.unit == "m2 s" assert_allclose(weighted_expo.data.sum(), 100) @pytest.fixture(scope="session") def fixed_pointing_info(): filename = "$GAMMAPY_DATA/cta-1dc/data/baseline/gps/gps_baseline_110380.fits" return FixedPointingInfo.read(filename) @pytest.fixture(scope="session") def fixed_pointing_info_aligned(): # Create Fixed Pointing Info aligned between sky and horizon coordinates # (removes rotation in FoV and results in predictable solid angles) time_start = Time("2000-09-21 11:55:00") time_stop = Time("2000-09-12 12:05:00") location = EarthLocation(lat=90 * u.deg, lon=0 * u.deg) fixed_icrs = SkyCoord(0 * u.deg, 0 * u.deg, frame="icrs") return FixedPointingInfo( fixed_icrs=fixed_icrs, location=location, time_start=time_start, time_stop=time_stop, ) @pytest.fixture(scope="session") def bkg_3d(): filename = ( "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits" ) return Background3D.read(filename, hdu="BACKGROUND") @pytest.fixture(scope="session") def bkg_2d(): offset_axis = MapAxis.from_bounds(0, 4, nbin=10, name="offset", unit="deg") energy_axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=20) bkg_2d = Background2D(axes=[energy_axis, offset_axis], unit="s-1 TeV-1 sr-1") coords = bkg_2d.axes.get_coord() value = np.exp(-0.5 * (coords["offset"] / (2 * u.deg)) ** 2) bkg_2d.data = (value * (coords["energy"] / (1 * u.TeV)) ** -2).to_value("") return bkg_2d def bkg_3d_custom(symmetry="constant", fov_align="RADEC"): if symmetry == "constant": data = np.ones((2, 3, 3)) elif symmetry == "symmetric": data = np.ones((2, 3, 3)) data[:, 1, 1] *= 2 elif symmetry == "asymmetric": data = np.indices((3, 3))[1] + 1 data = np.stack(2 * [data]) else: raise ValueError(f"Unknown value for symmetry: {symmetry}") energy_axis = MapAxis.from_energy_edges([0.1, 10, 1000] * u.TeV) fov_lon_axis = MapAxis.from_edges([-3, -1, 1, 3] * u.deg, name="fov_lon") fov_lat_axis = MapAxis.from_edges([-3, -1, 1, 3] * u.deg, name="fov_lat") return Background3D( axes=[energy_axis, fov_lon_axis, fov_lat_axis], data=data, unit=u.Unit("s-1 MeV-1 sr-1"), interp_kwargs=dict(bounds_error=False, fill_value=None, values_scale="log"), fov_alignment=fov_align, # allow extrapolation for symmetry tests ) @requires_data() def test_map_background_2d(bkg_2d, fixed_pointing_info): axis = MapAxis.from_edges([0.1, 1, 10], name="energy", unit="TeV", interp="log") obstime = Time("2020-01-01T20:00:00") skydir = fixed_pointing_info.get_icrs(obstime).galactic geom = WcsGeom.create( npix=(3, 3), binsz=4, axes=[axis], skydir=skydir, frame="galactic" ) bkg = make_map_background_irf( pointing=skydir, ontime="42 s", bkg=bkg_2d, geom=geom, ) assert_allclose(bkg.data[:, 1, 1], [1.869025, 0.186903], rtol=1e-5) # Check that function works also passing the FixedPointingInfo bkg_fpi = make_map_background_irf( pointing=fixed_pointing_info, ontime="42 s", bkg=bkg_2d, geom=geom, obstime=obstime, ) assert_allclose(bkg.data, bkg_fpi.data, rtol=1e-5) def make_map_background_irf_with_symmetry(fpi, symmetry="constant"): axis = MapAxis.from_edges([0.1, 1, 10], name="energy", unit="TeV", interp="log") obstime = Time("2020-01-01T20:00:00") return make_map_background_irf( pointing=fpi, ontime="42 s", bkg=bkg_3d_custom(symmetry), geom=WcsGeom.create(npix=(3, 3), binsz=4, axes=[axis], skydir=fpi.fixed_icrs), obstime=obstime, ) def geom(map_type, ebounds, skydir): axis = MapAxis.from_edges(ebounds, name="energy", unit="TeV", interp="log") if map_type == "wcs": return WcsGeom.create(npix=(4, 3), binsz=2, axes=[axis], skydir=skydir) elif map_type == "hpx": return HpxGeom(256, axes=[axis]) else: raise ValueError() @requires_data() @pytest.mark.parametrize( "pars", [ { "map_type": "wcs", "ebounds": [0.1, 1, 10], "shape": (2, 3, 4), "sum": 1051.960299, }, { "map_type": "wcs", "ebounds": [0.1, 10], "shape": (1, 3, 4), "sum": 1051.960299, }, # TODO: make this work for HPX # 'HpxGeom' object has no attribute 'separation' # { # 'geom': geom(map_type='hpx', ebounds=[0.1, 1, 10]), # 'shape': '???', # 'sum': '???', # }, ], ) def test_make_map_background_irf(bkg_3d, pars, fixed_pointing_info): m = make_map_background_irf( pointing=fixed_pointing_info, ontime="42 s", bkg=bkg_3d, geom=geom( map_type=pars["map_type"], ebounds=pars["ebounds"], skydir=fixed_pointing_info.fixed_icrs, ), oversampling=10, obstime=Time("2020-01-01T20:00"), ) assert m.data.shape == pars["shape"] assert m.unit == "" assert_allclose(m.data.sum(), pars["sum"], rtol=1e-5) @requires_data() def test_make_map_background_irf_constant(fixed_pointing_info_aligned): m = make_map_background_irf_with_symmetry( fpi=fixed_pointing_info_aligned, symmetry="constant" ) for d in m.data: assert_allclose(d[1, :], d[1, 0]) # Constant along lon assert_allclose(d[0, 1], d[2, 1]) # Symmetric along lat with pytest.raises(AssertionError): # Not constant along lat due to changes in # solid angle (great circle) assert_allclose(d[:, 1], d[0, 1]) @requires_data() def test_make_map_background_irf_sym(fixed_pointing_info_aligned): m = make_map_background_irf_with_symmetry( fpi=fixed_pointing_info_aligned, symmetry="symmetric" ) for d in m.data: assert_allclose(d[1, 0], d[1, 2], rtol=1e-4) # Symmetric along lon assert_allclose(d[0, 1], d[2, 1], rtol=1e-4) # Symmetric along lat @requires_data() def test_make_map_background_irf_asym(fixed_pointing_info_aligned): m = make_map_background_irf_with_symmetry( fpi=fixed_pointing_info_aligned, symmetry="asymmetric" ) for d in m.data: # TODO: # Dimensions of skymap data are [energy, lat, lon] (and is # represented as [lon, lat, energy] in the api, but the bkg irf # dimensions are currently [energy, lon, lat] - Will be changed in # the future (perhaps when IRFs use the skymaps class) assert_allclose(d[1, 0], d[1, 2], rtol=1e-4) # Symmetric along lon with pytest.raises(AssertionError): assert_allclose(d[0, 1], d[2, 1], rtol=1e-4) # Symmetric along lat assert_allclose(d[0, 1] * 9, d[2, 1], rtol=1e-4) # Asymmetric along lat @requires_data() def test_make_map_background_irf_skycoord(fixed_pointing_info_aligned): axis = MapAxis.from_edges([0.1, 1, 10], name="energy", unit="TeV", interp="log") position = fixed_pointing_info_aligned.fixed_icrs with pytest.raises(TypeError): make_map_background_irf( pointing=position, ontime="42 s", bkg=bkg_3d_custom("asymmetric", "ALTAZ"), geom=WcsGeom.create(npix=(3, 3), binsz=4, axes=[axis], skydir=position), ) def test_make_edisp_kernel_map(): migra = MapAxis.from_edges(np.linspace(0.5, 1.5, 50), unit="", name="migra") etrue = MapAxis.from_energy_bounds(0.5, 2, 6, unit="TeV", name="energy_true") offset = MapAxis.from_edges(np.linspace(0.0, 2.0, 3), unit="deg", name="offset") ereco = MapAxis.from_energy_bounds(0.5, 2, 3, unit="TeV", name="energy") edisp = EnergyDispersion2D.from_gauss( energy_axis_true=etrue, migra_axis=migra, bias=0, sigma=0.01, offset_axis=offset ) geom = WcsGeom.create(10, binsz=0.5, axes=[ereco, etrue]) pointing = SkyCoord(0, 0, frame="icrs", unit="deg") edispmap = make_edisp_kernel_map(edisp, pointing, geom) kernel = edispmap.get_edisp_kernel(position=pointing) assert_allclose(kernel.pdf_matrix[:, 0], (1.0, 1.0, 0.0, 0.0, 0.0, 0.0), atol=1e-14) assert_allclose(kernel.pdf_matrix[:, 1], (0.0, 0.0, 1.0, 1.0, 0.0, 0.0), atol=1e-14) assert_allclose(kernel.pdf_matrix[:, 2], (0.0, 0.0, 0.0, 0.0, 1.0, 1.0), atol=1e-14) @requires_data() def test_make_counts_rad_max(observations): pos = SkyCoord(083.6331144560900, +22.0144871383400, unit="deg", frame="icrs") on_region = PointSkyRegion(pos) energy_axis = MapAxis.from_energy_bounds( 0.05, 100, nbin=6, unit="TeV", name="energy" ) geome = RegionGeom.create(region=on_region, axes=[energy_axis]) counts = make_counts_rad_max(geome, observations.rad_max, observations.events) assert_allclose(np.squeeze(counts.data), np.array([547, 188, 52, 8, 0, 0])) @requires_data() def test_make_counts_off_rad_max(observations): pos = SkyCoord(83.6331, +22.0145, unit="deg", frame="icrs") on_region = PointSkyRegion(pos) energy_axis = MapAxis.from_energy_bounds( 0.05, 100, nbin=6, unit="TeV", name="energy" ) region_finder = WobbleRegionsFinder(n_off_regions=3) region_off, wcs = region_finder.run(on_region, pos) geom_off = RegionGeom.from_regions(regions=region_off, axes=[energy_axis], wcs=wcs) counts_off = make_counts_off_rad_max( geom_off=geom_off, rad_max=observations.rad_max, events=observations.events ) assert_allclose(np.squeeze(counts_off.data), np.array([1641, 564, 156, 24, 0, 0])) class TestTheta2Table: def setup_class(self): self.observations = [] for sign in [-1, 1]: events = Table() events["RA"] = [0.0, 0.0, 0.0, 0.0, 10.0] * u.deg events["DEC"] = sign * ([0.0, 0.05, 0.9, 10.0, 10.0] * u.deg) events["ENERGY"] = [1.0, 1.0, 1.5, 1.5, 10.0] * u.TeV events["OFFSET"] = [0.1, 0.1, 0.5, 1.0, 1.5] * u.deg events["TIME"] = [0.1, 0.2, 0.3, 0.4, 0.5] * u.s obs_info = dict( OBS_ID=0, DEADC=1, GEOLON=16.500222222222224, GEOLAT=-23.271777777777775, ALTITUDE=1834.9999999997833, ) meta = time_ref_to_dict("2010-01-01") obs_info.update(meta) events.meta.update(obs_info) gti = GTI.create( start=[1] * u.s, stop=[3] * u.s, reference_time=Time("2010-01-01", scale="tt"), ) pointing = FixedPointingInfo( fixed_icrs=SkyCoord(0 * u.deg, sign * 0.5 * u.deg), ) self.observations.append( Observation( events=EventList(events), gti=gti, pointing=pointing, ) ) def test_make_theta_squared_table(self): # pointing position: (0,0.5) degree in ra/dec # On theta2 distribution compute from (0,0) in ra/dec. # OFF theta2 distribution from the mirror position at (0,1) in ra/dec. position = SkyCoord(ra=0, dec=0, unit="deg", frame="icrs") axis = MapAxis.from_bounds(0, 0.2, nbin=4, interp="lin", unit="deg2") theta2_table = make_theta_squared_table( observations=[self.observations[0]], position=position, theta_squared_axis=axis, ) theta2_lo = [0, 0.05, 0.1, 0.15] theta2_hi = [0.05, 0.1, 0.15, 0.2] on_counts = [2, 0, 0, 0] off_counts = [1, 0, 0, 0] acceptance = [1, 1, 1, 1] acceptance_off = [1, 1, 1, 1] alpha = [1, 1, 1, 1] assert len(theta2_table) == 4 assert theta2_table["theta2_min"].unit == "deg2" assert_allclose(theta2_table["theta2_min"], theta2_lo) assert_allclose(theta2_table["theta2_max"], theta2_hi) assert_allclose(theta2_table["counts"], on_counts) assert_allclose(theta2_table["counts_off"], off_counts) assert_allclose(theta2_table["acceptance"], acceptance) assert_allclose(theta2_table["acceptance_off"], acceptance_off) assert_allclose(theta2_table["alpha"], alpha) assert_allclose(theta2_table.meta["ON_RA"], 0 * u.deg) assert_allclose(theta2_table.meta["ON_DEC"], 0 * u.deg) # Taking the off position as the on one off_position = position theta2_table2 = make_theta_squared_table( observations=[self.observations[0]], position=position, theta_squared_axis=axis, position_off=off_position, ) assert_allclose(theta2_table2["counts_off"], theta2_table["counts"]) # Test for two observations, here identical theta2_table_two_obs = make_theta_squared_table( observations=self.observations, position=position, theta_squared_axis=axis, ) on_counts_two_obs = [4, 0, 0, 0] off_counts_two_obs = [2, 0, 0, 0] acceptance_two_obs = [2, 2, 2, 2] acceptance_off_two_obs = [2, 2, 2, 2] alpha_two_obs = [1, 1, 1, 1] assert_allclose(theta2_table_two_obs["counts"], on_counts_two_obs) assert_allclose(theta2_table_two_obs["counts_off"], off_counts_two_obs) assert_allclose(theta2_table_two_obs["acceptance"], acceptance_two_obs) assert_allclose(theta2_table_two_obs["acceptance_off"], acceptance_off_two_obs) assert_allclose(theta2_table["alpha"], alpha_two_obs) @requires_data() def test_guess_instrument_fov(observations): with pytest.raises(ValueError): guess_instrument_fov(observations) ds = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") obs_hess = ds.obs(23523) assert_allclose(guess_instrument_fov(obs_hess), 2.5 * u.deg) obs_no_aeff = obs_hess.copy(in_memory=True, aeff=None) with pytest.raises(ValueError): guess_instrument_fov(obs_no_aeff) @requires_data() def test_make_observation_time_map(): ds = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") obs_id = ds.obs_table["OBS_ID"][ds.obs_table["OBJECT"] == "MSH 15-5-02"][:3] observations = ds.get_observations(obs_id) source_pos = SkyCoord(228.32, -59.08, unit="deg") geom = WcsGeom.create( skydir=source_pos, binsz=0.02, width=(6, 6), frame="icrs", proj="CAR", ) obs_time = make_observation_time_map(observations, geom, offset_max=2.5 * u.deg) obs_time_center = obs_time.get_by_coord(source_pos) assert_allclose(obs_time_center, 1.2847, rtol=1e-3) assert obs_time.unit == u.hr @requires_data() def test_make_effective_livetime_map(): ds = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1") obs_id = ds.obs_table["OBS_ID"][ds.obs_table["OBJECT"] == "MSH 15-5-02"][:3] observations = ds.get_observations(obs_id) source_pos = SkyCoord(228.32, -59.08, unit="deg") offset_pos = SkyCoord(322.00, 0.1, unit="deg", frame="galactic") energy_axis_true = MapAxis.from_energy_bounds( 10 * u.GeV, 1 * u.TeV, nbin=2, name="energy_true" ) geom = WcsGeom.create( skydir=source_pos, binsz=0.02, width=(6, 6), frame="galactic", proj="CAR", axes=[energy_axis_true], ) obs_time = make_effective_livetime_map(observations, geom, offset_max=2.5 * u.deg) obs_time_center = obs_time.get_by_coord((source_pos, energy_axis_true.center)) assert_allclose(obs_time_center, [0, 1.2847], rtol=1e-3) obs_time_offset = obs_time.get_by_coord((offset_pos, energy_axis_true.center)) assert_allclose(obs_time_offset, [0, 0.242814], rtol=1e-3) assert obs_time.unit == u.hr ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/makers/utils.py0000644000175100001770000005571014721316200017050 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import warnings import numpy as np import astropy.units as u from astropy.coordinates import Angle from astropy.table import Table from gammapy.data import FixedPointingInfo from gammapy.irf import BackgroundIRF, EDispMap, FoVAlignment, PSFMap from gammapy.maps import Map, RegionNDMap from gammapy.maps.utils import broadcast_axis_values_to_geom from gammapy.modeling.models import PowerLawSpectralModel from gammapy.stats import WStatCountsStatistic from gammapy.utils.coordinates import sky_to_fov from gammapy.utils.regions import compound_region_to_regions __all__ = [ "make_counts_rad_max", "make_edisp_kernel_map", "make_edisp_map", "make_map_background_irf", "make_map_exposure_true_energy", "make_psf_map", "make_theta_squared_table", "make_effective_livetime_map", "make_observation_time_map", ] log = logging.getLogger(__name__) def _get_fov_coords(pointing, irf, geom, use_region_center=True, obstime=None): # TODO: create dedicated coordinate handling see #5041 coords = {} if isinstance(pointing, FixedPointingInfo): # for backwards compatibility, obstime should be required if obstime is None: if isinstance(obstime, BackgroundIRF): warnings.warn( "Future versions of gammapy will require the obstime keyword for this function", DeprecationWarning, ) obstime = pointing.obstime pointing_icrs = pointing.get_icrs(obstime) else: pointing_icrs = pointing if not use_region_center: region_coord, weights = geom.get_wcs_coord_and_weights() sky_coord = region_coord.skycoord else: image_geom = geom.to_image() map_coord = image_geom.get_coord() sky_coord = map_coord.skycoord if irf.has_offset_axis: coords["offset"] = sky_coord.separation(pointing_icrs) else: if irf.fov_alignment == FoVAlignment.ALTAZ: if not isinstance(pointing, FixedPointingInfo) and isinstance( irf, BackgroundIRF ): raise TypeError( "make_map_background_irf requires FixedPointingInfo if " "BackgroundIRF.fov_alignement is ALTAZ", ) # for backwards compatibility, obstime should be required if obstime is None: warnings.warn( "Future versions of gammapy will require the obstime keyword for this function", DeprecationWarning, ) obstime = pointing.obstime pointing_altaz = pointing.get_altaz(obstime) altaz_coord = sky_coord.transform_to(pointing_altaz.frame) # Compute FOV coordinates of map relative to pointing fov_lon, fov_lat = sky_to_fov( altaz_coord.az, altaz_coord.alt, pointing_altaz.az, pointing_altaz.alt ) elif irf.fov_alignment in [FoVAlignment.RADEC, FoVAlignment.REVERSE_LON_RADEC]: fov_lon, fov_lat = sky_to_fov( sky_coord.icrs.ra, sky_coord.icrs.dec, pointing_icrs.icrs.ra, pointing_icrs.icrs.dec, ) if irf.fov_alignment == FoVAlignment.REVERSE_LON_RADEC: fov_lon = -fov_lon else: raise ValueError( f"Unsupported background coordinate system: {irf.fov_alignment!r}" ) coords["fov_lon"] = fov_lon coords["fov_lat"] = fov_lat return coords def make_map_exposure_true_energy( pointing, livetime, aeff, geom, use_region_center=True ): """Compute exposure map. This map has a true energy axis, the exposure is not combined with energy dispersion. Parameters ---------- pointing : `~astropy.coordinates.SkyCoord` Pointing direction. livetime : `~astropy.units.Quantity` Livetime. aeff : `~gammapy.irf.EffectiveAreaTable2D` Effective area. geom : `~gammapy.maps.WcsGeom` Map geometry (must have an energy axis). use_region_center : bool, optional For geom as a `~gammapy.maps.RegionGeom`. If True, consider the values at the region center. If False, average over the whole region. Default is True. Returns ------- map : `~gammapy.maps.WcsNDMap` Exposure map. """ coords = _get_fov_coords( pointing=pointing, geom=geom, use_region_center=use_region_center, irf=aeff, obstime=None, ) coords["energy_true"] = broadcast_axis_values_to_geom(geom, "energy_true") exposure = aeff.evaluate(**coords) data = (exposure * livetime).to("m2 s") meta = {"livetime": livetime, "is_pointlike": aeff.is_pointlike} if not use_region_center: _, weights = geom.get_wcs_coord_and_weights() data = np.average(data, axis=-1, weights=weights, keepdims=True) return Map.from_geom(geom=geom, data=data.value, unit=data.unit, meta=meta) def _map_spectrum_weight(map, spectrum=None): """Weight a map with a spectrum. This requires map to have an "energy" axis. The weights are normalised so that they sum to 1. The mean and unit of the output image is the same as of the input cube. At the moment this is used to get a weighted exposure image. Parameters ---------- map : `~gammapy.maps.Map` Input map with an "energy" axis. spectrum : `~gammapy.modeling.models.SpectralModel`, optional Spectral model to compute the weights. Default is None, which is a power-law with spectral index of 2. Returns ------- map_weighted : `~gammapy.maps.Map` Weighted image. """ if spectrum is None: spectrum = PowerLawSpectralModel(index=2.0) # Compute weights vector for name in map.geom.axes.names: if "energy" in name: energy_name = name energy_edges = map.geom.axes[energy_name].edges weights = spectrum.integral( energy_min=energy_edges[:-1], energy_max=energy_edges[1:] ) weights /= weights.sum() shape = np.ones(len(map.geom.data_shape)) shape[0] = -1 return map * weights.reshape(shape.astype(int)) def make_map_background_irf( pointing, ontime, bkg, geom, oversampling=None, use_region_center=True, obstime=None, ): """Compute background map from background IRFs. Parameters ---------- pointing : `~gammapy.data.FixedPointingInfo` or `~astropy.coordinates.SkyCoord` Observation pointing. - If a `~gammapy.data.FixedPointingInfo` is passed, FOV coordinates are properly computed. - If a `~astropy.coordinates.SkyCoord` is passed, FOV frame rotation is not taken into account. ontime : `~astropy.units.Quantity` Observation ontime. i.e. not corrected for deadtime see https://gamma-astro-data-formats.readthedocs.io/en/latest/irfs/full_enclosure/bkg/index.html#notes) # noqa: E501 bkg : `~gammapy.irf.Background3D` Background rate model. geom : `~gammapy.maps.WcsGeom` Reference geometry. oversampling : int Oversampling factor in energy, used for the background model evaluation. use_region_center : bool, optional For geom as a `~gammapy.maps.RegionGeom`. If True, consider the values at the region center. If False, average over the whole region. Default is True. obstime : `~astropy.time.Time` Observation time to use. Returns ------- background : `~gammapy.maps.WcsNDMap` Background predicted counts sky cube in reconstructed energy. """ # TODO: # This implementation can be improved in two ways: # 1. Create equal time intervals between TSTART and TSTOP and sum up the # background IRF for each interval. This is instead of multiplying by # the total ontime. This then handles the rotation of the FoV. # 2. Use the pointing table (does not currently exist in CTA files) to # obtain the RA DEC and time for each interval. This then considers that # the pointing might change slightly over the observation duration # Get altaz coords for map if oversampling is not None: geom = geom.upsample(factor=oversampling, axis_name="energy") if not use_region_center: image_geom = geom.to_wcs_geom().to_image() region_coord, weights = geom.get_wcs_coord_and_weights() idx = image_geom.coord_to_idx(region_coord) d_omega = image_geom.solid_angle().T[idx] else: image_geom = geom.to_image() d_omega = image_geom.solid_angle() coords = _get_fov_coords( pointing=pointing, irf=bkg, geom=geom, use_region_center=use_region_center, obstime=obstime, ) coords["energy"] = broadcast_axis_values_to_geom(geom, "energy", False) bkg_de = bkg.integrate_log_log(**coords, axis_name="energy") data = (bkg_de * d_omega * ontime).to_value("") if not use_region_center: region_coord, weights = geom.get_wcs_coord_and_weights() data = np.sum(weights * data, axis=2, keepdims=True) bkg_map = Map.from_geom(geom, data=data) if oversampling is not None: bkg_map = bkg_map.downsample(factor=oversampling, axis_name="energy") return bkg_map def make_psf_map(psf, pointing, geom, exposure_map=None): """Make a PSF map for a single observation. Expected axes : rad and true energy in this specific order. The name of the rad MapAxis is expected to be 'rad'. Parameters ---------- psf : `~gammapy.irf.PSF3D` The PSF IRF. pointing : `~astropy.coordinates.SkyCoord` The pointing direction. geom : `~gammapy.maps.Geom` The map geometry to be used. It provides the target geometry. rad and true energy axes should be given in this specific order. exposure_map : `~gammapy.maps.Map`, optional The associated exposure map. Default is None. Returns ------- psfmap : `~gammapy.irf.PSFMap` The resulting PSF map. """ coords = _get_fov_coords( pointing=pointing, irf=psf, geom=geom, use_region_center=True, obstime=None, ) coords["energy_true"] = broadcast_axis_values_to_geom(geom, "energy_true") coords["rad"] = broadcast_axis_values_to_geom(geom, "rad") # Compute PSF values data = psf.evaluate(**coords) # Create Map and fill relevant entries psf_map = Map.from_geom(geom, data=data.value, unit=data.unit) psf_map.normalize(axis_name="rad") return PSFMap(psf_map, exposure_map) def make_edisp_map(edisp, pointing, geom, exposure_map=None, use_region_center=True): """Make an edisp map for a single observation. Expected axes : migra and true energy in this specific order. The name of the migra MapAxis is expected to be 'migra'. Parameters ---------- edisp : `~gammapy.irf.EnergyDispersion2D` The 2D energy dispersion IRF. pointing : `~astropy.coordinates.SkyCoord` The pointing direction. geom : `~gammapy.maps.Geom` The map geometry to be used. It provides the target geometry. migra and true energy axes should be given in this specific order. exposure_map : `~gammapy.maps.Map`, optional The associated exposure map. Default is None. use_region_center : bool, optional For geom as a `~gammapy.maps.RegionGeom`. If True, consider the values at the region center. If False, average over the whole region. Default is True. Returns ------- edispmap : `~gammapy.irf.EDispMap` The resulting energy dispersion map. """ coords = _get_fov_coords(pointing, edisp, geom, use_region_center=use_region_center) coords["energy_true"] = broadcast_axis_values_to_geom(geom, "energy_true") coords["migra"] = broadcast_axis_values_to_geom(geom, "migra") # Compute EDisp values data = edisp.evaluate(**coords) if not use_region_center: _, weights = geom.get_wcs_coord_and_weights() data = np.average(data, axis=-1, weights=weights, keepdims=True) # Create Map and fill relevant entries edisp_map = Map.from_geom(geom, data=data.to_value(""), unit="") edisp_map.normalize(axis_name="migra") return EDispMap(edisp_map, exposure_map) def make_edisp_kernel_map( edisp, pointing, geom, exposure_map=None, use_region_center=True ): """Make an edisp kernel map for a single observation. Expected axes : (reco) energy and true energy in this specific order. The name of the reco energy MapAxis is expected to be 'energy'. The name of the true energy MapAxis is expected to be 'energy_true'. Parameters ---------- edisp : `~gammapy.irf.EnergyDispersion2D` The 2D energy dispersion IRF. pointing : `~astropy.coordinates.SkyCoord` The pointing direction. geom : `~gammapy.maps.Geom` The map geometry to be used. It provides the target geometry. energy and true energy axes should be given in this specific order. exposure_map : `~gammapy.maps.Map`, optional The associated exposure map. Default is None. use_region_center : bool, optional For geom as a `~gammapy.maps.RegionGeom`. If True, consider the values at the region center. If False, average over the whole region. Default is True. Returns ------- edispmap : `~gammapy.irf.EDispKernelMap` the resulting EDispKernel map """ coords = _get_fov_coords( pointing=pointing, irf=edisp, geom=geom, use_region_center=use_region_center, ) coords["energy_true"] = geom.axes["energy_true"].edges.reshape((-1, 1, 1, 1)) # Use EnergyDispersion2D migra axis. migra_axis = edisp.axes["migra"] # Create temporary EDispMap Geom new_geom = geom.to_image().to_cube([migra_axis, geom.axes["energy_true"]]) edisp_map = make_edisp_map( edisp, pointing, new_geom, exposure_map, use_region_center ) return edisp_map.to_edisp_kernel_map(geom.axes["energy"]) def make_theta_squared_table( observations, theta_squared_axis, position, position_off=None ): """Make theta squared distribution in the same FoV for a list of `Observation` objects. The ON theta2 profile is computed from a given distribution, on_position. By default, the OFF theta2 profile is extracted from a mirror position radially symmetric in the FOV to pos_on. The ON and OFF regions are assumed to be of the same size, so the normalisation factor between both region alpha = 1. Parameters ---------- observations: `~gammapy.data.Observations` List of observations. theta_squared_axis : `~gammapy.maps.geom.MapAxis` Axis of edges of the theta2 bin used to compute the distribution. position : `~astropy.coordinates.SkyCoord` Position from which the on theta^2 distribution is computed. position_off : `astropy.coordinates.SkyCoord` Position from which the OFF theta^2 distribution is computed. Default is reflected position w.r.t. to the pointing position. Returns ------- table : `~astropy.table.Table` Table containing the on counts, the off counts, acceptance, off acceptance and alpha for each theta squared bin. """ if not theta_squared_axis.edges.unit.is_equivalent("deg2"): raise ValueError("The theta2 axis should be equivalent to deg2") table = Table() table["theta2_min"] = theta_squared_axis.edges[:-1] table["theta2_max"] = theta_squared_axis.edges[1:] table["counts"] = 0 table["counts_off"] = 0 table["acceptance"] = 0.0 table["acceptance_off"] = 0.0 alpha_tot = np.zeros(len(table)) livetime_tot = 0 create_off = position_off is None for observation in observations: event_position = observation.events.radec pointing = observation.get_pointing_icrs(observation.tmid) separation = position.separation(event_position) counts, _ = np.histogram(separation**2, theta_squared_axis.edges) table["counts"] += counts if create_off: # Estimate the position of the mirror position pos_angle = pointing.position_angle(position) sep_angle = pointing.separation(position) position_off = pointing.directional_offset_by( pos_angle + Angle(np.pi, "rad"), sep_angle ) # Angular distance of the events from the mirror position separation_off = position_off.separation(event_position) # Extract the ON and OFF theta2 distribution from the two positions. counts_off, _ = np.histogram(separation_off**2, theta_squared_axis.edges) table["counts_off"] += counts_off # Normalisation between ON and OFF is one acceptance = np.ones(theta_squared_axis.nbin) acceptance_off = np.ones(theta_squared_axis.nbin) table["acceptance"] += acceptance table["acceptance_off"] += acceptance_off alpha = acceptance / acceptance_off alpha_tot += alpha * observation.observation_live_time_duration.to_value("s") livetime_tot += observation.observation_live_time_duration.to_value("s") alpha_tot /= livetime_tot table["alpha"] = alpha_tot stat = WStatCountsStatistic(table["counts"], table["counts_off"], table["alpha"]) table["excess"] = stat.n_sig table["sqrt_ts"] = stat.sqrt_ts table["excess_errn"] = stat.compute_errn() table["excess_errp"] = stat.compute_errp() table.meta["ON_RA"] = position.icrs.ra table.meta["ON_DEC"] = position.icrs.dec return table def make_counts_rad_max(geom, rad_max, events): """Extract the counts using for the ON region size the values in the `RAD_MAX_2D` table. Parameters ---------- geom : `~gammapy.maps.RegionGeom` Reference map geometry. rad_max : `~gammapy.irf.RadMax2D` Rhe RAD_MAX_2D table IRF. events : `~gammapy.data.EventList` Event list to be used to compute the ON counts. Returns ------- counts : `~gammapy.maps.RegionNDMap` Counts vs estimated energy extracted from the ON region. """ selected_events = events.select_rad_max( rad_max=rad_max, position=geom.region.center ) counts = Map.from_geom(geom=geom) counts.fill_events(selected_events) return counts def make_counts_off_rad_max(geom_off, rad_max, events): """Extract the OFF counts from a list of point regions and given rad max. This method does **not** check for overlap of the regions defined by rad_max. Parameters ---------- geom_off : `~gammapy.maps.RegionGeom` Reference map geometry for the on region. rad_max : `~gammapy.irf.RadMax2D` The RAD_MAX_2D table IRF. events : `~gammapy.data.EventList` Event list to be used to compute the OFF counts. Returns ------- counts_off : `~gammapy.maps.RegionNDMap` OFF Counts vs estimated energy extracted from the ON region. """ if not geom_off.is_all_point_sky_regions: raise ValueError( f"Only supports PointSkyRegions, got {geom_off.region} instead" ) counts_off = RegionNDMap.from_geom(geom=geom_off) for off_region in compound_region_to_regions(geom_off.region): selected_events = events.select_rad_max( rad_max=rad_max, position=off_region.center ) counts_off.fill_events(selected_events) return counts_off def make_observation_time_map(observations, geom, offset_max=None): """ Compute the total observation time on the target geometry for a list of observations. Parameters ---------- observations : `~gammapy.data.Observations` Observations container containing list of observations. geom : `~gammapy.maps.Geom` Reference geometry. offset_max : `~astropy.units.quantity.Quantity`, optional The maximum offset FoV. Default is None. If None, it will be taken from the IRFs. Returns ------- exposure : `~gammapy.maps.Map` Effective livetime. """ geom = geom.to_image() stacked = Map.from_geom(geom, unit=u.h) for obs in observations: if offset_max is None: offset_max = guess_instrument_fov(obs) coords = geom.get_coord(sparse=True) offset = coords.skycoord.separation(obs.get_pointing_icrs(obs.tmid)) mask = offset < offset_max c1 = coords.apply_mask(mask) weights = np.ones(c1.shape) * obs.observation_live_time_duration stacked.fill_by_coord(coords=c1, weights=weights) return stacked def make_effective_livetime_map(observations, geom, offset_max=None): """ Compute the acceptance corrected livetime map for a list of observations. Parameters ---------- observations : `~gammapy.data.Observations` Observations container containing list of observations. geom : `~gammapy.maps.Geom` Reference geometry. offset_max : `~astropy.units.quantity.Quantity`, optional The maximum offset FoV. Default is None. Returns ------- exposure : `~gammapy.maps.Map` Effective livetime. """ livetime = Map.from_geom(geom, unit=u.hr) for obs in observations: if offset_max is None: offset_max = guess_instrument_fov(obs) geom_obs = geom.cutout( position=obs.get_pointing_icrs(obs.tmid), width=2.0 * offset_max ) coords = geom_obs.get_coord() offset = coords.skycoord.separation(obs.get_pointing_icrs(obs.tmid)) mask = offset < offset_max exposure = make_map_exposure_true_energy( pointing=geom.center_skydir, livetime=obs.observation_live_time_duration, aeff=obs.aeff, geom=geom_obs, use_region_center=True, ) on_axis = obs.aeff.evaluate( offset=0.0 * u.deg, energy_true=geom.axes["energy_true"].center ) on_axis = on_axis.reshape((on_axis.shape[0], 1, 1)) lv_obs = exposure * mask / on_axis livetime.stack(lv_obs) return livetime def guess_instrument_fov(obs): """ Guess the camera field of view for the given observation from the IRFs. This simply takes the maximum offset of the effective area IRF. TODO: This logic will break for more complex IRF models. A better option would be to compute the offset at which the effective area is above 10% of the maximum. Parameters ---------- obs : `~gammapy.data.Observation` Observation container. Returns ------- offset_max : `~astropy.units.quantity.Quantity` The maximum offset of the effective area IRF. """ if "aeff" not in obs.available_irfs: raise ValueError("No Effective area IRF to infer the FoV from") if obs.aeff.is_pointlike: raise ValueError("Cannot guess FoV from pointlike IRFs") if "offset" not in obs.aeff.axes.names: raise ValueError("Offset axis not present!") return obs.aeff.axes["offset"].center[-1] ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.216642 gammapy-1.3/gammapy/maps/0000755000175100001770000000000014721316215015012 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/__init__.py0000644000175100001770000000134614721316200017121 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Sky maps.""" from .axes import LabelMapAxis, MapAxes, MapAxis, TimeMapAxis from .coord import MapCoord from .core import Map from .geom import Geom from .hpx import HpxGeom, HpxMap, HpxNDMap from .maps import Maps from .measure import containment_radius, containment_region from .region import RegionGeom, RegionNDMap from .wcs import WcsGeom, WcsMap, WcsNDMap __all__ = [ "Geom", "HpxGeom", "HpxMap", "HpxNDMap", "LabelMapAxis", "Map", "MapAxes", "MapAxis", "MapCoord", "Maps", "RegionGeom", "RegionNDMap", "TimeMapAxis", "WcsGeom", "WcsMap", "WcsNDMap", "containment_radius", "containment_region", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/axes.py0000644000175100001770000033042114721316200016321 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import copy import html import inspect import logging from collections.abc import Sequence from enum import Enum import numpy as np import scipy import astropy.units as u from astropy.io import fits from astropy.table import Column, Table, hstack from astropy.time import Time from astropy.utils import lazyproperty import matplotlib.pyplot as plt from gammapy.utils.compat import COPY_IF_NEEDED from gammapy.utils.interpolation import interpolation_scale from gammapy.utils.time import time_ref_from_dict, time_ref_to_dict from .utils import INVALID_INDEX, INVALID_VALUE, edges_from_lo_hi __all__ = ["MapAxes", "MapAxis", "TimeMapAxis", "LabelMapAxis"] log = logging.getLogger(__name__) def flat_if_equal(array): if array.ndim == 2 and np.all(array == array[0]): return array[0] else: return array class BoundaryEnum(str, Enum): monotonic = "monotonic" periodic = "periodic" class AxisCoordInterpolator: """Axis coordinate interpolator.""" def __init__(self, edges, interp="lin"): self.scale = interpolation_scale(interp) self.x = self.scale(edges) self.y = np.arange(len(edges), dtype=float) self.fill_value = "extrapolate" if len(edges) == 1: self.kind = 0 else: self.kind = 1 def coord_to_pix(self, coord): """Transform coordinate to pixel.""" interp_fn = scipy.interpolate.interp1d( x=self.x, y=self.y, kind=self.kind, fill_value=self.fill_value ) return interp_fn(self.scale(coord)) def pix_to_coord(self, pix): """Transform pixel to coordinate.""" interp_fn = scipy.interpolate.interp1d( x=self.y, y=self.x, kind=self.kind, fill_value=self.fill_value ) return self.scale.inverse(interp_fn(pix)) PLOT_AXIS_LABEL = { "energy": "Energy", "energy_true": "True Energy", "offset": "FoV Offset", "rad": "Source Offset", "migra": "Energy / True Energy", "fov_lon": "FoV Lon.", "fov_lat": "FoV Lat.", "time": "Time", } DEFAULT_LABEL_TEMPLATE = "{quantity} [{unit}]" UNIT_STRING_FORMAT = "latex_inline" class MapAxis: """Class representing an axis of a map. Provides methods for transforming to/from axis and pixel coordinates. An axis is defined by a sequence of node values that lie at the center of each bin. The pixel coordinate at each node is equal to its index in the node array (0, 1, ..). Bin edges are offset by 0.5 in pixel coordinates from the nodes such that the lower/upper edge of the first bin is (-0.5,0.5). Parameters ---------- nodes : `~numpy.ndarray` or `~astropy.units.Quantity` Array of node values. These will be interpreted as either bin edges or centers according to ``node_type``. interp : {'lin', 'log', 'sqrt'} Interpolation method used to transform between axis and pixel coordinates. Default is 'lin'. name : str, optional Axis name. Default is "". node_type : str, optional Flag indicating whether coordinate nodes correspond to pixel edges (node_type = 'edges') or pixel centers (node_type = 'center'). 'center' should be used where the map values are defined at a specific coordinate (e.g. differential quantities). 'edges' should be used where map values are defined by an integral over coordinate intervals (e.g. a counts histogram). Default is "edges". unit : str, optional String specifying the data units. Default is "". boundary_type : str, optional Flag indicating boundary condition for the axis. Available options are "monotonic" and "periodic". "Periodic" boundary is only supported for interp = "lin". Default is "monotonic". """ # TODO: Cache an interpolation object? def __init__( self, nodes, interp="lin", name="", node_type="edges", unit="", boundary_type="monotonic", ): if not isinstance(name, str): raise TypeError(f"Name must be a string, got: {type(name)!r}") if len(nodes) != len(np.unique(nodes)): raise ValueError("MapAxis: node values must be unique") if ~(np.all(nodes == np.sort(nodes)) or np.all(nodes[::-1] == np.sort(nodes))): raise ValueError("MapAxis: node values must be sorted") if isinstance(nodes, u.Quantity): unit = nodes.unit if nodes.unit is not None else "" nodes = nodes.value else: nodes = np.array(nodes) if boundary_type not in list(BoundaryEnum): raise ValueError(f"Invalid boundary_type: {boundary_type}") if boundary_type == BoundaryEnum.periodic and interp != "lin": raise ValueError("Periodic Axis only supports linear interpolation") self._name = name self._unit = u.Unit(unit) self._nodes = nodes.astype(float) self._node_type = node_type self._interp = interp self._boundary_type = BoundaryEnum(boundary_type).value if (self._nodes < 0).any() and interp != "lin": raise ValueError( f"Interpolation scaling {interp!r} only support for positive node values." ) # Set pixel coordinate of first node if node_type == "edges": self._pix_offset = -0.5 nbin = len(nodes) - 1 elif node_type == "center": self._pix_offset = 0.0 nbin = len(nodes) else: raise ValueError(f"Invalid node type: {node_type!r}") self._nbin = nbin self._use_center_as_plot_labels = None def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def assert_name(self, required_name): """Assert axis name if a specific one is required. Parameters ---------- required_name : str Required name. """ if self.name != required_name: raise ValueError( "Unexpected axis name," f' expected "{required_name}", got: "{self.name}"' ) def is_aligned(self, other, atol=2e-2): """Check if the other map axis is aligned. Two axes are aligned if their center coordinate values map to integers on the other axes as well and if the interpolation modes are equivalent. Parameters ---------- other : `MapAxis` Other map axis. atol : float, optional Absolute numerical tolerance for the comparison measured in bins. Default is 2e-2. Returns ------- aligned : bool Whether the axes are aligned. """ pix = self.coord_to_pix(other.center) pix_other = other.coord_to_pix(self.center) pix_all = np.append(pix, pix_other) aligned = np.allclose(np.round(pix_all) - pix_all, 0, atol=atol) return aligned and self.interp == other.interp def is_allclose(self, other, **kwargs): """Check if the other map axis is all close. Parameters ---------- other : `MapAxis` Other map axis. **kwargs : dict, optional Keyword arguments passed to `~numpy.allclose`. Returns ------- is_allclose : bool Whether the other axis is allclose. """ if not isinstance(other, self.__class__): return TypeError(f"Cannot compare {type(self)} and {type(other)}") if self.edges.shape != other.edges.shape: return False if not self.unit.is_equivalent(other.unit): return False return ( np.allclose(self.edges, other.edges, **kwargs) and self._node_type == other._node_type and self._interp == other._interp and self.name.upper() == other.name.upper() and self._boundary_type == other._boundary_type ) def __eq__(self, other): if not isinstance(other, self.__class__): return False return self.is_allclose(other, rtol=1e-6, atol=1e-6) def __ne__(self, other): return not self.__eq__(other) def __hash__(self): return id(self) @lazyproperty def _transform(self): """Interpolate coordinates to pixel.""" return AxisCoordInterpolator(edges=self._nodes, interp=self.interp) @property def is_energy_axis(self): """Whether this is an energy axis.""" return self.name in ["energy", "energy_true"] @property def interp(self): """Interpolation scale of the axis.""" return self._interp @property def name(self): """Name of the axis.""" return self._name @lazyproperty def edges(self): """Return an array of bin edges.""" pix = np.arange(self.nbin + 1, dtype=float) - 0.5 return u.Quantity(self.pix_to_coord(pix), self._unit, copy=COPY_IF_NEEDED) @property def edges_min(self): """Return an array of bin edges maximum values.""" return self.edges[:-1] @property def edges_max(self): """Return an array of bin edges minimum values.""" return self.edges[1:] @property def bounds(self): """Bounds of the axis as a `~astropy.units.Quantity`.""" idx = [0, -1] if self.node_type == "edges": return self.edges[idx] else: return self.center[idx] @property def as_plot_xerr(self): """Return a tuple of x-error to be passed to `~matplotlib.pyplot.errorbar`.""" return ( self.center - self.edges_min, self.edges_max - self.center, ) @property def use_center_as_plot_labels(self): """Use center as plot labels.""" if self._use_center_as_plot_labels is not None: return self._use_center_as_plot_labels return self.node_type == "center" @use_center_as_plot_labels.setter def use_center_as_plot_labels(self, value): """Use center as plot labels.""" self._use_center_as_plot_labels = bool(value) @property def as_plot_labels(self): """Return a list of axis plot labels.""" if self.use_center_as_plot_labels: labels = [f"{val:.2e}" for val in self.center] else: labels = [ f"{val_min:.2e} - {val_max:.2e}" for val_min, val_max in self.iter_by_edges ] return labels @property def as_plot_edges(self): """Plot edges.""" return self.edges @property def as_plot_center(self): """Plot center.""" return self.center @property def as_plot_scale(self): """Plot axis scale.""" mpl_scale = {"lin": "linear", "sqrt": "linear", "log": "log"} return mpl_scale[self.interp] def to_node_type(self, node_type): """Return a copy of the `MapAxis` instance with a node type set to node_type. Parameters ---------- node_type : str The target node type. It can be either 'center' or 'edges'. Returns ------- axis : `~gammapy.maps.MapAxis` The new MapAxis. """ if node_type == self.node_type: return self else: if node_type == "center": nodes = self.center else: nodes = self.edges return self.__class__( nodes=nodes, interp=self.interp, name=self.name, node_type=node_type, unit=self.unit, ) def rename(self, new_name): """Rename the axis. Return a copy of the `MapAxis` instance with name set to new_name. Parameters ---------- new_name : str The new name for the axis. Returns ------- axis : `~gammapy.maps.MapAxis` The new MapAxis. """ return self.copy(name=new_name) def format_plot_xaxis(self, ax): """Format the x-axis. Parameters ---------- ax : `~matplotlib.pyplot.Axis` Plot axis to format. Returns ------- ax : `~matplotlib.pyplot.Axis` Formatted plot axis. """ ax.set_xscale(self.as_plot_scale) xlabel = DEFAULT_LABEL_TEMPLATE.format( quantity=PLOT_AXIS_LABEL.get(self.name, self.name.capitalize()), unit=ax.xaxis.units.to_string(UNIT_STRING_FORMAT), ) ax.set_xlabel(xlabel) xmin, xmax = self.bounds if not xmin == xmax: ax.set_xlim(self.bounds) return ax def format_plot_yaxis(self, ax): """Format plot y-axis. Parameters ---------- ax : `~matplotlib.pyplot.Axis` Plot axis to format. Returns ------- ax : `~matplotlib.pyplot.Axis` Formatted plot axis. """ ax.set_yscale(self.as_plot_scale) ylabel = DEFAULT_LABEL_TEMPLATE.format( quantity=PLOT_AXIS_LABEL.get(self.name, self.name.capitalize()), unit=ax.yaxis.units.to_string(UNIT_STRING_FORMAT), ) ax.set_ylabel(ylabel) ax.set_ylim(self.bounds) return ax @property def iter_by_edges(self): """Iterate by intervals defined by the edges.""" for value_min, value_max in zip(self.edges[:-1], self.edges[1:]): yield (value_min, value_max) @lazyproperty def center(self): """Return an array of bin centers.""" pix = np.arange(self.nbin, dtype=float) return u.Quantity(self.pix_to_coord(pix), self._unit, copy=COPY_IF_NEEDED) @lazyproperty def bin_width(self): """Array of bin widths.""" return np.diff(self.edges) @property def nbin(self): """Return the number of bins.""" return self._nbin @property def nbin_per_decade(self): """Return the number of bins per decade.""" if self.interp != "log": raise ValueError("Bins per decade can only be computed for log-spaced axes") if self.node_type == "edges": values = self.edges else: values = self.center ndecades = np.log10(values.max() / values.min()) return (self._nbin / ndecades).value @property def node_type(self): """Return node type, either 'center' or 'edges'.""" return self._node_type @property def unit(self): """Return the coordinate axis unit.""" return self._unit @classmethod def from_bounds(cls, lo_bnd, hi_bnd, nbin, **kwargs): """Generate an axis object from a lower/upper bound and number of bins. If node_type = 'edges' then bounds correspond to the lower and upper bound of the first and last bin. If node_type = 'center' then bounds correspond to the centers of the first and last bin. Parameters ---------- lo_bnd : float Lower bound of first axis bin. hi_bnd : float Upper bound of last axis bin. nbin : int Number of bins. interp : {'lin', 'log', 'sqrt'} Interpolation method used to transform between axis and pixel coordinates. Default: 'lin'. ***kwargs : dict, optional Keyword arguments passed to `MapAxis`. """ nbin = int(nbin) interp = kwargs.setdefault("interp", "lin") node_type = kwargs.setdefault("node_type", "edges") if node_type == "edges": nnode = nbin + 1 elif node_type == "center": nnode = nbin else: raise ValueError(f"Invalid node type: {node_type!r}") if interp == "lin": nodes = np.linspace(lo_bnd, hi_bnd, nnode) elif interp == "log": nodes = np.geomspace(lo_bnd, hi_bnd, nnode) elif interp == "sqrt": nodes = np.linspace(lo_bnd**0.5, hi_bnd**0.5, nnode) ** 2.0 else: raise ValueError(f"Invalid interp: {interp}") return cls(nodes, **kwargs) @classmethod def from_energy_edges(cls, energy_edges, unit=None, name=None, interp="log"): """Make an energy axis from adjacent edges. Parameters ---------- energy_edges : `~astropy.units.Quantity` or float Energy edges. unit : `~astropy.units.Unit`, optional Energy unit. Default is None. name : str, optional Name of the energy axis, either 'energy' or 'energy_true'. Default is None. interp: str, optional interpolation mode. Default is 'log'. Returns ------- axis : `MapAxis` Axis with name "energy" and interp "log". """ energy_edges = u.Quantity(energy_edges, unit) if not energy_edges.unit.is_equivalent("TeV"): raise ValueError( f"Please provide a valid energy unit, got {energy_edges.unit} instead." ) if name is None: name = "energy" if name not in ["energy", "energy_true"]: raise ValueError("Energy axis can only be named 'energy' or 'energy_true'") return cls.from_edges(energy_edges, unit=unit, interp=interp, name=name) @classmethod def from_energy_bounds( cls, energy_min, energy_max, nbin, unit=None, per_decade=False, name=None, node_type="edges", strict_bounds=True, ): """Make an energy axis from energy bounds. The interpolation is always 'log'. Used frequently also to make energy grids, by making the axis, and then using ``axis.center`` or ``axis.edges``. Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity`, float Energy range. nbin : int Number of bins. unit : `~astropy.units.Unit`, optional Energy unit. Default is None. per_decade : bool, optional Whether `nbin` is given per decade. Default is False. name : str, optional Name of the energy axis, either 'energy' or 'energy_true'. Default is None. node_type : str, optional Node type, either 'edges' or 'center'. Default is 'edges'. strict_bounds : bool, optional Whether to strictly end the binning at 'energy_max' when `per_decade=True`. If True, the number of bins per decade might be slightly increased to match the bounds. If False, 'energy_max' might be reduced so the number of bins per decade is exactly the given input. Default is True. Returns ------- axis : `MapAxis` Create MapAxis from the given input parameters. """ energy_min = u.Quantity(energy_min, unit) energy_max = u.Quantity(energy_max, unit) if unit is None: unit = energy_max.unit energy_min = energy_min.to(unit) if not energy_max.unit.is_equivalent("TeV"): raise ValueError( f"Please provide a valid energy unit, got {energy_max.unit} instead." ) if per_decade: if strict_bounds: nbin = np.ceil(np.log10(energy_max / energy_min).value * nbin) else: bin_per_decade = nbin nbin = np.floor( np.log10(energy_max / energy_min).value * bin_per_decade ) if np.log10(energy_max / energy_min).value % (1 / bin_per_decade) != 0: energy_max = energy_min * 10 ** (nbin / bin_per_decade) if name is None: name = "energy" if name not in ["energy", "energy_true"]: raise ValueError("Energy axis can only be named 'energy' or 'energy_true'") return cls.from_bounds( energy_min.value, energy_max.value, nbin=nbin, unit=unit, interp="log", name=name, node_type=node_type, ) @classmethod def from_nodes(cls, nodes, **kwargs): # TODO: What to do with interp in docstring but not in signature? """Generate an axis object from a sequence of nodes (bin centers). This will create a sequence of bins with edges half-way between the node values. This method should be used to construct an axis where the bin center should lie at a specific value (e.g. a map of a continuous function). Parameters ---------- nodes : `~numpy.ndarray` Axis nodes (bin center). interp : {'lin', 'log', 'sqrt'} Interpolation method used to transform between axis and pixel coordinates. Default is 'lin'. **kwargs : dict, optional Keyword arguments passed to `MapAxis`. """ if len(nodes) < 1: raise ValueError("Nodes array must have at least one element.") return cls(nodes, node_type="center", **kwargs) @classmethod def from_edges(cls, edges, **kwargs): """Generate an axis object from a sequence of bin edges. This method should be used to construct an axis where the bin edges should lie at specific values (e.g. a histogram). The number of bins will be one less than the number of edges. Parameters ---------- edges : `~numpy.ndarray` Axis bin edges. interp : {'lin', 'log', 'sqrt'} Interpolation method used to transform between axis and pixel coordinates. Default: 'lin'. **kwargs : dict, optional Keyword arguments passed to `MapAxis`. """ if len(edges) < 2: raise ValueError("Edges array must have at least two elements.") return cls(edges, node_type="edges", **kwargs) def concatenate(self, axis): """Concatenate another `MapAxis` to this `MapAxis` into a new `MapAxis` object. Name, interp type and node type must agree between the axes. If the node type is "edges", the edges must be contiguous and non-overlapping. Parameters ---------- axis : `MapAxis` Axis to concatenate with. Returns ------- axis : `MapAxis` Concatenation of the two axis. """ if self.node_type != axis.node_type: raise ValueError( f"Node type must agree, got {self.node_type} and {axis.node_type}" ) if self.name != axis.name: raise ValueError(f"Names must agree, got {self.name} and {axis.name} ") if self.interp != axis.interp: raise ValueError( f"Interp type must agree, got {self.interp} and {axis.interp}" ) if self.node_type == "edges": edges = np.append(self.edges, axis.edges[1:]) return self.from_edges(edges=edges, interp=self.interp, name=self.name) else: nodes = np.append(self.center, axis.center) return self.from_nodes(nodes=nodes, interp=self.interp, name=self.name) def pad(self, pad_width): """Pad the axis by a given number of pixels. Parameters ---------- pad_width : int or tuple of int A single integer pads in both direction of the axis, a tuple specifies which number of bins to pad at the low and high edge of the axis. Returns ------- axis : `MapAxis` Padded axis. """ if isinstance(pad_width, tuple): pad_low, pad_high = pad_width else: pad_low, pad_high = pad_width, pad_width if self.node_type == "edges": pix = np.arange(-pad_low, self.nbin + pad_high + 1) - 0.5 edges = self.pix_to_coord(pix) return self.from_edges(edges=edges, interp=self.interp, name=self.name) else: pix = np.arange(-pad_low, self.nbin + pad_high) nodes = self.pix_to_coord(pix) return self.from_nodes(nodes=nodes, interp=self.interp, name=self.name) @classmethod def from_stack(cls, axes): """Create a map axis by merging a list of other map axes. If the node type is "edges" the bin edges in the provided axes must be contiguous and non-overlapping. Parameters ---------- axes : list of `MapAxis` List of map axis to merge. Returns ------- axis : `MapAxis` Merged axis. """ ax_stacked = axes[0] for ax in axes[1:]: ax_stacked = ax_stacked.concatenate(ax) return ax_stacked def pix_to_coord(self, pix): """Transform pixel to axis coordinates. Parameters ---------- pix : `~numpy.ndarray` Array of pixel coordinate values. Returns ------- coord : `~numpy.ndarray` Array of axis coordinate values. """ pix = pix - self._pix_offset values = self._transform.pix_to_coord(pix=pix) return u.Quantity(values, unit=self.unit, copy=COPY_IF_NEEDED) def wrap_coord(self, coord): """Wrap coords between axis edges for a periodic boundary condition Parameters ---------- coord : `~numpy.ndarray` Array of axis coordinate values. Returns ------- coord : `~numpy.ndarray` Wrapped array of axis coordinate values. """ m1, m2 = self.edges_min[0], self.edges_max[-1] out_of_range = (coord >= m2) | (coord < m1) return np.where(out_of_range, (coord - m1) % (m2 - m1) + m1, coord) def pix_to_idx(self, pix, clip=False): """Convert pixel to index. Parameters ---------- pix : `~numpy.ndarray` Pixel coordinates. clip : bool, optional Choose whether to clip indices to the valid range of the axis. Default is False. If False, indices for coordinates outside the axis range will be set to -1. Returns ------- idx : `~numpy.ndarray` Pixel indices. """ if clip: idx = np.clip(pix, 0, self.nbin - 1) else: condition = (pix < 0) | (pix >= self.nbin) idx = np.where(condition, -1, pix) return idx def coord_to_pix(self, coord): """Transform axis to pixel coordinates. Parameters ---------- coord : `~numpy.ndarray` Array of axis coordinate values. Returns ------- pix : `~numpy.ndarray` Array of pixel coordinate values. """ if self._boundary_type == BoundaryEnum.periodic: coord = self.wrap_coord(coord) coord = u.Quantity(coord, self.unit, copy=COPY_IF_NEEDED).value pix = self._transform.coord_to_pix(coord=coord) return np.array(pix + self._pix_offset, ndmin=1) def coord_to_idx(self, coord, clip=False): """Transform axis coordinate to bin index. Parameters ---------- coord : `~numpy.ndarray` Array of axis coordinate values. clip : bool, optional Choose whether to clip the index to the valid range of the axis. Default is False. If False, then indices for values outside the axis range will be set to -1. Returns ------- idx : `~numpy.ndarray` Array of bin indices. """ if self._boundary_type == BoundaryEnum.periodic: coord = self.wrap_coord(coord) coord = u.Quantity(coord, self.unit, copy=COPY_IF_NEEDED, ndmin=1).value edges = self.edges.value idx = np.digitize(coord, edges) - 1 if clip: idx = np.clip(idx, 0, self.nbin - 1) else: with np.errstate(invalid="ignore"): idx[coord > edges[-1]] = INVALID_INDEX.int idx[~np.isfinite(coord)] = INVALID_INDEX.int return idx def slice(self, idx): """Create a new axis object by extracting a slice from this axis. Parameters ---------- idx : slice Slice object selecting a sub-selection of the axis. Returns ------- axis : `MapAxis` Sliced axis object. Examples -------- >>> from gammapy.maps import MapAxis >>> axis = MapAxis.from_bounds( ... 10.0, 2e3, 6, interp="log", name="energy_true", unit="GeV" ... ) >>> slices = slice(1, 3) >>> sliced = axis.slice(slices) """ center = self.center[idx].value idx = self.coord_to_idx(center) # For edge nodes we need to keep N+1 nodes if self._node_type == "edges": idx = tuple(list(idx) + [1 + idx[-1]]) nodes = self._nodes[(idx,)] return MapAxis( nodes, interp=self._interp, name=self._name, node_type=self._node_type, unit=self._unit, ) def squash(self): """Create a new axis object by squashing the axis into one bin. Returns ------- axis : `~MapAxis` Squashed axis object. """ return MapAxis.from_bounds( lo_bnd=self.edges[0].value, hi_bnd=self.edges[-1].value, nbin=1, interp=self._interp, name=self._name, unit=self._unit, ) def __repr__(self): str_ = self.__class__.__name__ str_ += "\n\n" fmt = "\t{:<10s} : {:<10s}\n" str_ += fmt.format("name", self.name) str_ += fmt.format("unit", "{!r}".format(str(self.unit))) str_ += fmt.format("nbins", str(self.nbin)) str_ += fmt.format("node type", self.node_type) vals = self.edges if self.node_type == "edges" else self.center str_ += fmt.format(f"{self.node_type} min", "{:.1e}".format(vals.min())) str_ += fmt.format(f"{self.node_type} max", "{:.1e}".format(vals.max())) str_ += fmt.format("interp", self._interp) return str_ def _init_copy(self, **kwargs): """Init map axis instance by copying missing init arguments from self.""" argnames = inspect.getfullargspec(self.__init__).args argnames.remove("self") for arg in argnames: value = getattr(self, "_" + arg) if arg not in kwargs: kwargs[arg] = copy.deepcopy(value) return self.__class__(**kwargs) def copy(self, **kwargs): """Copy `MapAxis` instance and overwrite given attributes. Parameters ---------- **kwargs : dict, optional Keyword arguments to overwrite in the map axis constructor. Returns ------- copy : `MapAxis` Copied map axis. """ return self._init_copy(**kwargs) def round(self, coord, clip=False): """Round coordinate to the nearest axis edge. Parameters ---------- coord : `~astropy.units.Quantity` Coordinates to be rounded. clip : bool, optional Choose whether to clip the index to the valid range of the axis. Default is False. If False, then indices for values outside the axis range will be set to -1. Returns ------- coord : `~astropy.units.Quantity` Rounded coordinates. """ edges_pix = self.coord_to_pix(coord) if clip: edges_pix = np.clip(edges_pix, -0.5, self.nbin - 0.5) edges_idx = np.round(edges_pix + 0.5) - 0.5 return self.pix_to_coord(edges_idx) def group_table(self, edges): """Compute bin groups table for the map axis, given coarser bin edges. Parameters ---------- edges : `~astropy.units.Quantity` Group bin edges. Returns ------- groups : `~astropy.table.Table` Map axis group table. """ if self.node_type != "edges": raise ValueError("Only edge based map axis can be grouped") edges_pix = np.clip(self.coord_to_pix(edges), -0.5, self.nbin - 0.5) edges_idx = np.unique(np.round(edges_pix + 0.5) - 0.5) edges_ref = self.pix_to_coord(edges_idx) groups = Table() groups[f"{self.name}_min"] = edges_ref[:-1] groups[f"{self.name}_max"] = edges_ref[1:] groups["idx_min"] = (edges_idx[:-1] + 0.5).astype(int) groups["idx_max"] = (edges_idx[1:] - 0.5).astype(int) if len(groups) == 0: raise ValueError("No overlap between reference and target edges.") groups["bin_type"] = "normal " edge_idx_start, edge_ref_start = edges_idx[0], edges_ref[0] if edge_idx_start > 0: underflow = { "bin_type": "underflow", "idx_min": 0, "idx_max": edge_idx_start, f"{self.name}_min": self.pix_to_coord(-0.5), f"{self.name}_max": edge_ref_start, } groups.insert_row(0, vals=underflow) edge_idx_end, edge_ref_end = edges_idx[-1], edges_ref[-1] if edge_idx_end < (self.nbin - 0.5): overflow = { "bin_type": "overflow", "idx_min": edge_idx_end + 1, "idx_max": self.nbin - 1, f"{self.name}_min": edge_ref_end, f"{self.name}_max": self.pix_to_coord(self.nbin - 0.5), } groups.add_row(vals=overflow) group_idx = Column(np.arange(len(groups))) groups.add_column(group_idx, name="group_idx", index=0) return groups def upsample(self, factor): """Upsample map axis by a given factor. When upsampling for each node specified in the axis, the corresponding number of sub-nodes are introduced and preserving the initial nodes. For node type "edges" this results in nbin * factor new bins. For node type "center" this results in (nbin - 1) * factor + 1 new bins. Parameters ---------- factor : int Upsampling factor. Returns ------- axis : `MapAxis` Upsampled map axis. """ if self.node_type == "edges": pix = self.coord_to_pix(self.edges) nbin = int(self.nbin * factor) + 1 pix_new = np.linspace(pix.min(), pix.max(), nbin) edges = self.pix_to_coord(pix_new) return self.from_edges(edges, name=self.name, interp=self.interp) else: pix = self.coord_to_pix(self.center) nbin = int((self.nbin - 1) * factor) + 1 pix_new = np.linspace(pix.min(), pix.max(), nbin) nodes = self.pix_to_coord(pix_new) return self.from_nodes(nodes, name=self.name, interp=self.interp) def downsample(self, factor, strict=True): """Downsample map axis by a given factor. When downsampling, each n-th (given by the factor) bin is selected from the axis while preserving the axis limits. For node type "edges" this requires nbin to be dividable by the factor, for node type "center" this requires nbin - 1 to be dividable by the factor. Parameters ---------- factor : int Downsampling factor. strict : bool Whether the number of bins is strictly divisible by the factor. If True, ``nbin`` must be divisible by the ``factor``. If False, the reminder bins are put into the last bin of the new axis. Default is True. Returns ------- axis : `MapAxis` Downsampled map axis. """ if self.node_type == "edges": edges = self.edges[::factor] if edges[-1] != self.edges[-1]: if strict is True: raise ValueError( f"Number of {self.name} bins ({self.nbin}) is not divisible by {factor}" ) else: edges = np.append(edges, self.edges[-1]) return self.from_edges(edges, name=self.name, interp=self.interp) elif self.node_type == "center": centers = self.center[::factor] if centers[-1] != self.center[-1]: if strict is True: raise ValueError( f"Number of {self.name} bins - 1 ({self.nbin-1}) is not divisible by {factor}" ) else: centers = np.append(centers, self.center[-1]) return self.from_nodes(centers, name=self.name, interp=self.interp) def to_header(self, format="ogip", idx=0): """Create FITS header. Parameters ---------- format : {"ogip"}, optional Format specification. Default is "ogip". idx : int, optional Column index of the axis. Default is 0. Returns ------- header : `~astropy.io.fits.Header` Header to extend. """ header = fits.Header() if format in ["ogip", "ogip-sherpa"]: header["EXTNAME"] = "EBOUNDS", "Name of this binary table extension" header["TELESCOP"] = "DUMMY", "Mission/satellite name" header["INSTRUME"] = "DUMMY", "Instrument/detector" header["FILTER"] = "None", "Filter information" header["CHANTYPE"] = "PHA", "Type of channels (PHA, PI etc)" header["DETCHANS"] = self.nbin, "Total number of detector PHA channels" header["HDUCLASS"] = "OGIP", "Organisation devising file format" header["HDUCLAS1"] = "RESPONSE", "File relates to response of instrument" header["HDUCLAS2"] = "EBOUNDS", "This is an EBOUNDS extension" header["HDUVERS"] = "1.2.0", "Version of file format" elif format in ["gadf", "fgst-ccube", "fgst-template"]: key = f"AXCOLS{idx}" name = self.name.upper() if self.name == "energy" and self.node_type == "edges": header[key] = "E_MIN,E_MAX" elif self.name == "energy" and self.node_type == "center": header[key] = "ENERGY" elif self.node_type == "edges": header[key] = f"{name}_MIN,{name}_MAX" elif self.node_type == "center": header[key] = name else: raise ValueError(f"Invalid node type {self.node_type!r}") key_interp = f"INTERP{idx}" header[key_interp] = self.interp else: raise ValueError(f"Unknown format {format}") return header def to_table(self, format="ogip"): """Convert `~astropy.units.Quantity` to OGIP ``EBOUNDS`` extension. See https://heasarc.gsfc.nasa.gov/docs/heasarc/caldb/docs/memos/cal_gen_92_002/cal_gen_92_002.html#tth_sEc3.2 # noqa: E501 The 'ogip-sherpa' format is equivalent to 'ogip' but uses keV energy units. Parameters ---------- format : {"ogip", "ogip-sherpa", "gadf-dl3", "gtpsf"} Format specification. Default is "ogip". Returns ------- table : `~astropy.table.Table` Table HDU. """ table = Table() edges = self.edges if format in ["ogip", "ogip-sherpa"]: self.assert_name("energy") if format == "ogip-sherpa": edges = edges.to("keV") table["CHANNEL"] = np.arange(self.nbin, dtype=np.int16) table["E_MIN"] = edges[:-1] table["E_MAX"] = edges[1:] elif format in ["ogip-arf", "ogip-arf-sherpa"]: self.assert_name("energy_true") if format == "ogip-arf-sherpa": edges = edges.to("keV") table["ENERG_LO"] = edges[:-1] table["ENERG_HI"] = edges[1:] elif format == "gadf-sed": if self.is_energy_axis: table["e_ref"] = self.center table["e_min"] = self.edges_min table["e_max"] = self.edges_max elif format == "gadf-dl3": from gammapy.irf.io import IRF_DL3_AXES_SPECIFICATION if self.name == "energy": column_prefix = "ENERG" else: for column_prefix, spec in IRF_DL3_AXES_SPECIFICATION.items(): if spec["name"] == self.name: break if self.node_type == "edges": edges_hi, edges_lo = edges[:-1], edges[1:] else: edges_hi, edges_lo = self.center, self.center table[f"{column_prefix}_LO"] = edges_hi[np.newaxis] table[f"{column_prefix}_HI"] = edges_lo[np.newaxis] elif format == "gtpsf": if self.name == "energy_true": table["Energy"] = self.center.to("MeV") elif self.name == "rad": table["Theta"] = self.center.to("deg") else: raise ValueError( "Can only convert true energy or rad axis to" f"'gtpsf' format, got {self.name}" ) else: raise ValueError(f"{format} is not a valid format") return table def to_table_hdu(self, format="ogip"): """Convert `~astropy.units.Quantity` to OGIP ``EBOUNDS`` extension. See https://heasarc.gsfc.nasa.gov/docs/heasarc/caldb/docs/memos/cal_gen_92_002/cal_gen_92_002.html#tth_sEc3.2 # noqa: E501 The 'ogip-sherpa' format is equivalent to 'ogip' but uses keV energy units. Parameters ---------- format : {"ogip", "ogip-sherpa", "gtpsf"} Format specification. Default is "ogip". Returns ------- hdu : `~astropy.io.fits.BinTableHDU` Table HDU. """ table = self.to_table(format=format) if format == "gtpsf": name = "THETA" else: name = None hdu = fits.BinTableHDU(table, name=name) if format in ["ogip", "ogip-sherpa"]: hdu.header.update(self.to_header(format=format)) return hdu @classmethod def from_table(cls, table, format="ogip", idx=0, column_prefix=""): """Instantiate MapAxis from a table HDU. Parameters ---------- table : `~astropy.table.Table` Table. format : {"ogip", "ogip-arf", "fgst-ccube", "fgst-template", "gadf", "gadf-dl3"} Format specification. Default is "ogip". idx : int, optional Column index of the axis. Default is 0. column_prefix : str, optional Column name prefix of the axis, used for creating the axis. Default is "". Returns ------- axis : `MapAxis` Map Axis. """ if format in ["ogip", "fgst-ccube"]: energy_min = table["E_MIN"].quantity energy_max = table["E_MAX"].quantity energy_edges = ( np.append(energy_min.value, energy_max.value[-1]) * energy_min.unit ) axis = cls.from_edges(energy_edges, name="energy", interp="log") elif format == "ogip-arf": energy_min = table["ENERG_LO"].quantity energy_max = table["ENERG_HI"].quantity energy_edges = ( np.append(energy_min.value, energy_max.value[-1]) * energy_min.unit ) axis = cls.from_edges(energy_edges, name="energy_true", interp="log") elif format in ["fgst-template", "fgst-bexpcube"]: allowed_names = ["Energy", "ENERGY", "energy"] for colname in table.colnames: if colname in allowed_names: tag = colname break nodes = table[tag].data axis = cls.from_nodes( nodes=nodes, name="energy_true", unit="MeV", interp="log" ) elif format == "gadf": axcols = table.meta.get("AXCOLS{}".format(idx + 1)) colnames = axcols.split(",") node_type = "edges" if len(colnames) == 2 else "center" # TODO: check why this extra case is needed if colnames[0] == "E_MIN": name = "energy" else: name = colnames[0].replace("_MIN", "").lower() # this is need for backward compatibility if name == "theta": name = "rad" interp = table.meta.get("INTERP{}".format(idx + 1), "lin") if node_type == "center": nodes = np.unique(table[colnames[0]].quantity) else: edges_min = np.unique(table[colnames[0]].quantity) edges_max = np.unique(table[colnames[1]].quantity) nodes = edges_from_lo_hi(edges_min, edges_max) axis = MapAxis(nodes=nodes, node_type=node_type, interp=interp, name=name) elif format == "gadf-dl3": from gammapy.irf.io import IRF_DL3_AXES_SPECIFICATION spec = IRF_DL3_AXES_SPECIFICATION[column_prefix] name, interp = spec["name"], spec["interp"] # background models are stored in reconstructed energy hduclass = table.meta.get("HDUCLAS2") if hduclass in {"BKG", "RAD_MAX"} and column_prefix == "ENERG": name = "energy" edges_lo = table[f"{column_prefix}_LO"].quantity[0] edges_hi = table[f"{column_prefix}_HI"].quantity[0] # for a single-valued array, it can happen that the value is stored/extracted as a scalar if edges_lo.isscalar: log.warning( f"'{column_prefix}' axis is stored as a scalar -- converting to 1D array." ) edges_lo = edges_lo[np.newaxis] edges_hi = edges_hi[np.newaxis] if np.allclose(edges_hi, edges_lo): axis = MapAxis.from_nodes(edges_hi, interp=interp, name=name) else: edges = edges_from_lo_hi(edges_lo, edges_hi) axis = MapAxis.from_edges(edges, interp=interp, name=name) elif format == "gtpsf": try: energy = table["Energy"].data * u.MeV axis = MapAxis.from_nodes(energy, name="energy_true", interp="log") except KeyError: rad = table["Theta"].data * u.deg axis = MapAxis.from_nodes(rad, name="rad") elif format == "gadf-sed-energy": if "e_min" in table.colnames and "e_max" in table.colnames: e_min = flat_if_equal(table["e_min"].quantity) e_max = flat_if_equal(table["e_max"].quantity) edges = edges_from_lo_hi(e_min, e_max) axis = MapAxis.from_energy_edges(edges) elif "e_ref" in table.colnames: e_ref = flat_if_equal(table["e_ref"].quantity) axis = MapAxis.from_nodes(e_ref, name="energy", interp="log") else: raise ValueError( "Either 'e_ref', 'e_min' or 'e_max' column " "names are required" ) elif format == "gadf-sed-norm": # TODO: guess interp here nodes = flat_if_equal(table["norm_scan"][0]) axis = MapAxis.from_nodes(nodes, name="norm") elif format == "gadf-sed-counts": if "datasets" in table.colnames: labels = np.unique(table["datasets"]) axis = LabelMapAxis(labels=labels, name="dataset") else: shape = table["counts"].shape edges = np.arange(shape[-1] + 1) - 0.5 axis = MapAxis.from_edges(edges, name="dataset") elif format == "profile": if "datasets" in table.colnames: labels = np.unique(table["datasets"]) axis = LabelMapAxis(labels=labels, name="dataset") else: x_ref = table["x_ref"].quantity axis = MapAxis.from_nodes(x_ref, name="projected-distance") else: raise ValueError(f"Format '{format}' not supported") return axis @classmethod def from_table_hdu(cls, hdu, format="ogip", idx=0): """Instantiate MapAxis from table HDU. Parameters ---------- hdu : `~astropy.io.fits.BinTableHDU` Table HDU. format : {"ogip", "ogip-arf", "fgst-ccube", "fgst-template"} Format specification. Default is "ogip". idx : int, optional Column index of the axis. Default is 0. Returns ------- axis : `MapAxis` Map Axis. """ table = Table.read(hdu) return cls.from_table(table, format=format, idx=idx) class MapAxes(Sequence): """MapAxis container class. Parameters ---------- axes : list of `MapAxis` List of map axis objects. """ def __init__(self, axes, n_spatial_axes=None): unique_names = [] for ax in axes: if ax.name in unique_names: raise ( ValueError(f"Axis names must be unique, got: '{ax.name}' twice.") ) unique_names.append(ax.name) self._axes = axes self._n_spatial_axes = n_spatial_axes def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property def primary_axis(self): """Primary extra axis, defined as the longest one. Returns ------- axis : `MapAxis` Map axis. """ # get longest axis idx = np.argmax(self.shape) return self[int(idx)] @property def is_flat(self): """Whether axes is flat.""" shape = np.array(self.shape) return np.all(shape == 1) @property def is_unidimensional(self): """Whether axes is unidimensional.""" shape = np.array(self.shape) non_zero = np.count_nonzero(shape > 1) return self.is_flat or non_zero == 1 @property def reverse(self): """Reverse axes order.""" return MapAxes(self[::-1]) @property def iter_with_reshape(self): # TODO: The name is misleading. Maybe iter_axis_and_shape? """Generator that iterates over axes and their shape.""" for idx, axis in enumerate(self): # Extract values for each axis, default: nodes shape = [1] * len(self) shape[idx] = -1 if self._n_spatial_axes: shape = ( shape[::-1] + [ 1, ] * self._n_spatial_axes ) yield tuple(shape), axis def get_coord(self, mode="center", axis_name=None): """Get axes coordinates. Parameters ---------- mode : {"center", "edges"} Coordinate center or edges. Default is "center". axis_name : str, optional Axis name for which mode='edges' applies. Default is None. Returns ------- coords : dict of `~astropy.units.Quantity`. Map coordinates as a dictionary. """ coords = {} for shape, axis in self.iter_with_reshape: if mode == "edges" and axis.name == axis_name: coord = axis.edges else: coord = axis.center coords[axis.name] = coord.reshape(shape) return coords def bin_volume(self): """Bin axes volume. Returns ------- bin_volume : `~astropy.units.Quantity` Bin volume. """ bin_volume = np.array(1) for shape, axis in self.iter_with_reshape: bin_volume = bin_volume * axis.bin_width.reshape(shape) return bin_volume @property def shape(self): """Shapes of the axes.""" return tuple([ax.nbin for ax in self]) @property def names(self): """Names of the axes.""" return [ax.name for ax in self] def index(self, axis_name): """Get index in list.""" return self.names.index(axis_name) def index_data(self, axis_name): """Get data index of the axes. Parameters ---------- axis_name : str Name of the axis. Returns ------- idx : int Data index. """ idx = self.names.index(axis_name) return len(self) - idx - 1 def __len__(self): return len(self._axes) def __add__(self, other): return self.__class__(list(self) + list(other)) def upsample(self, factor, axis_name): """Upsample axis by a given factor. Parameters ---------- factor : int Upsampling factor. axis_name : str Axis to upsample. Returns ------- axes : `MapAxes` Map axes. """ axes = [] for ax in self: if ax.name == axis_name: ax = ax.upsample(factor=factor) axes.append(ax.copy()) return self.__class__(axes=axes) def replace(self, axis): """Replace a given axis. In order to be replaced, the name of the new axis must match the name of the old axis. Parameters ---------- axis : `MapAxis` Map axis. Returns ------- axes : MapAxes Map axes. """ axes = [] for ax in self: if ax.name == axis.name: ax = axis axes.append(ax) return self.__class__(axes=axes) def resample(self, axis): """Resample axis binning. This method groups the existing bins into a new binning. Parameters ---------- axis : `MapAxis` New map axis. Returns ------- axes : `MapAxes` Axes object with resampled axis. """ axis_self = self[axis.name] groups = axis_self.group_table(axis.edges) # Keep only normal bins groups = groups[groups["bin_type"] == "normal "] edges = edges_from_lo_hi( groups[axis.name + "_min"].quantity, groups[axis.name + "_max"].quantity, ) axis_resampled = MapAxis.from_edges( edges=edges, interp=axis.interp, name=axis.name ) axes = [] for ax in self: if ax.name == axis.name: axes.append(axis_resampled) else: axes.append(ax.copy()) return self.__class__(axes=axes) def downsample(self, factor, axis_name): """Downsample axis by a given factor. Parameters ---------- factor : int Downsampling factor. axis_name : str Axis to downsample. Returns ------- axes : `MapAxes` Map axes. """ axes = [] for ax in self: if ax.name == axis_name: ax = ax.downsample(factor=factor) axes.append(ax.copy()) return self.__class__(axes=axes) def squash(self, axis_name): """Squash axis. Parameters ---------- axis_name : str Axis to squash. Returns ------- axes : `MapAxes` Axes with squashed axis. """ axes = [] for ax in self: if ax.name == axis_name: ax = ax.squash() axes.append(ax.copy()) return self.__class__(axes=axes) def pad(self, axis_name, pad_width): """Pad axis. Parameters ---------- axis_name : str Name of the axis to pad. pad_width : int or tuple of int Pad width. Returns ------- axes : `MapAxes` Axes with squashed axis. """ axes = [] for ax in self: if ax.name == axis_name: ax = ax.pad(pad_width=pad_width) axes.append(ax) return self.__class__(axes=axes) def drop(self, axis_name): """Drop an axis. Parameters ---------- axis_name : str Name of the axis to remove. Returns ------- axes : `MapAxes` Axes without the `axis_name`. """ axes = [] for ax in self: if ax.name == axis_name: continue axes.append(ax.copy()) return self.__class__(axes=axes) def __getitem__(self, idx): if isinstance(idx, int): return self._axes[idx] elif isinstance(idx, str): for ax in self._axes: if ax.name == idx: return ax raise KeyError(f"No axes: {idx!r}") elif isinstance(idx, slice): axes = self._axes[idx] return self.__class__(axes=axes) elif isinstance(idx, list): axes = [] for name in idx: axes.append(self[name]) return self.__class__(axes=axes) else: raise TypeError(f"Invalid type: {type(idx)!r}") def coord_to_idx(self, coord, clip=True): """Transform from axis to pixel indices. Parameters ---------- coord : dict of `~numpy.ndarray` or `MapCoord` Array of axis coordinate values. clip : bool, optional Choose whether to clip indices to the valid range of the axis. Default is True. If False, then indices for coordinates outside the axis range will be set to -1. Returns ------- pix : tuple of `~numpy.ndarray` Array of pixel indices values. """ return tuple([ax.coord_to_idx(coord[ax.name], clip=clip) for ax in self]) def coord_to_pix(self, coord): """Transform from axis to pixel coordinates. Parameters ---------- coord : dict of `~numpy.ndarray` Array of axis coordinate values. Returns ------- pix : tuple of `~numpy.ndarray` Array of pixel coordinate values. """ return tuple([ax.coord_to_pix(coord[ax.name]) for ax in self]) def pix_to_coord(self, pix): """Convert pixel coordinates to map coordinates. Parameters ---------- pix : tuple Tuple of pixel coordinates. Returns ------- coords : tuple Tuple of map coordinates. """ return tuple([ax.pix_to_coord(p) for ax, p in zip(self, pix)]) def pix_to_idx(self, pix, clip=False): """Convert pixel to pixel indices. Parameters ---------- pix : tuple of `~numpy.ndarray` Pixel coordinates. clip : bool, optional Choose whether to clip indices to the valid range of the axis. Default is False. If False, then indices for coordinates outside the axi range will be set to -1. Returns ------- idx : tuple `~numpy.ndarray` Pixel indices. """ idx = [] for pix_array, ax in zip(pix, self): idx.append(ax.pix_to_idx(pix_array, clip=clip)) return tuple(idx) def slice_by_idx(self, slices): """Create a new geometry by slicing the non-spatial axes. Parameters ---------- slices : dict Dictionary of axes names and integers or `slice` object pairs. Contains one element for each non-spatial dimension. For integer indexing the corresponding axes is dropped from the map. Axes not specified in the dictionary are kept unchanged. Returns ------- axes : `~MapAxes` Sliced axes. Examples -------- >>> import astropy.units as u >>> from astropy.time import Time >>> from gammapy.maps import MapAxis, MapAxes, TimeMapAxis >>> energy_axis = MapAxis.from_energy_bounds(1*u.TeV, 3*u.TeV, 6) >>> time_ref = Time("1999-01-01T00:00:00.123456789") >>> time_axis = TimeMapAxis( ... edges_min=[0, 1, 3] * u.d, ... edges_max=[0.8, 1.9, 5.4] * u.d, ... reference_time=time_ref, ... ) >>> axes = MapAxes([energy_axis, time_axis]) >>> slices = {"energy": slice(0, 3), "time": slice(0, 1)} >>> sliced_axes = axes.slice_by_idx(slices) """ axes = [] for ax in self: ax_slice = slices.get(ax.name, slice(None)) # in the case where isinstance(ax_slice, int) the axes is dropped if isinstance(ax_slice, slice): ax_sliced = ax.slice(ax_slice) axes.append(ax_sliced.copy()) return self.__class__(axes=axes) def to_header(self, format="gadf"): """Convert axes to FITS header. Parameters ---------- format : {"gadf"} Header format. Default is "gadf". Returns ------- header : `~astropy.io.fits.Header` FITS header. """ header = fits.Header() for idx, ax in enumerate(self, start=1): header_ax = ax.to_header(format=format, idx=idx) header.update(header_ax) return header def to_table(self, format="gadf"): """Convert axes to table. Parameters ---------- format : {"gadf", "gadf-dl3", "fgst-ccube", "fgst-template", "ogip", "ogip-sherpa", "ogip-arf", "ogip-arf-sherpa"} # noqa E501 Format to use. Default is "gadf". Returns ------- table : `~astropy.table.Table` Table with axis data. """ if format == "gadf-dl3": tables = [] for ax in self: tables.append(ax.to_table(format=format)) table = hstack(tables) elif format in ["gadf", "fgst-ccube", "fgst-template"]: table = Table() table["CHANNEL"] = np.arange(np.prod(self.shape)) axes_ctr = np.meshgrid(*[ax.center for ax in self]) axes_min = np.meshgrid(*[ax.edges_min for ax in self]) axes_max = np.meshgrid(*[ax.edges_max for ax in self]) for idx, ax in enumerate(self): name = ax.name.upper() if name == "ENERGY": colnames = ["ENERGY", "E_MIN", "E_MAX"] else: colnames = [name, name + "_MIN", name + "_MAX"] for colname, v in zip(colnames, [axes_ctr, axes_min, axes_max]): # do not store edges for label axis if ax.node_type == "label" and colname != name: continue table[colname] = np.ravel(v[idx]) if isinstance(ax, TimeMapAxis): ref_dict = time_ref_to_dict(ax.reference_time) table.meta.update(ref_dict) elif format in ["ogip", "ogip-sherpa", "ogip", "ogip-arf"]: energy_axis = self["energy"] table = energy_axis.to_table(format=format) else: raise ValueError(f"Unsupported format: '{format}'") return table def to_table_hdu(self, format="gadf", hdu_bands=None): """Make FITS table columns for map axes. Parameters ---------- format : {"gadf", "fgst-ccube", "fgst-template"} Format to use. Default is "gadf". hdu_bands : str, optional Name of the bands HDU to use. Default is None. Returns ------- hdu : `~astropy.io.fits.BinTableHDU` Bin table HDU. """ # FIXME: Check whether convention is compatible with # dimensionality of geometry and simplify!!! if format in ["fgst-ccube", "ogip", "ogip-sherpa"]: hdu_bands = "EBOUNDS" elif format == "fgst-template": hdu_bands = "ENERGIES" elif format == "gadf" or format is None: if hdu_bands is None: hdu_bands = "BANDS" else: raise ValueError(f"Unknown format {format}") table = self.to_table(format=format) header = self.to_header(format=format) return fits.BinTableHDU(table, name=hdu_bands, header=header) @classmethod def from_table_hdu(cls, hdu, format="gadf"): """Create MapAxes from BinTableHDU. Parameters ---------- hdu : `~astropy.io.fits.BinTableHDU` Bin table HDU. format : {"gadf", "gadf-dl3", "fgst-ccube", "fgst-template", "fgst-bexcube", "ogip-arf"} Format to use. Default is "gadf". Returns ------- axes : `MapAxes` Map axes object. """ if hdu is None: return cls([]) table = Table.read(hdu) return cls.from_table(table, format=format) @classmethod def from_table(cls, table, format="gadf"): """Create MapAxes from table. Parameters ---------- table : `~astropy.table.Table` Bin table HDU. format : {"gadf", "gadf-dl3", "fgst-ccube", "fgst-template", "fgst-bexcube", "ogip-arf"} Format to use. Default is "gadf". Returns ------- axes : `MapAxes` Map axes object. """ from gammapy.irf.io import IRF_DL3_AXES_SPECIFICATION axes = [] # Formats that support only one energy axis if format in [ "fgst-ccube", "fgst-template", "fgst-bexpcube", "ogip", "ogip-arf", ]: axes.append(MapAxis.from_table(table, format=format)) elif format == "gadf": # This limits the max number of axes to 5 for idx in range(5): axcols = table.meta.get("AXCOLS{}".format(idx + 1)) if axcols is None: break # TODO: what is good way to check whether it is a given axis type? try: axis = LabelMapAxis.from_table(table, format=format, idx=idx) except (KeyError, TypeError): try: axis = TimeMapAxis.from_table(table, format=format, idx=idx) except (KeyError, ValueError, IndexError): axis = MapAxis.from_table(table, format=format, idx=idx) axes.append(axis) elif format == "gadf-dl3": for column_prefix in IRF_DL3_AXES_SPECIFICATION: try: axis = MapAxis.from_table( table, format=format, column_prefix=column_prefix ) except KeyError: continue axes.append(axis) elif format == "gadf-sed": for axis_format in ["gadf-sed-norm", "gadf-sed-energy", "gadf-sed-counts"]: try: axis = MapAxis.from_table(table=table, format=axis_format) except KeyError: continue axes.append(axis) elif format == "lightcurve": axes.extend(cls.from_table(table=table, format="gadf-sed")) axes.append(TimeMapAxis.from_table(table, format="lightcurve")) elif format == "profile": axes.extend(cls.from_table(table=table, format="gadf-sed")) axes.append(MapAxis.from_table(table, format="profile")) else: raise ValueError(f"Unsupported format: '{format}'") return cls(axes) @classmethod def from_default(cls, axes, n_spatial_axes=None): """Make a sequence of `~MapAxis` objects. Parameters ---------- axes : list of `~MapAxis` or `~numpy.ndarray` Sequence of axis or edges defining the axes. n_spatial_axes : int, optional Number of spatial axes. Default is None. Returns ------- axes : `MapAxes` Map axes object. """ if axes is None: return cls([]) axes_out = [] for idx, ax in enumerate(axes): if isinstance(ax, np.ndarray): ax = MapAxis(ax) if ax.name == "": ax._name = f"axis{idx}" axes_out.append(ax) return cls(axes_out, n_spatial_axes=n_spatial_axes) def assert_names(self, required_names, allow_extra=False): """Assert required axis names and order. Parameters ---------- required_names : list of str Required names. allow_extra : bool Allow extra axes beyond required ones. """ message = ( "Incorrect axis order or names. Expected axis " f"order: {required_names}, got: {self.names}." ) if not allow_extra and not len(self) == len(required_names): raise ValueError(message) try: for ax, required_name in zip(self[: len(required_names)], required_names): ax.assert_name(required_name) except ValueError: raise ValueError(message) def rename_axes(self, names, new_names): """Rename the axes. Parameters ---------- names : str or list of str Names of the axis. new_names : str or list of str New names of the axes (list must be of same length than `names`). Returns ------- axes : `MapAxes` Renamed Map axes object. """ axes = self.copy() if isinstance(names, str): names = [names] if isinstance(new_names, str): new_names = [new_names] for name, new_name in zip(names, new_names): axes[name]._name = new_name return axes @property def center_coord(self): """Center coordinates.""" return tuple([ax.pix_to_coord((float(ax.nbin) - 1.0) / 2.0) for ax in self]) def is_allclose(self, other, **kwargs): """Check if other map axes are all close. Parameters ---------- other : `MapAxes` Other map axes. **kwargs : dict, optional Keyword arguments forwarded to `~MapAxis.is_allclose` Returns ------- is_allclose : bool Whether other axes are all close. """ if not isinstance(other, self.__class__): return TypeError(f"Cannot compare {type(self)} and {type(other)}") return np.all([ax0.is_allclose(ax1, **kwargs) for ax0, ax1 in zip(other, self)]) def __eq__(self, other): if not isinstance(other, self.__class__): return False return self.is_allclose(other, rtol=1e-6, atol=1e-6) def __ne__(self, other): return not self.__eq__(other) def copy(self): """Initialize a new map axes instance by copying each axis.""" return self.__class__([_.copy() for _ in self]) class TimeMapAxis: """Class representing a time axis. Provides methods for transforming to/from axis and pixel coordinates. A time axis can represent non-contiguous sequences of non-overlapping time intervals. Time intervals must be provided in increasing order. Parameters ---------- edges_min : `~astropy.units.Quantity` Array of edge time values. This is the time delta w.r.t. to the reference time. edges_max : `~astropy.units.Quantity` Array of edge time values. This is the time delta w.r.t. to the reference time. reference_time : `~astropy.time.Time` Reference time to use. name : str, optional Axis name. Default is "time". interp : {'lin'} Interpolation method used to transform between axis and pixel coordinates. For now only 'lin' is supported. Default is 'lin'. """ node_type = "intervals" def __init__(self, edges_min, edges_max, reference_time, name="time", interp="lin"): self._name = name self._time_format = "iso" edges_min = u.Quantity(edges_min, ndmin=1) edges_max = u.Quantity(edges_max, ndmin=1) if not edges_min.unit.is_equivalent("s"): raise ValueError( f"Time edges min must have a valid time unit, got {edges_min.unit}" ) if not edges_max.unit.is_equivalent("s"): raise ValueError( f"Time edges max must have a valid time unit, got {edges_max.unit}" ) if not edges_min.shape == edges_max.shape: raise ValueError( "Edges min and edges max must have the same shape," f" got {edges_min.shape} and {edges_max.shape}." ) if not np.all(edges_max > edges_min): raise ValueError("Edges max must all be larger than edge min") if not np.all(edges_min == np.sort(edges_min)): raise ValueError("Time edges min values must be sorted") if not np.all(edges_max == np.sort(edges_max)): raise ValueError("Time edges max values must be sorted") if interp != "lin": raise NotImplementedError( f"Non-linear scaling scheme are not supported yet, got {interp}" ) self._edges_min = edges_min self._edges_max = edges_max self._reference_time = Time(reference_time) self._pix_offset = -0.5 self._interp = interp delta = edges_min[1:] - edges_max[:-1] if np.any(delta < 0 * u.s): raise ValueError("Time intervals must not overlap.") def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property def is_contiguous(self): """Whether the axis is contiguous.""" return np.all(self.edges_min[1:] == self.edges_max[:-1]) def to_contiguous(self): """Make the time axis contiguous. Returns ------- axis : `TimeMapAxis` Contiguous time axis. """ edges = np.unique(np.stack([self.edges_min, self.edges_max])) return self.__class__( edges_min=edges[:-1], edges_max=edges[1:], reference_time=self.reference_time, name=self.name, interp=self.interp, ) @property def unit(self): """Axis unit.""" return self.edges_max.unit @property def interp(self): """Interpolation scale of the axis.""" return self._interp @property def reference_time(self): """Return reference time used for the axis.""" return self._reference_time @property def name(self): """Return the axis name.""" return self._name @property def nbin(self): """Return the number of bins in the axis.""" return len(self.edges_min.flatten()) @property def edges_min(self): """Return the array of bin edges maximum values.""" return self._edges_min @property def edges_max(self): """Return the array of bin edges minimum values.""" return self._edges_max @property def edges(self): """Return the array of bin edges values.""" if not self.is_contiguous: raise ValueError("Time axis is not contiguous") return edges_from_lo_hi(self.edges_min, self.edges_max) @property def bounds(self): """Bounds of the axis as a ~astropy.units.Quantity.""" return self.edges_min[0], self.edges_max[-1] @property def time_bounds(self): """Bounds of the axis as a ~astropy.units.Quantity.""" t_min, t_max = self.bounds return t_min + self.reference_time, t_max + self.reference_time @property def time_min(self): """Return axis lower edges as `~astropy.time.Time` object.""" return self._edges_min + self.reference_time @property def time_max(self): """Return axis upper edges as a `~astropy.time.Time` object.""" return self._edges_max + self.reference_time @property def time_delta(self): """Return axis time bin width as a `~astropy.time.TimeDelta` object.""" return self.time_max - self.time_min @property def time_mid(self): """Return time bin center as a `~astropy.time.Time` object.""" return self.time_min + 0.5 * self.time_delta @property def time_edges(self): """Time edges as a `~astropy.time.Time` object.""" return self.reference_time + self.edges @property def time_format(self): """The time format to use for the axis.""" return self._time_format @time_format.setter def time_format(self, val): # inherited docstring if val not in ["iso", "mjd"]: raise ValueError(f"Invalid time_format: {self.time_format}") self._time_format = val @property def as_plot_xerr(self): """Return x errors for plotting.""" xn, xp = self.time_mid - self.time_min, self.time_max - self.time_mid if self.time_format == "iso": x_errn = xn.to_datetime() x_errp = xp.to_datetime() else: x_errn = xn.to("day") x_errp = xp.to("day") return x_errn, x_errp @property def as_plot_labels(self): """Return labels for plotting.""" labels = [] for t_min, t_max in self.iter_by_edges: label = f"{getattr(t_min, self.time_format)} - {getattr(t_max, self.time_format)}" labels.append(label) return labels @property def as_plot_edges(self): """Return edges for plotting.""" if self.time_format == "iso": edges = self.time_edges.to_datetime() else: edges = self.time_edges.mjd * u.day return edges @property def as_plot_center(self): """Return center for plotting.""" if self.time_format == "iso": center = self.time_mid.datetime else: center = self.time_mid.mjd * u.day return center def format_plot_xaxis(self, ax): """Format the x-axis. Parameters ---------- ax : `~matplotlib.pyplot.Axis` Plot axis to format. Returns ------- ax : `~matplotlib.pyplot.Axis` Formatted plot axis. """ from matplotlib.dates import DateFormatter xlabel = DEFAULT_LABEL_TEMPLATE.format( quantity=PLOT_AXIS_LABEL.get(self.name, self.name.capitalize()), unit=self.time_format, ) ax.set_xlabel(xlabel) if self.time_format == "iso": ax.xaxis.set_major_formatter(DateFormatter("%Y-%m-%d %H:%M:%S")) else: ax.xaxis.set_major_formatter("{x:,.5f}") plt.setp( ax.xaxis.get_majorticklabels(), rotation=30, ha="right", rotation_mode="anchor", ) return ax def assert_name(self, required_name): """Assert axis name if a specific one is required. Parameters ---------- required_name : str Required name. """ if self.name != required_name: raise ValueError( "Unexpected axis name," f' expected "{required_name}", got: "{self.name}"' ) def is_allclose(self, other, **kwargs): """Check if other time map axis is all close. Parameters ---------- other : `TimeMapAxis` Other time map axis. **kwargs : dict, optional Keyword arguments forwarded to `~numpy.allclose`. Returns ------- is_allclose : bool Whether the other time map axis is allclose. """ if not isinstance(other, self.__class__): return TypeError(f"Cannot compare {type(self)} and {type(other)}") if self._edges_min.shape != other._edges_min.shape: return False # This will test equality at microsec level. delta_min = self.time_min - other.time_min delta_max = self.time_max - other.time_max return ( np.allclose(delta_min.to_value("s"), 0.0, **kwargs) and np.allclose(delta_max.to_value("s"), 0.0, **kwargs) and self._interp == other._interp and self.name.upper() == other.name.upper() ) def __eq__(self, other): if not isinstance(other, self.__class__): return False return self.is_allclose(other=other, atol=1e-6) def __ne__(self, other): return not self.__eq__(other) def __hash__(self): return id(self) def is_aligned(self, other, atol=2e-2): """Not supported for time axis.""" raise NotImplementedError @property def iter_by_edges(self): """Iterate by intervals defined by the edges.""" for time_min, time_max in zip(self.time_min, self.time_max): yield (time_min, time_max) def coord_to_idx(self, coord, **kwargs): """Transform time axis coordinate to bin index. Indices of time values falling outside time bins will be set to -1. Parameters ---------- coord : `~astropy.time.Time` or `~astropy.units.Quantity` Array of time axis coordinate values. The quantity is assumed to be relative to the reference time. Returns ------- idx : `~numpy.ndarray` Array of bin indices. """ if isinstance(coord, u.Quantity): coord = self.reference_time + coord time = Time(coord[..., np.newaxis]) delta_plus = (time - self.time_min).value > 0.0 delta_minus = (time - self.time_max).value <= 0.0 mask = np.logical_and(delta_plus, delta_minus) idx = np.asanyarray(np.argmax(mask, axis=-1)) idx[~np.any(mask, axis=-1)] = INVALID_INDEX.int return idx def pix_to_coord(self, pix): """Transform from pixel position to time coordinate. Currently, works only for linear interpolation scheme. Parameters ---------- pix : `~numpy.ndarray` Array of pixel positions. Returns ------- coord : `~astropy.time.Time` Array of time axis coordinate values. """ shape = np.shape(pix) pix = np.atleast_1d(pix) coords = np.zeros_like(pix) frac, idx = np.modf(pix) idx1 = idx.astype(int) valid = np.logical_and(idx >= 0, idx < self.nbin, np.isfinite(idx)) idx_valid = np.where(valid) idx_invalid = np.where(~valid) coords[idx_valid] = ( frac[idx_valid] * self.time_delta[idx1[valid]] + self.edges_min[idx1[valid]] ).value coords = coords * self.unit + self.reference_time coords[idx_invalid] = Time(INVALID_VALUE.time, scale=self.reference_time.scale) return coords.reshape(shape) def coord_to_pix(self, coord, **kwargs): """Transform time axis coordinate to pixel position. Pixels of time values falling outside time bins will be set to -1. Parameters ---------- coord : `~astropy.time.Time` Array of time axis coordinate values. Returns ------- pix : `~numpy.ndarray` Array of pixel positions. """ if isinstance(coord, u.Quantity): coord = self.reference_time + coord idx = np.atleast_1d(self.coord_to_idx(coord)) valid_pix = idx != INVALID_INDEX.int pix = np.atleast_1d(idx).astype("float") # TODO: is there the equivalent of np.atleast1d for astropy.time.Time? if coord.shape == (): coord = coord.reshape((1,)) relative_time = coord[valid_pix] - self.reference_time scale = interpolation_scale(self._interp) valid_idx = idx[valid_pix] s_min = scale(self._edges_min[valid_idx]) s_max = scale(self._edges_max[valid_idx]) s_coord = scale(relative_time.to(self._edges_min.unit)) pix[valid_pix] += (s_coord - s_min) / (s_max - s_min) pix[~valid_pix] = INVALID_INDEX.float return pix - 0.5 @staticmethod def pix_to_idx(pix, clip=False): # TODO: Is this useful at all? return pix @property def center(self): """Return interval centers as a `~astropy.units.Quantity`.""" return self.edges_min + 0.5 * self.bin_width @property def bin_width(self): """Return time interval width as a `~astropy.units.Quantity`.""" return self.time_delta.to("h") def __repr__(self): str_ = self.__class__.__name__ + "\n" str_ += "-" * len(self.__class__.__name__) + "\n\n" fmt = "\t{:<14s} : {:<10s}\n" str_ += fmt.format("name", self.name) str_ += fmt.format("nbins", str(self.nbin)) str_ += fmt.format("reference time", self.reference_time.iso) str_ += fmt.format("scale", self.reference_time.scale) str_ += fmt.format("time min.", self.time_min.min().iso) str_ += fmt.format("time max.", self.time_max.max().iso) str_ += fmt.format("total time", np.sum(self.bin_width)) return str_.expandtabs(tabsize=2) def upsample(self): """Not supported for time axis.""" raise NotImplementedError def downsample(self): """Not supported for time axis.""" raise NotImplementedError def _init_copy(self, **kwargs): """Init map axis instance by copying missing init arguments from self.""" argnames = inspect.getfullargspec(self.__init__).args argnames.remove("self") for arg in argnames: value = getattr(self, "_" + arg) if arg not in kwargs: kwargs[arg] = copy.deepcopy(value) return self.__class__(**kwargs) def copy(self, **kwargs): """Copy `TimeMapAxis` instance and overwrite given attributes. Parameters ---------- **kwargs : dict, optional Keyword arguments to overwrite in the map axis constructor. Returns ------- copy : `TimeMapAxis` Copied time map axis. """ return self._init_copy(**kwargs) def slice(self, idx): """Create a new axis object by extracting a slice from this axis. Parameters ---------- idx : `slice` Slice object selecting a sub-selection of the axis. Returns ------- axis : `~TimeMapAxis` Sliced time map axis object. Examples -------- >>> from gammapy.maps import TimeMapAxis >>> import astropy.units as u >>> from astropy.time import Time >>> time_map_axis = TimeMapAxis( ... edges_min=[1, 5, 10, 15] * u.day, ... edges_max=[2, 7, 13, 18] * u.day, ... reference_time=Time("2020-03-19"), ... ) >>> slices = slice(1, 3) >>> sliced = time_map_axis.slice(slices) """ return TimeMapAxis( self._edges_min[idx].copy(), self._edges_max[idx].copy(), self.reference_time, interp=self._interp, name=self.name, ) def squash(self): """Create a new axis object by squashing the axis into one bin. Returns ------- axis : `~TimeMapAxis` Squashed time map axis object. """ return TimeMapAxis( self._edges_min[0], self._edges_max[-1], self.reference_time, interp=self._interp, name=self._name, ) # TODO: if we are to allow log or sqrt bins the reference time should always # be strictly lower than all times # Should we define a mechanism to ensure this is always correct? @classmethod def from_time_edges(cls, time_min, time_max, unit="d", interp="lin", name="time"): """Create TimeMapAxis from the time interval edges defined as a `~astropy.time.Time` object. The reference time is defined as the lower edge of the first interval. Parameters ---------- time_min : `~astropy.time.Time` Array of lower edge times. time_max : `~astropy.time.Time` Array of lower edge times. unit : `~astropy.units.Unit` or str, optional The unit to convert the edges to. Default is 'd' (day). interp : {'lin'} Interpolation method used to transform between axis and pixel coordinates. Currently, only 'lin' is supported. Default is 'lin'. name : str, optional Axis name. Default is "time". Returns ------- axis : `TimeMapAxis` Time map axis. """ unit = u.Unit(unit) reference_time = time_min[0] edges_min = time_min - reference_time edges_max = time_max - reference_time return cls( edges_min.to(unit), edges_max.to(unit), reference_time, interp=interp, name=name, ) # TODO: how configurable should that be? column names? @classmethod def from_table(cls, table, format="gadf", idx=0): """Create time map axis from table Parameters ---------- table : `~astropy.table.Table` Bin table HDU. format : {"gadf", "fermi-fgl", "lightcurve"} Format to use. Default is "gadf". idx : int Axis index. Default is 0. Returns ------- axis : `TimeMapAxis` Time map axis. """ if format == "gadf": axcols = table.meta.get("AXCOLS{}".format(idx + 1)) colnames = axcols.split(",") name = colnames[0].replace("_MIN", "").lower() reference_time = time_ref_from_dict(table.meta) edges_min = np.unique(table[colnames[0]].quantity) edges_max = np.unique(table[colnames[1]].quantity) elif format == "fermi-fgl": meta = table.meta.copy() meta["MJDREFF"] = str(meta["MJDREFF"]).replace("D-4", "e-4") reference_time = time_ref_from_dict(meta=meta) name = "time" edges_min = table["Hist_Start"][:-1] edges_max = table["Hist_Start"][1:] elif format == "lightcurve": # TODO: is this a good format? It just supports mjd... name = "time" time_ref_dict = dict( MJDREFF=table.meta.get("MJDREFF", 0), MJDREFI=table.meta.get("MJDREFI", 0), TIMESYS=table.meta.get("TIMESYS", "utc"), TIMEUNIT=table.meta.get("TIMEUNIT", "d"), ) reference_time = time_ref_from_dict(time_ref_dict, format="mjd") time_min = reference_time + table["time_min"].data * u.Unit( time_ref_dict["TIMEUNIT"] ) time_max = reference_time + table["time_max"].data * u.Unit( time_ref_dict["TIMEUNIT"] ) if reference_time.mjd == 0: # change to a more recent reference time reference_time = Time( "2001-01-01T00:00:00", scale=time_ref_dict["TIMESYS"] ) reference_time.format = "mjd" edges_min = (time_min - reference_time).to("s") edges_max = (time_max - reference_time).to("s") else: raise ValueError(f"Not a supported format: {format}") return cls( edges_min=edges_min, edges_max=edges_max, reference_time=reference_time, name=name, ) @classmethod def from_gti(cls, gti, name="time"): """Create a time axis from an input GTI. Parameters ---------- gti : `~gammapy.data.GTI` GTI table. name : str, optional Axis name. Default is "time". Returns ------- axis : `TimeMapAxis` Time map axis. """ tmin = gti.time_start - gti.time_ref tmax = gti.time_stop - gti.time_ref return cls( edges_min=tmin.to("s"), edges_max=tmax.to("s"), reference_time=gti.time_ref, name=name, ) @classmethod def from_gti_bounds(cls, gti, t_delta, name="time"): """Create a time axis from an input GTI. The unit for the axis is taken from the t_delta quantity. Parameters ---------- gti : `~gammapy.data.GTI` GTI table. t_delta : `~astropy.units.Quantity` Time binning. name : str, optional Axis name. Default is "time". Returns ------- axis : `TimeMapAxis` Time map axis. """ time_min = gti.time_start[0] time_max = gti.time_stop[-1] nbin = int(((time_max - time_min) / t_delta).to("")) return TimeMapAxis.from_time_bounds( time_min=time_min, time_max=time_max, nbin=nbin, name=name, unit=t_delta.unit, ) @classmethod def from_time_bounds(cls, time_min, time_max, nbin, unit="d", name="time"): """Create linearly spaced time axis from bounds. Parameters ---------- time_min : `~astropy.time.Time` Lower bound. time_max : `~astropy.time.Time` Upper bound. nbin : int Number of bins. unit : `~astropy.units.Unit` or str, optional The unit to convert the edges to. Default is 'd' (day). name : str, optional Name of the axis. Default is "time". """ delta = time_max - time_min time_edges = time_min + delta * np.linspace(0, 1, nbin + 1) return cls.from_time_edges( time_min=time_edges[:-1], time_max=time_edges[1:], interp="lin", unit=unit, name=name, ) def to_gti(self): """Convert the axis to a GTI table. Returns ------- gti : `~gammapy.data.GTI` GTI table. """ from gammapy.data import GTI return GTI.create( self.edges_min, self.edges_max, reference_time=self.reference_time ) def to_table(self): """Create table. Returns ------- table : `~astropy.table.Table` Table with axis data. """ t = self.to_gti().table return t def to_header(self, format="gadf", idx=0): """Create FITS header. Parameters ---------- format : {"ogip"} Format specification. Default is "gadf". idx : int, optional Column index of the axis. Default is 0. Returns ------- header : `~astropy.io.fits.Header` Corresponding FITS header. """ header = fits.Header() if format == "gadf": key = f"AXCOLS{idx}" name = self.name.upper() header[key] = f"{name}_MIN,{name}_MAX" key_interp = f"INTERP{idx}" header[key_interp] = self.interp ref_dict = time_ref_to_dict(self.reference_time) header.update(ref_dict) else: raise ValueError(f"Unknown format {format}") return header def group_table(self, interval_edges): """Compute bin groups table for the TimeMapAxis, given coarser bin edges. Parameters ---------- interval_edges : list of `~astropy.time.Time` or `~astropy.units.Quantity` Start and stop time for each interval to compute the LC. Returns ------- groups : `~astropy.table.Table` Group table. Bin groups are divided in: * "normal" for the bins containing data * "underflow" for the bins falling below the minimum axis threshold * "overflow" for the bins falling above the maximum axis threshold * "outflow" for other states """ for _, edge in enumerate(interval_edges): if not isinstance(edge, Time): interval_edges[_] = self.reference_time + interval_edges[_] time_intervals = list(zip(interval_edges[::2], interval_edges[1::2])) group_table = Table( names=("idx_min", "idx_max", "time_min", "time_max", "bin_type"), dtype=("i8", "i8", "f8", "f8", "S10"), ) for time_interval in time_intervals: mask1 = self.time_min >= time_interval[0] mask2 = self.time_max <= time_interval[1] mask = mask1 & mask2 if np.any(mask): idx_min = np.where(mask)[0][0] idx_max = np.where(mask)[0][-1] bin_type = "normal " else: idx_min = idx_max = -1 if np.any(mask1): bin_type = "overflow" elif np.any(mask2): bin_type = "underflow" else: bin_type = "outflow" time_min = self.time_min[idx_min] time_max = self.time_max[idx_max] group_table.add_row( [idx_min, idx_max, time_min.mjd, time_max.mjd, bin_type] ) return group_table class LabelMapAxis: """Map axis using labels. Parameters ---------- labels : list of str Labels to be used for the axis nodes. name : str, optional Name of the axis. Default is "". """ node_type = "label" def __init__(self, labels, name=""): unique_labels = np.unique(labels) if not len(unique_labels) == len(labels): raise ValueError("Node labels must be unique") self._labels = np.array(labels) self._name = name @property def unit(self): # TODO: should we allow units for label axis? """Unit of the axis.""" return u.Unit("") @property def name(self): """Name of the axis.""" return self._name def assert_name(self, required_name): """Assert axis name if a specific one is required. Parameters ---------- required_name : str Required name. """ if self.name != required_name: raise ValueError( "Unexpected axis name," f' expected "{required_name}", got: "{self.name}"' ) @property def nbin(self): """Number of bins.""" return len(self._labels) def pix_to_coord(self, pix): """Transform pixel to label coordinate. Parameters ---------- pix : `~numpy.ndarray` Array of pixel coordinate values. Returns ------- coord : `~numpy.ndarray` Array of axis coordinate values. """ idx = np.round(pix).astype(int) return self._labels[idx] def coord_to_idx(self, coord, **kwargs): """Transform label coordinate to indices. If the label is not present an error is raised. Parameters ---------- coord : `~astropy.time.Time` Array of axis coordinate values. Returns ------- idx : `~numpy.ndarray` Array of bin indices. """ coord = np.array(coord)[..., np.newaxis] is_equal = coord == self._labels if not np.all(np.any(is_equal, axis=-1)): label = coord[~np.any(is_equal, axis=-1)] raise ValueError(f"Not a valid label: {label}") return np.argmax(is_equal, axis=-1) def coord_to_pix(self, coord): """Transform label coordinate to pixel coordinate. Parameters ---------- coord : `~numpy.ndarray` Array of axis label values. Returns ------- pix : `~numpy.ndarray` Array of pixel coordinate values. """ return self.coord_to_idx(coord).astype("float") def pix_to_idx(self, pix, clip=False): """Convert pixel to idx Parameters ---------- pix : tuple of `~numpy.ndarray` Pixel coordinates. clip : bool, optional Choose whether to clip indices to the valid range of the axis. Default is False. If False, then indices for coordinates outside the axis range will be set to -1. Returns ------- idx : tuple `~numpy.ndarray` Pixel indices. """ if clip: idx = np.clip(pix, 0, self.nbin - 1) else: condition = (pix < 0) | (pix >= self.nbin) idx = np.where(condition, -1, pix) return idx @property def center(self): """Center of the label axis.""" return self._labels @property def edges(self): """Edges of the label axis.""" # TODO: Why not return self._labels here? raise ValueError("A LabelMapAxis does not define edges") @property def edges_min(self): """Edges of the label axis.""" return self._labels @property def edges_max(self): """Edges of the label axis.""" return self._labels @property def bin_width(self): """Bin width is unity.""" return np.ones(self.nbin) @property def as_plot_xerr(self): """Return labels for plotting.""" return 0.5 * np.ones(self.nbin) @property def as_plot_labels(self): """Return labels for plotting.""" return self._labels.tolist() @property def as_plot_center(self): """Return labels for plotting.""" return np.arange(self.nbin) @property def as_plot_edges(self): """Return labels for plotting.""" return np.arange(self.nbin + 1) - 0.5 def format_plot_xaxis(self, ax): """Format plot axis. Parameters ---------- ax : `~matplotlib.pyplot.Axis` Plot axis to format. Returns ------- ax : `~matplotlib.pyplot.Axis` Formatted plot axis. """ ax.set_xticks(self.as_plot_center) ax.set_xticklabels( self.as_plot_labels, rotation=30, ha="right", rotation_mode="anchor", ) return ax def to_header(self, format="gadf", idx=0): """Create FITS header. Parameters ---------- format : {"ogip"} Format specification. Default is "gadf". idx : int, optional Column index of the axis. Default is 0. Returns ------- header : `~astropy.io.fits.Header` Header to extend. """ header = fits.Header() if format == "gadf": key = f"AXCOLS{idx}" header[key] = self.name.upper() else: raise ValueError(f"Unknown format {format}") return header # TODO: how configurable should that be? column names? @classmethod def from_table(cls, table, format="gadf", idx=0): """Create time map axis from table. Parameters ---------- table : `~astropy.table.Table` Bin table HDU. format : {"gadf"} Format to use. idx : int Axis index. Default is 0. Returns ------- axis : `TimeMapAxis` Time map axis. """ if format == "gadf": colname = table.meta.get("AXCOLS{}".format(idx + 1)) column = table[colname] if not np.issubdtype(column.dtype, np.str_): raise TypeError(f"Not a valid dtype for label axis: '{column.dtype}'") labels = np.unique(column.data) else: raise ValueError(f"Not a supported format: {format}") return cls(labels=labels, name=colname.lower()) def __repr__(self): str_ = self.__class__.__name__ + "\n" str_ += "-" * len(self.__class__.__name__) + "\n\n" fmt = "\t{:<10s} : {:<10s}\n" str_ += fmt.format("name", self.name) str_ += fmt.format("nbins", str(self.nbin)) str_ += fmt.format("node type", self.node_type) str_ += fmt.format("labels", "{0}".format(list(self._labels))) return str_.expandtabs(tabsize=2) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def is_allclose(self, other, **kwargs): """Check if other map axis is all close. Parameters ---------- other : `LabelMapAxis` Other map axis. Returns ------- is_allclose : bool Whether other axis is allclose. """ if not isinstance(other, self.__class__): return TypeError(f"Cannot compare {type(self)} and {type(other)}") name_equal = self.name.upper() == other.name.upper() labels_equal = np.all(self.center == other.center) return name_equal & labels_equal def __eq__(self, other): if not isinstance(other, self.__class__): return False return self.is_allclose(other=other) def __ne__(self, other): return not self.__eq__(other) # TODO: could create sub-labels here using dashes like "label-1-a", etc. def upsample(self, *args, **kwargs): """Not supported for label axis.""" raise NotImplementedError("Upsampling a LabelMapAxis is not supported") # TODO: could merge labels here like "label-1-label2", etc. def downsample(self, *args, **kwargs): """Not supported for label axis.""" raise NotImplementedError("Downsampling a LabelMapAxis is not supported") # TODO: could merge labels here like "label-1-label2", etc. def resample(self, *args, **kwargs): """Not supported for label axis.""" raise NotImplementedError("Resampling a LabelMapAxis is not supported") # TODO: could create new labels here like "label-10-a" def pad(self, *args, **kwargs): """Not supported for label axis.""" raise NotImplementedError("Padding a LabelMapAxis is not supported") def copy(self): """Copy the axis.""" return copy.deepcopy(self) def slice(self, idx): """Create a new axis object by extracting a slice from this axis. Parameters ---------- idx : slice Slice object selecting a sub-selection of the axis. Returns ------- axis : `~LabelMapAxis` Sliced axis object. Examples -------- >>> from gammapy.maps import LabelMapAxis >>> label_axis = LabelMapAxis( ... labels=["dataset-1", "dataset-2", "dataset-3", "dataset-4"], name="dataset" ... ) >>> slices = slice(2, 4) >>> sliced = label_axis.slice(slices) """ return self.__class__( labels=self._labels[idx], name=self.name, ) @classmethod def from_stack(cls, axes): """Create a label map axis by merging a list of label axis. Parameters ---------- axes : list of `LabelMapAxis` List of label map axis to be merged. Returns ------- axis : `LabelMapAxis` Stacked axis. """ axis_stacked = axes[0] for ax in axes[1:]: axis_stacked = axis_stacked.concatenate(ax) return axis_stacked def concatenate(self, axis): """Concatenate another label map axis to this one into a new instance of `LabelMapAxis`. Names must agree between the axes. Labels must be unique. Parameters ---------- axis : `LabelMapAxis` Axis to concatenate with. Returns ------- axis : `LabelMapAxis` Concatenation of the two axis. """ if not isinstance(axis, LabelMapAxis): raise TypeError( f"axis must be an instance of LabelMapAxis, got {axis.__class__.__name__} instead." ) if self.name != axis.name: raise ValueError(f"Names must agree, got {self.name} and {axis.name} ") merged_labels = np.append(self.center, axis.center) return LabelMapAxis(merged_labels, self.name) def squash(self): """Create a new axis object by squashing the axis into one bin. The label of the new axis is given as "first-label...last-label". Returns ------- axis : `LabelMapAxis` Squashed label map axis. """ return LabelMapAxis( labels=[self.center[0] + "..." + self.center[-1]], name=self._name ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/coord.py0000644000175100001770000002523014721316200016466 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import copy import html import numpy as np from astropy import units as u from astropy.coordinates import SkyCoord from gammapy.utils.compat import COPY_IF_NEEDED __all__ = ["MapCoord"] def skycoord_to_lonlat(skycoord, frame=None): """Convert SkyCoord to lon, lat, frame. Returns ------- lon : `~numpy.ndarray` Longitude in degrees. lat : `~numpy.ndarray` Latitude in degrees. """ if frame: skycoord = skycoord.transform_to(frame) return skycoord.data.lon.deg, skycoord.data.lat.deg, skycoord.frame.name class MapCoord: """Represents a sequence of n-dimensional map coordinates. Contains coordinates for 2 spatial dimensions and an arbitrary number of additional non-spatial dimensions. Ensure correct broadcasting of the array to allow for this. For further information and examples see :ref:`mapcoord`. Parameters ---------- data : `dict` of `~numpy.ndarray` Dictionary of coordinate arrays. frame : {None, "icrs", "galactic"} Spatial coordinate system. If None then the coordinate system will be set to the native coordinate system of the geometry. Default is None. match_by_name : bool, optional Match coordinates to axes by name. If False, coordinates will be matched by index. Default is True. """ def __init__(self, data, frame=None, match_by_name=True): if "lon" not in data or "lat" not in data: raise ValueError("data dictionary must contain axes named 'lon' and 'lat'.") self._data = {k: np.atleast_1d(v) for k, v in data.items()} self._frame = frame self._match_by_name = match_by_name def __getitem__(self, key): if isinstance(key, str): return self._data[key] else: return list(self._data.values())[key] def __setitem__(self, key, value): # TODO: check for broadcastability? self._data[key] = value def __iter__(self): return iter(self._data.values()) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property def ndim(self): """Number of dimensions.""" return len(self._data) @property def axis_names(self): """Names of axes.""" return list(self._data.keys()) @property def shape(self): """Coordinate array shape.""" arrays = [_ for _ in self._data.values()] return np.broadcast(*arrays).shape @property def size(self): """Product of all shape elements.""" return np.prod(self.shape) @property def lon(self): """Longitude coordinate in degrees.""" return self._data["lon"] @property def lat(self): """Latitude coordinate in degrees.""" return self._data["lat"] @property def theta(self): """Theta co-latitude angle in radians.""" theta = u.Quantity(self.lat, unit="deg", copy=COPY_IF_NEEDED).to_value("rad") return np.pi / 2.0 - theta @property def phi(self): """Phi longitude angle in radians.""" phi = u.Quantity(self.lon, unit="deg", copy=COPY_IF_NEEDED).to_value("rad") return phi @property def frame(self): """Coordinate system as a string.""" return self._frame @property def match_by_name(self): """Boolean flag: axis lookup by name return True or by index return False.""" return self._match_by_name @property def skycoord(self): """Coordinate object as a `~astropy.coordinates.SkyCoord`.""" return SkyCoord(self.lon, self.lat, unit="deg", frame=self.frame) @classmethod def _from_lonlat(cls, coords, frame=None, axis_names=None): """Create a `~MapCoord` from a tuple of coordinate vectors. The first two elements of the tuple should be longitude and latitude in degrees. Parameters ---------- coords : tuple Tuple of `~numpy.ndarray`. frame : {"icrs", "galactic", None} Set the coordinate system for longitude and latitude. If None, longitude and latitude will be assumed to be in the coordinate system native to a given map geometry. Default is None. axis_names : list of str, optional Axis names to use. Returns ------- coord : `~MapCoord` A coordinates object. """ if axis_names is None: axis_names = [f"axis{idx}" for idx in range(len(coords) - 2)] if isinstance(coords, (list, tuple)): coords_dict = {"lon": coords[0], "lat": coords[1]} for name, c in zip(axis_names, coords[2:]): coords_dict[name] = c else: raise ValueError("Unrecognized input type.") return cls(coords_dict, frame=frame, match_by_name=False) @classmethod def _from_tuple(cls, coords, frame=None, axis_names=None): """Create from a tuple of coordinate vectors.""" if isinstance(coords[0], (list, np.ndarray)) or np.isscalar(coords[0]): return cls._from_lonlat(coords, frame=frame, axis_names=axis_names) elif isinstance(coords[0], SkyCoord): lon, lat, frame = skycoord_to_lonlat(coords[0], frame=frame) coords = (lon, lat) + coords[1:] return cls._from_lonlat(coords, frame=frame, axis_names=axis_names) else: raise TypeError(f"Type not supported: {type(coords)!r}") @classmethod def _from_dict(cls, coords, frame=None): """Create from a dictionary of coordinate vectors.""" if "lon" in coords and "lat" in coords: return cls(coords, frame=frame) elif "skycoord" in coords: lon, lat, frame = skycoord_to_lonlat(coords["skycoord"], frame=frame) coords_dict = {"lon": lon, "lat": lat} for k, v in coords.items(): if k == "skycoord": continue coords_dict[k] = v return cls(coords_dict, frame=frame) else: raise ValueError("coords dict must contain 'lon'/'lat' or 'skycoord'.") @classmethod def create(cls, data, frame=None, axis_names=None): """Create a new `~MapCoord` object. This method can be used to create either unnamed (with tuple input) or named (via dict input) axes. Parameters ---------- data : tuple, dict, `~gammapy.maps.MapCoord` or `~astropy.coordinates.SkyCoord` Object containing coordinate arrays. frame : {"icrs", "galactic", None} Set the coordinate system for longitude and latitude. If None longitude and latitude will be assumed to be in the coordinate system native to a given map geometry. Default is None. axis_names : list of str, optional Axis names uses if a tuple is provided. Default is None. Examples -------- >>> from astropy.coordinates import SkyCoord >>> from gammapy.maps import MapCoord >>> lon, lat = [1, 2], [2, 3] >>> skycoord = SkyCoord(lon, lat, unit='deg') >>> energy = [1000] >>> c = MapCoord.create((lon,lat)) >>> c = MapCoord.create((skycoord,)) >>> c = MapCoord.create((lon,lat,energy)) >>> c = MapCoord.create(dict(lon=lon,lat=lat)) >>> c = MapCoord.create(dict(lon=lon,lat=lat,energy=energy)) >>> c = MapCoord.create(dict(skycoord=skycoord,energy=energy)) """ if isinstance(data, cls): if data.frame is None or frame == data.frame: return data else: return data.to_frame(frame) elif isinstance(data, dict): return cls._from_dict(data, frame=frame) elif isinstance(data, (list, tuple)): return cls._from_tuple(data, frame=frame, axis_names=axis_names) elif isinstance(data, SkyCoord): return cls._from_tuple((data,), frame=frame, axis_names=axis_names) else: raise TypeError(f"Unsupported input type: {type(data)!r}") def to_frame(self, frame): """Convert to a different coordinate frame. Parameters ---------- frame : {"icrs", "galactic"} Coordinate system, either Galactic ("galactic") or Equatorial ("icrs"). Returns ------- coords : `~MapCoord` A coordinates object. """ if frame == self.frame: return copy.deepcopy(self) else: lon, lat, frame = skycoord_to_lonlat(self.skycoord, frame=frame) data = copy.deepcopy(self._data) if isinstance(self.lon, u.Quantity): lon = u.Quantity(lon, unit="deg", copy=COPY_IF_NEEDED) if isinstance(self.lon, u.Quantity): lat = u.Quantity(lat, unit="deg", copy=COPY_IF_NEEDED) data["lon"] = lon data["lat"] = lat return self.__class__(data, frame, self._match_by_name) def apply_mask(self, mask): """Return a masked copy of this coordinate object. Parameters ---------- mask : `~numpy.ndarray` Boolean mask. Returns ------- coords : `~MapCoord` A coordinates object. """ try: data = {k: v[mask] for k, v in self._data.items()} except IndexError: data = {} for name, coord in self._data.items(): if name in ["lon", "lat"]: data[name] = np.squeeze(coord)[mask] else: data[name] = np.squeeze(coord, axis=-1) return self.__class__(data, self.frame, self._match_by_name) @property def flat(self): """Return flattened, valid coordinates.""" coords = self.broadcasted is_finite = np.isfinite(coords[0]) return coords.apply_mask(is_finite) @property def broadcasted(self): """Return broadcasted coordinates.""" vals = np.broadcast_arrays(*self._data.values(), subok=True) data = dict(zip(self._data.keys(), vals)) return self.__class__( data=data, frame=self.frame, match_by_name=self._match_by_name ) def copy(self): """Copy `MapCoord` object.""" return copy.deepcopy(self) def __str__(self): return ( f"{self.__class__.__name__}\n\n" f"\taxes : {list(self._data.keys())}\n" f"\tshape : {self.shape[::-1]}\n" f"\tndim : {self.ndim}\n" f"\tframe : {self.frame}\n" ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/core.py0000644000175100001770000021603214721316200016312 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import abc import copy import html import inspect import json from collections import OrderedDict from itertools import repeat import numpy as np from astropy import units as u from astropy.io import fits import matplotlib.pyplot as plt import gammapy.utils.parallel as parallel from gammapy.utils.compat import COPY_IF_NEEDED from gammapy.utils.random import InverseCDFSampler, get_random_state from gammapy.utils.scripts import make_path from gammapy.utils.types import JsonQuantityDecoder from gammapy.utils.units import energy_unit_format from .axes import MapAxis from .coord import MapCoord from .geom import pix_tuple_to_idx __all__ = ["Map"] class Map(abc.ABC): """Abstract map class. This can represent WCS or HEALPix-based maps with 2 spatial dimensions and N non-spatial dimensions. Parameters ---------- geom : `~gammapy.maps.Geom` Geometry. data : `~numpy.ndarray` or `~astropy.units.Quantity` Data array. meta : `dict`, optional Dictionary to store metadata. Default is None. unit : str or `~astropy.units.Unit`, optional Data unit, ignored if data is a Quantity. Default is ''. """ tag = "map" def __init__(self, geom, data, meta=None, unit=""): self._geom = geom if isinstance(data, u.Quantity): self._unit = u.Unit(unit) self.quantity = data else: self.data = data self._unit = u.Unit(unit) if meta is None: self.meta = {} else: self.meta = meta def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def _init_copy(self, **kwargs): """Init map instance by copying missing init arguments from self.""" argnames = inspect.getfullargspec(self.__init__).args argnames.remove("self") argnames.remove("dtype") for arg in argnames: value = getattr(self, "_" + arg) if arg not in kwargs: kwargs[arg] = copy.deepcopy(value) return self.from_geom(**kwargs) @property def is_mask(self): """Whether map is a mask with boolean data type.""" return self.data.dtype == bool @property def geom(self): """Map geometry as a `~gammapy.maps.Geom` object.""" return self._geom @property def data(self): """Data array as a `~numpy.ndarray` object.""" return self._data @data.setter def data(self, value): """Set data. Parameters ---------- value : array-like Data array. """ if np.isscalar(value): value = value * np.ones(self.geom.data_shape, dtype=type(value)) if isinstance(value, u.Quantity): raise TypeError("Map data must be a Numpy array. Set unit separately") if not value.shape == self.geom.data_shape: try: value = np.broadcast_to(value, self.geom.data_shape, subok=True) except ValueError as exc: raise ValueError( f"Input shape {value.shape} is not compatible with shape from geometry {self.geom.data_shape}" ) from exc self._data = value @property def unit(self): """Map unit as an `~astropy.units.Unit` object.""" return self._unit @property def meta(self): """Map metadata as a `dict`.""" return self._meta @meta.setter def meta(self, val): self._meta = val @property def quantity(self): """Map data as a `~astropy.units.Quantity` object.""" return u.Quantity(self.data, self.unit, copy=COPY_IF_NEEDED) @quantity.setter def quantity(self, val): """Set data and unit. Parameters ---------- val : `~astropy.units.Quantity` Quantity. """ val = u.Quantity(val, copy=COPY_IF_NEEDED) self.data = val.value self._unit = val.unit def rename_axes(self, names, new_names): """Rename the Map axes. Parameters ---------- names : str or list of str Names of the axes. new_names : str or list of str New names of the axes. The list must be of the same length as `names`). Returns ------- geom : `~Map` Map with renamed axes. """ geom = self.geom.rename_axes(names=names, new_names=new_names) return self._init_copy(geom=geom) @staticmethod def create(**kwargs): """Create an empty map object. This method accepts generic options listed below, as well as options for `HpxMap` and `WcsMap` objects. For WCS-specific options, see `WcsMap.create`; for HPX-specific options, see `HpxMap.create`. Parameters ---------- frame : {"icrs", "galactic"} Coordinate system, either Galactic ("galactic") or Equatorial ("icrs"). map_type : {'wcs', 'wcs-sparse', 'hpx', 'hpx-sparse', 'region'} Map type. Selects the class that will be used to instantiate the map. Default is 'wcs'. binsz : float or `~numpy.ndarray` Pixel size in degrees. skydir : `~astropy.coordinates.SkyCoord` Coordinate of the map center. axes : list List of `~MapAxis` objects for each non-spatial dimension. If None then the map will be a 2D image. dtype : str Data type, default is 'float32' unit : str or `~astropy.units.Unit` Data unit. meta : `dict` Dictionary to store metadata. region : `~regions.SkyRegion` Sky region used for the region map. Returns ------- map : `Map` Empty map object. """ from .hpx import HpxMap from .region import RegionNDMap from .wcs import WcsMap map_type = kwargs.setdefault("map_type", "wcs") if "wcs" in map_type.lower(): return WcsMap.create(**kwargs) elif "hpx" in map_type.lower(): return HpxMap.create(**kwargs) elif map_type == "region": _ = kwargs.pop("map_type") return RegionNDMap.create(**kwargs) else: raise ValueError(f"Unrecognized map type: {map_type!r}") @staticmethod def read( filename, hdu=None, hdu_bands=None, map_type="auto", format=None, colname=None, checksum=False, ): """Read a map from a FITS file. Parameters ---------- filename : str or `~pathlib.Path` Name of the FITS file. hdu : str, optional Name or index of the HDU with the map data. Default is None. hdu_bands : str, optional Name or index of the HDU with the BANDS table. If not defined this will be inferred from the FITS header of the map HDU. Default is None. map_type : {'auto', 'wcs', 'wcs-sparse', 'hpx', 'hpx-sparse', 'region'} Map type. Selects the class that will be used to instantiate the map. The map type should be consistent with the format of the input file. If map_type is 'auto' then an appropriate map type will be inferred from the input file. Default is 'auto'. colname : str, optional data column name to be used for HEALPix map. checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. Returns ------- map_out : `Map` Map object. """ with fits.open( make_path(filename), memmap=False, checksum=checksum ) as hdulist: return Map.from_hdulist( hdulist, hdu, hdu_bands, map_type, format=format, colname=colname ) @staticmethod def from_geom(geom, meta=None, data=None, unit="", dtype="float32"): """Generate an empty map from a `Geom` instance. Parameters ---------- geom : `Geom` Map geometry. meta : `dict`, optional Dictionary to store metadata. Default is None. data : `numpy.ndarray`, optional Data array. Default is None. unit : str or `~astropy.units.Unit` Data unit. dtype : str, optional Data type. Default is 'float32'. Returns ------- map_out : `Map` Map object. """ from .hpx import HpxGeom from .region import RegionGeom from .wcs import WcsGeom if isinstance(geom, HpxGeom): map_type = "hpx" elif isinstance(geom, WcsGeom): map_type = "wcs" elif isinstance(geom, RegionGeom): map_type = "region" else: raise ValueError("Unrecognized geom type.") cls_out = Map._get_map_cls(map_type) return cls_out(geom, data=data, meta=meta, unit=unit, dtype=dtype) @staticmethod def from_hdulist( hdulist, hdu=None, hdu_bands=None, map_type="auto", format=None, colname=None ): """Create from a `astropy.io.fits.HDUList` object. Parameters ---------- hdulist : `~astropy.io.fits.HDUList` HDU list containing HDUs for map data and bands. hdu : str, optional Name or index of the HDU with the map data. Default is None. hdu_bands : str, optional Name or index of the HDU with the BANDS table. Default is None. map_type : {"auto", "wcs", "hpx", "region"} Map type. Default is "auto". format : {'gadf', 'fgst-ccube', 'fgst-template'} FITS format convention. Default is None. colname : str, optional Data column name to be used for HEALPix map. Default is None. Returns ------- map_out : `Map` Map object. """ if map_type == "auto": map_type = Map._get_map_type(hdulist, hdu) cls_out = Map._get_map_cls(map_type) if map_type == "hpx": return cls_out.from_hdulist( hdulist, hdu=hdu, hdu_bands=hdu_bands, format=format, colname=colname ) else: return cls_out.from_hdulist( hdulist, hdu=hdu, hdu_bands=hdu_bands, format=format ) @staticmethod def _get_meta_from_header(header): """Load metadata from a FITS header.""" if "META" in header: return json.loads(header["META"], cls=JsonQuantityDecoder) else: return {} @staticmethod def _get_map_type(hdu_list, hdu_name): """Infer map type from a FITS HDU. Only read header, never data, to have good performance. """ if hdu_name is None: # Find the header of the first non-empty HDU header = hdu_list[0].header if header["NAXIS"] == 0: header = hdu_list[1].header else: header = hdu_list[hdu_name].header if ("PIXTYPE" in header) and (header["PIXTYPE"] == "HEALPIX"): return "hpx" elif "CTYPE1" in header: return "wcs" else: return "region" @staticmethod def _get_map_cls(map_type): """Get map class for a given `map_type` string. This should probably be a registry dict so that users can add supported map types to the `gammapy.maps` I/O (see e.g. the Astropy table format I/O registry), but that's non-trivial to implement without avoiding circular imports. """ if map_type == "wcs": from .wcs import WcsNDMap return WcsNDMap elif map_type == "wcs-sparse": raise NotImplementedError() elif map_type == "hpx": from .hpx import HpxNDMap return HpxNDMap elif map_type == "hpx-sparse": raise NotImplementedError() elif map_type == "region": from .region import RegionNDMap return RegionNDMap else: raise ValueError(f"Unrecognized map type: {map_type!r}") def write(self, filename, overwrite=False, **kwargs): """Write to a FITS file. Parameters ---------- filename : str Output file name. overwrite : bool, optional Overwrite existing file. Default is False. hdu : str, optional Set the name of the image extension. By default, this will be set to 'SKYMAP' (for BINTABLE HDU) or 'PRIMARY' (for IMAGE HDU). hdu_bands : str, optional Set the name of the bands table extension. Default is 'BANDS'. format : str, optional FITS format convention. By default, files will be written to the gamma-astro-data-formats (GADF) format. This option can be used to write files that are compliant with format conventions required by specific software (e.g. the Fermi Science Tools). The following formats are supported: - "gadf" (default) - "fgst-ccube" - "fgst-ltcube" - "fgst-bexpcube" - "fgst-srcmap" - "fgst-template" - "fgst-srcmap-sparse" - "galprop" - "galprop2" sparse : bool, optional Sparsify the map by dropping pixels with zero amplitude. This option is only compatible with the 'gadf' format. checksum : bool, optional When True adds both DATASUM and CHECKSUM cards to the headers written to the file. Default is False. """ checksum = kwargs.pop("checksum", False) hdulist = self.to_hdulist(**kwargs) hdulist.writeto(make_path(filename), overwrite=overwrite, checksum=checksum) def iter_by_axis(self, axis_name, keepdims=False): """Iterate over a given axis. Yields ------ map : `Map` Map iteration. See also -------- iter_by_image : iterate by image returning a map. """ axis = self.geom.axes[axis_name] for idx in range(axis.nbin): idx_axis = slice(idx, idx + 1) if keepdims else idx slices = {axis_name: idx_axis} yield self.slice_by_idx(slices=slices) def iter_by_image(self, keepdims=False): """Iterate over image planes of a map. Parameters ---------- keepdims : bool, optional Keep dimensions. Default is False. Yields ------ map : `Map` Map iteration. See also -------- iter_by_image_data : iterate by image returning data and index. """ for idx in np.ndindex(self.geom.shape_axes): if keepdims: names = self.geom.axes.names slices = {name: slice(_, _ + 1) for name, _ in zip(names, idx)} yield self.slice_by_idx(slices=slices) else: yield self.get_image_by_idx(idx=idx) def iter_by_image_data(self): """Iterate over image planes of the map. The image plane index is in data order, so that the data array can be indexed directly. Yields ------ (data, idx) : tuple Where ``data`` is a `numpy.ndarray` view of the image plane data, and ``idx`` is a tuple of int, the index of the image plane. See also -------- iter_by_image : iterate by image returning a map. """ for idx in np.ndindex(self.geom.shape_axes): yield self.data[idx[::-1]], idx[::-1] def iter_by_image_index(self): """Iterate over image planes of the map. The image plane index is in data order, so that the data array can be indexed directly. Yields ------ idx : tuple ``idx`` is a tuple of int, the index of the image plane. See also -------- iter_by_image : iterate by image returning a map. """ for idx in np.ndindex(self.geom.shape_axes): yield idx[::-1] def coadd(self, map_in, weights=None): """Add the contents of ``map_in`` to this map. This method can be used to combine maps containing integral quantities (e.g. counts) or differential quantities if the maps have the same binning. Parameters ---------- map_in : `Map` Map to add. weights: `Map` or `~numpy.ndarray` The weight factors while adding. Default is None. """ if not self.unit.is_equivalent(map_in.unit): raise ValueError("Incompatible units") # TODO: Check whether geometries are aligned and if so sum the # data vectors directly if weights is not None: map_in = map_in * weights idx = map_in.geom.get_idx() coords = map_in.geom.get_coord() vals = u.Quantity(map_in.get_by_idx(idx), map_in.unit) self.fill_by_coord(coords, vals) def pad(self, pad_width, axis_name=None, mode="constant", cval=0, method="linear"): """Pad the spatial dimensions of the map. Parameters ---------- pad_width : {sequence, array_like, int} Number of pixels padded to the edges of each axis. axis_name : str, optional Which axis to downsample. By default, spatial axes are padded. Default is None. mode : {'constant', 'edge', 'interp'} Padding mode. 'edge' pads with the closest edge value. 'constant' pads with a constant value. 'interp' pads with an extrapolated value. Default is 'constant'. cval : float, optional Padding value when ``mode='consant'``. Default is 0. Returns ------- map : `Map` Padded map. """ if axis_name: if np.isscalar(pad_width): pad_width = (pad_width, pad_width) geom = self.geom.pad(pad_width=pad_width, axis_name=axis_name) idx = self.geom.axes.index_data(axis_name) pad_width_np = [(0, 0)] * self.data.ndim pad_width_np[idx] = pad_width kwargs = {} if mode == "constant": kwargs["constant_values"] = cval data = np.pad(self.data, pad_width=pad_width_np, mode=mode, **kwargs) return self.__class__( geom=geom, data=data, unit=self.unit, meta=self.meta.copy() ) return self._pad_spatial(pad_width, mode="constant", cval=cval) @abc.abstractmethod def _pad_spatial(self, pad_width, mode="constant", cval=0, order=1): pass @abc.abstractmethod def crop(self, crop_width): """Crop the spatial dimensions of the map. Parameters ---------- crop_width : {sequence, array_like, int} Number of pixels cropped from the edges of each axis. Defined analogously to ``pad_with`` from `numpy.pad`. Returns ------- map : `Map` Cropped map. """ pass @abc.abstractmethod def downsample(self, factor, preserve_counts=True, axis_name=None): """Downsample the spatial dimension by a given factor. Parameters ---------- factor : int Downsampling factor. preserve_counts : bool, optional Preserve the integral over each bin. This should be set to True if the map is an integral quantity (e.g. counts) and False if the map is a differential quantity (e.g. intensity). Default is True. axis_name : str, optional Which axis to downsample. By default, spatial axes are downsampled. Default is None. Returns ------- map : `Map` Downsampled map. """ pass @abc.abstractmethod def upsample(self, factor, order=0, preserve_counts=True, axis_name=None): """Upsample the spatial dimension by a given factor. Parameters ---------- factor : int Upsampling factor. order : int, optional Order of the interpolation used for upsampling. Default is 0. preserve_counts : bool, optional Preserve the integral over each bin. This should be true if the map is an integral quantity (e.g. counts) and false if the map is a differential quantity (e.g. intensity). Default is True. axis_name : str, optional Which axis to upsample. By default, spatial axes are upsampled. Default is None. Returns ------- map : `Map` Upsampled map. """ pass def resample(self, geom, weights=None, preserve_counts=True): """Resample pixels to ``geom`` with given ``weights``. Parameters ---------- geom : `~gammapy.maps.Geom` Target Map geometry. weights : `~numpy.ndarray`, optional Weights vector. It must have same shape as the data of the map. If set to None, weights will be set to 1. Default is None. preserve_counts : bool, optional Preserve the integral over each bin. This should be true if the map is an integral quantity (e.g. counts) and false if the map is a differential quantity (e.g. intensity). Default is True. Returns ------- resampled_map : `Map` Resampled map. """ coords = self.geom.get_coord() idx = geom.coord_to_idx(coords) weights = 1 if weights is None else weights resampled = self._init_copy(data=None, geom=geom) resampled._resample_by_idx( idx, weights=self.data * weights, preserve_counts=preserve_counts ) return resampled @abc.abstractmethod def _resample_by_idx(self, idx, weights=None, preserve_counts=False): """Resample pixels at ``idx`` with given ``weights``. Parameters ---------- idx : tuple Tuple of pixel index arrays for each dimension of the map. Tuple should be ordered as (I_lon, I_lat, I_0, ..., I_n) for WCS maps and (I_hpx, I_0, ..., I_n) for HEALPix maps. weights : `~numpy.ndarray`, optional Weights vector. If set to None, weights will be set to 1. Default is None. preserve_counts : bool, optional Preserve the integral over each bin. This should be true if the map is an integral quantity (e.g. counts) and false if the map is a differential quantity (e.g. intensity). Default is False. """ pass def resample_axis(self, axis, weights=None, ufunc=np.add): """Resample map to a new axis by grouping and reducing smaller bins by a given function `ufunc`. By default, the map content are summed over the smaller bins. Other `numpy.ufunc` can be used, e.g. `numpy.logical_and` or `numpy.logical_or`. Parameters ---------- axis : `MapAxis` New map axis. weights : `Map`, optional Array to be used as weights. The spatial geometry must be equivalent to `other` and additional axes must be broadcastable. If set to None, weights will be set to 1. Default is None. ufunc : `~numpy.ufunc` Universal function to use to resample the axis. Default is `numpy.add`. Returns ------- map : `Map` Map with resampled axis. """ from .hpx import HpxGeom geom = self.geom.resample_axis(axis) axis_self = self.geom.axes[axis.name] axis_resampled = geom.axes[axis.name] # We don't use MapAxis.coord_to_idx because is does not behave as needed with boundaries coord = axis_resampled.edges.value edges = axis_self.edges.value indices = np.digitize(coord, edges) - 1 idx = self.geom.axes.index_data(axis.name) weights = 1 if weights is None else weights.data if not isinstance(self.geom, HpxGeom): shape = self.geom._shape[:2] else: shape = (self.geom.data_shape[-1],) shape += tuple([ax.nbin if ax != axis else 1 for ax in self.geom.axes]) padded_array = np.append(self.data * weights, np.zeros(shape[::-1]), axis=idx) slices = tuple([slice(0, _) for _ in geom.data_shape]) data = ufunc.reduceat(padded_array, indices=indices, axis=idx)[slices] return self._init_copy(data=data, geom=geom) def slice_by_idx( self, slices, ): """Slice sub map from map object. Parameters ---------- slices : dict Dictionary of axes names and integers or `slice` object pairs. Contains one element for each non-spatial dimension. For integer indexing the corresponding axes is dropped from the map. Axes not specified in the dictionary are kept unchanged. Returns ------- map_out : `Map` Sliced map object. Examples -------- >>> from gammapy.maps import Map >>> m = Map.read("$GAMMAPY_DATA/fermi_3fhl/gll_iem_v06_cutout.fits") >>> slices = {"energy": slice(0, 5)} >>> sliced = m.slice_by_idx(slices) """ geom = self.geom.slice_by_idx(slices) slices = tuple([slices.get(ax.name, slice(None)) for ax in self.geom.axes]) data = self.data[slices[::-1]] return self.__class__(geom=geom, data=data, unit=self.unit, meta=self.meta) def get_image_by_coord(self, coords): """Return spatial map at the given axis coordinates. Parameters ---------- coords : tuple or dict Tuple should be ordered as (x_0, ..., x_n) where x_i are coordinates for non-spatial dimensions of the map. Dictionary should specify the axis names of the non-spatial axes such as {'axes0': x_0, ..., 'axesn': x_n}. Returns ------- map_out : `Map` Map with spatial dimensions only. See Also -------- get_image_by_idx, get_image_by_pix. Examples -------- :: import numpy as np from gammapy.maps import Map, MapAxis from astropy.coordinates import SkyCoord from astropy import units as u # Define map axes energy_axis = MapAxis.from_edges( np.logspace(-1., 1., 4), unit='TeV', name='energy', ) time_axis = MapAxis.from_edges( np.linspace(0., 10, 20), unit='h', name='time', ) # Define map center skydir = SkyCoord(0, 0, frame='galactic', unit='deg') # Create map m_wcs = Map.create( map_type='wcs', binsz=0.02, skydir=skydir, width=10.0, axes=[energy_axis, time_axis], ) # Get image by coord tuple image = m_wcs.get_image_by_coord(('500 GeV', '1 h')) # Get image by coord dict with strings image = m_wcs.get_image_by_coord({'energy': '500 GeV', 'time': '1 h'}) # Get image by coord dict with quantities image = m_wcs.get_image_by_coord({'energy': 0.5 * u.TeV, 'time': 1 * u.h}) """ if isinstance(coords, tuple): coords = dict(zip(self.geom.axes.names, coords)) idx = self.geom.axes.coord_to_idx(coords) return self.get_image_by_idx(idx) def get_image_by_pix(self, pix): """Return spatial map at the given axis pixel coordinates Parameters ---------- pix : tuple Tuple of scalar pixel coordinates for each non-spatial dimension of the map. Tuple should be ordered as (I_0, ..., I_n). Pixel coordinates can be either float or integer type. See Also -------- get_image_by_coord, get_image_by_idx. Returns ------- map_out : `Map` Map with spatial dimensions only. """ idx = self.geom.pix_to_idx(pix) return self.get_image_by_idx(idx) def get_image_by_idx(self, idx): """Return spatial map at the given axis pixel indices. Parameters ---------- idx : tuple Tuple of scalar indices for each non-spatial dimension of the map. Tuple should be ordered as (I_0, ..., I_n). See Also -------- get_image_by_coord, get_image_by_pix. Returns ------- map_out : `Map` Map with spatial dimensions only. """ if len(idx) != len(self.geom.axes): raise ValueError("Tuple length must equal number of non-spatial dimensions") # Only support scalar indices per axis idx = tuple([int(_.item()) for _ in np.array(idx)]) geom = self.geom.to_image() data = self.data[idx[::-1]] return self.__class__(geom=geom, data=data, unit=self.unit, meta=self.meta) def get_by_coord(self, coords, fill_value=np.nan): """Return map values at the given map coordinates. Parameters ---------- coords : tuple or `~gammapy.maps.MapCoord` Coordinate arrays for each dimension of the map. Tuple should be ordered as (lon, lat, x_0, ..., x_n) where x_i are coordinates for non-spatial dimensions of the map. fill_value : float Value which is returned if the position is outside the projection footprint. Default is `numpy.nan`. Returns ------- vals : `~numpy.ndarray` Values of pixels in the map. `numpy.nan` is used to flag coordinates outside the map. """ pix = self.geom.coord_to_pix(coords=coords) vals = self.get_by_pix(pix, fill_value=fill_value) return vals def get_by_pix(self, pix, fill_value=np.nan): """Return map values at the given pixel coordinates. Parameters ---------- pix : tuple Tuple of pixel index arrays for each dimension of the map. Tuple should be ordered as (I_lon, I_lat, I_0, ..., I_n) for WCS maps and (I_hpx, I_0, ..., I_n) for HEALPix maps. Pixel indices can be either float or integer type. fill_value : float Value which is returned if the position is outside the projection footprint. Default is `numpy.nan`. Returns ------- vals : `~numpy.ndarray` Array of pixel values. `numpy.nan` is used to flag coordinates outside the map. """ # FIXME: Support local indexing here? # FIXME: Support slicing? pix = np.broadcast_arrays(*pix) idx = self.geom.pix_to_idx(pix) vals = self.get_by_idx(idx) mask = self.geom.contains_pix(pix) if not mask.all(): vals = vals.astype(type(fill_value)) vals[~mask] = fill_value return vals @abc.abstractmethod def get_by_idx(self, idx): """Return map values at the given pixel indices. Parameters ---------- idx : tuple Tuple of pixel index arrays for each dimension of the map. Tuple should be ordered as (I_lon, I_lat, I_0, ..., I_n) for WCS maps and (I_hpx, I_0, ..., I_n) for HEALPix maps. Returns ------- vals : `~numpy.ndarray` Array of pixel values. `numpy.nan` is used to flag coordinates outside the map. """ pass @abc.abstractmethod def interp_by_coord(self, coords, method="linear", fill_value=None): """Interpolate map values at the given map coordinates. Parameters ---------- coords : tuple or `~gammapy.maps.MapCoord` Coordinate arrays for each dimension of the map. Tuple should be ordered as (lon, lat, x_0, ..., x_n) where x_i are coordinates for non-spatial dimensions of the map. method : {"linear", "nearest"} Method to interpolate data values. Default is "linear". fill_value : float, optional The value to use for points outside the interpolation domain. If None, values outside the domain are extrapolated. Default is None. Returns ------- vals : `~numpy.ndarray` Interpolated pixel values. """ pass @abc.abstractmethod def interp_by_pix(self, pix, method="linear", fill_value=None): """Interpolate map values at the given pixel coordinates. Parameters ---------- pix : tuple Tuple of pixel coordinate arrays for each dimension of the map. Tuple should be ordered as (p_lon, p_lat, p_0, ..., p_n) where p_i are pixel coordinates for non-spatial dimensions of the map. method : {"linear", "nearest"} Method to interpolate data values. Default is "linear". fill_value : float, optional The value to use for points outside the interpolation domain. If None, values outside the domain are extrapolated. Default is None. Returns ------- vals : `~numpy.ndarray` Interpolated pixel values. """ pass def interp_to_geom(self, geom, preserve_counts=False, fill_value=0, **kwargs): """Interpolate map to input geometry. Parameters ---------- geom : `~gammapy.maps.Geom` Target Map geometry. preserve_counts : bool, optional Preserve the integral over each bin. This should be true if the map is an integral quantity (e.g. counts) and false if the map is a differential quantity (e.g. intensity). Default is False. fill_value : float, optional The value to use for points outside the interpolation domain. If None, values outside the domain are extrapolated. Default is 0. **kwargs : dict, optional Keyword arguments passed to `Map.interp_by_coord`. Returns ------- interp_map : `Map` Interpolated Map. """ coords = geom.get_coord() map_copy = self.copy() if preserve_counts: if geom.ndim > 2 and geom.axes[0] != self.geom.axes[0]: raise ValueError( f"Energy axes do not match, expected: \n {self.geom.axes[0]}," f" but got: \n {geom.axes[0]}." ) map_copy.data /= map_copy.geom.solid_angle().to_value("deg2") if map_copy.is_mask and fill_value is not None: # TODO: check this NaN handling is needed data = map_copy.get_by_coord(coords) data = np.nan_to_num(data, nan=fill_value).astype(bool) else: data = map_copy.interp_by_coord(coords, fill_value=fill_value, **kwargs) if map_copy.is_mask: data = data.astype(bool) if preserve_counts: data *= geom.solid_angle().to_value("deg2") return Map.from_geom(geom, data=data, unit=self.unit) def reproject_to_geom(self, geom, preserve_counts=False, precision_factor=10): """Reproject map to input geometry. Parameters ---------- geom : `~gammapy.maps.Geom` Target Map geometry. preserve_counts : bool, optional Preserve the integral over each bin. This should be true if the map is an integral quantity (e.g. counts) and false if the map is a differential quantity (e.g. intensity). Default is False. precision_factor : int, optional Minimal factor between the bin size of the output map and the oversampled base map. Used only for the oversampling method. Default is 10. Returns ------- output_map : `Map` Reprojected Map. """ from .hpx import HpxGeom from .region import RegionGeom axes = [ax.copy() for ax in self.geom.axes] geom3d = geom.copy(axes=axes) if not geom.is_image: if geom.axes.names != geom3d.axes.names: raise ValueError("Axis names and order should be the same.") if geom.axes != geom3d.axes and ( isinstance(geom3d, HpxGeom) or isinstance(self.geom, HpxGeom) ): raise TypeError( "Reprojection to 3d geom with non-identical axes is not supported for HpxGeom. " "Reproject to 2d geom first and then use inter_to_geom method." ) if isinstance(geom3d, RegionGeom): base_factor = ( geom3d.to_wcs_geom().pixel_scales.min() / self.geom.pixel_scales.min() ) elif isinstance(self.geom, RegionGeom): base_factor = ( geom3d.pixel_scales.min() / self.geom.to_wcs_geom().pixel_scales.min() ) else: base_factor = geom3d.pixel_scales.min() / self.geom.pixel_scales.min() if base_factor >= precision_factor: input_map = self else: factor = precision_factor / base_factor if isinstance(self.geom, HpxGeom): factor = int(2 ** np.ceil(np.log(factor) / np.log(2))) else: factor = int(np.ceil(factor)) input_map = self.upsample(factor=factor, preserve_counts=preserve_counts) output_map = input_map.resample(geom3d, preserve_counts=preserve_counts) if not geom.is_image and geom.axes != geom3d.axes: for base_ax, target_ax in zip(geom3d.axes, geom.axes): base_factor = base_ax.bin_width.min() / target_ax.bin_width.min() if not base_factor >= precision_factor: factor = precision_factor / base_factor factor = int(np.ceil(factor)) output_map = output_map.upsample( factor=factor, preserve_counts=preserve_counts, axis_name=base_ax.name, ) output_map = output_map.resample(geom, preserve_counts=preserve_counts) return output_map def reproject_by_image( self, geom, preserve_counts=False, precision_factor=10, ): """Reproject each image of a ND map to input 2d geometry. For large maps this method is faster than `reproject_to_geom`. Parameters ---------- geom : `~gammapy.maps.Geom` Target slice geometry in 2D. preserve_counts : bool, optional Preserve the integral over each bin. This should be True if the map is an integral quantity (e.g. counts) and False if the map is a differential quantity (e.g. intensity). Default is False. precision_factor : int, optional Minimal factor between the bin size of the output map and the oversampled base map. Used only for the oversampling method. Default is 10. Returns ------- output_map : `Map` Reprojected Map. """ if not geom.is_image: raise TypeError("This method is only valid for 2d geom") output_map = Map.from_geom(geom.to_cube(self.geom.axes)) maps = parallel.run_multiprocessing( self._reproject_image, zip( self.iter_by_image(), repeat(geom), repeat(preserve_counts), repeat(precision_factor), ), task_name="Reprojection", ) for idx in np.ndindex(self.geom.shape_axes): output_map.data[idx[0]] = maps[idx[0]].data return output_map @staticmethod def _reproject_image(image, geom, preserve_counts, precision_factor): return image.reproject_to_geom( geom, precision_factor=precision_factor, preserve_counts=preserve_counts ) def fill_events(self, events, weights=None): """Fill the map from an `~gammapy.data.EventList` object. Parameters ---------- events : `~gammapy.data.EventList` Events to fill in the map with. weights : `~numpy.ndarray`, optional Weights vector. The weights vector must be of the same length as the events column length. If None, weights are set to 1. Default is None. """ self.fill_by_coord(events.map_coord(self.geom), weights=weights) def fill_by_coord(self, coords, weights=None): """Fill pixels at ``coords`` with given ``weights``. Parameters ---------- coords : tuple or `~gammapy.maps.MapCoord` Coordinate arrays for each dimension of the map. Tuple should be ordered as (lon, lat, x_0, ..., x_n) where x_i are coordinates for non-spatial dimensions of the map. weights : `~numpy.ndarray`, optional Weights vector. If None, weights are set to 1. Default is None. """ idx = self.geom.coord_to_idx(coords) self.fill_by_idx(idx, weights=weights) def fill_by_pix(self, pix, weights=None): """Fill pixels at ``pix`` with given ``weights``. Parameters ---------- pix : tuple Tuple of pixel index arrays for each dimension of the map. Tuple should be ordered as (I_lon, I_lat, I_0, ..., I_n) for WCS maps and (I_hpx, I_0, ..., I_n) for HEALPix maps. Pixel indices can be either float or integer type. Float indices will be rounded to the nearest integer. weights : `~numpy.ndarray`, optional Weights vector. If None, weights are set to 1. Default is None. """ idx = pix_tuple_to_idx(pix) return self.fill_by_idx(idx, weights=weights) @abc.abstractmethod def fill_by_idx(self, idx, weights=None): """Fill pixels at ``idx`` with given ``weights``. Parameters ---------- idx : tuple Tuple of pixel index arrays for each dimension of the map. Tuple should be ordered as (I_lon, I_lat, I_0, ..., I_n) for WCS maps and (I_hpx, I_0, ..., I_n) for HEALPix maps. weights : `~numpy.ndarray`, optional Weights vector. If None, weights are set to 1. Default is None. """ pass def set_by_coord(self, coords, vals): """Set pixels at ``coords`` with given ``vals``. Parameters ---------- coords : tuple or `~gammapy.maps.MapCoord` Coordinate arrays for each dimension of the map. Tuple should be ordered as (lon, lat, x_0, ..., x_n) where x_i are coordinates for non-spatial dimensions of the map. vals : `~numpy.ndarray` Values vector. """ idx = self.geom.coord_to_pix(coords) self.set_by_pix(idx, vals) def set_by_pix(self, pix, vals): """Set pixels at ``pix`` with given ``vals``. Parameters ---------- pix : tuple Tuple of pixel index arrays for each dimension of the map. Tuple should be ordered as (I_lon, I_lat, I_0, ..., I_n) for WCS maps and (I_hpx, I_0, ..., I_n) for HEALPix maps. Pixel indices can be either float or integer type. Float indices will be rounded to the nearest integer. vals : `~numpy.ndarray` Values vector. """ idx = pix_tuple_to_idx(pix) return self.set_by_idx(idx, vals) @abc.abstractmethod def set_by_idx(self, idx, vals): """Set pixels at ``idx`` with given ``vals``. Parameters ---------- idx : tuple Tuple of pixel index arrays for each dimension of the map. Tuple should be ordered as (I_lon, I_lat, I_0, ..., I_n) for WCS maps and (I_hpx, I_0, ..., I_n) for HEALPix maps. vals : `~numpy.ndarray` Values vector. """ pass def plot_grid(self, figsize=None, ncols=3, **kwargs): """Plot map as a grid of subplots for non-spatial axes. Parameters ---------- figsize : tuple of int, optional Figsize to plot on. Default is None. ncols : int, optional Number of columns to plot. Default is 3. **kwargs : dict, optional Keyword arguments passed to `WcsNDMap.plot`. Returns ------- axes : `~numpy.ndarray` of `~matplotlib.pyplot.Axes` Axes grid. """ if len(self.geom.axes) > 1: raise ValueError("Grid plotting is only supported for one non spatial axis") axis = self.geom.axes[0] cols = min(ncols, axis.nbin) rows = 1 + (axis.nbin - 1) // cols if figsize is None: width = 12 figsize = (width, width * rows / cols) if self.geom.is_hpx: wcs = self.geom.to_wcs_geom().wcs else: wcs = self.geom.wcs fig, axes = plt.subplots( ncols=cols, nrows=rows, subplot_kw={"projection": wcs}, figsize=figsize, gridspec_kw={"hspace": 0.1, "wspace": 0.1}, ) for idx in range(cols * rows): ax = axes.flat[idx] try: image = self.get_image_by_idx((idx,)) except IndexError: ax.set_visible(False) continue if image.geom.is_hpx: image_wcs = image.to_wcs( normalize=False, proj="AIT", oversample=2, ) else: image_wcs = image image_wcs.plot(ax=ax, **kwargs) if axis.node_type == "center": if axis.name == "energy" or axis.name == "energy_true": info = energy_unit_format(axis.center[idx]) else: info = f"{axis.center[idx]:.1f}" elif axis.node_type == "label": info = f"{axis.center[idx]}" else: if axis.name == "energy" or axis.name == "energy_true": info = ( f"{energy_unit_format(axis.edges[idx])} - " f"{energy_unit_format(axis.edges[idx+1])}" ) else: info = f"{axis.edges_min[idx]:.1f} - {axis.edges_max[idx]:.1f} " ax.set_title(f"{axis.name.capitalize()} " + info) lon, lat = ax.coords[0], ax.coords[1] lon.set_ticks_position("b") lat.set_ticks_position("l") row, col = np.unravel_index(idx, shape=(rows, cols)) if col > 0: lat.set_ticklabel_visible(False) lat.set_axislabel("") if row < (rows - 1): lon.set_ticklabel_visible(False) lon.set_axislabel("") return axes def plot_interactive(self, rc_params=None, **kwargs): """ Plot map with interactive widgets to explore the non-spatial axes. Parameters ---------- rc_params : dict, optional Passed to ``matplotlib.rc_context(rc=rc_params)`` to style the plot. Default is None. **kwargs : dict, optional Keyword arguments passed to `WcsNDMap.plot`. Examples -------- You can try this out e.g. using a Fermi-LAT diffuse model cube with an energy axis:: from gammapy.maps import Map m = Map.read("$GAMMAPY_DATA/fermi_3fhl/gll_iem_v06_cutout.fits") m.plot_interactive(add_cbar=True, stretch="sqrt") If you would like to adjust the figure size you can use the ``rc_params`` argument:: rc_params = {'figure.figsize': (12, 6), 'font.size': 12} m.plot_interactive(rc_params=rc_params) """ import matplotlib as mpl from ipywidgets import RadioButtons, SelectionSlider from ipywidgets.widgets.interaction import fixed, interact if self.geom.is_image: raise TypeError("Use .plot() for 2D Maps") kwargs.setdefault("interpolation", "nearest") kwargs.setdefault("origin", "lower") kwargs.setdefault("cmap", "afmhot") rc_params = rc_params or {} stretch = kwargs.pop("stretch", "sqrt") interact_kwargs = {} for axis in self.geom.axes: if axis.node_type == "center": if axis.name == "energy" or axis.name == "energy_true": options = energy_unit_format(axis.center) else: options = axis.as_plot_labels elif axis.name == "energy" or axis.name == "energy_true": E = energy_unit_format(axis.edges) options = [f"{E[i]} - {E[i+1]}" for i in range(len(E) - 1)] else: options = axis.as_plot_labels interact_kwargs[axis.name] = SelectionSlider( options=options, description=f"Select {axis.name}:", continuous_update=False, style={"description_width": "initial"}, layout={"width": "50%"}, ) interact_kwargs[axis.name + "_options"] = fixed(options) interact_kwargs["stretch"] = RadioButtons( options=["linear", "sqrt", "log"], value=stretch, description="Select stretch:", style={"description_width": "initial"}, ) @interact(**interact_kwargs) def _plot_interactive(**ikwargs): idx = [ ikwargs[ax.name + "_options"].index(ikwargs[ax.name]) for ax in self.geom.axes ] img = self.get_image_by_idx(idx) stretch = ikwargs["stretch"] with mpl.rc_context(rc=rc_params): img.plot(stretch=stretch, **kwargs) plt.show() def copy(self, **kwargs): """Copy map instance and overwrite given attributes, except for geometry. Parameters ---------- **kwargs : dict, optional Keyword arguments to overwrite in the map constructor. Returns ------- copy : `Map` Copied Map. """ if "geom" in kwargs: geom = kwargs["geom"] if not geom.data_shape == self.geom.data_shape: raise ValueError( "Can't copy and change data size of the map. " f" Current shape {self.geom.data_shape}," f" requested shape {geom.data_shape}" ) return self._init_copy(**kwargs) def mask_nearest_position(self, position): """Given a sky coordinate return nearest valid position in the mask. If the mask contains additional axes, the mask is reduced over those axes. Parameters ---------- position : `~astropy.coordinates.SkyCoord` Test position. Returns ------- position : `~astropy.coordinates.SkyCoord` The nearest position in the mask. """ if not self.geom.is_image: raise ValueError("Method only supported for 2D images") coords = self.geom.to_image().get_coord().skycoord separation = coords.separation(position) separation[~self.data] = np.inf idx = np.argmin(separation) return coords.flatten()[idx] def sum_over_axes(self, axes_names=None, keepdims=True, weights=None): """Sum map values over all non-spatial axes. Parameters ---------- axes_names: list of str Names of the axis to reduce over. If None, all non-spatial axis will be summed over. Default is None. keepdims : bool, optional If this is set to true, the axes which are summed over are left in the map with a single bin. Default is True. weights : `Map`, optional Weights to be applied. The map should have the same geometry as this one. Default is None. Returns ------- map_out : `~Map` Map with non-spatial axes summed over. """ return self.reduce_over_axes( func=np.add, axes_names=axes_names, keepdims=keepdims, weights=weights ) def reduce_over_axes( self, func=np.add, keepdims=False, axes_names=None, weights=None ): """Reduce map over non-spatial axes. Parameters ---------- func : `~numpy.ufunc`, optional Function to use for reducing the data. Default is `numpy.add`. keepdims : bool, optional If this is set to true, the axes which are summed over are left in the map with a single bin. Default is False. axes_names: list, optional Names of axis to reduce over. If None, all non-spatial axis will be reduced. weights : `Map`, optional Weights to be applied. The map should have the same geometry as this one. Default is None. Returns ------- map_out : `~Map` Map with non-spatial axes reduced. """ if axes_names is None: axes_names = self.geom.axes.names map_out = self.copy() for axis_name in axes_names: map_out = map_out.reduce( axis_name, func=func, keepdims=keepdims, weights=weights ) return map_out def reduce(self, axis_name, func=np.add, keepdims=False, weights=None): """Reduce map over a single non-spatial axis. Parameters ---------- axis_name: str The name of the axis to reduce over. func : `~numpy.ufunc`, optional Function to use for reducing the data. Default is `numpy.add`. keepdims : bool, optional If this is set to true, the axes which are summed over are left in the map with a single bin. Default is False. weights : `Map` Weights to be applied. The map should have the same geometry as this one. Default is None. Returns ------- map_out : `~Map` Map with the given non-spatial axes reduced. """ if keepdims: geom = self.geom.squash(axis_name=axis_name) else: geom = self.geom.drop(axis_name=axis_name) idx = self.geom.axes.index_data(axis_name) data = self.data if weights is not None: data = data * weights data = func.reduce(data, axis=idx, keepdims=keepdims, where=~np.isnan(data)) return self._init_copy(geom=geom, data=data) def cumsum(self, axis_name): """Compute cumulative sum along a given axis. Parameters ---------- axis_name : str Along which axis to sum. Returns ------- cumsum : `Map` Map with cumulative sum. """ axis = self.geom.axes[axis_name] axis_idx = self.geom.axes.index_data(axis_name) # TODO: the broadcasting should be done by axis.center, axis.bin_width etc. shape = [1] * len(self.geom.data_shape) shape[axis_idx] = -1 values = self.quantity * axis.bin_width.reshape(shape) if axis_name == "rad": # take Jacobian into account values = 2 * np.pi * axis.center.reshape(shape) * values data = np.insert(values.cumsum(axis=axis_idx), 0, 0, axis=axis_idx) axis_shifted = MapAxis.from_nodes( axis.edges, name=axis.name, interp=axis.interp ) axes = self.geom.axes.replace(axis_shifted) geom = self.geom.to_image().to_cube(axes) return self.__class__(geom=geom, data=data.value, unit=data.unit) def integral(self, axis_name, coords, **kwargs): """Compute integral along a given axis. This method uses interpolation of the cumulative sum. Parameters ---------- axis_name : str Along which axis to integrate. coords : dict or `MapCoord` Map coordinates. **kwargs : dict, optional Keyword arguments passed to `Map.interp_by_coord`. Returns ------- array : `~astropy.units.Quantity` 2D array with axes offset. """ cumsum = self.cumsum(axis_name=axis_name) cumsum = cumsum.pad(pad_width=1, axis_name=axis_name, mode="edge") return u.Quantity( cumsum.interp_by_coord(coords, **kwargs), cumsum.unit, copy=COPY_IF_NEEDED ) def normalize(self, axis_name=None): """Normalise data in place along a given axis. Parameters ---------- axis_name : str, optional Along which axis to normalise. """ cumsum = self.cumsum(axis_name=axis_name).quantity with np.errstate(invalid="ignore", divide="ignore"): axis = self.geom.axes.index_data(axis_name=axis_name) normed = self.quantity / cumsum.max(axis=axis, keepdims=True) self.quantity = np.nan_to_num(normed) @classmethod def from_stack(cls, maps, axis=None, axis_name=None): """Create Map from a list of images and a non-spatial axis. The image geometries must be aligned, except for the axis that is stacked. Parameters ---------- maps : list of `Map` objects List of maps. axis : `MapAxis`, optional If a `MapAxis` is provided the maps are stacked along the last data axis and the new axis is introduced. Default is None. axis_name : str, optional If an axis name is as string the given the maps are stacked along the given axis name. Returns ------- map : `Map` Map with additional non-spatial axis. """ geom = maps[0].geom if axis_name is None and axis is None: axis_name = geom.axes.names[-1] if axis_name: axis = MapAxis.from_stack(axes=[m.geom.axes[axis_name] for m in maps]) geom = geom.drop(axis_name=axis_name) data = [] for m in maps: if axis_name: m_geom = m.geom.drop(axis_name=axis_name) else: m_geom = m.geom if not m_geom == geom: raise ValueError(f"Image geometries not aligned: {m.geom} and {geom}") data.append(m.quantity.to_value(maps[0].unit)) new_geom = geom.to_cube(axes=[axis]) data = np.concatenate(data).reshape(new_geom.data_shape) return cls.from_geom(data=data, geom=new_geom, unit=maps[0].unit) def split_by_axis(self, axis_name): """Split a Map along an axis into multiple maps. Parameters ---------- axis_name : str Name of the axis to split. Returns ------- maps : list A list of `~gammapy.maps.Map`. """ maps = [] axis = self.geom.axes[axis_name] for idx in range(axis.nbin): m = self.slice_by_idx({axis_name: idx}) maps.append(m) return maps def to_cube(self, axes): """Append non-spatial axes to create a higher-dimensional Map. This will result in a Map with a new geometry with N+M dimensions where N is the number of current dimensions and M is the number of axes in the list. The data is reshaped onto this new geometry. Parameters ---------- axes : list Axes that will be appended to this Map. The axes should have only one bin. Returns ------- map : `~gammapy.maps.WcsNDMap` New map. """ for ax in axes: if ax.nbin > 1: raise ValueError(ax.name, "should have only one bin") geom = self.geom.to_cube(axes) data = self.data.reshape((1,) * len(axes) + self.data.shape) return self.from_geom(data=data, geom=geom, unit=self.unit) def get_spectrum(self, region=None, func=np.nansum, weights=None): """Extract spectrum in a given region. The spectrum can be computed by summing (or, more generally, applying ``func``) along the spatial axes in each energy bin. This occurs only inside the ``region``, which by default is assumed to be the whole spatial extension of the map. Parameters ---------- region: `~regions.Region`, optional Region to extract the spectrum from. Pixel or sky regions are accepted. Default is None. func : `numpy.func`, optional Function to reduce the data. Default is `~numpy.nansum`. For a boolean Map, use `numpy.any` or `numpy.all`. Default is `numpy.nansum`. weights : `WcsNDMap`, optional Array to be used as weights. The geometry must be equivalent. Default is None. Returns ------- spectrum : `~gammapy.maps.RegionNDMap` Spectrum in the given region. """ if not self.geom.has_energy_axis: raise ValueError("Energy axis required") return self.to_region_nd_map(region=region, func=func, weights=weights) def to_unit(self, unit): """Convert map to a different unit. Parameters ---------- unit : str or `~astropy.unit.Unit` New unit. Returns ------- map : `Map` Map with new unit and converted data. """ data = self.quantity.to_value(unit) return self.from_geom(self.geom, data=data, unit=unit) def is_allclose(self, other, rtol_axes=1e-3, atol_axes=1e-6, **kwargs): """Compare two Maps for close equivalency. Parameters ---------- other : `gammapy.maps.Map` The Map to compare against. rtol_axes : float, optional Relative tolerance for the axes' comparison. Default is 1e-3. atol_axes : float, optional Absolute tolerance for the axes' comparison. Default is 1e-6. **kwargs : dict, optional Keywords passed to `~numpy.allclose`. Returns ------- is_allclose : bool Whether the Map is all close. """ if not isinstance(other, self.__class__): return TypeError(f"Cannot compare {type(self)} and {type(other)}") if self.data.shape != other.data.shape: return False axes_eq = self.geom.axes.is_allclose( other.geom.axes, rtol=rtol_axes, atol=atol_axes ) data_eq = np.allclose(self.quantity, other.quantity, **kwargs) return axes_eq and data_eq def __str__(self): geom = self.geom.__class__.__name__ axes = ["skycoord"] if self.geom.is_hpx else ["lon", "lat"] axes = axes + [_.name for _ in self.geom.axes] return ( f"{self.__class__.__name__}\n\n" f"\tgeom : {geom} \n " f"\taxes : {axes}\n" f"\tshape : {self.geom.data_shape[::-1]}\n" f"\tndim : {self.geom.ndim}\n" f"\tunit : {self.unit}\n" f"\tdtype : {self.data.dtype}\n" ) def _arithmetics(self, operator, other, copy): """Perform arithmetic on maps after checking geometry consistency.""" if isinstance(other, Map): if self.geom == other.geom: q = other.quantity else: raise ValueError("Map Arithmetic: Inconsistent geometries.") else: q = u.Quantity(other, copy=COPY_IF_NEEDED) out = self.copy() if copy else self out.quantity = operator(out.quantity, q) return out def _boolean_arithmetics(self, operator, other, copy): """Perform arithmetic on maps after checking geometry consistency.""" if operator == np.logical_not: out = self.copy() out.data = operator(out.data) return out if isinstance(other, Map): if self.geom == other.geom: other = other.data else: raise ValueError("Map Arithmetic: Inconsistent geometries.") out = self.copy() if copy else self out.data = operator(out.data, other) return out def __add__(self, other): return self._arithmetics(np.add, other, copy=True) def __iadd__(self, other): return self._arithmetics(np.add, other, copy=COPY_IF_NEEDED) def __sub__(self, other): return self._arithmetics(np.subtract, other, copy=True) def __isub__(self, other): return self._arithmetics(np.subtract, other, copy=COPY_IF_NEEDED) def __mul__(self, other): return self._arithmetics(np.multiply, other, copy=True) def __imul__(self, other): return self._arithmetics(np.multiply, other, copy=COPY_IF_NEEDED) def __truediv__(self, other): return self._arithmetics(np.true_divide, other, copy=True) def __itruediv__(self, other): return self._arithmetics(np.true_divide, other, copy=COPY_IF_NEEDED) def __le__(self, other): return self._arithmetics(np.less_equal, other, copy=True) def __lt__(self, other): return self._arithmetics(np.less, other, copy=True) def __ge__(self, other): return self._arithmetics(np.greater_equal, other, copy=True) def __gt__(self, other): return self._arithmetics(np.greater, other, copy=True) def __eq__(self, other): return self._arithmetics(np.equal, other, copy=True) def __ne__(self, other): return self._arithmetics(np.not_equal, other, copy=True) def __and__(self, other): return self._boolean_arithmetics(np.logical_and, other, copy=True) def __or__(self, other): return self._boolean_arithmetics(np.logical_or, other, copy=True) def __invert__(self): return self._boolean_arithmetics(np.logical_not, None, copy=True) def __xor__(self, other): return self._boolean_arithmetics(np.logical_xor, other, copy=True) def __iand__(self, other): return self._boolean_arithmetics(np.logical_and, other, copy=COPY_IF_NEEDED) def __ior__(self, other): return self._boolean_arithmetics(np.logical_or, other, copy=COPY_IF_NEEDED) def __ixor__(self, other): return self._boolean_arithmetics(np.logical_xor, other, copy=COPY_IF_NEEDED) def __array__(self): return self.data def sample_coord(self, n_events, random_state=0): """Sample position and energy of events. Parameters ---------- n_events : int Number of events to sample. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is 0. Returns ------- coords : `~gammapy.maps.MapCoord` object. Sequence of coordinates and energies of the sampled events. """ random_state = get_random_state(random_state) sampler = InverseCDFSampler(pdf=self.data, random_state=random_state) coords_pix = sampler.sample(n_events) coords = self.geom.pix_to_coord(coords_pix[::-1]) # TODO: pix_to_coord should return a MapCoord object cdict = OrderedDict(zip(self.geom.axes_names, coords)) return MapCoord.create(cdict, frame=self.geom.frame) def reorder_axes(self, axes_names): """Return a new map re-ordering the non-spatial axes. Parameters ---------- axes_names : list of str The list of axes names in the required order. Returns ------- map : `~gammapy.maps.Map` The map with axes re-ordered. """ old_axes = self.geom.axes if not set(old_axes.names) == set(axes_names): raise ValueError(f"{old_axes.names} is not compatible with {axes_names}") new_axes = [old_axes[_] for _ in axes_names] new_geom = self.geom.to_image().to_cube(new_axes) old_indices = [old_axes.index_data(ax) for ax in axes_names] new_indices = [new_geom.axes.index_data(ax) for ax in axes_names] data = np.moveaxis(self.data, old_indices, new_indices) return Map.from_geom(new_geom, data=data) def dot(self, other): """Apply dot product with the input map. The input Map has to share a single MapAxis with the current Map. Because it has no spatial dimension, it must be a `~gammapy.maps.RegionNDMap`. Parameters ---------- other : `~gammapy.maps.RegionNDMap` Map to apply the dot product to. It must share a unique non-spatial MapAxis with the current Map. Returns ------- map : `~gammapy.maps.Map` Map with dot product applied. """ from .region import RegionNDMap if not isinstance(other, RegionNDMap): raise TypeError( f"Dot product can be applied to a RegionNDMap. Got {type(other)} instead." ) common_names = list( set(other.geom.axes.names).intersection(self.geom.axes.names) ) if len(common_names) == 0: raise ValueError( "Map geometries have no axis in common. Cannot apply dot product." ) elif len(common_names) > 1: raise ValueError( f"Map geometries have more than one axis in common: {common_names}." "Cannot apply dot product." ) axis_name = common_names[0] if self.geom.axes[axis_name] != other.geom.axes[axis_name]: raise ValueError( f"Axes {axis_name} are not equal. Cannot apply dot product." ) loc = self.geom.axes.index_data(axis_name) other_loc = other.geom.axes.index_data(axis_name) # move axes because numpy dot product is performed on last axis of a and second-to-last axis of b data = np.moveaxis(self.data, loc, -1) if len(other.geom.axes) > 1: other_data = np.moveaxis(other.data[..., 0, 0], other_loc, -2) else: other_data = other.data[..., 0, 0] data = np.dot(data, other_data) # prepare new axes with expected shape (i.e. common axis replaced by other's axes) index = self.geom.axes.index(axis_name) axes1 = self.geom.axes.drop(axis_name) inserted_axes = other.geom.axes.drop(axis_name) new_axes = axes1[:index] + inserted_axes + axes1[index:] # reorder axes to get the expected shape remaining_axes = np.arange(len(inserted_axes)) old_axes_pos = -1 - remaining_axes new_axes_pos = loc + remaining_axes[::-1] data = np.moveaxis(data, old_axes_pos, new_axes_pos) geom = self.geom.to_image().to_cube(new_axes) return self._init_copy(geom=geom, data=data) def __matmul__(self, other): """Apply dot product with the input map. The input Map has to share a single MapAxis with the current Map. Because it has no spatial dimension, it must be a `~gammapy.maps.RegionNDMap`. """ return self.dot(other) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/geom.py0000644000175100001770000004467514721316200016325 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import abc import copy import html import inspect import logging import numpy as np from astropy import units as u from astropy.io import fits from .io import find_bands_hdu, find_hdu from .utils import INVALID_INDEX __all__ = ["Geom"] log = logging.getLogger(__name__) def get_shape(param): if param is None: return tuple() if not isinstance(param, tuple): param = [param] return max([np.array(p, ndmin=1).shape for p in param]) def pix_tuple_to_idx(pix): """Convert a tuple of pixel coordinate arrays to a tuple of pixel indices. Pixel coordinates are rounded to the closest integer value. Parameters ---------- pix : tuple Tuple of pixel coordinates with one element for each dimension. Returns ------- idx : `~numpy.ndarray` Array of pixel indices. """ idx = [] for p in pix: p = np.array(p, ndmin=1) if np.issubdtype(p.dtype, np.integer): idx += [p] else: with np.errstate(invalid="ignore"): p_idx = np.floor(p + 0.5).astype(int) p_idx[~np.isfinite(p)] = INVALID_INDEX.int idx += [p_idx] return tuple(idx) class Geom(abc.ABC): """Map geometry base class. See also: `~gammapy.maps.WcsGeom` and `~gammapy.maps.HpxGeom`. """ # workaround for the lru_cache pickle issue # see e.g. https://github.com/cloudpipe/cloudpickle/issues/178 def __getstate__(self): state = self.__dict__.copy() for key, value in state.items(): func = getattr(value, "__wrapped__", None) if func is not None: state[key] = func return state def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property @abc.abstractmethod def data_shape(self): """Shape of the Numpy data array matching this geometry.""" pass def data_nbytes(self, dtype="float32"): """Estimate memory usage in megabytes of the Numpy data array matching this geometry depending on the given type. Parameters ---------- dtype : str, optional The desired data-type for the array. Default is "float32". Returns ------- memory : `~astropy.units.Quantity` Estimated memory usage in megabytes (MB). """ return (np.empty(self.data_shape, dtype).nbytes * u.byte).to("MB") @property @abc.abstractmethod def is_allsky(self): pass @property @abc.abstractmethod def center_coord(self): pass @property @abc.abstractmethod def center_pix(self): pass @property @abc.abstractmethod def center_skydir(self): pass @classmethod def from_hdulist(cls, hdulist, hdu=None, hdu_bands=None): """Load a geometry object from a FITS HDUList. Parameters ---------- hdulist : `~astropy.io.fits.HDUList` HDU list containing HDUs for map data and bands. hdu : str or int, optional Name or index of the HDU with the map data. Default is None. hdu_bands : str, optional Name or index of the HDU with the BANDS table. If not defined this will be inferred from the FITS header of the map HDU. Default is None. Returns ------- geom : `~Geom` Geometry object. """ if hdu is None: hdu = find_hdu(hdulist) else: hdu = hdulist[hdu] if hdu_bands is None: hdu_bands = find_bands_hdu(hdulist, hdu) if hdu_bands is not None: hdu_bands = hdulist[hdu_bands] return cls.from_header(hdu.header, hdu_bands) def to_bands_hdu(self, hdu_bands=None, format="gadf"): table_hdu = self.axes.to_table_hdu(format=format, hdu_bands=hdu_bands) cols = table_hdu.columns.columns cols.extend(self._make_bands_cols()) return fits.BinTableHDU.from_columns( cols, header=table_hdu.header, name=table_hdu.name ) @abc.abstractmethod def _make_bands_cols(self): pass @abc.abstractmethod def get_idx(self, idx=None, local=False, flat=False): """Get tuple of pixel indices for this geometry. Returns all pixels in the geometry by default. Pixel indices for a single image plane can be accessed by setting ``idx`` to the index tuple of a plane. Parameters ---------- idx : tuple, optional A tuple of indices with one index for each non-spatial dimension. If defined only pixels for the image plane with this index will be returned. If none then all pixels will be returned. Default is None. local : bool, optional Flag to return local or global pixel indices. Local indices run from 0 to the number of pixels in a given image plane. Default is False. flat : bool, optional Return a flattened array containing only indices for pixels contained in the geometry. Default is False. Returns ------- idx : tuple Tuple of pixel index vectors with one vector for each dimension. """ pass @abc.abstractmethod def get_coord(self, idx=None, flat=False): """Get the coordinate array for this geometry. Returns a coordinate array with the same shape as the data array. Pixels outside the geometry are set to NaN. Coordinates for a single image plane can be accessed by setting ``idx`` to the index tuple of a plane. Parameters ---------- idx : tuple, optional A tuple of indices with one index for each non-spatial dimension. If defined only coordinates for the image plane with this index will be returned. If none then coordinates for all pixels will be returned. Default is None. flat : bool, optional Return a flattened array containing only coordinates for pixels contained in the geometry. Default is False. Returns ------- coords : tuple Tuple of coordinate vectors with one vector for each dimension. """ pass @abc.abstractmethod def coord_to_pix(self, coords): """Convert map coordinates to pixel coordinates. Parameters ---------- coords : tuple Coordinate values in each dimension of the map. This can either be a tuple of numpy arrays or a MapCoord object. If passed as a tuple then the ordering should be (longitude, latitude, c_0, ..., c_N) where c_i is the coordinate vector for axis i. Returns ------- pix : tuple Tuple of pixel coordinates in image and band dimensions. """ pass def coord_to_idx(self, coords, clip=False): """Convert map coordinates to pixel indices. Parameters ---------- coords : tuple or `~MapCoord` Coordinate values in each dimension of the map. This can either be a tuple of numpy arrays or a MapCoord object. If passed as a tuple then the ordering should be (longitude, latitude, c_0, ..., c_N) where c_i is the coordinate vector for axis i. clip : bool Choose whether to clip indices to the valid range of the geometry. If False then indices for coordinates outside the geometry range will be set -1. Default is False. Returns ------- pix : tuple Tuple of pixel indices in image and band dimensions. Elements set to -1 correspond to coordinates outside the map. """ pix = self.coord_to_pix(coords) return self.pix_to_idx(pix, clip=clip) @abc.abstractmethod def pix_to_coord(self, pix): """Convert pixel coordinates to map coordinates. Parameters ---------- pix : tuple Tuple of pixel coordinates. Returns ------- coords : tuple Tuple of map coordinates. """ pass @abc.abstractmethod def pix_to_idx(self, pix, clip=False): """Convert pixel coordinates to pixel indices. Returns -1 for pixel coordinates that lie outside the map. Parameters ---------- pix : tuple Tuple of pixel coordinates. clip : bool Choose whether to clip indices to the valid range of the geometry. If False then indices for coordinates outside the geometry range will be set -1. Default is False. Returns ------- idx : tuple Tuple of pixel indices. """ pass @abc.abstractmethod def contains(self, coords): """Check if a given map coordinate is contained in the geometry. Parameters ---------- coords : tuple or `~gammapy.maps.MapCoord` Tuple of map coordinates. Returns ------- containment : `~numpy.ndarray` Bool array. """ pass def contains_pix(self, pix): """Check if a given pixel coordinate is contained in the geometry. Parameters ---------- pix : tuple Tuple of pixel coordinates. Returns ------- containment : `~numpy.ndarray` Bool array. """ idx = self.pix_to_idx(pix) return np.all(np.stack([t != INVALID_INDEX.int for t in idx]), axis=0) def slice_by_idx(self, slices): """Create a new geometry by slicing the non-spatial axes. Parameters ---------- slices : dict Dictionary of axes names and integers or `slice` object pairs. Contains one element for each non-spatial dimension. For integer indexing the corresponding axes is dropped from the map. Axes not specified in the dict are kept unchanged. Returns ------- geom : `~Geom` Sliced geometry. Examples -------- >>> from gammapy.maps import MapAxis, WcsGeom >>> import astropy.units as u >>> energy_axis = MapAxis.from_energy_bounds(1*u.TeV, 3*u.TeV, 6) >>> geom = WcsGeom.create(skydir=(83.63, 22.01), axes=[energy_axis], width=5, binsz=0.02) >>> slices = {"energy": slice(0, 2)} >>> sliced_geom = geom.slice_by_idx(slices) """ axes = self.axes.slice_by_idx(slices) return self._init_copy(axes=axes) def rename_axes(self, names, new_names): """Rename axes contained in the geometry. Parameters ---------- names : list or str Names of the axes. new_names : list or str New names of the axes. The list must be of same length than `names`. Returns ------- geom : `~Geom` Renamed geometry. """ axes = self.axes.rename_axes(names=names, new_names=new_names) return self._init_copy(axes=axes) @property def as_energy_true(self): """If the geom contains an axis named 'energy' rename it to 'energy_true'.""" return self.rename_axes(names="energy", new_names="energy_true") @property def has_energy_axis(self): """Whether geom has an energy axis (either 'energy' or 'energy_true').""" return ("energy" in self.axes.names) ^ ("energy_true" in self.axes.names) @abc.abstractmethod def to_image(self): """Create a 2D image geometry (drop non-spatial dimensions). Returns ------- geom : `~Geom` Image geometry. """ pass @abc.abstractmethod def to_cube(self, axes): """Append non-spatial axes to create a higher-dimensional geometry. This will result in a new geometry with N+M dimensions where N is the number of current dimensions and M is the number of axes in the list. Parameters ---------- axes : list Axes that will be appended to this geometry. Returns ------- geom : `~Geom` Map geometry. """ pass def squash(self, axis_name): """Squash geom axis. Parameters ---------- axis_name : str Axis to squash. Returns ------- geom : `Geom` Geom with squashed axis. """ axes = self.axes.squash(axis_name=axis_name) return self.to_image().to_cube(axes=axes) def drop(self, axis_name): """Drop an axis from the geom. Parameters ---------- axis_name : str Name of the axis to remove. Returns ------- geom : `Geom` New geom with the axis removed. """ axes = self.axes.drop(axis_name=axis_name) return self.to_image().to_cube(axes=axes) def pad(self, pad_width, axis_name): """ Pad the geometry at the edges. Parameters ---------- pad_width : {sequence, array_like, int} Number of values padded to the edges of each axis. axis_name : str Name of the axis to pad. Returns ------- geom : `~Geom` Padded geometry. """ if axis_name is None: return self._pad_spatial(pad_width) else: axes = self.axes.pad(axis_name=axis_name, pad_width=pad_width) return self.to_image().to_cube(axes) @abc.abstractmethod def _pad_spatial(self, pad_width): pass @abc.abstractmethod def crop(self, crop_width): """ Crop the geometry at the edges. Parameters ---------- crop_width : {sequence, array_like, int} Number of values cropped from the edges of each axis. Returns ------- geom : `~Geom` Cropped geometry. """ pass @abc.abstractmethod def downsample(self, factor, axis_name): """Downsample the spatial dimension of the geometry by a given factor. Parameters ---------- factor : int Downsampling factor. axis_name : str Axis to downsample. Returns ------- geom : `~Geom` Downsampled geometry. """ pass @abc.abstractmethod def upsample(self, factor, axis_name=None): """Upsample the spatial dimension of the geometry by a given factor. Parameters ---------- factor : int Upsampling factor. axis_name : str Axis to upsample. Returns ------- geom : `~Geom` Upsampled geometry. """ pass def resample_axis(self, axis): """Resample geom to a new axis binning. This method groups the existing bins into a new binning. Parameters ---------- axis : `MapAxis` New map axis. Returns ------- map : `Geom` Geom with resampled axis. """ axes = self.axes.resample(axis=axis) return self._init_copy(axes=axes) def replace_axis(self, axis): """Replace axis with a new one. Parameters ---------- axis : `MapAxis` New map axis. Returns ------- map : `Geom` Geom with replaced axis. """ axes = self.axes.replace(axis=axis) return self._init_copy(axes=axes) @abc.abstractmethod def solid_angle(self): """Solid angle as a `~astropy.units.Quantity` object (in ``sr``).""" pass @property def is_image(self): """Whether the geom is an image without extra dimensions.""" if self.axes is None: return True return len(self.axes) == 0 @property def is_flat(self): """Whether the geom non-spatial axes have length 1, equivalent to an image.""" if self.is_image: return True else: valid = True for axis in self.axes: valid = valid and (axis.nbin == 1) return valid def _init_copy(self, **kwargs): """Init map geom instance by copying missing init arguments from self.""" argnames = inspect.getfullargspec(self.__init__).args argnames.remove("self") for arg in argnames: value = getattr(self, "_" + arg) if arg not in kwargs: kwargs[arg] = copy.deepcopy(value) return self.__class__(**kwargs) def copy(self, **kwargs): """Copy and overwrite given attributes. Parameters ---------- **kwargs : dict Keyword arguments to overwrite in the map geometry constructor. Returns ------- copy : `Geom` Copied map geometry. """ return self._init_copy(**kwargs) def energy_mask(self, energy_min=None, energy_max=None, round_to_edge=False): """Create a mask for a given energy range. The energy bin must be fully contained to be included in the mask. Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Energy range. Returns ------- mask : `~gammapy.maps.Map` Map containing the energy mask. The geometry of the map is the same as the geometry of the instance which called this method. """ from . import Map # get energy axes and values energy_axis = self.axes["energy"] if round_to_edge: energy_min, energy_max = energy_axis.round([energy_min, energy_max]) # TODO: make this more general shape = (-1, 1) if self.is_hpx else (-1, 1, 1) energy_edges = energy_axis.edges.reshape(shape) # set default values energy_min = energy_min if energy_min is not None else energy_edges[0] energy_max = energy_max if energy_max is not None else energy_edges[-1] mask = (energy_edges[:-1] >= energy_min) & (energy_edges[1:] <= energy_max) data = np.broadcast_to(mask, shape=self.data_shape) return Map.from_geom(geom=self, data=data, dtype=data.dtype) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.2206419 gammapy-1.3/gammapy/maps/hpx/0000755000175100001770000000000014721316215015611 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/hpx/__init__.py0000644000175100001770000000031314721316200017711 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from .core import HpxMap from .geom import HpxGeom from .ndmap import HpxNDMap __all__ = [ "HpxGeom", "HpxMap", "HpxNDMap", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/hpx/core.py0000644000175100001770000002632614721316200017116 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import abc import json import numpy as np import astropy.units as u from astropy.io import fits from ..core import Map from ..io import find_bands_hdu, find_bintable_hdu from .geom import HpxGeom from .io import HpxConv __all__ = ["HpxMap"] class HpxMap(Map): """Base class for HEALPIX map classes. Parameters ---------- geom : `~gammapy.maps.HpxGeom` HEALPix geometry object. data : `~numpy.ndarray` Data array. meta : dict Dictionary to store metadata. unit : `~astropy.units.Unit` The map unit. """ @classmethod def create( cls, nside=None, binsz=None, nest=True, map_type="hpx", frame="icrs", data=None, skydir=None, width=None, dtype="float32", region=None, axes=None, meta=None, unit="", ): """Factory method to create an empty HEALPix map. Parameters ---------- nside : int or `~numpy.ndarray`, optional HEALPix NSIDE parameter. This parameter sets the size of the spatial pixels in the map. Default is None. binsz : float or `~numpy.ndarray`, optional Approximate pixel size in degrees. An NSIDE will be chosen that corresponds to a pixel size closest to this value. This option is superseded by ``nside``. Default is None. nest : bool, optional Indexing scheme. If True, "NESTED" scheme. If False, "RING" scheme. Default is True. map_type : {'hpx', 'hpx-sparse'}, optional Map type. Selects the class that will be used to instantiate the map. Default is "hpx". frame : {"icrs", "galactic"} Coordinate system, either Galactic ("galactic") or Equatorial ("icrs"). Default is "icrs". data : `~numpy.ndarray`, optional Data array. Default is None. skydir : tuple or `~astropy.coordinates.SkyCoord`, optional Sky position of map center. Can be either a SkyCoord object or a tuple of longitude and latitude in deg in the coordinate system of the map. Default is None. width : float, optional Diameter of the map in degrees. If None then an all-sky geometry will be created. Default is None. dtype : str, optional Data type. Default is "float32". region : str, optional HEALPix region string. Default is None. axes : list, optional List of `~MapAxis` objects for each non-spatial dimension. Default is None. meta : `dict`, optional Dictionary to store the metadata. Default is None. unit : str or `~astropy.units.Unit`, optional The map unit. Default is "". Returns ------- map : `~HpxMap` A HEALPix map object. """ from .ndmap import HpxNDMap hpx = HpxGeom.create( nside=nside, binsz=binsz, nest=nest, frame=frame, region=region, axes=axes, skydir=skydir, width=width, ) if cls.__name__ == "HpxNDMap": return HpxNDMap(hpx, dtype=dtype, meta=meta, unit=unit) elif map_type == "hpx": return HpxNDMap(hpx, dtype=dtype, meta=meta, unit=unit) else: raise ValueError(f"Unrecognized map type: {map_type!r}") @classmethod def from_hdulist( cls, hdu_list, hdu=None, hdu_bands=None, format=None, colname=None ): """Make a HpxMap object from a FITS HDUList. Parameters ---------- hdu_list : `~astropy.io.fits.HDUList` HDU list containing HDUs for map data and bands. hdu : str, optional Name or index of the HDU with the map data. If None then the method will try to load map data from the first BinTableHDU in the file. Default is None. hdu_bands : str, optional Name or index of the HDU with the BANDS table. Default is None. format : str, optional FITS format convention. By default, files will be written to the gamma-astro-data-formats (GADF) format. This option can be used to write files that are compliant with format conventions required by specific software (e.g. the Fermi Science Tools). The following formats are supported: - "gadf" (default) - "fgst-ccube" - "fgst-ltcube" - "fgst-bexpcube" - "fgst-srcmap" - "fgst-template" - "fgst-srcmap-sparse" - "galprop" - "galprop2" Returns ------- hpx_map : `HpxMap` Map object. """ if hdu is None: hdu_out = find_bintable_hdu(hdu_list) else: hdu_out = hdu_list[hdu] if hdu_bands is None: hdu_bands = find_bands_hdu(hdu_list, hdu_out) hdu_bands_out = None if hdu_bands is not None: hdu_bands_out = hdu_list[hdu_bands] if format is None: format = HpxConv.identify_hpx_format(hdu_out.header) hpx_map = cls.from_hdu(hdu_out, hdu_bands_out, format=format, colname=colname) # exposure maps have an additional GTI hdu if format == "fgst-bexpcube" and "GTI" in hdu_list: hpx_map._unit = u.Unit("cm2 s") return hpx_map def to_hdulist(self, hdu="SKYMAP", hdu_bands=None, sparse=False, format="gadf"): """Convert to `~astropy.io.fits.HDUList`. Parameters ---------- hdu : str, optional The HDU extension name. Default is "SKYMAP". hdu_bands : str, optional The HDU extension name for BANDS table. Default is None. sparse : bool, optional Set INDXSCHM to SPARSE and sparsify the map by only writing pixels with non-zero amplitude. Default is False. format : str, optional FITS format convention. By default, files will be written to the gamma-astro-data-formats (GADF) format. This option can be used to write files that are compliant with format conventions required by specific software (e.g. the Fermi Science Tools). The following formats are supported: - "gadf" (default) - "fgst-ccube" - "fgst-ltcube" - "fgst-bexpcube" - "fgst-srcmap" - "fgst-template" - "fgst-srcmap-sparse" - "galprop" - "galprop2" Returns ------- hdu_list : `~astropy.io.fits.HDUList` The FITS HDUList. """ if hdu_bands is None: hdu_bands = f"{hdu.upper()}_BANDS" if self.geom.axes: hdu_bands_out = self.geom.to_bands_hdu(hdu_bands=hdu_bands, format=format) hdu_bands = hdu_bands_out.name else: hdu_bands_out = None hdu_bands = None hdu_out = self.to_hdu( hdu=hdu, hdu_bands=hdu_bands, sparse=sparse, format=format ) hdu_out.header["META"] = json.dumps(self.meta) hdu_out.header["BUNIT"] = self.unit.to_string("fits") hdu_list = fits.HDUList([fits.PrimaryHDU(), hdu_out]) if self.geom.axes: hdu_list.append(hdu_bands_out) return hdu_list @abc.abstractmethod def to_wcs( self, sum_bands=False, normalize=True, proj="AIT", oversample=2, width_pix=None, hpx2wcs=None, ): """Make a WCS object and convert HEALPix data into WCS projection. Parameters ---------- sum_bands : bool, optional Sum over non-spatial axes before reprojecting. If False then the WCS map will have the same dimensionality as the HEALPix one. Default is False. normalize : bool, optional Preserve integral by splitting HEALPix values between bins. Default is True. proj : str, optional WCS-projection. Default is "AIT". oversample : float, optional Oversampling factor for WCS map. This will be the approximate ratio of the width of a HEALPix pixel to a WCS pixel. If this parameter is None then the width will be set from ``width_pix``. Default is 2. width_pix : int, optional Width of the WCS geometry in pixels. The pixel size will be set to the number of pixels satisfying ``oversample`` or ``width_pix`` whichever is smaller. If this parameter is None then the width will be set from ``oversample``. Default is None. hpx2wcs : `~HpxToWcsMapping`, optional Set the HEALPix to WCS mapping object that will be used to generate the WCS map. If None then a new mapping will be generated based on ``proj`` and ``oversample`` arguments. Default is None. Returns ------- map_out : `~gammapy.maps.WcsMap` WCS map object. """ pass @abc.abstractmethod def to_swapped(self): """Return a new map with the opposite scheme (ring or nested). Returns ------- map : `~HpxMap` Map object. """ pass def to_hdu(self, hdu=None, hdu_bands=None, sparse=False, format=None): """Make a FITS HDU with input data. Parameters ---------- hdu : str, optional The HDU extension name. Default is None. hdu_bands : str, optional The HDU extension name for BANDS table. Default is None. sparse : bool, optional Set INDXSCHM to SPARSE and sparsify the map by only writing pixels with non-zero amplitude. Default is False. format : {None, 'fgst-ccube', 'fgst-template', 'gadf'} FITS format convention. If None this will be set to the default convention of the map. Default is None. Returns ------- hdu_out : `~astropy.io.fits.BinTableHDU` or `~astropy.io.fits.ImageHDU` Output HDU containing map data. """ hpxconv = HpxConv.create(format) hduname = hpxconv.hduname if hdu is None else hdu hduname_bands = hpxconv.bands_hdu if hdu_bands is None else hdu_bands header = self.geom.to_header(format=format) if self.geom.axes: header["BANDSHDU"] = hduname_bands if sparse: header["INDXSCHM"] = "SPARSE" cols = [] if header["INDXSCHM"] == "EXPLICIT": array = self.geom._ipix cols.append(fits.Column("PIX", "J", array=array)) elif header["INDXSCHM"] == "LOCAL": array = np.arange(self.data.shape[-1]) cols.append(fits.Column("PIX", "J", array=array)) cols += self._make_cols(header, hpxconv) return fits.BinTableHDU.from_columns(cols, header=header, name=hduname) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/hpx/geom.py0000644000175100001770000013462714721316200017121 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Utilities for dealing with HEALPix projections and mappings.""" import copy import numpy as np from astropy import units as u from astropy.coordinates import SkyCoord from astropy.io import fits from astropy.units import Quantity from gammapy.utils.array import is_power2 from ..axes import MapAxes from ..coord import MapCoord, skycoord_to_lonlat from ..geom import Geom, pix_tuple_to_idx from ..utils import INVALID_INDEX, coordsys_to_frame, frame_to_coordsys from .io import HPX_FITS_CONVENTIONS, HpxConv from .utils import ( coords_to_vec, get_nside_from_pix_size, get_pix_size_from_nside, get_subpixels, get_superpixels, match_hpx_pix, nside_to_order, parse_hpxregion, ravel_hpx_index, unravel_hpx_index, ) # Not sure if we should expose this in the docs or not: # HPX_FITS_CONVENTIONS, HpxConv __all__ = ["HpxGeom"] class HpxGeom(Geom): """Geometry class for HEALPix maps. This class performs mapping between partial-sky indices (pixel number within a HEALPix region) and all-sky indices (pixel number within an all-sky HEALPix map). Multi-band HEALPix geometries use a global indexing scheme that assigns a unique pixel number based on the all-sky index and band index. In the single-band case the global index is the same as the HEALPix index. By default, the constructor will return an all-sky map. Partial-sky maps can be defined with the ``region`` argument. Parameters ---------- nside : `~numpy.ndarray` HEALPix NSIDE parameter, the total number of pixels is 12*nside*nside. For multi-dimensional maps one can pass either a single ``nside`` value or a vector of ``nside`` values defining the pixel size for each image plane. If ``nside`` is not a scalar then its dimensionality should match that of the non-spatial axes. If nest is True, ``nside`` must be a power of 2, less than 2**30. nest : bool Indexing scheme. If True, "NESTED" scheme. If False, "RING" scheme. frame : {"icrs", "galactic"} Coordinate system. Default is "icrs". region : str or tuple Spatial geometry for partial-sky maps. If None, the map will encompass the whole sky. String input will be parsed according to HPX_REG header keyword conventions. Tuple input can be used to define an explicit list of pixels encompassed by the geometry. axes : list Axes for non-spatial dimensions. """ is_hpx = True is_region = False def __init__(self, nside, nest=True, frame="icrs", region=None, axes=None): from healpy.pixelfunc import check_nside check_nside(nside, nest=nest) self._nside = np.array(nside, ndmin=1) self._axes = MapAxes.from_default(axes, n_spatial_axes=1) if self.nside.size > 1 and self.nside.shape != self.shape_axes: raise ValueError( "Wrong dimensionality for nside. nside must " "be a scalar or have a dimensionality consistent " "with the axes argument." ) self._frame = frame self._nest = nest self._ipix = None self._region = region self._create_lookup(region) self._npix = self._npix * np.ones(self.shape_axes, dtype=int) def _create_lookup(self, region): """Create local-to-global pixel lookup table.""" if isinstance(region, str): ipix = [ self.get_index_list(nside, self._nest, region) for nside in self._nside.flat ] self._ipix = [ ravel_hpx_index((p, i * np.ones_like(p)), np.ravel(self.npix_max)) for i, p in enumerate(ipix) ] self._region = region self._indxschm = "EXPLICIT" self._npix = np.array([len(t) for t in self._ipix]) if self.nside.ndim > 1: self._npix = self._npix.reshape(self.nside.shape) self._ipix = np.concatenate(self._ipix) elif isinstance(region, tuple): region = [np.asarray(t) for t in region] m = np.any(np.stack([t >= 0 for t in region]), axis=0) region = [t[m] for t in region] self._ipix = ravel_hpx_index(region, self.npix_max) self._ipix = np.unique(self._ipix) region = unravel_hpx_index(self._ipix, self.npix_max) self._region = "explicit" self._indxschm = "EXPLICIT" if len(region) == 1: self._npix = np.array([len(region[0])]) else: self._npix = np.zeros(self.shape_axes, dtype=int) idx = np.ravel_multi_index(region[1:], self.shape_axes) cnt = np.unique(idx, return_counts=True) self._npix.flat[cnt[0]] = cnt[1] elif region is None: self._region = None self._indxschm = "IMPLICIT" self._npix = self.npix_max else: raise ValueError(f"Invalid region string: {region!r}") def local_to_global(self, idx_local): """Compute a global index (all-sky) from a local (partial-sky) index. Parameters ---------- idx_local : tuple A tuple of pixel indices with local HEALPix pixel indices. Returns ------- idx_global : tuple A tuple of pixel index vectors with global HEALPix pixel indices. """ if self._ipix is None: return idx_local if self.nside.size > 1: idx = ravel_hpx_index(idx_local, self._npix) else: idx_tmp = tuple( [idx_local[0]] + [np.zeros(t.shape, dtype=int) for t in idx_local[1:]] ) idx = ravel_hpx_index(idx_tmp, self._npix) idx_global = unravel_hpx_index(self._ipix[idx], self.npix_max) return idx_global[:1] + tuple(idx_local[1:]) def global_to_local(self, idx_global, ravel=False): """Compute global (all-sky) index from a local (partial-sky) index. Parameters ---------- idx_global : tuple A tuple of pixel indices with global HEALPix pixel indices. ravel : bool, optional Return a raveled index. Default is False. Returns ------- idx_local : tuple A tuple of pixel indices with local HEALPix pixel indices. """ if ( isinstance(idx_global, int) or (isinstance(idx_global, tuple) and isinstance(idx_global[0], int)) or isinstance(idx_global, np.ndarray) ): idx_global = unravel_hpx_index(np.array(idx_global, ndmin=1), self.npix_max) if self.nside.size == 1: idx = np.array(idx_global[0], ndmin=1) else: idx = ravel_hpx_index(idx_global, self.npix_max) if self._ipix is not None: retval = np.full(idx.size, -1, "i") m = np.isin(idx.flat, self._ipix) retval[m] = np.searchsorted(self._ipix, idx.flat[m]) retval = retval.reshape(idx.shape) else: retval = idx if self.nside.size == 1: idx_local = tuple([retval] + list(idx_global[1:])) else: idx_local = unravel_hpx_index(retval, self._npix) m = np.any(np.stack([t == INVALID_INDEX.int for t in idx_local]), axis=0) for i, t in enumerate(idx_local): idx_local[i][m] = INVALID_INDEX.int if not ravel: return idx_local else: return ravel_hpx_index(idx_local, self.npix) def cutout(self, position, width, **kwargs): """Create a cutout around a given position. Parameters ---------- position : `~astropy.coordinates.SkyCoord` Center position of the cutout region. width : `~astropy.coordinates.Angle` or `~astropy.units.Quantity` Diameter of the circular cutout region. Returns ------- cutout : `~gammapy.maps.WcsNDMap` Cutout map. """ if not self.is_regular: raise ValueError("Can only do a cutout from a regular map.") width = u.Quantity(width, "deg").value return self.create( nside=self.nside, nest=self.nest, width=width, skydir=position, frame=self.frame, axes=self.axes, ) def coord_to_pix(self, coords): import healpy as hp coords = MapCoord.create( coords, frame=self.frame, axis_names=self.axes.names ).broadcasted theta, phi = coords.theta, coords.phi if self.axes: idxs = self.axes.coord_to_idx(coords, clip=True) bins = self.axes.coord_to_pix(coords) # FIXME: Figure out how to handle coordinates out of # bounds of non-spatial dimensions if self.nside.size > 1: nside = self.nside[tuple(idxs)] else: nside = self.nside m = ~np.isfinite(theta) theta[m] = 0.0 phi[m] = 0.0 pix = hp.ang2pix(nside, theta, phi, nest=self.nest) pix = tuple([pix]) + bins if np.any(m): for p in pix: p[m] = INVALID_INDEX.int else: pix = (hp.ang2pix(self.nside, theta, phi, nest=self.nest),) return pix def pix_to_coord(self, pix): import healpy as hp if self.axes: bins = [] vals = [] for i, ax in enumerate(self.axes): bins += [pix[1 + i]] vals += [ax.pix_to_coord(pix[1 + i])] idxs = pix_tuple_to_idx(bins) if self.nside.size > 1: nside = self.nside[idxs] else: nside = self.nside ipix = np.round(pix[0]).astype(int) m = ipix == INVALID_INDEX.int ipix[m] = 0 theta, phi = hp.pix2ang(nside, ipix, nest=self.nest) coords = [np.degrees(phi), np.degrees(np.pi / 2.0 - theta)] coords = tuple(coords + vals) if np.any(m): for c in coords: c[m] = INVALID_INDEX.float else: ipix = np.round(pix[0]).astype(int) theta, phi = hp.pix2ang(self.nside, ipix, nest=self.nest) coords = (np.degrees(phi), np.degrees(np.pi / 2.0 - theta)) return coords def pix_to_idx(self, pix, clip=False): # FIXME: Look for better method to clip HPX indices idx = pix_tuple_to_idx(pix) idx_local = self.global_to_local(idx) for i, _ in enumerate(idx): if clip: if i > 0: np.clip(idx[i], 0, self.axes[i - 1].nbin - 1, out=idx[i]) else: np.clip(idx[i], 0, None, out=idx[i]) else: if i > 0: mask = (idx[i] < 0) | (idx[i] >= self.axes[i - 1].nbin) np.putmask(idx[i], mask, -1) else: mask = (idx_local[i] < 0) | (idx[i] < 0) np.putmask(idx[i], mask, -1) return tuple(idx) @property def axes(self): """List of non-spatial axes.""" return self._axes @property def axes_names(self): """All axes names.""" return ["skycoord"] + self.axes.names @property def shape_axes(self): """Shape of non-spatial axes.""" return self.axes.shape @property def data_shape(self): """Shape of the `~numpy.ndarray` matching this geometry.""" npix_shape = tuple([np.max(self.npix)]) return (npix_shape + self.axes.shape)[::-1] @property def data_shape_axes(self): """Shape of data of the non-spatial axes and unit spatial axes.""" return self.axes.shape[::-1] + (1,) @property def ndim(self): """Number of dimensions as an integer.""" return len(self._axes) + 2 @property def ordering(self): """HEALPix ordering ('NESTED' or 'RING').""" return "NESTED" if self.nest else "RING" @property def nside(self): """NSIDE in each band.""" return self._nside @property def order(self): """The order in each band (``NSIDE = 2 ** ORDER``). Set to -1 for bands with NSIDE that is not a power of 2. """ return nside_to_order(self.nside) @property def nest(self): """Whether HEALPix order is nested as a boolean.""" return self._nest @property def npix(self): """Number of pixels in each band. For partial-sky geometries this can be less than the number of pixels for the band NSIDE. """ return self._npix @property def npix_max(self): """Maximum number of pixels.""" maxpix = 12 * self.nside**2 return maxpix * np.ones(self.shape_axes, dtype=int) @property def frame(self): return self._frame @property def projection(self): """Map projection.""" return "HPX" @property def region(self): """Region string.""" return self._region @property def is_allsky(self): """Flag for all-sky maps.""" return self._region is None @property def is_regular(self): """Flag identifying whether this geometry is regular in non-spatial dimensions. False for multi-resolution or irregular geometries. If True, all image planes have the same pixel geometry. """ if self.nside.size > 1 or self.region == "explicit": return False else: return True @property def center_coord(self): """Map coordinates of the center of the geometry as a tuple.""" lon, lat, frame = skycoord_to_lonlat(self.center_skydir) return tuple([lon, lat]) + self.axes.center_coord @property def center_pix(self): """Pixel coordinates of the center of the geometry as a tuple.""" return self.coord_to_pix(self.center_coord) @property def center_skydir(self): """Sky coordinate of the center of the geometry. Returns ------- center : `~astropy.coordinates.SkyCoord` Center position. """ import healpy as hp if self.is_allsky: lon, lat = 0.0, 0.0 elif self.region == "explicit": idx = unravel_hpx_index(self._ipix, self.npix_max) nside = self._get_nside(idx) vec = hp.pix2vec(nside, idx[0], nest=self.nest) lon, lat = hp.vec2ang(np.mean(vec, axis=1), lonlat=True) else: tokens = parse_hpxregion(self.region) if tokens[0] in ["DISK", "DISK_INC"]: lon, lat = float(tokens[1]), float(tokens[2]) elif tokens[0] == "HPX_PIXEL": nside_pix = int(tokens[2]) ipix_pix = int(tokens[3]) if tokens[1] == "NESTED": nest_pix = True elif tokens[1] == "RING": nest_pix = False else: raise ValueError(f"Invalid ordering scheme: {tokens[1]!r}") theta, phi = hp.pix2ang(nside_pix, ipix_pix, nest_pix) lon, lat = np.degrees(phi), np.degrees((np.pi / 2) - theta) return SkyCoord(lon, lat, frame=self.frame, unit="deg") @property def pixel_scales(self): self.angle_ = """Pixel scale. Returns ------- angle: `~astropy.coordinates.Angle` """ return get_pix_size_from_nside(self.nside) * u.deg def interp_weights(self, coords, idxs=None): """Get interpolation weights for given coordinates. Parameters ---------- coords : `MapCoord` or dict Input coordinates. idxs : `~numpy.ndarray`, optional Indices for non-spatial axes. Default is None. Returns ------- weights : `~numpy.ndarray` Interpolation weights. """ import healpy as hp coords = MapCoord.create(coords, frame=self.frame).broadcasted if idxs is None: idxs = self.coord_to_idx(coords, clip=True)[1:] theta, phi = coords.theta, coords.phi m = ~np.isfinite(theta) theta[m] = 0 phi[m] = 0 if not self.is_regular: nside = self.nside[tuple(idxs)] else: nside = self.nside pix, wts = hp.get_interp_weights(nside, theta, phi, nest=self.nest) wts[:, m] = 0 pix[:, m] = INVALID_INDEX.int if not self.is_regular: pix_local = [self.global_to_local([pix] + list(idxs))[0]] else: pix_local = [self.global_to_local(pix, ravel=True)] # If a pixel lies outside of the geometry set its index to the center pixel m = pix_local[0] == INVALID_INDEX.int if m.any(): coords_ctr = [coords.lon, coords.lat] coords_ctr += [ax.pix_to_coord(t) for ax, t in zip(self.axes, idxs)] idx_ctr = self.coord_to_idx(coords_ctr) idx_ctr = self.global_to_local(idx_ctr) pix_local[0][m] = (idx_ctr[0] * np.ones(pix.shape, dtype=int))[m] pix_local += [np.broadcast_to(t, pix_local[0].shape) for t in idxs] return pix_local, wts @property def ipix(self): """HEALPix pixel and band indices for every pixel in the map.""" return self.get_idx() def is_aligned(self, other): """Check if HEALPix geoms and extra axes are aligned. Parameters ---------- other : `HpxGeom` Other geometry. Returns ------- aligned : bool Whether geometries are aligned. """ for axis, otheraxis in zip(self.axes, other.axes): if axis != otheraxis: return False if not self.nside == other.nside: return False elif not self.frame == other.frame: return False elif not self.nest == other.nest: return False else: return True def to_nside(self, nside): """Upgrade or downgrade the resolution to a given NSIDE. Parameters ---------- nside : int HEALPix NSIDE parameter. Returns ------- geom : `~HpxGeom` A HEALPix geometry object. """ if not self.is_regular: raise ValueError("Upgrade and degrade only implemented for standard maps") axes = copy.deepcopy(self.axes) return self.__class__( nside=nside, nest=self.nest, frame=self.frame, region=self.region, axes=axes ) def to_binsz(self, binsz): """Change pixel size of the geometry. Parameters ---------- binsz : float or `~astropy.units.Quantity` New pixel size. A float is assumed to be in degree. Returns ------- geom : `WcsGeom` Geometry with new pixel size. """ binsz = u.Quantity(binsz, "deg").value if self.is_allsky: return self.create( binsz=binsz, frame=self.frame, axes=copy.deepcopy(self.axes), ) else: return self.create( skydir=self.center_skydir, binsz=binsz, width=self.width.to_value("deg"), frame=self.frame, axes=copy.deepcopy(self.axes), ) def separation(self, center): """Compute sky separation with respect to a given center. Parameters ---------- center : `~astropy.coordinates.SkyCoord` Center position. Returns ------- separation : `~astropy.coordinates.Angle` Separation angle array (1D). """ coord = self.to_image().get_coord() return center.separation(coord.skycoord) def to_swapped(self): """Geometry copy with swapped ORDERING (NEST->RING or vice versa). Returns ------- geom : `~HpxGeom` A HEALPix geometry object. """ axes = copy.deepcopy(self.axes) return self.__class__( self.nside, not self.nest, frame=self.frame, region=self.region, axes=axes, ) def to_image(self): return self.__class__( np.max(self.nside), self.nest, frame=self.frame, region=self.region ) def to_cube(self, axes): axes = copy.deepcopy(self.axes) + axes return self.__class__( np.max(self.nside), self.nest, frame=self.frame, region=self.region, axes=axes, ) def _get_neighbors(self, idx): import healpy as hp nside = self._get_nside(idx) idx_nb = (hp.get_all_neighbours(nside, idx[0], nest=self.nest),) idx_nb += tuple([t[None, ...] * np.ones_like(idx_nb[0]) for t in idx[1:]]) return idx_nb def _pad_spatial(self, pad_width): if self.is_allsky: raise ValueError("Cannot pad an all-sky map.") idx = self.get_idx(flat=True) idx_r = ravel_hpx_index(idx, self.npix_max) # TODO: Pre-filter indices to find those close to the edge idx_nb = self._get_neighbors(idx) idx_nb = ravel_hpx_index(idx_nb, self.npix_max) for _ in range(pad_width): mask_edge = np.isin(idx_nb, idx_r, invert=True) idx_edge = idx_nb[mask_edge] idx_edge = np.unique(idx_edge) idx_r = np.sort(np.concatenate((idx_r, idx_edge))) idx_nb = unravel_hpx_index(idx_edge, self.npix_max) idx_nb = self._get_neighbors(idx_nb) idx_nb = ravel_hpx_index(idx_nb, self.npix_max) idx = unravel_hpx_index(idx_r, self.npix_max) return self.__class__( self.nside.copy(), self.nest, frame=self.frame, region=idx, axes=copy.deepcopy(self.axes), ) def crop(self, crop_width): if self.is_allsky: raise ValueError("Cannot crop an all-sky map.") idx = self.get_idx(flat=True) idx_r = ravel_hpx_index(idx, self.npix_max) # TODO: Pre-filter indices to find those close to the edge idx_nb = self._get_neighbors(idx) idx_nb = ravel_hpx_index(idx_nb, self.npix_max) for _ in range(crop_width): # Mask of pixels that have at least one neighbor not # contained in the geometry mask_edge = np.any(np.isin(idx_nb, idx_r, invert=True), axis=0) idx_r = idx_r[~mask_edge] idx_nb = idx_nb[:, ~mask_edge] idx = unravel_hpx_index(idx_r, self.npix_max) return self.__class__( self.nside.copy(), self.nest, frame=self.frame, region=idx, axes=copy.deepcopy(self.axes), ) def upsample(self, factor): if not is_power2(factor): raise ValueError("Upsample factor must be a power of 2.") if self.is_allsky: return self.__class__( self.nside * factor, self.nest, frame=self.frame, region=self.region, axes=copy.deepcopy(self.axes), ) idx = list(self.get_idx(flat=True)) nside = self._get_nside(idx) idx_new = get_subpixels(idx[0], nside, nside * factor, nest=self.nest) for i in range(1, len(idx)): idx[i] = idx[i][..., None] * np.ones(idx_new.shape, dtype=int) idx[0] = idx_new return self.__class__( self.nside * factor, self.nest, frame=self.frame, region=tuple(idx), axes=copy.deepcopy(self.axes), ) def downsample(self, factor, axis_name=None): if not is_power2(factor): raise ValueError("Downsample factor must be a power of 2.") if axis_name is not None: raise ValueError("Currently the only valid axis name is None.") if self.is_allsky: return self.__class__( self.nside // factor, self.nest, frame=self.frame, region=self.region, axes=copy.deepcopy(self.axes), ) idx = list(self.get_idx(flat=True)) nside = self._get_nside(idx) idx_new = get_superpixels(idx[0], nside, nside // factor, nest=self.nest) idx[0] = idx_new return self.__class__( self.nside // factor, self.nest, frame=self.frame, region=tuple(idx), axes=copy.deepcopy(self.axes), ) @classmethod def create( cls, nside=None, binsz=None, nest=True, frame="icrs", region=None, axes=None, skydir=None, width=None, ): """Create an HpxGeom object. Parameters ---------- nside : int or `~numpy.ndarray`, optional HEALPix NSIDE parameter. This parameter sets the size of the spatial pixels in the map. If nest is True, ``nside`` must be a power of 2, less than 2**30. Default is None. binsz : float or `~numpy.ndarray`, optional Approximate pixel size in degrees. An ``nside`` will be chosen that corresponds to a pixel size closest to this value. This option is superseded by ``nside``. Default is None. nest : bool, optional Indexing scheme. If True, "NESTED" scheme. If False, "RING" scheme. Default is True. frame : {"icrs", "galactic"} Coordinate system, either Galactic ("galactic") or Equatorial ("icrs"). Default is "icrs". region : str, optional HEALPix region string. Allows for partial-sky maps. Default is None. axes : list, optional List of axes for non-spatial dimensions. Default is None. skydir : tuple or `~astropy.coordinates.SkyCoord`, optional Sky position of map center. Can be either a SkyCoord object or a tuple of longitude and latitude in deg in the coordinate system of the map. Default is None. width : float, optional Diameter of the map in degrees. If set the map will encompass all pixels within a circular region centered on ``skydir``. Default is None. Returns ------- geom : `~HpxGeom` A HEALPix geometry object. Examples -------- >>> from gammapy.maps import HpxGeom, MapAxis >>> axis = MapAxis.from_bounds(0,1,2) >>> geom = HpxGeom.create(nside=16) # doctest: +SKIP >>> geom = HpxGeom.create(binsz=0.1, width=10.0) # doctest: +SKIP >>> geom = HpxGeom.create(nside=64, width=10.0, axes=[axis]) # doctest: +SKIP >>> geom = HpxGeom.create(nside=[32,64], width=10.0, axes=[axis]) # doctest: +SKIP """ if nside is None and binsz is None: raise ValueError("Either nside or binsz must be defined.") if nside is None and binsz is not None: nside = get_nside_from_pix_size(binsz) if skydir is None: lon, lat = (0.0, 0.0) elif isinstance(skydir, tuple): lon, lat = skydir elif isinstance(skydir, SkyCoord): lon, lat, frame = skycoord_to_lonlat(skydir, frame=frame) else: raise ValueError(f"Invalid type for skydir: {type(skydir)!r}") if region is None and width is not None: region = f"DISK({lon},{lat},{width/2})" return cls(nside, nest=nest, frame=frame, region=region, axes=axes) @classmethod def from_header(cls, header, hdu_bands=None, format=None): """Create an HPX object from a FITS header. Parameters ---------- header : `~astropy.io.fits.Header` The FITS header. hdu_bands : `~astropy.io.fits.BinTableHDU`, optional The BANDS table HDU. Default is None. format : str, optional FITS convention. Default is None. If None the format is guessed. The following formats are supported: - "gadf" - "fgst-ccube" - "fgst-ltcube" - "fgst-bexpcube" - "fgst-srcmap" - "fgst-template" - "fgst-srcmap-sparse" - "galprop" - "galprop2" - "healpy" Returns ------- hpx : `~HpxGeom` HEALPix geometry. """ if format is None: format = HpxConv.identify_hpx_format(header) conv = HPX_FITS_CONVENTIONS[format] axes = MapAxes.from_table_hdu(hdu_bands, format=format) if header["PIXTYPE"] != "HEALPIX": raise ValueError( f"Invalid header PIXTYPE: {header['PIXTYPE']} (must be HEALPIX)" ) if header["ORDERING"] == "RING": nest = False elif header["ORDERING"] == "NESTED": nest = True else: raise ValueError( f"Invalid header ORDERING: {header['ORDERING']} (must be RING or NESTED)" ) if hdu_bands is not None and "NSIDE" in hdu_bands.columns.names: nside = hdu_bands.data.field("NSIDE").reshape(axes.shape).astype(int) elif "NSIDE" in header: nside = header["NSIDE"] elif "ORDER" in header: nside = 2 ** header["ORDER"] else: raise ValueError("Failed to extract NSIDE or ORDER.") try: frame = coordsys_to_frame(header[conv.frame]) except KeyError: frame = header.get("COORDSYS", "icrs") try: region = header["HPX_REG"] except KeyError: try: region = header["HPXREGION"] except KeyError: region = None return cls(nside, nest, frame=frame, region=region, axes=axes) @classmethod def from_hdu(cls, hdu, hdu_bands=None): """Create an HPX object from a BinTable HDU. Parameters ---------- hdu : `~astropy.io.fits.BinTableHDU` The FITS HDU. hdu_bands : `~astropy.io.fits.BinTableHDU`, optional The BANDS table HDU. Default is None. Returns ------- hpx : `~HpxGeom` HEALPix geometry. """ # FIXME: Need correct handling of IMPLICIT and EXPLICIT maps # if HPX region is not defined then geometry is defined by # the set of all pixels in the table if "HPX_REG" not in hdu.header: pix = (hdu.data.field("PIX"), hdu.data.field("CHANNEL")) else: pix = None return cls.from_header(hdu.header, hdu_bands=hdu_bands, pix=pix) def to_header(self, format="gadf", **kwargs): """Build and return FITS header for this HEALPix map.""" header = fits.Header() format = kwargs.get("format", HPX_FITS_CONVENTIONS[format]) # FIXME: For some sparse maps we may want to allow EXPLICIT # with an empty region string indxschm = kwargs.get("indxschm", None) if indxschm is None: if self._region is None: indxschm = "IMPLICIT" elif self.is_regular == 1: indxschm = "EXPLICIT" else: indxschm = "LOCAL" if "FGST" in format.convname.upper(): header["TELESCOP"] = "GLAST" header["INSTRUME"] = "LAT" header[format.frame] = frame_to_coordsys(self.frame) header["PIXTYPE"] = "HEALPIX" header["ORDERING"] = self.ordering header["INDXSCHM"] = indxschm header["ORDER"] = np.max(self.order) header["NSIDE"] = np.max(self.nside) header["FIRSTPIX"] = 0 header["LASTPIX"] = np.max(self.npix_max) - 1 header["HPX_CONV"] = format.convname.upper() if self.frame == "icrs": header["EQUINOX"] = (2000.0, "Equinox of RA & DEC specifications") if self.region: header["HPX_REG"] = self._region return header def _make_bands_cols(self): cols = [] if self.nside.size > 1: cols += [fits.Column("NSIDE", "I", array=np.ravel(self.nside))] return cols @staticmethod def get_index_list(nside, nest, region): """Get list of pixels indices for all the pixels in a region. Parameters ---------- nside : int HEALPix NSIDE parameter. nest : bool Indexing scheme. If True, "NESTED" scheme. If False, "RING" scheme. region : str HEALPix region string. Returns ------- ilist : `~numpy.ndarray` List of pixel indices. """ import healpy as hp # TODO: this should return something more friendly than a tuple # e.g. a namedtuple or a dict tokens = parse_hpxregion(region) reg_type = tokens[0] if reg_type == "DISK": lon, lat = float(tokens[1]), float(tokens[2]) radius = np.radians(float(tokens[3])) vec = coords_to_vec(lon, lat)[0] ilist = hp.query_disc(nside, vec, radius, inclusive=False, nest=nest) elif reg_type == "DISK_INC": lon, lat = float(tokens[1]), float(tokens[2]) radius = np.radians(float(tokens[3])) vec = coords_to_vec(lon, lat)[0] fact = int(tokens[4]) ilist = hp.query_disc( nside, vec, radius, inclusive=True, nest=nest, fact=fact ) elif reg_type == "HPX_PIXEL": nside_pix = int(tokens[2]) if tokens[1] == "NESTED": ipix_ring = hp.nest2ring(nside_pix, int(tokens[3])) elif tokens[1] == "RING": ipix_ring = int(tokens[3]) else: raise ValueError(f"Invalid ordering scheme: {tokens[1]!r}") ilist = match_hpx_pix(nside, nest, nside_pix, ipix_ring) else: raise ValueError(f"Invalid region type: {reg_type!r}") return ilist @property def width(self): """Width of the map.""" # TODO: simplify import healpy as hp if self.is_allsky: width = 180.0 elif self.region == "explicit": idx = unravel_hpx_index(self._ipix, self.npix_max) nside = self._get_nside(idx) ang = hp.pix2ang(nside, idx[0], nest=self.nest, lonlat=True) dirs = SkyCoord(ang[0], ang[1], unit="deg", frame=self.frame) width = np.max(dirs.separation(self.center_skydir)) else: tokens = parse_hpxregion(self.region) if tokens[0] in {"DISK", "DISK_INC"}: width = float(tokens[3]) elif tokens[0] == "HPX_PIXEL": pix_size = get_pix_size_from_nside(int(tokens[2])) width = 2.0 * pix_size return u.Quantity(width, "deg") def _get_nside(self, idx): if self.nside.size > 1: return self.nside[tuple(idx[1:])] else: return self.nside def to_wcs_geom(self, proj="AIT", oversample=2, width_pix=None): """Make a WCS projection appropriate for this HEALPix pixelization. Parameters ---------- proj : str, optional Projection type of WCS geometry. Default is "AIT". oversample : float, optional Oversampling factor for WCS map. This will be the approximate ratio of the width of a HEALPix pixel to a WCS pixel. If this parameter is None then the width will be set from ``width_pix``. Default is 2. width_pix : int, optional Width of the WCS geometry in pixels. The pixel size will be set to the number of pixels satisfying ``oversample`` or ``width_pix`` whichever is smaller. If this parameter is None then the width will be set from ``oversample``. Default is None. Returns ------- wcs : `~gammapy.maps.WcsGeom` WCS geometry. """ from gammapy.maps import WcsGeom pix_size = get_pix_size_from_nside(self.nside) binsz = np.min(pix_size) / oversample width = 2.0 * self.width.to_value("deg") + np.max(pix_size) if width_pix is not None and int(width / binsz) > width_pix: binsz = width / width_pix if width > 90.0: width = min(360.0, width), min(180.0, width) axes = copy.deepcopy(self.axes) return WcsGeom.create( width=width, binsz=binsz, frame=self.frame, axes=axes, skydir=self.center_skydir, proj=proj, ) def to_wcs_tiles(self, nside_tiles=4, margin="0 deg"): """Create WCS tiles geometries from HPX geometry with given nside. The HEALPix geom is divide into superpixels defined by ``nside_tiles``, which are then represented by a WCS geometry using a tangential projection. The number of WCS tiles is given by the number of pixels for the given ``nside_tiles``. Parameters ---------- nside_tiles : int, optional HEALPix NSIDE parameter for super pixel tiles. Default is 4. margin : `~astropy.units.Quantity`, optional Width margin of the WCS tile. Default is "0 deg". Returns ------- wcs_tiles : list List of WCS tile geometries. """ import healpy as hp from gammapy.maps import WcsGeom margin = u.Quantity(margin) if nside_tiles >= self.nside: raise ValueError(f"nside_tiles must be < {self.nside}") if not self.is_allsky: raise ValueError("to_wcs_tiles() is only supported for all sky geoms") binsz = np.degrees(hp.nside2resol(self.nside)) * u.deg hpx = self.to_image().to_nside(nside=nside_tiles) wcs_tiles = [] for pix in range(int(hpx.npix[0])): skydir = hpx.pix_to_coord([pix]) vtx = hp.boundaries(nside=hpx.nside.item(), pix=pix, nest=hpx.nest, step=1) lon, lat = hp.vec2ang(vtx.T, lonlat=True) boundaries = SkyCoord(lon * u.deg, lat * u.deg, frame=hpx.frame) # Compute maximum separation between all pairs of boundaries and take it # as width width = boundaries.separation(boundaries[:, np.newaxis]).max() wcs_tile_geom = WcsGeom.create( skydir=(float(skydir[0].item()), float(skydir[1].item())), width=width + margin, binsz=binsz, frame=hpx.frame, proj="TAN", axes=self.axes, ) wcs_tiles.append(wcs_tile_geom) return wcs_tiles def get_idx( self, idx=None, local=False, flat=False, sparse=False, mode="center", axis_name=None, ): # TODO: simplify this!!! if idx is not None and np.any(np.array(idx) >= np.array(self.shape_axes)): raise ValueError(f"Image index out of range: {idx!r}") # Regular all- and partial-sky maps if self.is_regular: pix = [np.arange(np.max(self._npix))] if idx is None: for ax in self.axes: if mode == "edges" and ax.name == axis_name: pix += [np.arange(-0.5, ax.nbin, dtype=float)] else: pix += [np.arange(ax.nbin, dtype=int)] else: pix += [t for t in idx] pix = np.meshgrid(*pix[::-1], indexing="ij", sparse=sparse)[::-1] pix = self.local_to_global(pix) # Non-regular all-sky elif self.is_allsky and not self.is_regular: shape = (np.max(self.npix),) if idx is None: shape = shape + self.shape_axes else: shape = shape + (1,) * len(self.axes) pix = [np.full(shape, -1, dtype=int) for i in range(1 + len(self.axes))] for idx_img in np.ndindex(self.shape_axes): if idx is not None and idx_img != idx: continue npix = self._npix[idx_img] if idx is None: s_img = (slice(0, npix),) + idx_img else: s_img = (slice(0, npix),) + (0,) * len(self.axes) pix[0][s_img] = np.arange(self._npix[idx_img]) for j in range(len(self.axes)): pix[j + 1][s_img] = idx_img[j] pix = [p.T for p in pix] # Explicit pixel indices else: if idx is not None: npix_sum = np.concatenate(([0], np.cumsum(self._npix))) idx_ravel = np.ravel_multi_index(idx, self.shape_axes) s = slice(npix_sum[idx_ravel], npix_sum[idx_ravel + 1]) else: s = slice(None) pix_flat = unravel_hpx_index(self._ipix[s], self.npix_max) shape = (np.max(self.npix),) if idx is None: shape = shape + self.shape_axes else: shape = shape + (1,) * len(self.axes) pix = [np.full(shape, -1, dtype=int) for _ in range(1 + len(self.axes))] for idx_img in np.ndindex(self.shape_axes): if idx is not None and idx_img != idx: continue npix = int(self._npix[idx_img].item()) if idx is None: s_img = (slice(0, npix),) + idx_img else: s_img = (slice(0, npix),) + (0,) * len(self.axes) if self.axes: m = np.all( np.stack([pix_flat[i + 1] == t for i, t in enumerate(idx_img)]), axis=0, ) pix[0][s_img] = pix_flat[0][m] else: pix[0][s_img] = pix_flat[0] for j in range(len(self.axes)): pix[j + 1][s_img] = idx_img[j] pix = [p.T for p in pix] if local: pix = self.global_to_local(pix) if flat: pix = tuple([p[p != INVALID_INDEX.int] for p in pix]) return pix def region_mask(self, regions): """Create a mask from a given list of regions. The mask is filled such that a pixel inside the region is filled with "True". To invert the mask, e.g. to create a mask with exclusion regions the tilde (~) operator can be used (see example below). Parameters ---------- regions : str, `~regions.Region` or list of `~regions.Region` Region or list of regions (pixel or sky regions accepted). A region can be defined as a string ind DS9 format as well. See http://ds9.si.edu/doc/ref/region.html for details. Returns ------- mask_map : `~gammapy.maps.WcsNDMap` of boolean type Boolean region mask. """ from gammapy.maps import Map, RegionGeom if not self.is_regular: raise ValueError("Multi-resolution maps not supported yet") # TODO: use spatial coordinates only... geom = RegionGeom.from_regions(regions) coords = self.get_coord() mask = geom.contains(coords) return Map.from_geom(self, data=mask) def get_coord( self, idx=None, flat=False, sparse=False, mode="center", axis_name=None ): if mode == "edges" and axis_name is None: raise ValueError("Mode 'edges' requires axis name to be defined") pix = self.get_idx( idx=idx, flat=flat, sparse=sparse, mode=mode, axis_name=axis_name ) data = self.pix_to_coord(pix) coords = MapCoord.create( data=data, frame=self.frame, axis_names=self.axes.names ) return coords def contains(self, coords): idx = self.coord_to_idx(coords) return np.all(np.stack([t != INVALID_INDEX.int for t in idx]), axis=0) def solid_angle(self): """Solid angle array as a `~astropy.units.Quantity` in ``sr``. The array has the same dimensionality as ``map.nside`` as all pixels have the same solid angle. """ import healpy as hp return Quantity(hp.nside2pixarea(self.nside), "sr") def __str__(self): lon, lat = self.center_skydir.data.lon.deg, self.center_skydir.data.lat.deg return ( f"{self.__class__.__name__}\n\n" f"\taxes : {self.axes_names}\n" f"\tshape : {self.data_shape[::-1]}\n" f"\tndim : {self.ndim}\n" f"\tnside : {self.nside[0]}\n" f"\tnested : {self.nest}\n" f"\tframe : {self.frame}\n" f"\tprojection : {self.projection}\n" f"\tcenter : {lon:.1f} deg, {lat:.1f} deg\n" ) def is_allclose(self, other, rtol_axes=1e-6, atol_axes=1e-6): """Compare two data IRFs for equivalency. Parameters ---------- other : `HpxGeom` Geometry to compare against. rtol_axes : float, optional Relative tolerance for axes comparison. Default is 1e-6. atol_axes : float, optional Relative tolerance for axes comparison. Default is 1e-6. Returns ------- is_allclose : bool Whether the geometry is all close. """ if not isinstance(other, self.__class__): return TypeError(f"Cannot compare {type(self)} and {type(other)}") if self.is_allsky and not other.is_allsky: return False if self.data_shape != other.data_shape: return False axes_eq = self.axes.is_allclose(other.axes, rtol=rtol_axes, atol=atol_axes) hpx_eq = ( self.nside == other.nside and self.frame == other.frame and self.order == other.order and self.nest == other.nest ) return axes_eq and hpx_eq def __eq__(self, other): if not isinstance(other, self.__class__): return False return self.is_allclose(other=other) def __ne__(self, other): return not self.__eq__(other) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/hpx/io.py0000644000175100001770000000714414721316200016572 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import copy import html class HpxConv: """Data structure to define how a HEALPix map is stored to FITS.""" def __init__(self, convname, **kwargs): self.convname = convname self.colstring = kwargs.get("colstring", "CHANNEL") self.firstcol = kwargs.get("firstcol", 1) self.hduname = kwargs.get("hduname", "SKYMAP") self.bands_hdu = kwargs.get("bands_hdu", "EBOUNDS") self.quantity_type = kwargs.get("quantity_type", "integral") self.frame = kwargs.get("frame", "COORDSYS") def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def colname(self, indx): return f"{self.colstring}{indx}" @classmethod def create(cls, convname="gadf"): return copy.deepcopy(HPX_FITS_CONVENTIONS[convname]) @staticmethod def identify_hpx_format(header): """Identify the convention used to write this file.""" # Hopefully the file contains the HPX_CONV keyword specifying # the convention used if "HPX_CONV" in header: return header["HPX_CONV"].lower() # Try based on the EXTNAME keyword hduname = header.get("EXTNAME", None) if hduname == "HPXEXPOSURES": return "fgst-bexpcube" elif hduname == "SKYMAP2": if "COORDTYPE" in header.keys(): return "galprop" else: return "galprop2" elif hduname == "xtension": return "healpy" # Check the name of the first column colname = header["TTYPE1"] if colname == "PIX": colname = header["TTYPE2"] if colname == "KEY": return "fgst-srcmap-sparse" elif colname == "ENERGY1": return "fgst-template" elif colname == "COSBINS": return "fgst-ltcube" elif colname == "Bin0": return "galprop" elif colname == "CHANNEL1" or colname == "CHANNEL0": if hduname == "SKYMAP": return "fgst-ccube" else: return "fgst-srcmap" else: raise ValueError("Could not identify HEALPIX convention") HPX_FITS_CONVENTIONS = {} """Various conventions for storing HEALPIX maps in FITS files""" HPX_FITS_CONVENTIONS[None] = HpxConv("gadf", bands_hdu="BANDS") HPX_FITS_CONVENTIONS["gadf"] = HpxConv("gadf", bands_hdu="BANDS") HPX_FITS_CONVENTIONS["fgst-ccube"] = HpxConv("fgst-ccube") HPX_FITS_CONVENTIONS["fgst-ltcube"] = HpxConv( "fgst-ltcube", colstring="COSBINS", hduname="EXPOSURE", bands_hdu="CTHETABOUNDS" ) HPX_FITS_CONVENTIONS["fgst-bexpcube"] = HpxConv( "fgst-bexpcube", colstring="ENERGY", hduname="HPXEXPOSURES", bands_hdu="ENERGIES" ) HPX_FITS_CONVENTIONS["fgst-srcmap"] = HpxConv( "fgst-srcmap", hduname=None, quantity_type="differential" ) HPX_FITS_CONVENTIONS["fgst-template"] = HpxConv( "fgst-template", colstring="ENERGY", bands_hdu="ENERGIES" ) HPX_FITS_CONVENTIONS["fgst-srcmap-sparse"] = HpxConv( "fgst-srcmap-sparse", colstring=None, hduname=None, quantity_type="differential" ) HPX_FITS_CONVENTIONS["galprop"] = HpxConv( "galprop", colstring="Bin", hduname="SKYMAP2", bands_hdu="ENERGIES", quantity_type="differential", frame="COORDTYPE", ) HPX_FITS_CONVENTIONS["galprop2"] = HpxConv( "galprop", colstring="Bin", hduname="SKYMAP2", bands_hdu="ENERGIES", quantity_type="differential", ) HPX_FITS_CONVENTIONS["healpy"] = HpxConv("healpy", hduname=None, colstring=None) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/hpx/ndmap.py0000644000175100001770000011720514721316200017262 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import numpy as np import astropy.units as u from astropy.coordinates import SkyCoord from astropy.io import fits from regions import PointSkyRegion import matplotlib.pyplot as plt from gammapy.utils.compat import COPY_IF_NEEDED from gammapy.utils.units import unit_from_fits_image_hdu from ..coord import MapCoord from ..geom import pix_tuple_to_idx from ..utils import INVALID_INDEX from .core import HpxMap from .geom import HpxGeom from .io import HPX_FITS_CONVENTIONS, HpxConv from .utils import HpxToWcsMapping, get_pix_size_from_nside, get_superpixels __all__ = ["HpxNDMap"] log = logging.getLogger(__name__) class HpxNDMap(HpxMap): """HEALPix map with any number of non-spatial dimensions. This class uses an N+1D numpy array to represent the sequence of HEALPix image planes. Following the convention of WCS-based maps this class uses a column-wise ordering for the data array with the spatial dimension being tied to the last index of the array. Parameters ---------- geom : `~gammapy.maps.HpxGeom` HEALPix geometry object. data : `~numpy.ndarray` HEALPix data array. If None, then an empty array will be allocated. meta : `dict` Dictionary to store metadata. unit : str or `~astropy.units.Unit` The map unit. """ def __init__(self, geom, data=None, dtype="float32", meta=None, unit=""): data_shape = geom.data_shape if data is None: data = self._make_default_data(geom, data_shape, dtype) super().__init__(geom, data, meta, unit) @staticmethod def _make_default_data(geom, shape_np, dtype): if geom.npix.size > 1: data = np.full(shape_np, np.nan, dtype=dtype) idx = geom.get_idx(local=True) data[idx[::-1]] = 0 else: data = np.zeros(shape_np, dtype=dtype) return data @classmethod def from_wcs_tiles(cls, wcs_tiles, nest=True): """Create HEALPix map from WCS tiles. Parameters ---------- wcs_tiles : list of `WcsNDMap` WCS map tiles. nest : bool, optional Indexing scheme. If True, "NESTED" scheme. If False, "RING" scheme. Default is True. Returns ------- hpx_map : `HpxNDMap` HEALPix map. """ import healpy as hp geom_wcs = wcs_tiles[0].geom geom_hpx = HpxGeom.create( binsz=geom_wcs.pixel_scales[0], frame=geom_wcs.frame, nest=nest, axes=geom_wcs.axes, ) map_hpx = cls.from_geom(geom=geom_hpx, unit=wcs_tiles[0].unit) coords = map_hpx.geom.get_coord().skycoord nside_superpix = hp.npix2nside(len(wcs_tiles)) hpx_ref = HpxGeom(nside=nside_superpix, nest=nest, frame=geom_wcs.frame) idx = np.arange(map_hpx.geom.to_image().npix.item()) indices = get_superpixels(idx, map_hpx.geom.nside, nside_superpix, nest=nest) for wcs_tile in wcs_tiles: hpx_idx = hpx_ref.coord_to_idx(wcs_tile.geom.center_skydir)[0] hpx_idx = int(hpx_idx.item()) mask = indices == hpx_idx map_hpx.data[mask] = wcs_tile.interp_by_coord(coords[mask]) return map_hpx def to_wcs_tiles( self, nside_tiles=4, margin="0 deg", method="nearest", oversampling_factor=1 ): """Convert HpxNDMap to a list of WCS tiles. Parameters ---------- nside_tiles : int, optional HEALPix NSIDE parameter for super pixel tiles. Default is 4. margin : Angle, optional Width margin of the WCS tile. Default is "0 deg". method : {'nearest', 'linear'} Interpolation method. Default is "nearest". oversampling_factor : int, optional Oversampling factor. Default is 1. Returns ------- wcs_tiles : list of `WcsNDMap` WCS tiles. """ wcs_tiles = [] wcs_geoms = self.geom.to_wcs_tiles(nside_tiles=nside_tiles, margin=margin) for geom in wcs_geoms: if oversampling_factor > 1: geom = geom.upsample(oversampling_factor) wcs_map = self.interp_to_geom(geom=geom, method=method) wcs_tiles.append(wcs_map) return wcs_tiles @classmethod def from_hdu(cls, hdu, hdu_bands=None, format=None, colname=None): """Make a HpxNDMap object from a FITS HDU. Parameters ---------- hdu : `~astropy.io.fits.BinTableHDU` The FITS HDU. hdu_bands : `~astropy.io.fits.BinTableHDU`, optional The BANDS table HDU. Default is None. format : str, optional FITS convention. Default is None. If None the format is guessed. The following formats are supported: - "gadf" - "fgst-ccube" - "fgst-ltcube" - "fgst-bexpcube" - "fgst-srcmap" - "fgst-template" - "fgst-srcmap-sparse" - "galprop" - "galprop2" colname : str, optional Data column name to be used for the HEALPix map. Default is None. Returns ------- map : `HpxMap` HEALPix map. """ if format is None: format = HpxConv.identify_hpx_format(hdu.header) geom = HpxGeom.from_header(hdu.header, hdu_bands, format=format) hpx_conv = HPX_FITS_CONVENTIONS[format] shape = geom.axes.shape[::-1] # TODO: Should we support extracting slices? meta = cls._get_meta_from_header(hdu.header) unit = unit_from_fits_image_hdu(hdu.header) map_out = cls(geom, None, meta=meta, unit=unit) colnames = hdu.columns.names cnames = [] if hdu.header.get("INDXSCHM", None) == "SPARSE": pix = hdu.data.field("PIX") vals = hdu.data.field("VALUE") if "CHANNEL" in hdu.data.columns.names: chan = hdu.data.field("CHANNEL") chan = np.unravel_index(chan, shape) idx = chan + (pix,) else: idx = (pix,) map_out.set_by_idx(idx[::-1], vals) else: if colname is not None: cnames.append(colname) else: for c in colnames: if c.find(hpx_conv.colstring) == 0: cnames.append(c) nbin = len(cnames) if nbin == 1: map_out.data = hdu.data.field(cnames[0]).reshape(geom.data_shape) else: for idx, cname in enumerate(cnames): # TODO: check whether reshaping is needed here. Is this tested? idx = np.unravel_index(idx, shape) map_out.data[idx + (slice(None),)] = hdu.data.field(cname) return map_out def to_wcs( self, sum_bands=False, normalize=True, proj="AIT", oversample=2, width_pix=None, hpx2wcs=None, fill_nan=True, ): from gammapy.maps import WcsNDMap if sum_bands and self.geom.nside.size > 1: map_sum = self.sum_over_axes() return map_sum.to_wcs( sum_bands=False, normalize=normalize, proj=proj, oversample=oversample, width_pix=width_pix, ) # FIXME: Check whether the old mapping is still valid and reuse it if hpx2wcs is None: geom_wcs_image = self.geom.to_wcs_geom( proj=proj, oversample=oversample, width_pix=width_pix ).to_image() hpx2wcs = HpxToWcsMapping.create(self.geom, geom_wcs_image) # FIXME: Need a function to extract a valid shape from npix property if sum_bands: axes = np.arange(self.data.ndim - 1) hpx_data = np.apply_over_axes(np.sum, self.data, axes=axes) hpx_data = np.squeeze(hpx_data) wcs_shape = tuple([t.flat[0] for t in hpx2wcs.npix]) wcs_data = np.zeros(wcs_shape).T wcs = hpx2wcs.wcs.to_image() else: hpx_data = self.data wcs_shape = tuple([t.flat[0] for t in hpx2wcs.npix]) + self.geom.shape_axes wcs_data = np.zeros(wcs_shape).T wcs = hpx2wcs.wcs.to_cube(self.geom.axes) # FIXME: Should reimplement instantiating map first and fill data array hpx2wcs.fill_wcs_map_from_hpx_data(hpx_data, wcs_data, normalize, fill_nan) return WcsNDMap(wcs, wcs_data, unit=self.unit) def _pad_spatial(self, pad_width, mode="constant", cval=0): geom = self.geom._pad_spatial(pad_width=pad_width) map_out = self._init_copy(geom=geom, data=None) map_out.coadd(self) coords = geom.get_coord(flat=True) m = self.geom.contains(coords) coords = tuple([c[~m] for c in coords]) if mode == "constant": map_out.set_by_coord(coords, cval) elif mode == "interp": raise ValueError("Method 'interp' not supported for HpxMap") else: raise ValueError(f"Unrecognized pad mode: {mode!r}") return map_out def crop(self, crop_width): geom = self.geom.crop(crop_width) map_out = self._init_copy(geom=geom, data=None) map_out.coadd(self) return map_out def upsample(self, factor, order=0, preserve_counts=True, axis_name=None): if axis_name: raise NotImplementedError( "HpxNDMap.upsample does currently not support upsampling of non-spatial axes." ) if order != 0: raise ValueError( "HpxNDMap.upsample currently only supports nearest upsampling" ) geom = self.geom.upsample(factor) coords = geom.get_coord() data = self.get_by_coord(coords) if preserve_counts: data /= factor**2 return self._init_copy(geom=geom, data=data) def downsample(self, factor, preserve_counts=True, axis_name=None): if axis_name: raise NotImplementedError( "HpxNDMap does currently not support upsampling of non-spatial axes." ) geom = self.geom.downsample(factor) coords = self.geom.get_coord() vals = self.get_by_coord(coords) map_out = self._init_copy(geom=geom, data=None) map_out.fill_by_coord(coords, vals) if not preserve_counts: map_out.data /= factor**2 return map_out def to_nside(self, nside, preserve_counts=True): """Upsample or downsample the map to a given nside. Parameters ---------- nside : int HEALPix NSIDE parameter. preserve_counts : bool, optional Preserve the integral over each bin. This should be true if the map is an integral quantity (e.g. counts) and false if the map is a differential quantity (e.g. intensity). Default is True. Returns ------- geom : `~HpxNDMap` HEALPix map with new NSIDE. """ if len(self.geom.nside) > 1: raise NotImplementedError( "to_nside() is not supported for an irregular map." ) factor = nside / self.geom.nside.item() if factor > 1: return self.upsample(factor=int(factor), preserve_counts=preserve_counts) elif factor < 1: return self.downsample( factor=int(1 / factor), preserve_counts=preserve_counts ) else: return self.copy() def interp_by_coord(self, coords, method="linear", fill_value=None): # inherited docstring coords = MapCoord.create(coords, frame=self.geom.frame) if method == "linear": return self._interp_by_coord(coords) elif method == "nearest": return self.get_by_coord(coords) else: raise ValueError(f"Invalid interpolation method: {method!r}") def interp_by_pix(self, pix, method=None, fill_value=None): """Interpolate map values at the given pixel coordinates.""" raise NotImplementedError def cutout(self, position, width, *args, **kwargs): """Create a cutout around a given position. Parameters ---------- position : `~astropy.coordinates.SkyCoord` Center position of the cutout region. width : `~astropy.coordinates.Angle` or `~astropy.units.Quantity` Diameter of the circular cutout region. Returns ------- cutout : `~gammapy.maps.HpxNDMap` Cutout map. """ geom = self.geom.cutout(position=position, width=width) if self.geom.is_allsky: idx = geom._ipix else: idx = self.geom.to_image().global_to_local((geom._ipix,))[0] data = self.data[..., idx] return self.__class__(geom=geom, data=data, unit=self.unit, meta=self.meta) def stack(self, other, weights=None, nan_to_num=True): """Stack cutout into map. Parameters ---------- other : `HpxNDMap` Other map to stack. weights : `HpxNDMap`, optional Array to be used as weights. The spatial geometry must be equivalent to `other` and additional axes must be broadcastable. Default is None. nan_to_num: bool, optional Non-finite values are replaced by zero if True. Default is True. """ if self.geom == other.geom: idx = None elif self.geom.is_aligned(other.geom): if self.geom.is_allsky: idx = other.geom._ipix else: idx = self.geom.to_image().global_to_local((other.geom._ipix,))[0] else: raise ValueError( "Can only stack equivalent maps or cutout of the same map." ) data = other.quantity.to_value(self.unit) if nan_to_num: not_finite = ~np.isfinite(data) if np.any(not_finite): data = data.copy() data[not_finite] = 0 if weights is not None: if not other.geom.to_image() == weights.geom.to_image(): raise ValueError("Incompatible spatial geoms between map and weights") data = data * weights.data if idx is None: self.data += data else: self.data[..., idx] += data def smooth(self, width, kernel="gauss"): """Smooth the map. Iterates over 2D image planes, processing one at a time. Parameters ---------- width : `~astropy.units.Quantity`, str or float Smoothing width given as quantity or float. If a float is given it is interpreted as smoothing width in pixels. If an (angular) quantity is given it is converted to pixels using `~healpy.nside2resol`. It corresponds to the standard deviation in case of a Gaussian kernel, and the radius in case of a disk kernel. kernel : {'gauss', 'disk'}, optional Kernel shape. Default is "gauss". Returns ------- image : `HpxNDMap` Smoothed image (a copy, the original object is unchanged). """ import healpy as hp if len(self.geom.nside) > 1: raise NotImplementedError("smooth is not supported for an irregular map.") nside = self.geom.nside.item() lmax = int(3 * nside - 1) # maximum l of the power spectrum ipix = self.geom._ipix if not self.geom.is_allsky: # stack into an all sky map full_sky_geom = HpxGeom.create( nside=self.geom.nside, nest=self.geom.nest, frame=self.geom.frame, axes=self.geom.axes, ) full_sky_map = HpxNDMap.from_geom(full_sky_geom) for img, idx in self.iter_by_image_data(): full_sky_map.data[idx][ipix] = img else: full_sky_map = self # The smoothing width is expected by healpy in radians if isinstance(width, (u.Quantity, str)): width = u.Quantity(width) width = width.to_value("rad") else: binsz = np.degrees(hp.nside2resol(nside)) width = width * binsz width = np.deg2rad(width) smoothed_data = np.empty(self.data.shape, dtype=float) for img, idx in full_sky_map.iter_by_image_data(): img = img.astype(float) if self.geom.nest: # reorder to ring to do the smoothing img = hp.pixelfunc.reorder(img, n2r=True) if kernel == "gauss": data = hp.sphtfunc.smoothing(img, sigma=width, pol=False, lmax=lmax) elif kernel == "disk": # create the step function in angular space theta = np.linspace(0, width) beam = np.ones(len(theta)) beam[theta > width] = 0 # convert to the spherical harmonics space window_beam = hp.sphtfunc.beam2bl(beam, theta, lmax) # normalize the window beam window_beam = window_beam / window_beam.max() data = hp.sphtfunc.smoothing( img, beam_window=window_beam, pol=False, lmax=lmax ) else: raise ValueError(f"Invalid kernel: {kernel!r}") if self.geom.nest: # reorder back to nest after the smoothing data = hp.pixelfunc.reorder(data, r2n=True) smoothed_data[idx] = data[ipix] return self._init_copy(data=smoothed_data) def convolve(self, kernel, convolution_method="wcs-tan", **kwargs): """Convolve map with a WCS kernel. Project the map into a WCS geometry, convolve with a WCS kernel and project back into the initial HEALPix geometry. If the kernel is two-dimensional, it is applied to all image planes likewise. If the kernel is higher dimensional it must match the map in the number of dimensions and the corresponding kernel is selected for every image plane. Parameters ---------- kernel : `~gammapy.irf.PSFKernel` Convolution kernel. The pixel size must be upsampled by a factor 2 or bigger with respect to the input map to prevent artifacts in the projection. convolution_method : {"wcs-tan", ""} Convolution method. If "wcs-tan", project on WCS geometry and convolve with WCS kernel. See `~gammapy.maps.HpxNDMap.convolve_wcs`. If "", convolve map with a symmetrical WCS kernel. See `~gammapy.maps.HpxNDMap.convolve_full`. Default is "wcs-tan". **kwargs : dict Keyword arguments passed to `~gammapy.maps.WcsNDMap.convolve`. Returns ------- map : `HpxNDMap` Convolved map. """ if convolution_method == "wcs-tan": return self.convolve_wcs(kernel, **kwargs) elif convolution_method == "": return self.convolve_full(kernel) else: raise ValueError( f"Not a valid method for HPX convolution: {convolution_method}" ) def convolve_wcs(self, kernel, **kwargs): """Convolve map with a WCS kernel. Project the map into a WCS geometry, convolve with a WCS kernel and project back into the initial HEALPix geometry. If the kernel is two-dimensional, it is applied to all image planes likewise. If the kernel is higher dimensional should either match the map in the number of dimensions or the map must be an image (no non-spatial axes). In that case, the corresponding kernel is selected and applied to every image plane or to the single input image respectively. Parameters ---------- kernel : `~gammapy.irf.PSFKernel` Convolution kernel. The pixel size must be upsampled by a factor 2 or bigger with respect to the input map to prevent artifacts in the projection. **kwargs : dict Keyword arguments passed to `~gammapy.maps.WcsNDMap.convolve`. Returns ------- map : `HpxNDMap` Convolved map. """ # TODO: maybe go through `.to_wcs_tiles()` to make this work for allsky maps if self.geom.is_allsky: raise ValueError( "Convolution via WCS projection is not supported for allsky maps." ) if self.geom.width > 10 * u.deg: log.warning( "Convolution via WCS projection is not recommended for large " "maps (> 10 deg). Perhaps the method `convolve_full()` is more suited for " "this case." ) geom_kernel = kernel.psf_kernel_map.geom wcs_size = np.max(geom_kernel.to_image().pixel_scales.deg) hpx_size = get_pix_size_from_nside(self.geom.nside[0]) if wcs_size > 0.5 * hpx_size: raise ValueError( f"The kernel pixel size of {wcs_size} has to be smaller by at least" f" a factor 2 than the pixel size of the input map of {hpx_size}" ) geom_wcs = self.geom.to_wcs_geom(proj="TAN").to_image() hpx2wcs = HpxToWcsMapping.create( hpx=self.geom, wcs=geom_wcs.to_binsz(binsz=wcs_size) ) # Project to WCS and convolve wcs_map = self.to_wcs(hpx2wcs=hpx2wcs, fill_nan=False) conv_wcs_map = wcs_map.convolve(kernel=kernel, **kwargs) if self.geom.is_image and geom_kernel.ndim > 2: target_geom = self.geom.to_cube(geom_kernel.axes) else: target_geom = self.geom # and back to hpx data = np.zeros(target_geom.data_shape) data = hpx2wcs.fill_hpx_map_from_wcs_data( wcs_data=conv_wcs_map.data, hpx_data=data ) return HpxNDMap.from_geom(target_geom, data=data) def convolve_full(self, kernel): """Convolve map with a symmetrical WCS kernel. Extract the radial profile of the kernel (assuming radial symmetry) and convolve via `~healpy.sphtfunc.smoothing`. Since no projection is applied, this is suited for full-sky and large maps. If the kernel is two-dimensional, it is applied to all image planes likewise. If the kernel is higher dimensional it must match the map in the number of dimensions and the corresponding kernel is selected for every image plane. Parameters ---------- kernel : `~gammapy.irf.PSFKernel` Convolution kernel. The pixel size must be upsampled by a factor 2 or bigger with respect to the input map to prevent artifacts in the projection. Returns ------- map : `HpxNDMap` Convolved map. """ import healpy as hp if len(self.geom.nside) > 1: raise NotImplementedError( "convolve_full() is not supported for an irregular map." ) nside = self.geom.nside.item() lmax = int(3 * nside - 1) # maximum l of the power spectrum nest = self.geom.nest allsky = self.geom.is_allsky ipix = self.geom._ipix if not allsky: # stack into an all sky map full_sky_geom = HpxGeom.create( nside=self.geom.nside, nest=self.geom.nest, frame=self.geom.frame, axes=self.geom.axes, ) full_sky_map = HpxNDMap.from_geom(full_sky_geom) for img, idx in self.iter_by_image_data(): full_sky_map.data[idx][ipix] = img else: full_sky_map = self # Get radial profile from the kernel psf_kernel = kernel.psf_kernel_map center_pix = psf_kernel.geom.center_pix[:2] center = max(center_pix) dim = np.argmax(center_pix) pixels = [0, 0] pixels[dim] = np.linspace( 0, center, int(center + 1) ) # assuming radially symmetric kernel pixels[abs(1 - dim)] = center_pix[abs(1 - dim)] * np.ones(int(center + 1)) coords = psf_kernel.geom.pix_to_coord(pixels) coordinates = SkyCoord(coords[0], coords[1], frame=psf_kernel.geom.frame) angles = coordinates.separation(psf_kernel.geom.center_skydir).rad values = psf_kernel.get_by_pix(pixels) # Do the convolution in each image plane convolved_data = np.empty(self.data.shape, dtype=float) for img, idx in full_sky_map.iter_by_image_data(): img = img.astype(float) if nest: # reorder to ring to do the convolution img = hp.pixelfunc.reorder(img, n2r=True) radial_profile = np.reshape(values[:, idx], (values.shape[0],)) window_beam = hp.sphtfunc.beam2bl( np.flip(radial_profile), np.flip(angles), lmax ) window_beam = window_beam / window_beam.max() data = hp.sphtfunc.smoothing( img, beam_window=window_beam, pol=False, lmax=lmax ) if nest: # reorder back to nest after the convolution data = hp.pixelfunc.reorder(data, r2n=True) convolved_data[idx] = data[ipix] return self._init_copy(data=convolved_data) def get_by_idx(self, idx): # inherited docstring idx = pix_tuple_to_idx(idx) idx = self.geom.global_to_local(idx) return self.data.T[idx] def _interp_by_coord(self, coords): """Linearly interpolate map values.""" pix, wts = self.geom.interp_weights(coords) if self.geom.is_image: return np.sum(self.data.T[tuple(pix)] * wts, axis=0) val = np.zeros(pix[0].shape[1:]) # Loop over function values at corners for i in range(2 ** len(self.geom.axes)): pix_i = [] wt = np.ones(pix[0].shape[1:])[np.newaxis, ...] for j, ax in enumerate(self.geom.axes): idx = ax.coord_to_idx(coords[ax.name]) idx = np.clip(idx, 0, len(ax.center) - 2) w = ax.center[idx + 1] - ax.center[idx] c = u.Quantity( coords[ax.name], ax.center.unit, copy=COPY_IF_NEEDED ).value if i & (1 << j): wt *= (c - ax.center[idx].value) / w.value pix_i += [idx + 1] else: wt *= 1.0 - (c - ax.center[idx].value) / w.value pix_i += [idx] if not self.geom.is_regular: pix, wts = self.geom.interp_weights(coords, idxs=pix_i) wts[pix[0] == INVALID_INDEX.int] = 0 wt[~np.isfinite(wt)] = 0 val += np.nansum(wts * wt * self.data.T[tuple(pix[:1] + pix_i)], axis=0) return val def _resample_by_idx(self, idx, weights=None, preserve_counts=False): idx = pix_tuple_to_idx(idx) msk = np.all(np.stack([t != INVALID_INDEX.int for t in idx]), axis=0) if weights is not None: weights = weights[msk] idx = [t[msk] for t in idx] idx_local = list(self.geom.global_to_local(idx)) msk = idx_local[0] >= 0 idx_local = [t[msk] for t in idx_local] if weights is not None: if isinstance(weights, u.Quantity): weights = weights.to_value(self.unit) weights = weights[msk] idx_local = np.ravel_multi_index(idx_local, self.data.T.shape) idx_local, idx_inv = np.unique(idx_local, return_inverse=True) weights = np.bincount(idx_inv, weights=weights) if not preserve_counts: weights /= np.bincount(idx_inv).astype(self.data.dtype) self.data.T.flat[idx_local] += weights def fill_by_idx(self, idx, weights=None): return self._resample_by_idx(idx, weights=weights, preserve_counts=True) def set_by_idx(self, idx, vals): idx = pix_tuple_to_idx(idx) idx_local = self.geom.global_to_local(idx) self.data.T[idx_local] = vals def _make_cols(self, header, conv): shape = self.data.shape cols = [] if header["INDXSCHM"] == "SPARSE": data = self.data.copy() data[~np.isfinite(data)] = 0 nonzero = np.where(data > 0) value = data[nonzero].astype(float) pix = self.geom.local_to_global(nonzero[::-1])[0] if len(shape) == 1: cols.append(fits.Column("PIX", "J", array=pix)) cols.append(fits.Column("VALUE", "E", array=value)) else: channel = np.ravel_multi_index(nonzero[:-1], shape[:-1]) cols.append(fits.Column("PIX", "J", array=pix)) cols.append(fits.Column("CHANNEL", "I", array=channel)) cols.append(fits.Column("VALUE", "E", array=value)) elif len(shape) == 1: name = conv.colname(indx=conv.firstcol) array = self.data.astype(float) cols.append(fits.Column(name, "E", array=array)) else: for i, idx in enumerate(np.ndindex(shape[:-1])): name = conv.colname(indx=i + conv.firstcol) array = self.data[idx].astype(float) cols.append(fits.Column(name, "E", array=array)) return cols def to_swapped(self): import healpy as hp hpx_out = self.geom.to_swapped() map_out = self._init_copy(geom=hpx_out, data=None) idx = self.geom.get_idx(flat=True) vals = self.get_by_idx(idx) if self.geom.nside.size > 1: nside = self.geom.nside[idx[1:]] else: nside = self.geom.nside if self.geom.nest: idx_new = tuple([hp.nest2ring(nside, idx[0])]) + idx[1:] else: idx_new = tuple([hp.ring2nest(nside, idx[0])]) + idx[1:] map_out.set_by_idx(idx_new, vals) return map_out def to_region_nd_map(self, region, func=np.nansum, weights=None, method="nearest"): """Get region ND map in a given region. By default, the whole map region is considered. Parameters ---------- region: `~regions.Region` or `~astropy.coordinates.SkyCoord` Region. func : numpy.func, optional Function to reduce the data. Default is np.nansum. For boolean Map, use np.any or np.all. weights : `WcsNDMap`, optional Array to be used as weights. The geometry must be equivalent. Default is None. method : {"nearest", "linear"} How to interpolate if a position is given. Default is "neraest". Returns ------- spectrum : `~gammapy.maps.RegionNDMap` Spectrum in the given region. """ from gammapy.maps import RegionGeom, RegionNDMap if isinstance(region, SkyCoord): region = PointSkyRegion(region) if weights is not None: if not self.geom == weights.geom: raise ValueError("Incompatible spatial geoms between map and weights") geom = RegionGeom(region=region, axes=self.geom.axes) if isinstance(region, PointSkyRegion): coords = geom.get_coord() data = self.interp_by_coord(coords=coords, method=method) if weights is not None: data *= weights.interp_by_coord(coords=coords, method=method) else: cutout = self.cutout(position=geom.center_skydir, width=np.max(geom.width)) if weights is not None: weights_cutout = weights.cutout( position=geom.center_skydir, width=geom.width ) cutout.data *= weights_cutout.data mask = cutout.geom.to_image().region_mask([region]).data data = func(cutout.data[..., mask], axis=-1) return RegionNDMap(geom=geom, data=data, unit=self.unit, meta=self.meta.copy()) def plot( self, method="raster", ax=None, normalize=False, proj="AIT", oversample=2, width_pix=1000, **kwargs, ): """Quickplot method. This will generate a visualization of the map by converting to a rasterized WCS image (method='raster') or drawing polygons for each pixel (method='poly'). Parameters ---------- method : {'raster','poly'} Method for mapping HEALPix pixels to a two-dimensional image. Can be set to 'raster' (rasterization to cartesian image plane) or 'poly' (explicit polygons for each pixel). WARNING: The 'poly' method is much slower than 'raster' and only suitable for maps with less than ~10k pixels. Default is "raster". proj : string, optional Any valid WCS projection type. Default is "AIT". oversample : float, optional Oversampling factor for WCS map. This will be the approximate ratio of the width of a HPX pixel to a WCS pixel. If this parameter is None then the width will be set from ``width_pix``. Default is 2. width_pix : int, optional Width of the WCS geometry in pixels. The pixel size will be set to the number of pixels satisfying ``oversample`` or ``width_pix`` whichever is smaller. If this parameter is None then the width will be set from ``oversample``. Default is 1000. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.imshow`. Returns ------- ax : `~astropy.visualization.wcsaxes.WCSAxes` WCS axes object. """ if method == "raster": m = self.to_wcs( sum_bands=True, normalize=normalize, proj=proj, oversample=oversample, width_pix=width_pix, ) return m.plot(ax, **kwargs) elif method == "poly": return self._plot_poly(proj=proj, ax=ax) else: raise ValueError(f"Invalid method: {method!r}") def _plot_poly(self, proj="AIT", step=1, ax=None): """Plot the map using a collection of polygons. Parameters ---------- proj : string, optional Any valid WCS projection type. Default is "AIT". step : int, optional Set the number vertices that will be computed for each pixel in multiples of 4. Default is 1. ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. """ # FIXME: At the moment this only works for all-sky maps if the # projection is centered at (0,0) # FIXME: Figure out how to force a square aspect-ratio like imshow import healpy as hp from matplotlib.collections import PatchCollection from matplotlib.patches import Polygon wcs = self.geom.to_wcs_geom(proj=proj, oversample=1) if ax is None: fig = plt.gcf() ax = fig.add_subplot(111, projection=wcs.wcs, aspect="equal") wcs_lonlat = wcs.center_coord[:2] idx = self.geom.get_idx() vtx = hp.boundaries( self.geom.nside.item(), idx[0], nest=self.geom.nest, step=step ) theta, phi = hp.vec2ang(np.rollaxis(vtx, 2)) theta = theta.reshape((4 * step, -1)).T phi = phi.reshape((4 * step, -1)).T patches = [] data = [] def get_angle(x, t): return 180.0 - (180.0 - x + t) % 360.0 for i, (x, y) in enumerate(zip(phi, theta)): lon, lat = np.degrees(x), np.degrees(np.pi / 2.0 - y) # Add a small offset to avoid vertices wrapping to the # other size of the projection if get_angle(np.median(lon), wcs_lonlat[0].to_value("deg")) > 0: idx = wcs.coord_to_pix((lon - 1e-4, lat)) else: idx = wcs.coord_to_pix((lon + 1e-4, lat)) dist = np.max(np.abs(idx[0][0] - idx[0])) # Split pixels that wrap around the edges of the projection if dist > wcs.npix[0] / 1.5: lon, lat = np.degrees(x), np.degrees(np.pi / 2.0 - y) lon0 = lon - 1e-4 lon1 = lon + 1e-4 pix0 = wcs.coord_to_pix((lon0, lat)) pix1 = wcs.coord_to_pix((lon1, lat)) idx0 = np.argsort(pix0[0]) idx1 = np.argsort(pix1[0]) pix0 = (pix0[0][idx0][:3], pix0[1][idx0][:3]) pix1 = (pix1[0][idx1][1:], pix1[1][idx1][1:]) patches.append(Polygon(np.vstack((pix0[0], pix0[1])).T, closed=True)) patches.append(Polygon(np.vstack((pix1[0], pix1[1])).T, closed=True)) data.append(self.data[i]) data.append(self.data[i]) else: polygon = Polygon(np.vstack((idx[0], idx[1])).T, closed=True) patches.append(polygon) data.append(self.data[i]) p = PatchCollection(patches, linewidths=0, edgecolors="None") p.set_array(np.array(data)) ax.add_collection(p) ax.autoscale_view() ax.coords.grid(color="w", linestyle=":", linewidth=0.5) return ax def plot_mask( self, method="raster", ax=None, proj="AIT", oversample=2, width_pix=1000, **kwargs, ): """Plot the mask as a shaded area. Parameters ---------- method : {'raster','poly'} Method for mapping HEALPix pixels to a two-dimensional image. Can be set to 'raster' (rasterization to cartesian image plane) or 'poly' (explicit polygons for each pixel). WARNING: The 'poly' method is much slower than 'raster' and only suitable for maps with less than ~10k pixels. Default is "raster". proj : string, optional Any valid WCS projection type. Default is "AIT". oversample : float, optional Oversampling factor for WCS map. This will be the approximate ratio of the width of a HPX pixel to a WCS pixel. If this parameter is None then the width will be set from ``width_pix``. Default is 2. width_pix : int, optional Width of the WCS geometry in pixels. The pixel size will be set to the number of pixels satisfying ``oversample`` or ``width_pix`` whichever is smaller. If this parameter is None then the width will be set from ``oversample``. Default is 1000. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.imshow`. Returns ------- ax : `~astropy.visualization.wcsaxes.WCSAxes` WCS axis object. """ if not self.is_mask: raise ValueError( "`.plot_mask()` only supports maps containing boolean values." ) if method == "raster": m = self.to_wcs( sum_bands=True, normalize=False, proj=proj, oversample=oversample, width_pix=width_pix, ) m.data = np.nan_to_num(m.data).astype(bool) return m.plot_mask(ax=ax, **kwargs) else: raise ValueError(f"Invalid method: {method!r}") def sample_coord(self, n_events, random_state=0): raise NotImplementedError("HpXNDMap.sample_coord is not implemented yet.") ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.2206419 gammapy-1.3/gammapy/maps/hpx/tests/0000755000175100001770000000000014721316215016753 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/hpx/tests/__init__.py0000644000175100001770000000010014721316200021045 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/hpx/tests/test_geom.py0000644000175100001770000006474014721316200021320 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy import units as u from astropy.coordinates import SkyCoord from astropy.io import fits from regions import CircleSkyRegion from gammapy.maps import HpxGeom, MapAxis, MapCoord from gammapy.maps.hpx.utils import ( HpxToWcsMapping, get_pix_size_from_nside, get_subpixels, get_superpixels, nside_to_order, ravel_hpx_index, unravel_hpx_index, ) pytest.importorskip("healpy") hpx_allsky_test_geoms = [ # 2D All-sky (8, False, "galactic", None, None), # 3D All-sky (8, False, "galactic", None, [MapAxis(np.logspace(0.0, 3.0, 4))]), # 3D All-sky w/ variable pixel size ([2, 4, 8], False, "galactic", None, [MapAxis(np.logspace(0.0, 3.0, 4))]), # 4D All-sky ( 8, False, "galactic", None, [ MapAxis(np.logspace(0.0, 3.0, 3), name="axis0"), MapAxis(np.logspace(0.0, 2.0, 4), name="axis1"), ], ), ] hpx_partialsky_test_geoms = [ # 2D Partial-sky (8, False, "galactic", "DISK(110.,75.,10.)", None), # 3D Partial-sky (8, False, "galactic", "DISK(110.,75.,10.)", [MapAxis(np.logspace(0.0, 3.0, 4))]), # 3D Partial-sky w/ variable pixel size ( [8, 16, 32], False, "galactic", "DISK(110.,75.,10.)", [MapAxis(np.logspace(0.0, 3.0, 4))], ), # 4D Partial-sky w/ variable pixel size ( [[8, 16, 32], [8, 8, 16]], False, "galactic", "DISK(110.,75.,10.)", [ MapAxis(np.logspace(0.0, 3.0, 3), name="axis0"), MapAxis(np.logspace(0.0, 2.0, 4), name="axis1"), ], ), ] hpx_test_geoms = hpx_allsky_test_geoms + hpx_partialsky_test_geoms def make_test_coords(geom, lon, lat): coords = [lon, lat] + [ax.center for ax in geom.axes] coords = np.meshgrid(*coords) coords = tuple([np.ravel(t) for t in coords]) return MapCoord.create(coords) def test_unravel_hpx_index(): npix = np.array([2, 7]) assert_allclose( unravel_hpx_index(np.array([0, 4]), npix), (np.array([0, 2]), np.array([0, 1])) ) npix = np.array([[2, 7], [3, 1]]) assert_allclose( unravel_hpx_index(np.array([0, 3, 10]), npix), (np.array([0, 1, 1]), np.array([0, 0, 1]), np.array([0, 1, 0])), ) def test_ravel_hpx_index(): npix = np.array([2, 7]) idx = (np.array([0, 2]), np.array([0, 1])) assert_allclose(ravel_hpx_index(idx, npix), np.array([0, 4])) npix = np.array([[2, 7], [3, 1]]) idx = (np.array([0, 1, 1]), np.array([0, 0, 1]), np.array([0, 1, 0])) assert_allclose(ravel_hpx_index(idx, npix), np.array([0, 3, 10])) def make_test_nside(nside, nside0, nside1): npix = 12 * nside**2 nside_test = np.concatenate( (nside0 * np.ones(npix // 2, dtype=int), nside1 * np.ones(npix // 2, dtype=int)) ) return nside_test @pytest.mark.parametrize( ("nside_subpix", "nside_superpix", "nest"), [ (4, 2, True), (8, 2, True), (8, make_test_nside(8, 4, 2), True), (4, 2, False), (8, 2, False), (8, make_test_nside(8, 4, 2), False), ], ) def test_get_superpixels(nside_subpix, nside_superpix, nest): import healpy as hp npix = 12 * nside_subpix**2 subpix = np.arange(npix) ang_subpix = hp.pix2ang(nside_subpix, subpix, nest=nest) superpix = get_superpixels(subpix, nside_subpix, nside_superpix, nest=nest) pix1 = hp.ang2pix(nside_superpix, *ang_subpix, nest=nest) assert_allclose(superpix, pix1) subpix = subpix.reshape((12, -1)) if not np.isscalar(nside_subpix): nside_subpix = nside_subpix.reshape((12, -1)) if not np.isscalar(nside_superpix): nside_superpix = nside_superpix.reshape((12, -1)) ang_subpix = hp.pix2ang(nside_subpix, subpix, nest=nest) superpix = get_superpixels(subpix, nside_subpix, nside_superpix, nest=nest) pix1 = hp.ang2pix(nside_superpix, *ang_subpix, nest=nest) assert_allclose(superpix, pix1) @pytest.mark.parametrize( ("nside_superpix", "nside_subpix", "nest"), [(2, 4, True), (2, 8, True), (2, 4, False), (2, 8, False)], ) def test_get_subpixels(nside_superpix, nside_subpix, nest): import healpy as hp npix = 12 * nside_superpix**2 superpix = np.arange(npix) subpix = get_subpixels(superpix, nside_superpix, nside_subpix, nest=nest) ang1 = hp.pix2ang(nside_subpix, subpix, nest=nest) pix1 = hp.ang2pix(nside_superpix, *ang1, nest=nest) assert np.all(superpix[..., None] == pix1) superpix = superpix.reshape((12, -1)) subpix = get_subpixels(superpix, nside_superpix, nside_subpix, nest=nest) ang1 = hp.pix2ang(nside_subpix, subpix, nest=nest) pix1 = hp.ang2pix(nside_superpix, *ang1, nest=nest) assert np.all(superpix[..., None] == pix1) pix1 = get_superpixels(subpix, nside_subpix, nside_superpix, nest=nest) assert np.all(superpix[..., None] == pix1) def test_hpx_global_to_local(): ax0 = np.linspace(0.0, 1.0, 3) ax1 = np.linspace(0.0, 1.0, 3) # 2D All-sky hpx = HpxGeom(16, False, "galactic") assert_allclose(hpx.global_to_local(0, ravel=True), np.array([0])) assert_allclose(hpx.global_to_local(633, ravel=True), np.array([633])) assert_allclose(hpx.global_to_local((0, 633), ravel=True), np.array([0, 633])) assert_allclose( hpx.global_to_local(np.array([0, 633]), ravel=True), np.array([0, 633]) ) # 3D All-sky hpx = HpxGeom(16, False, "galactic", axes=[ax0]) assert_allclose( hpx.global_to_local((np.array([177, 177]), np.array([0, 1])), ravel=True), np.array([177, 177 + 3072]), ) # 2D Partial-sky hpx = HpxGeom(64, False, "galactic", region="DISK(110.,75.,2.)") assert_allclose( hpx.global_to_local((0, 633, 706), ravel=True), np.array([-1, 0, 2]) ) # 3D Partial-sky hpx = HpxGeom(64, False, "galactic", region="DISK(110.,75.,2.)", axes=[ax0]) assert_allclose(hpx.global_to_local(633, ravel=True), np.array([0])) assert_allclose(hpx.global_to_local(49859, ravel=True), np.array([19])) assert_allclose( hpx.global_to_local((0, 633, 706, 49859, 49935), ravel=True), np.array([-1, 0, 2, 19, 21]), ) assert_allclose( hpx.global_to_local(np.array([0, 633, 706, 49859, 49935]), ravel=True), np.array([-1, 0, 2, 19, 21]), ) idx_global = (np.array([0, 633, 706, 707, 783]), np.array([0, 0, 0, 1, 1])) assert_allclose(hpx.global_to_local(idx_global, ravel=True), [-1, 0, 2, 19, 21]) # 3D Partial-sky w/ variable bin size hpx = HpxGeom([32, 64], False, "galactic", region="DISK(110.,75.,2.)", axes=[ax0]) assert_allclose(hpx.global_to_local(191, ravel=True), [0]) assert_allclose(hpx.global_to_local(12995, ravel=True), [6]) assert_allclose( hpx.global_to_local((0, 191, 233, 12995), ravel=True), [-1, 0, 2, 6] ) idx_global = (np.array([0, 191, 233, 707]), np.array([0, 0, 0, 1])) assert_allclose( hpx.global_to_local(idx_global, ravel=True), np.array([-1, 0, 2, 6]), ) # 4D Partial-sky w/ variable bin size hpx = HpxGeom( [[16, 32], [32, 64]], False, "galactic", region="DISK(110.,75.,2.)", axes=[ax0, ax1], ) assert_allclose(hpx.global_to_local(3263, ravel=True), [1]) assert_allclose(hpx.global_to_local(28356, ravel=True), [11]) idx_global = (np.array([46]), np.array([0]), np.array([0])) assert_allclose(hpx.global_to_local(idx_global, ravel=True), [0]) @pytest.mark.parametrize( ("nside", "nested", "frame", "region", "axes"), hpx_allsky_test_geoms ) def test_hpxgeom_init_with_pix(nside, nested, frame, region, axes): geom = HpxGeom(nside, nested, frame, region=region, axes=axes) idx0 = geom.get_idx(flat=True) idx1 = tuple([t[::10] for t in idx0]) geom = HpxGeom(nside, nested, frame, region=idx0, axes=axes) assert_allclose(idx0, geom.get_idx(flat=True)) assert_allclose(len(idx0[0]), np.sum(geom.npix)) geom = HpxGeom(nside, nested, frame, region=idx1, axes=axes) assert_allclose(idx1, geom.get_idx(flat=True)) assert_allclose(len(idx1[0]), np.sum(geom.npix)) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxgeom_get_pix(nside, nested, frame, region, axes): geom = HpxGeom(nside, nested, frame, region=region, axes=axes) idx = geom.get_idx(local=False, flat=True) idx_local = geom.get_idx(local=True, flat=True) assert_allclose(idx, geom.local_to_global(idx_local)) if axes is not None: idx_img = geom.get_idx(local=False, idx=tuple([1] * len(axes)), flat=True) idx_img_local = geom.get_idx(local=True, idx=tuple([1] * len(axes)), flat=True) assert_allclose(idx_img, geom.local_to_global(idx_img_local)) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxgeom_coord_to_idx(nside, nested, frame, region, axes): import healpy as hp geom = HpxGeom(nside, nested, frame, region=region, axes=axes) lon = np.array([112.5, 135.0, 105.0]) lat = np.array([75.3, 75.3, 74.6]) coords = make_test_coords(geom, lon, lat) zidx = tuple([ax.coord_to_idx(t) for t, ax in zip(coords[2:], geom.axes)]) if geom.nside.size > 1: nside = geom.nside[zidx] else: nside = geom.nside phi, theta = coords.phi, coords.theta idx = geom.coord_to_idx(coords) assert_allclose(hp.ang2pix(nside, theta, phi), idx[0]) for i, z in enumerate(zidx): assert_allclose(z, idx[i + 1]) # Test w/ coords outside the geometry lon = np.array([0.0, 5.0, 10.0]) lat = np.array([75.3, 75.3, 74.6]) coords = make_test_coords(geom, lon, lat) zidx = [ax.coord_to_idx(t) for t, ax in zip(coords[2:], geom.axes)] idx = geom.coord_to_idx(coords) if geom.region is not None: assert_allclose(np.full_like(coords[0], -1, dtype=int), idx[0]) idx = geom.coord_to_idx(coords, clip=True) assert np.all(np.not_equal(np.full_like(coords[0], -1, dtype=int), idx[0])) def test_hpxgeom_coord_to_pix(): lon = np.array([110.25, 114.0, 105.0]) lat = np.array([75.3, 75.3, 74.6]) z0 = np.array([0.5, 1.5, 2.5]) z1 = np.array([3.5, 4.5, 5.5]) ax0 = np.linspace(0.0, 3.0, 4) ax1 = np.linspace(3.0, 6.0, 4) pix64 = np.array([784, 785, 864]) # 2D all-sky coords = (lon, lat) hpx = HpxGeom(64, False, "galactic") assert_allclose(hpx.coord_to_pix(coords)[0], pix64) # 2D partial-sky coords = (lon, lat) hpx = HpxGeom(64, False, "galactic", region="DISK(110.,75.,2.)") assert_allclose(hpx.coord_to_pix(coords)[0], pix64) # 3D partial-sky coords = (lon, lat, z0) hpx = HpxGeom(64, False, "galactic", region="DISK(110.,75.,2.)", axes=[ax0]) assert_allclose(hpx.coord_to_pix(coords), (pix64, np.array([0, 1, 2]))) # 3D partial-sky w/ variable bin size coords = (lon, lat, z0) nside = [16, 32, 64] hpx_bins = [ HpxGeom(n, False, "galactic", region="DISK(110.,75.,2.)") for n in nside ] hpx = HpxGeom(nside, False, "galactic", region="DISK(110.,75.,2.)", axes=[ax0]) for i, (x, y, z) in enumerate(np.vstack(coords).T): pix0 = hpx.coord_to_pix((np.array([x]), np.array([y]), np.array([z]))) pix1 = hpx_bins[i].coord_to_pix((np.array([x]), np.array([y]))) assert_allclose(pix0[0], pix1[0]) # 4D partial-sky coords = (lon, lat, z0, z1) hpx = HpxGeom(64, False, "galactic", region="DISK(110.,75.,2.)", axes=[ax0, ax1]) assert_allclose( hpx.coord_to_pix(coords), (pix64, np.array([0, 1, 2]), np.array([0, 1, 2])) ) def test_hpx_nside_to_order(): assert_allclose(nside_to_order(64), np.array([6])) assert_allclose( nside_to_order(np.array([10, 32, 42, 64, 128, 256])), np.array([-1, 5, -1, 6, 7, 8]), ) order = np.linspace(1, 10, 10).astype(int) nside = 2**order assert_allclose(nside_to_order(nside), order) assert_allclose(nside_to_order(nside).reshape((2, 5)), order.reshape((2, 5))) def test_hpx_get_pix_size_from_nside(): assert_allclose( get_pix_size_from_nside(np.array([1, 2, 4])), np.array([32.0, 16.0, 8.0]) ) def test_hpx_get_hpxregion_size(): geom = HpxGeom.create(nside=128, region="DISK(110.,75.,2.)") assert_allclose(geom.width, 2.0 * u.deg) def test_hpxgeom_get_hpxregion_dir(): geom = HpxGeom.create(nside=128, region="DISK(110.,75.,2.)", frame="galactic") refdir = geom.center_skydir assert_allclose(refdir.l.deg, 110.0) assert_allclose(refdir.b.deg, 75.0) geom = HpxGeom.create(nside=128, frame="galactic") refdir = geom.center_skydir assert_allclose(refdir.l.deg, 0.0) assert_allclose(refdir.b.deg, 0.0) def test_hpxgeom_make_wcs(): ax0 = np.linspace(0.0, 3.0, 4) hpx = HpxGeom(64, False, "galactic", region="DISK(110.,75.,2.)") wcs = hpx.to_wcs_geom() assert_allclose(wcs.wcs.wcs.crval, np.array([110.0, 75.0])) hpx = HpxGeom(64, False, "galactic", region="DISK(110.,75.,2.)", axes=[ax0]) wcs = hpx.to_wcs_geom() assert_allclose(wcs.wcs.wcs.crval, np.array([110.0, 75.0])) def test_hpxgeom_get_coord(): ax0 = np.linspace(0.0, 3.0, 4) # 2D all-sky hpx = HpxGeom(16, False, "galactic") c = hpx.get_coord() assert_allclose(c[0][:3], np.array([45.0, 135.0, 225.0])) assert_allclose(c[1][:3], np.array([87.075819, 87.075819, 87.075819])) # 3D all-sky hpx = HpxGeom(16, False, "galactic", axes=[ax0]) c = hpx.get_coord() assert_allclose(c[0][0, :3], np.array([45.0, 135.0, 225.0])) assert_allclose(c[1][0, :3], np.array([87.075819, 87.075819, 87.075819])) assert_allclose(c[2][0, :3], np.array([0.5, 0.5, 0.5])) # 2D partial-sky hpx = HpxGeom(64, False, "galactic", region="DISK(110.,75.,2.)") c = hpx.get_coord() assert_allclose(c[0][:3], np.array([107.5, 112.5, 106.57894737])) assert_allclose(c[1][:3], np.array([76.813533, 76.813533, 76.07742])) # 3D partial-sky hpx = HpxGeom(64, False, "galactic", region="DISK(110.,75.,2.)", axes=[ax0]) c = hpx.get_coord() assert_allclose(c[0][0, :3], np.array([107.5, 112.5, 106.57894737])) assert_allclose(c[1][0, :3], np.array([76.813533, 76.813533, 76.07742])) assert_allclose(c[2][0, :3], np.array([0.5, 0.5, 0.5])) # 3D partial-sky w/ variable bin size hpx = HpxGeom( [16, 32, 64], False, "galactic", region="DISK(110.,75.,2.)", axes=[ax0] ) c = hpx.get_coord(flat=True) assert_allclose(c[0][:3], np.array([117.0, 103.5, 112.5])) assert_allclose(c[1][:3], np.array([75.340734, 75.340734, 75.340734])) assert_allclose(c[2][:3], np.array([0.5, 1.5, 1.5])) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxgeom_contains(nside, nested, frame, region, axes): geom = HpxGeom(nside, nested, frame, region=region, axes=axes) coords = geom.get_coord(flat=True) assert_allclose(geom.contains(coords), np.ones_like(coords[0], dtype=bool)) if axes is not None: coords = [c[0] for c in coords[:2]] + [ax.edges[-1] + 1.0 for ax in axes] assert_allclose(geom.contains(coords), np.zeros((1,), dtype=bool)) if geom.region is not None: coords = [0.0, 0.0] + [ax.center[0] for ax in geom.axes] assert_allclose(geom.contains(coords), np.zeros((1,), dtype=bool)) def test_make_hpx_to_wcs_mapping(): ax0 = np.linspace(0.0, 1.0, 3) hpx = HpxGeom(16, False, "galactic", region="DISK(110.,75.,2.)") # FIXME construct explicit WCS projection here wcs = hpx.to_wcs_geom() hpx2wcs = HpxToWcsMapping.create(hpx, wcs) assert_allclose( hpx2wcs.ipix, np.array( [ 67, 46, 46, 46, 46, 29, 67, 67, 46, 46, 46, 46, 67, 67, 67, 46, 46, 46, 67, 67, 67, 28, 28, 28, 45, 45, 45, 45, 28, 28, 66, 45, 45, 45, 45, 28, ] ), ) assert_allclose( hpx2wcs.mult_val, np.array( [ 0.11111111, 0.09090909, 0.09090909, 0.09090909, 0.09090909, 1.0, 0.11111111, 0.11111111, 0.09090909, 0.09090909, 0.09090909, 0.09090909, 0.11111111, 0.11111111, 0.11111111, 0.09090909, 0.09090909, 0.09090909, 0.11111111, 0.11111111, 0.11111111, 0.16666667, 0.16666667, 0.16666667, 0.125, 0.125, 0.125, 0.125, 0.16666667, 0.16666667, 1.0, 0.125, 0.125, 0.125, 0.125, 0.16666667, ] ), ) hpx = HpxGeom([8, 16], False, "galactic", region="DISK(110.,75.,2.)", axes=[ax0]) hpx2wcs = HpxToWcsMapping.create(hpx, wcs) assert_allclose( hpx2wcs.ipix, np.array( [ [ 15, 6, 6, 6, 6, 6, 15, 15, 6, 6, 6, 6, 15, 15, 15, 6, 6, 6, 15, 15, 15, 6, 6, 6, 15, 15, 15, 15, 6, 6, 15, 15, 15, 15, 15, 6, ], [ 67, 46, 46, 46, 46, 29, 67, 67, 46, 46, 46, 46, 67, 67, 67, 46, 46, 46, 67, 67, 67, 28, 28, 28, 45, 45, 45, 45, 28, 28, 66, 45, 45, 45, 45, 28, ], ] ), ) def test_hpxgeom_from_header(): pars = { "HPX_REG": "DISK(110.,75.,2.)", "EXTNAME": "SKYMAP", "NSIDE": 2**6, "ORDER": 6, "PIXTYPE": "HEALPIX", "ORDERING": "RING", "COORDSYS": "CEL", "TTYPE1": "PIX", "TFORM1": "K", "TTYPE2": "CHANNEL1", "TFORM2": "D", "INDXSCHM": "EXPLICIT", } header = fits.Header() header.update(pars) hpx = HpxGeom.from_header(header) assert hpx.frame == "icrs" assert not hpx.nest assert_allclose(hpx.nside, np.array([64])) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxgeom_read_write(tmp_path, nside, nested, frame, region, axes): geom0 = HpxGeom(nside, nested, frame, region=region, axes=axes) hdu_bands = geom0.to_bands_hdu(hdu_bands="TEST_BANDS") hdu_prim = fits.PrimaryHDU() hdu_prim.header.update(geom0.to_header()) hdulist = fits.HDUList([hdu_prim, hdu_bands]) hdulist.writeto(tmp_path / "tmp.fits") with fits.open(tmp_path / "tmp.fits", memmap=False) as hdulist: geom1 = HpxGeom.from_header(hdulist[0].header, hdulist["TEST_BANDS"]) assert_allclose(geom0.nside, geom1.nside) assert_allclose(geom0.npix, geom1.npix) assert_allclose(geom0.nest, geom1.nest) assert geom0.frame == geom1.frame @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxgeom_upsample(nside, nested, frame, region, axes): # NESTED geom = HpxGeom(nside, True, frame, region=region, axes=axes) geom_up = geom.upsample(2) assert_allclose(2 * geom.nside, geom_up.nside) assert_allclose(4 * geom.npix, geom_up.npix) coords = geom_up.get_coord(flat=True) assert np.all(geom.contains(coords)) # RING geom = HpxGeom(nside, False, frame, region=region, axes=axes) geom_up = geom.upsample(2) assert_allclose(2 * geom.nside, geom_up.nside) assert_allclose(4 * geom.npix, geom_up.npix) coords = geom_up.get_coord(flat=True) assert np.all(geom.contains(coords)) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxgeom_downsample(nside, nested, frame, region, axes): # NESTED geom = HpxGeom(nside, True, frame, region=region, axes=axes) geom_down = geom.downsample(2) assert_allclose(geom.nside, 2 * geom_down.nside) coords = geom.get_coord(flat=True) assert np.all(geom_down.contains(coords)) # RING geom = HpxGeom(nside, False, frame, region=region, axes=axes) geom_down = geom.downsample(2) assert_allclose(geom.nside, 2 * geom_down.nside) coords = geom.get_coord(flat=True) assert np.all(geom_down.contains(coords)) def test_hpxgeom_solid_angle(): geom = HpxGeom.create( nside=8, frame="galactic", axes=[MapAxis.from_edges([0, 2, 3])] ) solid_angle = geom.solid_angle() assert solid_angle.shape == (1,) assert_allclose(solid_angle.value, 0.016362461737446838) def test_hpxgeom_pixel_scales(): geom = HpxGeom.create( nside=8, frame="galactic", axes=[MapAxis.from_edges([0, 2, 3])] ) pixel_scales = geom.pixel_scales assert_allclose(pixel_scales, [4] * u.deg) def test_hpx_geom_cutout(): geom = HpxGeom.create( nside=8, frame="galactic", axes=[MapAxis.from_edges([0, 2, 3])] ) cutout = geom.cutout(position=SkyCoord("0d", "0d"), width=30 * u.deg) assert cutout.nside == 8 assert cutout.data_shape == (2, 14) assert cutout.data_shape_axes == (2, 1) center = cutout.center_skydir.icrs assert_allclose(center.ra.deg, 0, atol=1e-8) assert_allclose(center.dec.deg, 0, atol=1e-8) def test_hpx_geom_is_aligned(): geom = HpxGeom.create(nside=8, frame="galactic") assert geom.is_aligned(geom) cutout = geom.cutout(position=SkyCoord("0d", "0d"), width=30 * u.deg) assert cutout.is_aligned(geom) geom_other = HpxGeom.create(nside=4, frame="galactic") assert not geom.is_aligned(geom_other) geom_other = HpxGeom.create(nside=8, frame="galactic", nest=False) assert not geom.is_aligned(geom_other) geom_other = HpxGeom.create(nside=8, frame="icrs") assert not geom.is_aligned(geom_other) def test_hpx_geom_to_wcs_tiles(): geom = HpxGeom.create( nside=8, frame="galactic", axes=[MapAxis.from_edges([0, 2, 3])] ) tiles = geom.to_wcs_tiles(nside_tiles=2) assert len(tiles) == 48 assert tiles[0].projection == "TAN" assert_allclose(tiles[0].width, [[43.974226], [43.974226]] * u.deg) tiles = geom.to_wcs_tiles(nside_tiles=4) assert len(tiles) == 192 assert tiles[0].projection == "TAN" assert_allclose(tiles[0].width, [[21.987113], [21.987113]] * u.deg) def test_geom_str(): geom = HpxGeom(nside=8) assert geom.__class__.__name__ in str(geom) assert "nside" in str(geom) hpx_equality_test_geoms = [ (16, False, "galactic", None, True), (16, True, "galactic", None, False), (8, False, "galactic", None, False), (16, False, "icrs", None, False), ] @pytest.mark.parametrize( ("nside", "nested", "frame", "region", "result"), hpx_equality_test_geoms ) def test_hpxgeom_equal(nside, nested, frame, region, result): geom0 = HpxGeom(16, False, "galactic", region=None) geom1 = HpxGeom(nside, nested, frame, region=region) assert (geom0 == geom1) is result assert (geom0 != geom1) is not result def test_hpx_geom_to_binsz(): geom = HpxGeom.create(nside=32, frame="galactic", nest=True) geom_new = geom.to_binsz(1 * u.deg) assert geom_new.nside[0] == 64 assert geom_new.frame == "galactic" assert geom_new.nest geom = HpxGeom.create( nside=32, frame="galactic", nest=True, region="DISK(110.,75.,10.)" ) geom_new = geom.to_binsz(1 * u.deg) assert geom_new.nside[0] == 64 center = geom_new.center_skydir.galactic assert_allclose(center.l.deg, 110) assert_allclose(center.b.deg, 75) def test_hpx_geom_region_mask(): geom = HpxGeom.create(nside=256, region="DISK(0.,0.,5.)") circle = CircleSkyRegion(center=SkyCoord("0d", "0d"), radius=3 * u.deg) mask = geom.region_mask(circle) assert_allclose(mask.data.sum(), 534) assert mask.geom.nside == 256 solid_angle = (mask.data * geom.solid_angle()).sum() assert_allclose(solid_angle, 2 * np.pi * (1 - np.cos(3 * u.deg)) * u.sr, rtol=0.01) def test_hpx_geom_separation(): geom = HpxGeom.create(binsz=0.1, frame="galactic", nest=True) position = SkyCoord(0, 0, unit="deg", frame="galactic") separation = geom.separation(position) assert separation.unit == "deg" assert_allclose(separation.value[0], 45.000049) # Make sure it also works for 2D maps as input separation = geom.to_image().separation(position) assert separation.unit == "deg" assert_allclose(separation.value[0], 45.000049) # make sure works for partial geometry geom = HpxGeom.create(binsz=0.1, frame="galactic", nest=True, region="DISK(0,0,10)") separation = geom.separation(position) assert separation.unit == "deg" assert_allclose(separation.value[0], 9.978725) def test_check_nside(): with pytest.raises(ValueError): HpxGeom.create(nside=3) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/hpx/tests/test_ndmap.py0000644000175100001770000005127614721316200021470 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy import units as u from astropy.coordinates import SkyCoord from astropy.io import fits from regions import CircleSkyRegion from gammapy.irf import PSFKernel, PSFMap from gammapy.maps import HpxGeom, HpxMap, HpxNDMap, Map, MapAxis, WcsGeom from gammapy.maps.hpx.io import HpxConv from gammapy.maps.io import find_bintable_hdu from gammapy.utils.testing import mpl_plot_check, requires_data pytest.importorskip("healpy") axes1 = [MapAxis(np.logspace(0.0, 3.0, 3), interp="log")] hpx_test_allsky_geoms = [ (8, False, "galactic", None, None), (8, False, "galactic", None, axes1), ([4, 8], False, "galactic", None, axes1), ] hpx_test_partialsky_geoms = [ ([4, 8], False, "galactic", "DISK(110.,75.,30.)", axes1), (8, False, "galactic", "DISK(110.,75.,10.)", [MapAxis(np.logspace(0.0, 3.0, 4))]), ( 8, False, "galactic", "DISK(110.,75.,10.)", [ MapAxis(np.logspace(0.0, 3.0, 4), name="axis0"), MapAxis(np.logspace(0.0, 2.0, 3), name="axis1"), ], ), ] hpx_test_geoms = hpx_test_allsky_geoms + hpx_test_partialsky_geoms def create_map(nside, nested, frame, region, axes): return HpxNDMap( HpxGeom(nside=nside, nest=nested, frame=frame, region=region, axes=axes) ) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxmap_init(nside, nested, frame, region, axes): geom = HpxGeom(nside=nside, nest=nested, frame=frame, region=region, axes=axes) shape = [int(np.max(geom.npix))] if axes: shape += [ax.nbin for ax in axes] shape = shape[::-1] data = np.random.uniform(0, 1, shape) m = HpxNDMap(geom) assert m.data.shape == data.shape m = HpxNDMap(geom, data) assert_allclose(m.data, data) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxmap_create(nside, nested, frame, region, axes): create_map(nside, nested, frame, region, axes) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxmap_read_write(tmp_path, nside, nested, frame, region, axes): path = tmp_path / "tmp.fits" m = create_map(nside, nested, frame, region, axes) m.write(path, sparse=True, overwrite=True) m2 = HpxNDMap.read(path) m4 = Map.read(path, map_type="hpx") msk = np.ones_like(m2.data[...], dtype=bool) assert_allclose(m.data[...][msk], m2.data[...][msk]) assert_allclose(m.data[...][msk], m4.data[...][msk]) m.write(path, overwrite=True) m2 = HpxNDMap.read(path) m3 = HpxMap.read(path, map_type="hpx") m4 = Map.read(path, map_type="hpx") assert_allclose(m.data[...][msk], m2.data[...][msk]) assert_allclose(m.data[...][msk], m3.data[...][msk]) assert_allclose(m.data[...][msk], m4.data[...][msk]) # Specify alternate HDU name for IMAGE and BANDS table m.write(path, sparse=True, hdu="IMAGE", hdu_bands="TEST", overwrite=True) m2 = HpxNDMap.read(path) m3 = Map.read(path) m4 = Map.read(path, map_type="hpx") def test_hpxmap_read_write_fgst(tmp_path): path = tmp_path / "tmp.fits" axis = MapAxis.from_bounds( 100.0, 1000.0, 4, name="energy", unit="MeV", interp="log" ) # Test Counts Cube m = create_map(8, False, "galactic", None, [axis]) m.write(path, format="fgst-ccube", overwrite=True) with fits.open(path, memmap=False) as hdulist: assert "SKYMAP" in hdulist assert "EBOUNDS" in hdulist assert hdulist["SKYMAP"].header["HPX_CONV"] == "FGST-CCUBE" assert hdulist["SKYMAP"].header["TTYPE1"] == "CHANNEL1" m2 = Map.read(path) assert m2 is not None assert m.geom.axes.is_allclose(m2.geom.axes) assert m.geom.is_allclose(m2.geom) assert m.is_allclose(m2) # Test Model Cube m.write(path, format="fgst-template", overwrite=True) with fits.open(path, memmap=False) as hdulist: assert "SKYMAP" in hdulist assert "ENERGIES" in hdulist assert hdulist["SKYMAP"].header["HPX_CONV"] == "FGST-TEMPLATE" assert hdulist["SKYMAP"].header["TTYPE1"] == "ENERGY1" m2 = Map.read(path) assert m2 is not None assert_allclose(m2.geom.axes["energy_true"].edges, axis.edges, atol=1e-3) assert_allclose(m.data, m2.data) @requires_data() def test_read_fgst_exposure(): exposure = Map.read("$GAMMAPY_DATA/fermi_3fhl/fermi_3fhl_exposure_cube_hpx.fits.gz") energy_axis = exposure.geom.axes["energy_true"] assert energy_axis.node_type == "center" assert exposure.unit == "cm2 s" @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxmap_set_get_by_pix(nside, nested, frame, region, axes): m = create_map(nside, nested, frame, region, axes) coords = m.geom.get_coord(flat=True) idx = m.geom.get_idx(flat=True) m.set_by_pix(idx, coords[0]) assert_allclose(coords[0], m.get_by_pix(idx)) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxmap_set_get_by_coord(nside, nested, frame, region, axes): m = create_map(nside, nested, frame, region, axes) coords = m.geom.get_coord(flat=True) m.set_by_coord(coords, coords[0]) assert_allclose(coords[0], m.get_by_coord(coords)) # Test with SkyCoords m = create_map(nside, nested, frame, region, axes) coords = m.geom.get_coord(flat=True) skydir = SkyCoord(coords[0], coords[1], unit="deg", frame=m.geom.frame) skydir_cel = skydir.transform_to("icrs") skydir_gal = skydir.transform_to("galactic") m.set_by_coord((skydir_gal,) + tuple(coords[2:]), coords[0]) assert_allclose(coords[0], m.get_by_coord(coords)) assert_allclose( m.get_by_coord((skydir_cel,) + tuple(coords[2:])), m.get_by_coord((skydir_gal,) + tuple(coords[2:])), ) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxmap_interp_by_coord(nside, nested, frame, region, axes): m = HpxNDMap( HpxGeom(nside=nside, nest=nested, frame=frame, region=region, axes=axes) ) coords = m.geom.get_coord(flat=True) m.set_by_coord(coords, coords[1]) assert_allclose(m.get_by_coord(coords), m.interp_by_coord(coords, method="linear")) def test_hpxmap_interp_by_coord_quantities(): ax = MapAxis(np.logspace(0.0, 3.0, 3), interp="log", name="energy", unit="TeV") geom = HpxGeom(nside=1, axes=[ax]) m = HpxNDMap(geom=geom) coords_dict = {"lon": 99, "lat": 42, "energy": 1000 * u.GeV} coords = m.geom.get_coord(flat=True) m.set_by_coord(coords, coords["lat"]) coords_dict["energy"] = 1 * u.TeV val = m.interp_by_coord(coords_dict) assert_allclose(val, 42, rtol=1e-2) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxmap_fill_by_coord(nside, nested, frame, region, axes): m = create_map(nside, nested, frame, region, axes) coords = m.geom.get_coord(flat=True) m.fill_by_coord(coords, coords[1]) m.fill_by_coord(coords, coords[1]) assert_allclose(m.get_by_coord(coords), 2.0 * coords[1]) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxmap_to_wcs(nside, nested, frame, region, axes): m = HpxNDMap( HpxGeom(nside=nside, nest=nested, frame=frame, region=region, axes=axes) ) m.to_wcs(sum_bands=False, oversample=2, normalize=False) m.to_wcs(sum_bands=True, oversample=2, normalize=False) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxmap_swap_scheme(nside, nested, frame, region, axes): m = HpxNDMap( HpxGeom(nside=nside, nest=nested, frame=frame, region=region, axes=axes) ) m.data = np.arange(m.data.size).reshape(m.geom.data_shape) m2 = m.to_swapped() coords = m.geom.get_coord(flat=True) assert_allclose(m.get_by_coord(coords), m2.get_by_coord(coords)) @pytest.mark.parametrize( ("nside", "nested", "frame", "region", "axes"), hpx_test_partialsky_geoms ) def test_hpxmap_pad(nside, nested, frame, region, axes): m = HpxNDMap( HpxGeom(nside=nside, nest=nested, frame=frame, region=region, axes=axes) ) m.set_by_pix(m.geom.get_idx(flat=True), 1.0) cval = 2.2 m_pad = m.pad(1, mode="constant", cval=cval) coords_pad = m_pad.geom.get_coord(flat=True) msk = m.geom.contains(coords_pad) coords_out = tuple([c[~msk] for c in coords_pad]) assert_allclose(m_pad.get_by_coord(coords_out), cval * np.ones_like(coords_out[0])) coords_in = tuple([c[msk] for c in coords_pad]) assert_allclose(m_pad.get_by_coord(coords_in), np.ones_like(coords_in[0])) def test_hpx_nd_map_pad_axis(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=2) m = HpxNDMap.create(nside=2, frame="galactic", axes=[axis]) m.data += [[1], [2]] m_pad = m.pad(axis_name="energy", pad_width=(1, 1), mode="constant", cval=3) assert_allclose(m_pad.data[:, 0], [3, 1, 2, 3]) @pytest.mark.parametrize( ("nside", "nested", "frame", "region", "axes"), hpx_test_partialsky_geoms ) def test_hpxmap_crop(nside, nested, frame, region, axes): m = HpxNDMap( HpxGeom(nside=nside, nest=nested, frame=frame, region=region, axes=axes) ) m.crop(1) @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxmap_upsample(nside, nested, frame, region, axes): m = HpxNDMap( HpxGeom(nside=nside, nest=nested, frame=frame, region=region, axes=axes), unit="m2", ) m.set_by_pix(m.geom.get_idx(flat=True), 1.0) m_up = m.upsample(2, preserve_counts=True) assert_allclose(np.nansum(m.data), np.nansum(m_up.data)) m_up = m.upsample(2, preserve_counts=False) assert_allclose(4.0 * np.nansum(m.data), np.nansum(m_up.data)) assert m.unit == m_up.unit @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxmap_downsample(nside, nested, frame, region, axes): m = HpxNDMap( HpxGeom(nside=nside, nest=nested, frame=frame, region=region, axes=axes), unit="m2", ) m.set_by_pix(m.geom.get_idx(flat=True), 1.0) m_down = m.downsample(2, preserve_counts=True) assert_allclose(np.nansum(m.data), np.nansum(m_down.data)) assert m.unit == m_down.unit @pytest.mark.parametrize(("nside", "nested", "frame", "region", "axes"), hpx_test_geoms) def test_hpxmap_sum_over_axes(nside, nested, frame, region, axes): m = HpxNDMap( HpxGeom(nside=nside, nest=nested, frame=frame, region=region, axes=axes) ) coords = m.geom.get_coord(flat=True) m.fill_by_coord(coords, coords[0]) msum = m.sum_over_axes() if m.geom.is_regular: assert_allclose(np.nansum(m.data), np.nansum(msum.data)) def test_coadd_unit(): geom = HpxGeom.create(nside=128) m1 = HpxNDMap(geom, unit="m2") m2 = HpxNDMap(geom, unit="cm2") idx = geom.get_idx() weights = u.Quantity(np.ones_like(idx[0]), unit="cm2") m1.fill_by_idx(idx, weights=weights) assert_allclose(m1.data, 0.0001) weights = u.Quantity(np.ones_like(idx[0]), unit="m2") m1.fill_by_idx(idx, weights=weights) m1.coadd(m2) assert_allclose(m1.data, 1.0001) def test_plot(): m = HpxNDMap.create(binsz=10) with mpl_plot_check(): m.plot() def test_plot_grid(): axis = MapAxis([0, 1, 2], node_type="edges") m = HpxNDMap.create(binsz=0.1 * u.deg, width=1, axes=[axis]) with mpl_plot_check(): m.plot_grid() def test_plot_poly(): m = HpxNDMap.create(binsz=10) with mpl_plot_check(): m.plot(method="poly") def test_hpxndmap_resample_axis(): axis_1 = MapAxis.from_edges([1, 2, 3, 4, 5], name="test-1") axis_2 = MapAxis.from_edges([1, 2, 3, 4], name="test-2") geom = HpxGeom.create(nside=16, axes=[axis_1, axis_2]) m = HpxNDMap(geom, unit="m2") m.data += 1 new_axis = MapAxis.from_edges([2, 3, 5], name="test-1") m2 = m.resample_axis(axis=new_axis) assert m2.data.shape == (3, 2, 3072) assert_allclose(m2.data[0, :, 0], [1, 2]) # Test without all interval covered new_axis = MapAxis.from_edges([1.7, 4], name="test-1") m3 = m.resample_axis(axis=new_axis) assert m3.data.shape == (3, 1, 3072) assert_allclose(m3.data, 2) def test_hpx_nd_map_to_nside(): axis = MapAxis.from_edges([1, 2, 3], name="test-1") geom = HpxGeom.create(nside=64, axes=[axis]) m = HpxNDMap(geom, unit="m2") m.data += 1 m2 = m.to_nside(nside=32) assert_allclose(m2.data, 4) m3 = m.to_nside(nside=128) assert_allclose(m3.data, 0.25) def test_hpx_nd_map_to_wcs_tiles(): m = HpxNDMap.create(nside=8, frame="galactic") m.data += 1 tiles = m.to_wcs_tiles(nside_tiles=4) assert_allclose(tiles[0].data, 1) assert_allclose(tiles[32].data, 1) axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=1) m = HpxNDMap.create(nside=8, frame="galactic", axes=[axis]) m.data += 1 tiles = m.to_wcs_tiles(nside_tiles=4) assert_allclose(tiles[0].data, 1) assert_allclose(tiles[32].data, 1) def test_from_wcs_tiles(): geom = HpxGeom.create(nside=8) wcs_geoms = geom.to_wcs_tiles(nside_tiles=4) wcs_tiles = [Map.from_geom(geom, data=1) for geom in wcs_geoms] m = HpxNDMap.from_wcs_tiles(wcs_tiles=wcs_tiles) assert_allclose(m.data, 1) def test_hpx_map_cutout(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=1) m = HpxNDMap.create(nside=32, frame="galactic", axes=[axis]) m.data += np.arange(12288) cutout = m.cutout(SkyCoord("0d", "0d"), width=10 * u.deg) assert cutout.data.shape == (1, 25) assert_allclose(cutout.data.sum(), 239021) assert_allclose(cutout.data[0, 0], 8452) assert_allclose(cutout.data[0, -1], 9768) def test_partial_hpx_map_cutout(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=1) m = HpxNDMap.create( nside=32, frame="galactic", axes=[axis], region="DISK(110.,75.,10.)" ) m.data += np.arange(90) cutout = m.cutout(SkyCoord("0d", "0d"), width=10 * u.deg) assert cutout.data.shape == (1, 25) assert_allclose(cutout.data.sum(), 2225) assert_allclose(cutout.data[0, 0], 89) assert_allclose(cutout.data[0, -1], 89) def test_hpx_map_stack(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=1) m = HpxNDMap.create( nside=32, frame="galactic", axes=[axis], region="DISK(110.,75.,10.)" ) m.data += np.arange(90) m_allsky = HpxNDMap.create(nside=32, frame="galactic", axes=[axis]) m_allsky.stack(m) assert_allclose(m_allsky.data.sum(), (90 * 89) / 2) value = m_allsky.get_by_coord( {"skycoord": SkyCoord("110d", "75d", frame="galactic"), "energy": 3 * u.TeV} ) assert_allclose(value, 69) with pytest.raises(ValueError): m_allsky = HpxNDMap.create(nside=16, frame="galactic", axes=[axis]) m_allsky.stack(m) def test_hpx_map_weights_stack(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=1) m = HpxNDMap.create( nside=32, frame="galactic", axes=[axis], region="DISK(110.,75.,10.)" ) m.data += np.arange(90) + 1 weights = m.copy() weights.data = 1 / (np.arange(90) + 1) m_allsky = HpxNDMap.create(nside=32, frame="galactic", axes=[axis]) m_allsky.stack(m, weights=weights) assert_allclose(m_allsky.data.sum(), 90) def test_partial_hpx_map_stack(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=1) m_1 = HpxNDMap.create( nside=128, frame="galactic", axes=[axis], region="DISK(110.,75.,20.)" ) m_1.data += 1 m_2 = HpxNDMap.create( nside=128, frame="galactic", axes=[axis], region="DISK(130.,75.,20.)" ) m_2.stack(m_1) assert_allclose(m_1.data.sum(), 5933) assert_allclose(m_2.data.sum(), 4968) def test_hpx_map_to_region_nd_map(): axis = MapAxis.from_energy_bounds("10 GeV", "2 TeV", nbin=10) m = HpxNDMap.create(nside=128, axes=[axis]) m.data += 1 circle = CircleSkyRegion(center=SkyCoord("0d", "0d"), radius=10 * u.deg) spec = m.to_region_nd_map(region=circle) assert_allclose(spec.data.sum(), 14660) spec_mean = m.to_region_nd_map(region=circle, func=np.mean) assert_allclose(spec_mean.data, 1) spec_interp = m.to_region_nd_map(region=circle.center, func=np.mean) assert_allclose(spec_interp.data, 1) @pytest.mark.parametrize("kernel", ["gauss", "disk"]) def test_smooth(kernel): axes = [ MapAxis(np.logspace(0.0, 3.0, 3), interp="log"), MapAxis(np.logspace(1.0, 3.0, 4), interp="lin"), ] geom_nest = HpxGeom.create(nside=256, nest=False, frame="galactic", axes=axes) geom_ring = HpxGeom.create(nside=256, nest=True, frame="galactic", axes=axes) m_nest = HpxNDMap(geom_nest, data=np.ones(geom_nest.data_shape), unit="m2") m_ring = HpxNDMap(geom_ring, data=np.ones(geom_ring.data_shape), unit="m2") desired_nest = m_nest.data.sum() desired_ring = m_ring.data.sum() smoothed_nest = m_nest.smooth(0.2 * u.deg, kernel) smoothed_ring = m_ring.smooth(0.2 * u.deg, kernel) actual_nest = smoothed_nest.data.sum() assert_allclose(actual_nest, desired_nest) assert smoothed_nest.data.dtype == float actual_ring = smoothed_ring.data.sum() assert_allclose(actual_ring, desired_ring) assert smoothed_ring.data.dtype == float # with pytest.raises(NotImplementedError): cutout = m_nest.cutout(position=(0, 0), width=15 * u.deg) smoothed_cutout = cutout.smooth(0.1 * u.deg, kernel) actual_cutout = cutout.data.sum() desired_cutout = smoothed_cutout.data.sum() assert_allclose(actual_cutout, desired_cutout, rtol=0.01) with pytest.raises(ValueError): m_nest.smooth(0.2 * u.deg, "box") @pytest.mark.parametrize("nest", [True, False]) def test_convolve_wcs(nest): energy = MapAxis.from_bounds(1, 100, unit="TeV", nbin=2, name="energy") nside = 256 hpx_geom = HpxGeom.create( nside=nside, axes=[energy], region="DISK(0,0,2.5)", nest=nest ) hpx_map = Map.from_geom(hpx_geom) hpx_map.set_by_coord((0, 0, [2, 90]), 1) wcs_geom = WcsGeom.create(width=5, binsz=0.04, axes=[energy]) kernel = PSFKernel.from_gauss(wcs_geom, 0.4 * u.deg) convolved_map = hpx_map.convolve_wcs(kernel) assert_allclose(convolved_map.data.sum(), 2, rtol=0.001) @pytest.mark.parametrize("region", [None, "DISK(0,0,70)"]) def test_convolve_full(region): energy = MapAxis.from_bounds(1, 100, unit="TeV", nbin=2, name="energy_true") nside = 256 all_sky_geom = HpxGeom( nside=nside, axes=[energy], region=region, nest=False, frame="icrs" ) all_sky_map = Map.from_geom(all_sky_geom) all_sky_map.set_by_coord((0, 0, [2, 90]), 1) all_sky_map.set_by_coord((10, 10, [2, 90]), 1) all_sky_map.set_by_coord((30, 30, [2, 90]), 1) all_sky_map.set_by_coord((-40, -40, [2, 90]), 1) all_sky_map.set_by_coord((60, 0, [2, 90]), 1) all_sky_map.set_by_coord((-45, 30, [2, 90]), 1) all_sky_map.set_by_coord((30, -45, [2, 90]), 1) wcs_geom = WcsGeom.create(width=5, binsz=0.05, axes=[energy]) psf = PSFMap.from_gauss(energy_axis_true=energy, sigma=[0.5, 0.6] * u.deg) kernel = psf.get_psf_kernel(geom=wcs_geom, max_radius=1 * u.deg) convolved_map = all_sky_map.convolve_full(kernel) assert_allclose(convolved_map.data.sum(), 14.0, rtol=2e-5) def test_hpxmap_read_healpy(tmp_path): import healpy as hp path = tmp_path / "tmp.fits" npix = 12 * 1024 * 1024 m = [np.arange(npix), np.arange(npix) - 1, np.arange(npix) - 2] hp.write_map(filename=path, m=m, nest=False, overwrite=True) with fits.open(path, memmap=False) as hdulist: hdu_out = find_bintable_hdu(hdulist) header = hdu_out.header assert header["PIXTYPE"] == "HEALPIX" assert header["ORDERING"] == "RING" assert header["EXTNAME"] == "xtension" assert header["NSIDE"] == 1024 format = HpxConv.identify_hpx_format(header) assert format == "healpy" # first column "TEMPERATURE" m1 = Map.read(path, colname="TEMPERATURE") assert m1.data.shape[0] == npix diff = np.sum(m[0] - m1.data) assert_allclose(diff, 0.0) # specifying the colname by default for healpy it is "Q_POLARISATION" m2 = Map.read(path, colname="Q_POLARISATION") assert m2.data.shape[0] == npix diff = np.sum(m[1] - m2.data) assert_allclose(diff, 0.0) def test_map_plot_mask(): geom = HpxGeom.create(nside=16) region = CircleSkyRegion( center=SkyCoord("0d", "0d", frame="galactic"), radius=20 * u.deg ) mask = geom.region_mask([region]) with mpl_plot_check(): mask.plot_mask() def test_hpx_map_sampling(): hpxmap = HpxNDMap.create(nside=16) with pytest.raises(NotImplementedError): hpxmap.sample_coord(2) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/hpx/utils.py0000644000175100001770000003425014721316200017321 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import html import re import numpy as np from astropy.utils import lazyproperty from gammapy.utils.array import is_power2 from ..utils import INVALID_INDEX # Approximation of the size of HEALPix pixels (in degrees) for a particular order. # Used to convert from HEALPix to WCS-based projections. HPX_ORDER_TO_PIXSIZE = np.array( [32.0, 16.0, 8.0, 4.0, 2.0, 1.0, 0.50, 0.25, 0.1, 0.05, 0.025, 0.01, 0.005, 0.002] ) def unravel_hpx_index(idx, npix): """Convert flattened global map index to an index tuple. Parameters ---------- idx : `~numpy.ndarray` Flat index. npix : `~numpy.ndarray` Number of pixels in each band. Returns ------- idx : tuple of `~numpy.ndarray` Index array for each dimension of the map. """ if npix.size == 1: return tuple([idx]) dpix = np.zeros(npix.size, dtype="i") dpix[1:] = np.cumsum(npix.flat[:-1]) bidx = np.searchsorted(np.cumsum(npix.flat), idx + 1) pix = idx - dpix[bidx] return tuple([pix] + list(np.unravel_index(bidx, npix.shape))) def ravel_hpx_index(idx, npix): """Convert map index tuple to a flattened index. Parameters ---------- idx : tuple of `~numpy.ndarray` Index array. npix : `~numpy.ndarray` Number of pixels. Returns ------- idx : `~numpy.ndarray` Index array. """ if len(idx) == 1: return idx[0] # TODO: raise exception for indices that are out of bounds idx0 = idx[0] idx1 = np.ravel_multi_index(idx[1:], npix.shape, mode="clip") npix = np.concatenate((np.array([0]), npix.flat[:-1])) return idx0 + np.cumsum(npix)[idx1] def coords_to_vec(lon, lat): """Convert longitude and latitude coordinates to a unit 3-vector. Returns ------- array(3,n) with v_x[i],v_y[i],v_z[i] = directional cosines """ phi = np.radians(lon) theta = (np.pi / 2) - np.radians(lat) sin_t = np.sin(theta) cos_t = np.cos(theta) x = sin_t * np.cos(phi) y = sin_t * np.sin(phi) z = cos_t # Stack them into the output array out = np.vstack((x, y, z)).swapaxes(0, 1) return out def get_nside_from_pix_size(pixsz): """Get the NSIDE that is closest to the given pixel size. Parameters ---------- pixsz : `~numpy.ndarray` Pixel size in degrees. Returns ------- nside : `~numpy.ndarray` HEALPix NSIDE parameter. """ import healpy as hp pixsz = np.array(pixsz, ndmin=1) nside = 2 ** np.linspace(1, 14, 14, dtype=int) nside_pixsz = np.degrees(hp.nside2resol(nside)) return nside[np.argmin(np.abs(nside_pixsz - pixsz[..., None]), axis=-1)] def get_pix_size_from_nside(nside): """Estimate of the pixel size from the HEALPix nside coordinate. This just uses a lookup table to provide a nice round number for each HEALPix order. """ order = nside_to_order(nside) if np.any(order < 0) or np.any(order > 13): raise ValueError(f"HEALPix order must be 0 to 13. Got: {order!r}") return HPX_ORDER_TO_PIXSIZE[order] def match_hpx_pix(nside, nest, nside_pix, ipix_ring): """ Match to the HEALPix pixel number. Parameters ---------- nside : int or `~numpy.ndarray` HEALPix NSIDE parameter, must be a power of 2. nest : bool Indexing scheme. If True, "NESTED" scheme. If False, "RING" scheme. nside_pix : int or `~numpy.ndarray` HEALPix NSIDE parameter of subpixel. ipix_ring : int or `~numpy.ndarray` HEALPix pixel number. """ import healpy as hp ipix_in = np.arange(12 * nside * nside) vecs = hp.pix2vec(nside, ipix_in, nest) pix_match = hp.vec2pix(nside_pix, vecs[0], vecs[1], vecs[2]) == ipix_ring return ipix_in[pix_match] def parse_hpxregion(region): """Parse the ``HPX_REG`` header keyword into a list of tokens.""" m = re.match(r"([A-Za-z\_]*?)\((.*?)\)", region) if m is None: raise ValueError(f"Failed to parse hpx region string: {region!r}") if not m.group(1): return re.split(",", m.group(2)) else: return [m.group(1)] + re.split(",", m.group(2)) def nside_to_order(nside): """Compute the ORDER given NSIDE. Returns -1 for NSIDE values that are not a power of 2. """ nside = np.array(nside, ndmin=1) order = -1 * np.ones_like(nside) m = is_power2(nside) order[m] = np.log2(nside[m]).astype(int) return order def get_superpixels(idx, nside_subpix, nside_superpix, nest=True): """Compute the indices of superpixels that contain a subpixel. Parameters ---------- idx : `~numpy.ndarray` Array of HEALPix pixel indices for subpixels of NSIDE ``nside_subpix``. nside_subpix : int or `~numpy.ndarray` HEALPix NSIDE parameter of subpixel. nside_superpix : int or `~numpy.ndarray` HEALPix NSIDE parameter of superpixel. nest : bool, optional Indexing scheme. If True, "NESTED" scheme. If False, "RING" scheme. Default is True. Returns ------- idx_super : `~numpy.ndarray` Indices of HEALPix pixels of nside ``nside_superpix`` that contain pixel indices ``idx`` of nside ``nside_subpix``. """ import healpy as hp idx = np.array(idx) nside_superpix = np.asarray(nside_superpix) nside_subpix = np.asarray(nside_subpix) if not nest: idx = hp.ring2nest(nside_subpix, idx) if np.any(~is_power2(nside_superpix)) or np.any(~is_power2(nside_subpix)): raise ValueError("NSIDE must be a power of 2.") ratio = np.array((nside_subpix // nside_superpix) ** 2, ndmin=1) idx //= ratio if not nest: m = idx == INVALID_INDEX.int idx[m] = 0 idx = hp.nest2ring(nside_superpix, idx) idx[m] = INVALID_INDEX.int return idx def get_subpixels(idx, nside_superpix, nside_subpix, nest=True): """Compute the indices of subpixels contained within superpixels. This function returns an output array with one additional dimension of size N for subpixel indices where N is the maximum number of subpixels for any pair of ``nside_superpix`` and ``nside_subpix``. If the number of subpixels is less than N the remaining subpixel indices will be set to -1. Parameters ---------- idx : `~numpy.ndarray` Array of HEALPix pixel indices for superpixels of NSIDE ``nside_superpix``. nside_superpix : int or `~numpy.ndarray` HEALPix NSIDE parameter of superpixel. nside_subpix : int or `~numpy.ndarray` HEALPix NSIDE parameter of subpixel. nest : bool, optional Indexing scheme. If True, "NESTED" scheme. If False, "RING" scheme. Default is True. Returns ------- idx_sub : `~numpy.ndarray` Indices of HEALPix pixels of nside ``nside_subpix`` contained within pixel indices ``idx`` of nside ``nside_superpix``. """ import healpy as hp if not nest: idx = hp.ring2nest(nside_superpix, idx) idx = np.asarray(idx) nside_superpix = np.asarray(nside_superpix) nside_subpix = np.asarray(nside_subpix) if np.any(~is_power2(nside_superpix)) or np.any(~is_power2(nside_subpix)): raise ValueError("NSIDE must be a power of 2.") # number of subpixels in each superpixel npix = np.array((nside_subpix // nside_superpix) ** 2, ndmin=1) x = np.arange(np.max(npix), dtype=int) idx = idx * npix if not np.all(npix[0] == npix): x = np.broadcast_to(x, idx.shape + x.shape) idx = idx[..., None] + x idx[x >= np.broadcast_to(npix[..., None], x.shape)] = INVALID_INDEX.int else: idx = idx[..., None] + x if not nest: m = idx == INVALID_INDEX.int idx[m] = 0 idx = hp.nest2ring(nside_subpix[..., None], idx) idx[m] = INVALID_INDEX.int return idx class HpxToWcsMapping: """Stores the indices need to convert from HEALPix to WCS. Parameters ---------- hpx : `~HpxGeom` HEALPix geometry object. wcs : `~gammapy.maps.WcsGeom` WCS geometry object. """ def __init__(self, hpx, wcs, ipix, mult_val, npix): self._hpx = hpx self._wcs = wcs self._ipix = ipix self._mult_val = mult_val self._npix = npix def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property def hpx(self): """HEALPix projection.""" return self._hpx @property def wcs(self): """WCS projection.""" return self._wcs @property def ipix(self): """An array(nx,ny) of the global HEALPix pixel indices for each WCS pixel.""" return self._ipix @property def mult_val(self): """An array(nx,ny) of 1/number of WCS pixels pointing at each HEALPix pixel.""" return self._mult_val @property def npix(self): """A tuple(nx,ny) of the shape of the WCS grid.""" return self._npix @lazyproperty def lmap(self): """Array ``(nx, ny)`` mapping local HEALPix pixel indices for each WCS pixel.""" return self.hpx.global_to_local(self.ipix, ravel=True) @property def valid(self): """Array ``(nx, ny)`` of bool: which WCS pixel in inside the HEALPix region.""" return self.lmap >= 0 @classmethod def create(cls, hpx, wcs): """Create HEALPix to WCS geometry pixel mapping. Parameters ---------- hpx : `~HpxGeom` HEALPix geometry object. wcs : `~gammapy.maps.WcsGeom` WCS geometry object. Returns ------- hpx2wcs : `~HpxToWcsMapping` Mapping. """ import healpy as hp npix = wcs.npix # FIXME: Calculation of WCS pixel centers should be moved into a # method of WcsGeom pix_crds = np.dstack(np.meshgrid(np.arange(npix[0][0]), np.arange(npix[1][0]))) pix_crds = pix_crds.swapaxes(0, 1).reshape((-1, 2)) sky_crds = wcs.wcs.wcs_pix2world(pix_crds, 0) sky_crds *= np.radians(1.0) sky_crds[0:, 1] = (np.pi / 2) - sky_crds[0:, 1] mask = ~np.any(np.isnan(sky_crds), axis=1) ipix = -1 * np.ones((len(hpx.nside), int(npix[0][0] * npix[1][0])), int) m = mask[None, :] * np.ones_like(ipix, dtype=bool) ipix[m] = hp.ang2pix( hpx.nside[..., None], sky_crds[:, 1][mask][None, ...], sky_crds[:, 0][mask][None, ...], hpx.nest, ).flatten() # Here we are counting the number of HEALPix pixels each WCS pixel # points to and getting a multiplicative factor that tells use how # to split up the counts in each HEALPix pixel (by dividing the # corresponding WCS pixels by the number of associated HEALPix # pixels). mult_val = np.ones_like(ipix, dtype=float) for i, t in enumerate(ipix): count = np.unique(t, return_counts=True) idx = np.searchsorted(count[0], t) mult_val[i, ...] = 1.0 / count[1][idx] if hpx.nside.size == 1: ipix = np.squeeze(ipix, axis=0) mult_val = np.squeeze(mult_val, axis=0) return cls(hpx, wcs, ipix, mult_val, npix) def fill_wcs_map_from_hpx_data( self, hpx_data, wcs_data, normalize=True, fill_nan=True ): """Fill the WCS map from the HEALPix data using the pre-calculated mappings. Parameters ---------- hpx_data : `~numpy.ndarray` The input HEALPix data. wcs_data : `~numpy.ndarray` The data array being filled. normalize : bool, optional If True, preserve integral by splitting HEALPix values between bins. Default is True. fill_nan : bool, optional Fill pixels outside the HEALPix geometry with NaN. Default is True. Returns ------- wcs_data : `~numpy.ndarray` The WCS data array. """ # FIXME: Do we want to flatten mapping arrays? shape = tuple([t.flat[0] for t in self._npix]) if self.valid.ndim != 1: shape = hpx_data.shape[:-1] + shape valid = np.where(self.valid.reshape(shape)) lmap = self.lmap[self.valid] mult_val = self._mult_val[self.valid] wcs_slice = [slice(None) for _ in range(wcs_data.ndim - 2)] wcs_slice = tuple(wcs_slice + list(valid)[::-1][:2]) hpx_slice = [slice(None) for _ in range(wcs_data.ndim - 2)] hpx_slice = tuple(hpx_slice + [lmap]) if normalize: wcs_data[wcs_slice] = mult_val * hpx_data[hpx_slice] else: wcs_data[wcs_slice] = hpx_data[hpx_slice] if fill_nan: valid = np.swapaxes(self.valid.reshape(shape), -1, -2) valid = valid * np.ones_like(wcs_data, dtype=bool) wcs_data[~valid] = np.nan return wcs_data def fill_hpx_map_from_wcs_data(self, wcs_data, hpx_data, normalize=True): """Fill the HEALPix map from the WCS data using the pre-calculated mappings. Parameters ---------- wcs_data : `~numpy.ndarray` The input WCS data. hpx_data : `~numpy.ndarray` The data array being filled. normalize : bool, optional If True, preserve integral by splitting HEALPix values between bins. Default is True. Returns ------- hpx_data : `~numpy.ndarray` The HEALPix data array. """ shape = tuple([t.flat[0] for t in self._npix]) if self.valid.ndim != 1: shape = hpx_data.shape[:-1] + shape valid = np.where(self.valid.reshape(shape)) lmap = self.lmap[self.valid] mult_val = self._mult_val[self.valid] wcs_slice = [slice(None) for _ in range(wcs_data.ndim - 2)] wcs_slice = tuple(wcs_slice + list(valid)[::-1][:2]) hpx_slice = [slice(None) for _ in range(wcs_data.ndim - 2)] hpx_slice = tuple(hpx_slice + [lmap]) if normalize: hpx_data[hpx_slice] = 1 / mult_val * wcs_data[wcs_slice] else: hpx_data[hpx_slice] = wcs_data[wcs_slice] return hpx_data ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/io.py0000644000175100001770000000302114721316200015761 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from astropy.io import fits def find_bands_hdu(hdu_list, hdu): """Discover the extension name of the BANDS HDU. Parameters ---------- hdu_list : `~astropy.io.fits.HDUList` The FITS HDU list. hdu : `~astropy.io.fits.BinTableHDU` or `~astropy.io.fits.ImageHDU` The HDU to check. Returns ------- hduname : str Extension name of the BANDS HDU. Returns None if no BANDS HDU was found. """ if "BANDSHDU" in hdu.header: return hdu.header["BANDSHDU"] has_cube_data = False if ( isinstance(hdu, (fits.ImageHDU, fits.PrimaryHDU)) and hdu.header.get("NAXIS", None) == 3 ): has_cube_data = True elif isinstance(hdu, fits.BinTableHDU): if ( hdu.header.get("INDXSCHM", "") in ["EXPLICIT", "IMPLICIT", ""] and len(hdu.columns) > 1 ): has_cube_data = True if has_cube_data: if "EBOUNDS" in hdu_list: return "EBOUNDS" elif "ENERGIES" in hdu_list: return "ENERGIES" return None def find_hdu(hdulist): """Find the first non-empty HDU.""" for hdu in hdulist: if hdu.data is not None: return hdu raise AttributeError("No Image or BinTable HDU found.") def find_bintable_hdu(hdulist): for hdu in hdulist: if hdu.data is not None and isinstance(hdu, fits.BinTableHDU): return hdu raise AttributeError("No BinTable HDU found.") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/maps.py0000644000175100001770000001306114721316200016317 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import html from collections.abc import MutableMapping from astropy.io import fits from gammapy.maps import Map from gammapy.utils.scripts import make_path __all__ = ["Maps"] class Maps(MutableMapping): """A Dictionary containing Map objects sharing the same geometry. This class simplifies handling and I/O of maps collections. For maps with different geometries, use a regular dict. """ def __init__(self, **kwargs): self._geom = None self._data = {} for key, value in kwargs.items(): self.__setitem__(key, value) @property def geom(self): """Map geometry as a `~gammapy.maps.Geom` object.""" return self._geom def __setitem__(self, key, value): if value is not None and not isinstance(value, Map): raise ValueError( f"MapDict can only contain Map objects, got {type(value)} instead." ) # TODO: which loosers criterion to apply? broadcastability? else: self._geom = value.geom self._data[key] = value def __getitem__(self, key): return self._data[key] def __len__(self): """Returns the length of MapDict.""" return len(self._data) def __delitem__(self, key): del self._data[key] def __iter__(self): return iter(self._data) def __repr__(self): return f"{type(self).__name__}({self._data})" def __str__(self): str_ = f"{self.__class__.__name__}\n" str_ += "-" * len(self.__class__.__name__) + "\n" str_ += "\n" str_ += self._geom.__repr__() str_ += "\n" for name, value in self.items(): str_ += f"{name} \n" str_ += f"\t unit\t : {value.unit} \n" str_ += f"\t dtype\t : {value.data.dtype}\n" str_ += "\n" return str_ def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def to_hdulist(self, hdu_bands="BANDS"): """Convert map dictionary to list of HDUs. Parameters ---------- hdu_bands : str, optional Name of the HDU with the BANDS table. If set to None, each map will have its own hdu_band. Default is 'BANDS'. Returns ------- hdulist : `~astropy.io.fits.HDUList` Map dataset list of HDUs. """ exclude_primary = slice(1, None) hdu_primary = fits.PrimaryHDU() hdulist = fits.HDUList([hdu_primary]) for key, m in self.items(): hdulist += m.to_hdulist(hdu=key, hdu_bands=hdu_bands)[exclude_primary] return hdulist @classmethod def from_hdulist(cls, hdulist, hdu_bands="BANDS"): """Create map dictionary from a HDU list. Because FITS keywords are case-insensitive, all key names will return as lower-case. Parameters ---------- hdulist : `~astropy.io.fits.HDUList` List of HDUs. hdu_bands : str, optional Name of the HDU with the BANDS table. If set to None, each map should have its own hdu_band. Default is 'BANDS'. Returns ------- maps : `~gammapy.maps.Maps` Maps object. """ maps = cls() for hdu in hdulist: if hdu.is_image and hdu.data is not None: map_name = hdu.name.lower() maps[map_name] = Map.from_hdulist( hdulist, hdu=map_name, hdu_bands=hdu_bands ) return maps @classmethod def read(cls, filename, checksum=False): """Read map dictionary from file. Because FITS keywords are case-insensitive, all key names will return as lower-case. Parameters ---------- filename : str Filename to read from. Returns ------- maps : `~gammapy.maps.Maps` Maps object. """ with fits.open( str(make_path(filename)), memmap=False, checksum=checksum ) as hdulist: return cls.from_hdulist(hdulist) def write(self, filename, overwrite=False, checksum=False): """Write map dictionary to file. Parameters ---------- filename : str Filename to write to. overwrite : bool, optional Overwrite existing file. Default is False. checksum : bool, optional When True adds both DATASUM and CHECKSUM cards to the headers written to the file. Default is False. """ filename = make_path(filename) hdulist = self.to_hdulist() hdulist.writeto(filename, overwrite=overwrite, checksum=checksum) @classmethod def from_geom(cls, geom, names, kwargs_list=None): """Create map dictionary from a geometry. Parameters ---------- geom : `~gammapy.maps.Geom` The input geometry that will be used by all maps. names : list of str The list of all map names. kwargs_list : list of dict the list of arguments to be passed to `~gammapy.maps.Map.from_geom()`. Returns ------- maps : `~gammapy.maps.Maps` Maps object. """ mapdict = {} if kwargs_list is None: kwargs_list = [{}] * len(names) for name, kwargs in zip(names, kwargs_list): mapdict[name] = Map.from_geom(geom, **kwargs) return cls(**mapdict) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/measure.py0000644000175100001770000001000614721316200017014 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from astropy.coordinates import SkyCoord from regions import PolygonSkyRegion import matplotlib as mpl import matplotlib.pyplot as plt def containment_region(map_, fraction=0.393, apply_union=True): """Find the iso-contours region corresponding to a given containment for a map of integral quantities with a flat geometry. Parameters ---------- map_ : `~gammapy.maps.WcsNDMap` Map of integral quantities. fraction : float, optional Containment fraction. Default is 0.393. apply_union : bool, optional It True return a compound region otherwise return a list of polygon regions. Default is True. Note that compound regions cannot be written in ds9 format but can always be saved with numpy.savez. Returns ------- regions : list of `~regions.PolygonSkyRegion` or `~regions.CompoundSkyRegion` Regions from iso-contours matching containment fraction. """ from . import WcsNDMap if not isinstance(map_, WcsNDMap): raise TypeError( f"This function is only supported for WcsNDMap, got {type(map_)} instead." ) fmax = np.nanmax(map_.data) if fmax > 0.0: ordered = np.sort(map_.data.flatten())[::-1] cumsum = np.nancumsum(ordered) ind = np.argmin(np.abs(cumsum / cumsum.max() - fraction)) fval = ordered[ind] plt.ioff() fig = plt.figure() cs = plt.contour(map_.data.squeeze(), [fval]) plt.close(fig) plt.ion() regions_pieces = [] try: paths_all = cs.get_paths()[0] starts = np.where(paths_all.codes == 1)[0] stops = np.where(paths_all.codes == 79)[0] + 1 paths = [] for start, stop in zip(starts, stops): paths.append( mpl.path.Path( paths_all.vertices[start:stop], codes=paths_all.codes[start:stop], ) ) except AttributeError: paths = cs.collections[0].get_paths() for pp in paths: vertices = [] for v in pp.vertices: v_coord = map_.geom.pix_to_coord(v) vertices.append([v_coord[0], v_coord[1]]) vertices = SkyCoord(vertices, frame=map_.geom.frame) regions_pieces.append(PolygonSkyRegion(vertices)) if apply_union: regions_union = regions_pieces[0] for region in regions_pieces[1:]: regions_union = regions_union.union(region) return regions_union else: return regions_pieces else: raise ValueError("No positive values in the map.") def containment_radius(map_, fraction=0.393, position=None): """Compute containment radius from a position in a map with integral quantities and flat geometry. Parameters ---------- map_ : `~gammapy.maps.WcsNDMap` Map of integral quantities. fraction : float Containment fraction. Default is 0.393. position : `~astropy.coordinates.SkyCoord` Position from where the containment is computed. Default is the center of the Map. Returns ------- radius : `~astropy.coordinates.Angle` Minimal radius required to reach the given containement fraction. """ from . import WcsNDMap if not isinstance(map_, WcsNDMap): raise TypeError( f"This function is only supported for WcsNDMap, got {type(map_)} instead." ) if position is None: position = map_.geom.center_skydir fmax = np.nanmax(map_.data) if fmax > 0.0: sep = map_.geom.separation(position).flatten() order = np.argsort(sep) ordered = map_.data.flatten()[order] cumsum = np.nancumsum(ordered) ind = np.argmin(np.abs(cumsum / cumsum.max() - fraction)) else: raise ValueError("No positive values in the map.") return sep[order][ind] ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.2206419 gammapy-1.3/gammapy/maps/region/0000755000175100001770000000000014721316215016275 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/region/__init__.py0000644000175100001770000000026014721316200020376 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from .geom import RegionGeom from .ndmap import RegionNDMap __all__ = [ "RegionGeom", "RegionNDMap", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/region/geom.py0000644000175100001770000006723414721316200017604 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import copy import logging from functools import lru_cache import numpy as np from astropy import units as u from astropy.coordinates import Angle, SkyCoord from astropy.io import fits from astropy.table import QTable, Table from astropy.utils import lazyproperty from astropy.visualization.wcsaxes import WCSAxes from astropy.wcs.utils import ( proj_plane_pixel_area, proj_plane_pixel_scales, wcs_to_celestial_frame, ) from regions import ( CompoundSkyRegion, PixCoord, PointSkyRegion, RectanglePixelRegion, Regions, SkyRegion, ) import matplotlib.pyplot as plt from gammapy.utils.regions import ( compound_region_center, compound_region_to_regions, regions_to_compound_region, ) from gammapy.visualization.utils import ARTIST_TO_LINE_PROPERTIES from ..axes import MapAxes from ..coord import MapCoord from ..core import Map from ..geom import Geom, pix_tuple_to_idx from ..utils import _check_width from ..wcs import WcsGeom __all__ = ["RegionGeom"] class RegionGeom(Geom): """Map geometry representing a region on the sky. The spatial component of the geometry is made up of a single pixel with an arbitrary shape and size. It can also have any number of non-spatial dimensions. This class represents the geometry for the `RegionNDMap` maps. Parameters ---------- region : `~regions.SkyRegion` Region object. axes : list of `MapAxis` Non-spatial data axes. wcs : `~astropy.wcs.WCS` Optional WCS object to project the region if needed. binsz_wcs : `~astropy.units.Quantity` Angular bin size of the underlying `~WcsGeom` used to evaluate quantities in the region. Default is "0.1 deg". This default value is adequate for the majority of use cases. If a WCS object is provided, the input of ``binsz_wcs`` is overridden. """ is_regular = True is_allsky = False is_hpx = False is_region = True _slice_spatial_axes = slice(0, 2) _slice_non_spatial_axes = slice(2, None) projection = "TAN" def __init__(self, region, axes=None, wcs=None, binsz_wcs="0.1 deg"): self._region = region self._axes = MapAxes.from_default(axes, n_spatial_axes=2) self._binsz_wcs = u.Quantity(binsz_wcs) if wcs is None and region is not None: if isinstance(region, CompoundSkyRegion): self._center = compound_region_center(region) else: self._center = region.center wcs = WcsGeom.create( binsz=binsz_wcs, skydir=self._center, proj=self.projection, frame=self._center.frame.name, ).wcs self._wcs = wcs # TODO : can we get the width before defining the wcs ? wcs = WcsGeom.create( binsz=binsz_wcs, width=tuple(self.width), skydir=self._center, proj=self.projection, frame=self._center.frame.name, ).wcs self._wcs = wcs self.ndim = len(self.data_shape) # define cached methods self.get_wcs_coord_and_weights = lru_cache()(self.get_wcs_coord_and_weights) def __setstate__(self, state): for key, value in state.items(): if key in ["get_wcs_coord_and_weights"]: state[key] = lru_cache()(value) self.__dict__ = state @property def frame(self): """Coordinate system, either Galactic ("galactic") or Equatorial ("icrs").""" if self.region is None: return "icrs" try: return self.region.center.frame.name except AttributeError: return wcs_to_celestial_frame(self.wcs).name @property def binsz_wcs(self): """Angular bin size of the underlying `~WcsGeom`. Returns ------- binsz_wcs: `~astropy.coordinates.Angle` Angular bin size. """ return Angle(proj_plane_pixel_scales(self.wcs), unit="deg") @lazyproperty def _rectangle_bbox(self): if self.region is None: raise ValueError("Region definition required.") regions = compound_region_to_regions(self.region) regions_pix = [_.to_pixel(self.wcs) for _ in regions] bbox = regions_pix[0].bounding_box for region_pix in regions_pix[1:]: bbox = bbox.union(region_pix.bounding_box) try: rectangle_pix = bbox.to_region() except ValueError: rectangle_pix = RectanglePixelRegion( center=PixCoord(*bbox.center[::-1]), width=1, height=1 ) return rectangle_pix.to_sky(self.wcs) @property def width(self): """Width of bounding box of the region. Returns ------- width : `~astropy.units.Quantity` Dimensions of the region in both spatial dimensions. Units: ``deg`` """ rectangle = self._rectangle_bbox return u.Quantity([rectangle.width.to("deg"), rectangle.height.to("deg")]) @property def region(self): """The spatial component of the region geometry as a `~regions.SkyRegion`.""" return self._region @property def is_all_point_sky_regions(self): """Whether regions are all point regions.""" regions = compound_region_to_regions(self.region) return np.all([isinstance(_, PointSkyRegion) for _ in regions]) @property def axes(self): """List of non-spatial axes.""" return self._axes @property def axes_names(self): """All axes names.""" return ["lon", "lat"] + self.axes.names @property def wcs(self): """WCS projection object.""" return self._wcs @property def center_coord(self): """Center coordinate of the region as a `astropy.coordinates.SkyCoord`.""" return self.pix_to_coord(self.center_pix) @property def center_pix(self): """Pixel values corresponding to the center of the region.""" return tuple((np.array(self.data_shape) - 1.0) / 2)[::-1] @lazyproperty def center_skydir(self): """Sky coordinate of the center of the region.""" if self.region is None: return SkyCoord(np.nan * u.deg, np.nan * u.deg) return self._rectangle_bbox.center @property def npix(self): """Number of spatial pixels.""" return ([1], [1]) def contains(self, coords): """Check if a given map coordinate is contained in the region. Requires the `.region` attribute to be set. For `PointSkyRegion` the method always returns True. Parameters ---------- coords : tuple, dict, `MapCoord` or `~astropy.coordinates.SkyCoord` Object containing coordinate arrays we wish to check for inclusion in the region. Returns ------- mask : `~numpy.ndarray` Boolean array with the same shape as the input that indicates which coordinates are inside the region. """ if self.region is None: raise ValueError("Region definition required.") coords = MapCoord.create(coords, frame=self.frame, axis_names=self.axes.names) if self.is_all_point_sky_regions: return np.ones(coords.skycoord.shape, dtype=bool) return self.region.contains(coords.skycoord, self.wcs) def contains_wcs_pix(self, pix): """Check if a given WCS pixel coordinate is contained in the region. For `PointSkyRegion` the method always returns True. Parameters ---------- pix : tuple Tuple of pixel coordinates. Returns ------- containment : `~numpy.ndarray` Boolean array. """ if self.is_all_point_sky_regions: return np.ones(pix[0].shape, dtype=bool) region_pix = self.region.to_pixel(self.wcs) return region_pix.contains(PixCoord(pix[0], pix[1])) def separation(self, position): """Angular distance between the center of the region and the given position. Parameters ---------- position : `astropy.coordinates.SkyCoord` Sky coordinate we want the angular distance to. Returns ------- sep : `~astropy.coordinates.Angle` The on-sky separation between the given coordinate and the region center. """ return self.center_skydir.separation(position) @property def data_shape(self): """Shape of the `~numpy.ndarray` matching this geometry.""" return self._shape[::-1] @property def data_shape_axes(self): """Shape of data of the non-spatial axes and unit spatial axes.""" return self.axes.shape[::-1] + (1, 1) @property def _shape(self): """Number of bins in each dimension. The spatial dimension is always (1, 1), as a `RegionGeom` is not pixelized further. """ return tuple((1, 1) + self.axes.shape) def get_coord(self, mode="center", frame=None, sparse=False, axis_name=None): """Get map coordinates from the geometry. Parameters ---------- mode : {'center', 'edges'}, optional Get center or edge coordinates for the non-spatial axes. Default is "center". frame : str or `~astropy.coordinates.Frame`, optional Coordinate frame. Default is None. sparse : bool, optional Compute sparse coordinates. Default is False. axis_name : str, optional If mode = "edges", the edges will be returned for this axis only. Default is None. Returns ------- coord : `~MapCoord` Map coordinate object. """ if mode == "edges" and axis_name is None: raise ValueError("Mode 'edges' requires axis name") coords = self.axes.get_coord(mode=mode, axis_name=axis_name) coords["skycoord"] = self.center_skydir.reshape((1, 1)) if frame is None: frame = self.frame coords = MapCoord.create(coords, frame=self.frame).to_frame(frame) if not sparse: coords = coords.broadcasted return coords def _pad_spatial(self, pad_width): raise NotImplementedError("Spatial padding of `RegionGeom` not supported") def crop(self): raise NotImplementedError("Cropping of `RegionGeom` not supported") def solid_angle(self): """Get solid angle of the region. Returns ------- angle : `~astropy.units.Quantity` Solid angle of the region in steradians. Units: ``sr`` """ if self.region is None: raise ValueError("Region definition required.") # compound regions do not implement area() # so we use the mask representation and estimate the area # from the pixels in the mask using oversampling if isinstance(self.region, CompoundSkyRegion): # oversample by a factor of ten oversampling = 10.0 wcs = self.to_binsz_wcs(self.binsz_wcs / oversampling).wcs pixel_region = self.region.to_pixel(wcs) mask = pixel_region.to_mask() area = np.count_nonzero(mask) / oversampling**2 else: # all other types of regions should implement area area = self.region.to_pixel(self.wcs).area solid_angle = area * proj_plane_pixel_area(self.wcs) * u.deg**2 return solid_angle.to("sr") def bin_volume(self): """If the `RegionGeom` has a non-spatial axis, it returns the volume of the region. If not, it returns the solid angle size. Returns ------- volume : `~astropy.units.Quantity` Volume of the region. """ bin_volume = self.solid_angle() * np.ones(self.data_shape) for idx, ax in enumerate(self.axes): shape = self.ndim * [1] shape[-(idx + 3)] = -1 bin_volume = bin_volume * ax.bin_width.reshape(tuple(shape)) return bin_volume def to_wcs_geom(self, width_min=None): """Get the minimal equivalent geometry which contains the region. Parameters ---------- width_min : `~astropy.quantity.Quantity`, optional Minimum width for the resulting geometry. Can be a single number or two, for different minimum widths in each spatial dimension. Default is None. Returns ------- wcs_geom : `~WcsGeom` A WCS geometry object. """ if width_min is not None: width = np.max( [self.width.to_value("deg"), _check_width(width_min)], axis=0 ) else: width = self.width wcs_geom_region = WcsGeom(wcs=self.wcs) wcs_geom = wcs_geom_region.cutout(position=self.center_skydir, width=width) wcs_geom = wcs_geom.to_cube(self.axes) return wcs_geom def to_binsz_wcs(self, binsz): """Change the bin size of the underlying WCS geometry. Parameters ---------- binsz : float, str or `~astropy.quantity.Quantity` Bin size. Returns ------- region : `~RegionGeom` Region geometry with the same axes and region as the input, but different WCS pixelization. """ new_geom = RegionGeom(self.region, axes=self.axes, binsz_wcs=binsz) return new_geom def get_wcs_coord_and_weights(self, factor=10): """Get the array of spatial coordinates and corresponding weights. The coordinates are the center of a pixel that intersects the region and the weights that represent which fraction of the pixel is contained in the region. Parameters ---------- factor : int, optional Oversampling factor to compute the weights. Default is 10. Returns ------- region_coord : `~MapCoord` MapCoord object with the coordinates inside the region. weights : `~numpy.ndarray` Weights representing the fraction of each pixel contained in the region. """ wcs_geom = self.to_wcs_geom() weights = wcs_geom.to_image().region_weights( regions=[self.region], oversampling_factor=factor ) mask = weights.data > 0 weights = weights.data[mask] # Get coordinates coords = wcs_geom.get_coord(sparse=True).apply_mask(mask) return coords, weights def to_binsz(self, binsz): """Return self.""" return self def to_cube(self, axes): """Append non-spatial axes to create a higher-dimensional geometry. Returns ------- region : `~RegionGeom` Region geometry with the added axes. """ axes = copy.deepcopy(self.axes) + axes return self._init_copy(region=self.region, wcs=self.wcs, axes=axes) def to_image(self): """Remove non-spatial axes to create a 2D region. Returns ------- region : `~RegionGeom` Region geometry without any non-spatial axes. """ return self._init_copy(region=self.region, wcs=self.wcs, axes=None) def upsample(self, factor, axis_name=None): """Upsample a non-spatial dimension of the region by a given factor. Returns ------- region : `~RegionGeom` Region geometry with the upsampled axis. """ axes = self.axes.upsample(factor=factor, axis_name=axis_name) return self._init_copy(region=self.region, wcs=self.wcs, axes=axes) def downsample(self, factor, axis_name): """Downsample a non-spatial dimension of the region by a given factor. Returns ------- region : `~RegionGeom` Region geometry with the downsampled axis. """ axes = self.axes.downsample(factor=factor, axis_name=axis_name) return self._init_copy(region=self.region, wcs=self.wcs, axes=axes) def pix_to_coord(self, pix): lon = np.where( (-0.5 < pix[0]) & (pix[0] < 0.5), self.center_skydir.data.lon, np.nan * u.deg, ) lat = np.where( (-0.5 < pix[1]) & (pix[1] < 0.5), self.center_skydir.data.lat, np.nan * u.deg, ) coords = (lon, lat) for p, ax in zip(pix[self._slice_non_spatial_axes], self.axes): coords += (ax.pix_to_coord(p),) return coords def pix_to_idx(self, pix, clip=False): idxs = list(pix_tuple_to_idx(pix)) for i, idx in enumerate(idxs[self._slice_non_spatial_axes]): if clip: np.clip(idx, 0, self.axes[i].nbin - 1, out=idx) else: np.putmask(idx, (idx < 0) | (idx >= self.axes[i].nbin), -1) return tuple(idxs) def coord_to_pix(self, coords): # inherited docstring if isinstance(coords, tuple) and len(coords) == len(self.axes): skydir = self.center_skydir.transform_to(self.frame) coords = (skydir.data.lon, skydir.data.lat) + coords elif isinstance(coords, dict): valid_keys = ["lon", "lat", "skycoord"] if not any([_ in coords for _ in valid_keys]): coords.setdefault("skycoord", self.center_skydir) coords = MapCoord.create(coords, frame=self.frame, axis_names=self.axes.names) if self.region is None: pix = (0, 0) else: in_region = self.contains(coords.skycoord) x = np.zeros(coords.skycoord.shape) x[~in_region] = np.nan y = np.zeros(coords.skycoord.shape) y[~in_region] = np.nan pix = (x, y) pix += self.axes.coord_to_pix(coords) return pix def get_idx(self): idxs = [np.arange(n, dtype=float) for n in self.data_shape[::-1]] return np.meshgrid(*idxs[::-1], indexing="ij")[::-1] def _make_bands_cols(self): return [] @classmethod def create(cls, region, **kwargs): """Create region geometry. The input region can be passed in the form of a ds9 string and will be parsed internally by `~regions.Regions.parse`. See: * https://astropy-regions.readthedocs.io/en/stable/region_io.html * http://ds9.si.edu/doc/ref/region.html Parameters ---------- region : str or `~regions.SkyRegion` Region definition. **kwargs : dict Keyword arguments passed to `RegionGeom.__init__`. Returns ------- geom : `RegionGeom` Region geometry. """ return cls.from_regions(regions=region, **kwargs) def __str__(self): axes = ["lon", "lat"] + [_.name for _ in self.axes] try: frame = self.center_skydir.frame.name lon = self.center_skydir.data.lon.deg lat = self.center_skydir.data.lat.deg except AttributeError: frame, lon, lat = "", np.nan, np.nan return ( f"{self.__class__.__name__}\n\n" f"\tregion : {self.region.__class__.__name__}\n" f"\taxes : {axes}\n" f"\tshape : {self.data_shape[::-1]}\n" f"\tndim : {self.ndim}\n" f"\tframe : {frame}\n" f"\tcenter : {lon:.1f} deg, {lat:.1f} deg\n" ) def is_allclose(self, other, rtol_axes=1e-6, atol_axes=1e-6): """Compare two data IRFs for equivalency. Parameters ---------- other : `RegionGeom` Region geometry to compare against. rtol_axes : float, optional Relative tolerance for the axes comparison. Default is 1e-6. atol_axes : float, optional Relative tolerance for the axes comparison. Default is 1e-6. Returns ------- is_allclose : bool Whether the geometry is all close. """ if not isinstance(other, self.__class__): return TypeError(f"Cannot compare {type(self)} and {type(other)}") if self.data_shape != other.data_shape: return False axes_eq = self.axes.is_allclose(other.axes, rtol=rtol_axes, atol=atol_axes) # TODO: compare regions based on masks... regions_eq = True return axes_eq and regions_eq def __eq__(self, other): if not isinstance(other, self.__class__): return False return self.is_allclose(other=other) def _to_region_table(self): """Export region to a FITS region table.""" if self.region is None: raise ValueError("Region definition required.") region_list = compound_region_to_regions(self.region) pixel_region_list = [] for reg in region_list: pixel_region_list.append(reg.to_pixel(self.wcs)) table = Regions(pixel_region_list).serialize(format="fits") header = WcsGeom(wcs=self.wcs).to_header() table.meta.update(header) return table def to_hdulist(self, format="ogip", hdu_bands=None, hdu_region=None): """Convert geometry to HDU list. Parameters ---------- format : {"ogip", "gadf", "ogip-sherpa"} HDU format. Default is "ogip". hdu_bands : str, optional Name or index of the HDU with the BANDS table. Default is None. hdu_region : str, optional Name or index of the HDU with the region table. Not used for the "gadf" format. Default is None. Returns ------- hdulist : `~astropy.io.fits.HDUList` HDU list. """ if hdu_bands is None: hdu_bands = "HDU_BANDS" if hdu_region is None: hdu_region = "HDU_REGION" if format != "gadf": hdu_region = "REGION" hdulist = fits.HDUList() hdulist.append(self.axes.to_table_hdu(hdu_bands=hdu_bands, format=format)) # region HDU if self.region: region_table = self._to_region_table() region_hdu = fits.BinTableHDU(region_table, name=hdu_region) hdulist.append(region_hdu) return hdulist @classmethod def from_regions(cls, regions, **kwargs): """Create region geometry from list of regions. The regions are combined with union to a compound region. Parameters ---------- regions : list of `~regions.SkyRegion` or str Regions. **kwargs: dict Keyword arguments forwarded to `RegionGeom`. Returns ------- geom : `RegionGeom` Region map geometry. """ if isinstance(regions, str): regions = Regions.parse(data=regions, format="ds9") elif isinstance(regions, SkyRegion): regions = [regions] elif isinstance(regions, SkyCoord): regions = [PointSkyRegion(center=regions)] elif isinstance(regions, list) and len(regions) == 0: regions = None if regions: regions = regions_to_compound_region(regions) return cls(region=regions, **kwargs) @classmethod def from_hdulist(cls, hdulist, format="ogip", hdu=None): """Read region table and convert it to region list. Parameters ---------- hdulist : `~astropy.io.fits.HDUList` HDU list. format : {"ogip", "ogip-arf", "gadf"} HDU format. Default is "ogip". hdu : str, optional Name of the HDU. Default is None. Returns ------- geom : `RegionGeom` Region map geometry. """ region_hdu = "REGION" if format == "gadf" and hdu: region_hdu = f"{hdu}_{region_hdu}" if region_hdu in hdulist: try: region_table = QTable.read(hdulist[region_hdu]) regions_pix = Regions.parse(data=region_table, format="fits") except TypeError: region_table = Table.read(hdulist[region_hdu]) regions_pix = Regions.parse(data=region_table, format="fits") wcs = WcsGeom.from_header(region_table.meta).wcs regions = [] for region_pix in regions_pix: # TODO: remove workaround once regions issue with fits serialization is sorted out # see https://github.com/astropy/regions/issues/400 # requires update regions to >=v0.6 region_pix.meta["include"] = True regions.append(region_pix.to_sky(wcs)) region = regions_to_compound_region(regions) else: region, wcs = None, None if format == "ogip": hdu_bands = "EBOUNDS" elif format == "ogip-arf": hdu_bands = "SPECRESP" elif format == "gadf": hdu_bands = hdu + "_BANDS" else: raise ValueError(f"Unknown format {format}") axes = MapAxes.from_table_hdu(hdulist[hdu_bands], format=format) return cls(region=region, wcs=wcs, axes=axes) def union(self, other): """Stack a `RegionGeom` by making the union.""" if not self == other: raise ValueError("Can only make union if extra axes are equivalent.") if other.region: if self.region: self._region = self.region.union(other.region) else: self._region = other.region def plot_region(self, ax=None, kwargs_point=None, path_effect=None, **kwargs): """Plot region in the sky. Parameters ---------- ax : `~astropy.visualization.WCSAxes`, optional Axes to plot on. If no axes are given, the region is shown using the minimal equivalent WCS geometry. Default is None. kwargs_point : dict, optional Keyword arguments passed to `~matplotlib.lines.Line2D` for plotting of point sources. Default is None. path_effect : `~matplotlib.patheffects.PathEffect`, optional Path effect applied to artists and lines. Default is None. **kwargs : dict Keyword arguments forwarded to `~regions.PixelRegion.as_artist`. Returns ------- ax : `~astropy.visualization.WCSAxes` Axes to plot on. """ if self.region: kwargs_point = kwargs_point or {} if ax is None: ax = plt.gca() if not isinstance(ax, WCSAxes): ax.remove() wcs_geom = self.to_wcs_geom() m = Map.from_geom(geom=wcs_geom.to_image()) ax = m.plot(add_cbar=False, vmin=-1, vmax=0) kwargs.setdefault("facecolor", "None") kwargs.setdefault("edgecolor", "tab:blue") kwargs_point.setdefault("marker", "*") for key, value in kwargs.items(): key_point = ARTIST_TO_LINE_PROPERTIES.get(key, None) if key_point: kwargs_point[key_point] = value for region in compound_region_to_regions(self.region): region_pix = region.to_pixel(wcs=ax.wcs) if isinstance(region, PointSkyRegion): artist = region_pix.as_artist(**kwargs_point) else: artist = region_pix.as_artist(**kwargs) if path_effect: artist.add_path_effect(path_effect) ax.add_artist(artist) return ax else: logging.info("Region definition required.") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/region/ndmap.py0000644000175100001770000006712714721316200017755 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from itertools import product import numpy as np from scipy.ndimage import label as ndi_label from astropy import units as u from astropy.io import fits from astropy.nddata import block_reduce from astropy.table import Table from astropy.visualization import quantity_support import matplotlib.pyplot as plt from gammapy.maps.axes import UNIT_STRING_FORMAT from gammapy.utils.interpolation import ScaledRegularGridInterpolator, StatProfileScale from gammapy.utils.scripts import make_path from ..axes import MapAxes from ..core import Map from ..geom import pix_tuple_to_idx from ..region import RegionGeom from ..utils import INVALID_INDEX __all__ = ["RegionNDMap"] class RegionNDMap(Map): """N-dimensional region map. A `~RegionNDMap` owns a `~RegionGeom` instance as well as a data array containing the values associated to that region in the sky along the non-spatial axis, usually an energy axis. The spatial dimensions of a `~RegionNDMap` are reduced to a single spatial bin with an arbitrary shape, and any extra dimensions are described by an arbitrary number of non-spatial axes. Parameters ---------- geom : `~gammapy.maps.RegionGeom` Region geometry object. data : `~numpy.ndarray` Data array. If None then an empty array will be allocated. dtype : str, optional Data type. Default is "float32". meta : `dict`, optional Dictionary to store metadata. Default is None. unit : str or `~astropy.units.Unit`, optional The map unit. Default is "". """ def __init__(self, geom, data=None, dtype="float32", meta=None, unit=""): if data is None: data = np.zeros(geom.data_shape, dtype=dtype) if meta is None: meta = {} self._geom = geom self.data = data self.meta = meta self._unit = u.Unit(unit) @property def data(self): return self._data @data.setter def data(self, value): """Set data. Parameters ---------- value : array-like Data array. """ if np.isscalar(value): value = value * np.ones(self.geom.data_shape, dtype=type(value)) if isinstance(value, u.Quantity): raise TypeError("Map data must be a Numpy array. Set unit separately") input_shape = value.shape if len(self.geom.data_shape) == 2 + len(value.shape): if self.geom.data_shape[: len(value.shape)] == value.shape: value = np.expand_dims(value, (-2, -1)) if self.geom.data_shape != value.shape: raise ValueError( f"Input shape {input_shape} is not compatible with shape from geometry {self.geom.data_shape}" ) self._data = value def plot(self, ax=None, axis_name=None, **kwargs): """Plot the data contained in region map along the non-spatial axis. Parameters ---------- ax : `~matplotlib.pyplot.Axis`, optional Axis used for plotting. Default is None. axis_name : str, optional Which axis to plot on the x-axis. Extra axes will be plotted as additional lines. Default is None. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.errorbar`. Returns ------- ax : `~matplotlib.pyplot.Axis` Axis used for plotting. """ ax = ax or plt.gca() if axis_name is None: if self.geom.axes.is_unidimensional: axis_name = self.geom.axes.primary_axis.name else: raise ValueError( "Plotting a region map with multiple extra axes requires " "specifying the 'axis_name' keyword." ) axis = self.geom.axes[axis_name] kwargs.setdefault("marker", "o") kwargs.setdefault("markersize", 4) kwargs.setdefault("ls", "None") kwargs.setdefault("xerr", axis.as_plot_xerr) yerr_nd, yerr = kwargs.pop("yerr", None), None uplims_nd, uplims = kwargs.pop("uplims", None), None label_default = kwargs.pop("label", None) labels = product( *[ax.as_plot_labels for ax in self.geom.axes if ax.name != axis.name] ) for label_axis, (idx, quantity) in zip( labels, self.iter_by_axis_data(axis_name=axis.name) ): if isinstance(yerr_nd, tuple): yerr = yerr_nd[0][idx], yerr_nd[1][idx] elif isinstance(yerr_nd, np.ndarray): yerr = yerr_nd[idx] if uplims_nd is not None: uplims = uplims_nd[idx] label = " ".join(label_axis) if label_default is None else label_default with quantity_support(): ax.errorbar( x=axis.as_plot_center, y=quantity, yerr=yerr, uplims=uplims, label=label, **kwargs, ) axis.format_plot_xaxis(ax=ax) if "energy" in axis_name: ax.set_yscale("log", nonpositive="clip") return ax def plot_hist(self, ax=None, **kwargs): """Plot as histogram. Parameters ---------- ax : `~matplotlib.axis`, optional Axis instance to be used for the plot. Default is None. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.hist`. Returns ------- ax : `~matplotlib.pyplot.Axis` Axis used for plotting. """ ax = plt.gca() if ax is None else ax kwargs.setdefault("histtype", "step") kwargs.setdefault("lw", 1) if not self.geom.axes.is_unidimensional: raise ValueError("Plotting is only supported for unidimensional maps") axis = self.geom.axes[0] with quantity_support(): weights = self.data[:, 0, 0] ax.hist( axis.as_plot_center, bins=axis.as_plot_edges, weights=weights, **kwargs ) if not self.unit.is_unity(): ax.set_ylabel(f"Data [{self.unit.to_string(UNIT_STRING_FORMAT)}]") axis.format_plot_xaxis(ax=ax) ax.set_yscale("log") return ax def plot_interactive(self): raise NotImplementedError( "Interactive plotting currently not support for RegionNDMap" ) def plot_region(self, ax=None, **kwargs): """Plot region. Parameters ---------- ax : `~astropy.visualization.WCSAxes`, optional Axes to plot on. If no axes are given, the region is shown using the minimal equivalent WCS geometry. Default is None. **kwargs : dict Keyword arguments forwarded to `~regions.PixelRegion.as_artist`. """ ax = self.geom.plot_region(ax, **kwargs) return ax def plot_mask(self, ax=None, **kwargs): """Plot the mask as a shaded area in a xmin-xmax range. Parameters ---------- ax : `~matplotlib.axis`, optional Axis instance to be used for the plot. Default is None. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.axvspan`. Returns ------- ax : `~matplotlib.pyplot.Axis` Axis used for plotting. """ if not self.is_mask: raise ValueError("This is not a mask and cannot be plotted") kwargs.setdefault("color", "k") kwargs.setdefault("alpha", 0.05) kwargs.setdefault("label", "mask") ax = plt.gca() if ax is None else ax edges = self.geom.axes["energy"].edges.reshape((-1, 1, 1)) labels, nlabels = ndi_label(self.data) for idx in range(1, nlabels + 1): mask = labels == idx xmin = edges[:-1][mask].min().value xmax = edges[1:][mask].max().value ax.axvspan(xmin, xmax, **kwargs) return ax @classmethod def create( cls, region, axes=None, dtype="float32", meta=None, unit="", wcs=None, binsz_wcs="0.1 deg", data=None, ): """Create an empty region map object. Parameters ---------- region : str or `~regions.SkyRegion` Region specification. axes : list of `MapAxis`, optional Non-spatial axes. Default is None. dtype : str, optional Data type. Default is 'float32'. unit : str or `~astropy.units.Unit`, optional Data unit. Default is "". meta : `dict`, optional Dictionary to store metadata. Default is None. wcs : `~astropy.wcs.WCS`, optional WCS projection to use for local projections of the region. Default is None. binsz_wcs: `~astropy.units.Quantity` or str, optional Bin size used for the default WCS, if ``wcs=None``. Default is "0.1 deg". data : `~numpy.ndarray`, optional Data array. Default is None. Returns ------- map : `RegionNDMap` Region map. """ geom = RegionGeom.create(region=region, axes=axes, wcs=wcs, binsz_wcs=binsz_wcs) return cls(geom=geom, dtype=dtype, unit=unit, meta=meta, data=data) def downsample(self, factor, preserve_counts=True, axis_name=None, weights=None): """Downsample the non-spatial dimension by a given factor. By default, the first axes is downsampled. Parameters ---------- factor : int Downsampling factor. preserve_counts : bool, optional Preserve the integral over each bin. This should be true if the map is an integral quantity (e.g. counts) and false if the map is a differential quantity (e.g. intensity). Default is True. axis_name : str, optional Which axis to downsample. Default is "energy". weights : `RegionNDMap`, optional Contains the weights to apply to the axis to reduce. Default weights of one. Returns ------- map : `RegionNDMap` Downsampled region map. """ if axis_name is None: axis_name = self.geom.axes[0].name geom = self.geom.downsample(factor=factor, axis_name=axis_name) block_size = [1] * self.data.ndim idx = self.geom.axes.index_data(axis_name) block_size[idx] = factor if weights is None: weights = 1 else: weights = weights.data func = np.nansum if preserve_counts else np.nanmean if self.is_mask: func = np.all data = block_reduce(self.data * weights, tuple(block_size), func=func) return self._init_copy(geom=geom, data=data) def upsample(self, factor, order=0, preserve_counts=True, axis_name=None): """Upsample the non-spatial dimension by a given factor. By default, the first axes is upsampled. Parameters ---------- factor : int Upsampling factor. order : int, optional Order of the interpolation used for upsampling. Default is 0. preserve_counts : bool, optional Preserve the integral over each bin. This should be true if the RegionNDMap is an integral quantity (e.g. counts) and false if the RegionNDMap is a differential quantity (e.g. intensity). Default is True. axis_name : str, optional Which axis to upsample. Default is "energy". Returns ------- map : `RegionNDMap` Upsampled region map. """ if axis_name is None: axis_name = self.geom.axes[0].name geom = self.geom.upsample(factor=factor, axis_name=axis_name) data = self.interp_by_coord(geom.get_coord()) if preserve_counts: data /= factor return self._init_copy(geom=geom, data=data) def iter_by_axis_data(self, axis_name): """Iterate data by axis. Parameters ---------- axis_name : str Axis name. Returns ------- idx, data : tuple, `~astropy.units.Quantity` Data and index. """ idx_axis = self.geom.axes.index_data(axis_name) shape = list(self.data.shape) shape[idx_axis] = 1 for idx in np.ndindex(*shape): idx = list(idx) idx[idx_axis] = slice(None) yield tuple(idx), self.quantity[tuple(idx)] def _resample_by_idx(self, idx, weights=None, preserve_counts=False): # inherited docstring # TODO: too complex, simplify! idx = pix_tuple_to_idx(idx) msk = np.all(np.stack([t != INVALID_INDEX.int for t in idx]), axis=0) idx = [t[msk] for t in idx] if weights is not None: if isinstance(weights, u.Quantity): weights = weights.to_value(self.unit) weights = weights[msk] idx = np.ravel_multi_index(idx, self.data.T.shape) idx, idx_inv = np.unique(idx, return_inverse=True) weights = np.bincount(idx_inv, weights=weights).astype(self.data.dtype) if not preserve_counts: weights /= np.bincount(idx_inv).astype(self.data.dtype) self.data.T.flat[idx] += weights def fill_by_idx(self, idx, weights=None): return self._resample_by_idx(idx, weights=weights, preserve_counts=True) def get_by_idx(self, idxs): # inherited docstring return self.data[idxs[::-1]] def interp_by_coord(self, coords, **kwargs): """Interpolate map values at the given map coordinates. Parameters ---------- coords : tuple, dict or `~gammapy.maps.MapCoord` Coordinate arrays for each dimension of the map. Tuple should be ordered as (lon, lat, x_0, ..., x_n) where x_i are coordinates for non-spatial dimensions of the map. "lon" and "lat" are optional and will be taken at the center of the region by default. method : {"linear", "nearest"} Method to interpolate data values. Default is "linear". fill_value : None or float value The value to use for points outside the interpolation domain. If None, values outside the domain are extrapolated. values_scale : {"lin", "log", "sqrt"} Optional value scaling. Returns ------- vals : `~numpy.ndarray` Interpolated pixel values. """ pix = self.geom.coord_to_pix(coords=coords) return self.interp_by_pix(pix, **kwargs) def interp_by_pix(self, pix, **kwargs): # inherited docstring grid_pix = [np.arange(n, dtype=float) for n in self.data.shape[::-1]] if np.any(np.isfinite(self.data)): data = self.data.copy().T data[~np.isfinite(data)] = 0.0 else: data = self.data.T scale = kwargs.get("values_scale", "lin") if scale == "stat-profile": axis = 2 + self.geom.axes.index("norm") kwargs["values_scale"] = StatProfileScale(axis=axis) fn = ScaledRegularGridInterpolator(grid_pix, data, **kwargs) return fn(tuple(pix), clip=False) def set_by_idx(self, idx, value): # inherited docstring self.data[idx[::-1]] = value @classmethod def read(cls, filename, format="gadf", ogip_column=None, hdu=None, checksum=False): """Read from file. Parameters ---------- filename : `pathlib.Path` or str Filename. format : {"gadf", "ogip", "ogip-arf"} Which format to use. Default is "gadf". ogip_column : {None, "COUNTS", "QUALITY", "BACKSCAL"}, optional If format 'ogip' is chosen which table hdu column to read. Default is None. hdu : str, optional Name or index of the HDU with the map data. Default is None. checksum : bool If True checks both DATASUM and CHECKSUM cards in the file headers. Default is False. Returns ------- region_map : `RegionNDMap` Region map. """ filename = make_path(filename) with fits.open(filename, memmap=False, checksum=checksum) as hdulist: return cls.from_hdulist( hdulist, format=format, ogip_column=ogip_column, hdu=hdu ) def write( self, filename, overwrite=False, format="gadf", hdu="SKYMAP", checksum=False ): """Write map to file. Parameters ---------- filename : `pathlib.Path` or str Filename. overwrite : bool, optional Overwrite existing file. Default is False. format : {"gadf", "ogip", "ogip-sherpa", "ogip-arf", "ogip-arf-sherpa"} Which format to use. Default is "gadf". hdu : str, optional Name of the HDU. Default is "SKYMAP". checksum : bool, optional When True adds both DATASUM and CHECKSUM cards to the headers written to the file. Default is False. """ filename = make_path(filename) self.to_hdulist(format=format, hdu=hdu).writeto( filename, overwrite=overwrite, checksum=checksum ) def to_hdulist(self, format="gadf", hdu="SKYMAP", hdu_bands=None, hdu_region=None): """Convert to `~astropy.io.fits.HDUList`. Parameters ---------- format : {"gadf", "ogip", "ogip-sherpa", "ogip-arf", "ogip-arf-sherpa"} Format specification. Default is "gadf". hdu : str, optional Name of the HDU with the map data, used for "gadf" format. Default is "SKYMAP". hdu_bands : str, optional Name or index of the HDU with the BANDS table, used for "gadf" format. Default is None. hdu_region : str, optional Name or index of the HDU with the region table. Default is None. Returns ------- hdulist : `~astropy.fits.HDUList` HDU list. """ hdulist = fits.HDUList() table = self.to_table(format=format) if hdu_bands is None: hdu_bands = f"{hdu.upper()}_BANDS" if hdu_region is None: hdu_region = f"{hdu.upper()}_REGION" if format in ["ogip", "ogip-sherpa", "ogip-arf", "ogip-arf-sherpa"]: hdulist.append(fits.BinTableHDU(table)) elif format == "gadf": table.meta.update(self.geom.axes.to_header()) hdulist.append(fits.BinTableHDU(table, name=hdu)) else: raise ValueError(f"Unsupported format '{format}'") if format in ["ogip", "ogip-sherpa", "gadf"]: hdulist_geom = self.geom.to_hdulist( format=format, hdu_bands=hdu_bands, hdu_region=hdu_region ) hdulist.extend(hdulist_geom[1:]) return hdulist @classmethod def from_table(cls, table, format="", colname=None): """Create region map from table. Parameters ---------- table : `~astropy.table.Table` Table with input data. format : {"gadf-sed", "lightcurve", "profile"} Format to use. colname : str Column name to take the data from. Returns ------- region_map : `RegionNDMap` Region map. """ if format == "gadf-sed": if colname is None: raise ValueError("Column name required") axes = MapAxes.from_table(table=table, format=format) if colname == "stat_scan": names = ["norm", "energy"] # TODO: this is not officially supported by GADF... elif colname in ["counts", "npred", "npred_excess"]: names = ["dataset", "energy"] else: names = ["energy"] axes = axes[names] data = table[colname].data unit = table[colname].unit or "" elif format == "lightcurve": axes = MapAxes.from_table(table=table, format=format) if colname == "stat_scan": names = ["norm", "energy", "time"] # TODO: this is not officially supported by GADF... elif colname in ["counts", "npred", "npred_excess"]: names = ["dataset", "energy", "time"] else: names = ["energy", "time"] axes = axes[names] data = table[colname].data unit = table[colname].unit or "" elif format == "profile": axes = MapAxes.from_table(table=table, format=format) if colname == "stat_scan": names = ["norm", "energy", "projected-distance"] # TODO: this is not officially supported by GADF... elif colname in ["counts", "npred", "npred_excess"]: names = ["dataset", "energy", "projected-distance"] else: names = ["energy", "projected-distance"] axes = axes[names] data = table[colname].data unit = table[colname].unit or "" else: raise ValueError(f"Format not supported {format}") geom = RegionGeom.create(region=None, axes=axes) data = data.reshape(geom.data_shape) return cls(geom=geom, data=data, unit=unit, meta=table.meta, dtype=data.dtype) @classmethod def from_hdulist(cls, hdulist, format="gadf", ogip_column=None, hdu=None, **kwargs): """Create from `~astropy.io.fits.HDUList`. Parameters ---------- hdulist : `~astropy.io.fits.HDUList` HDU list. format : {"gadf", "ogip", "ogip-arf"} Format specification. Default is "gadf". ogip_column : {"COUNTS", "QUALITY", "BACKSCAL"}, optional If format 'ogip' is chosen which table HDU column to read. Default is None. hdu : str, optional Name or index of the HDU with the map data. Default is None. Returns ------- region_nd_map : `RegionNDMap` Region map. """ defaults = { "ogip": {"hdu": "SPECTRUM", "column": "COUNTS"}, "ogip-arf": {"hdu": "SPECRESP", "column": "SPECRESP"}, "gadf": {"hdu": "SKYMAP", "column": "DATA"}, } if hdu is None: hdu = defaults[format]["hdu"] if ogip_column is None: ogip_column = defaults[format]["column"] geom = RegionGeom.from_hdulist(hdulist, format=format, hdu=hdu) table = Table.read(hdulist[hdu]) quantity = table[ogip_column].quantity.reshape(geom.data_shape) if ogip_column == "QUALITY": data, unit = np.logical_not(quantity.value.astype(bool)), "" else: data, unit = quantity.value, quantity.unit return cls(geom=geom, data=data, meta=table.meta, unit=unit, dtype=data.dtype) def _pad_spatial(self, *args, **kwargs): raise NotImplementedError("Spatial padding is not supported by RegionNDMap") def crop(self): raise NotImplementedError("Crop is not supported by RegionNDMap") def stack(self, other, weights=None, nan_to_num=True): """Stack other region map into map. Parameters ---------- other : `RegionNDMap` Other map to stack. weights : `RegionNDMap`, optional Array to be used as weights. The spatial geometry must be equivalent to `other` and additional axes must be broadcastable. Default is None. nan_to_num: bool, optional Non-finite values are replaced by zero if True. Default is True. """ data = other.quantity.to_value(self.unit).astype(self.data.dtype) # TODO: re-think stacking of regions. Is making the union reasonable? # self.geom.union(other.geom) if nan_to_num: not_finite = ~np.isfinite(data) if np.any(not_finite): data = data.copy() data[not_finite] = 0 if weights is not None: if not other.geom.to_image() == weights.geom.to_image(): raise ValueError("Incompatible geoms between map and weights") data = data * weights.data self.data += data def to_table(self, format="gadf"): """Convert to `~astropy.table.Table`. Data format specification: :ref:`gadf:ogip-pha`. Parameters ---------- format : {"gadf", "ogip", "ogip-arf", "ogip-arf-sherpa"} Format specification. Default is "gadf". Returns ------- table : `~astropy.table.Table` Table. """ data = np.nan_to_num(self.quantity[:, 0, 0]) if format == "ogip": if len(self.geom.axes) > 1: raise ValueError( f"Writing to format '{format}' only supports a " f"single energy axis. Got {self.geom.axes.names}" ) energy_axis = self.geom.axes[0] energy_axis.assert_name("energy") table = Table() table["CHANNEL"] = np.arange(energy_axis.nbin, dtype=np.int16) table["COUNTS"] = np.array(data, dtype=np.int32) # see https://heasarc.gsfc.nasa.gov/docs/heasarc/ofwg/docs/spectra/ogip_92_007/node6.html # noqa: E501 table.meta = { "EXTNAME": "SPECTRUM", "telescop": "unknown", "instrume": "unknown", "filter": "None", "exposure": 0, "corrfile": "", "corrscal": "", "ancrfile": "", "hduclass": "OGIP", "hduclas1": "SPECTRUM", "hduvers": "1.2.1", "poisserr": True, "chantype": "PHA", "detchans": energy_axis.nbin, "quality": 0, "backscal": 0, "grouping": 0, "areascal": 1, } elif format in ["ogip-arf", "ogip-arf-sherpa"]: if len(self.geom.axes) > 1: raise ValueError( f"Writing to format '{format}' only supports a " f"single energy axis. Got {self.geom.axes.names}" ) energy_axis = self.geom.axes[0] table = energy_axis.to_table(format=format) table.meta = { "EXTNAME": "SPECRESP", "telescop": "unknown", "instrume": "unknown", "filter": "None", "hduclass": "OGIP", "hduclas1": "RESPONSE", "hduclas2": "SPECRESP", "hduvers": "1.1.0", } if format == "ogip-arf-sherpa": data = data.to("cm2") table["SPECRESP"] = data elif format == "gadf": table = Table() data = self.quantity table["CHANNEL"] = np.arange(len(data), dtype=np.int16) table["DATA"] = data else: raise ValueError(f"Unsupported format: '{format}'") meta = {k: self.meta.get(k, v) for k, v in table.meta.items()} table.meta.update(meta) return table def get_spectrum(self, *args, **kwargs): """Return self.""" return self def to_region_nd_map(self, *args, **kwargs): """Return self.""" return self def cutout(self, *args, **kwargs): """Return self.""" return self ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.2206419 gammapy-1.3/gammapy/maps/region/tests/0000755000175100001770000000000014721316215017437 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/region/tests/__init__.py0000644000175100001770000000010014721316200021531 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/region/tests/test_geom.py0000644000175100001770000003246214721316200022000 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion, CompoundSkyRegion, RectangleSkyRegion import matplotlib.pyplot as plt from gammapy.maps import MapAxis, RegionGeom, WcsGeom from gammapy.utils.testing import mpl_plot_check @pytest.fixture() def region(): center = SkyCoord("0 deg", "0 deg", frame="galactic") return CircleSkyRegion(center=center, radius=1 * u.deg) @pytest.fixture() def energy_axis(): return MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) @pytest.fixture() def test_axis(): return MapAxis.from_nodes([1, 2], unit="", name="test") def test_create(region): geom = RegionGeom.create(region) assert geom.frame == "galactic" assert geom.projection == "TAN" assert geom.is_image assert not geom.is_allsky def test_from_regions(region): RegionGeom.from_regions(region) geom = RegionGeom.from_regions("galactic;circle(10,20,3)") assert geom.region.radius.value == 3 geom = RegionGeom.from_regions(region.center) assert geom.region.center == region.center geom = RegionGeom.from_regions([]) assert geom.region is None def test_binsz(region): geom = RegionGeom.create(region, binsz_wcs=0.05) wcs_geom = geom.to_wcs_geom() assert geom.binsz_wcs[0].deg == 0.05 assert_allclose(wcs_geom.pixel_scales, geom.binsz_wcs) def test_defined_wcs(region): wcs = WcsGeom.create( skydir=(0, 0), frame="galactic", width="1.5deg", binsz="0.1deg" ).wcs geom = RegionGeom.create(region, wcs=wcs) assert geom.binsz_wcs[0].deg == 0.1 def test_to_binsz_wcs(region): binsz = 0.05 * u.deg geom = RegionGeom.create(region, binsz_wcs=0.01) new_geom = geom.to_binsz_wcs(binsz) assert geom.binsz_wcs[0].deg == 0.01 assert new_geom.binsz_wcs[0].deg == binsz.value def test_centers(region): geom = RegionGeom.create(region) assert_allclose(geom.center_skydir.l.deg, 0, atol=1e-30) assert_allclose(geom.center_skydir.b.deg, 0, atol=1e-30) assert_allclose(geom.center_pix, (0, 0)) values = [_.value for _ in geom.center_coord] assert_allclose(values, (0, 0), atol=1e-30) def test_width(region): geom = RegionGeom.create(region, binsz_wcs=0.01) assert_allclose(geom.width.value, [2.02, 2.02]) def test_create_axis(region, energy_axis, test_axis): geom = RegionGeom.create(region, axes=[energy_axis]) assert geom.ndim == 3 assert len(geom.axes) == 1 assert geom.data_shape == (3, 1, 1) assert geom.data_shape_axes == (3, 1, 1) geom = RegionGeom.create(region, axes=[energy_axis, test_axis]) assert geom.ndim == 4 assert len(geom.axes) == 2 assert geom.data_shape == (2, 3, 1, 1) def test_get_coord(region, energy_axis, test_axis): geom = RegionGeom.create(region, axes=[energy_axis]) coords = geom.get_coord() assert_allclose(coords.lon, 0, atol=1e-30) assert_allclose(coords.lat, 0, atol=1e-30) assert_allclose( coords["energy"].value.squeeze(), [1.467799, 3.162278, 6.812921], rtol=1e-5 ) geom = RegionGeom.create(region, axes=[energy_axis, test_axis]) coords = geom.get_coord(sparse=True) assert coords["lon"].shape == (1, 1) assert coords["test"].shape == (2, 1, 1, 1) assert coords["energy"].shape == (1, 3, 1, 1) assert_allclose( coords["energy"].value[0, :, 0, 0], [1.467799, 3.162278, 6.812921], rtol=1e-5 ) assert_allclose(coords["test"].value[:, 0, 0, 0].squeeze(), [1, 2], rtol=1e-5) coords = geom.get_coord(sparse=False) assert coords["lon"].shape == (2, 3, 1, 1) assert coords["test"].shape == (2, 3, 1, 1) assert coords["energy"].shape == (2, 3, 1, 1) assert_allclose(coords["test"].value[:, 2, 0, 0].squeeze(), [1, 2], rtol=1e-5) def test_get_idx(region, energy_axis, test_axis): geom = RegionGeom.create(region, axes=[energy_axis]) pix = geom.get_idx() assert_allclose(pix[0], 0) assert_allclose(pix[1], 0) assert_allclose(pix[2].squeeze(), [0, 1, 2]) geom = RegionGeom.create(region, axes=[energy_axis, test_axis]) pix = geom.get_idx() assert pix[0].shape == (2, 3, 1, 1) assert_allclose(pix[0], 0) assert_allclose(pix[1], 0) assert_allclose(pix[2][0].squeeze(), [0, 1, 2]) def test_coord_to_pix(region, energy_axis, test_axis): geom = RegionGeom.create(region, axes=[energy_axis]) position = SkyCoord(0, 0, frame="galactic", unit="deg") coords = {"skycoord": position, "energy": 1 * u.TeV} coords_pix = geom.coord_to_pix(coords) assert_allclose(coords_pix[0], 0) assert_allclose(coords_pix[1], 0) assert_allclose(coords_pix[2], -0.5) geom = RegionGeom.create(region, axes=[energy_axis, test_axis]) coords["test"] = 2 coords_pix = geom.coord_to_pix(coords) assert_allclose(coords_pix[0], 0) assert_allclose(coords_pix[1], 0) assert_allclose(coords_pix[2], -0.5) assert_allclose(coords_pix[3], 1) def test_pix_to_coord(region, energy_axis): geom = RegionGeom.create(region, axes=[energy_axis]) pix = (0, 0, 0) coords = geom.pix_to_coord(pix) assert_allclose(coords[0].value, 0, atol=1e-30) assert_allclose(coords[1].value, 0, atol=1e-30) assert_allclose(coords[2].value, 1.467799, rtol=1e-5) pix = (1, 1, 1) coords = geom.pix_to_coord(pix) assert_allclose(coords[0].value, np.nan) assert_allclose(coords[1].value, np.nan) assert_allclose(coords[2].value, 3.162278, rtol=1e-5) pix = (1, 1, 3) coords = geom.pix_to_coord(pix) assert_allclose(coords[2].value, 14.677993, rtol=1e-5) def test_pix_to_coord_2axes(region, energy_axis, test_axis): geom = RegionGeom.create(region, axes=[energy_axis, test_axis]) pix = (0, 0, 0, 0) coords = geom.pix_to_coord(pix) assert_allclose(coords[0].value, 0, atol=1e-30) assert_allclose(coords[1].value, 0, atol=1e-30) assert_allclose(coords[2].value, 1.467799, rtol=1e-5) assert_allclose(coords[3].value, 1) pix = (0, 0, 0, 2) coords = geom.pix_to_coord(pix) assert_allclose(coords[3].value, 3) def test_contains(region): geom = RegionGeom.create(region) position = SkyCoord([0, 0], [0, 1.1], frame="galactic", unit="deg") contains = geom.contains(coords={"skycoord": position}) assert_allclose(contains, [1, 0]) def test_solid_angle(region): geom = RegionGeom.create(region) omega = geom.solid_angle() assert omega.unit == "sr" reference = 2 * np.pi * (1 - np.cos(region.radius)) assert_allclose(omega.value, reference.value, rtol=1e-3) def test_solid_angle_compound(): center1 = SkyCoord(ra=0 * u.deg, dec=0 * u.deg) center2 = SkyCoord(ra=3 * u.deg, dec=0 * u.deg) region1 = CircleSkyRegion(center1, radius=1 * u.deg) region2 = CircleSkyRegion(center2, radius=1 * u.deg) # regions don't overlap, so expected area is the sum of both region = region1 | region2 expected = sum(RegionGeom.create(r).solid_angle() for r in [region1, region2]) geom = RegionGeom.create(region) omega = geom.solid_angle() assert u.isclose(omega, expected, rtol=2e-3) region1 = CircleSkyRegion(center1, radius=5 * u.deg) region2 = CircleSkyRegion(center2, radius=1 * u.deg) # fully overlapping regions, expect only area of the larger one expected = RegionGeom.create(region1).solid_angle() region = region1 | region2 assert isinstance(region, CompoundSkyRegion) geom = RegionGeom.create(region) omega = geom.solid_angle() assert u.isclose(omega, expected, rtol=2e-3) def test_bin_volume(region): axis = MapAxis.from_edges([1, 3] * u.TeV, name="energy", interp="log") geom = RegionGeom.create(region, axes=[axis]) volume = geom.bin_volume() assert volume.unit == "sr TeV" reference = 2 * 2 * np.pi * (1 - np.cos(region.radius)) assert_allclose(volume.value, reference.value, rtol=1e-3) def test_separation(region): geom = RegionGeom.create(region) position = SkyCoord([0, 0], [0, 1.1], frame="galactic", unit="deg") separation = geom.separation(position) assert_allclose(separation.deg, [0, 1.1], atol=1e-30) def test_upsample(region): axis = MapAxis.from_edges([1, 10] * u.TeV, name="energy", interp="log") geom = RegionGeom.create(region, axes=[axis]) geom_up = geom.upsample(factor=2, axis_name="energy") assert_allclose(geom_up.axes[0].edges.value, [1.0, 3.162278, 10.0], rtol=1e-5) def test_downsample(region): axis = MapAxis.from_edges([1, 3.162278, 10] * u.TeV, name="energy", interp="log") geom = RegionGeom.create(region, axes=[axis]) geom_down = geom.downsample(factor=2, axis_name="energy") assert_allclose(geom_down.axes[0].edges.value, [1.0, 10.0], rtol=1e-5) def test_str(region): axis = MapAxis.from_edges([1, 3.162278, 10] * u.TeV, name="energy", interp="log") geom = RegionGeom.create(region, axes=[axis]) assert "RegionGeom" in str(geom) assert "CircleSkyRegion" in str(geom) def test_eq(region): axis = MapAxis.from_edges([1, 10] * u.TeV, name="energy", interp="log") geom_1 = RegionGeom.create(region, axes=[axis]) geom_2 = RegionGeom.create(region, axes=[axis]) assert geom_1 == geom_2 axis = MapAxis.from_edges([1, 100] * u.TeV, name="energy", interp="log") geom_3 = RegionGeom.create(region, axes=[axis]) assert not geom_2 == geom_3 def test_to_cube_to_image(region): axis = MapAxis.from_edges([1, 10] * u.TeV, name="energy", interp="log") geom = RegionGeom.create(region) geom_cube = geom.to_cube([axis]) assert geom_cube.ndim == 3 geom = geom_cube.to_image() assert geom.ndim == 2 def test_squash(region): axis1 = MapAxis.from_edges([1, 10, 100] * u.TeV, name="energy", interp="log") axis2 = MapAxis.from_edges([1, 2, 3, 4] * u.deg, name="angle", interp="lin") geom = RegionGeom(region, axes=[axis1, axis2]) geom_squashed = geom.squash("energy") assert len(geom_squashed.axes) == 2 assert geom_squashed.axes[1] == axis2 assert_allclose(geom_squashed.axes[0].edges.to_value("TeV"), (1, 100)) def test_pad(region): axis1 = MapAxis.from_edges([1, 10] * u.TeV, name="energy", interp="log") axis2 = MapAxis.from_nodes([1, 2, 3, 4] * u.deg, name="angle", interp="lin") geom = RegionGeom(region, axes=[axis1, axis2]) geom_pad = geom.pad(axis_name="energy", pad_width=1) assert_allclose(geom_pad.axes["energy"].nbin, 3) geom_pad = geom.pad(axis_name="angle", pad_width=1) assert_allclose(geom_pad.axes["angle"].nbin, 6) def test_to_wcs_geom(region): geom = RegionGeom(region) wcs_geom = geom.to_wcs_geom() assert_allclose(wcs_geom.center_coord[1].value, 0, rtol=0.001, atol=0) assert_allclose(wcs_geom.width[0], 360 * u.deg, rtol=1, atol=0) assert wcs_geom.wcs.wcs.ctype[1] == "GLAT-TAN" # test with an extra axis axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=10) geom_cube = geom.to_cube([axis]) wcs_geom_cube = geom_cube.to_wcs_geom() assert wcs_geom_cube.to_image() == wcs_geom assert wcs_geom_cube.axes[0] == axis # test with minimum widths width_min = 3 * u.deg wcs_geom = geom.to_wcs_geom(width_min=width_min) assert_allclose(wcs_geom.center_coord[1].value, 0, rtol=0.001, atol=0) assert_allclose(wcs_geom.width, [[3], [3]] * u.deg, rtol=1, atol=0) width_min = [1, 3] * u.deg wcs_geom = geom.to_wcs_geom(width_min=width_min) assert_allclose(wcs_geom.center_coord[1].value, 0, rtol=0.001, atol=0) assert_allclose(wcs_geom.width, [[2], [3]] * u.deg, rtol=1, atol=0) def test_get_wcs_coord_and_weights(region): # test on circular region geom = RegionGeom(region) region_coord, weights = geom.get_wcs_coord_and_weights() wcs_geom = geom.to_wcs_geom() solid_angles = wcs_geom.solid_angle().T[wcs_geom.coord_to_idx(region_coord)] area = (weights * solid_angles).sum() assert_allclose(area.value, geom.solid_angle().value, rtol=1e-3) assert region_coord.shape == weights.shape # test on rectangular region (asymmetric) center = SkyCoord("0 deg", "0 deg", frame="galactic") region = RectangleSkyRegion( center=center, width=1 * u.deg, height=2 * u.deg, angle=15 * u.deg ) geom = RegionGeom(region) wcs_geom = geom.to_wcs_geom() region_coord, weights = geom.get_wcs_coord_and_weights() solid_angles = wcs_geom.solid_angle().T[wcs_geom.coord_to_idx(region_coord)] area = (weights * solid_angles).sum() assert_allclose(area.value, geom.solid_angle().value, rtol=1e-3) assert region_coord.shape == weights.shape def test_region_nd_map_plot(region): geom = RegionGeom(region) fig = plt.figure() ax = fig.add_subplot(projection=geom.wcs) with mpl_plot_check(): geom.plot_region(ax=ax) def test_region_geom_to_from_hdu(region): axis1 = MapAxis.from_edges([1, 10] * u.TeV, name="energy", interp="log") geom = RegionGeom.create(region, axes=[axis1]) hdulist = geom.to_hdulist(format="ogip") new_geom = RegionGeom.from_hdulist(hdulist, format="ogip") assert new_geom == geom assert new_geom.region.meta["include"] def test_contains_point_sky_region(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) geom = RegionGeom.create( region="galactic;point(0, 0)", axes=[axis], binsz_wcs=0.01 * u.deg ) assert all(geom.contains(geom.center_skydir)) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/region/tests/test_ndmap.py0000644000175100001770000003346214721316200022151 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy import units as u from astropy.time import Time from regions import CircleSkyRegion import matplotlib.pyplot as plt from gammapy.data import EventList from gammapy.maps import ( LabelMapAxis, Map, MapAxis, RegionGeom, RegionNDMap, TimeMapAxis, ) from gammapy.utils.testing import mpl_plot_check, requires_data @pytest.fixture def region_map(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=6, name="energy") m = Map.create( region="icrs;circle(83.63, 21.51, 1)", map_type="region", axes=[axis], unit="1/TeV", ) m.data = np.arange(m.data.size, dtype=float).reshape(m.geom.data_shape) return m @pytest.fixture def region_map_no_region(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=6, name="energy") m = Map.create( region=None, map_type="region", axes=[axis], unit="1/TeV", ) m.data = np.arange(m.data.size, dtype=float).reshape(m.geom.data_shape) return m @pytest.fixture def point_region_map(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=6, name="energy") m = Map.create( region="icrs;point(83.63, 21.51)", map_type="region", axes=[axis], unit="1/TeV", ) m.data = np.arange(m.data.size, dtype=float).reshape(m.geom.data_shape) return m def test_region_nd_map(region_map): assert_allclose(region_map.data.sum(), 15) assert region_map.geom.frame == "icrs" assert region_map.unit == "TeV-1" assert region_map.data.dtype == float assert "RegionNDMap" in str(region_map) assert "1 / TeV" in str(region_map) def test_region_nd_map_sum_over_axes(region_map): region_map_summed = region_map.sum_over_axes() weights = RegionNDMap.from_geom(region_map.geom, data=1.0) weights.data[5, :, :] = 0 region_map_summed_weights = region_map.sum_over_axes(weights=weights) assert_allclose(region_map_summed.data, 15) assert_allclose( region_map_summed.data.shape, ( 1, 1, 1, ), ) assert_allclose(region_map_summed_weights.data, 10) def test_region_nd_map_plot(region_map): with mpl_plot_check(): region_map.plot() fig = plt.figure() ax = fig.add_subplot(1, 1, 1, projection=region_map.geom.wcs) with mpl_plot_check(): region_map.plot_region(ax=ax) def test_region_nd_map_plot_two_axes(): energy_axis = MapAxis.from_energy_edges([1, 3, 10] * u.TeV) time_ref = Time("1999-01-01T00:00:00.123456789") time_axis = TimeMapAxis( edges_min=[0, 1, 3] * u.d, edges_max=[0.8, 1.9, 5.4] * u.d, reference_time=time_ref, ) m = RegionNDMap.create("icrs;circle(0, 0, 1)", axes=[energy_axis, time_axis]) m.data = 10 + np.random.random(m.data.shape) with mpl_plot_check(): m.plot(axis_name="energy") with mpl_plot_check(): m.plot(axis_name="time") with pytest.raises(ValueError): m.plot() def test_region_nd_map_plot_label_axis(): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=5) label_axis = LabelMapAxis(labels=["dataset-1", "dataset-2"], name="dataset") m = RegionNDMap.create(region=None, axes=[energy_axis, label_axis]) with mpl_plot_check(): m.plot(axis_name="energy") with mpl_plot_check(): m.plot(axis_name="dataset") def test_label_axis_io(tmpdir): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=5) label_axis = LabelMapAxis(labels=["dataset-1", "dataset-2"], name="dataset") m = RegionNDMap.create(region=None, axes=[energy_axis, label_axis]) m.data = np.arange(m.data.size).reshape((2, 5)) filename = tmpdir / "test.fits" m.write(filename, format="gadf") m_new = RegionNDMap.read(filename, format="gadf") assert m.geom.axes["dataset"] == m_new.geom.axes["dataset"] assert m.geom.axes["energy"] == m_new.geom.axes["energy"] def test_region_plot_mask(region_map): mask = region_map.geom.energy_mask(2.5 * u.TeV, 6 * u.TeV) with mpl_plot_check(): mask.plot_mask() def test_region_nd_map_misc(region_map): assert_allclose(region_map.sum_over_axes(), 15) stacked = region_map.copy() stacked.stack(region_map) assert_allclose(stacked.data.sum(), 30) stacked = region_map.copy() weights = Map.from_geom(region_map.geom, dtype=np.int_) stacked.stack(region_map, weights=weights) assert_allclose(stacked.data.sum(), 15) def test_stack_different_unit(): region = "icrs;circle(0, 0, 1)" axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) region_map = RegionNDMap.create(axes=[axis], unit="m2 s", region=region) region_map.data += 1 region_map_other = RegionNDMap.create(axes=[axis], unit="cm2 s", region=region) region_map_other.data += 1 region_map.stack(region_map_other) assert_allclose(region_map.data, 1.0001) def test_region_nd_map_sample(region_map): upsampled = region_map.upsample(factor=2) assert_allclose(upsampled.data.sum(), 15) assert upsampled.data.shape == (12, 1, 1) upsampled = region_map.upsample(factor=2, preserve_counts=False) assert_allclose(upsampled.data[3, 0, 0], 1.25) assert upsampled.data.shape == (12, 1, 1) downsampled = region_map.downsample(factor=2) assert_allclose(downsampled.data.sum(), 15) assert_allclose(downsampled.data[2, 0, 0], 9) assert downsampled.data.shape == (3, 1, 1) downsampled = region_map.downsample(factor=2, preserve_counts=False) assert_allclose(downsampled.data.sum(), 7.5) assert_allclose(downsampled.data[2, 0, 0], 4.5) assert downsampled.data.shape == (3, 1, 1) def test_region_nd_map_get(region_map): values = region_map.get_by_idx((0, 0, [2, 3])) assert_allclose(values.squeeze(), [2, 3]) values = region_map.get_by_pix((0, 0, [2.3, 3.7])) assert_allclose(values.squeeze(), [2, 4]) energies = region_map.geom.axes[0].center values = region_map.get_by_coord((83.63, 21.51, energies[[0, -1]])) assert_allclose(values.squeeze(), [0, 5]) def test_region_nd_map_get_no_region(region_map_no_region): energies = region_map_no_region.geom.axes[0].center values = region_map_no_region.get_by_coord((energies[[0, -1]],)) assert_allclose(values.squeeze(), [0, 5]) values = region_map_no_region.get_by_coord({"energy": energies[[0, -1]]}) assert_allclose(values.squeeze(), [0, 5]) def test_region_nd_map_set(region_map): region_map = region_map.copy() region_map.set_by_idx((0, 0, [2, 3]), [42, 42]) assert_allclose(region_map.data[[2, 3]], 42) region_map = region_map.copy() region_map.set_by_pix((0, 0, [2.3, 3.7]), [42, 42]) assert_allclose(region_map.data[[2, 3]], 42) region_map = region_map.copy() energies = region_map.geom.axes[0].center region_map.set_by_coord((83.63, 21.51, energies[[0, -1]]), [42, 42]) assert_allclose(region_map.data[[0, -1]], 42) @requires_data() def test_region_nd_map_fill_events(region_map): filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz" events = EventList.read(filename) region_map = Map.from_geom(region_map.geom) region_map.fill_events(events) weights = np.linspace(0, 1, len(events.time)) region_map2 = Map.from_geom(region_map.geom) region_map2.fill_events(events, weights=weights) assert_allclose(region_map.data.sum(), 665) assert_allclose(region_map2.data.sum(), 328.33487) @requires_data() def test_region_nd_map_fill_events_point_sky_region(point_region_map): filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz" events = EventList.read(filename).select_offset([70.0, 71] * u.deg) region_map = Map.from_geom(point_region_map.geom) region_map.fill_events(events) assert_allclose(region_map.data.sum(), 0) weights = np.linspace(0, 1, len(events.time)) region_map = Map.from_geom(point_region_map.geom) region_map.fill_events(events, weights=weights) assert_allclose(region_map.data.sum(), 0) def test_region_nd_map_resample_axis(): axis_1 = MapAxis.from_edges([1, 2, 3, 4, 5], name="test-1") axis_2 = MapAxis.from_edges([1, 2, 3, 4], name="test-2") geom = RegionGeom.create( region="icrs;circle(83.63, 21.51, 1)", axes=[axis_1, axis_2] ) m = RegionNDMap(geom, unit="m2") m.data += 1 new_axis = MapAxis.from_edges([2, 3, 5], name="test-1") m2 = m.resample_axis(axis=new_axis) assert m2.data.shape == (3, 2, 1, 1) assert_allclose(m2.data[0, :, 0, 0], [1, 2]) # Test without all interval covered new_axis = MapAxis.from_edges([1.7, 4], name="test-1") m3 = m.resample_axis(axis=new_axis) assert m3.data.shape == (3, 1, 1, 1) assert_allclose(m3.data, 2) def test_region_nd_io_ogip(tmpdir): energy_axis = MapAxis.from_energy_bounds(0.1, 10, 12, unit="TeV") m = RegionNDMap.create( "icrs;circle(83.63, 22.01, 0.5)", axes=[energy_axis], binsz_wcs="0.01deg" ) m.write(tmpdir / "test.fits", format="ogip") m_new = RegionNDMap.read(tmpdir / "test.fits", format="ogip") assert isinstance(m_new.geom.region, CircleSkyRegion) geom = m_new.geom.to_wcs_geom() assert geom.data_shape == (12, 102, 102) with pytest.raises(ValueError): m.write(tmpdir / "test.fits", format="ogip-arf") def test_region_nd_io_ogip_arf(tmpdir): energy_axis = MapAxis.from_energy_bounds( 0.1, 10, 12, unit="TeV", name="energy_true" ) m = RegionNDMap.create("icrs;circle(83.63, 22.01, 0.5)", axes=[energy_axis]) m.write(tmpdir / "test.fits", format="ogip-arf") m_new = RegionNDMap.read(tmpdir / "test.fits", format="ogip-arf") assert m_new.geom.region is None with pytest.raises(ValueError): m.write(tmpdir / "test.fits", format="ogip") def test_region_nd_io_gadf(tmpdir): energy_axis = MapAxis.from_edges([1, 3, 10] * u.TeV, name="energy") m = RegionNDMap.create("icrs;circle(83.63, 22.01, 0.5)", axes=[energy_axis]) m.write(tmpdir / "test.fits", format="gadf") m_new = RegionNDMap.read(tmpdir / "test.fits", format="gadf") assert isinstance(m_new.geom.region, CircleSkyRegion) assert m_new.geom.axes[0].name == "energy" assert m_new.data.shape == (2, 1, 1) assert_allclose(m_new.geom.axes["energy"].edges, [1, 3, 10] * u.TeV) def test_region_nd_io_gadf_no_region(tmpdir): energy_axis = MapAxis.from_edges([1, 3, 10] * u.TeV, name="energy") m = RegionNDMap.create(region=None, axes=[energy_axis]) m.write(tmpdir / "test.fits", format="gadf", hdu="TEST") m_new = RegionNDMap.read(tmpdir / "test.fits", format="gadf", hdu="TEST") assert m_new.geom.region is None assert m_new.geom.axes[0].name == "energy" assert m_new.data.shape == (2, 1, 1) assert_allclose(m_new.geom.axes["energy"].edges, [1, 3, 10] * u.TeV) def test_region_nd_io_gadf_rad_axis(tmpdir): energy_axis = MapAxis.from_edges([1, 3, 10] * u.TeV, name="energy") rad_axis = MapAxis.from_nodes([0, 0.1, 0.2] * u.deg, name="rad") m = RegionNDMap.create( "icrs;circle(83.63, 22.01, 0.5)", axes=[energy_axis, rad_axis], unit="sr-1" ) m.data = np.arange(np.prod(m.data.shape)).reshape(m.data.shape) m.write(tmpdir / "test.fits", format="gadf") m_new = RegionNDMap.read(tmpdir / "test.fits", format="gadf") assert isinstance(m_new.geom.region, CircleSkyRegion) assert m_new.geom.axes.names == ["energy", "rad"] assert m_new.unit == "sr-1" # check that the data is not re-shuffled assert_allclose(m_new.data, m.data) assert m_new.data.shape == (3, 2, 1, 1) def test_region_nd_hdulist(): energy_axis = MapAxis.from_edges([1, 3, 10] * u.TeV, name="energy") m = RegionNDMap.create(region="icrs;circle(83.63, 22.01, 0.5)", axes=[energy_axis]) hdulist = m.to_hdulist() assert hdulist[0].name == "PRIMARY" assert hdulist[1].name == "SKYMAP" assert hdulist[2].name == "SKYMAP_BANDS" assert hdulist[3].name == "SKYMAP_REGION" def test_region_nd_map_interp_no_region(): energy_axis = MapAxis.from_energy_edges([1, 3, 10] * u.TeV) time_ref = Time("1999-01-01T00:00:00.123456789") time_axis = TimeMapAxis( edges_min=[0, 1, 3] * u.d, edges_max=[0.8, 1.9, 5.4] * u.d, reference_time=time_ref, ) m = RegionNDMap.create(region=None, axes=[energy_axis, time_axis]) m.data = np.arange(6).reshape((time_axis.nbin, energy_axis.nbin)) energy = [2, 6] * u.TeV time = time_ref + [[0.4], [1.5], [4.2]] * u.d value = m.interp_by_coord({"energy": energy, "time": time}) reference = np.array( [ [0.13093, 1.075717], [2.242041, 3.186828], [4.13093, 5.075717], ] ) assert_allclose(value, reference, rtol=1e-5) value = m.interp_by_coord((energy, time)) assert_allclose(value, reference, rtol=1e-5) def test_region_map_sampling(region_map): energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=6, name="energy") edges = np.linspace(0.0, 30.0, 3) * u.min time_axis = MapAxis.from_edges(edges, name="time") npred_map = Map.create( region="icrs;circle(83.63, 21.51, 1)", map_type="region", axes=[time_axis, energy_axis], ) npred_map.data[...] = 5 coords = npred_map.sample_coord(n_events=2, random_state=0) assert len(coords["lon"]) == 2 assert_allclose(coords["lon"].data, [83.63, 83.63], rtol=1e-5) assert_allclose(coords["lat"].data, [21.51, 21.51], rtol=1e-5) assert_allclose(coords["energy"].data, [3.985296, 5.721113], rtol=1e-5) assert_allclose(coords["time"].data, [6.354822, 9.688412], rtol=1e-5) assert coords["lon"].unit == "deg" assert coords["lat"].unit == "deg" assert coords["energy"].unit == "TeV" assert coords["time"].unit == "min" ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.224642 gammapy-1.3/gammapy/maps/tests/0000755000175100001770000000000014721316215016154 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/tests/__init__.py0000644000175100001770000000010014721316200020246 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/tests/test_axes.py0000644000175100001770000010521014721316200020516 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose, assert_equal import astropy.units as u from astropy.table import Table from astropy.time import Time from astropy.visualization import quantity_support import matplotlib.pyplot as plt from gammapy.data import GTI from gammapy.maps import LabelMapAxis, MapAxes, MapAxis, RegionNDMap, TimeMapAxis from gammapy.utils.scripts import make_path from gammapy.utils.testing import assert_time_allclose, mpl_plot_check, requires_data from gammapy.utils.time import time_ref_to_dict MAP_AXIS_INTERP = [ (np.array([0.25, 0.75, 1.0, 2.0]), "lin"), (np.array([0.25, 0.75, 1.0, 2.0]), "log"), (np.array([0.25, 0.75, 1.0, 2.0]), "sqrt"), ] MAP_AXIS_NODE_TYPES = [ ([0.25, 0.75, 1.0, 2.0], "lin", "edges"), ([0.25, 0.75, 1.0, 2.0], "log", "edges"), ([0.25, 0.75, 1.0, 2.0], "sqrt", "edges"), ([0.25, 0.75, 1.0, 2.0], "lin", "center"), ([0.25, 0.75, 1.0, 2.0], "log", "center"), ([0.25, 0.75, 1.0, 2.0], "sqrt", "center"), ] nodes_array = np.array([0.25, 0.75, 1.0, 2.0]) MAP_AXIS_NODE_TYPE_UNIT = [ (nodes_array, "lin", "edges", "s", "TEST", True), (nodes_array, "log", "edges", "s", "test", False), (nodes_array, "lin", "edges", "TeV", "TEST", False), (nodes_array, "sqrt", "edges", "s", "test", False), (nodes_array, "lin", "center", "s", "test", False), (nodes_array + 1e-9, "lin", "edges", "s", "test", True), (nodes_array + 1e-3, "lin", "edges", "s", "test", False), (nodes_array / 3600.0, "lin", "edges", "hr", "TEST", True), ] @pytest.fixture def time_intervals(): t0 = Time("2020-03-19") t_min = np.linspace(0, 10, 20) * u.d t_max = t_min + 1 * u.h return {"t_min": t_min, "t_max": t_max, "t_ref": t0} @pytest.fixture def time_interval(): t0 = Time("2020-03-19") t_min = 1 * u.d t_max = 11 * u.d return {"t_min": t_min, "t_max": t_max, "t_ref": t0} @pytest.fixture(scope="session") def energy_axis_ref(): edges = np.arange(1, 11) * u.TeV return MapAxis.from_edges(edges, name="energy") def test_mapaxis_repr(): axis = MapAxis([1, 2, 3], name="test") assert "MapAxis" in repr(axis) def test_mapaxis_invalid_name(): with pytest.raises(TypeError): MapAxis([1, 2, 3], name=1) @pytest.mark.parametrize( ("nodes", "interp", "node_type", "unit", "name", "result"), MAP_AXIS_NODE_TYPE_UNIT, ) def test_mapaxis_equal(nodes, interp, node_type, unit, name, result): axis1 = MapAxis( nodes=[0.25, 0.75, 1.0, 2.0], name="test", unit="s", interp="lin", node_type="edges", ) axis2 = MapAxis(nodes, name=name, unit=unit, interp=interp, node_type=node_type) assert (axis1 == axis2) is result assert (axis1 != axis2) is not result def test_squash(): axis = MapAxis( nodes=[0, 1, 2, 3], unit="TeV", name="energy", node_type="edges", interp="lin" ) ax_sq = axis.squash() assert_allclose(ax_sq.nbin, 1) assert_allclose(axis.edges[0], ax_sq.edges[0]) assert_allclose(axis.edges[-1], ax_sq.edges[1]) assert_allclose(ax_sq.center, 1.5 * u.TeV) def test_upsample(): axis = MapAxis( nodes=[0, 1, 2, 3], unit="TeV", name="energy", node_type="edges", interp="lin" ) axis_up = axis.upsample(10) assert_allclose(axis_up.nbin, 10 * axis.nbin) assert_allclose(axis_up.edges[0], axis.edges[0]) assert_allclose(axis_up.edges[-1], axis.edges[-1]) assert axis_up.node_type == axis.node_type def test_downsample(): axis = MapAxis( nodes=[0, 1, 2, 3, 4, 5, 6, 7, 8], unit="TeV", name="energy", node_type="edges", interp="lin", ) # Node_type edge, divisible, strict=True axis_down = axis.downsample(2) assert_allclose(axis_down.nbin, 0.5 * axis.nbin) assert_allclose(axis_down.edges[0], axis.edges[0]) assert_allclose(axis_down.edges[-1], axis.edges[-1]) assert axis_down.node_type == axis.node_type # Node_type edge, divisible, strict=False axis_down1 = axis.downsample(2, strict=False) assert axis_down == axis_down1 axis = MapAxis( nodes=[0, 1, 2, 3, 4, 5, 6, 7], unit="TeV", name="energy", node_type="edges", interp="lin", ) # Node_type edge, not divisible, strict=True with pytest.raises(ValueError) as exc_info: axis.downsample(2, strict=True) assert str(exc_info.value) == "Number of energy bins (7) is not divisible by 2" # Node_type edge, not divisible, strict=False axis_down = axis.downsample(2, strict=False) assert_allclose(axis_down.nbin, np.ceil(0.5 * axis.nbin)) assert_allclose(axis_down.edges[0], axis.edges[0]) assert_allclose(axis_down.edges[-1], axis.edges[-1]) assert axis_down.node_type == axis.node_type axis = MapAxis( nodes=[ 0, 1, 2, 3, 4, 5, ], unit="TeV", name="energy", node_type="center", interp="lin", ) # Node_type center, not divisible, strict=True with pytest.raises(ValueError) as exc_info: axis.downsample(2, strict=True) assert str(exc_info.value) == "Number of energy bins - 1 (5) is not divisible by 2" # Node_type center, not divisible, strict=False axis_down = axis.downsample(2, strict=False) assert_allclose(axis_down.nbin, 4) assert_allclose(axis_down.center, [0.0, 2.0, 4.0, 5] * u.TeV) assert axis_down.node_type == "center" axis = MapAxis( nodes=[ 0, 1, 2, 3, 4, ], unit="TeV", name="energy", node_type="center", interp="lin", ) # Node_type center, divisible, strict=True axis_down = axis.downsample(2, strict=True) assert_allclose(axis_down.nbin, 3) assert_allclose(axis_down.center, [0.0, 2.0, 4.0] * u.TeV) # Node_type center, divisible, strict=False axis_down1 = axis.downsample(2, strict=False) assert axis_down == axis_down1 def test_upsample_non_regular(): axis = MapAxis.from_edges([0, 1, 3, 7], name="test", interp="lin") axis_up = axis.upsample(2) assert_allclose(axis_up.nbin, 2 * axis.nbin) assert_allclose(axis_up.edges[0], axis.edges[0]) assert_allclose(axis_up.edges[-1], axis.edges[-1]) assert axis_up.node_type == axis.node_type def test_upsample_non_regular_nodes(): axis = MapAxis.from_nodes([0, 1, 3, 7], name="test", interp="lin") axis_up = axis.upsample(2) assert_allclose(axis_up.nbin, 2 * axis.nbin - 1) assert_allclose(axis_up.center[0], axis.center[0]) assert_allclose(axis_up.center[-1], axis.center[-1]) assert axis_up.node_type == axis.node_type def test_downsample_non_regular(): axis = MapAxis.from_edges([0, 1, 3, 7, 13], name="test", interp="lin") axis_down = axis.downsample(2) assert_allclose(axis_down.nbin, 0.5 * axis.nbin) assert_allclose(axis_down.edges[0], axis.edges[0]) assert_allclose(axis_down.edges[-1], axis.edges[-1]) assert axis_down.node_type == axis.node_type def test_downsample_non_regular_nodes(): axis = MapAxis.from_edges([0, 1, 3, 7, 9], name="test", interp="lin") axis_down = axis.downsample(2) assert_allclose(axis_down.nbin, 0.5 * axis.nbin) assert_allclose(axis_down.edges[0], axis.edges[0]) assert_allclose(axis_down.edges[-1], axis.edges[-1]) assert axis_down.node_type == axis.node_type @pytest.mark.parametrize("factor", [1, 3, 5, 7, 11]) def test_up_downsample_consistency(factor): axis = MapAxis.from_edges([0, 1, 3, 7, 13], name="test", interp="lin") axis_new = axis.upsample(factor).downsample(factor) assert_allclose(axis.edges, axis_new.edges) def test_one_bin_nodes(): axis = MapAxis.from_nodes([1], name="test", unit="deg") assert_allclose(axis.center, 1 * u.deg) assert_allclose(axis.coord_to_pix(1 * u.deg), 0) assert_allclose(axis.coord_to_pix(2 * u.deg), 0) assert_allclose(axis.pix_to_coord(0), 1 * u.deg) def test_table(): axis = MapAxis( nodes=[0, 1, 2, 3, 4, 5, 6, 7, 8], unit="TeV", name="energy", node_type="edges", interp="lin", ) table = axis.to_table() assert table.colnames == ["CHANNEL", "E_MIN", "E_MAX"] assert len(table["CHANNEL"]) == axis.nbin assert table["E_MIN"].unit == table["E_MAX"].unit == u.TeV assert (table["E_MIN"].data == axis.edges_min.value).all() table_kev = axis.to_table("ogip-sherpa") assert table_kev["E_MIN"].unit == table_kev["E_MAX"].unit == u.keV def test_group_table_basic(energy_axis_ref): energy_edges = [1, 2, 10] * u.TeV groups = energy_axis_ref.group_table(energy_edges) assert_allclose(groups["group_idx"], [0, 1]) assert_allclose(groups["idx_min"], [0, 1]) assert_allclose(groups["idx_max"], [0, 8]) assert_allclose(groups["energy_min"], [1, 2]) assert_allclose(groups["energy_max"], [2, 10]) bin_type = [_.strip() for _ in groups["bin_type"]] assert_equal(bin_type, ["normal", "normal"]) @pytest.mark.parametrize( "energy_edges", [[1.8, 4.8, 7.2] * u.TeV, [2, 5, 7] * u.TeV, [2000, 5000, 7000] * u.GeV], ) def test_group_tablenergy_edges(energy_axis_ref, energy_edges): groups = energy_axis_ref.group_table(energy_edges) assert_allclose(groups["group_idx"], [0, 1, 2, 3]) assert_allclose(groups["idx_min"], [0, 1, 4, 6]) assert_allclose(groups["idx_max"], [0, 3, 5, 8]) assert_allclose(groups["energy_min"].quantity.to_value("TeV"), [1, 2, 5, 7]) assert_allclose(groups["energy_max"].quantity.to_value("TeV"), [2, 5, 7, 10]) bin_type = [_.strip() for _ in groups["bin_type"]] assert_equal(bin_type, ["underflow", "normal", "normal", "overflow"]) def test_group_table_below_range(energy_axis_ref): energy_edges = [0.7, 0.8, 1, 4] * u.TeV groups = energy_axis_ref.group_table(energy_edges) assert_allclose(groups["group_idx"], [0, 1]) assert_allclose(groups["idx_min"], [0, 3]) assert_allclose(groups["idx_max"], [2, 8]) assert_allclose(groups["energy_min"], [1, 4]) assert_allclose(groups["energy_max"], [4, 10]) bin_type = [_.strip() for _ in groups["bin_type"]] assert_equal(bin_type, ["normal", "overflow"]) def test_group_table_above_range(energy_axis_ref): energy_edges = [5, 7, 11, 13] * u.TeV groups = energy_axis_ref.group_table(energy_edges) assert_allclose(groups["group_idx"], [0, 1, 2]) assert_allclose(groups["idx_min"], [0, 4, 6]) assert_allclose(groups["idx_max"], [3, 5, 8]) assert_allclose(groups["energy_min"], [1, 5, 7]) assert_allclose(groups["energy_max"], [5, 7, 10]) bin_type = [_.strip() for _ in groups["bin_type"]] assert_equal(bin_type, ["underflow", "normal", "normal"]) def test_group_table_outside_range(energy_axis_ref): energy_edges = [20, 30, 40] * u.TeV with pytest.raises(ValueError): energy_axis_ref.group_table(energy_edges) def test_map_axis_aligned(): ax1 = MapAxis([1, 2, 3], interp="lin", node_type="edges") ax2 = MapAxis([1.5, 2.5], interp="log", node_type="center") assert not ax1.is_aligned(ax2) def test_map_axis_pad(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=1) padded = axis.pad(pad_width=(0, 1)) assert_allclose(padded.edges, [1, 10, 100] * u.TeV) padded = axis.pad(pad_width=(1, 0)) assert_allclose(padded.edges, [0.1, 1, 10] * u.TeV) padded = axis.pad(pad_width=1) assert_allclose(padded.edges, [0.1, 1, 10, 100] * u.TeV) def test_map_axes_pad(): axis_1 = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=1) axis_2 = MapAxis.from_bounds(0, 1, nbin=2, unit="deg", name="rad") axes = MapAxes([axis_1, axis_2]) axes = axes.pad(axis_name="energy", pad_width=1) assert_allclose(axes["energy"].edges, [0.1, 1, 10, 100] * u.TeV) def test_rename(): axis_1 = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=1) axis = axis_1.rename("energy_true") assert axis_1.name == "energy" assert axis.name == "energy_true" axis_2 = MapAxis.from_bounds(0, 1, nbin=2, unit="deg", name="rad") axes = MapAxes([axis_1, axis_2]) axes = axes.rename_axes(["energy", "rad"], ["energy_true", "angle"]) assert axes.names == ["energy_true", "angle"] @pytest.mark.parametrize(("edges", "interp"), MAP_AXIS_INTERP) def test_mapaxis_init_from_edges(edges, interp): axis = MapAxis(edges, interp=interp) assert_allclose(axis.edges, edges) assert_allclose(axis.nbin, len(edges) - 1) with pytest.raises(ValueError): MapAxis.from_edges([1]) MapAxis.from_edges([0, 1, 1, 2]) MapAxis.from_edges([0, 1, 3, 2]) @pytest.mark.parametrize(("nodes", "interp"), MAP_AXIS_INTERP) def test_mapaxis_from_nodes(nodes, interp): axis = MapAxis.from_nodes(nodes, interp=interp) assert_allclose(axis.center, nodes) assert_allclose(axis.nbin, len(nodes)) with pytest.raises(ValueError): MapAxis.from_nodes([]) MapAxis.from_nodes([0, 1, 1, 2]) MapAxis.from_nodes([0, 1, 3, 2]) @pytest.mark.parametrize(("nodes", "interp"), MAP_AXIS_INTERP) def test_mapaxis_from_bounds(nodes, interp): axis = MapAxis.from_bounds(nodes[0], nodes[-1], 3, interp=interp) assert_allclose(axis.edges[0], nodes[0]) assert_allclose(axis.edges[-1], nodes[-1]) assert_allclose(axis.nbin, 3) with pytest.raises(ValueError): MapAxis.from_bounds(1, 1, 1) def test_map_axis_from_energy_units(): with pytest.raises(ValueError): _ = MapAxis.from_energy_bounds(0.1, 10, 2, unit="deg") with pytest.raises(ValueError): _ = MapAxis.from_energy_edges([0.1, 1, 10] * u.deg) @pytest.mark.parametrize(("nodes", "interp", "node_type"), MAP_AXIS_NODE_TYPES) def test_mapaxis_pix_to_coord(nodes, interp, node_type): axis = MapAxis(nodes, interp=interp, node_type=node_type) assert_allclose(axis.center, axis.pix_to_coord(np.arange(axis.nbin, dtype=float))) assert_allclose( np.arange(axis.nbin + 1, dtype=float) - 0.5, axis.coord_to_pix(axis.edges) ) @pytest.mark.parametrize(("nodes", "interp", "node_type"), MAP_AXIS_NODE_TYPES) def test_mapaxis_coord_to_idx(nodes, interp, node_type): axis = MapAxis(nodes, interp=interp, node_type=node_type) assert_allclose(np.arange(axis.nbin, dtype=int), axis.coord_to_idx(axis.center)) @pytest.mark.parametrize(("nodes", "interp", "node_type"), MAP_AXIS_NODE_TYPES) def test_mapaxis_slice(nodes, interp, node_type): axis = MapAxis(nodes, interp=interp, node_type=node_type) saxis = axis.slice(slice(1, 3)) assert_allclose(saxis.nbin, 2) assert_allclose(saxis.center, axis.center[slice(1, 3)]) axis = MapAxis(nodes, interp=interp, node_type=node_type) saxis = axis.slice(slice(1, None)) assert_allclose(saxis.nbin, axis.nbin - 1) assert_allclose(saxis.center, axis.center[slice(1, None)]) axis = MapAxis(nodes, interp=interp, node_type=node_type) saxis = axis.slice(slice(None, 2)) assert_allclose(saxis.nbin, 2) assert_allclose(saxis.center, axis.center[slice(None, 2)]) axis = MapAxis(nodes, interp=interp, node_type=node_type) saxis = axis.slice(slice(None, -1)) assert_allclose(saxis.nbin, axis.nbin - 1) assert_allclose(saxis.center, axis.center[slice(None, -1)]) def test_map_axis_plot_helpers(): axis = MapAxis.from_nodes([0, 1, 2], unit="deg", name="offset") labels = axis.as_plot_labels assert labels[0] == "0.00e+00 deg" assert_allclose(axis.center, axis.as_plot_center) assert_allclose(axis.edges, axis.as_plot_edges) def test_map_axis_concatenate(): axis_1 = MapAxis.from_bounds(0, 10, 10, name="axis") axis_2 = MapAxis.from_bounds(10, 20, 10, name="axis") axis_2_other_name = MapAxis.from_bounds(10, 20, 10, name="other_axis") axis_12 = axis_1.concatenate(axis_2) assert_equal(axis_12.edges, np.linspace(0, 20, 21)) with pytest.raises(ValueError): axis_1.concatenate(axis_2_other_name) def test_time_axis(time_intervals): axis = TimeMapAxis( time_intervals["t_min"], time_intervals["t_max"], time_intervals["t_ref"] ) axis_copy = axis.copy() assert axis.nbin == 20 assert axis.name == "time" assert axis.node_type == "intervals" assert_allclose(axis.time_delta.to_value("min"), 60) assert_allclose(axis.time_mid[0].mjd, 58927.020833333336) assert "time" in axis.__str__() assert "20" in axis.__str__() with pytest.raises(ValueError): axis.assert_name("bad") assert axis_copy == axis assert not axis.is_contiguous ax_cont = axis.to_contiguous() assert_allclose(ax_cont.nbin, 39) def test_single_interval_time_axis(time_interval): axis = TimeMapAxis( edges_min=time_interval["t_min"], edges_max=time_interval["t_max"], reference_time=time_interval["t_ref"], ) coord = Time(58933, format="mjd") + u.Quantity([1.5, 3.5, 10], unit="d") pix = axis.coord_to_pix(coord) assert axis.nbin == 1 assert_allclose(axis.time_delta.to_value("d"), 10) assert_allclose(axis.time_mid[0].mjd, 58933) pix_min = axis.coord_to_pix(time_interval["t_min"] + 0.001 * u.s) assert_allclose(pix_min, -0.5) pix_max = axis.coord_to_pix(time_interval["t_max"] - 0.001 * u.s) assert_allclose(pix_max, 0.5) assert_allclose(pix, [0.15, 0.35, np.nan]) def test_slice_squash_time_axis(time_intervals): axis = TimeMapAxis( time_intervals["t_min"], time_intervals["t_max"], time_intervals["t_ref"] ) axis_squash = axis.squash() axis_slice = axis.slice(slice(1, 5)) assert axis_squash.nbin == 1 assert_allclose(axis_squash.time_min[0].mjd, 58927) assert_allclose(axis_squash.time_delta.to_value("d"), 10.04166666) assert axis_slice.nbin == 4 assert_allclose(axis_slice.time_delta.to_value("d")[0], 0.04166666666) assert axis_squash != axis_slice def test_from_time_edges_time_axis(): t0 = Time("2020-03-19") t_min = t0 + np.linspace(0, 10, 20) * u.d t_max = t_min + 1 * u.h axis = TimeMapAxis.from_time_edges(t_min, t_max) axis_h = TimeMapAxis.from_time_edges(t_min, t_max, unit="h") assert axis.nbin == 20 assert axis.name == "time" assert_time_allclose(axis.reference_time, t0) assert_allclose(axis.time_delta.to_value("min"), 60) assert_allclose(axis.time_mid[0].mjd, 58927.020833333336) assert_allclose(axis_h.time_delta.to_value("h"), 1) assert_allclose(axis_h.time_mid[0].mjd, 58927.020833333336) assert axis == axis_h def test_incorrect_time_axis(): tmin = np.linspace(0, 10, 20) * u.h tmax = np.linspace(1, 11, 20) * u.h # incorrect reference time with pytest.raises(ValueError): TimeMapAxis(tmin, tmax, reference_time=51000 * u.d, name="time") # overlapping time intervals with pytest.raises(ValueError): TimeMapAxis(tmin, tmax, reference_time=Time.now(), name="time") def test_bad_length_sort_time_axis(time_intervals): tref = time_intervals["t_ref"] tmin = time_intervals["t_min"] tmax_reverse = time_intervals["t_max"][::-1] tmax_short = time_intervals["t_max"][:-1] with pytest.raises(ValueError): TimeMapAxis(tmin, tmax_reverse, tref, name="time") with pytest.raises(ValueError): TimeMapAxis(tmin, tmax_short, tref, name="time") def test_coord_to_idx_time_axis(time_intervals): tmin = time_intervals["t_min"] tmax = time_intervals["t_max"] tref = time_intervals["t_ref"] axis = TimeMapAxis(tmin, tmax, tref, name="time") time = Time(58927.020833333336, format="mjd") times = axis.time_mid times[::2] += 1 * u.h times = times.insert(0, tref - [1, 2] * u.yr) idx = axis.coord_to_idx(time) indices = axis.coord_to_idx(times) pix = axis.coord_to_pix(time) pixels = axis.coord_to_pix(times) assert idx == 0 assert_allclose(indices[1::2], [-1, 1, 3, 5, 7, 9, 11, 13, 15, 17, 19]) assert_allclose(indices[::2], -1) assert_allclose(pix, 0, atol=1e-10) assert_allclose(pixels[1::2], [np.nan, 1, 3, 5, 7, 9, 11, 13, 15, 17, 19]) def test_timeaxis_table(time_intervals): tmin = time_intervals["t_min"] tmax = time_intervals["t_max"] tref = time_intervals["t_ref"] axis = TimeMapAxis(tmin, tmax, tref, name="time") table = axis.to_table() assert table.colnames == ["START", "STOP"] assert len(table["START"]) == axis.nbin assert (table["START"] == axis.time_min).all() def test_pix_to_coord_time_axis(time_intervals): tmin = time_intervals["t_min"] tmax = time_intervals["t_max"] tref = time_intervals["t_ref"] axis = TimeMapAxis(tmin, tmax, tref, name="time") pixels = [1.3, 3.2, 5.4, 7, 15.33, 17.21, 19.11] coords = axis.pix_to_coord(pixels) assert_allclose( coords[0:3].mjd, [58927.538816, 58928.587281, 58929.648246], rtol=1e-5 ) # test with nan indices pixels.append(np.nan) coords = axis.pix_to_coord(pixels) assert_allclose( coords[-3:].mjd, [58935.9561, 58937.0046, -3.72500000e-04], rtol=1e-5, ) # assert with invalid pixels & multidim array coords = axis.pix_to_coord([[-1.2, 0.6], [1.5, 24.7]]) assert_allclose( coords.mjd, [[-3.72500000e-04, 5.89270250e04], [5.89275471e04, -3.72500000e-04]], rtol=1e-5, ) # test with one value coords = axis.pix_to_coord(3) assert_allclose(coords.mjd, axis.time_min[3].mjd, rtol=1e-5) def test_slice_time_axis(time_intervals): axis = TimeMapAxis( time_intervals["t_min"], time_intervals["t_max"], time_intervals["t_ref"] ) new_axis = axis.slice([2, 6, 9]) squashed = axis.squash() assert new_axis.nbin == 3 assert_allclose(squashed.time_max[0].mjd, 58937.041667) assert squashed.nbin == 1 assert_allclose(squashed.time_max[0].mjd, 58937.041667) def test_time_map_axis_from_time_bounds(): t_min = Time("2006-02-12", scale="utc") t_max = t_min + 12 * u.h axis = TimeMapAxis.from_time_bounds(time_min=t_min, time_max=t_max, nbin=3) assert_allclose(axis.center, [0.083333, 0.25, 0.416667] * u.d, rtol=1e-5) def test_from_table_time_axis(): t0 = Time("2006-02-12", scale="utc") t_min = np.linspace(0, 10, 10) * u.d t_max = t_min + 12 * u.h table = Table() table["TIME_MIN"] = t_min table["TIME_MAX"] = t_max table.meta.update(time_ref_to_dict(t0)) table.meta["AXCOLS1"] = "TIME_MIN,TIME_MAX" axis = TimeMapAxis.from_table(table, format="gadf") assert axis.nbin == 10 assert_allclose(axis.time_mid[0].mjd, 53778.25) def test_from_table_time_axis_lightcurve_format(): t0 = Time("2006-02-12", scale="tt") t_min = np.linspace(0, 10, 10) * u.d t_max = t_min + 12 * u.h table = Table() table["time_min"] = t_min.to_value("h") table["time_max"] = t_max.to_value("h") table.meta.update(time_ref_to_dict(t0)) table.meta["TIMEUNIT"] = "h" axis = TimeMapAxis.from_table(table, format="lightcurve") assert axis.nbin == 10 assert_allclose(axis.time_mid[0].mjd, 53778.25) assert axis.time_mid.scale == "tt" t0.format = "mjd" assert_time_allclose(axis.reference_time, t0) @requires_data() def test_from_gti_time_axis(): filename = "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_020136.fits.gz" filename = make_path(filename) gti = GTI.read(filename) axis = TimeMapAxis.from_gti(gti) expected = Time(53090.123451203704, format="mjd", scale="tt") assert_time_allclose(axis.time_min[0], expected) assert axis.nbin == 1 def test_from_gti_bounds(): start = u.Quantity([1, 2], "min") stop = u.Quantity([1.5, 2.5], "min") time_ref = Time("2010-01-01 00:00:00.0") gti = GTI.create(start, stop, time_ref) axis = TimeMapAxis.from_gti_bounds( gti=gti, t_delta=10 * u.s, ) assert axis.nbin == 8 expected = Time("2010-01-01 00:01:00.0") # GTI.create() changes the reference time format expected.format = "mjd" assert_time_allclose(axis.time_min[0], expected) expected = Time("2010-01-01 00:02:30.0") expected.format = "mjd" assert_time_allclose(axis.time_max[-1], expected) def test_to_gti(time_intervals): time_axis = TimeMapAxis( time_intervals["t_min"], time_intervals["t_max"], time_intervals["t_ref"] ) gti = time_axis.to_gti() assert len(gti.table) == 20 assert TimeMapAxis.from_gti(gti) == time_axis def test_map_with_time_axis(time_intervals): time_axis = TimeMapAxis( time_intervals["t_min"], time_intervals["t_max"], time_intervals["t_ref"] ) energy_axis = MapAxis.from_energy_bounds(0.1, 10, 2, unit="TeV") region_map = RegionNDMap.create( region="fk5; circle(0,0,0.1)", axes=[energy_axis, time_axis] ) assert region_map.geom.data_shape == (20, 2, 1, 1) def test_time_axis_plot_helpers(): time_ref = Time("1999-01-01T00:00:00.123456789") time_axis = TimeMapAxis( edges_min=[0, 1, 3] * u.d, edges_max=[0.8, 1.9, 5.4] * u.d, reference_time=time_ref, ) labels = time_axis.as_plot_labels assert labels[0] == "1999-01-01 00:00:00.123 - 1999-01-01 19:12:00.123" center = time_axis.as_plot_center assert center[0].year == 1999 edges = time_axis.to_contiguous().as_plot_edges assert edges[0].year == 1999 def test_axes_basics(): energy_axis = MapAxis.from_energy_edges([1, 3] * u.TeV) time_ref = Time("1999-01-01T00:00:00.123456789") time_axis = TimeMapAxis( edges_min=[0, 1, 3] * u.d, edges_max=[0.8, 1.9, 5.4] * u.d, reference_time=time_ref, ) axes = MapAxes([energy_axis, time_axis]) assert axes.shape == (1, 3) assert axes.is_unidimensional assert not axes.is_flat assert axes.primary_axis.name == "time" new_axes = axes.copy() assert new_axes[0] == new_axes[0] assert new_axes[1] == new_axes[1] assert new_axes == axes energy_axis = MapAxis.from_energy_edges([1, 4] * u.TeV) new_axes = MapAxes([energy_axis, time_axis.copy()]) assert new_axes != axes def test_map_axes_assert_names(): axis_a = MapAxis([1, 2, 3], name="axis_a") axis_b = MapAxis([1, 2, 3, 4], name="axis_b") axis_c = LabelMapAxis(["a", "b"], name="axis_c") axes = MapAxes([axis_a, axis_b, axis_c]) axes.assert_names(["axis_a", "axis_b", "axis_c"]) axes.assert_names(["axis_a", "axis_b"], allow_extra=True) with pytest.raises(ValueError): axes.assert_names(["axis_a", "axis_b"]) with pytest.raises(ValueError): axes.assert_names(["axis_c", "axis_b", "axis_a"]) with pytest.raises(ValueError): axes.assert_names(["axis_b"], allow_extra=True) def test_axes_getitem(): axis1 = MapAxis.from_bounds(1, 4, 3, name="a1") axis2 = axis1.copy(name="a2") axis3 = axis1.copy(name="a3") axes = MapAxes([axis1, axis2, axis3]) assert isinstance(axes[0], MapAxis) assert axes[-1].name == "a3" assert isinstance(axes[1:], MapAxes) assert len(axes[1:]) == 2 assert isinstance(axes[0:1], MapAxes) assert len(axes[0:1]) == 1 assert isinstance(axes[["a3", "a1"]], MapAxes) assert axes[["a3", "a1"]][0].name == "a3" def test_label_map_axis_basics(): axis = LabelMapAxis(labels=["label-1", "label-2"], name="label-axis") axis_str = str(axis) assert "node type" in axis_str assert "labels" in axis_str assert "label-2" in axis_str with pytest.raises(ValueError): axis.assert_name("time") assert axis.nbin == 2 assert axis.node_type == "label" assert_allclose(axis.bin_width, 1) assert axis.name == "label-axis" with pytest.raises(ValueError): axis.edges axis_copy = axis.copy() assert axis_copy.name == "label-axis" # Ensure order is correct labels = np.array([1, 4, 5]) assert_allclose(labels, [1, 4, 5]) assert_allclose(LabelMapAxis(labels).center, labels) assert_allclose(LabelMapAxis(labels)._labels, labels) # Test case with non unique labels with pytest.raises(ValueError) as exc_info: LabelMapAxis([1, 1, 1]) assert str(exc_info.value) == "Node labels must be unique" # Ensure non alphabetical ordering labels = np.array(["b", "a", "d", "c"]) assert_equal(LabelMapAxis(labels).center, labels) def test_label_map_axis_coord_to_idx(): axis = LabelMapAxis(labels=["label-1", "label-2", "label-3"], name="label-axis") labels = "label-1" idx = axis.coord_to_idx(coord=labels) assert_allclose(idx, 0) labels = ["label-1", "label-3"] idx = axis.coord_to_idx(coord=labels) assert_allclose(idx, [0, 2]) labels = [["label-1"], ["label-2"]] idx = axis.coord_to_idx(coord=labels) assert_allclose(idx, [[0], [1]]) with pytest.raises(ValueError): labels = [["bad-label"], ["label-2"]] _ = axis.coord_to_idx(coord=labels) def test_mixed_axes(): label_axis = LabelMapAxis(labels=["label-1", "label-2", "label-3"], name="label") time_axis = TimeMapAxis( edges_min=[1, 10] * u.day, edges_max=[2, 13] * u.day, reference_time=Time("2020-03-19"), ) energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=4) axes = MapAxes(axes=[energy_axis, time_axis, label_axis]) coords = axes.get_coord() assert coords["label"].shape == (1, 1, 3) assert coords["energy"].shape == (4, 1, 1) assert coords["time"].shape == (1, 2, 1) idx = axes.coord_to_idx(coords) assert_allclose(idx[0], np.arange(4).reshape((4, 1, 1))) assert_allclose(idx[1], np.arange(2).reshape((1, 2, 1))) assert_allclose(idx[2], np.arange(3).reshape((1, 1, 3))) hdu = axes.to_table_hdu(format="gadf") table = Table.read(hdu) assert table["LABEL"].dtype == np.dtype("U7") assert len(table) == 24 def test_map_axis_format_plot_xaxis(): axis = MapAxis.from_energy_bounds( "0.03 TeV", "300 TeV", nbin=20, per_decade=True, name="energy_true" ) with mpl_plot_check(): ax = plt.gca() with quantity_support(): ax.plot(axis.center, np.ones_like(axis.center)) ax1 = axis.format_plot_xaxis(ax=ax) assert ax1.xaxis.units == u.Unit("TeV") assert " ".join(ax1.axes.axes.get_xlabel().split()[:2]) == "True Energy" def test_time_format(time_intervals): axis = TimeMapAxis( time_intervals["t_min"], time_intervals["t_max"], time_intervals["t_ref"], name="time", ) with pytest.raises(ValueError): axis.time_format = "null" def test_time_map_axis_format_plot_xaxis(time_intervals): axis = TimeMapAxis( time_intervals["t_min"], time_intervals["t_max"], time_intervals["t_ref"], name="time", ) with mpl_plot_check(): ax = plt.gca() with quantity_support(): ax.plot(axis.as_plot_center, np.ones_like(axis.center)) ax1 = axis.format_plot_xaxis(ax=ax) assert ax1.axes.axes.get_xlabel().split()[0] == "Time" assert ax1.axes.axes.get_xlabel().split()[1] == "[iso]" axis.time_format = "mjd" with mpl_plot_check(): ax = plt.gca() with quantity_support(): ax.plot(axis.as_plot_center, np.ones_like(axis.center)) ax2 = axis.format_plot_xaxis(ax=ax) assert ax2.axes.axes.get_xlabel().split()[1] == "[mjd]" def test_time_group_table(time_intervals): axis = TimeMapAxis( time_intervals["t_min"], time_intervals["t_max"], time_intervals["t_ref"], name="time", ) groups = axis.group_table(interval_edges=[5 * u.d - 1 * u.h, 6 * u.d + 2 * u.h]) assert_allclose(groups["idx_min"], [10]) assert_allclose(groups["idx_max"], [11]) assert_allclose(groups["time_min"], [58932.263158]) assert_allclose(groups["time_max"], [58932.83114]) groups_overflow = axis.group_table(interval_edges=[11 * u.d, 12 * u.d]) assert_allclose(groups_overflow["idx_min"], [-1]) groups_timeedges = axis.group_table( interval_edges=[ Time("2020-03-19") + 5 * u.d - 1 * u.h, Time("2020-03-19") + 6 * u.d + 2 * u.h, ] ) assert_allclose(groups_timeedges["time_min"], [58932.263158]) assert_allclose(groups_timeedges["time_max"], [58932.83114]) groups_exactedges = axis.group_table(interval_edges=[3 * u.d, 7 * u.d + 1 * u.h]) assert_allclose(groups_exactedges["idx_min"], [6]) assert_allclose(groups_exactedges["idx_max"], [13]) def test_single_valued_axis(): # this will be interpreted as a scalar value # that is against the specifications, but we allow it nevertheless theta_values = np.array([0.5]) * u.deg table = Table(data=[theta_values, theta_values], names=["THETA_LO", "THETA_HI"]) _ = MapAxis.from_table(table, format="gadf-dl3", column_prefix="THETA") # this is a proper array-like axis with just a single value theta_values = np.array([[0.5]]) * u.deg table = Table(data=[theta_values, theta_values], names=["THETA_LO", "THETA_HI"]) _ = MapAxis.from_table(table, format="gadf-dl3", column_prefix="THETA") def test_label_map_axis_concatenate(): label1 = LabelMapAxis(["aa", "bb"], name="letters") label2 = LabelMapAxis(["cc", "dd"], name="letters") label3 = LabelMapAxis(["ee", "ff"], name="other_letters") label_append12 = label1.concatenate(label2) assert_equal(label_append12.center, np.array(["aa", "bb", "cc", "dd"], dtype=" 15000 * u.cm**2 assert_allclose(gt_m2, False) ge_m2 = m2 >= m2 assert_allclose(ge_m2, True) eq_m2 = m2 == 500 * u.cm**2 assert_allclose(eq_m2, False) ne_m2 = m2 != 500 * u.cm**2 assert_allclose(ne_m2, True) def test_boolean_arithmetics(): m_1 = Map.create(binsz=1, width=2) m_1.data = True m_2 = Map.create(binsz=1, width=2) m_2.data = False m_and = m_1 & m_2 assert not np.any(m_and.data) m_or = m_1 | m_2 assert np.all(m_or.data) m_not = ~m_2 assert np.all(m_not.data) m_xor = m_1 ^ m_1 assert not np.any(m_xor.data) def test_arithmetics_inconsistent_geom(): m_wcs = Map.create(binsz=0.1, width=1.0) m_wcs_incorrect = Map.create(binsz=0.1, width=2.0) with pytest.raises(ValueError): m_wcs += m_wcs_incorrect m_hpx = Map.create(binsz=0.1, width=1.0, map_type="hpx") with pytest.raises(ValueError): m_wcs += m_hpx map_serialization_args = [("log"), ("lin")] @pytest.mark.parametrize(("interp"), map_serialization_args) def test_arithmetics_after_serialization(tmp_path, interp): axis = MapAxis.from_bounds( 1.0, 10.0, 3, interp=interp, name="energy", node_type="center", unit="TeV" ) m_wcs = Map.create(binsz=0.1, width=1.0, map_type="wcs", skydir=(0, 0), axes=[axis]) m_wcs += 1 m_wcs.write(tmp_path / "tmp.fits") m_wcs_serialized = Map.read(tmp_path / "tmp.fits") m_wcs += m_wcs_serialized assert_allclose(m_wcs.data, 2.0) def test_set_scalar(): m = Map.create(width=1) m.data = 1 assert m.data.shape == (10, 10) assert_allclose(m.data, 1) def test_interp_to_geom(): energy = MapAxis.from_energy_bounds("1 TeV", "300 TeV", nbin=5, name="energy") energy_target = MapAxis.from_energy_bounds( "1 TeV", "300 TeV", nbin=7, name="energy" ) value = 30 coords = {"skycoord": SkyCoord("0 deg", "0 deg"), "energy": energy_target.center[3]} # WcsNDMap geom_wcs = WcsGeom.create( npix=(5, 3), proj="CAR", binsz=60, axes=[energy], skydir=(0, 0) ) wcs_map = Map.from_geom(geom_wcs, unit="") wcs_map.data = value * np.ones(wcs_map.data.shape) wcs_geom_target = WcsGeom.create( skydir=(0, 0), width=(10, 10), binsz=0.1 * u.deg, axes=[energy_target] ) interp_wcs_map = wcs_map.interp_to_geom(wcs_geom_target, method="linear") assert_allclose(interp_wcs_map.get_by_coord(coords)[0], value, atol=1e-7) assert isinstance(interp_wcs_map, WcsNDMap) assert interp_wcs_map.geom == wcs_geom_target # WcsNDMap is_mask geom_wcs = WcsGeom.create( npix=(5, 3), proj="CAR", binsz=0.1, axes=[energy], skydir=(0, 0) ) wcs_map = Map.from_geom(geom_wcs, unit="", data=True) wcs_geom_target = WcsGeom.create( skydir=(30, 30), width=(10, 10), binsz=0.1 * u.deg, axes=[energy_target], frame="galactic", ) interp_wcs_map = wcs_map.interp_to_geom( wcs_geom_target, method="linear", fill_value=None ) assert np.all(interp_wcs_map.data) # HpxNDMap geom_hpx = HpxGeom.create(binsz=60, axes=[energy], skydir=(0, 0)) hpx_map = Map.from_geom(geom_hpx, unit="") hpx_map.data = value * np.ones(hpx_map.data.shape) hpx_geom_target = HpxGeom.create( skydir=(0, 0), width=10, binsz=0.1 * u.deg, axes=[energy_target] ) interp_hpx_map = hpx_map.interp_to_geom(hpx_geom_target) assert_allclose(interp_hpx_map.get_by_coord(coords)[0], value, atol=1e-7) assert isinstance(interp_hpx_map, HpxNDMap) assert interp_hpx_map.geom == hpx_geom_target # Preserving the counts binsz = 0.2 * u.deg geom_initial = WcsGeom.create( skydir=(20, 20), width=(1, 1), binsz=0.2 * u.deg, ) factor = 2 test_map = Map.from_geom(geom_initial, unit="") test_map.data = value * np.eye(test_map.data.shape[0]) geom_target = WcsGeom.create( skydir=(20, 20), width=(1.5, 1.5), binsz=binsz / factor, ) new_map = test_map.interp_to_geom( geom_target, fill_value=0.0, method="nearest", preserve_counts=True ) assert_allclose(new_map.data[8, 8], test_map.data[4, 4] / factor**2, rtol=1e-4) assert_allclose(new_map.data[0, 8], 0.0, rtol=1e-4) def test_map_plot_mask(): from regions import CircleSkyRegion skydir = SkyCoord(0, 0, frame="galactic", unit="deg") m_wcs = Map.create( map_type="wcs", binsz=0.02, skydir=skydir, width=2.0, ) exclusion_region = CircleSkyRegion( center=SkyCoord(0.0, 0.0, unit="deg", frame="galactic"), radius=0.6 * u.deg ) mask = ~m_wcs.geom.region_mask([exclusion_region]) with mpl_plot_check(): mask.plot_mask() def test_reproject_2d(): npix1 = 3 geom1 = WcsGeom.create(npix=npix1, frame="icrs") geom1_large = WcsGeom.create(npix=npix1 + 5, frame="icrs") map1 = Map.from_geom(geom1, data=np.eye(npix1), unit="s") factor = 10 binsz = 0.5 / factor npix2 = 7 * factor geom2 = WcsGeom.create( skydir=SkyCoord(0.1, 0.1, unit=u.deg), binsz=binsz, npix=npix2, frame="galactic" ) map1_repro = map1.reproject_to_geom(geom2, preserve_counts=True) assert map1_repro.unit == map1.unit assert_allclose(np.sum(map1_repro), np.sum(map1), rtol=1e-5) map1_new = map1_repro.reproject_to_geom(geom1_large, preserve_counts=True) assert_allclose(np.sum(map1_repro), np.sum(map1_new), rtol=1e-5) map1_repro = map1.reproject_to_geom(geom2, preserve_counts=False) map1_new = map1_repro.reproject_to_geom(geom1_large, preserve_counts=False) assert_allclose( np.sum(map1_repro * geom2.solid_angle()), np.sum(map1_new * geom1_large.solid_angle()), rtol=1e-3, ) factor = 0.5 binsz = 0.5 / factor npix = 7 geom2 = WcsGeom.create( skydir=SkyCoord(0.1, 0.1, unit=u.deg), binsz=binsz, npix=npix, frame="galactic" ) geom1_large = WcsGeom.create(npix=npix1 + 5, frame="icrs") map1_repro = map1.reproject_to_geom(geom2, preserve_counts=True) assert_allclose(np.sum(map1_repro), np.sum(map1), rtol=1e-5) map1_new = map1_repro.reproject_to_geom(geom1_large, preserve_counts=True) assert_allclose(np.sum(map1_repro), np.sum(map1_new), rtol=1e-3) map1_repro = map1.reproject_to_geom(geom2, preserve_counts=False) map1_new = map1_repro.reproject_to_geom(geom1_large, preserve_counts=False) assert_allclose( np.sum(map1_repro * geom2.solid_angle()), np.sum(map1_new * geom1_large.solid_angle()), rtol=1e-3, ) def test_resample_wcs_Wcs(): npix1 = 3 geom1 = WcsGeom.create(npix=npix1, frame="icrs") map1 = Map.from_geom(geom1, data=np.eye(npix1), unit="1 / (GeV m2 s sr)") geom2 = WcsGeom.create( skydir=SkyCoord(0.0, 0.0, unit=u.deg), binsz=0.5, npix=7, frame="galactic" ) map2 = map1.resample(geom2, preserve_counts=False) assert_allclose( np.sum(map2 * geom2.solid_angle()), np.sum(map1 * geom1.solid_angle()), rtol=1e-3, ) assert map2.unit == map1.unit def test_resample_weights(): npix1 = 3 geom1 = WcsGeom.create(npix=npix1, frame="icrs") map1 = Map.from_geom(geom1, data=np.eye(npix1)) geom2 = WcsGeom.create( skydir=SkyCoord(0.0, 0.0, unit=u.deg), binsz=0.5, npix=7, frame="galactic" ) map2 = map1.resample(geom2, weights=np.zeros(npix1), preserve_counts=False) assert np.sum(map2.data) == 0.0 def test_resample_downsample_wcs(): geom_fine = WcsGeom.create(npix=(15, 15), frame="icrs") map_fine = Map.from_geom(geom_fine) map_fine.data = np.arange(225).reshape((15, 15)) map_downsampled = map_fine.downsample(3) map_resampled = map_fine.resample(geom_fine.downsample(3), preserve_counts=True) assert_allclose(map_downsampled, map_resampled, rtol=1e-5) def test_resample_downsample_axis(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=12) geom_fine = WcsGeom.create(npix=(3, 3), frame="icrs", axes=[axis]) map_fine = Map.from_geom(geom_fine) map_fine.data = np.arange(108).reshape((12, 3, 3)) map_downsampled = map_fine.downsample(3, axis_name="energy", preserve_counts=False) map_resampled = map_fine.resample( geom_fine.downsample(3, axis_name="energy"), preserve_counts=False ) assert_allclose(map_downsampled, map_resampled, rtol=1e-5) def test_resample_wcs_hpx(): geom1 = HpxGeom.create(nside=32, frame="icrs") map1 = Map.from_geom(geom1, data=1.0, unit="1 / (GeV m2 s sr)") geom2 = HpxGeom.create( skydir=SkyCoord(0.0, 0.0, unit=u.deg), nside=8, frame="galactic" ) map2 = map1.resample(geom2, weights=map1.quantity, preserve_counts=False) assert_allclose( np.sum(map2 * geom2.solid_angle()), np.sum(map1 * geom1.solid_angle()), rtol=1e-3, ) assert map2.unit == map1.unit def test_map_reproject_wcs_to_hpx(): axis = MapAxis.from_bounds( 1.0, 10.0, 3, interp="log", name="energy", node_type="center" ) axis2 = MapAxis.from_bounds( 1.0, 10.0, 8, interp="log", name="energy", node_type="center" ) geom_wcs = WcsGeom.create( skydir=(0, 0), npix=(11, 11), binsz=10, axes=[axis], frame="galactic" ) geom_hpx = HpxGeom.create(binsz=10, frame="galactic", axes=[axis2]) data = np.arange(11 * 11 * 3).reshape(geom_wcs.data_shape) m = WcsNDMap(geom_wcs, data=data) m_r = m.reproject_to_geom(geom_hpx.to_image()) actual = m_r.get_by_coord({"lon": 0, "lat": 0, "energy": [1.0, 3.16227766, 10.0]}) assert_allclose(actual, [65.0, 186.0, 307.0], rtol=1e-3) with pytest.raises(TypeError): m.reproject_to_geom(geom_hpx) def test_map_reproject_hpx_to_wcs(): axis = MapAxis.from_bounds( 1.0, 10.0, 3, interp="log", name="energy", node_type="center" ) axis2 = MapAxis.from_bounds( 1.0, 10.0, 8, interp="log", name="energy", node_type="center" ) geom_wcs = WcsGeom.create( skydir=(0, 0), npix=(11, 11), binsz=10, axes=[axis], frame="galactic" ) geom_wcs2 = WcsGeom.create( skydir=(0, 0), npix=(11, 11), binsz=10, axes=[axis2], frame="galactic" ) geom_hpx = HpxGeom.create(binsz=10, frame="galactic", axes=[axis]) data = np.arange(3 * 768).reshape(geom_hpx.data_shape) m = HpxNDMap(geom_hpx, data=data) m_r = m.reproject_to_geom(geom_wcs.to_image()) actual = m_r.get_by_coord({"lon": 0, "lat": 0, "energy": [1.0, 3.16227766, 10.0]}) assert_allclose(actual, [287.5, 1055.5, 1823.5], rtol=1e-3) m_r = m.reproject_to_geom(geom_wcs) actual = m_r.get_by_coord({"lon": 0, "lat": 0, "energy": [1.0, 3.16227766, 10.0]}) assert_allclose(actual, [287.5, 1055.5, 1823.5], rtol=1e-3) with pytest.raises(TypeError): m.reproject_to_geom(geom_wcs2) def test_map_reproject_wcs_to_wcs_with_axes(): energy_nodes = np.arange(4) - 0.5 time_nodes = np.arange(5) - 0.5 axis1 = MapAxis(energy_nodes, interp="lin", name="energy", node_type="edges") axis2 = MapAxis(time_nodes, interp="lin", name="time", node_type="edges") geom_wcs_1 = WcsGeom.create( skydir=(266.405, -28.936), npix=(11, 11), binsz=0.1, axes=[axis1, axis2], frame="icrs", ) geom_wcs_2 = WcsGeom.create( skydir=(0, 0), npix=(11, 11), binsz=0.1, frame="galactic" ) spatial_data = np.zeros((11, 11)) energy_data = axis1.center.reshape(3, 1, 1) time_data = axis2.center.reshape(4, 1, 1, 1) data = spatial_data + energy_data + 0.5 * time_data m = WcsNDMap(geom_wcs_1, data=data) m_r = m.reproject_to_geom(geom_wcs_2) assert m.data.shape == m_r.data.shape geom_wcs_3 = geom_wcs_1.copy(axes=[axis1, axis2]) m_r_3 = m.reproject_to_geom(geom_wcs_3, preserve_counts=False) for data, idx in m_r_3.iter_by_image_data(): ref = idx[1] + 0.5 * idx[0] if np.any(data > 0): assert_allclose(np.nanmean(data[data > 0]), ref) factor = 2 axis1_up = axis1.upsample(factor=factor) axis2_up = axis2.upsample(factor=factor) geom_wcs_4 = geom_wcs_1.copy(axes=[axis1_up, axis2_up]) m_r_4 = m.reproject_to_geom(geom_wcs_4, preserve_counts=False) for data, idx in m_r_4.iter_by_image_data(): ref = int(idx[1] / factor) + 0.5 * int(idx[0] / factor) if np.any(data > 0): assert_allclose(np.nanmean(data[data > 0]), ref) def test_map_reproject_by_image(): axis = MapAxis.from_bounds( 1.0, 10.0, 3, interp="log", name="energy_true", node_type="center" ) axis2 = MapAxis.from_bounds( 1.0, 10.0, 8, interp="log", name="energy", node_type="center" ) geom_wcs = WcsGeom.create(skydir=(0, 0), npix=(11, 11), binsz=10, frame="galactic") geom_hpx = HpxGeom.create(binsz=10, frame="galactic", axes=[axis]) geom_hpx_wrong = HpxGeom.create(binsz=10, frame="galactic", axes=[axis, axis2]) m = HpxNDMap(geom_hpx_wrong) m.reproject_by_image(geom_wcs) assert len(m.geom.axes) == 2 data = np.arange(3 * 768).reshape(geom_hpx.data_shape) m = HpxNDMap(geom_hpx, data=data) geom_wcs_wrong = WcsGeom.create( skydir=(0, 0), npix=(11, 11), binsz=10, axes=[axis], frame="galactic" ) with pytest.raises(TypeError): m.reproject_by_image(geom_wcs_wrong) m_r = m.reproject_by_image(geom_wcs) actual = m_r.get_by_coord( {"lon": 0, "lat": 0, "energy_true": [1.0, 3.16227766, 10.0]} ) assert_allclose(actual, [287.5, 1055.5, 1823.5], rtol=1e-3) def test_wcsndmap_reproject_allsky_car(): geom = WcsGeom.create(binsz=10.0, proj="CAR", frame="icrs") m = WcsNDMap(geom) mask = m.geom.get_coord()[0] > 180 * u.deg m.data = mask.astype(float) geom0 = WcsGeom.create( binsz=20, proj="CAR", frame="icrs", skydir=(180.0, 0.0), width=20 ) expected = np.mean([m.data[0, 0], m.data[0, -1]]) # wrap at 180deg m0 = m.reproject_to_geom(geom0) assert_allclose(m0.data[0], expected) geom1 = HpxGeom.create(binsz=5.0, frame="icrs") m1 = m.reproject_to_geom(geom1) mask = np.abs(m1.geom.get_coord()[0] - 180) <= 5 assert_allclose(np.unique(m1.data[mask])[1], expected) def test_iter_by_image(): time_axis = MapAxis.from_bounds( 0, 3, nbin=3, unit="hour", name="time", interp="lin" ) energy_axis = MapAxis.from_bounds( 1, 100, nbin=4, unit="TeV", name="energy", interp="log" ) m_4d = Map.create( binsz=0.2, width=(1, 1), frame="galactic", axes=[energy_axis, time_axis] ) for m in m_4d.iter_by_image(keepdims=True): assert m.data.ndim == 4 assert m.geom.axes.names == ["energy", "time"] for m in m_4d.iter_by_image(keepdims=False): assert m.data.ndim == 2 assert m.geom.axes.names == [] def test_iter_by_axis(): time_axis = MapAxis.from_bounds( 0, 3, nbin=3, unit="hour", name="time", interp="lin" ) energy_axis = MapAxis.from_bounds( 1, 100, nbin=4, unit="TeV", name="energy", interp="log" ) m_4d = Map.create( binsz=0.2, width=(1, 1), frame="galactic", axes=[energy_axis, time_axis] ) for m in m_4d.iter_by_axis(axis_name="energy", keepdims=False): assert m.data.ndim == 3 assert m.geom.axes.names == ["time"] def test_split_by_axis(): time_axis = MapAxis.from_bounds( 0, 24, nbin=3, unit="hour", name="time", ) energy_axis = MapAxis.from_bounds( 1, 100, nbin=2, unit="TeV", name="energy", interp="log" ) m_4d = Map.create( binsz=1.0, width=(10, 5), frame="galactic", axes=[energy_axis, time_axis] ) maps = m_4d.split_by_axis("time") assert len(maps) == 3 def test_rename_axes(): time_axis = MapAxis.from_bounds( 0, 24, nbin=3, unit="hour", name="time", ) energy_axis = MapAxis.from_bounds( 1, 100, nbin=2, unit="TeV", name="energy", interp="log" ) m_4d = Map.create( binsz=1.0, width=(10, 5), frame="galactic", axes=[energy_axis, time_axis] ) geom = m_4d.geom.rename_axes("energy", "energy_true") assert m_4d.geom.axes.names == ["energy", "time"] assert geom.axes.names == ["energy_true", "time"] new_map = m_4d.rename_axes("energy", "energy_true") assert m_4d.geom.axes.names == ["energy", "time"] assert new_map.geom.axes.names == ["energy_true", "time"] def test_reorder_axes_fail(): axis1 = MapAxis.from_edges((0, 1, 3), name="axis1") axis2 = MapAxis.from_edges((0, 1, 2, 3, 4), name="axis2") some_map = RegionNDMap.create(region=None, axes=[axis1, axis2]) with pytest.raises(ValueError): some_map.reorder_axes(["axis3", "axis1"]) with pytest.raises(ValueError): some_map.reorder_axes("axis3") def test_reorder_axes(): axis1 = MapAxis.from_edges((0, 1, 3), name="axis1") axis2 = MapAxis.from_edges((0, 1, 2, 3, 4), name="axis2") axis3 = MapAxis.from_edges((0, 1, 2, 3), name="axis3") some_map = RegionNDMap.create(region=None, axes=[axis1, axis2, axis3]) some_map.data[:, 1, :] = 1 new_map = some_map.reorder_axes(["axis2", "axis1", "axis3"]) assert new_map.geom.axes.names == ["axis2", "axis1", "axis3"] assert new_map.geom.data_shape == (3, 2, 4, 1, 1) assert_allclose(new_map.data[:, :, 1], 1) def test_map_dot_product_fail(): axis1 = MapAxis.from_edges((0, 1, 2, 3), name="axis1") axis2 = MapAxis.from_edges((0, 1, 2, 3, 4), name="axis2") axis3 = MapAxis.from_edges((0, 1, 2), name="axis1") map1 = WcsNDMap.create(npix=5, axes=[axis1]) map2 = RegionNDMap.create(region=None, axes=[axis2, axis3]) map3 = RegionNDMap.create(region=None, axes=[axis2, axis3]) with pytest.raises(TypeError): map1.dot(map1) with pytest.raises(ValueError): map1.dot(map2) with pytest.raises(ValueError): map1.dot(map3) def test_map_dot_product(): axis1 = MapAxis.from_edges((0, 1, 3), name="axis1") axis2 = MapAxis.from_edges((0, 1, 2, 3, 4), name="axis2") map1 = WcsNDMap.create(npix=(5, 6), axes=[axis1]) map2 = RegionNDMap.create(region=None, axes=[axis1, axis2]) map1.data[0, ...] = 1 map1.data[1, ...] = 2 map2.data[0, 0, ...] = 1 map2.data[1, 1, ...] = 2 dot_map = map1 @ map2 assert dot_map.geom.axes.names == ["axis2"] assert_allclose(dot_map.data[:, 0, 0], [1, 4, 0, 0]) map3 = RegionNDMap.create(region=None, axes=[axis1]) map3.data[1, 0, 0] = 1 dot_map = map1.dot(map3) assert dot_map.geom.axes.names == [] assert_allclose(dot_map.data[0, 0], 2) axis3 = MapAxis.from_edges((0, 1, 2, 3), name="axis3") axis4 = MapAxis.from_edges((1, 2, 3, 4, 5), name="axis4") map4 = RegionNDMap.create(region=None, axes=[axis2, axis3, axis4]) map4.data[...] = 1 dot_map = map4.dot(map2) assert dot_map.geom.axes.names == ["axis1", "axis3", "axis4"] assert dot_map.data.shape == (4, 3, 2, 1, 1) dot_map = map2.dot(map4) assert dot_map.geom.axes.names == ["axis1", "axis3", "axis4"] assert dot_map.data.shape == (4, 3, 2, 1, 1) assert_allclose(dot_map.data[0, 0, :, 0, 0], [1, 2]) map5 = RegionNDMap.create(region=None, axes=[axis3, axis2, axis4]) map5.data[:, 0, :, :, :] = 1 dot_map = map5.dot(map2) assert dot_map.geom.axes.names == ["axis3", "axis1", "axis4"] assert dot_map.data.shape == (4, 2, 3, 1, 1) assert_allclose(dot_map.data[0, :, 0, 0, 0], [1, 0]) def test_data_shape_broadcast(): axis1 = MapAxis.from_edges((0, 1, 3), name="axis1") map1 = WcsNDMap.create(npix=(5, 6), axes=[axis1]) with pytest.raises(ValueError): map1.data = np.zeros((1, 2, 6, 5)) with pytest.raises(ValueError): map1.data = np.zeros((2, 6, 5, 1)) map2: RegionNDMap = RegionNDMap.create(region=None, axes=[axis1]) map2.data = np.ones((2,)) assert_allclose(map2.data, 1) assert map2.data.shape == (2, 1, 1) map2.data = np.zeros((2, 1, 1)) assert_allclose(map2.data, 0) with pytest.raises(ValueError): map2.data = np.ones((2, 1)) with pytest.raises(ValueError): map2.data = np.ones((1, 1, 2)) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/tests/test_counts.py0000644000175100001770000000441414721316200021075 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.table import Table from gammapy.data import EventList from gammapy.maps import HpxGeom, Map, MapAxis, WcsNDMap from gammapy.utils.testing import requires_dependency @pytest.fixture() def events(): t = Table() t["EVENT_ID"] = np.array([1, 5], dtype=np.uint16) t["RA"] = [5, 11] * u.deg t["DEC"] = [0, 0] * u.deg t["ENERGY"] = [10, 12] * u.TeV t["TIME"] = [3, 4] * u.s return EventList(t) def test_map_fill_events_wcs(events): # 2D map m = Map.create(npix=(2, 1), binsz=10) m.fill_events(events) assert_allclose(m.data, [[1, 0]]) # 3D with energy axis axis = MapAxis.from_edges([9, 11, 13], name="energy", unit="TeV") m = Map.create(npix=(2, 1), binsz=10, axes=[axis]) m.fill_events(events) assert m.data.sum() == 1 assert_allclose(m.data[0, 0, 0], 1) weights = np.array([0.5, 1]) m = Map.create(npix=(2, 1), binsz=10, axes=[axis]) m.fill_events(events, weights=weights) assert_allclose(m.data.sum(), 0.5) @requires_dependency("healpy") def test_map_fill_events_hpx(events): # 2D map m = Map.from_geom(HpxGeom(1)) m.fill_events(events) assert m.data[4] == 2 # 3D with energy axis axis = MapAxis.from_edges([9, 11, 13], name="energy", unit="TeV") m = Map.from_geom(HpxGeom(1, axes=[axis])) m.fill_events(events) assert m.data[0, 4] == 1 assert m.data[1, 4] == 1 weights = np.array([0.5, 1]) m = Map.from_geom(HpxGeom(1, axes=[axis])) m.fill_events(events, weights=weights) assert_allclose(m.data.sum(), 1.5) def test_map_fill_events_keyerror(events): axis = MapAxis([0, 1, 2], name="nokey") m = WcsNDMap.create(binsz=0.1, npix=10, axes=[axis]) with pytest.raises(KeyError): m.fill_events(events) def test_map_fill_events_uint_column(events): # Check that unsigned int column works. # Regression test for https://github.com/gammapy/gammapy/issues/1620 axis = MapAxis.from_edges([0, 3, 6], name="event_id") m = Map.create(npix=(2, 1), binsz=10, axes=[axis]) m.fill_events(events) assert m.data.sum() == 1 assert_allclose(m.data[0, 0, 0], 1) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/tests/test_maps.py0000644000175100001770000000500714721316200020521 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from gammapy.maps import Map, MapAxis, Maps, RegionNDMap, WcsGeom from gammapy.utils.testing import assert_allclose @pytest.fixture() def map_dictionary(): mapdict = {} axis = MapAxis.from_edges([1, 2, 3, 4], name="axis", unit="cm") geom = WcsGeom.create(npix=10, axes=[axis]) mapdict["map1"] = Map.from_geom(geom, data=1) mapdict["map2"] = Map.from_geom(geom, data=2) return mapdict def test_maps(map_dictionary): maps = Maps(**map_dictionary) maps["map3"] = maps["map1"].copy() assert maps.geom.npix[0] == 10 assert len(maps) == 3 assert_allclose(maps["map1"].data, 1) assert_allclose(maps["map2"].data, 2) assert_allclose(maps["map3"].data, 1) assert "map3" in maps.__str__() @pytest.mark.xfail def test_maps_wrong_addition(map_dictionary): maps = Maps(**map_dictionary) # Test pop method some_map = maps.pop("map2") assert len(maps) == 1 assert_allclose(some_map.data, 2) # Test incorrect map addition with pytest.raises(ValueError): maps["map3"] = maps["map1"].sum_over_axes() def test_maps_read_write(map_dictionary): maps = Maps(**map_dictionary) maps.write("test.fits", overwrite=True) new_maps = Maps.read("test.fits") assert new_maps.geom == maps.geom assert len(new_maps) == 2 assert_allclose(new_maps["map1"].data, 1) assert_allclose(new_maps["map2"].data, 2) def test_maps_region(): axis = MapAxis.from_edges([1, 2, 3, 4], name="axis", unit="cm") map1 = RegionNDMap.create(region=None, axes=[axis]) map1.data = 1 map2 = RegionNDMap.create(region=None, axes=[axis]) maps = Maps(map1=map1, map2=map2) assert len(maps) == 2 assert_allclose(maps["map1"], 1) def test_maps_from_geom(): geom = WcsGeom.create(npix=5) names = ["map1", "map2", "map3"] kwargs_list = [ {"unit": "cm2s", "dtype": "float64"}, {"dtype": "bool"}, {"data": np.arange(25).reshape(5, 5)}, ] maps = Maps.from_geom(geom, names) maps_kwargs = Maps.from_geom(geom, names, kwargs_list=kwargs_list) assert len(maps) == 3 assert maps["map1"].geom == geom assert maps["map2"].unit == "" assert maps["map3"].data.dtype == np.float32 assert len(maps_kwargs) == 3 assert maps_kwargs["map1"].unit == "cm2s" assert maps_kwargs["map1"].data.dtype == np.float64 assert maps_kwargs["map2"].data.dtype == bool assert maps_kwargs["map3"].data[2, 2] == 12 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/tests/test_measure.py0000644000175100001770000000316414721316200021224 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose from regions import CompoundSkyRegion, PolygonSkyRegion from gammapy.maps import HpxGeom, HpxNDMap, containment_radius, containment_region from gammapy.modeling.models import GaussianSpatialModel from gammapy.utils.testing import requires_dependency def test_containment(): model = GaussianSpatialModel(sigma="0.15 deg") geom = model._get_plot_map(None).geom.upsample(factor=3) model_map = model.integrate_geom(geom) regions = containment_region(model_map, fraction=0.393, apply_union=True) assert isinstance(regions, PolygonSkyRegion) # because there is only one assert_allclose( regions.vertices.separation(geom.center_skydir), model.sigma.quantity, rtol=1e-2 ) radius = containment_radius(model_map, fraction=0.393) assert_allclose(radius, model.sigma.quantity, rtol=1e-2) model2 = GaussianSpatialModel(lon_0="-0.5deg", sigma="0.15 deg") model_map2 = model_map + model2.integrate_geom(geom) regions = containment_region(model_map2, fraction=0.1, apply_union=True) assert isinstance(regions, CompoundSkyRegion) regions = containment_region(model_map2, fraction=0.1, apply_union=False) assert isinstance(regions, list) @requires_dependency("healpy") def test_containment_fail_hpx(): geom_hpx = HpxGeom.create(binsz=10, frame="galactic") m = HpxNDMap(geom_hpx) with pytest.raises(TypeError): containment_region(m, fraction=0.393, apply_union=True) with pytest.raises(TypeError): containment_radius(m, fraction=0.393) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/utils.py0000644000175100001770000000711114721316200016516 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from astropy import units as u from astropy.coordinates import Angle from astropy.time import Time def _check_width(width): """Check and normalise width argument. Always returns tuple (lon, lat) as float in degrees. """ if isinstance(width, tuple): lon = Angle(width[0], "deg").deg lat = Angle(width[1], "deg").deg return lon, lat else: angle = Angle(width, "deg").deg if np.isscalar(angle): return angle, angle else: return tuple(angle) def _check_binsz(binsz): """Check and normalise bin size argument. Always returns an object with the same shape as the input where the spatial coordinates are a float in degrees. """ if isinstance(binsz, tuple): lon_sz = Angle(binsz[0], "deg").deg lat_sz = Angle(binsz[1], "deg").deg return lon_sz, lat_sz elif isinstance(binsz, list): binsz[:2] = Angle(binsz[:2], unit="deg").deg return binsz return Angle(binsz, unit="deg").deg def coordsys_to_frame(coordsys): if coordsys in ["CEL", "C"]: return "icrs" elif coordsys in ["GAL", "G"]: return "galactic" else: raise ValueError(f"Unknown coordinate system: '{coordsys}'") def frame_to_coordsys(frame): if frame in ["icrs", "fk5", "fk4"]: return "CEL" elif frame in ["galactic"]: return "GAL" else: raise ValueError(f"Unknown coordinate frame '{frame}'") class InvalidValue: """Class to define placeholder for invalid array values.""" float = np.nan int = np.nan bool = np.nan time = Time(0.0, format="mjd", scale="tt") def __getitem__(self, dtype): if np.issubdtype(dtype, np.integer): return self.int elif np.issubdtype(dtype, np.floating): return self.float elif np.issubdtype(dtype, np.dtype(bool).type): return self.bool elif np.issubdtype(dtype, Time): return self.time else: raise ValueError(f"No invalid value placeholder defined for {dtype}") class InvalidIndex: """Class to define placeholder for invalid array indices.""" float = np.nan int = -1 bool = False INVALID_VALUE = InvalidValue() INVALID_INDEX = InvalidIndex() def edges_from_lo_hi(edges_lo, edges_hi): if np.isscalar(edges_lo.value) and np.isscalar(edges_hi.value): return u.Quantity([edges_lo, edges_hi]) edges = edges_lo.copy() try: edges = edges.insert(len(edges), edges_hi[-1]) except AttributeError: edges = np.insert(edges, len(edges), edges_hi[-1]) return edges def broadcast_axis_values_to_geom(geom, axis_name, use_center=True): """Reshape axis center array to the expected shape for broadcasting. Parameters ---------- geom : `~gammapy.maps.Geom` the input Geom. axis_name : str input axis name. Must be part of the Geom non-spatial axes. use_centers : bool reshape center array if True, else use edges. Returns ------- array : `~numpy.array` or `~astropy.Quantity` reshaped array. """ if axis_name not in geom.axes.names: raise ValueError(f"Can't find axis {axis_name} in input Geom.") broadcast_shape = len(geom.data_shape) * [ 1, ] broadcast_shape[geom.axes.index_data(axis_name)] = -1 if use_center: return geom.axes[axis_name].center.reshape(broadcast_shape) else: return geom.axes[axis_name].edges.reshape(broadcast_shape) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.224642 gammapy-1.3/gammapy/maps/wcs/0000755000175100001770000000000014721316215015606 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/wcs/__init__.py0000644000175100001770000000031314721316200017706 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from .core import WcsMap from .geom import WcsGeom from .ndmap import WcsNDMap __all__ = [ "WcsGeom", "WcsMap", "WcsNDMap", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/wcs/core.py0000644000175100001770000002513514721316200017110 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import json import numpy as np import astropy.units as u from astropy.io import fits from gammapy.utils.types import JsonQuantityEncoder from ..core import Map from ..io import find_bands_hdu, find_hdu from .geom import WcsGeom from .io import identify_wcs_format __all__ = ["WcsMap"] class WcsMap(Map): """Base class for WCS map classes. Parameters ---------- geom : `~gammapy.maps.WcsGeom` A WCS geometry object. data : `~numpy.ndarray` Data array. """ @classmethod def create( cls, map_type="wcs", npix=None, binsz=0.1, width=None, proj="CAR", frame="icrs", refpix=None, axes=None, skydir=None, dtype="float32", meta=None, unit="", ): """Factory method to create an empty WCS map. Parameters ---------- map_type : {'wcs', 'wcs-sparse'}, optional Map type. Selects the class that will be used to instantiate the map. Default is "wcs". npix : int or tuple or list, optional Width of the map in pixels. A tuple will be interpreted as parameters for longitude and latitude axes. For maps with non-spatial dimensions, list input can be used to define a different map width in each image plane. This option supersedes width. Default is None. binsz : float or tuple or list, optional Map pixel size in degrees. A tuple will be interpreted as parameters for longitude and latitude axes. For maps with non-spatial dimensions, list input can be used to define a different bin size in each image plane. Default is 0.1. width : float or tuple or list, optional Width of the map in degrees. A tuple will be interpreted as parameters for longitude and latitude axes. For maps with non-spatial dimensions, list input can be used to define a different map width in each image plane. Default is None. proj : string, optional Any valid WCS projection type. Default is 'CAR' (cartesian). frame : {"icrs", "galactic"}, optional Coordinate system, either Galactic ("galactic") or Equatorial ("icrs"). Default is "icrs". refpix : tuple, optional Reference pixel of the projection. If None then this will be chosen to be center of the map. Default is None. axes : list, optional List of non-spatial axes. Default is None. skydir : tuple or `~astropy.coordinates.SkyCoord`, optional Sky position of map center. Can be either a SkyCoord object or a tuple of longitude and latitude in degrees in the coordinate system of the map. dtype : str, optional Data type. Default is "float32". meta : `dict`, optional Dictionary to store metadata. Default is None. unit : str or `~astropy.units.Unit`, optional The unit of the map. Default is "". Returns ------- map : `~WcsMap` A WCS map object. """ from .ndmap import WcsNDMap geom = WcsGeom.create( npix=npix, binsz=binsz, width=width, proj=proj, skydir=skydir, frame=frame, refpix=refpix, axes=axes, ) if map_type == "wcs": return WcsNDMap(geom, dtype=dtype, meta=meta, unit=unit) elif map_type == "wcs-sparse": raise NotImplementedError else: raise ValueError(f"Invalid map type: {map_type!r}") @classmethod def from_hdulist(cls, hdu_list, hdu=None, hdu_bands=None, format="gadf"): """Make a WcsMap object from a FITS HDUList. Parameters ---------- hdu_list : `~astropy.io.fits.HDUList` HDU list containing HDUs for map data and bands. hdu : str, optional Name or index of the HDU with the map data. Default is None. hdu_bands : str, optional Name or index of the HDU with the BANDS table. Default is None. format : {'gadf', 'fgst-ccube', 'fgst-template'}, optional FITS format convention. Default is "gadf". Returns ------- wcs_map : `WcsMap` Map object. """ if hdu is None: hdu = find_hdu(hdu_list) else: hdu = hdu_list[hdu] if hdu_bands is None: hdu_bands = find_bands_hdu(hdu_list, hdu) if hdu_bands is not None: hdu_bands = hdu_list[hdu_bands] format = identify_wcs_format(hdu_bands) wcs_map = cls.from_hdu(hdu, hdu_bands, format=format) if wcs_map.unit.is_equivalent(""): if format == "fgst-template": if "GTI" in hdu_list: # exposure maps have an additional GTI hdu wcs_map._unit = u.Unit("cm2 s") else: wcs_map._unit = u.Unit("cm-2 s-1 MeV-1 sr-1") return wcs_map def to_hdulist(self, hdu=None, hdu_bands=None, sparse=False, format="gadf"): """Convert to `~astropy.io.fits.HDUList`. Parameters ---------- hdu : str, optional Name or index of the HDU with the map data. Default is None. hdu_bands : str, optional Name or index of the HDU with the BANDS table. Default is None. sparse : bool, optional Sparsify the map by only writing pixels with non-zero amplitude. Default is False. format : {'gadf', 'fgst-ccube','fgst-template'}, optional FITS format convention. Default is "gadf". Returns ------- hdu_list : `~astropy.io.fits.HDUList` HDU list. """ if sparse: hdu = "SKYMAP" if hdu is None else hdu.upper() else: hdu = "PRIMARY" if hdu is None else hdu.upper() if sparse and hdu == "PRIMARY": raise ValueError("Sparse maps cannot be written to the PRIMARY HDU.") if format in ["fgst-ccube", "fgst-template"]: if self.geom.axes[0].name != "energy" or len(self.geom.axes) > 1: raise ValueError( "All 'fgst' formats don't support extra axes except for energy." ) if hdu_bands is None: hdu_bands = f"{hdu.upper()}_BANDS" if self.geom.axes: hdu_bands_out = self.geom.to_bands_hdu(hdu_bands=hdu_bands, format=format) hdu_bands = hdu_bands_out.name else: hdu_bands = None hdu_out = self.to_hdu(hdu=hdu, hdu_bands=hdu_bands, sparse=sparse) hdu_out.header["META"] = json.dumps(self.meta, cls=JsonQuantityEncoder) hdu_out.header["BUNIT"] = self.unit.to_string("fits") if hdu == "PRIMARY": hdulist = [hdu_out] else: hdulist = [fits.PrimaryHDU(), hdu_out] if self.geom.axes: hdulist += [hdu_bands_out] return fits.HDUList(hdulist) def to_hdu(self, hdu="SKYMAP", hdu_bands=None, sparse=False): """Make a FITS HDU from this map. Parameters ---------- hdu : str, optional The HDU extension name. Default is "SKYMAP". hdu_bands : str, optional The HDU extension name for BANDS table. Default is None. sparse : bool, optional Set INDXSCHM to SPARSE and sparsify the map by only writing pixels with non-zero amplitude. Default is False. Returns ------- hdu : `~astropy.io.fits.BinTableHDU` or `~astropy.io.fits.ImageHDU` HDU containing the map data. """ header = self.geom.to_header() if self.is_mask: data = self.data.astype(int) else: data = self.data if hdu_bands is not None: header["BANDSHDU"] = hdu_bands if sparse: hdu_out = self._make_hdu_sparse(data, self.geom.npix, hdu, header) elif hdu == "PRIMARY": hdu_out = fits.PrimaryHDU(data, header=header) else: hdu_out = fits.ImageHDU(data, header=header, name=hdu) return hdu_out @staticmethod def _make_hdu_sparse(data, npix, hdu, header): shape = data.shape # We make a copy, because below we modify `data` to handle non-finite entries # TODO: The code below could probably be simplified to use expressions # that create new arrays instead of in-place modifications # But first: do we want / need the non-finite entry handling at all and # always cast to 64-bit float? data = data.copy() if len(shape) == 2: data_flat = np.ravel(data) non_zero = np.where(~(data_flat == 0)) value = data_flat[non_zero].astype(float) cols = [ fits.Column("PIX", "J", array=non_zero[0]), fits.Column("VALUE", "E", array=value), ] elif npix[0].size == 1: shape_flat = shape[:-2] + (shape[-1] * shape[-2],) data_flat = np.ravel(data).reshape(shape_flat) nonzero = np.where(~(data_flat == 0)) channel = np.ravel_multi_index(nonzero[:-1], shape[:-2]) value = data_flat[nonzero].astype(float) cols = [ fits.Column("PIX", "J", array=nonzero[-1]), fits.Column("CHANNEL", "I", array=channel), fits.Column("VALUE", "E", array=value), ] else: data_flat = [] channel = [] pix = [] for i, _ in np.ndenumerate(npix[0]): data_i = np.ravel(data[i[::-1]]) pix_i = np.where(~(data_i == 0)) data_i = data_i[pix_i] data_flat += [data_i] pix += pix_i channel += [ np.ones(data_i.size, dtype=int) * np.ravel_multi_index(i[::-1], shape[:-2]) ] pix = np.concatenate(pix) channel = np.concatenate(channel) value = np.concatenate(data_flat).astype(float) cols = [ fits.Column("PIX", "J", array=pix), fits.Column("CHANNEL", "I", array=channel), fits.Column("VALUE", "E", array=value), ] return fits.BinTableHDU.from_columns(cols, header=header, name=hdu) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/wcs/geom.py0000644000175100001770000012303014721316200017100 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import copy from functools import lru_cache import numpy as np import astropy.units as u from astropy.convolution import Tophat2DKernel from astropy.coordinates import Angle, SkyCoord from astropy.io import fits from astropy.nddata import Cutout2D from astropy.nddata.utils import overlap_slices from astropy.utils import lazyproperty from astropy.wcs import WCS from astropy.wcs.utils import ( celestial_frame_to_wcs, proj_plane_pixel_scales, wcs_to_celestial_frame, ) from regions import RectangleSkyRegion from gammapy.utils.array import round_up_to_even, round_up_to_odd from gammapy.utils.compat import COPY_IF_NEEDED from ..axes import MapAxes from ..coord import MapCoord, skycoord_to_lonlat from ..geom import Geom, get_shape, pix_tuple_to_idx from ..utils import INVALID_INDEX, _check_binsz, _check_width __all__ = ["WcsGeom"] def cast_to_shape(param, shape, dtype): """Cast a tuple of parameter arrays to a given shape.""" if not isinstance(param, tuple): param = [param] param = [np.array(p, ndmin=1, dtype=dtype) for p in param] if len(param) == 1: param = [param[0].copy(), param[0].copy()] for i, p in enumerate(param): if p.size > 1 and p.shape != shape: raise ValueError if p.shape == shape: continue param[i] = p * np.ones(shape, dtype=dtype) return tuple(param) def get_resampled_wcs(wcs, factor, downsampled): """Get resampled WCS object.""" wcs = wcs.deepcopy() if not downsampled: factor = 1.0 / factor wcs.wcs.cdelt *= factor wcs.wcs.crpix = (wcs.wcs.crpix - 0.5) / factor + 0.5 return wcs class WcsGeom(Geom): """Geometry class for WCS maps. This class encapsulates both the WCS transformation object and the image extent (number of pixels in each dimension). Provides methods for accessing the properties of the WCS object and performing transformations between pixel and world coordinates. Parameters ---------- wcs : `~astropy.wcs.WCS` WCS projection object. npix : tuple Number of pixels in each spatial dimension. cdelt : tuple Pixel size in each image plane. If None then a constant pixel size will be used. crpix : tuple Reference pixel coordinate in each image plane. axes : list Axes for non-spatial dimensions. """ _slice_spatial_axes = slice(0, 2) _slice_non_spatial_axes = slice(2, None) is_hpx = False is_region = False def __init__(self, wcs, npix=None, cdelt=None, crpix=None, axes=None): self._wcs = wcs self._frame = wcs_to_celestial_frame(wcs).name self._projection = wcs.wcs.ctype[0][5:] self._axes = MapAxes.from_default(axes, n_spatial_axes=2) if npix is None: if wcs.array_shape is not None: npix = wcs.array_shape[::-1] else: npix = (0, 0) if cdelt is None: cdelt = tuple(np.abs(self.wcs.wcs.cdelt)) # Shape to use for WCS transformations wcs_shape = max([get_shape(t) for t in [npix, cdelt]]) self._npix = cast_to_shape(npix, wcs_shape, int) self._cdelt = cast_to_shape(cdelt, wcs_shape, float) # By convention CRPIX is indexed from 1 if crpix is None: crpix = tuple(1.0 + (np.array(self._npix) - 1.0) / 2.0) self._crpix = crpix # define cached methods self.get_coord = lru_cache()(self.get_coord) self.get_pix = lru_cache()(self.get_pix) def __setstate__(self, state): for key, value in state.items(): if key in ["get_coord", "get_pix"]: state[key] = lru_cache()(value) self.__dict__ = state @property def data_shape(self): """Shape of the `~numpy.ndarray` matching this geometry.""" return self._shape[::-1] @property def axes_names(self): """All axes names.""" return ["lon", "lat"] + self.axes.names @property def data_shape_axes(self): """Shape of data of the non-spatial axes and unit spatial axes.""" return self.axes.shape[::-1] + (1, 1) @property def data_shape_image(self): """Shape of data of the spatial axes and unit non-spatial axes.""" return (1,) * len(self.axes) + self.data_shape[len(self.axes) :] @property def _shape(self): npix_shape = tuple([np.max(self.npix[0]), np.max(self.npix[1])]) return npix_shape + self.axes.shape @property def _shape_edges(self): npix_shape = tuple([np.max(self.npix[0]) + 1, np.max(self.npix[1]) + 1]) return npix_shape + self.axes.shape @property def shape_axes(self): """Shape of non-spatial axes.""" return self._shape[self._slice_non_spatial_axes] @property def wcs(self): """WCS projection object.""" return self._wcs @property def frame(self): """Coordinate system of the projection. Galactic ("galactic") or Equatorial ("icrs"). """ return self._frame def cutout_slices(self, geom, mode="partial"): """Compute cutout slices. Parameters ---------- geom : `WcsGeom` Parent geometry. mode : {"trim", "partial", "strict"}, optional Cutout slices mode. Default is "partial". Returns ------- slices : dict Dictionary containing "parent-slices" and "cutout-slices". """ position = geom.to_image().coord_to_pix(self.center_skydir) slices = overlap_slices( large_array_shape=geom.data_shape[-2:], small_array_shape=self.data_shape[-2:], position=[_[0] for _ in position[::-1]], mode=mode, ) return { "parent-slices": slices[0], "cutout-slices": slices[1], } @property def projection(self): """Map projection.""" return self._projection @property def is_allsky(self): """Flag for all-sky maps.""" if np.all(np.isclose(self._npix[0] * self._cdelt[0], 360.0)) and np.all( np.isclose(self._npix[1] * self._cdelt[1], 180.0) ): return True else: return False @property def is_regular(self): """If geometry is regular in non-spatial dimensions as a boolean. - False for multi-resolution or irregular geometries. - True if all image planes have the same pixel geometry. """ if self.npix[0].size > 1: return False else: return True @property def width(self): """Tuple with image dimension in degrees in longitude and latitude.""" dlon = self._cdelt[0] * self._npix[0] dlat = self._cdelt[1] * self._npix[1] return (dlon, dlat) * u.deg @property def pixel_area(self): """Pixel area in deg^2.""" # FIXME: Correctly compute solid angle for projection return self._cdelt[0] * self._cdelt[1] @property def npix(self): """Tuple with image dimension in pixels in longitude and latitude.""" return self._npix @property def axes(self): """List of non-spatial axes.""" return self._axes @property def ndim(self): return len(self.data_shape) @property def center_coord(self): """Map coordinate of the center of the geometry. Returns ------- coord : tuple """ return self.pix_to_coord(self.center_pix) @property def center_pix(self): """Pixel coordinate of the center of the geometry. Returns ------- pix : tuple """ return tuple((np.array(self.data_shape) - 1.0) / 2)[::-1] @property def center_skydir(self): """Sky coordinate of the center of the geometry. Returns ------- pix : `~astropy.coordinates.SkyCoord` """ return SkyCoord.from_pixel(self.center_pix[0], self.center_pix[1], self.wcs) @property def pixel_scales(self): """ Pixel scale. Returns angles along each axis of the image at the CRPIX location once it is projected onto the plane of intermediate world coordinates. Returns ------- angle: `~astropy.coordinates.Angle` """ return Angle(proj_plane_pixel_scales(self.wcs), "deg") @classmethod def create( cls, npix=None, binsz=0.5, proj="CAR", frame="icrs", refpix=None, axes=None, skydir=None, width=None, ): """Create a WCS geometry object. Pixelization of the map is set with ``binsz`` and one of either ``npix`` or ``width`` arguments. For maps with non-spatial dimensions a different pixelization can be used for each image plane by passing a list or array argument for any of the pixelization parameters. If both npix and width are None then an all-sky geometry will be created. Parameters ---------- npix : int or tuple or list, optional Width of the map in pixels. A tuple will be interpreted as parameters for longitude and latitude axes. For maps with non-spatial dimensions, list input can be used to define a different map width in each image plane. This option supersedes width. Default is None. binsz : float or tuple or list, optional Map pixel size in degrees. A tuple will be interpreted as parameters for longitude and latitude axes. For maps with non-spatial dimensions, list input can be used to define a different bin size in each image plane. Default is 0.5 proj : string, optional Any valid WCS projection type. Default is 'CAR' (Plate-Carrée projection). See `WCS supported projections `__ # noqa: E501 frame : {"icrs", "galactic"}, optional Coordinate system, either Galactic ("galactic") or Equatorial ("icrs"). Default is "icrs". refpix : tuple, optional Reference pixel of the projection. If None this will be set to the center of the map. Default is None. axes : list, optional List of non-spatial axes. skydir : tuple or `~astropy.coordinates.SkyCoord`, optional Sky position of map center. Can be either a SkyCoord object or a tuple of longitude and latitude in deg in the coordinate system of the map. Default is None. width : float or tuple or list or string, optional Width of the map in degrees. A tuple will be interpreted as parameters for longitude and latitude axes. For maps with non-spatial dimensions, list input can be used to define a different map width in each image plane. Default is None. Returns ------- geom : `~WcsGeom` A WCS geometry object. Examples -------- >>> from gammapy.maps import WcsGeom >>> from gammapy.maps import MapAxis >>> axis = MapAxis.from_bounds(0,1,2) >>> geom = WcsGeom.create(npix=(100,100), binsz=0.1) >>> geom = WcsGeom.create(npix=(100,100), binsz="0.1deg") >>> geom = WcsGeom.create(npix=[100,200], binsz=[0.1,0.05], axes=[axis]) >>> geom = WcsGeom.create(npix=[100,200], binsz=["0.1deg","0.05deg"], axes=[axis]) >>> geom = WcsGeom.create(width=[5.0,8.0], binsz=[0.1,0.05], axes=[axis]) >>> geom = WcsGeom.create(npix=([100,200],[100,200]), binsz=0.1, axes=[axis]) """ if skydir is None: crval = (0.0, 0.0) elif isinstance(skydir, tuple): crval = skydir elif isinstance(skydir, SkyCoord): xref, yref, frame = skycoord_to_lonlat(skydir, frame=frame) crval = (xref, yref) else: raise ValueError(f"Invalid type for skydir: {type(skydir)!r}") if width is not None: width = _check_width(width) binsz = _check_binsz(binsz) shape = max([get_shape(t) for t in [npix, binsz, width]]) binsz = cast_to_shape(binsz, shape, float) # If both npix and width are None then create an all-sky geometry if npix is None and width is None: width = (360.0, 180.0) if npix is None: width = cast_to_shape(width, shape, float) npix = ( np.rint(width[0] / binsz[0]).astype(int), np.rint(width[1] / binsz[1]).astype(int), ) else: npix = cast_to_shape(npix, shape, int) if refpix is None: nxpix = int(npix[0].flat[0]) nypix = int(npix[1].flat[0]) refpix = ((nxpix + 1) / 2.0, (nypix + 1) / 2.0) # get frame class frame = SkyCoord(np.nan, np.nan, frame=frame, unit="deg").frame wcs = celestial_frame_to_wcs(frame, projection=proj) wcs.wcs.crpix = refpix wcs.wcs.crval = crval cdelt = (-binsz[0].flat[0], binsz[1].flat[0]) wcs.wcs.cdelt = cdelt wcs.array_shape = npix[1].flat[0], npix[0].flat[0] wcs.wcs.datfix() return cls(wcs, npix, cdelt=binsz, axes=axes) @property def footprint(self): """Footprint of the geometry as a `~astropy.coordinates.SkyCoord`.""" coords = self.wcs.calc_footprint() return SkyCoord(coords, frame=self.frame, unit="deg") @property def footprint_rectangle_sky_region(self): """Footprint of the geometry as a `~regions.RectangleSkyRegion`.""" width, height = self.width return RectangleSkyRegion( center=self.center_skydir, width=width[0], height=height[0] ) @classmethod def from_aligned(cls, geom, skydir, width): """Create an aligned geometry from an existing one. Parameters ---------- geom : `~WcsGeom` A reference WCS geometry object. skydir : tuple or `~astropy.coordinates.SkyCoord` Sky position of map center. Can be either a SkyCoord object or a tuple of longitude and latitude in degrees in the coordinate system of the map. width : float or tuple or list or string Width of the map in degrees. A tuple will be interpreted as parameters for longitude and latitude axes. For maps with non-spatial dimensions, list input can be used to define a different map width in each image plane. Returns ------- geom : `~WcsGeom` An aligned WCS geometry object with specified size and center. """ width = _check_width(width) * u.deg npix = tuple(np.round(width / geom.pixel_scales).astype(int)) xref, yref = geom.to_image().coord_to_pix(skydir) xref = int(np.floor(-xref[0] + npix[0] / 2.0)) + geom.wcs.wcs.crpix[0] yref = int(np.floor(-yref[0] + npix[1] / 2.0)) + geom.wcs.wcs.crpix[1] return cls.create( skydir=tuple(geom.wcs.wcs.crval), npix=npix, refpix=(xref, yref), frame=geom.frame, binsz=tuple(geom.pixel_scales.deg), axes=geom.axes, proj=geom.projection, ) @classmethod def from_header(cls, header, hdu_bands=None, format="gadf"): """Create a WCS geometry object from a FITS header. Parameters ---------- header : `~astropy.io.fits.Header` The FITS header. hdu_bands : `~astropy.io.fits.BinTableHDU`, optional The BANDS table HDU. Default is None. format : {'gadf', 'fgst-ccube','fgst-template'}, optional FITS format convention. Default is "gadf". Returns ------- wcs : `~WcsGeom` WCS geometry object. """ wcs = WCS(header, naxis=2).sub(2) axes = MapAxes.from_table_hdu(hdu_bands, format=format) shape = axes.shape if hdu_bands is not None and "NPIX" in hdu_bands.columns.names: npix = hdu_bands.data.field("NPIX").reshape(shape + (2,)) npix = (npix[..., 0], npix[..., 1]) cdelt = hdu_bands.data.field("CDELT").reshape(shape + (2,)) cdelt = (cdelt[..., 0], cdelt[..., 1]) elif "WCSSHAPE" in header: wcs_shape = eval(header["WCSSHAPE"]) npix = (wcs_shape[0], wcs_shape[1]) cdelt = None wcs.array_shape = npix else: npix = (header["NAXIS1"], header["NAXIS2"]) cdelt = None return cls(wcs, npix, cdelt=cdelt, axes=axes) def _make_bands_cols(self): cols = [] if not self.is_regular: cols += [ fits.Column( "NPIX", "2I", dim="(2)", array=np.vstack((np.ravel(self.npix[0]), np.ravel(self.npix[1]))).T, ) ] cols += [ fits.Column( "CDELT", "2E", dim="(2)", array=np.vstack( (np.ravel(self._cdelt[0]), np.ravel(self._cdelt[1])) ).T, ) ] cols += [ fits.Column( "CRPIX", "2E", dim="(2)", array=np.vstack( (np.ravel(self._crpix[0]), np.ravel(self._crpix[1])) ).T, ) ] return cols def to_header(self): header = self.wcs.to_header() header.update(self.axes.to_header()) shape = "{},{}".format(np.max(self.npix[0]), np.max(self.npix[1])) for ax in self.axes: shape += f",{ax.nbin}" header["WCSSHAPE"] = f"({shape})" return header def get_idx(self, idx=None, flat=False): pix = self.get_pix(idx=idx, mode="center") if flat: pix = tuple([p[np.isfinite(p)] for p in pix]) return pix_tuple_to_idx(pix) def _get_pix_all( self, idx=None, mode="center", sparse=False, axis_name=("lon", "lat") ): """Get index coordinate array without footprint of the projection applied.""" pix_all = [] for name, nbin in zip(self.axes_names, self._shape): if mode == "edges" and name in axis_name: pix = np.arange(-0.5, nbin, dtype=float) else: pix = np.arange(nbin, dtype=float) pix_all.append(pix) # TODO: improve varying bin size coordinate handling if idx is not None: pix_all = pix_all[self._slice_spatial_axes] + [float(t) for t in idx] return np.meshgrid(*pix_all[::-1], indexing="ij", sparse=sparse)[::-1] def get_pix(self, idx=None, mode="center"): """Get map pixel coordinates from the geometry. Parameters ---------- mode : {'center', 'edges'}, optional Get center or edge pix coordinates for the spatial axes. Default is "center". Returns ------- coord : tuple Map pixel coordinate tuple. """ pix = self._get_pix_all(idx=idx, mode=mode) coords = self.pix_to_coord(pix) m = np.isfinite(coords[0]) for _ in pix: _[~m] = INVALID_INDEX.float return pix def get_coord( self, idx=None, mode="center", frame=None, sparse=False, axis_name=None ): """Get map coordinates from the geometry. Parameters ---------- mode : {'center', 'edges'}, optional Get center or edge coordinates for the spatial axes. Default is "center". frame : str or `~astropy.coordinates.Frame`, optional Coordinate frame. Default is None. sparse : bool, optional Compute sparse coordinates. Default is False. axis_name : str, optional If mode = "edges", the edges will be returned for this axis. Default is None. Returns ------- coord : `~MapCoord` Map coordinate object. """ if axis_name is None: axis_name = ("lon", "lat") if frame is None: frame = self.frame pix = self._get_pix_all(idx=idx, mode=mode, sparse=sparse, axis_name=axis_name) data = self.pix_to_coord(pix) coords = MapCoord.create( data=data, frame=self.frame, axis_names=self.axes.names ) return coords.to_frame(frame) def coord_to_pix(self, coords): coords = MapCoord.create(coords, frame=self.frame, axis_names=self.axes.names) if coords.size == 0: return tuple([np.array([]) for i in range(coords.ndim)]) # Variable Bin Size if not self.is_regular: idxs = self.axes.coord_to_idx(coords, clip=True) crpix = [t[idxs] for t in self._crpix] cdelt = [t[idxs] for t in self._cdelt] pix = world2pix(self.wcs, cdelt, crpix, (coords.lon, coords.lat)) pix = list(pix) else: pix = self._wcs.wcs_world2pix(coords.lon, coords.lat, 0) pix += self.axes.coord_to_pix(coords) return tuple(pix) def pix_to_coord(self, pix): # Variable Bin Size if not self.is_regular: idxs = pix_tuple_to_idx(pix[self._slice_non_spatial_axes]) crpix = [t[idxs] for t in self._crpix] cdelt = [t[idxs] for t in self._cdelt] coords = pix2world(self.wcs, cdelt, crpix, pix[self._slice_spatial_axes]) else: coords = self._wcs.wcs_pix2world(pix[0], pix[1], 0) coords = ( u.Quantity(coords[0], unit="deg", copy=COPY_IF_NEEDED), u.Quantity(coords[1], unit="deg", copy=COPY_IF_NEEDED), ) coords += self.axes.pix_to_coord(pix[self._slice_non_spatial_axes]) return coords def pix_to_idx(self, pix, clip=False): pix = pix_tuple_to_idx(pix) idx_non_spatial = self.axes.pix_to_idx( pix[self._slice_non_spatial_axes], clip=clip ) if not self.is_regular: npix = (self.npix[0][idx_non_spatial], self.npix[1][idx_non_spatial]) else: npix = self.npix idx_spatial = [] for idx, npix_ in zip(pix[self._slice_spatial_axes], npix): if clip: idx = np.clip(idx, 0, npix_) else: idx = np.where((idx < 0) | (idx >= npix_), -1, idx) idx_spatial.append(idx) return tuple(idx_spatial) + idx_non_spatial def contains(self, coords): idx = self.coord_to_idx(coords) return np.all(np.stack([t != INVALID_INDEX.int for t in idx]), axis=0) def to_image(self): return self._image_geom @lazyproperty def _image_geom(self): npix = (np.max(self._npix[0]), np.max(self._npix[1])) cdelt = (np.max(self._cdelt[0]), np.max(self._cdelt[1])) return self.__class__(self._wcs, npix, cdelt=cdelt) def to_cube(self, axes): npix = (np.max(self._npix[0]), np.max(self._npix[1])) cdelt = (np.max(self._cdelt[0]), np.max(self._cdelt[1])) axes = copy.deepcopy(self.axes) + axes return self.__class__( self._wcs.deepcopy(), npix, cdelt=cdelt, axes=axes, ) def _pad_spatial(self, pad_width): if np.isscalar(pad_width): pad_width = (pad_width, pad_width) npix = (self.npix[0] + 2 * pad_width[0], self.npix[1] + 2 * pad_width[1]) wcs = self._wcs.deepcopy() wcs.wcs.crpix += np.array(pad_width) cdelt = copy.deepcopy(self._cdelt) return self.__class__(wcs, npix, cdelt=cdelt, axes=copy.deepcopy(self.axes)) def crop(self, crop_width): if np.isscalar(crop_width): crop_width = (crop_width, crop_width) npix = (self.npix[0] - 2 * crop_width[0], self.npix[1] - 2 * crop_width[1]) wcs = self._wcs.deepcopy() wcs.wcs.crpix -= np.array(crop_width) cdelt = copy.deepcopy(self._cdelt) return self.__class__(wcs, npix, cdelt=cdelt, axes=copy.deepcopy(self.axes)) def downsample(self, factor, axis_name=None): if axis_name is None: if np.any(np.mod(self.npix, factor) > 0): raise ValueError( f"Spatial shape not divisible by factor {factor!r} in all axes." f" You need to pad prior to calling downsample." ) npix = (self.npix[0] / factor, self.npix[1] / factor) cdelt = (self._cdelt[0] * factor, self._cdelt[1] * factor) wcs = get_resampled_wcs(self.wcs, factor, True) return self._init_copy(wcs=wcs, npix=npix, cdelt=cdelt) else: if not self.is_regular: raise NotImplementedError( "Upsampling in non-spatial axes not supported for irregular geometries" ) axes = self.axes.downsample(factor=factor, axis_name=axis_name) return self._init_copy(axes=axes) def upsample(self, factor, axis_name=None): if axis_name is None: npix = (self.npix[0] * factor, self.npix[1] * factor) cdelt = (self._cdelt[0] / factor, self._cdelt[1] / factor) wcs = get_resampled_wcs(self.wcs, factor, False) return self._init_copy(wcs=wcs, npix=npix, cdelt=cdelt) else: if not self.is_regular: raise NotImplementedError( "Upsampling in non-spatial axes not supported for irregular geometries" ) axes = self.axes.upsample(factor=factor, axis_name=axis_name) return self._init_copy(axes=axes) def to_binsz(self, binsz): """Change pixel size of the geometry. Parameters ---------- binsz : float or tuple or list New pixel size in degree. Returns ------- geom : `WcsGeom` Geometry with new pixel size. """ return self.create( skydir=self.center_skydir, binsz=binsz, width=self.width, proj=self.projection, frame=self.frame, axes=copy.deepcopy(self.axes), ) def solid_angle(self): """Solid angle array as a `~astropy.units.Quantity` in ``sr``. The array has the same dimension as the WcsGeom object if the spatial shape is not unique along the extra axis, otherwise the array shape matches the spatial dimensions. To return solid angles for the spatial dimensions only use:: WcsGeom.to_image().solid_angle() """ return self._solid_angle @lazyproperty def _solid_angle(self): if self.is_regular: coord = self.to_image().get_coord(mode="edges").skycoord else: coord = self.get_coord(mode="edges").skycoord # define pixel corners low_left = coord[..., :-1, :-1] low_right = coord[..., 1:, :-1] up_left = coord[..., :-1, 1:] up_right = coord[..., 1:, 1:] # compute side lengths low = low_left.separation(low_right) left = low_left.separation(up_left) up = up_left.separation(up_right) right = low_right.separation(up_right) # compute enclosed angles angle_low_right = low_right.position_angle(up_right) - low_right.position_angle( low_left ) angle_up_left = up_left.position_angle(up_right) - low_left.position_angle( up_left ) # compute area assuming a planar triangle area_low_right = 0.5 * low * right * np.sin(angle_low_right) area_up_left = 0.5 * up * left * np.sin(angle_up_left) # TODO: for non-negative cdelt a negative solid angle is returned # find out why and fix properly value = np.abs( u.Quantity(area_low_right + area_up_left, "sr", copy=COPY_IF_NEEDED) ) if self.is_regular: value = value.reshape(self.data_shape_image) return value def bin_volume(self): """Bin volume as a `~astropy.units.Quantity`.""" return self._bin_volume @lazyproperty def _bin_volume(self): """Cached property of bin volume.""" value = self.to_image().solid_angle() if not self.is_image: value = value * self.axes.bin_volume() return value def separation(self, center): """Compute sky separation with respect to a given center. Parameters ---------- center : `~astropy.coordinates.SkyCoord` Center position. Returns ------- separation : `~astropy.coordinates.Angle` Separation angle array (2D). """ coord = self.to_image().get_coord() return center.separation(coord.skycoord) def cutout(self, position, width, mode="trim", odd_npix=False, min_npix=1): """ Create a cutout around a given position. Parameters ---------- position : `~astropy.coordinates.SkyCoord` Center position of the cutout region. width : tuple of `~astropy.coordinates.Angle` Angular sizes of the region in (lon, lat) in that specific order. If only one value is passed, a square region is extracted. mode : {'trim', 'partial', 'strict'}, optional Mode option for Cutout2D, for details see `~astropy.nddata.utils.Cutout2D`. Default is "trim". odd_npix : bool, optional Force width to odd number of pixels. Default is False. min_npix : bool, optional Force width to a minimmum number of pixels. Default is 1. Returns ------- cutout : `~gammapy.maps.WcsNDMap` Cutout map. """ width = _check_width(width) * u.deg binsz = self.pixel_scales width_npix = np.clip((width / binsz).to_value(""), min_npix, None) if odd_npix: width_npix = round_up_to_odd(width_npix) dummy_data = np.empty(self.to_image().data_shape, dtype=bool) c2d = Cutout2D( data=dummy_data, wcs=self.wcs, position=position, # Cutout2D takes size with order (lat, lon) size=width_npix[::-1], mode=mode, ) return self._init_copy(wcs=c2d.wcs, npix=c2d.shape[::-1]) def boundary_mask(self, width): """Create a mask applying binary erosion with a given width from geometry edges. Parameters ---------- width : tuple of `~astropy.units.Quantity` Angular sizes of the margin in (lon, lat) in that specific order. If only one value is passed, the same margin is applied in (lon, lat). Returns ------- mask_map : `~gammapy.maps.WcsNDMap` of boolean type Boundary mask. """ from .ndmap import WcsNDMap data = np.ones(self.data_shape, dtype=bool) return WcsNDMap.from_geom(self, data=data).binary_erode( width=2 * u.Quantity(width), kernel="box" ) def region_mask(self, regions, inside=True): """Create a mask from a given list of regions. The mask is filled such that a pixel inside the region is filled with "True". To invert the mask, e.g. to create a mask with exclusion regions the tilde (~) operator can be used (see example below). Parameters ---------- regions : str, `~regions.Region` or list of `~regions.Region` Region or list of regions (pixel or sky regions accepted). A region can be defined as a string ind DS9 format as well. See http://ds9.si.edu/doc/ref/region.html for details. inside : bool, optional For ``inside=True``, set pixels in the region to True. For ``inside=False``, set pixels in the region to False. Default is True. Returns ------- mask_map : `~gammapy.maps.WcsNDMap` of boolean type Boolean region mask. Examples -------- Make an exclusion mask for a circular region:: from regions import CircleSkyRegion from astropy.coordinates import SkyCoord, Angle from gammapy.maps import WcsNDMap, WcsGeom pos = SkyCoord(0, 0, unit='deg') geom = WcsGeom.create(skydir=pos, npix=100, binsz=0.1) region = CircleSkyRegion( SkyCoord(3, 2, unit='deg'), Angle(1, 'deg'), ) # the Gammapy convention for exclusion regions is to take the inverse mask = ~geom.region_mask([region]) Note how we made a list with a single region, since this method expects a list of regions. """ from gammapy.maps import Map, RegionGeom if not self.is_regular: raise ValueError("Multi-resolution maps not supported yet") geom = RegionGeom.from_regions(regions, wcs=self.wcs) idx = self.get_idx() mask = geom.contains_wcs_pix(idx) if not inside: np.logical_not(mask, out=mask) return Map.from_geom(self, data=mask) def region_weights(self, regions, oversampling_factor=10): """Compute regions weights. Parameters ---------- regions : str, `~regions.Region` or list of `~regions.Region` Region or list of regions (pixel or sky regions accepted). A region can be defined as a string ind DS9 format as well. See http://ds9.si.edu/doc/ref/region.html for details. oversampling_factor : int, optional Over-sampling factor to compute the region weights. Default is 10. Returns ------- map : `~gammapy.maps.WcsNDMap` of boolean type Weights region mask. """ geom = self.upsample(factor=oversampling_factor) m = geom.region_mask(regions=regions) m.data = m.data.astype(float) return m.downsample(factor=oversampling_factor, preserve_counts=False) def binary_structure(self, width, kernel="disk"): """Get binary structure. Parameters ---------- width : `~astropy.units.Quantity`, str or float If a float is given it interpreted as width in pixels. If an (angular) quantity is given it converted to pixels using ``geom.wcs.wcs.cdelt``. The width corresponds to radius in case of a disk kernel, and the side length in case of a box kernel. kernel : {'disk', 'box'}, optional Kernel shape. Default is "disk". Returns ------- structure : `~numpy.ndarray` Binary structure. """ width = u.Quantity(width) if width.unit.is_equivalent("deg"): width = width / self.pixel_scales width = round_up_to_odd(width.to_value("")) if kernel == "disk": disk = Tophat2DKernel(width[0]) disk.normalize("peak") structure = disk.array elif kernel == "box": structure = np.ones(width) else: raise ValueError(f"Invalid kernel: {kernel!r}") shape = (1,) * len(self.axes) + structure.shape return structure.reshape(shape) def __str__(self): lon = self.center_skydir.data.lon.deg lat = self.center_skydir.data.lat.deg lon_ref, lat_ref = self.wcs.wcs.crval return ( f"{self.__class__.__name__}\n\n" f"\taxes : {self.axes_names}\n" f"\tshape : {self.data_shape[::-1]}\n" f"\tndim : {self.ndim}\n" f"\tframe : {self.frame}\n" f"\tprojection : {self.projection}\n" f"\tcenter : {lon:.1f} deg, {lat:.1f} deg\n" f"\twidth : {self.width[0][0]:.1f} x {self.width[1][0]:.1f}\n" f"\twcs ref : {lon_ref:.1f} deg, {lat_ref:.1f} deg\n" ) def to_odd_npix(self, max_radius=None): """Create a new geometry object with an odd number of pixels and a maximum size. This is useful for PSF kernel creation. Parameters ---------- max_radius : `~astropy.units.Quantity`, optional Maximum radius of the geometry (half the width). Default is None. Returns ------- geom : `WcsGeom` Geometry with odd number of pixels. """ if max_radius is None: width = self.width.max() else: width = 2 * u.Quantity(max_radius) binsz = self.pixel_scales.max() width_npix = (width / binsz).to_value("") npix = round_up_to_odd(width_npix) return WcsGeom.create( skydir=self.center_skydir, binsz=binsz, npix=npix, proj=self.projection, frame=self.frame, axes=self.axes, ) def to_even_npix(self): """Create a new geometry object with an even number of pixels and a maximum size. Returns ------- geom : `WcsGeom` Geometry with odd number of pixels. """ width = self.width.max() binsz = self.pixel_scales.max() width_npix = (width / binsz).to_value("") npix = round_up_to_even(width_npix) return WcsGeom.create( skydir=self.center_skydir, binsz=binsz, npix=npix, proj=self.projection, frame=self.frame, axes=self.axes, ) def is_aligned(self, other, tolerance=1e-6): """Check if WCS and extra axes are aligned. Parameters ---------- other : `WcsGeom` Other geometry. tolerance : float, optional Tolerance for the comparison. Default is 1e-6. Returns ------- aligned : bool Whether geometries are aligned. """ for axis, otheraxis in zip(self.axes, other.axes): if axis != otheraxis: return False # check WCS consistency with a priori tolerance of 1e-6 return self.wcs.wcs.compare(other.wcs.wcs, cmp=2, tolerance=tolerance) def is_allclose(self, other, rtol_axes=1e-6, atol_axes=1e-6, rtol_wcs=1e-6): """Compare two data IRFs for equivalency. Parameters ---------- other : `WcsGeom` Geom to compare against. rtol_axes : float, optional Relative tolerance for the axes comparison. Default is 1e-6. atol_axes : float, optional Relative tolerance for the axes comparison. Default is 1e-6. rtol_wcs : float, optional Relative tolerance for the WCS comparison. Default is 1e-6. Returns ------- is_allclose : bool Whether the geometry is all close. """ if not isinstance(other, self.__class__): return TypeError(f"Cannot compare {type(self)} and {type(other)}") if self.data_shape != other.data_shape: return False axes_eq = self.axes.is_allclose(other.axes, rtol=rtol_axes, atol=atol_axes) # check WCS consistency with a priori tolerance of 1e-6 # cmp=1 parameter ensures no comparison with ancillary information # see https://github.com/astropy/astropy/pull/4522/files wcs_eq = self.wcs.wcs.compare(other.wcs.wcs, cmp=1, tolerance=rtol_wcs) return axes_eq and wcs_eq def __eq__(self, other): if not isinstance(other, self.__class__): return False if not (self.is_regular and other.is_regular): raise NotImplementedError( "Geom comparison is not possible for irregular geometries." ) return self.is_allclose(other=other, rtol_wcs=1e-6, rtol_axes=1e-6) def __ne__(self, other): return not self.__eq__(other) def __hash__(self): return id(self) def pix2world(wcs, cdelt, crpix, pix): """Perform pixel to world coordinate transformation. For a WCS projection with a given pixel size (CDELT) and reference pixel (CRPIX). This method can be used to perform WCS transformations for projections with different pixelizations but the same reference coordinate (CRVAL), projection type, and coordinate system. Parameters ---------- wcs : `~astropy.wcs.WCS` WCS transform object. cdelt : tuple Tuple of X/Y pixel size in deg. Each element should have the same length as ``pix``. crpix : tuple Tuple of reference pixel parameters in X and Y dimensions. Each element should have the same length as ``pix``. pix : tuple Tuple of pixel coordinates. """ pix_ratio = [ np.abs(wcs.wcs.cdelt[0] / cdelt[0]), np.abs(wcs.wcs.cdelt[1] / cdelt[1]), ] pix = ( (pix[0] - (crpix[0] - 1.0)) / pix_ratio[0] + wcs.wcs.crpix[0] - 1.0, (pix[1] - (crpix[1] - 1.0)) / pix_ratio[1] + wcs.wcs.crpix[1] - 1.0, ) return wcs.wcs_pix2world(pix[0], pix[1], 0) def world2pix(wcs, cdelt, crpix, coord): pix_ratio = [ np.abs(wcs.wcs.cdelt[0] / cdelt[0]), np.abs(wcs.wcs.cdelt[1] / cdelt[1]), ] pix = wcs.wcs_world2pix(coord[0], coord[1], 0) return ( (pix[0] - (wcs.wcs.crpix[0] - 1.0)) * pix_ratio[0] + crpix[0] - 1.0, (pix[1] - (wcs.wcs.crpix[1] - 1.0)) * pix_ratio[1] + crpix[1] - 1.0, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/wcs/io.py0000644000175100001770000000044614721316200016565 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst def identify_wcs_format(hdu): if hdu is None: return "gadf" elif hdu.name == "ENERGIES": return "fgst-template" elif hdu.name == "EBOUNDS": return "fgst-ccube" else: return "gadf" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/wcs/ndmap.py0000644000175100001770000011335214721316200017256 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging from itertools import repeat import numpy as np import scipy.interpolate import scipy.ndimage as ndi import scipy.signal import astropy.units as u from astropy.convolution import Tophat2DKernel from astropy.coordinates import SkyCoord from astropy.io import fits from astropy.nddata import block_reduce from regions import PixCoord, PointPixelRegion, PointSkyRegion, SkyRegion import matplotlib.colors as mpcolors import matplotlib.pyplot as plt import gammapy.utils.parallel as parallel from gammapy.utils.interpolation import ScaledRegularGridInterpolator from gammapy.utils.units import unit_from_fits_image_hdu from gammapy.visualization.utils import add_colorbar from ..geom import pix_tuple_to_idx from ..utils import INVALID_INDEX from .core import WcsMap from .geom import WcsGeom __all__ = ["WcsNDMap"] log = logging.getLogger(__name__) C_MAP_MASK = mpcolors.ListedColormap(["black", "white"], name="mask") class WcsNDMap(WcsMap): """WCS map with any number of non-spatial dimensions. This class uses an ND numpy array to store map values. For maps with non-spatial dimensions and variable pixel size it will allocate an array with dimensions commensurate with the largest image plane. Parameters ---------- geom : `~gammapy.maps.WcsGeom` WCS geometry object. data : `~numpy.ndarray` Data array. If none then an empty array will be allocated. dtype : str, optional Data type, default is float32 meta : `dict` Dictionary to store meta data. unit : str or `~astropy.units.Unit` The map unit """ def __init__(self, geom, data=None, dtype="float32", meta=None, unit=""): # TODO: Figure out how to mask pixels for integer data types data_shape = geom.data_shape if data is None: data = self._make_default_data(geom, data_shape, dtype) super().__init__(geom, data, meta, unit) @staticmethod def _make_default_data(geom, shape_np, dtype): # Check whether corners of each image plane are valid data = np.zeros(shape_np, dtype=dtype) if not geom.is_regular or geom.is_allsky: coords = geom.get_coord() is_nan = np.isnan(coords.lon) data[is_nan] = np.nan return data @classmethod def from_hdu(cls, hdu, hdu_bands=None, format=None): """Make a WcsNDMap object from a FITS HDU. Parameters ---------- hdu : `~astropy.io.fits.BinTableHDU` or `~astropy.io.fits.ImageHDU` The map FITS HDU. hdu_bands : `~astropy.io.fits.BinTableHDU` The BANDS table HDU. format : {'gadf', 'fgst-ccube','fgst-template'} FITS format convention. Returns ------- map : `WcsNDMap` WCS map. """ geom = WcsGeom.from_header(hdu.header, hdu_bands, format=format) shape = geom.axes.shape shape_wcs = tuple([np.max(geom.npix[0]), np.max(geom.npix[1])]) meta = cls._get_meta_from_header(hdu.header) unit = unit_from_fits_image_hdu(hdu.header) # TODO: Should we support extracting slices? if isinstance(hdu, fits.BinTableHDU): map_out = cls(geom, meta=meta, unit=unit) pix = hdu.data.field("PIX") pix = np.unravel_index(pix, shape_wcs[::-1]) vals = hdu.data.field("VALUE") if "CHANNEL" in hdu.data.columns.names and shape: chan = hdu.data.field("CHANNEL") chan = np.unravel_index(chan, shape[::-1]) idx = chan + pix else: idx = pix map_out.set_by_idx(idx[::-1], vals) else: if any(x in hdu.name.lower() for x in ["mask", "is_ul", "success"]): data = hdu.data.astype(bool) else: data = hdu.data map_out = cls(geom=geom, meta=meta, data=data, unit=unit) return map_out def get_by_idx(self, idx): idx = pix_tuple_to_idx(idx) return self.data.T[idx] def interp_by_coord( self, coords, method="linear", fill_value=None, values_scale="lin" ): """Interpolate map values at the given map coordinates. Parameters ---------- coords : tuple, dict or `~gammapy.maps.MapCoord` Coordinate arrays for each dimension of the map. Tuple should be ordered as (lon, lat, x_0, ..., x_n) where x_i are coordinates for non-spatial dimensions of the map. "lon" and "lat" are optional and will be taken at the center of the region by default. method : {"linear", "nearest"} Method to interpolate data values. By default linear interpolation is performed. fill_value : None or float value The value to use for points outside of the interpolation domain. If None, values outside the domain are extrapolated. values_scale : {"lin", "log", "sqrt"} Optional value scaling. Default is "lin". Returns ------- vals : `~numpy.ndarray` Interpolated pixel values. """ if self.geom.is_regular: pix = self.geom.coord_to_pix(coords) return self.interp_by_pix( pix, method=method, fill_value=fill_value, values_scale=values_scale ) else: return self._interp_by_coord_griddata(coords, method=method) def interp_by_pix(self, pix, method="linear", fill_value=None, values_scale="lin"): if not self.geom.is_regular: raise ValueError("interp_by_pix only supported for regular geom.") grid_pix = [np.arange(n, dtype=float) for n in self.data.shape[::-1]] if np.any(np.isfinite(self.data)): data = self.data.copy().T data[~np.isfinite(data)] = 0.0 else: data = self.data.T fn = ScaledRegularGridInterpolator( grid_pix, data, fill_value=None, bounds_error=False, method=method, values_scale=values_scale, ) interp_data = fn(tuple(pix), clip=False) if fill_value is not None: idxs = self.geom.pix_to_idx(pix, clip=False) invalid = np.broadcast_arrays(*[idx == -1 for idx in idxs]) mask = np.any(invalid, axis=0) if not interp_data.shape: mask = mask.squeeze() interp_data[mask] = fill_value interp_data[~np.isfinite(interp_data)] = fill_value return interp_data def _interp_by_coord_griddata(self, coords, method="linear"): grid_coords = self.geom.get_coord() data = self.data[np.isfinite(self.data)] vals = scipy.interpolate.griddata( tuple(grid_coords.flat), data, tuple(coords), method=method ) m = ~np.isfinite(vals) if np.any(m): vals_fill = scipy.interpolate.griddata( tuple(grid_coords.flat), data, tuple([c[m] for c in coords]), method="nearest", ) vals[m] = vals_fill return vals def _resample_by_idx(self, idx, weights=None, preserve_counts=False): idx = pix_tuple_to_idx(idx) msk = np.all(np.stack([t != INVALID_INDEX.int for t in idx]), axis=0) idx = [t[msk] for t in idx] if weights is not None: if isinstance(weights, u.Quantity): weights = weights.to_value(self.unit) weights = weights[msk] idx = np.ravel_multi_index(idx, self.data.T.shape) idx, idx_inv = np.unique(idx, return_inverse=True) weights = np.bincount(idx_inv, weights=weights).astype(self.data.dtype) if not preserve_counts: weights /= np.bincount(idx_inv).astype(self.data.dtype) self.data.T.flat[idx] += weights def fill_by_idx(self, idx, weights=None): return self._resample_by_idx(idx, weights=weights, preserve_counts=True) def set_by_idx(self, idx, vals): idx = pix_tuple_to_idx(idx) self.data.T[idx] = vals def _pad_spatial( self, pad_width, axis_name=None, mode="constant", cval=0, method="linear" ): if axis_name is None: if np.isscalar(pad_width): pad_width = (pad_width, pad_width) if len(pad_width) == 2: pad_width += (0,) * (self.geom.ndim - 2) geom = self.geom._pad_spatial(pad_width[:2]) if self.geom.is_regular and mode != "interp": return self._pad_np(geom, pad_width, mode, cval) else: return self._pad_coadd(geom, pad_width, mode, cval, method) def _pad_np(self, geom, pad_width, mode, cval): """Pad a map using `~numpy.pad`. This method only works for regular geometries but should be more efficient when working with large maps. """ kwargs = {} if mode == "constant": kwargs["constant_values"] = cval pad_width = [(t, t) for t in pad_width] data = np.pad(self.data, pad_width[::-1], mode, **kwargs) return self._init_copy(geom=geom, data=data) def _pad_coadd(self, geom, pad_width, mode, cval, method): """Pad a map manually by co-adding the original map with the new map.""" idx_in = self.geom.get_idx(flat=True) idx_in = tuple([t + w for t, w in zip(idx_in, pad_width)])[::-1] idx_out = geom.get_idx(flat=True)[::-1] map_out = self._init_copy(geom=geom, data=None) map_out.coadd(self) if mode == "constant": pad_msk = np.zeros_like(map_out.data, dtype=bool) pad_msk[idx_out] = True pad_msk[idx_in] = False map_out.data[pad_msk] = cval elif mode == "interp": coords = geom.pix_to_coord(idx_out[::-1]) m = self.geom.contains(coords) coords = tuple([c[~m] for c in coords]) vals = self.interp_by_coord(coords, method=method) map_out.set_by_coord(coords, vals) else: raise ValueError(f"Invalid mode: {mode!r}") return map_out def crop(self, crop_width): if np.isscalar(crop_width): crop_width = (crop_width, crop_width) geom = self.geom.crop(crop_width) if self.geom.is_regular: slices = [slice(None)] * len(self.geom.axes) slices += [ slice(crop_width[1], int(self.geom.npix[1][0] - crop_width[1])), slice(crop_width[0], int(self.geom.npix[0][0] - crop_width[0])), ] data = self.data[tuple(slices)] map_out = self._init_copy(geom=geom, data=data) else: # FIXME: This could be done more efficiently by # constructing the appropriate slices for each image plane map_out = self._init_copy(geom=geom, data=None) map_out.coadd(self) return map_out def upsample(self, factor, order=0, preserve_counts=True, axis_name=None): if factor == 1 or factor is None: return self geom = self.geom.upsample(factor, axis_name=axis_name) idx = geom.get_idx() if axis_name is None: pix = ( (idx[0] - 0.5 * (factor - 1)) / factor, (idx[1] - 0.5 * (factor - 1)) / factor, ) + idx[2:] else: pix = list(idx) idx_ax = self.geom.axes_names.index(axis_name) pix[idx_ax] = (pix[idx_ax] - 0.5 * (factor - 1)) / factor if preserve_counts: data = self.data / self.geom.bin_volume().value else: data = self.data data = ndi.map_coordinates(data.T, tuple(pix), order=order, mode="nearest") if preserve_counts: data *= geom.bin_volume().value return self._init_copy(geom=geom, data=data.astype(self.data.dtype)) def downsample(self, factor, preserve_counts=True, axis_name=None, weights=None): if factor == 1 or factor is None: return self geom = self.geom.downsample(factor, axis_name=axis_name) if axis_name is None: block_size = (1,) * len(self.geom.axes) + (factor, factor) else: block_size = [1] * self.data.ndim idx = self.geom.axes.index_data(axis_name) block_size[idx] = factor func = np.nansum if preserve_counts else np.nanmean if weights is None: weights = 1 else: weights = weights.data data = block_reduce(self.data * weights, tuple(block_size), func=func) return self._init_copy(geom=geom, data=data.astype(self.data.dtype)) def plot( self, ax=None, fig=None, add_cbar=False, stretch="linear", axes_loc=None, kwargs_colorbar=None, **kwargs, ): """ Plot image on matplotlib WCS axes. Parameters ---------- ax : `~astropy.visualization.wcsaxes.WCSAxes`, optional WCS axis object to plot on. Default is None. fig : `~matplotlib.figure.Figure`, optional Figure object. Default is None. add_cbar : bool, optional Add color bar. Default is False. stretch : str, optional Passed to `astropy.visualization.simple_norm`. Default is "linear". axes_loc : dict, optional Keyword arguments passed to `~mpl_toolkits.axes_grid1.axes_divider.AxesDivider.append_axes`. kwargs_colorbar : dict, optional Keyword arguments passed to `~matplotlib.pyplot.colorbar`. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.imshow`. Returns ------- ax : `~astropy.visualization.wcsaxes.WCSAxes` WCS axes object. """ from astropy.visualization import simple_norm if not self.geom.is_flat: raise TypeError("Use .plot_interactive() for Map dimension > 2") ax = self._plot_default_axes(ax=ax) if fig is None: fig = plt.gcf() if self.geom.is_image: data = self.data.astype(float) else: axis = tuple(np.arange(len(self.geom.axes))) data = np.squeeze(self.data, axis=axis).astype(float) kwargs.setdefault("interpolation", "nearest") kwargs.setdefault("origin", "lower") kwargs.setdefault("cmap", "afmhot") kwargs_colorbar = kwargs_colorbar or {} mask = np.isfinite(data) if self.is_mask: kwargs.setdefault("vmin", 0) kwargs.setdefault("vmax", 1) kwargs["cmap"] = C_MAP_MASK if mask.any(): min_cut, max_cut = kwargs.pop("vmin", None), kwargs.pop("vmax", None) try: norm = simple_norm(data[mask], stretch, vmin=min_cut, vmax=max_cut) except TypeError: # astropy <6.1 norm = simple_norm( data[mask], stretch, min_cut=min_cut, max_cut=max_cut ) kwargs.setdefault("norm", norm) im = ax.imshow(data, **kwargs) if add_cbar: label = str(self.unit) kwargs_colorbar.setdefault("label", label) add_colorbar(im, ax=ax, axes_loc=axes_loc, **kwargs_colorbar) if self.geom.is_allsky: ax = self._plot_format_allsky(ax) else: ax = self._plot_format(ax) # without this the axis limits are changed when calling scatter ax.autoscale(enable=False) return ax def plot_mask(self, ax=None, **kwargs): """Plot the mask as a shaded area. Parameters ---------- ax : `~astropy.visualization.wcsaxes.WCSAxes`, optional WCS axis object to plot on. Default is None. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.contourf`. Returns ------- ax : `~astropy.visualization.wcsaxes.WCSAxes`, optional WCS axis object to plot on. """ if not self.geom.is_flat: raise TypeError("Use .plot_interactive() for Map dimension > 2") if not self.is_mask: raise ValueError( "`.plot_mask()` only supports maps containing boolean values." ) ax = self._plot_default_axes(ax=ax) kwargs.setdefault("alpha", 0.5) kwargs.setdefault("colors", "w") data = np.squeeze(self.data).astype(float) ax.contourf(data, levels=[0, 0.5], **kwargs) if self.geom.is_allsky: ax = self._plot_format_allsky(ax) else: ax = self._plot_format(ax) # without this the axis limits are changed when calling scatter ax.autoscale(enable=False) return ax def _plot_default_axes(self, ax): from astropy.visualization.wcsaxes.frame import EllipticalFrame if ax is None: fig = plt.gcf() if self.geom.projection in ["AIT"]: ax = fig.add_subplot( 1, 1, 1, projection=self.geom.wcs, frame_class=EllipticalFrame ) else: ax = fig.add_subplot(1, 1, 1, projection=self.geom.wcs) return ax @staticmethod def _plot_format(ax): try: ax.coords["glon"].set_axislabel("Galactic Longitude") ax.coords["glat"].set_axislabel("Galactic Latitude") except KeyError: ax.coords["ra"].set_axislabel("Right Ascension") ax.coords["dec"].set_axislabel("Declination") except AttributeError: log.info("Can't set coordinate axes. No WCS information available.") return ax def _plot_format_allsky(self, ax): # Remove frame ax.coords.frame.set_linewidth(0) # Set plot axis limits xmin, _ = self.geom.to_image().coord_to_pix({"lon": 180, "lat": 0}) xmax, _ = self.geom.to_image().coord_to_pix({"lon": -180, "lat": 0}) _, ymin = self.geom.to_image().coord_to_pix({"lon": 0, "lat": -90}) _, ymax = self.geom.to_image().coord_to_pix({"lon": 0, "lat": 90}) ax.set_xlim(xmin[0], xmax[0]) ax.set_ylim(ymin[0], ymax[0]) ax.text(0, ymax[0], self.geom.frame + " coords") # Grid and ticks glon_spacing, glat_spacing = 45, 15 lon, lat = ax.coords lon.set_ticks(spacing=glon_spacing * u.deg, color="w", alpha=0.8) lat.set_ticks(spacing=glat_spacing * u.deg) lon.set_ticks_visible(False) lon.set_major_formatter("d") lat.set_major_formatter("d") lon.set_ticklabel(color="w", alpha=0.8) lon.grid(alpha=0.2, linestyle="solid", color="w") lat.grid(alpha=0.2, linestyle="solid", color="w") return ax def cutout_and_mask_region(self, region=None): """Compute cutout and mask for a given region of the map. The function will estimate the minimal size of the cutout, which encloses the region. Parameters ---------- region: `~regions.Region`, optional Extended region. Default is None. Returns ------- cutout, mask : tuple of `WcsNDMap` Cutout and mask map. """ from gammapy.maps import RegionGeom if region is None: region = self.geom.footprint_rectangle_sky_region geom = RegionGeom.from_regions(regions=region, wcs=self.geom.wcs) cutout = self.cutout(position=geom.center_skydir, width=geom.width) mask = cutout.geom.to_image().region_mask([region]) return self.__class__(data=cutout.data, geom=cutout.geom, unit=self.unit), mask def to_region_nd_map( self, region=None, func=np.nansum, weights=None, method="nearest" ): """Get region ND map in a given region. By default, the whole map region is considered. Parameters ---------- region: `~regions.Region` or `~astropy.coordinates.SkyCoord`, optional Region. Default is None. func : numpy.func, optional Function to reduce the data. Default is np.nansum. For boolean Map, use np.any or np.all. weights : `WcsNDMap`, optional Array to be used as weights. The geometry must be equivalent. Default is None. method : {"nearest", "linear"}, optional How to interpolate if a position is given. Default is "nearest". Returns ------- spectrum : `~gammapy.maps.RegionNDMap` Spectrum in the given region. """ from gammapy.maps import RegionGeom, RegionNDMap if region is None: region = self.geom.footprint_rectangle_sky_region if weights is not None: if not self.geom == weights.geom: raise ValueError("Incompatible spatial geoms between map and weights") geom = RegionGeom.from_regions( regions=region, axes=self.geom.axes, wcs=self.geom.wcs ) if geom.is_all_point_sky_regions: coords = geom.get_coord() data = self.interp_by_coord(coords=coords, method=method) if weights is not None: data *= weights.interp_by_coord(coords=coords, method=method) # Casting needed as interp_by_coord transforms boolean data = data.astype(self.data.dtype) else: cutout, mask = self.cutout_and_mask_region(region=region) if weights is not None: weights_cutout = weights.cutout( position=geom.center_skydir, width=geom.width ) cutout.data *= weights_cutout.data idx_y, idx_x = np.where(mask) data = func(cutout.data[..., idx_y, idx_x], axis=-1) return RegionNDMap(geom=geom, data=data, unit=self.unit, meta=self.meta.copy()) def to_region_nd_map_histogram( self, region=None, bins_axis=None, nbin=100, density=False ): """Convert map into region map by histogramming. By default, it creates a linearly spaced axis with 100 bins between (-max(abs(data)), max(abs(data))) within the given region. Parameters ---------- region: `~regions.Region`, optional Region to histogram over. Default is None. bins_axis : `MapAxis`, optional Binning of the histogram. Default is None. nbin : int, optional Number of bins to use if no bins_axis is given. Default is 100. density : bool, optional Normalize integral of the histogram to 1. Default is False. Examples -------- This is how to use the method to create energy dependent histograms: :: from gammapy.maps import MapAxis, Map import numpy as np random_state = np.random.RandomState(seed=0) energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) data = Map.create(axes=[energy_axis], width=10, unit="cm2 s-1", binsz=0.02) data.data = random_state.normal( size=data.data.shape, loc=0, scale=np.array([1.0, 2.0, 3.0]).reshape((-1, 1, 1)) ) hist = data.to_region_nd_map_histogram() hist.plot(axis_name="bins") Returns ------- region_map : `RegionNDMap` Region map with histogram. """ from gammapy.maps import MapAxis, RegionGeom, RegionNDMap if isinstance(region, (PointSkyRegion, SkyCoord)): raise ValueError("Histogram method not supported for point regions") cutout, mask = self.cutout_and_mask_region(region=region) idx_y, idx_x = np.where(mask) quantity = cutout.quantity[..., idx_y, idx_x] value = np.abs(quantity).max() if bins_axis is None: bins_axis = MapAxis.from_bounds( -value, value, nbin=nbin, interp="lin", unit=self.unit, name="bins", ) if not bins_axis.unit.is_equivalent(self.unit): raise ValueError("Unit of bins_axis must be equivalent to unit of map.") axes = [bins_axis] + list(self.geom.axes) geom_hist = RegionGeom(region=region, axes=axes, wcs=self.geom.wcs) # This is likely not the most efficient way to do this data = np.apply_along_axis( lambda a: np.histogram(a, bins=bins_axis.edges.value, density=density)[0], axis=-1, arr=quantity.to_value(bins_axis.unit), ) unit = 1.0 / bins_axis.unit if density else "" return RegionNDMap.from_geom(geom=geom_hist, data=data, unit=unit) def mask_contains_region(self, region): """Check if input region is contained in a boolean mask map. Parameters ---------- region: `~regions.SkyRegion` or `~regions.PixRegion` Region or list of Regions (pixel or sky regions accepted). Returns ------- contained : bool Whether region is contained in the mask. """ if not self.is_mask: raise ValueError("mask_contains_region is only supported for boolean masks") if not self.geom.is_image: raise ValueError("Method only supported for 2D images") if isinstance(region, SkyRegion): region = region.to_pixel(self.geom.wcs) if isinstance(region, PointPixelRegion): lon, lat = region.center.x, region.center.y contains = self.get_by_pix((lon, lat)) else: idx = self.geom.get_idx() coords_pix = PixCoord(idx[0][self.data], idx[1][self.data]) contains = region.contains(coords_pix) return np.any(contains) def binary_erode(self, width, kernel="disk", use_fft=True): """Binary erosion of boolean mask removing a given margin. Parameters ---------- width : `~astropy.units.Quantity`, str or float If a float is given it interpreted as width in pixels. If an (angular) quantity is given it converted to pixels using ``geom.wcs.wcs.cdelt``. The width corresponds to radius in case of a disk kernel, and the side length in case of a box kernel. kernel : {'disk', 'box'}, optional Kernel shape. Default is "disk". use_fft : bool, optional Use `scipy.signal.fftconvolve` if True. Otherwise, use `scipy.ndimage.binary_erosion`. Default is True. Returns ------- map : `WcsNDMap` Eroded mask map. """ if not self.is_mask: raise ValueError("Binary operations only supported for boolean masks") structure = self.geom.binary_structure(width=width, kernel=kernel) if use_fft: return self.convolve(structure.squeeze(), method="fft") > ( structure.sum() - 1 ) data = ndi.binary_erosion(self.data, structure=structure) return self._init_copy(data=data) def binary_dilate(self, width, kernel="disk", use_fft=True): """Binary dilation of boolean mask adding a given margin. Parameters ---------- width : tuple of `~astropy.units.Quantity` Angular sizes of the margin in (lon, lat) in that specific order. If only one value is passed, the same margin is applied in (lon, lat). kernel : {'disk', 'box'}, optional Kernel shape. Default is "disk". use_fft : bool, optional Use `scipy.signal.fftconvolve` if True. Otherwise, use `scipy.ndimage.binary_dilation`. Default is True. Returns ------- map : `WcsNDMap` Dilated mask map. """ if not self.is_mask: raise ValueError("Binary operations only supported for boolean masks") structure = self.geom.binary_structure(width=width, kernel=kernel) if use_fft: return self.convolve(structure.squeeze(), method="fft") > 1 data = ndi.binary_dilation(self.data, structure=structure) return self._init_copy(data=data) def convolve(self, kernel, method="fft", mode="same"): """Convolve map with a kernel. If the kernel is two-dimensional, it is applied to all image planes likewise. If the kernel is higher dimensional, it should either match the map in the number of dimensions or the map must be an image (no non-spatial axes). In that case, the corresponding kernel is selected and applied to every image plane or to the single input image respectively. Parameters ---------- kernel : `~gammapy.irf.PSFKernel` or `numpy.ndarray` Convolution kernel. method : str, optional The method used by `~scipy.signal.convolve`. Default is 'fft'. mode : str, optional The convolution mode used by `~scipy.signal.convolve`. Default is 'same'. Returns ------- map : `WcsNDMap` Convolved map. """ from gammapy.irf import PSFKernel if self.geom.is_image and not isinstance(kernel, PSFKernel): if kernel.ndim > 2: raise ValueError( "Image convolution with 3D kernel requires a PSFKernel object" ) geom = self.geom.copy() if isinstance(kernel, PSFKernel): kmap = kernel.psf_kernel_map if not np.allclose( self.geom.pixel_scales.deg, kmap.geom.pixel_scales.deg, rtol=1e-5 ): raise ValueError("Pixel size of kernel and map not compatible.") kernel = kmap.data.astype(np.float32) if self.geom.is_image: geom = geom.to_cube(kmap.geom.axes) if mode == "full": pad_width = [0.5 * (width - 1) for width in kernel.shape[-2:]] geom = geom.pad(pad_width, axis_name=None) elif mode == "valid": raise NotImplementedError( "WcsNDMap.convolve: mode='valid' is not supported." ) shape_axes_kernel = kernel.shape[slice(0, -2)] if len(shape_axes_kernel) > 0: if not geom.shape_axes == shape_axes_kernel: raise ValueError( f"Incompatible shape between data {geom.shape_axes}" " and kernel {shape_axes_kernel}" ) if self.geom.is_image and kernel.ndim == 3: indexes = range(kernel.shape[0]) images = repeat(self.data.astype(np.float32)) else: indexes = list(self.iter_by_image_index()) images = (self.data[idx] for idx in indexes) kernels = ( kernel[Ellipsis] if kernel.ndim == 2 else kernel[idx] for idx in indexes ) convolved = parallel.run_multiprocessing( self._convolve, zip( images, kernels, repeat(method), repeat(mode), ), task_name="Convolution", ) data = np.empty(geom.data_shape, dtype=np.float32) for idx_res, idx in enumerate(indexes): data[idx] = convolved[idx_res] return self._init_copy(data=data, geom=geom) @staticmethod def _convolve(image, kernel, method, mode): """Convolve using `~scipy.signal.convolve` without kwargs for parallel evaluation.""" return scipy.signal.convolve(image, kernel, method=method, mode=mode) def smooth(self, width, kernel="gauss", **kwargs): """Smooth the map. Iterates over 2D image planes, processing one at a time. Parameters ---------- width : `~astropy.units.Quantity`, str or float Smoothing width given as quantity or float. If a float is given it interpreted as smoothing width in pixels. If an (angular) quantity is given it converted to pixels using ``geom.wcs.wcs.cdelt``. It corresponds to the standard deviation in case of a Gaussian kernel, the radius in case of a disk kernel, and the side length in case of a box kernel. kernel : {'gauss', 'disk', 'box'}, optional Kernel shape. Default is "gauss". kwargs : dict Keyword arguments passed to `~ndi.uniform_filter` ('box'), `~ndi.gaussian_filter` ('gauss') or `~ndi.convolve` ('disk'). Returns ------- image : `WcsNDMap` Smoothed image (a copy, the original object is unchanged). """ if isinstance(width, (u.Quantity, str)): width = u.Quantity(width) / self.geom.pixel_scales.mean() width = width.to_value("") smoothed_data = np.empty(self.data.shape, dtype=float) for img, idx in self.iter_by_image_data(): img = img.astype(float) if kernel == "gauss": data = ndi.gaussian_filter(img, width, **kwargs) elif kernel == "disk": disk = Tophat2DKernel(width) disk.normalize("integral") data = ndi.convolve(img, disk.array, **kwargs) elif kernel == "box": data = ndi.uniform_filter(img, width, **kwargs) else: raise ValueError(f"Invalid kernel: {kernel!r}") smoothed_data[idx] = data return self._init_copy(data=smoothed_data) def cutout(self, position, width, mode="trim", odd_npix=False, min_npix=1): """ Create a cutout around a given position. Parameters ---------- position : `~astropy.coordinates.SkyCoord` Center position of the cutout region. width : tuple of `~astropy.coordinates.Angle` Angular sizes of the region in (lon, lat) in that specific order. If only one value is passed, a square region is extracted. mode : {'trim', 'partial', 'strict'}, optional Mode option for Cutout2D, for details see `~astropy.nddata.utils.Cutout2D`. Default is "trim". odd_npix : bool, optional Force width to odd number of pixels. Default is False. min_npix : bool, optional Force width to a minimmum number of pixels. Default is 1. Returns ------- cutout : `~gammapy.maps.WcsNDMap` Cutout map. """ geom_cutout = self.geom.cutout( position=position, width=width, mode=mode, odd_npix=odd_npix, min_npix=min_npix, ) cutout_info = geom_cutout.cutout_slices(self.geom, mode=mode) slices = cutout_info["parent-slices"] parent_slices = Ellipsis, slices[0], slices[1] slices = cutout_info["cutout-slices"] cutout_slices = Ellipsis, slices[0], slices[1] data = np.zeros(shape=geom_cutout.data_shape, dtype=self.data.dtype) data[cutout_slices] = self.data[parent_slices] return self._init_copy(geom=geom_cutout, data=data) def _cutout_view(self, position, width, odd_npix=False): """ Create a cutout around a given position without copy of the data. Parameters ---------- position : `~astropy.coordinates.SkyCoord` Center position of the cutout region. width : tuple of `~astropy.coordinates.Angle` Angular sizes of the region in (lon, lat) in that specific order. If only one value is passed, a square region is extracted. odd_npix : bool, optional Force width to odd number of pixels. Default is False. Returns ------- cutout : `~gammapy.maps.WcsNDMap` Cutout map. """ geom_cutout = self.geom.cutout( position=position, width=width, mode="trim", odd_npix=odd_npix ) cutout_info = geom_cutout.cutout_slices(self.geom, mode="trim") slices = cutout_info["parent-slices"] parent_slices = Ellipsis, slices[0], slices[1] return self.__class__.from_geom( geom=geom_cutout, data=self.quantity[parent_slices] ) def stack(self, other, weights=None, nan_to_num=True): """Stack cutout into map. Parameters ---------- other : `WcsNDMap` Other map to stack. weights : `WcsNDMap`, optional Array to be used as weights. The spatial geometry must be equivalent to `other` and additional axes must be broadcastable. Default is None. nan_to_num: bool, optional Non-finite values are replaced by zero if True. Default is True. """ if self.geom == other.geom: parent_slices, cutout_slices = None, None elif self.geom.is_aligned(other.geom): cutout_slices = other.geom.cutout_slices(self.geom) slices = cutout_slices["parent-slices"] parent_slices = Ellipsis, slices[0], slices[1] slices = cutout_slices["cutout-slices"] cutout_slices = Ellipsis, slices[0], slices[1] else: raise ValueError( "Can only stack equivalent maps or cutout of the same map." ) data = other.quantity[cutout_slices].to_value(self.unit) if nan_to_num: not_finite = ~np.isfinite(data) if np.any(not_finite): data = data.copy() data[not_finite] = 0 if weights is not None: if not other.geom.to_image() == weights.geom.to_image(): raise ValueError("Incompatible spatial geoms between map and weights") data = data * weights.data[cutout_slices] self.data[parent_slices] += data ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.224642 gammapy-1.3/gammapy/maps/wcs/tests/0000755000175100001770000000000014721316215016750 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/wcs/tests/__init__.py0000644000175100001770000000010014721316200021042 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/wcs/tests/test_geom.py0000644000175100001770000005422314721316200021310 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose, assert_equal import astropy.units as u from astropy.coordinates import Angle, SkyCoord from astropy.io import fits from astropy.time import Time from regions import CircleSkyRegion from gammapy.maps import Map, MapAxis, TimeMapAxis, WcsGeom from gammapy.maps.utils import _check_binsz, _check_width from gammapy.utils.scripts import make_path from gammapy.utils.testing import requires_data axes1 = [MapAxis(np.logspace(0.0, 3.0, 3), interp="log", name="energy")] axes2 = [ MapAxis(np.logspace(0.0, 3.0, 3), interp="log", name="energy"), MapAxis(np.logspace(1.0, 3.0, 4), interp="lin"), ] skydir = SkyCoord(110.0, 75.0, unit="deg", frame="icrs") wcs_allsky_test_geoms = [ (None, 10.0, "galactic", "AIT", skydir, None), (None, 10.0, "galactic", "AIT", skydir, axes1), (None, [10.0, 20.0], "galactic", "AIT", skydir, axes1), (None, 10.0, "galactic", "AIT", skydir, axes2), (None, [[10.0, 20.0, 30.0], [10.0, 20.0, 30.0]], "galactic", "AIT", skydir, axes2), ] wcs_partialsky_test_geoms = [ (10, 0.1, "galactic", "AIT", skydir, None), (10, 0.1, "galactic", "AIT", skydir, axes1), (10, [0.1, 0.2], "galactic", "AIT", skydir, axes1), ] wcs_test_geoms = wcs_allsky_test_geoms + wcs_partialsky_test_geoms @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsgeom_init(npix, binsz, frame, proj, skydir, axes): WcsGeom.create( npix=npix, binsz=binsz, skydir=skydir, proj=proj, frame=frame, axes=axes ) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsgeom_get_pix(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create( npix=npix, binsz=binsz, skydir=skydir, proj=proj, frame=frame, axes=axes ) pix = geom.get_idx() if axes is not None: idx = tuple([1] * len(axes)) pix_img = geom.get_idx(idx=idx) m = np.all(np.stack([x == y for x, y in zip(idx, pix[2:])]), axis=0) m2 = pix_img[0] != -1 assert_allclose(pix[0][m], np.ravel(pix_img[0][m2])) assert_allclose(pix[1][m], np.ravel(pix_img[1][m2])) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsgeom_test_pix_to_coord(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create( npix=npix, binsz=binsz, skydir=skydir, proj=proj, frame=frame, axes=axes ) assert_allclose(geom.get_coord()[0], geom.pix_to_coord(geom.get_idx())[0]) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsgeom_test_coord_to_idx(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create(npix=npix, binsz=binsz, proj=proj, frame=frame, axes=axes) assert_allclose(geom.get_idx()[0], geom.coord_to_idx(geom.get_coord())[0]) if not geom.is_allsky: coords = geom.center_coord[:2] + tuple([ax.center[0] for ax in geom.axes]) coords[0][...] += 2.0 * np.max(geom.width[0]) idx = geom.coord_to_idx(coords) assert_allclose(np.full_like(coords[0].value, -1, dtype=int), idx[0]) idx = geom.coord_to_idx(coords, clip=True) assert np.all( np.not_equal(np.full_like(coords[0].value, -1, dtype=int), idx[0]) ) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsgeom_read_write(tmp_path, npix, binsz, frame, proj, skydir, axes): geom0 = WcsGeom.create(npix=npix, binsz=binsz, proj=proj, frame=frame, axes=axes) hdu_bands = geom0.to_bands_hdu(hdu_bands="TEST_BANDS") hdu_prim = fits.PrimaryHDU() hdu_prim.header.update(geom0.to_header()) hdulist = fits.HDUList([hdu_prim, hdu_bands]) hdulist.writeto(tmp_path / "tmp.fits") with fits.open(tmp_path / "tmp.fits", memmap=False) as hdulist: geom1 = WcsGeom.from_header(hdulist[0].header, hdulist["TEST_BANDS"]) assert_allclose(geom0.npix, geom1.npix) assert geom0.frame == geom1.frame @requires_data() def test_wcsgeom_from_header(): file_path = make_path("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") with fits.open(file_path, memmap=False) as hdulist: geom2 = WcsGeom.from_header(hdulist[1].header, hdulist["COUNTS_BANDS"]) assert geom2.wcs.naxis == 2 assert_equal(geom2.wcs._naxis, (240, 320)) def test_wcsgeom_to_hdulist(): npix, binsz, frame, proj, skydir, axes = wcs_test_geoms[3] geom = WcsGeom.create(npix=npix, binsz=binsz, proj=proj, frame=frame, axes=axes) hdu = geom.to_bands_hdu(hdu_bands="TEST") assert hdu.header["AXCOLS1"] == "E_MIN,E_MAX" assert hdu.header["AXCOLS2"] == "AXIS1_MIN,AXIS1_MAX" @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsgeom_contains(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create( npix=npix, binsz=binsz, skydir=skydir, proj=proj, frame=frame, axes=axes ) coords = geom.get_coord() m = np.isfinite(coords[0]) coords = [c[m] for c in coords] assert_allclose(geom.contains(coords), np.ones(coords[0].shape, dtype=bool)) if axes is not None: coords = [c[0] for c in coords[:2]] + [ax.edges[-1] + 1.0 for ax in axes] assert_allclose(geom.contains(coords), np.zeros((1,), dtype=bool)) if not geom.is_allsky: coords = [0.0, 0.0] + [ax.center[0] for ax in geom.axes] assert_allclose(geom.contains(coords), np.zeros((1,), dtype=bool)) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcs_geom_from_aligned(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create( npix=npix, binsz=binsz, skydir=(0, 0), proj=proj, frame=frame, axes=axes ) aligned_geom = WcsGeom.from_aligned(geom=geom, skydir=(2, 3), width="90 deg") assert aligned_geom.is_aligned(geom) def test_from_aligned_vs_cutout(): skydir = SkyCoord(0.12, -0.34, unit="deg", frame="galactic") geom = WcsGeom.create(binsz=0.1, skydir=skydir, proj="AIT", frame="galactic") position = SkyCoord("2.23d", "3.102d", frame="galactic") width = ("89 deg", "79 deg") aligned_geom = WcsGeom.from_aligned(geom=geom, skydir=position, width=width) geom_cutout = geom.cutout(position=position, width=width) assert geom_cutout == aligned_geom def test_from_aligned_vs_cutout_tan(): skydir = SkyCoord(0, 0, unit="deg", frame="galactic") geom = WcsGeom.create( binsz=1, skydir=skydir, proj="TAN", frame="galactic", width=("180d", "90d") ) position = SkyCoord("53.23d", "22.102d", frame="galactic") width = ("17 deg", "15 deg") geom_cutout = geom.cutout(position=position, width=width, mode="partial") aligned_geom = WcsGeom.from_aligned(geom=geom, skydir=position, width=width) assert aligned_geom == geom_cutout def test_wcsgeom_solid_angle(): # Test using a CAR projection map with an extra axis binsz = 1.0 * u.deg npix = 10 geom = WcsGeom.create( skydir=(0, 0), npix=(npix, npix), binsz=binsz, frame="galactic", proj="CAR", axes=[MapAxis.from_edges([0, 2, 3])], ) solid_angle = geom.solid_angle() # Check array size assert solid_angle.shape == (1, npix, npix) # Test at b = 0 deg assert solid_angle.unit == "sr" assert_allclose(solid_angle.value[0, 5, 5], 0.0003046, rtol=1e-3) # Test at b = 5 deg assert_allclose(solid_angle.value[0, 9, 5], 0.0003038, rtol=1e-3) def test_wcsgeom_solid_angle_symmetry(): geom = WcsGeom.create( skydir=(0, 0), frame="galactic", npix=(3, 3), binsz=20.0 * u.deg ) sa = geom.solid_angle() assert_allclose(sa[1, :], sa[1, 0]) # Constant along lon assert_allclose(sa[0, 1], sa[2, 1]) # Symmetric along lat with pytest.raises(AssertionError): # Not constant along lat due to changes in solid angle (great circle) assert_allclose(sa[:, 1], sa[0, 1]) def test_wcsgeom_solid_angle_ait(): # Pixels that don't correspond to locations on the sky # should have solid angles set to NaN ait_geom = WcsGeom.create( skydir=(0, 0), width=(360, 180), binsz=20, frame="galactic", proj="AIT" ) solid_angle = ait_geom.solid_angle().to_value("deg2") assert_allclose(solid_angle[4, 1], 397.04838) assert_allclose(solid_angle[4, 16], 397.751841) assert_allclose(solid_angle[1, 8], 381.556269) assert_allclose(solid_angle[7, 8], 398.34725) assert np.isnan(solid_angle[0, 0]) def test_wcsgeom_separation(): geom = WcsGeom.create( skydir=(0, 0), npix=10, binsz=0.1, frame="galactic", proj="CAR", axes=[MapAxis.from_edges([0, 2, 3])], ) position = SkyCoord(1, 0, unit="deg", frame="galactic").icrs separation = geom.separation(position) assert separation.unit == "deg" assert separation.shape == (10, 10) assert_allclose(separation.value[0, 0], 0.7106291438079875) # Make sure it also works for 2D maps as input separation = geom.to_image().separation(position) assert separation.unit == "deg" assert separation.shape == (10, 10) assert_allclose(separation.value[0, 0], 0.7106291438079875) def test_cutout(): geom = WcsGeom.create( skydir=(0, 0), npix=10, binsz=0.1, frame="galactic", proj="CAR", axes=[MapAxis.from_edges([0, 2, 3])], ) position = SkyCoord(0.1, 0.2, unit="deg", frame="galactic") cutout_geom = geom.cutout(position=position, width=2 * 0.3 * u.deg, mode="trim") center_coord = cutout_geom.center_coord assert_allclose(center_coord[0].value, 0.1) assert_allclose(center_coord[1].value, 0.2) assert_allclose(center_coord[2].value, 2.0) assert cutout_geom.data_shape == (2, 6, 6) assert cutout_geom.data_shape_axes == (2, 1, 1) def test_cutout_info(): geom = WcsGeom.create(skydir=(0, 0), npix=10) position = SkyCoord(0, 0, unit="deg") cutout_geom = geom.cutout(position=position, width="2 deg") cutout_info = cutout_geom.cutout_slices(geom) assert cutout_info["parent-slices"][0].start == 3 assert cutout_info["parent-slices"][1].start == 3 assert cutout_info["cutout-slices"][0].start == 0 assert cutout_info["cutout-slices"][1].start == 0 def test_cutout_min_size(): geom = WcsGeom.create(skydir=(0, 0), npix=10, binsz=0.5) position = SkyCoord(0, 0, unit="deg") cutout_geom = geom.cutout(position=position, width=["2 deg", "0.1 deg"]) assert cutout_geom.data_shape == (1, 4) cutout_geom = geom.cutout(position=position, width=["2 deg", "0.1 deg"], min_npix=3) assert cutout_geom.data_shape == (3, 4) geom = WcsGeom.create(skydir=(0, 0), npix=(10, 1), binsz=0.5) position = SkyCoord(0, 0, unit="deg") cutout_geom = geom.cutout(position=position, width=["1 deg", "0.1 deg"]) assert cutout_geom.data_shape == (1, 2) cutout_geom = geom.cutout(position=position, width=["1 deg", "0.1 deg"], min_npix=3) assert cutout_geom.data_shape == (1, 3) def test_wcs_geom_get_coord(): geom = WcsGeom.create( skydir=(0, 0), npix=(4, 3), binsz=1, frame="galactic", proj="CAR" ) coord = geom.get_coord(mode="edges") assert_allclose(coord.lon[0, 0].value, 2) assert coord.lon[0, 0].unit == "deg" assert_allclose(coord.lat[0, 0].value, -1.5) assert coord.lat[0, 0].unit == "deg" def test_wcs_geom_instance_cache(): geom_1 = WcsGeom.create(npix=(3, 3)) geom_2 = WcsGeom.create(npix=(3, 3)) coord_1, coord_2 = geom_1.get_coord(), geom_2.get_coord() assert geom_1.get_coord.cache_info().misses == 1 assert geom_2.get_coord.cache_info().misses == 1 coord_1_cached, coord_2_cached = geom_1.get_coord(), geom_2.get_coord() assert geom_1.get_coord.cache_info().hits == 1 assert geom_2.get_coord.cache_info().hits == 1 assert geom_1.get_coord.cache_info().currsize == 1 assert geom_2.get_coord.cache_info().currsize == 1 assert id(coord_1) == id(coord_1_cached) assert id(coord_2) == id(coord_2_cached) def test_wcs_geom_squash(): axis = MapAxis.from_nodes([1, 2, 3], name="test-axis") geom = WcsGeom.create(npix=(3, 3), axes=[axis]) geom_squashed = geom.squash(axis_name="test-axis") assert geom_squashed.data_shape == (1, 3, 3) def test_wcs_geom_drop(): ax1 = MapAxis.from_nodes([1, 2, 3], name="ax1") ax2 = MapAxis.from_nodes([1, 2], name="ax2") ax3 = MapAxis.from_nodes([1, 2, 3, 4], name="ax3") geom = WcsGeom.create(npix=(3, 3), axes=[ax1, ax2, ax3]) geom_drop = geom.drop(axis_name="ax1") assert geom_drop.data_shape == (4, 2, 3, 3) def test_wcs_geom_resample_overflows(): ax1 = MapAxis.from_edges([1, 2, 3, 4, 5], name="ax1") ax2 = MapAxis.from_nodes([1, 2, 3], name="ax2") geom = WcsGeom.create(npix=(3, 3), axes=[ax1, ax2]) new_axis = MapAxis.from_edges([-1.0, 1, 2.3, 4.8, 6], name="ax1") geom_resample = geom.resample_axis(axis=new_axis) assert geom_resample.data_shape == (3, 2, 3, 3) assert geom_resample.data_shape_axes == (3, 2, 1, 1) assert_allclose(geom_resample.axes[0].edges, [1, 2, 5]) def test_wcs_geom_get_pix_coords(): geom = WcsGeom.create( skydir=(0, 0), npix=(4, 3), binsz=1, frame="galactic", proj="CAR", axes=axes1 ) idx_center = geom.get_pix(mode="center") for idx in idx_center: assert idx.shape == (2, 3, 4) assert_allclose(idx[0, 0, 0], 0) idx_edge = geom.get_pix(mode="edges") for idx, desired in zip(idx_edge, [-0.5, -0.5, 0]): assert idx.shape == (2, 4, 5) assert_allclose(idx[0, 0, 0], desired) def test_geom_str(): geom = WcsGeom.create( skydir=(0, 0), npix=(10, 4), binsz=50, frame="galactic", proj="AIT" ) str_info = str(geom) assert geom.__class__.__name__ in str_info assert "wcs ref" in str_info assert "center" in str_info assert "projection" in str_info assert "axes" in str_info assert "shape" in str_info assert "ndim" in str_info assert "width" in str_info def test_geom_refpix(): refpix = (400, 300) geom = WcsGeom.create( skydir=(0, 0), npix=(800, 600), refpix=refpix, binsz=0.1, frame="galactic" ) assert_allclose(geom.wcs.wcs.crpix, refpix) def test_region_mask(): geom = WcsGeom.create(npix=(3, 3), binsz=2, proj="CAR") r1 = CircleSkyRegion(SkyCoord(0, 0, unit="deg"), 1 * u.deg) r2 = CircleSkyRegion(SkyCoord(20, 20, unit="deg"), 1 * u.deg) regions = [r1, r2] mask = geom.region_mask(regions) assert mask.data.dtype == bool assert np.sum(mask.data) == 1 mask = geom.region_mask(regions, inside=False) assert np.sum(mask.data) == 8 def test_energy_mask(): energy_axis = MapAxis.from_nodes( [1, 10, 100], interp="log", name="energy", unit="TeV" ) geom = WcsGeom.create(npix=(1, 1), binsz=1, proj="CAR", axes=[energy_axis]) mask = geom.energy_mask(energy_min=3 * u.TeV).data assert not mask[0, 0, 0] assert mask[1, 0, 0] assert mask[2, 0, 0] mask = geom.energy_mask(energy_max=30 * u.TeV).data assert mask[0, 0, 0] assert not mask[1, 0, 0] assert not mask[2, 0, 0] mask = geom.energy_mask(energy_min=3 * u.TeV, energy_max=40 * u.TeV).data assert not mask[0, 0, 0] assert not mask[2, 0, 0] assert mask[1, 0, 0] def test_boundary_mask(): axis = MapAxis.from_edges([1, 10, 100]) geom = WcsGeom.create( skydir=(0, 0), binsz=0.02, width=(2, 2), axes=[axis], ) mask = geom.boundary_mask(width=(0.3 * u.deg, 0.1 * u.deg)) assert np.sum(mask.data[0, :, :]) == 6300 assert np.sum(mask.data[1, :, :]) == 6300 @pytest.mark.parametrize( ("width", "out"), [ (10, (10, 10)), ((10 * u.deg).to("rad"), (10, 10)), ((10, 5), (10, 5)), (("10 deg", "5 deg"), (10, 5)), (Angle([10, 5], "deg"), (10, 5)), ((10 * u.deg, 5 * u.deg), (10, 5)), ((10, 5) * u.deg, (10, 5)), ([10, 5], (10, 5)), (["10 deg", "5 deg"], (10, 5)), (np.array([10, 5]), (10, 5)), ], ) def test_check_width(width, out): width = _check_width(width) assert isinstance(width, tuple) assert isinstance(width[0], float) assert isinstance(width[1], float) assert width == out geom = WcsGeom.create(width=width, binsz=1.0) assert tuple(geom.npix) == out def test_check_binsz(): # float binsz = _check_binsz(0.1) assert isinstance(binsz, float) # string and other units binsz = _check_binsz("0.1deg") assert isinstance(binsz, float) binsz = _check_binsz("3.141592653589793 rad") assert_allclose(binsz, 180) # tuple binsz = _check_binsz(("0.1deg", "0.2deg")) assert isinstance(binsz, tuple) assert isinstance(binsz[0], float) assert isinstance(binsz[1], float) # list binsz = _check_binsz(["0.1deg", "0.2deg"]) assert isinstance(binsz, list) assert isinstance(binsz[0], float) assert isinstance(binsz[1], float) def test_check_width_bad_input(): with pytest.raises(IndexError): _check_width(width=(10,)) def test_get_axis_index_by_name(): e_axis = MapAxis.from_edges([1, 5], name="energy") geom = WcsGeom.create(width=5, binsz=1.0, axes=[e_axis]) assert geom.axes.index("energy") == 0 with pytest.raises(ValueError): geom.axes.index("time") test_axis1 = [MapAxis(nodes=(1, 2, 3, 4), unit="TeV", node_type="center")] test_axis2 = [ MapAxis(nodes=(1, 2, 3, 4), unit="TeV", node_type="center"), MapAxis(nodes=(1, 2, 3), unit="TeV", node_type="center"), ] skydir2 = SkyCoord(110.0, 75.0 + 1e-8, unit="deg", frame="icrs") skydir3 = SkyCoord(110.0, 75.0 + 1e-3, unit="deg", frame="icrs") compatibility_test_geoms = [ (10, 0.1, "galactic", "CAR", skydir, test_axis1, True), (10, 0.1, "galactic", "CAR", skydir2, test_axis1, True), (10, 0.1, "galactic", "CAR", skydir3, test_axis1, False), (10, 0.1, "galactic", "TAN", skydir, test_axis1, False), (8, 0.1, "galactic", "CAR", skydir, test_axis1, False), (10, 0.1, "galactic", "CAR", skydir, test_axis2, False), (10, 0.1, "galactic", "CAR", skydir.galactic, test_axis1, True), ] @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skypos", "axes", "result"), compatibility_test_geoms, ) def test_wcs_geom_equal(npix, binsz, frame, proj, skypos, axes, result): geom0 = WcsGeom.create( skydir=skydir, npix=10, binsz=0.1, proj="CAR", frame="galactic", axes=test_axis1 ) geom1 = WcsGeom.create( skydir=skypos, npix=npix, binsz=binsz, proj=proj, frame=frame, axes=axes ) assert (geom0 == geom1) is result assert (geom0 != geom1) is not result def test_irregular_geom_equality(): axis = MapAxis.from_bounds(1, 3, 10, name="axis", unit="") geom0 = WcsGeom.create(skydir=(0, 0), npix=10, binsz=0.1, axes=[axis]) binsizes = np.ones((10)) * 0.1 geom1 = WcsGeom.create(skydir=(0, 0), npix=10, binsz=binsizes, axes=[axis]) with pytest.raises(NotImplementedError): geom0 == geom1 def test_wcs_geom_pad(): axis = MapAxis.from_bounds(0, 1, nbin=1, name="axis", unit="") geom = WcsGeom.create(skydir=(0, 0), npix=10, binsz=0.1, axes=[axis]) geom_pad = geom.pad(axis_name="axis", pad_width=1) assert_allclose(geom_pad.axes["axis"].edges, [-1, 0, 1, 2]) @pytest.mark.parametrize("node_type", ["edges", "center"]) @pytest.mark.parametrize("interp", ["log", "lin", "sqrt"]) def test_read_write(tmp_path, node_type, interp): # Regression test for MapAxis interp and node_type FITS serialization # https://github.com/gammapy/gammapy/issues/1887 e_ax = MapAxis([1, 2], interp, "energy", node_type, "TeV") t_ax = MapAxis([3, 4], interp, "time", node_type, "s") m = Map.create(binsz=1, npix=10, axes=[e_ax, t_ax], unit="m2") # Check what Gammapy writes in the FITS header header = m.to_hdu().header assert header["INTERP1"] == interp assert header["INTERP2"] == interp # Check that all MapAxis properties are preserved on FITS I/O m.write(tmp_path / "tmp.fits", overwrite=True) m2 = Map.read(tmp_path / "tmp.fits") assert m2.geom == m.geom @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skypos", "axes", "result"), compatibility_test_geoms, ) def test_wcs_geom_to_binsz(npix, binsz, frame, proj, skypos, axes, result): geom = WcsGeom.create( skydir=skydir, npix=10, binsz=0.1, proj="CAR", frame="galactic", axes=test_axis1 ) geom_new = geom.to_binsz(binsz=0.5) assert_allclose(geom_new.pixel_scales.value, 0.5) def test_non_equal_binsz(): geom = WcsGeom.create( width=(360, 180), binsz=(360, 60), frame="icrs", skydir=(0, 0), proj="CAR" ) coords = geom.get_coord() assert_allclose(coords["lon"].to_value("deg"), 0) assert_allclose(coords["lat"].to_value("deg"), [[-60], [0], [60]]) def test_wcs_geom_non_equal_dim(): geom = WcsGeom.create(skydir=(0, 0), binsz=1, width=(3, 5)) assert geom.data_shape == (5, 3) assert geom.data_shape == geom.wcs.array_shape def test_wcs_geom_to_even_npix(): geom = WcsGeom.create(skydir=(0, 0), binsz=1, width=(3, 3)) geom_even = geom.to_even_npix() assert geom_even.data_shape == (4, 4) def test_wcs_geom_no_zero_shape_cut(): geom = WcsGeom.create(skydir=(0, 2.5), binsz=(360, 5), width=(360, 180)) skydir = SkyCoord(110.0, 75.0, unit="deg", frame="icrs") geom_cut = geom.cutout(skydir, width=5) assert geom_cut.data_shape != (1, 0) assert geom_cut.data_shape == (1, 1) def test_wcs_geom_with_timeaxis(): center = SkyCoord("0d", "0d", frame="icrs") t_ref = Time(55555, format="mjd") energy_axis = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=3, name="energy_true" ) time_min = t_ref + [1, 3, 5, 7] * u.day time_max = t_ref + [2, 4, 6, 8] * u.day time_axis = TimeMapAxis.from_time_edges(time_min=time_min, time_max=time_max) wcs_geom = WcsGeom.create( width=[1, 1.2], binsz=0.2, skydir=center, axes=[energy_axis, time_axis] ) coords = wcs_geom.get_coord() assert coords.shape == (4, 3, 6, 5) assert_allclose( coords[0][0][0][0:-1:5].value, [[4.000e-01, 2.000e-01, 0.000e00, 3.598e02, 3.596e02]], rtol=1e-5, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/maps/wcs/tests/test_ndmap.py0000644000175100001770000010610314721316200021453 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose, assert_equal import astropy.units as u from astropy.convolution import Box2DKernel, Gaussian2DKernel from astropy.coordinates import SkyCoord from astropy.io import fits from astropy.time import Time from regions import CircleSkyRegion, PointSkyRegion, RectangleSkyRegion from gammapy.datasets.map import MapEvaluator from gammapy.irf import PSFKernel, PSFMap from gammapy.maps import ( LabelMapAxis, Map, MapAxis, MapCoord, TimeMapAxis, WcsGeom, WcsNDMap, ) from gammapy.modeling.models import ( GaussianSpatialModel, PowerLawSpectralModel, SkyModel, ) from gammapy.utils.testing import mpl_plot_check, requires_data axes1 = [MapAxis(np.logspace(0.0, 3.0, 3), interp="log", name="spam")] axes2 = [ MapAxis(np.logspace(0.0, 3.0, 3), interp="log"), MapAxis(np.logspace(1.0, 3.0, 4), interp="lin"), ] skydir = SkyCoord(110.0, 75.0, unit="deg", frame="icrs") wcs_allsky_test_geoms = [ (None, 10.0, "galactic", "AIT", skydir, None), (None, 10.0, "galactic", "AIT", skydir, axes1), (None, [10.0, 20.0], "galactic", "AIT", skydir, axes1), (None, 10.0, "galactic", "AIT", skydir, axes2), (None, [[10.0, 20.0, 30.0], [10.0, 20.0, 30.0]], "galactic", "AIT", skydir, axes2), ] wcs_partialsky_test_geoms = [ (10, 1.0, "galactic", "AIT", skydir, None), (10, 1.0, "galactic", "AIT", skydir, axes1), (10, [1.0, 2.0], "galactic", "AIT", skydir, axes1), (10, 1.0, "galactic", "AIT", skydir, axes2), (10, [[1.0, 2.0, 3.0], [1.0, 2.0, 3.0]], "galactic", "AIT", skydir, axes2), ] wcs_test_geoms = wcs_allsky_test_geoms + wcs_partialsky_test_geoms @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsndmap_init(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create(npix=npix, binsz=binsz, proj=proj, frame=frame, axes=axes) m0 = WcsNDMap(geom) coords = m0.geom.get_coord() m0.set_by_coord(coords, coords[1]) m1 = WcsNDMap(geom, m0.data) assert_allclose(m0.data, m1.data) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsndmap_read_write(tmp_path, npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create(npix=npix, binsz=binsz, proj=proj, frame=frame, axes=axes) path = tmp_path / "tmp.fits" m0 = WcsNDMap(geom) m0.write(path, overwrite=True) m1 = WcsNDMap.read(path) m2 = Map.read(path) m3 = Map.read(path, map_type="wcs") assert_allclose(m0.data, m1.data) assert_allclose(m0.data, m2.data) assert_allclose(m0.data, m3.data) m0.write(path, sparse=True, overwrite=True) m1 = WcsNDMap.read(path) m2 = Map.read(path) m3 = Map.read(path, map_type="wcs") assert_allclose(m0.data, m1.data) assert_allclose(m0.data, m2.data) assert_allclose(m0.data, m3.data) # Specify alternate HDU name for IMAGE and BANDS table m0.write(path, hdu="IMAGE", hdu_bands="TEST", overwrite=True) m1 = WcsNDMap.read(path) m2 = Map.read(path) m3 = Map.read(path, map_type="wcs") @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsndmap_write_checksum(tmp_path, npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create(npix=npix, binsz=binsz, proj=proj, frame=frame, axes=axes) path = tmp_path / "tmp.fits" m0 = WcsNDMap(geom) m0.write(path, overwrite=True, checksum=True) hdul = fits.open(path) for hdu in hdul: assert "CHECKSUM" in hdu.header assert "DATASUM" in hdu.header def test_wcsndmap_read_write_fgst(tmp_path): path = tmp_path / "tmp.fits" axis = MapAxis.from_bounds(100.0, 1000.0, 4, name="energy", unit="MeV") geom = WcsGeom.create(npix=10, binsz=1.0, proj="AIT", frame="galactic", axes=[axis]) # Test Counts Cube m = WcsNDMap(geom) m.write(path, format="fgst-ccube", overwrite=True) with fits.open(path, memmap=False) as hdulist: assert "EBOUNDS" in hdulist m2 = Map.read(path) assert m2.geom.axes[0].name == "energy" # Test Model Cube m.write(path, format="fgst-template", overwrite=True) with fits.open(path, memmap=False) as hdulist: assert "ENERGIES" in hdulist @requires_data() def test_wcsndmap_read_ccube(): counts = Map.read("$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-counts-cube.fits.gz") energy_axis = counts.geom.axes["energy"] # for the 3FGL data the lower energy threshold should be at 10 GeV assert_allclose(energy_axis.edges.min().to_value("GeV"), 10, rtol=1e-3) @requires_data() def test_wcsndmap_read_exposure(): exposure = Map.read( "$GAMMAPY_DATA/fermi-3fhl-gc/fermi-3fhl-gc-exposure-cube.fits.gz" ) energy_axis = exposure.geom.axes["energy_true"] assert energy_axis.node_type == "center" assert exposure.unit == "cm2 s" def test_wcs_nd_map_data_transpose_issue(tmp_path): # Regression test for https://github.com/gammapy/gammapy/issues/1346 # Our test case: a little map with WCS shape (3, 2), i.e. numpy array shape (2, 3) data = np.array([[0, 1, 2], [np.nan, np.inf, -np.inf]]) geom = WcsGeom.create(npix=(3, 2)) # Data should be unmodified after init m = WcsNDMap(data=data, geom=geom) assert_equal(m.data, data) # Data should be unmodified if initialised like this m = WcsNDMap(geom=geom) # and then filled via an in-place Numpy array operation m.data += data assert_equal(m.data, data) # Data should be unmodified after write / read to normal image format m.write(tmp_path / "normal.fits.gz") m2 = Map.read(tmp_path / "normal.fits.gz") assert_equal(m2.data, data) # Data should be unmodified after write / read to sparse image format m.write(tmp_path / "sparse.fits.gz", sparse=True) m2 = Map.read(tmp_path / "sparse.fits.gz") assert_equal(m2.data, data) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsndmap_set_get_by_pix(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create( npix=npix, binsz=binsz, skydir=skydir, proj=proj, frame=frame, axes=axes ) m = WcsNDMap(geom) coords = m.geom.get_coord() pix = m.geom.get_idx() m.set_by_pix(pix, coords[0]) assert_allclose(coords[0].value, m.get_by_pix(pix)) def test_get_by_coord_bool_int(): mask = WcsNDMap.create(width=2, dtype="bool") coords = {"lon": [0, 3], "lat": [0, 3]} vals = mask.get_by_coord(coords) assert_allclose(vals, [0, np.nan]) mask = WcsNDMap.create(width=2, dtype="int") coords = {"lon": [0, 3], "lat": [0, 3]} vals = mask.get_by_coord(coords) assert_allclose(vals, [0, np.nan]) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsndmap_set_get_by_coord(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create( npix=npix, binsz=binsz, skydir=skydir, proj=proj, frame=frame, axes=axes ) m = WcsNDMap(geom) coords = m.geom.get_coord() m.set_by_coord(coords, coords[0]) assert_allclose(coords[0].value, m.get_by_coord(coords)) # Test with SkyCoords m = WcsNDMap(geom) coords = m.geom.get_coord() skydir = coords.skycoord skydir_cel = skydir.transform_to("icrs") skydir_gal = skydir.transform_to("galactic") m.set_by_coord((skydir_gal,) + tuple(coords[2:]), coords[0]) assert_allclose(coords[0].value, m.get_by_coord(coords)) assert_allclose( m.get_by_coord((skydir_cel,) + tuple(coords[2:])), m.get_by_coord((skydir_gal,) + tuple(coords[2:])), ) # Test with MapCoord m = WcsNDMap(geom) coords = m.geom.get_coord() coords_dict = dict(lon=coords[0], lat=coords[1]) if axes: for i, ax in enumerate(axes): coords_dict[ax.name] = coords[i + 2] map_coords = MapCoord.create(coords_dict, frame=frame) m.set_by_coord(map_coords, coords[0]) assert_allclose(coords[0].value, m.get_by_coord(map_coords)) def test_set_get_by_coord_quantities(): ax = MapAxis(np.logspace(0.0, 3.0, 3), interp="log", name="energy", unit="TeV") geom = WcsGeom.create(binsz=0.1, npix=(3, 4), axes=[ax]) m = WcsNDMap(geom) coords_dict = {"lon": 0, "lat": 0, "energy": 1000 * u.GeV} m.set_by_coord(coords_dict, 42) coords_dict["energy"] = 1 * u.TeV assert_allclose(42, m.get_by_coord(coords_dict)) def qconcatenate(q_1, q_2): """Concatenate quantity""" return u.Quantity(np.concatenate((q_1.value, q_2.value)), unit=q_1.unit) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsndmap_fill_by_coord(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create( npix=npix, binsz=binsz, skydir=skydir, proj=proj, frame=frame, axes=axes ) m = WcsNDMap(geom) coords = m.geom.get_coord() fill_coords = tuple([qconcatenate(t, t) for t in coords]) fill_vals = fill_coords[1] m.fill_by_coord(fill_coords, fill_vals.value) assert_allclose(m.get_by_coord(coords), 2.0 * coords[1].value) # Test with SkyCoords m = WcsNDMap(geom) coords = m.geom.get_coord() skydir = coords.skycoord skydir_cel = skydir.transform_to("icrs") skydir_gal = skydir.transform_to("galactic") fill_coords_cel = (skydir_cel,) + tuple(coords[2:]) fill_coords_gal = (skydir_gal,) + tuple(coords[2:]) m.fill_by_coord(fill_coords_cel, coords[1].value) m.fill_by_coord(fill_coords_gal, coords[1].value) assert_allclose(m.get_by_coord(coords), 2.0 * coords[1].value) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsndmap_coadd(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create( npix=npix, binsz=binsz, skydir=skydir, proj=proj, frame=frame, axes=axes ) m0 = WcsNDMap(geom) m1 = WcsNDMap(geom.upsample(2)) coords = m0.geom.get_coord() m1.fill_by_coord( tuple([qconcatenate(t, t) for t in coords]), qconcatenate(coords[1], coords[1]).value, ) m0.coadd(m1) assert_allclose(np.nansum(m0.data), np.nansum(m1.data), rtol=1e-4) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsndmap_interp_by_coord(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create( npix=npix, binsz=binsz, skydir=skydir, proj=proj, frame=frame, axes=axes ) m = WcsNDMap(geom) coords = m.geom.get_coord().flat m.set_by_coord(coords, coords[1].value) assert_allclose(coords[1].value, m.interp_by_coord(coords, method="nearest")) def test_interp_by_coord_quantities(): ax = MapAxis( np.logspace(0.0, 3.0, 3), interp="log", name="energy", unit="TeV", node_type="center", ) geom = WcsGeom.create(binsz=0.1, npix=(3, 3), axes=[ax]) m = WcsNDMap(geom) coords_dict = {"lon": 0, "lat": 0, "energy": 1000 * u.GeV} m.set_by_coord(coords_dict, 42) coords_dict["energy"] = 1 * u.TeV assert_allclose(42.0, m.interp_by_coord(coords_dict, method="nearest")) def test_interp_methods(): m = Map.create(npix=(3, 3)) m.data += np.arange(9).reshape((3, 3)) actual = m.interp_by_coord({"lon": 0.07, "lat": 0.03}, method="linear") assert_allclose(actual, 4.2) actual = m.interp_by_coord({"lon": 0.07, "lat": 0.03}, method="nearest") assert_allclose(actual, 3.0) def test_wcsndmap_interp_by_coord_fill_value(): # Introduced in https://github.com/gammapy/gammapy/pull/1559/files m = Map.create(npix=(20, 10)) m.data += 42 # With `fill_value` one should be able to control what gets filled assert_allclose(m.interp_by_coord((99, 0), fill_value=99), 99) # Default is to extrapolate assert_allclose(m.interp_by_coord((99, 0)), 42) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) @pytest.mark.parametrize("keepdims", [True, False]) def test_wcsndmap_sum_over_axes(npix, binsz, frame, proj, skydir, axes, keepdims): geom = WcsGeom.create(npix=npix, binsz=binsz, proj=proj, frame=frame, axes=axes) m = WcsNDMap(geom) coords = m.geom.get_coord() m.fill_by_coord(coords, coords[0].value) msum = m.sum_over_axes(keepdims=keepdims) if m.geom.is_regular: assert_allclose(np.nansum(m.data), np.nansum(msum.data)) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsndmap_pad(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create(npix=npix, binsz=binsz, proj=proj, frame=frame, axes=axes) m = WcsNDMap(geom) m2 = m.pad(1, mode="constant", cval=2.2) if not geom.is_allsky: coords = m2.geom.get_coord() msk = m2.geom.contains(coords) coords = tuple([c[~msk] for c in coords]) assert_allclose(m2.get_by_coord(coords), 2.2) m.pad(1, mode="interp", method="nearest") m.pad(1, mode="interp") def test_wcsndmap_pad_cval(): geom = WcsGeom.create(npix=(5, 5)) m = WcsNDMap.from_geom(geom) cval = 1.1 m_padded = m.pad(1, mode="constant", cval=cval) assert_allclose(m_padded.data[0, 0], cval) def test_wcs_nd_map_pad_axis(): axis = MapAxis.from_nodes([0, 1], unit="deg", name="axis") m = WcsNDMap.create(npix=3, axes=[axis]) m.data += np.array([1, 2]).reshape((-1, 1, 1)) m_pad = m.pad(axis_name="axis", pad_width=1, mode="edge") m_pad.data assert_allclose(m_pad.data[:, 1, 1], [1, 1, 2, 2]) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsndmap_crop(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create(npix=npix, binsz=binsz, proj=proj, frame=frame, axes=axes) m = WcsNDMap(geom) m.crop(1) @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsndmap_downsample(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create(npix=npix, binsz=binsz, proj=proj, frame=frame, axes=axes) m = WcsNDMap(geom, unit="m2") # Check whether we can downsample if np.all(np.mod(geom.npix[0], 2) == 0) and np.all(np.mod(geom.npix[1], 2) == 0): m2 = m.downsample(2, preserve_counts=True) assert_allclose(np.nansum(m.data), np.nansum(m2.data)) assert m.unit == m2.unit @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsndmap_upsample(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create(npix=npix, binsz=binsz, proj=proj, frame=frame, axes=axes) m = WcsNDMap(geom, unit="m2") m2 = m.upsample(2, preserve_counts=True) assert_allclose(np.nansum(m.data), np.nansum(m2.data)) assert m.unit == m2.unit @pytest.mark.parametrize( ("npix", "binsz", "frame", "proj", "skydir", "axes"), wcs_test_geoms ) def test_wcsmap_upsample_downsample_wcs(npix, binsz, frame, proj, skydir, axes): geom = WcsGeom.create(npix=npix, binsz=binsz, proj=proj, frame=frame) if geom.is_regular: m = WcsNDMap(geom, unit="m2") pix_scale = np.max(geom.pixel_scales.to_value("deg")) factor = 11 # alignment check fails for larger odd value due to precision error on the position # seems ok for even values m_up = m.upsample(factor, preserve_counts=True) m_down = m_up.downsample(factor, preserve_counts=True) m.stack(m_down) assert_allclose(m.geom.center_skydir.l, m_down.geom.center_skydir.l, atol=1e-10) assert_allclose(m.geom.center_skydir.b, m_down.geom.center_skydir.b, atol=1e-10) assert_allclose(m.geom.data_shape, m_down.geom.data_shape) assert m.geom.is_aligned(m_down.geom) mcut = m.cutout(m.geom.center_skydir, 2 * pix_scale) assert m.geom.is_aligned(mcut.geom) factor = 49 mcut_up = mcut.upsample(factor, preserve_counts=True) mcut_down = mcut_up.downsample(factor, preserve_counts=True) m.stack(mcut_down) assert_allclose( m.geom.center_skydir.l, mcut_down.geom.center_skydir.l, atol=1e-10 ) assert_allclose( m.geom.center_skydir.b, mcut_down.geom.center_skydir.b, atol=1e-10 ) assert_allclose(mcut.geom.data_shape, mcut_down.geom.data_shape) assert m.geom.is_aligned(mcut_down.geom) def test_wcsndmap_upsample_axis(): axis = MapAxis.from_edges([1, 2, 3, 4], name="test") geom = WcsGeom.create(npix=(2, 2), axes=[axis]) test_nodes = np.arange(3) test_data = test_nodes.reshape(3, 1, 1) spatial_data = np.zeros((2, 2)) data = spatial_data + 0.5 * test_data m = WcsNDMap(geom, unit="m2", data=data) m2 = m.upsample(2, preserve_counts=True, axis_name="test") assert m2.data.shape == (6, 2, 2) assert_allclose(m.data.sum(), m2.data.sum()) assert_allclose(m2.data[:, 0, 0], [0, 0, 0.25, 0.25, 0.5, 0.5]) def test_wcsndmap_downsample_axis(): axis = MapAxis.from_edges([1, 2, 3, 4, 5], name="test") geom = WcsGeom.create(npix=(4, 4), axes=[axis]) m = WcsNDMap(geom, unit="m2") m.data += 1 m2 = m.downsample(2, preserve_counts=True, axis_name="test") assert m2.data.shape == (2, 4, 4) def test_wcsndmap_resample_axis(): axis_1 = MapAxis.from_edges([1, 2, 3, 4, 5], name="test-1") axis_2 = MapAxis.from_edges([1, 2, 3, 4], name="test-2") geom = WcsGeom.create(npix=(7, 6), axes=[axis_1, axis_2]) m = WcsNDMap(geom, unit="m2") m.data += 1 new_axis = MapAxis.from_edges([1, 3, 5], name="test-1") m2 = m.resample_axis(axis=new_axis) assert m2.data.shape == (3, 2, 6, 7) assert_allclose(m2.data, 2) # Test without all interval covered new_axis = MapAxis.from_edges([2, 3], name="test-1") m3 = m.resample_axis(axis=new_axis) assert m3.data.shape == (3, 1, 6, 7) assert_allclose(m3.data, 1) def test_wcsndmap_resample_axis_logical_and(): axis_1 = MapAxis.from_edges([1, 2, 3, 4, 5], name="test-1") geom = WcsGeom.create(npix=(2, 2), axes=[axis_1]) m = WcsNDMap(geom, dtype=bool) m.data[:, :, :] = True m.data[0, 0, 0] = False new_axis = MapAxis.from_edges([1, 3, 5], name="test-1") m2 = m.resample_axis(axis=new_axis, ufunc=np.logical_and) assert_allclose(m2.data[0, 0, 0], False) assert_allclose(m2.data[1, 0, 0], True) def test_coadd_unit(): geom = WcsGeom.create(npix=(10, 10), binsz=1, proj="CAR", frame="galactic") m1 = WcsNDMap(geom, data=np.ones((10, 10)), unit="m2") m2 = WcsNDMap(geom, data=np.ones((10, 10)), unit="cm2") m1.coadd(m2) assert_allclose(m1.data, 1.0001) @pytest.mark.parametrize("kernel", ["gauss", "box", "disk"]) def test_smooth(kernel): axes = [ MapAxis(np.logspace(0.0, 3.0, 3), interp="log"), MapAxis(np.logspace(1.0, 3.0, 4), interp="lin"), ] geom = WcsGeom.create( npix=(10, 10), binsz=1, proj="CAR", frame="galactic", axes=axes ) m = WcsNDMap(geom, data=np.ones(geom.data_shape), unit="m2") desired = m.data.sum() smoothed = m.smooth(0.2 * u.deg, kernel) actual = smoothed.data.sum() assert_allclose(actual, desired) assert smoothed.data.dtype == float @pytest.mark.parametrize("mode", ["partial", "strict", "trim"]) def test_make_cutout(mode): pos = SkyCoord(0, 0, unit="deg", frame="galactic") geom = WcsGeom.create( npix=(10, 10), binsz=1, skydir=pos, proj="CAR", frame="galactic", axes=axes2 ) m = WcsNDMap(geom, data=np.ones((3, 2, 10, 10)), unit="m2") cutout = m.cutout(position=pos, width=(2.0, 3.0) * u.deg, mode=mode) actual = cutout.data.sum() assert_allclose(actual, 36.0) assert_allclose(cutout.geom.shape_axes, m.geom.shape_axes) assert_allclose(cutout.geom.width.to_value("deg"), [[2.0], [3.0]]) assert_allclose(cutout.geom.data_shape, (3, 2, 3, 2)) cutout = m.cutout(position=pos, width=(2.0, 3.0) * u.deg, mode=mode, min_npix=3) assert_allclose(cutout.geom.data_shape, (3, 2, 3, 3)) def test_convolve_vs_smooth(): axes = [ MapAxis(np.logspace(0.0, 3.0, 3), interp="log"), MapAxis(np.logspace(1.0, 3.0, 4), interp="lin"), ] binsz = 0.05 * u.deg m = WcsNDMap.create(binsz=binsz, width=1.05 * u.deg, axes=axes) m.data[:, :, 10, 10] = 1.0 desired = m.smooth(kernel="gauss", width=0.5 * u.deg, mode="constant") gauss = Gaussian2DKernel(10).array actual = m.convolve(kernel=gauss) assert_allclose(actual.data, desired.data, rtol=1e-3) @requires_data() def test_convolve_nd(): energy_axis = MapAxis.from_edges( np.logspace(-1.0, 1.0, 4), unit="TeV", name="energy_true" ) geom = WcsGeom.create(binsz=0.02 * u.deg, width=4.0 * u.deg, axes=[energy_axis]) m = Map.from_geom(geom) m.fill_by_coord([[0.2, 0.4], [-0.1, 0.6], [0.5, 3.6]]) psf = PSFMap.from_gauss(energy_axis, sigma=[0.1, 0.2, 0.3] * u.deg) psf_kernel = psf.get_psf_kernel(geom=geom, max_radius=1 * u.deg) assert psf_kernel.psf_kernel_map.data.shape == (3, 101, 101) mc = m.convolve(psf_kernel) assert_allclose(mc.data.sum(axis=(1, 2)), [0, 1, 1], atol=1e-5) kernel_2d = Box2DKernel(3, mode="center") kernel_2d.normalize("peak") mc = m.convolve(kernel_2d.array) assert_allclose(mc.data[0, :, :].sum(), 0, atol=1e-5) assert_allclose(mc.data[1, :, :].sum(), 9, atol=1e-5) kernel_2d = Gaussian2DKernel(15, mode="center") kernel_2d.normalize("peak") mc_full = m.convolve(kernel_2d.array, mode="full") mc_same = m.convolve(kernel_2d.array, mode="same") coords = [ [0.2, 0.1, 0.4, 0.44, -1.3], [-0.1, -0.13, 0.6, 0.57, 0.91], [0.5, 0.5, 3.6, 3.6, 0.5], ] values_full = mc_full.get_by_coord(coords) values_same = mc_same.get_by_coord(coords) assert mc_same.data.shape == (3, 200, 200) assert mc_full.data.shape == (3, 320, 320) assert_allclose(values_full, values_same, rtol=1e-5) def test_convolve_pixel_scale_error(): m = WcsNDMap.create(binsz=0.05 * u.deg, width=5 * u.deg) kgeom = WcsGeom.create(binsz=0.04 * u.deg, width=0.5 * u.deg) kernel = PSFKernel.from_gauss(kgeom, sigma=0.1 * u.deg, max_radius=1.5 * u.deg) with pytest.raises(ValueError): m.convolve(kernel) def test_convolve_kernel_size_error(): axis_1 = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=2) axis_2 = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) m = WcsNDMap.create(binsz=0.05 * u.deg, width=5 * u.deg, axes=[axis_1]) kgeom = WcsGeom.create(binsz=0.05 * u.deg, width=0.5 * u.deg, axes=[axis_2]) kernel = PSFKernel.from_gauss(kgeom, sigma=0.1 * u.deg, max_radius=1.5 * u.deg) with pytest.raises(ValueError): m.convolve(kernel) def test_plot(): axis = MapAxis([0, 1], node_type="edges") m = WcsNDMap.create(binsz=0.1 * u.deg, width=1 * u.deg, axes=[axis]) with mpl_plot_check(): m.plot(add_cbar=True) def test_plot_grid(): axis = MapAxis([0, 1, 2], node_type="edges") m = WcsNDMap.create(binsz=0.1 * u.deg, width=1 * u.deg, axes=[axis]) with mpl_plot_check(): m.plot_grid() def test_plot_grid_label_axis(): axis = LabelMapAxis(labels=["d1", "d2"], name="dataset") m = WcsNDMap.create(width="5 deg", axes=[axis]) with mpl_plot_check(): m.plot_grid() def test_plot_grid_time_axis(): time_axis = TimeMapAxis( edges_min=[1, 10] * u.day, edges_max=[2, 13] * u.day, reference_time=Time("2020-03-19"), ) m = WcsNDMap.create(width="5 deg", axes=[time_axis]) with mpl_plot_check(): m.plot_grid() def test_plot_allsky(): axis = MapAxis([0, 1], node_type="edges") m = WcsNDMap.create(binsz=10 * u.deg, axes=[axis]) with mpl_plot_check(): m.plot() def test_plot_nan(): m = Map.create(width=10, binsz=1) m.data += np.nan with mpl_plot_check(): m.plot(add_cbar=False) def test_get_spectrum(): axis = MapAxis.from_bounds(1, 10, nbin=3, unit="TeV", name="energy") geom = WcsGeom.create( skydir=(0, 0), width=(2.5, 2.5), binsz=0.5, axes=[axis], frame="galactic" ) m = Map.from_geom(geom) m.data += 1 center = SkyCoord(0, 0, frame="galactic", unit="deg") region = CircleSkyRegion(center=center, radius=1 * u.deg) spec = m.get_spectrum(region=region) assert_allclose(spec.data.squeeze(), [13.0, 13.0, 13.0]) spec = m.get_spectrum(region=region, func=np.mean) assert_allclose(spec.data.squeeze(), [1.0, 1.0, 1.0]) spec = m.get_spectrum() assert isinstance(spec.geom.region, RectangleSkyRegion) region = PointSkyRegion(center) spec = m.get_spectrum(region=region) assert_allclose(spec.data.squeeze(), [1.0, 1.0, 1.0]) def test_get_spectrum_type(): axis = MapAxis.from_bounds(1, 10, nbin=3, unit="TeV", name="energy") geom = WcsGeom.create( skydir=(0, 0), width=(2.5, 2.5), binsz=0.5, axes=[axis], frame="galactic" ) m_int = Map.from_geom(geom, dtype="int") m_int.data += 1 m_bool = Map.from_geom(geom, dtype="bool") m_bool.data += True center = SkyCoord(0, 0, frame="galactic", unit="deg") region = CircleSkyRegion(center=center, radius=1 * u.deg) spec_int = m_int.get_spectrum(region=region) assert spec_int.data.dtype == np.dtype("int") assert_allclose(spec_int.data.squeeze(), [13, 13, 13]) spec_bool = m_bool.get_spectrum(region=region, func=np.any) assert spec_bool.data.dtype == np.dtype("bool") assert_allclose(spec_bool.data.squeeze(), [1, 1, 1]) def test_get_spectrum_weights(): axis = MapAxis.from_bounds(1, 10, nbin=3, unit="TeV", name="energy") geom = WcsGeom.create( skydir=(0, 0), width=(2.5, 2.5), binsz=0.5, axes=[axis], frame="galactic" ) m_int = Map.from_geom(geom, dtype="int") m_int.data += 1 weights = Map.from_geom(geom, dtype="bool") weights.data[:, 2, 2] = True bad_weights = Map.from_geom(geom.to_image(), dtype="bool") center = SkyCoord(0, 0, frame="galactic", unit="deg") region = CircleSkyRegion(center=center, radius=1 * u.deg) spec_int = m_int.get_spectrum(region=region, weights=weights) assert spec_int.data.dtype == np.dtype("int") assert_allclose(spec_int.data.squeeze(), [1, 1, 1]) with pytest.raises(ValueError): m_int.get_spectrum(region=region, weights=bad_weights) def get_npred_map(): position = SkyCoord(0.0, 0.0, frame="galactic", unit="deg") energy_axis = MapAxis.from_bounds( 1, 100, nbin=30, unit="TeV", name="energy_true", interp="log" ) exposure = Map.create( binsz=0.02, map_type="wcs", skydir=position, width="2 deg", axes=[energy_axis], frame="galactic", unit="cm2 s", ) spatial_model = GaussianSpatialModel( lon_0="0.015 deg", lat_0="-0.037 deg", sigma="0.2 deg", frame="galactic" ) spectral_model = PowerLawSpectralModel(amplitude="1e-11 cm-2 s-1 TeV-1") skymodel = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) exposure.data = 1e14 * np.ones(exposure.data.shape) evaluator = MapEvaluator(model=skymodel, exposure=exposure) npred = evaluator.compute_npred() return evaluator, npred def test_map_sampling(): eval, npred = get_npred_map() nmap = WcsNDMap(geom=eval.geom, data=npred.data) coords = nmap.sample_coord(n_events=2, random_state=0) assert len(coords["lon"]) == 2 assert_allclose(coords.skycoord.icrs.ra.deg, [266.204197, 266.451241], rtol=1e-5) assert_allclose(coords.skycoord.icrs.dec.deg, [-28.862369, -29.075469], rtol=1e-5) assert_allclose(coords["energy_true"].data, [2.363293, 2.342388], rtol=1e-5) assert coords["lon"].unit == "deg" assert coords["lat"].unit == "deg" assert coords["energy_true"].unit == "TeV" def test_map_interp_one_bin(): m = WcsNDMap.create(npix=(2, 1)) m.data = np.array([[1, 2]]) coords = {"lon": 0, "lat": [0, 0]} data = m.interp_by_coord(coords) assert data.shape == (2,) assert_allclose(data, 1.5) def test_sum_over_axes(): # Check summing over a specific axis ax1 = MapAxis.from_nodes([1, 2, 3, 4], name="ax1") ax2 = MapAxis.from_nodes([5, 6, 7], name="ax2") ax3 = MapAxis.from_nodes([8, 9], name="ax3") geom = WcsGeom.create(npix=(5, 5), axes=[ax1, ax2, ax3]) m1 = Map.from_geom(geom=geom) m1.data = np.ones(m1.data.shape) m2 = m1.sum_over_axes(axes_names=["ax1", "ax3"], keepdims=True) assert_allclose(m2.geom.data_shape, (1, 3, 1, 5, 5)) assert_allclose(m2.data[0][0][0][0][0], 8.0) m3 = m1.sum_over_axes(axes_names=["ax3", "ax2"], keepdims=False) assert_allclose(m3.geom.data_shape, (4, 5, 5)) assert_allclose(m3.data[0][0][0], 6.0) def test_reduce(): # Check summing over a specific axis ax1 = MapAxis.from_nodes([1, 2, 3, 4], name="ax1") ax2 = MapAxis.from_nodes([5, 6, 7], name="ax2") ax3 = MapAxis.from_nodes([8, 9], name="ax3") geom = WcsGeom.create(npix=(5, 5), axes=[ax1, ax2, ax3]) m1 = Map.from_geom(geom=geom) m1.data = np.ones(m1.data.shape) m2 = m1.reduce(axis_name="ax1", keepdims=True) assert_allclose(m2.geom.data_shape, (2, 3, 1, 5, 5)) assert_allclose(m2.data[0][0][0][0][0], 4.0) m3 = m1.reduce(axis_name="ax1", keepdims=False) assert_allclose(m3.geom.data_shape, (2, 3, 5, 5)) assert_allclose(m3.data[0][0][0][0], 4.0) def test_to_cube(): ax1 = MapAxis.from_nodes([1, 2, 3, 4], name="ax1") ax2 = MapAxis.from_edges([5, 6], name="ax2") ax3 = MapAxis.from_edges([8, 9], name="ax3") geom = WcsGeom.create(npix=(5, 5), axes=[ax1]) m1 = Map.from_geom(geom=geom, data=np.ones(geom.data_shape)) m2 = m1.to_cube([ax2, ax3]) assert_allclose(m2.geom.data_shape, (1, 1, 4, 5, 5)) # test that more than one bin fails ax4 = MapAxis.from_edges([8, 9, 10], name="ax4") with pytest.raises(ValueError): m1.to_cube([ax4]) def test_stack_unit_handling(): m = WcsNDMap.create(npix=(3, 3), unit="m2 s") m.data += 1 m_other = WcsNDMap.create(npix=(3, 3), unit="cm2 s") m_other.data += 1 m.stack(m_other) assert_allclose(m.data, 1.0001) def test_binary_erode(): geom = WcsGeom.create(binsz=0.02, width=2 * u.deg) mask = geom.region_mask("icrs;circle(0, 0, 1)") mask = mask.binary_erode(width=0.2 * u.deg, kernel="disk", use_fft=False) assert_allclose(mask.data.sum(), 4832) mask = mask.binary_erode(width=0.2 * u.deg, kernel="box", use_fft=True) # Due to fft noise the result is not exact here. # See https://github.com/gammapy/gammapy/issues/3662 assert_allclose(mask.data.sum(), 3372, atol=20) def test_binary_dilate(): geom = WcsGeom.create(binsz=0.02, width=2 * u.deg) mask = geom.region_mask("icrs;circle(0, 0, 0.8)") mask = mask.binary_dilate(width=0.2 * u.deg, kernel="disk", use_fft=False) assert_allclose(mask.data.sum(), 8048) mask = mask.binary_dilate(width=(10, 10), kernel="box") # Due to fft noise the result is not exact here. # See https://github.com/gammapy/gammapy/issues/3662 assert_allclose(mask.data.sum(), 9203, atol=20) def test_binary_dilate_erode_3d(): axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=2) geom = WcsGeom.create( binsz=0.02, width=(2, 2), frame="icrs", axes=[axis], ) mask = Map.from_geom(geom=geom, dtype=bool) mask.data |= True mask_fit = mask.binary_erode(width=(0.3 * u.deg, 0.1 * u.deg)) assert np.sum(mask_fit.data) == 9800 mask = geom.boundary_mask(width=(0.3 * u.deg, 0.1 * u.deg)) mask = mask.binary_dilate(width=(0.6 * u.deg, 0.2 * u.deg)) assert np.sum(mask.data) == np.prod(mask.data.shape) def test_memory_usage(): geom = WcsGeom.create() assert geom.data_nbytes().unit == u.MB assert_allclose(geom.data_nbytes(dtype="float32").value, 1.0368) assert_allclose(geom.data_nbytes(dtype="b").value, 0.2592) def test_double_cutout(): # regression test for https://github.com/gammapy/gammapy/issues/3368 m = Map.create(width="10 deg") m.data = np.arange(10_000, dtype="float").reshape((100, 100)) position = SkyCoord("1d", "1d") m_c = m.cutout(position=position, width="3 deg") m_cc = m_c.cutout(position=position, width="2 deg") m_new = Map.create(width="10 deg") m_new.stack(m_cc) m_c_new = m_new.cutout(position=position, width="2 deg") np.testing.assert_allclose(m_c_new.data, m_cc.data) def test_to_region_nd_map_histogram_basic(): random_state = np.random.RandomState(seed=0) energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) data = Map.create(axes=[energy_axis], width=10, unit="cm2 s-1", binsz=0.02) data.data = random_state.normal( size=data.data.shape, loc=0, scale=np.array([1.0, 2.0, 3.0]).reshape((-1, 1, 1)) ) hist = data.to_region_nd_map_histogram() assert hist.data.shape == (3, 100, 1, 1) assert hist.unit == "" assert hist.geom.axes[0].name == "bins" assert_allclose(hist.data[:, 50, 0, 0], [29019, 14625, 9794]) hist = data.to_region_nd_map_histogram(density=True, nbin=50) assert hist.data.shape == (3, 50, 1, 1) assert hist.unit == 1 / (u.cm**2 / u.s) assert hist.geom.axes[0].name == "bins" assert_allclose(hist.data[:, 25, 0, 0], [0.378215, 0.196005, 0.131782], atol=1e-5) def test_to_region_nd_map_histogram_advanced(): random_state = np.random.RandomState(seed=0) energy_axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) label_axis = LabelMapAxis(["a", "b"], name="label") data = Map.create(axes=[energy_axis, label_axis], width=10, binsz=0.02) data.data = random_state.normal( size=data.data.shape, loc=0, scale=np.array([[1.0, 2.0, 3.0]]).reshape((-1, 1, 1)), ) region = CircleSkyRegion(center=SkyCoord(0, 0, unit="deg"), radius=3 * u.deg) bins_axis = MapAxis.from_edges([-2, -1, 0, 1, 2], name="my-bins") hist = data.to_region_nd_map_histogram(region=region, bins_axis=bins_axis) assert hist.data.shape == (2, 3, 4, 1, 1) assert hist.unit == "" assert hist.geom.axes[0].name == "my-bins" assert_allclose( hist.data[:, :, 2, 0, 0], [[24189, 13587, 9212], [24053, 13410, 9186]] ) def test_plot_mask(): axis = MapAxis.from_energy_bounds("0.1 TeV", "10 TeV", nbin=2) geom = WcsGeom.create( binsz=0.02, width=(2, 2), frame="icrs", axes=[axis], ) mask = Map.from_geom(geom=geom, dtype=bool) mask.data[1] |= True with mpl_plot_check(): mask.plot_grid() def test_histogram_center_value(): evt_energy = np.arange(100.0, 105, 0.5) * u.GeV N_evt = evt_energy.shape[0] center = SkyCoord(0, 0, unit="deg", frame="icrs") radec = SkyCoord( [center.ra] * N_evt, [center.dec] * N_evt, unit="deg" ) # fake position list energy_axis = MapAxis.from_edges(np.arange(100, 106, 1) * u.GeV, name="energy") map1 = WcsNDMap.create( skydir=center, binsz=0.1 * u.deg, width=0.1 * u.deg, axes=[energy_axis] ) map1.fill_by_coord({"skycoord": radec, "energy": evt_energy}) assert_allclose(map1.data[0:2], [[[2.0]], [[2.0]]]) map1 = WcsNDMap.create( skydir=center, binsz=1 * u.deg, width=3 * u.deg, ) map1.fill_by_pix([[0.0, 0.5, 1.0, 1.5], [0.0, 0.5, 1.0, 1.5]]) assert_allclose(map1.data, [[1.0, 0.0, 0.0], [0.0, 2.0, 0.0], [0.0, 0.0, 1.0]]) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.228642 gammapy-1.3/gammapy/modeling/0000755000175100001770000000000014721316215015650 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/__init__.py0000644000175100001770000000112614721316200017753 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Models and fitting.""" from .covariance import Covariance from .fit import CovarianceResult, Fit, FitResult, OptimizeResult from .parameter import Parameter, Parameters, PriorParameter, PriorParameters from .scipy import stat_profile_ul_scipy from .selection import select_nested_models __all__ = [ "Covariance", "Fit", "FitResult", "OptimizeResult", "CovarianceResult", "Parameter", "Parameters", "select_nested_models", "PriorParameter", "PriorParameters", "stat_profile_ul_scipy", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/covariance.py0000644000175100001770000001536014721316200020333 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Covariance class.""" import numpy as np import scipy import matplotlib.pyplot as plt from gammapy.utils.parallel import is_ray_initialized from .parameter import Parameters __all__ = ["Covariance"] class Covariance: """Parameter covariance class. Parameters ---------- parameters : `~gammapy.modeling.Parameters` Parameter list. data : `~numpy.ndarray` Covariance data array. """ def __init__(self, parameters, data=None): self.parameters = parameters if data is None: data = np.diag([p.error**2 for p in self.parameters]) self._data = np.asanyarray(data, dtype=float) @property def shape(self): """Covariance shape.""" npars = len(self.parameters) return npars, npars @property def data(self): """Covariance data as a `~numpy.ndarray`.""" return self._data @data.setter def data(self, value): value = np.asanyarray(value) npars = len(self.parameters) shape = (npars, npars) if value.shape != shape: raise ValueError( f"Invalid covariance shape: {value.shape}, expected {shape}" ) self._data = value @staticmethod def _expand_factor_matrix(matrix, parameters): """Expand covariance matrix with zeros for frozen parameters.""" npars = len(parameters) matrix_expanded = np.zeros((npars, npars)) mask_frozen = [par.frozen for par in parameters] pars_index = [np.where(np.array(parameters) == p)[0][0] for p in parameters] mask_duplicate = [pars_idx != idx for idx, pars_idx in enumerate(pars_index)] mask = np.array(mask_frozen) | np.array(mask_duplicate) free_parameters = ~(mask | mask[:, np.newaxis]) matrix_expanded[free_parameters] = matrix.ravel() return matrix_expanded @classmethod def from_factor_matrix(cls, parameters, matrix): """Set covariance from factor covariance matrix. Used in the optimizer interface. """ npars = len(parameters) if not matrix.shape == (npars, npars): matrix = cls._expand_factor_matrix(matrix, parameters) scales = [par.scale for par in parameters] scale_matrix = np.outer(scales, scales) data = scale_matrix * matrix return cls(parameters, data=data) @classmethod def from_stack(cls, covar_list): """Stack sub-covariance matrices from list. Parameters ---------- covar_list : list of `Covariance` List of sub-covariances. Returns ------- covar : `Covariance` Stacked covariance. """ parameters = Parameters.from_stack([_.parameters for _ in covar_list]) covar = cls(parameters) for subcovar in covar_list: covar.set_subcovariance(subcovar) return covar def get_subcovariance(self, parameters): """Get sub-covariance matrix. Parameters ---------- parameters : `Parameters` Sub list of parameters. Returns ------- covariance : `~numpy.ndarray` Sub-covariance. """ idx = [self.parameters.index(par) for par in parameters] data = self._data[np.ix_(idx, idx)] return self.__class__(parameters=parameters, data=data) def set_subcovariance(self, covar): """Set sub-covariance matrix. Parameters ---------- covar : `Covariance` Sub-covariance. """ if is_ray_initialized(): # This copy is required to make the covariance setting work with ray self._data = self._data.copy() idx = [self.parameters.index(par) for par in covar.parameters] if not np.allclose(self.data[np.ix_(idx, idx)], covar.data): self.data[idx, :] = 0 self.data[:, idx] = 0 self._data[np.ix_(idx, idx)] = covar.data def plot_correlation(self, figsize=None, **kwargs): """Plot correlation matrix. Parameters ---------- figsize : tuple, optional Figure size. Default is None, which takes (number_params*0.9, number_params*0.7). **kwargs : dict Keyword arguments passed to `~gammapy.visualization.plot_heatmap`. Returns ------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. """ from gammapy.visualization import annotate_heatmap, plot_heatmap npars = len(self.parameters) figsize = (npars * 0.9, npars * 0.7) if figsize is None else figsize plt.figure(figsize=figsize) ax = plt.gca() kwargs.setdefault("cmap", "coolwarm") names = self.parameters.names im, cbar = plot_heatmap( data=self.correlation, col_labels=names, row_labels=names, ax=ax, vmin=-1, vmax=1, cbarlabel="Correlation", **kwargs, ) annotate_heatmap(im=im) return ax @property def correlation(self): r"""Correlation matrix as a `numpy.ndarray`. Correlation :math:`C` is related to covariance :math:`\Sigma` via: .. math:: C_{ij} = \frac{ \Sigma_{ij} }{ \sqrt{\Sigma_{ii} \Sigma_{jj}} } """ err = np.sqrt(np.diag(self.data)) with np.errstate(invalid="ignore", divide="ignore"): correlation = self.data / np.outer(err, err) return np.nan_to_num(correlation) @property def scipy_mvn(self): return scipy.stats.multivariate_normal( self.parameters.value, self.data, allow_singular=True ) def __str__(self): return str(self.data) def __array__(self): return self.data class CovarianceMixin: """Mixin class for covariance property on multi-components models""" def _check_covariance(self): if not self.parameters == self._covariance.parameters: self._covariance = Covariance.from_stack( [model.covariance for model in self._models] ) @property def covariance(self): """Covariance as a `~gammapy.modeling.Covariance` object.""" self._check_covariance() for model in self._models: self._covariance.set_subcovariance(model.covariance) return self._covariance @covariance.setter def covariance(self, covariance): self._check_covariance() self._covariance.data = covariance for model in self._models: subcovar = self._covariance.get_subcovariance(model.covariance.parameters) model.covariance = subcovar ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/fit.py0000644000175100001770000006773614721316200017021 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import html import itertools import logging import numpy as np from astropy.table import Table from gammapy.utils.pbar import progress_bar from .covariance import Covariance from .iminuit import ( confidence_iminuit, contour_iminuit, covariance_iminuit, optimize_iminuit, ) from .scipy import confidence_scipy, optimize_scipy from .sherpa import optimize_sherpa __all__ = ["Fit", "FitResult", "OptimizeResult", "CovarianceResult"] log = logging.getLogger(__name__) class Registry: """Registry of available backends for given tasks. Gives users the power to extend from their scripts. Used by `Fit` below. Not sure if we should call it "backend" or "method" or something else. Probably we will code up some methods, e.g. for profile analysis ourselves, using scipy or even just Python / Numpy? """ register = { "optimize": { "minuit": optimize_iminuit, "sherpa": optimize_sherpa, "scipy": optimize_scipy, }, "covariance": { "minuit": covariance_iminuit, # "sherpa": covariance_sherpa, # "scipy": covariance_scipy, }, "confidence": { "minuit": confidence_iminuit, # "sherpa": confidence_sherpa, "scipy": confidence_scipy, }, } @classmethod def get(cls, task, backend): if task not in cls.register: raise ValueError(f"Unknown task {task!r}") backend_options = cls.register[task] if backend not in backend_options: raise ValueError(f"Unknown backend {backend!r} for task {task!r}") return backend_options[backend] registry = Registry() class Fit: """Fit class. The fit class provides a uniform interface to multiple fitting backends. Currently available: "minuit", "sherpa" and "scipy". Parameters ---------- backend : {"minuit", "scipy" "sherpa"} Global backend used for fitting. Default is "minuit". optimize_opts : dict Keyword arguments passed to the optimizer. For the `"minuit"` backend see https://iminuit.readthedocs.io/en/stable/reference.html#iminuit.Minuit for a detailed description of the available options. If there is an entry 'migrad_opts', those options will be passed to `iminuit.Minuit.migrad()`. For the `"sherpa"` backend you can from the options: * `"simplex"` * `"levmar"` * `"moncar"` * `"gridsearch"` Those methods are described and compared in detail on http://cxc.cfa.harvard.edu/sherpa/methods/index.html. The available options of the optimization methods are described on the following pages in detail: * http://cxc.cfa.harvard.edu/sherpa/ahelp/neldermead.html * http://cxc.cfa.harvard.edu/sherpa/ahelp/montecarlo.html * http://cxc.cfa.harvard.edu/sherpa/ahelp/gridsearch.html * http://cxc.cfa.harvard.edu/sherpa/ahelp/levmar.html For the `"scipy"` backend the available options are described in detail here: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html covariance_opts : dict Covariance options passed to the given backend. confidence_opts : dict Extra arguments passed to the backend. E.g. `iminuit.Minuit.minos` supports a ``maxcall`` option. For the scipy backend ``confidence_opts`` are forwarded to `~scipy.optimize.brentq`. If the confidence estimation fails, the bracketing interval can be adapted by modifying the upper bound of the interval (``b``) value. store_trace : bool Whether to store the trace of the fit. """ def __init__( self, backend="minuit", optimize_opts=None, covariance_opts=None, confidence_opts=None, store_trace=False, ): self.store_trace = store_trace self.backend = backend if optimize_opts is None: optimize_opts = {"backend": backend} if covariance_opts is None: covariance_opts = {"backend": backend} if confidence_opts is None: confidence_opts = {"backend": backend} self.optimize_opts = optimize_opts self.covariance_opts = covariance_opts self.confidence_opts = confidence_opts self._minuit = None def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @staticmethod def _parse_datasets(datasets): from gammapy.datasets import Dataset, Datasets if isinstance(datasets, (list, Dataset)): datasets = Datasets(datasets) return datasets, datasets.parameters def run(self, datasets): """Run all fitting steps. Parameters ---------- datasets : `Datasets` or list of `Dataset` Datasets to optimize. Returns ------- fit_result : `FitResult` Fit result. """ datasets, parameters = self._parse_datasets(datasets=datasets) optimize_result = self.optimize(datasets=datasets) if self.backend not in registry.register["covariance"]: log.warning("No covariance estimate - not supported by this backend.") return FitResult(optimize_result=optimize_result) covariance_result = self.covariance( datasets=datasets, optimize_result=optimize_result ) optimize_result.models.covariance = Covariance( optimize_result.models.parameters, covariance_result.matrix ) datasets._covariance = Covariance(parameters, covariance_result.matrix) return FitResult( optimize_result=optimize_result, covariance_result=covariance_result, ) def optimize(self, datasets): """Run the optimization. Parameters ---------- datasets : `Datasets` or list of `Dataset` Datasets to optimize. Returns ------- optimize_result : `OptimizeResult` Optimization result. """ datasets, parameters = self._parse_datasets(datasets=datasets) datasets.parameters.check_limits() if len(parameters.free_parameters.names) == 0: raise ValueError("No free parameters for fitting") parameters.autoscale() kwargs = self.optimize_opts.copy() backend = kwargs.pop("backend", self.backend) compute = registry.get("optimize", backend) # TODO: change this calling interface! # probably should pass a fit statistic, which has a model, which has parameters # and return something simpler, not a tuple of three things factors, info, optimizer = compute( parameters=parameters, function=datasets.stat_sum, store_trace=self.store_trace, **kwargs, ) if backend == "minuit": self._minuit = optimizer kwargs["method"] = "migrad" trace = Table(info.pop("trace")) if self.store_trace: idx = [ parameters.index(par) for par in parameters.unique_parameters.free_parameters ] unique_names = np.array(datasets.models.parameters_unique_names)[idx] trace.rename_columns(trace.colnames[1:], list(unique_names)) # Copy final results into the parameters object parameters.set_parameter_factors(factors) parameters.check_limits() return OptimizeResult( models=datasets.models.copy(), total_stat=datasets.stat_sum(), backend=backend, method=kwargs.get("method", backend), trace=trace, minuit=optimizer, **info, ) def covariance(self, datasets, optimize_result=None): """Estimate the covariance matrix. Assumes that the model parameters are already optimised. Parameters ---------- datasets : `Datasets` or list of `Dataset` Datasets to optimize. optimize_result : `OptimizeResult`, optional Optimization result. Can be optionally used to pass the state of the IMinuit object to the covariance estimation. This might save computation time in certain cases. Default is None. Returns ------- result : `CovarianceResult` Results. """ datasets, unique_pars = self._parse_datasets(datasets=datasets) parameters = datasets.models.parameters kwargs = self.covariance_opts.copy() if optimize_result is not None and optimize_result.backend == "minuit": kwargs["minuit"] = optimize_result.minuit backend = kwargs.pop("backend", self.backend) compute = registry.get("covariance", backend) with unique_pars.restore_status(): if self.backend == "minuit": method = "hesse" else: method = "" factor_matrix, info = compute( parameters=unique_pars, function=datasets.stat_sum, **kwargs ) matrix = Covariance.from_factor_matrix( parameters=parameters, matrix=factor_matrix ) datasets.models.covariance = matrix if optimize_result: optimize_result.models.covariance = matrix.data.copy() return CovarianceResult( backend=backend, method=method, success=info["success"], message=info["message"], matrix=matrix.data, ) def confidence(self, datasets, parameter, sigma=1, reoptimize=True): """Estimate confidence interval. Extra ``kwargs`` are passed to the backend. E.g. `iminuit.Minuit.minos` supports a ``maxcall`` option. For the scipy backend ``kwargs`` are forwarded to `~scipy.optimize.brentq`. If the confidence estimation fails, the bracketing interval can be adapted by modifying the upper bound of the interval (``b``) value. Parameters ---------- datasets : `Datasets` or list of `Dataset` Datasets to optimize. parameter : `~gammapy.modeling.Parameter` Parameter of interest. sigma : float, optional Number of standard deviations for the confidence level. Default is 1. reoptimize : bool, optional Re-optimize other parameters, when computing the confidence region. Default is True. Returns ------- result : dict Dictionary with keys "errp", 'errn", "success" and "nfev". """ datasets, parameters = self._parse_datasets(datasets=datasets) kwargs = self.confidence_opts.copy() backend = kwargs.pop("backend", self.backend) compute = registry.get("confidence", backend) parameter = parameters[parameter] with parameters.restore_status(): result = compute( parameters=parameters, parameter=parameter, function=datasets.stat_sum, sigma=sigma, reoptimize=reoptimize, **kwargs, ) result["errp"] *= parameter.scale result["errn"] *= parameter.scale return result def stat_profile(self, datasets, parameter, reoptimize=False): """Compute fit statistic profile. The method used is to vary one parameter, keeping all others fixed. So this is taking a "slice" or "scan" of the fit statistic. Notes ----- The progress bar can be displayed for this function. Parameters ---------- datasets : `Datasets` or list of `Dataset` Datasets to optimize. parameter : `~gammapy.modeling.Parameter` Parameter of interest. The specification for the scan, such as bounds and number of values is taken from the parameter object. reoptimize : bool, optional Re-optimize other parameters, when computing the confidence region. Default is False. Returns ------- results : dict Dictionary with keys "parameter_name_scan", "stat_scan" and "fit_results". The latter contains an empty list, if `reoptimize` is set to False. Examples -------- >>> from gammapy.datasets import Datasets, SpectrumDatasetOnOff >>> from gammapy.modeling.models import SkyModel, LogParabolaSpectralModel >>> from gammapy.modeling import Fit >>> datasets = Datasets() >>> for obs_id in [23523, 23526]: ... dataset = SpectrumDatasetOnOff.read( ... f"$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs{obs_id}.fits" ... ) ... datasets.append(dataset) >>> datasets = datasets.stack_reduce(name="HESS") >>> model = SkyModel(spectral_model=LogParabolaSpectralModel(), name="crab") >>> datasets.models = model >>> fit = Fit() >>> result = fit.run(datasets) >>> parameter = datasets.models.parameters['amplitude'] >>> stat_profile = fit.stat_profile(datasets=datasets, parameter=parameter) """ datasets, parameters = self._parse_datasets(datasets=datasets) parameter = parameters[parameter] values = parameter.scan_values stats = [] fit_results = [] with parameters.restore_status(): for value in progress_bar(values, desc="Scan values"): parameter.value = value if reoptimize: parameter.frozen = True result = self.optimize(datasets=datasets) stat = result.total_stat fit_results.append(result) else: stat = datasets.stat_sum() stats.append(stat) idx = datasets.parameters.index(parameter) name = datasets.models.parameters_unique_names[idx] return { f"{name}_scan": values, "stat_scan": np.array(stats), "fit_results": fit_results, } def stat_surface(self, datasets, x, y, reoptimize=False): """Compute fit statistic surface. The method used is to vary two parameters, keeping all others fixed. So this is taking a "slice" or "scan" of the fit statistic. Caveat: This method can be very computationally intensive and slow See also: `Fit.stat_contour`. Notes ----- The progress bar can be displayed for this function. Parameters ---------- datasets : `Datasets` or list of `Dataset` Datasets to optimize. x, y : `~gammapy.modeling.Parameter` Parameters of interest. reoptimize : bool, optional Re-optimize other parameters, when computing the confidence region. Default is False. Returns ------- results : dict Dictionary with keys "x_values", "y_values", "stat" and "fit_results". The latter contains an empty list, if `reoptimize` is set to False. Examples -------- >>> from gammapy.datasets import Datasets, SpectrumDatasetOnOff >>> from gammapy.modeling.models import SkyModel, LogParabolaSpectralModel >>> from gammapy.modeling import Fit >>> import numpy as np >>> datasets = Datasets() >>> for obs_id in [23523, 23526]: ... dataset = SpectrumDatasetOnOff.read( ... f"$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs{obs_id}.fits" ... ) ... datasets.append(dataset) >>> datasets = datasets.stack_reduce(name="HESS") >>> model = SkyModel(spectral_model=LogParabolaSpectralModel(), name="crab") >>> datasets.models = model >>> par_alpha = datasets.models.parameters["alpha"] >>> par_beta = datasets.models.parameters["beta"] >>> par_alpha.scan_values = np.linspace(1.55, 2.7, 20) >>> par_beta.scan_values = np.linspace(-0.05, 0.55, 20) >>> fit = Fit() >>> stat_surface = fit.stat_surface( ... datasets=datasets, ... x=par_alpha, ... y=par_beta, ... reoptimize=False, ... ) """ datasets, parameters = self._parse_datasets(datasets=datasets) x = parameters[x] y = parameters[y] stats = [] fit_results = [] with parameters.restore_status(): for x_value, y_value in progress_bar( itertools.product(x.scan_values, y.scan_values), desc="Trial values" ): x.value, y.value = x_value, y_value if reoptimize: x.frozen, y.frozen = True, True result = self.optimize(datasets=datasets) stat = result.total_stat fit_results.append(result) else: stat = datasets.stat_sum() stats.append(stat) shape = (len(x.scan_values), len(y.scan_values)) stats = np.array(stats).reshape(shape) if reoptimize: fit_results = np.array(fit_results).reshape(shape) i1, i2 = datasets.parameters.index(x), datasets.parameters.index(y) name_x = datasets.models.parameters_unique_names[i1] name_y = datasets.models.parameters_unique_names[i2] return { f"{name_x}_scan": x.scan_values, f"{name_y}_scan": y.scan_values, "stat_scan": stats, "fit_results": fit_results, } def stat_contour(self, datasets, x, y, numpoints=10, sigma=1): """Compute stat contour. Calls ``iminuit.Minuit.mncontour``. This is a contouring algorithm for a 2D function which is not simply the fit statistic function. That 2D function is given at each point ``(par_1, par_2)`` by re-optimising all other free parameters, and taking the fit statistic at that point. Very compute-intensive and slow. Parameters ---------- datasets : `Datasets` or list of `Dataset` Datasets to optimize. x, y : `~gammapy.modeling.Parameter` Parameters of interest. numpoints : int, optional Number of contour points. Default is 10. sigma : float, optional Number of standard deviations for the confidence level. Default is 1. Returns ------- result : dict Dictionary containing the parameter values defining the contour, with the boolean flag "success" and the information objects from ``mncontour``. Examples -------- >>> from gammapy.datasets import Datasets, SpectrumDatasetOnOff >>> from gammapy.modeling.models import SkyModel, LogParabolaSpectralModel >>> from gammapy.modeling import Fit >>> datasets = Datasets() >>> for obs_id in [23523, 23526]: ... dataset = SpectrumDatasetOnOff.read( ... f"$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs{obs_id}.fits" ... ) ... datasets.append(dataset) >>> datasets = datasets.stack_reduce(name="HESS") >>> model = SkyModel(spectral_model=LogParabolaSpectralModel(), name="crab") >>> datasets.models = model >>> fit = Fit(backend='minuit') >>> optimize = fit.optimize(datasets) >>> stat_contour = fit.stat_contour( ... datasets=datasets, ... x=model.spectral_model.alpha, ... y=model.spectral_model.amplitude, ... ) """ datasets, parameters = self._parse_datasets(datasets=datasets) x = parameters[x] y = parameters[y] i1, i2 = datasets.parameters.index(x), datasets.parameters.index(y) name_x = datasets.models.parameters_unique_names[i1] name_y = datasets.models.parameters_unique_names[i2] with parameters.restore_status(): result = contour_iminuit( parameters=parameters, function=datasets.stat_sum, x=x, y=y, numpoints=numpoints, sigma=sigma, ) x = result["x"] * x.scale y = result["y"] * y.scale return { name_x: x, name_y: y, "success": result["success"], } class FitStepResult: """Fit result base class.""" def __init__(self, backend, method, success, message): self._success = success self._message = message self._backend = backend self._method = method @property def backend(self): """Optimizer backend used for the fit.""" return self._backend @property def method(self): """Optimizer method used for the fit.""" return self._method @property def success(self): """Fit success status flag.""" return self._success @property def message(self): """Optimizer status message.""" return self._message def __str__(self): return ( f"{self.__class__.__name__}\n\n" f"\tbackend : {self.backend}\n" f"\tmethod : {self.method}\n" f"\tsuccess : {self.success}\n" f"\tmessage : {self.message}\n" ) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def to_dict(self): """Convert to dictionary.""" return { self.__class__.__name__: { "backend": self.backend, "method": self.method, "success": self.success, "message": self.message, } } class CovarianceResult(FitStepResult): """Covariance result object. Parameters ---------- matrix : `~numpy.ndarray`, optional The covariance matrix. Default is None. kwargs : dict Extra ``kwargs`` are passed to the backend. """ def __init__(self, matrix=None, **kwargs): self._matrix = matrix super().__init__(**kwargs) @property def matrix(self): """Covariance matrix as a `~numpy.ndarray`.""" return self._matrix class OptimizeResult(FitStepResult): """Optimize result object. Parameters ---------- models : `~gammapy.modeling.models.DatasetModels` Best fit models. nfev : int Number of function evaluations. total_stat : float Value of the fit statistic at minimum. trace : `~astropy.table.Table` Parameter trace from the optimisation. minuit : `~iminuit.minuit.Minuit`, optional Minuit object. Default is None. kwargs : dict Extra ``kwargs`` are passed to the backend. """ def __init__(self, models, nfev, total_stat, trace, minuit=None, **kwargs): self._models = models self._nfev = nfev self._total_stat = total_stat self._trace = trace self._minuit = minuit super().__init__(**kwargs) @property def minuit(self): """Minuit object.""" return self._minuit @property def parameters(self): """Best fit parameters.""" return self.models.parameters @property def models(self): """Best fit models.""" return self._models @property def trace(self): """Parameter trace from the optimisation.""" return self._trace @property def nfev(self): """Number of function evaluations.""" return self._nfev @property def total_stat(self): """Value of the fit statistic at minimum.""" return self._total_stat def __str__(self): string = super().__str__() string += f"\tnfev : {self.nfev}\n" string += f"\ttotal stat : {self.total_stat:.2f}\n\n" return string def to_dict(self): """Convert to dictionary.""" output = super().to_dict() output[self.__class__.__name__]["nfev"] = self.nfev output[self.__class__.__name__]["total_stat"] = float(self._total_stat) return output class FitResult: """Fit result class. The fit result class provides the results from the optimisation and covariance of the fit. Parameters ---------- optimize_result : `~OptimizeResult` Result of the optimization step. covariance_result : `~CovarianceResult` Result of the covariance step. """ def __init__(self, optimize_result=None, covariance_result=None): self._optimize_result = optimize_result self._covariance_result = covariance_result @property def minuit(self): """Minuit object.""" return self.optimize_result.minuit @property def parameters(self): """Best fit parameters of the optimization step.""" return self.optimize_result.parameters @property def models(self): """Best fit parameters of the optimization step.""" return self.optimize_result.models @property def total_stat(self): """Total stat of the optimization step.""" return self.optimize_result.total_stat @property def trace(self): """Parameter trace of the optimisation step.""" return self.optimize_result.trace @property def nfev(self): """Number of function evaluations of the optimisation step.""" return self.optimize_result.nfev @property def backend(self): """Optimizer backend used for the fit.""" return self.optimize_result.backend @property def method(self): """Optimizer method used for the fit.""" return self.optimize_result.method @property def message(self): """Optimizer status message.""" return self.optimize_result.message @property def success(self): """Total success flag.""" success = self.optimize_result.success if self.covariance_result: success &= self.covariance_result.success return success @property def optimize_result(self): """Optimize result.""" return self._optimize_result @property def covariance_result(self): """Optimize result.""" return self._covariance_result def write( self, path, overwrite=False, full_output=True, overwrite_templates=False, write_covariance=True, checksum=False, ): """Write to file. Parameters ---------- path : `pathlib.Path` or str Path to write files. overwrite : bool, optional Overwrite existing file. Default is False. full_output : bool, optional Store full parameter output. Default is True. overwrite_templates : bool, optional Overwrite templates FITS files. Default is False. checksum : bool, optional When True adds a CHECKSUM entry to the file. Default is False. """ from gammapy.modeling.models.core import _write_models output = {} if self.optimize_result is not None: output.update(self.optimize_result.to_dict()) if self.covariance_result is not None: output.update(self.covariance_result.to_dict()) _write_models( self.models, path, overwrite, full_output, overwrite_templates, write_covariance, extra_dict=output, ) def __str__(self): string = "" if self.optimize_result: string += str(self.optimize_result) if self.covariance_result: string += str(self.covariance_result) return string def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/iminuit.py0000644000175100001770000001530014721316200017671 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """iminuit fitting functions.""" import logging import numpy as np from scipy.stats import chi2, norm from iminuit import Minuit from .likelihood import Likelihood __all__ = [ "confidence_iminuit", "contour_iminuit", "covariance_iminuit", "optimize_iminuit", ] log = logging.getLogger(__name__) class MinuitLikelihood(Likelihood): """Likelihood function interface for iminuit.""" def fcn(self, *factors): self.parameters.set_parameter_factors(factors) total_stat = self.function() if self.store_trace: self.store_trace_iteration(total_stat) return total_stat def setup_iminuit(parameters, function, store_trace=False, **kwargs): minuit_func = MinuitLikelihood(function, parameters, store_trace=store_trace) pars, errors, limits = make_minuit_par_kwargs(parameters) minuit = Minuit(minuit_func.fcn, name=list(pars.keys()), **pars) minuit.tol = kwargs.pop("tol", 0.1) minuit.errordef = kwargs.pop("errordef", 1) minuit.print_level = kwargs.pop("print_level", 0) minuit.strategy = kwargs.pop("strategy", 1) for name, error in errors.items(): minuit.errors[name] = error for name, limit in limits.items(): minuit.limits[name] = limit return minuit, minuit_func def optimize_iminuit(parameters, function, store_trace=False, **kwargs): """iminuit optimization. Parameters ---------- parameters : `~gammapy.modeling.Parameters` Parameters with starting values. function : callable Likelihood function. store_trace : bool, optional Store trace of the fit. Default is False. **kwargs : dict Options passed to `iminuit.Minuit` constructor. If there is an entry 'migrad_opts', those options will be passed to `iminuit.Minuit.migrad()`. Returns ------- result : (factors, info, optimizer) Tuple containing the best fit factors, some information and the optimizer instance. """ migrad_opts = kwargs.pop("migrad_opts", {}) minuit, minuit_func = setup_iminuit( parameters=parameters, function=function, store_trace=store_trace, **kwargs ) minuit.migrad(**migrad_opts) factors = minuit.values info = { "success": minuit.valid, "nfev": minuit.nfcn, "message": _get_message(minuit, parameters), "trace": minuit_func.trace, } optimizer = minuit return factors, info, optimizer def covariance_iminuit(parameters, function, **kwargs): minuit = kwargs.get("minuit") if minuit is None: minuit, _ = setup_iminuit( parameters=parameters, function=function, store_trace=False, **kwargs ) minuit.hesse() message, success = "Hesse terminated successfully.", True try: covariance_factors = np.array(minuit.covariance) except (TypeError, RuntimeError): N = len(minuit.values) covariance_factors = np.nan * np.ones((N, N)) message, success = "Hesse failed", False return covariance_factors, {"success": success, "message": message} def confidence_iminuit(parameters, function, parameter, reoptimize, sigma, **kwargs): # TODO: this is ugly - design something better for translating to MINUIT parameter names. if not reoptimize: log.warning("Reoptimize = False ignored for iminuit backend") minuit, minuit_func = setup_iminuit( parameters=parameters, function=function, store_trace=False, **kwargs ) migrad_opts = kwargs.get("migrad_opts", {}) minuit.migrad(**migrad_opts) # Maybe a wrapper class MinuitParameters? parameter = parameters[parameter] idx = parameters.free_parameters.index(parameter) var = _make_parname(idx, parameter) message = "Minos terminated" cl = 2 * norm.cdf(sigma) - 1 try: minuit.minos(var, cl=cl, ncall=None) info = minuit.merrors[var] except (AttributeError, RuntimeError) as error: return { "success": False, "message": str(error), "errp": np.nan, "errn": np.nan, "nfev": 0, } if info.is_valid: message += " successfully." else: message += ", but result is invalid." return { "success": info.is_valid, "message": message, "errp": info.upper, "errn": -info.lower, "nfev": info.nfcn, } def contour_iminuit(parameters, function, x, y, numpoints, sigma, **kwargs): minuit, minuit_func = setup_iminuit( parameters=parameters, function=function, store_trace=False, **kwargs ) minuit.migrad() par_x = parameters[x] idx_x = parameters.free_parameters.index(par_x) x = _make_parname(idx_x, par_x) par_y = parameters[y] idx_y = parameters.free_parameters.index(par_y) y = _make_parname(idx_y, par_y) cl = chi2(2).cdf(sigma**2) contour = minuit.mncontour(x=x, y=y, size=numpoints, cl=cl) # TODO: add try and except to get the success return { "success": True, "x": contour[:, 0], "y": contour[:, 1], } # This code is copied from https://github.com/scikit-hep/iminuit/blob/v2.21.0/src/iminuit/minimize.py#L124-L136 def _get_message(m, parameters): success = m.valid success &= np.all(np.isfinite([par.value for par in parameters])) if success: message = "Optimization terminated successfully" if m.accurate: message += "." else: message += ", but uncertainties are unreliable." else: message = "Optimization failed." fmin = m.fmin if fmin.has_reached_call_limit: message += " Call limit was reached." if fmin.is_above_max_edm: message += " Estimated distance to minimum too large." return message def _make_parnames(parameters): return [_make_parname(idx, par) for idx, par in enumerate(parameters)] def _make_parname(idx, par): return f"par_{idx:03d}_{par.name}" def make_minuit_par_kwargs(parameters): """Create *Parameter Keyword Arguments* for the `Minuit` constructor. See: http://iminuit.readthedocs.io/en/latest/api.html#iminuit.Minuit """ names = _make_parnames(parameters.free_parameters) pars, errors, limits = {}, {}, {} for name, par in zip(names, parameters.free_parameters): pars[name] = par.factor min_ = None if np.isnan(par.factor_min) else par.factor_min max_ = None if np.isnan(par.factor_max) else par.factor_max limits[name] = (min_, max_) if par.error == 0 or np.isnan(par.error): error = 1 else: error = par.error / par.scale errors[name] = error return pars, errors, limits ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/likelihood.py0000644000175100001770000000314114721316200020336 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import html __all__ = ["Likelihood"] # TODO: get rid of this wrapper class? Or use it in a better way? class Likelihood: """Wrapper of the likelihood function used by the optimiser. This might become superfluous if we introduce a generic ``Likelihood`` interface and use that directly, or change the ``Fit`` class to work with ``Model`` and ``Likelihood`` objects. For now, this class does the translation of parameter values and the parameter factors the optimiser sees. Parameters ---------- parameters : `~gammapy.modeling.Parameters` Parameters with starting values. function : callable Likelihood function. """ def __init__(self, function, parameters, store_trace): self.function = function self.parameters = parameters self.trace = [] self.store_trace = store_trace def store_trace_iteration(self, total_stat): row = {"total_stat": total_stat} pars = self.parameters.free_parameters names = [f"par-{idx}" for idx in range(len(pars))] vals = dict(zip(names, pars.value)) row.update(vals) self.trace.append(row) def fcn(self, factors): self.parameters.set_parameter_factors(factors) total_stat = self.function() if self.store_trace: self.store_trace_iteration(total_stat) return total_stat def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.228642 gammapy-1.3/gammapy/modeling/models/0000755000175100001770000000000014721316215017133 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/__init__.py0000644000175100001770000001354714721316200021250 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Built-in models in Gammapy.""" from gammapy.utils.registry import Registry from .core import DatasetModels, Model, ModelBase, Models from .cube import ( FoVBackgroundModel, SkyModel, TemplateNPredModel, create_fermi_isotropic_diffuse_model, ) from .prior import GaussianPrior, Prior, UniformPrior from .spatial import ( ConstantFluxSpatialModel, ConstantSpatialModel, DiskSpatialModel, GaussianSpatialModel, GeneralizedGaussianSpatialModel, PiecewiseNormSpatialModel, PointSpatialModel, Shell2SpatialModel, ShellSpatialModel, SpatialModel, TemplateNDSpatialModel, TemplateSpatialModel, ) from .spectral import ( EBL_DATA_BUILTIN, BrokenPowerLawSpectralModel, CompoundSpectralModel, ConstantSpectralModel, EBLAbsorptionNormSpectralModel, ExpCutoffPowerLaw3FGLSpectralModel, ExpCutoffPowerLawNormSpectralModel, ExpCutoffPowerLawSpectralModel, GaussianSpectralModel, LogParabolaNormSpectralModel, LogParabolaSpectralModel, NaimaSpectralModel, PiecewiseNormSpectralModel, PowerLaw2SpectralModel, PowerLawNormSpectralModel, PowerLawSpectralModel, ScaleSpectralModel, SmoothBrokenPowerLawSpectralModel, SpectralModel, SuperExpCutoffPowerLaw3FGLSpectralModel, SuperExpCutoffPowerLaw4FGLDR3SpectralModel, SuperExpCutoffPowerLaw4FGLSpectralModel, TemplateNDSpectralModel, TemplateSpectralModel, integrate_spectrum, scale_plot_flux, ) from .spectral_cosmic_ray import create_cosmic_ray_spectral_model from .spectral_crab import MeyerCrabSpectralModel, create_crab_spectral_model from .temporal import ( ConstantTemporalModel, ExpDecayTemporalModel, GaussianTemporalModel, GeneralizedGaussianTemporalModel, LightCurveTemplateTemporalModel, LinearTemporalModel, PowerLawTemporalModel, SineTemporalModel, TemplatePhaseCurveTemporalModel, TemporalModel, ) from .utils import read_hermes_cube __all__ = [ "BrokenPowerLawSpectralModel", "CompoundSpectralModel", "ConstantFluxSpatialModel", "ConstantSpatialModel", "ConstantSpectralModel", "ConstantTemporalModel", "create_cosmic_ray_spectral_model", "create_crab_spectral_model", "create_fermi_isotropic_diffuse_model", "DatasetModels", "DiskSpatialModel", "EBLAbsorptionNormSpectralModel", "ExpCutoffPowerLaw3FGLSpectralModel", "ExpCutoffPowerLawNormSpectralModel", "ExpCutoffPowerLawSpectralModel", "ExpDecayTemporalModel", "FoVBackgroundModel", "GaussianSpatialModel", "GaussianSpectralModel", "GaussianTemporalModel", "GeneralizedGaussianSpatialModel", "GeneralizedGaussianTemporalModel", "integrate_spectrum", "LightCurveTemplateTemporalModel", "LinearTemporalModel", "LogParabolaNormSpectralModel", "LogParabolaSpectralModel", "MeyerCrabSpectralModel", "Model", "Models", "ModelBase", "MODEL_REGISTRY", "NaimaSpectralModel", "PiecewiseNormSpectralModel", "PiecewiseNormSpatialModel", "PointSpatialModel", "PowerLaw2SpectralModel", "PowerLawNormSpectralModel", "PowerLawSpectralModel", "PowerLawTemporalModel", "Prior", "GaussianPrior", "UniformPrior", "scale_plot_flux", "ScaleSpectralModel", "Shell2SpatialModel", "ShellSpatialModel", "SineTemporalModel", "SkyModel", "SmoothBrokenPowerLawSpectralModel", "SPATIAL_MODEL_REGISTRY", "SpatialModel", "SPECTRAL_MODEL_REGISTRY", "SpectralModel", "SuperExpCutoffPowerLaw3FGLSpectralModel", "SuperExpCutoffPowerLaw4FGLDR3SpectralModel", "SuperExpCutoffPowerLaw4FGLSpectralModel", "TemplatePhaseCurveTemporalModel", "TemplateSpatialModel", "TemplateSpectralModel", "TemplateNDSpatialModel", "TemplateNDSpectralModel", "TemplateNPredModel", "TEMPORAL_MODEL_REGISTRY", "TemporalModel", "EBL_DATA_BUILTIN", "read_hermes_cube", ] SPATIAL_MODEL_REGISTRY = Registry( [ ConstantSpatialModel, TemplateSpatialModel, TemplateNDSpatialModel, DiskSpatialModel, GaussianSpatialModel, GeneralizedGaussianSpatialModel, PiecewiseNormSpatialModel, PointSpatialModel, ShellSpatialModel, Shell2SpatialModel, ] ) """Registry of spatial model classes.""" SPECTRAL_MODEL_REGISTRY = Registry( [ ConstantSpectralModel, CompoundSpectralModel, PowerLawSpectralModel, PowerLaw2SpectralModel, BrokenPowerLawSpectralModel, SmoothBrokenPowerLawSpectralModel, PiecewiseNormSpectralModel, ExpCutoffPowerLawSpectralModel, ExpCutoffPowerLaw3FGLSpectralModel, SuperExpCutoffPowerLaw3FGLSpectralModel, SuperExpCutoffPowerLaw4FGLDR3SpectralModel, SuperExpCutoffPowerLaw4FGLSpectralModel, LogParabolaSpectralModel, TemplateSpectralModel, TemplateNDSpectralModel, GaussianSpectralModel, EBLAbsorptionNormSpectralModel, NaimaSpectralModel, ScaleSpectralModel, PowerLawNormSpectralModel, LogParabolaNormSpectralModel, ExpCutoffPowerLawNormSpectralModel, ] ) """Registry of spectral model classes.""" TEMPORAL_MODEL_REGISTRY = Registry( [ ConstantTemporalModel, LinearTemporalModel, LightCurveTemplateTemporalModel, ExpDecayTemporalModel, GaussianTemporalModel, GeneralizedGaussianTemporalModel, PowerLawTemporalModel, SineTemporalModel, TemplatePhaseCurveTemporalModel, ] ) """Registry of temporal models classes.""" PRIOR_REGISTRY = Registry( [ UniformPrior, GaussianPrior, ] ) """Registry of prior classes.""" MODEL_REGISTRY = Registry([SkyModel, FoVBackgroundModel, TemplateNPredModel]) """Registry of model classes""" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/core.py0000644000175100001770000011627514721316200020443 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import collections.abc import copy import html import logging from os.path import split import numpy as np import astropy.units as u from astropy.coordinates import SkyCoord from astropy.table import Table import matplotlib.pyplot as plt from gammapy.maps import Map, RegionGeom from gammapy.modeling import Covariance, Parameter, Parameters from gammapy.modeling.covariance import CovarianceMixin from gammapy.utils.scripts import from_yaml, make_name, make_path, to_yaml, write_yaml __all__ = ["Model", "Models", "DatasetModels", "ModelBase"] log = logging.getLogger(__name__) def _set_link(shared_register, model): for param in model.parameters: name = param.name link_label = param._link_label_io if link_label is not None: if link_label in shared_register: new_param = shared_register[link_label] setattr(model, name, new_param) else: shared_register[link_label] = param return shared_register def _get_model_class_from_dict(data): """Get a model class from a dictionary.""" from . import ( MODEL_REGISTRY, PRIOR_REGISTRY, SPATIAL_MODEL_REGISTRY, SPECTRAL_MODEL_REGISTRY, TEMPORAL_MODEL_REGISTRY, ) if "type" in data: cls = MODEL_REGISTRY.get_cls(data["type"]) elif "spatial" in data: cls = SPATIAL_MODEL_REGISTRY.get_cls(data["spatial"]["type"]) elif "spectral" in data: cls = SPECTRAL_MODEL_REGISTRY.get_cls(data["spectral"]["type"]) elif "temporal" in data: cls = TEMPORAL_MODEL_REGISTRY.get_cls(data["temporal"]["type"]) elif "prior" in data: cls = PRIOR_REGISTRY.get_cls(data["prior"]["type"]) return cls def _build_parameters_from_dict(data, default_parameters): """Build a `~gammapy.modeling.Parameters` object from input dictionary and default parameter values.""" par_data = [] input_names = [_["name"] for _ in data] for par in default_parameters: par_dict = par.to_dict() try: index = input_names.index(par_dict["name"]) par_dict.update(data[index]) except ValueError: log.warning( f"Parameter '{par_dict['name']}' not defined in YAML file." f" Using default value: {par_dict['value']} {par_dict['unit']}" ) par_data.append(par_dict) return Parameters.from_dict(par_data) def _check_name_unique(model, names): """Check if a model is not duplicated""" if model.name in names: raise ( ValueError( f"Model names must be unique. Models named '{model.name}' are duplicated." ) ) return def _write_models( models, path, overwrite=False, full_output=False, overwrite_templates=False, write_covariance=True, checksum=False, extra_dict=None, ): """Write models to YAML file with additionnal informations using an `extra_dict`""" base_path, _ = split(path) path = make_path(path) base_path = make_path(base_path) if path.exists() and not overwrite: raise IOError(f"File exists already: {path}") if ( write_covariance and models.covariance is not None and len(models.parameters) != 0 ): filecovar = path.stem + "_covariance.dat" kwargs = dict(format="ascii.fixed_width", delimiter="|", overwrite=overwrite) models.write_covariance(base_path / filecovar, **kwargs) models._covar_file = filecovar yaml_str = "" if extra_dict is not None: yaml_str += to_yaml(extra_dict) yaml_str += models.to_yaml(full_output, overwrite_templates) write_yaml(yaml_str, path, overwrite=overwrite, checksum=checksum) class ModelBase: """Model base class.""" _type = None def __init__(self, **kwargs): # Copy default parameters from the class to the instance default_parameters = self.default_parameters.copy() for par in default_parameters: value = kwargs.get(par.name, par) if not isinstance(value, Parameter): par.quantity = u.Quantity(value) else: par = value setattr(self, par.name, par) self._covariance = Covariance(self.parameters) covariance_data = kwargs.get("covariance_data", None) if covariance_data is not None: self.covariance = covariance_data def __getattribute__(self, name): value = object.__getattribute__(self, name) if isinstance(value, Parameter): return value.__get__(self, None) return value @property def type(self): return self._type def __init_subclass__(cls, **kwargs): # Add parameters list on the model sub-class (not instances) cls.default_parameters = Parameters( [_ for _ in cls.__dict__.values() if isinstance(_, Parameter)] ) @classmethod def from_parameters(cls, parameters, **kwargs): """Create model from parameter list. Parameters ---------- parameters : `Parameters` Parameters for init. Returns ------- model : `Model` Model instance. """ for par in parameters: kwargs[par.name] = par return cls(**kwargs) def _check_covariance(self): if not self.parameters == self._covariance.parameters: self._covariance = Covariance(self.parameters) @property def covariance(self): self._check_covariance() for par in self.parameters: pars = Parameters([par]) error = np.nan_to_num(par.error**2, nan=1) covar = Covariance(pars, data=[[error]]) self._covariance.set_subcovariance(covar) return self._covariance @covariance.setter def covariance(self, covariance): self._check_covariance() self._covariance.data = covariance for par in self.parameters: pars = Parameters([par]) variance = self._covariance.get_subcovariance(pars).data par.error = np.sqrt(variance[0][0]) @property def parameters(self): """Parameters as a `~gammapy.modeling.Parameters` object.""" return Parameters( [getattr(self, name) for name in self.default_parameters.names] ) @property def parameters_unique_names(self): return self.parameters.unique_parameters.names def copy(self, **kwargs): """Deep copy.""" return copy.deepcopy(self) def to_dict(self, full_output=False): """Create dictionary for YAML serialisation.""" tag = self.tag[0] if isinstance(self.tag, list) else self.tag params = self.parameters.to_dict() if not full_output: for par, par_default in zip(params, self.default_parameters): init = par_default.to_dict() for item in [ "min", "max", "error", "interp", "scale_method", ]: default = init[item] if par[item] == default or ( np.isnan(par[item]) and np.isnan(default) ): del par[item] if par["frozen"] == init["frozen"]: del par["frozen"] if init["unit"] == "": del par["unit"] data = {"type": tag, "parameters": params} if self.type is None: return data else: return {self.type: data} @classmethod def from_dict(cls, data, **kwargs): key0 = next(iter(data)) if key0 in ["spatial", "temporal", "spectral"]: data = data[key0] if data["type"] not in cls.tag: raise ValueError( f"Invalid model type {data['type']} for class {cls.__name__}" ) parameters = _build_parameters_from_dict( data["parameters"], cls.default_parameters ) return cls.from_parameters(parameters, **kwargs) def __str__(self): string = f"{self.__class__.__name__}\n" if len(self.parameters) > 0: string += f"\n{self.parameters.to_table()}" return string def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property def frozen(self): """Frozen status of a model, True if all parameters are frozen.""" return np.all([p.frozen for p in self.parameters]) def freeze(self): """Freeze all parameters.""" self.parameters.freeze_all() def unfreeze(self): """Restore parameters frozen status to default.""" for p, default in zip(self.parameters, self.default_parameters): p.frozen = default.frozen def reassign(self, datasets_names, new_datasets_names): """Reassign a model from one dataset to another. Parameters ---------- datasets_names : str or list Name of the datasets where the model is currently defined. new_datasets_names : str or list Name of the datasets where the model should be defined instead. If multiple names are given the two list must have the save length, as the reassignment is element-wise. Returns ------- model : `Model` Reassigned model. """ model = self.copy(name=self.name) if not isinstance(datasets_names, list): datasets_names = [datasets_names] if not isinstance(new_datasets_names, list): new_datasets_names = [new_datasets_names] if isinstance(model.datasets_names, str): model.datasets_names = [model.datasets_names] if getattr(model, "datasets_names", None): for name, name_new in zip(datasets_names, new_datasets_names): model.datasets_names = [ _.replace(name, name_new) for _ in model.datasets_names ] return model class Model: """Model class that contains only methods to create a model listed in the registries.""" @staticmethod def create(tag, model_type=None, *args, **kwargs): """Create a model instance. Examples -------- >>> from gammapy.modeling.models import Model >>> spectral_model = Model.create( ... "pl-2", model_type="spectral", amplitude="1e-10 cm-2 s-1", index=3 ... ) >>> type(spectral_model) """ data = {"type": tag} if model_type is not None: data = {model_type: data} cls = _get_model_class_from_dict(data) return cls(*args, **kwargs) @staticmethod def from_dict(data): """Create a model instance from a dictionary.""" cls = _get_model_class_from_dict(data) return cls.from_dict(data) class DatasetModels(collections.abc.Sequence, CovarianceMixin): """Immutable models container. Parameters ---------- models : `SkyModel`, list of `SkyModel` or `Models` Sky models. covariance_data : `~numpy.ndarray` Covariance data. """ def __init__(self, models=None, covariance_data=None): if models is None: models = [] if isinstance(models, (Models, DatasetModels)): if covariance_data is None and models.covariance is not None: covariance_data = models.covariance.data models = models._models elif isinstance(models, ModelBase): models = [models] elif not isinstance(models, list): raise TypeError(f"Invalid type: {models!r}") unique_names = [] for model in models: _check_name_unique(model, names=unique_names) unique_names.append(model.name) self._models = models self._covar_file = None self._covariance = Covariance(self.parameters) # Set separately because this triggers the update mechanism on the sub-models if covariance_data is not None: self.covariance = covariance_data @property def parameters(self): """Parameters as a `~gammapy.modeling.Parameters` object.""" return Parameters.from_stack([_.parameters for _ in self._models]) @property def parameters_unique_names(self): """List of unique parameter names. Return formatted as model_name.par_type.par_name.""" names = [] for model in self._models: for par_name in model.parameters_unique_names: components = [model.name, par_name] name = ".".join(components) names.append(name) return names @property def names(self): """List of model names.""" return [m.name for m in self._models] @classmethod def read(cls, filename, checksum=False): """Read from YAML file. Parameters ---------- filename : str input filename checksum : bool, optional Whether to perform checksum verification. Default is False. """ yaml_str = make_path(filename).read_text() path, filename = split(filename) return cls.from_yaml(yaml_str, path=path, checksum=checksum) @classmethod def from_yaml(cls, yaml_str, path="", checksum=False): """Create from YAML string. Parameters ---------- yaml_str : str yaml str path : str base path of model files checksum : bool, optional Whether to perform checksum verification. Default is False. """ data = from_yaml(yaml_str, checksum=checksum) # TODO : for now metadata are not kept. Add proper metadata creation. data.pop("metadata", None) return cls.from_dict(data, path=path) @classmethod def from_dict(cls, data, path=""): """Create from dictionary.""" from . import MODEL_REGISTRY, SkyModel models = [] for component in data["components"]: model_cls = MODEL_REGISTRY.get_cls(component["type"]) model = model_cls.from_dict(component) models.append(model) models = cls(models) if "covariance" in data: filename = data["covariance"] path = make_path(path) if not (path / filename).exists(): path, filename = split(filename) models.read_covariance(path, filename, format="ascii.fixed_width") shared_register = {} for model in models: if isinstance(model, SkyModel): submodels = [ model.spectral_model, model.spatial_model, model.temporal_model, ] for submodel in submodels: if submodel is not None: shared_register = _set_link(shared_register, submodel) else: shared_register = _set_link(shared_register, model) return models def write( self, path, overwrite=False, full_output=False, overwrite_templates=False, write_covariance=True, checksum=False, ): """Write to YAML file. Parameters ---------- path : `pathlib.Path` or str Path to write files. overwrite : bool, optional Overwrite existing file. Default is False. full_output : bool, optional Store full parameter output. Default is False. overwrite_templates : bool, optional Overwrite templates FITS files. Default is False. write_covariance : bool, optional Whether to save the covariance. Default is True. checksum : bool, optional When True adds a CHECKSUM entry to the file. Default is False. """ _write_models( self, path, overwrite, full_output, overwrite_templates, write_covariance, checksum, ) def to_yaml(self, full_output=False, overwrite_templates=False): """Convert to YAML string. Parameters ---------- full_output : bool, optional Store full parameter output. Default is False. overwrite_templates : bool, optional Overwrite templates FITS files. Default is False. """ data = self.to_dict(full_output, overwrite_templates) return to_yaml(data) def update_link_label(self): """Update linked parameters labels used for serialisation and print.""" params_list = [] params_shared = [] for param in self.parameters: if param not in params_list: params_list.append(param) params_list.append(param) elif param not in params_shared: params_shared.append(param) for param in params_shared: param._link_label_io = param.name + "@" + make_name() def to_dict(self, full_output=False, overwrite_templates=False): """Convert to dictionary.""" self.update_link_label() models_data = [] for model in self._models: model_data = model.to_dict(full_output) models_data.append(model_data) if ( hasattr(model, "spatial_model") and model.spatial_model is not None and "template" in model.spatial_model.tag ): model.spatial_model.write(overwrite=overwrite_templates) if model.tag == "TemplateNPredModel": model.write(overwrite=overwrite_templates) if self._covar_file is not None: return { "components": models_data, "covariance": str(self._covar_file), } else: return {"components": models_data} def to_parameters_table(self): """Convert model parameters to a `~astropy.table.Table`.""" table = self.parameters.to_table() # Warning: splitting of parameters will break is source name has a "." in its name. model_name = [name.split(".")[0] for name in self.parameters_unique_names] table.add_column(model_name, name="model", index=0) return table def update_parameters_from_table(self, t): """Update models from a `~astropy.table.Table`.""" parameters_dict = [dict(zip(t.colnames, row)) for row in t] for k, data in enumerate(parameters_dict): self.parameters[k].update_from_dict(data) def read_covariance(self, path, filename="_covariance.dat", **kwargs): """Read covariance data from file. Parameters ---------- path : str or `Path` Base path. filename : str Filename. **kwargs : dict Keyword arguments passed to `~astropy.table.Table.read`. """ path = make_path(path) filepath = str(path / filename) t = Table.read(filepath, **kwargs) t.remove_column("Parameters") arr = np.array(t) data = arr.view(float).reshape(arr.shape + (-1,)) self.covariance = data self._covar_file = filename def write_covariance(self, filename, **kwargs): """Write covariance to file. Parameters ---------- filename : str Filename. **kwargs : dict Keyword arguments passed to `~astropy.table.Table.write`. """ names = self.parameters_unique_names table = Table() table["Parameters"] = names for idx, name in enumerate(names): values = self.covariance.data[idx] table[str(idx)] = values table.write(make_path(filename), **kwargs) def __str__(self): self.update_link_label() str_ = f"{self.__class__.__name__}\n\n" for idx, model in enumerate(self): str_ += f"Component {idx}: " str_ += str(model) return str_.expandtabs(tabsize=2) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def __add__(self, other): if isinstance(other, (Models, list)): return Models([*self, *other]) elif isinstance(other, ModelBase): _check_name_unique(other, self.names) return Models([*self, other]) else: raise TypeError(f"Invalid type: {other!r}") def __getitem__(self, key): if isinstance(key, np.ndarray) and key.dtype == bool: return self.__class__(list(np.array(self._models)[key])) else: return self._models[self.index(key)] def index(self, key): if isinstance(key, (int, slice)): return key elif isinstance(key, str): return self.names.index(key) elif isinstance(key, ModelBase): return self._models.index(key) else: raise TypeError(f"Invalid type: {type(key)!r}") def __len__(self): return len(self._models) def _ipython_key_completions_(self): return self.names def copy(self, copy_data=False): """Deep copy. Parameters ---------- copy_data : bool Whether to copy data attached to template models. Returns ------- models: `Models` Copied models. """ models = [] for model in self: model_copy = model.copy(name=model.name, copy_data=copy_data) models.append(model_copy) return self.__class__( models=models, covariance_data=self.covariance.data.copy() ) def _slice_by_energy(self, energy_min, energy_max, sum_over_energy_groups=False): """Copy models and slice TemplateNPredModel in energy range. Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Energy bounds of the slice sum_over_energy_groups : bool Whether to sum over the energy groups or not. Default is False. Returns ------- models : `Models` Sliced models. """ models_sliced = Models(self.copy()) for k, m in enumerate(models_sliced): if m.tag == "TemplateNPredModel": m_sliced = m.slice_by_energy(energy_min, energy_max) if sum_over_energy_groups: m_sliced.map = m_sliced.map.sum_over_axes(keepdims=True) models_sliced[k] = m_sliced return models_sliced def select( self, name_substring=None, datasets_names=None, tag=None, model_type=None, frozen=None, ): """Select models that meet all specified conditions. Parameters ---------- name_substring : str, optional Substring contained in the model name. Default is None. datasets_names : str or list, optional Name of the dataset. Default is None. tag : str or list, optional Model tag. Default is None. model_type : {'None', 'spatial', 'spectral'} Type of model, used together with "tag", if the tag is not unique. Default is None. frozen : bool, optional If True, select models with all parameters frozen; if False, exclude them. Default is None. Returns ------- models : `DatasetModels` Selected models. """ mask = self.selection_mask( name_substring, datasets_names, tag, model_type, frozen ) return self[mask] def selection_mask( self, name_substring=None, datasets_names=None, tag=None, model_type=None, frozen=None, ): """Create a mask of models, that meet all specified conditions. Parameters ---------- name_substring : str, optional Substring contained in the model name. Default is None. datasets_names : str or list of str, optional Name of the dataset. Default is None. tag : str or list of str, optional Model tag. Default is None. model_type : {'None', 'spatial', 'spectral'} Type of model, used together with "tag", if the tag is not unique. Default is None. frozen : bool, optional Select models with all parameters frozen if True, exclude them if False. Default is None. Returns ------- mask : `numpy.array` Boolean mask, True for selected models. """ selection = np.ones(len(self), dtype=bool) if tag and not isinstance(tag, list): tag = [tag] if datasets_names and not isinstance(datasets_names, list): datasets_names = [datasets_names] for idx, model in enumerate(self): if name_substring: selection[idx] &= name_substring in model.name if datasets_names: selection[idx] &= model.datasets_names is None or np.any( [name in model.datasets_names for name in datasets_names] ) if tag: if model_type is None: sub_model = model else: sub_model = getattr(model, f"{model_type}_model", None) if sub_model: selection[idx] &= np.any([t in sub_model.tag for t in tag]) else: selection[idx] &= False if frozen is not None: if frozen: selection[idx] &= model.frozen else: selection[idx] &= ~model.frozen return np.array(selection, dtype=bool) def select_mask(self, mask, margin="0 deg", use_evaluation_region=True): """Check if sky models contribute within a mask map. Parameters ---------- mask : `~gammapy.maps.WcsNDMap` of boolean type Map containing a boolean mask. margin : `~astropy.unit.Quantity`, optional Add a margin in degree to the source evaluation radius. Used to take into account PSF width. Default is "0 deg". use_evaluation_region : bool, optional Account for the extension of the model or not. Default is True. Returns ------- models : `DatasetModels` Selected models contributing inside the region where mask==True. """ models = [] if not mask.geom.is_image: mask = mask.reduce_over_axes(func=np.logical_or) for model in self.select(tag="sky-model"): if use_evaluation_region: contributes = model.contributes(mask=mask, margin=margin) else: contributes = mask.get_by_coord(model.position, fill_value=0) if np.any(contributes): models.append(model) return self.__class__(models=models) def select_from_geom(self, geom, **kwargs): """Select models that fall inside a given geometry. Parameters ---------- geom : `~gammapy.maps.Geom` Geometry to select models from. **kwargs : dict Keyword arguments passed to `~gammapy.modeling.models.DatasetModels.select_mask`. Returns ------- models : `DatasetModels` Selected models. """ mask = Map.from_geom(geom=geom, data=True, dtype=bool) return self.select_mask(mask=mask, **kwargs) def select_region(self, regions, wcs=None): """Select sky models with center position contained within a given region. Parameters ---------- regions : str, `~regions.Region` or list of `~regions.Region` Region or list of regions (pixel or sky regions accepted). A region can be defined as a string ind DS9 format as well. See http://ds9.si.edu/doc/ref/region.html for details. wcs : `~astropy.wcs.WCS`, optional World coordinate system transformation. Default is None. Returns ------- models : `DatasetModels` Selected models. """ geom = RegionGeom.from_regions(regions, wcs=wcs) models = [] for model in self.select(tag="sky-model"): if geom.contains(model.position): models.append(model) return self.__class__(models=models) def restore_status(self, restore_values=True): """Context manager to restore status. A copy of the values is made on enter, and those values are restored on exit. Parameters ---------- restore_values : bool, optional Restore values if True, otherwise restore only frozen status and covariance matrix. Default is True. """ return restore_models_status(self, restore_values) def set_parameters_bounds( self, tag, model_type, parameters_names=None, min=None, max=None, value=None ): """Set bounds for the selected models types and parameters names. Parameters ---------- tag : str or list Tag of the models. model_type : {"spatial", "spectral", "temporal"} Type of model. parameters_names : str or list, optional Parameters names. Default is None. min : float, optional Minimum value. Default is None. max : float, optional Maximum value. Default is None. value : float, optional Initial value. Default is None. """ models = self.select(tag=tag, model_type=model_type) parameters = models.parameters.select(name=parameters_names, type=model_type) n = len(parameters) if min is not None: parameters.min = np.ones(n) * min if max is not None: parameters.max = np.ones(n) * max if value is not None: parameters.value = np.ones(n) * value def freeze(self, model_type=None): """Freeze parameters depending on model type. Parameters ---------- model_type : {None, "spatial", "spectral"} Freeze all parameters or only spatial or only spectral. Default is None. """ for m in self: m.freeze(model_type) def unfreeze(self, model_type=None): """Restore parameters frozen status to default depending on model type. Parameters ---------- model_type : {None, "spatial", "spectral"} Restore frozen status to default for all parameters or only spatial or only spectral. Default is None. """ for m in self: m.unfreeze(model_type) @property def frozen(self): """Boolean mask, True if all parameters of a given model are frozen.""" return np.all([m.frozen for m in self]) def reassign(self, dataset_name, new_dataset_name): """Reassign a model from one dataset to another. Parameters ---------- dataset_name : str or list Name of the datasets where the model is currently defined. new_dataset_name : str or list Name of the datasets where the model should be defined instead. If multiple names are given the two list must have the save length, as the reassignment is element-wise. """ models = [m.reassign(dataset_name, new_dataset_name) for m in self] return self.__class__(models) def to_template_sky_model(self, geom, spectral_model=None, name=None): """Merge a list of models into a single `~gammapy.modeling.models.SkyModel`. Parameters ---------- geom : `Geom` Map geometry of the result template model. spectral_model : `~gammapy.modeling.models.SpectralModel`, optional One of the NormSpectralModel. Default is None. name : str, optional Name of the new model. Default is None. Returns ------- model : `SkyModel` Template sky model. """ from . import PowerLawNormSpectralModel, SkyModel, TemplateSpatialModel unit = u.Unit("1 / (cm2 s sr TeV)") map_ = Map.from_geom(geom, unit=unit) for m in self: map_ += m.evaluate_geom(geom).to(unit) spatial_model = TemplateSpatialModel(map_, normalize=False) if spectral_model is None: spectral_model = PowerLawNormSpectralModel() return SkyModel( spectral_model=spectral_model, spatial_model=spatial_model, name=name ) def to_template_spectral_model(self, geom, mask=None): """Merge a list of models into a single `~gammapy.modeling.models.TemplateSpectralModel`. For each model the spatial component is integrated over the given geometry where the mask is true and multiplied by the spectral component value in each energy bin. Parameters ---------- geom : `~gammapy.maps.Geom` Map geometry on which the template model is computed. mask : `~gammapy.maps.Map` with bool dtype. Evaluate the model only where the mask is True. Returns ------- model : `~gammapy.modeling.models.TemplateSpectralModel` Template spectral model. """ from . import TemplateSpectralModel energy = geom.axes[0].center if mask is None: mask = Map.from_geom(geom, data=True) elif mask.geom != geom: mask = mask.interp_to_geom(geom, method="nearest") values = 0 for m in self: dnde = m.spectral_model(energy) spatial_integ = m.spatial_model.integrate_geom(geom) masked_integ = spatial_integ.data * np.isfinite(spatial_integ.data) * mask spatial_integ.data[~np.isfinite(spatial_integ.data)] = np.nan dnde *= masked_integ.sum(axis=(1, 2)) * spatial_integ.unit values += dnde return TemplateSpectralModel(energy, values) @property def positions(self): """Positions of the models as a `~astropy.coordinates.SkyCoord`.""" positions = [] for model in self.select(tag="sky-model"): if model.position: positions.append(model.position.icrs) else: log.warning( f"Skipping model {model.name} - no spatial component present" ) return SkyCoord(positions) def to_regions(self): """Return a list of the regions for the spatial models. Returns ------- regions: list of `~regions.SkyRegion` Regions. """ regions = [] for model in self.select(tag="sky-model"): try: region = model.spatial_model.to_region() regions.append(region) except AttributeError: log.warning( f"Skipping model {model.name} - no spatial component present" ) return regions @property def wcs_geom(self): """Minimum WCS geom in which all the models are contained.""" regions = self.to_regions() try: return RegionGeom.from_regions(regions).to_wcs_geom() except IndexError: log.error("No spatial component in any model. Geom not defined") def plot_regions(self, ax=None, kwargs_point=None, path_effect=None, **kwargs): """Plot extent of the spatial models on a given WCS axis. Parameters ---------- ax : `~astropy.visualization.WCSAxes`, optional Axes to plot on. If no axes are given, an all-sky WCS is chosen using a CAR projection. Default is None. kwargs_point : dict, optional Keyword arguments passed to `~matplotlib.lines.Line2D` for plotting of point sources. Default is None. path_effect : `~matplotlib.patheffects.PathEffect`, optional Path effect applied to artists and lines. Default is None. **kwargs : dict Keyword arguments passed to `~matplotlib.artists.Artist`. Returns ------- ax : `~astropy.visualization.WcsAxes` WCS axes. """ regions = self.to_regions() geom = RegionGeom.from_regions(regions=regions) return geom.plot_region( ax=ax, kwargs_point=kwargs_point, path_effect=path_effect, **kwargs ) def plot_positions(self, ax=None, **kwargs): """Plot the centers of the spatial models on a given WCS axis. Parameters ---------- ax : `~astropy.visualization.WCSAxes`, optional Axes to plot on. If no axes are given, an all-sky WCS is chosen using a CAR projection. Default is None. **kwargs : dict Keyword arguments passed to `~matplotlib.pyplot.scatter`. Returns ------- ax : `~astropy.visualization.WcsAxes` WCS axes. """ from astropy.visualization.wcsaxes import WCSAxes if ax is None or not isinstance(ax, WCSAxes): ax = Map.from_geom(self.wcs_geom).plot() kwargs.setdefault("marker", "*") kwargs.setdefault("color", "tab:blue") path_effects = kwargs.get("path_effects", None) xp, yp = self.positions.to_pixel(ax.wcs) p = ax.scatter(xp, yp, **kwargs) if path_effects: plt.setp(p, path_effects=path_effects) return ax class Models(DatasetModels, collections.abc.MutableSequence): """Sky model collection. Parameters ---------- models : `SkyModel`, list of `SkyModel` or `Models` Sky models. """ def __delitem__(self, key): del self._models[self.index(key)] def __setitem__(self, key, model): from gammapy.modeling.models import ( FoVBackgroundModel, SkyModel, TemplateNPredModel, ) if isinstance(model, (SkyModel, FoVBackgroundModel, TemplateNPredModel)): self._models[self.index(key)] = model else: raise TypeError(f"Invalid type: {model!r}") def insert(self, idx, model): _check_name_unique(model, self.names) self._models.insert(idx, model) def set_prior(self, parameters, priors): for parameter, prior in zip(parameters, priors): parameter.prior = prior class restore_models_status: def __init__(self, models, restore_values=True): self.restore_values = restore_values self.models = models self.values = [_.value for _ in models.parameters] self.frozen = [_.frozen for _ in models.parameters] self.covariance_data = models.covariance.data def __enter__(self): pass def __exit__(self, type, value, traceback): for value, par, frozen in zip(self.values, self.models.parameters, self.frozen): if self.restore_values: par.value = value par.frozen = frozen self.models.covariance = self.covariance_data ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/cube.py0000644000175100001770000012055214721316200020422 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Cube models (axes: lon, lat, energy).""" import logging import warnings import os import numpy as np import astropy.units as u from astropy.nddata import NoOverlapError from astropy.time import Time from gammapy.maps import Map, MapAxis, WcsGeom from gammapy.modeling import Parameters from gammapy.modeling.covariance import CovarianceMixin from gammapy.modeling.parameter import _get_parameters_str from gammapy.utils.compat import COPY_IF_NEEDED from gammapy.utils.fits import LazyFitsData from gammapy.utils.scripts import make_name, make_path from gammapy.utils.deprecation import GammapyDeprecationWarning from .core import Model, ModelBase, Models from .spatial import ConstantSpatialModel, SpatialModel from .spectral import PowerLawNormSpectralModel, SpectralModel, TemplateSpectralModel from .temporal import TemporalModel log = logging.getLogger(__name__) __all__ = [ "create_fermi_isotropic_diffuse_model", "FoVBackgroundModel", "SkyModel", "TemplateNPredModel", ] class SkyModel(CovarianceMixin, ModelBase): """Sky model component. This model represents a factorised sky model. It has `~gammapy.modeling.Parameters` combining the spatial and spectral parameters. Parameters ---------- spectral_model : `~gammapy.modeling.models.SpectralModel` Spectral model. spatial_model : `~gammapy.modeling.models.SpatialModel` Spatial model (must be normalised to integrate to 1). temporal_model : `~gammapy.modeling.models.TemporalModel` Temporal model. name : str Model identifier. apply_irf : dict Dictionary declaring which IRFs should be applied to this model. Options are {"exposure": True, "psf": True, "edisp": True}. datasets_names : list of str Which datasets this model is applied to. """ tag = ["SkyModel", "sky-model"] _apply_irf_default = {"exposure": True, "psf": True, "edisp": True} def __init__( self, spectral_model, spatial_model=None, temporal_model=None, name=None, apply_irf=None, datasets_names=None, covariance_data=None, ): self.spatial_model = spatial_model self.spectral_model = spectral_model self.temporal_model = temporal_model self._name = make_name(name) if apply_irf is None: apply_irf = self._apply_irf_default.copy() self.apply_irf = apply_irf self.datasets_names = datasets_names self._check_unit() super().__init__(covariance_data=covariance_data) @property def _models(self): models = self.spectral_model, self.spatial_model, self.temporal_model return [model for model in models if model is not None] def _check_unit(self): axis = MapAxis.from_energy_bounds( "0.1 TeV", "10 TeV", nbin=1, name="energy_true" ) geom = WcsGeom.create(skydir=self.position, npix=(2, 2), axes=[axis]) time = Time(55555, format="mjd") if self.apply_irf["exposure"]: ref_unit = u.Unit("cm-2 s-1 MeV-1") else: ref_unit = u.Unit("") obt_unit = self.spectral_model(axis.center).unit if self.spatial_model: obt_unit = obt_unit * self.spatial_model.evaluate_geom(geom).unit ref_unit = ref_unit / u.sr if self.temporal_model: if u.Quantity(self.temporal_model(time)).unit.is_equivalent( self.spectral_model(axis.center).unit ): obt_unit = ( ( self.temporal_model(time) * self.spatial_model.evaluate_geom(geom).unit ) .to(obt_unit.to_string()) .unit ) else: obt_unit = obt_unit * u.Quantity(self.temporal_model(time)).unit if not obt_unit.is_equivalent(ref_unit): raise ValueError( f"SkyModel unit {obt_unit} is not equivalent to {ref_unit}" ) @property def name(self): return self._name @property def parameters(self): parameters = [] parameters.append(self.spectral_model.parameters) if self.spatial_model is not None: parameters.append(self.spatial_model.parameters) if self.temporal_model is not None: parameters.append(self.temporal_model.parameters) return Parameters.from_stack(parameters) @property def parameters_unique_names(self): """List of unique parameter names. Return formatted as par_type.par_name.""" names = [] for model in self._models: for par_name in model.parameters_unique_names: components = [model.type, par_name] name = ".".join(components) names.append(name) return names @property def spatial_model(self): """Spatial model as a `~gammapy.modeling.models.SpatialModel` object.""" return self._spatial_model @spatial_model.setter def spatial_model(self, model): if not (model is None or isinstance(model, SpatialModel)): raise TypeError(f"Invalid type: {model!r}") self._spatial_model = model @property def spectral_model(self): """Spectral model as a `~gammapy.modeling.models.SpectralModel` object.""" return self._spectral_model @spectral_model.setter def spectral_model(self, model): if not (model is None or isinstance(model, SpectralModel)): raise TypeError(f"Invalid type: {model!r}") self._spectral_model = model @property def temporal_model(self): """Temporal model as a `~gammapy.modeling.models.TemporalModel` object.""" return self._temporal_model @temporal_model.setter def temporal_model(self, model): if not (model is None or isinstance(model, TemporalModel)): raise TypeError(f"Invalid type: {model!r}") self._temporal_model = model @property def position(self): """Position as a `~astropy.coordinates.SkyCoord`.""" return getattr(self.spatial_model, "position", None) @property def position_lonlat(self): """Spatial model center position `(lon, lat)` in radians and frame of the model.""" return getattr(self.spatial_model, "position_lonlat", None) @property def evaluation_bin_size_min(self): """Minimal spatial bin size for spatial model evaluation.""" if ( self.spatial_model is not None and self.spatial_model.evaluation_bin_size_min is not None ): return self.spatial_model.evaluation_bin_size_min else: return None @property def evaluation_radius(self): """Evaluation radius as an `~astropy.coordinates.Angle`.""" return self.spatial_model.evaluation_radius @property def evaluation_region(self): """Evaluation region as an `~astropy.coordinates.Angle`.""" return self.spatial_model.evaluation_region @property def frame(self): return self.spatial_model.frame def __add__(self, other): if isinstance(other, (Models, list)): return Models([self, *other]) elif isinstance(other, (SkyModel, TemplateNPredModel)): return Models([self, other]) else: raise TypeError(f"Invalid type: {other!r}") def __radd__(self, model): return self.__add__(model) def __call__(self, lon, lat, energy, time=None): return self.evaluate(lon, lat, energy, time) def __repr__(self): return ( f"{self.__class__.__name__}(" f"spatial_model={self.spatial_model!r}, " f"spectral_model={self.spectral_model!r})" f"temporal_model={self.temporal_model!r})" ) def contributes(self, mask, margin="0 deg"): """Check if a sky model contributes within a mask map. Parameters ---------- mask : `~gammapy.maps.WcsNDMap` of boolean type Map containing a boolean mask. margin : `~astropy.units.Quantity` Add a margin in degree to the source evaluation radius. Used to take into account PSF width. Returns ------- models : `DatasetModels` Selected models contributing inside the region where mask is True. """ from gammapy.datasets.evaluator import CUTOUT_MARGIN margin = u.Quantity(margin) if not mask.geom.is_image: mask = mask.reduce_over_axes(func=np.logical_or) if mask.geom.is_region and mask.geom.region is not None: if mask.geom.is_all_point_sky_regions: return True geom = mask.geom.to_wcs_geom() mask = geom.region_mask([mask.geom.region]) try: mask_cutout = mask.cutout( position=self.position, width=(2 * self.evaluation_radius + CUTOUT_MARGIN + margin), ) contributes = np.any(mask_cutout.data) except (NoOverlapError, ValueError): contributes = False return contributes def evaluate(self, lon, lat, energy, time=None): """Evaluate the model at given points. The model evaluation follows numpy broadcasting rules. Return differential surface brightness cube. At the moment in units: ``cm-2 s-1 TeV-1 deg-2``. Parameters ---------- lon, lat : `~astropy.units.Quantity` Spatial coordinates. energy : `~astropy.units.Quantity` Energy coordinate. time: `~astropy.time.Time`, optional Time coordinate. Default is None. Returns ------- value : `~astropy.units.Quantity` Model value at the given point. """ value = self.spectral_model(energy) # pylint:disable=not-callable # TODO: case if self.temporal_model is not None, introduce time in arguments ? if self.spatial_model is not None: if self.spatial_model.is_energy_dependent: spatial = self.spatial_model(lon, lat, energy) else: spatial = self.spatial_model(lon, lat) value = value * spatial # pylint:disable=not-callable if (self.temporal_model is not None) and (time is not None): value = value * self.temporal_model(time) return value def evaluate_geom(self, geom, gti=None): """Evaluate model on `~gammapy.maps.Geom`.""" coords = geom.get_coord(sparse=True) value = self.spectral_model(coords["energy_true"]) additional_axes = set(coords.axis_names) - { "lon", "lat", "energy_true", "time", } for axis in additional_axes: value = value * np.ones_like(coords[axis]) if self.spatial_model: value = value * self.spatial_model.evaluate_geom(geom) if self.temporal_model: value = self._compute_time_integral(value, geom, gti) return value def integrate_geom(self, geom, gti=None, oversampling_factor=None): """Integrate model on `~gammapy.maps.Geom`. See `~gammapy.modeling.models.SpatialModel.integrate_geom` and `~gammapy.modeling.models.SpectralModel.integral`. Parameters ---------- geom : `Geom` or `~gammapy.maps.RegionGeom` Map geometry. gti : `GTI`, optional GIT table. Default is None. oversampling_factor : int, optional The oversampling factor to use for spatial integration. Default is None: the factor is estimated from the model minimal bin size. Returns ------- flux : `Map` Predicted flux map. """ energy = geom.axes["energy_true"].edges shape = len(geom.data_shape) * [ 1, ] shape[geom.axes.index_data("energy_true")] = -1 value = self.spectral_model.integral( energy[:-1], energy[1:], ).reshape(shape) if self.spatial_model: value = ( value * self.spatial_model.integrate_geom( geom, oversampling_factor=oversampling_factor ).quantity ) if self.temporal_model: value = self._compute_time_integral(value, geom, gti) value = value * np.ones(geom.data_shape) return Map.from_geom(geom=geom, data=value.value, unit=value.unit) def _compute_time_integral(self, value, geom, gti): """Multiply input value with time integral for the given geometry and GTI.""" if "time" in geom.axes.names: if geom.axes.names[-1] != "time": raise ValueError( "Incorrect axis order. The time axis must be the last axis" ) time_axis = geom.axes["time"] temp_eval = np.ones(time_axis.nbin) for idx in range(time_axis.nbin): if gti is None: t1, t2 = time_axis.time_min[idx], time_axis.time_max[idx] else: gti_in_bin = gti.select_time( time_interval=[ time_axis.time_min[idx], time_axis.time_max[idx], ] ) t1, t2 = gti_in_bin.time_start, gti_in_bin.time_stop integral = self.temporal_model.integral(t1, t2) temp_eval[idx] = np.sum(integral) value = (value.T * temp_eval).T else: if gti is not None: integral = self.temporal_model.integral(gti.time_start, gti.time_stop) value = value * np.sum(integral) return value def copy(self, name=None, copy_data=False, **kwargs): """Copy sky model. Parameters ---------- name : str, optional Assign a new name to the copied model. Default is None. copy_data : bool, optional Copy the data arrays attached to models. Default is False. **kwargs : dict Keyword arguments forwarded to `SkyModel`. Returns ------- model : `SkyModel` Copied sky model. """ if self.spatial_model is not None: spatial_model = self.spatial_model.copy(copy_data=copy_data) else: spatial_model = None if self.temporal_model is not None: temporal_model = self.temporal_model.copy() else: temporal_model = None kwargs.setdefault("name", make_name(name)) kwargs.setdefault("spectral_model", self.spectral_model.copy()) kwargs.setdefault("spatial_model", spatial_model) kwargs.setdefault("temporal_model", temporal_model) kwargs.setdefault("apply_irf", self.apply_irf.copy()) kwargs.setdefault("datasets_names", self.datasets_names) kwargs.setdefault("covariance_data", self.covariance.data.copy()) return self.__class__(**kwargs) def to_dict(self, full_output=False): """Create dictionary for YAML serilisation.""" data = {} data["name"] = self.name data["type"] = self.tag[0] if self.apply_irf != self._apply_irf_default: data["apply_irf"] = self.apply_irf if self.datasets_names is not None: data["datasets_names"] = self.datasets_names data.update(self.spectral_model.to_dict(full_output)) if self.spatial_model is not None: data.update(self.spatial_model.to_dict(full_output)) if self.temporal_model is not None: data.update(self.temporal_model.to_dict(full_output)) return data @classmethod def from_dict(cls, data, **kwargs): """Create SkyModel from dictionary.""" from gammapy.modeling.models import ( SPATIAL_MODEL_REGISTRY, SPECTRAL_MODEL_REGISTRY, TEMPORAL_MODEL_REGISTRY, ) model_class = SPECTRAL_MODEL_REGISTRY.get_cls(data["spectral"]["type"]) spectral_model = model_class.from_dict({"spectral": data["spectral"]}) spatial_data = data.get("spatial") if spatial_data is not None: model_class = SPATIAL_MODEL_REGISTRY.get_cls(spatial_data["type"]) spatial_model = model_class.from_dict({"spatial": spatial_data}) else: spatial_model = None temporal_data = data.get("temporal") if temporal_data is not None: model_class = TEMPORAL_MODEL_REGISTRY.get_cls(temporal_data["type"]) temporal_model = model_class.from_dict({"temporal": temporal_data}) else: temporal_model = None return cls( name=data["name"], spatial_model=spatial_model, spectral_model=spectral_model, temporal_model=temporal_model, apply_irf=data.get("apply_irf", cls._apply_irf_default), datasets_names=data.get("datasets_names"), ) def __str__(self): str_ = f"{self.__class__.__name__}\n\n" str_ += "\t{:26}: {}\n".format("Name", self.name) str_ += "\t{:26}: {}\n".format("Datasets names", self.datasets_names) str_ += "\t{:26}: {}\n".format( "Spectral model type", self.spectral_model.__class__.__name__ ) if self.spatial_model is not None: spatial_type = self.spatial_model.__class__.__name__ else: spatial_type = "" str_ += "\t{:26}: {}\n".format("Spatial model type", spatial_type) if self.temporal_model is not None: temporal_type = self.temporal_model.__class__.__name__ else: temporal_type = "" str_ += "\t{:26}: {}\n".format("Temporal model type", temporal_type) str_ += "\tParameters:\n" info = _get_parameters_str(self.parameters) lines = info.split("\n") str_ += "\t" + "\n\t".join(lines[:-1]) str_ += "\n\n" return str_.expandtabs(tabsize=2) @classmethod def create(cls, spectral_model, spatial_model=None, temporal_model=None, **kwargs): """Create a model instance. Parameters ---------- spectral_model : str Tag to create spectral model. spatial_model : str, optional Tag to create spatial model. Default is None. temporal_model : str, optional Tag to create temporal model. Default is None. **kwargs : dict Keyword arguments passed to `SkyModel`. Returns ------- model : SkyModel Sky model. """ spectral_model = Model.create(spectral_model, model_type="spectral") if spatial_model: spatial_model = Model.create(spatial_model, model_type="spatial") if temporal_model: temporal_model = Model.create(temporal_model, model_type="temporal") return cls( spectral_model=spectral_model, spatial_model=spatial_model, temporal_model=temporal_model, **kwargs, ) def freeze(self, model_type=None): """Freeze parameters depending on model type. Parameters ---------- model_type : {None, "spatial", "spectral", "temporal"} Freeze all parameters or only spatial/spectral/temporal. Default is None, such that all parameters are frozen. """ if model_type is None: self.parameters.freeze_all() else: model = getattr(self, f"{model_type}_model") model.freeze() def unfreeze(self, model_type=None): """Restore parameters frozen status to default depending on model type. Parameters ---------- model_type : {None, "spatial", "spectral", "temporal"} Restore frozen status to default for all parameters or only spatial/spectral/temporal. Default is None, such that all parameters are restored to default frozen status. """ if model_type is None: for model_type in ["spectral", "spatial", "temporal"]: self.unfreeze(model_type) else: model = getattr(self, f"{model_type}_model") if model: model.unfreeze() class FoVBackgroundModel(ModelBase): """Field of view background model. The background model holds the correction parameters applied to the instrumental background attached to a `MapDataset` or `SpectrumDataset`. Parameters ---------- dataset_name : str Dataset name. spectral_model : `~gammapy.modeling.models.SpectralModel`, Optional Normalized spectral model. Default is `~gammapy.modeling.models.PowerLawNormSpectralModel` spatial_model : `~gammapy.modeling.models.SpatialModel`, Optional Unitless Spatial model (unit is dropped on evaluation if defined). Default is None. """ tag = ["FoVBackgroundModel", "fov-bkg"] def __init__( self, dataset_name, spectral_model=None, spatial_model=None, covariance_data=None, ): # TODO: remove this in v2.0 if isinstance(dataset_name, SpectralModel): warnings.warn( "dataset_name has been made first argument since v1.3.", GammapyDeprecationWarning, stacklevel=2, ) buf = dataset_name dataset_name = spectral_model spectral_model = buf self.datasets_names = [dataset_name] if spectral_model is None: spectral_model = PowerLawNormSpectralModel() if not spectral_model.is_norm_spectral_model: raise ValueError("A norm spectral model is required.") self._spatial_model = spatial_model self._spectral_model = spectral_model super().__init__(covariance_data=covariance_data) @staticmethod def contributes(*args, **kwargs): """FoV background models always contribute.""" return True @property def spectral_model(self): """Spectral norm model.""" return self._spectral_model @property def spatial_model(self): """Spatial norm model.""" return self._spatial_model @property def _models(self): models = self.spectral_model, self.spatial_model return [model for model in models if model is not None] @property def name(self): """Model name.""" return self.datasets_names[0] + "-bkg" @property def parameters(self): """Model parameters.""" parameters = [] parameters.append(self.spectral_model.parameters) if self.spatial_model is not None: parameters.append(self.spatial_model.parameters) return Parameters.from_stack(parameters) @property def parameters_unique_names(self): """List of unique parameter names. Return formatted as par_type.par_name.""" names = [] for model in self._models: for par_name in model.parameters_unique_names: components = [model.type, par_name] name = ".".join(components) names.append(name) return names def __str__(self): str_ = f"{self.__class__.__name__}\n\n" str_ += "\t{:26}: {}\n".format("Name", self.name) str_ += "\t{:26}: {}\n".format("Datasets names", self.datasets_names) str_ += "\t{:26}: {}\n".format( "Spectral model type", self.spectral_model.__class__.__name__ ) if self.spatial_model is not None: str_ += "\t{:26}: {}\n".format( "Spatial model type", self.spatial_model.__class__.__name__ ) str_ += "\tParameters:\n" info = _get_parameters_str(self.parameters) lines = info.split("\n") str_ += "\t" + "\n\t".join(lines[:-1]) str_ += "\n\n" return str_.expandtabs(tabsize=2) def evaluate_geom(self, geom): """Evaluate map.""" coords = geom.get_coord(sparse=True) return self.evaluate(**coords._data) def evaluate(self, energy, lon=None, lat=None): """Evaluate model.""" value = self.spectral_model(energy) if self.spatial_model is not None: if lon is not None and lat is not None: if self.spatial_model.is_energy_dependent: return self.spatial_model(lon, lat, energy).value * value else: return self.spatial_model(lon, lat).value * value else: raise ValueError( "lon and lat are required if a spatial model is defined" ) else: return value def copy(self, name=None, copy_data=False, **kwargs): """Copy the `FoVBackgroundModel` instance. Parameters ---------- name : str, optional Ignored, present for API compatibility. Default is None. copy_data : bool, optional Ignored, present for API compatibility. Default is False. **kwargs : dict Keyword arguments forwarded to `FoVBackgroundModel`. Returns ------- model : `FoVBackgroundModel` Copied FoV background model. """ kwargs.setdefault("spectral_model", self.spectral_model.copy()) kwargs.setdefault("dataset_name", self.datasets_names[0]) kwargs.setdefault("covariance_data", self.covariance.data.copy()) if self.spatial_model is not None: kwargs.setdefault("spatial_model", self.spatial_model.copy()) return self.__class__(**kwargs) def to_dict(self, full_output=False): data = {} data["type"] = self.tag[0] data["datasets_names"] = self.datasets_names data.update(self.spectral_model.to_dict(full_output=full_output)) if self.spatial_model is not None: data.update(self.spatial_model.to_dict(full_output)) return data @classmethod def from_dict(cls, data, **kwargs): """Create model from dictionary. Parameters ---------- data : dict Data dictionary. """ from gammapy.modeling.models import ( SPATIAL_MODEL_REGISTRY, SPECTRAL_MODEL_REGISTRY, ) spectral_data = data.get("spectral") if spectral_data is not None: model_class = SPECTRAL_MODEL_REGISTRY.get_cls(spectral_data["type"]) spectral_model = model_class.from_dict({"spectral": spectral_data}) else: spectral_model = None spatial_data = data.get("spatial") if spatial_data is not None: model_class = SPATIAL_MODEL_REGISTRY.get_cls(spatial_data["type"]) spatial_model = model_class.from_dict({"spatial": spatial_data}) else: spatial_model = None datasets_names = data.get("datasets_names") if datasets_names is None: raise ValueError("FoVBackgroundModel must define a dataset name") if len(datasets_names) > 1: raise ValueError("FoVBackgroundModel can only be assigned to one dataset") return cls( spatial_model=spatial_model, spectral_model=spectral_model, dataset_name=datasets_names[0], ) def reset_to_default(self): """Reset parameter values to default.""" values = self.spectral_model.default_parameters.value self.spectral_model.parameters.value = values def freeze(self, model_type="spectral"): """Freeze model parameters.""" if model_type is None or model_type == "spectral": self._spectral_model.freeze() def unfreeze(self, model_type="spectral"): """Restore parameters frozen status to default.""" if model_type is None or model_type == "spectral": self._spectral_model.unfreeze() class TemplateNPredModel(ModelBase): """Background model. Create a new map by a tilt and normalization on the available map. Parameters ---------- map : `~gammapy.maps.Map` Background model map. spectral_model : `~gammapy.modeling.models.SpectralModel` Normalized spectral model. Default is `~gammapy.modeling.models.PowerLawNormSpectralModel`. copy_data : bool Create a deepcopy of the map data or directly use the original. Default is True. Use False to save memory in case of large maps. spatial_model : `~gammapy.modeling.models.SpatialModel` Unitless Spatial model (unit is dropped on evaluation if defined). Default is None. """ tag = "TemplateNPredModel" map = LazyFitsData(cache=True) def __init__( self, map, spectral_model=None, name=None, filename=None, datasets_names=None, copy_data=True, spatial_model=None, covariance_data=None, ): if isinstance(map, Map): axis = map.geom.axes["energy"] if axis.node_type != "edges": raise ValueError( 'Need an integrated map, energy axis node_type="edges"' ) if copy_data: self.map = map.copy() else: self.map = map self._name = make_name(name) self.filename = filename if spectral_model is None: spectral_model = PowerLawNormSpectralModel() spectral_model.tilt.frozen = True self.spatial_model = spatial_model self.spectral_model = spectral_model if isinstance(datasets_names, str): datasets_names = [datasets_names] if isinstance(datasets_names, list): if len(datasets_names) != 1: raise ValueError( "Currently background models can only be assigned to one dataset." ) self.datasets_names = datasets_names super().__init__(covariance_data=covariance_data) def copy(self, name=None, copy_data=False, **kwargs): """Copy template npred model. Parameters ---------- name : str, optional Assign a new name to the copied model. Default is None. copy_data : bool, optional Copy the data arrays attached to models. Default is False. **kwargs : dict Keyword arguments forwarded to `TemplateNPredModel`. Returns ------- model : `TemplateNPredModel` Copied template npred model. """ name = make_name(name) kwargs.setdefault("map", self.map) kwargs.setdefault("spectral_model", self.spectral_model.copy()) kwargs.setdefault("filename", self.filename) kwargs.setdefault("datasets_names", self.datasets_names) kwargs.setdefault("covariance_data", self.covariance.data.copy()) return self.__class__(name=name, copy_data=copy_data, **kwargs) @property def name(self): return self._name @property def energy_center(self): """True energy axis bin centers as a `~astropy.units.Quantity`.""" energy_axis = self.map.geom.axes["energy"] energy = energy_axis.center return energy[:, np.newaxis, np.newaxis] @property def spectral_model(self): """Spectral model as a `~gammapy.modeling.models.SpectralModel` object.""" return self._spectral_model @spectral_model.setter def spectral_model(self, model): if not (model is None or isinstance(model, SpectralModel)): raise TypeError(f"Invalid type: {model!r}") self._spectral_model = model @property def _models(self): models = self.spectral_model, self.spatial_model return [model for model in models if model is not None] @property def parameters(self): parameters = [] parameters.append(self.spectral_model.parameters) return Parameters.from_stack(parameters) @property def parameters_unique_names(self): """List of unique parameter names. Return formatted as par_type.par_name.""" names = [] for model in self._models: for par_name in model.parameters_unique_names: components = [model.type, par_name] name = ".".join(components) names.append(name) return names def evaluate(self): """Evaluate background model. Returns ------- background_map : `~gammapy.maps.Map` Background evaluated on the Map. """ value = self.spectral_model(self.energy_center).value back_values = self.map.data * value if self.spatial_model is not None: value = self.spatial_model.evaluate_geom(self.map.geom).value back_values *= value return self.map.copy(data=back_values) def to_dict(self, full_output=False): data = {} data["name"] = self.name data["type"] = self.tag if self.spatial_model is not None: data["spatial"] = self.spatial_model.to_dict(full_output)["spatial"] data["spectral"] = self.spectral_model.to_dict(full_output)["spectral"] if self.filename is not None: data["filename"] = self.filename if self.datasets_names is not None: data["datasets_names"] = self.datasets_names return data def write(self, overwrite=False): """ Write the map. Parameters ---------- overwrite: bool, optional Overwrite existing file. Default is False, which will raise a warning if the template file exists already. """ if self.filename is None: raise IOError("Missing filename") elif os.path.isfile(make_path(self.filename)) and not overwrite: log.warning("Template file already exits, and overwrite is False") else: self.map.write(self.filename, overwrite=overwrite) @classmethod def from_dict(cls, data, **kwargs): from gammapy.modeling.models import ( SPATIAL_MODEL_REGISTRY, SPECTRAL_MODEL_REGISTRY, ) spectral_data = data.get("spectral") if spectral_data is not None: model_class = SPECTRAL_MODEL_REGISTRY.get_cls(spectral_data["type"]) spectral_model = model_class.from_dict({"spectral": spectral_data}) else: spectral_model = None spatial_data = data.get("spatial") if spatial_data is not None: model_class = SPATIAL_MODEL_REGISTRY.get_cls(spatial_data["type"]) spatial_model = model_class.from_dict({"spatial": spatial_data}) else: spatial_model = None if "filename" in data: bkg_map = Map.read(data["filename"]) else: raise IOError("Missing filename") return cls( map=bkg_map, spatial_model=spatial_model, spectral_model=spectral_model, name=data["name"], datasets_names=data.get("datasets_names"), filename=data.get("filename"), ) def cutout(self, position, width, mode="trim", name=None): """Cutout background model. Parameters ---------- position : `~astropy.coordinates.SkyCoord` Center position of the cutout region. width : tuple of `~astropy.coordinates.Angle` Angular sizes of the region in (lon, lat) in that specific order. If only one value is passed, a square region is extracted. mode : {'trim', 'partial', 'strict'} Mode option for Cutout2D, for details see `~astropy.nddata.utils.Cutout2D`. Default is "trim". name : str, optional Name of the returned background model. Default is None. Returns ------- cutout : `TemplateNPredModel` Cutout background model. """ cutout_kwargs = {"position": position, "width": width, "mode": mode} bkg_map = self.map.cutout(**cutout_kwargs) spectral_model = self.spectral_model.copy() return self.__class__(bkg_map, spectral_model=spectral_model, name=name) def stack(self, other, weights=None, nan_to_num=True): """Stack background model in place. Stacking the background model resets the current parameters values. Parameters ---------- other : `TemplateNPredModel` Other background model. weights : float, optional Weights. Default is None. nan_to_num: bool, optional Non-finite values are replaced by zero if True. Default is True. """ bkg = self.evaluate() if nan_to_num: bkg.data[~np.isfinite(bkg.data)] = 0 other_bkg = other.evaluate() bkg.stack(other_bkg, weights=weights, nan_to_num=nan_to_num) self.map = bkg # reset parameter values self.spectral_model.norm.value = 1 self.spectral_model.tilt.value = 0 def slice_by_energy(self, energy_min=None, energy_max=None, name=None): """Select and slice model template in energy range Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Energy bounds of the slice. Default is None. name : str Name of the sliced model. Default is None. Returns ------- model : `TemplateNpredModel` Sliced Model. """ name = make_name(name) energy_axis = self.map._geom.axes["energy"] if energy_min is None: energy_min = energy_axis.bounds[0] if energy_max is None: energy_max = energy_axis.bounds[1] if name is None: name = self.name energy_min, energy_max = u.Quantity(energy_min), u.Quantity(energy_max) group = energy_axis.group_table(edges=[energy_min, energy_max]) is_normal = group["bin_type"] == "normal " group = group[is_normal] slices = { "energy": slice(int(group["idx_min"][0]), int(group["idx_max"][0]) + 1) } model = self.copy(name=name) model.map = model.map.slice_by_idx(slices=slices) return model def __str__(self): str_ = self.__class__.__name__ + "\n\n" str_ += "\t{:26}: {}\n".format("Name", self.name) str_ += "\t{:26}: {}\n".format("Datasets names", self.datasets_names) str_ += "\tParameters:\n" info = _get_parameters_str(self.parameters) lines = info.split("\n") str_ += "\t" + "\n\t".join(lines[:-1]) str_ += "\n\n" return str_.expandtabs(tabsize=2) @property def position(self): """Position as a `~astropy.coordinates.SkyCoord`.""" return self.map.geom.center_skydir @property def evaluation_radius(self): """Evaluation radius as a `~astropy.coordinates.Angle`.""" return np.max(self.map.geom.width) / 2.0 def freeze(self, model_type="spectral"): """Freeze model parameters.""" if model_type is None or model_type == "spectral": self._spectral_model.freeze() def unfreeze(self, model_type="spectral"): """Restore parameters frozen status to default.""" if model_type is None or model_type == "spectral": self._spectral_model.unfreeze() def create_fermi_isotropic_diffuse_model(filename, **kwargs): """Read Fermi isotropic diffuse model. See `LAT Background models `__. Parameters ---------- filename : str Filename. kwargs : dict Keyword arguments forwarded to `TemplateSpectralModel`. Returns ------- diffuse_model : `SkyModel` Fermi isotropic diffuse sky model. """ vals = np.loadtxt(make_path(filename)) energy = u.Quantity(vals[:, 0], "MeV", copy=COPY_IF_NEEDED) values = u.Quantity(vals[:, 1], "MeV-1 s-1 cm-2", copy=COPY_IF_NEEDED) kwargs.setdefault("interp_kwargs", {"fill_value": None}) spatial_model = ConstantSpatialModel() spectral_model = ( TemplateSpectralModel(energy=energy, values=values, **kwargs) * PowerLawNormSpectralModel() ) return SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, name="fermi-diffuse-iso", apply_irf={"psf": False, "exposure": True, "edisp": True}, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/prior.py0000644000175100001770000001244614721316200020641 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Priors for Gammapy.""" import logging import numpy as np import astropy.units as u from gammapy.modeling import PriorParameter, PriorParameters from .core import ModelBase __all__ = ["GaussianPrior", "UniformPrior", "Prior"] log = logging.getLogger(__name__) def _build_priorparameters_from_dict(data, default_parameters): """Build a `~gammapy.modeling.PriorParameters` object from input dictionary and default prior parameter values.""" par_data = [] input_names = [_["name"] for _ in data] for par in default_parameters: par_dict = par.to_dict() try: index = input_names.index(par_dict["name"]) par_dict.update(data[index]) except ValueError: log.warning( f"PriorParameter '{par_dict['name']}' not defined in YAML file." f" Using default value: {par_dict['value']} {par_dict['unit']}" ) par_data.append(par_dict) return PriorParameters.from_dict(par_data) class Prior(ModelBase): """Prior base class.""" _unit = "" def __init__(self, **kwargs): # Copy default parameters from the class to the instance default_parameters = self.default_parameters.copy() for par in default_parameters: value = kwargs.get(par.name, par) if not isinstance(value, PriorParameter): par.quantity = u.Quantity(value) else: par = value setattr(self, par.name, par) _weight = kwargs.get("weight", None) if _weight is not None: self._weight = _weight else: self._weight = 1 @property def parameters(self): """Prior parameters as a `~gammapy.modeling.PriorParameters` object.""" return PriorParameters( [getattr(self, name) for name in self.default_parameters.names] ) def __init_subclass__(cls, **kwargs): # Add priorparameters list on the model sub-class (not instances) cls.default_parameters = PriorParameters( [_ for _ in cls.__dict__.values() if isinstance(_, PriorParameter)] ) @property def weight(self): """Weight mulitplied to the prior when evaluated.""" return self._weight @weight.setter def weight(self, value): self._weight = value def __call__(self, value): """Call evaluate method.""" # assuming the same unit as the PriorParamater here kwargs = {par.name: par.value for par in self.parameters} return self.weight * self.evaluate(value.value, **kwargs) def to_dict(self, full_output=False): """Create dictionary for YAML serialisation.""" tag = self.tag[0] if isinstance(self.tag, list) else self.tag params = self.parameters.to_dict() if not full_output: for par, par_default in zip(params, self.default_parameters): init = par_default.to_dict() for item in [ "min", "max", "error", ]: default = init[item] if par[item] == default or ( np.isnan(par[item]) and np.isnan(default) ): del par[item] data = {"type": tag, "parameters": params, "weight": self.weight} if self.type is None: return data else: return {self.type: data} @classmethod def from_dict(cls, data, **kwargs): """Get prior parameters from dictionary.""" kwargs = {} key0 = next(iter(data)) if key0 in ["prior"]: data = data[key0] if data["type"] not in cls.tag: raise ValueError( f"Invalid model type {data['type']} for class {cls.__name__}" ) priorparameters = _build_priorparameters_from_dict( data["parameters"], cls.default_parameters ) kwargs["weight"] = data["weight"] return cls.from_parameters(priorparameters, **kwargs) class GaussianPrior(Prior): """One-dimensional Gaussian Prior. Parameters ---------- mu : float Mean of the Gaussian distribution. Default is 0. sigma : float Standard deviation of the Gaussian distribution. Default is 1. """ tag = ["GaussianPrior"] _type = "prior" mu = PriorParameter(name="mu", value=0) sigma = PriorParameter(name="sigma", value=1) @staticmethod def evaluate(value, mu, sigma): """Evaluate the Gaussian prior.""" return ((value - mu) / sigma) ** 2 class UniformPrior(Prior): """Uniform Prior. Returns 1 if the parameter value is in (min, max). 0, if otherwise. Parameters ---------- min : float Minimum value. Default is -inf. max : float Maxmimum value. Default is inf. """ tag = ["UniformPrior"] _type = "prior" min = PriorParameter(name="min", value=-np.inf, unit="") max = PriorParameter(name="max", value=np.inf, unit="") @staticmethod def evaluate(value, min, max): """Evaluate the uniform prior.""" if min < value < max: return 0.0 else: return 1.0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/spatial.py0000644000175100001770000016437314721316200021152 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Spatial models.""" import logging import os import numpy as np import scipy.integrate import scipy.special from scipy.interpolate import griddata import astropy.units as u from astropy.coordinates import Angle, SkyCoord, angular_separation, position_angle from astropy.nddata import NoOverlapError from astropy.utils import lazyproperty from regions import ( CircleAnnulusSkyRegion, CircleSkyRegion, EllipseAnnulusSkyRegion, EllipseSkyRegion, PointSkyRegion, RectangleSkyRegion, ) import matplotlib.pyplot as plt from gammapy.maps import HpxNDMap, Map, MapCoord, WcsGeom, WcsNDMap from gammapy.modeling import Parameter, Parameters from gammapy.utils.compat import COPY_IF_NEEDED from gammapy.utils.gauss import Gauss2DPDF from gammapy.utils.interpolation import interpolation_scale from gammapy.utils.regions import region_circle_to_ellipse, region_to_frame from gammapy.utils.scripts import make_path from .core import ModelBase, _build_parameters_from_dict __all__ = [ "ConstantFluxSpatialModel", "ConstantSpatialModel", "DiskSpatialModel", "GaussianSpatialModel", "GeneralizedGaussianSpatialModel", "PointSpatialModel", "Shell2SpatialModel", "ShellSpatialModel", "SpatialModel", "TemplateSpatialModel", "TemplateNDSpatialModel", "PiecewiseNormSpatialModel", ] log = logging.getLogger(__name__) MAX_OVERSAMPLING = 200 def compute_sigma_eff(lon_0, lat_0, lon, lat, phi, major_axis, e): """Effective radius, used for the evaluation of elongated models.""" phi_0 = position_angle(lon_0, lat_0, lon, lat) d_phi = phi - phi_0 minor_axis = Angle(major_axis * np.sqrt(1 - e**2)) a2 = (major_axis * np.sin(d_phi)) ** 2 b2 = (minor_axis * np.cos(d_phi)) ** 2 denominator = np.sqrt(a2 + b2) sigma_eff = major_axis * minor_axis / denominator return minor_axis, sigma_eff class SpatialModel(ModelBase): """Spatial model base class.""" _type = "spatial" def __init__(self, **kwargs): frame = kwargs.pop("frame", "icrs") super().__init__(**kwargs) if not hasattr(self, "frame"): self.frame = frame def __call__(self, lon, lat, energy=None): """Call evaluate method.""" kwargs = {par.name: par.quantity for par in self.parameters} if energy is None and self.is_energy_dependent: raise ValueError("Missing energy value for evaluation") if energy is not None: kwargs["energy"] = energy return self.evaluate(lon, lat, **kwargs) @property def evaluation_bin_size_min(self): return None # TODO: make this a hard-coded class attribute? @lazyproperty def is_energy_dependent(self): varnames = self.evaluate.__code__.co_varnames return "energy" in varnames @property def position(self): """Spatial model center position as a `~astropy.coordinates.SkyCoord`.""" lon = self.lon_0.quantity lat = self.lat_0.quantity return SkyCoord(lon, lat, frame=self.frame) @position.setter def position(self, skycoord): """Spatial model center position.""" coord = skycoord.transform_to(self.frame) self.lon_0.quantity = coord.data.lon self.lat_0.quantity = coord.data.lat @property def position_lonlat(self): """Spatial model center position `(lon, lat)` in radians and frame of the model.""" lon = self.lon_0.quantity.to_value(u.rad) lat = self.lat_0.quantity.to_value(u.rad) return lon, lat # TODO: get rid of this! _phi_0 = 0.0 @property def phi_0(self): return self._phi_0 @phi_0.setter def phi_0(self, phi_0=0.0): self._phi_0 = phi_0 @property def position_error(self): """Get 95% containment position error as `~regions.EllipseSkyRegion`.""" if self.covariance is None: raise ValueError("No position error information available.") pars = self.parameters sub_covar = self.covariance.get_subcovariance(["lon_0", "lat_0"]).data.copy() cos_lat = np.cos(self.lat_0.quantity.to_value("rad")) sub_covar[0, 0] *= cos_lat**2.0 sub_covar[0, 1] *= cos_lat sub_covar[1, 0] *= cos_lat eig_vals, eig_vecs = np.linalg.eig(sub_covar) lon_err, lat_err = np.sqrt(eig_vals) y_vec = eig_vecs[:, 0] phi = (np.arctan2(y_vec[1], y_vec[0]) * u.rad).to("deg") + self.phi_0 err = np.sort([lon_err, lat_err]) scale_r95 = Gauss2DPDF(sigma=1).containment_radius(0.95) err *= scale_r95 if err[1] == lon_err * scale_r95: phi += 90 * u.deg height = 2 * err[1] * pars["lon_0"].unit width = 2 * err[0] * pars["lat_0"].unit else: height = 2 * err[1] * pars["lat_0"].unit width = 2 * err[0] * pars["lon_0"].unit return EllipseSkyRegion( center=self.position, height=height, width=width, angle=phi ) def evaluate_geom(self, geom): """Evaluate model on `~gammapy.maps.Geom`. Parameters ---------- geom : `~gammapy.maps.WcsGeom` Map geometry. Returns ------- map : `~gammapy.maps.Map` Map containing the value in each spatial bin. """ coords = geom.get_coord(frame=self.frame, sparse=True) if self.is_energy_dependent: return self(coords.lon, coords.lat, energy=coords["energy_true"]) else: return self(coords.lon, coords.lat) def integrate_geom(self, geom, oversampling_factor=None): """Integrate model on `~gammapy.maps.Geom` or `~gammapy.maps.RegionGeom`. Integration is performed by simple rectangle approximation, the pixel center model value is multiplied by the pixel solid angle. An oversampling factor can be used for precision. By default, this parameter is set to None and an oversampling factor is automatically estimated based on the model estimation maximal bin width. For a RegionGeom, the model is integrated on a tangent WCS projection in the region. Parameters ---------- geom : `~gammapy.maps.WcsGeom` or `~gammapy.maps.RegionGeom` The geom on which the integration is performed. oversampling_factor : int or None The oversampling factor to use for integration. Default is None: the factor is estimated from the model minimal bin size. Returns ------- map : `~gammapy.maps.Map` or `gammapy.maps.RegionNDMap` Map containing the integral value in each spatial bin. """ wcs_geom = geom mask = None if geom.is_region: wcs_geom = geom.to_wcs_geom().to_image() result = Map.from_geom(geom=wcs_geom) integrated = Map.from_geom(wcs_geom) pix_scale = np.max(wcs_geom.pixel_scales.to_value("deg")) if self.evaluation_radius is not None: try: width = 2 * np.maximum( self.evaluation_radius.to_value("deg"), pix_scale ) wcs_geom_cut = wcs_geom.cutout(self.position, width) integrated = Map.from_geom(wcs_geom_cut) except (NoOverlapError, ValueError): oversampling_factor = 1 if oversampling_factor is None: if self.evaluation_bin_size_min is not None: res_scale = self.evaluation_bin_size_min.to_value("deg") if res_scale > 0: oversampling_factor = np.minimum( int(np.ceil(pix_scale / res_scale)), MAX_OVERSAMPLING ) else: oversampling_factor = MAX_OVERSAMPLING else: oversampling_factor = 1 if oversampling_factor > 1: upsampled_geom = integrated.geom.upsample( oversampling_factor, axis_name=None ) # assume the upsampled solid angles are approximately factor**2 smaller values = self.evaluate_geom(upsampled_geom) / oversampling_factor**2 upsampled = Map.from_geom(upsampled_geom, unit=values.unit) upsampled += values if geom.is_region: mask = geom.contains(upsampled_geom.get_coord()).astype("int") integrated.quantity = upsampled.downsample( oversampling_factor, preserve_counts=True, weights=mask ).quantity # Finally stack result result._unit = integrated.unit result.stack(integrated) else: values = self.evaluate_geom(wcs_geom) result._unit = values.unit result += values result *= result.geom.solid_angle() if geom.is_region: mask = result.geom.region_mask([geom.region]) result = Map.from_geom( geom, data=np.sum(result.data[mask]), unit=result.unit ) return result def to_dict(self, full_output=False): """Create dictionary for YAML serilisation.""" data = super().to_dict(full_output) data["spatial"]["frame"] = self.frame data["spatial"]["parameters"] = data["spatial"].pop("parameters") return data @classmethod def from_dict(cls, data, **kwargs): """Create a spatial model from a dictionary. Parameters ---------- data : dict Dictionary containing model parameters. kwargs : dict Keyword arguments passed to `~SpatialModel.from_parameters`. """ kwargs = kwargs or {} spatial_data = data.get("spatial", data) if "frame" in spatial_data: kwargs["frame"] = spatial_data["frame"] return super().from_dict(data, **kwargs) @property def _evaluation_geom(self): if isinstance(self, TemplateSpatialModel): geom = self.map.geom else: width = 2 * max(self.evaluation_radius, 0.1 * u.deg) geom = WcsGeom.create( skydir=self.position, frame=self.frame, width=width, binsz=0.02 ) return geom def _get_plot_map(self, geom): if self.evaluation_radius is None and geom is None: raise ValueError( f"{self.__class__.__name__} requires geom to be defined for plotting." ) if geom is None: geom = self._evaluation_geom data = self.evaluate_geom(geom) return Map.from_geom(geom, data=data.value, unit=data.unit) def plot(self, ax=None, geom=None, **kwargs): """Plot spatial model. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. geom : `~gammapy.maps.WcsGeom`, optional Geometry to use for plotting. Default is None. **kwargs : dict Keyword arguments passed to `~gammapy.maps.WcsMap.plot()`. Returns ------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. """ m = self._get_plot_map(geom) if not m.geom.is_flat: raise TypeError( "Use .plot_interactive() or .plot_grid() for Map dimension > 2" ) return m.plot(ax=ax, **kwargs) def plot_interactive(self, ax=None, geom=None, **kwargs): """Plot spatial model. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. geom : `~gammapy.maps.WcsGeom`, optional Geom to use for plotting. Default is None. **kwargs : dict Keyword arguments passed to `~gammapy.maps.WcsMap.plot()`. Returns ------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. """ m = self._get_plot_map(geom) if m.geom.is_image: raise TypeError("Use .plot() for 2D Maps") m.plot_interactive(ax=ax, **kwargs) def plot_position_error(self, ax=None, **kwargs): """Plot position error. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes to plot the position error on. Default is None. **kwargs : dict Keyword arguments passed to `~gammapy.maps.WcsMap.plot()`. Returns ------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. """ # plot center position lon, lat = self.lon_0.value, self.lat_0.value ax = plt.gca() if ax is None else ax kwargs.setdefault("marker", "x") kwargs.setdefault("color", "red") kwargs.setdefault("label", "position") ax.scatter(lon, lat, transform=ax.get_transform(self.frame), **kwargs) # plot position error if not np.all(self.covariance.data == 0): region = self.position_error.to_pixel(ax.wcs) artist = region.as_artist(facecolor="none", edgecolor=kwargs["color"]) ax.add_artist(artist) return ax def _to_region_error(self): pass def plot_error( self, ax=None, which="position", kwargs_position=None, kwargs_extension=None ): """Plot the errors of the spatial model. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes to plot the errors on. Default is None. which: list of str Which errors to plot. Available options are: * "all": all the optional steps are plotted * "position": plot the position error of the spatial model * "extension": plot the extension error of the spatial model kwargs_position : dict, optional Keyword arguments passed to `~SpatialModel.plot_position_error`. Default is None. kwargs_extension : dict, optional Keyword arguments passed to `~SpatialModel.plot_extension_error`. Default is None. Returns ------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. """ kwargs_position = kwargs_position or {} kwargs_extension = kwargs_extension or {} ax = plt.gca() if ax is None else ax kwargs_extension.setdefault("edgecolor", "red") kwargs_extension.setdefault("facecolor", "red") kwargs_extension.setdefault("alpha", 0.15) kwargs_extension.setdefault("fill", True) if "all" in which: self.plot_position_error(ax, **kwargs_position) region = self._to_region_error() if region is not None: artist = region.to_pixel(ax.wcs).as_artist(**kwargs_extension) ax.add_artist(artist) if "position" in which: self.plot_position_error(ax, **kwargs_position) if "extension" in which: region = self._to_region_error() if region is not None: artist = region.to_pixel(ax.wcs).as_artist(**kwargs_extension) ax.add_artist(artist) def plot_grid(self, geom=None, **kwargs): """Plot spatial model energy slices in a grid. Parameters ---------- geom : `~gammapy.maps.WcsGeom`, optional Geometry to use for plotting. Default is None. **kwargs : dict Keyword arguments passed to `~gammapy.maps.WcsMap.plot()`. Returns ------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. """ m = self._get_plot_map(geom) if (m.geom is None) or m.geom.is_image: raise TypeError("Use .plot() for 2D Maps") m.plot_grid(**kwargs) @classmethod def from_position(cls, position, **kwargs): """Define the position of the model using a `~astropy.coordinates.SkyCoord`. The model will be created in the frame of the `~astropy.coordinates.SkyCoord`. Parameters ---------- position : `~astropy.coordinates.SkyCoord` Position. Returns ------- model : `SpatialModel` Spatial model. """ lon_0, lat_0 = position.data.lon, position.data.lat return cls(lon_0=lon_0, lat_0=lat_0, frame=position.frame.name, **kwargs) @property def evaluation_radius(self): """Evaluation radius.""" return None @property def evaluation_region(self): """Evaluation region.""" if hasattr(self, "to_region"): return self.to_region() elif self.evaluation_radius is not None: return CircleSkyRegion( center=self.position, radius=self.evaluation_radius, ) else: return None class PointSpatialModel(SpatialModel): """Point Source. For more information see :ref:`point-spatial-model`. Parameters ---------- lon_0, lat_0 : `~astropy.coordinates.Angle` Center position. Default is "0 deg", "0 deg". frame : {"icrs", "galactic"} Center position coordinate frame. """ tag = ["PointSpatialModel", "point"] lon_0 = Parameter("lon_0", "0 deg") lat_0 = Parameter("lat_0", "0 deg", min=-90, max=90) @property def evaluation_bin_size_min(self): """Minimal evaluation bin size as an `~astropy.coordinates.Angle`.""" return 0 * u.deg @property def evaluation_radius(self): """Evaluation radius as an `~astropy.coordinates.Angle`. Set as zero degrees. """ return 0 * u.deg @staticmethod def _grid_weights(x, y, x0, y0): """Compute 4-pixel weights such that centroid is preserved.""" dx = np.abs(x - x0) dx = np.where(dx < 1, 1 - dx, 0) dy = np.abs(y - y0) dy = np.where(dy < 1, 1 - dy, 0) return dx * dy @property def is_energy_dependent(self): return False def evaluate_geom(self, geom): """Evaluate model on `~gammapy.maps.Geom`.""" values = self.integrate_geom(geom).data return values / geom.solid_angle() def integrate_geom(self, geom, oversampling_factor=None): """Integrate model on `~gammapy.maps.Geom`. Parameters ---------- geom : `Geom` Map geometry. Returns ------- flux : `Map` Predicted flux map. """ geom_image = geom.to_image() if geom.is_hpx: idx, weights = geom_image.interp_weights({"skycoord": self.position}) data = np.zeros(geom_image.data_shape) data[tuple(idx)] = weights else: x, y = geom_image.get_pix() x0, y0 = self.position.to_pixel(geom.wcs) data = self._grid_weights(x, y, x0, y0) return Map.from_geom(geom=geom_image, data=data, unit="") def to_region(self, **kwargs): """Model outline as a `~regions.PointSkyRegion`.""" return PointSkyRegion(center=self.position, **kwargs) class GaussianSpatialModel(SpatialModel): r"""Two-dimensional Gaussian model. For more information see :ref:`gaussian-spatial-model`. Parameters ---------- lon_0, lat_0 : `~astropy.coordinates.Angle` Center position. Default is "0 deg", "0 deg". sigma : `~astropy.coordinates.Angle` Length of the major semiaxis of the Gaussian, in angular units. Default is 1 deg. e : `float` Eccentricity of the Gaussian (:math:`0< e< 1`). Default is 0. phi : `~astropy.coordinates.Angle` Rotation angle :math:`\phi`: of the major semiaxis. Increases counter-clockwise from the North direction. Default is 0 deg. frame : {"icrs", "galactic"} Center position coordinate frame. """ tag = ["GaussianSpatialModel", "gauss"] lon_0 = Parameter("lon_0", "0 deg") lat_0 = Parameter("lat_0", "0 deg", min=-90, max=90) sigma = Parameter("sigma", "1 deg", min=0) e = Parameter("e", 0, min=0, max=1, frozen=True) phi = Parameter("phi", "0 deg", frozen=True) @property def evaluation_bin_size_min(self): """Minimal evaluation bin size as an `~astropy.coordinates.Angle`, chosen as sigma/3.""" return self.parameters["sigma"].quantity / 3.0 @property def evaluation_radius(self): r"""Evaluation radius as an `~astropy.coordinates.Angle`. Set as :math:`5\sigma`. """ return 5 * self.parameters["sigma"].quantity @staticmethod def evaluate(lon, lat, lon_0, lat_0, sigma, e, phi): """Evaluate model.""" sep = angular_separation(lon, lat, lon_0, lat_0) if e == 0: a = 1.0 - np.cos(sigma) norm = (1 / (4 * np.pi * a * (1.0 - np.exp(-1.0 / a)))).value else: minor_axis, sigma_eff = compute_sigma_eff( lon_0, lat_0, lon, lat, phi, sigma, e ) a = 1.0 - np.cos(sigma_eff) norm = (1 / (2 * np.pi * sigma * minor_axis)).to_value("sr-1") exponent = -0.5 * ((1 - np.cos(sep)) / a) return u.Quantity(norm * np.exp(exponent).value, "sr-1", copy=COPY_IF_NEEDED) def to_region(self, x_sigma=1.5, **kwargs): r"""Model outline at a given number of :math:`\sigma`. Parameters ---------- x_sigma : float Number of :math:`\sigma Default is :math:`1.5\sigma` which corresponds to about 68% containment for a 2D symmetric Gaussian. Returns ------- region : `~regions.EllipseSkyRegion` Model outline. """ minor_axis = Angle(self.sigma.quantity * np.sqrt(1 - self.e.quantity**2)) return EllipseSkyRegion( center=self.position, height=2 * x_sigma * self.sigma.quantity, width=2 * x_sigma * minor_axis, angle=self.phi.quantity, **kwargs, ) @property def evaluation_region(self): """Evaluation region consistent with evaluation radius.""" return self.to_region(x_sigma=5) def _to_region_error(self, x_sigma=1.5): r"""Plot model error at a given number of :math:`\sigma`. Parameters ---------- x_sigma : float Number of :math:`\sigma Default is :math:`1.5\sigma` which corresponds to about 68% containment for a 2D symmetric Gaussian. Returns ------- region : `~regions.EllipseSkyRegion` Model error region. """ sigma_hi = self.sigma.quantity + (self.sigma.error * self.sigma.unit) sigma_lo = self.sigma.quantity - (self.sigma.error * self.sigma.unit) minor_axis_hi = Angle( sigma_hi * np.sqrt(1 - (self.e.quantity + self.e.error) ** 2) ) minor_axis_lo = Angle( sigma_lo * np.sqrt(1 - (self.e.quantity - self.e.error) ** 2) ) return EllipseAnnulusSkyRegion( center=self.position, inner_height=2 * x_sigma * sigma_lo, outer_height=2 * x_sigma * sigma_hi, inner_width=2 * x_sigma * minor_axis_lo, outer_width=2 * x_sigma * minor_axis_hi, angle=self.phi.quantity, ) class GeneralizedGaussianSpatialModel(SpatialModel): r"""Two-dimensional Generalized Gaussian model. For more information see :ref:`generalized-gaussian-spatial-model`. Parameters ---------- lon_0, lat_0 : `~astropy.coordinates.Angle` Center position. Default is "0 deg", "0 deg". r_0 : `~astropy.coordinates.Angle` Length of the major semiaxis, in angular units. Default is 1 deg. eta : `float` Shape parameter within (0, 1). Special cases for disk: ->0, Gaussian: 0.5, Laplace:1 Default is 0.5. e : `float` Eccentricity (:math:`0< e< 1`). Default is 0. phi : `~astropy.coordinates.Angle` Rotation angle :math:`\phi`: of the major semiaxis. Increases counter-clockwise from the North direction. Default is 0 deg. frame : {"icrs", "galactic"} Center position coordinate frame. """ tag = ["GeneralizedGaussianSpatialModel", "gauss-general"] lon_0 = Parameter("lon_0", "0 deg") lat_0 = Parameter("lat_0", "0 deg", min=-90, max=90) r_0 = Parameter("r_0", "1 deg") eta = Parameter("eta", 0.5, min=0.01, max=1.0) e = Parameter("e", 0.0, min=0.0, max=1.0, frozen=True) phi = Parameter("phi", "0 deg", frozen=True) @staticmethod def evaluate(lon, lat, lon_0, lat_0, r_0, eta, e, phi): sep = angular_separation(lon, lat, lon_0, lat_0) if isinstance(eta, u.Quantity): eta = eta.value # gamma function does not allow quantities minor_axis, r_eff = compute_sigma_eff(lon_0, lat_0, lon, lat, phi, r_0, e) z = sep / r_eff norm = 1 / (2 * np.pi * minor_axis * r_0 * eta * scipy.special.gamma(2 * eta)) return (norm * np.exp(-(z ** (1 / eta)))).to("sr-1") @property def evaluation_bin_size_min(self): """Minimal evaluation bin size as an `~astropy.coordinates.Angle`. The bin min size is defined as r_0/(3+8*eta)/(e+1). """ return self.r_0.quantity / (3 + 8 * self.eta.value) / (self.e.value + 1) @property def evaluation_radius(self): r"""Evaluation radius as an `~astropy.coordinates.Angle`. The evaluation radius is defined as r_eval = r_0*(1+8*eta) so it verifies: r_eval -> r_0 if eta -> 0 r_eval = 5*r_0 > 5*sigma_gauss = 5*r_0/sqrt(2) ~ 3.5*r_0 if eta=0.5 r_eval = 9*r_0 > 5*sigma_laplace = 5*sqrt(2)*r_0 ~ 7*r_0 if eta = 1 r_eval -> inf if eta -> inf """ return self.r_0.quantity * (1 + 8 * self.eta.value) def to_region(self, x_r_0=1, **kwargs): r"""Model outline at a given number of :math:`r_0`. Parameters ---------- x_r_0 : float, optional Number of :math:`r_0`. Default is 1. Returns ------- region : `~regions.EllipseSkyRegion` Model outline. """ minor_axis = Angle(self.r_0.quantity * np.sqrt(1 - self.e.quantity**2)) return EllipseSkyRegion( center=self.position, height=2 * x_r_0 * self.r_0.quantity, width=2 * x_r_0 * minor_axis, angle=self.phi.quantity, **kwargs, ) @property def evaluation_region(self): """Evaluation region consistent with evaluation radius.""" scale = self.evaluation_radius / self.r_0.quantity return self.to_region(x_r_0=scale) def _to_region_error(self, x_r_0=1): r"""Model error at a given number of :math:`r_0`. Parameters ---------- x_r_0 : float, optional Number of :math:`r_0`. Default is 1. Returns ------- region : `~regions.EllipseSkyRegion` Model error region. """ r_0_lo = self.r_0.quantity - self.r_0.error * self.r_0.unit r_0_hi = self.r_0.quantity + self.r_0.error * self.r_0.unit minor_axis_hi = Angle( r_0_hi * np.sqrt(1 - (self.e.quantity + self.e.error) ** 2) ) minor_axis_lo = Angle( r_0_lo * np.sqrt(1 - (self.e.quantity - self.e.error) ** 2) ) return EllipseAnnulusSkyRegion( center=self.position, inner_height=2 * x_r_0 * r_0_lo, outer_height=2 * x_r_0 * r_0_hi, inner_width=2 * x_r_0 * minor_axis_lo, outer_width=2 * x_r_0 * minor_axis_hi, angle=self.phi.quantity, ) class DiskSpatialModel(SpatialModel): r"""Constant disk model. For more information see :ref:`disk-spatial-model`. Parameters ---------- lon_0, lat_0 : `~astropy.coordinates.Angle` Center position. Default is "0 deg", "0 deg". r_0 : `~astropy.coordinates.Angle` :math:`a`: length of the major semiaxis, in angular units. Default is 1 deg. e : `float` Eccentricity of the ellipse (:math:`0< e< 1`). Default is 0. phi : `~astropy.coordinates.Angle` Rotation angle :math:`\phi`: of the major semiaxis. Increases counter-clockwise from the North direction. Default is 0 deg. edge_width : float Width of the edge. The width is defined as the range within which the smooth edge of the model drops from 95% to 5% of its amplitude. It is given as fraction of r_0. Default is 0.01. frame : {"icrs", "galactic"} Center position coordinate frame. """ tag = ["DiskSpatialModel", "disk"] lon_0 = Parameter("lon_0", "0 deg") lat_0 = Parameter("lat_0", "0 deg", min=-90, max=90) r_0 = Parameter("r_0", "1 deg", min=0) e = Parameter("e", 0, min=0, max=1, frozen=True) phi = Parameter("phi", "0 deg", frozen=True) edge_width = Parameter("edge_width", value=0.01, min=0, max=1, frozen=True) @property def evaluation_bin_size_min(self): """Minimal evaluation bin size as an `~astropy.coordinates.Angle`. The bin min size is defined as r_0*(1-edge_width)/10. """ return self.r_0.quantity * (1 - self.edge_width.quantity) / 10.0 @property def evaluation_radius(self): """Evaluation radius as an `~astropy.coordinates.Angle`. Set to the length of the semi-major axis plus the edge width. """ return 1.1 * self.r_0.quantity * (1 + self.edge_width.quantity) @staticmethod def _evaluate_norm_factor(r_0, e): """Compute the normalization factor.""" semi_minor = r_0 * np.sqrt(1 - e**2) def integral_fcn(x, a, b): A = 1 / np.sin(a) ** 2 B = 1 / np.sin(b) ** 2 C = A - B cs2 = np.cos(x) ** 2 return 1 - np.sqrt(1 - 1 / (B + C * cs2)) return ( 2 * scipy.integrate.quad( lambda x: integral_fcn(x, r_0, semi_minor), 0, np.pi )[0] ) ** -1 @staticmethod def _evaluate_smooth_edge(x, width): value = (x / width).to_value("") edge_width_95 = 2.326174307353347 return 0.5 * (1 - scipy.special.erf(value * edge_width_95)) @staticmethod def evaluate(lon, lat, lon_0, lat_0, r_0, e, phi, edge_width): """Evaluate model.""" sep = angular_separation(lon, lat, lon_0, lat_0) if e == 0: sigma_eff = r_0 else: sigma_eff = compute_sigma_eff(lon_0, lat_0, lon, lat, phi, r_0, e)[1] norm = DiskSpatialModel._evaluate_norm_factor(r_0, e) in_ellipse = DiskSpatialModel._evaluate_smooth_edge( sep - sigma_eff, sigma_eff * edge_width ) return u.Quantity(norm * in_ellipse, "sr-1", copy=COPY_IF_NEEDED) def to_region(self, **kwargs): """Model outline as a `~regions.EllipseSkyRegion`.""" minor_axis = Angle(self.r_0.quantity * np.sqrt(1 - self.e.quantity**2)) return EllipseSkyRegion( center=self.position, height=2 * self.r_0.quantity, width=2 * minor_axis, angle=self.phi.quantity, **kwargs, ) @classmethod def from_region(cls, region, **kwargs): """Create a `DiskSpatialModel from a ~regions.EllipseSkyRegion`. Parameters ---------- region : `~regions.EllipseSkyRegion` or ~regions.CircleSkyRegion` Region to create model from. kwargs : dict Keyword arguments passed to `~gammapy.modeling.models.DiskSpatialModel`. Returns ------- spatial_model : `~gammapy.modeling.models.DiskSpatialModel` Spatial model. """ if isinstance(region, CircleSkyRegion): region = region_circle_to_ellipse(region) if not isinstance(region, EllipseSkyRegion): raise ValueError( f"Please provide a `CircleSkyRegion` " f"or `EllipseSkyRegion`, got {type(region)} instead." ) frame = kwargs.pop("frame", region.center.frame) region = region_to_frame(region, frame=frame) if region.height > region.width: major_axis, minor_axis = region.height, region.width phi = region.angle else: minor_axis, major_axis = region.height, region.width phi = 90 * u.deg + region.angle kwargs.setdefault("phi", phi) kwargs.setdefault("e", np.sqrt(1.0 - np.power(minor_axis / major_axis, 2))) kwargs.setdefault("r_0", major_axis / 2.0) return cls.from_position(region.center, **kwargs) def _to_region_error(self): """Model error. Returns ------- region : `~regions.EllipseSkyRegion` Model error region. """ r_0_lo = self.r_0.quantity - self.r_0.error * self.r_0.unit r_0_hi = self.r_0.quantity + self.r_0.error * self.r_0.unit minor_axis_hi = Angle( r_0_hi * np.sqrt(1 - (self.e.quantity + self.e.error) ** 2) ) minor_axis_lo = Angle( r_0_lo * np.sqrt(1 - (self.e.quantity - self.e.error) ** 2) ) return EllipseAnnulusSkyRegion( center=self.position, inner_height=2 * r_0_lo, outer_height=2 * r_0_hi, inner_width=2 * minor_axis_lo, outer_width=2 * minor_axis_hi, angle=self.phi.quantity, ) class ShellSpatialModel(SpatialModel): r"""Shell model. For more information see :ref:`shell-spatial-model`. Parameters ---------- lon_0, lat_0 : `~astropy.coordinates.Angle` Center position. Default is "0 deg", "0 deg". radius : `~astropy.coordinates.Angle` Inner radius, :math:`r_{in}`. Default is 1 deg. width : `~astropy.coordinates.Angle` Shell width. Default is 0.2 deg. frame : {"icrs", "galactic"} Center position coordinate frame. See Also -------- Shell2SpatialModel """ tag = ["ShellSpatialModel", "shell"] lon_0 = Parameter("lon_0", "0 deg") lat_0 = Parameter("lat_0", "0 deg", min=-90, max=90) radius = Parameter("radius", "1 deg") width = Parameter("width", "0.2 deg") @property def evaluation_bin_size_min(self): """Minimal evaluation bin size as an `~astropy.coordinates.Angle`. The bin min size is defined as the shell width. """ return self.width.quantity @property def evaluation_radius(self): r"""Evaluation radius as an `~astropy.coordinates.Angle`. Set to :math:`r_\text{out}`. """ return self.radius.quantity + self.width.quantity @staticmethod def evaluate(lon, lat, lon_0, lat_0, radius, width): """Evaluate model.""" sep = angular_separation(lon, lat, lon_0, lat_0) radius_out = radius + width norm = 3 / (2 * np.pi * (radius_out**3 - radius**3)) with np.errstate(invalid="ignore"): # np.where and np.select do not work with quantities, so we use the # workaround with indexing value = np.sqrt(radius_out**2 - sep**2) mask = sep < radius value[mask] = (value - np.sqrt(radius**2 - sep**2))[mask] value[sep > radius_out] = 0 return norm * value def to_region(self, **kwargs): """Model outline as a `~regions.CircleAnnulusSkyRegion`.""" return CircleAnnulusSkyRegion( center=self.position, inner_radius=self.radius.quantity, outer_radius=self.radius.quantity + self.width.quantity, **kwargs, ) class Shell2SpatialModel(SpatialModel): r"""Shell model with outer radius and relative width parametrization. For more information see :ref:`shell2-spatial-model`. Parameters ---------- lon_0, lat_0 : `~astropy.coordinates.Angle` Center position. Default is "0 deg", "0 deg". r_0 : `~astropy.coordinates.Angle` Outer radius, :math:`r_{out}`. Default is 1 deg. eta : float Shell width relative to outer radius, r_0, should be within (0,1). Default is 0.2. frame : {"icrs", "galactic"} Center position coordinate frame. See Also -------- ShellSpatialModel """ tag = ["Shell2SpatialModel", "shell2"] lon_0 = Parameter("lon_0", "0 deg") lat_0 = Parameter("lat_0", "0 deg", min=-90, max=90) r_0 = Parameter("r_0", "1 deg") eta = Parameter("eta", 0.2, min=0.02, max=1) @property def evaluation_bin_size_min(self): """Minimal evaluation bin size as an `~astropy.coordinates.Angle`. The bin min size is defined as r_0*eta. """ return self.eta.value * self.r_0.quantity @property def evaluation_radius(self): r"""Evaluation radius as an `~astropy.coordinates.Angle`. Set to :math:`r_\text{out}`. """ return self.r_0.quantity @property def r_in(self): return (1 - self.eta.quantity) * self.r_0.quantity @staticmethod def evaluate(lon, lat, lon_0, lat_0, r_0, eta): """Evaluate model.""" sep = angular_separation(lon, lat, lon_0, lat_0) r_in = (1 - eta) * r_0 norm = 3 / (2 * np.pi * (r_0**3 - r_in**3)) with np.errstate(invalid="ignore"): # np.where and np.select do not work with quantities, so we use the # workaround with indexing value = np.sqrt(r_0**2 - sep**2) mask = sep < r_in value[mask] = (value - np.sqrt(r_in**2 - sep**2))[mask] value[sep > r_0] = 0 return norm * value def to_region(self, **kwargs): """Model outline as a `~regions.CircleAnnulusSkyRegion`.""" return CircleAnnulusSkyRegion( center=self.position, inner_radius=self.r_in, outer_radius=self.r_0.quantity, **kwargs, ) class ConstantSpatialModel(SpatialModel): """Spatially constant (isotropic) spatial model. For more information see :ref:`constant-spatial-model`. Parameters ---------- value : `~astropy.units.Quantity` Value. Default is 1 sr-1. """ tag = ["ConstantSpatialModel", "const"] value = Parameter("value", "1 sr-1", frozen=True) frame = "icrs" evaluation_radius = None position = None def to_dict(self, full_output=False): """Create dictionary for YAML serilisation.""" # redefined to ignore frame attribute from parent class data = super().to_dict(full_output) data["spatial"].pop("frame") data["spatial"]["parameters"] = [] return data @staticmethod def evaluate(lon, lat, value): """Evaluate model.""" return value def to_region(self, **kwargs): """Model outline as a `~regions.RectangleSkyRegion`.""" return RectangleSkyRegion( center=SkyCoord(0 * u.deg, 0 * u.deg, frame=self.frame), height=180 * u.deg, width=360 * u.deg, **kwargs, ) class ConstantFluxSpatialModel(SpatialModel): """Spatially constant flux spatial model. For more information see :ref:`constant-spatial-model`. """ tag = ["ConstantFluxSpatialModel", "const-flux"] frame = "icrs" evaluation_radius = None position = None def to_dict(self, full_output=False): """Create dictionary for YAML serilisation.""" # redefined to ignore frame attribute from parent class data = super().to_dict(full_output) data["spatial"].pop("frame") return data @staticmethod def evaluate(lon, lat): """Evaluate model.""" return 1 / u.sr @staticmethod def evaluate_geom(geom): """Evaluate model.""" return 1 / geom.solid_angle() @staticmethod def integrate_geom(geom, oversampling_factor=None): """Evaluate model.""" return Map.from_geom(geom=geom, data=1) def to_region(self, **kwargs): """Model outline as a `~regions.RectangleSkyRegion`.""" return RectangleSkyRegion( center=SkyCoord(0 * u.deg, 0 * u.deg, frame=self.frame), height=180 * u.deg, width=360 * u.deg, **kwargs, ) class TemplateSpatialModel(SpatialModel): """Spatial sky map template model. For more information see :ref:`template-spatial-model`. By default, the position of the model is fixed at the center of the map. The position can be fitted by unfreezing the `lon_0` and `lat_0` parameters. In that case, the coordinate of every pixel is shifted in lon and lat in the frame of the map. NOTE: planar distances are calculated, so the results are correct only when the fitted position is close to the map center. Parameters ---------- map : `~gammapy.maps.Map` Map template. meta : dict, optional Meta information, meta['filename'] will be used for serialisation. normalize : bool Normalize the input map so that it integrates to unity. interp_kwargs : dict Interpolation keyword arguments passed to `gammapy.maps.Map.interp_by_coord`. Default arguments are {'method': 'linear', 'fill_value': 0, "values_scale": "log"}. filename : str Name of the map file. copy_data : bool Create a deepcopy of the map data or directly use the original. Default is True. Use False to save memory in case of large maps. **kwargs : dict Keyword arguments forwarded to `SpatialModel.__init__`. """ tag = ["TemplateSpatialModel", "template"] lon_0 = Parameter("lon_0", np.nan, unit="deg", frozen=True) lat_0 = Parameter("lat_0", np.nan, unit="deg", min=-90, max=90, frozen=True) def __init__( self, map, meta=None, normalize=True, interp_kwargs=None, filename=None, copy_data=True, **kwargs, ): if (map.data < 0).any(): log.warning("Map has negative values. Check and fix this!") if (map.data == 0.0).all(): log.warning("Map values are all zeros. Check and fix this!") if np.isnan(map.data).any(): log.warning("Map has NaN values. Check and fix this!") if not map.geom.is_image and (map.data.sum(axis=(1, 2)) == 0).any(): log.warning( "Map values are all zeros in at least one energy bin. Check and fix this!" ) if filename is not None: filename = str(make_path(filename)) self.normalize = normalize if normalize: # Normalize the diffuse map model so that it integrates to unity if map.geom.is_image: data_sum = map.data.sum() else: # Normalize in each energy bin data_sum = map.data.sum(axis=(1, 2)).reshape((-1, 1, 1)) data = np.divide( map.data.astype(float), data_sum, out=np.zeros_like(map.data, dtype=float), where=data_sum != 0, ) data /= map.geom.solid_angle().to_value("sr") map = map.copy(data=data, unit="sr-1") if map.unit.is_equivalent(""): map = map.copy(data=map.data, unit="sr-1") log.warning("Missing spatial template unit, assuming sr^-1") if copy_data: self._map = map.copy() else: self._map = map.copy(data=map.data) self.meta = {} if meta is None else meta interp_kwargs = {} if interp_kwargs is None else interp_kwargs interp_kwargs.setdefault("method", "linear") interp_kwargs.setdefault("fill_value", 0) interp_kwargs.setdefault("values_scale", "log") self._interp_kwargs = interp_kwargs self.filename = filename kwargs["frame"] = self.map.geom.frame if "lon_0" not in kwargs or ( isinstance(kwargs["lon_0"], Parameter) and np.isnan(kwargs["lon_0"].value) ): kwargs["lon_0"] = self.map_center.data.lon if "lat_0" not in kwargs or ( isinstance(kwargs["lat_0"], Parameter) and np.isnan(kwargs["lat_0"].value) ): kwargs["lat_0"] = self.map_center.data.lat super().__init__(**kwargs) def __str__(self): width = self.map.geom.width data_min = np.nanmin(self.map.data) data_max = np.nanmax(self.map.data) prnt = ( f"{self.__class__.__name__} model summary:\n " f"Model center: {self.position} \n " f"Map center: {self.map_center} \n " f"Map width: {width} \n " f"Data min: {data_min} \n" f"Data max: {data_max} \n" f"Data unit: {self.map.unit}" ) if self.is_energy_dependent: energy_min = self.map.geom.axes["energy_true"].center[0] energy_max = self.map.geom.axes["energy_true"].center[-1] prnt1 = f"Energy min: {energy_min} \n" f"Energy max: {energy_max} \n" prnt = prnt + prnt1 return prnt def copy(self, copy_data=False, **kwargs): """Copy model. Parameters ---------- copy_data : bool Whether to copy the data. Default is False. **kwargs : dict Keyword arguments forwarded to `TemplateSpatialModel`. Returns ------- model : `TemplateSpatialModel` Copied template spatial model. """ kwargs.setdefault("map", self.map) kwargs.setdefault("meta", self.meta.copy()) kwargs.setdefault("normalize", self.normalize) kwargs.setdefault("interp_kwargs", self._interp_kwargs) kwargs.setdefault("filename", self.filename) kwargs.setdefault("lon_0", self.parameters["lon_0"].copy()) kwargs.setdefault("lat_0", self.parameters["lat_0"].copy()) return self.__class__(copy_data=copy_data, **kwargs) @property def map(self): """Template map as a `~gammapy.maps.Map` object.""" return self._map @property def is_energy_dependent(self): return "energy_true" in self.map.geom.axes.names @property def evaluation_radius(self): """Evaluation radius as an `~astropy.coordinates.Angle`. Set to half of the maximal dimension of the map. """ return np.max(self.map.geom.width) / 2.0 @property def map_center(self): return self.map.geom.center_skydir @classmethod def read(cls, filename, normalize=True, **kwargs): """Read spatial template model from FITS image. If unit is not given in the FITS header the default is ``sr-1``. Parameters ---------- filename : str FITS image filename. normalize : bool Normalize the input map so that it integrates to unity. kwargs : dict Keyword arguments passed to `Map.read()`. """ m = Map.read(filename, **kwargs) return cls(m, normalize=normalize, filename=filename) def evaluate(self, lon, lat, energy=None, lon_0=None, lat_0=None): """Evaluate the model at given coordinates. Note that, if the map data assume negative values, these are clipped to zero. """ offset_lon = 0.0 * u.deg if lon_0 is None else lon_0 - self.map_center.data.lon offset_lat = 0.0 * u.deg if lat_0 is None else lat_0 - self.map_center.data.lat coord = { "lon": (lon - offset_lon).to_value("deg"), "lat": (lat - offset_lat).to_value("deg"), } if energy is not None: coord["energy_true"] = energy val = self.map.interp_by_coord(coord, **self._interp_kwargs) val = np.clip(val, 0, a_max=None) return u.Quantity(val, self.map.unit, copy=COPY_IF_NEEDED) @property def position_lonlat(self): """Spatial model center position `(lon, lat)` in radians and frame of the model.""" lon = self.position.data.lon.rad lat = self.position.data.lat.rad return lon, lat @classmethod def from_dict(cls, data): data = data["spatial"] filename = data["filename"] normalize = data.get("normalize", True) m = Map.read(filename) pars = data.get("parameters") if pars is not None: parameters = _build_parameters_from_dict(pars, cls.default_parameters) kwargs = {par.name: par for par in parameters} else: kwargs = {} return cls(m, normalize=normalize, filename=filename, **kwargs) def to_dict(self, full_output=False): """Create dictionary for YAML serilisation.""" data = super().to_dict(full_output) data["spatial"]["filename"] = self.filename data["spatial"]["normalize"] = self.normalize data["spatial"]["unit"] = str(self.map.unit) return data def write(self, overwrite=False, filename=None): """ Write the map. Parameters ---------- overwrite: bool, optional Overwrite existing file. Default is False, which will raise a warning if the template file exists already. filename: str, optional Filename of the template model. By default, the template model will be saved with the `TemplateSpatialModel.filename` attribute, if `filename` is provided this attribute will be updated. """ if filename is not None: self.filename = filename if self.filename is None: raise IOError("Missing filename") if os.path.isfile(make_path(self.filename)) and not overwrite: log.warning("Template file already exits, and overwrite is False") else: self.map.write(self.filename, overwrite=overwrite) def to_region(self, **kwargs): """Model outline from template map boundary as a `~regions.RectangleSkyRegion`.""" return RectangleSkyRegion( center=self.map.geom.center_skydir, width=self.map.geom.width[0][0], height=self.map.geom.width[1][0], **kwargs, ) def plot(self, ax=None, geom=None, **kwargs): if geom is None: geom = self.map.geom super().plot(ax=ax, geom=geom, **kwargs) def plot_interactive(self, ax=None, geom=None, **kwargs): if geom is None: geom = self.map.geom super().plot_interactive(ax=ax, geom=geom, **kwargs) class TemplateNDSpatialModel(SpatialModel): """A model generated from a ND map where extra dimensions define the parameter space. Parameters ---------- map : `~gammapy.maps.WcsNDMap` or `~gammapy.maps.HpxNDMap` Map template. meta : dict, optional Meta information, meta['filename'] will be used for serialisation. interp_kwargs : dict Interpolation keyword arguments passed to `gammapy.maps.Map.interp_by_pix`. Default arguments are {'method': 'linear', 'fill_value': 0, "values_scale": "log"}. copy_data : bool Create a deepcopy of the map data or directly use the original. Default is True. Use False to save memory in case of large maps. """ tag = ["TemplateNDSpatialModel", "templateND"] def __init__( self, map, interp_kwargs=None, meta=None, filename=None, copy_data=True, ): if not isinstance(map, (HpxNDMap, WcsNDMap)): raise TypeError("Map should be a HpxNDMap or WcsNDMap") if copy_data: self._map = map.copy() else: self._map = map.copy(data=map.data) self.meta = dict() if meta is None else meta if filename is not None: filename = str(make_path(filename)) self.filename = filename parameters = [] for axis in map.geom.axes: if axis.name not in ["energy_true", "energy"]: center = (axis.bounds[1] + axis.bounds[0]) / 2 parameter = Parameter( name=axis.name, value=center, unit=axis.unit, scale_method="scale10", min=axis.bounds[0], max=axis.bounds[-1], interp=axis.interp, ) parameters.append(parameter) self.default_parameters = Parameters(parameters) interp_kwargs = interp_kwargs or {} interp_kwargs.setdefault("method", "linear") interp_kwargs.setdefault("fill_value", 0) interp_kwargs.setdefault("values_scale", "log") self._interp_kwargs = interp_kwargs super().__init__() @property def map(self): """Template map as a `~gammapy.maps.WcsNDMap` or `~gammapy.maps.HpxNDMap`.""" return self._map @property def is_energy_dependent(self): return "energy_true" in self.map.geom.axes.names def evaluate(self, lon, lat, energy=None, **kwargs): coord = { "lon": lon.to_value("deg"), "lat": lat.to_value("deg"), } if energy is not None: coord["energy_true"] = energy coord.update(kwargs) val = self.map.interp_by_coord(coord, **self._interp_kwargs) val = np.clip(val, 0, a_max=None) return u.Quantity(val, self.map.unit, copy=COPY_IF_NEEDED) def write(self, overwrite=False): """ Write the map. Parameters ---------- overwrite: bool, optional Overwrite existing file. Default is False, which will raise a warning if the template file exists already. """ if self.filename is None: raise IOError("Missing filename") elif os.path.isfile(self.filename) and not overwrite: log.warning("Template file already exits, and overwrite is False") else: self.map.write(self.filename) @classmethod def from_dict(cls, data): data = data["spatial"] filename = data["filename"] m = Map.read(filename) model = cls(m, filename=filename) for idx, p in enumerate(model.parameters): par = p.to_dict() par.update(data["parameters"][idx]) setattr(model, p.name, Parameter(**par)) return model def to_dict(self, full_output=False): """Create dictionary for YAML serialisation.""" data = super().to_dict(full_output) data["spatial"]["filename"] = self.filename data["spatial"]["unit"] = str(self.map.unit) return data class PiecewiseNormSpatialModel(SpatialModel): """Piecewise spatial correction with a free normalization at each fixed nodes. For more information see :ref:`piecewise-norm-spatial`. Parameters ---------- coord : `gammapy.maps.MapCoord` Flat coordinates list at which the model values are given (nodes). norms : `~numpy.ndarray` or list of `Parameter` Array with the initial norms of the model at energies ``energy``. Normalisation parameters are created for each value. Default is one at each node. interp : {"lin", "log"} Interpolation scaling. Default is "lin". """ tag = ["PiecewiseNormSpatialModel", "piecewise-norm"] def __init__(self, coords, norms=None, interp="lin", **kwargs): self._coords = coords.copy() lon = Angle(coords.lon).wrap_at(0 * u.deg) self._wrap_angle = (lon.max() - lon.min()) / 2 self._coords["lon"] = Angle(lon).wrap_at(self._wrap_angle) self._interp = interp if norms is None: norms = np.ones(coords.shape) if len(norms) != coords.shape[0]: raise ValueError("dimension mismatch") if len(norms) < 4: raise ValueError("Input arrays must contain at least 4 elements") if self.is_energy_dependent: raise ValueError("Energy dependent nodes are not supported") if not isinstance(norms[0], Parameter): parameters = Parameters( [Parameter(f"norm_{k}", norm) for k, norm in enumerate(norms)] ) else: parameters = Parameters(norms) self.default_parameters = parameters super().__init__(**kwargs) @property def coords(self): """Energy nodes.""" return self._coords @property def norms(self): """Norm values.""" return u.Quantity([p.quantity for p in self.parameters]) @property def is_energy_dependent(self): keys = self.coords._data.keys() return "energy" in keys or "energy_true" in keys def evaluate(self, lon, lat, energy=None, **norms): """Evaluate the model at given coordinates.""" scale = interpolation_scale(scale=self._interp) v_nodes = scale(self.norms.value) coords = [value.value for value in self.coords._data.values()] # TODO: apply axes scaling in this loop coords = list(zip(*coords)) lon = Angle(lon).wrap_at(0 * u.deg) lon = Angle(lon).wrap_at(self._wrap_angle) # by default rely on CloughTocher2DInterpolator # (Piecewise cubic, C1 smooth, curvature-minimizing interpolant) interpolated = griddata(coords, v_nodes, (lon, lat), method="cubic") return scale.inverse(interpolated) * self.norms.unit def evaluate_geom(self, geom): """Evaluate model on `~gammapy.maps.Geom`. Parameters ---------- geom : `~gammapy.maps.WcsGeom` Map geometry. Returns ------- map : `~gammapy.maps.Map` Map containing the value in each spatial bin. """ coords = geom.get_coord(frame=self.frame, sparse=True) return self(coords.lon, coords.lat) def to_dict(self, full_output=False): data = super().to_dict(full_output=full_output) for key, value in self.coords._data.items(): data["spatial"][key] = { "data": value.data.tolist(), "unit": str(value.unit), } return data @classmethod def from_dict(cls, data): """Create model from dictionary.""" data = data["spatial"] lon = u.Quantity(data["lon"]["data"], data["lon"]["unit"]) lat = u.Quantity(data["lat"]["data"], data["lat"]["unit"]) coords = MapCoord.create((lon, lat)) parameters = Parameters.from_dict(data["parameters"]) return cls.from_parameters(parameters, coords=coords, frame=data["frame"]) @classmethod def from_parameters(cls, parameters, **kwargs): """Create model from parameters.""" return cls(norms=parameters, **kwargs) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/spectral.py0000644000175100001770000024351414721316200021325 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Spectral models for Gammapy.""" import logging import operator import os import warnings from pathlib import Path import numpy as np import scipy.optimize import scipy.special import astropy.units as u from astropy import constants as const from astropy.table import Table from astropy.units import Quantity from astropy.utils.decorators import classproperty from astropy.visualization import quantity_support import matplotlib.pyplot as plt from gammapy.maps import MapAxis, RegionNDMap from gammapy.maps.axes import UNIT_STRING_FORMAT from gammapy.modeling import Parameter, Parameters from gammapy.utils.compat import COPY_IF_NEEDED from gammapy.utils.deprecation import GammapyDeprecationWarning from gammapy.utils.integrate import trapz_loglog from gammapy.utils.interpolation import ( ScaledRegularGridInterpolator, interpolation_scale, ) from gammapy.utils.roots import find_roots from gammapy.utils.scripts import make_path from ..covariance import CovarianceMixin from .core import ModelBase log = logging.getLogger(__name__) __all__ = [ "BrokenPowerLawSpectralModel", "CompoundSpectralModel", "ConstantSpectralModel", "EBLAbsorptionNormSpectralModel", "ExpCutoffPowerLaw3FGLSpectralModel", "ExpCutoffPowerLawNormSpectralModel", "ExpCutoffPowerLawSpectralModel", "GaussianSpectralModel", "integrate_spectrum", "LogParabolaNormSpectralModel", "LogParabolaSpectralModel", "NaimaSpectralModel", "PiecewiseNormSpectralModel", "PowerLaw2SpectralModel", "PowerLawNormSpectralModel", "PowerLawSpectralModel", "scale_plot_flux", "ScaleSpectralModel", "SmoothBrokenPowerLawSpectralModel", "SpectralModel", "SuperExpCutoffPowerLaw3FGLSpectralModel", "SuperExpCutoffPowerLaw4FGLDR3SpectralModel", "SuperExpCutoffPowerLaw4FGLSpectralModel", "TemplateSpectralModel", "TemplateNDSpectralModel", "EBL_DATA_BUILTIN", ] EBL_DATA_BUILTIN = { "franceschini": "$GAMMAPY_DATA/ebl/ebl_franceschini.fits.gz", "dominguez": "$GAMMAPY_DATA/ebl/ebl_dominguez11.fits.gz", "finke": "$GAMMAPY_DATA/ebl/frd_abs.fits.gz", "franceschini17": "$GAMMAPY_DATA/ebl/ebl_franceschini_2017.fits.gz", "saldana-lopez21": "$GAMMAPY_DATA/ebl/ebl_saldana-lopez_2021.fits.gz", } def scale_plot_flux(flux, energy_power=0): """Scale flux to plot. Parameters ---------- flux : `Map` Flux map. energy_power : int, optional Power of energy to multiply flux axis with. Default is 0. Returns ------- flux : `Map` Scaled flux map. """ energy = flux.geom.get_coord(sparse=True)["energy"] try: eunit = [_ for _ in flux.unit.bases if _.physical_type == "energy"][0] except IndexError: eunit = energy.unit y = flux * np.power(energy, energy_power) return y.to_unit(flux.unit * eunit**energy_power) def integrate_spectrum(func, energy_min, energy_max, ndecade=100): """Integrate one-dimensional function using the log-log trapezoidal rule. Internally an oversampling of the energy bins to "ndecade" is used. Parameters ---------- func : callable Function to integrate. energy_min : `~astropy.units.Quantity` Integration range minimum. energy_max : `~astropy.units.Quantity` Integration range minimum. ndecade : int, optional Number of grid points per decade used for the integration. Default is 100. """ # Here we impose to duplicate the number num = np.maximum(np.max(ndecade * np.log10(energy_max / energy_min)), 2) energy = np.geomspace(energy_min, energy_max, num=int(num), axis=-1) integral = trapz_loglog(func(energy), energy, axis=-1) return integral.sum(axis=0) class SpectralModel(ModelBase): """Spectral model base class.""" _type = "spectral" def __call__(self, energy): kwargs = {par.name: par.quantity for par in self.parameters} kwargs = self._convert_evaluate_unit(kwargs, energy) return self.evaluate(energy, **kwargs) @classproperty def is_norm_spectral_model(cls): """Whether model is a norm spectral model.""" return "Norm" in cls.__name__ @staticmethod def _convert_evaluate_unit(kwargs_ref, energy): kwargs = {} for name, quantity in kwargs_ref.items(): if quantity.unit.physical_type == "energy": quantity = quantity.to(energy.unit) kwargs[name] = quantity return kwargs def __add__(self, model): if not isinstance(model, SpectralModel): model = ConstantSpectralModel(const=model) return CompoundSpectralModel(self, model, operator.add) def __mul__(self, other): if isinstance(other, SpectralModel): return CompoundSpectralModel(self, other, operator.mul) else: raise TypeError(f"Multiplication invalid for type {other!r}") def __radd__(self, model): return self.__add__(model) def __sub__(self, model): if not isinstance(model, SpectralModel): model = ConstantSpectralModel(const=model) return CompoundSpectralModel(self, model, operator.sub) def __rsub__(self, model): return self.__sub__(model) def _propagate_error(self, epsilon, fct, **kwargs): """Evaluate error for a given function with uncertainty propagation. Parameters ---------- fct : `~astropy.units.Quantity` Function to estimate the error. epsilon : float Step size of the gradient evaluation. Given as a fraction of the parameter error. **kwargs : dict Keyword arguments. Returns ------- f_cov : `~astropy.units.Quantity` Error of the given function. """ eps = np.sqrt(np.diag(self.covariance)) * epsilon n, f_0 = len(self.parameters), fct(**kwargs) shape = (n, len(np.atleast_1d(f_0))) df_dp = np.zeros(shape) for idx, parameter in enumerate(self.parameters): if parameter.frozen or eps[idx] == 0: continue parameter.value += eps[idx] df = fct(**kwargs) - f_0 df_dp[idx] = df.value / eps[idx] parameter.value -= eps[idx] f_cov = df_dp.T @ self.covariance @ df_dp f_err = np.sqrt(np.diagonal(f_cov)) return u.Quantity([np.atleast_1d(f_0.value), f_err], unit=f_0.unit).squeeze() def evaluate_error(self, energy, epsilon=1e-4): """Evaluate spectral model with error propagation. Parameters ---------- energy : `~astropy.units.Quantity` Energy at which to evaluate. epsilon : float, optional Step size of the gradient evaluation. Given as a fraction of the parameter error. Default is 1e-4. Returns ------- dnde, dnde_error : tuple of `~astropy.units.Quantity` Tuple of flux and flux error. """ return self._propagate_error(epsilon=epsilon, fct=self, energy=energy) @property def pivot_energy(self): """Pivot or decorrelation energy, for a given spectral model calculated numerically. It is defined as the energy at which the correlation between the spectral parameters is minimized. Returns ------- pivot energy : `~astropy.units.Quantity` The energy at which the statistical error in the computed flux is smallest. If no minimum is found, NaN will be returned. """ x_unit = self.reference.unit def min_func(x): """Function to minimise.""" x = np.exp(x) dnde, dnde_error = self.evaluate_error(x * x_unit) return dnde_error / dnde bounds = [np.log(self.reference.value) - 3, np.log(self.reference.value) + 3] std = np.std(min_func(x=np.linspace(bounds[0], bounds[1], 100))) if std < 1e-5: log.warning( "The relative error on the flux does not depend on energy. No pivot energy found." ) return np.nan * x_unit minimizer = scipy.optimize.minimize_scalar(min_func, bounds=bounds) if not minimizer.success: log.warning( "No minima found in the relative error on the flux. Pivot energy computation failed." ) return np.nan * x_unit else: return np.exp(minimizer.x) * x_unit def integral(self, energy_min, energy_max, **kwargs): r"""Integrate spectral model numerically if no analytical solution defined. .. math:: F(E_{min}, E_{max}) = \int_{E_{min}}^{E_{max}} \phi(E) dE Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Lower and upper bound of integration range. **kwargs : dict Keyword arguments passed to :func:`~gammapy.modeling.models.spectral.integrate_spectrum`. """ if hasattr(self, "evaluate_integral"): kwargs = {par.name: par.quantity for par in self.parameters} kwargs = self._convert_evaluate_unit(kwargs, energy_min) return self.evaluate_integral(energy_min, energy_max, **kwargs) else: return integrate_spectrum(self, energy_min, energy_max, **kwargs) def integral_error(self, energy_min, energy_max, epsilon=1e-4, **kwargs): """Evaluate the error of the integral flux of a given spectrum in a given energy range. Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Lower and upper bound of integration range. epsilon : float, optional Step size of the gradient evaluation. Given as a fraction of the parameter error. Default is 1e-4. Returns ------- flux, flux_err : tuple of `~astropy.units.Quantity` Integral flux and flux error between energy_min and energy_max. """ return self._propagate_error( epsilon=epsilon, fct=self.integral, energy_min=energy_min, energy_max=energy_max, **kwargs, ) def energy_flux(self, energy_min, energy_max, **kwargs): r"""Compute energy flux in given energy range. .. math:: G(E_{min}, E_{max}) = \int_{E_{min}}^{E_{max}} E \phi(E) dE Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Lower and upper bound of integration range. **kwargs : dict Keyword arguments passed to :func:`~gammapy.modeling.models.spectral.integrate_spectrum`. """ def f(x): return x * self(x) if hasattr(self, "evaluate_energy_flux"): kwargs = {par.name: par.quantity for par in self.parameters} kwargs = self._convert_evaluate_unit(kwargs, energy_min) return self.evaluate_energy_flux(energy_min, energy_max, **kwargs) else: return integrate_spectrum(f, energy_min, energy_max, **kwargs) def energy_flux_error(self, energy_min, energy_max, epsilon=1e-4, **kwargs): """Evaluate the error of the energy flux of a given spectrum in a given energy range. Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Lower and upper bound of integration range. epsilon : float, optional Step size of the gradient evaluation. Given as a fraction of the parameter error. Default is 1e-4. Returns ------- energy_flux, energy_flux_err : tuple of `~astropy.units.Quantity` Energy flux and energy flux error between energy_min and energy_max. """ return self._propagate_error( epsilon=epsilon, fct=self.energy_flux, energy_min=energy_min, energy_max=energy_max, **kwargs, ) def reference_fluxes(self, energy_axis): """Get reference fluxes for a given energy axis. Parameters ---------- energy_axis : `MapAxis` Energy axis. Returns ------- fluxes : dict of `~astropy.units.Quantity` Reference fluxes. """ energy = energy_axis.center energy_min, energy_max = energy_axis.edges_min, energy_axis.edges_max return { "e_ref": energy, "e_min": energy_min, "e_max": energy_max, "ref_dnde": self(energy), "ref_flux": self.integral(energy_min, energy_max), "ref_eflux": self.energy_flux(energy_min, energy_max), "ref_e2dnde": self(energy) * energy**2, } def _get_plot_flux(self, energy, sed_type): flux = RegionNDMap.create(region=None, axes=[energy]) flux_err = RegionNDMap.create(region=None, axes=[energy]) if sed_type in ["dnde", "norm"]: flux.quantity, flux_err.quantity = self.evaluate_error(energy.center) elif sed_type == "e2dnde": flux.quantity, flux_err.quantity = energy.center**2 * self.evaluate_error( energy.center ) elif sed_type == "flux": flux.quantity, flux_err.quantity = self.integral_error( energy.edges_min, energy.edges_max ) elif sed_type == "eflux": flux.quantity, flux_err.quantity = self.energy_flux_error( energy.edges_min, energy.edges_max ) else: raise ValueError(f"Not a valid SED type: '{sed_type}'") return flux, flux_err def plot( self, energy_bounds, ax=None, sed_type="dnde", energy_power=0, n_points=100, **kwargs, ): """Plot spectral model curve. By default a log-log scaling of the axes is used, if you want to change the y-axis scaling to linear you can use:: >>> from gammapy.modeling.models import ExpCutoffPowerLawSpectralModel >>> from astropy import units as u >>> pwl = ExpCutoffPowerLawSpectralModel() >>> ax = pwl.plot(energy_bounds=(0.1, 100) * u.TeV) >>> ax.set_yscale('linear') Parameters ---------- energy_bounds : `~astropy.units.Quantity`, list of `~astropy.units.Quantity` or `~gammapy.maps.MapAxis` Energy bounds between which the model is to be plotted. Or an axis defining the energy bounds between which the model is to be plotted. ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. sed_type : {"dnde", "flux", "eflux", "e2dnde"} Evaluation methods of the model. Default is "dnde". energy_power : int, optional Power of energy to multiply flux axis with. Default is 0. n_points : int, optional Number of evaluation nodes. Default is 100. **kwargs : dict Keyword arguments forwarded to `~matplotlib.pyplot.plot`. Returns ------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Notes ----- If ``energy_bounds`` is supplied as a list, tuple, or Quantity, an ``energy_axis`` is created internally with ``n_points`` bins between the given bounds. """ from gammapy.estimators.map.core import DEFAULT_UNIT if self.is_norm_spectral_model: sed_type = "norm" if isinstance(energy_bounds, (tuple, list, Quantity)): energy_min, energy_max = energy_bounds energy = MapAxis.from_energy_bounds( energy_min, energy_max, n_points, ) elif isinstance(energy_bounds, MapAxis): energy = energy_bounds ax = plt.gca() if ax is None else ax if ax.yaxis.units is None: ax.yaxis.set_units(DEFAULT_UNIT[sed_type] * energy.unit**energy_power) flux, _ = self._get_plot_flux(sed_type=sed_type, energy=energy) flux = scale_plot_flux(flux, energy_power=energy_power) with quantity_support(): ax.plot(energy.center, flux.quantity[:, 0, 0], **kwargs) self._plot_format_ax(ax, energy_power, sed_type) return ax def plot_error( self, energy_bounds, ax=None, sed_type="dnde", energy_power=0, n_points=100, **kwargs, ): """Plot spectral model error band. .. note:: This method calls ``ax.set_yscale("log", nonpositive='clip')`` and ``ax.set_xscale("log", nonposx='clip')`` to create a log-log representation. The additional argument ``nonposx='clip'`` avoids artefacts in the plot, when the error band extends to negative values (see also https://github.com/matplotlib/matplotlib/issues/8623). When you call ``plt.loglog()`` or ``plt.semilogy()`` explicitly in your plotting code and the error band extends to negative values, it is not shown correctly. To circumvent this issue also use ``plt.loglog(nonposx='clip', nonpositive='clip')`` or ``plt.semilogy(nonpositive='clip')``. Parameters ---------- energy_bounds : `~astropy.units.Quantity`, list of `~astropy.units.Quantity` or `~gammapy.maps.MapAxis` Energy bounds between which the model is to be plotted. Or an axis defining the energy bounds between which the model is to be plotted. ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Default is None. sed_type : {"dnde", "flux", "eflux", "e2dnde"} Evaluation methods of the model. Default is "dnde". energy_power : int, optional Power of energy to multiply flux axis with. Default is 0. n_points : int, optional Number of evaluation nodes. Default is 100. **kwargs : dict Keyword arguments forwarded to `matplotlib.pyplot.fill_between`. Returns ------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. Notes ----- If ``energy_bounds`` is supplied as a list, tuple, or Quantity, an ``energy_axis`` is created internally with ``n_points`` bins between the given bounds. """ from gammapy.estimators.map.core import DEFAULT_UNIT if self.is_norm_spectral_model: sed_type = "norm" if isinstance(energy_bounds, (tuple, list, Quantity)): energy_min, energy_max = energy_bounds energy = MapAxis.from_energy_bounds( energy_min, energy_max, n_points, ) elif isinstance(energy_bounds, MapAxis): energy = energy_bounds ax = plt.gca() if ax is None else ax kwargs.setdefault("facecolor", "black") kwargs.setdefault("alpha", 0.2) kwargs.setdefault("linewidth", 0) if ax.yaxis.units is None: ax.yaxis.set_units(DEFAULT_UNIT[sed_type] * energy.unit**energy_power) flux, flux_err = self._get_plot_flux(sed_type=sed_type, energy=energy) y_lo = scale_plot_flux(flux - flux_err, energy_power).quantity[:, 0, 0] y_hi = scale_plot_flux(flux + flux_err, energy_power).quantity[:, 0, 0] with quantity_support(): ax.fill_between(energy.center, y_lo, y_hi, **kwargs) self._plot_format_ax(ax, energy_power, sed_type) return ax @staticmethod def _plot_format_ax(ax, energy_power, sed_type): ax.set_xlabel(f"Energy [{ax.xaxis.units.to_string(UNIT_STRING_FORMAT)}]") if energy_power > 0: ax.set_ylabel( f"e{energy_power} * {sed_type} [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]" ) else: ax.set_ylabel( f"{sed_type} [{ax.yaxis.units.to_string(UNIT_STRING_FORMAT)}]" ) ax.set_xscale("log", nonpositive="clip") ax.set_yscale("log", nonpositive="clip") def spectral_index(self, energy, epsilon=1e-5): """Compute spectral index at given energy. Parameters ---------- energy : `~astropy.units.Quantity` Energy at which to estimate the index. epsilon : float, optional Fractional energy increment to use for determining the spectral index. Default is 1e-5. Returns ------- index : float Estimated spectral index. """ f1 = self(energy) f2 = self(energy * (1 + epsilon)) return np.log(f1 / f2) / np.log(1 + epsilon) def spectral_index_error(self, energy, epsilon=1e-5): """Evaluate the error on spectral index at the given energy. Parameters ---------- energy : `~astropy.units.Quantity` Energy at which to estimate the index. epsilon : float, optional Fractional energy increment to use for determining the spectral index. Default is 1e-5. Returns ------- index, index_error : tuple of float Estimated spectral index and its error. """ return self._propagate_error( epsilon=epsilon, fct=self.spectral_index, energy=energy ) def inverse(self, value, energy_min=0.1 * u.TeV, energy_max=100 * u.TeV): """Return energy for a given function value of the spectral model. Calls the `scipy.optimize.brentq` numerical root finding method. Parameters ---------- value : `~astropy.units.Quantity` Function value of the spectral model. energy_min : `~astropy.units.Quantity`, optional Lower energy bound of the roots finding. Default is 0.1 TeV. energy_max : `~astropy.units.Quantity`, optional Upper energy bound of the roots finding. Default is 100 TeV. Returns ------- energy : `~astropy.units.Quantity` Energies at which the model has the given ``value``. """ eunit = "TeV" energy_min = energy_min.to(eunit) energy_max = energy_max.to(eunit) def f(x): # scale by 1e12 to achieve better precision energy = u.Quantity(x, eunit, copy=COPY_IF_NEEDED) y = self(energy).to_value(value.unit) return 1e12 * (y - value.value) roots, res = find_roots(f, energy_min, energy_max, points_scale="log") return roots def inverse_all(self, values, energy_min=0.1 * u.TeV, energy_max=100 * u.TeV): """Return energies for multiple function values of the spectral model. Calls the `scipy.optimize.brentq` numerical root finding method. Parameters ---------- values : `~astropy.units.Quantity` Function values of the spectral model. energy_min : `~astropy.units.Quantity`, optional Lower energy bound of the roots finding. Default is 0.1 TeV. energy_max : `~astropy.units.Quantity`, optional Upper energy bound of the roots finding. Default is 100 TeV. Returns ------- energy : list of `~astropy.units.Quantity` Each element contains the energies at which the model has corresponding value of ``values``. """ energies = [] for val in np.atleast_1d(values): res = self.inverse(val, energy_min, energy_max) energies.append(res) return energies class ConstantSpectralModel(SpectralModel): r"""Constant model. For more information see :ref:`constant-spectral-model`. Parameters ---------- const : `~astropy.units.Quantity` :math:`k`. Default is 1e-12 cm-2 s-1 TeV-1. """ tag = ["ConstantSpectralModel", "const"] const = Parameter("const", "1e-12 cm-2 s-1 TeV-1") @staticmethod def evaluate(energy, const): """Evaluate the model (static function).""" return np.ones(np.atleast_1d(energy).shape) * const class CompoundSpectralModel(CovarianceMixin, SpectralModel): """Arithmetic combination of two spectral models. For more information see :ref:`compound-spectral-model`. """ tag = ["CompoundSpectralModel", "compound"] def __init__(self, model1, model2, operator): self.model1 = model1 self.model2 = model2 self.operator = operator super().__init__() @property def _models(self): return [self.model1, self.model2] @property def parameters(self): return self.model1.parameters + self.model2.parameters @property def parameters_unique_names(self): names = [] for idx, model in enumerate(self._models): for par_name in model.parameters_unique_names: components = [f"model{idx+1}", par_name] name = ".".join(components) names.append(name) return names def __str__(self): return ( f"{self.__class__.__name__}\n" f" Component 1 : {self.model1}\n" f" Component 2 : {self.model2}\n" f" Operator : {self.operator.__name__}\n" ) def __call__(self, energy): val1 = self.model1(energy) val2 = self.model2(energy) return self.operator(val1, val2) def to_dict(self, full_output=False): dict1 = self.model1.to_dict(full_output) dict2 = self.model2.to_dict(full_output) return { self._type: { "type": self.tag[0], "model1": dict1["spectral"], # for cleaner output "model2": dict2["spectral"], "operator": self.operator.__name__, } } def evaluate(self, energy, *args): args1 = args[: len(self.model1.parameters)] args2 = args[len(self.model1.parameters) :] val1 = self.model1.evaluate(energy, *args1) val2 = self.model2.evaluate(energy, *args2) return self.operator(val1, val2) @classmethod def from_dict(cls, data, **kwargs): from gammapy.modeling.models import SPECTRAL_MODEL_REGISTRY data = data["spectral"] model1_cls = SPECTRAL_MODEL_REGISTRY.get_cls(data["model1"]["type"]) model1 = model1_cls.from_dict({"spectral": data["model1"]}) model2_cls = SPECTRAL_MODEL_REGISTRY.get_cls(data["model2"]["type"]) model2 = model2_cls.from_dict({"spectral": data["model2"]}) op = getattr(operator, data["operator"]) return cls(model1, model2, op) class PowerLawSpectralModel(SpectralModel): r"""Spectral power-law model. For more information see :ref:`powerlaw-spectral-model`. Parameters ---------- index : `~astropy.units.Quantity` :math:`\Gamma`. Default is 2.0. amplitude : `~astropy.units.Quantity` :math:`\phi_0`. Default is 1e-12 cm-2 s-1 TeV-1. reference : `~astropy.units.Quantity` :math:`E_0`. Default is 1 TeV. See Also -------- PowerLaw2SpectralModel, PowerLawNormSpectralModel """ tag = ["PowerLawSpectralModel", "pl"] index = Parameter("index", 2.0) amplitude = Parameter( "amplitude", "1e-12 cm-2 s-1 TeV-1", scale_method="scale10", interp="log", ) reference = Parameter("reference", "1 TeV", frozen=True) @staticmethod def evaluate(energy, index, amplitude, reference): """Evaluate the model (static function).""" return amplitude * np.power((energy / reference), -index) @staticmethod def evaluate_integral(energy_min, energy_max, index, amplitude, reference): r"""Integrate power law analytically (static function). .. math:: F(E_{min}, E_{max}) = \int_{E_{min}}^{E_{max}}\phi(E)dE = \left. \phi_0 \frac{E_0}{-\Gamma + 1} \left( \frac{E}{E_0} \right)^{-\Gamma + 1} \right \vert _{E_{min}}^{E_{max}} Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Lower and upper bound of integration range. """ val = -1 * index + 1 prefactor = amplitude * reference / val upper = np.power((energy_max / reference), val) lower = np.power((energy_min / reference), val) integral = prefactor * (upper - lower) mask = np.isclose(val, 0) if mask.any(): integral[mask] = (amplitude * reference * np.log(energy_max / energy_min))[ mask ] return integral @staticmethod def evaluate_energy_flux(energy_min, energy_max, index, amplitude, reference): r"""Compute energy flux in given energy range analytically (static function). .. math:: G(E_{min}, E_{max}) = \int_{E_{min}}^{E_{max}}E \phi(E)dE = \left. \phi_0 \frac{E_0^2}{-\Gamma + 2} \left( \frac{E}{E_0} \right)^{-\Gamma + 2} \right \vert _{E_{min}}^{E_{max}} Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Lower and upper bound of integration range. """ val = -1 * index + 2 prefactor = amplitude * reference**2 / val upper = (energy_max / reference) ** val lower = (energy_min / reference) ** val energy_flux = prefactor * (upper - lower) mask = np.isclose(val, 0) if mask.any(): # see https://www.wolframalpha.com/input/?i=a+*+x+*+(x%2Fb)+%5E+(-2) # for reference energy_flux[mask] = ( amplitude * reference**2 * np.log(energy_max / energy_min)[mask] ) return energy_flux def inverse(self, value, *args): """Return energy for a given function value of the spectral model. Parameters ---------- value : `~astropy.units.Quantity` Function value of the spectral model. """ base = value / self.amplitude.quantity return self.reference.quantity * np.power(base, -1.0 / self.index.value) @property def pivot_energy(self): r"""The pivot or decorrelation energy is defined as: .. math:: E_D = E_0 * \exp{cov(\phi_0, \Gamma) / (\phi_0 \Delta \Gamma^2)} Formula (1) in https://arxiv.org/pdf/0910.4881.pdf Returns ------- pivot energy : `~astropy.units.Quantity` If no minimum is found, NaN will be returned. """ index_err = self.index.error reference = self.reference.quantity amplitude = self.amplitude.quantity cov_index_ampl = self.covariance.data[0, 1] * amplitude.unit return reference * np.exp(cov_index_ampl / (amplitude * index_err**2)) class PowerLawNormSpectralModel(SpectralModel): r"""Spectral power-law model with normalized amplitude parameter. Parameters ---------- tilt : `~astropy.units.Quantity` :math:`\Gamma`. Default is 0. norm : `~astropy.units.Quantity` :math:`\phi_0`. Default is 1. reference : `~astropy.units.Quantity` :math:`E_0`. Default is 1 TeV. See Also -------- PowerLawSpectralModel, PowerLaw2SpectralModel """ tag = ["PowerLawNormSpectralModel", "pl-norm"] norm = Parameter("norm", 1, unit="", interp="log") tilt = Parameter("tilt", 0, frozen=True) reference = Parameter("reference", "1 TeV", frozen=True) @staticmethod def evaluate(energy, tilt, norm, reference): """Evaluate the model (static function).""" return norm * np.power((energy / reference), -tilt) @staticmethod def evaluate_integral(energy_min, energy_max, tilt, norm, reference): """Evaluate powerlaw integral.""" val = -1 * tilt + 1 prefactor = norm * reference / val upper = np.power((energy_max / reference), val) lower = np.power((energy_min / reference), val) integral = prefactor * (upper - lower) mask = np.isclose(val, 0) if mask.any(): integral[mask] = (norm * reference * np.log(energy_max / energy_min))[mask] return integral @staticmethod def evaluate_energy_flux(energy_min, energy_max, tilt, norm, reference): """Evaluate the energy flux (static function).""" val = -1 * tilt + 2 prefactor = norm * reference**2 / val upper = (energy_max / reference) ** val lower = (energy_min / reference) ** val energy_flux = prefactor * (upper - lower) mask = np.isclose(val, 0) if mask.any(): # see https://www.wolframalpha.com/input/?i=a+*+x+*+(x%2Fb)+%5E+(-2) # for reference energy_flux[mask] = ( norm * reference**2 * np.log(energy_max / energy_min)[mask] ) return energy_flux def inverse(self, value, *args): """Return energy for a given function value of the spectral model. Parameters ---------- value : `~astropy.units.Quantity` Function value of the spectral model. """ base = value / self.norm.quantity return self.reference.quantity * np.power(base, -1.0 / self.tilt.value) @property def pivot_energy(self): r"""The pivot or decorrelation energy is defined as: .. math:: E_D = E_0 * \exp{cov(\phi_0, \Gamma) / (\phi_0 \Delta \Gamma^2)} Formula (1) in https://arxiv.org/pdf/0910.4881.pdf Returns ------- pivot energy : `~astropy.units.Quantity` If no minimum is found, NaN will be returned. """ tilt_err = self.tilt.error reference = self.reference.quantity norm = self.norm.quantity cov_tilt_norm = self.covariance.data[0, 1] * norm.unit return reference * np.exp(cov_tilt_norm / (norm * tilt_err**2)) class PowerLaw2SpectralModel(SpectralModel): r"""Spectral power-law model with integral as amplitude parameter. For more information see :ref:`powerlaw2-spectral-model`. Parameters ---------- index : `~astropy.units.Quantity` Spectral index :math:`\Gamma`. Default is 2. amplitude : `~astropy.units.Quantity` Integral flux :math:`F_0`. Default is 1e-12 cm-2 s-1. emin : `~astropy.units.Quantity` Lower energy limit :math:`E_{0, min}`. Default is 0.1 TeV. emax : `~astropy.units.Quantity` Upper energy limit :math:`E_{0, max}`. Default is 100 TeV. See Also -------- PowerLawSpectralModel, PowerLawNormSpectralModel """ tag = ["PowerLaw2SpectralModel", "pl-2"] amplitude = Parameter( name="amplitude", value="1e-12 cm-2 s-1", scale_method="scale10", interp="log", ) index = Parameter("index", 2) emin = Parameter("emin", "0.1 TeV", frozen=True) emax = Parameter("emax", "100 TeV", frozen=True) @staticmethod def evaluate(energy, amplitude, index, emin, emax): """Evaluate the model (static function).""" top = -index + 1 # to get the energies dimensionless we use a modified formula bottom = emax - emin * (emin / emax) ** (-index) return amplitude * (top / bottom) * np.power(energy / emax, -index) @staticmethod def evaluate_integral(energy_min, energy_max, amplitude, index, emin, emax): r"""Integrate power law analytically. .. math:: F(E_{min}, E_{max}) = F_0 \cdot \frac{E_{max}^{\Gamma + 1} \ - E_{min}^{\Gamma + 1}}{E_{0, max}^{\Gamma + 1} \ - E_{0, min}^{\Gamma + 1}} Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Lower and upper bound of integration range. """ temp1 = np.power(energy_max, -index.value + 1) temp2 = np.power(energy_min, -index.value + 1) top = temp1 - temp2 temp1 = np.power(emax, -index.value + 1) temp2 = np.power(emin, -index.value + 1) bottom = temp1 - temp2 return amplitude * top / bottom def inverse(self, value, *args): """Return energy for a given function value of the spectral model. Parameters ---------- value : `~astropy.units.Quantity` Function value of the spectral model. """ amplitude = self.amplitude.quantity index = self.index.value energy_min = self.emin.quantity energy_max = self.emax.quantity # to get the energies dimensionless we use a modified formula top = -index + 1 bottom = energy_max - energy_min * (energy_min / energy_max) ** (-index) term = (bottom / top) * (value / amplitude) return np.power(term.to_value(""), -1.0 / index) * energy_max class BrokenPowerLawSpectralModel(SpectralModel): r"""Spectral broken power-law model. For more information see :ref:`broken-powerlaw-spectral-model`. Parameters ---------- index1 : `~astropy.units.Quantity` :math:`\Gamma1`. Default is 2. index2 : `~astropy.units.Quantity` :math:`\Gamma2`. Default is 2. amplitude : `~astropy.units.Quantity` :math:`\phi_0`. Default is 1e-12 cm-2 s-1 TeV-1. ebreak : `~astropy.units.Quantity` :math:`E_{break}`. Default is 1 TeV. See Also -------- SmoothBrokenPowerLawSpectralModel """ tag = ["BrokenPowerLawSpectralModel", "bpl"] index1 = Parameter("index1", 2.0) index2 = Parameter("index2", 2.0) amplitude = Parameter( name="amplitude", value="1e-12 cm-2 s-1 TeV-1", scale_method="scale10", interp="log", ) ebreak = Parameter("ebreak", "1 TeV") @staticmethod def evaluate(energy, index1, index2, amplitude, ebreak): """Evaluate the model (static function).""" energy = np.atleast_1d(energy) cond = energy < ebreak bpwl = amplitude * np.ones(energy.shape) bpwl[cond] *= (energy[cond] / ebreak) ** (-index1) bpwl[~cond] *= (energy[~cond] / ebreak) ** (-index2) return bpwl class SmoothBrokenPowerLawSpectralModel(SpectralModel): r"""Spectral smooth broken power-law model. For more information see :ref:`smooth-broken-powerlaw-spectral-model`. Parameters ---------- index1 : `~astropy.units.Quantity` :math:`\Gamma_1`. Default is 2. index2 : `~astropy.units.Quantity` :math:`\Gamma_2`. Default is 2. amplitude : `~astropy.units.Quantity` :math:`\phi_0`. Default is 1e-12 cm-2 s-1 TeV-1. reference : `~astropy.units.Quantity` :math:`E_0`. Default is 1 TeV. ebreak : `~astropy.units.Quantity` :math:`E_{break}`. Default is 1 TeV. beta : `~astropy.units.Quantity` :math:`\beta`. Default is 1. See Also -------- BrokenPowerLawSpectralModel """ tag = ["SmoothBrokenPowerLawSpectralModel", "sbpl"] index1 = Parameter("index1", 2.0) index2 = Parameter("index2", 2.0) amplitude = Parameter( name="amplitude", value="1e-12 cm-2 s-1 TeV-1", scale_method="scale10", interp="log", ) ebreak = Parameter("ebreak", "1 TeV") reference = Parameter("reference", "1 TeV", frozen=True) beta = Parameter("beta", 1, frozen=True) @staticmethod def evaluate(energy, index1, index2, amplitude, ebreak, reference, beta): """Evaluate the model (static function).""" beta *= np.sign(index2 - index1) pwl = amplitude * (energy / reference) ** (-index1) brk = (1 + (energy / ebreak) ** ((index2 - index1) / beta)) ** (-beta) return pwl * brk class PiecewiseNormSpectralModel(SpectralModel): """Piecewise spectral correction with a free normalization at each fixed energy nodes. For more information see :ref:`piecewise-norm-spectral`. Parameters ---------- energy : `~astropy.units.Quantity` Array of energies at which the model values are given (nodes). norms : `~numpy.ndarray` or list of `Parameter` Array with the initial norms of the model at energies ``energy``. Normalisation parameters are created for each value. Default is one at each node. interp : str Interpolation scaling in {"log", "lin"}. Default is "log". """ tag = ["PiecewiseNormSpectralModel", "piecewise-norm"] def __init__(self, energy, norms=None, interp="log"): self._energy = energy self._interp = interp if norms is None: norms = np.ones(len(energy)) if len(norms) != len(energy): raise ValueError("dimension mismatch") if len(norms) < 2: raise ValueError("Input arrays must contain at least 2 elements") parameters_list = [] if not isinstance(norms[0], Parameter): parameters_list += [ Parameter(f"norm_{k}", norm) for k, norm in enumerate(norms) ] else: parameters_list += norms self.default_parameters = Parameters(parameters_list) super().__init__() @property def energy(self): """Energy nodes.""" return self._energy @property def norms(self): """Norm values""" return u.Quantity([p.value for p in self.parameters]) def evaluate(self, energy, **norms): scale = interpolation_scale(scale=self._interp) e_eval = scale(np.atleast_1d(energy.value)) e_nodes = scale(self.energy.to(energy.unit).value) v_nodes = scale(self.norms) log_interp = scale.inverse(np.interp(e_eval, e_nodes, v_nodes)) return log_interp def to_dict(self, full_output=False): data = super().to_dict(full_output=full_output) data["spectral"]["energy"] = { "data": self.energy.data.tolist(), "unit": str(self.energy.unit), } data["spectral"]["interp"] = self._interp return data @classmethod def from_dict(cls, data, **kwargs): """Create model from dictionary.""" data = data["spectral"] energy = u.Quantity(data["energy"]["data"], data["energy"]["unit"]) parameters = Parameters.from_dict(data["parameters"]) if "interp" in data: return cls.from_parameters(parameters, energy=energy, interp=data["interp"]) else: return cls.from_parameters(parameters, energy=energy) @classmethod def from_parameters(cls, parameters, **kwargs): """Create model from parameters.""" return cls(norms=parameters, **kwargs) class ExpCutoffPowerLawSpectralModel(SpectralModel): r"""Spectral exponential cutoff power-law model. For more information see :ref:`exp-cutoff-powerlaw-spectral-model`. Parameters ---------- index : `~astropy.units.Quantity` :math:`\Gamma`. Default is 1.5. amplitude : `~astropy.units.Quantity` :math:`\phi_0`. Default is 1e-12 cm-2 s-1 TeV-1. reference : `~astropy.units.Quantity` :math:`E_0`. Default is 1 TeV. lambda_ : `~astropy.units.Quantity` :math:`\lambda`. Default is 0.1 TeV-1. alpha : `~astropy.units.Quantity` :math:`\alpha`. Default is 1. See Also -------- ExpCutoffPowerLawNormSpectralModel """ tag = ["ExpCutoffPowerLawSpectralModel", "ecpl"] index = Parameter("index", 1.5) amplitude = Parameter( name="amplitude", value="1e-12 cm-2 s-1 TeV-1", scale_method="scale10", interp="log", ) reference = Parameter("reference", "1 TeV", frozen=True) lambda_ = Parameter("lambda_", "0.1 TeV-1") alpha = Parameter("alpha", "1.0", frozen=True) @staticmethod def evaluate(energy, index, amplitude, reference, lambda_, alpha): """Evaluate the model (static function).""" pwl = amplitude * (energy / reference) ** (-index) cutoff = np.exp(-np.power(energy * lambda_, alpha)) return pwl * cutoff @property def e_peak(self): r"""Spectral energy distribution peak energy (`~astropy.units.Quantity`). This is the peak in E^2 x dN/dE and is given by: .. math:: E_{Peak} = \left(\frac{2 - \Gamma}{\alpha}\right)^{1/\alpha} / \lambda """ reference = self.reference.quantity index = self.index.quantity lambda_ = self.lambda_.quantity alpha = self.alpha.quantity if index >= 2 or lambda_ == 0.0 or alpha == 0.0: return np.nan * reference.unit else: return np.power((2 - index) / alpha, 1 / alpha) / lambda_ class ExpCutoffPowerLawNormSpectralModel(SpectralModel): r"""Norm spectral exponential cutoff power-law model. Parameters ---------- index : `~astropy.units.Quantity` :math:`\Gamma`. Default is 0. norm : `~astropy.units.Quantity` :math:`\phi_0`. Default is 1. reference : `~astropy.units.Quantity` :math:`E_0`. Default is 1 TeV. lambda_ : `~astropy.units.Quantity` :math:`\lambda`. Default is 0.1 TeV-1. alpha : `~astropy.units.Quantity` :math:`\alpha`. Default is 1. See Also -------- ExpCutoffPowerLawSpectralModel """ tag = ["ExpCutoffPowerLawNormSpectralModel", "ecpl-norm"] index = Parameter("index", 0.0) norm = Parameter("norm", 1, unit="", interp="log") reference = Parameter("reference", "1 TeV", frozen=True) lambda_ = Parameter("lambda_", "0.1 TeV-1") alpha = Parameter("alpha", "1.0", frozen=True) def __init__( self, index=None, norm=None, reference=None, lambda_=None, alpha=None, **kwargs ): if index is None: warnings.warn( "The default index value changed from 1.5 to 0 since v1.3", GammapyDeprecationWarning, ) if norm is not None: kwargs.update({"norm": norm}) if index is not None: kwargs.update({"index": index}) if reference is not None: kwargs.update({"reference": reference}) if lambda_ is not None: kwargs.update({"lambda_": lambda_}) if alpha is not None: kwargs.update({"alpha": alpha}) super().__init__(**kwargs) @staticmethod def evaluate(energy, index, norm, reference, lambda_, alpha): """Evaluate the model (static function).""" pwl = norm * (energy / reference) ** (-index) cutoff = np.exp(-np.power(energy * lambda_, alpha)) return pwl * cutoff class ExpCutoffPowerLaw3FGLSpectralModel(SpectralModel): r"""Spectral exponential cutoff power-law model used for 3FGL. For more information see :ref:`exp-cutoff-powerlaw-3fgl-spectral-model`. Parameters ---------- index : `~astropy.units.Quantity` :math:`\Gamma`. Default is 1.5. amplitude : `~astropy.units.Quantity` :math:`\phi_0`. Default is 1e-12 cm-2 s-1 TeV-1. reference : `~astropy.units.Quantity` :math:`E_0`. Default is 1 TeV. ecut : `~astropy.units.Quantity` :math:`E_{C}`. Default is 10 TeV. """ tag = ["ExpCutoffPowerLaw3FGLSpectralModel", "ecpl-3fgl"] index = Parameter("index", 1.5) amplitude = Parameter( "amplitude", "1e-12 cm-2 s-1 TeV-1", scale_method="scale10", interp="log", ) reference = Parameter("reference", "1 TeV", frozen=True) ecut = Parameter("ecut", "10 TeV") @staticmethod def evaluate(energy, index, amplitude, reference, ecut): """Evaluate the model (static function).""" pwl = amplitude * (energy / reference) ** (-index) cutoff = np.exp((reference - energy) / ecut) return pwl * cutoff class SuperExpCutoffPowerLaw3FGLSpectralModel(SpectralModel): r"""Spectral super exponential cutoff power-law model used for 3FGL. For more information see :ref:`super-exp-cutoff-powerlaw-3fgl-spectral-model`. .. math:: \phi(E) = \phi_0 \cdot \left(\frac{E}{E_0}\right)^{-\Gamma_1} \exp \left( \left(\frac{E_0}{E_{C}} \right)^{\Gamma_2} - \left(\frac{E}{E_{C}} \right)^{\Gamma_2} \right) Parameters ---------- index_1 : `~astropy.units.Quantity` :math:`\Gamma_1`. Default is 1.5. index_2 : `~astropy.units.Quantity` :math:`\Gamma_2`. Default is 2. amplitude : `~astropy.units.Quantity` :math:`\phi_0`. Default is 1e-12 cm-2 s-1 TeV-1. reference : `~astropy.units.Quantity` :math:`E_0`. Default is 1 TeV. ecut : `~astropy.units.Quantity` :math:`E_{C}`. Default is 10 TeV. """ tag = ["SuperExpCutoffPowerLaw3FGLSpectralModel", "secpl-3fgl"] amplitude = Parameter( "amplitude", "1e-12 cm-2 s-1 TeV-1", scale_method="scale10", interp="log", ) reference = Parameter("reference", "1 TeV", frozen=True) ecut = Parameter("ecut", "10 TeV") index_1 = Parameter("index_1", 1.5) index_2 = Parameter("index_2", 2) @staticmethod def evaluate(energy, amplitude, reference, ecut, index_1, index_2): """Evaluate the model (static function).""" pwl = amplitude * (energy / reference) ** (-index_1) cutoff = np.exp((reference / ecut) ** index_2 - (energy / ecut) ** index_2) return pwl * cutoff class SuperExpCutoffPowerLaw4FGLSpectralModel(SpectralModel): r"""Spectral super exponential cutoff power-law model used for 4FGL-DR1 (and DR2). See equation (4) of https://arxiv.org/pdf/1902.10045.pdf For more information see :ref:`super-exp-cutoff-powerlaw-4fgl-spectral-model`. Parameters ---------- index_1 : `~astropy.units.Quantity` :math:`\Gamma_1`. Default is 1.5. index_2 : `~astropy.units.Quantity` :math:`\Gamma_2`. Default is 2. amplitude : `~astropy.units.Quantity` :math:`\phi_0`. Default is 1e-12 cm-2 s-1 TeV-1. reference : `~astropy.units.Quantity` :math:`E_0`. Default is 1 TeV. expfactor : `~astropy.units.Quantity` :math:`a`, given as dimensionless value but internally assumes unit of :math:`E_0^{-\Gamma_2}`. Default is 1e-2. """ tag = ["SuperExpCutoffPowerLaw4FGLSpectralModel", "secpl-4fgl"] amplitude = Parameter( "amplitude", "1e-12 cm-2 s-1 TeV-1", scale_method="scale10", interp="log", ) reference = Parameter("reference", "1 TeV", frozen=True) expfactor = Parameter("expfactor", "1e-2") index_1 = Parameter("index_1", 1.5) index_2 = Parameter("index_2", 2) @staticmethod def evaluate(energy, amplitude, reference, expfactor, index_1, index_2): """Evaluate the model (static function).""" if isinstance(index_1, u.Quantity): index_1 = index_1.to_value(u.one) if isinstance(index_2, u.Quantity): index_2 = index_2.to_value(u.one) pwl = amplitude * (energy / reference) ** (-index_1) cutoff = np.exp( expfactor / reference.unit**index_2 * (reference**index_2 - energy**index_2) ) return pwl * cutoff class SuperExpCutoffPowerLaw4FGLDR3SpectralModel(SpectralModel): r"""Spectral super exponential cutoff power-law model used for 4FGL-DR3. See equations (2) and (3) of https://arxiv.org/pdf/2201.11184.pdf For more information see :ref:`super-exp-cutoff-powerlaw-4fgl-dr3-spectral-model`. Parameters ---------- index_1 : `~astropy.units.Quantity` :math:`\Gamma_1`. Default is 1.5. index_2 : `~astropy.units.Quantity` :math:`\Gamma_2`. Default is 2. amplitude : `~astropy.units.Quantity` :math:`\phi_0`. Default is 1e-12 cm-2 s-1 TeV-1. reference : `~astropy.units.Quantity` :math:`E_0`. Default is 1 TeV. expfactor : `~astropy.units.Quantity` :math:`a`, given as dimensionless value. Default is 1e-2. """ tag = ["SuperExpCutoffPowerLaw4FGLDR3SpectralModel", "secpl-4fgl-dr3"] amplitude = Parameter( name="amplitude", value="1e-12 cm-2 s-1 TeV-1", scale_method="scale10", interp="log", ) reference = Parameter("reference", "1 TeV", frozen=True) expfactor = Parameter("expfactor", "1e-2") index_1 = Parameter("index_1", 1.5) index_2 = Parameter("index_2", 2) @staticmethod def evaluate(energy, amplitude, reference, expfactor, index_1, index_2): """Evaluate the model (static function).""" # https://fermi.gsfc.nasa.gov/ssc/data/analysis/scitools/source_models.html#PLSuperExpCutoff4 pwl = amplitude * (energy / reference) ** (-index_1) cutoff = (energy / reference) ** (expfactor / index_2) * np.exp( expfactor / index_2**2 * (1 - (energy / reference) ** index_2) ) mask = np.abs(index_2 * np.log(energy / reference)) < 1e-2 ln_ = np.log(energy[mask] / reference) power = expfactor * ( ln_ / 2.0 + index_2 / 6.0 * ln_**2.0 + index_2**2.0 / 24.0 * ln_**3 ) cutoff[mask] = (energy[mask] / reference) ** power return pwl * cutoff class LogParabolaSpectralModel(SpectralModel): r"""Spectral log parabola model. For more information see :ref:`logparabola-spectral-model`. Parameters ---------- amplitude : `~astropy.units.Quantity` :math:`\phi_0`. Default is 1e-12 cm-2 s-1 TeV-1. reference : `~astropy.units.Quantity` :math:`E_0`. Default is 10 TeV. alpha : `~astropy.units.Quantity` :math:`\alpha`. Default is 2. beta : `~astropy.units.Quantity` :math:`\beta`. Default is 1. See Also -------- LogParabolaNormSpectralModel """ tag = ["LogParabolaSpectralModel", "lp"] amplitude = Parameter( "amplitude", "1e-12 cm-2 s-1 TeV-1", scale_method="scale10", interp="log", ) reference = Parameter("reference", "10 TeV", frozen=True) alpha = Parameter("alpha", 2) beta = Parameter("beta", 1) @classmethod def from_log10(cls, amplitude, reference, alpha, beta): """Construct from :math:`log_{10}` parametrization.""" beta_ = beta / np.log(10) return cls(amplitude=amplitude, reference=reference, alpha=alpha, beta=beta_) @staticmethod def evaluate(energy, amplitude, reference, alpha, beta): """Evaluate the model (static function).""" xx = energy / reference exponent = -alpha - beta * np.log(xx) return amplitude * np.power(xx, exponent) @property def e_peak(self): r"""Spectral energy distribution peak energy (`~astropy.units.Quantity`). This is the peak in E^2 x dN/dE and is given by: .. math:: E_{Peak} = E_{0} \exp{ (2 - \alpha) / (2 * \beta)} """ reference = self.reference.quantity alpha = self.alpha.quantity beta = self.beta.quantity return reference * np.exp((2 - alpha) / (2 * beta)) class LogParabolaNormSpectralModel(SpectralModel): r"""Norm spectral log parabola model. Parameters ---------- norm : `~astropy.units.Quantity` :math:`\phi_0`. Default is 1. reference : `~astropy.units.Quantity` :math:`E_0`. Default is 10 TeV. alpha : `~astropy.units.Quantity` :math:`\alpha`. Default is 0. beta : `~astropy.units.Quantity` :math:`\beta`. Default is 0. See Also -------- LogParabolaSpectralModel """ tag = ["LogParabolaNormSpectralModel", "lp-norm"] norm = Parameter("norm", 1, unit="", interp="log") reference = Parameter("reference", "10 TeV", frozen=True) alpha = Parameter("alpha", 0) beta = Parameter("beta", 0) def __init__(self, norm=None, reference=None, alpha=None, beta=None, **kwargs): if norm is not None: kwargs.update({"norm": norm}) if beta is not None: kwargs.update({"beta": beta}) if reference is not None: kwargs.update({"reference": reference}) if alpha is not None: kwargs.update({"alpha": alpha}) super().__init__(**kwargs) @classmethod def from_log10(cls, norm, reference, alpha, beta): """Construct from :math:`log_{10}` parametrization.""" beta_ = beta / np.log(10) return cls(norm=norm, reference=reference, alpha=alpha, beta=beta_) @staticmethod def evaluate(energy, norm, reference, alpha, beta): """Evaluate the model (static function).""" xx = energy / reference exponent = -alpha - beta * np.log(xx) return norm * np.power(xx, exponent) class TemplateSpectralModel(SpectralModel): """A model generated from a table of energy and value arrays. For more information see :ref:`template-spectral-model`. Parameters ---------- energy : `~astropy.units.Quantity` Array of energies at which the model values are given values : `~numpy.ndarray` Array with the values of the model at energies ``energy``. interp_kwargs : dict Interpolation option passed to `~gammapy.utils.interpolation.ScaledRegularGridInterpolator`. By default, all values outside the interpolation range are set to NaN. If you want to apply linear extrapolation you can pass `interp_kwargs={'extrapolate': True, 'method': 'linear'}`. If you want to choose the interpolation scaling applied to values, you can use `interp_kwargs={"values_scale": "log"}`. meta : dict, optional Meta information, meta['filename'] will be used for serialisation. """ tag = ["TemplateSpectralModel", "template"] def __init__( self, energy, values, interp_kwargs=None, meta=None, ): self.energy = energy self.values = u.Quantity(values, copy=COPY_IF_NEEDED) self.meta = {} if meta is None else meta interp_kwargs = interp_kwargs or {} interp_kwargs.setdefault("values_scale", "log") interp_kwargs.setdefault("points_scale", ("log",)) if len(energy) == 1: interp_kwargs["method"] = "nearest" self._evaluate = ScaledRegularGridInterpolator( points=(energy,), values=values, **interp_kwargs ) super().__init__() @classmethod def read_xspec_model(cls, filename, param, **kwargs): """Read XSPEC table model. The input is a table containing absorbed values from a XSPEC model as a function of energy. TODO: Format of the file should be described and discussed in https://gamma-astro-data-formats.readthedocs.io/en/latest/index.html Parameters ---------- filename : str File containing the XSPEC model. param : float Model parameter value. Examples -------- Fill table from an EBL model (Franceschini, 2008) >>> from gammapy.modeling.models import TemplateSpectralModel >>> filename = '$GAMMAPY_DATA/ebl/ebl_franceschini.fits.gz' >>> table_model = TemplateSpectralModel.read_xspec_model(filename=filename, param=0.3) """ filename = make_path(filename) # Check if parameter value is in range table_param = Table.read(filename, hdu="PARAMETERS") pmin = table_param["MINIMUM"] pmax = table_param["MAXIMUM"] if param < pmin or param > pmax: raise ValueError(f"Out of range: param={param}, min={pmin}, max={pmax}") # Get energy values table_energy = Table.read(filename, hdu="ENERGIES") energy_lo = table_energy["ENERG_LO"] energy_hi = table_energy["ENERG_HI"] # set energy to log-centers energy = np.sqrt(energy_lo * energy_hi) # Get spectrum values (no interpolation, take closest value for param) table_spectra = Table.read(filename, hdu="SPECTRA") idx = np.abs(table_spectra["PARAMVAL"] - param).argmin() values = u.Quantity( table_spectra[idx][1], "", copy=COPY_IF_NEEDED ) # no dimension kwargs.setdefault("interp_kwargs", {"values_scale": "lin"}) return cls(energy=energy, values=values, **kwargs) def evaluate(self, energy): """Evaluate the model (static function).""" return self._evaluate((energy,), clip=True) def to_dict(self, full_output=False): data = super().to_dict(full_output) data["spectral"]["energy"] = { "data": self.energy.data.tolist(), "unit": str(self.energy.unit), } data["spectral"]["values"] = { "data": self.values.data.tolist(), "unit": str(self.values.unit), } return data @classmethod def from_dict(cls, data, **kwargs): data = data["spectral"] energy = u.Quantity(data["energy"]["data"], data["energy"]["unit"]) values = u.Quantity(data["values"]["data"], data["values"]["unit"]) return cls(energy=energy, values=values) @classmethod def from_region_map(cls, map, **kwargs): """Create model from region map.""" energy = map.geom.axes["energy_true"].center values = map.quantity[:, 0, 0] return cls(energy=energy, values=values, **kwargs) class TemplateNDSpectralModel(SpectralModel): """A model generated from a ND map where extra dimensions define the parameter space. Parameters ---------- map : `~gammapy.maps.RegionNDMap` Map template. meta : dict, optional Meta information, meta['filename'] will be used for serialisation. interp_kwargs : dict Interpolation keyword arguments passed to `gammapy.maps.Map.interp_by_pix`. Default arguments are {'method': 'linear', 'fill_value': 0}. """ tag = ["TemplateNDSpectralModel", "templateND"] def __init__(self, map, interp_kwargs=None, meta=None, filename=None): self._map = map.copy() self.meta = dict() if meta is None else meta if filename is not None: filename = str(make_path(filename)) self.filename = filename parameters = [] has_energy = False for axis in map.geom.axes: if axis.name not in ["energy_true", "energy"]: unit = axis.unit center = (axis.bounds[1] + axis.bounds[0]) / 2 parameter = Parameter( name=axis.name, value=center.to_value(unit), unit=unit, scale_method="scale10", min=axis.bounds[0].to_value(unit), max=axis.bounds[-1].to_value(unit), interp=axis.interp, ) parameters.append(parameter) else: has_energy |= True if not has_energy: raise ValueError("Invalid map, no energy axis found") self.default_parameters = Parameters(parameters) interp_kwargs = interp_kwargs or {} interp_kwargs.setdefault("values_scale", "log") self._interp_kwargs = interp_kwargs super().__init__() @property def map(self): """Template map as a `~gammapy.maps.RegionNDMap`.""" return self._map def evaluate(self, energy, **kwargs): coord = {"energy_true": energy} coord.update(kwargs) pixels = [0, 0] + [ self.map.geom.axes[key].coord_to_pix(value) for key, value in coord.items() ] val = self.map.interp_by_pix(pixels, **self._interp_kwargs) return u.Quantity(val, self.map.unit, copy=COPY_IF_NEEDED) def write(self, overwrite=False): """ Write the map. Parameters ---------- overwrite: bool, optional Overwrite existing file. Default is False, which will raise a warning if the template file exists already. """ if self.filename is None: raise IOError("Missing filename") elif os.path.isfile(self.filename) and not overwrite: log.warning("Template file already exits, and overwrite is False") else: self.map.write(self.filename) @classmethod def from_dict(cls, data, **kwargs): data = data["spectral"] filename = data["filename"] m = RegionNDMap.read(filename) model = cls(m, filename=filename) for idx, p in enumerate(model.parameters): par = p.to_dict() par.update(data["parameters"][idx]) setattr(model, p.name, Parameter(**par)) return model def to_dict(self, full_output=False): """Create dictionary for YAML serilisation.""" data = super().to_dict(full_output) data["spectral"]["filename"] = self.filename data["spectral"]["unit"] = str(self.map.unit) return data class ScaleSpectralModel(SpectralModel): """Wrapper to scale another spectral model by a norm factor. Parameters ---------- model : `SpectralModel` Spectral model to wrap. norm : float Multiplicative norm factor for the model value. Default is 1. """ tag = ["ScaleSpectralModel", "scale"] norm = Parameter("norm", 1, unit="", interp="log") def __init__(self, model, norm=norm.quantity): self.model = model self._covariance = None super().__init__(norm=norm) def evaluate(self, energy, norm): return norm * self.model(energy) def integral(self, energy_min, energy_max, **kwargs): return self.norm.value * self.model.integral(energy_min, energy_max, **kwargs) class EBLAbsorptionNormSpectralModel(SpectralModel): r"""Gamma-ray absorption models. For more information see :ref:`absorption-spectral-model`. Parameters ---------- energy : `~astropy.units.Quantity` Energy node values. param : `~astropy.units.Quantity` Parameter node values. data : `~astropy.units.Quantity` Model value. redshift : float Redshift of the absorption model. Default is 0.1. alpha_norm: float Norm of the EBL model. Default is 1. interp_kwargs : dict Interpolation option passed to `~gammapy.utils.interpolation.ScaledRegularGridInterpolator`. By default the models are extrapolated outside the range. To prevent this and raise an error instead use interp_kwargs = {"extrapolate": False}. """ tag = ["EBLAbsorptionNormSpectralModel", "ebl-norm"] alpha_norm = Parameter("alpha_norm", 1.0, frozen=True) redshift = Parameter("redshift", 0.1, frozen=True) def __init__(self, energy, param, data, redshift, alpha_norm, interp_kwargs=None): self.filename = None # set values log centers self.param = param self.energy = energy self.data = u.Quantity(data, copy=COPY_IF_NEEDED) interp_kwargs = interp_kwargs or {} interp_kwargs.setdefault("points_scale", ("lin", "log")) interp_kwargs.setdefault("values_scale", "log") interp_kwargs.setdefault("extrapolate", True) self._evaluate_table_model = ScaledRegularGridInterpolator( points=(self.param, self.energy), values=self.data, **interp_kwargs ) super().__init__(redshift=redshift, alpha_norm=alpha_norm) def to_dict(self, full_output=False): data = super().to_dict(full_output=full_output) param = u.Quantity(self.param) if self.filename is None: data["spectral"]["energy"] = { "data": self.energy.data.tolist(), "unit": str(self.energy.unit), } data["spectral"]["param"] = { "data": param.data.tolist(), "unit": str(param.unit), } data["spectral"]["values"] = { "data": self.data.value.tolist(), "unit": str(self.data.unit), } else: data["spectral"]["filename"] = str(self.filename) return data @classmethod def from_dict(cls, data, **kwargs): data = data["spectral"] redshift = [p["value"] for p in data["parameters"] if p["name"] == "redshift"][ 0 ] alpha_norm = [ p["value"] for p in data["parameters"] if p["name"] == "alpha_norm" ][0] if "filename" in data: if os.path.exists(data["filename"]): return cls.read( data["filename"], redshift=redshift, alpha_norm=alpha_norm ) else: for reference, filename in EBL_DATA_BUILTIN.items(): if Path(filename).stem in data["filename"]: return cls.read_builtin( reference, redshift=redshift, alpha_norm=alpha_norm ) raise IOError(f'File {data["filename"]} not found') else: energy = u.Quantity(data["energy"]["data"], data["energy"]["unit"]) param = u.Quantity(data["param"]["data"], data["param"]["unit"]) values = u.Quantity(data["values"]["data"], data["values"]["unit"]) return cls( energy=energy, param=param, data=values, redshift=redshift, alpha_norm=alpha_norm, ) @classmethod def read(cls, filename, redshift=0.1, alpha_norm=1, interp_kwargs=None): """Build object from an XSPEC model. Parameters ---------- filename : str File containing the model. redshift : float, optional Redshift of the absorption model. Default is 0.1. alpha_norm: float, optional Norm of the EBL model. Default is 1. interp_kwargs : dict, optional Interpolation option passed to `~gammapy.utils.interpolation.ScaledRegularGridInterpolator`. Default is None. """ # Create EBL data array filename = make_path(filename) table_param = Table.read(filename, hdu="PARAMETERS") # TODO: for some reason the table contain duplicated values param, idx = np.unique(table_param[0]["VALUE"], return_index=True) # Get energy values table_energy = Table.read(filename, hdu="ENERGIES") energy_lo = u.Quantity( table_energy["ENERG_LO"], "keV", copy=COPY_IF_NEEDED ) # unit not stored in file energy_hi = u.Quantity( table_energy["ENERG_HI"], "keV", copy=COPY_IF_NEEDED ) # unit not stored in file energy = np.sqrt(energy_lo * energy_hi) # Get spectrum values table_spectra = Table.read(filename, hdu="SPECTRA") data = table_spectra["INTPSPEC"].data[idx, :] model = cls( energy=energy, param=param, data=data, redshift=redshift, alpha_norm=alpha_norm, interp_kwargs=interp_kwargs, ) model.filename = filename return model @classmethod def read_builtin( cls, reference="dominguez", redshift=0.1, alpha_norm=1, interp_kwargs=None ): """Read from one of the built-in absorption models. Parameters ---------- reference : {'franceschini', 'dominguez', 'finke'}, optional Name of one of the available model in gammapy-data. Default is 'dominquez'. redshift : float, optional Redshift of the absorption model. Default is 0.1. alpha_norm : float, optional Norm of the EBL model. Default is 1. interp_kwargs : dict, optional Interpolation keyword arguments. Default is None. References ---------- .. [1] Franceschini et al. (2008), "Extragalactic optical-infrared background radiation, its time evolution and the cosmic photon-photon opacity", # noqa: E501 `Link `__ .. [2] Dominguez et al. (2011), " Extragalactic background light inferred from AEGIS galaxy-SED-type fractions" # noqa: E501 `Link `__ .. [3] Finke et al. (2010), "Modeling the Extragalactic Background Light from Stars and Dust" `Link `__ .. [4] Franceschini et al. (2017), "The extragalactic background light revisited and the cosmic photon-photon opacity" `Link `__ .. [5] Saldana-Lopez et al. (2021) "An observational determination of the evolving extragalactic background light from the multiwavelength HST/CANDELS survey in the Fermi and CTA era" `Link `__ """ return cls.read( EBL_DATA_BUILTIN[reference], redshift, alpha_norm, interp_kwargs=interp_kwargs, ) def evaluate(self, energy, redshift, alpha_norm): """Evaluate model for energy and parameter value.""" absorption = np.clip(self._evaluate_table_model((redshift, energy)), 0, 1) return np.power(absorption, alpha_norm) class NaimaSpectralModel(SpectralModel): r"""A wrapper for Naima models. For more information see :ref:`naima-spectral-model`. Parameters ---------- radiative_model : `~naima.models.BaseRadiative` An instance of a radiative model defined in `~naima.models`. distance : `~astropy.units.Quantity`, optional Distance to the source. If set to 0, the intrinsic differential luminosity will be returned. Default is 1 kpc. seed : str or list of str, optional Seed photon field(s) to be considered for the `radiative_model` flux computation, in case of a `~naima.models.InverseCompton` model. It can be a subset of the `seed_photon_fields` list defining the `radiative_model`. Default is the whole list of photon fields. nested_models : dict Additional parameters for nested models not supplied by the radiative model, for now this is used only for synchrotron self-compton model. """ tag = ["NaimaSpectralModel", "naima"] def __init__( self, radiative_model, distance=1.0 * u.kpc, seed=None, nested_models=None, use_cache=False, ): import naima self.radiative_model = radiative_model self.radiative_model._memoize = use_cache self.distance = u.Quantity(distance) self.seed = seed if nested_models is None: nested_models = {} self.nested_models = nested_models if isinstance(self.particle_distribution, naima.models.TableModel): param_names = ["amplitude"] else: param_names = self.particle_distribution.param_names parameters = [] for name in param_names: value = getattr(self.particle_distribution, name) parameter = Parameter(name, value) parameters.append(parameter) # In case of a synchrotron radiative model, append B to the fittable parameters if "B" in self.radiative_model.param_names: value = getattr(self.radiative_model, "B") parameter = Parameter("B", value) parameters.append(parameter) # In case of a synchrotron self compton model, append B and Rpwn to the fittable parameters if self.include_ssc: B = self.nested_models["SSC"]["B"] radius = self.nested_models["SSC"]["radius"] parameters.append(Parameter("B", B)) parameters.append(Parameter("radius", radius, frozen=True)) self.default_parameters = Parameters(parameters) self.ssc_energy = np.logspace(-7, 9, 100) * u.eV super().__init__() @property def include_ssc(self): """Whether the model includes an SSC component.""" import naima is_ic_model = isinstance(self.radiative_model, naima.models.InverseCompton) return is_ic_model and "SSC" in self.nested_models @property def ssc_model(self): """Synchrotron model.""" import naima if self.include_ssc: return naima.models.Synchrotron( self.particle_distribution, B=self.B.quantity, Eemax=self.radiative_model.Eemax, Eemin=self.radiative_model.Eemin, ) @property def particle_distribution(self): """Particle distribution.""" return self.radiative_model.particle_distribution def _evaluate_ssc( self, energy, ): """ Compute photon density spectrum from synchrotron emission for synchrotron self-compton model, assuming uniform synchrotron emissivity inside a sphere of radius R (see Section 4.1 of Atoyan & Aharonian 1996). Based on : https://naima.readthedocs.io/en/latest/examples.html#crab-nebula-ssc-model """ Lsy = self.ssc_model.flux( self.ssc_energy, distance=0 * u.cm ) # use distance 0 to get luminosity phn_sy = Lsy / (4 * np.pi * self.radius.quantity**2 * const.c) * 2.24 # The factor 2.24 comes from the assumption on uniform synchrotron # emissivity inside a sphere if "SSC" not in self.radiative_model.seed_photon_fields: self.radiative_model.seed_photon_fields["SSC"] = { "isotropic": True, "type": "array", "energy": self.ssc_energy, "photon_density": phn_sy, } else: self.radiative_model.seed_photon_fields["SSC"]["photon_density"] = phn_sy dnde = self.radiative_model.flux( energy, seed=self.seed, distance=self.distance ) + self.ssc_model.flux(energy, distance=self.distance) return dnde def _update_naima_parameters(self, **kwargs): """Update Naima model parameters.""" for name, value in kwargs.items(): setattr(self.particle_distribution, name, value) if "B" in self.radiative_model.param_names: self.radiative_model.B = self.B.quantity def evaluate(self, energy, **kwargs): """Evaluate the model. Parameters ---------- energy : `~astropy.units.Quantity` Energy to evaluate the model at. Returns ------- dnde : `~astropy.units.Quantity` Differential flux at given energy. """ self._update_naima_parameters(**kwargs) if self.include_ssc: dnde = self._evaluate_ssc(energy.flatten()) elif self.seed is not None: dnde = self.radiative_model.flux( energy.flatten(), seed=self.seed, distance=self.distance ) else: dnde = self.radiative_model.flux(energy.flatten(), distance=self.distance) dnde = dnde.reshape(energy.shape) unit = 1 / (energy.unit * u.cm**2 * u.s) return dnde.to(unit) def to_dict(self, full_output=True): # for full_output to True otherwise broken return super().to_dict(full_output=True) @classmethod def from_dict(cls, data, **kwargs): raise NotImplementedError( "Currently the NaimaSpectralModel cannot be read from YAML" ) @classmethod def from_parameters(cls, parameters, **kwargs): raise NotImplementedError( "Currently the NaimaSpectralModel cannot be built from a list of parameters." ) class GaussianSpectralModel(SpectralModel): r"""Gaussian spectral model. For more information see :ref:`gaussian-spectral-model`. Parameters ---------- amplitude : `~astropy.units.Quantity` :math:`N_0`. Default is 1e-12 cm-2 s-1. mean : `~astropy.units.Quantity` :math:`\bar{E}`. Default is 1 TeV. sigma : `~astropy.units.Quantity` :math:`\sigma`. Default is 2 TeV. """ tag = ["GaussianSpectralModel", "gauss"] amplitude = Parameter("amplitude", 1e-12 * u.Unit("cm-2 s-1"), interp="log") mean = Parameter("mean", 1 * u.TeV) sigma = Parameter("sigma", 2 * u.TeV) @staticmethod def evaluate(energy, amplitude, mean, sigma): return ( amplitude / (sigma * np.sqrt(2 * np.pi)) * np.exp(-((energy - mean) ** 2) / (2 * sigma**2)) ) def integral(self, energy_min, energy_max, **kwargs): r"""Integrate Gaussian analytically. .. math:: F(E_{min}, E_{max}) = \frac{N_0}{2} \left[ erf(\frac{E - \bar{E}}{\sqrt{2} \sigma})\right]_{E_{min}}^{E_{max}} Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Lower and upper bound of integration range """ # noqa: E501 # kwargs are passed to this function but not used # this is to get a consistent API with SpectralModel.integral() u_min = ( (energy_min - self.mean.quantity) / (np.sqrt(2) * self.sigma.quantity) ).to_value("") u_max = ( (energy_max - self.mean.quantity) / (np.sqrt(2) * self.sigma.quantity) ).to_value("") return ( self.amplitude.quantity / 2 * (scipy.special.erf(u_max) - scipy.special.erf(u_min)) ) def energy_flux(self, energy_min, energy_max): r"""Compute energy flux in given energy range analytically. .. math:: G(E_{min}, E_{max}) = \frac{N_0 \sigma}{\sqrt{2*\pi}}* \left[ - \exp(\frac{E_{min}-\bar{E}}{\sqrt{2} \sigma}) \right]_{E_{min}}^{E_{max}} + \frac{N_0 * \bar{E}}{2} \left[ erf(\frac{E - \bar{E}}{\sqrt{2} \sigma}) \right]_{E_{min}}^{E_{max}} Parameters ---------- energy_min, energy_max : `~astropy.units.Quantity` Lower and upper bound of integration range. """ # noqa: E501 u_min = ( (energy_min - self.mean.quantity) / (np.sqrt(2) * self.sigma.quantity) ).to_value("") u_max = ( (energy_max - self.mean.quantity) / (np.sqrt(2) * self.sigma.quantity) ).to_value("") a = self.amplitude.quantity * self.sigma.quantity / np.sqrt(2 * np.pi) b = self.amplitude.quantity * self.mean.quantity / 2 return a * (np.exp(-(u_min**2)) - np.exp(-(u_max**2))) + b * ( scipy.special.erf(u_max) - scipy.special.erf(u_min) ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/spectral_cosmic_ray.py0000644000175100001770000000625014721316200023527 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Simple models for cosmic ray spectra at Earth. For measurements, the "Database of Charged Cosmic Rays (CRDB)" is a great resource: https://lpsc.in2p3.fr/crdb/ """ import numpy as np from astropy import units as u from gammapy.modeling import Parameter from .spectral import PowerLawSpectralModel, SpectralModel __all__ = [ "create_cosmic_ray_spectral_model", ] class _LogGaussianSpectralModel(SpectralModel): r"""Log Gaussian spectral model with a weird parametrisation. This should not be exposed to end-users as a Gammapy spectral model! See Table 3 in https://ui.adsabs.harvard.edu/abs/2013APh....43..171B """ L = Parameter("L", 1e-12 * u.Unit("cm-2 s-1")) Ep = Parameter("Ep", 0.107 * u.TeV) w = Parameter("w", 0.776) @staticmethod def evaluate(energy, L, Ep, w): return ( L / (energy * w * np.sqrt(2 * np.pi)) * np.exp(-((np.log(energy / Ep)) ** 2) / (2 * w**2)) ) def create_cosmic_ray_spectral_model(particle="proton"): """Cosmic a cosmic ray spectral model at Earth. These are the spectra assumed in this CTAO study: Table 3 in https://ui.adsabs.harvard.edu/abs/2013APh....43..171B The spectrum given is a differential flux ``dnde`` in units of ``cm-2 s-1 TeV-1``, as the value integrated over the whole sky. To get a surface brightness you need to compute ``dnde / (4 * np.pi * u.sr)``. To get the ``dnde`` in a region of solid angle ``omega``, you need to compute ``dnde * omega / (4 * np.pi * u.sr)``. The hadronic spectra are simple power-laws, the electron spectrum is the sum of a power law and a log-normal component to model the "Fermi shoulder". Parameters ---------- particle : {'electron', 'proton', 'He', 'N', 'Si', 'Fe'}, optional Particle type. Default is 'proton'. Returns ------- `~gammapy.modeling.models.SpectralModel` Spectral model (for all-sky cosmic ray flux). """ omega = 4 * np.pi * u.sr if particle == "proton": return PowerLawSpectralModel( amplitude=0.096 * u.Unit("1 / (m2 s TeV sr)") * omega, index=2.70, reference=1 * u.TeV, ) elif particle == "N": return PowerLawSpectralModel( amplitude=0.0719 * u.Unit("1 / (m2 s TeV sr)") * omega, index=2.64, reference=1 * u.TeV, ) elif particle == "Si": return PowerLawSpectralModel( amplitude=0.0284 * u.Unit("1 / (m2 s TeV sr)") * omega, index=2.66, reference=1 * u.TeV, ) elif particle == "Fe": return PowerLawSpectralModel( amplitude=0.0134 * u.Unit("1 / (m2 s TeV sr)") * omega, index=2.63, reference=1 * u.TeV, ) elif particle == "electron": return PowerLawSpectralModel( amplitude=6.85e-5 * u.Unit("1 / (m2 s TeV sr)") * omega, index=3.21, reference=1 * u.TeV, ) + _LogGaussianSpectralModel(L=3.19e-3 * u.Unit("1 / (m2 s sr)") * omega) else: raise ValueError(f"Invalid particle: {particle!r}") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/spectral_crab.py0000644000175100001770000001046214721316200022306 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from astropy import units as u from gammapy.modeling import Parameter from gammapy.utils.compat import COPY_IF_NEEDED from .spectral import ( ExpCutoffPowerLawSpectralModel, LogParabolaSpectralModel, PowerLawSpectralModel, SpectralModel, ) __all__ = [ "create_crab_spectral_model", "MeyerCrabSpectralModel", ] class MeyerCrabSpectralModel(SpectralModel): """Meyer 2010 log polynomial Crab spectral model. Reference: https://ui.adsabs.harvard.edu/abs/2010A%26A...523A...2M, Appendix D """ norm = Parameter("norm", value=1, frozen=True) coefficients = [-0.00449161, 0, 0.0473174, -0.179475, -0.53616, -10.2708] @staticmethod def evaluate(energy, norm): """Evaluate the model.""" polynomial = np.poly1d(MeyerCrabSpectralModel.coefficients) log_energy = np.log10(energy.to_value("TeV")) log_flux = polynomial(log_energy) flux = u.Quantity(np.power(10, log_flux), "erg / (cm2 s)", copy=COPY_IF_NEEDED) return norm * flux / energy**2 def create_crab_spectral_model(reference="meyer"): """Create a Crab nebula reference spectral model. The Crab nebula is often used as a standard candle in gamma-ray astronomy. Fluxes and sensitivities are often quoted relative to the Crab spectrum. The following references are available: * 'meyer', https://ui.adsabs.harvard.edu/abs/2010A%26A...523A...2M, Appendix D * 'hegra', https://ui.adsabs.harvard.edu/abs/2000ApJ...539..317A * 'hess_pl' and 'hess_ecpl': https://ui.adsabs.harvard.edu/abs/2006A%26A...457..899A * 'magic_lp' and 'magic_ecpl': https://ui.adsabs.harvard.edu/abs/2015JHEAp...5...30A Parameters ---------- reference : {'meyer', 'hegra', 'hess_pl', 'hess_ecpl', 'magic_lp', 'magic_ecpl'}, optional Which reference to use for the spectral model. Default is 'meyer'. Examples -------- Let's first import what we need:: >>> import astropy.units as u >>> from gammapy.modeling.models import PowerLawSpectralModel, create_crab_spectral_model Plot the 'hess_ecpl' reference Crab spectrum between 1 TeV and 100 TeV:: >>> crab_hess_ecpl = create_crab_spectral_model('hess_ecpl') >>> crab_hess_ecpl.plot([1, 100] * u.TeV) #doctest: +SKIP Use a reference crab spectrum as unit to measure a differential flux (at 10 TeV):: >>> pwl = PowerLawSpectralModel( ... index=2.3, amplitude=1e-12 * u.Unit('1 / (cm2 s TeV)'), reference=1 * u.TeV ... ) >>> crab = create_crab_spectral_model('hess_pl') >>> energy = 10 * u.TeV >>> dnde_cu = (pwl(energy) / crab(energy)).to('%') >>> print(dnde_cu) 6.196991563774588 % And the same for integral fluxes (between 1 and 10 TeV):: >>> # compute integral flux in crab units >>> emin, emax = [1, 10] * u.TeV >>> flux_int_cu = (pwl.integral(emin, emax) / crab.integral(emin, emax)).to('%') >>> print(flux_int_cu) 3.535058216604496 % """ if reference == "meyer": return MeyerCrabSpectralModel() elif reference == "hegra": return PowerLawSpectralModel( amplitude=2.83e-11 * u.Unit("1 / (cm2 s TeV)"), index=2.62, reference=1 * u.TeV, ) elif reference == "hess_pl": return PowerLawSpectralModel( amplitude=3.45e-11 * u.Unit("1 / (cm2 s TeV)"), index=2.63, reference=1 * u.TeV, ) elif reference == "hess_ecpl": return ExpCutoffPowerLawSpectralModel( amplitude=3.76e-11 * u.Unit("1 / (cm2 s TeV)"), index=2.39, lambda_=1 / (14.3 * u.TeV), reference=1 * u.TeV, ) elif reference == "magic_lp": return LogParabolaSpectralModel( amplitude=3.23e-11 * u.Unit("1 / (cm2 s TeV)"), alpha=2.47, beta=0.24 / np.log(10), reference=1 * u.TeV, ) elif reference == "magic_ecpl": return ExpCutoffPowerLawSpectralModel( amplitude=3.80e-11 * u.Unit("1 / (cm2 s TeV)"), index=2.21, lambda_=1 / (6.0 * u.TeV), reference=1 * u.TeV, ) else: raise ValueError(f"Invalid reference: {reference!r}") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/temporal.py0000644000175100001770000011704414721316200021331 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Time-dependent models.""" import logging import numpy as np import scipy.interpolate from astropy import units as u from astropy.io import fits from astropy.table import Table from astropy.time import Time from gammapy.maps import MapAxis, RegionNDMap, TimeMapAxis from astropy.utils import lazyproperty from gammapy.modeling import Parameter from gammapy.utils.compat import COPY_IF_NEEDED from gammapy.utils.random import InverseCDFSampler, get_random_state from gammapy.utils.scripts import make_path from gammapy.utils.time import time_ref_from_dict, time_ref_to_dict from .core import ModelBase, _build_parameters_from_dict __all__ = [ "ConstantTemporalModel", "ExpDecayTemporalModel", "GaussianTemporalModel", "GeneralizedGaussianTemporalModel", "LightCurveTemplateTemporalModel", "LinearTemporalModel", "PowerLawTemporalModel", "SineTemporalModel", "TemplatePhaseCurveTemporalModel", "TemporalModel", ] log = logging.getLogger(__name__) # TODO: make this a small ABC to define a uniform interface. class TemporalModel(ModelBase): """Temporal model base class. Evaluates on `~astropy.time.Time` objects. """ _type = "temporal" def __init__(self, **kwargs): scale = kwargs.pop("scale", "utc") if scale not in Time.SCALES: raise ValueError( f"{scale} is not a valid time scale. Choose from {Time.SCALES}" ) self.scale = scale super().__init__(**kwargs) def __call__(self, time, energy=None): """Evaluate model. Parameters ---------- time : `~astropy.time.Time` Time object. energy : `~astropy.units.Quantity`, optional Energy. Default is None. Returns ------- values : `~astropy.units.Quantity` Model values. """ kwargs = {par.name: par.quantity for par in self.parameters} if energy is not None: kwargs["energy"] = energy time = Time(time, scale=self.scale).mjd * u.d return self.evaluate(time, **kwargs) @property def type(self): return self._type @property def is_energy_dependent(self): return False @property def reference_time(self): """Reference time in MJD.""" return Time(self.t_ref.value, format="mjd", scale=self.scale) @reference_time.setter def reference_time(self, t_ref): """Reference time.""" if not isinstance(t_ref, Time): raise TypeError(f"{t_ref} is not a {Time} object") self.t_ref.value = Time(t_ref, scale=self.scale).mjd def to_dict(self, full_output=False): """Create dictionary for YAML serilisation.""" data = super().to_dict(full_output) data["temporal"]["scale"] = self.scale return data @classmethod def from_dict(cls, data, **kwargs): """Create a temporal model from a dictionary. Parameters ---------- data : dict Dictionary containing the model parameters. **kwargs : dict Keyword arguments passed to `~TemporalModel.from_parameters`. """ kwargs = kwargs or {} temporal_data = data.get("temporal", data) if "scale" in temporal_data: kwargs["scale"] = temporal_data["scale"] return super().from_dict(data, **kwargs) @staticmethod def time_sum(t_min, t_max): """Total time between t_min and t_max. Parameters ---------- t_min, t_max : `~astropy.time.Time` Lower and upper bound of integration range. Returns ------- time_sum : `~astropy.time.TimeDelta` Summed time in the intervals. """ diff = t_max - t_min return np.sum(diff).to(u.day) def plot(self, time_range, ax=None, n_points=100, **kwargs): """ Plot the temporal model. Parameters ---------- time_range : `~astropy.time.Time` Times to plot the model. ax : `~matplotlib.axes.Axes`, optional Axis to plot on. n_points : int Number of bins to plot model. Default is 100. **kwargs : dict Keywords forwarded to `~matplotlib.pyplot.errorbar`. Returns ------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. """ time_min, time_max = time_range time_axis = TimeMapAxis.from_time_bounds( time_min=time_min, time_max=time_max, nbin=n_points ) m = RegionNDMap.create(region=None, axes=[time_axis]) kwargs.setdefault("marker", "None") kwargs.setdefault("ls", "-") kwargs.setdefault("xerr", None) m.quantity = self(time_axis.time_mid).to(u.one) ax = m.plot(ax=ax, **kwargs) ax.set_ylabel("Norm / A.U.") return ax def sample_time(self, n_events, t_min, t_max, t_delta="1 s", random_state=0): """Sample arrival times of events. Parameters ---------- n_events : int Number of events to sample. t_min : `~astropy.time.Time` Start time of the sampling. t_max : `~astropy.time.Time` Stop time of the sampling. t_delta : `~astropy.units.Quantity`, optional Time step used for sampling of the temporal model. Default is 1 s. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is 0. Returns ------- time : `~astropy.units.Quantity` Array with times of the sampled events. """ t_min = Time(t_min, scale=self.scale) t_max = Time(t_max, scale=self.scale) t_delta = u.Quantity(t_delta) random_state = get_random_state(random_state) ontime = (t_max - t_min).to("s") n_step = (ontime / t_delta).to_value("").item() t_step = ontime / n_step indices = np.arange(n_step + 1) steps = indices * t_step t = Time(t_min + steps, format="mjd") pdf = self(t) sampler = InverseCDFSampler(pdf=pdf, random_state=random_state) time_pix = sampler.sample(n_events)[0] time = np.interp(time_pix, indices, steps) return t_min + time def integral(self, t_min, t_max, oversampling_factor=100, **kwargs): """Evaluate the integrated flux within the given time intervals. Parameters ---------- t_min: `~astropy.time.Time` Start times of observation. t_max: `~astropy.time.Time` Stop times of observation. oversampling_factor : int, optional Oversampling factor to be used for numerical integration. Default is 100. Returns ------- norm : float Integrated flux norm on the given time intervals. """ t_values, steps = np.linspace( t_min.mjd, t_max.mjd, oversampling_factor, retstep=True, axis=-1 ) times = Time(t_values, format="mjd", scale=self.scale) values = self(times) integral = np.sum(values, axis=-1) * steps return integral / self.time_sum(t_min, t_max).to_value("d") class ConstantTemporalModel(TemporalModel): """Constant temporal model. For more information see :ref:`constant-temporal-model`. """ tag = ["ConstantTemporalModel", "const"] @staticmethod def evaluate(time): """Evaluate at given times.""" return np.ones(time.shape) * u.one def integral(self, t_min, t_max): """Evaluate the integrated flux within the given time intervals. Parameters ---------- t_min : `~astropy.time.Time` Start times of observation. t_max : `~astropy.time.Time` Stop times of observation. Returns ------- norm : `~astropy.units.Quantity` Integrated flux norm on the given time intervals. """ return (t_max - t_min) / self.time_sum(t_min, t_max) class LinearTemporalModel(TemporalModel): """Temporal model with a linear variation. For more information see :ref:`linear-temporal-model`. Parameters ---------- alpha : float Constant term of the baseline flux. Default is 1. beta : `~astropy.units.Quantity` Time variation coefficient of the flux. Default is 0. t_ref : `~astropy.units.Quantity` The reference time in mjd. Frozen per default, at 2000-01-01. """ tag = ["LinearTemporalModel", "linear"] alpha = Parameter("alpha", 1.0, frozen=False) beta = Parameter("beta", 0.0, unit="d-1", frozen=False) _t_ref_default = Time("2000-01-01") t_ref = Parameter("t_ref", _t_ref_default.mjd, unit="day", frozen=True) @staticmethod def evaluate(time, alpha, beta, t_ref): """Evaluate at given times.""" return alpha + beta * (time - t_ref) def integral(self, t_min, t_max): """Evaluate the integrated flux within the given time intervals. Parameters ---------- t_min : `~astropy.time.Time` Start times of observation. t_max : `~astropy.time.Time` Stop times of observation. Returns ------- norm : float Integrated flux norm on the given time intervals. """ pars = self.parameters alpha = pars["alpha"] beta = pars["beta"].quantity t_ref = self.reference_time value = alpha * (t_max - t_min) + beta / 2.0 * ( (t_max - t_ref) * (t_max - t_ref) - (t_min - t_ref) * (t_min - t_ref) ) return value / self.time_sum(t_min, t_max) class ExpDecayTemporalModel(TemporalModel): r"""Temporal model with an exponential decay. For more information see :ref:`expdecay-temporal-model`. Parameters ---------- t0 : `~astropy.units.Quantity` Decay timescale. Default is 1 day. t_ref : `~astropy.units.Quantity` The reference time in mjd. Frozen per default, at 2000-01-01. """ tag = ["ExpDecayTemporalModel", "exp-decay"] t0 = Parameter("t0", "1 d", frozen=False) _t_ref_default = Time("2000-01-01") t_ref = Parameter("t_ref", _t_ref_default.mjd, unit="day", frozen=True) @staticmethod def evaluate(time, t0, t_ref): """Evaluate at given times.""" return np.exp(-(time - t_ref) / t0) def integral(self, t_min, t_max): """Evaluate the integrated flux within the given time intervals. Parameters ---------- t_min : `~astropy.time.Time` Start times of observation. t_max : `~astropy.time.Time` Stop times of observation. Returns ------- norm : float Integrated flux norm on the given time intervals. """ pars = self.parameters t0 = pars["t0"].quantity t_ref = self.reference_time value = self.evaluate(t_max, t0, t_ref) - self.evaluate(t_min, t0, t_ref) return -t0 * value / self.time_sum(t_min, t_max) class GaussianTemporalModel(TemporalModel): r"""A Gaussian temporal profile. For more information see :ref:`gaussian-temporal-model`. Parameters ---------- t_ref : `~astropy.units.Quantity` The reference time in mjd at the peak. Default is 2000-01-01. sigma : `~astropy.units.Quantity` Width of the gaussian profile. Default is 1 day. """ tag = ["GaussianTemporalModel", "gauss"] _t_ref_default = Time("2000-01-01") t_ref = Parameter("t_ref", _t_ref_default.mjd, unit="day", frozen=False) sigma = Parameter("sigma", "1 d", frozen=False) @staticmethod def evaluate(time, t_ref, sigma): return np.exp(-((time - t_ref) ** 2) / (2 * sigma**2)) def integral(self, t_min, t_max, **kwargs): """Evaluate the integrated flux within the given time intervals. Parameters ---------- t_min : `~astropy.time.Time` Start times of observation. t_max : `~astropy.time.Time` Stop times of observation. Returns ------- norm : float Integrated flux norm on the given time intervals. """ pars = self.parameters sigma = pars["sigma"].quantity t_ref = self.reference_time norm = np.sqrt(np.pi / 2) * sigma u_min = (t_min - t_ref) / (np.sqrt(2) * sigma) u_max = (t_max - t_ref) / (np.sqrt(2) * sigma) integral = norm * (scipy.special.erf(u_max) - scipy.special.erf(u_min)) return integral / self.time_sum(t_min, t_max) class GeneralizedGaussianTemporalModel(TemporalModel): r"""A generalized Gaussian temporal profile. For more information see :ref:`generalized-gaussian-temporal-model`. Parameters ---------- t_ref : `~astropy.units.Quantity` The time of the pulse's maximum intensity. Default is 2000-01-01. t_rise : `~astropy.units.Quantity` Rise time constant. Default is 1 day. t_decay : `~astropy.units.Quantity` Decay time constant. Default is 1 day. eta : `~astropy.units.Quantity` Inverse pulse sharpness -> higher values implies a more peaked pulse. Default is 1/2. """ tag = ["GeneralizedGaussianTemporalModel", "gengauss"] _t_ref_default = Time("2000-01-01") t_ref = Parameter("t_ref", _t_ref_default.mjd, unit="day", frozen=False) t_rise = Parameter("t_rise", "1d", frozen=False) t_decay = Parameter("t_decay", "1d", frozen=False) eta = Parameter("eta", 1 / 2, unit="", frozen=False) @staticmethod def evaluate(time, t_ref, t_rise, t_decay, eta): val_rise = np.exp( -0.5 * (np.abs(u.Quantity(time - t_ref, "d")) ** (1 / eta)) / (t_rise ** (1 / eta)) ) val_decay = np.exp( -0.5 * (np.abs(u.Quantity(time - t_ref, "d")) ** (1 / eta)) / (t_decay ** (1 / eta)) ) val = np.where(time < t_ref, val_rise, val_decay) return val class LightCurveTemplateTemporalModel(TemporalModel): """Temporal light curve model. The lightcurve is given at specific times (and optionally energies) as a ``norm`` It can be serialised either as an astropy table or a `~gammapy.maps.RegionNDMap` The ``norm`` is supposed to be a unit-less multiplicative factor in the model, to be multiplied with a spectral model. The model does linear interpolation for times between the given ``(time, energy, norm)`` values. When the temporal model is energy-dependent, the default interpolation scheme is linear with a log scale for the values. The interpolation method and scale values can be changed with the ``method`` and ``values_scale`` arguments. For more information see :ref:`LightCurve-temporal-model`. Examples -------- Read an example light curve object: >>> from gammapy.modeling.models import LightCurveTemplateTemporalModel >>> path = '$GAMMAPY_DATA/tests/models/light_curve/lightcrv_PKSB1222+216.fits' >>> light_curve = LightCurveTemplateTemporalModel.read(path) Show basic information about the lightcurve: >>> print(light_curve) LightCurveTemplateTemporalModel model summary: Reference time: 59000.49919925926 MJD Start time: 58999.99919925926 MJD End time: 61862.99919925926 MJD Norm min: 0.01551196351647377 Norm max: 1.0 Compute ``norm`` at a given time: >>> from astropy.time import Time >>> t = Time(59001.195, format="mjd") >>> light_curve.evaluate(t) Compute mean ``norm`` in a given time interval: >>> import astropy.units as u >>> t_r = Time(59000.5, format='mjd') >>> t_min = t_r + [1, 4, 8] * u.d >>> t_max = t_r + [1.5, 6, 9] * u.d >>> light_curve.integral(t_min, t_max) """ tag = ["LightCurveTemplateTemporalModel", "template"] _t_ref_default = Time("2000-01-01") t_ref = Parameter("t_ref", _t_ref_default.mjd, unit="day", frozen=True) def __init__(self, map, t_ref=None, filename=None, method=None, values_scale=None): if (map.data < 0).any(): log.warning("Map has negative values. Check and fix this!") self.map = map.copy() super().__init__() if t_ref: self.reference_time = t_ref self.filename = filename if method is None: method = "linear" if values_scale is None: if self.is_energy_dependent: values_scale = "log" else: values_scale = "lin" self.method = method self.values_scale = values_scale def __str__(self): start_time = self.t_ref.quantity + self.map.geom.axes["time"].edges[0] end_time = self.t_ref.quantity + self.map.geom.axes["time"].edges[-1] norm_min = np.nanmin(self.map.data) norm_max = np.nanmax(self.map.data) prnt = ( f"{self.__class__.__name__} model summary:\n " f"Reference time: {self.t_ref.value} MJD \n " f"Start time: {start_time.value} MJD \n " f"End time: {end_time.value} MJD \n " f"Norm min: {norm_min} \n" f"Norm max: {norm_max}" ) if self.is_energy_dependent: energy_min = self.map.geom.axes["energy"].center[0] energy_max = self.map.geom.axes["energy"].center[-1] prnt1 = f"Energy min: {energy_min} \n" f"Energy max: {energy_max} \n" prnt = prnt + prnt1 return prnt @property def is_energy_dependent(self): """Whether the model is energy dependent.""" return self.map.geom.has_energy_axis @classmethod def from_table(cls, table, filename=None): """Create a template model from an astropy table. Parameters ---------- table : `~astropy.table.Table` Table containing the template model. filename : str, optional Name of input file. Default is None. Returns ------- model : `LightCurveTemplateTemporalModel` Light curve template model. """ columns = [_.lower() for _ in table.colnames] if "time" not in columns: raise ValueError("A TIME column is necessary") t_ref = time_ref_from_dict(table.meta, scale="utc") nodes = table["TIME"] ax_unit = nodes.quantity.unit if not ax_unit.is_equivalent("d"): try: ax_unit = u.Unit(table.meta["TIMEUNIT"]) except KeyError: raise ValueError("Time unit not found in the table") time_axis = MapAxis.from_nodes(nodes=nodes, name="time", unit=ax_unit) axes = [time_axis] m = RegionNDMap.create(region=None, axes=axes, data=table["NORM"]) return cls(m, t_ref=t_ref, filename=filename) @classmethod def read(cls, filename, format="table"): """Read a template model. Parameters ---------- filename : str Name of file to read. format : {"table", "map"} Format of the input file. Returns ------- model : `LightCurveTemplateTemporalModel` Light curve template model. """ filename = str(make_path(filename)) if format == "table": table = Table.read(filename) return cls.from_table(table, filename=filename) elif format == "map": with fits.open(filename) as hdulist: header = hdulist["SKYMAP_BANDS"].header t_ref = time_ref_from_dict(header) # TODO : Ugly hack to prevent creating a TimeMapAxis # By default, MapAxis.from_table tries to create a # TimeMapAxis, failing which, it creates a normal MapAxis. # This ugly hack forces the fail. We need a normal Axis to # have the evaluate method work hdulist["SKYMAP_BANDS"].header.pop("MJDREFI") m = RegionNDMap.from_hdulist(hdulist) return cls(m, t_ref=t_ref, filename=filename) else: raise ValueError( f"Not a valid format: '{format}', choose from: {'table', 'map'}" ) def to_table(self): """Convert model to an astropy table.""" if self.is_energy_dependent: raise NotImplementedError("Not supported for energy dependent models") table = Table( data=[self.map.geom.axes["time"].center, self.map.quantity], names=["TIME", "NORM"], meta=time_ref_to_dict(self.reference_time, scale=self.scale), ) return table def write(self, filename, format="table", overwrite=False): """Write a model to disk as per the specified format. Parameters: filename : str Name of output file. format : {"table" or "map"} If format is "table", it is serialised as a `~astropy.table.Table`. If "map", then it is serialised as a `~gammapy.maps.RegionNDMap`. Default is "table". overwrite : bool, optional Overwrite existing file. Default is False. """ if self.filename is None: raise IOError("Missing filename") if format == "table": table = self.to_table() table.write(filename, overwrite=overwrite) elif format == "map": # RegionNDMap.from_hdulist does not update the header hdulist = self.map.to_hdulist() hdulist["SKYMAP_BANDS"].header.update( time_ref_to_dict(self.reference_time, scale=self.scale) ) hdulist.writeto(filename, overwrite=overwrite) else: raise ValueError("Not a valid format, choose from ['map', 'table']") def evaluate(self, time, t_ref=None, energy=None): """Evaluate the model at given coordinates. Parameters ---------- time: `~astropy.time.Time` Time. t_ref: `~gammapy.modeling.Parameter`, optional Reference time for the model. Default is None. energy: `~astropy.units.Quantity`, optional Energy. Default is None. Returns ------- values : `~astropy.units.Quantity` Model values. """ if t_ref is None: t_ref = self.reference_time t = (time - t_ref).to_value(self.map.geom.axes["time"].unit) coords = {"time": t} if self.is_energy_dependent: if energy is None: energy = self.map.geom.axes["energy"].center coords["energy"] = energy.reshape(-1, 1) val = self.map.interp_by_coord( coords, method=self.method, values_scale=self.values_scale ) val = np.clip(val, 0, a_max=None) return u.Quantity(val, unit=self.map.unit, copy=COPY_IF_NEEDED) def integral(self, t_min, t_max, oversampling_factor=100, **kwargs): if self.is_energy_dependent: raise NotImplementedError( "Integral not supported for energy dependent models" ) return super().integral(t_min, t_max, oversampling_factor, **kwargs) @classmethod def from_dict(cls, data): data = data["temporal"] filename = data["filename"] format = data.get("format", "table") return cls.read(filename, format) def _guess_format(self): if self.is_energy_dependent: format = "map" else: format = "table" log.info("Inferred format: " + format) return format def to_dict(self, full_output=False, format=None): """Create dictionary for YAML serialisation.""" data = super().to_dict(full_output) if format is None: format = self._guess_format() data["temporal"]["filename"] = self.filename data["temporal"]["format"] = format data["temporal"]["unit"] = str(self.map.unit) return data def plot(self, time_range, ax=None, n_points=100, energy=None, **kwargs): """ Plot the temporal model. Parameters ---------- time_range : `~astropy.time.Time` Times to plot the model. ax : `~matplotlib.axes.Axes`, optional Axis to plot on. Default is None. n_points : int, optional Number of bins to plot model. Default is 100. energy : `~astropy.units.quantity`, optional Energies to compute the model at for energy dependent models. Default is None. **kwargs : dict Keywords forwarded to `~matplotlib.pyplot.errorbar`. Returns ------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. """ if not self.is_energy_dependent: super().plot(time_range=time_range, ax=ax, n_points=n_points, **kwargs) else: time_min, time_max = Time(time_range, scale=self.scale) time_axis = TimeMapAxis.from_time_bounds( time_min=time_min, time_max=time_max, nbin=n_points ) if energy is None: energy_axis = self.map.geom.axes["energy"] else: energy_axis = MapAxis.from_nodes( nodes=energy, name="energy", interp="log" ) m = RegionNDMap.create(region=None, axes=[time_axis, energy_axis]) kwargs.setdefault("marker", "None") kwargs.setdefault("ls", "-") m.quantity = self.evaluate( time=time_axis.time_mid, energy=energy_axis.center ) ax = m.plot(axis_name="time", ax=ax, **kwargs) ax.set_ylabel("Norm / A.U.") return ax, m class PowerLawTemporalModel(TemporalModel): """Temporal model with a Power Law decay. For more information see :ref:`powerlaw-temporal-model`. Parameters ---------- alpha : float Decay time power. Default is 1. t_ref: `~astropy.units.Quantity` The reference time in mjd. Frozen by default, at 2000-01-01. t0: `~astropy.units.Quantity` The scaling time in mjd. Fixed by default, at 1 day. """ tag = ["PowerLawTemporalModel", "powerlaw"] alpha = Parameter("alpha", 1.0, frozen=False) _t_ref_default = Time("2000-01-01") t_ref = Parameter("t_ref", _t_ref_default.mjd, unit="day", frozen=True) t0 = Parameter("t0", "1 d", frozen=True) @staticmethod def evaluate(time, alpha, t_ref, t0=1 * u.day): """Evaluate at given times.""" return np.power((time - t_ref) / t0, alpha) def integral(self, t_min, t_max): """Evaluate the integrated flux within the given time intervals. Parameters ---------- t_min: `~astropy.time.Time` Start times of observation. t_max: `~astropy.time.Time` Stop times of observation. Returns ------- norm : float Integrated flux norm on the given time intervals. """ pars = self.parameters alpha = pars["alpha"].quantity t0 = pars["t0"].quantity t_ref = self.reference_time if alpha != -1: value = self.evaluate(t_max, alpha + 1.0, t_ref, t0) - self.evaluate( t_min, alpha + 1.0, t_ref, t0 ) return t0 / (alpha + 1.0) * value / self.time_sum(t_min, t_max) else: value = np.log((t_max - t_ref) / (t_min - t_ref)) return t0 * value / self.time_sum(t_min, t_max) class SineTemporalModel(TemporalModel): """Temporal model with a sinusoidal modulation. For more information see :ref:`sine-temporal-model`. Parameters ---------- amp : float Amplitude of the sinusoidal function. Default is 1. t_ref: `~astropy.units.Quantity` The reference time in mjd. Default is 2000-01-01. omega: `~astropy.units.Quantity` Pulsation of the signal. Default is 1 rad/day. """ tag = ["SineTemporalModel", "sinus"] amp = Parameter("amp", 1.0, frozen=False) omega = Parameter("omega", "1. rad/day", frozen=False) _t_ref_default = Time("2000-01-01") t_ref = Parameter("t_ref", _t_ref_default.mjd, unit="day", frozen=False) @staticmethod def evaluate(time, amp, omega, t_ref): """Evaluate at given times.""" return 1.0 + amp * np.sin(omega * (time - t_ref)) def integral(self, t_min, t_max): """Evaluate the integrated flux within the given time intervals. Parameters ---------- t_min: `~astropy.time.Time` Start times of observation. t_max: `~astropy.time.Time` Stop times of observation. Returns ------- norm : float Integrated flux norm on the given time intervals. """ pars = self.parameters omega = pars["omega"].quantity.to_value("rad/day") amp = pars["amp"].value t_ref = self.reference_time value = (t_max - t_min).to_value(u.day) - amp / omega * ( np.sin(omega * (t_max - t_ref).to_value(u.day)) - np.sin(omega * (t_min - t_ref).to_value(u.day)) ) return value / self.time_sum(t_min, t_max).to_value(u.day) class TemplatePhaseCurveTemporalModel(TemporalModel): """Temporal phase curve model. A timing solution is used to compute the phase corresponding to time and a template phase curve is used to determine the associated ``norm``. The phasecurve is given as a table with columns ``phase`` and ``norm``. The ``norm`` is supposed to be a unit-less multiplicative factor in the model, to be multiplied with a spectral model. The model does linear interpolation for times between the given ``(phase, norm)`` values. The implementation currently uses `scipy.interpolate. InterpolatedUnivariateSpline`, using degree ``k=1`` to get linear interpolation. This class also contains an ``integral`` method, making the computation of mean fluxes for a given time interval a one-liner. Parameters ---------- table : `~astropy.table.Table` A table with 'PHASE' vs 'NORM'. filename : str The name of the file containing the phase curve. t_ref : `~astropy.units.Quantity` The reference time in mjd. Default is 48442.5 mjd. phi_ref : `~astropy.units.Quantity` The phase at reference time. Default is 0. f0 : `~astropy.units.Quantity` The frequency at t_ref in s-1. Default is 29.946923 s-1. f1 : `~astropy.units.Quantity` The frequency derivative at t_ref in s-2. Default is 0 s-2. f2 : `~astropy.units.Quantity` The frequency second derivative at t_ref in s-3. Default is 0 s-3. """ tag = ["TemplatePhaseCurveTemporalModel", "template-phase"] _t_ref_default = Time(48442.5, format="mjd") _phi_ref_default = 0 _f0_default = 29.946923 * u.s**-1 _f1_default = 0 * u.s**-2 _f2_default = 0 * u.s**-3 t_ref = Parameter("t_ref", _t_ref_default.mjd, unit="day", frozen=True) phi_ref = Parameter("phi_ref", _phi_ref_default, unit="", frozen=True) f0 = Parameter("f0", _f0_default, frozen=True) f1 = Parameter("f1", _f1_default, frozen=True) f2 = Parameter("f2", _f2_default, frozen=True) def __init__(self, table, filename=None, **kwargs): self.table = self._normalise_table(table) if filename is not None: filename = str(make_path(filename)) self.filename = filename super().__init__(**kwargs) @staticmethod def _normalise_table(table): x = table["PHASE"].data y = table["NORM"].data interpolator = scipy.interpolate.InterpolatedUnivariateSpline( x, y, k=1, ext=2, bbox=[0.0, 1.0] ) integral = interpolator.integral(0, 1) table["NORM"] *= 1 / integral return table @classmethod def read( cls, path, t_ref=_t_ref_default.mjd * u.d, phi_ref=_phi_ref_default, f0=_f0_default, f1=_f1_default, f2=_f2_default, ): """Read phasecurve model table from FITS file. Beware : this does **not** read parameters. They will be set to defaults. Parameters ---------- path : str or `~pathlib.Path` Filename with path. """ filename = str(make_path(path)) return cls( Table.read(filename), filename=filename, t_ref=t_ref, phi_ref=phi_ref, f0=f0, f1=f1, f2=f2, ) @staticmethod def _time_to_phase(time, t_ref, phi_ref, f0, f1, f2): """Convert time to phase given timing solution parameters. Parameters ---------- time : `~astropy.units.Quantity` The time at which to compute the phase. t_ref : `~astropy.units.Quantity` The reference time in mjd. phi_ref : `~astropy.units.Quantity` The phase at reference time. Default is 0. f0 : `~astropy.units.Quantity` The frequency at t_ref in s-1. f1 : `~astropy.units.Quantity` The frequency derivative at t_ref in s-2. f2 : `~astropy.units.Quantity` The frequency second derivative at t_ref in s-3. Returns ------- phase : float Phase. period_number : int Number of period since t_ref. """ delta_t = time - t_ref phase = ( phi_ref + delta_t * (f0 + delta_t / 2.0 * (f1 + delta_t / 3 * f2)) ).to_value("") period_number = np.floor(phase) phase -= period_number return phase, period_number def write(self, path=None, overwrite=False): if path is None: path = self.filename if path is None: raise ValueError(f"filename is required for {self.tag}") else: self.filename = str(make_path(path)) self.table.write(self.filename, overwrite=overwrite) @lazyproperty def _interpolator(self): x = self.table["PHASE"].data y = self.table["NORM"].data return scipy.interpolate.InterpolatedUnivariateSpline( x, y, k=1, ext=2, bbox=[0.0, 1.0] ) def evaluate(self, time, t_ref, phi_ref, f0, f1, f2): phase, _ = self._time_to_phase(time, t_ref, phi_ref, f0, f1, f2) return self._interpolator(phase) * u.one def integral(self, t_min, t_max): """Evaluate the integrated flux within the given time intervals. Parameters ---------- t_min: `~astropy.time.Time` Start times of observation. t_max: `~astropy.time.Time` Stop times of observation. Returns ------- norm: The model integrated flux. """ kwargs = {par.name: par.quantity for par in self.parameters} ph_min, n_min = self._time_to_phase(t_min.mjd * u.d, **kwargs) ph_max, n_max = self._time_to_phase(t_max.mjd * u.d, **kwargs) # here we assume that the frequency does not change during the integration boundaries delta_t = (t_min - self.reference_time).to(u.d) frequency = self.f0.quantity + delta_t * ( self.f1.quantity + delta_t * self.f2.quantity / 2 ) # Compute integral of one phase phase_integral = self._interpolator.antiderivative()( 1 ) - self._interpolator.antiderivative()(0) # Multiply by the total number of phases phase_integral *= n_max - n_min - 1 # Compute integrals before first full phase and after the last full phase end_integral = self._interpolator.antiderivative()( ph_max ) - self._interpolator.antiderivative()(0) start_integral = self._interpolator.antiderivative()( 1 ) - self._interpolator.antiderivative()(ph_min) # Divide by Jacobian (here we neglect variations of frequency during the integration period) total = phase_integral + start_integral + end_integral # Normalize by total integration time n_period = (self.time_sum(t_min, t_max) * frequency).to("") if int(n_period) == 0: n_period = 1 integral_norm = total / n_period return integral_norm @classmethod def from_dict(cls, data): params = _build_parameters_from_dict( data["temporal"]["parameters"], cls.default_parameters ) filename = data["temporal"]["filename"] kwargs = {par.name: par for par in params} return cls.read(filename, **kwargs) def to_dict(self, full_output=False): """Create dictionary for YAML serialisation.""" model_dict = super().to_dict() model_dict["temporal"]["filename"] = self.filename return model_dict def plot_phasogram(self, ax=None, n_points=100, **kwargs): """ Plot phasogram of the phase model. Parameters ---------- ax : `~matplotlib.axes.Axes`, optional Axis to plot on. Default is None. n_points : int, optional Number of bins to plot model. Default is 100. **kwargs : dict Keywords forwarded to `~matplotlib.pyplot.errorbar`. Returns ------- ax : `~matplotlib.axes.Axes`, optional Matplotlib axes. """ phase_axis = MapAxis.from_bounds(0.0, 1, nbin=n_points, name="Phase", unit="") m = RegionNDMap.create(region=None, axes=[phase_axis]) kwargs.setdefault("marker", "None") kwargs.setdefault("ls", "-") kwargs.setdefault("xerr", None) m.quantity = self._interpolator(phase_axis.center) ax = m.plot(ax=ax, **kwargs) ax.set_ylabel("Norm / A.U.") return ax def sample_time(self, n_events, t_min, t_max, t_delta="1 s", random_state=0): """Sample arrival times of events. To fully cover the phase range, t_delta is the minimum between the input and product of the period at 0.5*(t_min + t_max) and the table bin size. Parameters ---------- n_events : int Number of events to sample. t_min : `~astropy.time.Time` Start time of the sampling. t_max : `~astropy.time.Time` Stop time of the sampling. t_delta : `~astropy.units.Quantity` Time step used for sampling of the temporal model. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Returns ------- time : `~astropy.units.Quantity` Array with times of the sampled events. """ t_delta = u.Quantity(t_delta) # Determine period at the mid time t_mid = Time(t_min, scale=self.scale) + 0.5 * (t_max - t_min) delta_t = (t_mid - self.reference_time).to(u.d) frequency = self.f0.quantity + delta_t * ( self.f1.quantity + delta_t * self.f2.quantity / 2 ) period = 1 / frequency # Take minimum time delta between user input and the period divided by the number of rows in the model table # this assumes that phase values are evenly spaced. t_delta = np.minimum(period / len(self.table), t_delta) return super().sample_time(n_events, t_min, t_max, t_delta, random_state) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.232642 gammapy-1.3/gammapy/modeling/models/tests/0000755000175100001770000000000014721316215020275 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/__init__.py0000644000175100001770000000010014721316200022367 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.232642 gammapy-1.3/gammapy/modeling/models/tests/data/0000755000175100001770000000000014721316215021206 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/data/example2.yaml0000644000175100001770000000273314721316200023606 0ustar00runnerdockercomponents: - name: example_2 type: SkyModel spectral: type: PowerLawSpectralModel parameters: - name: index value: 2.0 unit: '' min: .nan max: .nan frozen: false error: 0 - name: amplitude value: 1.0e-12 unit: cm-2 s-1 TeV-1 min: .nan max: .nan frozen: false error: 0 - name: reference value: 1.0 unit: TeV min: .nan max: .nan frozen: true error: 0 spatial: type: GaussianSpatialModel frame: icrs parameters: - name: lon_0 value: 0.0 unit: deg min: .nan max: .nan frozen: false error: 0 - name: lat_0 value: 0.0 unit: deg min: -90.0 max: 90.0 frozen: false error: 0 - name: sigma value: 1.0 unit: deg min: 0.0 max: .nan frozen: false error: 0 - name: e value: 0.0 unit: '' min: 0.0 max: 1.0 frozen: true error: 0 - name: phi value: 0.0 unit: deg min: .nan max: .nan frozen: true error: 0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/data/examples.yaml0000644000175100001770000000662314721316200023711 0ustar00runnerdockercomponents: - name: background_irf datasets_names: CTA-gc type: TemplateNPredModel filename: $GAMMAPY_DATA/tests/models/background_irf.fits parameters: - name: norm value: 1.01 scale: 1.0 unit: '' min: 0.0 max: .nan frozen: false - name: tilt value: 0.0 scale: 1.0 unit: '' min: .nan max: .nan frozen: true - name: reference value: 1.0 scale: 1.0 unit: TeV min: .nan max: .nan frozen: true - name: source0 type: SkyModel spatial: type: PointSpatialModel parameters: - name: lon_0 value: -0.5 scale: 0.01 unit: deg min: -180.0 max: 180.0 frozen: true - name: lat_0 value: -0.0005 scale: 0.01 unit: deg min: -90.0 max: 90.0 frozen: true spectral: type: ExpCutoffPowerLawSpectralModel parameters: - name: index value: 2.1 scale: 1.0 unit: '' min: .nan max: .nan frozen: false - name: amplitude value: 2.3e-12 scale: 1.0e-12 unit: cm-2 s-1 TeV-1 min: .nan max: .nan frozen: false - name: reference value: 1.0 scale: 1.0 unit: TeV min: .nan max: .nan frozen: true - name: lambda_ value: 0.006 scale: 0.1 unit: TeV-1 min: .nan max: .nan frozen: false - name: source1 type: SkyModel spatial: type: DiskSpatialModel parameters: - name: lon_0 value: -50. unit: deg frozen: false - name: lat_0 value: -0.05 unit: deg frozen: false - name: r_0 value: 0.2 unit: deg frozen: false spectral: type: PowerLawSpectralModel parameters: - name: index value: 2.2 scale: 1.0 unit: '' min: .nan max: .nan frozen: false - name: amplitude value: 2.3e-12 scale: 1.0e-12 unit: cm-2 s-1 TeV-1 min: .nan max: .nan frozen: false - name: reference value: 1.0 scale: 1.0 unit: TeV min: .nan max: .nan frozen: true temporal: type: LightCurveTemplateTemporalModel filename: $GAMMAPY_DATA/tests/models/light_curve/lightcrv_PKSB1222+216.fits - name: source2 type: SkyModel spatial: type: TemplateSpatialModel filename: $GAMMAPY_DATA/catalogs/fermi/Extended_archive_v18/Templates/RXJ1713_2016_250GeV.fits normalize: false parameters: [] spectral: type: TemplateSpectralModel energy: data: - 34.171 - 44.333 - 57.517 unit: MeV values: data: - 2.52894e-06 - 1.2486e-06 - 6.14648e-06 unit: 1 / (cm2 MeV s) parameters: - name: norm value: 2.1 scale: 1.0 unit: '' min: .nan max: .nan frozen: false - name: cube_iem type: SkyModel spatial: type: TemplateSpatialModel filename: $GAMMAPY_DATA/fermi-3fhl-gc/gll_iem_v06_gc.fits.gz normalize: false unit: 1 / (cm2 s MeV sr) spectral: type: PowerLawNormSpectralModel parameters: - name: norm value: 1.09 scale: 1.0 unit: '' min: 0.0 max: .nan frozen: false - name: tilt value: 0.0 scale: 1.0 unit: '' min: .nan max: .nan frozen: true - name: reference value: 1.0 scale: 1.0 unit: TeV min: .nan max: .nan frozen: true ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/data/make.py0000644000175100001770000000635514721316200022500 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Create example model YAML files programmatically. (some will be also written manually) """ from pathlib import Path import numpy as np import astropy.units as u from gammapy.data import DataStore from gammapy.datasets import Datasets, MapDataset from gammapy.makers import MapDatasetMaker from gammapy.maps import MapAxis, WcsGeom from gammapy.modeling.models import ( ExpCutoffPowerLawSpectralModel, GaussianSpatialModel, Models, PointSpatialModel, PowerLawSpectralModel, SkyModel, TemplateSpatialModel, ) DATA_PATH = Path("./") def make_example_2(): spatial = GaussianSpatialModel(lon_0="0 deg", lat_0="0 deg", sigma="1 deg") model = SkyModel(PowerLawSpectralModel(), spatial, name="example_2") models = Models([model]) models.write(DATA_PATH / "example2.yaml", overwrite=True, write_covariance=False) def make_datasets_example(): # Define which data to use and print some information energy_axis = MapAxis.from_edges( np.logspace(-1.0, 1.0, 4), unit="TeV", name="energy", interp="log" ) geom0 = WcsGeom.create( skydir=(0, 0), binsz=0.1, width=(2, 2), frame="galactic", proj="CAR", axes=[energy_axis], ) geom1 = WcsGeom.create( skydir=(1, 0), binsz=0.1, width=(2, 2), frame="galactic", proj="CAR", axes=[energy_axis], ) geoms = [geom0, geom1] sources_coords = [(0, 0), (0.9, 0.1)] names = ["gc", "g09"] models = Models() for idx, (lon, lat) in enumerate(sources_coords): spatial_model = PointSpatialModel( lon_0=lon * u.deg, lat_0=lat * u.deg, frame="galactic" ) spectral_model = ExpCutoffPowerLawSpectralModel( index=2 * u.Unit(""), amplitude=3e-12 * u.Unit("cm-2 s-1 TeV-1"), reference=1.0 * u.TeV, lambda_=0.1 / u.TeV, ) model_ecpl = SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, name=names[idx] ) models.append(model_ecpl) models["gc"].spectral_model.reference = models["g09"].spectral_model.reference obs_ids = [110380, 111140, 111159] data_store = DataStore.from_dir("$GAMMAPY_DATA/cta-1dc/index/gps/") diffuse_spatial = TemplateSpatialModel.read( "$GAMMAPY_DATA/fermi-3fhl-gc/gll_iem_v06_gc.fits.gz" ) diffuse_model = SkyModel(PowerLawSpectralModel(), diffuse_spatial) maker = MapDatasetMaker() datasets = Datasets() observations = data_store.get_observations(obs_ids) for idx, geom in enumerate(geoms): stacked = MapDataset.create(geom=geom, name=names[idx]) for obs in observations: dataset = maker.run(stacked, obs) stacked.stack(dataset) bkg = stacked.models.pop(0) stacked.models = [models[idx], diffuse_model, bkg] datasets.append(stacked) datasets.write( filename="$GAMMAPY_DATA/tests/models/gc_example_datasets.yaml", filename_models="$GAMMAPY_DATA/tests/models/gc_example_models.yaml", overwrite=True, write_covariance=False, ) if __name__ == "__main__": make_example_2() make_datasets_example() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/test_core.py0000644000175100001770000002211014721316200022624 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import sys import pytest from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import SkyCoord from gammapy.catalog import SourceCatalog4FGL from gammapy.maps import MapAxis, WcsGeom from gammapy.modeling import Parameter, Parameters from gammapy.modeling.models import ( GaussianSpatialModel, Model, ModelBase, Models, PointSpatialModel, PowerLawSpectralModel, SkyModel, ) from gammapy.utils.testing import mpl_plot_check, requires_data class MyModel(ModelBase): """Simple model example""" x = Parameter("x", 1, "cm") y = Parameter("y", 2) class CoModel(ModelBase): """Compound model example""" norm = Parameter("norm", 42, "cm") def __init__(self, m1, m2, norm=norm.quantity): self.m1 = m1 self.m2 = m2 super().__init__(norm=norm) @property def parameters(self): return Parameters([self.norm]) + self.m1.parameters + self.m2.parameters class WrapperModel(ModelBase): """Wrapper compound model. Dynamically generated parameters in `__init__`, and a parameter name conflict with the wrapped model, both have a parameter called "y". """ def __init__(self, m1, a=1, y=99): self.m1 = m1 a = Parameter("a", a) y = Parameter("y", y) self.default_parameters = Parameters([a, y]) super().__init__(a=a, y=y) @property def parameters(self): return Parameters([self.a, self.y]) + self.m1.parameters def test_model_class(): assert isinstance(MyModel.parameters, property) assert MyModel.x.name == "x" assert MyModel.default_parameters["x"] is MyModel.x def test_model_class_par_init(): x = Parameter("x", 4, "cm") y = Parameter("y", 10) model = MyModel(x=x, y=y) assert x is model.x assert y is model.y def test_model_init(): m = MyModel() assert m.x.name == "x" assert m.x.value == 1 assert m.x is m.parameters[0] assert m.y is m.parameters[1] assert m.parameters is not MyModel.default_parameters m = MyModel(x="99 cm") assert m.x.value == 99 assert m.y.value == 2 # Currently we always convert to the default unit of a parameter # TODO: discuss if this is the behaviour we want, or if we instead # should change to the user-set unit, as long as it's compatible m = MyModel(x=99 * u.m) assert_allclose(m.x.value, 99) assert m.x.unit == "m" with pytest.raises(u.UnitConversionError): MyModel(x=99) with pytest.raises(u.UnitConversionError): MyModel(x=99 * u.s) def test_wrapper_model(): outer = MyModel() m = WrapperModel(outer) assert isinstance(m.a, Parameter) assert m.y.value == 99 assert m.parameters.names == ["a", "y", "x", "y"] def test_model_parameter(): m = MyModel(x="99 cm") assert isinstance(m.x, Parameter) assert m.x.value == 99 assert m.x.unit == "cm" with pytest.raises(TypeError): m.x = 99 with pytest.raises(TypeError): m.x = 99 * u.cm # TODO: implement parameter linking. Not working ATM! def test_model_parameter_link(): # Assigning a parameter should create a link m = MyModel() par = MyModel.x.copy() m.x = par assert isinstance(m.x, Parameter) assert m.x is par # model.parameters should be in sync with attributes assert m.x is m.parameters["x"] def test_model_copy(): m = MyModel() m2 = m.copy() # Models should be independent assert m.parameters is not m2.parameters assert m.parameters[0] is not m2.parameters[0] def test_model_create(): spectral_model = Model.create( "pl-2", model_type="spectral", amplitude="1e-10 cm-2 s-1", index=3 ) assert "PowerLaw2SpectralModel" in spectral_model.tag assert_allclose(spectral_model.index.value, 3) def test_compound_model(): m1 = MyModel() m2 = MyModel(x=10 * u.cm, y=20) m = CoModel(m1, m2) assert len(m.parameters) == 5 assert m.parameters.names == ["norm", "x", "y", "x", "y"] assert_allclose(m.parameters.value, [42, 1, 2, 10, 20]) def test_parameter_link_init(): m1 = MyModel() m2 = MyModel(y=m1.y) assert m1.y is m2.y m1.y.value = 100 assert_allclose(m2.y.value, 100) def test_parameter_link(): m1 = MyModel() m2 = MyModel() m2.y = m1.y m1.y.value = 100 assert_allclose(m2.y.value, 100) @requires_data() def test_set_parameters_from_table(): # read gammapy models models = Models.read("$GAMMAPY_DATA/tests/models/gc_example_models.yaml") tab = models.to_parameters_table() tab["value"][0] = 3.0 tab["min"][0] = -10 tab["max"][0] = 10 tab["frozen"][0] = True tab["name"][0] = "index2" tab["frozen"][1] = True models.update_parameters_from_table(tab) d = models.parameters.to_dict() assert d[0]["value"] == 3.0 assert d[0]["min"] == -10 assert d[0]["max"] == 10 assert d[0]["frozen"] assert d[0]["name"] == "index" assert d[1]["frozen"] @requires_data() def test_plot_models(caplog): models = Models.read("$GAMMAPY_DATA/tests/models/gc_example_models.yaml") with mpl_plot_check(): models.plot_regions(linewidth=2) models.plot_regions() assert models.wcs_geom.data_shape == models.wcs_geom.wcs.array_shape regions = models.to_regions() assert len(regions) == 3 p1 = Model.create( "pl-2", model_type="spectral", ) g1 = Model.create("gauss", model_type="spatial") p2 = Model.create( "pl-2", model_type="spectral", ) m1 = SkyModel(spectral_model=p1, spatial_model=g1, name="m1") m2 = SkyModel(spectral_model=p2, name="m2") models = Models([m1, m2]) models.plot_regions() assert "WARNING" in [_.levelname for _ in caplog.records] assert "Skipping model m2 - no spatial component present" in [ _.message for _ in caplog.records ] def test_plot_models_empty(caplog): models = Models([]) models.plot_regions() def test_positions(): p1 = Model.create( "pl", model_type="spectral", ) g1 = Model.create("gauss", model_type="spatial") m1 = SkyModel(spectral_model=p1, spatial_model=g1, name="m1") g3 = Model.create("gauss", model_type="spatial", frame="galactic") m3 = SkyModel(spectral_model=p1, spatial_model=g3, name="m3") models = Models([m1, m3]) pos = models.positions assert_allclose(pos.galactic[0].l.value, 96.337, rtol=1e-3) def test_parameter_name(): # From the 3.12 changelog: # Exceptions raised in a class or type’s __set_name__ method are no longer # wrapped by a RuntimeError. if sys.version_info < (3, 12): exc_class = RuntimeError else: exc_class = ValueError with pytest.raises(exc_class): class MyTestModel: par = Parameter("wrong-name", value=3) _ = MyTestModel() @requires_data() def test_select_models(): cat = SourceCatalog4FGL() mask_models = cat.table["GLAT"].quantity > 80 * u.deg subcat = cat[mask_models] models = subcat.to_models() pos = SkyCoord(182, 25, unit="deg", frame="icrs") geom = WcsGeom.create(skydir=pos, width=2 * u.deg, binsz=0.02, frame="icrs") models_selected = models.select_from_geom(geom) assert len(models_selected) == 2 def test_to_template(): energy_bounds = [1, 100] * u.TeV energy_axis = MapAxis.from_energy_bounds( energy_bounds[0], energy_bounds[1], nbin=2, per_decade=True, name="energy_true" ) spatial_model = GaussianSpatialModel() spectral_model = PowerLawSpectralModel() geom = spatial_model._evaluation_geom.to_cube([energy_axis]) model = SkyModel( spatial_model=spatial_model, spectral_model=PowerLawSpectralModel() ) models = Models([model]) template3d = models.to_template_sky_model(geom) template_1d_direct = models.to_template_spectral_model(geom) template_1d_from3d = Models([template3d]).to_template_spectral_model(geom) energy_axis_down = energy_axis.upsample(2) values_ref = spectral_model(energy_axis_down.edges) values_direct = template_1d_direct(energy_axis_down.edges) values_from3d = template_1d_from3d(energy_axis_down.edges) assert_allclose(values_ref, values_direct, rtol=1e-5) assert_allclose(values_ref, values_from3d, rtol=1e-5) def test_add_not_unique_models(): spec_model1 = PowerLawSpectralModel() spatial_model1 = PointSpatialModel() model1 = SkyModel( spectral_model=spec_model1, spatial_model=spatial_model1, name="source1" ) model2 = SkyModel( spectral_model=spec_model1, spatial_model=spatial_model1, name="source2" ) model3 = SkyModel( spectral_model=spec_model1, spatial_model=spatial_model1, name="source3" ) model4 = SkyModel( spectral_model=spec_model1, spatial_model=spatial_model1, name="source1" ) models1 = Models([model1, model2]) models2 = Models([model3, model4]) with pytest.raises( ValueError, match="Model names must be unique. Models named 'source1' are duplicated.", ): models1.extend(models2) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/test_cube.py0000644000175100001770000010750414721316200022625 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import operator import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import SkyCoord, angular_separation from astropy.time import Time from regions import CircleSkyRegion from gammapy.data.gti import GTI from gammapy.datasets.map import MapEvaluator from gammapy.irf import EDispKernel, PSFKernel from gammapy.maps import Map, MapAxis, RegionGeom, RegionNDMap, TimeMapAxis, WcsGeom from gammapy.modeling import Parameter from gammapy.modeling.models import ( CompoundSpectralModel, ConstantSpatialModel, ConstantSpectralModel, ConstantTemporalModel, FoVBackgroundModel, GaussianSpatialModel, LightCurveTemplateTemporalModel, LogParabolaSpectralModel, Models, PiecewiseNormSpatialModel, PointSpatialModel, PowerLawNormSpectralModel, PowerLawSpectralModel, PowerLawTemporalModel, SkyModel, SpatialModel, TemplateNPredModel, TemplateSpatialModel, create_fermi_isotropic_diffuse_model, ) from gammapy.utils.scripts import make_path from gammapy.utils.testing import mpl_plot_check, requires_data @pytest.fixture(scope="session") def sky_model(): spatial_model = GaussianSpatialModel( lon_0="3 deg", lat_0="4 deg", sigma="3 deg", frame="galactic" ) spectral_model = PowerLawSpectralModel( index=2, amplitude="1e-11 cm-2 s-1 TeV-1", reference="1 TeV" ) spectral_model.index.error = 0.1 spectral_model.amplitude.error = "1e-12 cm-2 s-1 TeV-1" temporal_model = ConstantTemporalModel() return SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, temporal_model=temporal_model, name="source-1", ) @pytest.fixture(scope="session") def gti(): start = [1, 3, 5] * u.day stop = [2, 3.5, 6] * u.day t_ref = Time(55555, format="mjd") gti = GTI.create(start, stop, reference_time=t_ref) return gti @pytest.fixture(scope="session") def diffuse_model(): axis = MapAxis.from_edges( [0.1, 1, 100], name="energy_true", unit="TeV", interp="log" ) m = Map.create( npix=(4, 3), binsz=2, axes=[axis], unit="cm-2 s-1 MeV-1 sr-1", frame="galactic" ) m.data += 42 spatial_model = TemplateSpatialModel( m, normalize=False, filename="diffuse_test.fits" ) return SkyModel(PowerLawNormSpectralModel(), spatial_model) @pytest.fixture(scope="session") def geom(): axis = MapAxis.from_edges(np.logspace(-1, 1, 3), unit=u.TeV, name="energy") return WcsGeom.create(skydir=(0, 0), npix=(5, 4), frame="galactic", axes=[axis]) @pytest.fixture(scope="session") def geom_true(): axis = MapAxis.from_edges(np.logspace(-1, 1, 4), unit=u.TeV, name="energy_true") return WcsGeom.create(skydir=(0, 0), npix=(5, 4), frame="galactic", axes=[axis]) @pytest.fixture(scope="session") def exposure(geom_true): m = Map.from_geom(geom_true) m.quantity = np.ones(geom_true.data_shape) * u.Quantity("100 m2 s") m.data[1] *= 10 return m @pytest.fixture(scope="session") def background(geom): m = Map.from_geom(geom) m.quantity = np.ones(geom.data_shape) * 1e-7 return m @pytest.fixture(scope="session") def edisp(geom, geom_true): e_reco = geom.axes["energy"] e_true = geom_true.axes["energy_true"] return EDispKernel.from_diagonal_response( energy_axis_true=e_true, energy_axis=e_reco ) @pytest.fixture(scope="session") def psf(geom_true): sigma = 0.5 * u.deg return PSFKernel.from_gauss(geom_true, sigma) @pytest.fixture(scope="session") def evaluator(sky_model, exposure, psf, edisp, gti): return MapEvaluator(sky_model, exposure, psf=psf, edisp=edisp, gti=gti) @pytest.fixture(scope="session") def diffuse_evaluator(diffuse_model, exposure, psf, edisp): return MapEvaluator(diffuse_model, exposure, psf=psf, edisp=edisp) @pytest.fixture(scope="session") def diffuse_evaluator_edisp_false(diffuse_model, exposure, psf, edisp): model = diffuse_model.copy() model.apply_irf["edisp"] = False return MapEvaluator(model, exposure, psf=psf, edisp=edisp) @pytest.fixture(scope="session") def sky_models(sky_model): sky_model_2 = sky_model.copy(name="source-2") sky_model_3 = sky_model.copy(name="source-3") return Models([sky_model_2, sky_model_3]) @pytest.fixture(scope="session") def sky_models_2(sky_model): sky_model_4 = sky_model.copy(name="source-4") sky_model_5 = sky_model.copy(name="source-5") return Models([sky_model_4, sky_model_5]) @requires_data() def test_sky_model_init(): with pytest.raises(TypeError): spatial_model = GaussianSpatialModel() SkyModel(spectral_model=1234, spatial_model=spatial_model) with pytest.raises(TypeError): SkyModel(spectral_model=PowerLawSpectralModel(), spatial_model=1234) # test init of energy dependent temporal models filename = make_path( "$GAMMAPY_DATA/gravitational_waves/GW_example_DC_map_file.fits.gz" ) temporal_model = LightCurveTemplateTemporalModel.read(filename, format="map") spatial_model = PointSpatialModel() spectral_model_fake = ConstantSpectralModel() model = SkyModel( spatial_model=spatial_model, spectral_model=spectral_model_fake, temporal_model=temporal_model, name="test-source", ) assert model.name == "test-source" def test_sky_model_spatial_none_io(tmpdir): pwl = PowerLawSpectralModel() model = SkyModel(spectral_model=pwl, name="test") models = Models([model]) filename = tmpdir / "test-models-none.yaml" models.write(filename) models = Models.read(filename) assert models["test"].spatial_model is None def test_sky_model_spatial_none_evaluate(geom_true, gti): pwl = PowerLawSpectralModel() model = SkyModel(spectral_model=pwl, name="test") data = model.evaluate_geom(geom_true, gti).to_value("cm-2 s-1 TeV-1") assert data.shape == (3, 1, 1) assert_allclose(data[0], 1.256774e-11, rtol=1e-6) def test_skymodel_addition(sky_model, sky_models, sky_models_2, diffuse_model): models = sky_model + sky_model.copy() assert isinstance(models, Models) assert len(models) == 2 models = sky_model + sky_models assert isinstance(models, Models) assert len(models) == 3 models = sky_models + sky_model assert isinstance(models, Models) assert len(models) == 3 models = sky_models + diffuse_model assert isinstance(models, Models) assert len(models) == 3 models = sky_models + sky_models_2 assert isinstance(models, Models) assert len(models) == 4 models = sky_model + sky_models assert isinstance(models, Models) assert len(models) == 3 def test_background_model(background): bkg1 = TemplateNPredModel(background) bkg1.spectral_model.norm.value = 2.0 npred1 = bkg1.evaluate() assert_allclose(npred1.data[0][0][0], background.data[0][0][0] * 2.0, rtol=1e-3) assert_allclose(npred1.data.sum(), background.data.sum() * 2.0, rtol=1e-3) bkg2 = TemplateNPredModel(background) bkg2.spectral_model.norm.value = 2.0 bkg2.spectral_model.tilt.value = 0.2 bkg2.spectral_model.reference.quantity = "1000 GeV" npred2 = bkg2.evaluate() assert_allclose(npred2.data[0][0][0], 2.254e-07, rtol=1e-3) assert_allclose(npred2.data.sum(), 7.352e-06, rtol=1e-3) def test_background_slice(background): bkg1 = TemplateNPredModel(background) e_edges = background.geom.axes[0].edges bkg1_slice = bkg1.slice_by_energy(e_edges[0], e_edges[1]) # 1 bin slice assert bkg1_slice.name == bkg1_slice.name assert bkg1_slice.map.data.shape == bkg1.map.sum_over_axes().data.shape assert_allclose(bkg1_slice.map.data[0, :, :], bkg1.map.data[0, :, :], rtol=1e-5) def test_background_model_io(tmpdir, background): filename = str(tmpdir / "test-bkg-file.fits") bkg = TemplateNPredModel(background, filename=filename) bkg.spectral_model.norm.value = 2.0 bkg.write(overwrite=False) bkg.write(overwrite=True) bkg_dict = bkg.to_dict() bkg_read = bkg.from_dict(bkg_dict) assert_allclose( bkg_read.evaluate().data.sum(), background.data.sum() * 2.0, rtol=1e-3 ) assert bkg_read.filename == filename def test_background_model_io_missing_file(tmpdir, background): bkg = TemplateNPredModel(background, filename=None) with pytest.raises(IOError): bkg.write(overwrite=True) def test_background_model_copy(background): background_copy = background.copy() bkg = TemplateNPredModel(background_copy) bkg.map.data += 1.0 assert np.all( background_copy.data == background.data ) # Check that the original map is unchanged bkg_copy = bkg.copy() bkg_copy.map.data += 1.0 assert np.all( bkg_copy.map.data == bkg.map.data ) # Check that the map has now changed def test_parameters(sky_models): parnames = [ "index", "amplitude", "reference", "lon_0", "lat_0", "sigma", "e", "phi", ] * 2 assert sky_models.parameters.names == parnames # Check that model parameters are references to the parts p1 = sky_models.parameters["lon_0"] p2 = sky_models[0].parameters["lon_0"] assert p1 is p2 def test_str(sky_models): assert "Component 0" in str(sky_models) assert "Component 1" in str(sky_models) def test_get_item(sky_models): model = sky_models["source-2"] assert model.name == "source-2" model = sky_models["source-3"] assert model.name == "source-3" with pytest.raises(ValueError): sky_models["spam"] def test_names(sky_models): assert sky_models.names == ["source-2", "source-3"] @requires_data() def test_models_mutation(sky_model, sky_models, sky_models_2): mods = sky_models mods.insert(0, sky_model) assert mods.names == ["source-1", "source-2", "source-3"] mods.extend(sky_models_2) assert mods.names == ["source-1", "source-2", "source-3", "source-4", "source-5"] mod3 = mods[3] mods.remove(mods[3]) assert mods.names == ["source-1", "source-2", "source-3", "source-5"] mods.append(mod3) assert mods.names == ["source-1", "source-2", "source-3", "source-5", "source-4"] mods.pop(3) assert mods.names == ["source-1", "source-2", "source-3", "source-4"] with pytest.raises(ValueError, match="Model names must be unique"): mods.append(sky_model) with pytest.raises(ValueError, match="Model names must be unique"): mods.insert(0, sky_model) with pytest.raises(ValueError, match="Model names must be unique"): mods.extend(sky_models_2) with pytest.raises(ValueError, match="Model names must be unique"): mods = sky_models + sky_models_2 class TestSkyModel: @staticmethod def test_repr(sky_model): assert "SkyModel" in repr(sky_model) @staticmethod def test_str(sky_model): string_model = str(sky_model) model_lines = string_model.splitlines() assert "SkyModel" in string_model assert "2.000 +/- 0.10" in model_lines[8] @staticmethod def test_parameters(sky_model): # Check that model parameters are references to the spatial and spectral parts p1 = sky_model.parameters["lon_0"] p2 = sky_model.spatial_model.parameters["lon_0"] assert p1 is p2 p1 = sky_model.parameters["amplitude"] p2 = sky_model.spectral_model.parameters["amplitude"] assert p1 is p2 @staticmethod def test_evaluate_scalar(sky_model): lon = 3 * u.deg lat = 4 * u.deg energy = 1 * u.TeV q = sky_model.evaluate(lon, lat, energy) assert q.unit == "cm-2 s-1 TeV-1 sr-1" assert np.isscalar(q.value) assert_allclose(q.to_value("cm-2 s-1 TeV-1 deg-2"), 1.76879232e-13) @staticmethod def test_evaluate_array(sky_model): lon = 3 * u.deg * np.ones(shape=(3, 4)) lat = 4 * u.deg * np.ones(shape=(3, 4)) energy = [1, 1, 1, 1, 1] * u.TeV q = sky_model.evaluate(lon, lat, energy[:, np.newaxis, np.newaxis]) assert q.shape == (5, 3, 4) assert_allclose(q.to_value("cm-2 s-1 TeV-1 deg-2"), 1.76879232e-13) @staticmethod def test_processing(sky_model): assert sky_model.apply_irf == {"exposure": True, "psf": True, "edisp": True} out = sky_model.to_dict() assert "apply_irf" not in out sky_model.apply_irf["edisp"] = False out = sky_model.to_dict() assert out["apply_irf"] == {"exposure": True, "psf": True, "edisp": False} sky_model.apply_irf["edisp"] = True class Test_Template_with_cube: @staticmethod def test_evaluate_scalar(diffuse_model): # Check pixel inside map val = diffuse_model.evaluate(0 * u.deg, 0 * u.deg, 10 * u.TeV) assert val.unit == "cm-2 s-1 MeV-1 sr-1" assert val.shape == (1,) assert_allclose(val.value, 42) # Check pixel outside map (spatially) val = diffuse_model.evaluate(100 * u.deg, 0 * u.deg, 10 * u.TeV) assert_allclose(val.value, 0) # Check pixel outside energy range val = diffuse_model.evaluate(0 * u.deg, 0 * u.deg, 200 * u.TeV) assert_allclose(val.value, 0) @staticmethod def test_evaluate_array(diffuse_model): lon = 1 * u.deg * np.ones(shape=(3, 4)) lat = 2 * u.deg * np.ones(shape=(3, 4)) energy = [1, 1, 1, 1, 1] * u.TeV q = diffuse_model.evaluate(lon, lat, energy[:, np.newaxis, np.newaxis]) assert q.shape == (5, 3, 4) assert_allclose(q.value.mean(), 42) @staticmethod def test_write(tmpdir, diffuse_model): filename = tmpdir / diffuse_model.spatial_model.filename diffuse_model.spatial_model.filename = None with pytest.raises(IOError): diffuse_model.spatial_model.write() with pytest.raises(IOError): Models(diffuse_model).to_dict() diffuse_model.spatial_model.filename = filename diffuse_model.spatial_model.write(overwrite=False) TemplateSpatialModel.read(filename) @staticmethod @requires_data() def test_read(): model = TemplateSpatialModel.read( "$GAMMAPY_DATA/tests/unbundled/fermi/gll_iem_v02_cutout.fits", normalize=False, ) assert model.map.unit == "cm-2 s-1 MeV-1 sr-1" # Check pixel inside map val = model.evaluate(0 * u.deg, 0 * u.deg, energy=100 * u.GeV) assert val.unit == "cm-2 s-1 MeV-1 sr-1" assert val.shape == (1,) assert_allclose(val.value, 1.395156e-12, rtol=1e-5) @staticmethod def test_evaluation_radius(diffuse_model): radius = diffuse_model.evaluation_radius assert radius.unit == "deg" assert_allclose(radius.value, 4) @staticmethod def test_frame(diffuse_model): assert diffuse_model.frame == "galactic" @staticmethod def test_processing(diffuse_model): assert diffuse_model.apply_irf == {"exposure": True, "psf": True, "edisp": True} out = diffuse_model.to_dict() assert "apply_irf" not in out diffuse_model.apply_irf["edisp"] = False out = diffuse_model.to_dict() assert out["apply_irf"] == {"exposure": True, "psf": True, "edisp": False} diffuse_model.apply_irf["edisp"] = True @staticmethod def test_datasets_name(diffuse_model): assert diffuse_model.datasets_names is None diffuse_model.datasets_names = ["1", "2"] out = diffuse_model.to_dict() assert out["datasets_names"] == ["1", "2"] diffuse_model.datasets_names = None out = diffuse_model.to_dict() assert "datasets_names" not in out class Test_template_cube_MapEvaluator: @staticmethod def test_compute_dnde(diffuse_evaluator): out = diffuse_evaluator.compute_dnde() assert out.shape == (3, 4, 5) out = out.to("cm-2 s-1 MeV-1 sr-1") assert_allclose(out.value.sum(), 2520.0, rtol=1e-5) assert_allclose(out.value[0, 0, 0], 42, rtol=1e-5) @staticmethod def test_compute_flux(diffuse_evaluator): out = diffuse_evaluator.compute_flux() assert out.data.shape == (3, 4, 5) out = out.quantity.to("cm-2 s-1") assert_allclose(out.value.sum(), 633263.444803, rtol=5e-3) assert_allclose(out.value[0, 0, 0], 1164.656176, rtol=5e-3) @staticmethod def test_apply_psf(diffuse_evaluator): flux = diffuse_evaluator.compute_flux() npred = diffuse_evaluator.apply_exposure(flux) out = diffuse_evaluator.apply_psf(npred) assert out.data.shape == (3, 4, 5) assert_allclose(out.data.sum(), 1.106404e12, rtol=5e-3) assert_allclose(out.data[0, 0, 0], 5.586508e08, rtol=5e-3) @staticmethod def test_apply_edisp(diffuse_evaluator): flux = diffuse_evaluator.compute_flux() npred = diffuse_evaluator.apply_exposure(flux) out = diffuse_evaluator.apply_edisp(npred) assert out.data.shape == (2, 4, 5) assert_allclose(out.data.sum(), 1.606345e12, rtol=5e-3) assert_allclose(out.data[0, 0, 0], 1.83018e10, rtol=5e-3) @staticmethod def test_compute_npred(diffuse_evaluator): out = diffuse_evaluator.compute_npred() assert out.data.shape == (2, 4, 5) assert_allclose(out.data.sum(), 1.106403e12, rtol=5e-3) assert_allclose(out.data[0, 0, 0], 8.778828e09, rtol=5e-3) @staticmethod def test_apply_edisp_false(diffuse_evaluator_edisp_false): out = diffuse_evaluator_edisp_false.compute_npred() assert "energy" in out.geom.axes.names assert out.data.shape == (2, 4, 5) assert_allclose(out.data.sum(), 1.106403e12, rtol=5e-3) assert_allclose(out.data[0, 0, 0], 8.778828e09, rtol=5e-3) class TestSkyModelMapEvaluator: @staticmethod def test_compute_dnde(evaluator): out = evaluator.compute_dnde() assert out.shape == (3, 4, 5) assert out.unit == "cm-2 s-1 TeV-1 sr-1" assert_allclose( out.to_value("cm-2 s-1 TeV-1 deg-2").sum(), 1.1788166328203174e-11, rtol=1e-5, ) assert_allclose( out.to_value("cm-2 s-1 TeV-1 deg-2")[0, 0, 0], 5.087056282039508e-13, rtol=1e-5, ) @staticmethod def test_compute_flux(evaluator): out = evaluator.compute_flux() out = out.quantity.to_value("cm-2 s-1") assert out.shape == (3, 4, 5) assert_allclose(out.sum(), 2.213817e-12, rtol=1e-5) assert_allclose(out[0, 0, 0], 7.938388e-14, rtol=1e-5) @staticmethod def test_apply_psf(evaluator): flux = evaluator.compute_flux() npred = evaluator.apply_exposure(flux) out = evaluator.apply_psf(npred) assert out.data.shape == (3, 4, 5) assert_allclose(out.data.sum(), 3.862314e-06, rtol=5e-3) assert_allclose(out.data[0, 0, 0], 4.126612e-08, rtol=5e-3) @staticmethod def test_apply_edisp(evaluator): flux = evaluator.compute_flux() npred = evaluator.apply_exposure(flux) out = evaluator.apply_edisp(npred) assert out.data.shape == (2, 4, 5) assert_allclose(out.data.sum(), 5.615601e-06, rtol=1e-5) assert_allclose(out.data[0, 0, 0], 1.33602e-07, rtol=1e-5) @staticmethod def test_compute_npred(evaluator, gti): out = evaluator.compute_npred() assert out.data.shape == (2, 4, 5) assert_allclose(out.data.sum(), 3.862314e-06, rtol=5e-3) assert_allclose(out.data[0, 0, 0], 6.94503e-08, rtol=5e-3) def test_sky_point_source(): # Test special case of point source. Regression test for GH 2367. energy_axis = MapAxis.from_edges( [1, 10], unit="TeV", name="energy_true", interp="log" ) exposure = Map.create( skydir=(100, 70), npix=(4, 4), binsz=0.1, proj="AIT", unit="cm2 s", axes=[energy_axis], ) exposure.data = np.ones_like(exposure.data) spatial_model = PointSpatialModel( lon_0=100.06 * u.deg, lat_0=70.03 * u.deg, frame="icrs" ) # Create a spectral model with integral flux of 1 cm-2 s-1 in this energy band spectral_model = ConstantSpectralModel(const="1 cm-2 s-1 TeV-1") spectral_model.const.value /= spectral_model.integral(1 * u.TeV, 10 * u.TeV).value model = SkyModel(spatial_model=spatial_model, spectral_model=spectral_model) evaluator = MapEvaluator(model=model, exposure=exposure) flux = evaluator.compute_flux().quantity.to_value("cm-2 s-1")[0] expected = [ [0, 0, 0, 0], [0, 0.140, 0.058, 0.0], [0, 0.564, 0.236, 0], [0, 0, 0, 0], ] assert_allclose(flux, expected, atol=0.01) assert_allclose(flux.sum(), 1) @requires_data() def test_fermi_isotropic(): filename = "$GAMMAPY_DATA/fermi_3fhl/iso_P8R2_SOURCE_V6_v06.txt" energy = [0.01, 1, 10, 100, 1000] * u.GeV coords = {"lon": 0 * u.deg, "lat": 0 * u.deg, "energy": energy} model_noextrapolate = create_fermi_isotropic_diffuse_model( filename=filename, interp_kwargs={"extrapolate": False}, ) model_extrapolate = create_fermi_isotropic_diffuse_model( filename=filename, interp_kwargs={"extrapolate": True, "method": "nearest"}, ) flux_noextrapolate = model_noextrapolate(**coords) assert_allclose( flux_noextrapolate.value, [np.nan, 5.98959823e-10, 6.26407059e-12, 2.83721193e-14, np.nan], rtol=1e-3, ) assert flux_noextrapolate.unit == "MeV-1 cm-2 s-1 sr-1" assert isinstance(model_noextrapolate.spectral_model, CompoundSpectralModel) assert_allclose( model_extrapolate(**coords).value, [2.52894e-06, 5.86237e-10, 5.78221e-12, 2.32045e-14, 2.74918e-16], rtol=1e-3, ) # No extrapolation with bounds_error with pytest.raises(ValueError): create_fermi_isotropic_diffuse_model( filename=filename, interp_kwargs={"extrapolate": False, "bounds_error": True}, ) class MyCustomGaussianModel(SpatialModel): """My custom gaussian model. Parameters ---------- lon_0, lat_0 : `~astropy.coordinates.Angle` Center position sigma_1TeV : `~astropy.coordinates.Angle` Width of the Gaussian at 1 TeV sigma_10TeV : `~astropy.coordinates.Angle` Width of the Gaussian at 10 TeV """ tag = "MyCustomGaussianModel" lon_0 = Parameter("lon_0", "0 deg") lat_0 = Parameter("lat_0", "0 deg", min=-90, max=90) sigma_1TeV = Parameter("sigma_1TeV", "0.5 deg", min=0) sigma_10TeV = Parameter("sigma_10TeV", "0.1 deg", min=0) @staticmethod def evaluate(lon, lat, energy, lon_0, lat_0, sigma_1TeV, sigma_10TeV): """Evaluate custom Gaussian model""" sigmas = u.Quantity([sigma_1TeV, sigma_10TeV]) energy_nodes = [1, 10] * u.TeV sigma = np.interp(energy, energy_nodes, sigmas) sigma = sigma.to("rad") sep = angular_separation(lon, lat, lon_0, lat_0) exponent = -0.5 * (sep / sigma) ** 2 norm = 1 / (2 * np.pi * sigma**2) return norm * np.exp(exponent) @property def evaluation_radius(self): """Evaluation radius (`~astropy.coordinates.Angle`).""" return 5 * self.sigma_1TeV.quantity def test_energy_dependent_model(): axis = MapAxis.from_edges(np.logspace(-1, 1, 4), unit=u.TeV, name="energy_true") geom_true = WcsGeom.create( skydir=(0, 0), binsz="0.1 deg", npix=(50, 50), frame="galactic", axes=[axis] ) spectral_model = PowerLawSpectralModel(amplitude="1e-11 cm-2 s-1 TeV-1") spatial_model = MyCustomGaussianModel(frame="galactic") sky_model = SkyModel(spectral_model=spectral_model, spatial_model=spatial_model) model = sky_model.integrate_geom(geom_true) assert_allclose(model.data.sum(), 9.9e-11, rtol=1e-3) def test_plot_grid(geom_true): spatial_model = MyCustomGaussianModel(frame="galactic") with mpl_plot_check(): spatial_model.plot_grid(geom=geom_true) def test_sky_model_create(): m = SkyModel.create("pl", "point", name="my-source") assert isinstance(m.spatial_model, PointSpatialModel) assert isinstance(m.spectral_model, PowerLawSpectralModel) assert m.name == "my-source" def test_integrate_geom(): model = GaussianSpatialModel(lon="0d", lat="0d", sigma=0.1 * u.deg, frame="icrs") spectral_model = PowerLawSpectralModel(amplitude="1e-11 cm-2 s-1 TeV-1") sky_model = SkyModel(spectral_model=spectral_model, spatial_model=model) center = SkyCoord("0d", "0d", frame="icrs") radius = 0.3 * u.deg square = CircleSkyRegion(center, radius) axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3, name="energy_true") geom = RegionGeom(region=square, axes=[axis], binsz_wcs="0.01deg") integral = sky_model.integrate_geom(geom).data assert_allclose(integral / 1e-12, [[[5.299]], [[2.460]], [[1.142]]], rtol=1e-3) def test_evaluate_integrate_nd_geom(): model = GaussianSpatialModel(lon="0d", lat="0d", sigma=0.1 * u.deg, frame="icrs") spectral_model = PowerLawSpectralModel(amplitude="1e-11 cm-2 s-1 TeV-1") sky_model = SkyModel(spectral_model=spectral_model, spatial_model=model) center = SkyCoord("0d", "0d", frame="icrs") radius = 0.3 * u.deg region = CircleSkyRegion(center, radius) energy_axis = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=3, name="energy_true" ) other_axis = MapAxis.from_edges([0.0, 1.0, 2.0], name="other") wcs_geom = WcsGeom.create( width=[1, 1.2], binsz=0.05, skydir=center, axes=[energy_axis, other_axis] ) region_geom = RegionGeom( region=region, axes=[other_axis, energy_axis], binsz_wcs="0.01deg" ) evaluation = sky_model.evaluate_geom(wcs_geom) assert evaluation.shape == (2, 3, 24, 20) assert_allclose(evaluation[0], evaluation[1]) assert_allclose( evaluation.value[0, :, 12, 10], [2.278184e-07, 4.908198e-08, 1.057439e-08], rtol=1e-6, ) integral = sky_model.integrate_geom(wcs_geom).data assert integral.shape == (2, 3, 24, 20) assert_allclose(integral[0], integral[1]) assert_allclose(integral[0, :, 12, 10], [1.973745e-13, 9.161312e-14, 4.252304e-14]) integral = sky_model.integrate_geom(region_geom).data assert integral.shape == (3, 2, 1, 1) assert_allclose(integral[:, 0, :, :], integral[:, 1, :, :]) assert_allclose( integral[:, 0] / 1e-12, [[[5.299]], [[2.460]], [[1.142]]], rtol=1e-3 ) def test_evaluate_integrate_geom_with_time(): spatial_model = GaussianSpatialModel( lon="0d", lat="0d", sigma=0.1 * u.deg, frame="icrs" ) spectral_model = PowerLawSpectralModel(amplitude="1e-11 cm-2 s-1 TeV-1") temporal_model = PowerLawTemporalModel() temporal_model.t_ref.value = 55000 sky_model = SkyModel( spectral_model=spectral_model, spatial_model=spatial_model, temporal_model=temporal_model, ) center = SkyCoord("0d", "0d", frame="icrs") energy_axis = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=3, name="energy_true" ) other_axis = MapAxis.from_edges([0.0, 1.0, 2.0], name="other") t_ref = Time(temporal_model.t_ref.value, format="mjd") time_min = t_ref + [1, 3, 5, 7] * u.day time_max = t_ref + [2, 4, 6, 8] * u.day time_axis = TimeMapAxis.from_time_edges(time_min=time_min, time_max=time_max) wcs_geom = WcsGeom.create( width=[1, 1.2], binsz=0.05, skydir=center, axes=[energy_axis, other_axis, time_axis], ) unit_exp = 1 / u.TeV / u.cm**2 / u.s / u.sr evaluation = sky_model.evaluate_geom(wcs_geom) assert evaluation.shape == (4, 2, 3, 24, 20) assert evaluation.unit.is_equivalent(unit_exp) assert_allclose( evaluation.value[0, 0, 1, 12, 10], 7.362297e-08, rtol=1e-6, ) radius = 0.3 * u.deg region = CircleSkyRegion(center, radius) sky_model1 = SkyModel(spectral_model=spectral_model, temporal_model=temporal_model) region_geom = RegionGeom( region=region, axes=[time_axis, energy_axis], binsz_wcs="0.01deg" ) with pytest.raises(ValueError): sky_model1.evaluate_geom(region_geom) region_geom = RegionGeom( region=region, axes=[energy_axis, time_axis], binsz_wcs="0.01deg" ) evaluation = sky_model1.evaluate_geom(geom=region_geom) assert evaluation.shape == (4, 3, 1, 1) assert_allclose( evaluation[0].value, [[[6.96238325e-12]], [[1.5000000e-12]], [[3.23165204e-13]]], rtol=1e-6, ) unit_exp = 1 / u.TeV / u.cm**2 / u.s assert evaluation.unit.is_equivalent(unit_exp) integral = sky_model1.integrate_geom(geom=region_geom) assert integral.geom.data_shape == (4, 3, 1, 1) assert_allclose( integral.data[0], [[[8.03761675e-12]], [[3.73073122e-12]], [[1.73165204e-12]]], rtol=1e-6, ) unit_exp = 1 / u.cm**2 / u.s assert integral.unit.is_equivalent(unit_exp) def test_evaluate_integrate_geom_with_time_and_gti(): spatial_model = GaussianSpatialModel( lon="0d", lat="0d", sigma=0.1 * u.deg, frame="icrs" ) spectral_model = PowerLawSpectralModel(amplitude="1e-11 cm-2 s-1 TeV-1") temporal_model = PowerLawTemporalModel() temporal_model.t_ref.value = 55000 sky_model = SkyModel( spectral_model=spectral_model, spatial_model=spatial_model, temporal_model=temporal_model, ) center = SkyCoord("0d", "0d", frame="icrs") start = np.linspace(0, 8, 10) * u.day stop = np.linspace(0.5, 8.5, 10) * u.day t_ref = Time(temporal_model.t_ref.value, format="mjd") gti = GTI.create(start, stop, reference_time=t_ref) energy_axis = MapAxis.from_energy_bounds( "1 TeV", "10 TeV", nbin=3, name="energy_true" ) other_axis = MapAxis.from_edges([0.0, 1.0, 2.0], name="other") time_min = t_ref + [1, 3, 5, 7] * u.day time_max = t_ref + [2, 4, 6, 8] * u.day time_axis = TimeMapAxis.from_time_edges(time_min=time_min, time_max=time_max) wcs_geom = WcsGeom.create( width=[1, 1.2], binsz=0.05, skydir=center, axes=[energy_axis, other_axis, time_axis], ) evaluation = sky_model.evaluate_geom(geom=wcs_geom, gti=gti) assert evaluation.shape == (4, 2, 3, 24, 20) unit_exp = 1 / u.TeV / u.cm**2 / u.s / u.sr assert evaluation.unit.is_equivalent(unit_exp) assert_allclose( evaluation.value[0, 0, 1, 12, 10], 7.102014e-08, rtol=1e-6, ) radius = 0.3 * u.deg region = CircleSkyRegion(center, radius) sky_model1 = SkyModel(spectral_model=spectral_model, temporal_model=temporal_model) region_geom = RegionGeom( region=region, axes=[time_axis, energy_axis], binsz_wcs="0.01deg" ) with pytest.raises(ValueError): sky_model1.evaluate_geom(region_geom, gti) region_geom = RegionGeom( region=region, axes=[energy_axis, time_axis], binsz_wcs="0.01deg" ) evaluation = sky_model1.evaluate_geom(geom=region_geom, gti=gti) assert evaluation.shape == (4, 3, 1, 1) assert_allclose( evaluation[0].value, [[[6.71623839e-12]], [[1.44696970e-12]], [[3.11740171e-13]]], rtol=1e-3, ) unit_exp = 1 / u.TeV / u.cm**2 / u.s assert evaluation.unit.is_equivalent(unit_exp) integral = sky_model1.integrate_geom(geom=region_geom, gti=gti) assert integral.geom.data_shape == (4, 3, 1, 1) assert_allclose( integral.data[0], [[[7.75345858e-12]], [[3.59883668e-12]], [[1.67043201e-12]]], rtol=1e-3, ) unit_exp = 1 / u.cm**2 / u.s assert integral.unit.is_equivalent(unit_exp) def test_compound_spectral_model(caplog): spatial_model = GaussianSpatialModel( lon_0="3 deg", lat_0="4 deg", sigma="3 deg", frame="galactic" ) pwl = PowerLawSpectralModel( index=2, amplitude="1e-11 cm-2 s-1 TeV-1", reference="1 TeV" ) lp = LogParabolaSpectralModel( amplitude="1e-12 cm-2 s-1 TeV-1", reference="10 TeV", alpha=2.0, beta=1.0 ) temporal_model = ConstantTemporalModel() spectral_model = CompoundSpectralModel(pwl, lp, operator.add) m = SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, temporal_model=temporal_model, name="source-1", ) assert_allclose(m.spectral_model(5 * u.TeV).value, 2.87e-12, rtol=1e-2) def test_sky_model_contributes_point_region(): model = SkyModel.create("pl", "point") geom = RegionGeom.create("icrs;point(0.05, 0.05)", binsz_wcs="0.01 deg") mask = RegionNDMap.from_geom(geom) assert np.any(model.contributes(mask)) def test_spatial_model_background(background): geom = background.geom spatial_model = ConstantSpatialModel(frame="galactic") identical_npred = TemplateNPredModel( background, spatial_model=spatial_model ).evaluate() assert_allclose(identical_npred, background.data) reference = FoVBackgroundModel( spatial_model=None, dataset_name="test" ).evaluate_geom(geom) identical = FoVBackgroundModel( spatial_model=spatial_model, dataset_name="test" ).evaluate_geom(geom) assert_allclose(identical, reference) spatial_model2 = ConstantSpatialModel(frame="galactic") spatial_model2.value.value = 2 twice_npred = TemplateNPredModel( background, spatial_model=spatial_model2 ).evaluate() assert_allclose(twice_npred, background.data * 2) twice = FoVBackgroundModel( spatial_model=spatial_model2, dataset_name="test" ).evaluate_geom(geom) assert_allclose(twice, reference * 2) def test_spatial_model_io_background(tmp_path, background): spatial_model = ConstantSpatialModel(frame="galactic") fbkg_irf = str(tmp_path / "background_irf_test.fits") model = TemplateNPredModel(background, spatial_model=None, filename=fbkg_irf) model.write() model_dict = model.to_dict() assert "spatial" not in model_dict new_model = TemplateNPredModel.from_dict(model_dict) assert new_model.spatial_model is None model = TemplateNPredModel( background, spatial_model=spatial_model, filename=fbkg_irf ) model_dict = model.to_dict() assert "spatial" in model_dict new_model = TemplateNPredModel.from_dict(model_dict) assert isinstance(new_model.spatial_model, ConstantSpatialModel) assert new_model.spatial_model.frame == "icrs" model = FoVBackgroundModel(spatial_model=None, dataset_name="test") model_dict = model.to_dict() assert "spatial" not in model_dict new_model = FoVBackgroundModel.from_dict(model_dict) assert new_model.spatial_model is None model = FoVBackgroundModel(spatial_model=spatial_model, dataset_name="test") model_dict = model.to_dict() assert "spatial" in model_dict new_model = FoVBackgroundModel.from_dict(model_dict) assert isinstance(new_model.spatial_model, ConstantSpatialModel) def test_piecewise_spatial_model_background(background): geom = background.geom coords = geom.to_image().get_coord().flat spatial_model = PiecewiseNormSpatialModel(coords, frame="galactic") identical_npred = TemplateNPredModel( background, spatial_model=spatial_model ).evaluate() assert_allclose(identical_npred, background.data) reference = Map.from_geom(geom, data=1) identical = FoVBackgroundModel( spatial_model=spatial_model, dataset_name="test" ).evaluate_geom(geom) assert_allclose(identical, reference) spatial_model2 = PiecewiseNormSpatialModel( coords, norms=2 * np.ones(coords.shape[0]), frame="galactic" ) twice_npred = TemplateNPredModel( background, spatial_model=spatial_model2 ).evaluate() assert_allclose(twice_npred, background.data * 2) twice = FoVBackgroundModel( spatial_model=spatial_model2, dataset_name="test" ).evaluate_geom(geom) assert_allclose(twice, reference * 2.0) copied = FoVBackgroundModel(spatial_model=spatial_model, dataset_name="test").copy() assert isinstance(copied.spatial_model, PiecewiseNormSpatialModel) assert "Spatial model type" in copied.__str__() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/test_io.py0000644000175100001770000004415014721316200022313 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import operator import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.utils.data import get_pkg_data_filename from gammapy.maps import Map, MapAxis, RegionNDMap from gammapy.modeling.models import ( MODEL_REGISTRY, CompoundSpectralModel, ConstantTemporalModel, EBLAbsorptionNormSpectralModel, ExpDecayTemporalModel, FoVBackgroundModel, GaussianTemporalModel, LinearTemporalModel, LogParabolaSpectralModel, Model, Models, PiecewiseNormSpectralModel, PowerLawSpectralModel, PowerLawTemporalModel, SineTemporalModel, SkyModel, TemplateNPredModel, ) from gammapy.utils.deprecation import GammapyDeprecationWarning from gammapy.utils.scripts import make_path, read_yaml, to_yaml, write_yaml from gammapy.utils.testing import requires_data, requires_dependency @pytest.fixture(scope="session") @requires_data() def models(): filename = get_pkg_data_filename("./data/examples.yaml") models_data = read_yaml(filename) models = Models.from_dict(models_data) return models @requires_data() def test_dict_to_skymodels(models): assert len(models) == 5 model0 = models[0] assert isinstance(model0, TemplateNPredModel) assert model0.name == "background_irf" model0 = models[1] assert "ExpCutoffPowerLawSpectralModel" in model0.spectral_model.tag assert "PointSpatialModel" in model0.spatial_model.tag pars0 = model0.parameters assert pars0["index"].value == 2.1 assert pars0["index"].unit == "" assert np.isnan(pars0["index"].max) assert np.isnan(pars0["index"].min) assert not pars0["index"].frozen assert pars0["lon_0"].value == -0.5 assert pars0["lon_0"].unit == "deg" assert pars0["lon_0"].max == 180.0 assert pars0["lon_0"].min == -180.0 assert pars0["lon_0"].frozen assert pars0["lat_0"].value == -0.0005 assert pars0["lat_0"].unit == "deg" assert pars0["lat_0"].max == 90.0 assert pars0["lat_0"].min == -90.0 assert pars0["lat_0"].frozen assert pars0["lambda_"].value == 0.006 assert pars0["lambda_"].unit == "TeV-1" assert np.isnan(pars0["lambda_"].min) assert np.isnan(pars0["lambda_"].max) model1 = models[2] assert "pl" in model1.spectral_model.tag assert "PowerLawSpectralModel" in model1.spectral_model.tag assert "DiskSpatialModel" in model1.spatial_model.tag assert "disk" in model1.spatial_model.tag assert "LightCurveTemplateTemporalModel" in model1.temporal_model.tag pars1 = model1.parameters assert pars1["index"].value == 2.2 assert pars1["index"].unit == "" assert pars1["lat_0"].scale == 1.0 assert pars1["lat_0"].factor == pars1["lat_0"].value assert np.isnan(pars1["index"].max) assert np.isnan(pars1["index"].min) assert pars1["r_0"].unit == "deg" model2 = models[3] assert_allclose(model2.spectral_model.energy.data, [34.171, 44.333, 57.517]) assert model2.spectral_model.energy.unit == "MeV" assert_allclose( model2.spectral_model.values.data, [2.52894e-06, 1.2486e-06, 6.14648e-06] ) assert model2.spectral_model.values.unit == "1 / (cm2 MeV s)" assert "TemplateSpectralModel" in model2.spectral_model.tag assert "TemplateSpatialModel" in model2.spatial_model.tag assert not model2.spatial_model.normalize @requires_data() def test_sky_models_io(tmpdir, models): models.covariance = np.eye(len(models.parameters)) models.write(tmpdir / "tmp.yaml", full_output=True, overwrite_templates=False) models = Models.read(tmpdir / "tmp.yaml") assert models._covar_file == "tmp_covariance.dat" assert_allclose(models.covariance.data, np.eye(len(models.parameters))) assert_allclose(models.parameters["lat_0"].min, -90.0) # TODO: not sure if we should just round-trip, or if we should # check YAML file content (e.g. against a ref file in the repo) # or check serialised dict content @requires_data() def test_sky_models_checksum(tmpdir, models): import yaml models.write( tmpdir / "tmp.yaml", full_output=True, overwrite_templates=False, checksum=True ) file = open(tmpdir / "tmp.yaml", "rb") yaml_content = file.read() file.close() assert "checksum: " in str(yaml_content) data = yaml.safe_load(yaml_content) data["checksum"] = "bad" yaml_str = yaml.dump( data, sort_keys=False, indent=4, width=80, default_flow_style=False ) path = make_path(tmpdir) / "bad_checksum.yaml" path.write_text(yaml_str) with pytest.warns(UserWarning): Models.read(tmpdir / "bad_checksum.yaml", checksum=True) @requires_data() def test_sky_models_io_auto_write(tmp_path, models): models_new = models.copy() fsource2 = str(tmp_path / "source2_test.fits") fbkg_iem = str(tmp_path / "cube_iem_test.fits") fbkg_irf = str(tmp_path / "background_irf_test.fits") models_new["source2"].spatial_model.filename = fsource2 models_new["cube_iem"].spatial_model.filename = fbkg_iem models_new["background_irf"].filename = fbkg_irf models_new.write(tmp_path / "tmp.yaml", full_output=True) models = Models.read(tmp_path / "tmp.yaml") assert models._covar_file == "tmp_covariance.dat" assert models["source2"].spatial_model.filename == fsource2 assert models["cube_iem"].spatial_model.filename == fbkg_iem assert models["background_irf"].filename == fbkg_irf assert_allclose( models_new["source2"].spatial_model.map.data, models["source2"].spatial_model.map.data, ) assert_allclose( models_new["cube_iem"].spatial_model.map.data, models["cube_iem"].spatial_model.map.data, ) assert_allclose( models_new["background_irf"].map.data, models["background_irf"].map.data ) def test_piecewise_norm_spectral_model_init(): with pytest.raises(ValueError): PiecewiseNormSpectralModel( energy=[1] * u.TeV, norms=[1, 5], ) with pytest.raises(ValueError): PiecewiseNormSpectralModel( energy=[1] * u.TeV, norms=[1], ) def test_piecewise_norm_spectral_model_io(): energy = [1, 3, 7, 10] * u.TeV norms = [1, 5, 3, 0.5] * u.Unit("") model = PiecewiseNormSpectralModel(energy=energy, norms=norms) model.parameters["norm_0"].value = 2 model_dict = model.to_dict() parnames = [_["name"] for _ in model_dict["spectral"]["parameters"]] for k, parname in enumerate(parnames): assert parname == f"norm_{k}" new_model = PiecewiseNormSpectralModel.from_dict(model_dict) assert_allclose(new_model.parameters["norm_0"].value, 2) assert_allclose(new_model.energy, energy) assert_allclose(new_model.norms, [2, 5, 3, 0.5]) bkg = FoVBackgroundModel(spectral_model=model, dataset_name="") bkg_dict = bkg.to_dict() new_bkg = FoVBackgroundModel.from_dict(bkg_dict) assert_allclose(new_bkg.spectral_model.parameters["norm_0"].value, 2) assert_allclose(new_bkg.spectral_model.energy, energy) assert_allclose(new_bkg.spectral_model.norms, [2, 5, 3, 0.5]) @requires_data() def test_absorption_io_invalid_path(tmp_path): dominguez = EBLAbsorptionNormSpectralModel.read_builtin("dominguez", redshift=0.5) dominguez.filename = "/not/a/valid/path/ebl_dominguez11.fits.gz" assert len(dominguez.parameters) == 2 model_dict = dominguez.to_dict() parnames = [_["name"] for _ in model_dict["spectral"]["parameters"]] assert parnames == [ "alpha_norm", "redshift", ] new_model = EBLAbsorptionNormSpectralModel.from_dict(model_dict) assert new_model.redshift.value == 0.5 assert new_model.alpha_norm.name == "alpha_norm" assert new_model.alpha_norm.value == 1 assert_allclose(new_model.energy, dominguez.energy) assert_allclose(new_model.param, dominguez.param) assert len(new_model.parameters) == 2 dominguez.filename = "/not/a/valid/path/dominguez.fits.gz" model_dict = dominguez.to_dict() with pytest.raises(IOError): EBLAbsorptionNormSpectralModel.from_dict(model_dict) @requires_data() def test_absorption_io_no_filename(tmp_path): dominguez = EBLAbsorptionNormSpectralModel.read_builtin("dominguez", redshift=0.5) dominguez.filename = None assert len(dominguez.parameters) == 2 model_dict = dominguez.to_dict() parnames = [_["name"] for _ in model_dict["spectral"]["parameters"]] assert parnames == [ "alpha_norm", "redshift", ] new_model = EBLAbsorptionNormSpectralModel.from_dict(model_dict) assert new_model.redshift.value == 0.5 assert new_model.alpha_norm.name == "alpha_norm" assert new_model.alpha_norm.value == 1 assert_allclose(new_model.energy, dominguez.energy) assert_allclose(new_model.param, dominguez.param) assert len(new_model.parameters) == 2 @requires_data() def test_absorption_io(tmp_path): dominguez = EBLAbsorptionNormSpectralModel.read_builtin("dominguez", redshift=0.5) assert len(dominguez.parameters) == 2 model_dict = dominguez.to_dict() parnames = [_["name"] for _ in model_dict["spectral"]["parameters"]] assert parnames == [ "alpha_norm", "redshift", ] new_model = EBLAbsorptionNormSpectralModel.from_dict(model_dict) assert new_model.redshift.value == 0.5 assert new_model.alpha_norm.name == "alpha_norm" assert new_model.alpha_norm.value == 1 assert_allclose(new_model.energy, dominguez.energy) assert_allclose(new_model.param, dominguez.param) assert len(new_model.parameters) == 2 model = EBLAbsorptionNormSpectralModel( u.Quantity(range(3), "keV"), u.Quantity(range(2), ""), u.Quantity(np.ones((2, 3)), ""), redshift=0.5, alpha_norm=1, ) model_dict = model.to_dict() new_model = EBLAbsorptionNormSpectralModel.from_dict(model_dict) assert_allclose(new_model.energy, model.energy) assert_allclose(new_model.param, model.param) assert_allclose(new_model.data, model.data) write_yaml(to_yaml(model_dict), tmp_path / "tmp.yaml") read_yaml(tmp_path / "tmp.yaml") @requires_dependency("naima") def test_naima_model(): import naima particle_distribution = naima.models.PowerLaw( amplitude=2e33 / u.eV, e_0=10 * u.TeV, alpha=2.5 ) radiative_model = naima.radiative.PionDecay(particle_distribution, nh=1 * u.cm**-3) yield Model.create( "NaimaSpectralModel", "spectral", radiative_model=radiative_model ) def make_all_models(): """Make an instance of each model, for testing.""" yield Model.create("ConstantSpatialModel", "spatial") map_constantmodel = Map.create(npix=(10, 20), unit="sr-1") yield Model.create("TemplateSpatialModel", "spatial", map=map_constantmodel) yield Model.create( "DiskSpatialModel", "spatial", lon_0="1 deg", lat_0="2 deg", r_0="3 deg" ) yield Model.create("gauss", "spatial", lon_0="1 deg", lat_0="2 deg", sigma="3 deg") yield Model.create("PointSpatialModel", "spatial", lon_0="1 deg", lat_0="2 deg") yield Model.create( "ShellSpatialModel", "spatial", lon_0="1 deg", lat_0="2 deg", radius="3 deg", width="4 deg", ) yield Model.create("ConstantSpectralModel", "spectral", const="99 cm-2 s-1 TeV-1") yield Model.create( "CompoundSpectralModel", "spectral", model1=Model.create("PowerLawSpectralModel", "spectral"), model2=Model.create("PowerLawSpectralModel", "spectral"), operator=np.add, ) yield Model.create("PowerLawSpectralModel", "spectral") yield Model.create("PowerLawNormSpectralModel", "spectral") yield Model.create("PowerLaw2SpectralModel", "spectral") yield Model.create("ExpCutoffPowerLawSpectralModel", "spectral") with pytest.warns(GammapyDeprecationWarning): yield Model.create("ExpCutoffPowerLawNormSpectralModel", "spectral") yield Model.create("ExpCutoffPowerLaw3FGLSpectralModel", "spectral") yield Model.create("SuperExpCutoffPowerLaw3FGLSpectralModel", "spectral") yield Model.create("SuperExpCutoffPowerLaw4FGLDR3SpectralModel", "spectral") yield Model.create("SuperExpCutoffPowerLaw4FGLSpectralModel", "spectral") yield Model.create("LogParabolaSpectralModel", "spectral") yield Model.create( "TemplateSpectralModel", "spectral", energy=[1, 2] * u.cm, values=[3, 4] * u.cm ) # TODO: add unit validation? yield Model.create( "PiecewiseNormSpectralModel", "spectral", energy=[1, 2] * u.cm, norms=[3, 4] * u.cm, ) yield Model.create("GaussianSpectralModel", "spectral") yield Model.create( "EBLAbsorptionNormSpectralModel", "spectral", energy=[0, 1, 2] * u.keV, param=[0, 1], data=np.ones((2, 3)), redshift=0.1, alpha_norm=1.0, ) yield Model.create("ScaleSpectralModel", "spectral", model=PowerLawSpectralModel()) yield Model.create("ConstantTemporalModel", "temporal") yield Model.create("LinearTemporalModel", "temporal") yield Model.create("PowerLawTemporalModel", "temporal") yield Model.create("SineTemporalModel", "temporal") m = RegionNDMap.create(region=None) yield Model.create("LightCurveTemplateTemporalModel", "temporal", m) yield Model.create( "SkyModel", spatial_model=Model.create("ConstantSpatialModel", "spatial"), spectral_model=Model.create("PowerLawSpectralModel", "spectral"), ) m1 = Map.create( npix=(10, 20, 30), axes=[MapAxis.from_nodes([1, 2] * u.TeV, name="energy")] ) yield Model.create("TemplateSpatialModel", "spatial", map=m1) m2 = Map.create( npix=(10, 20, 30), axes=[MapAxis.from_edges([1, 2] * u.TeV, name="energy")] ) yield Model.create("TemplateNPredModel", map=m2) @pytest.mark.parametrize("model_class", MODEL_REGISTRY) def test_all_model_classes(model_class): if isinstance(model_class.tag, list): assert model_class.tag[0] == model_class.__name__ else: assert model_class.tag == model_class.__name__ @pytest.mark.parametrize("model", make_all_models()) def test_all_model_instances(model): tag = model.tag[0] if isinstance(model.tag, list) else model.tag assert tag == model.__class__.__name__ @requires_data() def test_missing_parameters(models): assert models["source1"].spatial_model.e in models.parameters assert len(models["source1"].spatial_model.parameters) == 6 def test_simplified_output(): model = PowerLawSpectralModel() full = model.to_dict(full_output=True) simplified = model.to_dict() for k, _ in enumerate(model.parameters.names): for item in ["min", "max", "error"]: assert item in full["spectral"]["parameters"][k] assert item not in simplified["spectral"]["parameters"][k] def test_registries_print(): assert "Registry" in str(MODEL_REGISTRY) def test_io_temporal(): classes = [ ConstantTemporalModel, LinearTemporalModel, ExpDecayTemporalModel, GaussianTemporalModel, PowerLawTemporalModel, SineTemporalModel, ] for c in classes: sky_model = SkyModel(spectral_model=PowerLawSpectralModel(), temporal_model=c()) model_dict = sky_model.to_dict() read_model = SkyModel.from_dict(model_dict) for p in sky_model.temporal_model.parameters: assert_allclose(read_model.temporal_model.parameters[p.name].value, p.value) assert read_model.temporal_model.parameters[p.name].unit == p.unit @requires_data() def test_link_label(models): skymodels = models.select(tag="sky-model") skymodels[0].spectral_model.reference = skymodels[1].spectral_model.reference dict_ = skymodels.to_dict() label0 = dict_["components"][0]["spectral"]["parameters"][2]["link"] label1 = dict_["components"][1]["spectral"]["parameters"][2]["link"] assert label0 == label1 txt = skymodels.__str__() lines = txt.splitlines() n_link = 0 for line in lines: if "@" in line: assert "reference" in line n_link += 1 assert n_link == 2 def test_to_dict_not_default(): model = PowerLawSpectralModel() model.index.min = -1 model.index.max = -5 model.index.frozen = True mdict = model.to_dict(full_output=False) index_dict = mdict["spectral"]["parameters"][0] assert "min" in index_dict assert "max" in index_dict assert "frozen" in index_dict assert "error" not in index_dict assert "interp" not in index_dict assert "scale_method" not in index_dict model_2 = model.from_dict(mdict) assert model_2.index.min == model.index.min assert model_2.index.max == model.index.max assert model_2.index.frozen == model.index.frozen def test_to_dict_unfreeze_parameters_frozen_by_default(): model = PowerLawSpectralModel() mdict = model.to_dict(full_output=False) index_dict = mdict["spectral"]["parameters"][2] assert "frozen" not in index_dict model.reference.frozen = False mdict = model.to_dict(full_output=False) index_dict = mdict["spectral"]["parameters"][2] assert index_dict["frozen"] is False def test_compound_models_io(tmp_path): m1 = PowerLawSpectralModel() m2 = LogParabolaSpectralModel() m = CompoundSpectralModel(m1, m2, operator.add) sk = SkyModel(spectral_model=m, name="model") Models([sk]).write(tmp_path / "test.yaml") sk1 = Models.read(tmp_path / "test.yaml") assert_allclose(sk1.covariance.data, sk.covariance.data, rtol=1e-3) assert_allclose(np.sum(sk1.covariance.data), 0.0) assert Models([sk]).parameters_unique_names == [ "model.spectral.model1.index", "model.spectral.model1.amplitude", "model.spectral.model1.reference", "model.spectral.model2.amplitude", "model.spectral.model2.reference", "model.spectral.model2.alpha", "model.spectral.model2.beta", ] def test_meta_io(caplog, tmp_path): m = PowerLawSpectralModel() sk = SkyModel(spectral_model=m, name="model") Models([sk]).write(tmp_path / "test.yaml") sk_dict = read_yaml(tmp_path / "test.yaml") assert "metadata" in sk_dict assert "Gammapy" in sk_dict["metadata"]["creator"] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/test_management.py0000644000175100001770000003064114721316200024020 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy import units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion from gammapy.maps import Map, MapAxis, RegionGeom, WcsGeom from gammapy.modeling import Covariance from gammapy.modeling.models import ( FoVBackgroundModel, GaussianSpatialModel, Models, PointSpatialModel, PowerLawSpectralModel, SkyModel, TemplateNPredModel, ) @pytest.fixture(scope="session") def backgrounds(): axis = MapAxis.from_edges(np.logspace(-1, 1, 3), unit=u.TeV, name="energy") geom = WcsGeom.create(skydir=(0, 0), npix=(5, 4), frame="galactic", axes=[axis]) m = Map.from_geom(geom) m.quantity = np.ones(geom.data_shape) * 1e-7 background1 = TemplateNPredModel(m, name="bkg1", datasets_names="dataset-1") background2 = TemplateNPredModel(m, name="bkg2", datasets_names=["dataset-2"]) backgrounds = [background1, background2] return backgrounds @pytest.fixture(scope="session") def models(backgrounds): spatial_model = GaussianSpatialModel( lon_0="3 deg", lat_0="4 deg", sigma="3 deg", frame="galactic" ) spectral_model = PowerLawSpectralModel( index=2, amplitude="1e-11 cm-2 s-1 TeV-1", reference="1 TeV" ) model1 = SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, name="source-1", ) model2 = model1.copy(name="source-2") model2.datasets_names = ["dataset-1"] model3 = model1.copy(name="source-3") model3.datasets_names = "dataset-2" model3.spatial_model = PointSpatialModel(frame="galactic") model3.parameters.freeze_all() models = Models([model1, model2, model3] + backgrounds) return models @pytest.fixture(scope="session") def models_gauss(): model1 = SkyModel.create("pl", "gauss", name="source-1") model1.spatial_model.sigma.value = 0.1 model1.spatial_model.lon_0.value = 0 model1.spatial_model.lat_0.value = 0 model2 = SkyModel.create("pl", "gauss", name="source-2") model2.spatial_model.sigma.value = 0.1 model2.spatial_model.lon_0.value = 1.1 model2.spatial_model.lat_0.value = 0 model3 = SkyModel.create("pl", "gauss", name="source-3") model3.spatial_model.sigma.value = 0.1 model3.spatial_model.lon_0.value = -1.8 model3.spatial_model.lat_0.value = 0 return Models([model1, model2, model3]) def test_select_region(models): center_sky = SkyCoord(3, 4, unit="deg", frame="galactic") circle_sky_12 = CircleSkyRegion(center=center_sky, radius=1 * u.deg) selected = models.select_region([circle_sky_12]) assert selected.names == ["source-1", "source-2"] center_sky = SkyCoord(0, 0.5, unit="deg", frame="galactic") circle_sky_3 = CircleSkyRegion(center=center_sky, radius=1 * u.deg) selected = models.select_region([circle_sky_3]) assert selected.names == ["source-3"] selected = models.select_region([circle_sky_3, circle_sky_12]) assert selected.names == ["source-1", "source-2", "source-3"] def test_select_mask(models_gauss): center_sky = SkyCoord("0d", "0d") circle = CircleSkyRegion(center=center_sky, radius=1 * u.deg) axis = MapAxis.from_energy_edges(np.logspace(-1, 1, 3), unit="TeV") geom = WcsGeom.create(skydir=center_sky, width=(5, 4), axes=[axis], binsz=0.02) mask = geom.region_mask([circle]) contribute = models_gauss.select_mask(mask, use_evaluation_region=True) assert contribute.names == ["source-1", "source-2"] inside = models_gauss.select_mask(mask, use_evaluation_region=False) assert inside.names == ["source-1"] contribute_margin = models_gauss.select_mask( mask, margin=0.6 * u.deg, use_evaluation_region=True ) assert contribute_margin.names == ["source-1", "source-2", "source-3"] def test_contributes(): center_sky = SkyCoord(3, 4, unit="deg", frame="galactic") circle_sky_12 = CircleSkyRegion(center=center_sky, radius=1 * u.deg) axis = MapAxis.from_edges(np.logspace(-1, 1, 3), unit=u.TeV, name="energy") geom = WcsGeom.create(skydir=(3, 4), npix=(5, 4), frame="galactic", axes=[axis]) mask = geom.region_mask([circle_sky_12]) spatial_model = GaussianSpatialModel( lon_0="0 deg", lat_0="0 deg", sigma="0.9 deg", frame="galactic" ) assert spatial_model.evaluation_region.height == 2 * spatial_model.evaluation_radius model4 = SkyModel( spatial_model=spatial_model, spectral_model=PowerLawSpectralModel(), name="source-4", ) assert model4.contributes(mask, margin=0 * u.deg) def test_contributes_region_mask(): axis = MapAxis.from_edges(np.logspace(-1, 1, 3), unit=u.TeV, name="energy") geom = RegionGeom.create( "galactic;circle(0, 0, 0.2)", axes=[axis], binsz_wcs="0.05 deg" ) mask = Map.from_geom(geom, unit="", dtype="bool") mask.data[...] = True spatial_model1 = GaussianSpatialModel( lon_0="0.2 deg", lat_0="0 deg", sigma="0.1 deg", frame="galactic" ) spatial_model2 = PointSpatialModel( lon_0="0.3 deg", lat_0="0.3 deg", frame="galactic" ) model1 = SkyModel( spatial_model=spatial_model1, spectral_model=PowerLawSpectralModel(), name="source-1", ) model2 = SkyModel( spatial_model=spatial_model2, spectral_model=PowerLawSpectralModel(), name="source-2", ) assert model1.contributes(mask, margin=0 * u.deg) assert not model2.contributes(mask, margin=0 * u.deg) assert model2.contributes(mask, margin=0.3 * u.deg) def test_select(models): conditions = [ {"datasets_names": "dataset-1"}, {"datasets_names": "dataset-2"}, {"datasets_names": ["dataset-1", "dataset-2"]}, {"datasets_names": None}, {"tag": "TemplateNPredModel"}, {"tag": ["SkyModel", "TemplateNPredModel"]}, {"tag": "point", "model_type": "spatial"}, {"tag": ["point", "gauss"], "model_type": "spatial"}, {"tag": "pl", "model_type": "spectral"}, {"tag": ["pl", "pl-norm"], "model_type": "spectral"}, {"name_substring": "bkg"}, {"frozen": True}, {"frozen": False}, {"datasets_names": "dataset-1", "tag": "pl", "model_type": "spectral"}, ] expected = [ ["source-1", "source-2", "bkg1"], ["source-1", "source-3", "bkg2"], ["source-1", "source-2", "source-3", "bkg1", "bkg2"], ["source-1", "source-2", "source-3", "bkg1", "bkg2"], ["bkg1", "bkg2"], ["source-1", "source-2", "source-3", "bkg1", "bkg2"], ["source-3"], ["source-1", "source-2", "source-3"], ["source-1", "source-2", "source-3"], ["source-1", "source-2", "source-3", "bkg1", "bkg2"], ["bkg1", "bkg2"], ["source-3"], ["source-1", "source-2", "bkg1", "bkg2"], ["source-1", "source-2"], ] for cdt, xp in zip(conditions, expected): selected = models.select(**cdt) assert selected.names == xp mask = models.selection_mask(**conditions[4]) | models.selection_mask( **conditions[6] ) selected = models[mask] assert selected.names == ["source-3", "bkg1", "bkg2"] def test_restore_status(models): model = models[1].spectral_model covariance_data = np.array([[1.0, 1.0, 0.0], [1.0, 1.0, 0.0], [0.0, 0.0, 0.0]]) # the covariance is reset for frozen parameters # because of from_factor_matrix (used by the optimizer) # so if amplitude if frozen we get covariance_frozen = np.array([[1.0, 0.0, 0.0], [0.0, 0.0, 0.0], [0.0, 0.0, 0.0]]) model.covariance = Covariance.from_factor_matrix(model.parameters, np.ones((2, 2))) assert_allclose(model.covariance.data, covariance_data) with models.restore_status(restore_values=True): model.amplitude.value = 0 model.amplitude.frozen = True model.covariance = Covariance.from_factor_matrix( model.parameters, np.ones((1, 1)) ) assert_allclose(model.covariance.data, covariance_frozen) assert model.amplitude.value == 0 assert model.amplitude.error == 0 assert_allclose(model.amplitude.value, 1e-11) assert model.amplitude.frozen is False assert isinstance(models.covariance, Covariance) assert_allclose(model.covariance.data, covariance_data) assert model.amplitude.error == 1 with models.parameters.restore_status(restore_values=True): model.amplitude.value = 0 model.amplitude.frozen = True assert model.amplitude.value == 0 assert model.amplitude.frozen assert_allclose(model.amplitude.value, 1e-11) assert not model.amplitude.frozen with models.parameters.restore_status(restore_values=False): model.amplitude.value = 0 model.amplitude.frozen = True assert model.amplitude.value == 0 def test_bounds(models): models.set_parameters_bounds( tag="pl", model_type="spectral", parameters_names="index", min=0, max=5, value=2.4, ) pl_mask = models.selection_mask(tag="pl", model_type="spectral") assert np.all([m.spectral_model.index.value == 2.4 for m in models[pl_mask]]) assert np.all([m.spectral_model.index.min == 0 for m in models[pl_mask]]) assert np.all([m.spectral_model.index.max == 5 for m in models[pl_mask]]) models.set_parameters_bounds( tag=["pl", "pl-norm"], model_type="spectral", min=0, max=None, ) assert np.all([m.spectral_model.amplitude.min == 0 for m in models[pl_mask]]) bkg_mask = models.selection_mask(tag="TemplateNPredModel") assert np.all([m.spectral_model.norm.min == 0 for m in models[bkg_mask]]) def test_freeze(models): models.freeze() assert np.all([p.frozen for p in models.parameters]) models.unfreeze() assert not models["source-1"].spatial_model.lon_0.frozen assert models["source-1"].spectral_model.reference.frozen assert not models["source-3"].spatial_model.lon_0.frozen assert models["bkg1"].spectral_model.tilt.frozen assert not models["bkg1"].spectral_model.norm.frozen models.freeze("spatial") assert models["source-1"].spatial_model.lon_0.frozen assert models["source-3"].spatial_model.lon_0.frozen assert not models["bkg1"].spectral_model.norm.frozen models.unfreeze("spatial") assert not models["source-1"].spatial_model.lon_0.frozen assert models["source-1"].spectral_model.reference.frozen assert not models["source-3"].spatial_model.lon_0.frozen models.freeze("spectral") assert models["bkg1"].spectral_model.tilt.frozen assert models["bkg1"].spectral_model.norm.frozen assert models["source-1"].spectral_model.index.frozen assert not models["source-3"].spatial_model.lon_0.frozen models.unfreeze("spectral") assert models["bkg1"].spectral_model.tilt.frozen assert not models["bkg1"].spectral_model.norm.frozen assert not models["source-1"].spectral_model.index.frozen def test_parameters(models): pars = models.parameters.select(frozen=False) pars.freeze_all() assert np.all([p.frozen for p in pars]) assert len(pars.select(frozen=True)) == len(pars) pars.unfreeze_all() assert np.all([not p.frozen for p in pars]) assert len(pars.min) == len(pars) assert len(pars.max) == len(pars) with pytest.raises(TypeError): pars.min = 1 with pytest.raises(ValueError): pars.min = [1] with pytest.raises(ValueError): pars.max = [2] def test_fov_bkg_models(): models = Models( [FoVBackgroundModel(dataset_name=name) for name in ["test-1", "test-2"]] ) models.freeze() assert models.frozen models.parameters.select(name="tilt").unfreeze_all() assert not models["test-1-bkg"].spectral_model.tilt.frozen assert not models["test-2-bkg"].spectral_model.tilt.frozen models.parameters.select(name=["tilt", "norm"]).freeze_all() assert models["test-1-bkg"].spectral_model.tilt.frozen assert models["test-1-bkg"].spectral_model.norm.frozen def test_reassign_dataset(models): ref = models.select(datasets_names="dataset-2") models = models.reassign("dataset-2", "dataset-2-copy") assert len(models.select(datasets_names="dataset-2")) == np.sum( [m.datasets_names is None for m in models] ) new = models.select(datasets_names="dataset-2-copy") assert len(new) == len(ref) assert new["source-1"].datasets_names is None assert new["source-3"].datasets_names == ["dataset-2-copy"] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/test_prior.py0000644000175100001770000000406614721316200023041 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose import astropy.units as u from gammapy.modeling.models import PRIOR_REGISTRY, GaussianPrior, Model, UniformPrior from gammapy.utils.testing import assert_quantity_allclose TEST_PRIORS = [ dict( name="gaussian", model=GaussianPrior(mu=4.0, sigma=1.0), val_at_0=16.0, val_at_1=9.0, val_with_weight_2=32.0, ), dict( name="uni", model=UniformPrior(min=0.0), val_at_0=1.0, val_at_1=0.0, val_with_weight_2=2.0, ), ] @pytest.mark.parametrize("prior", TEST_PRIORS) def test_priors(prior): model = prior["model"] for p in model.parameters: assert p.type == "prior" value_0 = model(0.0 * u.Unit("")) value_1 = model(1.0 * u.Unit("")) assert_allclose(value_0, prior["val_at_0"], rtol=1e-7) assert_allclose(value_1, prior["val_at_1"], rtol=1e-7) model.weight = 2.0 value_0_weight = model(0.0 * u.Unit("")) assert_allclose(value_0_weight, prior["val_with_weight_2"], rtol=1e-7) def test_to_from_dict(): prior = TEST_PRIORS[1] model = prior["model"] model.weight = 2.0 model_dict = model.to_dict() # Here we reverse the order of parameters list to ensure assignment is correct model_dict["prior"]["parameters"].reverse() model_class = PRIOR_REGISTRY.get_cls(model_dict["prior"]["type"]) new_model = model_class.from_dict(model_dict) assert isinstance(new_model, UniformPrior) actual = [par.value for par in new_model.parameters] desired = [par.value for par in model.parameters] assert_quantity_allclose(actual, desired) assert_allclose(model.weight, new_model.weight, rtol=1e-7) new_model = Model.from_dict(model_dict) assert isinstance(new_model, UniformPrior) actual = [par.value for par in new_model.parameters] desired = [par.value for par in model.parameters] assert_quantity_allclose(actual, desired) assert_allclose(model.weight, new_model.weight, rtol=1e-7) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/test_spatial.py0000644000175100001770000006723014721316200023345 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging from pathlib import Path import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.coordinates import SkyCoord from astropy.wcs import WCS from regions import ( CircleAnnulusSkyRegion, CircleSkyRegion, EllipseSkyRegion, PointSkyRegion, RectangleSkyRegion, ) from gammapy.maps import Map, MapAxis, MapCoord, RegionGeom, WcsGeom, WcsNDMap from gammapy.modeling.models import ( SPATIAL_MODEL_REGISTRY, ConstantSpatialModel, DiskSpatialModel, FoVBackgroundModel, GaussianSpatialModel, GeneralizedGaussianSpatialModel, PiecewiseNormSpatialModel, PointSpatialModel, PowerLawSpectralModel, Shell2SpatialModel, ShellSpatialModel, SkyModel, TemplateNDSpatialModel, TemplateSpatialModel, ) from gammapy.utils.testing import mpl_plot_check, requires_data, requires_dependency def test_sky_point_source(): geom = WcsGeom.create(skydir=(2.4, 2.3), npix=(10, 10), binsz=0.3) model = PointSpatialModel(lon_0="2.5 deg", lat_0="2.5 deg", frame="icrs") assert model.is_energy_dependent is False assert model.evaluation_radius.unit == "deg" assert_allclose(model.evaluation_radius.value, 0) assert model.frame == "icrs" assert_allclose(model.position.ra.deg, 2.5) assert_allclose(model.position.dec.deg, 2.5) val = model.evaluate_geom(geom) assert val.unit == "sr-1" assert_allclose(np.sum(val * geom.solid_angle()), 1) assert isinstance(model.to_region(), PointSkyRegion) def test_sky_gaussian(): # Test symmetric model sigma = 1 * u.deg model = GaussianSpatialModel(lon_0="5 deg", lat_0="15 deg", sigma=sigma) assert model.parameters["sigma"].min == 0 val_0 = model(5 * u.deg, 15 * u.deg) val_sigma = model(5 * u.deg, 16 * u.deg) assert val_0.unit == "sr-1" ratio = val_0 / val_sigma assert_allclose(ratio, np.exp(0.5)) radius = model.evaluation_radius assert radius.unit == "deg" assert_allclose(radius.value, 5 * sigma.value) assert_allclose(model.evaluation_bin_size_min, (1.0 / 3.0) * u.deg) # test the normalization for an elongated Gaussian near the Galactic Plane m_geom_1 = WcsGeom.create( binsz=0.05, width=(20, 20), skydir=(2, 2), frame="galactic", proj="AIT" ) coords = m_geom_1.get_coord() solid_angle = m_geom_1.solid_angle() lon = coords.lon lat = coords.lat sigma = 3 * u.deg model_1 = GaussianSpatialModel( lon_0=2 * u.deg, lat_0=2 * u.deg, sigma=sigma, e=0.8, phi=30 * u.deg ) vals_1 = model_1(lon, lat) assert vals_1.unit == "sr-1" assert_allclose(np.sum(vals_1 * solid_angle), 1, rtol=1.0e-3) radius = model_1.evaluation_radius assert radius.unit == "deg" assert_allclose(radius.value, 5 * sigma.value) # check the ratio between the value at the peak and on the 1-sigma isocontour sigma = 4 * u.deg semi_minor = 2 * u.deg e = np.sqrt(1 - (semi_minor / sigma) ** 2) model_2 = GaussianSpatialModel( lon_0=0 * u.deg, lat_0=0 * u.deg, sigma=sigma, e=e, phi=0 * u.deg ) val_0 = model_2(0 * u.deg, 0 * u.deg) val_major = model_2(0 * u.deg, 4 * u.deg) val_minor = model_2(2 * u.deg, 0 * u.deg) assert val_0.unit == "sr-1" ratio_major = val_0 / val_major ratio_minor = val_0 / val_minor assert_allclose(ratio_major, np.exp(0.5)) assert_allclose(ratio_minor, np.exp(0.5)) # check the rotation model_3 = GaussianSpatialModel( lon_0=0 * u.deg, lat_0=0 * u.deg, sigma=sigma, e=e, phi=90 * u.deg ) val_minor_rotated = model_3(0 * u.deg, 2 * u.deg) ratio_minor_rotated = val_0 / val_minor_rotated assert_allclose(ratio_minor_rotated, np.exp(0.5)) assert isinstance(model.to_region(), EllipseSkyRegion) @pytest.mark.parametrize("eta", np.arange(0.1, 1.01, 0.3)) @pytest.mark.parametrize("r_0", np.arange(0.01, 1.01, 0.3)) @pytest.mark.parametrize("e", np.arange(0.0, 0.801, 0.4)) def test_generalized_gaussian(eta, r_0, e): # check normalization is robust for a large set of values model = GeneralizedGaussianSpatialModel( eta=eta, r_0=r_0 * u.deg, e=e, frame="galactic" ) width = np.maximum(2 * model.evaluation_radius.to_value("deg"), 0.5) geom = WcsGeom.create( skydir=(0, 0), binsz=0.02, width=width, frame="galactic", ) integral = model.integrate_geom(geom) assert integral.unit.is_equivalent("") assert_allclose(integral.data.sum(), 1.0, atol=5e-3) def test_generalized_gaussian_io(): model = GeneralizedGaussianSpatialModel(e=0.5) reg = model.to_region() assert isinstance(reg, EllipseSkyRegion) assert_allclose(reg.width.value, 1.73205, rtol=1e-5) new_model = GeneralizedGaussianSpatialModel.from_dict(model.to_dict()) assert isinstance(new_model, GeneralizedGaussianSpatialModel) def test_sky_disk(): # Test the disk case (e=0) r_0 = 2 * u.deg model = DiskSpatialModel(lon_0="1 deg", lat_0="45 deg", r_0=r_0) lon = [1, 5, 359] * u.deg lat = 46 * u.deg val = model(lon, lat) assert val.unit == "sr-1" desired = [261.263956, 0, 261.263956] assert_allclose(val.value, desired) radius = model.evaluation_radius assert radius.unit == "deg" assert_allclose(radius.to_value("deg"), 2.222) assert_allclose(model.evaluation_bin_size_min, 0.198 * u.deg) # test the normalization for an elongated ellipse near the Galactic Plane m_geom_1 = WcsGeom.create( binsz=0.015, width=(20, 20), skydir=(2, 2), frame="galactic", proj="AIT" ) coords = m_geom_1.get_coord() solid_angle = m_geom_1.solid_angle() lon = coords.lon lat = coords.lat r_0 = 10 * u.deg model_1 = DiskSpatialModel( lon_0=2 * u.deg, lat_0=2 * u.deg, r_0=r_0, e=0.4, phi=30 * u.deg ) vals_1 = model_1(lon, lat) assert vals_1.unit == "sr-1" assert_allclose(np.sum(vals_1 * solid_angle), 1, rtol=1.0e-3) radius = model_1.evaluation_radius assert radius.unit == "deg" assert_allclose(radius.to_value("deg"), 11.11) # test rotation r_0 = 2 * u.deg semi_minor = 1 * u.deg eccentricity = np.sqrt(1 - (semi_minor / r_0) ** 2) model_rot_test = DiskSpatialModel( lon_0=0 * u.deg, lat_0=0 * u.deg, r_0=r_0, e=eccentricity, phi=90 * u.deg ) assert_allclose(model_rot_test(0 * u.deg, 1.5 * u.deg).value, 0) # test the normalization for a disk (ellipse with e=0) at the Galactic Pole m_geom_2 = WcsGeom.create( binsz=0.1, width=(6, 6), skydir=(0, 90), frame="galactic", proj="AIT" ) coords = m_geom_2.get_coord() lon = coords.lon lat = coords.lat r_0 = 5 * u.deg disk = DiskSpatialModel(lon_0=0 * u.deg, lat_0=90 * u.deg, r_0=r_0) vals_disk = disk(lon, lat) solid_angle = 2 * np.pi * (1 - np.cos(5 * u.deg)) assert_allclose(np.max(vals_disk).value * solid_angle, 1) assert isinstance(model.to_region(), EllipseSkyRegion) def test_sky_disk_edge(): r_0 = 2 * u.deg model = DiskSpatialModel( lon_0="0 deg", lat_0="0 deg", r_0=r_0, e=0.5, phi="0 deg", ) value_center = model(0 * u.deg, 0 * u.deg) value_edge = model(0 * u.deg, r_0) assert_allclose((value_edge / value_center).to_value(""), 0.5) edge = model.edge_width.value * r_0 value_edge_pwidth = model(0 * u.deg, r_0 + edge / 2) assert_allclose((value_edge_pwidth / value_center).to_value(""), 0.05) value_edge_nwidth = model(0 * u.deg, r_0 - edge / 2) assert_allclose((value_edge_nwidth / value_center).to_value(""), 0.95) def test_disk_from_region(): region = EllipseSkyRegion( center=SkyCoord(20, 17, unit="deg"), height=0.3 * u.deg, width=1.0 * u.deg, angle=30 * u.deg, ) disk = DiskSpatialModel.from_region(region, frame="galactic") assert_allclose(disk.parameters["lon_0"].value, 132.666, rtol=1e-2) assert_allclose(disk.parameters["lat_0"].value, -45.33118067, rtol=1e-2) assert_allclose(disk.parameters["r_0"].quantity, 0.5 * u.deg, rtol=1e-2) assert_allclose(disk.parameters["e"].value, 0.9539, rtol=1e-2) assert_allclose(disk.parameters["phi"].quantity, 110.946048 * u.deg) reg1 = disk.to_region() assert_allclose(reg1.height, region.width, rtol=1e-2) center = SkyCoord(20, 17, unit="deg", frame="galactic") region = EllipseSkyRegion( center=center, height=1 * u.deg, width=0.3 * u.deg, angle=30 * u.deg, ) disk = DiskSpatialModel.from_region(region, frame="icrs") reg1 = disk.to_region() assert_allclose(reg1.angle, -30.323 * u.deg, rtol=1e-2) assert_allclose(reg1.height, region.height, rtol=1e-3) region = CircleSkyRegion(center=region.center, radius=1.0 * u.deg) disk = DiskSpatialModel.from_region(region) assert_allclose(disk.parameters["e"].value, 0.0, rtol=1e-2) assert_allclose(disk.parameters["lon_0"].value, 20, rtol=1e-2) assert disk.frame == "galactic" geom = WcsGeom.create(skydir=center, npix=(10, 10), binsz=0.3) res = disk.evaluate_geom(geom) assert_allclose(np.sum(res.value), 50157.904662) region = PointSkyRegion(center=region.center) with pytest.raises(ValueError): DiskSpatialModel.from_region(region) def test_from_position(): center = SkyCoord(20, 17, unit="deg") spatial_model = GaussianSpatialModel.from_position( position=center, sigma=0.5 * u.deg ) geom = WcsGeom.create(skydir=center, npix=(10, 10), binsz=0.3) res = spatial_model.evaluate_geom(geom) assert_allclose(np.sum(res.value), 36307.440813) model = SkyModel( spectral_model=PowerLawSpectralModel(), spatial_model=spatial_model ) assert_allclose(model.position.ra.value, center.ra.value, rtol=1e-3) def test_sky_shell(): width = 2 * u.deg rad = 2 * u.deg model = ShellSpatialModel(lon_0="1 deg", lat_0="45 deg", radius=rad, width=width) lon = [1, 2, 4] * u.deg lat = 45 * u.deg val = model(lon, lat) assert val.unit == "deg-2" desired = [55.979449, 57.831651, 94.919895] assert_allclose(val.to_value("sr-1"), desired) radius = model.evaluation_radius assert radius.unit == "deg" assert_allclose(radius.value, rad.value + width.value) assert isinstance(model.to_region(), CircleAnnulusSkyRegion) assert_allclose(model.evaluation_bin_size_min, 2 * u.deg) def test_sky_shell2(): width = 2 * u.deg rad = 2 * u.deg model = Shell2SpatialModel(lon_0="1 deg", lat_0="45 deg", r_0=rad + width, eta=0.5) lon = [1, 2, 4] * u.deg lat = 45 * u.deg val = model(lon, lat) assert val.unit == "deg-2" desired = [55.979449, 57.831651, 94.919895] assert_allclose(val.to_value("sr-1"), desired) radius = model.evaluation_radius assert radius.unit == "deg" assert_allclose(radius.value, rad.value + width.value) assert_allclose(model.r_in.value, rad.value) assert isinstance(model.to_region(), CircleAnnulusSkyRegion) assert_allclose(model.evaluation_bin_size_min, 2 * u.deg) def test_sky_diffuse_constant(): model = ConstantSpatialModel(value="42 sr-1") lon = [1, 2] * u.deg lat = 45 * u.deg val = model(lon, lat) assert val.unit == "sr-1" assert_allclose(val.value, 42) radius = model.evaluation_radius assert radius is None assert isinstance(model.to_region(), RectangleSkyRegion) @requires_data() @requires_dependency("ipywidgets") def test_sky_diffuse_map(caplog): filename = "$GAMMAPY_DATA/catalogs/fermi/Extended_archive_v18/Templates/RXJ1713_2016_250GeV.fits" # noqa: E501 model = TemplateSpatialModel.read(filename, normalize=False) lon = [258.5, 0] * u.deg lat = -39.8 * u.deg val = model(lon, lat) assert "WARNING" in [_.levelname for _ in caplog.records] assert "Missing spatial template unit, assuming sr^-1" in [ _.message for _ in caplog.records ] assert val.unit == "sr-1" desired = [3265.6559, 0] assert_allclose(val.value, desired) res = model.evaluate_geom(model.map.geom) assert_allclose(np.sum(res.value), 32826159.74707) radius = model.evaluation_radius assert radius.unit == "deg" assert_allclose(radius.value, 0.64, rtol=1.0e-2) assert model.frame == "fk5" assert isinstance(model.to_region(), RectangleSkyRegion) with pytest.raises(TypeError): model.plot_interactive() with pytest.raises(TypeError): model.plot_grid() # change central position model.lon_0.value = 12.0 model.lat_0.value = 6 val = model([11.8, 12.8] * u.deg, 6.1 * u.deg) assert_allclose(val.value, [2850.8103, 89.629447], rtol=1e-3) # test to and from dict dict1 = model.to_dict() model2 = TemplateSpatialModel.from_dict(dict1) assert_allclose(model2.lon_0.quantity, 12.0 * u.deg, rtol=1e-3) # test dict without parameters dict1["spatial"]["parameters"] = [] model3 = TemplateSpatialModel.from_dict(dict1) assert_allclose(model3.lon_0.quantity, 258.388 * u.deg, rtol=1e-3) # test dict without parameters dict1["spatial"].pop("parameters") model3 = TemplateSpatialModel.from_dict(dict1) assert_allclose(model3.lon_0.quantity, 258.388 * u.deg, rtol=1e-3) @requires_data() @requires_dependency("ipywidgets") def test_sky_diffuse_map_3d(): filename = "$GAMMAPY_DATA/fermi-3fhl-gc/gll_iem_v06_gc.fits.gz" model = TemplateSpatialModel.read(filename, normalize=False) lon = [258.5, 0] * u.deg lat = -39.8 * u.deg energy = 1 * u.GeV val = model(lon, lat, energy) with pytest.raises(ValueError): model(lon, lat) assert model.map.unit == "cm-2 s-1 MeV-1 sr-1" val = model(lon, lat, energy) assert val.unit == "cm-2 s-1 MeV-1 sr-1" res = model.evaluate_geom(model.map.geom) assert_allclose(np.sum(res.value), 0.11803847221522712) with pytest.raises(TypeError): model.plot() with mpl_plot_check(): model.plot_grid() with mpl_plot_check(): model.plot_interactive() def test_sky_diffuse_map_normalize(): # define model map with a constant value of 1 model_map = Map.create(map_type="wcs", width=(10, 5), binsz=0.5, unit="sr-1") model_map.data += 1.0 model = TemplateSpatialModel(model_map) # define data map with a different spatial binning data_map = Map.create(map_type="wcs", width=(10, 5), binsz=1) coords = data_map.geom.get_coord() solid_angle = data_map.geom.solid_angle() vals = model(coords.lon, coords.lat) * solid_angle assert vals.unit == "" integral = vals.sum() assert_allclose(integral.value, 1, rtol=1e-4) def test_sky_diffuse_map_copy(): # define model map with a constant value of 1 model_map = Map.create(map_type="wcs", width=(1, 1), binsz=0.5, unit="sr-1") model_map.data += 1.0 model = TemplateSpatialModel(model_map, normalize=False) assert np.all(model.map.data == model_map.data) model.map.data += 1 # Check that the original map is unchanged assert np.all(model_map.data == np.ones_like(model_map.data)) model = TemplateSpatialModel(model_map, normalize=False, copy_data=False) assert np.all(model.map.data == model_map.data) model.map.data += 1 # Check that the original map has also been changed assert np.all(model.map.data == model_map.data) model_copy = model.copy(copy_data=False) model_copy.map.data += 1 # Check that the original map has also been changed assert np.all(model.map.data == model_copy.map.data) def test_sky_diffuse_map_empty(caplog): # define model map with 0 values model_map = Map.create(map_type="wcs", width=(1, 1), binsz=0.5, unit="sr-1") with caplog.at_level(logging.WARNING): model = TemplateSpatialModel(model_map, normalize=True) assert "Map values are all zeros. Check and fix this!" in [ _.message for _ in caplog.records ] assert np.all(np.isfinite(model.map.data)) axes = [MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=2)] model_map = Map.create( map_type="wcs", width=(1, 1), binsz=0.5, unit="sr-1", axes=axes ) model_map.data[0, :, :] = 1 with caplog.at_level(logging.WARNING): model = TemplateSpatialModel(model_map, normalize=True) assert ( "Map values are all zeros in at least one energy bin. Check and fix this!" in [_.message for _ in caplog.records] ) assert np.all(np.isfinite(model.map.data)) # model with nan model_map.data[0, 0, 0] = np.nan with caplog.at_level(logging.WARNING): TemplateSpatialModel(model_map, normalize=True) assert "Map has NaN values. Check and fix this!" in [ _.message for _ in caplog.records ] @pytest.mark.parametrize("model_cls", SPATIAL_MODEL_REGISTRY) def test_model_from_dict(tmpdir, model_cls): if model_cls in [TemplateSpatialModel, TemplateNDSpatialModel]: default_map = Map.create(map_type="wcs", width=(1, 1), binsz=0.5, unit="sr-1") filename = str(tmpdir / "template.fits") model = model_cls(default_map, filename=filename) model.write() elif model_cls is PiecewiseNormSpatialModel: geom = WcsGeom.create(skydir=(0, 0), npix=(2, 2), binsz=0.3, frame="galactic") default_coords = MapCoord.create(geom.footprint) default_coords["lon"] *= u.deg default_coords["lat"] *= u.deg model = model_cls(default_coords, frame="galactic") else: model = model_cls() data = model.to_dict() model_from_dict = model_cls.from_dict(data) assert model_from_dict.tag == model_from_dict.tag bkg_model = FoVBackgroundModel(spatial_model=model, dataset_name="test") bkg_model_dict = bkg_model.to_dict() bkg_model_from_dict = FoVBackgroundModel.from_dict(bkg_model_dict) assert bkg_model_from_dict.spatial_model is not None def test_evaluate_on_fk5_map(): # Check if spatial model can be evaluated on a map with FK5 frame # Regression test for GH-2402 header = {} header["CDELT1"] = 1.0 header["CDELT2"] = 1.0 header["CTYPE1"] = "RA---TAN" header["CTYPE2"] = "DEC--TAN" header["RADESYS"] = "FK5" header["CRVAL1"] = 0 header["CRVAL2"] = 0 header["CRPIX1"] = 5 header["CRPIX2"] = 5 wcs = WCS(header) geom = WcsGeom(wcs, npix=(10, 10)) model = GaussianSpatialModel(lon_0="0 deg", lat_0="0 deg", sigma="1 deg") data = model.evaluate_geom(geom) assert data.sum() > 0 def test_evaluate_fk5_model(): geom = WcsGeom.create(width=(5, 5), binsz=0.1, frame="icrs") model = GaussianSpatialModel( lon_0="0 deg", lat_0="0 deg", sigma="0.1 deg", frame="fk5" ) data = model.evaluate_geom(geom) assert data.sum() > 0 def test_spatial_model_plot(): model = PointSpatialModel() model.covariance = np.diag([0.01, 0.01]) with mpl_plot_check(): ax = model.plot() with mpl_plot_check(): model.plot_error(ax=ax, which="all") models_test = [ (GaussianSpatialModel, "sigma"), (GeneralizedGaussianSpatialModel, "r_0"), (DiskSpatialModel, "r_0"), ] @pytest.mark.parametrize(("model_class", "extension_param"), models_test) def test_spatial_model_plot_error(model_class, extension_param): model = model_class(lon="0d", lat="0d", sigma=0.2 * u.deg, frame="galactic") model.lat_0.error = 0.04 model.lon_0.error = 0.02 model.parameters[extension_param].error = 0.04 model.e.error = 0.002 empty_map = Map.create( skydir=model.position, frame=model.frame, width=1, binsz=0.02 ) with mpl_plot_check(): ax = empty_map.plot() model.plot_error(ax=ax, which="all") model.plot_error(ax=ax, which="position") model.plot_error(ax=ax, which="extension") def test_integrate_region_geom(): center = SkyCoord("0d", "0d", frame="icrs") model = GaussianSpatialModel( lon_0="0 deg", lat_0="0 deg", sigma=0.1 * u.deg, frame="icrs" ) radius_large = 1 * u.deg circle_large = CircleSkyRegion(center, radius_large) radius_small = 0.1 * u.deg circle_small = CircleSkyRegion(center, radius_small) geom_large, geom_small = ( RegionGeom(region=circle_large), RegionGeom(region=circle_small, binsz_wcs="0.01 deg"), ) integral_large, integral_small = ( model.integrate_geom(geom_large).data, model.integrate_geom(geom_small).data, ) assert_allclose(integral_large[0], 1, rtol=0.001) assert_allclose(integral_small[0], 0.3953, rtol=0.001) # TODO: solve issue with small radii (e.g. 1e-5) and improve tolerance @pytest.mark.parametrize("width", np.geomspace(1e-4, 1e-1, 10) * u.deg) @pytest.mark.parametrize( "model", [ (GaussianSpatialModel, "sigma", 6e-3), (DiskSpatialModel, "r_0", 0.4), (GeneralizedGaussianSpatialModel, "r_0", 3e-4), ], ) def test_integrate_wcs_geom(width, model): model_cls, param_name, tolerance = model param_dict = {param_name: width} spatial_model = model_cls( lon_0="0.234 deg", lat_0="-0.172 deg", frame="icrs", **param_dict ) geom = WcsGeom.create(skydir=(0, 0), npix=100, binsz=0.02) integrated = spatial_model.integrate_geom(geom) assert_allclose(integrated.data.sum(), 1, atol=tolerance) def test_integrate_geom_no_overlap(): center = SkyCoord("0d", "0d", frame="icrs") model = GaussianSpatialModel( lon_0="10.234 deg", lat_0="-0.172 deg", sigma=1e-2 * u.deg, frame="icrs" ) geom = WcsGeom.create(skydir=center, npix=100, binsz=0.02) # This should not fail but return map filled with 0 integrated = model.integrate_geom(geom) assert_allclose(integrated.data, 0.0) def test_integrate_geom_parameter_issue(): center = SkyCoord("0d", "0d", frame="icrs") model = GaussianSpatialModel( lon_0="0.234 deg", lat_0="-0.172 deg", sigma=np.nan * u.deg, frame="icrs" ) geom = WcsGeom.create(skydir=center, npix=100, binsz=0.02) # This should not fail but return map filled with nan integrated = model.integrate_geom(geom) assert_allclose(integrated.data, np.nan) def test_integrate_geom_energy_axis(): center = SkyCoord("0d", "0d", frame="icrs") model = GaussianSpatialModel(lon="0d", lat="0d", sigma=0.1 * u.deg, frame="icrs") radius = 1 * u.deg square = RectangleSkyRegion(center, radius, radius) axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=10) geom = RegionGeom(region=square, axes=[axis]) integral = model.integrate_geom(geom).data assert_allclose(integral, 1, rtol=0.0001) def test_templatemap_clip(): model_map = Map.create(map_type="wcs", width=(2, 2), binsz=0.5, unit="sr-1") model_map.data += 1.0 model = TemplateSpatialModel(model_map) model.map.data = model.map.data * -1 lon = np.array([0, 0.2, 0.3]) * u.deg lat = np.array([0, 0.2, 0.3]) * u.deg val = model.evaluate(lon, lat) assert_allclose(val, 0, rtol=0.0001) def test_piecewise_spatial_model_gc(): geom = WcsGeom.create(skydir=(0, 0), npix=(2, 2), binsz=0.3, frame="galactic") coords = MapCoord.create(geom.footprint) coords["lon"] *= u.deg coords["lat"] *= u.deg model = PiecewiseNormSpatialModel(coords, frame="galactic") assert_allclose(model(*geom.to_image().center_coord), 1.0) norms = np.arange(coords.shape[0]) model = PiecewiseNormSpatialModel(coords, norms, frame="galactic") assert not model.is_energy_dependent expected = np.array([[0, 3], [1, 2]]) assert_allclose(model(*geom.to_image().get_coord()), expected, atol=1e-5) assert_allclose(model.evaluate_geom(geom.to_image()), expected, atol=1e-5) assert_allclose(model.evaluate_geom(geom), expected, atol=1e-5) model_dict = model.to_dict() new_model = PiecewiseNormSpatialModel.from_dict(model_dict) assert model_dict == new_model.to_dict() assert_allclose(new_model.evaluate_geom(geom.to_image()), expected, atol=1e-5) assert_allclose( model.evaluate(-0.1 * u.deg, 2.3 * u.deg), model.evaluate(359.9 * u.deg, 2.3 * u.deg), ) def test_piecewise_spatial_model(): for lon in range(-360, 360): geom = WcsGeom.create( skydir=(lon, 2.3), npix=(2, 2), binsz=0.3, frame="galactic" ) coords = MapCoord.create(geom.footprint) coords["lon"] *= u.deg coords["lat"] *= u.deg model = PiecewiseNormSpatialModel(coords, frame="galactic") assert_allclose(model(*geom.to_image().center_coord), 1.0) norms = np.arange(coords.shape[0]) model = PiecewiseNormSpatialModel(coords, norms, frame="galactic") expected = np.array([[0, 3], [1, 2]]) assert_allclose(model(*geom.to_image().get_coord()), expected, atol=1e-5) assert_allclose(model.evaluate_geom(geom.to_image()), expected, atol=1e-5) assert_allclose(model.evaluate_geom(geom), expected, atol=1e-5) model_dict = model.to_dict() new_model = PiecewiseNormSpatialModel.from_dict(model_dict) assert model_dict == new_model.to_dict() assert_allclose(new_model.evaluate_geom(geom.to_image()), expected, atol=1e-5) def test_piecewise_spatial_model_3d(): axis = MapAxis.from_energy_bounds("1 TeV", "10 TeV", nbin=3) geom = WcsGeom.create( skydir=(2.4, 2.3), npix=(2, 2), binsz=0.3, frame="galactic", axes=[axis] ) coords = geom.get_coord().flat with pytest.raises(ValueError): PiecewiseNormSpatialModel(coords, frame="galactic") @requires_data() def test_template_ND(tmpdir): filename = "$GAMMAPY_DATA/catalogs/fermi/Extended_archive_v18/Templates/RXJ1713_2016_250GeV.fits" # noqa: E501 map_ = Map.read(filename) map_.data[map_.data < 0] = 0 geom2d = map_.geom norm = MapAxis.from_nodes(range(0, 11, 2), interp="lin", name="norm", unit="") cste = MapAxis.from_bounds(-1, 1, 3, interp="lin", name="cste", unit="") geom = geom2d.to_cube([norm, cste]) nd_map = WcsNDMap(geom) for kn, norm_value in enumerate(norm.center): for kp, cste_value in enumerate(cste.center): nd_map.data[kp, kn, :, :] = norm_value * map_.data + cste_value template = TemplateNDSpatialModel(nd_map, interp_kwargs={"values_scale": "lin"}) assert len(template.parameters) == 2 assert_allclose(template.parameters["norm"].value, 5) assert_allclose(template.parameters["cste"].value, 0) assert_allclose( template.evaluate( geom2d.center_skydir.ra, geom2d.center_skydir.dec, norm=0, cste=0 ), [0], ) assert_allclose(template.evaluate_geom(geom2d), 5 * map_.data, rtol=0.03, atol=10) template.parameters["norm"].value = 2 template.parameters["cste"].value = 0 assert_allclose(template.evaluate_geom(geom2d), 2 * map_.data, rtol=0.03, atol=10) template.filename = str(tmpdir / "template_ND.fits") template.write() dict_ = template.to_dict() template_new = TemplateNDSpatialModel.from_dict(dict_) assert_allclose(template_new.map.data, nd_map.data) assert len(template_new.parameters) == 2 assert template_new.parameters["norm"].value == 2 assert template_new.parameters["cste"].value == 0 @requires_data() def test_templatespatial_write(tmpdir): filename = "$GAMMAPY_DATA/catalogs/fermi/Extended_archive_v18/Templates/RXJ1713_2016_250GeV.fits" map_ = Map.read(filename) template = TemplateSpatialModel(map_, filename=filename) filename_new = str(tmpdir / "template_test.fits") template.write(overwrite=True, filename=filename_new) assert Path(filename_new).is_file() @requires_data() def test_template_spatial_parameters_copy(): filename = "$GAMMAPY_DATA/catalogs/fermi/Extended_archive_v18/Templates/RXJ1713_2016_250GeV.fits" model = TemplateSpatialModel.read(filename, normalize=False) model.position = SkyCoord(0, 0, unit="deg", frame="galactic") model_copy = model.copy() assert_allclose(model.parameters.value, model_copy.parameters.value) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/test_spectral.py0000644000175100001770000013157414721316200023530 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import operator import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.table import Table import matplotlib.pyplot as plt from gammapy.maps import MapAxis, RegionNDMap from gammapy.modeling.models import ( SPECTRAL_MODEL_REGISTRY, BrokenPowerLawSpectralModel, CompoundSpectralModel, ConstantSpectralModel, EBLAbsorptionNormSpectralModel, ExpCutoffPowerLaw3FGLSpectralModel, ExpCutoffPowerLawNormSpectralModel, ExpCutoffPowerLawSpectralModel, GaussianSpectralModel, LogParabolaNormSpectralModel, LogParabolaSpectralModel, Model, NaimaSpectralModel, PiecewiseNormSpectralModel, PowerLaw2SpectralModel, PowerLawNormSpectralModel, PowerLawSpectralModel, SkyModel, SmoothBrokenPowerLawSpectralModel, SuperExpCutoffPowerLaw4FGLDR3SpectralModel, SuperExpCutoffPowerLaw4FGLSpectralModel, TemplateNDSpectralModel, TemplateSpectralModel, ) from gammapy.utils.compat import COPY_IF_NEEDED from gammapy.utils.scripts import make_path from gammapy.utils.testing import ( assert_quantity_allclose, mpl_plot_check, requires_data, requires_dependency, ) def table_model(): energy = MapAxis.from_energy_bounds(0.1 * u.TeV, 100 * u.TeV, 1000).center model = PowerLawSpectralModel( index=2.3, amplitude="4 cm-2 s-1 TeV-1", reference="1 TeV" ) dnde = model(energy) return TemplateSpectralModel(energy, dnde) TEST_MODELS = [ dict( name="constant", model=ConstantSpectralModel(const=4 / u.cm**2 / u.s / u.TeV), val_at_2TeV=u.Quantity(4, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(35.9999999999999, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(198.00000000000006, "TeV cm-2 s-1"), ), dict( name="powerlaw", model=PowerLawSpectralModel( index=2.3 * u.Unit(""), amplitude=4 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, ), val_at_2TeV=u.Quantity(4 * 2.0 ** (-2.3), "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(2.9227116204223784, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(6.650836884969039, "TeV cm-2 s-1"), ), dict( name="powerlaw", model=PowerLawSpectralModel( index=2 * u.Unit(""), amplitude=4 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, ), val_at_2TeV=u.Quantity(1.0, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(3.6, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(9.210340371976184, "TeV cm-2 s-1"), ), dict( name="norm-powerlaw", model=PowerLawNormSpectralModel( tilt=2 * u.Unit(""), norm=4.0 * u.Unit(""), reference=1 * u.TeV, ), val_at_2TeV=u.Quantity(1.0, ""), integral_1_10TeV=u.Quantity(3.6, "TeV"), eflux_1_10TeV=u.Quantity(9.210340371976184, "TeV2"), ), dict( name="powerlaw2", model=PowerLaw2SpectralModel( amplitude=u.Quantity(2.9227116204223784, "cm-2 s-1"), index=2.3 * u.Unit(""), emin=1 * u.TeV, emax=10 * u.TeV, ), val_at_2TeV=u.Quantity(4 * 2.0 ** (-2.3), "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(2.9227116204223784, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(6.650836884969039, "TeV cm-2 s-1"), ), dict( name="ecpl", model=ExpCutoffPowerLawSpectralModel( index=1.6 * u.Unit(""), amplitude=4 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, lambda_=0.1 / u.TeV, ), val_at_2TeV=u.Quantity(1.080321705479446, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(3.765838739678921, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(9.901735870666526, "TeV cm-2 s-1"), e_peak=4 * u.TeV, ), dict( name="norm-ecpl", model=ExpCutoffPowerLawNormSpectralModel( index=1.6 * u.Unit(""), norm=4 * u.Unit(""), reference=1 * u.TeV, lambda_=0.1 / u.TeV, ), val_at_2TeV=u.Quantity(1.080321705479446, ""), integral_1_10TeV=u.Quantity(3.765838739678921, "TeV"), eflux_1_10TeV=u.Quantity(9.901735870666526, "TeV2"), ), dict( name="ecpl_3fgl", model=ExpCutoffPowerLaw3FGLSpectralModel( index=2.3 * u.Unit(""), amplitude=4 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, ecut=10 * u.TeV, ), val_at_2TeV=u.Quantity(0.7349563611124971, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(2.6034046173089, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(5.340285560055799, "TeV cm-2 s-1"), ), dict( name="plsec_4fgl_dr1", model=SuperExpCutoffPowerLaw4FGLSpectralModel( index_1=1.5, index_2=2, amplitude=1 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, expfactor=1e-2, ), val_at_2TeV=u.Quantity(0.3431043087721737, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(1.2125247, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(3.38072082, "TeV cm-2 s-1"), ), dict( name="plsec_4fgl", model=SuperExpCutoffPowerLaw4FGLDR3SpectralModel( index_1=1.5, index_2=2, amplitude=1 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, expfactor=1e-2, ), val_at_2TeV=u.Quantity(0.35212994, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(1.328499, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(4.067067, "TeV cm-2 s-1"), ), dict( name="logpar", model=LogParabolaSpectralModel( alpha=2.3 * u.Unit(""), amplitude=4 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, beta=0.5 * u.Unit(""), ), val_at_2TeV=u.Quantity(0.6387956571420305, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(2.255689748270628, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(3.9586515834989267, "TeV cm-2 s-1"), e_peak=0.74082 * u.TeV, ), dict( name="norm-logpar", model=LogParabolaNormSpectralModel( alpha=2.3 * u.Unit(""), norm=4 * u.Unit(""), reference=1 * u.TeV, beta=0.5 * u.Unit(""), ), val_at_2TeV=u.Quantity(0.6387956571420305, ""), integral_1_10TeV=u.Quantity(2.255689748270628, "TeV"), eflux_1_10TeV=u.Quantity(3.9586515834989267, "TeV2"), ), dict( name="logpar10", model=LogParabolaSpectralModel.from_log10( alpha=2.3 * u.Unit(""), amplitude=4 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, beta=1.151292546497023 * u.Unit(""), ), val_at_2TeV=u.Quantity(0.6387956571420305, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(2.255689748270628, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(3.9586515834989267, "TeV cm-2 s-1"), e_peak=0.74082 * u.TeV, ), dict( name="powerlaw_index1", model=PowerLawSpectralModel( index=1 * u.Unit(""), amplitude=2 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, ), val_at_2TeV=u.Quantity(1.0, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(4.605170185, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(18.0, "TeV cm-2 s-1"), ), dict( name="ecpl_2", model=ExpCutoffPowerLawSpectralModel( index=2.0 * u.Unit(""), amplitude=4 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, lambda_=0.1 / u.TeV, ), val_at_2TeV=u.Quantity(0.81873075, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(2.83075297, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(6.41406327, "TeV cm-2 s-1"), e_peak=np.nan * u.TeV, ), dict( name="GaussianSpectralModel", model=GaussianSpectralModel( amplitude=4 / u.cm**2 / u.s, mean=2 * u.TeV, sigma=0.2 * u.TeV ), val_at_2TeV=u.Quantity(7.978845608028654, "cm-2 s-1 TeV-1"), val_at_3TeV=u.Quantity(2.973439029468601e-05, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(3.9999988533937123, "cm-2 s-1"), integral_infinity=u.Quantity(4, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(7.999998896163037, "TeV cm-2 s-1"), ), dict( name="ecpl", model=ExpCutoffPowerLawSpectralModel( index=1.8 * u.Unit(""), amplitude=4 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, lambda_=0.1 / u.TeV, alpha=0.8, ), val_at_2TeV=u.Quantity(0.871694294554192, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(3.026342, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(7.38652453, "TeV cm-2 s-1"), e_peak=1.7677669529663684 * u.TeV, ), dict( name="bpl", model=BrokenPowerLawSpectralModel( index1=1.5 * u.Unit(""), index2=2.5 * u.Unit(""), amplitude=4 / u.cm**2 / u.s / u.TeV, ebreak=0.5 * u.TeV, ), val_at_2TeV=u.Quantity(0.125, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(0.45649740094103286, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(0.9669999668731384, "TeV cm-2 s-1"), ), dict( name="sbpl", model=SmoothBrokenPowerLawSpectralModel( index1=1.5 * u.Unit(""), index2=2.5 * u.Unit(""), amplitude=4 / u.cm**2 / u.s / u.TeV, ebreak=0.5 * u.TeV, reference=1 * u.TeV, beta=1, ), val_at_2TeV=u.Quantity(0.28284271247461906, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(0.9956923907948155, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(2.2372256145972207, "TeV cm-2 s-1"), ), dict( name="sbpl-hard", model=SmoothBrokenPowerLawSpectralModel( index1=2.5 * u.Unit(""), index2=1.5 * u.Unit(""), amplitude=4 / u.cm**2 / u.s / u.TeV, ebreak=0.5 * u.TeV, reference=1 * u.TeV, beta=1, ), val_at_2TeV=u.Quantity(3.5355339059327378, "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(13.522782989735022, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(40.06681812966845, "TeV cm-2 s-1"), ), dict( name="pbpl", model=PiecewiseNormSpectralModel( energy=[1, 3, 7, 10] * u.TeV, norms=[1, 5, 3, 0.5] * u.Unit(""), ), val_at_2TeV=u.Quantity(2.76058404, ""), integral_1_10TeV=u.Quantity(24.758255, "TeV"), eflux_1_10TeV=u.Quantity(117.745068, "TeV2"), ), dict( name="pbpllin", model=PiecewiseNormSpectralModel( energy=[1, 3, 7, 10] * u.TeV, norms=[1, 5, 3, 0.5] * u.Unit(""), interp="lin", ), val_at_2TeV=u.Quantity(3.0, ""), integral_1_10TeV=u.Quantity(27.24066846, "TeV"), eflux_1_10TeV=u.Quantity(133.34487, "TeV2"), ), ] # Add compound models TEST_MODELS.append( dict( name="compound6", model=TEST_MODELS[0]["model"] + u.Quantity(4, "cm-2 s-1 TeV-1"), val_at_2TeV=TEST_MODELS[0]["val_at_2TeV"] * 2, integral_1_10TeV=TEST_MODELS[0]["integral_1_10TeV"] * 2, eflux_1_10TeV=TEST_MODELS[0]["eflux_1_10TeV"] * 2, ) ) TEST_MODELS.append( dict( name="compound3", model=TEST_MODELS[1]["model"] + TEST_MODELS[1]["model"], val_at_2TeV=TEST_MODELS[1]["val_at_2TeV"] * 2, integral_1_10TeV=TEST_MODELS[1]["integral_1_10TeV"] * 2, eflux_1_10TeV=TEST_MODELS[1]["eflux_1_10TeV"] * 2, ) ) TEST_MODELS.append( dict( name="table_model", model=table_model(), # Values took from power law expectation val_at_2TeV=u.Quantity(4 * 2.0 ** (-2.3), "cm-2 s-1 TeV-1"), integral_1_10TeV=u.Quantity(2.9227116204223784, "cm-2 s-1"), eflux_1_10TeV=u.Quantity(6.650836884969039, "TeV cm-2 s-1"), ) ) @requires_dependency("scipy") @pytest.mark.parametrize("spectrum", TEST_MODELS, ids=lambda _: _["name"]) def test_models(spectrum): model = spectrum["model"] for p in model.parameters: assert p.type == "spectral" energy = 2 * u.TeV value = model(energy) energies = [2, 3] * u.TeV values = model(energies) assert_quantity_allclose(value, spectrum["val_at_2TeV"], rtol=1e-7) if "val_at_3TeV" in spectrum: energy = 3 * u.TeV value = model(energy) assert_quantity_allclose(value, spectrum["val_at_3TeV"], rtol=1e-7) energy_min = 1 * u.TeV energy_max = 10 * u.TeV assert_quantity_allclose( model.integral(energy_min=energy_min, energy_max=energy_max), spectrum["integral_1_10TeV"], rtol=1e-5, ) assert_quantity_allclose( model.energy_flux(energy_min=energy_min, energy_max=energy_max), spectrum["eflux_1_10TeV"], rtol=1e-5, ) if "e_peak" in spectrum: assert_quantity_allclose(model.e_peak, spectrum["e_peak"], rtol=1e-2) # inverse for ConstantSpectralModel is irrelevant. # inverse for Gaussian and PiecewiseNormSpectralModel have a degeneracy if not ( isinstance(model, ConstantSpectralModel) or spectrum["name"] == "compound6" or spectrum["name"] == "GaussianSpectralModel" or "pbpl" in spectrum["name"] ): assert_quantity_allclose(model.inverse(value), energy, rtol=0.01) inverse = model.inverse_all(values) for ke, ener in enumerate(energies): assert_quantity_allclose(inverse[ke], energies[ke], rtol=0.01) if "integral_infinity" in spectrum: energy_min = 0 * u.TeV energy_max = 10000 * u.TeV assert_quantity_allclose( model.integral(energy_min=energy_min, energy_max=energy_max), spectrum["integral_infinity"], ) model.to_dict() assert "" in str(model) # check that an array evaluation works (otherwise e.g. plotting raises an error) e_array = [2, 10, 20] * u.TeV e_array = e_array[:, np.newaxis, np.newaxis] val = model(e_array) assert val.shape == e_array.shape assert_quantity_allclose(val[0], spectrum["val_at_2TeV"]) def test_model_unit(): pwl = PowerLawSpectralModel() value = pwl(500 * u.MeV) assert value.unit == "cm-2 s-1 TeV-1" def test_model_plot(): pwl = PowerLawSpectralModel( amplitude=1e-12 * u.Unit("TeV-1 cm-2 s-1"), reference=1 * u.Unit("TeV"), index=2 ) pwl.amplitude.error = 0.1e-12 * u.Unit("TeV-1 cm-2 s-1") energy_axis = MapAxis.from_energy_bounds(1 * u.TeV, 10 * u.TeV, 100) with mpl_plot_check(): pwl.plot((1 * u.TeV, 10 * u.TeV)) with mpl_plot_check(): pwl.plot_error((1 * u.TeV, 10 * u.TeV)) with mpl_plot_check(): pwl.plot([0.08, 20] * u.TeV) with mpl_plot_check(): pwl.plot_error([0.08, 20] * u.TeV) with mpl_plot_check(): pwl.plot([0.08 * u.TeV, 20 * u.TeV]) with mpl_plot_check(): pwl.plot_error([0.08 * u.TeV, 20 * u.TeV]) with mpl_plot_check(): pwl.plot(energy_bounds=energy_axis) with mpl_plot_check(): pwl.plot_error(energy_bounds=energy_axis) def test_model_plot_sed_type(): pwl = PowerLawSpectralModel( amplitude=1e-12 * u.Unit("TeV-1 cm-2 s-1"), reference=1 * u.Unit("TeV"), index=2 ) pwl.amplitude.error = 0.1e-12 * u.Unit("TeV-1 cm-2 s-1") with mpl_plot_check(): ax1 = pwl.plot((1 * u.TeV, 100 * u.TeV), sed_type="dnde") ax2 = pwl.plot_error((1 * u.TeV, 100 * u.TeV), sed_type="dnde") assert ax1.yaxis.units == u.Unit("1 / (s cm2 TeV)") assert ax1.axes.axes.get_ylabel().split()[0] == "dnde" assert ax2.yaxis.units == u.Unit("1 / (s cm2 TeV)") assert ax2.axes.axes.get_ylabel().split()[0] == "dnde" with mpl_plot_check(): ax1 = pwl.plot((1 * u.TeV, 100 * u.TeV), sed_type="e2dnde") ax2 = pwl.plot_error((1 * u.TeV, 100 * u.TeV), sed_type="e2dnde") assert ax1.yaxis.units == u.Unit("erg / (cm2 s)") assert ax1.axes.axes.get_ylabel().split()[0] == "e2dnde" assert ax2.yaxis.units == u.Unit("erg / (cm2 s)") assert ax2.axes.axes.get_ylabel().split()[0] == "e2dnde" with mpl_plot_check(): ax1 = pwl.plot((1 * u.TeV, 100 * u.TeV), sed_type="flux") ax2 = pwl.plot_error((1 * u.TeV, 100 * u.TeV), sed_type="flux") assert ax1.yaxis.units == u.Unit("1 / (s cm2)") assert ax1.axes.axes.get_ylabel().split()[0] == "flux" assert ax2.yaxis.units == u.Unit("1 / (s cm2)") assert ax2.axes.axes.get_ylabel().split()[0] == "flux" with mpl_plot_check(): ax1 = pwl.plot((1 * u.TeV, 100 * u.TeV), sed_type="eflux") ax2 = pwl.plot_error((1 * u.TeV, 100 * u.TeV), sed_type="eflux") assert ax1.yaxis.units == u.Unit("erg / (cm2 s)") assert ax1.axes.axes.get_ylabel().split()[0] == "eflux" assert ax2.yaxis.units == u.Unit("erg / (cm2 s)") assert ax2.axes.axes.get_ylabel().split()[0] == "eflux" def test_to_from_dict(): spectrum = TEST_MODELS[1] model = spectrum["model"] model_dict = model.to_dict() # Here we reverse the order of parameters list to ensure assignment is correct model_dict["spectral"]["parameters"].reverse() model_class = SPECTRAL_MODEL_REGISTRY.get_cls(model_dict["spectral"]["type"]) new_model = model_class.from_dict(model_dict) assert isinstance(new_model, PowerLawSpectralModel) actual = [par.value for par in new_model.parameters] desired = [par.value for par in model.parameters] assert_quantity_allclose(actual, desired) actual = [par.frozen for par in new_model.parameters] desired = [par.frozen for par in model.parameters] assert_allclose(actual, desired) new_model = Model.from_dict(model_dict) assert isinstance(new_model, PowerLawSpectralModel) actual = [par.value for par in new_model.parameters] desired = [par.value for par in model.parameters] assert_quantity_allclose(actual, desired) actual = [par.frozen for par in new_model.parameters] desired = [par.frozen for par in model.parameters] assert_allclose(actual, desired) def test_to_from_dict_partial_input(caplog): spectrum = TEST_MODELS[1] model = spectrum["model"] model_dict = model.to_dict() # Here we remove the reference energy model_dict["spectral"]["parameters"].remove(model_dict["spectral"]["parameters"][2]) model_class = SPECTRAL_MODEL_REGISTRY.get_cls(model_dict["spectral"]["type"]) new_model = model_class.from_dict(model_dict) assert isinstance(new_model, PowerLawSpectralModel) actual = [par.value for par in new_model.parameters] desired = [par.value for par in model.parameters] assert_quantity_allclose(actual, desired) actual = [par.frozen for par in new_model.parameters] desired = [par.frozen for par in model.parameters] assert_allclose(actual, desired) assert "WARNING" in [_.levelname for _ in caplog.records] assert ( "Parameter 'reference' not defined in YAML file. Using default value: 1.0 TeV" in [_.message for _ in caplog.records] ) def test_to_from_dict_compound(): spectrum = TEST_MODELS[-3] model = spectrum["model"] assert spectrum["name"] == "compound6" model_dict = model.to_dict() assert model_dict["spectral"]["operator"] == "add" model_class = SPECTRAL_MODEL_REGISTRY.get_cls(model_dict["spectral"]["type"]) new_model = model_class.from_dict(model_dict) assert isinstance(new_model, CompoundSpectralModel) actual = [par.value for par in new_model.parameters] desired = [par.value for par in model.parameters] assert_quantity_allclose(actual, desired) def test_to_from_dict_piecewise_lin(): spectrum = TEST_MODELS[-4] model = spectrum["model"] assert spectrum["name"] == "pbpllin" model_dict = model.to_dict() assert model_dict["spectral"]["interp"] == "lin" model_class = SPECTRAL_MODEL_REGISTRY.get_cls(model_dict["spectral"]["type"]) new_model = model_class.from_dict(model_dict) assert isinstance(new_model, PiecewiseNormSpectralModel) actual = [par.value for par in new_model.parameters] desired = [par.value for par in model.parameters] assert_quantity_allclose(actual, desired) assert new_model._interp == model._interp model_dict["spectral"].pop("interp") new_model_default = model_class.from_dict(model_dict) assert isinstance(new_model_default, PiecewiseNormSpectralModel) assert new_model_default._interp == "log" def test_to_from_dict_piecewise(): spectrum = TEST_MODELS[-5] model = spectrum["model"] assert spectrum["name"] == "pbpl" model_dict = model.to_dict() assert model_dict["spectral"]["interp"] == "log" model_class = SPECTRAL_MODEL_REGISTRY.get_cls(model_dict["spectral"]["type"]) new_model = model_class.from_dict(model_dict) assert isinstance(new_model, PiecewiseNormSpectralModel) actual = [par.value for par in new_model.parameters] desired = [par.value for par in model.parameters] assert_quantity_allclose(actual, desired) assert new_model._interp == model._interp model_dict["spectral"].pop("interp") new_model_default = model_class.from_dict(model_dict) assert isinstance(new_model_default, PiecewiseNormSpectralModel) assert new_model_default._interp == "log" @requires_data() def test_table_model_from_file(): filename = "$GAMMAPY_DATA/ebl/ebl_franceschini.fits.gz" absorption_z03 = TemplateSpectralModel.read_xspec_model( filename=filename, param=0.3 ) value = absorption_z03(1 * u.TeV) assert_allclose(value, 1) @requires_data() def test_absorption(): # absorption values for given redshift redshift = 0.117 absorption = EBLAbsorptionNormSpectralModel.read_builtin( "dominguez", redshift=redshift ) # Spectral model corresponding to PKS 2155-304 (quiescent state) index = 3.53 amplitude = 1.81 * 1e-12 * u.Unit("cm-2 s-1 TeV-1") reference = 1 * u.TeV pwl = PowerLawSpectralModel(index=index, amplitude=amplitude, reference=reference) # EBL + PWL model model = pwl * absorption desired = u.Quantity(5.140765e-13, "TeV-1 s-1 cm-2") assert_quantity_allclose(model(1 * u.TeV), desired, rtol=1e-3) assert model.model2.alpha_norm.value == 1.0 # EBL + PWL model: test if norm of EBL=0: it mean model =pwl model.parameters["alpha_norm"].value = 0 assert_quantity_allclose(model(1 * u.TeV), pwl(1 * u.TeV), rtol=1e-3) # EBL + PWL model: Test with a norm different of 1 absorption = EBLAbsorptionNormSpectralModel.read_builtin( "dominguez", redshift=redshift, alpha_norm=1.5 ) model = pwl * absorption desired = u.Quantity(2.739695e-13, "TeV-1 s-1 cm-2") assert model.model2.alpha_norm.value == 1.5 assert_quantity_allclose(model(1 * u.TeV), desired, rtol=1e-3) # Test error propagation model.model1.amplitude.error = 0.1 * model.model1.amplitude.value dnde, dnde_err = model.evaluate_error(1 * u.TeV) assert_allclose(dnde_err / dnde, 0.1) @requires_data() def test_absorbed_extrapolate(): ebl_model = "dominguez" z = 0.0001 alpha_norm = 1 absorption = EBLAbsorptionNormSpectralModel.read_builtin(ebl_model) values = absorption.evaluate(1 * u.TeV, z, alpha_norm) assert_allclose(values, 1) def test_ecpl_integrate(): # regression test to check the numerical integration for small energy bins ecpl = ExpCutoffPowerLawSpectralModel() value = ecpl.integral(1 * u.TeV, 1.1 * u.TeV) assert value.isscalar assert_quantity_allclose(value, 8.380714e-14 * u.Unit("s-1 cm-2")) def test_pwl_pivot_energy(): pwl = PowerLawSpectralModel(amplitude="5.35510540e-11 cm-2 s-1 TeV-1") assert_quantity_allclose(pwl.pivot_energy, np.nan * u.TeV, rtol=1e-5) pwl.covariance = [ [0.08**2, 6.56889e-14, 0], [6.56889e-14, (5.5e-12) ** 2, 0], [0, 0, 0], ] assert_quantity_allclose(pwl.pivot_energy, 1.2112653 * u.TeV, rtol=1e-5) ecpl = ExpCutoffPowerLawSpectralModel( amplitude="5.35510540e-11 cm-2 s-1 TeV-1", lambda_=0.001 * (1 / u.TeV), index=2 ) ecpl.covariance = [ [0.08**2, 6.56889e-14, 0, 0, 0], [6.56889e-14, (5.5e-12) ** 2, 0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0], [0, 0, 0, 0, 0], ] assert_quantity_allclose(pwl.pivot_energy, ecpl.pivot_energy, rtol=1e-5) def test_num_pivot_energy(): lp = LogParabolaSpectralModel( amplitude="5.82442e-11 cm-2 s-1 GeV-1", reference="17.337 GeV", alpha="1.9134", beta="0.2217", ) lp.amplitude.error = "2.8804e-12 cm-2 s-1 GeV-1" assert_quantity_allclose(lp.pivot_energy, np.nan * u.GeV, rtol=1e-5) lp.alpha.error = "0.1126" lp.beta.error = "0.0670" assert_quantity_allclose(lp.pivot_energy, 17.337042 * u.GeV, rtol=1e-5) def test_template_spectral_model_evaluate_tiny(): energy = np.array([1.00000000e06, 1.25892541e06, 1.58489319e06, 1.99526231e06]) values = np.array([4.39150790e-38, 1.96639562e-38, 8.80497507e-39, 3.94262401e-39]) model = TemplateSpectralModel( energy=energy, values=values * u.Unit("MeV-1 s-1 sr-1") ) result = model(energy) tiny = np.finfo(np.float32).tiny mask = abs(values) - tiny > tiny np.testing.assert_allclose( values[mask] / values.max(), result[mask].value / values.max() ) mask = abs(result.value) - tiny <= tiny assert np.all(result[mask] == 0.0) def test_template_spectral_model_single_value(): energy = [1] * u.TeV values = [1e-12] * u.Unit("TeV-1 s-1 cm-2") model = TemplateSpectralModel(energy=energy, values=values) result = model(energy=[0.5, 2] * u.TeV) assert_allclose(result.data, 1e-12) data = model.to_dict() model2 = TemplateSpectralModel.from_dict(data) assert model2.to_dict() == data def test_template_spectral_model_compound(): energy = [1.00e06, 1.25e06, 1.58e06, 1.99e06] * u.MeV values = [4.39e-7, 1.96e-7, 8.80e-7, 3.94e-7] * u.Unit("MeV-1 s-1 sr-1") template = TemplateSpectralModel(energy=energy, values=values) correction = PowerLawNormSpectralModel(norm=2) model = CompoundSpectralModel(template, correction, operator=operator.mul) assert np.allclose(model(energy), 2 * values) model_mul = template * correction assert isinstance(model_mul, CompoundSpectralModel) assert np.allclose(model_mul(energy), 2 * values) model_dict = model.to_dict() assert model_dict["spectral"]["operator"] == "mul" model_class = SPECTRAL_MODEL_REGISTRY.get_cls(model_dict["spectral"]["type"]) new_model = model_class.from_dict(model_dict) assert isinstance(new_model, CompoundSpectralModel) assert np.allclose(new_model(energy), 2 * values) def test_template_spectral_model_options(): energy = [1.00e06, 1.25e06, 1.58e06, 1.99e06] * u.MeV values = [4.39e-7, 1.96e-7, 8.80e-7, 3.94e-7] * u.Unit("MeV-1 s-1 sr-1") model = TemplateSpectralModel( energy=energy, values=values, interp_kwargs={"extrapolate": True, "method": "linear"}, ) assert np.allclose(model(energy), values) def test_covariance_spectral_model_compound(): model = TEST_MODELS[2]["model"] + TEST_MODELS[1]["model"] covar = np.eye(len(model.parameters)) covar[0, -1] = 1.0 covar[-1, 0] = 1.0 covar[0, 1] = 1.0 covar[-1, -3] = 1.0 model.covariance = covar assert_allclose(model.covariance.data, covar) assert_allclose(model.model1.covariance.data, covar[:3, :3]) assert_allclose(model.model2.covariance.data, covar[-3:, -3:]) def test_evaluate_spectral_model_compound(): energy = [1.00e06, 1.25e06, 1.58e06, 1.99e06] * u.MeV model = TEST_MODELS[-2]["model"] values = model.evaluate(energy, *[p.quantity for p in model.parameters]) assert_allclose(values, model(energy)) model = TEST_MODELS[-3]["model"] values = model.evaluate(energy, *[p.quantity for p in model.parameters]) assert_allclose(values, model(energy)) def test_evaluate_nested_spectral_model_compound(): energy = [1.00e06, 1.25e06, 1.58e06, 1.99e06] * u.MeV model1 = TEST_MODELS[-2]["model"] model2 = TEST_MODELS[-3]["model"] model = model1 + model2 values = model.evaluate(energy, *[p.quantity for p in model.parameters]) assert_allclose(values, model(energy)) @requires_dependency("naima") class TestNaimaModel: # Used to test model value at 2 TeV energy = 2 * u.TeV # Used to test model integral and energy flux energy_min = 1 * u.TeV energy_max = 10 * u.TeV # Used to that if array evaluation works e_array = [2, 10, 20] * u.TeV e_array = e_array[:, np.newaxis, np.newaxis] def test_pion_decay(self): import naima particle_distribution = naima.models.PowerLaw( amplitude=2e33 / u.eV, e_0=10 * u.TeV, alpha=2.5 ) radiative_model = naima.radiative.PionDecay( particle_distribution, nh=1 * u.cm**-3 ) model = NaimaSpectralModel(radiative_model) for p in model.parameters: assert p._type == "spectral" val_at_2TeV = 9.725347355450884e-14 * u.Unit("cm-2 s-1 TeV-1") integral_1_10TeV = 3.530537143620737e-13 * u.Unit("cm-2 s-1") eflux_1_10TeV = 7.643559573105779e-13 * u.Unit("TeV cm-2 s-1") value = model(self.energy) assert_quantity_allclose(value, val_at_2TeV) assert_quantity_allclose( model.integral(energy_min=self.energy_min, energy_max=self.energy_max), integral_1_10TeV, ) assert_quantity_allclose( model.energy_flux(energy_min=self.energy_min, energy_max=self.energy_max), eflux_1_10TeV, ) val = model(self.e_array) assert val.shape == self.e_array.shape model.amplitude.error = 0.1 * model.amplitude.value out = model.evaluate_error(1 * u.TeV) assert_allclose(out.data, [5.266068e-13, 5.266068e-14], rtol=1e-3) def test_ic(self): import naima particle_distribution = naima.models.ExponentialCutoffBrokenPowerLaw( amplitude=2e33 / u.eV, e_0=10 * u.TeV, alpha_1=2.5, alpha_2=2.7, e_break=900 * u.GeV, e_cutoff=10 * u.TeV, ) radiative_model = naima.radiative.InverseCompton( particle_distribution, seed_photon_fields=["CMB"] ) model = NaimaSpectralModel(radiative_model) for p in model.parameters: assert p._type == "spectral" val_at_2TeV = 4.347836316893546e-12 * u.Unit("cm-2 s-1 TeV-1") integral_1_10TeV = 1.595813e-11 * u.Unit("cm-2 s-1") eflux_1_10TeV = 2.851283e-11 * u.Unit("TeV cm-2 s-1") value = model(self.energy) assert_quantity_allclose(value, val_at_2TeV) assert_quantity_allclose( model.integral(energy_min=self.energy_min, energy_max=self.energy_max), integral_1_10TeV, rtol=1e-5, ) assert_quantity_allclose( model.energy_flux(energy_min=self.energy_min, energy_max=self.energy_max), eflux_1_10TeV, rtol=1e-5, ) val = model(self.e_array) assert val.shape == self.e_array.shape def test_synchrotron(self): import naima particle_distribution = naima.models.LogParabola( amplitude=2e33 / u.eV, e_0=10 * u.TeV, alpha=1.3, beta=0.5 ) radiative_model = naima.radiative.Synchrotron(particle_distribution, B=2 * u.G) model = NaimaSpectralModel(radiative_model) for p in model.parameters: assert p._type == "spectral" val_at_2TeV = 1.0565840392550432e-24 * u.Unit("cm-2 s-1 TeV-1") integral_1_10TeV = 4.449186e-13 * u.Unit("cm-2 s-1") eflux_1_10TeV = 4.594121e-13 * u.Unit("TeV cm-2 s-1") value = model(self.energy) assert_quantity_allclose(value, val_at_2TeV) assert_quantity_allclose( model.integral(energy_min=self.energy_min, energy_max=self.energy_max), integral_1_10TeV, rtol=1e-5, ) assert_quantity_allclose( model.energy_flux(energy_min=self.energy_min, energy_max=self.energy_max), eflux_1_10TeV, rtol=1e-5, ) val = model(self.e_array) assert val.shape == self.e_array.shape model.B.value = 3 # update B val_at_2TeV = 5.1985064062296e-16 * u.Unit("cm-2 s-1 TeV-1") value = model(self.energy) assert_quantity_allclose(value, val_at_2TeV) def test_ssc(self): import naima ECBPL = naima.models.ExponentialCutoffBrokenPowerLaw( amplitude=3.699e36 / u.eV, e_0=1 * u.TeV, e_break=0.265 * u.TeV, alpha_1=1.5, alpha_2=3.233, e_cutoff=1863 * u.TeV, beta=2.0, ) radiative_model = naima.radiative.InverseCompton( ECBPL, seed_photon_fields=[ "CMB", ["FIR", 70 * u.K, 0.5 * u.eV / u.cm**3], ["NIR", 5000 * u.K, 1 * u.eV / u.cm**3], ], Eemax=50 * u.PeV, Eemin=0.1 * u.GeV, ) B = 125 * u.uG radius = 2.1 * u.pc nested_models = {"SSC": {"B": B, "radius": radius}} model = NaimaSpectralModel(radiative_model, nested_models=nested_models) assert_quantity_allclose(model.B.quantity, B) assert_quantity_allclose(model.radius.quantity, radius) val_at_2TeV = 1.6703761561806372e-11 * u.Unit("cm-2 s-1 TeV-1") value = model(self.energy) assert_quantity_allclose(value, val_at_2TeV, rtol=1e-5) model.parameters["B"].value = 100 val_at_2TeV = 1.441331153167876e-11 * u.Unit("cm-2 s-1 TeV-1") value = model(self.energy) assert_quantity_allclose(value, val_at_2TeV, rtol=1e-5) def test_bad_init(self): import naima particle_distribution = naima.models.PowerLaw( amplitude=2e33 / u.eV, e_0=10 * u.TeV, alpha=2.5 ) radiative_model = naima.radiative.PionDecay( particle_distribution, nh=1 * u.cm**-3 ) model = NaimaSpectralModel(radiative_model) with pytest.raises(NotImplementedError): NaimaSpectralModel.from_dict(model.to_dict()) with pytest.raises(NotImplementedError): NaimaSpectralModel.from_parameters(model.parameters) def test_cache(self): import naima particle_distribution = naima.models.ExponentialCutoffPowerLaw( 1e30 / u.eV, 10 * u.TeV, 3.0, 30 * u.TeV ) radiative_model = naima.radiative.InverseCompton( particle_distribution, seed_photon_fields=["CMB", ["FIR", 26.5 * u.K, 0.415 * u.eV / u.cm**3]], Eemin=100 * u.GeV, ) model = NaimaSpectralModel(radiative_model, distance=1.5 * u.kpc) opts = { "energy_bounds": [10 * u.GeV, 80 * u.TeV], "sed_type": "e2dnde", } _, ax = plt.subplots() model.plot(label="IC (total)", ax=ax, **opts) for seed, ls in zip(["CMB", "FIR"], ["-", "--"]): model = NaimaSpectralModel(radiative_model, seed=seed, distance=1.5 * u.kpc) model.plot(label=f"IC ({seed})", ls=ls, color="gray", **opts) skymodel = SkyModel(model) # fail if cache is on : _, ax = plt.subplots() skymodel.spectral_model.plot( energy_bounds=[10 * u.GeV, 80 * u.TeV], energy_power=2, ax=ax ) assert not radiative_model._memoize class TestSpectralModelErrorPropagation: """Test spectral model error propagation. https://github.com/gammapy/gammapy/blob/master/docs/development/pigs/pig-014.rst#proposal https://nbviewer.jupyter.org/github/gammapy/gammapy-extra/blob/master/experiments/uncertainty_estimation_prototype.ipynb """ def setup_method(self): self.model = LogParabolaSpectralModel( amplitude=3.76e-11 * u.Unit("cm-2 s-1 TeV-1"), reference=1 * u.TeV, alpha=2.44, beta=0.25, ) self.model.covariance = [ [1.31e-23, 0, -6.80e-14, 3.04e-13], [0, 0, 0, 0], [-6.80e-14, 0, 0.00899, 0.00904], [3.04e-13, 0, 0.00904, 0.0284], ] def test_evaluate_error_scalar(self): # evaluate_error on scalar out = self.model.evaluate_error(1 * u.TeV) assert isinstance(out, u.Quantity) assert out.unit == "cm-2 s-1 TeV-1" assert out.shape == (2,) assert_allclose(out.data, [3.7600e-11, 3.6193e-12], rtol=1e-3) def test_evaluate_error_array(self): out = self.model.evaluate_error([1, 100] * u.TeV) assert out.shape == (2, 2) expected = [[3.76e-11, 2.469e-18], [3.619e-12, 9.375e-18]] assert_allclose(out.data, expected, rtol=1e-3) def test_evaluate_error_unit(self): out = self.model.evaluate_error(1e6 * u.MeV) assert out.unit == "cm-2 s-1 TeV-1" assert_allclose(out.data, [3.760e-11, 3.6193e-12], rtol=1e-3) def test_integral_error(self): out = self.model.integral_error(1 * u.TeV, 10 * u.TeV) assert out.unit == "cm-2 s-1" assert out.shape == (2,) assert_allclose(out.data, [2.197e-11, 2.796e-12], rtol=1e-3) def test_energy_flux_error(self): out = self.model.energy_flux_error(1 * u.TeV, 10 * u.TeV) assert out.unit == "TeV cm-2 s-1" assert out.shape == (2,) assert_allclose(out.data, [4.119e-11, 8.157e-12], rtol=1e-3) def test_logpar_index_error(): model = LogParabolaSpectralModel( amplitude=3.81e-11 / u.cm**2 / u.s / u.TeV, reference=1 * u.TeV, alpha=2.19, beta=0.226, ) model.alpha.error = 0.4 out = model.spectral_index_error(energy=1.0 * u.TeV) assert_allclose(out, [2.19, 0.4], rtol=1e-3) def test_dnde_error_ecpl_model(): # Regression test for ECPL model # https://github.com/gammapy/gammapy/issues/2007 model = ExpCutoffPowerLawSpectralModel( amplitude=2.076183759227292e-12 * u.Unit("cm-2 s-1 TeV-1"), index=1.8763343736076483, lambda_=0.08703226432146616 * u.Unit("TeV-1"), reference=1 * u.TeV, ) model.covariance = [ [0.00204191498, -1.507724e-14, 0.0, -0.001834819, 0.0], [-1.507724e-14, 1.6864740e-25, 0.0, 1.854251e-14, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0], [-0.001834819175, 1.8542517e-14, 0.0, 0.0032559101, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0], ] out = model.evaluate_error(1 * u.TeV) assert_allclose(out.data, [1.903129e-12, 2.979976e-13], rtol=1e-3) out = model.evaluate_error(0.1 * u.TeV) assert_allclose(out.data, [1.548176e-10, 1.933612e-11], rtol=1e-3) def test_integral_error_power_law(): energy = np.linspace(1 * u.TeV, 10 * u.TeV, 10) energy_min = energy[:-1] energy_max = energy[1:] powerlaw = PowerLawSpectralModel() powerlaw.parameters["index"].error = 0.4 powerlaw.parameters["amplitude"].error = 1e-13 flux, flux_error = powerlaw.integral_error(energy_min, energy_max) assert_allclose(flux.value[0] / 1e-13, 5.0, rtol=1e-3) assert_allclose(flux_error.value[0] / 1e-14, 7.915984, rtol=1e-3) def test_integral_error_exp_cut_off_power_law(): energy = np.linspace(1 * u.TeV, 10 * u.TeV, 10) energy_min = energy[:-1] energy_max = energy[1:] exppowerlaw = ExpCutoffPowerLawSpectralModel() exppowerlaw.parameters["index"].error = 0.4 exppowerlaw.parameters["amplitude"].error = 1e-13 exppowerlaw.parameters["lambda_"].error = 0.03 flux, flux_error = exppowerlaw.integral_error(energy_min, energy_max) assert_allclose(flux.value[0] / 1e-13, 5.05855622, rtol=0.01) assert_allclose(flux_error.value[0] / 1e-14, 8.552617, rtol=0.01) def test_energy_flux_error_power_law(): energy_min = 1 * u.TeV energy_max = 10 * u.TeV powerlaw = PowerLawSpectralModel() powerlaw.parameters["index"].error = 0.4 powerlaw.parameters["amplitude"].error = 1e-13 enrg_flux, enrg_flux_error = powerlaw.energy_flux_error(energy_min, energy_max) assert_allclose(enrg_flux.value / 1e-12, 2.303, rtol=0.001) assert_allclose(enrg_flux_error.value / 1e-12, 1.085, rtol=0.001) def test_energy_flux_error_exp_cutoff_power_law(): energy_min = 1 * u.TeV energy_max = 10 * u.TeV exppowerlaw = ExpCutoffPowerLawSpectralModel() exppowerlaw.parameters["index"].error = 0.4 exppowerlaw.parameters["amplitude"].error = 1e-13 exppowerlaw.parameters["lambda_"].error = 0.03 enrg_flux, enrg_flux_error = exppowerlaw.energy_flux_error(energy_min, energy_max) assert_allclose(enrg_flux.value / 1e-12, 2.788, rtol=0.001) assert_allclose(enrg_flux_error.value / 1e-12, 1.419, rtol=0.001) def test_integral_exp_cut_off_power_law_large_number_of_bins(): energy = np.geomspace(1, 10, 100) * u.TeV energy_min = energy[:-1] energy_max = energy[1:] exppowerlaw = ExpCutoffPowerLawSpectralModel( amplitude="1e-11 TeV-1 cm-2 s-1", index=2 ) exppowerlaw.parameters["lambda_"].value = 1e-3 powerlaw = PowerLawSpectralModel(amplitude="1e-11 TeV-1 cm-2 s-1", index=2) expected_flux = powerlaw.integral(energy_min, energy_max) flux = exppowerlaw.integral(energy_min, energy_max) assert_allclose(flux.value, expected_flux.value, rtol=0.01) def test_template_ND(tmpdir): energy_axis = MapAxis.from_bounds( 1.0, 100, 10, interp="log", name="energy_true", unit="GeV" ) norm = MapAxis.from_bounds(0, 10, 10, interp="lin", name="norm", unit="") tilt = MapAxis.from_bounds(-1.0, 1, 5, interp="lin", name="tilt", unit="") region_map = RegionNDMap.create( region="icrs;point(83.63, 22.01)", axes=[energy_axis, norm, tilt] ) region_map.data[:, :, :5, 0, 0] = 1 region_map.data[:, :, 5:, 0, 0] = 2 template = TemplateNDSpectralModel(region_map) assert len(template.parameters) == 2 assert template.parameters["norm"].value == 5 assert template.parameters["tilt"].value == 0 assert_allclose(template([1, 100, 1000] * u.GeV), [1.0, 2.0, 2.0]) template.parameters["norm"].value = 1 template.filename = str(tmpdir / "template_ND.fits") template.write() dict_ = template.to_dict() template_new = TemplateNDSpectralModel.from_dict(dict_) assert_allclose(template_new.map.data, region_map.data) assert len(template_new.parameters) == 2 assert template_new.parameters["norm"].value == 1 assert template_new.parameters["tilt"].value == 0 def test_template_ND_no_energy(tmpdir): norm = MapAxis.from_bounds(0, 10, 10, interp="lin", name="norm", unit="") tilt = MapAxis.from_bounds(-1.0, 1, 5, interp="lin", name="tilt", unit="") region_map = RegionNDMap.create( region="icrs;point(83.63, 22.01)", axes=[norm, tilt] ) region_map.data[:, :5, 0, 0] = 1 region_map.data[:, 5:, 0, 0] = 2 with pytest.raises(ValueError): TemplateNDSpectralModel(region_map) @requires_data() def test_template_ND_EBL(tmpdir): # TODO: add RegionNDMap.read(format="xspec") # Create EBL data array filename = "$GAMMAPY_DATA/ebl/ebl_franceschini.fits.gz" filename = make_path(filename) table_param = Table.read(filename, hdu="PARAMETERS") npar = len(table_param) par_axes = [] idx_data = [] for k in range(npar): name = table_param["NAME"][k].lower().strip() param, idx = np.unique(table_param[0]["VALUE"], return_index=True) par_axes.append( MapAxis(param, node_type="center", interp="lin", name=name, unit="") ) idx_data.append(idx) idx_data.append(Ellipsis) idx_data = tuple(idx_data) # Get energy values table_energy = Table.read(filename, hdu="ENERGIES") energy_lo = u.Quantity( table_energy["ENERG_LO"], "keV", copy=COPY_IF_NEEDED ) # unit not stored in file energy_hi = u.Quantity( table_energy["ENERG_HI"], "keV", copy=COPY_IF_NEEDED ) # unit not stored in file energy = np.sqrt(energy_lo * energy_hi) # Get spectrum values table_spectra = Table.read(filename, hdu="SPECTRA") energy_axis = MapAxis(energy, node_type="center", interp="log", name="energy_true") region_map = RegionNDMap.create( region="galactic;point(0, 0)", axes=[energy_axis] + par_axes ) # TODO: here we use a fake position, is it possible to allow region=None ? data = table_spectra["INTPSPEC"].data[idx_data] region_map.data[:, :, 0, 0] = data template = TemplateNDSpectralModel(region_map) assert len(template.parameters) == 1 assert_allclose(template.parameters["redshift"].value, 1.001, rtol=1e-3) expected = [1.132092e00, 4.967878e-01, 1.596544e-06] assert_allclose(template([1, 100, 1000] * u.GeV), expected, rtol=1e-3) template.parameters["redshift"].value = 0.1 template.filename = str(tmpdir / "template_ND_ebl_franceschini.fits") template.write() dict_ = template.to_dict() template_new = TemplateNDSpectralModel.from_dict(dict_) assert_allclose(template_new.map.data, region_map.data) assert len(template.parameters) == 1 assert_allclose(template.parameters["redshift"].value, 0.1) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/test_spectral_cosmic_ray.py0000644000175100001770000000235014721316200025725 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose import astropy.units as u from gammapy.modeling.models import create_cosmic_ray_spectral_model COSMIC_RAY_SPECTRA = [ {"name": "proton", "dnde": 1.856522e-01, "flux": 7.096247e-01, "index": 2.70}, {"name": "N", "dnde": 1.449504e-01, "flux": 5.509215e-01, "index": 2.64}, {"name": "Si", "dnde": 5.646618e-02, "flux": 2.149887e-01, "index": 2.66}, {"name": "Fe", "dnde": 2.720231e-02, "flux": 1.03305e-01, "index": 2.63}, {"name": "electron", "dnde": 1.013671e-04, "flux": 4.691975e-04, "index": 3.428318}, ] @pytest.mark.parametrize("spec", COSMIC_RAY_SPECTRA, ids=lambda _: _["name"]) def test_cosmic_ray_spectrum(spec): cr_spectrum = create_cosmic_ray_spectral_model(particle=spec["name"]) dnde = cr_spectrum(2 * u.TeV) assert_allclose(dnde.value, spec["dnde"], rtol=1e-3) assert dnde.unit == "m-2 s-1 TeV-1" flux = cr_spectrum.integral(1 * u.TeV, 1e3 * u.TeV) assert_allclose(flux.value, spec["flux"], rtol=1e-3) assert flux.unit == "m-2 s-1" index = cr_spectrum.spectral_index(2 * u.TeV) assert_allclose(index.value, spec["index"], rtol=1e-3) assert index.unit == "" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/test_spectral_crab.py0000644000175100001770000000377614721316200024521 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import astropy.units as u from gammapy.modeling.models import create_crab_spectral_model from gammapy.utils.testing import assert_quantity_allclose CRAB_SPECTRA = [ { "name": "meyer", "dnde": u.Quantity(5.572437502365652e-12, "cm-2 s-1 TeV-1"), "flux": u.Quantity(2.0744425607240974e-11, "cm-2 s-1"), "index": 2.631535530090332, }, { "name": "hegra", "dnde": u.Quantity(4.60349681e-12, "cm-2 s-1 TeV-1"), "flux": u.Quantity(1.74688947e-11, "cm-2 s-1"), "index": 2.62000000, }, { "name": "hess_pl", "dnde": u.Quantity(5.57327158e-12, "cm-2 s-1 TeV-1"), "flux": u.Quantity(2.11653715e-11, "cm-2 s-1"), "index": 2.63000000, }, { "name": "hess_ecpl", "dnde": u.Quantity(6.23714253e-12, "cm-2 s-1 TeV-1"), "flux": u.Quantity(2.267957713046026e-11, "cm-2 s-1"), "index": 2.529860258102417, }, { "name": "magic_lp", "dnde": u.Quantity(5.5451060834144166e-12, "cm-2 s-1 TeV-1"), "flux": u.Quantity(2.028222279117e-11, "cm-2 s-1"), "index": 2.614495440236207, }, { "name": "magic_ecpl", "dnde": u.Quantity(5.88494595619e-12, "cm-2 s-1 TeV-1"), "flux": u.Quantity(2.070767119534948e-11, "cm-2 s-1"), "index": 2.5433349999859405, }, ] @pytest.mark.parametrize("spec", CRAB_SPECTRA, ids=lambda _: _["name"]) def test_crab_spectrum(spec): crab_spectrum = create_crab_spectral_model(reference=spec["name"]) dnde = crab_spectrum(2 * u.TeV) assert_quantity_allclose(dnde, spec["dnde"]) flux = crab_spectrum.integral(1 * u.TeV, 1e3 * u.TeV) assert_quantity_allclose(flux, spec["flux"], rtol=1e-6) index = crab_spectrum.spectral_index(2 * u.TeV) assert_quantity_allclose(index, spec["index"], rtol=1e-5) def test_invalid_format(): with pytest.raises(ValueError): create_crab_spectral_model("spam") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/test_temporal.py0000644000175100001770000004132214721316200023525 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy import units as u from astropy.table import Table from astropy.time import Time from gammapy.modeling.models import ( ConstantTemporalModel, ExpDecayTemporalModel, GaussianTemporalModel, GeneralizedGaussianTemporalModel, LightCurveTemplateTemporalModel, LinearTemporalModel, Model, PowerLawSpectralModel, PowerLawTemporalModel, SineTemporalModel, SkyModel, TemplatePhaseCurveTemporalModel, ) from gammapy.utils.scripts import make_path from gammapy.utils.testing import mpl_plot_check, requires_data from gammapy.utils.time import time_ref_to_dict # TODO: add light-curve test case from scratch # only use the FITS one for I/O (or not at all) @pytest.fixture(scope="session") def light_curve(): path = "$GAMMAPY_DATA/tests/models/light_curve/lightcrv_PKSB1222+216.fits" return LightCurveTemplateTemporalModel.read(path) @pytest.fixture() def phase_curve_table(): phase = np.linspace(0.0, 1, 101) norm = phase * (phase < 0.5) + (1 - phase) * (phase >= 0.5) return Table(data={"PHASE": phase, "NORM": norm}) @requires_data() def test_light_curve_str(light_curve): ss = str(light_curve) assert "LightCurveTemplateTemporalModel" in ss @requires_data() def test_light_curve_evaluate(light_curve): t = Time(59500, format="mjd") val = light_curve(t) assert_allclose(val, 0.015512, rtol=1e-5) @requires_data() def test_energy_dependent_lightcurve(tmp_path): filename = "$GAMMAPY_DATA/gravitational_waves/GW_example_DC_map_file.fits.gz" mod = LightCurveTemplateTemporalModel.read(filename, format="map") assert mod.is_energy_dependent is True t = Time(55555.6157407407, format="mjd") val = mod.evaluate(t, energy=[0.3, 2] * u.TeV) assert_allclose(val.data, [[2.278068e-21], [4.280503e-23]], rtol=1e-5) t = Time([55555, 55556, 55557], format="mjd") val = mod.evaluate(t) assert val.data.shape == (41, 3) with mpl_plot_check(): mod.plot( time_range=(Time(55555.50, format="mjd"), Time(55563.0, format="mjd")), energy=[0.3, 2, 10.0] * u.TeV, ) filename = make_path(tmp_path / "test.fits") with pytest.raises(NotImplementedError): mod.write(filename=filename, format="table", overwrite=True) with pytest.raises(NotImplementedError): time_start = Time("2010-01-01T00:00:00") + [1, 3, 5] * u.hour time_stop = Time("2010-01-01T00:00:00") + [2, 3.5, 6] * u.hour mod.integral(time_start, time_stop) def ph_curve(x, amplitude=0.5, x0=0.01): return 100.0 + amplitude * np.sin(2 * np.pi * (x - x0) / 1.0) @requires_data() def test_light_curve_to_from_table(light_curve): table = light_curve.to_table() assert_allclose(table.meta["MJDREFI"], 59000) assert_allclose(table.meta["MJDREFF"], 0.4991992, rtol=1e-6) assert table.meta["TIMESYS"] == "utc" lc1 = LightCurveTemplateTemporalModel.from_table(table) assert lc1.map == light_curve.map assert_allclose( lc1.reference_time.value, Time(59000.5, format="mjd").value, rtol=1e-2 ) # test failing cases table1 = table.copy() table1["TIME"].unit = None table.meta = None with pytest.raises(ValueError, match="Time unit not found in the table"): LightCurveTemplateTemporalModel.from_table(table1) @requires_data() def test_light_curve_to_dict(light_curve): data = light_curve.to_dict() assert data["temporal"]["format"] == "table" assert data["temporal"]["unit"] == "" assert data["temporal"]["type"] == "LightCurveTemplateTemporalModel" assert data["temporal"]["parameters"][0]["name"] == "t_ref" lc1 = LightCurveTemplateTemporalModel.from_dict(data) assert lc1.map == light_curve.map assert_allclose( lc1.reference_time.value, light_curve.reference_time.value, rtol=1e-9 ) @requires_data() def test_light_curve_map_serialisation(light_curve, tmp_path): filename = str(make_path(tmp_path / "tmp.fits")) light_curve.write(filename, format="map") lc1 = LightCurveTemplateTemporalModel.read(filename, format="map") assert_allclose( lc1.reference_time.value, light_curve.reference_time.value, rtol=1e-9 ) assert lc1.map == light_curve.map def test_time_sampling_template(): time_ref = Time(55197.00000000, format="mjd") livetime = 3.0 * u.hr sigma = 0.5 * u.h t_min = "2010-01-01T00:00:00" t_max = "2010-01-01T03:00:00" t_delta = "3 min" times = time_ref + livetime * np.linspace(0, 1, 1000) flare_model = GaussianTemporalModel(t_ref=(times[500].mjd) * u.d, sigma=sigma) lc = Table() meta = time_ref_to_dict(times[0]) lc.meta = meta lc.meta["TIMEUNIT"] = "s" lc["TIME"] = (times - times[0]).to("s") lc["NORM"] = flare_model(times) temporal_model = LightCurveTemplateTemporalModel.from_table(lc) sampler_template = temporal_model.sample_time( n_events=1000, t_min=t_min, t_max=t_max, random_state=0, t_delta=t_delta ) assert len(sampler_template) == 1000 mean = np.mean(sampler_template.mjd) std = np.std(sampler_template.mjd) assert_allclose(mean - times[500].mjd, 0.0, atol=1e-3) assert_allclose(std - sigma.to("d").value, 0.0, atol=3e-4) def test_time_sampling_gaussian(): time_ref = Time(55197.00000000, format="mjd") sigma = 0.5 * u.h t_min = "2010-01-01T00:00:00" t_max = "2010-01-01T03:00:00" t_delta = "3 min" temporal_model = GaussianTemporalModel( t_ref=(time_ref.mjd + 0.03) * u.d, sigma=sigma ) sampler = temporal_model.sample_time( n_events=1000, t_min=t_min, t_max=t_max, random_state=0, t_delta=t_delta ) assert len(sampler) == 1000 mean = np.mean(sampler.mjd) std = np.std(sampler.mjd) assert_allclose(mean - (time_ref.mjd + 0.03), 0.0, atol=4e-3) assert_allclose(std - sigma.to("d").value, 0.0, atol=3e-3) def test_lightcurve_temporal_model_integral(): time = np.arange(0, 10, 0.06) * u.hour table = Table() table["TIME"] = time table["NORM"] = np.ones(len(time)) table.meta = dict(MJDREFI=55197.0, MJDREFF=0, TIMEUNIT="hour") temporal_model = LightCurveTemplateTemporalModel.from_table(table) assert not temporal_model.is_energy_dependent time_start = Time("2010-01-01T00:00:00") + [1, 3, 5] * u.hour time_stop = Time("2010-01-01T00:00:00") + [2, 3.5, 6] * u.hour val = temporal_model.integral(time_start, time_stop) assert len(val) == 3 assert_allclose(np.sum(val), 1.0101, rtol=1e-5) with mpl_plot_check(): temporal_model.plot( time_range=(Time(55555.50, format="mjd"), Time(55563.0, format="mjd")) ) def test_constant_temporal_model_evaluate(): temporal_model = ConstantTemporalModel() t = Time(46300, format="mjd") val = temporal_model(t) assert_allclose(val, 1.0, rtol=1e-5) def test_constant_temporal_model_integral(): temporal_model = ConstantTemporalModel() time_start = Time("2010-01-01T00:00:00") + [1, 3, 5] * u.day time_stop = Time("2010-01-01T00:00:00") + [2, 3.5, 6] * u.day val = temporal_model.integral(time_start, time_stop) assert len(val) == 3 assert_allclose(np.sum(val), 1.0, rtol=1e-5) def test_linear_temporal_model_evaluate(): t = Time(46301, format="mjd") t_ref = 46300 * u.d temporal_model = LinearTemporalModel(alpha=1.0, beta=0.1 / u.day, t_ref=t_ref) val = temporal_model(t) assert_allclose(val, 1.1, rtol=1e-5) def test_linear_temporal_model_integral(): t_ref = Time(55555, format="mjd") temporal_model = LinearTemporalModel( alpha=1.0, beta=0.1 / u.day, t_ref=t_ref.mjd * u.d ) time_start = t_ref + [1, 3, 5] * u.day time_stop = t_ref + [2, 3.5, 6] * u.day val = temporal_model.integral(time_start, time_stop) assert len(val) == 3 assert_allclose(np.sum(val), 1.345, rtol=1e-5) def test_exponential_temporal_model_evaluate(): t = Time(46301, format="mjd") t_ref = 46300 * u.d t0 = 2.0 * u.d temporal_model = ExpDecayTemporalModel(t_ref=t_ref, t0=t0) val = temporal_model(t) assert_allclose(val, 0.6065306597126334, rtol=1e-5) def test_exponential_temporal_model_integral(): t_ref = Time(55555, format="mjd") temporal_model = ExpDecayTemporalModel(t_ref=t_ref.mjd * u.d) time_start = t_ref + [1, 3, 5] * u.day time_stop = t_ref + [2, 3.5, 6] * u.day val = temporal_model.integral(time_start, time_stop) assert len(val) == 3 assert_allclose(np.sum(val), 0.102557, rtol=1e-5) def test_gaussian_temporal_model_evaluate(): t = Time(46301, format="mjd") t_ref = 46300 * u.d sigma = 2.0 * u.d temporal_model = GaussianTemporalModel(t_ref=t_ref, sigma=sigma) val = temporal_model(t) assert_allclose(val, 0.882497, rtol=1e-5) def test_gaussian_temporal_model_integral(): temporal_model = GaussianTemporalModel(t_ref=50003 * u.d, sigma="2.0 day") t_ref = Time(50000, format="mjd") time_start = t_ref + [1, 3, 5] * u.day time_stop = t_ref + [2, 3.5, 6] * u.day val = temporal_model.integral(time_start, time_stop) assert len(val) == 3 assert_allclose(np.sum(val), 0.682679, rtol=1e-5) def test_generalized_gaussian_temporal_model_evaluate(): t = Time(46301, format="mjd") t_ref = 46300 * u.d t_rise = 2.0 * u.d t_decay = 2.0 * u.d eta = 1 / 2 temporal_model = GeneralizedGaussianTemporalModel( t_ref=t_ref, t_rise=t_rise, t_decay=t_decay, eta=eta ) val = temporal_model(t) assert_allclose(val, 0.882497, rtol=1e-5) def test_generalized_gaussian_temporal_model_integral(): temporal_model = GeneralizedGaussianTemporalModel( t_ref=50003 * u.d, t_rise="2.0 day", t_decay="2.0 day", eta=1 / 2 ) start = 1 * u.day stop = 2 * u.day t_ref = Time(50000, format="mjd", scale="utc") time_start = t_ref + start time_stop = t_ref + stop val = temporal_model.integral(time_start, time_stop) assert_allclose(val, 0.758918, rtol=1e-4) def test_powerlaw_temporal_model_evaluate(): t = Time(46302, format="mjd") t_ref = 46300 * u.d alpha = -2.0 temporal_model = PowerLawTemporalModel(t_ref=t_ref, alpha=alpha) val = temporal_model(t) assert_allclose(val, 0.25, rtol=1e-5) def test_powerlaw_temporal_model_integral(): t_ref = Time(55555, format="mjd") temporal_model = PowerLawTemporalModel(alpha=-2.0, t_ref=t_ref.mjd * u.d) time_start = t_ref + [1] * u.day time_stop = t_ref + [4] * u.day val = temporal_model.integral(time_start, time_stop) assert len(val) == 1 assert_allclose(np.sum(val), 0.25, rtol=1e-5) temporal_model.parameters["alpha"].value = -1 time_start = t_ref + [1, 3, 5] * u.day time_stop = t_ref + [2, 3.5, 6] * u.day val = temporal_model.integral(time_start, time_stop) assert len(val) == 3 assert_allclose(np.sum(val), 0.411847, rtol=1e-5) def test_sine_temporal_model_evaluate(): t = Time(46302, format="mjd") t_ref = 46300 * u.d omega = np.pi / 4.0 * u.rad / u.day temporal_model = SineTemporalModel(amp=0.5, omega=omega, t_ref=t_ref) val = temporal_model(t) assert_allclose(val, 1.5, rtol=1e-5) def test_sine_temporal_model_integral(): t_ref = Time(55555, format="mjd") omega = np.pi / 4.0 * u.rad / u.day temporal_model = SineTemporalModel(amp=0.5, omega=omega, t_ref=t_ref.mjd * u.d) time_start = t_ref + [1, 3, 5] * u.day time_stop = t_ref + [2, 3.5, 6] * u.day val = temporal_model.integral(time_start, time_stop) assert len(val) == 3 assert_allclose(np.sum(val), 1.08261, rtol=1e-5) @requires_data() def test_to_dict(light_curve): out = light_curve.to_dict() assert out["temporal"]["type"] == "LightCurveTemplateTemporalModel" assert "lightcrv_PKSB1222+216.fits" in out["temporal"]["filename"] @requires_data() def test_with_skymodel(light_curve): sky_model = SkyModel(spectral_model=PowerLawSpectralModel()) out = sky_model.to_dict() assert "temporal" not in out sky_model = SkyModel( spectral_model=PowerLawSpectralModel(), temporal_model=light_curve ) assert "LightCurveTemplateTemporalModel" in sky_model.temporal_model.tag out = sky_model.to_dict() assert "temporal" in out def test_plot_constant_model(): time_range = [Time.now(), Time.now() + 1 * u.d] constant_model = ConstantTemporalModel(const=1) with mpl_plot_check(): constant_model.plot(time_range) def test_phase_curve_model(tmp_path): phase = np.linspace(0.0, 1, 101) norm = phase * (phase < 0.5) + (1 - phase) * (phase >= 0.5) table = Table(data={"PHASE": phase, "NORM": norm}) t_ref = Time("2022-06-01") phase_model = TemplatePhaseCurveTemporalModel( table=table, f0="20 Hz", phi_ref=0.0, f1="0 s-2", f2="0 s-3", t_ref=t_ref.mjd * u.d, ) result = phase_model(t_ref + [0, 0.025, 0.05] * u.s) assert_allclose(result, [0.000000e00, 1.999989e00, 2.169609e-05], atol=1e-5) phase_model.filename = str(make_path(tmp_path / "tmp.fits")) phase_model.write() model_dict = phase_model.to_dict() new_model = Model.from_dict(model_dict) assert_allclose(phase_model.parameters.value, new_model.parameters.value) assert phase_model.parameters.names == new_model.parameters.names assert ( phase_model.parameters.free_parameters.names == new_model.parameters.free_parameters.names ) assert_allclose(new_model.table["PHASE"].data, phase) assert_allclose(new_model.table["NORM"].data, norm * 4) # exact number of phases integral = phase_model.integral(t_ref, t_ref + 10 * u.s) assert_allclose(integral, 1.0, rtol=1e-5) # long duration. Should be equal to the phase average integral = phase_model.integral(t_ref + 1 * u.h, t_ref + 3 * u.h) assert_allclose(integral, 1.0, rtol=1e-5) # 1.25 phase integral = phase_model.integral(t_ref, t_ref + 62.5 * u.ms) assert_allclose(integral, 0.9, rtol=1e-5) def test_phase_curve_model_sample_time(): phase = np.linspace(0.0, 1, 51) norm = 1.0 * (phase < 0.5) table = Table(data={"PHASE": phase, "NORM": norm}) t_ref = Time("2020-06-01", scale="utc") phase_model = TemplatePhaseCurveTemporalModel( table=table, f0="50 Hz", phi_ref=0.0, f1="0 s-2", f2="0 s-3", t_ref=t_ref.mjd * u.d, scale="utc", ) tmin = Time("2023-06-01", scale="tt") tmax = tmin + 0.5 * u.h times = phase_model.sample_time(10, tmin, tmax) phases, _ = phase_model._time_to_phase( times, phase_model.reference_time, phase_model.phi_ref.quantity, phase_model.f0.quantity, phase_model.f1.quantity, phase_model.f2.quantity, ) assert np.all(phases <= 0.5) @requires_data() def test_phasecurve_DC1(): filename = "$GAMMAPY_DATA/tests/phasecurve_LSI_DC.fits" t_ref = 43366.275 * u.d P0 = 26.7 * u.d f0 = 1 / P0 model = TemplatePhaseCurveTemporalModel.read(filename, t_ref, 0.0, f0) times = Time(t_ref, format="mjd") + [0.0, 0.5, 0.65, 1.0] * P0 norm = model(times) assert_allclose(norm, [0.294118, 0.882353, 5.882353, 0.294118], atol=1e-5) with mpl_plot_check(): model.plot_phasogram(n_points=200) def test_model_scale(): model = GaussianTemporalModel(t_ref=50003.2503033 * u.d, sigma="2.43 day") assert model.scale == "utc" model.scale = "tai" assert_allclose(model.reference_time.mjd, 50003.2503033, rtol=1e-9) dict1 = model.to_dict() model1 = GaussianTemporalModel.from_dict(dict1) assert model1.scale == "tai" assert_allclose(model1.sigma.quantity, 2.43 * u.d, rtol=1e-3) time_start = model.reference_time + [1, 3, 5] * u.day time_stop = model.reference_time + [2, 3.5, 6] * u.day val = model.integral(time_start, time_stop) assert_allclose(np.sum(val), 0.442833, rtol=1e-5) model1.reference_time = Time(52398.23456, format="mjd", scale="utc") assert model1.scale == "tai" assert_allclose(model1.t_ref.value, 52398.23493, rtol=1e-9) with pytest.raises(TypeError): model1.reference_time = 23456 with pytest.raises(ValueError): model = GaussianTemporalModel( t_ref=50003.2503033 * u.d, sigma="2.43 day", scale="ms" ) @requires_data() def test_template_temporal_model_format(): temporal_model = LightCurveTemplateTemporalModel.read( "$GAMMAPY_DATA/gravitational_waves/GW_example_DC_map_file.fits.gz", format="map" ) mod_dict = temporal_model.to_dict() assert mod_dict["temporal"]["format"] == "map" path = "$GAMMAPY_DATA/tests/models/light_curve/lightcrv_PKSB1222+216.fits" temporal_model = LightCurveTemplateTemporalModel.read(path) mod_dict = temporal_model.to_dict() assert mod_dict["temporal"]["format"] == "table" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/tests/test_utils.py0000644000175100001770000000356214721316200023046 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from numpy.testing import assert_allclose import astropy.units as u from astropy.time import Time from gammapy.modeling.models import LightCurveTemplateTemporalModel from gammapy.modeling.models.utils import _template_model_from_cta_sdc, read_hermes_cube from gammapy.utils.scripts import make_path from gammapy.utils.testing import requires_data, requires_dependency @requires_data() def test__template_model_from_cta_sdc(tmp_path): filename = "$GAMMAPY_DATA/gravitational_waves/GW_example_DC_file.fits.gz" mod = _template_model_from_cta_sdc(filename) assert isinstance(mod, LightCurveTemplateTemporalModel) t = Time(55555.6157407407, format="mjd") val = mod.evaluate(t, energy=[0.3, 2] * u.TeV) assert_allclose(val.data, [[2.281216e-21], [4.281390e-23]], rtol=1e-5) filename = make_path(tmp_path / "test.fits") mod.write(filename=filename, format="map", overwrite=True) mod1 = LightCurveTemplateTemporalModel.read(filename=filename, format="map") assert_allclose(mod1.t_ref.value, mod.t_ref.value, rtol=1e-7) @requires_data() def test__reference_time(): filename = "$GAMMAPY_DATA/gravitational_waves/GW_example_DC_file.fits.gz" t_ref = Time("2028-01-01T00:00:00", format="isot", scale="utc") mod = _template_model_from_cta_sdc(filename, t_ref=t_ref) assert_allclose(mod.reference_time.mjd, t_ref.mjd, rtol=1e-7) @requires_data() @requires_dependency("healpy") def test_read_hermes_cube(): filename = make_path( "$GAMMAPY_DATA/tests/hermes/hermes-VariableMin-pi0-Htot_CMZ_nside256.fits.gz" ) map_ = read_hermes_cube(filename) assert map_.geom.frame == "galactic" assert map_.geom.axes[0].unit == "GeV" assert_allclose(map_.geom.axes[0].center[3], 1 * u.GeV) assert_allclose(map_.get_by_coord((0 * u.deg, 0 * u.deg, 1 * u.GeV)), 2.6391575) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/models/utils.py0000644000175100001770000001065414721316200020645 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from astropy import units as u from astropy.coordinates import SkyCoord from astropy.io import fits from astropy.nddata import NoOverlapError from astropy.time import Time from regions import PointSkyRegion from gammapy.maps import HpxNDMap, Map, MapAxis, RegionNDMap from gammapy.maps.hpx.io import HPX_FITS_CONVENTIONS, HpxConv from gammapy.utils.scripts import make_path from gammapy.utils.time import time_ref_from_dict, time_ref_to_dict from . import LightCurveTemplateTemporalModel, Models, SkyModel, TemplateSpatialModel __all__ = ["read_hermes_cube"] def _template_model_from_cta_sdc(filename, t_ref=None): """To create a `LightCurveTemplateTemporalModel` from the energy-dependent temporal model files of the cta-sdc1. This format is subject to change. """ filename = str(make_path(filename)) with fits.open(filename) as hdul: frame = hdul[0].header.get("frame", "icrs") position = SkyCoord( hdul[0].header["LONG"] * u.deg, hdul[0].header["LAT"] * u.deg, frame=frame ) energy_hdu = hdul["ENERGIES"] energy_axis = MapAxis.from_nodes( nodes=energy_hdu.data, unit=energy_hdu.header["TUNIT1"], name="energy", interp="log", ) time_hdu = hdul["TIMES"] time_header = time_hdu.header if t_ref is None: t_ref = Time(55555.5, format="mjd", scale="tt") time_header.update(time_ref_to_dict(t_ref, t_ref.scale)) time_min = time_hdu.data["Initial Time"] time_max = time_hdu.data["Final Time"] edges = np.append(time_min, time_max[-1]) * u.Unit(time_header["TUNIT1"]) data = hdul["SPECTRA"] time_ref = time_ref_from_dict(time_header, scale=t_ref.scale) time_axis = MapAxis.from_edges(edges=edges, name="time", interp="log") reg_map = RegionNDMap.create( region=PointSkyRegion(center=position), axes=[energy_axis, time_axis], data=np.array(list(data.data) * u.Unit(data.header["UNITS"])), ) return LightCurveTemplateTemporalModel(reg_map, t_ref=time_ref, filename=filename) def read_hermes_cube(filename): """Read 3d templates produced with hermes.""" # add hermes conventions to the list used by gammapy hermes_conv = HpxConv( convname="hermes-template", colstring="TFLOAT", hduname="xtension", frame="COORDTYPE", quantity_type="differential", ) HPX_FITS_CONVENTIONS["hermes-template"] = hermes_conv maps = [] energy_nodes = [] with fits.open(filename) as hdulist: # cannot read directly in 3d with Map.read because BANDS HDU is missing # https://gamma-astro-data-formats.readthedocs.io/en/v0.2/skymaps/index.html#bands-hdu # so we have to loop over hdus and create the energy axis for hdu in hdulist[1:]: template = HpxNDMap.from_hdu(hdu, format="hermes-template") # fix missing/incompatible infos template._unit = u.Unit(hdu.header["TUNIT1"]) # .from_hdu expect "BUNIT" if template.geom.frame == "G": template._geom._frame = "galactic" maps.append(template) energy_nodes.append(hdu.header["ENERGY"]) # SI unit (see header comment) # create energy axis and set unit energy_nodes *= u.Joule energy_nodes = energy_nodes.to("GeV") axis = MapAxis( energy_nodes, interp="log", name="energy_true", node_type="center", unit="GeV" ) return Map.from_stack(maps, axis=axis) def cutout_template_models(models, cutout_kwargs, datasets_names=None): """Apply cutout to template models.""" models_cut = Models() if models is None: return models_cut for m in models: if isinstance(m.spatial_model, TemplateSpatialModel): try: map_ = m.spatial_model.map.cutout(**cutout_kwargs) except (NoOverlapError, ValueError): continue template_cut = TemplateSpatialModel( map_, normalize=m.spatial_model.normalize, ) model_cut = SkyModel( spatial_model=template_cut, spectral_model=m.spectral_model, datasets_names=datasets_names, ) models_cut.append(model_cut) else: models_cut.append(m) return models_cut ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/parameter.py0000644000175100001770000006263414721316200020207 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Model parameter classes.""" import collections.abc import copy import html import itertools import logging import numpy as np from astropy import units as u from astropy.table import Table from gammapy.utils.interpolation import interpolation_scale __all__ = ["Parameter", "Parameters", "PriorParameter", "PriorParameters"] log = logging.getLogger(__name__) def _get_parameters_str(parameters): str_ = "" for par in parameters: if par.name == "amplitude": value_format, error_format = "{:10.2e}", "{:7.1e}" else: value_format, error_format = "{:10.3f}", "{:7.2f}" line = "\t{:21} {:8}: " + value_format + "\t {} {:<12s}\n" if par._link_label_io is not None: name = par._link_label_io else: name = par.name if par.frozen: frozen, error = "(frozen)", "\t\t" else: frozen = "" try: error = "+/- " + error_format.format(par.error) except AttributeError: error = "" str_ += line.format(name, frozen, par.value, error, par.unit) return str_.expandtabs(tabsize=2) class Parameter: """A model parameter. Note that the parameter value has been split into a factor and scale like this:: value = factor x scale Users should interact with the ``value``, ``quantity`` or ``min`` and ``max`` properties and consider the fact that there is a ``factor``` and ``scale`` an implementation detail. That was introduced for numerical stability in parameter and error estimation methods, only in the Gammapy optimiser interface do we interact with the ``factor``, ``factor_min`` and ``factor_max`` properties, i.e. the optimiser "sees" the well-scaled problem. Parameters ---------- name : str Name. value : float or `~astropy.units.Quantity` Value. scale : float, optional Scale (sometimes used in fitting). unit : `~astropy.units.Unit` or str, optional Unit. min : float, optional Minimum (sometimes used in fitting). max : float, optional Maximum (sometimes used in fitting). frozen : bool, optional Frozen (used in fitting). error : float Parameter error. scan_min : float Minimum value for the parameter scan. Overwrites scan_n_sigma. scan_max : float Minimum value for the parameter scan. Overwrites scan_n_sigma. scan_n_values: int Number of values to be used for the parameter scan. scan_n_sigma : int Number of sigmas to scan. scan_values: `numpy.array` Scan values. Overwrites all the scan keywords before. scale_method : {'scale10', 'factor1', None} Method used to set ``factor`` and ``scale``. interp : {"lin", "sqrt", "log"} Parameter scaling to use for the scan. prior : `~gammapy.modeling.models.Prior` Prior set on the parameter. """ def __init__( self, name, value, unit="", scale=1, min=np.nan, max=np.nan, frozen=False, error=0, scan_min=None, scan_max=None, scan_n_values=11, scan_n_sigma=2, scan_values=None, scale_method="scale10", interp="lin", prior=None, ): if not isinstance(name, str): raise TypeError(f"Name must be string, got '{type(name)}' instead") self._name = name self._link_label_io = None self.scale = scale self.min = min self.max = max self.frozen = frozen self._error = error self._type = None # TODO: move this to a setter method that can be called from `__set__` also! # Having it here is bad: behaviour not clear if Quantity and `unit` is passed. if isinstance(value, u.Quantity) or isinstance(value, str): val = u.Quantity(value) self.value = val.value self.unit = val.unit else: self.value = float(value) self.unit = unit self.scan_min = scan_min self.scan_max = scan_max self.scan_values = scan_values self.scan_n_values = scan_n_values self.scan_n_sigma = scan_n_sigma self.interp = interp self.scale_method = scale_method self.prior = prior def __get__(self, instance, owner): if instance is None: return self par = instance.__dict__[self.name] par._type = getattr(instance, "type", None) return par def __set__(self, instance, value): if isinstance(value, Parameter): instance.__dict__[self.name] = value else: par = instance.__dict__[self.name] raise TypeError(f"Cannot assign {value!r} to parameter {par!r}") def __set_name__(self, owner, name): if not self._name == name: raise ValueError(f"Expected parameter name '{name}', got {self._name}") @property def prior(self): """Prior applied to the parameter as a `~gammapy.modeling.models.Prior`.""" return self._prior @prior.setter def prior(self, value): if value is not None: from .models import Prior if isinstance(value, dict): from .models import Model self._prior = Model.from_dict({"prior": value}) elif isinstance(value, Prior): self._prior = value else: raise TypeError(f"Invalid type: {value!r}") else: self._prior = value def prior_stat_sum(self): if self.prior is not None: return self.prior(self) @property def type(self): return self._type @property def error(self): return self._error @error.setter def error(self, value): self._error = float(u.Quantity(value, unit=self.unit).value) @property def name(self): """Name as a string.""" return self._name @property def factor(self): """Factor as a float.""" return self._factor @factor.setter def factor(self, val): self._factor = float(val) @property def scale(self): """Scale as a float.""" return self._scale @scale.setter def scale(self, val): self._scale = float(val) @property def unit(self): """Unit as a `~astropy.units.Unit` object.""" return self._unit @unit.setter def unit(self, val): self._unit = u.Unit(val) @property def min(self): """Minimum as a float.""" return self._min @min.setter def min(self, val): """`~astropy.table.Table` has masked values for NaN. Replacing with NaN.""" if isinstance(val, np.ma.core.MaskedConstant): self._min = np.nan else: self._min = float(val) @property def factor_min(self): """Factor minimum as a float. This ``factor_min = min / scale`` is for the optimizer interface. """ return self.min / self.scale @property def max(self): """Maximum as a float.""" return self._max @max.setter def max(self, val): """`~astropy.table.Table` has masked values for NaN. Replacing with NaN.""" if isinstance(val, np.ma.core.MaskedConstant): self._max = np.nan else: self._max = float(val) @property def factor_max(self): """Factor maximum as a float. This ``factor_max = max / scale`` is for the optimizer interface. """ return self.max / self.scale @property def scale_method(self): """Method used to set ``factor`` and ``scale``.""" return self._scale_method @scale_method.setter def scale_method(self, val): if val not in ["scale10", "factor1"] and val is not None: raise ValueError(f"Invalid method: {val}") self._scale_method = val @property def frozen(self): """Frozen (used in fitting) (bool).""" return self._frozen @frozen.setter def frozen(self, val): if val in ["True", "False"]: val = bool(val) if not isinstance(val, bool) and not isinstance(val, np.bool_): raise TypeError(f"Invalid type: {val}, {type(val)}") self._frozen = val @property def value(self): """Value = factor x scale (float).""" return self._factor * self._scale @value.setter def value(self, val): self.factor = float(val) / self._scale @property def quantity(self): """Value times unit as a `~astropy.units.Quantity`.""" return self.value * self.unit @quantity.setter def quantity(self, val): val = u.Quantity(val) if not val.unit.is_equivalent(self.unit): raise u.UnitConversionError( f"Unit must be equivalent to {self.unit} for parameter {self.name}" ) self.value = val.value self.unit = val.unit # TODO: possibly allow to set this independently @property def conf_min(self): """Confidence minimum value as a `float`. Return parameter minimum if defined, otherwise return the scan_min. """ if not np.isnan(self.min): return self.min else: return self.scan_min # TODO: possibly allow to set this independently @property def conf_max(self): """Confidence maximum value as a `float`. Return parameter maximum if defined, otherwise return the scan_max. """ if not np.isnan(self.max): return self.max else: return self.scan_max @property def scan_min(self): """Stat scan minimum.""" if self._scan_min is None: return self.value - self.error * self.scan_n_sigma return self._scan_min @property def scan_max(self): """Stat scan maximum.""" if self._scan_max is None: return self.value + self.error * self.scan_n_sigma return self._scan_max @scan_min.setter def scan_min(self, value): """Stat scan minimum setter.""" self._scan_min = value @scan_max.setter def scan_max(self, value): """Stat scan maximum setter.""" self._scan_max = value @property def scan_n_sigma(self): """Stat scan n sigma.""" return self._scan_n_sigma @scan_n_sigma.setter def scan_n_sigma(self, n_sigma): """Stat scan n sigma.""" self._scan_n_sigma = int(n_sigma) @property def scan_values(self): """Stat scan values as a `~numpy.ndarray`.""" if self._scan_values is None: scale = interpolation_scale(self.interp) parmin, parmax = scale([self.scan_min, self.scan_max]) values = np.linspace(parmin, parmax, self.scan_n_values) return scale.inverse(values) return self._scan_values @scan_values.setter def scan_values(self, values): """Set scan values.""" self._scan_values = values def check_limits(self): """Emit a warning or error if value is outside the minimum/maximum range.""" if not self.frozen: if (~np.isnan(self.min) and (self.value <= self.min)) or ( ~np.isnan(self.max) and (self.value >= self.max) ): log.warning( f"Value {self.value} is outside bounds [{self.min}, {self.max}]" f" for parameter '{self.name}'" ) def __repr__(self): return ( f"{self.__class__.__name__}(name={self.name!r}, value={self.value!r}, " f"factor={self.factor!r}, scale={self.scale!r}, unit={self.unit!r}, " f"min={self.min!r}, max={self.max!r}, frozen={self.frozen!r}, prior={self.prior!r}, id={hex(id(self))})" ) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def copy(self): """Deep copy.""" return copy.deepcopy(self) def update_from_dict(self, data): """Update parameters from a dictionary.""" keys = ["value", "unit", "min", "max", "frozen", "prior"] for k in keys: if k == "prior" and data[k] == "": data[k] = None setattr(self, k, data[k]) def to_dict(self): """Convert to dictionary.""" output = { "name": self.name, "value": self.value, "unit": self.unit.to_string("fits"), "error": self.error, "min": self.min, "max": self.max, "frozen": self.frozen, "interp": self.interp, "scale_method": self.scale_method, } if self._link_label_io is not None: output["link"] = self._link_label_io if self.prior is not None: output["prior"] = self.prior.to_dict()["prior"] return output def autoscale(self): """Autoscale the parameters. Set ``factor`` and ``scale`` according to ``scale_method`` attribute. Available ``scale_method``. * ``scale10`` sets ``scale`` to power of 10, so that abs(factor) is in the range 1 to 10 * ``factor1`` sets ``factor, scale = 1, value`` In both cases the sign of value is stored in ``factor``, i.e. the ``scale`` is always positive. If ``scale_method`` is None the scaling is ignored. """ if self.scale_method == "scale10": value = self.value if value != 0: exponent = np.floor(np.log10(np.abs(value))) scale = np.power(10.0, exponent) self.factor = value / scale self.scale = scale elif self.scale_method == "factor1": self.factor, self.scale = 1, self.value class Parameters(collections.abc.Sequence): """Parameters container. - List of `Parameter` objects. - Covariance matrix. Parameters ---------- parameters : list of `Parameter` List of parameters. """ def __init__(self, parameters=None): if parameters is None: parameters = [] else: parameters = list(parameters) self._parameters = parameters def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def check_limits(self): """Check parameter limits and emit a warning.""" for par in self: par.check_limits() @property def prior(self): return [par.prior for par in self] def prior_stat_sum(self): parameters_stat_sum = 0 for par in self: if par.prior is not None: parameters_stat_sum += par.prior_stat_sum() return parameters_stat_sum @property def types(self): """Parameter types.""" return [par.type for par in self] @property def min(self): """Parameter minima as a `numpy.ndarray`.""" return np.array([_.min for _ in self._parameters], dtype=np.float64) @min.setter def min(self, min_array): """Parameter minima as a `numpy.ndarray`.""" if not len(self) == len(min_array): raise ValueError("Minima must have same length as parameter list") for min_, par in zip(min_array, self): par.min = min_ @property def max(self): """Parameter maxima as a `numpy.ndarray`.""" return np.array([_.max for _ in self._parameters], dtype=np.float64) @max.setter def max(self, max_array): """Parameter maxima as a `numpy.ndarray`.""" if not len(self) == len(max_array): raise ValueError("Maxima must have same length as parameter list") for max_, par in zip(max_array, self): par.max = max_ @property def value(self): """Parameter values as a `numpy.ndarray`.""" return np.array([_.value for _ in self._parameters], dtype=np.float64) @value.setter def value(self, values): """Parameter values as a `numpy.ndarray`.""" if not len(self) == len(values): raise ValueError("Values must have same length as parameter list") for value, par in zip(values, self): par.value = value @classmethod def from_stack(cls, parameters_list): """Create `Parameters` by stacking a list of other `Parameters` objects. Parameters ---------- parameters_list : list of `Parameters` List of `Parameters` objects. """ pars = itertools.chain(*parameters_list) return cls(pars) def copy(self): """Deep copy.""" return copy.deepcopy(self) @property def free_parameters(self): """List of free parameters.""" return self.__class__([par for par in self._parameters if not par.frozen]) @property def unique_parameters(self): """Unique parameters as a `Parameters` object.""" return self.__class__(dict.fromkeys(self._parameters)) @property def names(self): """List of parameter names.""" return [par.name for par in self._parameters] def index(self, val): """Get position index for a given parameter. The input can be a parameter object, parameter name (str) or if a parameter index (int) is passed in, it is simply returned. """ if isinstance(val, int): return val elif isinstance(val, Parameter): return self._parameters.index(val) elif isinstance(val, str): for idx, par in enumerate(self._parameters): if val == par.name: return idx raise IndexError(f"No parameter: {val!r}") else: raise TypeError(f"Invalid type: {type(val)!r}") def __getitem__(self, key): """Access parameter by name, index or boolean mask.""" if isinstance(key, np.ndarray) and key.dtype == bool: return self.__class__(list(np.array(self._parameters)[key])) else: idx = self.index(key) return self._parameters[idx] def __len__(self): return len(self._parameters) def __add__(self, other): if isinstance(other, Parameters): return Parameters.from_stack([self, other]) else: raise TypeError(f"Invalid type: {other!r}") def to_dict(self): data = [] for par in self._parameters: data.append(par.to_dict()) return data @staticmethod def _create_default_table(): name_to_type = { "type": "str", "name": "str", "value": "float", "unit": "str", "error": "float", "min": "float", "max": "float", "frozen": "bool", "link": "str", "prior": "str", } return Table(names=name_to_type.keys(), dtype=name_to_type.values()) def to_table(self): """Convert parameter attributes to `~astropy.table.Table`.""" table = self._create_default_table() for p in self._parameters: d = {k: v for k, v in p.to_dict().items() if k in table.colnames} if "prior" in d: d["prior"] = d["prior"]["type"] table.add_row(d) table["value"].format = ".4e" for name in ["error", "min", "max"]: table[name].format = ".3e" return table def __eq__(self, other): all_equal = np.all([p is p_new for p, p_new in zip(self, other)]) return all_equal and len(self) == len(other) @classmethod def from_dict(cls, data): parameters = [] for par in data: link_label = par.pop("link", None) par.pop("is_norm", None) parameter = Parameter(**par) parameter._link_label_io = link_label parameters.append(parameter) return cls(parameters=parameters) def set_parameter_factors(self, factors): """Set factor of all parameters. Used in the optimizer interface. """ idx = 0 for parameter in self._parameters: if not parameter.frozen: parameter.factor = factors[idx] idx += 1 def autoscale(self): """Autoscale all parameters. See :func:`~gammapy.modeling.Parameter.autoscale`. """ for par in self._parameters: par.autoscale() def select( self, name=None, type=None, frozen=None, ): """Create a mask of models, true if all conditions are verified. Parameters ---------- name : str or list, optional Name of the parameter. Default is None. type : {None, "spatial", "spectral", "temporal"} Type of models. Default is None. frozen : bool, optional Select frozen parameters if True, exclude them if False. Default is None. Returns ------- parameters : `Parameters` Selected parameters. """ selection = np.ones(len(self), dtype=bool) if name and not isinstance(name, list): name = [name] for idx, par in enumerate(self): if name: selection[idx] &= np.any([_ == par.name for _ in name]) if type: selection[idx] &= type == par.type if frozen is not None: if frozen: selection[idx] &= par.frozen else: selection[idx] &= ~par.frozen return self[selection] def freeze_all(self): """Freeze all parameters.""" for par in self._parameters: par.frozen = True def unfreeze_all(self): """Unfreeze all parameters (even those frozen by default).""" for par in self._parameters: par.frozen = False def restore_status(self, restore_values=True): """Context manager to restore status. A copy of the values is made on enter, and those values are restored on exit. Parameters ---------- restore_values : bool, optional Restore values if True, otherwise restore only frozen status. Default is None. Examples -------- >>> from gammapy.modeling.models import PowerLawSpectralModel >>> pwl = PowerLawSpectralModel(index=2) >>> with pwl.parameters.restore_status(): ... pwl.parameters["index"].value = 3 >>> print(pwl.parameters["index"].value) # doctest: +SKIP """ return restore_parameters_status(self, restore_values) class restore_parameters_status: def __init__(self, parameters, restore_values=True): self.restore_values = restore_values self._parameters = parameters self.values = [_.value for _ in parameters] self.frozen = [_.frozen for _ in parameters] def __enter__(self): pass def __exit__(self, type, value, traceback): for value, par, frozen in zip(self.values, self._parameters, self.frozen): if self.restore_values: par.value = value par.frozen = frozen class PriorParameter(Parameter): def __init__( self, name, value, unit="", scale=1, min=np.nan, max=np.nan, error=0, ): if not isinstance(name, str): raise TypeError(f"Name must be string, got '{type(name)}' instead") self._name = name self.scale = scale self.min = min self.max = max self._error = error if isinstance(value, u.Quantity) or isinstance(value, str): val = u.Quantity(value) self.value = val.value self.unit = val.unit else: self.factor = value self.unit = unit self._type = "prior" def to_dict(self): """Convert to dictionary.""" output = { "name": self.name, "value": self.value, "unit": self.unit.to_string("fits"), "error": self.error, "min": self.min, "max": self.max, } return output def __repr__(self): return ( f"{self.__class__.__name__}(name={self.name!r}, value={self.value!r}, " f"factor={self.factor!r}, scale={self.scale!r}, unit={self.unit!r}, " f"min={self.min!r}, max={self.max!r})" ) class PriorParameters(Parameters): def __init__(self, parameters=None): if parameters is None: parameters = [] else: parameters = list(parameters) self._parameters = parameters def to_table(self): """Convert parameter attributes to `~astropy.table.Table`.""" rows = [] for p in self._parameters: d = p.to_dict() rows.append({**dict(type=p.type), **d}) table = Table(rows) table["value"].format = ".4e" for name in ["error", "min", "max"]: table[name].format = ".3e" return table @classmethod def from_dict(cls, data): parameters = [] for par in data: parameter = PriorParameter(**par) parameters.append(parameter) return cls(parameters=parameters) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/scipy.py0000644000175100001770000001206714721316200017351 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np import scipy.optimize from gammapy.utils.interpolation import interpolate_profile from gammapy.utils.roots import find_roots from .likelihood import Likelihood __all__ = [ "confidence_scipy", "covariance_scipy", "optimize_scipy", "stat_profile_ul_scipy", ] def optimize_scipy(parameters, function, store_trace=False, **kwargs): method = kwargs.pop("method", "Nelder-Mead") pars = [par.factor for par in parameters.free_parameters] bounds = [] for par in parameters.free_parameters: parmin = par.factor_min if not np.isnan(par.factor_min) else None parmax = par.factor_max if not np.isnan(par.factor_max) else None bounds.append((parmin, parmax)) likelihood = Likelihood(function, parameters, store_trace) result = scipy.optimize.minimize( likelihood.fcn, pars, bounds=bounds, method=method, **kwargs ) factors = result.x info = { "success": result.success, "message": result.message, "nfev": result.nfev, "trace": likelihood.trace, } optimizer = None return factors, info, optimizer class TSDifference: """Fit statistic function wrapper to compute TS differences.""" def __init__(self, function, parameters, parameter, reoptimize, ts_diff): self.stat_null = function() self.parameters = parameters self.function = function self.parameter = parameter self.ts_diff = ts_diff self.reoptimize = reoptimize def fcn(self, factor): self.parameter.factor = factor if self.reoptimize: self.parameter.frozen = True optimize_scipy(self.parameters, self.function, method="L-BFGS-B") value = self.function() - self.stat_null - self.ts_diff return value def _confidence_scipy_brentq( parameters, parameter, function, sigma, reoptimize, upper=True, **kwargs ): ts_diff = TSDifference( function, parameters, parameter, reoptimize, ts_diff=sigma**2 ) lower_bound = parameter.factor if upper: upper_bound = parameter.conf_max / parameter.scale else: upper_bound = parameter.conf_min / parameter.scale message, success = "Confidence terminated successfully.", True kwargs.setdefault("nbin", 1) roots, res = find_roots( ts_diff.fcn, lower_bound=lower_bound, upper_bound=upper_bound, **kwargs ) result = (roots[0], res[0]) if np.isnan(roots[0]): message = ( "Confidence estimation failed. Try to set the parameter.min/max by hand." ) success = False suffix = "errp" if upper else "errn" return { "nfev_" + suffix: result[1].iterations, suffix: np.abs(result[0] - lower_bound), "success_" + suffix: success, "message_" + suffix: message, "stat_null": ts_diff.stat_null, } def confidence_scipy(parameters, parameter, function, sigma, reoptimize=True, **kwargs): if len(parameters.free_parameters) <= 1: reoptimize = False with parameters.restore_status(): result = _confidence_scipy_brentq( parameters=parameters, parameter=parameter, function=function, sigma=sigma, upper=False, reoptimize=reoptimize, **kwargs, ) with parameters.restore_status(): result_errp = _confidence_scipy_brentq( parameters=parameters, parameter=parameter, function=function, sigma=sigma, upper=True, reoptimize=reoptimize, **kwargs, ) result.update(result_errp) return result # TODO: implement, e.g. with numdifftools.Hessian def covariance_scipy(parameters, function): raise NotImplementedError def stat_profile_ul_scipy( value_scan, stat_scan, delta_ts=4, interp_scale="sqrt", **kwargs ): """Compute upper limit of a parameter from a likelihood profile. Parameters ---------- value_scan : `~numpy.ndarray` Array of parameter values. stat_scan : `~numpy.ndarray` Array of delta fit statistic values, with respect to the minimum. delta_ts : float, optional Difference in test statistics for the upper limit. Default is 4. interp_scale : {"sqrt", "lin"}, optional Interpolation scale applied to the fit statistic profile. If the profile is of parabolic shape, a "sqrt" scaling is recommended. In other cases or for fine sampled profiles a "lin" can also be used. Default is "sqrt". **kwargs : dict Keyword arguments passed to `~scipy.optimize.brentq`. Returns ------- ul : float Upper limit value. """ interp = interpolate_profile(value_scan, stat_scan, interp_scale=interp_scale) def f(x): return interp((x,)) - delta_ts idx = np.argmin(stat_scan) norm_best_fit = value_scan[idx] roots, res = find_roots( f, lower_bound=norm_best_fit, upper_bound=value_scan[-1], nbin=1, **kwargs ) return roots[0] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/selection.py0000644000175100001770000002560314721316200020207 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from gammapy.modeling import Fit, Parameter, Covariance from gammapy.stats.utils import sigma_to_ts from .fit import FitResult, OptimizeResult __all__ = ["select_nested_models"] class TestStatisticNested: """Compute the test statistic (TS) between two nested hypothesis. The null hypothesis is the minimal one, for which a set of parameters are frozen to given values. The model is updated to the alternative hypothesis if there is a significant improvement (larger than the given threshold). Parameters ---------- parameters : `~gammapy.modeling.Parameters` or list of `~gammapy.modeling.Parameter` List of parameters frozen for the null hypothesis but free for the test hypothesis. null_values : list of float or `~gammapy.modeling.Parameters` Values of the parameters frozen for the null hypothesis. If a `Parameters` object or a list of `Parameters` is given the null hypothesis follows the values of these parameters, so this tests linked parameters versus unliked. n_sigma : float Threshold in number of sigma to switch from the null hypothesis to the alternative one. Default is 2. The TS is converted to sigma assuming that the Wilk's theorem is verified. n_free_parameters : int Number of free parameters to consider between the two hypothesis in order to estimate the `ts_threshold` from the `n_sigma` threshold. Default is len(parameters). fit : `Fit` Fit instance specifying the backend and fit options. """ __test__ = False def __init__( self, parameters, null_values, n_sigma=2, n_free_parameters=None, fit=None ): self.parameters = parameters self.null_values = null_values self.n_sigma = n_sigma if n_free_parameters is None: n_free_parameters = len(parameters) self.n_free_parameters = n_free_parameters if fit is None: fit = Fit() minuit_opts = {"tol": 0.1, "strategy": 1} fit.backend = "minuit" fit.optimize_opts = minuit_opts self.fit = fit @property def ts_threshold(self): """Threshold value in TS corresponding to `n_sigma`. This assumes that the TS follows a chi squared distribution with a number of degree of freedom equal to `n_free_parameters`. """ return np.sign(self.n_sigma) * sigma_to_ts(self.n_sigma, self.n_free_parameters) def ts_known_bkg(self, datasets): """Perform the alternative hypothesis testing assuming known background (all parameters frozen). This implicitly assumes that the non-null model is a good representation of the true model. If the assumption is true the ts_known_bkg should tend to the ts_asimov (deviation would indicate a bad fit of the data). Deviations between ts and frozen_ts can be used to identify potential sources of confusion depending on which parameters are let free for the ts computation (for example considereing diffuse background or nearby source). """ stat = datasets.stat_sum() cache = self._apply_null_hypothesis(datasets) stat_null = datasets.stat_sum() self._restore_status(datasets, cache) return stat_null - stat def ts_asimov(self, datasets): """Perform the alternative hypothesis testing in the Asimov dataset. The Asimov dataset is defined by counts=npred such as the non-null model is the true model. """ counts_cache = [d.counts for d in datasets] for d in datasets: d.counts = d.npred() ts = self.ts_known_bkg(datasets) for kd, d in enumerate(datasets): d.counts = counts_cache[kd] return ts def ts(self, datasets): """Perform the alternative hypothesis testing.""" return self.run(datasets, apply_selection=False)["ts"] def run(self, datasets, apply_selection=True): """Perform the alternative hypothesis testing and apply model selection. Parameters ---------- datasets : `~gammapy.datasets.Datasets` Datasets. apply_selection : bool Apply or not the model selection. Default is True. Returns ------- result : dict Dictionary with the TS of the best fit value compared to the null hypothesis and fit results for the two hypotheses. Entries are: * "ts" : fit statistic difference with null hypothesis * "fit_results" : results for the best fit * "fit_results_null" : fit results for the null hypothesis """ for p in self.parameters: p.frozen = False fit_results = self.fit.run(datasets) stat = datasets.stat_sum() cache = self._apply_null_hypothesis(datasets) if len(datasets.models.parameters.free_parameters) > 0: fit_results_null = self.fit.run(datasets) else: fit_results_null = FitResult( OptimizeResult( models=datasets.models.copy(), nfev=0, total_stat=datasets.stat_sum(), trace=None, backend=None, method=None, success=None, message=None, ) ) stat_null = datasets.stat_sum() ts = stat_null - stat if not apply_selection or ts > self.ts_threshold: # restore default model if preferred against null hypothesis or if selection is ignored self._restore_status(datasets, cache) return dict( ts=ts, fit_results=fit_results, fit_results_null=fit_results_null, ) def _apply_null_hypothesis(self, datasets): cache = dict() cache["object"] = [p.__dict__ for p in datasets.models.parameters] cache["values"] = [p.value for p in datasets.models.parameters] cache["error"] = [p.error for p in datasets.models.parameters] for p, val in zip(self.parameters, self.null_values): if isinstance(val, Parameter): p.__dict__ = val.__dict__ else: p.value = val p.frozen = True cache["covar"] = Covariance( datasets.models.parameters, datasets.models.covariance.data ) return cache def _restore_status(self, datasets, cache): """Restore parameters to given cached objects and values""" for p in self.parameters: p.frozen = False for kp, p in enumerate(datasets.models.parameters): p.__dict__ = cache["object"][kp] p.value = cache["values"][kp] p.error = cache["error"][kp] datasets._covariance = cache["covar"] def select_nested_models( datasets, parameters, null_values, n_sigma=2, n_free_parameters=None, fit=None ): """Compute the test statistic (TS) between two nested hypothesis. The null hypothesis is the minimal one, for which a set of parameters are frozen to given values. The model is updated to the alternative hypothesis if there is a significant improvement (larger than the given threshold). Parameters ---------- datasets : `~gammapy.datasets.Datasets` Datasets. parameters : `~gammapy.modeling.Parameters` or list of `~gammapy.modeling.Parameter` List of parameters frozen for the null hypothesis but free for the test hypothesis. null_values : list of float or `~gammapy.modeling.Parameters` Values of the parameters frozen for the null hypothesis. If a `Parameters` object or a list of `Parameters` is given the null hypothesis follows the values of these parameters, so this tests linked parameters versus unliked. n_sigma : float, optional Threshold in number of sigma to switch from the null hypothesis to the alternative one. Default is 2. The TS is converted to sigma assuming that the Wilk's theorem is verified. n_free_parameters : int, optional Number of free parameters to consider between the two hypothesis in order to estimate the `ts_threshold` from the `n_sigma` threshold. Default is ``len(parameters)``. fit : `Fit`, optional Fit instance specifying the backend and fit options. Default is None, which utilises the "minuit" backend with tol=0.1 and strategy=1. Returns ------- result : dict Dictionary with the TS of the best fit value compared to the null hypothesis and fit results for the two hypotheses. Entries are: * "ts" : fit statistic difference with null hypothesis * "fit_results" : results for the best fit * "fit_results_null" : fit results for the null hypothesis Examples -------- .. testcode:: from gammapy.modeling.selection import select_nested_models from gammapy.datasets import Datasets, SpectrumDatasetOnOff from gammapy.modeling.models import SkyModel # Test if cutoff is significant dataset = SpectrumDatasetOnOff.read("$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs23523.fits") datasets = Datasets(dataset) model = SkyModel.create(spectral_model="ecpl", spatial_model="point", name='hess') datasets.models = model result = select_nested_models(datasets, parameters=[model.spectral_model.lambda_], null_values=[0], ) # Test if source is significant filename = "$GAMMAPY_DATA/fermi-3fhl-crab/Fermi-LAT-3FHL_datasets.yaml" filename_models = "$GAMMAPY_DATA/fermi-3fhl-crab/Fermi-LAT-3FHL_models.yaml" fermi_datasets = Datasets.read(filename=filename, filename_models=filename_models) model = fermi_datasets.models["Crab Nebula"] # Number of parameters previously fit for the source of interest n_free_parameters = len(model.parameters.free_parameters) # Freeze spatial parameters to ensure another weaker source does not move from its position # to replace the source of interest during the null hypothesis test. # (with all parameters free you test N vs. N+1 models and not the detection of a specific source.) fermi_datasets.models.freeze(model_type='spatial') results = select_nested_models(fermi_datasets, parameters=[model.spectral_model.amplitude], null_values=[0], n_free_parameters=n_free_parameters, n_sigma=4, ) """ test = TestStatisticNested(parameters, null_values, n_sigma, n_free_parameters, fit) return test.run(datasets) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/sherpa.py0000644000175100001770000000420614721316200017500 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from .likelihood import Likelihood __all__ = ["optimize_sherpa", "covariance_sherpa"] def get_sherpa_optimizer(name): from sherpa.optmethods import GridSearch, LevMar, MonCar, NelderMead return { "levmar": LevMar, "simplex": NelderMead, "moncar": MonCar, "gridsearch": GridSearch, }[name]() class SherpaLikelihood(Likelihood): """Likelihood function interface for Sherpa.""" def fcn(self, factors): self.parameters.set_parameter_factors(factors) total_stat = self.function() if self.store_trace: self.store_trace_iteration(total_stat) return total_stat, 0 def optimize_sherpa(parameters, function, store_trace=False, **kwargs): """Sherpa optimization wrapper method. Parameters ---------- parameters : `~gammapy.modeling.Parameters` Parameter list with starting values. function : callable Likelihood function. **kwargs : dict Options passed to the optimizer instance. Returns ------- result : (factors, info, optimizer) Tuple containing the best fit factors, some information and the optimizer instance. """ method = kwargs.pop("method", "simplex") optimizer = get_sherpa_optimizer(method) optimizer.config.update(kwargs) pars = [par.factor for par in parameters.free_parameters] parmins = np.nan_to_num( [par.factor_min for par in parameters.free_parameters], nan=-np.inf ) parmaxes = np.nan_to_num( [par.factor_max for par in parameters.free_parameters], nan=np.inf ) statfunc = SherpaLikelihood(function, parameters, store_trace) with np.errstate(invalid="ignore"): result = optimizer.fit( statfunc=statfunc.fcn, pars=pars, parmins=parmins, parmaxes=parmaxes ) factors = result[1] info = { "success": result[0], "message": result[3], "nfev": result[4]["nfev"], "trace": statfunc.trace, } return factors, info, optimizer def covariance_sherpa(): raise NotImplementedError ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.236642 gammapy-1.3/gammapy/modeling/tests/0000755000175100001770000000000014721316215017012 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/tests/__init__.py0000644000175100001770000000010014721316200021104 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/tests/test_covariance.py0000644000175100001770000000355614721316200022540 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Unit tests for the Covariance class""" import pytest import numpy as np from numpy.testing import assert_allclose from gammapy.modeling import Covariance, Parameter, Parameters from gammapy.utils.testing import mpl_plot_check @pytest.fixture def covariance_diagonal(): x = Parameter("x", 1, error=0.1) y = Parameter("y", 2, error=0.2) z = Parameter("z", 3, error=0.3) parameters = Parameters([x, y, z]) return Covariance(parameters=parameters) @pytest.fixture def covariance(covariance_diagonal): x = covariance_diagonal.parameters["x"] y = covariance_diagonal.parameters["y"] parameters = Parameters([x, y]) data = np.ones((2, 2)) return Covariance(parameters=parameters, data=data) def test_str(covariance_diagonal): assert "0.01" in str(covariance_diagonal) def test_shape(covariance_diagonal): assert_allclose(covariance_diagonal.shape, (3, 3)) def test_set_data(covariance_diagonal): data = np.ones((2, 2)) with pytest.raises(ValueError): covariance_diagonal.data = data def test_set_subcovariance(covariance_diagonal, covariance): covariance_diagonal.set_subcovariance(covariance) assert_allclose(covariance_diagonal.data[:2, :2], np.ones((2, 2))) def test_get_subcovariance(covariance_diagonal, covariance): covar = covariance_diagonal.get_subcovariance(covariance.parameters) assert_allclose(np.diag(covar), [0.1**2, 0.2**2]) def test_scipy_mvn(covariance): # Adapt test because scipy 1.9.3 does not work with semi positive matrices of rank one mvn = Covariance(covariance.parameters, np.array([[1, 0.5], [0.5, 1]])).scipy_mvn value = mvn.pdf(2) assert_allclose(value, 0.094354, rtol=1e-3) def test_plot_correlation(covariance_diagonal): with mpl_plot_check(): covariance_diagonal.plot_correlation() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/tests/test_fit.py0000644000175100001770000002731314721316200021205 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Unit tests for the Fit class""" import pytest from numpy.testing import assert_allclose from astropy.table import Table from gammapy.datasets import Dataset, Datasets, SpectrumDatasetOnOff from gammapy.modeling import Fit, Parameter from gammapy.modeling.fit import FitResult from gammapy.modeling.models import ( LogParabolaSpectralModel, ModelBase, Models, SkyModel, ) from gammapy.utils.scripts import read_yaml from gammapy.utils.testing import requires_data, requires_dependency class MyModel(ModelBase): x = Parameter("x", 2) y = Parameter("y", 3e2) z = Parameter("z", 4e-2) name = "test" datasets_names = ["test"] type = "model" class MyDataset(Dataset): tag = "MyDataset" def __init__(self, name="test"): self._name = name self._models = Models([MyModel(x=1.99, y=2.99e3, z=3.99e-2)]) self.data_shape = (1,) self.meta_table = Table() @property def models(self): return self._models def stat_sum(self): # self._model.parameters = parameters x, y, z = [p.value for p in self.models.parameters.unique_parameters] x_opt, y_opt, z_opt = 2, 3e2, 4e-2 return (x - x_opt) ** 2 + (y - y_opt) ** 2 + (z - z_opt) ** 2 def fcn(self): x, y, z = [p.value for p in self.models.parameters.unique_parameters] x_opt, y_opt, z_opt = 2, 3e5, 4e-5 x_err, y_err, z_err = 0.2, 3e4, 4e-6 return ( ((x - x_opt) / x_err) ** 2 + ((y - y_opt) / y_err) ** 2 + ((z - z_opt) / z_err) ** 2 ) def stat_array(self): """Statistic array, one value per data point.""" @requires_dependency("sherpa") @pytest.mark.parametrize("backend", ["sherpa", "scipy"]) def test_optimize_backend_and_covariance(backend): dataset = MyDataset() if backend == "scipy": kwargs = {"method": "L-BFGS-B"} else: kwargs = {} kwargs["backend"] = backend fit = Fit(optimize_opts=kwargs) result = fit.run([dataset]) assert result is not None pars = dataset.models.parameters assert_allclose(pars["x"].value, 2, rtol=1e-3) assert_allclose(pars["y"].value, 3e2, rtol=1e-3) assert_allclose(pars["z"].value, 4e-2, rtol=1e-2) assert_allclose(pars["x"].error, 1, rtol=1e-7) assert_allclose(pars["y"].error, 1, rtol=1e-7) assert_allclose(pars["z"].error, 1, rtol=1e-7) correlation = dataset.models.covariance.correlation assert_allclose(correlation[0, 1], 0, atol=1e-7) assert_allclose(correlation[0, 2], 0, atol=1e-7) assert_allclose(correlation[1, 2], 0, atol=1e-7) @pytest.mark.parametrize("backend", ["minuit"]) def test_run(backend): dataset = MyDataset() fit = Fit(backend=backend) result = fit.run([dataset]) pars = dataset.models.parameters assert fit._minuit is not None assert result.success assert result.optimize_result.method == "migrad" assert result.covariance_result.method == "hesse" assert result.covariance_result.success assert_allclose(pars["x"].value, 2, rtol=1e-3) assert_allclose(pars["y"].value, 3e2, rtol=1e-3) assert_allclose(pars["z"].value, 4e-2, rtol=1e-3) assert_allclose(pars["x"].error, 1, rtol=1e-7) assert_allclose(pars["y"].error, 1, rtol=1e-7) assert_allclose(pars["z"].error, 1, rtol=1e-7) correlation = dataset.models.covariance.correlation assert_allclose(correlation[0, 1], 0, atol=1e-7) assert_allclose(correlation[0, 2], 0, atol=1e-7) assert_allclose(correlation[1, 2], 0, atol=1e-7) # Verify that the fit result models are independent of the dataset ones pars["x"].value = 3 assert_allclose(result.parameters["x"].value, 2, rtol=1e-3) # check parameters from the result object pars = result.parameters assert_allclose(pars["x"].error, 1, rtol=1e-7) assert_allclose(pars["y"].error, 1, rtol=1e-7) assert_allclose(pars["z"].error, 1, rtol=1e-7) def test_run_no_free_parameters(): dataset = MyDataset() for par in dataset.models.parameters.free_parameters: par.frozen = True fit = Fit() with pytest.raises(ValueError, match="No free parameters for fitting"): fit.run(dataset) @pytest.mark.parametrize("backend", ["minuit"]) def test_run_linked(backend): dataset = MyDataset() model1 = MyModel(x=1.99, y=2.99e3, z=3.99e-2) model1.name = "model1" model2 = MyModel(x=0.99, y=0.99e3, z=3.99e-2) model2.name = "model2" model2.x = model1.x model2.y = model1.y model2.z = model1.z dataset._models = Models([model1, model2]) fit = Fit(backend=backend) fit.run([dataset]) assert len(dataset.models.parameters.unique_parameters) == 3 assert dataset.models.covariance.shape == (6, 6) expected = [ [1.00000000e00, 1.69728073e-30, 4.76456033e-16], [1.69728073e-30, 1.00000000e00, 3.56230294e-15], [4.76456033e-16, 3.56230294e-15, 1.00000000e00], ] assert_allclose(dataset.models[0].covariance.data, expected) assert_allclose(dataset.models[1].covariance.data, expected) @requires_dependency("sherpa") @pytest.mark.parametrize("backend", ["minuit", "sherpa", "scipy"]) def test_optimize(backend): dataset = MyDataset() if backend == "scipy": kwargs = {"method": "L-BFGS-B"} else: kwargs = {} fit = Fit(store_trace=True, backend=backend, optimize_opts=kwargs) result = fit.optimize([dataset]) pars = dataset.models.parameters assert result.success assert_allclose(result.total_stat, 0, atol=1) assert_allclose(pars["x"].value, 2, rtol=1e-3) assert_allclose(pars["y"].value, 3e2, rtol=1e-3) assert_allclose(pars["z"].value, 4e-2, rtol=1e-2) assert len(result.trace) == result.nfev @pytest.mark.parametrize("backend", ["minuit"]) def test_confidence(backend): dataset = MyDataset() fit = Fit(backend=backend) fit.optimize([dataset]) result = fit.confidence(datasets=[dataset], parameter="x") assert result["success"] assert_allclose(result["errp"], 1) assert_allclose(result["errn"], 1) # Check that original value state wasn't changed assert_allclose(dataset.models.parameters["x"].value, 2) @pytest.mark.parametrize("backend", ["minuit"]) def test_confidence_frozen(backend): dataset = MyDataset() dataset.models.parameters["x"].frozen = True fit = Fit(backend=backend) fit.optimize([dataset]) result = fit.confidence(datasets=[dataset], parameter="y") assert result["success"] assert_allclose(result["errp"], 1) assert_allclose(result["errn"], 1) def test_stat_profile(): dataset = MyDataset() fit = Fit() fit.run([dataset]) dataset.models.parameters["x"].scan_n_values = 3 result = fit.stat_profile(datasets=[dataset], parameter="x") assert_allclose(result["test.x_scan"], [0, 2, 4], atol=1e-7) assert_allclose(result["stat_scan"], [4, 0, 4], atol=1e-7) assert len(result["fit_results"]) == 0 # Check that original value state wasn't changed assert_allclose(dataset.models.parameters["x"].value, 2) def test_stat_profile_reoptimize(): dataset = MyDataset() fit = Fit() fit.run([dataset]) dataset.models.parameters["y"].value = 0 dataset.models.parameters["x"].scan_n_values = 3 result = fit.stat_profile(datasets=[dataset], parameter="x", reoptimize=True) assert_allclose(result["test.x_scan"], [0, 2, 4], atol=1e-7) assert_allclose(result["stat_scan"], [4, 0, 4], atol=1e-7) assert_allclose( result["fit_results"][0].total_stat, result["stat_scan"][0], atol=1e-7 ) def test_stat_surface(): dataset = MyDataset() fit = Fit() fit.run([dataset]) x_values = [1, 2, 3] y_values = [2e2, 3e2, 4e2] dataset.models.parameters["x"].scan_values = x_values dataset.models.parameters["y"].scan_values = y_values result = fit.stat_surface(datasets=[dataset], x="x", y="y") assert_allclose(result["test.x_scan"], x_values, atol=1e-7) assert_allclose(result["test.y_scan"], y_values, atol=1e-7) expected_stat = [ [1.0001e04, 1.0000e00, 1.0001e04], [1.0000e04, 0.0000e00, 1.0000e04], [1.0001e04, 1.0000e00, 1.0001e04], ] assert_allclose(list(result["stat_scan"]), expected_stat, atol=1e-7) assert len(result["fit_results"]) == 0 # Check that original value state wasn't changed assert_allclose(dataset.models.parameters["x"].value, 2) assert_allclose(dataset.models.parameters["y"].value, 3e2) def test_stat_surface_reoptimize(): dataset = MyDataset() fit = Fit() fit.run([dataset]) x_values = [1, 2, 3] y_values = [2e2, 3e2, 4e2] dataset.models.parameters["z"].value = 0 dataset.models.parameters["x"].scan_values = x_values dataset.models.parameters["y"].scan_values = y_values result = fit.stat_surface(datasets=[dataset], x="x", y="y", reoptimize=True) assert_allclose(result["test.x_scan"], x_values, atol=1e-7) assert_allclose(result["test.y_scan"], y_values, atol=1e-7) expected_stat = [ [1.0001e04, 1.0000e00, 1.0001e04], [1.0000e04, 0.0000e00, 1.0000e04], [1.0001e04, 1.0000e00, 1.0001e04], ] assert_allclose(list(result["stat_scan"]), expected_stat, atol=1e-7) assert_allclose( result["fit_results"][0][0].total_stat, result["stat_scan"][0][0], atol=1e-7 ) def test_stat_contour(): dataset = MyDataset() dataset.models.parameters["x"].frozen = True fit = Fit(backend="minuit") fit.optimize([dataset]) result = fit.stat_contour(datasets=[dataset], x="y", y="z") assert result["success"] x = result["test.y"] assert len(x) in [10, 11] # Behavior changed after iminuit>=2.13 assert_allclose(x[0], 299, rtol=1e-5) assert_allclose(x[9], 299.133975, rtol=1e-5) y = result["test.z"] assert len(x) == len(y) assert len(y) in [10, 11] assert_allclose(y[0], 0.04, rtol=1e-5) assert_allclose(y[9], 0.54, rtol=1e-5) # Check that original value state wasn't changed assert_allclose(dataset.models.parameters["y"].value, 300) @requires_data() def test_write(tmpdir): datasets = Datasets() for obs_id in [23523, 23526]: dataset = SpectrumDatasetOnOff.read( f"$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs{obs_id}.fits" ) datasets.append(dataset) datasets = datasets.stack_reduce(name="HESS") model = SkyModel(spectral_model=LogParabolaSpectralModel(), name="crab") datasets.models = model fit = Fit() result = fit.run(datasets) result_dict = result.covariance_result.to_dict() assert ( result_dict["CovarianceResult"]["backend"] == result.covariance_result.backend ) result_dict = result.optimize_result.to_dict() assert result_dict["OptimizeResult"]["nfev"] == result.optimize_result.nfev assert ( result_dict["OptimizeResult"]["total_stat"] == result.optimize_result.total_stat ) filename = tmpdir / "test-fit-result.yaml" result.write(filename) data = read_yaml(filename) assert "CovarianceResult" in data assert "OptimizeResult" in data optimize_result = fit.optimize(datasets) result = FitResult(optimize_result=optimize_result) result.write(filename, overwrite=True) data = read_yaml(filename) assert "CovarianceResult" not in data assert "OptimizeResult" in data @requires_data() def test_covariance_no_optimize_results(): spec = SpectrumDatasetOnOff.read( "$GAMMAPY_DATA/joint-crab/spectra/hess/pha_obs23523.fits" ) spec.models = [SkyModel.create(spectral_model="pl")] fit = Fit() fit.optimize([spec]) res = fit.covariance([spec]) assert_allclose(res.matrix.data[0, 1], 6.163970e-13, rtol=1e-3) assert_allclose(res.matrix.data[0, 0], 2.239832e-02, rtol=1e-3) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/tests/test_iminuit.py0000644000175100001770000000772314721316200022104 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose from gammapy.modeling import Parameter from gammapy.modeling.iminuit import confidence_iminuit, optimize_iminuit from gammapy.modeling.models import ModelBase, Models pytest.importorskip("iminuit") class MyModel(ModelBase): x = Parameter("x", 2.1, error=0.2) y = Parameter("y", 3.1e5, scale=1e5, error=3e4) z = Parameter("z", 4.1e-5, scale=1e-5, error=4e-6) name = "test" datasets_names = ["test"] class MyDataset: def __init__(self, name="test"): self.name = name self.models = Models(MyModel()) def fcn(self): x, y, z = [p.value for p in self.models.parameters] x_opt, y_opt, z_opt = 2, 3e5, 4e-5 x_err, y_err, z_err = 0.2, 3e4, 4e-6 return ( ((x - x_opt) / x_err) ** 2 + ((y - y_opt) / y_err) ** 2 + ((z - z_opt) / z_err) ** 2 ) def test_iminuit_basic(): ds = MyDataset() pars = ds.models.parameters factors, info, minuit = optimize_iminuit(function=ds.fcn, parameters=pars) assert info["success"] assert_allclose(ds.fcn(), 0, atol=1e-5) # Check the result in parameters is OK assert_allclose(pars["x"].value, 2, rtol=1e-3) assert_allclose(pars["y"].value, 3e5, rtol=1e-3) # Precision of estimate on "z" is very poor (0.040488). Why is it so bad? assert_allclose(pars["z"].value, 4e-5, rtol=2e-2) # Check that minuit sees the parameter factors correctly assert_allclose(factors, [2, 3, 4], rtol=1e-3) assert_allclose(minuit.values["par_000_x"], 2, rtol=1e-3) assert_allclose(minuit.values["par_001_y"], 3, rtol=1e-3) assert_allclose(minuit.values["par_002_z"], 4, rtol=1e-3) def test_iminuit_stepsize(): ds = MyDataset() pars = ds.models.parameters factors, info, minuit = optimize_iminuit(function=ds.fcn, parameters=pars) assert info["success"] assert_allclose(ds.fcn(), 0, atol=1e-5) assert_allclose(pars["x"].value, 2, rtol=1e-3) assert_allclose(pars["y"].value, 3e5, rtol=1e-3) assert_allclose(pars["z"].value, 4e-5, rtol=2e-2) def test_iminuit_frozen(): ds = MyDataset() pars = ds.models.parameters pars["y"].frozen = True factors, info, minuit = optimize_iminuit(function=ds.fcn, parameters=pars) assert info["success"] assert_allclose(pars["x"].value, 2, rtol=1e-4) assert_allclose(pars["y"].value, 3.1e5) assert_allclose(pars["z"].value, 4.0e-5, rtol=1e-4) assert_allclose(ds.fcn(), 0.111112, rtol=1e-5) def test_iminuit_limits(): ds = MyDataset() pars = ds.models.parameters pars["y"].min = 301000 factors, info, minuit = optimize_iminuit(function=ds.fcn, parameters=pars) assert info["success"] # Check the result in parameters is OK assert_allclose(pars["x"].value, 2, rtol=1e-2) assert_allclose(pars["y"].value, 301000, rtol=1e-3) # Check that minuit sees the limit factors correctly params = minuit.init_params assert not params["x"].has_limits assert params["par_001_y"].has_limits assert_allclose(params["par_001_y"].lower_limit, 3.01) assert params["par_001_y"].upper_limit is None def test_opts(): ds = MyDataset() pars = ds.models.parameters factors, info, minuit = optimize_iminuit( function=ds.fcn, parameters=pars, migrad_opts={"ncall": 20}, tol=1.0, strategy=2 ) assert info["nfev"] == 29 assert minuit.tol == 1.0 assert minuit.strategy == 2 def test_iminuit_confidence(): ds = MyDataset() pars = ds.models.parameters factors, info, minuit = optimize_iminuit(function=ds.fcn, parameters=pars) assert_allclose(ds.fcn(), 0, atol=1e-5) par = pars["x"] par.min, par.max = 0, 10 result = confidence_iminuit( function=ds.fcn, parameters=pars, parameter=par, sigma=1, reoptimize=True ) assert result["success"] assert_allclose(result["errp"], 0.2) assert_allclose(result["errn"], 0.2) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/tests/test_parameter.py0000644000175100001770000002270714721316200022405 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy.table import Table from gammapy.modeling import Parameter, Parameters, PriorParameter, PriorParameters from gammapy.modeling.models import GaussianPrior def test_parameter_init(): par = Parameter("spam", 42, "deg") assert par.name == "spam" assert par.factor == 42 assert isinstance(par.factor, float) assert par.scale == 1 assert isinstance(par.scale, float) assert par.value == 42 assert isinstance(par.value, float) assert par.unit == "deg" assert par.min is np.nan assert par.max is np.nan assert not par.frozen par = Parameter("spam", "42 deg") assert par.factor == 42 assert par.scale == 1 assert par.unit == "deg" with pytest.raises(TypeError): Parameter(1, 2) def test_parameter_outside_limit(caplog): par = Parameter("spam", 50, min=0, max=40) par.check_limits() assert "WARNING" in [_.levelname for _ in caplog.records] message1 = "Value 50.0 is outside bounds [0.0, 40.0] for parameter 'spam'" assert message1 in [_.message for _ in caplog.records] def test_parameter_scale(): # Basic check how scale is used for value, min, max par = Parameter("spam", 420, "deg", 10, 400, 500) assert par.value == 420 assert par.min == 400 assert_allclose(par.factor_min, 40) assert par.max == 500 assert_allclose(par.factor_max, 50) par.value = 70 assert par.scale == 10 assert_allclose(par.factor, 7) def test_parameter_quantity(): par = Parameter("spam", 420, "deg", 10) quantity = par.quantity assert quantity.unit == "deg" assert quantity.value == 420 par.quantity = "70 deg" assert_allclose(par.factor, 7) assert par.scale == 10 assert par.unit == "deg" def test_parameter_repr(): par = Parameter("spam", 42, "deg") assert repr(par).startswith("Parameter(name=") def test_parameter_to_dict(): par = Parameter("spam", 42, "deg") d = par.to_dict() assert isinstance(d["unit"], str) @pytest.mark.parametrize( "method,value,factor,scale", [ # Check method="scale10" in detail ("scale10", 2e-10, 2, 1e-10), ("scale10", 2e10, 2, 1e10), ("scale10", -2e-10, -2, 1e-10), ("scale10", -2e10, -2, 1e10), # Check that results are OK for very large numbers # Regression test for https://github.com/gammapy/gammapy/issues/1883 ("scale10", 9e35, 9, 1e35), # Checks for the simpler method="factor1" ("factor1", 2e10, 1, 2e10), ("factor1", -2e10, 1, -2e10), ], ) def test_parameter_autoscale(method, value, factor, scale): par = Parameter("", value, scale_method=method) par.autoscale() assert_allclose(par.factor, factor) assert_allclose(par.scale, scale) assert isinstance(par.scale, float) @pytest.fixture() def pars(): return Parameters( [ Parameter("spam", 42, "deg"), Parameter("ham", 99, "TeV", prior=GaussianPrior()), ] ) def test_parameters_basics(pars): # This applies a unit transformation pars["ham"].error = "10000 GeV" pars["spam"].error = 0.1 assert_allclose(pars["spam"].error, 0.1) assert_allclose(pars[1].error, 10) def test_parameters_copy(pars): pars2 = pars.copy() assert pars is not pars2 assert pars[0] is not pars2[0] def test_parameters_from_stack(): a = Parameter("a", 1) b = Parameter("b", 2) c = Parameter("c", 3) pars = Parameters([a, b]) + Parameters([]) + Parameters([c]) assert pars.names == ["a", "b", "c"] def test_unique_parameters(): a = Parameter("a", 1) b = Parameter("b", 2) c = Parameter("c", 3) parameters = Parameters([a, b, a, c]) assert parameters.names == ["a", "b", "a", "c"] parameters_unique = parameters.unique_parameters assert parameters_unique.names == ["a", "b", "c"] def test_parameters_getitem(pars): assert pars[1].name == "ham" assert pars["ham"].name == "ham" assert pars[pars[1]].name == "ham" with pytest.raises(TypeError): pars[42.3] with pytest.raises(IndexError): pars[3] with pytest.raises(IndexError): pars["lamb"] with pytest.raises(ValueError): pars[Parameter("bam!", 99)] def test_parameters_to_table(pars, tmp_path): pars["ham"].error = 1e-10 pars["spam"]._link_label_io = "test" table = pars.to_table() assert len(table) == 2 assert len(table.columns) == 10 assert table["link"][0] == "test" assert table["link"][1] == "" assert table["prior"][0] == "" assert table["prior"][1] == "GaussianPrior" assert table["type"][1] == "" table.write(tmp_path / "test_parameters.fits") Table.read(tmp_path / "test_parameters.fits") def test_parameters_create_table(): table = Parameters._create_default_table() assert len(table) == 0 assert len(table.columns) == 10 assert table.colnames == [ "type", "name", "value", "unit", "error", "min", "max", "frozen", "link", "prior", ] assert table.dtype == np.dtype( [ ("type", " ts_eval.ts_threshold assert model2.spectral_model.alpha.value != model.spectral_model.alpha.value assert model2.spectral_model.alpha.error != model.spectral_model.alpha.error assert model2.spectral_model.alpha.error != 0 ts_eval = TestStatisticNested( [model.spectral_model.alpha], [model2.spectral_model.alpha], fit=fit, n_sigma=np.inf, ) results = ts_eval.run(fermi_datasets) assert results["ts"] < ts_eval.ts_threshold assert_allclose(model2.spectral_model.alpha.value, model.spectral_model.alpha.value) assert_allclose(model2.spectral_model.alpha.error, model.spectral_model.alpha.error) assert model2.spectral_model.alpha.error != 0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/modeling/tests/test_sherpa.py0000644000175100001770000000433514721316200021704 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose from gammapy.modeling import Parameter, Parameters from gammapy.modeling.sherpa import optimize_sherpa pytest.importorskip("sherpa") class MyDataset: def __init__(self, parameters): self.parameters = parameters def fcn(self): x, y, z = [p.value for p in self.parameters] x_opt, y_opt, z_opt = 2, 3e5, 4e-5 x_err, y_err, z_err = 0.2, 3e4, 4e-6 return ( ((x - x_opt) / x_err) ** 2 + ((y - y_opt) / y_err) ** 2 + ((z - z_opt) / z_err) ** 2 ) @pytest.fixture() def pars(): x = Parameter("x", 2.1) y = Parameter("y", 3.1e5, scale=1e5) z = Parameter("z", 4.1e-5, scale=1e-5) return Parameters([x, y, z]) # TODO: levmar doesn't work yet; needs array of statval as return in likelihood # method='gridsearch' would require very low tolerance asserts, not added for now @pytest.mark.parametrize("method", ["moncar", "simplex"]) def test_sherpa(method, pars): ds = MyDataset(pars) factors, info, _ = optimize_sherpa(function=ds.fcn, parameters=pars, method=method) assert info["success"] assert_allclose(ds.fcn(), 0, atol=1e-12) # Check the result in parameters is OK assert_allclose(pars["x"].value, 2) assert_allclose(pars["y"].value, 3e5) assert_allclose(pars["z"].value, 4e-5) # Check that sherpa sees the parameter factors correctly assert_allclose(factors, [2, 3, 4]) def test_sherpa_frozen(pars): ds = MyDataset(pars) pars["y"].frozen = True factors, info, _ = optimize_sherpa(function=ds.fcn, parameters=pars) assert info["success"] assert_allclose(pars["x"].value, 2) assert_allclose(pars["y"].value, 3.1e5) assert_allclose(pars["z"].value, 4.0e-5) assert_allclose(ds.fcn(), 0.11111111, rtol=1e-6) @pytest.mark.xfail def test_sherpa_limits(pars): ds = MyDataset(pars) pars["y"].min = 301000 factors, info, _ = optimize_sherpa(function=ds.fcn, parameters=pars) assert info["success"] # Check the result in parameters is OK assert_allclose(pars["x"].value, 2, rtol=1e-2) assert_allclose(pars["y"].value, 301000, rtol=1e-3) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.236642 gammapy-1.3/gammapy/scripts/0000755000175100001770000000000014721316215015541 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/scripts/__init__.py0000644000175100001770000000016014721316200017641 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Gammapy command line interface (scripts).""" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/scripts/analysis.py0000644000175100001770000000271214721316200017732 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import click from gammapy.analysis import Analysis, AnalysisConfig log = logging.getLogger(__name__) @click.command(name="config") @click.option( "--filename", default="config.yaml", help="Filename to store the default configuration values.", show_default=True, ) @click.option( "--overwrite", default=False, is_flag=True, help="Overwrite existing file." ) def cli_make_config(filename, overwrite): """Writes default configuration file.""" config = AnalysisConfig() config.write(filename, overwrite=overwrite) log.info(f"Configuration file produced: {filename}") @click.command(name="run") @click.option( "--filename", default="config.yaml", help="Filename with default configuration values.", show_default=True, ) @click.option( "--out", default="datasets", help="Output folder where reduced datasets are stored.", show_default=True, ) @click.option( "--overwrite", default=False, is_flag=True, help="Overwrite existing datasets." ) def cli_run_analysis(filename, out, overwrite): """Performs automated data reduction process.""" config = AnalysisConfig.read(filename) config.datasets.background.method = "reflected" analysis = Analysis(config) analysis.get_observations() analysis.get_datasets() analysis.datasets.write(out, overwrite=overwrite) log.info(f"Datasets stored in {out} folder.") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/scripts/check.py0000644000175100001770000000134514721316200017165 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Command line tool to check various things about a Gammapy installation. This file is called `check` instead of `test` to prevent confusion for developers and the test runner from including it in test collection. """ import logging import warnings import click log = logging.getLogger(__name__) @click.group("check") def cli_check(): """Run checks for Gammapy.""" @cli_check.command("logging") def cli_check_logging(): """Check logging.""" log.debug("this is log.debug() output") log.info("this is log.info() output") log.warning("this is log.warning() output") warnings.warn("this is warnings.warn() output") print("this is stdout output") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/scripts/download.py0000644000175100001770000001253314721316200017720 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Command line tool to download datasets and notebooks.""" import logging import tarfile import zipfile from pathlib import Path from urllib.parse import urlparse from astropy.utils import lazyproperty import click from gammapy import __version__ log = logging.getLogger(__name__) BUNDLESIZE = 152 # in MB GAMMAPY_BASE_URL = "https://gammapy.org/download/" RELEASE = __version__ if "dev" in __version__: RELEASE = "dev" class DownloadIndex: """Download index.""" _notebooks_key = "notebooks" _datasets_key = "datasets" _environment_key = "conda-environment" _index_json = "index.json" def __init__(self, release=RELEASE): self.release = release @lazyproperty def index(self): """Index for a given release.""" import requests response = requests.get(GAMMAPY_BASE_URL + self._index_json) data = response.json() if self.release not in data: raise ValueError( f"Download not available for release {self.release}, " f"choose from: {list(data.keys())}" ) return data[self.release] @property def notebooks_url(self): """Notebooks URL.""" return self.index[self._notebooks_key] @property def environment_url(self): """Environment URL""" return self.index[self._environment_key] @property def datasets_url(self): """Datasets URL.""" return self.index[self._datasets_key] def progress_download(source, destination): """ Notes ----- The progress bar can be displayed for this function. """ import requests from tqdm import tqdm destination.parent.mkdir(parents=True, exist_ok=True) with requests.get(source, stream=True) as r: total_size = ( int(r.headers.get("content-length")) if r.headers.get("content-length") else BUNDLESIZE * 1024 * 1024 ) progress_bar = tqdm( total=total_size, unit="B", unit_scale=True, unit_divisor=1024 ) with open(destination, "wb") as f: for chunk in r.iter_content(chunk_size=1024): if chunk: f.write(chunk) progress_bar.update(len(chunk)) progress_bar.close() def members(tf): list_members = tf.getmembers() root_folder = list_members[0].name for member in list_members: if member.path.startswith(root_folder): member.path = member.path[len(root_folder) + 1 :] # noqa: E203 yield member def extract_bundle(bundle, destination): with tarfile.open(bundle) as tar: tar.extractall(path=destination, members=members(tar)) def show_info_notebooks(outfolder, release): print("") print( "*** Enter the following commands below to get started with this version of Gammapy" ) print(f"cd {outfolder}") if __version__ != release: print(f"conda env create -f gammapy-{release}-environment.yml") print(f"conda activate gammapy-{release}") print("jupyter lab") print("") def show_info_datasets(outfolder, release): print("") print("*** You might want to declare GAMMAPY_DATA as a global env variable") print(f"export GAMMAPY_DATA=$PWD/{outfolder}") print("") print("Or as part of your conda environment:") print(f"conda env config vars set GAMMAPY_DATA=$PWD/{outfolder}") print(f"conda activate gammapy-{release}") print("") @click.command(name="notebooks") @click.option( "--release", default=RELEASE, help="Number of stable release - ex: 0.18.2)", show_default=True, ) @click.option( "--out", default="gammapy-notebooks", help="Path where the versioned notebook files will be copied.", show_default=True, ) def cli_download_notebooks(release, out): """Download notebooks.""" index = DownloadIndex(release=release) path = Path(out) / index.release filename = path / f"gammapy-{index.release}-environment.yml" progress_download(index.environment_url, filename) url_path = urlparse(index.notebooks_url).path filename_destination = path / Path(url_path).name progress_download(index.notebooks_url, filename_destination) if "zip" in index.notebooks_url: archive = zipfile.ZipFile(filename_destination, "r") else: archive = tarfile.open(filename_destination) with archive as ref: ref.extractall(path) # delete file filename_destination.unlink() show_info_notebooks(path, release) @click.command(name="datasets") @click.option( "--release", default=RELEASE, help="Number of stable release - ex: 0.18.2)", show_default=True, ) @click.option( "--out", default="gammapy-datasets", help="Destination folder.", show_default=True, ) def cli_download_datasets(release, out): """Download datasets.""" index = DownloadIndex(release=release) localfolder = Path(out) / index.release log.info(f"Downloading datasets from {index.datasets_url}") tar_destination_file = localfolder / "datasets.tar.gz" progress_download(index.datasets_url, tar_destination_file) log.info(f"Extracting {tar_destination_file}") extract_bundle(tar_destination_file, localfolder) Path(tar_destination_file).unlink() show_info_datasets(localfolder, release) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/scripts/info.py0000644000175100001770000000566514721316200017054 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import importlib import logging import os import platform import sys import warnings import click from gammapy import __version__ log = logging.getLogger(__name__) # Should be in sync with `docs/install/dependencies.rst` GAMMAPY_DEPENDENCIES = [ # required "numpy", "scipy", "astropy", "regions", "click", "yaml", # "pydantic", # has no __version__ # optional "IPython", # "jupyter", # has no __version__ "jupyterlab", "matplotlib", "pandas", "healpy", "iminuit", "sherpa", "naima", "emcee", "corner", "ray", ] GAMMAPY_ENV_VARIABLES = ["GAMMAPY_DATA"] @click.command(name="info") @click.option("--system/--no-system", default=True, help="Show system info") @click.option("--version/--no-version", default=True, help="Show version") @click.option( "--dependencies/--no-dependencies", default=True, help="Show dependencies" ) @click.option("--envvar/--no-envvar", default=True, help="Show environment variables") def cli_info(system, version, dependencies, envvar): """Display information about Gammapy.""" if system: info = get_info_system() print_info(info=info, title="System") if version: info = get_info_version() print_info(info=info, title="Gammapy package") if dependencies: info = get_info_dependencies() print_info(info=info, title="Other packages") if envvar: info = get_info_envvar() print_info(info=info, title="Gammapy environment variables") def print_info(info, title): """Print Gammapy info.""" info_all = f"\n{title}:\n\n" for key, value in info.items(): info_all += f"\t{key:22s} : {value:<10s} \n" print(info_all) def get_info_system(): """Get information about user system.""" return { "python_executable": sys.executable, "python_version": platform.python_version(), "machine": platform.machine(), "system": platform.system(), } def get_info_version(): """Get detailed info about Gammapy version.""" info = {"version": __version__} try: path = sys.modules["gammapy"].__path__[0] except Exception: path = "unknown" info["path"] = path return info def get_info_dependencies(): """Get info about Gammapy dependencies.""" info = {} for name in GAMMAPY_DEPENDENCIES: try: with warnings.catch_warnings(): warnings.simplefilter("ignore") module = importlib.import_module(name) module_version = getattr(module, "__version__", "no version info found") except ImportError: module_version = "not installed" info[name] = module_version return info def get_info_envvar(): """Get info about Gammapy environment variables.""" return {name: os.environ.get(name, "not set") for name in GAMMAPY_ENV_VARIABLES} ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/scripts/main.py0000644000175100001770000000656714721316200017047 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import warnings import click from gammapy import __version__ # We implement the --version following the example from here: # http://click.pocoo.org/5/options/#callbacks-and-eager-options def print_version(ctx, param, value): if not value or ctx.resilient_parsing: return print(f"gammapy version {__version__}") ctx.exit() # http://click.pocoo.org/5/documentation/#help-parameter-customization CONTEXT_SETTINGS = dict(help_option_names=["-h", "--help"]) # https://click.palletsprojects.com/en/5.x/python3/#unicode-literals click.disable_unicode_literals_warning = True @click.group("gammapy", context_settings=CONTEXT_SETTINGS) @click.option( "--log-level", default="info", help="Logging verbosity level.", type=click.Choice(["debug", "info", "warning", "error"]), ) @click.option("--ignore-warnings", is_flag=True, help="Ignore warnings?") @click.option( "--version", is_flag=True, callback=print_version, expose_value=False, is_eager=True, help="Print version and exit.", ) def cli(log_level, ignore_warnings): # noqa: D301 """Gammapy command line interface (CLI). Gammapy is a Python package for gamma-ray astronomy. Use ``--help`` to see available sub-commands, as well as the available arguments and options for each sub-command. For further information, see https://gammapy.org/ and https://docs.gammapy.org/ \b Examples -------- \b $ gammapy --help $ gammapy --version $ gammapy info --help $ gammapy info """ logging.basicConfig(level=log_level.upper()) if ignore_warnings: warnings.simplefilter("ignore") @cli.group("analysis") def cli_analysis(): """Automation of configuration driven data reduction process. \b Examples -------- \b $ gammapy analysis config $ gammapy analysis run $ gammapy analysis config --overwrite $ gammapy analysis config --filename myconfig.yaml $ gammapy analysis run --filename myconfig.yaml """ @cli.group("download", short_help="Download datasets and notebooks") @click.pass_context def cli_download(ctx): # noqa: D301 """Download notebooks and datasets. \b Download notebooks published in the Gammapy documentation as well as the related datasets needed to execute them. \b - The option `notebooks` will download the notebook files into a `gammapy-notebooks` folder created at the current working directory. \b - The option `datasets` will download the datasets used in the documentation into a `gammapy-datasets` folder created at the current working directory. \b Examples -------- \b $ gammapy download datasets --out localfolder $ gammapy download notebooks --release 0.18 --out localfolder """ def add_subcommands(): from .info import cli_info cli.add_command(cli_info) from .check import cli_check cli.add_command(cli_check) from .download import cli_download_notebooks cli_download.add_command(cli_download_notebooks) from .download import cli_download_datasets cli_download.add_command(cli_download_datasets) from .analysis import cli_make_config cli_analysis.add_command(cli_make_config) from .analysis import cli_run_analysis cli_analysis.add_command(cli_run_analysis) add_subcommands() ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.236642 gammapy-1.3/gammapy/scripts/tests/0000755000175100001770000000000014721316215016703 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/scripts/tests/__init__.py0000644000175100001770000000010014721316200020775 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/scripts/tests/test_analysis.py0000644000175100001770000000150614721316200022133 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from gammapy.scripts.main import cli from gammapy.utils.testing import requires_data, run_cli from ...analysis.tests.test_analysis import get_example_config def test_cli_analysis_config(tmp_path): path_config = tmp_path / "config.yaml" args = ["analysis", "config", f"--filename={path_config}"] run_cli(cli, args) assert path_config.exists() @requires_data() def test_cli_analysis_run(tmp_path): path_config = tmp_path / "config.yaml" config = get_example_config("1d") config.write(path_config) path_datasets = tmp_path / "datasets" args = [ "analysis", "run", f"--filename={path_config}", f"--out={path_datasets}", "--overwrite", ] run_cli(cli, args) assert path_datasets.exists() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/scripts/tests/test_download.py0000644000175100001770000000333614721316200022122 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from gammapy.scripts.main import cli from gammapy.utils.testing import requires_dependency, run_cli @pytest.fixture(scope="session") def config(): return { "release": "0.20", "notebook": "overview", "envfilename": "gammapy-0.20-environment.yml", } def test_cli_download_help(): result = run_cli(cli, ["download", "--help"]) assert "Usage" in result.output @requires_dependency("requests") @requires_dependency("tqdm") @pytest.mark.remote_data def test_cli_download_notebooks_stable(tmp_path, config): args = [ "download", "notebooks", f"--out={tmp_path}", f"--release={config['release']}", ] run_cli(cli, args) assert (tmp_path / config["release"] / config["envfilename"]).exists() assert ( tmp_path / config["release"] / "tutorials" / "starting" / f"{config['notebook']}.ipynb" ).exists() @requires_dependency("requests") @requires_dependency("tqdm") @pytest.mark.remote_data def test_cli_download_notebooks_dev(tmp_path): args = [ "download", "notebooks", f"--out={tmp_path}", "--release=dev", ] run_cli(cli, args) assert (tmp_path / "dev" / "gammapy-dev-environment.yml").exists() assert (tmp_path / "dev" / "starting" / "analysis_1.ipynb").exists() @requires_dependency("requests") @requires_dependency("tqdm") @pytest.mark.remote_data def test_cli_download_datasets(tmp_path): option_out = f"--out={tmp_path}" args = ["download", "datasets", option_out] result = run_cli(cli, args) assert (tmp_path / "dev").exists() assert "GAMMAPY_DATA" in result.output ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/scripts/tests/test_info.py0000644000175100001770000000104614721316200021242 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from gammapy.scripts.main import cli from gammapy.utils.testing import run_cli def test_cli_info_help(): result = run_cli(cli, ["info", "--help"]) assert "Usage" in result.output def test_cli_info_no_args(): # No arguments should print all infos result = run_cli(cli, ["info"]) assert "System" in result.output assert "Gammapy package" in result.output assert "Other packages" in result.output assert "Gammapy environment variables" in result.output ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/scripts/tests/test_main.py0000644000175100001770000000120214721316200021225 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from gammapy import __version__ from gammapy.scripts.main import cli from gammapy.utils.testing import run_cli def test_cli_no_args(): # No arguments should print help result = run_cli(cli, []) assert "Usage" in result.output def test_cli_help(): result = run_cli(cli, ["--help"]) assert "Usage" in result.output def test_cli_version(): result = run_cli(cli, ["--version"]) assert f"gammapy version {__version__}" in result.output def test_check_logging(): result = run_cli(cli, ["check", "logging"]) assert "output" in result.output ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.240642 gammapy-1.3/gammapy/stats/0000755000175100001770000000000014721316215015210 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/stats/__init__.py0000644000175100001770000000170414721316200017315 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Statistics.""" from .counts_statistic import CashCountsStatistic, WStatCountsStatistic from .fit_statistics import cash, cstat, get_wstat_gof_terms, get_wstat_mu_bkg, wstat from .fit_statistics_cython import ( cash_sum_cython, f_cash_root_cython, norm_bounds_cython, ) from .variability import ( TimmerKonig_lightcurve_simulator, compute_chisq, compute_flux_doubling, compute_fpp, compute_fvar, discrete_correlation, structure_function, ) __all__ = [ "cash", "cash_sum_cython", "CashCountsStatistic", "cstat", "f_cash_root_cython", "get_wstat_gof_terms", "get_wstat_mu_bkg", "norm_bounds_cython", "wstat", "WStatCountsStatistic", "compute_fvar", "compute_fpp", "compute_flux_doubling", "compute_chisq", "structure_function", "discrete_correlation", "TimmerKonig_lightcurve_simulator", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/stats/counts_statistic.py0000644000175100001770000003510014721316200021155 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import abc import html import numpy as np from scipy.special import lambertw from scipy.stats import chi2 from gammapy.utils.roots import find_roots from .fit_statistics import cash, wstat __all__ = ["WStatCountsStatistic", "CashCountsStatistic"] class CountsStatistic(abc.ABC): """Counts statistics base class.""" @property @abc.abstractmethod def stat_null(self): pass @property @abc.abstractmethod def stat_max(self): pass @property @abc.abstractmethod def n_sig(self): pass @property @abc.abstractmethod def n_bkg(self): pass @property @abc.abstractmethod def error(self): pass @abc.abstractmethod def _stat_fcn(self): pass @property def ts(self): """Return stat difference (TS) of measured excess versus no excess.""" # Remove (small) negative TS due to error in root finding ts = np.clip(self.stat_null - self.stat_max, 0, None) return ts @property def sqrt_ts(self): """Return statistical significance of measured excess. The sign of the excess is applied to distinguish positive and negative fluctuations. """ return np.sign(self.n_sig) * np.sqrt(self.ts) @property def p_value(self): """Return p_value of measured excess. Here the value accounts only for the positive excess significance (i.e. one-sided). """ return 0.5 * chi2.sf(self.ts, 1) def __str__(self): str_ = "\t{:32}: {{n_on:.2f}} \n".format("Total counts") str_ += "\t{:32}: {{background:.2f}}\n".format("Total background counts") str_ += "\t{:32}: {{excess:.2f}}\n".format("Total excess counts") str_ += "\t{:32}: {{significance:.2f}}\n".format("Total significance") str_ += "\t{:32}: {{p_value:.3f}}\n".format("p - value") str_ += "\t{:32}: {{n_bins:.0f}}\n".format("Total number of bins") info = self.info_dict() info["n_bins"] = np.array(self.n_on).size str_ = str_.format(**info) return str_.expandtabs(tabsize=2) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def info_dict(self): """A dictionary of the relevant quantities. Returns ------- info_dict : dict Dictionary with summary information. """ info_dict = {} info_dict["n_on"] = self.n_on info_dict["background"] = self.n_bkg info_dict["excess"] = self.n_sig info_dict["significance"] = self.sqrt_ts info_dict["p_value"] = self.p_value return info_dict def compute_errn(self, n_sigma=1.0): """Compute downward excess uncertainties. Searches the signal value for which the test statistics is n_sigma**2 away from the maximum. Parameters ---------- n_sigma : float Confidence level of the uncertainty expressed in number of sigma. Default is 1. """ errn = np.zeros_like(self.n_sig, dtype="float") min_range = self.n_sig - 2 * n_sigma * (self.error + 1) it = np.nditer(errn, flags=["multi_index"]) while not it.finished: roots, res = find_roots( self._stat_fcn, min_range[it.multi_index], self.n_sig[it.multi_index], nbin=1, args=(self.stat_max[it.multi_index] + n_sigma**2, it.multi_index), ) if np.isnan(roots[0]): errn[it.multi_index] = self.n_on[it.multi_index] else: errn[it.multi_index] = self.n_sig[it.multi_index] - roots[0] it.iternext() return errn def compute_errp(self, n_sigma=1): """Compute upward excess uncertainties. Searches the signal value for which the test statistics is n_sigma**2 away from the maximum. Parameters ---------- n_sigma : float Confidence level of the uncertainty expressed in number of sigma. Default is 1. """ errp = np.zeros_like(self.n_on, dtype="float") max_range = self.n_sig + 2 * n_sigma * (self.error + 1) it = np.nditer(errp, flags=["multi_index"]) while not it.finished: roots, res = find_roots( self._stat_fcn, self.n_sig[it.multi_index], max_range[it.multi_index], nbin=1, args=(self.stat_max[it.multi_index] + n_sigma**2, it.multi_index), ) errp[it.multi_index] = roots[0] it.iternext() return errp - self.n_sig def compute_upper_limit(self, n_sigma=3): """Compute upper limit on the signal. Searches the signal value for which the test statistics is n_sigma**2 away from the maximum or from 0 if the measured excess is negative. Parameters ---------- n_sigma : float Confidence level of the upper limit expressed in number of sigma. Default is 3. """ ul = np.zeros_like(self.n_on, dtype="float") min_range = self.n_sig max_range = min_range + 2 * n_sigma * (self.error + 1) it = np.nditer(ul, flags=["multi_index"]) while not it.finished: ts_ref = self._stat_fcn(min_range[it.multi_index], 0.0, it.multi_index) roots, res = find_roots( self._stat_fcn, min_range[it.multi_index], max_range[it.multi_index], nbin=1, args=(ts_ref + n_sigma**2, it.multi_index), ) ul[it.multi_index] = roots[0] it.iternext() return ul @abc.abstractmethod def _n_sig_matching_significance_fcn(self): pass def n_sig_matching_significance(self, significance): """Compute excess matching a given significance. This function is the inverse of `significance`. Parameters ---------- significance : float Significance. Returns ------- n_sig : `numpy.ndarray` Excess. """ n_sig = np.zeros_like(self.n_bkg, dtype="float") it = np.nditer(n_sig, flags=["multi_index"]) while not it.finished: lower_bound = np.sqrt(self.n_bkg[it.multi_index]) * significance # find upper bounds for secant method as in scipy eps = 1e-4 upper_bound = lower_bound * (1 + eps) upper_bound += eps if upper_bound >= 0 else -eps roots, res = find_roots( self._n_sig_matching_significance_fcn, lower_bound=lower_bound, upper_bound=upper_bound, args=(significance, it.multi_index), nbin=1, method="secant", ) n_sig[it.multi_index] = roots[0] # return NaN if fail it.iternext() return n_sig @abc.abstractmethod def sum(self, axis=None): """Return summed CountsStatistics. Parameters ---------- axis : None or int or tuple of ints, optional Axis or axes on which to perform the summation. Default, axis=None, will perform the sum over the whole array. Returns ------- stat : `~gammapy.stats.CountsStatistics` The summed stat object. """ pass class CashCountsStatistic(CountsStatistic): """Class to compute statistics for Poisson distributed variable with known background. Parameters ---------- n_on : int Measured counts. mu_bkg : float Known level of background. """ def __init__(self, n_on, mu_bkg): self.n_on = np.asanyarray(n_on) self.mu_bkg = np.asanyarray(mu_bkg) @property def n_bkg(self): """Expected background counts.""" return self.mu_bkg @property def n_sig(self): """Excess.""" return self.n_on - self.n_bkg @property def error(self): """Approximate error from the covariance matrix.""" return np.sqrt(self.n_on) @property def stat_null(self): """Stat value for null hypothesis, i.e. 0 expected signal counts.""" return cash(self.n_on, self.mu_bkg + 0) @property def stat_max(self): """Stat value for best fit hypothesis, i.e. expected signal mu = n_on - mu_bkg.""" return cash(self.n_on, self.n_on) def info_dict(self): """A dictionary of the relevant quantities. Returns ------- info_dict : dict Dictionary with summary info. """ info_dict = super().info_dict() info_dict["mu_bkg"] = self.mu_bkg return info_dict def __str__(self): str_ = f"{self.__class__.__name__}\n" str_ += super().__str__() str_ += "\t{:32}: {:.2f} \n".format( "Predicted background counts", self.info_dict()["mu_bkg"] ) return str_.expandtabs(tabsize=2) def _stat_fcn(self, mu, delta=0, index=None): return cash(self.n_on[index], self.mu_bkg[index] + mu) - delta def _n_sig_matching_significance_fcn(self, n_sig, significance, index): TS0 = cash(n_sig + self.mu_bkg[index], self.mu_bkg[index]) TS1 = cash(n_sig + self.mu_bkg[index], self.mu_bkg[index] + n_sig) return np.sign(n_sig) * np.sqrt(np.clip(TS0 - TS1, 0, None)) - significance def sum(self, axis=None): n_on = self.n_on.sum(axis=axis) bkg = self.n_bkg.sum(axis=axis) return CashCountsStatistic(n_on=n_on, mu_bkg=bkg) def __getitem__(self, key): return CashCountsStatistic(n_on=self.n_on[key], mu_bkg=self.n_bkg[key]) def compute_errn(self, n_sigma=1.0): result = np.zeros_like(self.n_on, dtype="float") c = n_sigma**2 / 2 mask = self.n_on > 0 on = self.n_on[mask] result[mask] = on * (lambertw(-np.exp(-c / on - 1), k=0).real + 1) result[~mask] = 0 return result def compute_errp(self, n_sigma=1.0): result = np.zeros_like(self.n_on, dtype="float") c = n_sigma**2 / 2 mask = self.n_on > 0 on = self.n_on[mask] result[mask] = -on * (lambertw(-np.exp(-c / on - 1), k=-1).real + 1) result[~mask] = c return result def compute_upper_limit(self, n_sigma=3): result = np.zeros_like(self.n_on, dtype="float") c = n_sigma**2 / 2 mask = self.n_on > 0 on = self.n_on[mask] result[mask] = ( -on * (lambertw(-np.exp(-c / on - 1), k=-1).real + 1) + self.n_sig[mask] ) result[~mask] = c return result def n_sig_matching_significance(self, significance): result = np.zeros_like(self.mu_bkg, dtype="float") c = significance**2 / 2 mask = self.mu_bkg > 0 bkg = self.mu_bkg[mask] branch = 0 if significance > 0 else -1 res = lambertw((c / bkg - 1) / np.exp(1), k=branch).real result[mask] = bkg * (np.exp(res + 1) - 1) result[~mask] = np.nan return result class WStatCountsStatistic(CountsStatistic): """Class to compute statistics for Poisson distributed variable with unknown background. Parameters ---------- n_on : int Measured counts in on region. n_off : int Measured counts in off region. alpha : float Acceptance ratio of on and off measurements. mu_sig : float Expected signal counts in on region. """ def __init__(self, n_on, n_off, alpha, mu_sig=None): self.n_on = np.asanyarray(n_on) self.n_off = np.asanyarray(n_off) self.alpha = np.asanyarray(alpha) if mu_sig is None: self.mu_sig = np.zeros_like(self.n_on) else: self.mu_sig = np.asanyarray(mu_sig) @property def n_bkg(self): """Known background computed alpha * n_off.""" return self.alpha * self.n_off @property def n_sig(self): """Excess.""" return self.n_on - self.n_bkg - self.mu_sig @property def error(self): """Approximate error from the covariance matrix.""" return np.sqrt(self.n_on + self.alpha**2 * self.n_off) @property def stat_null(self): """Stat value for null hypothesis, i.e. mu_sig expected signal counts.""" return wstat(self.n_on, self.n_off, self.alpha, self.mu_sig) @property def stat_max(self): """Stat value for best fit hypothesis. i.e. expected signal mu = n_on - alpha * n_off - mu_sig. """ return wstat(self.n_on, self.n_off, self.alpha, self.n_sig + self.mu_sig) def info_dict(self): """A dictionary of the relevant quantities. Returns ------- info_dict : dict Dictionary with summary info. """ info_dict = super().info_dict() info_dict["n_off"] = self.n_off info_dict["alpha"] = self.alpha info_dict["mu_sig"] = self.mu_sig return info_dict def __str__(self): str_ = f"{self.__class__.__name__}\n" str_ += super().__str__() info_dict = self.info_dict() str_ += "\t{:32}: {:.2f} \n".format("Off counts", info_dict["n_off"]) str_ += "\t{:32}: {:.2f} \n".format("alpha ", info_dict["alpha"]) str_ += "\t{:32}: {:.2f} \n".format( "Predicted signal counts", info_dict["mu_sig"] ) return str_.expandtabs(tabsize=2) def _stat_fcn(self, mu, delta=0, index=None): return ( wstat( self.n_on[index], self.n_off[index], self.alpha[index], (mu + self.mu_sig[index]), ) - delta ) def _n_sig_matching_significance_fcn(self, n_sig, significance, index): stat0 = wstat( n_sig + self.n_bkg[index], self.n_off[index], self.alpha[index], 0 ) stat1 = wstat( n_sig + self.n_bkg[index], self.n_off[index], self.alpha[index], n_sig, ) return np.sign(n_sig) * np.sqrt(np.clip(stat0 - stat1, 0, None)) - significance def sum(self, axis=None): n_on = self.n_on.sum(axis=axis) n_off = self.n_off.sum(axis=axis) alpha = self.n_bkg.sum(axis=axis) / n_off return WStatCountsStatistic(n_on=n_on, n_off=n_off, alpha=alpha) def __getitem__(self, key): return WStatCountsStatistic( n_on=self.n_on[key], n_off=self.n_off[key], alpha=self.alpha[key] ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/stats/fit_statistics.py0000644000175100001770000001671014721316200020615 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Common fit statistics used in gamma-ray astronomy. see :ref:`fit-statistics` """ import numpy as np from gammapy.stats.fit_statistics_cython import TRUNCATION_VALUE __all__ = ["cash", "cstat", "wstat", "get_wstat_mu_bkg", "get_wstat_gof_terms"] def cash(n_on, mu_on, truncation_value=TRUNCATION_VALUE): r"""Cash statistic, for Poisson data. The Cash statistic is defined as: .. math:: C = 2 \left( \mu_{on} - n_{on} \log \mu_{on} \right) and :math:`C = 0` where :math:`\mu <= 0`. For more information see :ref:`fit-statistics`. Parameters ---------- n_on : `~numpy.ndarray` or array_like Observed counts. mu_on : `~numpy.ndarray` or array_like Expected counts. truncation_value : `~numpy.ndarray` or array_like Minimum value use for ``mu_on`` ``mu_on`` = ``truncation_value`` where ``mu_on`` <= ``truncation_value``. Default is 1e-25. Returns ------- stat : ndarray Statistic per bin. References ---------- * `Sherpa statistics page section on the Cash statistic: `_ * `Sherpa help page on the Cash statistic: `_ * `Cash 1979, ApJ 228, 939, `_ """ n_on = np.asanyarray(n_on) mu_on = np.asanyarray(mu_on) truncation_value = np.asanyarray(truncation_value) if np.any(truncation_value) <= 0: raise ValueError("Cash statistic truncation value must be positive.") mu_on = np.where(mu_on <= truncation_value, truncation_value, mu_on) # suppress zero division warnings, they are corrected below with np.errstate(divide="ignore", invalid="ignore"): stat = 2 * (mu_on - n_on * np.log(mu_on)) return stat def cstat(n_on, mu_on, truncation_value=TRUNCATION_VALUE): r"""C statistic, for Poisson data. The C statistic is defined as: .. math:: C = 2 \left[ \mu_{on} - n_{on} + n_{on} (\log(n_{on}) - log(\mu_{on}) \right] and :math:`C = 0` where :math:`\mu_{on} <= 0`. ``truncation_value`` handles the case where ``n_on`` or ``mu_on`` is 0 or less and the log cannot be taken. For more information see :ref:`fit-statistics`. Parameters ---------- n_on : `~numpy.ndarray` or array_like Observed counts. mu_on : `~numpy.ndarray` or array_like Expected counts. truncation_value : float ``n_on`` = ``truncation_value`` where ``n_on`` <= ``truncation_value.`` ``mu_on`` = ``truncation_value`` where ``n_on`` <= ``truncation_value`` Default is 1e-25. Returns ------- stat : ndarray Statistic per bin. References ---------- * `Sherpa stats page section on the C statistic: `_ * `Sherpa help page on the C statistic: `_ * `Cash 1979, ApJ 228, 939, `_ """ n_on = np.asanyarray(n_on, dtype=np.float64) mu_on = np.asanyarray(mu_on, dtype=np.float64) truncation_value = np.asanyarray(truncation_value, dtype=np.float64) if np.any(truncation_value) <= 0: raise ValueError("Cstat statistic truncation value must be positive.") n_on = np.where(n_on <= truncation_value, truncation_value, n_on) mu_on = np.where(mu_on <= truncation_value, truncation_value, mu_on) term1 = np.log(n_on) - np.log(mu_on) stat = 2 * (mu_on - n_on + n_on * term1) stat = np.where(mu_on > 0, stat, 0) return stat def wstat(n_on, n_off, alpha, mu_sig, mu_bkg=None, extra_terms=True): r"""W statistic, for Poisson data with Poisson background. For a definition of WStat see :ref:`wstat`. If ``mu_bkg`` is not provided it will be calculated according to the profile likelihood formula. Parameters ---------- n_on : `~numpy.ndarray` or array_like Total observed counts. n_off : `~numpy.ndarray` or array_like Total observed background counts. alpha : `~numpy.ndarray` or array_like Exposure ratio between on and off region. mu_sig : `~numpy.ndarray` or array_like Signal expected counts. mu_bkg : `~numpy.ndarray` or array_like, optional Background expected counts. extra_terms : bool, optional Add model independent terms to convert stat into goodness-of-fit parameter. Default is True. Returns ------- stat : ndarray Statistic per bin. References ---------- * `Habilitation M. de Naurois, p. 141, `_ * `XSPEC page on Poisson data with Poisson background, `_ """ # Note: This is equivalent to what's defined on the XSPEC page under the # following assumptions # t_s * m_i = mu_sig # t_b * m_b = mu_bkg # t_s / t_b = alpha n_on = np.asanyarray(n_on, dtype=np.float64) n_off = np.asanyarray(n_off, dtype=np.float64) alpha = np.asanyarray(alpha, dtype=np.float64) mu_sig = np.asanyarray(mu_sig, dtype=np.float64) if mu_bkg is None: mu_bkg = get_wstat_mu_bkg(n_on, n_off, alpha, mu_sig) term1 = mu_sig + (1 + alpha) * mu_bkg # suppress zero division warnings, they are corrected below with np.errstate(divide="ignore", invalid="ignore"): # This is a false positive error from pylint # See https://github.com/PyCQA/pylint/issues/2436 term2_ = -n_on * np.log(mu_sig + alpha * mu_bkg) # pylint:disable=invalid-unary-operand-type # Handle n_on == 0 condition = n_on == 0 term2 = np.where(condition, 0, term2_) # suppress zero division warnings, they are corrected below with np.errstate(divide="ignore", invalid="ignore"): # This is a false positive error from pylint # See https://github.com/PyCQA/pylint/issues/2436 term3_ = -n_off * np.log(mu_bkg) # pylint:disable=invalid-unary-operand-type # Handle n_off == 0 condition = n_off == 0 term3 = np.where(condition, 0, term3_) stat = 2 * (term1 + term2 + term3) if extra_terms: stat += get_wstat_gof_terms(n_on, n_off) return stat def get_wstat_mu_bkg(n_on, n_off, alpha, mu_sig): """Background estimate ``mu_bkg`` for WSTAT. See :ref:`wstat`. """ n_on = np.asanyarray(n_on, dtype=np.float64) n_off = np.asanyarray(n_off, dtype=np.float64) alpha = np.asanyarray(alpha, dtype=np.float64) mu_sig = np.asanyarray(mu_sig, dtype=np.float64) # NOTE: Corner cases in the docs are all handled correctly by this formula C = alpha * (n_on + n_off) - (1 + alpha) * mu_sig D = np.sqrt(C**2 + 4 * alpha * (alpha + 1) * n_off * mu_sig) with np.errstate(invalid="ignore", divide="ignore"): mu_bkg = (C + D) / (2 * alpha * (alpha + 1)) return mu_bkg def get_wstat_gof_terms(n_on, n_off): """Goodness of fit terms for WSTAT. See :ref:`wstat`. """ term = np.zeros(n_on.shape) # suppress zero division warnings, they are corrected below with np.errstate(divide="ignore", invalid="ignore"): term1 = -n_on * (1 - np.log(n_on)) term2 = -n_off * (1 - np.log(n_off)) term += np.where(n_on == 0, 0, term1) term += np.where(n_off == 0, 0, term2) return 2 * term ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615307.0 gammapy-1.3/gammapy/stats/fit_statistics_cython.c0000644000175100001770000163453314721316213022011 0ustar00runnerdocker/* Generated by Cython 3.0.11 */ /* BEGIN: Cython Metadata { "distutils": { "define_macros": [ [ "NPY_NO_DEPRECATED_API", "NPY_1_7_API_VERSION" ], [ "NPY_TARGET_VERSION", "NPY_1_21_API_VERSION" ] ], "depends": [ "/tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/_core/include/numpy/arrayobject.h", "/tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/_core/include/numpy/arrayscalars.h", "/tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/_core/include/numpy/ndarrayobject.h", "/tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/_core/include/numpy/ndarraytypes.h", "/tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/_core/include/numpy/ufuncobject.h" ], "include_dirs": [ "/tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/_core/include" ], "name": "gammapy.stats.fit_statistics_cython", "sources": [ "gammapy/stats/fit_statistics_cython.pyx" ] }, "module_name": "gammapy.stats.fit_statistics_cython" } END: Cython Metadata */ #ifndef PY_SSIZE_T_CLEAN #define PY_SSIZE_T_CLEAN #endif /* PY_SSIZE_T_CLEAN */ #if defined(CYTHON_LIMITED_API) && 0 #ifndef Py_LIMITED_API #if CYTHON_LIMITED_API+0 > 0x03030000 #define Py_LIMITED_API CYTHON_LIMITED_API #else #define Py_LIMITED_API 0x03030000 #endif #endif #endif #include "Python.h" #ifndef Py_PYTHON_H #error Python headers needed to compile C extensions, please install development version of Python. #elif PY_VERSION_HEX < 0x02070000 || (0x03000000 <= PY_VERSION_HEX && PY_VERSION_HEX < 0x03030000) #error Cython requires Python 2.7+ or Python 3.3+. #else #if defined(CYTHON_LIMITED_API) && CYTHON_LIMITED_API #define __PYX_EXTRA_ABI_MODULE_NAME "limited" #else #define __PYX_EXTRA_ABI_MODULE_NAME "" #endif #define CYTHON_ABI "3_0_11" __PYX_EXTRA_ABI_MODULE_NAME #define __PYX_ABI_MODULE_NAME "_cython_" CYTHON_ABI #define __PYX_TYPE_MODULE_PREFIX __PYX_ABI_MODULE_NAME "." #define CYTHON_HEX_VERSION 0x03000BF0 #define CYTHON_FUTURE_DIVISION 1 #include #ifndef offsetof #define offsetof(type, member) ( (size_t) & ((type*)0) -> member ) #endif #if !defined(_WIN32) && !defined(WIN32) && !defined(MS_WINDOWS) #ifndef __stdcall #define __stdcall #endif #ifndef __cdecl #define __cdecl #endif #ifndef __fastcall #define __fastcall #endif #endif #ifndef DL_IMPORT #define DL_IMPORT(t) t #endif #ifndef DL_EXPORT #define DL_EXPORT(t) t #endif #define __PYX_COMMA , #ifndef HAVE_LONG_LONG #define HAVE_LONG_LONG #endif #ifndef PY_LONG_LONG #define PY_LONG_LONG LONG_LONG #endif #ifndef Py_HUGE_VAL #define Py_HUGE_VAL HUGE_VAL #endif #define __PYX_LIMITED_VERSION_HEX PY_VERSION_HEX #if defined(GRAALVM_PYTHON) /* For very preliminary testing purposes. Most variables are set the same as PyPy. The existence of this section does not imply that anything works or is even tested */ #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_CPYTHON 0 #define CYTHON_COMPILING_IN_LIMITED_API 0 #define CYTHON_COMPILING_IN_GRAAL 1 #define CYTHON_COMPILING_IN_NOGIL 0 #undef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 0 #undef CYTHON_USE_TYPE_SPECS #define CYTHON_USE_TYPE_SPECS 0 #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #if PY_VERSION_HEX < 0x03050000 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #elif !defined(CYTHON_USE_ASYNC_SLOTS) #define CYTHON_USE_ASYNC_SLOTS 1 #endif #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #undef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 0 #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #undef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 1 #undef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 0 #undef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 0 #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_GIL #define CYTHON_FAST_GIL 0 #undef CYTHON_METH_FASTCALL #define CYTHON_METH_FASTCALL 0 #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #ifndef CYTHON_PEP487_INIT_SUBCLASS #define CYTHON_PEP487_INIT_SUBCLASS (PY_MAJOR_VERSION >= 3) #endif #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 1 #undef CYTHON_USE_MODULE_STATE #define CYTHON_USE_MODULE_STATE 0 #undef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 0 #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #ifndef CYTHON_UPDATE_DESCRIPTOR_DOC #define CYTHON_UPDATE_DESCRIPTOR_DOC 0 #endif #undef CYTHON_USE_FREELISTS #define CYTHON_USE_FREELISTS 0 #elif defined(PYPY_VERSION) #define CYTHON_COMPILING_IN_PYPY 1 #define CYTHON_COMPILING_IN_CPYTHON 0 #define CYTHON_COMPILING_IN_LIMITED_API 0 #define CYTHON_COMPILING_IN_GRAAL 0 #define CYTHON_COMPILING_IN_NOGIL 0 #undef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 0 #ifndef CYTHON_USE_TYPE_SPECS #define CYTHON_USE_TYPE_SPECS 0 #endif #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #if PY_VERSION_HEX < 0x03050000 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #elif !defined(CYTHON_USE_ASYNC_SLOTS) #define CYTHON_USE_ASYNC_SLOTS 1 #endif #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #undef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 0 #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #undef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 1 #undef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 0 #undef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 0 #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_GIL #define CYTHON_FAST_GIL 0 #undef CYTHON_METH_FASTCALL #define CYTHON_METH_FASTCALL 0 #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #ifndef CYTHON_PEP487_INIT_SUBCLASS #define CYTHON_PEP487_INIT_SUBCLASS (PY_MAJOR_VERSION >= 3) #endif #if PY_VERSION_HEX < 0x03090000 #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 0 #elif !defined(CYTHON_PEP489_MULTI_PHASE_INIT) #define CYTHON_PEP489_MULTI_PHASE_INIT 1 #endif #undef CYTHON_USE_MODULE_STATE #define CYTHON_USE_MODULE_STATE 0 #undef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE (PY_VERSION_HEX >= 0x030400a1 && PYPY_VERSION_NUM >= 0x07030C00) #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #ifndef CYTHON_UPDATE_DESCRIPTOR_DOC #define CYTHON_UPDATE_DESCRIPTOR_DOC 0 #endif #undef CYTHON_USE_FREELISTS #define CYTHON_USE_FREELISTS 0 #elif defined(CYTHON_LIMITED_API) #ifdef Py_LIMITED_API #undef __PYX_LIMITED_VERSION_HEX #define __PYX_LIMITED_VERSION_HEX Py_LIMITED_API #endif #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_CPYTHON 0 #define CYTHON_COMPILING_IN_LIMITED_API 1 #define CYTHON_COMPILING_IN_GRAAL 0 #define CYTHON_COMPILING_IN_NOGIL 0 #undef CYTHON_CLINE_IN_TRACEBACK #define CYTHON_CLINE_IN_TRACEBACK 0 #undef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 0 #undef CYTHON_USE_TYPE_SPECS #define CYTHON_USE_TYPE_SPECS 1 #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #undef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 0 #ifndef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #endif #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #ifndef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 0 #endif #undef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 0 #undef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 0 #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_GIL #define CYTHON_FAST_GIL 0 #undef CYTHON_METH_FASTCALL #define CYTHON_METH_FASTCALL 0 #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #ifndef CYTHON_PEP487_INIT_SUBCLASS #define CYTHON_PEP487_INIT_SUBCLASS 1 #endif #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 0 #undef CYTHON_USE_MODULE_STATE #define CYTHON_USE_MODULE_STATE 1 #ifndef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 0 #endif #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #ifndef CYTHON_UPDATE_DESCRIPTOR_DOC #define CYTHON_UPDATE_DESCRIPTOR_DOC 0 #endif #undef CYTHON_USE_FREELISTS #define CYTHON_USE_FREELISTS 0 #elif defined(Py_GIL_DISABLED) || defined(Py_NOGIL) #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_CPYTHON 0 #define CYTHON_COMPILING_IN_LIMITED_API 0 #define CYTHON_COMPILING_IN_GRAAL 0 #define CYTHON_COMPILING_IN_NOGIL 1 #ifndef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 1 #endif #ifndef CYTHON_USE_TYPE_SPECS #define CYTHON_USE_TYPE_SPECS 0 #endif #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #ifndef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 1 #endif #ifndef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #endif #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #ifndef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 1 #endif #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #ifndef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 0 #endif #ifndef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 1 #endif #ifndef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 1 #endif #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_GIL #define CYTHON_FAST_GIL 0 #ifndef CYTHON_METH_FASTCALL #define CYTHON_METH_FASTCALL 1 #endif #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #ifndef CYTHON_PEP487_INIT_SUBCLASS #define CYTHON_PEP487_INIT_SUBCLASS 1 #endif #ifndef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 1 #endif #ifndef CYTHON_USE_MODULE_STATE #define CYTHON_USE_MODULE_STATE 0 #endif #ifndef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 1 #endif #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #ifndef CYTHON_UPDATE_DESCRIPTOR_DOC #define CYTHON_UPDATE_DESCRIPTOR_DOC 1 #endif #ifndef CYTHON_USE_FREELISTS #define CYTHON_USE_FREELISTS 0 #endif #else #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_CPYTHON 1 #define CYTHON_COMPILING_IN_LIMITED_API 0 #define CYTHON_COMPILING_IN_GRAAL 0 #define CYTHON_COMPILING_IN_NOGIL 0 #ifndef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 1 #endif #ifndef CYTHON_USE_TYPE_SPECS #define CYTHON_USE_TYPE_SPECS 0 #endif #ifndef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 1 #endif #if PY_MAJOR_VERSION < 3 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #elif !defined(CYTHON_USE_ASYNC_SLOTS) #define CYTHON_USE_ASYNC_SLOTS 1 #endif #ifndef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 1 #endif #ifndef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 1 #endif #ifndef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 1 #endif #if PY_VERSION_HEX < 0x030300F0 || PY_VERSION_HEX >= 0x030B00A2 #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #elif !defined(CYTHON_USE_UNICODE_WRITER) #define CYTHON_USE_UNICODE_WRITER 1 #endif #ifndef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 0 #endif #ifndef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 1 #endif #ifndef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 1 #endif #ifndef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 1 #endif #ifndef CYTHON_FAST_GIL #define CYTHON_FAST_GIL (PY_MAJOR_VERSION < 3 || PY_VERSION_HEX >= 0x03060000 && PY_VERSION_HEX < 0x030C00A6) #endif #ifndef CYTHON_METH_FASTCALL #define CYTHON_METH_FASTCALL (PY_VERSION_HEX >= 0x030700A1) #endif #ifndef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 1 #endif #ifndef CYTHON_PEP487_INIT_SUBCLASS #define CYTHON_PEP487_INIT_SUBCLASS 1 #endif #if PY_VERSION_HEX < 0x03050000 #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 0 #elif !defined(CYTHON_PEP489_MULTI_PHASE_INIT) #define CYTHON_PEP489_MULTI_PHASE_INIT 1 #endif #ifndef CYTHON_USE_MODULE_STATE #define CYTHON_USE_MODULE_STATE 0 #endif #if PY_VERSION_HEX < 0x030400a1 #undef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 0 #elif !defined(CYTHON_USE_TP_FINALIZE) #define CYTHON_USE_TP_FINALIZE 1 #endif #if PY_VERSION_HEX < 0x030600B1 #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #elif !defined(CYTHON_USE_DICT_VERSIONS) #define CYTHON_USE_DICT_VERSIONS (PY_VERSION_HEX < 0x030C00A5) #endif #if PY_VERSION_HEX < 0x030700A3 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #elif !defined(CYTHON_USE_EXC_INFO_STACK) #define CYTHON_USE_EXC_INFO_STACK 1 #endif #ifndef CYTHON_UPDATE_DESCRIPTOR_DOC #define CYTHON_UPDATE_DESCRIPTOR_DOC 1 #endif #ifndef CYTHON_USE_FREELISTS #define CYTHON_USE_FREELISTS 1 #endif #endif #if !defined(CYTHON_FAST_PYCCALL) #define CYTHON_FAST_PYCCALL (CYTHON_FAST_PYCALL && PY_VERSION_HEX >= 0x030600B1) #endif #if !defined(CYTHON_VECTORCALL) #define CYTHON_VECTORCALL (CYTHON_FAST_PYCCALL && PY_VERSION_HEX >= 0x030800B1) #endif #define CYTHON_BACKPORT_VECTORCALL (CYTHON_METH_FASTCALL && PY_VERSION_HEX < 0x030800B1) #if CYTHON_USE_PYLONG_INTERNALS #if PY_MAJOR_VERSION < 3 #include "longintrepr.h" #endif #undef SHIFT #undef BASE #undef MASK #ifdef SIZEOF_VOID_P enum { __pyx_check_sizeof_voidp = 1 / (int)(SIZEOF_VOID_P == sizeof(void*)) }; #endif #endif #ifndef __has_attribute #define __has_attribute(x) 0 #endif #ifndef __has_cpp_attribute #define __has_cpp_attribute(x) 0 #endif #ifndef CYTHON_RESTRICT #if defined(__GNUC__) #define CYTHON_RESTRICT __restrict__ #elif defined(_MSC_VER) && _MSC_VER >= 1400 #define CYTHON_RESTRICT __restrict #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define CYTHON_RESTRICT restrict #else #define CYTHON_RESTRICT #endif #endif #ifndef CYTHON_UNUSED #if defined(__cplusplus) /* for clang __has_cpp_attribute(maybe_unused) is true even before C++17 * but leads to warnings with -pedantic, since it is a C++17 feature */ #if ((defined(_MSVC_LANG) && _MSVC_LANG >= 201703L) || __cplusplus >= 201703L) #if __has_cpp_attribute(maybe_unused) #define CYTHON_UNUSED [[maybe_unused]] #endif #endif #endif #endif #ifndef CYTHON_UNUSED # if defined(__GNUC__) # if !(defined(__cplusplus)) || (__GNUC__ > 3 || (__GNUC__ == 3 && __GNUC_MINOR__ >= 4)) # define CYTHON_UNUSED __attribute__ ((__unused__)) # else # define CYTHON_UNUSED # endif # elif defined(__ICC) || (defined(__INTEL_COMPILER) && !defined(_MSC_VER)) # define CYTHON_UNUSED __attribute__ ((__unused__)) # else # define CYTHON_UNUSED # endif #endif #ifndef CYTHON_UNUSED_VAR # if defined(__cplusplus) template void CYTHON_UNUSED_VAR( const T& ) { } # else # define CYTHON_UNUSED_VAR(x) (void)(x) # endif #endif #ifndef CYTHON_MAYBE_UNUSED_VAR #define CYTHON_MAYBE_UNUSED_VAR(x) CYTHON_UNUSED_VAR(x) #endif #ifndef CYTHON_NCP_UNUSED # if CYTHON_COMPILING_IN_CPYTHON # define CYTHON_NCP_UNUSED # else # define CYTHON_NCP_UNUSED CYTHON_UNUSED # endif #endif #ifndef CYTHON_USE_CPP_STD_MOVE #if defined(__cplusplus) && (\ __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1600)) #define CYTHON_USE_CPP_STD_MOVE 1 #else #define CYTHON_USE_CPP_STD_MOVE 0 #endif #endif #define __Pyx_void_to_None(void_result) ((void)(void_result), Py_INCREF(Py_None), Py_None) #ifdef _MSC_VER #ifndef _MSC_STDINT_H_ #if _MSC_VER < 1300 typedef unsigned char uint8_t; typedef unsigned short uint16_t; typedef unsigned int uint32_t; #else typedef unsigned __int8 uint8_t; typedef unsigned __int16 uint16_t; typedef unsigned __int32 uint32_t; #endif #endif #if _MSC_VER < 1300 #ifdef _WIN64 typedef unsigned long long __pyx_uintptr_t; #else typedef unsigned int __pyx_uintptr_t; #endif #else #ifdef _WIN64 typedef unsigned __int64 __pyx_uintptr_t; #else typedef unsigned __int32 __pyx_uintptr_t; #endif #endif #else #include typedef uintptr_t __pyx_uintptr_t; #endif #ifndef CYTHON_FALLTHROUGH #if defined(__cplusplus) /* for clang __has_cpp_attribute(fallthrough) is true even before C++17 * but leads to warnings with -pedantic, since it is a C++17 feature */ #if ((defined(_MSVC_LANG) && _MSVC_LANG >= 201703L) || __cplusplus >= 201703L) #if __has_cpp_attribute(fallthrough) #define CYTHON_FALLTHROUGH [[fallthrough]] #endif #endif #ifndef CYTHON_FALLTHROUGH #if __has_cpp_attribute(clang::fallthrough) #define CYTHON_FALLTHROUGH [[clang::fallthrough]] #elif __has_cpp_attribute(gnu::fallthrough) #define CYTHON_FALLTHROUGH [[gnu::fallthrough]] #endif #endif #endif #ifndef CYTHON_FALLTHROUGH #if __has_attribute(fallthrough) #define CYTHON_FALLTHROUGH __attribute__((fallthrough)) #else #define CYTHON_FALLTHROUGH #endif #endif #if defined(__clang__) && defined(__apple_build_version__) #if __apple_build_version__ < 7000000 #undef CYTHON_FALLTHROUGH #define CYTHON_FALLTHROUGH #endif #endif #endif #ifdef __cplusplus template struct __PYX_IS_UNSIGNED_IMPL {static const bool value = T(0) < T(-1);}; #define __PYX_IS_UNSIGNED(type) (__PYX_IS_UNSIGNED_IMPL::value) #else #define __PYX_IS_UNSIGNED(type) (((type)-1) > 0) #endif #if CYTHON_COMPILING_IN_PYPY == 1 #define __PYX_NEED_TP_PRINT_SLOT (PY_VERSION_HEX >= 0x030800b4 && PY_VERSION_HEX < 0x030A0000) #else #define __PYX_NEED_TP_PRINT_SLOT (PY_VERSION_HEX >= 0x030800b4 && PY_VERSION_HEX < 0x03090000) #endif #define __PYX_REINTERPRET_FUNCION(func_pointer, other_pointer) ((func_pointer)(void(*)(void))(other_pointer)) #ifndef CYTHON_INLINE #if defined(__clang__) #define CYTHON_INLINE __inline__ __attribute__ ((__unused__)) #elif defined(__GNUC__) #define CYTHON_INLINE __inline__ #elif defined(_MSC_VER) #define CYTHON_INLINE __inline #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define CYTHON_INLINE inline #else #define CYTHON_INLINE #endif #endif #define __PYX_BUILD_PY_SSIZE_T "n" #define CYTHON_FORMAT_SSIZE_T "z" #if PY_MAJOR_VERSION < 3 #define __Pyx_BUILTIN_MODULE_NAME "__builtin__" #define __Pyx_DefaultClassType PyClass_Type #define __Pyx_PyCode_New(a, p, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a+k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #else #define __Pyx_BUILTIN_MODULE_NAME "builtins" #define __Pyx_DefaultClassType PyType_Type #if CYTHON_COMPILING_IN_LIMITED_API static CYTHON_INLINE PyObject* __Pyx_PyCode_New(int a, int p, int k, int l, int s, int f, PyObject *code, PyObject *c, PyObject* n, PyObject *v, PyObject *fv, PyObject *cell, PyObject* fn, PyObject *name, int fline, PyObject *lnos) { PyObject *exception_table = NULL; PyObject *types_module=NULL, *code_type=NULL, *result=NULL; #if __PYX_LIMITED_VERSION_HEX < 0x030B0000 PyObject *version_info; PyObject *py_minor_version = NULL; #endif long minor_version = 0; PyObject *type, *value, *traceback; PyErr_Fetch(&type, &value, &traceback); #if __PYX_LIMITED_VERSION_HEX >= 0x030B0000 minor_version = 11; #else if (!(version_info = PySys_GetObject("version_info"))) goto end; if (!(py_minor_version = PySequence_GetItem(version_info, 1))) goto end; minor_version = PyLong_AsLong(py_minor_version); Py_DECREF(py_minor_version); if (minor_version == -1 && PyErr_Occurred()) goto end; #endif if (!(types_module = PyImport_ImportModule("types"))) goto end; if (!(code_type = PyObject_GetAttrString(types_module, "CodeType"))) goto end; if (minor_version <= 7) { (void)p; result = PyObject_CallFunction(code_type, "iiiiiOOOOOOiOO", a, k, l, s, f, code, c, n, v, fn, name, fline, lnos, fv, cell); } else if (minor_version <= 10) { result = PyObject_CallFunction(code_type, "iiiiiiOOOOOOiOO", a,p, k, l, s, f, code, c, n, v, fn, name, fline, lnos, fv, cell); } else { if (!(exception_table = PyBytes_FromStringAndSize(NULL, 0))) goto end; result = PyObject_CallFunction(code_type, "iiiiiiOOOOOOOiOO", a,p, k, l, s, f, code, c, n, v, fn, name, name, fline, lnos, exception_table, fv, cell); } end: Py_XDECREF(code_type); Py_XDECREF(exception_table); Py_XDECREF(types_module); if (type) { PyErr_Restore(type, value, traceback); } return result; } #ifndef CO_OPTIMIZED #define CO_OPTIMIZED 0x0001 #endif #ifndef CO_NEWLOCALS #define CO_NEWLOCALS 0x0002 #endif #ifndef CO_VARARGS #define CO_VARARGS 0x0004 #endif #ifndef CO_VARKEYWORDS #define CO_VARKEYWORDS 0x0008 #endif #ifndef CO_ASYNC_GENERATOR #define CO_ASYNC_GENERATOR 0x0200 #endif #ifndef CO_GENERATOR #define CO_GENERATOR 0x0020 #endif #ifndef CO_COROUTINE #define CO_COROUTINE 0x0080 #endif #elif PY_VERSION_HEX >= 0x030B0000 static CYTHON_INLINE PyCodeObject* __Pyx_PyCode_New(int a, int p, int k, int l, int s, int f, PyObject *code, PyObject *c, PyObject* n, PyObject *v, PyObject *fv, PyObject *cell, PyObject* fn, PyObject *name, int fline, PyObject *lnos) { PyCodeObject *result; PyObject *empty_bytes = PyBytes_FromStringAndSize("", 0); if (!empty_bytes) return NULL; result = #if PY_VERSION_HEX >= 0x030C0000 PyUnstable_Code_NewWithPosOnlyArgs #else PyCode_NewWithPosOnlyArgs #endif (a, p, k, l, s, f, code, c, n, v, fv, cell, fn, name, name, fline, lnos, empty_bytes); Py_DECREF(empty_bytes); return result; } #elif PY_VERSION_HEX >= 0x030800B2 && !CYTHON_COMPILING_IN_PYPY #define __Pyx_PyCode_New(a, p, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_NewWithPosOnlyArgs(a, p, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #else #define __Pyx_PyCode_New(a, p, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #endif #endif #if PY_VERSION_HEX >= 0x030900A4 || defined(Py_IS_TYPE) #define __Pyx_IS_TYPE(ob, type) Py_IS_TYPE(ob, type) #else #define __Pyx_IS_TYPE(ob, type) (((const PyObject*)ob)->ob_type == (type)) #endif #if PY_VERSION_HEX >= 0x030A00B1 || defined(Py_Is) #define __Pyx_Py_Is(x, y) Py_Is(x, y) #else #define __Pyx_Py_Is(x, y) ((x) == (y)) #endif #if PY_VERSION_HEX >= 0x030A00B1 || defined(Py_IsNone) #define __Pyx_Py_IsNone(ob) Py_IsNone(ob) #else #define __Pyx_Py_IsNone(ob) __Pyx_Py_Is((ob), Py_None) #endif #if PY_VERSION_HEX >= 0x030A00B1 || defined(Py_IsTrue) #define __Pyx_Py_IsTrue(ob) Py_IsTrue(ob) #else #define __Pyx_Py_IsTrue(ob) __Pyx_Py_Is((ob), Py_True) #endif #if PY_VERSION_HEX >= 0x030A00B1 || defined(Py_IsFalse) #define __Pyx_Py_IsFalse(ob) Py_IsFalse(ob) #else #define __Pyx_Py_IsFalse(ob) __Pyx_Py_Is((ob), Py_False) #endif #define __Pyx_NoneAsNull(obj) (__Pyx_Py_IsNone(obj) ? NULL : (obj)) #if PY_VERSION_HEX >= 0x030900F0 && !CYTHON_COMPILING_IN_PYPY #define __Pyx_PyObject_GC_IsFinalized(o) PyObject_GC_IsFinalized(o) #else #define __Pyx_PyObject_GC_IsFinalized(o) _PyGC_FINALIZED(o) #endif #ifndef CO_COROUTINE #define CO_COROUTINE 0x80 #endif #ifndef CO_ASYNC_GENERATOR #define CO_ASYNC_GENERATOR 0x200 #endif #ifndef Py_TPFLAGS_CHECKTYPES #define Py_TPFLAGS_CHECKTYPES 0 #endif #ifndef Py_TPFLAGS_HAVE_INDEX #define Py_TPFLAGS_HAVE_INDEX 0 #endif #ifndef Py_TPFLAGS_HAVE_NEWBUFFER #define Py_TPFLAGS_HAVE_NEWBUFFER 0 #endif #ifndef Py_TPFLAGS_HAVE_FINALIZE #define Py_TPFLAGS_HAVE_FINALIZE 0 #endif #ifndef Py_TPFLAGS_SEQUENCE #define Py_TPFLAGS_SEQUENCE 0 #endif #ifndef Py_TPFLAGS_MAPPING #define Py_TPFLAGS_MAPPING 0 #endif #ifndef METH_STACKLESS #define METH_STACKLESS 0 #endif #if PY_VERSION_HEX <= 0x030700A3 || !defined(METH_FASTCALL) #ifndef METH_FASTCALL #define METH_FASTCALL 0x80 #endif typedef PyObject *(*__Pyx_PyCFunctionFast) (PyObject *self, PyObject *const *args, Py_ssize_t nargs); typedef PyObject *(*__Pyx_PyCFunctionFastWithKeywords) (PyObject *self, PyObject *const *args, Py_ssize_t nargs, PyObject *kwnames); #else #if PY_VERSION_HEX >= 0x030d00A4 # define __Pyx_PyCFunctionFast PyCFunctionFast # define __Pyx_PyCFunctionFastWithKeywords PyCFunctionFastWithKeywords #else # define __Pyx_PyCFunctionFast _PyCFunctionFast # define __Pyx_PyCFunctionFastWithKeywords _PyCFunctionFastWithKeywords #endif #endif #if CYTHON_METH_FASTCALL #define __Pyx_METH_FASTCALL METH_FASTCALL #define __Pyx_PyCFunction_FastCall __Pyx_PyCFunctionFast #define __Pyx_PyCFunction_FastCallWithKeywords __Pyx_PyCFunctionFastWithKeywords #else #define __Pyx_METH_FASTCALL METH_VARARGS #define __Pyx_PyCFunction_FastCall PyCFunction #define __Pyx_PyCFunction_FastCallWithKeywords PyCFunctionWithKeywords #endif #if CYTHON_VECTORCALL #define __pyx_vectorcallfunc vectorcallfunc #define __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET PY_VECTORCALL_ARGUMENTS_OFFSET #define __Pyx_PyVectorcall_NARGS(n) PyVectorcall_NARGS((size_t)(n)) #elif CYTHON_BACKPORT_VECTORCALL typedef PyObject *(*__pyx_vectorcallfunc)(PyObject *callable, PyObject *const *args, size_t nargsf, PyObject *kwnames); #define __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET ((size_t)1 << (8 * sizeof(size_t) - 1)) #define __Pyx_PyVectorcall_NARGS(n) ((Py_ssize_t)(((size_t)(n)) & ~__Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET)) #else #define __Pyx_PY_VECTORCALL_ARGUMENTS_OFFSET 0 #define __Pyx_PyVectorcall_NARGS(n) ((Py_ssize_t)(n)) #endif #if PY_MAJOR_VERSION >= 0x030900B1 #define __Pyx_PyCFunction_CheckExact(func) PyCFunction_CheckExact(func) #else #define __Pyx_PyCFunction_CheckExact(func) PyCFunction_Check(func) #endif #define __Pyx_CyOrPyCFunction_Check(func) PyCFunction_Check(func) #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_CyOrPyCFunction_GET_FUNCTION(func) (((PyCFunctionObject*)(func))->m_ml->ml_meth) #elif !CYTHON_COMPILING_IN_LIMITED_API #define __Pyx_CyOrPyCFunction_GET_FUNCTION(func) PyCFunction_GET_FUNCTION(func) #endif #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_CyOrPyCFunction_GET_FLAGS(func) (((PyCFunctionObject*)(func))->m_ml->ml_flags) static CYTHON_INLINE PyObject* __Pyx_CyOrPyCFunction_GET_SELF(PyObject *func) { return (__Pyx_CyOrPyCFunction_GET_FLAGS(func) & METH_STATIC) ? NULL : ((PyCFunctionObject*)func)->m_self; } #endif static CYTHON_INLINE int __Pyx__IsSameCFunction(PyObject *func, void *cfunc) { #if CYTHON_COMPILING_IN_LIMITED_API return PyCFunction_Check(func) && PyCFunction_GetFunction(func) == (PyCFunction) cfunc; #else return PyCFunction_Check(func) && PyCFunction_GET_FUNCTION(func) == (PyCFunction) cfunc; #endif } #define __Pyx_IsSameCFunction(func, cfunc) __Pyx__IsSameCFunction(func, cfunc) #if __PYX_LIMITED_VERSION_HEX < 0x030900B1 #define __Pyx_PyType_FromModuleAndSpec(m, s, b) ((void)m, PyType_FromSpecWithBases(s, b)) typedef PyObject *(*__Pyx_PyCMethod)(PyObject *, PyTypeObject *, PyObject *const *, size_t, PyObject *); #else #define __Pyx_PyType_FromModuleAndSpec(m, s, b) PyType_FromModuleAndSpec(m, s, b) #define __Pyx_PyCMethod PyCMethod #endif #ifndef METH_METHOD #define METH_METHOD 0x200 #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyObject_Malloc) #define PyObject_Malloc(s) PyMem_Malloc(s) #define PyObject_Free(p) PyMem_Free(p) #define PyObject_Realloc(p) PyMem_Realloc(p) #endif #if CYTHON_COMPILING_IN_LIMITED_API #define __Pyx_PyCode_HasFreeVars(co) (PyCode_GetNumFree(co) > 0) #define __Pyx_PyFrame_SetLineNumber(frame, lineno) #else #define __Pyx_PyCode_HasFreeVars(co) (PyCode_GetNumFree(co) > 0) #define __Pyx_PyFrame_SetLineNumber(frame, lineno) (frame)->f_lineno = (lineno) #endif #if CYTHON_COMPILING_IN_LIMITED_API #define __Pyx_PyThreadState_Current PyThreadState_Get() #elif !CYTHON_FAST_THREAD_STATE #define __Pyx_PyThreadState_Current PyThreadState_GET() #elif PY_VERSION_HEX >= 0x030d00A1 #define __Pyx_PyThreadState_Current PyThreadState_GetUnchecked() #elif PY_VERSION_HEX >= 0x03060000 #define __Pyx_PyThreadState_Current _PyThreadState_UncheckedGet() #elif PY_VERSION_HEX >= 0x03000000 #define __Pyx_PyThreadState_Current PyThreadState_GET() #else #define __Pyx_PyThreadState_Current _PyThreadState_Current #endif #if CYTHON_COMPILING_IN_LIMITED_API static CYTHON_INLINE void *__Pyx_PyModule_GetState(PyObject *op) { void *result; result = PyModule_GetState(op); if (!result) Py_FatalError("Couldn't find the module state"); return result; } #endif #define __Pyx_PyObject_GetSlot(obj, name, func_ctype) __Pyx_PyType_GetSlot(Py_TYPE(obj), name, func_ctype) #if CYTHON_COMPILING_IN_LIMITED_API #define __Pyx_PyType_GetSlot(type, name, func_ctype) ((func_ctype) PyType_GetSlot((type), Py_##name)) #else #define __Pyx_PyType_GetSlot(type, name, func_ctype) ((type)->name) #endif #if PY_VERSION_HEX < 0x030700A2 && !defined(PyThread_tss_create) && !defined(Py_tss_NEEDS_INIT) #include "pythread.h" #define Py_tss_NEEDS_INIT 0 typedef int Py_tss_t; static CYTHON_INLINE int PyThread_tss_create(Py_tss_t *key) { *key = PyThread_create_key(); return 0; } static CYTHON_INLINE Py_tss_t * PyThread_tss_alloc(void) { Py_tss_t *key = (Py_tss_t *)PyObject_Malloc(sizeof(Py_tss_t)); *key = Py_tss_NEEDS_INIT; return key; } static CYTHON_INLINE void PyThread_tss_free(Py_tss_t *key) { PyObject_Free(key); } static CYTHON_INLINE int PyThread_tss_is_created(Py_tss_t *key) { return *key != Py_tss_NEEDS_INIT; } static CYTHON_INLINE void PyThread_tss_delete(Py_tss_t *key) { PyThread_delete_key(*key); *key = Py_tss_NEEDS_INIT; } static CYTHON_INLINE int PyThread_tss_set(Py_tss_t *key, void *value) { return PyThread_set_key_value(*key, value); } static CYTHON_INLINE void * PyThread_tss_get(Py_tss_t *key) { return PyThread_get_key_value(*key); } #endif #if PY_MAJOR_VERSION < 3 #if CYTHON_COMPILING_IN_PYPY #if PYPY_VERSION_NUM < 0x07030600 #if defined(__cplusplus) && __cplusplus >= 201402L [[deprecated("`with nogil:` inside a nogil function will not release the GIL in PyPy2 < 7.3.6")]] #elif defined(__GNUC__) || defined(__clang__) __attribute__ ((__deprecated__("`with nogil:` inside a nogil function will not release the GIL in PyPy2 < 7.3.6"))) #elif defined(_MSC_VER) __declspec(deprecated("`with nogil:` inside a nogil function will not release the GIL in PyPy2 < 7.3.6")) #endif static CYTHON_INLINE int PyGILState_Check(void) { return 0; } #else // PYPY_VERSION_NUM < 0x07030600 #endif // PYPY_VERSION_NUM < 0x07030600 #else static CYTHON_INLINE int PyGILState_Check(void) { PyThreadState * tstate = _PyThreadState_Current; return tstate && (tstate == PyGILState_GetThisThreadState()); } #endif #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030d0000 || defined(_PyDict_NewPresized) #define __Pyx_PyDict_NewPresized(n) ((n <= 8) ? PyDict_New() : _PyDict_NewPresized(n)) #else #define __Pyx_PyDict_NewPresized(n) PyDict_New() #endif #if PY_MAJOR_VERSION >= 3 || CYTHON_FUTURE_DIVISION #define __Pyx_PyNumber_Divide(x,y) PyNumber_TrueDivide(x,y) #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceTrueDivide(x,y) #else #define __Pyx_PyNumber_Divide(x,y) PyNumber_Divide(x,y) #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceDivide(x,y) #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX > 0x030600B4 && PY_VERSION_HEX < 0x030d0000 && CYTHON_USE_UNICODE_INTERNALS #define __Pyx_PyDict_GetItemStrWithError(dict, name) _PyDict_GetItem_KnownHash(dict, name, ((PyASCIIObject *) name)->hash) static CYTHON_INLINE PyObject * __Pyx_PyDict_GetItemStr(PyObject *dict, PyObject *name) { PyObject *res = __Pyx_PyDict_GetItemStrWithError(dict, name); if (res == NULL) PyErr_Clear(); return res; } #elif PY_MAJOR_VERSION >= 3 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM >= 0x07020000) #define __Pyx_PyDict_GetItemStrWithError PyDict_GetItemWithError #define __Pyx_PyDict_GetItemStr PyDict_GetItem #else static CYTHON_INLINE PyObject * __Pyx_PyDict_GetItemStrWithError(PyObject *dict, PyObject *name) { #if CYTHON_COMPILING_IN_PYPY return PyDict_GetItem(dict, name); #else PyDictEntry *ep; PyDictObject *mp = (PyDictObject*) dict; long hash = ((PyStringObject *) name)->ob_shash; assert(hash != -1); ep = (mp->ma_lookup)(mp, name, hash); if (ep == NULL) { return NULL; } return ep->me_value; #endif } #define __Pyx_PyDict_GetItemStr PyDict_GetItem #endif #if CYTHON_USE_TYPE_SLOTS #define __Pyx_PyType_GetFlags(tp) (((PyTypeObject *)tp)->tp_flags) #define __Pyx_PyType_HasFeature(type, feature) ((__Pyx_PyType_GetFlags(type) & (feature)) != 0) #define __Pyx_PyObject_GetIterNextFunc(obj) (Py_TYPE(obj)->tp_iternext) #else #define __Pyx_PyType_GetFlags(tp) (PyType_GetFlags((PyTypeObject *)tp)) #define __Pyx_PyType_HasFeature(type, feature) PyType_HasFeature(type, feature) #define __Pyx_PyObject_GetIterNextFunc(obj) PyIter_Next #endif #if CYTHON_COMPILING_IN_LIMITED_API #define __Pyx_SetItemOnTypeDict(tp, k, v) PyObject_GenericSetAttr((PyObject*)tp, k, v) #else #define __Pyx_SetItemOnTypeDict(tp, k, v) PyDict_SetItem(tp->tp_dict, k, v) #endif #if CYTHON_USE_TYPE_SPECS && PY_VERSION_HEX >= 0x03080000 #define __Pyx_PyHeapTypeObject_GC_Del(obj) {\ PyTypeObject *type = Py_TYPE((PyObject*)obj);\ assert(__Pyx_PyType_HasFeature(type, Py_TPFLAGS_HEAPTYPE));\ PyObject_GC_Del(obj);\ Py_DECREF(type);\ } #else #define __Pyx_PyHeapTypeObject_GC_Del(obj) PyObject_GC_Del(obj) #endif #if CYTHON_COMPILING_IN_LIMITED_API #define CYTHON_PEP393_ENABLED 1 #define __Pyx_PyUnicode_READY(op) (0) #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GetLength(u) #define __Pyx_PyUnicode_READ_CHAR(u, i) PyUnicode_ReadChar(u, i) #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) ((void)u, 1114111U) #define __Pyx_PyUnicode_KIND(u) ((void)u, (0)) #define __Pyx_PyUnicode_DATA(u) ((void*)u) #define __Pyx_PyUnicode_READ(k, d, i) ((void)k, PyUnicode_ReadChar((PyObject*)(d), i)) #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GetLength(u)) #elif PY_VERSION_HEX > 0x03030000 && defined(PyUnicode_KIND) #define CYTHON_PEP393_ENABLED 1 #if PY_VERSION_HEX >= 0x030C0000 #define __Pyx_PyUnicode_READY(op) (0) #else #define __Pyx_PyUnicode_READY(op) (likely(PyUnicode_IS_READY(op)) ?\ 0 : _PyUnicode_Ready((PyObject *)(op))) #endif #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_LENGTH(u) #define __Pyx_PyUnicode_READ_CHAR(u, i) PyUnicode_READ_CHAR(u, i) #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) PyUnicode_MAX_CHAR_VALUE(u) #define __Pyx_PyUnicode_KIND(u) ((int)PyUnicode_KIND(u)) #define __Pyx_PyUnicode_DATA(u) PyUnicode_DATA(u) #define __Pyx_PyUnicode_READ(k, d, i) PyUnicode_READ(k, d, i) #define __Pyx_PyUnicode_WRITE(k, d, i, ch) PyUnicode_WRITE(k, d, i, (Py_UCS4) ch) #if PY_VERSION_HEX >= 0x030C0000 #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GET_LENGTH(u)) #else #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03090000 #define __Pyx_PyUnicode_IS_TRUE(u) (0 != (likely(PyUnicode_IS_READY(u)) ? PyUnicode_GET_LENGTH(u) : ((PyCompactUnicodeObject *)(u))->wstr_length)) #else #define __Pyx_PyUnicode_IS_TRUE(u) (0 != (likely(PyUnicode_IS_READY(u)) ? PyUnicode_GET_LENGTH(u) : PyUnicode_GET_SIZE(u))) #endif #endif #else #define CYTHON_PEP393_ENABLED 0 #define PyUnicode_1BYTE_KIND 1 #define PyUnicode_2BYTE_KIND 2 #define PyUnicode_4BYTE_KIND 4 #define __Pyx_PyUnicode_READY(op) (0) #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_SIZE(u) #define __Pyx_PyUnicode_READ_CHAR(u, i) ((Py_UCS4)(PyUnicode_AS_UNICODE(u)[i])) #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) ((sizeof(Py_UNICODE) == 2) ? 65535U : 1114111U) #define __Pyx_PyUnicode_KIND(u) ((int)sizeof(Py_UNICODE)) #define __Pyx_PyUnicode_DATA(u) ((void*)PyUnicode_AS_UNICODE(u)) #define __Pyx_PyUnicode_READ(k, d, i) ((void)(k), (Py_UCS4)(((Py_UNICODE*)d)[i])) #define __Pyx_PyUnicode_WRITE(k, d, i, ch) (((void)(k)), ((Py_UNICODE*)d)[i] = (Py_UNICODE) ch) #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GET_SIZE(u)) #endif #if CYTHON_COMPILING_IN_PYPY #define __Pyx_PyUnicode_Concat(a, b) PyNumber_Add(a, b) #define __Pyx_PyUnicode_ConcatSafe(a, b) PyNumber_Add(a, b) #else #define __Pyx_PyUnicode_Concat(a, b) PyUnicode_Concat(a, b) #define __Pyx_PyUnicode_ConcatSafe(a, b) ((unlikely((a) == Py_None) || unlikely((b) == Py_None)) ?\ PyNumber_Add(a, b) : __Pyx_PyUnicode_Concat(a, b)) #endif #if CYTHON_COMPILING_IN_PYPY #if !defined(PyUnicode_DecodeUnicodeEscape) #define PyUnicode_DecodeUnicodeEscape(s, size, errors) PyUnicode_Decode(s, size, "unicode_escape", errors) #endif #if !defined(PyUnicode_Contains) || (PY_MAJOR_VERSION == 2 && PYPY_VERSION_NUM < 0x07030500) #undef PyUnicode_Contains #define PyUnicode_Contains(u, s) PySequence_Contains(u, s) #endif #if !defined(PyByteArray_Check) #define PyByteArray_Check(obj) PyObject_TypeCheck(obj, &PyByteArray_Type) #endif #if !defined(PyObject_Format) #define PyObject_Format(obj, fmt) PyObject_CallMethod(obj, "__format__", "O", fmt) #endif #endif #define __Pyx_PyString_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyString_Check(b) && !PyString_CheckExact(b)))) ? PyNumber_Remainder(a, b) : __Pyx_PyString_Format(a, b)) #define __Pyx_PyUnicode_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyUnicode_Check(b) && !PyUnicode_CheckExact(b)))) ? PyNumber_Remainder(a, b) : PyUnicode_Format(a, b)) #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyString_Format(a, b) PyUnicode_Format(a, b) #else #define __Pyx_PyString_Format(a, b) PyString_Format(a, b) #endif #if PY_MAJOR_VERSION < 3 && !defined(PyObject_ASCII) #define PyObject_ASCII(o) PyObject_Repr(o) #endif #if PY_MAJOR_VERSION >= 3 #define PyBaseString_Type PyUnicode_Type #define PyStringObject PyUnicodeObject #define PyString_Type PyUnicode_Type #define PyString_Check PyUnicode_Check #define PyString_CheckExact PyUnicode_CheckExact #ifndef PyObject_Unicode #define PyObject_Unicode PyObject_Str #endif #endif #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyBaseString_Check(obj) PyUnicode_Check(obj) #define __Pyx_PyBaseString_CheckExact(obj) PyUnicode_CheckExact(obj) #else #define __Pyx_PyBaseString_Check(obj) (PyString_Check(obj) || PyUnicode_Check(obj)) #define __Pyx_PyBaseString_CheckExact(obj) (PyString_CheckExact(obj) || PyUnicode_CheckExact(obj)) #endif #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_PySequence_ListKeepNew(obj)\ (likely(PyList_CheckExact(obj) && Py_REFCNT(obj) == 1) ? __Pyx_NewRef(obj) : PySequence_List(obj)) #else #define __Pyx_PySequence_ListKeepNew(obj) PySequence_List(obj) #endif #ifndef PySet_CheckExact #define PySet_CheckExact(obj) __Pyx_IS_TYPE(obj, &PySet_Type) #endif #if PY_VERSION_HEX >= 0x030900A4 #define __Pyx_SET_REFCNT(obj, refcnt) Py_SET_REFCNT(obj, refcnt) #define __Pyx_SET_SIZE(obj, size) Py_SET_SIZE(obj, size) #else #define __Pyx_SET_REFCNT(obj, refcnt) Py_REFCNT(obj) = (refcnt) #define __Pyx_SET_SIZE(obj, size) Py_SIZE(obj) = (size) #endif #if CYTHON_ASSUME_SAFE_MACROS #define __Pyx_PySequence_ITEM(o, i) PySequence_ITEM(o, i) #define __Pyx_PySequence_SIZE(seq) Py_SIZE(seq) #define __Pyx_PyTuple_SET_ITEM(o, i, v) (PyTuple_SET_ITEM(o, i, v), (0)) #define __Pyx_PyList_SET_ITEM(o, i, v) (PyList_SET_ITEM(o, i, v), (0)) #define __Pyx_PyTuple_GET_SIZE(o) PyTuple_GET_SIZE(o) #define __Pyx_PyList_GET_SIZE(o) PyList_GET_SIZE(o) #define __Pyx_PySet_GET_SIZE(o) PySet_GET_SIZE(o) #define __Pyx_PyBytes_GET_SIZE(o) PyBytes_GET_SIZE(o) #define __Pyx_PyByteArray_GET_SIZE(o) PyByteArray_GET_SIZE(o) #else #define __Pyx_PySequence_ITEM(o, i) PySequence_GetItem(o, i) #define __Pyx_PySequence_SIZE(seq) PySequence_Size(seq) #define __Pyx_PyTuple_SET_ITEM(o, i, v) PyTuple_SetItem(o, i, v) #define __Pyx_PyList_SET_ITEM(o, i, v) PyList_SetItem(o, i, v) #define __Pyx_PyTuple_GET_SIZE(o) PyTuple_Size(o) #define __Pyx_PyList_GET_SIZE(o) PyList_Size(o) #define __Pyx_PySet_GET_SIZE(o) PySet_Size(o) #define __Pyx_PyBytes_GET_SIZE(o) PyBytes_Size(o) #define __Pyx_PyByteArray_GET_SIZE(o) PyByteArray_Size(o) #endif #if __PYX_LIMITED_VERSION_HEX >= 0x030d00A1 #define __Pyx_PyImport_AddModuleRef(name) PyImport_AddModuleRef(name) #else static CYTHON_INLINE PyObject *__Pyx_PyImport_AddModuleRef(const char *name) { PyObject *module = PyImport_AddModule(name); Py_XINCREF(module); return module; } #endif #if PY_MAJOR_VERSION >= 3 #define PyIntObject PyLongObject #define PyInt_Type PyLong_Type #define PyInt_Check(op) PyLong_Check(op) #define PyInt_CheckExact(op) PyLong_CheckExact(op) #define __Pyx_Py3Int_Check(op) PyLong_Check(op) #define __Pyx_Py3Int_CheckExact(op) PyLong_CheckExact(op) #define PyInt_FromString PyLong_FromString #define PyInt_FromUnicode PyLong_FromUnicode #define PyInt_FromLong PyLong_FromLong #define PyInt_FromSize_t PyLong_FromSize_t #define PyInt_FromSsize_t PyLong_FromSsize_t #define PyInt_AsLong PyLong_AsLong #define PyInt_AS_LONG PyLong_AS_LONG #define PyInt_AsSsize_t PyLong_AsSsize_t #define PyInt_AsUnsignedLongMask PyLong_AsUnsignedLongMask #define PyInt_AsUnsignedLongLongMask PyLong_AsUnsignedLongLongMask #define PyNumber_Int PyNumber_Long #else #define __Pyx_Py3Int_Check(op) (PyLong_Check(op) || PyInt_Check(op)) #define __Pyx_Py3Int_CheckExact(op) (PyLong_CheckExact(op) || PyInt_CheckExact(op)) #endif #if PY_MAJOR_VERSION >= 3 #define PyBoolObject PyLongObject #endif #if PY_MAJOR_VERSION >= 3 && CYTHON_COMPILING_IN_PYPY #ifndef PyUnicode_InternFromString #define PyUnicode_InternFromString(s) PyUnicode_FromString(s) #endif #endif #if PY_VERSION_HEX < 0x030200A4 typedef long Py_hash_t; #define __Pyx_PyInt_FromHash_t PyInt_FromLong #define __Pyx_PyInt_AsHash_t __Pyx_PyIndex_AsHash_t #else #define __Pyx_PyInt_FromHash_t PyInt_FromSsize_t #define __Pyx_PyInt_AsHash_t __Pyx_PyIndex_AsSsize_t #endif #if CYTHON_USE_ASYNC_SLOTS #if PY_VERSION_HEX >= 0x030500B1 #define __Pyx_PyAsyncMethodsStruct PyAsyncMethods #define __Pyx_PyType_AsAsync(obj) (Py_TYPE(obj)->tp_as_async) #else #define __Pyx_PyType_AsAsync(obj) ((__Pyx_PyAsyncMethodsStruct*) (Py_TYPE(obj)->tp_reserved)) #endif #else #define __Pyx_PyType_AsAsync(obj) NULL #endif #ifndef __Pyx_PyAsyncMethodsStruct typedef struct { unaryfunc am_await; unaryfunc am_aiter; unaryfunc am_anext; } __Pyx_PyAsyncMethodsStruct; #endif #if defined(_WIN32) || defined(WIN32) || defined(MS_WINDOWS) #if !defined(_USE_MATH_DEFINES) #define _USE_MATH_DEFINES #endif #endif #include #ifdef NAN #define __PYX_NAN() ((float) NAN) #else static CYTHON_INLINE float __PYX_NAN() { float value; memset(&value, 0xFF, sizeof(value)); return value; } #endif #if defined(__CYGWIN__) && defined(_LDBL_EQ_DBL) #define __Pyx_truncl trunc #else #define __Pyx_truncl truncl #endif #define __PYX_MARK_ERR_POS(f_index, lineno) \ { __pyx_filename = __pyx_f[f_index]; (void)__pyx_filename; __pyx_lineno = lineno; (void)__pyx_lineno; __pyx_clineno = __LINE__; (void)__pyx_clineno; } #define __PYX_ERR(f_index, lineno, Ln_error) \ { __PYX_MARK_ERR_POS(f_index, lineno) goto Ln_error; } #ifdef CYTHON_EXTERN_C #undef __PYX_EXTERN_C #define __PYX_EXTERN_C CYTHON_EXTERN_C #elif defined(__PYX_EXTERN_C) #ifdef _MSC_VER #pragma message ("Please do not define the '__PYX_EXTERN_C' macro externally. Use 'CYTHON_EXTERN_C' instead.") #else #warning Please do not define the '__PYX_EXTERN_C' macro externally. Use 'CYTHON_EXTERN_C' instead. #endif #else #ifdef __cplusplus #define __PYX_EXTERN_C extern "C" #else #define __PYX_EXTERN_C extern #endif #endif #define __PYX_HAVE__gammapy__stats__fit_statistics_cython #define __PYX_HAVE_API__gammapy__stats__fit_statistics_cython /* Early includes */ #include #include /* Using NumPy API declarations from "numpy/__init__.cython-30.pxd" */ #include "numpy/arrayobject.h" #include "numpy/ndarrayobject.h" #include "numpy/ndarraytypes.h" #include "numpy/arrayscalars.h" #include "numpy/ufuncobject.h" #include "math.h" #ifdef _OPENMP #include #endif /* _OPENMP */ #if defined(PYREX_WITHOUT_ASSERTIONS) && !defined(CYTHON_WITHOUT_ASSERTIONS) #define CYTHON_WITHOUT_ASSERTIONS #endif typedef struct {PyObject **p; const char *s; const Py_ssize_t n; const char* encoding; const char is_unicode; const char is_str; const char intern; } __Pyx_StringTabEntry; #define __PYX_DEFAULT_STRING_ENCODING_IS_ASCII 0 #define __PYX_DEFAULT_STRING_ENCODING_IS_UTF8 0 #define __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT (PY_MAJOR_VERSION >= 3 && __PYX_DEFAULT_STRING_ENCODING_IS_UTF8) #define __PYX_DEFAULT_STRING_ENCODING "" #define __Pyx_PyObject_FromString __Pyx_PyBytes_FromString #define __Pyx_PyObject_FromStringAndSize __Pyx_PyBytes_FromStringAndSize #define __Pyx_uchar_cast(c) ((unsigned char)c) #define __Pyx_long_cast(x) ((long)x) #define __Pyx_fits_Py_ssize_t(v, type, is_signed) (\ (sizeof(type) < sizeof(Py_ssize_t)) ||\ (sizeof(type) > sizeof(Py_ssize_t) &&\ likely(v < (type)PY_SSIZE_T_MAX ||\ v == (type)PY_SSIZE_T_MAX) &&\ (!is_signed || likely(v > (type)PY_SSIZE_T_MIN ||\ v == (type)PY_SSIZE_T_MIN))) ||\ (sizeof(type) == sizeof(Py_ssize_t) &&\ (is_signed || likely(v < (type)PY_SSIZE_T_MAX ||\ v == (type)PY_SSIZE_T_MAX))) ) static CYTHON_INLINE int __Pyx_is_valid_index(Py_ssize_t i, Py_ssize_t limit) { return (size_t) i < (size_t) limit; } #if defined (__cplusplus) && __cplusplus >= 201103L #include #define __Pyx_sst_abs(value) std::abs(value) #elif SIZEOF_INT >= SIZEOF_SIZE_T #define __Pyx_sst_abs(value) abs(value) #elif SIZEOF_LONG >= SIZEOF_SIZE_T #define __Pyx_sst_abs(value) labs(value) #elif defined (_MSC_VER) #define __Pyx_sst_abs(value) ((Py_ssize_t)_abs64(value)) #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define __Pyx_sst_abs(value) llabs(value) #elif defined (__GNUC__) #define __Pyx_sst_abs(value) __builtin_llabs(value) #else #define __Pyx_sst_abs(value) ((value<0) ? -value : value) #endif static CYTHON_INLINE Py_ssize_t __Pyx_ssize_strlen(const char *s); static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject*); static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject*, Py_ssize_t* length); static CYTHON_INLINE PyObject* __Pyx_PyByteArray_FromString(const char*); #define __Pyx_PyByteArray_FromStringAndSize(s, l) PyByteArray_FromStringAndSize((const char*)s, l) #define __Pyx_PyBytes_FromString PyBytes_FromString #define __Pyx_PyBytes_FromStringAndSize PyBytes_FromStringAndSize static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char*); #if PY_MAJOR_VERSION < 3 #define __Pyx_PyStr_FromString __Pyx_PyBytes_FromString #define __Pyx_PyStr_FromStringAndSize __Pyx_PyBytes_FromStringAndSize #else #define __Pyx_PyStr_FromString __Pyx_PyUnicode_FromString #define __Pyx_PyStr_FromStringAndSize __Pyx_PyUnicode_FromStringAndSize #endif #define __Pyx_PyBytes_AsWritableString(s) ((char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsWritableSString(s) ((signed char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsWritableUString(s) ((unsigned char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsString(s) ((const char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsSString(s) ((const signed char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsUString(s) ((const unsigned char*) PyBytes_AS_STRING(s)) #define __Pyx_PyObject_AsWritableString(s) ((char*)(__pyx_uintptr_t) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsWritableSString(s) ((signed char*)(__pyx_uintptr_t) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsWritableUString(s) ((unsigned char*)(__pyx_uintptr_t) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsSString(s) ((const signed char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsUString(s) ((const unsigned char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_FromCString(s) __Pyx_PyObject_FromString((const char*)s) #define __Pyx_PyBytes_FromCString(s) __Pyx_PyBytes_FromString((const char*)s) #define __Pyx_PyByteArray_FromCString(s) __Pyx_PyByteArray_FromString((const char*)s) #define __Pyx_PyStr_FromCString(s) __Pyx_PyStr_FromString((const char*)s) #define __Pyx_PyUnicode_FromCString(s) __Pyx_PyUnicode_FromString((const char*)s) #define __Pyx_PyUnicode_FromOrdinal(o) PyUnicode_FromOrdinal((int)o) #define __Pyx_PyUnicode_AsUnicode PyUnicode_AsUnicode #define __Pyx_NewRef(obj) (Py_INCREF(obj), obj) #define __Pyx_Owned_Py_None(b) __Pyx_NewRef(Py_None) static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b); static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject*); static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject*); static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x); #define __Pyx_PySequence_Tuple(obj)\ (likely(PyTuple_CheckExact(obj)) ? __Pyx_NewRef(obj) : PySequence_Tuple(obj)) static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject*); static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t); static CYTHON_INLINE Py_hash_t __Pyx_PyIndex_AsHash_t(PyObject*); #if CYTHON_ASSUME_SAFE_MACROS #define __pyx_PyFloat_AsDouble(x) (PyFloat_CheckExact(x) ? PyFloat_AS_DOUBLE(x) : PyFloat_AsDouble(x)) #else #define __pyx_PyFloat_AsDouble(x) PyFloat_AsDouble(x) #endif #define __pyx_PyFloat_AsFloat(x) ((float) __pyx_PyFloat_AsDouble(x)) #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyNumber_Int(x) (PyLong_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Long(x)) #else #define __Pyx_PyNumber_Int(x) (PyInt_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Int(x)) #endif #if CYTHON_USE_PYLONG_INTERNALS #if PY_VERSION_HEX >= 0x030C00A7 #ifndef _PyLong_SIGN_MASK #define _PyLong_SIGN_MASK 3 #endif #ifndef _PyLong_NON_SIZE_BITS #define _PyLong_NON_SIZE_BITS 3 #endif #define __Pyx_PyLong_Sign(x) (((PyLongObject*)x)->long_value.lv_tag & _PyLong_SIGN_MASK) #define __Pyx_PyLong_IsNeg(x) ((__Pyx_PyLong_Sign(x) & 2) != 0) #define __Pyx_PyLong_IsNonNeg(x) (!__Pyx_PyLong_IsNeg(x)) #define __Pyx_PyLong_IsZero(x) (__Pyx_PyLong_Sign(x) & 1) #define __Pyx_PyLong_IsPos(x) (__Pyx_PyLong_Sign(x) == 0) #define __Pyx_PyLong_CompactValueUnsigned(x) (__Pyx_PyLong_Digits(x)[0]) #define __Pyx_PyLong_DigitCount(x) ((Py_ssize_t) (((PyLongObject*)x)->long_value.lv_tag >> _PyLong_NON_SIZE_BITS)) #define __Pyx_PyLong_SignedDigitCount(x)\ ((1 - (Py_ssize_t) __Pyx_PyLong_Sign(x)) * __Pyx_PyLong_DigitCount(x)) #if defined(PyUnstable_Long_IsCompact) && defined(PyUnstable_Long_CompactValue) #define __Pyx_PyLong_IsCompact(x) PyUnstable_Long_IsCompact((PyLongObject*) x) #define __Pyx_PyLong_CompactValue(x) PyUnstable_Long_CompactValue((PyLongObject*) x) #else #define __Pyx_PyLong_IsCompact(x) (((PyLongObject*)x)->long_value.lv_tag < (2 << _PyLong_NON_SIZE_BITS)) #define __Pyx_PyLong_CompactValue(x) ((1 - (Py_ssize_t) __Pyx_PyLong_Sign(x)) * (Py_ssize_t) __Pyx_PyLong_Digits(x)[0]) #endif typedef Py_ssize_t __Pyx_compact_pylong; typedef size_t __Pyx_compact_upylong; #else #define __Pyx_PyLong_IsNeg(x) (Py_SIZE(x) < 0) #define __Pyx_PyLong_IsNonNeg(x) (Py_SIZE(x) >= 0) #define __Pyx_PyLong_IsZero(x) (Py_SIZE(x) == 0) #define __Pyx_PyLong_IsPos(x) (Py_SIZE(x) > 0) #define __Pyx_PyLong_CompactValueUnsigned(x) ((Py_SIZE(x) == 0) ? 0 : __Pyx_PyLong_Digits(x)[0]) #define __Pyx_PyLong_DigitCount(x) __Pyx_sst_abs(Py_SIZE(x)) #define __Pyx_PyLong_SignedDigitCount(x) Py_SIZE(x) #define __Pyx_PyLong_IsCompact(x) (Py_SIZE(x) == 0 || Py_SIZE(x) == 1 || Py_SIZE(x) == -1) #define __Pyx_PyLong_CompactValue(x)\ ((Py_SIZE(x) == 0) ? (sdigit) 0 : ((Py_SIZE(x) < 0) ? -(sdigit)__Pyx_PyLong_Digits(x)[0] : (sdigit)__Pyx_PyLong_Digits(x)[0])) typedef sdigit __Pyx_compact_pylong; typedef digit __Pyx_compact_upylong; #endif #if PY_VERSION_HEX >= 0x030C00A5 #define __Pyx_PyLong_Digits(x) (((PyLongObject*)x)->long_value.ob_digit) #else #define __Pyx_PyLong_Digits(x) (((PyLongObject*)x)->ob_digit) #endif #endif #if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII #include static int __Pyx_sys_getdefaultencoding_not_ascii; static int __Pyx_init_sys_getdefaultencoding_params(void) { PyObject* sys; PyObject* default_encoding = NULL; PyObject* ascii_chars_u = NULL; PyObject* ascii_chars_b = NULL; const char* default_encoding_c; sys = PyImport_ImportModule("sys"); if (!sys) goto bad; default_encoding = PyObject_CallMethod(sys, (char*) "getdefaultencoding", NULL); Py_DECREF(sys); if (!default_encoding) goto bad; default_encoding_c = PyBytes_AsString(default_encoding); if (!default_encoding_c) goto bad; if (strcmp(default_encoding_c, "ascii") == 0) { __Pyx_sys_getdefaultencoding_not_ascii = 0; } else { char ascii_chars[128]; int c; for (c = 0; c < 128; c++) { ascii_chars[c] = (char) c; } __Pyx_sys_getdefaultencoding_not_ascii = 1; ascii_chars_u = PyUnicode_DecodeASCII(ascii_chars, 128, NULL); if (!ascii_chars_u) goto bad; ascii_chars_b = PyUnicode_AsEncodedString(ascii_chars_u, default_encoding_c, NULL); if (!ascii_chars_b || !PyBytes_Check(ascii_chars_b) || memcmp(ascii_chars, PyBytes_AS_STRING(ascii_chars_b), 128) != 0) { PyErr_Format( PyExc_ValueError, "This module compiled with c_string_encoding=ascii, but default encoding '%.200s' is not a superset of ascii.", default_encoding_c); goto bad; } Py_DECREF(ascii_chars_u); Py_DECREF(ascii_chars_b); } Py_DECREF(default_encoding); return 0; bad: Py_XDECREF(default_encoding); Py_XDECREF(ascii_chars_u); Py_XDECREF(ascii_chars_b); return -1; } #endif #if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT && PY_MAJOR_VERSION >= 3 #define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_DecodeUTF8(c_str, size, NULL) #else #define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_Decode(c_str, size, __PYX_DEFAULT_STRING_ENCODING, NULL) #if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT #include static char* __PYX_DEFAULT_STRING_ENCODING; static int __Pyx_init_sys_getdefaultencoding_params(void) { PyObject* sys; PyObject* default_encoding = NULL; char* default_encoding_c; sys = PyImport_ImportModule("sys"); if (!sys) goto bad; default_encoding = PyObject_CallMethod(sys, (char*) (const char*) "getdefaultencoding", NULL); Py_DECREF(sys); if (!default_encoding) goto bad; default_encoding_c = PyBytes_AsString(default_encoding); if (!default_encoding_c) goto bad; __PYX_DEFAULT_STRING_ENCODING = (char*) malloc(strlen(default_encoding_c) + 1); if (!__PYX_DEFAULT_STRING_ENCODING) goto bad; strcpy(__PYX_DEFAULT_STRING_ENCODING, default_encoding_c); Py_DECREF(default_encoding); return 0; bad: Py_XDECREF(default_encoding); return -1; } #endif #endif /* Test for GCC > 2.95 */ #if defined(__GNUC__) && (__GNUC__ > 2 || (__GNUC__ == 2 && (__GNUC_MINOR__ > 95))) #define likely(x) __builtin_expect(!!(x), 1) #define unlikely(x) __builtin_expect(!!(x), 0) #else /* !__GNUC__ or GCC < 2.95 */ #define likely(x) (x) #define unlikely(x) (x) #endif /* __GNUC__ */ static CYTHON_INLINE void __Pyx_pretend_to_initialize(void* ptr) { (void)ptr; } #if !CYTHON_USE_MODULE_STATE static PyObject *__pyx_m = NULL; #endif static int __pyx_lineno; static int __pyx_clineno = 0; static const char * __pyx_cfilenm = __FILE__; static const char *__pyx_filename; /* Header.proto */ #if !defined(CYTHON_CCOMPLEX) #if defined(__cplusplus) #define CYTHON_CCOMPLEX 1 #elif (defined(_Complex_I) && !defined(_MSC_VER)) || ((defined (__STDC_VERSION__) && __STDC_VERSION__ >= 201112L) && !defined(__STDC_NO_COMPLEX__) && !defined(_MSC_VER)) #define CYTHON_CCOMPLEX 1 #else #define CYTHON_CCOMPLEX 0 #endif #endif #if CYTHON_CCOMPLEX #ifdef __cplusplus #include #else #include #endif #endif #if CYTHON_CCOMPLEX && !defined(__cplusplus) && defined(__sun__) && defined(__GNUC__) #undef _Complex_I #define _Complex_I 1.0fj #endif /* #### Code section: filename_table ### */ static const char *__pyx_f[] = { "gammapy/stats/fit_statistics_cython.pyx", "__init__.cython-30.pxd", "type.pxd", }; /* #### Code section: utility_code_proto_before_types ### */ /* ForceInitThreads.proto */ #ifndef __PYX_FORCE_INIT_THREADS #define __PYX_FORCE_INIT_THREADS 0 #endif /* BufferFormatStructs.proto */ struct __Pyx_StructField_; #define __PYX_BUF_FLAGS_PACKED_STRUCT (1 << 0) typedef struct { const char* name; struct __Pyx_StructField_* fields; size_t size; size_t arraysize[8]; int ndim; char typegroup; char is_unsigned; int flags; } __Pyx_TypeInfo; typedef struct __Pyx_StructField_ { __Pyx_TypeInfo* type; const char* name; size_t offset; } __Pyx_StructField; typedef struct { __Pyx_StructField* field; size_t parent_offset; } __Pyx_BufFmt_StackElem; typedef struct { __Pyx_StructField root; __Pyx_BufFmt_StackElem* head; size_t fmt_offset; size_t new_count, enc_count; size_t struct_alignment; int is_complex; char enc_type; char new_packmode; char enc_packmode; char is_valid_array; } __Pyx_BufFmt_Context; /* #### Code section: numeric_typedefs ### */ /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":770 * # in Cython to enable them only on the right systems. * * ctypedef npy_int8 int8_t # <<<<<<<<<<<<<< * ctypedef npy_int16 int16_t * ctypedef npy_int32 int32_t */ typedef npy_int8 __pyx_t_5numpy_int8_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":771 * * ctypedef npy_int8 int8_t * ctypedef npy_int16 int16_t # <<<<<<<<<<<<<< * ctypedef npy_int32 int32_t * ctypedef npy_int64 int64_t */ typedef npy_int16 __pyx_t_5numpy_int16_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":772 * ctypedef npy_int8 int8_t * ctypedef npy_int16 int16_t * ctypedef npy_int32 int32_t # <<<<<<<<<<<<<< * ctypedef npy_int64 int64_t * #ctypedef npy_int96 int96_t */ typedef npy_int32 __pyx_t_5numpy_int32_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":773 * ctypedef npy_int16 int16_t * ctypedef npy_int32 int32_t * ctypedef npy_int64 int64_t # <<<<<<<<<<<<<< * #ctypedef npy_int96 int96_t * #ctypedef npy_int128 int128_t */ typedef npy_int64 __pyx_t_5numpy_int64_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":777 * #ctypedef npy_int128 int128_t * * ctypedef npy_uint8 uint8_t # <<<<<<<<<<<<<< * ctypedef npy_uint16 uint16_t * ctypedef npy_uint32 uint32_t */ typedef npy_uint8 __pyx_t_5numpy_uint8_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":778 * * ctypedef npy_uint8 uint8_t * ctypedef npy_uint16 uint16_t # <<<<<<<<<<<<<< * ctypedef npy_uint32 uint32_t * ctypedef npy_uint64 uint64_t */ typedef npy_uint16 __pyx_t_5numpy_uint16_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":779 * ctypedef npy_uint8 uint8_t * ctypedef npy_uint16 uint16_t * ctypedef npy_uint32 uint32_t # <<<<<<<<<<<<<< * ctypedef npy_uint64 uint64_t * #ctypedef npy_uint96 uint96_t */ typedef npy_uint32 __pyx_t_5numpy_uint32_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":780 * ctypedef npy_uint16 uint16_t * ctypedef npy_uint32 uint32_t * ctypedef npy_uint64 uint64_t # <<<<<<<<<<<<<< * #ctypedef npy_uint96 uint96_t * #ctypedef npy_uint128 uint128_t */ typedef npy_uint64 __pyx_t_5numpy_uint64_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":784 * #ctypedef npy_uint128 uint128_t * * ctypedef npy_float32 float32_t # <<<<<<<<<<<<<< * ctypedef npy_float64 float64_t * #ctypedef npy_float80 float80_t */ typedef npy_float32 __pyx_t_5numpy_float32_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":785 * * ctypedef npy_float32 float32_t * ctypedef npy_float64 float64_t # <<<<<<<<<<<<<< * #ctypedef npy_float80 float80_t * #ctypedef npy_float128 float128_t */ typedef npy_float64 __pyx_t_5numpy_float64_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":792 * ctypedef double complex complex128_t * * ctypedef npy_longlong longlong_t # <<<<<<<<<<<<<< * ctypedef npy_ulonglong ulonglong_t * */ typedef npy_longlong __pyx_t_5numpy_longlong_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":793 * * ctypedef npy_longlong longlong_t * ctypedef npy_ulonglong ulonglong_t # <<<<<<<<<<<<<< * * ctypedef npy_intp intp_t */ typedef npy_ulonglong __pyx_t_5numpy_ulonglong_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":795 * ctypedef npy_ulonglong ulonglong_t * * ctypedef npy_intp intp_t # <<<<<<<<<<<<<< * ctypedef npy_uintp uintp_t * */ typedef npy_intp __pyx_t_5numpy_intp_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":796 * * ctypedef npy_intp intp_t * ctypedef npy_uintp uintp_t # <<<<<<<<<<<<<< * * ctypedef npy_double float_t */ typedef npy_uintp __pyx_t_5numpy_uintp_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":798 * ctypedef npy_uintp uintp_t * * ctypedef npy_double float_t # <<<<<<<<<<<<<< * ctypedef npy_double double_t * ctypedef npy_longdouble longdouble_t */ typedef npy_double __pyx_t_5numpy_float_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":799 * * ctypedef npy_double float_t * ctypedef npy_double double_t # <<<<<<<<<<<<<< * ctypedef npy_longdouble longdouble_t * */ typedef npy_double __pyx_t_5numpy_double_t; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":800 * ctypedef npy_double float_t * ctypedef npy_double double_t * ctypedef npy_longdouble longdouble_t # <<<<<<<<<<<<<< * * ctypedef float complex cfloat_t */ typedef npy_longdouble __pyx_t_5numpy_longdouble_t; /* #### Code section: complex_type_declarations ### */ /* Declarations.proto */ #if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) #ifdef __cplusplus typedef ::std::complex< float > __pyx_t_float_complex; #else typedef float _Complex __pyx_t_float_complex; #endif #else typedef struct { float real, imag; } __pyx_t_float_complex; #endif static CYTHON_INLINE __pyx_t_float_complex __pyx_t_float_complex_from_parts(float, float); /* Declarations.proto */ #if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) #ifdef __cplusplus typedef ::std::complex< double > __pyx_t_double_complex; #else typedef double _Complex __pyx_t_double_complex; #endif #else typedef struct { double real, imag; } __pyx_t_double_complex; #endif static CYTHON_INLINE __pyx_t_double_complex __pyx_t_double_complex_from_parts(double, double); /* Declarations.proto */ #if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) #ifdef __cplusplus typedef ::std::complex< long double > __pyx_t_long_double_complex; #else typedef long double _Complex __pyx_t_long_double_complex; #endif #else typedef struct { long double real, imag; } __pyx_t_long_double_complex; #endif static CYTHON_INLINE __pyx_t_long_double_complex __pyx_t_long_double_complex_from_parts(long double, long double); /* #### Code section: type_declarations ### */ /*--- Type declarations ---*/ /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1096 * * # Iterator API added in v1.6 * ctypedef int (*NpyIter_IterNextFunc)(NpyIter* it) noexcept nogil # <<<<<<<<<<<<<< * ctypedef void (*NpyIter_GetMultiIndexFunc)(NpyIter* it, npy_intp* outcoords) noexcept nogil * */ typedef int (*__pyx_t_5numpy_NpyIter_IterNextFunc)(NpyIter *); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1097 * # Iterator API added in v1.6 * ctypedef int (*NpyIter_IterNextFunc)(NpyIter* it) noexcept nogil * ctypedef void (*NpyIter_GetMultiIndexFunc)(NpyIter* it, npy_intp* outcoords) noexcept nogil # <<<<<<<<<<<<<< * * cdef extern from "numpy/arrayobject.h": */ typedef void (*__pyx_t_5numpy_NpyIter_GetMultiIndexFunc)(NpyIter *, npy_intp *); /* #### Code section: utility_code_proto ### */ /* --- Runtime support code (head) --- */ /* Refnanny.proto */ #ifndef CYTHON_REFNANNY #define CYTHON_REFNANNY 0 #endif #if CYTHON_REFNANNY typedef struct { void (*INCREF)(void*, PyObject*, Py_ssize_t); void (*DECREF)(void*, PyObject*, Py_ssize_t); void (*GOTREF)(void*, PyObject*, Py_ssize_t); void (*GIVEREF)(void*, PyObject*, Py_ssize_t); void* (*SetupContext)(const char*, Py_ssize_t, const char*); void (*FinishContext)(void**); } __Pyx_RefNannyAPIStruct; static __Pyx_RefNannyAPIStruct *__Pyx_RefNanny = NULL; static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname); #define __Pyx_RefNannyDeclarations void *__pyx_refnanny = NULL; #ifdef WITH_THREAD #define __Pyx_RefNannySetupContext(name, acquire_gil)\ if (acquire_gil) {\ PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), (__LINE__), (__FILE__));\ PyGILState_Release(__pyx_gilstate_save);\ } else {\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), (__LINE__), (__FILE__));\ } #define __Pyx_RefNannyFinishContextNogil() {\ PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\ __Pyx_RefNannyFinishContext();\ PyGILState_Release(__pyx_gilstate_save);\ } #else #define __Pyx_RefNannySetupContext(name, acquire_gil)\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), (__LINE__), (__FILE__)) #define __Pyx_RefNannyFinishContextNogil() __Pyx_RefNannyFinishContext() #endif #define __Pyx_RefNannyFinishContextNogil() {\ PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\ __Pyx_RefNannyFinishContext();\ PyGILState_Release(__pyx_gilstate_save);\ } #define __Pyx_RefNannyFinishContext()\ __Pyx_RefNanny->FinishContext(&__pyx_refnanny) #define __Pyx_INCREF(r) __Pyx_RefNanny->INCREF(__pyx_refnanny, (PyObject *)(r), (__LINE__)) #define __Pyx_DECREF(r) __Pyx_RefNanny->DECREF(__pyx_refnanny, (PyObject *)(r), (__LINE__)) #define __Pyx_GOTREF(r) __Pyx_RefNanny->GOTREF(__pyx_refnanny, (PyObject *)(r), (__LINE__)) #define __Pyx_GIVEREF(r) __Pyx_RefNanny->GIVEREF(__pyx_refnanny, (PyObject *)(r), (__LINE__)) #define __Pyx_XINCREF(r) do { if((r) == NULL); else {__Pyx_INCREF(r); }} while(0) #define __Pyx_XDECREF(r) do { if((r) == NULL); else {__Pyx_DECREF(r); }} while(0) #define __Pyx_XGOTREF(r) do { if((r) == NULL); else {__Pyx_GOTREF(r); }} while(0) #define __Pyx_XGIVEREF(r) do { if((r) == NULL); else {__Pyx_GIVEREF(r);}} while(0) #else #define __Pyx_RefNannyDeclarations #define __Pyx_RefNannySetupContext(name, acquire_gil) #define __Pyx_RefNannyFinishContextNogil() #define __Pyx_RefNannyFinishContext() #define __Pyx_INCREF(r) Py_INCREF(r) #define __Pyx_DECREF(r) Py_DECREF(r) #define __Pyx_GOTREF(r) #define __Pyx_GIVEREF(r) #define __Pyx_XINCREF(r) Py_XINCREF(r) #define __Pyx_XDECREF(r) Py_XDECREF(r) #define __Pyx_XGOTREF(r) #define __Pyx_XGIVEREF(r) #endif #define __Pyx_Py_XDECREF_SET(r, v) do {\ PyObject *tmp = (PyObject *) r;\ r = v; Py_XDECREF(tmp);\ } while (0) #define __Pyx_XDECREF_SET(r, v) do {\ PyObject *tmp = (PyObject *) r;\ r = v; __Pyx_XDECREF(tmp);\ } while (0) #define __Pyx_DECREF_SET(r, v) do {\ PyObject *tmp = (PyObject *) r;\ r = v; __Pyx_DECREF(tmp);\ } while (0) #define __Pyx_CLEAR(r) do { PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);} while(0) #define __Pyx_XCLEAR(r) do { if((r) != NULL) {PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);}} while(0) /* PyErrExceptionMatches.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyErr_ExceptionMatches(err) __Pyx_PyErr_ExceptionMatchesInState(__pyx_tstate, err) static CYTHON_INLINE int __Pyx_PyErr_ExceptionMatchesInState(PyThreadState* tstate, PyObject* err); #else #define __Pyx_PyErr_ExceptionMatches(err) PyErr_ExceptionMatches(err) #endif /* PyThreadStateGet.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyThreadState_declare PyThreadState *__pyx_tstate; #define __Pyx_PyThreadState_assign __pyx_tstate = __Pyx_PyThreadState_Current; #if PY_VERSION_HEX >= 0x030C00A6 #define __Pyx_PyErr_Occurred() (__pyx_tstate->current_exception != NULL) #define __Pyx_PyErr_CurrentExceptionType() (__pyx_tstate->current_exception ? (PyObject*) Py_TYPE(__pyx_tstate->current_exception) : (PyObject*) NULL) #else #define __Pyx_PyErr_Occurred() (__pyx_tstate->curexc_type != NULL) #define __Pyx_PyErr_CurrentExceptionType() (__pyx_tstate->curexc_type) #endif #else #define __Pyx_PyThreadState_declare #define __Pyx_PyThreadState_assign #define __Pyx_PyErr_Occurred() (PyErr_Occurred() != NULL) #define __Pyx_PyErr_CurrentExceptionType() PyErr_Occurred() #endif /* PyErrFetchRestore.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyErr_Clear() __Pyx_ErrRestore(NULL, NULL, NULL) #define __Pyx_ErrRestoreWithState(type, value, tb) __Pyx_ErrRestoreInState(PyThreadState_GET(), type, value, tb) #define __Pyx_ErrFetchWithState(type, value, tb) __Pyx_ErrFetchInState(PyThreadState_GET(), type, value, tb) #define __Pyx_ErrRestore(type, value, tb) __Pyx_ErrRestoreInState(__pyx_tstate, type, value, tb) #define __Pyx_ErrFetch(type, value, tb) __Pyx_ErrFetchInState(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb); static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030C00A6 #define __Pyx_PyErr_SetNone(exc) (Py_INCREF(exc), __Pyx_ErrRestore((exc), NULL, NULL)) #else #define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) #endif #else #define __Pyx_PyErr_Clear() PyErr_Clear() #define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) #define __Pyx_ErrRestoreWithState(type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetchWithState(type, value, tb) PyErr_Fetch(type, value, tb) #define __Pyx_ErrRestoreInState(tstate, type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetchInState(tstate, type, value, tb) PyErr_Fetch(type, value, tb) #define __Pyx_ErrRestore(type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetch(type, value, tb) PyErr_Fetch(type, value, tb) #endif /* PyObjectGetAttrStr.proto */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name); #else #define __Pyx_PyObject_GetAttrStr(o,n) PyObject_GetAttr(o,n) #endif /* PyObjectGetAttrStrNoError.proto */ static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStrNoError(PyObject* obj, PyObject* attr_name); /* GetBuiltinName.proto */ static PyObject *__Pyx_GetBuiltinName(PyObject *name); /* GetTopmostException.proto */ #if CYTHON_USE_EXC_INFO_STACK && CYTHON_FAST_THREAD_STATE static _PyErr_StackItem * __Pyx_PyErr_GetTopmostException(PyThreadState *tstate); #endif /* SaveResetException.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_ExceptionSave(type, value, tb) __Pyx__ExceptionSave(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx__ExceptionSave(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #define __Pyx_ExceptionReset(type, value, tb) __Pyx__ExceptionReset(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx__ExceptionReset(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb); #else #define __Pyx_ExceptionSave(type, value, tb) PyErr_GetExcInfo(type, value, tb) #define __Pyx_ExceptionReset(type, value, tb) PyErr_SetExcInfo(type, value, tb) #endif /* GetException.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_GetException(type, value, tb) __Pyx__GetException(__pyx_tstate, type, value, tb) static int __Pyx__GetException(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #else static int __Pyx_GetException(PyObject **type, PyObject **value, PyObject **tb); #endif /* PyObjectCall.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw); #else #define __Pyx_PyObject_Call(func, arg, kw) PyObject_Call(func, arg, kw) #endif /* RaiseException.proto */ static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause); /* TupleAndListFromArray.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyList_FromArray(PyObject *const *src, Py_ssize_t n); static CYTHON_INLINE PyObject* __Pyx_PyTuple_FromArray(PyObject *const *src, Py_ssize_t n); #endif /* IncludeStringH.proto */ #include /* BytesEquals.proto */ static CYTHON_INLINE int __Pyx_PyBytes_Equals(PyObject* s1, PyObject* s2, int equals); /* UnicodeEquals.proto */ static CYTHON_INLINE int __Pyx_PyUnicode_Equals(PyObject* s1, PyObject* s2, int equals); /* fastcall.proto */ #if CYTHON_AVOID_BORROWED_REFS #define __Pyx_Arg_VARARGS(args, i) PySequence_GetItem(args, i) #elif CYTHON_ASSUME_SAFE_MACROS #define __Pyx_Arg_VARARGS(args, i) PyTuple_GET_ITEM(args, i) #else #define __Pyx_Arg_VARARGS(args, i) PyTuple_GetItem(args, i) #endif #if CYTHON_AVOID_BORROWED_REFS #define __Pyx_Arg_NewRef_VARARGS(arg) __Pyx_NewRef(arg) #define __Pyx_Arg_XDECREF_VARARGS(arg) Py_XDECREF(arg) #else #define __Pyx_Arg_NewRef_VARARGS(arg) arg #define __Pyx_Arg_XDECREF_VARARGS(arg) #endif #define __Pyx_NumKwargs_VARARGS(kwds) PyDict_Size(kwds) #define __Pyx_KwValues_VARARGS(args, nargs) NULL #define __Pyx_GetKwValue_VARARGS(kw, kwvalues, s) __Pyx_PyDict_GetItemStrWithError(kw, s) #define __Pyx_KwargsAsDict_VARARGS(kw, kwvalues) PyDict_Copy(kw) #if CYTHON_METH_FASTCALL #define __Pyx_Arg_FASTCALL(args, i) args[i] #define __Pyx_NumKwargs_FASTCALL(kwds) PyTuple_GET_SIZE(kwds) #define __Pyx_KwValues_FASTCALL(args, nargs) ((args) + (nargs)) static CYTHON_INLINE PyObject * __Pyx_GetKwValue_FASTCALL(PyObject *kwnames, PyObject *const *kwvalues, PyObject *s); #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030d0000 CYTHON_UNUSED static PyObject *__Pyx_KwargsAsDict_FASTCALL(PyObject *kwnames, PyObject *const *kwvalues); #else #define __Pyx_KwargsAsDict_FASTCALL(kw, kwvalues) _PyStack_AsDict(kwvalues, kw) #endif #define __Pyx_Arg_NewRef_FASTCALL(arg) arg /* no-op, __Pyx_Arg_FASTCALL is direct and this needs to have the same reference counting */ #define __Pyx_Arg_XDECREF_FASTCALL(arg) #else #define __Pyx_Arg_FASTCALL __Pyx_Arg_VARARGS #define __Pyx_NumKwargs_FASTCALL __Pyx_NumKwargs_VARARGS #define __Pyx_KwValues_FASTCALL __Pyx_KwValues_VARARGS #define __Pyx_GetKwValue_FASTCALL __Pyx_GetKwValue_VARARGS #define __Pyx_KwargsAsDict_FASTCALL __Pyx_KwargsAsDict_VARARGS #define __Pyx_Arg_NewRef_FASTCALL(arg) __Pyx_Arg_NewRef_VARARGS(arg) #define __Pyx_Arg_XDECREF_FASTCALL(arg) __Pyx_Arg_XDECREF_VARARGS(arg) #endif #if CYTHON_COMPILING_IN_CPYTHON && CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS #define __Pyx_ArgsSlice_VARARGS(args, start, stop) __Pyx_PyTuple_FromArray(&__Pyx_Arg_VARARGS(args, start), stop - start) #define __Pyx_ArgsSlice_FASTCALL(args, start, stop) __Pyx_PyTuple_FromArray(&__Pyx_Arg_FASTCALL(args, start), stop - start) #else #define __Pyx_ArgsSlice_VARARGS(args, start, stop) PyTuple_GetSlice(args, start, stop) #define __Pyx_ArgsSlice_FASTCALL(args, start, stop) PyTuple_GetSlice(args, start, stop) #endif /* RaiseArgTupleInvalid.proto */ static void __Pyx_RaiseArgtupleInvalid(const char* func_name, int exact, Py_ssize_t num_min, Py_ssize_t num_max, Py_ssize_t num_found); /* RaiseDoubleKeywords.proto */ static void __Pyx_RaiseDoubleKeywordsError(const char* func_name, PyObject* kw_name); /* ParseKeywords.proto */ static int __Pyx_ParseOptionalKeywords(PyObject *kwds, PyObject *const *kwvalues, PyObject **argnames[], PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args, const char* function_name); /* ArgTypeTest.proto */ #define __Pyx_ArgTypeTest(obj, type, none_allowed, name, exact)\ ((likely(__Pyx_IS_TYPE(obj, type) | (none_allowed && (obj == Py_None)))) ? 1 :\ __Pyx__ArgTypeTest(obj, type, name, exact)) static int __Pyx__ArgTypeTest(PyObject *obj, PyTypeObject *type, const char *name, int exact); /* IsLittleEndian.proto */ static CYTHON_INLINE int __Pyx_Is_Little_Endian(void); /* BufferFormatCheck.proto */ static const char* __Pyx_BufFmt_CheckString(__Pyx_BufFmt_Context* ctx, const char* ts); static void __Pyx_BufFmt_Init(__Pyx_BufFmt_Context* ctx, __Pyx_BufFmt_StackElem* stack, __Pyx_TypeInfo* type); /* BufferGetAndValidate.proto */ #define __Pyx_GetBufferAndValidate(buf, obj, dtype, flags, nd, cast, stack)\ ((obj == Py_None || obj == NULL) ?\ (__Pyx_ZeroBuffer(buf), 0) :\ __Pyx__GetBufferAndValidate(buf, obj, dtype, flags, nd, cast, stack)) static int __Pyx__GetBufferAndValidate(Py_buffer* buf, PyObject* obj, __Pyx_TypeInfo* dtype, int flags, int nd, int cast, __Pyx_BufFmt_StackElem* stack); static void __Pyx_ZeroBuffer(Py_buffer* buf); static CYTHON_INLINE void __Pyx_SafeReleaseBuffer(Py_buffer* info); static Py_ssize_t __Pyx_minusones[] = { -1, -1, -1, -1, -1, -1, -1, -1 }; static Py_ssize_t __Pyx_zeros[] = { 0, 0, 0, 0, 0, 0, 0, 0 }; /* PyDictVersioning.proto */ #if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS #define __PYX_DICT_VERSION_INIT ((PY_UINT64_T) -1) #define __PYX_GET_DICT_VERSION(dict) (((PyDictObject*)(dict))->ma_version_tag) #define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var)\ (version_var) = __PYX_GET_DICT_VERSION(dict);\ (cache_var) = (value); #define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) {\ static PY_UINT64_T __pyx_dict_version = 0;\ static PyObject *__pyx_dict_cached_value = NULL;\ if (likely(__PYX_GET_DICT_VERSION(DICT) == __pyx_dict_version)) {\ (VAR) = __pyx_dict_cached_value;\ } else {\ (VAR) = __pyx_dict_cached_value = (LOOKUP);\ __pyx_dict_version = __PYX_GET_DICT_VERSION(DICT);\ }\ } static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj); static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj); static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version); #else #define __PYX_GET_DICT_VERSION(dict) (0) #define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var) #define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) (VAR) = (LOOKUP); #endif /* GetModuleGlobalName.proto */ #if CYTHON_USE_DICT_VERSIONS #define __Pyx_GetModuleGlobalName(var, name) do {\ static PY_UINT64_T __pyx_dict_version = 0;\ static PyObject *__pyx_dict_cached_value = NULL;\ (var) = (likely(__pyx_dict_version == __PYX_GET_DICT_VERSION(__pyx_d))) ?\ (likely(__pyx_dict_cached_value) ? __Pyx_NewRef(__pyx_dict_cached_value) : __Pyx_GetBuiltinName(name)) :\ __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\ } while(0) #define __Pyx_GetModuleGlobalNameUncached(var, name) do {\ PY_UINT64_T __pyx_dict_version;\ PyObject *__pyx_dict_cached_value;\ (var) = __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\ } while(0) static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value); #else #define __Pyx_GetModuleGlobalName(var, name) (var) = __Pyx__GetModuleGlobalName(name) #define __Pyx_GetModuleGlobalNameUncached(var, name) (var) = __Pyx__GetModuleGlobalName(name) static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name); #endif #define __Pyx_BufPtrStrided1d(type, buf, i0, s0) (type)((char*)buf + i0 * s0) /* TypeImport.proto */ #ifndef __PYX_HAVE_RT_ImportType_proto_3_0_11 #define __PYX_HAVE_RT_ImportType_proto_3_0_11 #if defined (__STDC_VERSION__) && __STDC_VERSION__ >= 201112L #include #endif #if (defined (__STDC_VERSION__) && __STDC_VERSION__ >= 201112L) || __cplusplus >= 201103L #define __PYX_GET_STRUCT_ALIGNMENT_3_0_11(s) alignof(s) #else #define __PYX_GET_STRUCT_ALIGNMENT_3_0_11(s) sizeof(void*) #endif enum __Pyx_ImportType_CheckSize_3_0_11 { __Pyx_ImportType_CheckSize_Error_3_0_11 = 0, __Pyx_ImportType_CheckSize_Warn_3_0_11 = 1, __Pyx_ImportType_CheckSize_Ignore_3_0_11 = 2 }; static PyTypeObject *__Pyx_ImportType_3_0_11(PyObject* module, const char *module_name, const char *class_name, size_t size, size_t alignment, enum __Pyx_ImportType_CheckSize_3_0_11 check_size); #endif /* Import.proto */ static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list, int level); /* ImportDottedModule.proto */ static PyObject *__Pyx_ImportDottedModule(PyObject *name, PyObject *parts_tuple); #if PY_MAJOR_VERSION >= 3 static PyObject *__Pyx_ImportDottedModule_WalkParts(PyObject *module, PyObject *name, PyObject *parts_tuple); #endif /* IncludeStructmemberH.proto */ #include /* FixUpExtensionType.proto */ #if CYTHON_USE_TYPE_SPECS static int __Pyx_fix_up_extension_type_from_spec(PyType_Spec *spec, PyTypeObject *type); #endif /* FetchSharedCythonModule.proto */ static PyObject *__Pyx_FetchSharedCythonABIModule(void); /* FetchCommonType.proto */ #if !CYTHON_USE_TYPE_SPECS static PyTypeObject* __Pyx_FetchCommonType(PyTypeObject* type); #else static PyTypeObject* __Pyx_FetchCommonTypeFromSpec(PyObject *module, PyType_Spec *spec, PyObject *bases); #endif /* PyMethodNew.proto */ #if CYTHON_COMPILING_IN_LIMITED_API static PyObject *__Pyx_PyMethod_New(PyObject *func, PyObject *self, PyObject *typ) { PyObject *typesModule=NULL, *methodType=NULL, *result=NULL; CYTHON_UNUSED_VAR(typ); if (!self) return __Pyx_NewRef(func); typesModule = PyImport_ImportModule("types"); if (!typesModule) return NULL; methodType = PyObject_GetAttrString(typesModule, "MethodType"); Py_DECREF(typesModule); if (!methodType) return NULL; result = PyObject_CallFunctionObjArgs(methodType, func, self, NULL); Py_DECREF(methodType); return result; } #elif PY_MAJOR_VERSION >= 3 static PyObject *__Pyx_PyMethod_New(PyObject *func, PyObject *self, PyObject *typ) { CYTHON_UNUSED_VAR(typ); if (!self) return __Pyx_NewRef(func); return PyMethod_New(func, self); } #else #define __Pyx_PyMethod_New PyMethod_New #endif /* PyVectorcallFastCallDict.proto */ #if CYTHON_METH_FASTCALL static CYTHON_INLINE PyObject *__Pyx_PyVectorcall_FastCallDict(PyObject *func, __pyx_vectorcallfunc vc, PyObject *const *args, size_t nargs, PyObject *kw); #endif /* CythonFunctionShared.proto */ #define __Pyx_CyFunction_USED #define __Pyx_CYFUNCTION_STATICMETHOD 0x01 #define __Pyx_CYFUNCTION_CLASSMETHOD 0x02 #define __Pyx_CYFUNCTION_CCLASS 0x04 #define __Pyx_CYFUNCTION_COROUTINE 0x08 #define __Pyx_CyFunction_GetClosure(f)\ (((__pyx_CyFunctionObject *) (f))->func_closure) #if PY_VERSION_HEX < 0x030900B1 || CYTHON_COMPILING_IN_LIMITED_API #define __Pyx_CyFunction_GetClassObj(f)\ (((__pyx_CyFunctionObject *) (f))->func_classobj) #else #define __Pyx_CyFunction_GetClassObj(f)\ ((PyObject*) ((PyCMethodObject *) (f))->mm_class) #endif #define __Pyx_CyFunction_SetClassObj(f, classobj)\ __Pyx__CyFunction_SetClassObj((__pyx_CyFunctionObject *) (f), (classobj)) #define __Pyx_CyFunction_Defaults(type, f)\ ((type *)(((__pyx_CyFunctionObject *) (f))->defaults)) #define __Pyx_CyFunction_SetDefaultsGetter(f, g)\ ((__pyx_CyFunctionObject *) (f))->defaults_getter = (g) typedef struct { #if CYTHON_COMPILING_IN_LIMITED_API PyObject_HEAD PyObject *func; #elif PY_VERSION_HEX < 0x030900B1 PyCFunctionObject func; #else PyCMethodObject func; #endif #if CYTHON_BACKPORT_VECTORCALL __pyx_vectorcallfunc func_vectorcall; #endif #if PY_VERSION_HEX < 0x030500A0 || CYTHON_COMPILING_IN_LIMITED_API PyObject *func_weakreflist; #endif PyObject *func_dict; PyObject *func_name; PyObject *func_qualname; PyObject *func_doc; PyObject *func_globals; PyObject *func_code; PyObject *func_closure; #if PY_VERSION_HEX < 0x030900B1 || CYTHON_COMPILING_IN_LIMITED_API PyObject *func_classobj; #endif void *defaults; int defaults_pyobjects; size_t defaults_size; int flags; PyObject *defaults_tuple; PyObject *defaults_kwdict; PyObject *(*defaults_getter)(PyObject *); PyObject *func_annotations; PyObject *func_is_coroutine; } __pyx_CyFunctionObject; #undef __Pyx_CyOrPyCFunction_Check #define __Pyx_CyFunction_Check(obj) __Pyx_TypeCheck(obj, __pyx_CyFunctionType) #define __Pyx_CyOrPyCFunction_Check(obj) __Pyx_TypeCheck2(obj, __pyx_CyFunctionType, &PyCFunction_Type) #define __Pyx_CyFunction_CheckExact(obj) __Pyx_IS_TYPE(obj, __pyx_CyFunctionType) static CYTHON_INLINE int __Pyx__IsSameCyOrCFunction(PyObject *func, void *cfunc); #undef __Pyx_IsSameCFunction #define __Pyx_IsSameCFunction(func, cfunc) __Pyx__IsSameCyOrCFunction(func, cfunc) static PyObject *__Pyx_CyFunction_Init(__pyx_CyFunctionObject* op, PyMethodDef *ml, int flags, PyObject* qualname, PyObject *closure, PyObject *module, PyObject *globals, PyObject* code); static CYTHON_INLINE void __Pyx__CyFunction_SetClassObj(__pyx_CyFunctionObject* f, PyObject* classobj); static CYTHON_INLINE void *__Pyx_CyFunction_InitDefaults(PyObject *m, size_t size, int pyobjects); static CYTHON_INLINE void __Pyx_CyFunction_SetDefaultsTuple(PyObject *m, PyObject *tuple); static CYTHON_INLINE void __Pyx_CyFunction_SetDefaultsKwDict(PyObject *m, PyObject *dict); static CYTHON_INLINE void __Pyx_CyFunction_SetAnnotationsDict(PyObject *m, PyObject *dict); static int __pyx_CyFunction_init(PyObject *module); #if CYTHON_METH_FASTCALL static PyObject * __Pyx_CyFunction_Vectorcall_NOARGS(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames); static PyObject * __Pyx_CyFunction_Vectorcall_O(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames); static PyObject * __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames); static PyObject * __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS_METHOD(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames); #if CYTHON_BACKPORT_VECTORCALL #define __Pyx_CyFunction_func_vectorcall(f) (((__pyx_CyFunctionObject*)f)->func_vectorcall) #else #define __Pyx_CyFunction_func_vectorcall(f) (((PyCFunctionObject*)f)->vectorcall) #endif #endif /* CythonFunction.proto */ static PyObject *__Pyx_CyFunction_New(PyMethodDef *ml, int flags, PyObject* qualname, PyObject *closure, PyObject *module, PyObject *globals, PyObject* code); /* CLineInTraceback.proto */ #ifdef CYTHON_CLINE_IN_TRACEBACK #define __Pyx_CLineForTraceback(tstate, c_line) (((CYTHON_CLINE_IN_TRACEBACK)) ? c_line : 0) #else static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line); #endif /* CodeObjectCache.proto */ #if !CYTHON_COMPILING_IN_LIMITED_API typedef struct { PyCodeObject* code_object; int code_line; } __Pyx_CodeObjectCacheEntry; struct __Pyx_CodeObjectCache { int count; int max_count; __Pyx_CodeObjectCacheEntry* entries; }; static struct __Pyx_CodeObjectCache __pyx_code_cache = {0,0,NULL}; static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line); static PyCodeObject *__pyx_find_code_object(int code_line); static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object); #endif /* AddTraceback.proto */ static void __Pyx_AddTraceback(const char *funcname, int c_line, int py_line, const char *filename); /* BufferStructDeclare.proto */ typedef struct { Py_ssize_t shape, strides, suboffsets; } __Pyx_Buf_DimInfo; typedef struct { size_t refcount; Py_buffer pybuffer; } __Pyx_Buffer; typedef struct { __Pyx_Buffer *rcbuffer; char *data; __Pyx_Buf_DimInfo diminfo[8]; } __Pyx_LocalBuf_ND; #if PY_MAJOR_VERSION < 3 static int __Pyx_GetBuffer(PyObject *obj, Py_buffer *view, int flags); static void __Pyx_ReleaseBuffer(Py_buffer *view); #else #define __Pyx_GetBuffer PyObject_GetBuffer #define __Pyx_ReleaseBuffer PyBuffer_Release #endif /* GCCDiagnostics.proto */ #if !defined(__INTEL_COMPILER) && defined(__GNUC__) && (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 6)) #define __Pyx_HAS_GCC_DIAGNOSTIC #endif /* RealImag.proto */ #if CYTHON_CCOMPLEX #ifdef __cplusplus #define __Pyx_CREAL(z) ((z).real()) #define __Pyx_CIMAG(z) ((z).imag()) #else #define __Pyx_CREAL(z) (__real__(z)) #define __Pyx_CIMAG(z) (__imag__(z)) #endif #else #define __Pyx_CREAL(z) ((z).real) #define __Pyx_CIMAG(z) ((z).imag) #endif #if defined(__cplusplus) && CYTHON_CCOMPLEX\ && (defined(_WIN32) || defined(__clang__) || (defined(__GNUC__) && (__GNUC__ >= 5 || __GNUC__ == 4 && __GNUC_MINOR__ >= 4 )) || __cplusplus >= 201103) #define __Pyx_SET_CREAL(z,x) ((z).real(x)) #define __Pyx_SET_CIMAG(z,y) ((z).imag(y)) #else #define __Pyx_SET_CREAL(z,x) __Pyx_CREAL(z) = (x) #define __Pyx_SET_CIMAG(z,y) __Pyx_CIMAG(z) = (y) #endif /* Arithmetic.proto */ #if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) #define __Pyx_c_eq_float(a, b) ((a)==(b)) #define __Pyx_c_sum_float(a, b) ((a)+(b)) #define __Pyx_c_diff_float(a, b) ((a)-(b)) #define __Pyx_c_prod_float(a, b) ((a)*(b)) #define __Pyx_c_quot_float(a, b) ((a)/(b)) #define __Pyx_c_neg_float(a) (-(a)) #ifdef __cplusplus #define __Pyx_c_is_zero_float(z) ((z)==(float)0) #define __Pyx_c_conj_float(z) (::std::conj(z)) #if 1 #define __Pyx_c_abs_float(z) (::std::abs(z)) #define __Pyx_c_pow_float(a, b) (::std::pow(a, b)) #endif #else #define __Pyx_c_is_zero_float(z) ((z)==0) #define __Pyx_c_conj_float(z) (conjf(z)) #if 1 #define __Pyx_c_abs_float(z) (cabsf(z)) #define __Pyx_c_pow_float(a, b) (cpowf(a, b)) #endif #endif #else static CYTHON_INLINE int __Pyx_c_eq_float(__pyx_t_float_complex, __pyx_t_float_complex); static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_sum_float(__pyx_t_float_complex, __pyx_t_float_complex); static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_diff_float(__pyx_t_float_complex, __pyx_t_float_complex); static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_prod_float(__pyx_t_float_complex, __pyx_t_float_complex); static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_quot_float(__pyx_t_float_complex, __pyx_t_float_complex); static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_neg_float(__pyx_t_float_complex); static CYTHON_INLINE int __Pyx_c_is_zero_float(__pyx_t_float_complex); static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_conj_float(__pyx_t_float_complex); #if 1 static CYTHON_INLINE float __Pyx_c_abs_float(__pyx_t_float_complex); static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_pow_float(__pyx_t_float_complex, __pyx_t_float_complex); #endif #endif /* Arithmetic.proto */ #if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) #define __Pyx_c_eq_double(a, b) ((a)==(b)) #define __Pyx_c_sum_double(a, b) ((a)+(b)) #define __Pyx_c_diff_double(a, b) ((a)-(b)) #define __Pyx_c_prod_double(a, b) ((a)*(b)) #define __Pyx_c_quot_double(a, b) ((a)/(b)) #define __Pyx_c_neg_double(a) (-(a)) #ifdef __cplusplus #define __Pyx_c_is_zero_double(z) ((z)==(double)0) #define __Pyx_c_conj_double(z) (::std::conj(z)) #if 1 #define __Pyx_c_abs_double(z) (::std::abs(z)) #define __Pyx_c_pow_double(a, b) (::std::pow(a, b)) #endif #else #define __Pyx_c_is_zero_double(z) ((z)==0) #define __Pyx_c_conj_double(z) (conj(z)) #if 1 #define __Pyx_c_abs_double(z) (cabs(z)) #define __Pyx_c_pow_double(a, b) (cpow(a, b)) #endif #endif #else static CYTHON_INLINE int __Pyx_c_eq_double(__pyx_t_double_complex, __pyx_t_double_complex); static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_sum_double(__pyx_t_double_complex, __pyx_t_double_complex); static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_diff_double(__pyx_t_double_complex, __pyx_t_double_complex); static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_prod_double(__pyx_t_double_complex, __pyx_t_double_complex); static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_quot_double(__pyx_t_double_complex, __pyx_t_double_complex); static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_neg_double(__pyx_t_double_complex); static CYTHON_INLINE int __Pyx_c_is_zero_double(__pyx_t_double_complex); static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_conj_double(__pyx_t_double_complex); #if 1 static CYTHON_INLINE double __Pyx_c_abs_double(__pyx_t_double_complex); static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_pow_double(__pyx_t_double_complex, __pyx_t_double_complex); #endif #endif /* Arithmetic.proto */ #if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) #define __Pyx_c_eq_long__double(a, b) ((a)==(b)) #define __Pyx_c_sum_long__double(a, b) ((a)+(b)) #define __Pyx_c_diff_long__double(a, b) ((a)-(b)) #define __Pyx_c_prod_long__double(a, b) ((a)*(b)) #define __Pyx_c_quot_long__double(a, b) ((a)/(b)) #define __Pyx_c_neg_long__double(a) (-(a)) #ifdef __cplusplus #define __Pyx_c_is_zero_long__double(z) ((z)==(long double)0) #define __Pyx_c_conj_long__double(z) (::std::conj(z)) #if 1 #define __Pyx_c_abs_long__double(z) (::std::abs(z)) #define __Pyx_c_pow_long__double(a, b) (::std::pow(a, b)) #endif #else #define __Pyx_c_is_zero_long__double(z) ((z)==0) #define __Pyx_c_conj_long__double(z) (conjl(z)) #if 1 #define __Pyx_c_abs_long__double(z) (cabsl(z)) #define __Pyx_c_pow_long__double(a, b) (cpowl(a, b)) #endif #endif #else static CYTHON_INLINE int __Pyx_c_eq_long__double(__pyx_t_long_double_complex, __pyx_t_long_double_complex); static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_sum_long__double(__pyx_t_long_double_complex, __pyx_t_long_double_complex); static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_diff_long__double(__pyx_t_long_double_complex, __pyx_t_long_double_complex); static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_prod_long__double(__pyx_t_long_double_complex, __pyx_t_long_double_complex); static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_quot_long__double(__pyx_t_long_double_complex, __pyx_t_long_double_complex); static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_neg_long__double(__pyx_t_long_double_complex); static CYTHON_INLINE int __Pyx_c_is_zero_long__double(__pyx_t_long_double_complex); static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_conj_long__double(__pyx_t_long_double_complex); #if 1 static CYTHON_INLINE long double __Pyx_c_abs_long__double(__pyx_t_long_double_complex); static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_pow_long__double(__pyx_t_long_double_complex, __pyx_t_long_double_complex); #endif #endif /* CIntToPy.proto */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_unsigned_int(unsigned int value); /* CIntFromPy.proto */ static CYTHON_INLINE unsigned int __Pyx_PyInt_As_unsigned_int(PyObject *); /* FormatTypeName.proto */ #if CYTHON_COMPILING_IN_LIMITED_API typedef PyObject *__Pyx_TypeName; #define __Pyx_FMT_TYPENAME "%U" static __Pyx_TypeName __Pyx_PyType_GetName(PyTypeObject* tp); #define __Pyx_DECREF_TypeName(obj) Py_XDECREF(obj) #else typedef const char *__Pyx_TypeName; #define __Pyx_FMT_TYPENAME "%.200s" #define __Pyx_PyType_GetName(tp) ((tp)->tp_name) #define __Pyx_DECREF_TypeName(obj) #endif /* CIntToPy.proto */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value); /* CIntFromPy.proto */ static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *); /* CIntFromPy.proto */ static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *); /* FastTypeChecks.proto */ #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_TypeCheck(obj, type) __Pyx_IsSubtype(Py_TYPE(obj), (PyTypeObject *)type) #define __Pyx_TypeCheck2(obj, type1, type2) __Pyx_IsAnySubtype2(Py_TYPE(obj), (PyTypeObject *)type1, (PyTypeObject *)type2) static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b); static CYTHON_INLINE int __Pyx_IsAnySubtype2(PyTypeObject *cls, PyTypeObject *a, PyTypeObject *b); static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches(PyObject *err, PyObject *type); static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches2(PyObject *err, PyObject *type1, PyObject *type2); #else #define __Pyx_TypeCheck(obj, type) PyObject_TypeCheck(obj, (PyTypeObject *)type) #define __Pyx_TypeCheck2(obj, type1, type2) (PyObject_TypeCheck(obj, (PyTypeObject *)type1) || PyObject_TypeCheck(obj, (PyTypeObject *)type2)) #define __Pyx_PyErr_GivenExceptionMatches(err, type) PyErr_GivenExceptionMatches(err, type) #define __Pyx_PyErr_GivenExceptionMatches2(err, type1, type2) (PyErr_GivenExceptionMatches(err, type1) || PyErr_GivenExceptionMatches(err, type2)) #endif #define __Pyx_PyErr_ExceptionMatches2(err1, err2) __Pyx_PyErr_GivenExceptionMatches2(__Pyx_PyErr_CurrentExceptionType(), err1, err2) #define __Pyx_PyException_Check(obj) __Pyx_TypeCheck(obj, PyExc_Exception) /* CheckBinaryVersion.proto */ static unsigned long __Pyx_get_runtime_version(void); static int __Pyx_check_binary_version(unsigned long ct_version, unsigned long rt_version, int allow_newer); /* InitStrings.proto */ static int __Pyx_InitStrings(__Pyx_StringTabEntry *t); /* #### Code section: module_declarations ### */ static CYTHON_INLINE npy_intp __pyx_f_5numpy_5dtype_8itemsize_itemsize(PyArray_Descr *__pyx_v_self); /* proto*/ static CYTHON_INLINE npy_intp __pyx_f_5numpy_5dtype_9alignment_alignment(PyArray_Descr *__pyx_v_self); /* proto*/ static CYTHON_INLINE PyObject *__pyx_f_5numpy_5dtype_6fields_fields(PyArray_Descr *__pyx_v_self); /* proto*/ static CYTHON_INLINE PyObject *__pyx_f_5numpy_5dtype_5names_names(PyArray_Descr *__pyx_v_self); /* proto*/ static CYTHON_INLINE PyArray_ArrayDescr *__pyx_f_5numpy_5dtype_8subarray_subarray(PyArray_Descr *__pyx_v_self); /* proto*/ static CYTHON_INLINE npy_uint64 __pyx_f_5numpy_5dtype_5flags_flags(PyArray_Descr *__pyx_v_self); /* proto*/ static CYTHON_INLINE int __pyx_f_5numpy_9broadcast_7numiter_numiter(PyArrayMultiIterObject *__pyx_v_self); /* proto*/ static CYTHON_INLINE npy_intp __pyx_f_5numpy_9broadcast_4size_size(PyArrayMultiIterObject *__pyx_v_self); /* proto*/ static CYTHON_INLINE npy_intp __pyx_f_5numpy_9broadcast_5index_index(PyArrayMultiIterObject *__pyx_v_self); /* proto*/ static CYTHON_INLINE int __pyx_f_5numpy_9broadcast_2nd_nd(PyArrayMultiIterObject *__pyx_v_self); /* proto*/ static CYTHON_INLINE npy_intp *__pyx_f_5numpy_9broadcast_10dimensions_dimensions(PyArrayMultiIterObject *__pyx_v_self); /* proto*/ static CYTHON_INLINE void **__pyx_f_5numpy_9broadcast_5iters_iters(PyArrayMultiIterObject *__pyx_v_self); /* proto*/ static CYTHON_INLINE PyObject *__pyx_f_5numpy_7ndarray_4base_base(PyArrayObject *__pyx_v_self); /* proto*/ static CYTHON_INLINE PyArray_Descr *__pyx_f_5numpy_7ndarray_5descr_descr(PyArrayObject *__pyx_v_self); /* proto*/ static CYTHON_INLINE int __pyx_f_5numpy_7ndarray_4ndim_ndim(PyArrayObject *__pyx_v_self); /* proto*/ static CYTHON_INLINE npy_intp *__pyx_f_5numpy_7ndarray_5shape_shape(PyArrayObject *__pyx_v_self); /* proto*/ static CYTHON_INLINE npy_intp *__pyx_f_5numpy_7ndarray_7strides_strides(PyArrayObject *__pyx_v_self); /* proto*/ static CYTHON_INLINE npy_intp __pyx_f_5numpy_7ndarray_4size_size(PyArrayObject *__pyx_v_self); /* proto*/ static CYTHON_INLINE char *__pyx_f_5numpy_7ndarray_4data_data(PyArrayObject *__pyx_v_self); /* proto*/ /* Module declarations from "libc.string" */ /* Module declarations from "libc.stdio" */ /* Module declarations from "__builtin__" */ /* Module declarations from "cpython.type" */ /* Module declarations from "cpython" */ /* Module declarations from "cpython.object" */ /* Module declarations from "cpython.ref" */ /* Module declarations from "numpy" */ /* Module declarations from "numpy" */ /* Module declarations from "cython" */ /* Module declarations from "gammapy.stats.fit_statistics_cython" */ /* #### Code section: typeinfo ### */ static __Pyx_TypeInfo __Pyx_TypeInfo_nn___pyx_t_5numpy_float_t = { "float_t", NULL, sizeof(__pyx_t_5numpy_float_t), { 0 }, 0, 'R', 0, 0 }; /* #### Code section: before_global_var ### */ #define __Pyx_MODULE_NAME "gammapy.stats.fit_statistics_cython" extern int __pyx_module_is_main_gammapy__stats__fit_statistics_cython; int __pyx_module_is_main_gammapy__stats__fit_statistics_cython = 0; /* Implementation of "gammapy.stats.fit_statistics_cython" */ /* #### Code section: global_var ### */ static PyObject *__pyx_builtin_range; static PyObject *__pyx_builtin_ImportError; /* #### Code section: string_decls ### */ static const char __pyx_k_i[] = "i"; static const char __pyx_k_x[] = "x"; static const char __pyx_k__3[] = "*"; static const char __pyx_k_ni[] = "ni"; static const char __pyx_k_np[] = "np"; static const char __pyx_k_sn[] = "sn"; static const char __pyx_k__10[] = "?"; static const char __pyx_k_npr[] = "npr"; static const char __pyx_k_sum[] = "sum"; static const char __pyx_k_main[] = "__main__"; static const char __pyx_k_name[] = "__name__"; static const char __pyx_k_spec[] = "__spec__"; static const char __pyx_k_test[] = "__test__"; static const char __pyx_k_b_max[] = "b_max"; static const char __pyx_k_b_min[] = "b_min"; static const char __pyx_k_c_min[] = "c_min"; static const char __pyx_k_model[] = "model"; static const char __pyx_k_npred[] = "npred"; static const char __pyx_k_numpy[] = "numpy"; static const char __pyx_k_range[] = "range"; static const char __pyx_k_trunc[] = "trunc"; static const char __pyx_k_counts[] = "counts"; static const char __pyx_k_import[] = "__import__"; static const char __pyx_k_lognpr[] = "lognpr"; static const char __pyx_k_sn_min[] = "sn_min"; static const char __pyx_k_s_model[] = "s_model"; static const char __pyx_k_logtrunc[] = "logtrunc"; static const char __pyx_k_s_counts[] = "s_counts"; static const char __pyx_k_background[] = "background"; static const char __pyx_k_ImportError[] = "ImportError"; static const char __pyx_k_initializing[] = "_initializing"; static const char __pyx_k_is_coroutine[] = "_is_coroutine"; static const char __pyx_k_sn_min_total[] = "sn_min_total"; static const char __pyx_k_cash_sum_cython[] = "cash_sum_cython"; static const char __pyx_k_TRUNCATION_VALUE[] = "TRUNCATION_VALUE"; static const char __pyx_k_asyncio_coroutines[] = "asyncio.coroutines"; static const char __pyx_k_cline_in_traceback[] = "cline_in_traceback"; static const char __pyx_k_f_cash_root_cython[] = "f_cash_root_cython"; static const char __pyx_k_norm_bounds_cython[] = "norm_bounds_cython"; static const char __pyx_k_gammapy_stats_fit_statistics_cyt[] = "gammapy/stats/fit_statistics_cython.pyx"; static const char __pyx_k_numpy__core_multiarray_failed_to[] = "numpy._core.multiarray failed to import"; static const char __pyx_k_numpy__core_umath_failed_to_impo[] = "numpy._core.umath failed to import"; static const char __pyx_k_gammapy_stats_fit_statistics_cyt_2[] = "gammapy.stats.fit_statistics_cython"; /* #### Code section: decls ### */ static PyObject *__pyx_pf_7gammapy_5stats_21fit_statistics_cython_cash_sum_cython(CYTHON_UNUSED PyObject *__pyx_self, PyArrayObject *__pyx_v_counts, PyArrayObject *__pyx_v_npred); /* proto */ static PyObject *__pyx_pf_7gammapy_5stats_21fit_statistics_cython_2f_cash_root_cython(CYTHON_UNUSED PyObject *__pyx_self, __pyx_t_5numpy_float_t __pyx_v_x, PyArrayObject *__pyx_v_counts, PyArrayObject *__pyx_v_background, PyArrayObject *__pyx_v_model); /* proto */ static PyObject *__pyx_pf_7gammapy_5stats_21fit_statistics_cython_4norm_bounds_cython(CYTHON_UNUSED PyObject *__pyx_self, PyArrayObject *__pyx_v_counts, PyArrayObject *__pyx_v_background, PyArrayObject *__pyx_v_model); /* proto */ /* #### Code section: late_includes ### */ /* #### Code section: module_state ### */ typedef struct { PyObject *__pyx_d; PyObject *__pyx_b; PyObject *__pyx_cython_runtime; PyObject *__pyx_empty_tuple; PyObject *__pyx_empty_bytes; PyObject *__pyx_empty_unicode; #ifdef __Pyx_CyFunction_USED PyTypeObject *__pyx_CyFunctionType; #endif #ifdef __Pyx_FusedFunction_USED PyTypeObject *__pyx_FusedFunctionType; #endif #ifdef __Pyx_Generator_USED PyTypeObject *__pyx_GeneratorType; #endif #ifdef __Pyx_IterableCoroutine_USED PyTypeObject *__pyx_IterableCoroutineType; #endif #ifdef __Pyx_Coroutine_USED PyTypeObject *__pyx_CoroutineAwaitType; #endif #ifdef __Pyx_Coroutine_USED PyTypeObject *__pyx_CoroutineType; #endif #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif PyTypeObject *__pyx_ptype_7cpython_4type_type; #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif PyTypeObject *__pyx_ptype_5numpy_dtype; PyTypeObject *__pyx_ptype_5numpy_flatiter; PyTypeObject *__pyx_ptype_5numpy_broadcast; PyTypeObject *__pyx_ptype_5numpy_ndarray; PyTypeObject *__pyx_ptype_5numpy_generic; PyTypeObject *__pyx_ptype_5numpy_number; PyTypeObject *__pyx_ptype_5numpy_integer; PyTypeObject *__pyx_ptype_5numpy_signedinteger; PyTypeObject *__pyx_ptype_5numpy_unsignedinteger; PyTypeObject *__pyx_ptype_5numpy_inexact; PyTypeObject *__pyx_ptype_5numpy_floating; PyTypeObject *__pyx_ptype_5numpy_complexfloating; PyTypeObject *__pyx_ptype_5numpy_flexible; PyTypeObject *__pyx_ptype_5numpy_character; PyTypeObject *__pyx_ptype_5numpy_ufunc; #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif PyObject *__pyx_n_s_ImportError; PyObject *__pyx_n_s_TRUNCATION_VALUE; PyObject *__pyx_n_s__10; PyObject *__pyx_n_s__3; PyObject *__pyx_n_s_asyncio_coroutines; PyObject *__pyx_n_s_b_max; PyObject *__pyx_n_s_b_min; PyObject *__pyx_n_s_background; PyObject *__pyx_n_s_c_min; PyObject *__pyx_n_s_cash_sum_cython; PyObject *__pyx_n_s_cline_in_traceback; PyObject *__pyx_n_s_counts; PyObject *__pyx_n_s_f_cash_root_cython; PyObject *__pyx_kp_s_gammapy_stats_fit_statistics_cyt; PyObject *__pyx_n_s_gammapy_stats_fit_statistics_cyt_2; PyObject *__pyx_n_s_i; PyObject *__pyx_n_s_import; PyObject *__pyx_n_s_initializing; PyObject *__pyx_n_s_is_coroutine; PyObject *__pyx_n_s_lognpr; PyObject *__pyx_n_s_logtrunc; PyObject *__pyx_n_s_main; PyObject *__pyx_n_s_model; PyObject *__pyx_n_s_name; PyObject *__pyx_n_s_ni; PyObject *__pyx_n_s_norm_bounds_cython; PyObject *__pyx_n_s_np; PyObject *__pyx_n_s_npr; PyObject *__pyx_n_s_npred; PyObject *__pyx_n_s_numpy; PyObject *__pyx_kp_u_numpy__core_multiarray_failed_to; PyObject *__pyx_kp_u_numpy__core_umath_failed_to_impo; PyObject *__pyx_n_s_range; PyObject *__pyx_n_s_s_counts; PyObject *__pyx_n_s_s_model; PyObject *__pyx_n_s_sn; PyObject *__pyx_n_s_sn_min; PyObject *__pyx_n_s_sn_min_total; PyObject *__pyx_n_s_spec; PyObject *__pyx_n_s_sum; PyObject *__pyx_n_s_test; PyObject *__pyx_n_s_trunc; PyObject *__pyx_n_s_x; PyObject *__pyx_float_1eneg_25; PyObject *__pyx_tuple_; PyObject *__pyx_tuple__2; PyObject *__pyx_tuple__4; PyObject *__pyx_tuple__6; PyObject *__pyx_tuple__8; PyObject *__pyx_codeobj__5; PyObject *__pyx_codeobj__7; PyObject *__pyx_codeobj__9; } __pyx_mstate; #if CYTHON_USE_MODULE_STATE #ifdef __cplusplus namespace { extern struct PyModuleDef __pyx_moduledef; } /* anonymous namespace */ #else static struct PyModuleDef __pyx_moduledef; #endif #define __pyx_mstate(o) ((__pyx_mstate *)__Pyx_PyModule_GetState(o)) #define __pyx_mstate_global (__pyx_mstate(PyState_FindModule(&__pyx_moduledef))) #define __pyx_m (PyState_FindModule(&__pyx_moduledef)) #else static __pyx_mstate __pyx_mstate_global_static = #ifdef __cplusplus {}; #else {0}; #endif static __pyx_mstate *__pyx_mstate_global = &__pyx_mstate_global_static; #endif /* #### Code section: module_state_clear ### */ #if CYTHON_USE_MODULE_STATE static int __pyx_m_clear(PyObject *m) { __pyx_mstate *clear_module_state = __pyx_mstate(m); if (!clear_module_state) return 0; Py_CLEAR(clear_module_state->__pyx_d); Py_CLEAR(clear_module_state->__pyx_b); Py_CLEAR(clear_module_state->__pyx_cython_runtime); Py_CLEAR(clear_module_state->__pyx_empty_tuple); Py_CLEAR(clear_module_state->__pyx_empty_bytes); Py_CLEAR(clear_module_state->__pyx_empty_unicode); #ifdef __Pyx_CyFunction_USED Py_CLEAR(clear_module_state->__pyx_CyFunctionType); #endif #ifdef __Pyx_FusedFunction_USED Py_CLEAR(clear_module_state->__pyx_FusedFunctionType); #endif Py_CLEAR(clear_module_state->__pyx_ptype_7cpython_4type_type); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_dtype); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_flatiter); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_broadcast); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_ndarray); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_generic); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_number); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_integer); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_signedinteger); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_unsignedinteger); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_inexact); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_floating); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_complexfloating); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_flexible); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_character); Py_CLEAR(clear_module_state->__pyx_ptype_5numpy_ufunc); Py_CLEAR(clear_module_state->__pyx_n_s_ImportError); Py_CLEAR(clear_module_state->__pyx_n_s_TRUNCATION_VALUE); Py_CLEAR(clear_module_state->__pyx_n_s__10); Py_CLEAR(clear_module_state->__pyx_n_s__3); Py_CLEAR(clear_module_state->__pyx_n_s_asyncio_coroutines); Py_CLEAR(clear_module_state->__pyx_n_s_b_max); Py_CLEAR(clear_module_state->__pyx_n_s_b_min); Py_CLEAR(clear_module_state->__pyx_n_s_background); Py_CLEAR(clear_module_state->__pyx_n_s_c_min); Py_CLEAR(clear_module_state->__pyx_n_s_cash_sum_cython); Py_CLEAR(clear_module_state->__pyx_n_s_cline_in_traceback); Py_CLEAR(clear_module_state->__pyx_n_s_counts); Py_CLEAR(clear_module_state->__pyx_n_s_f_cash_root_cython); Py_CLEAR(clear_module_state->__pyx_kp_s_gammapy_stats_fit_statistics_cyt); Py_CLEAR(clear_module_state->__pyx_n_s_gammapy_stats_fit_statistics_cyt_2); Py_CLEAR(clear_module_state->__pyx_n_s_i); Py_CLEAR(clear_module_state->__pyx_n_s_import); Py_CLEAR(clear_module_state->__pyx_n_s_initializing); Py_CLEAR(clear_module_state->__pyx_n_s_is_coroutine); Py_CLEAR(clear_module_state->__pyx_n_s_lognpr); Py_CLEAR(clear_module_state->__pyx_n_s_logtrunc); Py_CLEAR(clear_module_state->__pyx_n_s_main); Py_CLEAR(clear_module_state->__pyx_n_s_model); Py_CLEAR(clear_module_state->__pyx_n_s_name); Py_CLEAR(clear_module_state->__pyx_n_s_ni); Py_CLEAR(clear_module_state->__pyx_n_s_norm_bounds_cython); Py_CLEAR(clear_module_state->__pyx_n_s_np); Py_CLEAR(clear_module_state->__pyx_n_s_npr); Py_CLEAR(clear_module_state->__pyx_n_s_npred); Py_CLEAR(clear_module_state->__pyx_n_s_numpy); Py_CLEAR(clear_module_state->__pyx_kp_u_numpy__core_multiarray_failed_to); Py_CLEAR(clear_module_state->__pyx_kp_u_numpy__core_umath_failed_to_impo); Py_CLEAR(clear_module_state->__pyx_n_s_range); Py_CLEAR(clear_module_state->__pyx_n_s_s_counts); Py_CLEAR(clear_module_state->__pyx_n_s_s_model); Py_CLEAR(clear_module_state->__pyx_n_s_sn); Py_CLEAR(clear_module_state->__pyx_n_s_sn_min); Py_CLEAR(clear_module_state->__pyx_n_s_sn_min_total); Py_CLEAR(clear_module_state->__pyx_n_s_spec); Py_CLEAR(clear_module_state->__pyx_n_s_sum); Py_CLEAR(clear_module_state->__pyx_n_s_test); Py_CLEAR(clear_module_state->__pyx_n_s_trunc); Py_CLEAR(clear_module_state->__pyx_n_s_x); Py_CLEAR(clear_module_state->__pyx_float_1eneg_25); Py_CLEAR(clear_module_state->__pyx_tuple_); Py_CLEAR(clear_module_state->__pyx_tuple__2); Py_CLEAR(clear_module_state->__pyx_tuple__4); Py_CLEAR(clear_module_state->__pyx_tuple__6); Py_CLEAR(clear_module_state->__pyx_tuple__8); Py_CLEAR(clear_module_state->__pyx_codeobj__5); Py_CLEAR(clear_module_state->__pyx_codeobj__7); Py_CLEAR(clear_module_state->__pyx_codeobj__9); return 0; } #endif /* #### Code section: module_state_traverse ### */ #if CYTHON_USE_MODULE_STATE static int __pyx_m_traverse(PyObject *m, visitproc visit, void *arg) { __pyx_mstate *traverse_module_state = __pyx_mstate(m); if (!traverse_module_state) return 0; Py_VISIT(traverse_module_state->__pyx_d); Py_VISIT(traverse_module_state->__pyx_b); Py_VISIT(traverse_module_state->__pyx_cython_runtime); Py_VISIT(traverse_module_state->__pyx_empty_tuple); Py_VISIT(traverse_module_state->__pyx_empty_bytes); Py_VISIT(traverse_module_state->__pyx_empty_unicode); #ifdef __Pyx_CyFunction_USED Py_VISIT(traverse_module_state->__pyx_CyFunctionType); #endif #ifdef __Pyx_FusedFunction_USED Py_VISIT(traverse_module_state->__pyx_FusedFunctionType); #endif Py_VISIT(traverse_module_state->__pyx_ptype_7cpython_4type_type); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_dtype); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_flatiter); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_broadcast); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_ndarray); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_generic); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_number); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_integer); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_signedinteger); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_unsignedinteger); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_inexact); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_floating); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_complexfloating); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_flexible); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_character); Py_VISIT(traverse_module_state->__pyx_ptype_5numpy_ufunc); Py_VISIT(traverse_module_state->__pyx_n_s_ImportError); Py_VISIT(traverse_module_state->__pyx_n_s_TRUNCATION_VALUE); Py_VISIT(traverse_module_state->__pyx_n_s__10); Py_VISIT(traverse_module_state->__pyx_n_s__3); Py_VISIT(traverse_module_state->__pyx_n_s_asyncio_coroutines); Py_VISIT(traverse_module_state->__pyx_n_s_b_max); Py_VISIT(traverse_module_state->__pyx_n_s_b_min); Py_VISIT(traverse_module_state->__pyx_n_s_background); Py_VISIT(traverse_module_state->__pyx_n_s_c_min); Py_VISIT(traverse_module_state->__pyx_n_s_cash_sum_cython); Py_VISIT(traverse_module_state->__pyx_n_s_cline_in_traceback); Py_VISIT(traverse_module_state->__pyx_n_s_counts); Py_VISIT(traverse_module_state->__pyx_n_s_f_cash_root_cython); Py_VISIT(traverse_module_state->__pyx_kp_s_gammapy_stats_fit_statistics_cyt); Py_VISIT(traverse_module_state->__pyx_n_s_gammapy_stats_fit_statistics_cyt_2); Py_VISIT(traverse_module_state->__pyx_n_s_i); Py_VISIT(traverse_module_state->__pyx_n_s_import); Py_VISIT(traverse_module_state->__pyx_n_s_initializing); Py_VISIT(traverse_module_state->__pyx_n_s_is_coroutine); Py_VISIT(traverse_module_state->__pyx_n_s_lognpr); Py_VISIT(traverse_module_state->__pyx_n_s_logtrunc); Py_VISIT(traverse_module_state->__pyx_n_s_main); Py_VISIT(traverse_module_state->__pyx_n_s_model); Py_VISIT(traverse_module_state->__pyx_n_s_name); Py_VISIT(traverse_module_state->__pyx_n_s_ni); Py_VISIT(traverse_module_state->__pyx_n_s_norm_bounds_cython); Py_VISIT(traverse_module_state->__pyx_n_s_np); Py_VISIT(traverse_module_state->__pyx_n_s_npr); Py_VISIT(traverse_module_state->__pyx_n_s_npred); Py_VISIT(traverse_module_state->__pyx_n_s_numpy); Py_VISIT(traverse_module_state->__pyx_kp_u_numpy__core_multiarray_failed_to); Py_VISIT(traverse_module_state->__pyx_kp_u_numpy__core_umath_failed_to_impo); Py_VISIT(traverse_module_state->__pyx_n_s_range); Py_VISIT(traverse_module_state->__pyx_n_s_s_counts); Py_VISIT(traverse_module_state->__pyx_n_s_s_model); Py_VISIT(traverse_module_state->__pyx_n_s_sn); Py_VISIT(traverse_module_state->__pyx_n_s_sn_min); Py_VISIT(traverse_module_state->__pyx_n_s_sn_min_total); Py_VISIT(traverse_module_state->__pyx_n_s_spec); Py_VISIT(traverse_module_state->__pyx_n_s_sum); Py_VISIT(traverse_module_state->__pyx_n_s_test); Py_VISIT(traverse_module_state->__pyx_n_s_trunc); Py_VISIT(traverse_module_state->__pyx_n_s_x); Py_VISIT(traverse_module_state->__pyx_float_1eneg_25); Py_VISIT(traverse_module_state->__pyx_tuple_); Py_VISIT(traverse_module_state->__pyx_tuple__2); Py_VISIT(traverse_module_state->__pyx_tuple__4); Py_VISIT(traverse_module_state->__pyx_tuple__6); Py_VISIT(traverse_module_state->__pyx_tuple__8); Py_VISIT(traverse_module_state->__pyx_codeobj__5); Py_VISIT(traverse_module_state->__pyx_codeobj__7); Py_VISIT(traverse_module_state->__pyx_codeobj__9); return 0; } #endif /* #### Code section: module_state_defines ### */ #define __pyx_d __pyx_mstate_global->__pyx_d #define __pyx_b __pyx_mstate_global->__pyx_b #define __pyx_cython_runtime __pyx_mstate_global->__pyx_cython_runtime #define __pyx_empty_tuple __pyx_mstate_global->__pyx_empty_tuple #define __pyx_empty_bytes __pyx_mstate_global->__pyx_empty_bytes #define __pyx_empty_unicode __pyx_mstate_global->__pyx_empty_unicode #ifdef __Pyx_CyFunction_USED #define __pyx_CyFunctionType __pyx_mstate_global->__pyx_CyFunctionType #endif #ifdef __Pyx_FusedFunction_USED #define __pyx_FusedFunctionType __pyx_mstate_global->__pyx_FusedFunctionType #endif #ifdef __Pyx_Generator_USED #define __pyx_GeneratorType __pyx_mstate_global->__pyx_GeneratorType #endif #ifdef __Pyx_IterableCoroutine_USED #define __pyx_IterableCoroutineType __pyx_mstate_global->__pyx_IterableCoroutineType #endif #ifdef __Pyx_Coroutine_USED #define __pyx_CoroutineAwaitType __pyx_mstate_global->__pyx_CoroutineAwaitType #endif #ifdef __Pyx_Coroutine_USED #define __pyx_CoroutineType __pyx_mstate_global->__pyx_CoroutineType #endif #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif #define __pyx_ptype_7cpython_4type_type __pyx_mstate_global->__pyx_ptype_7cpython_4type_type #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif #define __pyx_ptype_5numpy_dtype __pyx_mstate_global->__pyx_ptype_5numpy_dtype #define __pyx_ptype_5numpy_flatiter __pyx_mstate_global->__pyx_ptype_5numpy_flatiter #define __pyx_ptype_5numpy_broadcast __pyx_mstate_global->__pyx_ptype_5numpy_broadcast #define __pyx_ptype_5numpy_ndarray __pyx_mstate_global->__pyx_ptype_5numpy_ndarray #define __pyx_ptype_5numpy_generic __pyx_mstate_global->__pyx_ptype_5numpy_generic #define __pyx_ptype_5numpy_number __pyx_mstate_global->__pyx_ptype_5numpy_number #define __pyx_ptype_5numpy_integer __pyx_mstate_global->__pyx_ptype_5numpy_integer #define __pyx_ptype_5numpy_signedinteger __pyx_mstate_global->__pyx_ptype_5numpy_signedinteger #define __pyx_ptype_5numpy_unsignedinteger __pyx_mstate_global->__pyx_ptype_5numpy_unsignedinteger #define __pyx_ptype_5numpy_inexact __pyx_mstate_global->__pyx_ptype_5numpy_inexact #define __pyx_ptype_5numpy_floating __pyx_mstate_global->__pyx_ptype_5numpy_floating #define __pyx_ptype_5numpy_complexfloating __pyx_mstate_global->__pyx_ptype_5numpy_complexfloating #define __pyx_ptype_5numpy_flexible __pyx_mstate_global->__pyx_ptype_5numpy_flexible #define __pyx_ptype_5numpy_character __pyx_mstate_global->__pyx_ptype_5numpy_character #define __pyx_ptype_5numpy_ufunc __pyx_mstate_global->__pyx_ptype_5numpy_ufunc #if CYTHON_USE_MODULE_STATE #endif #if CYTHON_USE_MODULE_STATE #endif #define __pyx_n_s_ImportError __pyx_mstate_global->__pyx_n_s_ImportError #define __pyx_n_s_TRUNCATION_VALUE __pyx_mstate_global->__pyx_n_s_TRUNCATION_VALUE #define __pyx_n_s__10 __pyx_mstate_global->__pyx_n_s__10 #define __pyx_n_s__3 __pyx_mstate_global->__pyx_n_s__3 #define __pyx_n_s_asyncio_coroutines __pyx_mstate_global->__pyx_n_s_asyncio_coroutines #define __pyx_n_s_b_max __pyx_mstate_global->__pyx_n_s_b_max #define __pyx_n_s_b_min __pyx_mstate_global->__pyx_n_s_b_min #define __pyx_n_s_background __pyx_mstate_global->__pyx_n_s_background #define __pyx_n_s_c_min __pyx_mstate_global->__pyx_n_s_c_min #define __pyx_n_s_cash_sum_cython __pyx_mstate_global->__pyx_n_s_cash_sum_cython #define __pyx_n_s_cline_in_traceback __pyx_mstate_global->__pyx_n_s_cline_in_traceback #define __pyx_n_s_counts __pyx_mstate_global->__pyx_n_s_counts #define __pyx_n_s_f_cash_root_cython __pyx_mstate_global->__pyx_n_s_f_cash_root_cython #define __pyx_kp_s_gammapy_stats_fit_statistics_cyt __pyx_mstate_global->__pyx_kp_s_gammapy_stats_fit_statistics_cyt #define __pyx_n_s_gammapy_stats_fit_statistics_cyt_2 __pyx_mstate_global->__pyx_n_s_gammapy_stats_fit_statistics_cyt_2 #define __pyx_n_s_i __pyx_mstate_global->__pyx_n_s_i #define __pyx_n_s_import __pyx_mstate_global->__pyx_n_s_import #define __pyx_n_s_initializing __pyx_mstate_global->__pyx_n_s_initializing #define __pyx_n_s_is_coroutine __pyx_mstate_global->__pyx_n_s_is_coroutine #define __pyx_n_s_lognpr __pyx_mstate_global->__pyx_n_s_lognpr #define __pyx_n_s_logtrunc __pyx_mstate_global->__pyx_n_s_logtrunc #define __pyx_n_s_main __pyx_mstate_global->__pyx_n_s_main #define __pyx_n_s_model __pyx_mstate_global->__pyx_n_s_model #define __pyx_n_s_name __pyx_mstate_global->__pyx_n_s_name #define __pyx_n_s_ni __pyx_mstate_global->__pyx_n_s_ni #define __pyx_n_s_norm_bounds_cython __pyx_mstate_global->__pyx_n_s_norm_bounds_cython #define __pyx_n_s_np __pyx_mstate_global->__pyx_n_s_np #define __pyx_n_s_npr __pyx_mstate_global->__pyx_n_s_npr #define __pyx_n_s_npred __pyx_mstate_global->__pyx_n_s_npred #define __pyx_n_s_numpy __pyx_mstate_global->__pyx_n_s_numpy #define __pyx_kp_u_numpy__core_multiarray_failed_to __pyx_mstate_global->__pyx_kp_u_numpy__core_multiarray_failed_to #define __pyx_kp_u_numpy__core_umath_failed_to_impo __pyx_mstate_global->__pyx_kp_u_numpy__core_umath_failed_to_impo #define __pyx_n_s_range __pyx_mstate_global->__pyx_n_s_range #define __pyx_n_s_s_counts __pyx_mstate_global->__pyx_n_s_s_counts #define __pyx_n_s_s_model __pyx_mstate_global->__pyx_n_s_s_model #define __pyx_n_s_sn __pyx_mstate_global->__pyx_n_s_sn #define __pyx_n_s_sn_min __pyx_mstate_global->__pyx_n_s_sn_min #define __pyx_n_s_sn_min_total __pyx_mstate_global->__pyx_n_s_sn_min_total #define __pyx_n_s_spec __pyx_mstate_global->__pyx_n_s_spec #define __pyx_n_s_sum __pyx_mstate_global->__pyx_n_s_sum #define __pyx_n_s_test __pyx_mstate_global->__pyx_n_s_test #define __pyx_n_s_trunc __pyx_mstate_global->__pyx_n_s_trunc #define __pyx_n_s_x __pyx_mstate_global->__pyx_n_s_x #define __pyx_float_1eneg_25 __pyx_mstate_global->__pyx_float_1eneg_25 #define __pyx_tuple_ __pyx_mstate_global->__pyx_tuple_ #define __pyx_tuple__2 __pyx_mstate_global->__pyx_tuple__2 #define __pyx_tuple__4 __pyx_mstate_global->__pyx_tuple__4 #define __pyx_tuple__6 __pyx_mstate_global->__pyx_tuple__6 #define __pyx_tuple__8 __pyx_mstate_global->__pyx_tuple__8 #define __pyx_codeobj__5 __pyx_mstate_global->__pyx_codeobj__5 #define __pyx_codeobj__7 __pyx_mstate_global->__pyx_codeobj__7 #define __pyx_codeobj__9 __pyx_mstate_global->__pyx_codeobj__9 /* #### Code section: module_code ### */ /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":286 * * @property * cdef inline npy_intp itemsize(self) noexcept nogil: # <<<<<<<<<<<<<< * return PyDataType_ELSIZE(self) * */ static CYTHON_INLINE npy_intp __pyx_f_5numpy_5dtype_8itemsize_itemsize(PyArray_Descr *__pyx_v_self) { npy_intp __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":287 * @property * cdef inline npy_intp itemsize(self) noexcept nogil: * return PyDataType_ELSIZE(self) # <<<<<<<<<<<<<< * * @property */ __pyx_r = PyDataType_ELSIZE(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":286 * * @property * cdef inline npy_intp itemsize(self) noexcept nogil: # <<<<<<<<<<<<<< * return PyDataType_ELSIZE(self) * */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":290 * * @property * cdef inline npy_intp alignment(self) noexcept nogil: # <<<<<<<<<<<<<< * return PyDataType_ALIGNMENT(self) * */ static CYTHON_INLINE npy_intp __pyx_f_5numpy_5dtype_9alignment_alignment(PyArray_Descr *__pyx_v_self) { npy_intp __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":291 * @property * cdef inline npy_intp alignment(self) noexcept nogil: * return PyDataType_ALIGNMENT(self) # <<<<<<<<<<<<<< * * # Use fields/names with care as they may be NULL. You must check */ __pyx_r = PyDataType_ALIGNMENT(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":290 * * @property * cdef inline npy_intp alignment(self) noexcept nogil: # <<<<<<<<<<<<<< * return PyDataType_ALIGNMENT(self) * */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":296 * # for this using PyDataType_HASFIELDS. * @property * cdef inline object fields(self): # <<<<<<<<<<<<<< * return PyDataType_FIELDS(self) * */ static CYTHON_INLINE PyObject *__pyx_f_5numpy_5dtype_6fields_fields(PyArray_Descr *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1; __Pyx_RefNannySetupContext("fields", 1); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":297 * @property * cdef inline object fields(self): * return PyDataType_FIELDS(self) # <<<<<<<<<<<<<< * * @property */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = PyDataType_FIELDS(__pyx_v_self); __Pyx_INCREF(((PyObject *)__pyx_t_1)); __pyx_r = ((PyObject *)__pyx_t_1); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":296 * # for this using PyDataType_HASFIELDS. * @property * cdef inline object fields(self): # <<<<<<<<<<<<<< * return PyDataType_FIELDS(self) * */ /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":300 * * @property * cdef inline tuple names(self): # <<<<<<<<<<<<<< * return PyDataType_NAMES(self) * */ static CYTHON_INLINE PyObject *__pyx_f_5numpy_5dtype_5names_names(PyArray_Descr *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1; __Pyx_RefNannySetupContext("names", 1); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":301 * @property * cdef inline tuple names(self): * return PyDataType_NAMES(self) # <<<<<<<<<<<<<< * * # Use PyDataType_HASSUBARRAY to test whether this field is */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = PyDataType_NAMES(__pyx_v_self); __Pyx_INCREF(((PyObject*)__pyx_t_1)); __pyx_r = ((PyObject*)__pyx_t_1); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":300 * * @property * cdef inline tuple names(self): # <<<<<<<<<<<<<< * return PyDataType_NAMES(self) * */ /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":307 * # this field via the inline helper method PyDataType_SHAPE. * @property * cdef inline PyArray_ArrayDescr* subarray(self) noexcept nogil: # <<<<<<<<<<<<<< * return PyDataType_SUBARRAY(self) * */ static CYTHON_INLINE PyArray_ArrayDescr *__pyx_f_5numpy_5dtype_8subarray_subarray(PyArray_Descr *__pyx_v_self) { PyArray_ArrayDescr *__pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":308 * @property * cdef inline PyArray_ArrayDescr* subarray(self) noexcept nogil: * return PyDataType_SUBARRAY(self) # <<<<<<<<<<<<<< * * @property */ __pyx_r = PyDataType_SUBARRAY(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":307 * # this field via the inline helper method PyDataType_SHAPE. * @property * cdef inline PyArray_ArrayDescr* subarray(self) noexcept nogil: # <<<<<<<<<<<<<< * return PyDataType_SUBARRAY(self) * */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":311 * * @property * cdef inline npy_uint64 flags(self) noexcept nogil: # <<<<<<<<<<<<<< * """The data types flags.""" * return PyDataType_FLAGS(self) */ static CYTHON_INLINE npy_uint64 __pyx_f_5numpy_5dtype_5flags_flags(PyArray_Descr *__pyx_v_self) { npy_uint64 __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":313 * cdef inline npy_uint64 flags(self) noexcept nogil: * """The data types flags.""" * return PyDataType_FLAGS(self) # <<<<<<<<<<<<<< * * */ __pyx_r = PyDataType_FLAGS(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":311 * * @property * cdef inline npy_uint64 flags(self) noexcept nogil: # <<<<<<<<<<<<<< * """The data types flags.""" * return PyDataType_FLAGS(self) */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":323 * * @property * cdef inline int numiter(self) noexcept nogil: # <<<<<<<<<<<<<< * """The number of arrays that need to be broadcast to the same shape.""" * return PyArray_MultiIter_NUMITER(self) */ static CYTHON_INLINE int __pyx_f_5numpy_9broadcast_7numiter_numiter(PyArrayMultiIterObject *__pyx_v_self) { int __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":325 * cdef inline int numiter(self) noexcept nogil: * """The number of arrays that need to be broadcast to the same shape.""" * return PyArray_MultiIter_NUMITER(self) # <<<<<<<<<<<<<< * * @property */ __pyx_r = PyArray_MultiIter_NUMITER(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":323 * * @property * cdef inline int numiter(self) noexcept nogil: # <<<<<<<<<<<<<< * """The number of arrays that need to be broadcast to the same shape.""" * return PyArray_MultiIter_NUMITER(self) */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":328 * * @property * cdef inline npy_intp size(self) noexcept nogil: # <<<<<<<<<<<<<< * """The total broadcasted size.""" * return PyArray_MultiIter_SIZE(self) */ static CYTHON_INLINE npy_intp __pyx_f_5numpy_9broadcast_4size_size(PyArrayMultiIterObject *__pyx_v_self) { npy_intp __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":330 * cdef inline npy_intp size(self) noexcept nogil: * """The total broadcasted size.""" * return PyArray_MultiIter_SIZE(self) # <<<<<<<<<<<<<< * * @property */ __pyx_r = PyArray_MultiIter_SIZE(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":328 * * @property * cdef inline npy_intp size(self) noexcept nogil: # <<<<<<<<<<<<<< * """The total broadcasted size.""" * return PyArray_MultiIter_SIZE(self) */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":333 * * @property * cdef inline npy_intp index(self) noexcept nogil: # <<<<<<<<<<<<<< * """The current (1-d) index into the broadcasted result.""" * return PyArray_MultiIter_INDEX(self) */ static CYTHON_INLINE npy_intp __pyx_f_5numpy_9broadcast_5index_index(PyArrayMultiIterObject *__pyx_v_self) { npy_intp __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":335 * cdef inline npy_intp index(self) noexcept nogil: * """The current (1-d) index into the broadcasted result.""" * return PyArray_MultiIter_INDEX(self) # <<<<<<<<<<<<<< * * @property */ __pyx_r = PyArray_MultiIter_INDEX(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":333 * * @property * cdef inline npy_intp index(self) noexcept nogil: # <<<<<<<<<<<<<< * """The current (1-d) index into the broadcasted result.""" * return PyArray_MultiIter_INDEX(self) */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":338 * * @property * cdef inline int nd(self) noexcept nogil: # <<<<<<<<<<<<<< * """The number of dimensions in the broadcasted result.""" * return PyArray_MultiIter_NDIM(self) */ static CYTHON_INLINE int __pyx_f_5numpy_9broadcast_2nd_nd(PyArrayMultiIterObject *__pyx_v_self) { int __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":340 * cdef inline int nd(self) noexcept nogil: * """The number of dimensions in the broadcasted result.""" * return PyArray_MultiIter_NDIM(self) # <<<<<<<<<<<<<< * * @property */ __pyx_r = PyArray_MultiIter_NDIM(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":338 * * @property * cdef inline int nd(self) noexcept nogil: # <<<<<<<<<<<<<< * """The number of dimensions in the broadcasted result.""" * return PyArray_MultiIter_NDIM(self) */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":343 * * @property * cdef inline npy_intp* dimensions(self) noexcept nogil: # <<<<<<<<<<<<<< * """The shape of the broadcasted result.""" * return PyArray_MultiIter_DIMS(self) */ static CYTHON_INLINE npy_intp *__pyx_f_5numpy_9broadcast_10dimensions_dimensions(PyArrayMultiIterObject *__pyx_v_self) { npy_intp *__pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":345 * cdef inline npy_intp* dimensions(self) noexcept nogil: * """The shape of the broadcasted result.""" * return PyArray_MultiIter_DIMS(self) # <<<<<<<<<<<<<< * * @property */ __pyx_r = PyArray_MultiIter_DIMS(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":343 * * @property * cdef inline npy_intp* dimensions(self) noexcept nogil: # <<<<<<<<<<<<<< * """The shape of the broadcasted result.""" * return PyArray_MultiIter_DIMS(self) */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":348 * * @property * cdef inline void** iters(self) noexcept nogil: # <<<<<<<<<<<<<< * """An array of iterator objects that holds the iterators for the arrays to be broadcast together. * On return, the iterators are adjusted for broadcasting.""" */ static CYTHON_INLINE void **__pyx_f_5numpy_9broadcast_5iters_iters(PyArrayMultiIterObject *__pyx_v_self) { void **__pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":351 * """An array of iterator objects that holds the iterators for the arrays to be broadcast together. * On return, the iterators are adjusted for broadcasting.""" * return PyArray_MultiIter_ITERS(self) # <<<<<<<<<<<<<< * * */ __pyx_r = PyArray_MultiIter_ITERS(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":348 * * @property * cdef inline void** iters(self) noexcept nogil: # <<<<<<<<<<<<<< * """An array of iterator objects that holds the iterators for the arrays to be broadcast together. * On return, the iterators are adjusted for broadcasting.""" */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":366 * * @property * cdef inline PyObject* base(self) noexcept nogil: # <<<<<<<<<<<<<< * """Returns a borrowed reference to the object owning the data/memory. * """ */ static CYTHON_INLINE PyObject *__pyx_f_5numpy_7ndarray_4base_base(PyArrayObject *__pyx_v_self) { PyObject *__pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":369 * """Returns a borrowed reference to the object owning the data/memory. * """ * return PyArray_BASE(self) # <<<<<<<<<<<<<< * * @property */ __pyx_r = PyArray_BASE(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":366 * * @property * cdef inline PyObject* base(self) noexcept nogil: # <<<<<<<<<<<<<< * """Returns a borrowed reference to the object owning the data/memory. * """ */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":372 * * @property * cdef inline dtype descr(self): # <<<<<<<<<<<<<< * """Returns an owned reference to the dtype of the array. * """ */ static CYTHON_INLINE PyArray_Descr *__pyx_f_5numpy_7ndarray_5descr_descr(PyArrayObject *__pyx_v_self) { PyArray_Descr *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyArray_Descr *__pyx_t_1; __Pyx_RefNannySetupContext("descr", 1); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":375 * """Returns an owned reference to the dtype of the array. * """ * return PyArray_DESCR(self) # <<<<<<<<<<<<<< * * @property */ __Pyx_XDECREF((PyObject *)__pyx_r); __pyx_t_1 = PyArray_DESCR(__pyx_v_self); __Pyx_INCREF((PyObject *)((PyArray_Descr *)__pyx_t_1)); __pyx_r = ((PyArray_Descr *)__pyx_t_1); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":372 * * @property * cdef inline dtype descr(self): # <<<<<<<<<<<<<< * """Returns an owned reference to the dtype of the array. * """ */ /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF((PyObject *)__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":378 * * @property * cdef inline int ndim(self) noexcept nogil: # <<<<<<<<<<<<<< * """Returns the number of dimensions in the array. * """ */ static CYTHON_INLINE int __pyx_f_5numpy_7ndarray_4ndim_ndim(PyArrayObject *__pyx_v_self) { int __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":381 * """Returns the number of dimensions in the array. * """ * return PyArray_NDIM(self) # <<<<<<<<<<<<<< * * @property */ __pyx_r = PyArray_NDIM(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":378 * * @property * cdef inline int ndim(self) noexcept nogil: # <<<<<<<<<<<<<< * """Returns the number of dimensions in the array. * """ */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":384 * * @property * cdef inline npy_intp *shape(self) noexcept nogil: # <<<<<<<<<<<<<< * """Returns a pointer to the dimensions/shape of the array. * The number of elements matches the number of dimensions of the array (ndim). */ static CYTHON_INLINE npy_intp *__pyx_f_5numpy_7ndarray_5shape_shape(PyArrayObject *__pyx_v_self) { npy_intp *__pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":389 * Can return NULL for 0-dimensional arrays. * """ * return PyArray_DIMS(self) # <<<<<<<<<<<<<< * * @property */ __pyx_r = PyArray_DIMS(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":384 * * @property * cdef inline npy_intp *shape(self) noexcept nogil: # <<<<<<<<<<<<<< * """Returns a pointer to the dimensions/shape of the array. * The number of elements matches the number of dimensions of the array (ndim). */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":392 * * @property * cdef inline npy_intp *strides(self) noexcept nogil: # <<<<<<<<<<<<<< * """Returns a pointer to the strides of the array. * The number of elements matches the number of dimensions of the array (ndim). */ static CYTHON_INLINE npy_intp *__pyx_f_5numpy_7ndarray_7strides_strides(PyArrayObject *__pyx_v_self) { npy_intp *__pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":396 * The number of elements matches the number of dimensions of the array (ndim). * """ * return PyArray_STRIDES(self) # <<<<<<<<<<<<<< * * @property */ __pyx_r = PyArray_STRIDES(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":392 * * @property * cdef inline npy_intp *strides(self) noexcept nogil: # <<<<<<<<<<<<<< * """Returns a pointer to the strides of the array. * The number of elements matches the number of dimensions of the array (ndim). */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":399 * * @property * cdef inline npy_intp size(self) noexcept nogil: # <<<<<<<<<<<<<< * """Returns the total size (in number of elements) of the array. * """ */ static CYTHON_INLINE npy_intp __pyx_f_5numpy_7ndarray_4size_size(PyArrayObject *__pyx_v_self) { npy_intp __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":402 * """Returns the total size (in number of elements) of the array. * """ * return PyArray_SIZE(self) # <<<<<<<<<<<<<< * * @property */ __pyx_r = PyArray_SIZE(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":399 * * @property * cdef inline npy_intp size(self) noexcept nogil: # <<<<<<<<<<<<<< * """Returns the total size (in number of elements) of the array. * """ */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":405 * * @property * cdef inline char* data(self) noexcept nogil: # <<<<<<<<<<<<<< * """The pointer to the data buffer as a char*. * This is provided for legacy reasons to avoid direct struct field access. */ static CYTHON_INLINE char *__pyx_f_5numpy_7ndarray_4data_data(PyArrayObject *__pyx_v_self) { char *__pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":411 * of `PyArray_DATA()` instead, which returns a 'void*'. * """ * return PyArray_BYTES(self) # <<<<<<<<<<<<<< * * */ __pyx_r = PyArray_BYTES(__pyx_v_self); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":405 * * @property * cdef inline char* data(self) noexcept nogil: # <<<<<<<<<<<<<< * """The pointer to the data buffer as a char*. * This is provided for legacy reasons to avoid direct struct field access. */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":807 * ctypedef long double complex clongdouble_t * * cdef inline object PyArray_MultiIterNew1(a): # <<<<<<<<<<<<<< * return PyArray_MultiIterNew(1, a) * */ static CYTHON_INLINE PyObject *__pyx_f_5numpy_PyArray_MultiIterNew1(PyObject *__pyx_v_a) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; __Pyx_RefNannySetupContext("PyArray_MultiIterNew1", 1); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":808 * * cdef inline object PyArray_MultiIterNew1(a): * return PyArray_MultiIterNew(1, a) # <<<<<<<<<<<<<< * * cdef inline object PyArray_MultiIterNew2(a, b): */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = PyArray_MultiIterNew(1, ((void *)__pyx_v_a)); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 808, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":807 * ctypedef long double complex clongdouble_t * * cdef inline object PyArray_MultiIterNew1(a): # <<<<<<<<<<<<<< * return PyArray_MultiIterNew(1, a) * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("numpy.PyArray_MultiIterNew1", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":810 * return PyArray_MultiIterNew(1, a) * * cdef inline object PyArray_MultiIterNew2(a, b): # <<<<<<<<<<<<<< * return PyArray_MultiIterNew(2, a, b) * */ static CYTHON_INLINE PyObject *__pyx_f_5numpy_PyArray_MultiIterNew2(PyObject *__pyx_v_a, PyObject *__pyx_v_b) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; __Pyx_RefNannySetupContext("PyArray_MultiIterNew2", 1); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":811 * * cdef inline object PyArray_MultiIterNew2(a, b): * return PyArray_MultiIterNew(2, a, b) # <<<<<<<<<<<<<< * * cdef inline object PyArray_MultiIterNew3(a, b, c): */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = PyArray_MultiIterNew(2, ((void *)__pyx_v_a), ((void *)__pyx_v_b)); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 811, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":810 * return PyArray_MultiIterNew(1, a) * * cdef inline object PyArray_MultiIterNew2(a, b): # <<<<<<<<<<<<<< * return PyArray_MultiIterNew(2, a, b) * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("numpy.PyArray_MultiIterNew2", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":813 * return PyArray_MultiIterNew(2, a, b) * * cdef inline object PyArray_MultiIterNew3(a, b, c): # <<<<<<<<<<<<<< * return PyArray_MultiIterNew(3, a, b, c) * */ static CYTHON_INLINE PyObject *__pyx_f_5numpy_PyArray_MultiIterNew3(PyObject *__pyx_v_a, PyObject *__pyx_v_b, PyObject *__pyx_v_c) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; __Pyx_RefNannySetupContext("PyArray_MultiIterNew3", 1); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":814 * * cdef inline object PyArray_MultiIterNew3(a, b, c): * return PyArray_MultiIterNew(3, a, b, c) # <<<<<<<<<<<<<< * * cdef inline object PyArray_MultiIterNew4(a, b, c, d): */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = PyArray_MultiIterNew(3, ((void *)__pyx_v_a), ((void *)__pyx_v_b), ((void *)__pyx_v_c)); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 814, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":813 * return PyArray_MultiIterNew(2, a, b) * * cdef inline object PyArray_MultiIterNew3(a, b, c): # <<<<<<<<<<<<<< * return PyArray_MultiIterNew(3, a, b, c) * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("numpy.PyArray_MultiIterNew3", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":816 * return PyArray_MultiIterNew(3, a, b, c) * * cdef inline object PyArray_MultiIterNew4(a, b, c, d): # <<<<<<<<<<<<<< * return PyArray_MultiIterNew(4, a, b, c, d) * */ static CYTHON_INLINE PyObject *__pyx_f_5numpy_PyArray_MultiIterNew4(PyObject *__pyx_v_a, PyObject *__pyx_v_b, PyObject *__pyx_v_c, PyObject *__pyx_v_d) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; __Pyx_RefNannySetupContext("PyArray_MultiIterNew4", 1); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":817 * * cdef inline object PyArray_MultiIterNew4(a, b, c, d): * return PyArray_MultiIterNew(4, a, b, c, d) # <<<<<<<<<<<<<< * * cdef inline object PyArray_MultiIterNew5(a, b, c, d, e): */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = PyArray_MultiIterNew(4, ((void *)__pyx_v_a), ((void *)__pyx_v_b), ((void *)__pyx_v_c), ((void *)__pyx_v_d)); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 817, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":816 * return PyArray_MultiIterNew(3, a, b, c) * * cdef inline object PyArray_MultiIterNew4(a, b, c, d): # <<<<<<<<<<<<<< * return PyArray_MultiIterNew(4, a, b, c, d) * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("numpy.PyArray_MultiIterNew4", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":819 * return PyArray_MultiIterNew(4, a, b, c, d) * * cdef inline object PyArray_MultiIterNew5(a, b, c, d, e): # <<<<<<<<<<<<<< * return PyArray_MultiIterNew(5, a, b, c, d, e) * */ static CYTHON_INLINE PyObject *__pyx_f_5numpy_PyArray_MultiIterNew5(PyObject *__pyx_v_a, PyObject *__pyx_v_b, PyObject *__pyx_v_c, PyObject *__pyx_v_d, PyObject *__pyx_v_e) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; __Pyx_RefNannySetupContext("PyArray_MultiIterNew5", 1); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":820 * * cdef inline object PyArray_MultiIterNew5(a, b, c, d, e): * return PyArray_MultiIterNew(5, a, b, c, d, e) # <<<<<<<<<<<<<< * * cdef inline tuple PyDataType_SHAPE(dtype d): */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = PyArray_MultiIterNew(5, ((void *)__pyx_v_a), ((void *)__pyx_v_b), ((void *)__pyx_v_c), ((void *)__pyx_v_d), ((void *)__pyx_v_e)); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 820, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":819 * return PyArray_MultiIterNew(4, a, b, c, d) * * cdef inline object PyArray_MultiIterNew5(a, b, c, d, e): # <<<<<<<<<<<<<< * return PyArray_MultiIterNew(5, a, b, c, d, e) * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("numpy.PyArray_MultiIterNew5", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":822 * return PyArray_MultiIterNew(5, a, b, c, d, e) * * cdef inline tuple PyDataType_SHAPE(dtype d): # <<<<<<<<<<<<<< * if PyDataType_HASSUBARRAY(d): * return d.subarray.shape */ static CYTHON_INLINE PyObject *__pyx_f_5numpy_PyDataType_SHAPE(PyArray_Descr *__pyx_v_d) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2; __Pyx_RefNannySetupContext("PyDataType_SHAPE", 1); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":823 * * cdef inline tuple PyDataType_SHAPE(dtype d): * if PyDataType_HASSUBARRAY(d): # <<<<<<<<<<<<<< * return d.subarray.shape * else: */ __pyx_t_1 = PyDataType_HASSUBARRAY(__pyx_v_d); if (__pyx_t_1) { /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":824 * cdef inline tuple PyDataType_SHAPE(dtype d): * if PyDataType_HASSUBARRAY(d): * return d.subarray.shape # <<<<<<<<<<<<<< * else: * return () */ __Pyx_XDECREF(__pyx_r); __pyx_t_2 = __pyx_f_5numpy_5dtype_8subarray_subarray(__pyx_v_d)->shape; __Pyx_INCREF(((PyObject*)__pyx_t_2)); __pyx_r = ((PyObject*)__pyx_t_2); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":823 * * cdef inline tuple PyDataType_SHAPE(dtype d): * if PyDataType_HASSUBARRAY(d): # <<<<<<<<<<<<<< * return d.subarray.shape * else: */ } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":826 * return d.subarray.shape * else: * return () # <<<<<<<<<<<<<< * * */ /*else*/ { __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_empty_tuple); __pyx_r = __pyx_empty_tuple; goto __pyx_L0; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":822 * return PyArray_MultiIterNew(5, a, b, c, d, e) * * cdef inline tuple PyDataType_SHAPE(dtype d): # <<<<<<<<<<<<<< * if PyDataType_HASSUBARRAY(d): * return d.subarray.shape */ /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1010 * int _import_umath() except -1 * * cdef inline void set_array_base(ndarray arr, object base) except *: # <<<<<<<<<<<<<< * Py_INCREF(base) # important to do this before stealing the reference below! * PyArray_SetBaseObject(arr, base) */ static CYTHON_INLINE void __pyx_f_5numpy_set_array_base(PyArrayObject *__pyx_v_arr, PyObject *__pyx_v_base) { int __pyx_t_1; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1011 * * cdef inline void set_array_base(ndarray arr, object base) except *: * Py_INCREF(base) # important to do this before stealing the reference below! # <<<<<<<<<<<<<< * PyArray_SetBaseObject(arr, base) * */ Py_INCREF(__pyx_v_base); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1012 * cdef inline void set_array_base(ndarray arr, object base) except *: * Py_INCREF(base) # important to do this before stealing the reference below! * PyArray_SetBaseObject(arr, base) # <<<<<<<<<<<<<< * * cdef inline object get_array_base(ndarray arr): */ __pyx_t_1 = PyArray_SetBaseObject(__pyx_v_arr, __pyx_v_base); if (unlikely(__pyx_t_1 == ((int)-1))) __PYX_ERR(1, 1012, __pyx_L1_error) /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1010 * int _import_umath() except -1 * * cdef inline void set_array_base(ndarray arr, object base) except *: # <<<<<<<<<<<<<< * Py_INCREF(base) # important to do this before stealing the reference below! * PyArray_SetBaseObject(arr, base) */ /* function exit code */ goto __pyx_L0; __pyx_L1_error:; __Pyx_AddTraceback("numpy.set_array_base", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_L0:; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1014 * PyArray_SetBaseObject(arr, base) * * cdef inline object get_array_base(ndarray arr): # <<<<<<<<<<<<<< * base = PyArray_BASE(arr) * if base is NULL: */ static CYTHON_INLINE PyObject *__pyx_f_5numpy_get_array_base(PyArrayObject *__pyx_v_arr) { PyObject *__pyx_v_base; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; __Pyx_RefNannySetupContext("get_array_base", 1); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1015 * * cdef inline object get_array_base(ndarray arr): * base = PyArray_BASE(arr) # <<<<<<<<<<<<<< * if base is NULL: * return None */ __pyx_v_base = PyArray_BASE(__pyx_v_arr); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1016 * cdef inline object get_array_base(ndarray arr): * base = PyArray_BASE(arr) * if base is NULL: # <<<<<<<<<<<<<< * return None * return base */ __pyx_t_1 = (__pyx_v_base == NULL); if (__pyx_t_1) { /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1017 * base = PyArray_BASE(arr) * if base is NULL: * return None # <<<<<<<<<<<<<< * return base * */ __Pyx_XDECREF(__pyx_r); __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1016 * cdef inline object get_array_base(ndarray arr): * base = PyArray_BASE(arr) * if base is NULL: # <<<<<<<<<<<<<< * return None * return base */ } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1018 * if base is NULL: * return None * return base # <<<<<<<<<<<<<< * * # Versions of the import_* functions which are more suitable for */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(((PyObject *)__pyx_v_base)); __pyx_r = ((PyObject *)__pyx_v_base); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1014 * PyArray_SetBaseObject(arr, base) * * cdef inline object get_array_base(ndarray arr): # <<<<<<<<<<<<<< * base = PyArray_BASE(arr) * if base is NULL: */ /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1022 * # Versions of the import_* functions which are more suitable for * # Cython code. * cdef inline int import_array() except -1: # <<<<<<<<<<<<<< * try: * __pyx_import_array() */ static CYTHON_INLINE int __pyx_f_5numpy_import_array(void) { int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; int __pyx_t_4; PyObject *__pyx_t_5 = NULL; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; __Pyx_RefNannySetupContext("import_array", 1); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1023 * # Cython code. * cdef inline int import_array() except -1: * try: # <<<<<<<<<<<<<< * __pyx_import_array() * except Exception: */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_1, &__pyx_t_2, &__pyx_t_3); __Pyx_XGOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_t_3); /*try:*/ { /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1024 * cdef inline int import_array() except -1: * try: * __pyx_import_array() # <<<<<<<<<<<<<< * except Exception: * raise ImportError("numpy._core.multiarray failed to import") */ __pyx_t_4 = _import_array(); if (unlikely(__pyx_t_4 == ((int)-1))) __PYX_ERR(1, 1024, __pyx_L3_error) /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1023 * # Cython code. * cdef inline int import_array() except -1: * try: # <<<<<<<<<<<<<< * __pyx_import_array() * except Exception: */ } __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_XDECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; goto __pyx_L8_try_end; __pyx_L3_error:; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1025 * try: * __pyx_import_array() * except Exception: # <<<<<<<<<<<<<< * raise ImportError("numpy._core.multiarray failed to import") * */ __pyx_t_4 = __Pyx_PyErr_ExceptionMatches(((PyObject *)(&((PyTypeObject*)PyExc_Exception)[0]))); if (__pyx_t_4) { __Pyx_AddTraceback("numpy.import_array", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_5, &__pyx_t_6, &__pyx_t_7) < 0) __PYX_ERR(1, 1025, __pyx_L5_except_error) __Pyx_XGOTREF(__pyx_t_5); __Pyx_XGOTREF(__pyx_t_6); __Pyx_XGOTREF(__pyx_t_7); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1026 * __pyx_import_array() * except Exception: * raise ImportError("numpy._core.multiarray failed to import") # <<<<<<<<<<<<<< * * cdef inline int import_umath() except -1: */ __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple_, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(1, 1026, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_8); __Pyx_Raise(__pyx_t_8, 0, 0, 0); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; __PYX_ERR(1, 1026, __pyx_L5_except_error) } goto __pyx_L5_except_error; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1023 * # Cython code. * cdef inline int import_array() except -1: * try: # <<<<<<<<<<<<<< * __pyx_import_array() * except Exception: */ __pyx_L5_except_error:; __Pyx_XGIVEREF(__pyx_t_1); __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_ExceptionReset(__pyx_t_1, __pyx_t_2, __pyx_t_3); goto __pyx_L1_error; __pyx_L8_try_end:; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1022 * # Versions of the import_* functions which are more suitable for * # Cython code. * cdef inline int import_array() except -1: # <<<<<<<<<<<<<< * try: * __pyx_import_array() */ /* function exit code */ __pyx_r = 0; goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_5); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_AddTraceback("numpy.import_array", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1028 * raise ImportError("numpy._core.multiarray failed to import") * * cdef inline int import_umath() except -1: # <<<<<<<<<<<<<< * try: * _import_umath() */ static CYTHON_INLINE int __pyx_f_5numpy_import_umath(void) { int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; int __pyx_t_4; PyObject *__pyx_t_5 = NULL; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; __Pyx_RefNannySetupContext("import_umath", 1); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1029 * * cdef inline int import_umath() except -1: * try: # <<<<<<<<<<<<<< * _import_umath() * except Exception: */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_1, &__pyx_t_2, &__pyx_t_3); __Pyx_XGOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_t_3); /*try:*/ { /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1030 * cdef inline int import_umath() except -1: * try: * _import_umath() # <<<<<<<<<<<<<< * except Exception: * raise ImportError("numpy._core.umath failed to import") */ __pyx_t_4 = _import_umath(); if (unlikely(__pyx_t_4 == ((int)-1))) __PYX_ERR(1, 1030, __pyx_L3_error) /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1029 * * cdef inline int import_umath() except -1: * try: # <<<<<<<<<<<<<< * _import_umath() * except Exception: */ } __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_XDECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; goto __pyx_L8_try_end; __pyx_L3_error:; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1031 * try: * _import_umath() * except Exception: # <<<<<<<<<<<<<< * raise ImportError("numpy._core.umath failed to import") * */ __pyx_t_4 = __Pyx_PyErr_ExceptionMatches(((PyObject *)(&((PyTypeObject*)PyExc_Exception)[0]))); if (__pyx_t_4) { __Pyx_AddTraceback("numpy.import_umath", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_5, &__pyx_t_6, &__pyx_t_7) < 0) __PYX_ERR(1, 1031, __pyx_L5_except_error) __Pyx_XGOTREF(__pyx_t_5); __Pyx_XGOTREF(__pyx_t_6); __Pyx_XGOTREF(__pyx_t_7); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1032 * _import_umath() * except Exception: * raise ImportError("numpy._core.umath failed to import") # <<<<<<<<<<<<<< * * cdef inline int import_ufunc() except -1: */ __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__2, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(1, 1032, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_8); __Pyx_Raise(__pyx_t_8, 0, 0, 0); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; __PYX_ERR(1, 1032, __pyx_L5_except_error) } goto __pyx_L5_except_error; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1029 * * cdef inline int import_umath() except -1: * try: # <<<<<<<<<<<<<< * _import_umath() * except Exception: */ __pyx_L5_except_error:; __Pyx_XGIVEREF(__pyx_t_1); __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_ExceptionReset(__pyx_t_1, __pyx_t_2, __pyx_t_3); goto __pyx_L1_error; __pyx_L8_try_end:; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1028 * raise ImportError("numpy._core.multiarray failed to import") * * cdef inline int import_umath() except -1: # <<<<<<<<<<<<<< * try: * _import_umath() */ /* function exit code */ __pyx_r = 0; goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_5); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_AddTraceback("numpy.import_umath", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1034 * raise ImportError("numpy._core.umath failed to import") * * cdef inline int import_ufunc() except -1: # <<<<<<<<<<<<<< * try: * _import_umath() */ static CYTHON_INLINE int __pyx_f_5numpy_import_ufunc(void) { int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; int __pyx_t_4; PyObject *__pyx_t_5 = NULL; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; __Pyx_RefNannySetupContext("import_ufunc", 1); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1035 * * cdef inline int import_ufunc() except -1: * try: # <<<<<<<<<<<<<< * _import_umath() * except Exception: */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_1, &__pyx_t_2, &__pyx_t_3); __Pyx_XGOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_t_3); /*try:*/ { /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1036 * cdef inline int import_ufunc() except -1: * try: * _import_umath() # <<<<<<<<<<<<<< * except Exception: * raise ImportError("numpy._core.umath failed to import") */ __pyx_t_4 = _import_umath(); if (unlikely(__pyx_t_4 == ((int)-1))) __PYX_ERR(1, 1036, __pyx_L3_error) /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1035 * * cdef inline int import_ufunc() except -1: * try: # <<<<<<<<<<<<<< * _import_umath() * except Exception: */ } __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_XDECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; goto __pyx_L8_try_end; __pyx_L3_error:; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1037 * try: * _import_umath() * except Exception: # <<<<<<<<<<<<<< * raise ImportError("numpy._core.umath failed to import") * */ __pyx_t_4 = __Pyx_PyErr_ExceptionMatches(((PyObject *)(&((PyTypeObject*)PyExc_Exception)[0]))); if (__pyx_t_4) { __Pyx_AddTraceback("numpy.import_ufunc", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_5, &__pyx_t_6, &__pyx_t_7) < 0) __PYX_ERR(1, 1037, __pyx_L5_except_error) __Pyx_XGOTREF(__pyx_t_5); __Pyx_XGOTREF(__pyx_t_6); __Pyx_XGOTREF(__pyx_t_7); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1038 * _import_umath() * except Exception: * raise ImportError("numpy._core.umath failed to import") # <<<<<<<<<<<<<< * * */ __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__2, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(1, 1038, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_8); __Pyx_Raise(__pyx_t_8, 0, 0, 0); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; __PYX_ERR(1, 1038, __pyx_L5_except_error) } goto __pyx_L5_except_error; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1035 * * cdef inline int import_ufunc() except -1: * try: # <<<<<<<<<<<<<< * _import_umath() * except Exception: */ __pyx_L5_except_error:; __Pyx_XGIVEREF(__pyx_t_1); __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_ExceptionReset(__pyx_t_1, __pyx_t_2, __pyx_t_3); goto __pyx_L1_error; __pyx_L8_try_end:; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1034 * raise ImportError("numpy._core.umath failed to import") * * cdef inline int import_ufunc() except -1: # <<<<<<<<<<<<<< * try: * _import_umath() */ /* function exit code */ __pyx_r = 0; goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_5); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_AddTraceback("numpy.import_ufunc", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1041 * * * cdef inline bint is_timedelta64_object(object obj) noexcept: # <<<<<<<<<<<<<< * """ * Cython equivalent of `isinstance(obj, np.timedelta64)` */ static CYTHON_INLINE int __pyx_f_5numpy_is_timedelta64_object(PyObject *__pyx_v_obj) { int __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1053 * bool * """ * return PyObject_TypeCheck(obj, &PyTimedeltaArrType_Type) # <<<<<<<<<<<<<< * * */ __pyx_r = PyObject_TypeCheck(__pyx_v_obj, (&PyTimedeltaArrType_Type)); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1041 * * * cdef inline bint is_timedelta64_object(object obj) noexcept: # <<<<<<<<<<<<<< * """ * Cython equivalent of `isinstance(obj, np.timedelta64)` */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1056 * * * cdef inline bint is_datetime64_object(object obj) noexcept: # <<<<<<<<<<<<<< * """ * Cython equivalent of `isinstance(obj, np.datetime64)` */ static CYTHON_INLINE int __pyx_f_5numpy_is_datetime64_object(PyObject *__pyx_v_obj) { int __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1068 * bool * """ * return PyObject_TypeCheck(obj, &PyDatetimeArrType_Type) # <<<<<<<<<<<<<< * * */ __pyx_r = PyObject_TypeCheck(__pyx_v_obj, (&PyDatetimeArrType_Type)); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1056 * * * cdef inline bint is_datetime64_object(object obj) noexcept: # <<<<<<<<<<<<<< * """ * Cython equivalent of `isinstance(obj, np.datetime64)` */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1071 * * * cdef inline npy_datetime get_datetime64_value(object obj) noexcept nogil: # <<<<<<<<<<<<<< * """ * returns the int64 value underlying scalar numpy datetime64 object */ static CYTHON_INLINE npy_datetime __pyx_f_5numpy_get_datetime64_value(PyObject *__pyx_v_obj) { npy_datetime __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1078 * also needed. That can be found using `get_datetime64_unit`. * """ * return (obj).obval # <<<<<<<<<<<<<< * * */ __pyx_r = ((PyDatetimeScalarObject *)__pyx_v_obj)->obval; goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1071 * * * cdef inline npy_datetime get_datetime64_value(object obj) noexcept nogil: # <<<<<<<<<<<<<< * """ * returns the int64 value underlying scalar numpy datetime64 object */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1081 * * * cdef inline npy_timedelta get_timedelta64_value(object obj) noexcept nogil: # <<<<<<<<<<<<<< * """ * returns the int64 value underlying scalar numpy timedelta64 object */ static CYTHON_INLINE npy_timedelta __pyx_f_5numpy_get_timedelta64_value(PyObject *__pyx_v_obj) { npy_timedelta __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1085 * returns the int64 value underlying scalar numpy timedelta64 object * """ * return (obj).obval # <<<<<<<<<<<<<< * * */ __pyx_r = ((PyTimedeltaScalarObject *)__pyx_v_obj)->obval; goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1081 * * * cdef inline npy_timedelta get_timedelta64_value(object obj) noexcept nogil: # <<<<<<<<<<<<<< * """ * returns the int64 value underlying scalar numpy timedelta64 object */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1088 * * * cdef inline NPY_DATETIMEUNIT get_datetime64_unit(object obj) noexcept nogil: # <<<<<<<<<<<<<< * """ * returns the unit part of the dtype for a numpy datetime64 object. */ static CYTHON_INLINE NPY_DATETIMEUNIT __pyx_f_5numpy_get_datetime64_unit(PyObject *__pyx_v_obj) { NPY_DATETIMEUNIT __pyx_r; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1092 * returns the unit part of the dtype for a numpy datetime64 object. * """ * return (obj).obmeta.base # <<<<<<<<<<<<<< * * */ __pyx_r = ((NPY_DATETIMEUNIT)((PyDatetimeScalarObject *)__pyx_v_obj)->obmeta.base); goto __pyx_L0; /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1088 * * * cdef inline NPY_DATETIMEUNIT get_datetime64_unit(object obj) noexcept nogil: # <<<<<<<<<<<<<< * """ * returns the unit part of the dtype for a numpy datetime64 object. */ /* function exit code */ __pyx_L0:; return __pyx_r; } /* "gammapy/stats/fit_statistics_cython.pyx":15 * TRUNCATION_VALUE = 1e-25 * * @cython.cdivision(True) # <<<<<<<<<<<<<< * @cython.boundscheck(False) * def cash_sum_cython(np.ndarray[np.float_t, ndim=1] counts, */ /* Python wrapper */ static PyObject *__pyx_pw_7gammapy_5stats_21fit_statistics_cython_1cash_sum_cython(PyObject *__pyx_self, #if CYTHON_METH_FASTCALL PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds #else PyObject *__pyx_args, PyObject *__pyx_kwds #endif ); /*proto*/ PyDoc_STRVAR(__pyx_doc_7gammapy_5stats_21fit_statistics_cython_cash_sum_cython, "Summed cash fit statistics.\n\n Parameters\n ----------\n counts : `~numpy.ndarray`\n Counts array.\n npred : `~numpy.ndarray`\n Predicted counts array.\n "); static PyMethodDef __pyx_mdef_7gammapy_5stats_21fit_statistics_cython_1cash_sum_cython = {"cash_sum_cython", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_7gammapy_5stats_21fit_statistics_cython_1cash_sum_cython, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_7gammapy_5stats_21fit_statistics_cython_cash_sum_cython}; static PyObject *__pyx_pw_7gammapy_5stats_21fit_statistics_cython_1cash_sum_cython(PyObject *__pyx_self, #if CYTHON_METH_FASTCALL PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds #else PyObject *__pyx_args, PyObject *__pyx_kwds #endif ) { PyArrayObject *__pyx_v_counts = 0; PyArrayObject *__pyx_v_npred = 0; #if !CYTHON_METH_FASTCALL CYTHON_UNUSED Py_ssize_t __pyx_nargs; #endif CYTHON_UNUSED PyObject *const *__pyx_kwvalues; PyObject* values[2] = {0,0}; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("cash_sum_cython (wrapper)", 0); #if !CYTHON_METH_FASTCALL #if CYTHON_ASSUME_SAFE_MACROS __pyx_nargs = PyTuple_GET_SIZE(__pyx_args); #else __pyx_nargs = PyTuple_Size(__pyx_args); if (unlikely(__pyx_nargs < 0)) return NULL; #endif #endif __pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs); { PyObject **__pyx_pyargnames[] = {&__pyx_n_s_counts,&__pyx_n_s_npred,0}; if (__pyx_kwds) { Py_ssize_t kw_args; switch (__pyx_nargs) { case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds); switch (__pyx_nargs) { case 0: if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_counts)) != 0)) { (void)__Pyx_Arg_NewRef_FASTCALL(values[0]); kw_args--; } else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 15, __pyx_L3_error) else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_npred)) != 0)) { (void)__Pyx_Arg_NewRef_FASTCALL(values[1]); kw_args--; } else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 15, __pyx_L3_error) else { __Pyx_RaiseArgtupleInvalid("cash_sum_cython", 1, 2, 2, 1); __PYX_ERR(0, 15, __pyx_L3_error) } } if (unlikely(kw_args > 0)) { const Py_ssize_t kwd_pos_args = __pyx_nargs; if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "cash_sum_cython") < 0)) __PYX_ERR(0, 15, __pyx_L3_error) } } else if (unlikely(__pyx_nargs != 2)) { goto __pyx_L5_argtuple_error; } else { values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0); values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1); } __pyx_v_counts = ((PyArrayObject *)values[0]); __pyx_v_npred = ((PyArrayObject *)values[1]); } goto __pyx_L6_skip; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("cash_sum_cython", 1, 2, 2, __pyx_nargs); __PYX_ERR(0, 15, __pyx_L3_error) __pyx_L6_skip:; goto __pyx_L4_argument_unpacking_done; __pyx_L3_error:; { Py_ssize_t __pyx_temp; for (__pyx_temp=0; __pyx_temp < (Py_ssize_t)(sizeof(values)/sizeof(values[0])); ++__pyx_temp) { __Pyx_Arg_XDECREF_FASTCALL(values[__pyx_temp]); } } __Pyx_AddTraceback("gammapy.stats.fit_statistics_cython.cash_sum_cython", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return NULL; __pyx_L4_argument_unpacking_done:; if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_counts), __pyx_ptype_5numpy_ndarray, 1, "counts", 0))) __PYX_ERR(0, 17, __pyx_L1_error) if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_npred), __pyx_ptype_5numpy_ndarray, 1, "npred", 0))) __PYX_ERR(0, 18, __pyx_L1_error) __pyx_r = __pyx_pf_7gammapy_5stats_21fit_statistics_cython_cash_sum_cython(__pyx_self, __pyx_v_counts, __pyx_v_npred); /* function exit code */ goto __pyx_L0; __pyx_L1_error:; __pyx_r = NULL; __pyx_L0:; { Py_ssize_t __pyx_temp; for (__pyx_temp=0; __pyx_temp < (Py_ssize_t)(sizeof(values)/sizeof(values[0])); ++__pyx_temp) { __Pyx_Arg_XDECREF_FASTCALL(values[__pyx_temp]); } } __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7gammapy_5stats_21fit_statistics_cython_cash_sum_cython(CYTHON_UNUSED PyObject *__pyx_self, PyArrayObject *__pyx_v_counts, PyArrayObject *__pyx_v_npred) { __pyx_t_5numpy_float_t __pyx_v_sum; __pyx_t_5numpy_float_t __pyx_v_npr; __pyx_t_5numpy_float_t __pyx_v_lognpr; unsigned int __pyx_v_i; unsigned int __pyx_v_ni; __pyx_t_5numpy_float_t __pyx_v_trunc; __pyx_t_5numpy_float_t __pyx_v_logtrunc; __Pyx_LocalBuf_ND __pyx_pybuffernd_counts; __Pyx_Buffer __pyx_pybuffer_counts; __Pyx_LocalBuf_ND __pyx_pybuffernd_npred; __Pyx_Buffer __pyx_pybuffer_npred; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __pyx_t_5numpy_float_t __pyx_t_2; float __pyx_t_3; unsigned int __pyx_t_4; unsigned int __pyx_t_5; unsigned int __pyx_t_6; size_t __pyx_t_7; int __pyx_t_8; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; __Pyx_RefNannySetupContext("cash_sum_cython", 1); __pyx_pybuffer_counts.pybuffer.buf = NULL; __pyx_pybuffer_counts.refcount = 0; __pyx_pybuffernd_counts.data = NULL; __pyx_pybuffernd_counts.rcbuffer = &__pyx_pybuffer_counts; __pyx_pybuffer_npred.pybuffer.buf = NULL; __pyx_pybuffer_npred.refcount = 0; __pyx_pybuffernd_npred.data = NULL; __pyx_pybuffernd_npred.rcbuffer = &__pyx_pybuffer_npred; { __Pyx_BufFmt_StackElem __pyx_stack[1]; if (unlikely(__Pyx_GetBufferAndValidate(&__pyx_pybuffernd_counts.rcbuffer->pybuffer, (PyObject*)__pyx_v_counts, &__Pyx_TypeInfo_nn___pyx_t_5numpy_float_t, PyBUF_FORMAT| PyBUF_STRIDES, 1, 0, __pyx_stack) == -1)) __PYX_ERR(0, 15, __pyx_L1_error) } __pyx_pybuffernd_counts.diminfo[0].strides = __pyx_pybuffernd_counts.rcbuffer->pybuffer.strides[0]; __pyx_pybuffernd_counts.diminfo[0].shape = __pyx_pybuffernd_counts.rcbuffer->pybuffer.shape[0]; { __Pyx_BufFmt_StackElem __pyx_stack[1]; if (unlikely(__Pyx_GetBufferAndValidate(&__pyx_pybuffernd_npred.rcbuffer->pybuffer, (PyObject*)__pyx_v_npred, &__Pyx_TypeInfo_nn___pyx_t_5numpy_float_t, PyBUF_FORMAT| PyBUF_STRIDES, 1, 0, __pyx_stack) == -1)) __PYX_ERR(0, 15, __pyx_L1_error) } __pyx_pybuffernd_npred.diminfo[0].strides = __pyx_pybuffernd_npred.rcbuffer->pybuffer.strides[0]; __pyx_pybuffernd_npred.diminfo[0].shape = __pyx_pybuffernd_npred.rcbuffer->pybuffer.shape[0]; /* "gammapy/stats/fit_statistics_cython.pyx":28 * Predicted counts array. * """ * cdef np.float_t sum = 0 # <<<<<<<<<<<<<< * cdef np.float_t npr, lognpr * cdef unsigned int i, ni */ __pyx_v_sum = 0.0; /* "gammapy/stats/fit_statistics_cython.pyx":31 * cdef np.float_t npr, lognpr * cdef unsigned int i, ni * cdef np.float_t trunc = TRUNCATION_VALUE # <<<<<<<<<<<<<< * cdef np.float_t logtrunc = log(TRUNCATION_VALUE) * */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_TRUNCATION_VALUE); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 31, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = __pyx_PyFloat_AsDouble(__pyx_t_1); if (unlikely((__pyx_t_2 == ((npy_double)-1)) && PyErr_Occurred())) __PYX_ERR(0, 31, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_v_trunc = __pyx_t_2; /* "gammapy/stats/fit_statistics_cython.pyx":32 * cdef unsigned int i, ni * cdef np.float_t trunc = TRUNCATION_VALUE * cdef np.float_t logtrunc = log(TRUNCATION_VALUE) # <<<<<<<<<<<<<< * * ni = counts.shape[0] */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_TRUNCATION_VALUE); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 32, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_3 = __pyx_PyFloat_AsFloat(__pyx_t_1); if (unlikely((__pyx_t_3 == (float)-1) && PyErr_Occurred())) __PYX_ERR(0, 32, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_v_logtrunc = log(__pyx_t_3); /* "gammapy/stats/fit_statistics_cython.pyx":34 * cdef np.float_t logtrunc = log(TRUNCATION_VALUE) * * ni = counts.shape[0] # <<<<<<<<<<<<<< * for i in range(ni): * npr = npred[i] */ __pyx_v_ni = (__pyx_f_5numpy_7ndarray_5shape_shape(((PyArrayObject *)__pyx_v_counts))[0]); /* "gammapy/stats/fit_statistics_cython.pyx":35 * * ni = counts.shape[0] * for i in range(ni): # <<<<<<<<<<<<<< * npr = npred[i] * if npr > trunc: */ __pyx_t_4 = __pyx_v_ni; __pyx_t_5 = __pyx_t_4; for (__pyx_t_6 = 0; __pyx_t_6 < __pyx_t_5; __pyx_t_6+=1) { __pyx_v_i = __pyx_t_6; /* "gammapy/stats/fit_statistics_cython.pyx":36 * ni = counts.shape[0] * for i in range(ni): * npr = npred[i] # <<<<<<<<<<<<<< * if npr > trunc: * lognpr = log(npr) */ __pyx_t_7 = __pyx_v_i; __pyx_v_npr = (*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_npred.rcbuffer->pybuffer.buf, __pyx_t_7, __pyx_pybuffernd_npred.diminfo[0].strides)); /* "gammapy/stats/fit_statistics_cython.pyx":37 * for i in range(ni): * npr = npred[i] * if npr > trunc: # <<<<<<<<<<<<<< * lognpr = log(npr) * else: */ __pyx_t_8 = (__pyx_v_npr > __pyx_v_trunc); if (__pyx_t_8) { /* "gammapy/stats/fit_statistics_cython.pyx":38 * npr = npred[i] * if npr > trunc: * lognpr = log(npr) # <<<<<<<<<<<<<< * else: * npr = trunc */ __pyx_v_lognpr = log(__pyx_v_npr); /* "gammapy/stats/fit_statistics_cython.pyx":37 * for i in range(ni): * npr = npred[i] * if npr > trunc: # <<<<<<<<<<<<<< * lognpr = log(npr) * else: */ goto __pyx_L5; } /* "gammapy/stats/fit_statistics_cython.pyx":40 * lognpr = log(npr) * else: * npr = trunc # <<<<<<<<<<<<<< * lognpr = logtrunc * */ /*else*/ { __pyx_v_npr = __pyx_v_trunc; /* "gammapy/stats/fit_statistics_cython.pyx":41 * else: * npr = trunc * lognpr = logtrunc # <<<<<<<<<<<<<< * * sum += npr */ __pyx_v_lognpr = __pyx_v_logtrunc; } __pyx_L5:; /* "gammapy/stats/fit_statistics_cython.pyx":43 * lognpr = logtrunc * * sum += npr # <<<<<<<<<<<<<< * if counts[i] > 0: * sum -= counts[i] * lognpr */ __pyx_v_sum = (__pyx_v_sum + __pyx_v_npr); /* "gammapy/stats/fit_statistics_cython.pyx":44 * * sum += npr * if counts[i] > 0: # <<<<<<<<<<<<<< * sum -= counts[i] * lognpr * */ __pyx_t_7 = __pyx_v_i; __pyx_t_8 = ((*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_counts.rcbuffer->pybuffer.buf, __pyx_t_7, __pyx_pybuffernd_counts.diminfo[0].strides)) > 0.0); if (__pyx_t_8) { /* "gammapy/stats/fit_statistics_cython.pyx":45 * sum += npr * if counts[i] > 0: * sum -= counts[i] * lognpr # <<<<<<<<<<<<<< * * return 2 * sum */ __pyx_t_7 = __pyx_v_i; __pyx_v_sum = (__pyx_v_sum - ((*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_counts.rcbuffer->pybuffer.buf, __pyx_t_7, __pyx_pybuffernd_counts.diminfo[0].strides)) * __pyx_v_lognpr)); /* "gammapy/stats/fit_statistics_cython.pyx":44 * * sum += npr * if counts[i] > 0: # <<<<<<<<<<<<<< * sum -= counts[i] * lognpr * */ } } /* "gammapy/stats/fit_statistics_cython.pyx":47 * sum -= counts[i] * lognpr * * return 2 * sum # <<<<<<<<<<<<<< * * */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = PyFloat_FromDouble((2.0 * __pyx_v_sum)); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 47, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "gammapy/stats/fit_statistics_cython.pyx":15 * TRUNCATION_VALUE = 1e-25 * * @cython.cdivision(True) # <<<<<<<<<<<<<< * @cython.boundscheck(False) * def cash_sum_cython(np.ndarray[np.float_t, ndim=1] counts, */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); { PyObject *__pyx_type, *__pyx_value, *__pyx_tb; __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ErrFetch(&__pyx_type, &__pyx_value, &__pyx_tb); __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_counts.rcbuffer->pybuffer); __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_npred.rcbuffer->pybuffer); __Pyx_ErrRestore(__pyx_type, __pyx_value, __pyx_tb);} __Pyx_AddTraceback("gammapy.stats.fit_statistics_cython.cash_sum_cython", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; goto __pyx_L2; __pyx_L0:; __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_counts.rcbuffer->pybuffer); __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_npred.rcbuffer->pybuffer); __pyx_L2:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "gammapy/stats/fit_statistics_cython.pyx":50 * * * @cython.cdivision(True) # <<<<<<<<<<<<<< * @cython.boundscheck(False) * def f_cash_root_cython(np.float_t x, np.ndarray[np.float_t, ndim=1] counts, */ /* Python wrapper */ static PyObject *__pyx_pw_7gammapy_5stats_21fit_statistics_cython_3f_cash_root_cython(PyObject *__pyx_self, #if CYTHON_METH_FASTCALL PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds #else PyObject *__pyx_args, PyObject *__pyx_kwds #endif ); /*proto*/ PyDoc_STRVAR(__pyx_doc_7gammapy_5stats_21fit_statistics_cython_2f_cash_root_cython, "Function to find root of. Described in Appendix A, Stewart (2009).\n\n Parameters\n ----------\n x : float\n Model amplitude.\n counts : `~numpy.ndarray`\n Count image slice, where model is defined.\n background : `~numpy.ndarray`\n Background image slice, where model is defined.\n model : `~numpy.ndarray`\n Source template (multiplied with exposure).\n "); static PyMethodDef __pyx_mdef_7gammapy_5stats_21fit_statistics_cython_3f_cash_root_cython = {"f_cash_root_cython", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_7gammapy_5stats_21fit_statistics_cython_3f_cash_root_cython, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_7gammapy_5stats_21fit_statistics_cython_2f_cash_root_cython}; static PyObject *__pyx_pw_7gammapy_5stats_21fit_statistics_cython_3f_cash_root_cython(PyObject *__pyx_self, #if CYTHON_METH_FASTCALL PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds #else PyObject *__pyx_args, PyObject *__pyx_kwds #endif ) { __pyx_t_5numpy_float_t __pyx_v_x; PyArrayObject *__pyx_v_counts = 0; PyArrayObject *__pyx_v_background = 0; PyArrayObject *__pyx_v_model = 0; #if !CYTHON_METH_FASTCALL CYTHON_UNUSED Py_ssize_t __pyx_nargs; #endif CYTHON_UNUSED PyObject *const *__pyx_kwvalues; PyObject* values[4] = {0,0,0,0}; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("f_cash_root_cython (wrapper)", 0); #if !CYTHON_METH_FASTCALL #if CYTHON_ASSUME_SAFE_MACROS __pyx_nargs = PyTuple_GET_SIZE(__pyx_args); #else __pyx_nargs = PyTuple_Size(__pyx_args); if (unlikely(__pyx_nargs < 0)) return NULL; #endif #endif __pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs); { PyObject **__pyx_pyargnames[] = {&__pyx_n_s_x,&__pyx_n_s_counts,&__pyx_n_s_background,&__pyx_n_s_model,0}; if (__pyx_kwds) { Py_ssize_t kw_args; switch (__pyx_nargs) { case 4: values[3] = __Pyx_Arg_FASTCALL(__pyx_args, 3); CYTHON_FALLTHROUGH; case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2); CYTHON_FALLTHROUGH; case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds); switch (__pyx_nargs) { case 0: if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_x)) != 0)) { (void)__Pyx_Arg_NewRef_FASTCALL(values[0]); kw_args--; } else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 50, __pyx_L3_error) else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_counts)) != 0)) { (void)__Pyx_Arg_NewRef_FASTCALL(values[1]); kw_args--; } else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 50, __pyx_L3_error) else { __Pyx_RaiseArgtupleInvalid("f_cash_root_cython", 1, 4, 4, 1); __PYX_ERR(0, 50, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 2: if (likely((values[2] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_background)) != 0)) { (void)__Pyx_Arg_NewRef_FASTCALL(values[2]); kw_args--; } else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 50, __pyx_L3_error) else { __Pyx_RaiseArgtupleInvalid("f_cash_root_cython", 1, 4, 4, 2); __PYX_ERR(0, 50, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 3: if (likely((values[3] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_model)) != 0)) { (void)__Pyx_Arg_NewRef_FASTCALL(values[3]); kw_args--; } else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 50, __pyx_L3_error) else { __Pyx_RaiseArgtupleInvalid("f_cash_root_cython", 1, 4, 4, 3); __PYX_ERR(0, 50, __pyx_L3_error) } } if (unlikely(kw_args > 0)) { const Py_ssize_t kwd_pos_args = __pyx_nargs; if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "f_cash_root_cython") < 0)) __PYX_ERR(0, 50, __pyx_L3_error) } } else if (unlikely(__pyx_nargs != 4)) { goto __pyx_L5_argtuple_error; } else { values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0); values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1); values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2); values[3] = __Pyx_Arg_FASTCALL(__pyx_args, 3); } __pyx_v_x = __pyx_PyFloat_AsDouble(values[0]); if (unlikely((__pyx_v_x == ((npy_double)-1)) && PyErr_Occurred())) __PYX_ERR(0, 52, __pyx_L3_error) __pyx_v_counts = ((PyArrayObject *)values[1]); __pyx_v_background = ((PyArrayObject *)values[2]); __pyx_v_model = ((PyArrayObject *)values[3]); } goto __pyx_L6_skip; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("f_cash_root_cython", 1, 4, 4, __pyx_nargs); __PYX_ERR(0, 50, __pyx_L3_error) __pyx_L6_skip:; goto __pyx_L4_argument_unpacking_done; __pyx_L3_error:; { Py_ssize_t __pyx_temp; for (__pyx_temp=0; __pyx_temp < (Py_ssize_t)(sizeof(values)/sizeof(values[0])); ++__pyx_temp) { __Pyx_Arg_XDECREF_FASTCALL(values[__pyx_temp]); } } __Pyx_AddTraceback("gammapy.stats.fit_statistics_cython.f_cash_root_cython", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return NULL; __pyx_L4_argument_unpacking_done:; if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_counts), __pyx_ptype_5numpy_ndarray, 1, "counts", 0))) __PYX_ERR(0, 52, __pyx_L1_error) if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_background), __pyx_ptype_5numpy_ndarray, 1, "background", 0))) __PYX_ERR(0, 53, __pyx_L1_error) if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_model), __pyx_ptype_5numpy_ndarray, 1, "model", 0))) __PYX_ERR(0, 54, __pyx_L1_error) __pyx_r = __pyx_pf_7gammapy_5stats_21fit_statistics_cython_2f_cash_root_cython(__pyx_self, __pyx_v_x, __pyx_v_counts, __pyx_v_background, __pyx_v_model); /* function exit code */ goto __pyx_L0; __pyx_L1_error:; __pyx_r = NULL; __pyx_L0:; { Py_ssize_t __pyx_temp; for (__pyx_temp=0; __pyx_temp < (Py_ssize_t)(sizeof(values)/sizeof(values[0])); ++__pyx_temp) { __Pyx_Arg_XDECREF_FASTCALL(values[__pyx_temp]); } } __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7gammapy_5stats_21fit_statistics_cython_2f_cash_root_cython(CYTHON_UNUSED PyObject *__pyx_self, __pyx_t_5numpy_float_t __pyx_v_x, PyArrayObject *__pyx_v_counts, PyArrayObject *__pyx_v_background, PyArrayObject *__pyx_v_model) { __pyx_t_5numpy_float_t __pyx_v_sum; unsigned int __pyx_v_i; unsigned int __pyx_v_ni; __Pyx_LocalBuf_ND __pyx_pybuffernd_background; __Pyx_Buffer __pyx_pybuffer_background; __Pyx_LocalBuf_ND __pyx_pybuffernd_counts; __Pyx_Buffer __pyx_pybuffer_counts; __Pyx_LocalBuf_ND __pyx_pybuffernd_model; __Pyx_Buffer __pyx_pybuffer_model; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations unsigned int __pyx_t_1; unsigned int __pyx_t_2; unsigned int __pyx_t_3; size_t __pyx_t_4; int __pyx_t_5; size_t __pyx_t_6; size_t __pyx_t_7; size_t __pyx_t_8; PyObject *__pyx_t_9 = NULL; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; __Pyx_RefNannySetupContext("f_cash_root_cython", 1); __pyx_pybuffer_counts.pybuffer.buf = NULL; __pyx_pybuffer_counts.refcount = 0; __pyx_pybuffernd_counts.data = NULL; __pyx_pybuffernd_counts.rcbuffer = &__pyx_pybuffer_counts; __pyx_pybuffer_background.pybuffer.buf = NULL; __pyx_pybuffer_background.refcount = 0; __pyx_pybuffernd_background.data = NULL; __pyx_pybuffernd_background.rcbuffer = &__pyx_pybuffer_background; __pyx_pybuffer_model.pybuffer.buf = NULL; __pyx_pybuffer_model.refcount = 0; __pyx_pybuffernd_model.data = NULL; __pyx_pybuffernd_model.rcbuffer = &__pyx_pybuffer_model; { __Pyx_BufFmt_StackElem __pyx_stack[1]; if (unlikely(__Pyx_GetBufferAndValidate(&__pyx_pybuffernd_counts.rcbuffer->pybuffer, (PyObject*)__pyx_v_counts, &__Pyx_TypeInfo_nn___pyx_t_5numpy_float_t, PyBUF_FORMAT| PyBUF_STRIDES, 1, 0, __pyx_stack) == -1)) __PYX_ERR(0, 50, __pyx_L1_error) } __pyx_pybuffernd_counts.diminfo[0].strides = __pyx_pybuffernd_counts.rcbuffer->pybuffer.strides[0]; __pyx_pybuffernd_counts.diminfo[0].shape = __pyx_pybuffernd_counts.rcbuffer->pybuffer.shape[0]; { __Pyx_BufFmt_StackElem __pyx_stack[1]; if (unlikely(__Pyx_GetBufferAndValidate(&__pyx_pybuffernd_background.rcbuffer->pybuffer, (PyObject*)__pyx_v_background, &__Pyx_TypeInfo_nn___pyx_t_5numpy_float_t, PyBUF_FORMAT| PyBUF_STRIDES, 1, 0, __pyx_stack) == -1)) __PYX_ERR(0, 50, __pyx_L1_error) } __pyx_pybuffernd_background.diminfo[0].strides = __pyx_pybuffernd_background.rcbuffer->pybuffer.strides[0]; __pyx_pybuffernd_background.diminfo[0].shape = __pyx_pybuffernd_background.rcbuffer->pybuffer.shape[0]; { __Pyx_BufFmt_StackElem __pyx_stack[1]; if (unlikely(__Pyx_GetBufferAndValidate(&__pyx_pybuffernd_model.rcbuffer->pybuffer, (PyObject*)__pyx_v_model, &__Pyx_TypeInfo_nn___pyx_t_5numpy_float_t, PyBUF_FORMAT| PyBUF_STRIDES, 1, 0, __pyx_stack) == -1)) __PYX_ERR(0, 50, __pyx_L1_error) } __pyx_pybuffernd_model.diminfo[0].strides = __pyx_pybuffernd_model.rcbuffer->pybuffer.strides[0]; __pyx_pybuffernd_model.diminfo[0].shape = __pyx_pybuffernd_model.rcbuffer->pybuffer.shape[0]; /* "gammapy/stats/fit_statistics_cython.pyx":68 * Source template (multiplied with exposure). * """ * cdef np.float_t sum = 0 # <<<<<<<<<<<<<< * cdef unsigned int i, ni * ni = counts.shape[0] */ __pyx_v_sum = 0.0; /* "gammapy/stats/fit_statistics_cython.pyx":70 * cdef np.float_t sum = 0 * cdef unsigned int i, ni * ni = counts.shape[0] # <<<<<<<<<<<<<< * for i in range(ni): * if model[i] > 0: */ __pyx_v_ni = (__pyx_f_5numpy_7ndarray_5shape_shape(((PyArrayObject *)__pyx_v_counts))[0]); /* "gammapy/stats/fit_statistics_cython.pyx":71 * cdef unsigned int i, ni * ni = counts.shape[0] * for i in range(ni): # <<<<<<<<<<<<<< * if model[i] > 0: * if counts[i] > 0: */ __pyx_t_1 = __pyx_v_ni; __pyx_t_2 = __pyx_t_1; for (__pyx_t_3 = 0; __pyx_t_3 < __pyx_t_2; __pyx_t_3+=1) { __pyx_v_i = __pyx_t_3; /* "gammapy/stats/fit_statistics_cython.pyx":72 * ni = counts.shape[0] * for i in range(ni): * if model[i] > 0: # <<<<<<<<<<<<<< * if counts[i] > 0: * sum += model[i] * (1 - counts[i] / (x * model[i] + background[i])) */ __pyx_t_4 = __pyx_v_i; __pyx_t_5 = ((*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_model.rcbuffer->pybuffer.buf, __pyx_t_4, __pyx_pybuffernd_model.diminfo[0].strides)) > 0.0); if (__pyx_t_5) { /* "gammapy/stats/fit_statistics_cython.pyx":73 * for i in range(ni): * if model[i] > 0: * if counts[i] > 0: # <<<<<<<<<<<<<< * sum += model[i] * (1 - counts[i] / (x * model[i] + background[i])) * else: */ __pyx_t_4 = __pyx_v_i; __pyx_t_5 = ((*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_counts.rcbuffer->pybuffer.buf, __pyx_t_4, __pyx_pybuffernd_counts.diminfo[0].strides)) > 0.0); if (__pyx_t_5) { /* "gammapy/stats/fit_statistics_cython.pyx":74 * if model[i] > 0: * if counts[i] > 0: * sum += model[i] * (1 - counts[i] / (x * model[i] + background[i])) # <<<<<<<<<<<<<< * else: * sum += model[i] */ __pyx_t_4 = __pyx_v_i; __pyx_t_6 = __pyx_v_i; __pyx_t_7 = __pyx_v_i; __pyx_t_8 = __pyx_v_i; __pyx_v_sum = (__pyx_v_sum + ((*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_model.rcbuffer->pybuffer.buf, __pyx_t_4, __pyx_pybuffernd_model.diminfo[0].strides)) * (1.0 - ((*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_counts.rcbuffer->pybuffer.buf, __pyx_t_6, __pyx_pybuffernd_counts.diminfo[0].strides)) / ((__pyx_v_x * (*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_model.rcbuffer->pybuffer.buf, __pyx_t_7, __pyx_pybuffernd_model.diminfo[0].strides))) + (*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_background.rcbuffer->pybuffer.buf, __pyx_t_8, __pyx_pybuffernd_background.diminfo[0].strides))))))); /* "gammapy/stats/fit_statistics_cython.pyx":73 * for i in range(ni): * if model[i] > 0: * if counts[i] > 0: # <<<<<<<<<<<<<< * sum += model[i] * (1 - counts[i] / (x * model[i] + background[i])) * else: */ goto __pyx_L6; } /* "gammapy/stats/fit_statistics_cython.pyx":76 * sum += model[i] * (1 - counts[i] / (x * model[i] + background[i])) * else: * sum += model[i] # <<<<<<<<<<<<<< * * # 2 is required to maintain the correct normalization of the */ /*else*/ { __pyx_t_8 = __pyx_v_i; __pyx_v_sum = (__pyx_v_sum + (*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_model.rcbuffer->pybuffer.buf, __pyx_t_8, __pyx_pybuffernd_model.diminfo[0].strides))); } __pyx_L6:; /* "gammapy/stats/fit_statistics_cython.pyx":72 * ni = counts.shape[0] * for i in range(ni): * if model[i] > 0: # <<<<<<<<<<<<<< * if counts[i] > 0: * sum += model[i] * (1 - counts[i] / (x * model[i] + background[i])) */ } } /* "gammapy/stats/fit_statistics_cython.pyx":81 * # derivative of the likelihood function. It doesn't change the result of * # the fit. * return 2 * sum # <<<<<<<<<<<<<< * * */ __Pyx_XDECREF(__pyx_r); __pyx_t_9 = PyFloat_FromDouble((2.0 * __pyx_v_sum)); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 81, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_9); __pyx_r = __pyx_t_9; __pyx_t_9 = 0; goto __pyx_L0; /* "gammapy/stats/fit_statistics_cython.pyx":50 * * * @cython.cdivision(True) # <<<<<<<<<<<<<< * @cython.boundscheck(False) * def f_cash_root_cython(np.float_t x, np.ndarray[np.float_t, ndim=1] counts, */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_9); { PyObject *__pyx_type, *__pyx_value, *__pyx_tb; __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ErrFetch(&__pyx_type, &__pyx_value, &__pyx_tb); __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_background.rcbuffer->pybuffer); __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_counts.rcbuffer->pybuffer); __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_model.rcbuffer->pybuffer); __Pyx_ErrRestore(__pyx_type, __pyx_value, __pyx_tb);} __Pyx_AddTraceback("gammapy.stats.fit_statistics_cython.f_cash_root_cython", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; goto __pyx_L2; __pyx_L0:; __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_background.rcbuffer->pybuffer); __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_counts.rcbuffer->pybuffer); __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_model.rcbuffer->pybuffer); __pyx_L2:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "gammapy/stats/fit_statistics_cython.pyx":84 * * * @cython.cdivision(True) # <<<<<<<<<<<<<< * @cython.boundscheck(False) * def norm_bounds_cython(np.ndarray[np.float_t, ndim=1] counts, */ /* Python wrapper */ static PyObject *__pyx_pw_7gammapy_5stats_21fit_statistics_cython_5norm_bounds_cython(PyObject *__pyx_self, #if CYTHON_METH_FASTCALL PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds #else PyObject *__pyx_args, PyObject *__pyx_kwds #endif ); /*proto*/ PyDoc_STRVAR(__pyx_doc_7gammapy_5stats_21fit_statistics_cython_4norm_bounds_cython, "Compute bounds for the root of `_f_cash_root_cython`.\n\n Parameters\n ----------\n counts : `~numpy.ndarray`\n Counts image\n background : `~numpy.ndarray`\n Background image\n model : `~numpy.ndarray`\n Source template (multiplied with exposure).\n "); static PyMethodDef __pyx_mdef_7gammapy_5stats_21fit_statistics_cython_5norm_bounds_cython = {"norm_bounds_cython", (PyCFunction)(void*)(__Pyx_PyCFunction_FastCallWithKeywords)__pyx_pw_7gammapy_5stats_21fit_statistics_cython_5norm_bounds_cython, __Pyx_METH_FASTCALL|METH_KEYWORDS, __pyx_doc_7gammapy_5stats_21fit_statistics_cython_4norm_bounds_cython}; static PyObject *__pyx_pw_7gammapy_5stats_21fit_statistics_cython_5norm_bounds_cython(PyObject *__pyx_self, #if CYTHON_METH_FASTCALL PyObject *const *__pyx_args, Py_ssize_t __pyx_nargs, PyObject *__pyx_kwds #else PyObject *__pyx_args, PyObject *__pyx_kwds #endif ) { PyArrayObject *__pyx_v_counts = 0; PyArrayObject *__pyx_v_background = 0; PyArrayObject *__pyx_v_model = 0; #if !CYTHON_METH_FASTCALL CYTHON_UNUSED Py_ssize_t __pyx_nargs; #endif CYTHON_UNUSED PyObject *const *__pyx_kwvalues; PyObject* values[3] = {0,0,0}; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("norm_bounds_cython (wrapper)", 0); #if !CYTHON_METH_FASTCALL #if CYTHON_ASSUME_SAFE_MACROS __pyx_nargs = PyTuple_GET_SIZE(__pyx_args); #else __pyx_nargs = PyTuple_Size(__pyx_args); if (unlikely(__pyx_nargs < 0)) return NULL; #endif #endif __pyx_kwvalues = __Pyx_KwValues_FASTCALL(__pyx_args, __pyx_nargs); { PyObject **__pyx_pyargnames[] = {&__pyx_n_s_counts,&__pyx_n_s_background,&__pyx_n_s_model,0}; if (__pyx_kwds) { Py_ssize_t kw_args; switch (__pyx_nargs) { case 3: values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2); CYTHON_FALLTHROUGH; case 2: values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = __Pyx_NumKwargs_FASTCALL(__pyx_kwds); switch (__pyx_nargs) { case 0: if (likely((values[0] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_counts)) != 0)) { (void)__Pyx_Arg_NewRef_FASTCALL(values[0]); kw_args--; } else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 84, __pyx_L3_error) else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_background)) != 0)) { (void)__Pyx_Arg_NewRef_FASTCALL(values[1]); kw_args--; } else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 84, __pyx_L3_error) else { __Pyx_RaiseArgtupleInvalid("norm_bounds_cython", 1, 3, 3, 1); __PYX_ERR(0, 84, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 2: if (likely((values[2] = __Pyx_GetKwValue_FASTCALL(__pyx_kwds, __pyx_kwvalues, __pyx_n_s_model)) != 0)) { (void)__Pyx_Arg_NewRef_FASTCALL(values[2]); kw_args--; } else if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 84, __pyx_L3_error) else { __Pyx_RaiseArgtupleInvalid("norm_bounds_cython", 1, 3, 3, 2); __PYX_ERR(0, 84, __pyx_L3_error) } } if (unlikely(kw_args > 0)) { const Py_ssize_t kwd_pos_args = __pyx_nargs; if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_kwvalues, __pyx_pyargnames, 0, values + 0, kwd_pos_args, "norm_bounds_cython") < 0)) __PYX_ERR(0, 84, __pyx_L3_error) } } else if (unlikely(__pyx_nargs != 3)) { goto __pyx_L5_argtuple_error; } else { values[0] = __Pyx_Arg_FASTCALL(__pyx_args, 0); values[1] = __Pyx_Arg_FASTCALL(__pyx_args, 1); values[2] = __Pyx_Arg_FASTCALL(__pyx_args, 2); } __pyx_v_counts = ((PyArrayObject *)values[0]); __pyx_v_background = ((PyArrayObject *)values[1]); __pyx_v_model = ((PyArrayObject *)values[2]); } goto __pyx_L6_skip; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("norm_bounds_cython", 1, 3, 3, __pyx_nargs); __PYX_ERR(0, 84, __pyx_L3_error) __pyx_L6_skip:; goto __pyx_L4_argument_unpacking_done; __pyx_L3_error:; { Py_ssize_t __pyx_temp; for (__pyx_temp=0; __pyx_temp < (Py_ssize_t)(sizeof(values)/sizeof(values[0])); ++__pyx_temp) { __Pyx_Arg_XDECREF_FASTCALL(values[__pyx_temp]); } } __Pyx_AddTraceback("gammapy.stats.fit_statistics_cython.norm_bounds_cython", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return NULL; __pyx_L4_argument_unpacking_done:; if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_counts), __pyx_ptype_5numpy_ndarray, 1, "counts", 0))) __PYX_ERR(0, 86, __pyx_L1_error) if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_background), __pyx_ptype_5numpy_ndarray, 1, "background", 0))) __PYX_ERR(0, 87, __pyx_L1_error) if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_model), __pyx_ptype_5numpy_ndarray, 1, "model", 0))) __PYX_ERR(0, 88, __pyx_L1_error) __pyx_r = __pyx_pf_7gammapy_5stats_21fit_statistics_cython_4norm_bounds_cython(__pyx_self, __pyx_v_counts, __pyx_v_background, __pyx_v_model); /* function exit code */ goto __pyx_L0; __pyx_L1_error:; __pyx_r = NULL; __pyx_L0:; { Py_ssize_t __pyx_temp; for (__pyx_temp=0; __pyx_temp < (Py_ssize_t)(sizeof(values)/sizeof(values[0])); ++__pyx_temp) { __Pyx_Arg_XDECREF_FASTCALL(values[__pyx_temp]); } } __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7gammapy_5stats_21fit_statistics_cython_4norm_bounds_cython(CYTHON_UNUSED PyObject *__pyx_self, PyArrayObject *__pyx_v_counts, PyArrayObject *__pyx_v_background, PyArrayObject *__pyx_v_model) { __pyx_t_5numpy_float_t __pyx_v_s_model; __pyx_t_5numpy_float_t __pyx_v_s_counts; __pyx_t_5numpy_float_t __pyx_v_sn; __pyx_t_5numpy_float_t __pyx_v_sn_min; __pyx_t_5numpy_float_t __pyx_v_c_min; __pyx_t_5numpy_float_t __pyx_v_b_min; __pyx_t_5numpy_float_t __pyx_v_b_max; __pyx_t_5numpy_float_t __pyx_v_sn_min_total; unsigned int __pyx_v_i; unsigned int __pyx_v_ni; __Pyx_LocalBuf_ND __pyx_pybuffernd_background; __Pyx_Buffer __pyx_pybuffer_background; __Pyx_LocalBuf_ND __pyx_pybuffernd_counts; __Pyx_Buffer __pyx_pybuffer_counts; __Pyx_LocalBuf_ND __pyx_pybuffernd_model; __Pyx_Buffer __pyx_pybuffer_model; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations unsigned int __pyx_t_1; unsigned int __pyx_t_2; unsigned int __pyx_t_3; size_t __pyx_t_4; int __pyx_t_5; size_t __pyx_t_6; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; PyObject *__pyx_t_9 = NULL; PyObject *__pyx_t_10 = NULL; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; __Pyx_RefNannySetupContext("norm_bounds_cython", 1); __pyx_pybuffer_counts.pybuffer.buf = NULL; __pyx_pybuffer_counts.refcount = 0; __pyx_pybuffernd_counts.data = NULL; __pyx_pybuffernd_counts.rcbuffer = &__pyx_pybuffer_counts; __pyx_pybuffer_background.pybuffer.buf = NULL; __pyx_pybuffer_background.refcount = 0; __pyx_pybuffernd_background.data = NULL; __pyx_pybuffernd_background.rcbuffer = &__pyx_pybuffer_background; __pyx_pybuffer_model.pybuffer.buf = NULL; __pyx_pybuffer_model.refcount = 0; __pyx_pybuffernd_model.data = NULL; __pyx_pybuffernd_model.rcbuffer = &__pyx_pybuffer_model; { __Pyx_BufFmt_StackElem __pyx_stack[1]; if (unlikely(__Pyx_GetBufferAndValidate(&__pyx_pybuffernd_counts.rcbuffer->pybuffer, (PyObject*)__pyx_v_counts, &__Pyx_TypeInfo_nn___pyx_t_5numpy_float_t, PyBUF_FORMAT| PyBUF_STRIDES, 1, 0, __pyx_stack) == -1)) __PYX_ERR(0, 84, __pyx_L1_error) } __pyx_pybuffernd_counts.diminfo[0].strides = __pyx_pybuffernd_counts.rcbuffer->pybuffer.strides[0]; __pyx_pybuffernd_counts.diminfo[0].shape = __pyx_pybuffernd_counts.rcbuffer->pybuffer.shape[0]; { __Pyx_BufFmt_StackElem __pyx_stack[1]; if (unlikely(__Pyx_GetBufferAndValidate(&__pyx_pybuffernd_background.rcbuffer->pybuffer, (PyObject*)__pyx_v_background, &__Pyx_TypeInfo_nn___pyx_t_5numpy_float_t, PyBUF_FORMAT| PyBUF_STRIDES, 1, 0, __pyx_stack) == -1)) __PYX_ERR(0, 84, __pyx_L1_error) } __pyx_pybuffernd_background.diminfo[0].strides = __pyx_pybuffernd_background.rcbuffer->pybuffer.strides[0]; __pyx_pybuffernd_background.diminfo[0].shape = __pyx_pybuffernd_background.rcbuffer->pybuffer.shape[0]; { __Pyx_BufFmt_StackElem __pyx_stack[1]; if (unlikely(__Pyx_GetBufferAndValidate(&__pyx_pybuffernd_model.rcbuffer->pybuffer, (PyObject*)__pyx_v_model, &__Pyx_TypeInfo_nn___pyx_t_5numpy_float_t, PyBUF_FORMAT| PyBUF_STRIDES, 1, 0, __pyx_stack) == -1)) __PYX_ERR(0, 84, __pyx_L1_error) } __pyx_pybuffernd_model.diminfo[0].strides = __pyx_pybuffernd_model.rcbuffer->pybuffer.strides[0]; __pyx_pybuffernd_model.diminfo[0].shape = __pyx_pybuffernd_model.rcbuffer->pybuffer.shape[0]; /* "gammapy/stats/fit_statistics_cython.pyx":100 * Source template (multiplied with exposure). * """ * cdef np.float_t s_model = 0, s_counts = 0, sn, sn_min = 1e14, c_min = 1 # <<<<<<<<<<<<<< * cdef np.float_t b_min, b_max, sn_min_total = 1e14 * cdef unsigned int i, ni */ __pyx_v_s_model = 0.0; __pyx_v_s_counts = 0.0; __pyx_v_sn_min = 1e14; __pyx_v_c_min = 1.0; /* "gammapy/stats/fit_statistics_cython.pyx":101 * """ * cdef np.float_t s_model = 0, s_counts = 0, sn, sn_min = 1e14, c_min = 1 * cdef np.float_t b_min, b_max, sn_min_total = 1e14 # <<<<<<<<<<<<<< * cdef unsigned int i, ni * ni = counts.shape[0] */ __pyx_v_sn_min_total = 1e14; /* "gammapy/stats/fit_statistics_cython.pyx":103 * cdef np.float_t b_min, b_max, sn_min_total = 1e14 * cdef unsigned int i, ni * ni = counts.shape[0] # <<<<<<<<<<<<<< * for i in range(ni): * if counts[i] > 0: */ __pyx_v_ni = (__pyx_f_5numpy_7ndarray_5shape_shape(((PyArrayObject *)__pyx_v_counts))[0]); /* "gammapy/stats/fit_statistics_cython.pyx":104 * cdef unsigned int i, ni * ni = counts.shape[0] * for i in range(ni): # <<<<<<<<<<<<<< * if counts[i] > 0: * s_counts += counts[i] */ __pyx_t_1 = __pyx_v_ni; __pyx_t_2 = __pyx_t_1; for (__pyx_t_3 = 0; __pyx_t_3 < __pyx_t_2; __pyx_t_3+=1) { __pyx_v_i = __pyx_t_3; /* "gammapy/stats/fit_statistics_cython.pyx":105 * ni = counts.shape[0] * for i in range(ni): * if counts[i] > 0: # <<<<<<<<<<<<<< * s_counts += counts[i] * if model[i] > 0: */ __pyx_t_4 = __pyx_v_i; __pyx_t_5 = ((*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_counts.rcbuffer->pybuffer.buf, __pyx_t_4, __pyx_pybuffernd_counts.diminfo[0].strides)) > 0.0); if (__pyx_t_5) { /* "gammapy/stats/fit_statistics_cython.pyx":106 * for i in range(ni): * if counts[i] > 0: * s_counts += counts[i] # <<<<<<<<<<<<<< * if model[i] > 0: * sn = background[i] / model[i] */ __pyx_t_4 = __pyx_v_i; __pyx_v_s_counts = (__pyx_v_s_counts + (*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_counts.rcbuffer->pybuffer.buf, __pyx_t_4, __pyx_pybuffernd_counts.diminfo[0].strides))); /* "gammapy/stats/fit_statistics_cython.pyx":107 * if counts[i] > 0: * s_counts += counts[i] * if model[i] > 0: # <<<<<<<<<<<<<< * sn = background[i] / model[i] * if sn < sn_min: */ __pyx_t_4 = __pyx_v_i; __pyx_t_5 = ((*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_model.rcbuffer->pybuffer.buf, __pyx_t_4, __pyx_pybuffernd_model.diminfo[0].strides)) > 0.0); if (__pyx_t_5) { /* "gammapy/stats/fit_statistics_cython.pyx":108 * s_counts += counts[i] * if model[i] > 0: * sn = background[i] / model[i] # <<<<<<<<<<<<<< * if sn < sn_min: * sn_min = sn */ __pyx_t_4 = __pyx_v_i; __pyx_t_6 = __pyx_v_i; __pyx_v_sn = ((*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_background.rcbuffer->pybuffer.buf, __pyx_t_4, __pyx_pybuffernd_background.diminfo[0].strides)) / (*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_model.rcbuffer->pybuffer.buf, __pyx_t_6, __pyx_pybuffernd_model.diminfo[0].strides))); /* "gammapy/stats/fit_statistics_cython.pyx":109 * if model[i] > 0: * sn = background[i] / model[i] * if sn < sn_min: # <<<<<<<<<<<<<< * sn_min = sn * c_min = counts[i] */ __pyx_t_5 = (__pyx_v_sn < __pyx_v_sn_min); if (__pyx_t_5) { /* "gammapy/stats/fit_statistics_cython.pyx":110 * sn = background[i] / model[i] * if sn < sn_min: * sn_min = sn # <<<<<<<<<<<<<< * c_min = counts[i] * if model[i] > 0: */ __pyx_v_sn_min = __pyx_v_sn; /* "gammapy/stats/fit_statistics_cython.pyx":111 * if sn < sn_min: * sn_min = sn * c_min = counts[i] # <<<<<<<<<<<<<< * if model[i] > 0: * s_model += model[i] */ __pyx_t_6 = __pyx_v_i; __pyx_v_c_min = (*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_counts.rcbuffer->pybuffer.buf, __pyx_t_6, __pyx_pybuffernd_counts.diminfo[0].strides)); /* "gammapy/stats/fit_statistics_cython.pyx":109 * if model[i] > 0: * sn = background[i] / model[i] * if sn < sn_min: # <<<<<<<<<<<<<< * sn_min = sn * c_min = counts[i] */ } /* "gammapy/stats/fit_statistics_cython.pyx":107 * if counts[i] > 0: * s_counts += counts[i] * if model[i] > 0: # <<<<<<<<<<<<<< * sn = background[i] / model[i] * if sn < sn_min: */ } /* "gammapy/stats/fit_statistics_cython.pyx":105 * ni = counts.shape[0] * for i in range(ni): * if counts[i] > 0: # <<<<<<<<<<<<<< * s_counts += counts[i] * if model[i] > 0: */ } /* "gammapy/stats/fit_statistics_cython.pyx":112 * sn_min = sn * c_min = counts[i] * if model[i] > 0: # <<<<<<<<<<<<<< * s_model += model[i] * sn = background[i] / model[i] */ __pyx_t_6 = __pyx_v_i; __pyx_t_5 = ((*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_model.rcbuffer->pybuffer.buf, __pyx_t_6, __pyx_pybuffernd_model.diminfo[0].strides)) > 0.0); if (__pyx_t_5) { /* "gammapy/stats/fit_statistics_cython.pyx":113 * c_min = counts[i] * if model[i] > 0: * s_model += model[i] # <<<<<<<<<<<<<< * sn = background[i] / model[i] * if sn < sn_min_total: */ __pyx_t_6 = __pyx_v_i; __pyx_v_s_model = (__pyx_v_s_model + (*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_model.rcbuffer->pybuffer.buf, __pyx_t_6, __pyx_pybuffernd_model.diminfo[0].strides))); /* "gammapy/stats/fit_statistics_cython.pyx":114 * if model[i] > 0: * s_model += model[i] * sn = background[i] / model[i] # <<<<<<<<<<<<<< * if sn < sn_min_total: * sn_min_total = sn */ __pyx_t_6 = __pyx_v_i; __pyx_t_4 = __pyx_v_i; __pyx_v_sn = ((*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_background.rcbuffer->pybuffer.buf, __pyx_t_6, __pyx_pybuffernd_background.diminfo[0].strides)) / (*__Pyx_BufPtrStrided1d(__pyx_t_5numpy_float_t *, __pyx_pybuffernd_model.rcbuffer->pybuffer.buf, __pyx_t_4, __pyx_pybuffernd_model.diminfo[0].strides))); /* "gammapy/stats/fit_statistics_cython.pyx":115 * s_model += model[i] * sn = background[i] / model[i] * if sn < sn_min_total: # <<<<<<<<<<<<<< * sn_min_total = sn * b_min = c_min / s_model - sn_min */ __pyx_t_5 = (__pyx_v_sn < __pyx_v_sn_min_total); if (__pyx_t_5) { /* "gammapy/stats/fit_statistics_cython.pyx":116 * sn = background[i] / model[i] * if sn < sn_min_total: * sn_min_total = sn # <<<<<<<<<<<<<< * b_min = c_min / s_model - sn_min * b_max = s_counts / s_model - sn_min */ __pyx_v_sn_min_total = __pyx_v_sn; /* "gammapy/stats/fit_statistics_cython.pyx":115 * s_model += model[i] * sn = background[i] / model[i] * if sn < sn_min_total: # <<<<<<<<<<<<<< * sn_min_total = sn * b_min = c_min / s_model - sn_min */ } /* "gammapy/stats/fit_statistics_cython.pyx":112 * sn_min = sn * c_min = counts[i] * if model[i] > 0: # <<<<<<<<<<<<<< * s_model += model[i] * sn = background[i] / model[i] */ } } /* "gammapy/stats/fit_statistics_cython.pyx":117 * if sn < sn_min_total: * sn_min_total = sn * b_min = c_min / s_model - sn_min # <<<<<<<<<<<<<< * b_max = s_counts / s_model - sn_min * return b_min, b_max, -sn_min_total */ __pyx_v_b_min = ((__pyx_v_c_min / __pyx_v_s_model) - __pyx_v_sn_min); /* "gammapy/stats/fit_statistics_cython.pyx":118 * sn_min_total = sn * b_min = c_min / s_model - sn_min * b_max = s_counts / s_model - sn_min # <<<<<<<<<<<<<< * return b_min, b_max, -sn_min_total */ __pyx_v_b_max = ((__pyx_v_s_counts / __pyx_v_s_model) - __pyx_v_sn_min); /* "gammapy/stats/fit_statistics_cython.pyx":119 * b_min = c_min / s_model - sn_min * b_max = s_counts / s_model - sn_min * return b_min, b_max, -sn_min_total # <<<<<<<<<<<<<< */ __Pyx_XDECREF(__pyx_r); __pyx_t_7 = PyFloat_FromDouble(__pyx_v_b_min); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 119, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_7); __pyx_t_8 = PyFloat_FromDouble(__pyx_v_b_max); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 119, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_8); __pyx_t_9 = PyFloat_FromDouble((-__pyx_v_sn_min_total)); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 119, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_9); __pyx_t_10 = PyTuple_New(3); if (unlikely(!__pyx_t_10)) __PYX_ERR(0, 119, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_10); __Pyx_GIVEREF(__pyx_t_7); if (__Pyx_PyTuple_SET_ITEM(__pyx_t_10, 0, __pyx_t_7)) __PYX_ERR(0, 119, __pyx_L1_error); __Pyx_GIVEREF(__pyx_t_8); if (__Pyx_PyTuple_SET_ITEM(__pyx_t_10, 1, __pyx_t_8)) __PYX_ERR(0, 119, __pyx_L1_error); __Pyx_GIVEREF(__pyx_t_9); if (__Pyx_PyTuple_SET_ITEM(__pyx_t_10, 2, __pyx_t_9)) __PYX_ERR(0, 119, __pyx_L1_error); __pyx_t_7 = 0; __pyx_t_8 = 0; __pyx_t_9 = 0; __pyx_r = __pyx_t_10; __pyx_t_10 = 0; goto __pyx_L0; /* "gammapy/stats/fit_statistics_cython.pyx":84 * * * @cython.cdivision(True) # <<<<<<<<<<<<<< * @cython.boundscheck(False) * def norm_bounds_cython(np.ndarray[np.float_t, ndim=1] counts, */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_XDECREF(__pyx_t_9); __Pyx_XDECREF(__pyx_t_10); { PyObject *__pyx_type, *__pyx_value, *__pyx_tb; __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ErrFetch(&__pyx_type, &__pyx_value, &__pyx_tb); __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_background.rcbuffer->pybuffer); __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_counts.rcbuffer->pybuffer); __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_model.rcbuffer->pybuffer); __Pyx_ErrRestore(__pyx_type, __pyx_value, __pyx_tb);} __Pyx_AddTraceback("gammapy.stats.fit_statistics_cython.norm_bounds_cython", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; goto __pyx_L2; __pyx_L0:; __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_background.rcbuffer->pybuffer); __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_counts.rcbuffer->pybuffer); __Pyx_SafeReleaseBuffer(&__pyx_pybuffernd_model.rcbuffer->pybuffer); __pyx_L2:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyMethodDef __pyx_methods[] = { {0, 0, 0, 0} }; #ifndef CYTHON_SMALL_CODE #if defined(__clang__) #define CYTHON_SMALL_CODE #elif defined(__GNUC__) && (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 3)) #define CYTHON_SMALL_CODE __attribute__((cold)) #else #define CYTHON_SMALL_CODE #endif #endif /* #### Code section: pystring_table ### */ static int __Pyx_CreateStringTabAndInitStrings(void) { __Pyx_StringTabEntry __pyx_string_tab[] = { {&__pyx_n_s_ImportError, __pyx_k_ImportError, sizeof(__pyx_k_ImportError), 0, 0, 1, 1}, {&__pyx_n_s_TRUNCATION_VALUE, __pyx_k_TRUNCATION_VALUE, sizeof(__pyx_k_TRUNCATION_VALUE), 0, 0, 1, 1}, {&__pyx_n_s__10, __pyx_k__10, sizeof(__pyx_k__10), 0, 0, 1, 1}, {&__pyx_n_s__3, __pyx_k__3, sizeof(__pyx_k__3), 0, 0, 1, 1}, {&__pyx_n_s_asyncio_coroutines, __pyx_k_asyncio_coroutines, sizeof(__pyx_k_asyncio_coroutines), 0, 0, 1, 1}, {&__pyx_n_s_b_max, __pyx_k_b_max, sizeof(__pyx_k_b_max), 0, 0, 1, 1}, {&__pyx_n_s_b_min, __pyx_k_b_min, sizeof(__pyx_k_b_min), 0, 0, 1, 1}, {&__pyx_n_s_background, __pyx_k_background, sizeof(__pyx_k_background), 0, 0, 1, 1}, {&__pyx_n_s_c_min, __pyx_k_c_min, sizeof(__pyx_k_c_min), 0, 0, 1, 1}, {&__pyx_n_s_cash_sum_cython, __pyx_k_cash_sum_cython, sizeof(__pyx_k_cash_sum_cython), 0, 0, 1, 1}, {&__pyx_n_s_cline_in_traceback, __pyx_k_cline_in_traceback, sizeof(__pyx_k_cline_in_traceback), 0, 0, 1, 1}, {&__pyx_n_s_counts, __pyx_k_counts, sizeof(__pyx_k_counts), 0, 0, 1, 1}, {&__pyx_n_s_f_cash_root_cython, __pyx_k_f_cash_root_cython, sizeof(__pyx_k_f_cash_root_cython), 0, 0, 1, 1}, {&__pyx_kp_s_gammapy_stats_fit_statistics_cyt, __pyx_k_gammapy_stats_fit_statistics_cyt, sizeof(__pyx_k_gammapy_stats_fit_statistics_cyt), 0, 0, 1, 0}, {&__pyx_n_s_gammapy_stats_fit_statistics_cyt_2, __pyx_k_gammapy_stats_fit_statistics_cyt_2, sizeof(__pyx_k_gammapy_stats_fit_statistics_cyt_2), 0, 0, 1, 1}, {&__pyx_n_s_i, __pyx_k_i, sizeof(__pyx_k_i), 0, 0, 1, 1}, {&__pyx_n_s_import, __pyx_k_import, sizeof(__pyx_k_import), 0, 0, 1, 1}, {&__pyx_n_s_initializing, __pyx_k_initializing, sizeof(__pyx_k_initializing), 0, 0, 1, 1}, {&__pyx_n_s_is_coroutine, __pyx_k_is_coroutine, sizeof(__pyx_k_is_coroutine), 0, 0, 1, 1}, {&__pyx_n_s_lognpr, __pyx_k_lognpr, sizeof(__pyx_k_lognpr), 0, 0, 1, 1}, {&__pyx_n_s_logtrunc, __pyx_k_logtrunc, sizeof(__pyx_k_logtrunc), 0, 0, 1, 1}, {&__pyx_n_s_main, __pyx_k_main, sizeof(__pyx_k_main), 0, 0, 1, 1}, {&__pyx_n_s_model, __pyx_k_model, sizeof(__pyx_k_model), 0, 0, 1, 1}, {&__pyx_n_s_name, __pyx_k_name, sizeof(__pyx_k_name), 0, 0, 1, 1}, {&__pyx_n_s_ni, __pyx_k_ni, sizeof(__pyx_k_ni), 0, 0, 1, 1}, {&__pyx_n_s_norm_bounds_cython, __pyx_k_norm_bounds_cython, sizeof(__pyx_k_norm_bounds_cython), 0, 0, 1, 1}, {&__pyx_n_s_np, __pyx_k_np, sizeof(__pyx_k_np), 0, 0, 1, 1}, {&__pyx_n_s_npr, __pyx_k_npr, sizeof(__pyx_k_npr), 0, 0, 1, 1}, {&__pyx_n_s_npred, __pyx_k_npred, sizeof(__pyx_k_npred), 0, 0, 1, 1}, {&__pyx_n_s_numpy, __pyx_k_numpy, sizeof(__pyx_k_numpy), 0, 0, 1, 1}, {&__pyx_kp_u_numpy__core_multiarray_failed_to, __pyx_k_numpy__core_multiarray_failed_to, sizeof(__pyx_k_numpy__core_multiarray_failed_to), 0, 1, 0, 0}, {&__pyx_kp_u_numpy__core_umath_failed_to_impo, __pyx_k_numpy__core_umath_failed_to_impo, sizeof(__pyx_k_numpy__core_umath_failed_to_impo), 0, 1, 0, 0}, {&__pyx_n_s_range, __pyx_k_range, sizeof(__pyx_k_range), 0, 0, 1, 1}, {&__pyx_n_s_s_counts, __pyx_k_s_counts, sizeof(__pyx_k_s_counts), 0, 0, 1, 1}, {&__pyx_n_s_s_model, __pyx_k_s_model, sizeof(__pyx_k_s_model), 0, 0, 1, 1}, {&__pyx_n_s_sn, __pyx_k_sn, sizeof(__pyx_k_sn), 0, 0, 1, 1}, {&__pyx_n_s_sn_min, __pyx_k_sn_min, sizeof(__pyx_k_sn_min), 0, 0, 1, 1}, {&__pyx_n_s_sn_min_total, __pyx_k_sn_min_total, sizeof(__pyx_k_sn_min_total), 0, 0, 1, 1}, {&__pyx_n_s_spec, __pyx_k_spec, sizeof(__pyx_k_spec), 0, 0, 1, 1}, {&__pyx_n_s_sum, __pyx_k_sum, sizeof(__pyx_k_sum), 0, 0, 1, 1}, {&__pyx_n_s_test, __pyx_k_test, sizeof(__pyx_k_test), 0, 0, 1, 1}, {&__pyx_n_s_trunc, __pyx_k_trunc, sizeof(__pyx_k_trunc), 0, 0, 1, 1}, {&__pyx_n_s_x, __pyx_k_x, sizeof(__pyx_k_x), 0, 0, 1, 1}, {0, 0, 0, 0, 0, 0, 0} }; return __Pyx_InitStrings(__pyx_string_tab); } /* #### Code section: cached_builtins ### */ static CYTHON_SMALL_CODE int __Pyx_InitCachedBuiltins(void) { __pyx_builtin_range = __Pyx_GetBuiltinName(__pyx_n_s_range); if (!__pyx_builtin_range) __PYX_ERR(0, 35, __pyx_L1_error) __pyx_builtin_ImportError = __Pyx_GetBuiltinName(__pyx_n_s_ImportError); if (!__pyx_builtin_ImportError) __PYX_ERR(1, 1026, __pyx_L1_error) return 0; __pyx_L1_error:; return -1; } /* #### Code section: cached_constants ### */ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_InitCachedConstants", 0); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1026 * __pyx_import_array() * except Exception: * raise ImportError("numpy._core.multiarray failed to import") # <<<<<<<<<<<<<< * * cdef inline int import_umath() except -1: */ __pyx_tuple_ = PyTuple_Pack(1, __pyx_kp_u_numpy__core_multiarray_failed_to); if (unlikely(!__pyx_tuple_)) __PYX_ERR(1, 1026, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple_); __Pyx_GIVEREF(__pyx_tuple_); /* "../../../../../tmp/build-env-hnwpxi_7/lib/python3.9/site-packages/numpy/__init__.cython-30.pxd":1032 * _import_umath() * except Exception: * raise ImportError("numpy._core.umath failed to import") # <<<<<<<<<<<<<< * * cdef inline int import_ufunc() except -1: */ __pyx_tuple__2 = PyTuple_Pack(1, __pyx_kp_u_numpy__core_umath_failed_to_impo); if (unlikely(!__pyx_tuple__2)) __PYX_ERR(1, 1032, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__2); __Pyx_GIVEREF(__pyx_tuple__2); /* "gammapy/stats/fit_statistics_cython.pyx":15 * TRUNCATION_VALUE = 1e-25 * * @cython.cdivision(True) # <<<<<<<<<<<<<< * @cython.boundscheck(False) * def cash_sum_cython(np.ndarray[np.float_t, ndim=1] counts, */ __pyx_tuple__4 = PyTuple_Pack(9, __pyx_n_s_counts, __pyx_n_s_npred, __pyx_n_s_sum, __pyx_n_s_npr, __pyx_n_s_lognpr, __pyx_n_s_i, __pyx_n_s_ni, __pyx_n_s_trunc, __pyx_n_s_logtrunc); if (unlikely(!__pyx_tuple__4)) __PYX_ERR(0, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__4); __Pyx_GIVEREF(__pyx_tuple__4); __pyx_codeobj__5 = (PyObject*)__Pyx_PyCode_New(2, 0, 0, 9, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__4, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_gammapy_stats_fit_statistics_cyt, __pyx_n_s_cash_sum_cython, 15, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__5)) __PYX_ERR(0, 15, __pyx_L1_error) /* "gammapy/stats/fit_statistics_cython.pyx":50 * * * @cython.cdivision(True) # <<<<<<<<<<<<<< * @cython.boundscheck(False) * def f_cash_root_cython(np.float_t x, np.ndarray[np.float_t, ndim=1] counts, */ __pyx_tuple__6 = PyTuple_Pack(7, __pyx_n_s_x, __pyx_n_s_counts, __pyx_n_s_background, __pyx_n_s_model, __pyx_n_s_sum, __pyx_n_s_i, __pyx_n_s_ni); if (unlikely(!__pyx_tuple__6)) __PYX_ERR(0, 50, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__6); __Pyx_GIVEREF(__pyx_tuple__6); __pyx_codeobj__7 = (PyObject*)__Pyx_PyCode_New(4, 0, 0, 7, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__6, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_gammapy_stats_fit_statistics_cyt, __pyx_n_s_f_cash_root_cython, 50, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__7)) __PYX_ERR(0, 50, __pyx_L1_error) /* "gammapy/stats/fit_statistics_cython.pyx":84 * * * @cython.cdivision(True) # <<<<<<<<<<<<<< * @cython.boundscheck(False) * def norm_bounds_cython(np.ndarray[np.float_t, ndim=1] counts, */ __pyx_tuple__8 = PyTuple_Pack(13, __pyx_n_s_counts, __pyx_n_s_background, __pyx_n_s_model, __pyx_n_s_s_model, __pyx_n_s_s_counts, __pyx_n_s_sn, __pyx_n_s_sn_min, __pyx_n_s_c_min, __pyx_n_s_b_min, __pyx_n_s_b_max, __pyx_n_s_sn_min_total, __pyx_n_s_i, __pyx_n_s_ni); if (unlikely(!__pyx_tuple__8)) __PYX_ERR(0, 84, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__8); __Pyx_GIVEREF(__pyx_tuple__8); __pyx_codeobj__9 = (PyObject*)__Pyx_PyCode_New(3, 0, 0, 13, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__8, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_gammapy_stats_fit_statistics_cyt, __pyx_n_s_norm_bounds_cython, 84, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__9)) __PYX_ERR(0, 84, __pyx_L1_error) __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; __Pyx_RefNannyFinishContext(); return -1; } /* #### Code section: init_constants ### */ static CYTHON_SMALL_CODE int __Pyx_InitConstants(void) { if (__Pyx_CreateStringTabAndInitStrings() < 0) __PYX_ERR(0, 1, __pyx_L1_error); __pyx_float_1eneg_25 = PyFloat_FromDouble(1e-25); if (unlikely(!__pyx_float_1eneg_25)) __PYX_ERR(0, 1, __pyx_L1_error) return 0; __pyx_L1_error:; return -1; } /* #### Code section: init_globals ### */ static CYTHON_SMALL_CODE int __Pyx_InitGlobals(void) { /* NumpyImportArray.init */ /* * Cython has automatically inserted a call to _import_array since * you didn't include one when you cimported numpy. To disable this * add the line * numpy._import_array */ #ifdef NPY_FEATURE_VERSION #ifndef NO_IMPORT_ARRAY if (unlikely(_import_array() == -1)) { PyErr_SetString(PyExc_ImportError, "numpy.core.multiarray failed to import " "(auto-generated because you didn't call 'numpy.import_array()' after cimporting numpy; " "use 'numpy._import_array' to disable if you are certain you don't need it)."); } #endif #endif if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 1, __pyx_L1_error) return 0; __pyx_L1_error:; return -1; } /* #### Code section: init_module ### */ static CYTHON_SMALL_CODE int __Pyx_modinit_global_init_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_variable_export_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_function_export_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_type_init_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_type_import_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_variable_import_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_function_import_code(void); /*proto*/ static int __Pyx_modinit_global_init_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_global_init_code", 0); /*--- Global init code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_variable_export_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_variable_export_code", 0); /*--- Variable export code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_function_export_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_function_export_code", 0); /*--- Function export code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_type_init_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_type_init_code", 0); /*--- Type init code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_type_import_code(void) { __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; __Pyx_RefNannySetupContext("__Pyx_modinit_type_import_code", 0); /*--- Type import code ---*/ __pyx_t_1 = PyImport_ImportModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_t_1)) __PYX_ERR(2, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_ptype_7cpython_4type_type = __Pyx_ImportType_3_0_11(__pyx_t_1, __Pyx_BUILTIN_MODULE_NAME, "type", #if defined(PYPY_VERSION_NUM) && PYPY_VERSION_NUM < 0x050B0000 sizeof(PyTypeObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyTypeObject), #elif CYTHON_COMPILING_IN_LIMITED_API sizeof(PyTypeObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyTypeObject), #else sizeof(PyHeapTypeObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyHeapTypeObject), #endif __Pyx_ImportType_CheckSize_Warn_3_0_11); if (!__pyx_ptype_7cpython_4type_type) __PYX_ERR(2, 9, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = PyImport_ImportModule("numpy"); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 271, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_ptype_5numpy_dtype = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "dtype", sizeof(PyArray_Descr), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyArray_Descr),__Pyx_ImportType_CheckSize_Ignore_3_0_11); if (!__pyx_ptype_5numpy_dtype) __PYX_ERR(1, 271, __pyx_L1_error) __pyx_ptype_5numpy_flatiter = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "flatiter", sizeof(PyArrayIterObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyArrayIterObject),__Pyx_ImportType_CheckSize_Ignore_3_0_11); if (!__pyx_ptype_5numpy_flatiter) __PYX_ERR(1, 316, __pyx_L1_error) __pyx_ptype_5numpy_broadcast = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "broadcast", sizeof(PyArrayMultiIterObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyArrayMultiIterObject),__Pyx_ImportType_CheckSize_Ignore_3_0_11); if (!__pyx_ptype_5numpy_broadcast) __PYX_ERR(1, 320, __pyx_L1_error) __pyx_ptype_5numpy_ndarray = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "ndarray", sizeof(PyArrayObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyArrayObject),__Pyx_ImportType_CheckSize_Ignore_3_0_11); if (!__pyx_ptype_5numpy_ndarray) __PYX_ERR(1, 359, __pyx_L1_error) __pyx_ptype_5numpy_generic = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "generic", sizeof(PyObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyObject),__Pyx_ImportType_CheckSize_Warn_3_0_11); if (!__pyx_ptype_5numpy_generic) __PYX_ERR(1, 848, __pyx_L1_error) __pyx_ptype_5numpy_number = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "number", sizeof(PyObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyObject),__Pyx_ImportType_CheckSize_Warn_3_0_11); if (!__pyx_ptype_5numpy_number) __PYX_ERR(1, 850, __pyx_L1_error) __pyx_ptype_5numpy_integer = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "integer", sizeof(PyObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyObject),__Pyx_ImportType_CheckSize_Warn_3_0_11); if (!__pyx_ptype_5numpy_integer) __PYX_ERR(1, 852, __pyx_L1_error) __pyx_ptype_5numpy_signedinteger = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "signedinteger", sizeof(PyObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyObject),__Pyx_ImportType_CheckSize_Warn_3_0_11); if (!__pyx_ptype_5numpy_signedinteger) __PYX_ERR(1, 854, __pyx_L1_error) __pyx_ptype_5numpy_unsignedinteger = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "unsignedinteger", sizeof(PyObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyObject),__Pyx_ImportType_CheckSize_Warn_3_0_11); if (!__pyx_ptype_5numpy_unsignedinteger) __PYX_ERR(1, 856, __pyx_L1_error) __pyx_ptype_5numpy_inexact = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "inexact", sizeof(PyObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyObject),__Pyx_ImportType_CheckSize_Warn_3_0_11); if (!__pyx_ptype_5numpy_inexact) __PYX_ERR(1, 858, __pyx_L1_error) __pyx_ptype_5numpy_floating = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "floating", sizeof(PyObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyObject),__Pyx_ImportType_CheckSize_Warn_3_0_11); if (!__pyx_ptype_5numpy_floating) __PYX_ERR(1, 860, __pyx_L1_error) __pyx_ptype_5numpy_complexfloating = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "complexfloating", sizeof(PyObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyObject),__Pyx_ImportType_CheckSize_Warn_3_0_11); if (!__pyx_ptype_5numpy_complexfloating) __PYX_ERR(1, 862, __pyx_L1_error) __pyx_ptype_5numpy_flexible = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "flexible", sizeof(PyObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyObject),__Pyx_ImportType_CheckSize_Warn_3_0_11); if (!__pyx_ptype_5numpy_flexible) __PYX_ERR(1, 864, __pyx_L1_error) __pyx_ptype_5numpy_character = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "character", sizeof(PyObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyObject),__Pyx_ImportType_CheckSize_Warn_3_0_11); if (!__pyx_ptype_5numpy_character) __PYX_ERR(1, 866, __pyx_L1_error) __pyx_ptype_5numpy_ufunc = __Pyx_ImportType_3_0_11(__pyx_t_1, "numpy", "ufunc", sizeof(PyUFuncObject), __PYX_GET_STRUCT_ALIGNMENT_3_0_11(PyUFuncObject),__Pyx_ImportType_CheckSize_Ignore_3_0_11); if (!__pyx_ptype_5numpy_ufunc) __PYX_ERR(1, 930, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_RefNannyFinishContext(); return -1; } static int __Pyx_modinit_variable_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_variable_import_code", 0); /*--- Variable import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_function_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_function_import_code", 0); /*--- Function import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } #if PY_MAJOR_VERSION >= 3 #if CYTHON_PEP489_MULTI_PHASE_INIT static PyObject* __pyx_pymod_create(PyObject *spec, PyModuleDef *def); /*proto*/ static int __pyx_pymod_exec_fit_statistics_cython(PyObject* module); /*proto*/ static PyModuleDef_Slot __pyx_moduledef_slots[] = { {Py_mod_create, (void*)__pyx_pymod_create}, {Py_mod_exec, (void*)__pyx_pymod_exec_fit_statistics_cython}, {0, NULL} }; #endif #ifdef __cplusplus namespace { struct PyModuleDef __pyx_moduledef = #else static struct PyModuleDef __pyx_moduledef = #endif { PyModuleDef_HEAD_INIT, "fit_statistics_cython", 0, /* m_doc */ #if CYTHON_PEP489_MULTI_PHASE_INIT 0, /* m_size */ #elif CYTHON_USE_MODULE_STATE sizeof(__pyx_mstate), /* m_size */ #else -1, /* m_size */ #endif __pyx_methods /* m_methods */, #if CYTHON_PEP489_MULTI_PHASE_INIT __pyx_moduledef_slots, /* m_slots */ #else NULL, /* m_reload */ #endif #if CYTHON_USE_MODULE_STATE __pyx_m_traverse, /* m_traverse */ __pyx_m_clear, /* m_clear */ NULL /* m_free */ #else NULL, /* m_traverse */ NULL, /* m_clear */ NULL /* m_free */ #endif }; #ifdef __cplusplus } /* anonymous namespace */ #endif #endif #ifndef CYTHON_NO_PYINIT_EXPORT #define __Pyx_PyMODINIT_FUNC PyMODINIT_FUNC #elif PY_MAJOR_VERSION < 3 #ifdef __cplusplus #define __Pyx_PyMODINIT_FUNC extern "C" void #else #define __Pyx_PyMODINIT_FUNC void #endif #else #ifdef __cplusplus #define __Pyx_PyMODINIT_FUNC extern "C" PyObject * #else #define __Pyx_PyMODINIT_FUNC PyObject * #endif #endif #if PY_MAJOR_VERSION < 3 __Pyx_PyMODINIT_FUNC initfit_statistics_cython(void) CYTHON_SMALL_CODE; /*proto*/ __Pyx_PyMODINIT_FUNC initfit_statistics_cython(void) #else __Pyx_PyMODINIT_FUNC PyInit_fit_statistics_cython(void) CYTHON_SMALL_CODE; /*proto*/ __Pyx_PyMODINIT_FUNC PyInit_fit_statistics_cython(void) #if CYTHON_PEP489_MULTI_PHASE_INIT { return PyModuleDef_Init(&__pyx_moduledef); } static CYTHON_SMALL_CODE int __Pyx_check_single_interpreter(void) { #if PY_VERSION_HEX >= 0x030700A1 static PY_INT64_T main_interpreter_id = -1; PY_INT64_T current_id = PyInterpreterState_GetID(PyThreadState_Get()->interp); if (main_interpreter_id == -1) { main_interpreter_id = current_id; return (unlikely(current_id == -1)) ? -1 : 0; } else if (unlikely(main_interpreter_id != current_id)) #else static PyInterpreterState *main_interpreter = NULL; PyInterpreterState *current_interpreter = PyThreadState_Get()->interp; if (!main_interpreter) { main_interpreter = current_interpreter; } else if (unlikely(main_interpreter != current_interpreter)) #endif { PyErr_SetString( PyExc_ImportError, "Interpreter change detected - this module can only be loaded into one interpreter per process."); return -1; } return 0; } #if CYTHON_COMPILING_IN_LIMITED_API static CYTHON_SMALL_CODE int __Pyx_copy_spec_to_module(PyObject *spec, PyObject *module, const char* from_name, const char* to_name, int allow_none) #else static CYTHON_SMALL_CODE int __Pyx_copy_spec_to_module(PyObject *spec, PyObject *moddict, const char* from_name, const char* to_name, int allow_none) #endif { PyObject *value = PyObject_GetAttrString(spec, from_name); int result = 0; if (likely(value)) { if (allow_none || value != Py_None) { #if CYTHON_COMPILING_IN_LIMITED_API result = PyModule_AddObject(module, to_name, value); #else result = PyDict_SetItemString(moddict, to_name, value); #endif } Py_DECREF(value); } else if (PyErr_ExceptionMatches(PyExc_AttributeError)) { PyErr_Clear(); } else { result = -1; } return result; } static CYTHON_SMALL_CODE PyObject* __pyx_pymod_create(PyObject *spec, PyModuleDef *def) { PyObject *module = NULL, *moddict, *modname; CYTHON_UNUSED_VAR(def); if (__Pyx_check_single_interpreter()) return NULL; if (__pyx_m) return __Pyx_NewRef(__pyx_m); modname = PyObject_GetAttrString(spec, "name"); if (unlikely(!modname)) goto bad; module = PyModule_NewObject(modname); Py_DECREF(modname); if (unlikely(!module)) goto bad; #if CYTHON_COMPILING_IN_LIMITED_API moddict = module; #else moddict = PyModule_GetDict(module); if (unlikely(!moddict)) goto bad; #endif if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "loader", "__loader__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "origin", "__file__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "parent", "__package__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "submodule_search_locations", "__path__", 0) < 0)) goto bad; return module; bad: Py_XDECREF(module); return NULL; } static CYTHON_SMALL_CODE int __pyx_pymod_exec_fit_statistics_cython(PyObject *__pyx_pyinit_module) #endif #endif { int stringtab_initialized = 0; #if CYTHON_USE_MODULE_STATE int pystate_addmodule_run = 0; #endif PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; int __pyx_lineno = 0; const char *__pyx_filename = NULL; int __pyx_clineno = 0; __Pyx_RefNannyDeclarations #if CYTHON_PEP489_MULTI_PHASE_INIT if (__pyx_m) { if (__pyx_m == __pyx_pyinit_module) return 0; PyErr_SetString(PyExc_RuntimeError, "Module 'fit_statistics_cython' has already been imported. Re-initialisation is not supported."); return -1; } #elif PY_MAJOR_VERSION >= 3 if (__pyx_m) return __Pyx_NewRef(__pyx_m); #endif /*--- Module creation code ---*/ #if CYTHON_PEP489_MULTI_PHASE_INIT __pyx_m = __pyx_pyinit_module; Py_INCREF(__pyx_m); #else #if PY_MAJOR_VERSION < 3 __pyx_m = Py_InitModule4("fit_statistics_cython", __pyx_methods, 0, 0, PYTHON_API_VERSION); Py_XINCREF(__pyx_m); if (unlikely(!__pyx_m)) __PYX_ERR(0, 1, __pyx_L1_error) #elif CYTHON_USE_MODULE_STATE __pyx_t_1 = PyModule_Create(&__pyx_moduledef); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 1, __pyx_L1_error) { int add_module_result = PyState_AddModule(__pyx_t_1, &__pyx_moduledef); __pyx_t_1 = 0; /* transfer ownership from __pyx_t_1 to "fit_statistics_cython" pseudovariable */ if (unlikely((add_module_result < 0))) __PYX_ERR(0, 1, __pyx_L1_error) pystate_addmodule_run = 1; } #else __pyx_m = PyModule_Create(&__pyx_moduledef); if (unlikely(!__pyx_m)) __PYX_ERR(0, 1, __pyx_L1_error) #endif #endif CYTHON_UNUSED_VAR(__pyx_t_1); __pyx_d = PyModule_GetDict(__pyx_m); if (unlikely(!__pyx_d)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_d); __pyx_b = __Pyx_PyImport_AddModuleRef(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_b)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_cython_runtime = __Pyx_PyImport_AddModuleRef((const char *) "cython_runtime"); if (unlikely(!__pyx_cython_runtime)) __PYX_ERR(0, 1, __pyx_L1_error) if (PyObject_SetAttrString(__pyx_m, "__builtins__", __pyx_b) < 0) __PYX_ERR(0, 1, __pyx_L1_error) #if CYTHON_REFNANNY __Pyx_RefNanny = __Pyx_RefNannyImportAPI("refnanny"); if (!__Pyx_RefNanny) { PyErr_Clear(); __Pyx_RefNanny = __Pyx_RefNannyImportAPI("Cython.Runtime.refnanny"); if (!__Pyx_RefNanny) Py_FatalError("failed to import 'refnanny' module"); } #endif __Pyx_RefNannySetupContext("__Pyx_PyMODINIT_FUNC PyInit_fit_statistics_cython(void)", 0); if (__Pyx_check_binary_version(__PYX_LIMITED_VERSION_HEX, __Pyx_get_runtime_version(), CYTHON_COMPILING_IN_LIMITED_API) < 0) __PYX_ERR(0, 1, __pyx_L1_error) #ifdef __Pxy_PyFrame_Initialize_Offsets __Pxy_PyFrame_Initialize_Offsets(); #endif __pyx_empty_tuple = PyTuple_New(0); if (unlikely(!__pyx_empty_tuple)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_empty_bytes = PyBytes_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_bytes)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_empty_unicode = PyUnicode_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_unicode)) __PYX_ERR(0, 1, __pyx_L1_error) #ifdef __Pyx_CyFunction_USED if (__pyx_CyFunction_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_FusedFunction_USED if (__pyx_FusedFunction_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_Coroutine_USED if (__pyx_Coroutine_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_Generator_USED if (__pyx_Generator_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_AsyncGen_USED if (__pyx_AsyncGen_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_StopAsyncIteration_USED if (__pyx_StopAsyncIteration_init(__pyx_m) < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif /*--- Library function declarations ---*/ /*--- Threads initialization code ---*/ #if defined(WITH_THREAD) && PY_VERSION_HEX < 0x030700F0 && defined(__PYX_FORCE_INIT_THREADS) && __PYX_FORCE_INIT_THREADS PyEval_InitThreads(); #endif /*--- Initialize various global constants etc. ---*/ if (__Pyx_InitConstants() < 0) __PYX_ERR(0, 1, __pyx_L1_error) stringtab_initialized = 1; if (__Pyx_InitGlobals() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #if PY_MAJOR_VERSION < 3 && (__PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT) if (__Pyx_init_sys_getdefaultencoding_params() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif if (__pyx_module_is_main_gammapy__stats__fit_statistics_cython) { if (PyObject_SetAttr(__pyx_m, __pyx_n_s_name, __pyx_n_s_main) < 0) __PYX_ERR(0, 1, __pyx_L1_error) } #if PY_MAJOR_VERSION >= 3 { PyObject *modules = PyImport_GetModuleDict(); if (unlikely(!modules)) __PYX_ERR(0, 1, __pyx_L1_error) if (!PyDict_GetItemString(modules, "gammapy.stats.fit_statistics_cython")) { if (unlikely((PyDict_SetItemString(modules, "gammapy.stats.fit_statistics_cython", __pyx_m) < 0))) __PYX_ERR(0, 1, __pyx_L1_error) } } #endif /*--- Builtin init code ---*/ if (__Pyx_InitCachedBuiltins() < 0) __PYX_ERR(0, 1, __pyx_L1_error) /*--- Constants init code ---*/ if (__Pyx_InitCachedConstants() < 0) __PYX_ERR(0, 1, __pyx_L1_error) /*--- Global type/function init code ---*/ (void)__Pyx_modinit_global_init_code(); (void)__Pyx_modinit_variable_export_code(); (void)__Pyx_modinit_function_export_code(); (void)__Pyx_modinit_type_init_code(); if (unlikely((__Pyx_modinit_type_import_code() < 0))) __PYX_ERR(0, 1, __pyx_L1_error) (void)__Pyx_modinit_variable_import_code(); (void)__Pyx_modinit_function_import_code(); /*--- Execution code ---*/ #if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED) if (__Pyx_patch_abc() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif /* "gammapy/stats/fit_statistics_cython.pyx":3 * # Licensed under a 3-clause BSD style license - see LICENSE.rst * # cython: language_level=3 * import numpy as np # <<<<<<<<<<<<<< * * cimport numpy as np */ __pyx_t_2 = __Pyx_ImportDottedModule(__pyx_n_s_numpy, NULL); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 3, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_np, __pyx_t_2) < 0) __PYX_ERR(0, 3, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "gammapy/stats/fit_statistics_cython.pyx":13 * * global TRUNCATION_VALUE * TRUNCATION_VALUE = 1e-25 # <<<<<<<<<<<<<< * * @cython.cdivision(True) */ if (PyDict_SetItem(__pyx_d, __pyx_n_s_TRUNCATION_VALUE, __pyx_float_1eneg_25) < 0) __PYX_ERR(0, 13, __pyx_L1_error) /* "gammapy/stats/fit_statistics_cython.pyx":15 * TRUNCATION_VALUE = 1e-25 * * @cython.cdivision(True) # <<<<<<<<<<<<<< * @cython.boundscheck(False) * def cash_sum_cython(np.ndarray[np.float_t, ndim=1] counts, */ __pyx_t_2 = __Pyx_CyFunction_New(&__pyx_mdef_7gammapy_5stats_21fit_statistics_cython_1cash_sum_cython, 0, __pyx_n_s_cash_sum_cython, NULL, __pyx_n_s_gammapy_stats_fit_statistics_cyt_2, __pyx_d, ((PyObject *)__pyx_codeobj__5)); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_cash_sum_cython, __pyx_t_2) < 0) __PYX_ERR(0, 15, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "gammapy/stats/fit_statistics_cython.pyx":50 * * * @cython.cdivision(True) # <<<<<<<<<<<<<< * @cython.boundscheck(False) * def f_cash_root_cython(np.float_t x, np.ndarray[np.float_t, ndim=1] counts, */ __pyx_t_2 = __Pyx_CyFunction_New(&__pyx_mdef_7gammapy_5stats_21fit_statistics_cython_3f_cash_root_cython, 0, __pyx_n_s_f_cash_root_cython, NULL, __pyx_n_s_gammapy_stats_fit_statistics_cyt_2, __pyx_d, ((PyObject *)__pyx_codeobj__7)); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 50, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_f_cash_root_cython, __pyx_t_2) < 0) __PYX_ERR(0, 50, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "gammapy/stats/fit_statistics_cython.pyx":84 * * * @cython.cdivision(True) # <<<<<<<<<<<<<< * @cython.boundscheck(False) * def norm_bounds_cython(np.ndarray[np.float_t, ndim=1] counts, */ __pyx_t_2 = __Pyx_CyFunction_New(&__pyx_mdef_7gammapy_5stats_21fit_statistics_cython_5norm_bounds_cython, 0, __pyx_n_s_norm_bounds_cython, NULL, __pyx_n_s_gammapy_stats_fit_statistics_cyt_2, __pyx_d, ((PyObject *)__pyx_codeobj__9)); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 84, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_norm_bounds_cython, __pyx_t_2) < 0) __PYX_ERR(0, 84, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "gammapy/stats/fit_statistics_cython.pyx":1 * # Licensed under a 3-clause BSD style license - see LICENSE.rst # <<<<<<<<<<<<<< * # cython: language_level=3 * import numpy as np */ __pyx_t_2 = __Pyx_PyDict_NewPresized(0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_test, __pyx_t_2) < 0) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /*--- Wrapped vars code ---*/ goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_2); if (__pyx_m) { if (__pyx_d && stringtab_initialized) { __Pyx_AddTraceback("init gammapy.stats.fit_statistics_cython", __pyx_clineno, __pyx_lineno, __pyx_filename); } #if !CYTHON_USE_MODULE_STATE Py_CLEAR(__pyx_m); #else Py_DECREF(__pyx_m); if (pystate_addmodule_run) { PyObject *tp, *value, *tb; PyErr_Fetch(&tp, &value, &tb); PyState_RemoveModule(&__pyx_moduledef); PyErr_Restore(tp, value, tb); } #endif } else if (!PyErr_Occurred()) { PyErr_SetString(PyExc_ImportError, "init gammapy.stats.fit_statistics_cython"); } __pyx_L0:; __Pyx_RefNannyFinishContext(); #if CYTHON_PEP489_MULTI_PHASE_INIT return (__pyx_m != NULL) ? 0 : -1; #elif PY_MAJOR_VERSION >= 3 return __pyx_m; #else return; #endif } /* #### Code section: cleanup_globals ### */ /* #### Code section: cleanup_module ### */ /* #### Code section: main_method ### */ /* #### Code section: utility_code_pragmas ### */ #ifdef _MSC_VER #pragma warning( push ) /* Warning 4127: conditional expression is constant * Cython uses constant conditional expressions to allow in inline functions to be optimized at * compile-time, so this warning is not useful */ #pragma warning( disable : 4127 ) #endif /* #### Code section: utility_code_def ### */ /* --- Runtime support code --- */ /* Refnanny */ #if CYTHON_REFNANNY static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname) { PyObject *m = NULL, *p = NULL; void *r = NULL; m = PyImport_ImportModule(modname); if (!m) goto end; p = PyObject_GetAttrString(m, "RefNannyAPI"); if (!p) goto end; r = PyLong_AsVoidPtr(p); end: Py_XDECREF(p); Py_XDECREF(m); return (__Pyx_RefNannyAPIStruct *)r; } #endif /* PyErrExceptionMatches */ #if CYTHON_FAST_THREAD_STATE static int __Pyx_PyErr_ExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) { Py_ssize_t i, n; n = PyTuple_GET_SIZE(tuple); #if PY_MAJOR_VERSION >= 3 for (i=0; i= 0x030C00A6 PyObject *current_exception = tstate->current_exception; if (unlikely(!current_exception)) return 0; exc_type = (PyObject*) Py_TYPE(current_exception); if (exc_type == err) return 1; #else exc_type = tstate->curexc_type; if (exc_type == err) return 1; if (unlikely(!exc_type)) return 0; #endif #if CYTHON_AVOID_BORROWED_REFS Py_INCREF(exc_type); #endif if (unlikely(PyTuple_Check(err))) { result = __Pyx_PyErr_ExceptionMatchesTuple(exc_type, err); } else { result = __Pyx_PyErr_GivenExceptionMatches(exc_type, err); } #if CYTHON_AVOID_BORROWED_REFS Py_DECREF(exc_type); #endif return result; } #endif /* PyErrFetchRestore */ #if CYTHON_FAST_THREAD_STATE static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) { #if PY_VERSION_HEX >= 0x030C00A6 PyObject *tmp_value; assert(type == NULL || (value != NULL && type == (PyObject*) Py_TYPE(value))); if (value) { #if CYTHON_COMPILING_IN_CPYTHON if (unlikely(((PyBaseExceptionObject*) value)->traceback != tb)) #endif PyException_SetTraceback(value, tb); } tmp_value = tstate->current_exception; tstate->current_exception = value; Py_XDECREF(tmp_value); Py_XDECREF(type); Py_XDECREF(tb); #else PyObject *tmp_type, *tmp_value, *tmp_tb; tmp_type = tstate->curexc_type; tmp_value = tstate->curexc_value; tmp_tb = tstate->curexc_traceback; tstate->curexc_type = type; tstate->curexc_value = value; tstate->curexc_traceback = tb; Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); #endif } static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { #if PY_VERSION_HEX >= 0x030C00A6 PyObject* exc_value; exc_value = tstate->current_exception; tstate->current_exception = 0; *value = exc_value; *type = NULL; *tb = NULL; if (exc_value) { *type = (PyObject*) Py_TYPE(exc_value); Py_INCREF(*type); #if CYTHON_COMPILING_IN_CPYTHON *tb = ((PyBaseExceptionObject*) exc_value)->traceback; Py_XINCREF(*tb); #else *tb = PyException_GetTraceback(exc_value); #endif } #else *type = tstate->curexc_type; *value = tstate->curexc_value; *tb = tstate->curexc_traceback; tstate->curexc_type = 0; tstate->curexc_value = 0; tstate->curexc_traceback = 0; #endif } #endif /* PyObjectGetAttrStr */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name) { PyTypeObject* tp = Py_TYPE(obj); if (likely(tp->tp_getattro)) return tp->tp_getattro(obj, attr_name); #if PY_MAJOR_VERSION < 3 if (likely(tp->tp_getattr)) return tp->tp_getattr(obj, PyString_AS_STRING(attr_name)); #endif return PyObject_GetAttr(obj, attr_name); } #endif /* PyObjectGetAttrStrNoError */ #if __PYX_LIMITED_VERSION_HEX < 0x030d00A1 static void __Pyx_PyObject_GetAttrStr_ClearAttributeError(void) { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign if (likely(__Pyx_PyErr_ExceptionMatches(PyExc_AttributeError))) __Pyx_PyErr_Clear(); } #endif static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStrNoError(PyObject* obj, PyObject* attr_name) { PyObject *result; #if __PYX_LIMITED_VERSION_HEX >= 0x030d00A1 (void) PyObject_GetOptionalAttr(obj, attr_name, &result); return result; #else #if CYTHON_COMPILING_IN_CPYTHON && CYTHON_USE_TYPE_SLOTS && PY_VERSION_HEX >= 0x030700B1 PyTypeObject* tp = Py_TYPE(obj); if (likely(tp->tp_getattro == PyObject_GenericGetAttr)) { return _PyObject_GenericGetAttrWithDict(obj, attr_name, NULL, 1); } #endif result = __Pyx_PyObject_GetAttrStr(obj, attr_name); if (unlikely(!result)) { __Pyx_PyObject_GetAttrStr_ClearAttributeError(); } return result; #endif } /* GetBuiltinName */ static PyObject *__Pyx_GetBuiltinName(PyObject *name) { PyObject* result = __Pyx_PyObject_GetAttrStrNoError(__pyx_b, name); if (unlikely(!result) && !PyErr_Occurred()) { PyErr_Format(PyExc_NameError, #if PY_MAJOR_VERSION >= 3 "name '%U' is not defined", name); #else "name '%.200s' is not defined", PyString_AS_STRING(name)); #endif } return result; } /* GetTopmostException */ #if CYTHON_USE_EXC_INFO_STACK && CYTHON_FAST_THREAD_STATE static _PyErr_StackItem * __Pyx_PyErr_GetTopmostException(PyThreadState *tstate) { _PyErr_StackItem *exc_info = tstate->exc_info; while ((exc_info->exc_value == NULL || exc_info->exc_value == Py_None) && exc_info->previous_item != NULL) { exc_info = exc_info->previous_item; } return exc_info; } #endif /* SaveResetException */ #if CYTHON_FAST_THREAD_STATE static CYTHON_INLINE void __Pyx__ExceptionSave(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { #if CYTHON_USE_EXC_INFO_STACK && PY_VERSION_HEX >= 0x030B00a4 _PyErr_StackItem *exc_info = __Pyx_PyErr_GetTopmostException(tstate); PyObject *exc_value = exc_info->exc_value; if (exc_value == NULL || exc_value == Py_None) { *value = NULL; *type = NULL; *tb = NULL; } else { *value = exc_value; Py_INCREF(*value); *type = (PyObject*) Py_TYPE(exc_value); Py_INCREF(*type); *tb = PyException_GetTraceback(exc_value); } #elif CYTHON_USE_EXC_INFO_STACK _PyErr_StackItem *exc_info = __Pyx_PyErr_GetTopmostException(tstate); *type = exc_info->exc_type; *value = exc_info->exc_value; *tb = exc_info->exc_traceback; Py_XINCREF(*type); Py_XINCREF(*value); Py_XINCREF(*tb); #else *type = tstate->exc_type; *value = tstate->exc_value; *tb = tstate->exc_traceback; Py_XINCREF(*type); Py_XINCREF(*value); Py_XINCREF(*tb); #endif } static CYTHON_INLINE void __Pyx__ExceptionReset(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) { #if CYTHON_USE_EXC_INFO_STACK && PY_VERSION_HEX >= 0x030B00a4 _PyErr_StackItem *exc_info = tstate->exc_info; PyObject *tmp_value = exc_info->exc_value; exc_info->exc_value = value; Py_XDECREF(tmp_value); Py_XDECREF(type); Py_XDECREF(tb); #else PyObject *tmp_type, *tmp_value, *tmp_tb; #if CYTHON_USE_EXC_INFO_STACK _PyErr_StackItem *exc_info = tstate->exc_info; tmp_type = exc_info->exc_type; tmp_value = exc_info->exc_value; tmp_tb = exc_info->exc_traceback; exc_info->exc_type = type; exc_info->exc_value = value; exc_info->exc_traceback = tb; #else tmp_type = tstate->exc_type; tmp_value = tstate->exc_value; tmp_tb = tstate->exc_traceback; tstate->exc_type = type; tstate->exc_value = value; tstate->exc_traceback = tb; #endif Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); #endif } #endif /* GetException */ #if CYTHON_FAST_THREAD_STATE static int __Pyx__GetException(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) #else static int __Pyx_GetException(PyObject **type, PyObject **value, PyObject **tb) #endif { PyObject *local_type = NULL, *local_value, *local_tb = NULL; #if CYTHON_FAST_THREAD_STATE PyObject *tmp_type, *tmp_value, *tmp_tb; #if PY_VERSION_HEX >= 0x030C00A6 local_value = tstate->current_exception; tstate->current_exception = 0; if (likely(local_value)) { local_type = (PyObject*) Py_TYPE(local_value); Py_INCREF(local_type); local_tb = PyException_GetTraceback(local_value); } #else local_type = tstate->curexc_type; local_value = tstate->curexc_value; local_tb = tstate->curexc_traceback; tstate->curexc_type = 0; tstate->curexc_value = 0; tstate->curexc_traceback = 0; #endif #else PyErr_Fetch(&local_type, &local_value, &local_tb); #endif PyErr_NormalizeException(&local_type, &local_value, &local_tb); #if CYTHON_FAST_THREAD_STATE && PY_VERSION_HEX >= 0x030C00A6 if (unlikely(tstate->current_exception)) #elif CYTHON_FAST_THREAD_STATE if (unlikely(tstate->curexc_type)) #else if (unlikely(PyErr_Occurred())) #endif goto bad; #if PY_MAJOR_VERSION >= 3 if (local_tb) { if (unlikely(PyException_SetTraceback(local_value, local_tb) < 0)) goto bad; } #endif Py_XINCREF(local_tb); Py_XINCREF(local_type); Py_XINCREF(local_value); *type = local_type; *value = local_value; *tb = local_tb; #if CYTHON_FAST_THREAD_STATE #if CYTHON_USE_EXC_INFO_STACK { _PyErr_StackItem *exc_info = tstate->exc_info; #if PY_VERSION_HEX >= 0x030B00a4 tmp_value = exc_info->exc_value; exc_info->exc_value = local_value; tmp_type = NULL; tmp_tb = NULL; Py_XDECREF(local_type); Py_XDECREF(local_tb); #else tmp_type = exc_info->exc_type; tmp_value = exc_info->exc_value; tmp_tb = exc_info->exc_traceback; exc_info->exc_type = local_type; exc_info->exc_value = local_value; exc_info->exc_traceback = local_tb; #endif } #else tmp_type = tstate->exc_type; tmp_value = tstate->exc_value; tmp_tb = tstate->exc_traceback; tstate->exc_type = local_type; tstate->exc_value = local_value; tstate->exc_traceback = local_tb; #endif Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); #else PyErr_SetExcInfo(local_type, local_value, local_tb); #endif return 0; bad: *type = 0; *value = 0; *tb = 0; Py_XDECREF(local_type); Py_XDECREF(local_value); Py_XDECREF(local_tb); return -1; } /* PyObjectCall */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw) { PyObject *result; ternaryfunc call = Py_TYPE(func)->tp_call; if (unlikely(!call)) return PyObject_Call(func, arg, kw); #if PY_MAJOR_VERSION < 3 if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) return NULL; #else if (unlikely(Py_EnterRecursiveCall(" while calling a Python object"))) return NULL; #endif result = (*call)(func, arg, kw); Py_LeaveRecursiveCall(); if (unlikely(!result) && unlikely(!PyErr_Occurred())) { PyErr_SetString( PyExc_SystemError, "NULL result without error in PyObject_Call"); } return result; } #endif /* RaiseException */ #if PY_MAJOR_VERSION < 3 static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause) { __Pyx_PyThreadState_declare CYTHON_UNUSED_VAR(cause); Py_XINCREF(type); if (!value || value == Py_None) value = NULL; else Py_INCREF(value); if (!tb || tb == Py_None) tb = NULL; else { Py_INCREF(tb); if (!PyTraceBack_Check(tb)) { PyErr_SetString(PyExc_TypeError, "raise: arg 3 must be a traceback or None"); goto raise_error; } } if (PyType_Check(type)) { #if CYTHON_COMPILING_IN_PYPY if (!value) { Py_INCREF(Py_None); value = Py_None; } #endif PyErr_NormalizeException(&type, &value, &tb); } else { if (value) { PyErr_SetString(PyExc_TypeError, "instance exception may not have a separate value"); goto raise_error; } value = type; type = (PyObject*) Py_TYPE(type); Py_INCREF(type); if (!PyType_IsSubtype((PyTypeObject *)type, (PyTypeObject *)PyExc_BaseException)) { PyErr_SetString(PyExc_TypeError, "raise: exception class must be a subclass of BaseException"); goto raise_error; } } __Pyx_PyThreadState_assign __Pyx_ErrRestore(type, value, tb); return; raise_error: Py_XDECREF(value); Py_XDECREF(type); Py_XDECREF(tb); return; } #else static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause) { PyObject* owned_instance = NULL; if (tb == Py_None) { tb = 0; } else if (tb && !PyTraceBack_Check(tb)) { PyErr_SetString(PyExc_TypeError, "raise: arg 3 must be a traceback or None"); goto bad; } if (value == Py_None) value = 0; if (PyExceptionInstance_Check(type)) { if (value) { PyErr_SetString(PyExc_TypeError, "instance exception may not have a separate value"); goto bad; } value = type; type = (PyObject*) Py_TYPE(value); } else if (PyExceptionClass_Check(type)) { PyObject *instance_class = NULL; if (value && PyExceptionInstance_Check(value)) { instance_class = (PyObject*) Py_TYPE(value); if (instance_class != type) { int is_subclass = PyObject_IsSubclass(instance_class, type); if (!is_subclass) { instance_class = NULL; } else if (unlikely(is_subclass == -1)) { goto bad; } else { type = instance_class; } } } if (!instance_class) { PyObject *args; if (!value) args = PyTuple_New(0); else if (PyTuple_Check(value)) { Py_INCREF(value); args = value; } else args = PyTuple_Pack(1, value); if (!args) goto bad; owned_instance = PyObject_Call(type, args, NULL); Py_DECREF(args); if (!owned_instance) goto bad; value = owned_instance; if (!PyExceptionInstance_Check(value)) { PyErr_Format(PyExc_TypeError, "calling %R should have returned an instance of " "BaseException, not %R", type, Py_TYPE(value)); goto bad; } } } else { PyErr_SetString(PyExc_TypeError, "raise: exception class must be a subclass of BaseException"); goto bad; } if (cause) { PyObject *fixed_cause; if (cause == Py_None) { fixed_cause = NULL; } else if (PyExceptionClass_Check(cause)) { fixed_cause = PyObject_CallObject(cause, NULL); if (fixed_cause == NULL) goto bad; } else if (PyExceptionInstance_Check(cause)) { fixed_cause = cause; Py_INCREF(fixed_cause); } else { PyErr_SetString(PyExc_TypeError, "exception causes must derive from " "BaseException"); goto bad; } PyException_SetCause(value, fixed_cause); } PyErr_SetObject(type, value); if (tb) { #if PY_VERSION_HEX >= 0x030C00A6 PyException_SetTraceback(value, tb); #elif CYTHON_FAST_THREAD_STATE PyThreadState *tstate = __Pyx_PyThreadState_Current; PyObject* tmp_tb = tstate->curexc_traceback; if (tb != tmp_tb) { Py_INCREF(tb); tstate->curexc_traceback = tb; Py_XDECREF(tmp_tb); } #else PyObject *tmp_type, *tmp_value, *tmp_tb; PyErr_Fetch(&tmp_type, &tmp_value, &tmp_tb); Py_INCREF(tb); PyErr_Restore(tmp_type, tmp_value, tb); Py_XDECREF(tmp_tb); #endif } bad: Py_XDECREF(owned_instance); return; } #endif /* TupleAndListFromArray */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE void __Pyx_copy_object_array(PyObject *const *CYTHON_RESTRICT src, PyObject** CYTHON_RESTRICT dest, Py_ssize_t length) { PyObject *v; Py_ssize_t i; for (i = 0; i < length; i++) { v = dest[i] = src[i]; Py_INCREF(v); } } static CYTHON_INLINE PyObject * __Pyx_PyTuple_FromArray(PyObject *const *src, Py_ssize_t n) { PyObject *res; if (n <= 0) { Py_INCREF(__pyx_empty_tuple); return __pyx_empty_tuple; } res = PyTuple_New(n); if (unlikely(res == NULL)) return NULL; __Pyx_copy_object_array(src, ((PyTupleObject*)res)->ob_item, n); return res; } static CYTHON_INLINE PyObject * __Pyx_PyList_FromArray(PyObject *const *src, Py_ssize_t n) { PyObject *res; if (n <= 0) { return PyList_New(0); } res = PyList_New(n); if (unlikely(res == NULL)) return NULL; __Pyx_copy_object_array(src, ((PyListObject*)res)->ob_item, n); return res; } #endif /* BytesEquals */ static CYTHON_INLINE int __Pyx_PyBytes_Equals(PyObject* s1, PyObject* s2, int equals) { #if CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_LIMITED_API return PyObject_RichCompareBool(s1, s2, equals); #else if (s1 == s2) { return (equals == Py_EQ); } else if (PyBytes_CheckExact(s1) & PyBytes_CheckExact(s2)) { const char *ps1, *ps2; Py_ssize_t length = PyBytes_GET_SIZE(s1); if (length != PyBytes_GET_SIZE(s2)) return (equals == Py_NE); ps1 = PyBytes_AS_STRING(s1); ps2 = PyBytes_AS_STRING(s2); if (ps1[0] != ps2[0]) { return (equals == Py_NE); } else if (length == 1) { return (equals == Py_EQ); } else { int result; #if CYTHON_USE_UNICODE_INTERNALS && (PY_VERSION_HEX < 0x030B0000) Py_hash_t hash1, hash2; hash1 = ((PyBytesObject*)s1)->ob_shash; hash2 = ((PyBytesObject*)s2)->ob_shash; if (hash1 != hash2 && hash1 != -1 && hash2 != -1) { return (equals == Py_NE); } #endif result = memcmp(ps1, ps2, (size_t)length); return (equals == Py_EQ) ? (result == 0) : (result != 0); } } else if ((s1 == Py_None) & PyBytes_CheckExact(s2)) { return (equals == Py_NE); } else if ((s2 == Py_None) & PyBytes_CheckExact(s1)) { return (equals == Py_NE); } else { int result; PyObject* py_result = PyObject_RichCompare(s1, s2, equals); if (!py_result) return -1; result = __Pyx_PyObject_IsTrue(py_result); Py_DECREF(py_result); return result; } #endif } /* UnicodeEquals */ static CYTHON_INLINE int __Pyx_PyUnicode_Equals(PyObject* s1, PyObject* s2, int equals) { #if CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_LIMITED_API return PyObject_RichCompareBool(s1, s2, equals); #else #if PY_MAJOR_VERSION < 3 PyObject* owned_ref = NULL; #endif int s1_is_unicode, s2_is_unicode; if (s1 == s2) { goto return_eq; } s1_is_unicode = PyUnicode_CheckExact(s1); s2_is_unicode = PyUnicode_CheckExact(s2); #if PY_MAJOR_VERSION < 3 if ((s1_is_unicode & (!s2_is_unicode)) && PyString_CheckExact(s2)) { owned_ref = PyUnicode_FromObject(s2); if (unlikely(!owned_ref)) return -1; s2 = owned_ref; s2_is_unicode = 1; } else if ((s2_is_unicode & (!s1_is_unicode)) && PyString_CheckExact(s1)) { owned_ref = PyUnicode_FromObject(s1); if (unlikely(!owned_ref)) return -1; s1 = owned_ref; s1_is_unicode = 1; } else if (((!s2_is_unicode) & (!s1_is_unicode))) { return __Pyx_PyBytes_Equals(s1, s2, equals); } #endif if (s1_is_unicode & s2_is_unicode) { Py_ssize_t length; int kind; void *data1, *data2; if (unlikely(__Pyx_PyUnicode_READY(s1) < 0) || unlikely(__Pyx_PyUnicode_READY(s2) < 0)) return -1; length = __Pyx_PyUnicode_GET_LENGTH(s1); if (length != __Pyx_PyUnicode_GET_LENGTH(s2)) { goto return_ne; } #if CYTHON_USE_UNICODE_INTERNALS { Py_hash_t hash1, hash2; #if CYTHON_PEP393_ENABLED hash1 = ((PyASCIIObject*)s1)->hash; hash2 = ((PyASCIIObject*)s2)->hash; #else hash1 = ((PyUnicodeObject*)s1)->hash; hash2 = ((PyUnicodeObject*)s2)->hash; #endif if (hash1 != hash2 && hash1 != -1 && hash2 != -1) { goto return_ne; } } #endif kind = __Pyx_PyUnicode_KIND(s1); if (kind != __Pyx_PyUnicode_KIND(s2)) { goto return_ne; } data1 = __Pyx_PyUnicode_DATA(s1); data2 = __Pyx_PyUnicode_DATA(s2); if (__Pyx_PyUnicode_READ(kind, data1, 0) != __Pyx_PyUnicode_READ(kind, data2, 0)) { goto return_ne; } else if (length == 1) { goto return_eq; } else { int result = memcmp(data1, data2, (size_t)(length * kind)); #if PY_MAJOR_VERSION < 3 Py_XDECREF(owned_ref); #endif return (equals == Py_EQ) ? (result == 0) : (result != 0); } } else if ((s1 == Py_None) & s2_is_unicode) { goto return_ne; } else if ((s2 == Py_None) & s1_is_unicode) { goto return_ne; } else { int result; PyObject* py_result = PyObject_RichCompare(s1, s2, equals); #if PY_MAJOR_VERSION < 3 Py_XDECREF(owned_ref); #endif if (!py_result) return -1; result = __Pyx_PyObject_IsTrue(py_result); Py_DECREF(py_result); return result; } return_eq: #if PY_MAJOR_VERSION < 3 Py_XDECREF(owned_ref); #endif return (equals == Py_EQ); return_ne: #if PY_MAJOR_VERSION < 3 Py_XDECREF(owned_ref); #endif return (equals == Py_NE); #endif } /* fastcall */ #if CYTHON_METH_FASTCALL static CYTHON_INLINE PyObject * __Pyx_GetKwValue_FASTCALL(PyObject *kwnames, PyObject *const *kwvalues, PyObject *s) { Py_ssize_t i, n = PyTuple_GET_SIZE(kwnames); for (i = 0; i < n; i++) { if (s == PyTuple_GET_ITEM(kwnames, i)) return kwvalues[i]; } for (i = 0; i < n; i++) { int eq = __Pyx_PyUnicode_Equals(s, PyTuple_GET_ITEM(kwnames, i), Py_EQ); if (unlikely(eq != 0)) { if (unlikely(eq < 0)) return NULL; return kwvalues[i]; } } return NULL; } #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030d0000 CYTHON_UNUSED static PyObject *__Pyx_KwargsAsDict_FASTCALL(PyObject *kwnames, PyObject *const *kwvalues) { Py_ssize_t i, nkwargs = PyTuple_GET_SIZE(kwnames); PyObject *dict; dict = PyDict_New(); if (unlikely(!dict)) return NULL; for (i=0; i= 3 "%s() got multiple values for keyword argument '%U'", func_name, kw_name); #else "%s() got multiple values for keyword argument '%s'", func_name, PyString_AsString(kw_name)); #endif } /* ParseKeywords */ static int __Pyx_ParseOptionalKeywords( PyObject *kwds, PyObject *const *kwvalues, PyObject **argnames[], PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args, const char* function_name) { PyObject *key = 0, *value = 0; Py_ssize_t pos = 0; PyObject*** name; PyObject*** first_kw_arg = argnames + num_pos_args; int kwds_is_tuple = CYTHON_METH_FASTCALL && likely(PyTuple_Check(kwds)); while (1) { Py_XDECREF(key); key = NULL; Py_XDECREF(value); value = NULL; if (kwds_is_tuple) { Py_ssize_t size; #if CYTHON_ASSUME_SAFE_MACROS size = PyTuple_GET_SIZE(kwds); #else size = PyTuple_Size(kwds); if (size < 0) goto bad; #endif if (pos >= size) break; #if CYTHON_AVOID_BORROWED_REFS key = __Pyx_PySequence_ITEM(kwds, pos); if (!key) goto bad; #elif CYTHON_ASSUME_SAFE_MACROS key = PyTuple_GET_ITEM(kwds, pos); #else key = PyTuple_GetItem(kwds, pos); if (!key) goto bad; #endif value = kwvalues[pos]; pos++; } else { if (!PyDict_Next(kwds, &pos, &key, &value)) break; #if CYTHON_AVOID_BORROWED_REFS Py_INCREF(key); #endif } name = first_kw_arg; while (*name && (**name != key)) name++; if (*name) { values[name-argnames] = value; #if CYTHON_AVOID_BORROWED_REFS Py_INCREF(value); Py_DECREF(key); #endif key = NULL; value = NULL; continue; } #if !CYTHON_AVOID_BORROWED_REFS Py_INCREF(key); #endif Py_INCREF(value); name = first_kw_arg; #if PY_MAJOR_VERSION < 3 if (likely(PyString_Check(key))) { while (*name) { if ((CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**name) == PyString_GET_SIZE(key)) && _PyString_Eq(**name, key)) { values[name-argnames] = value; #if CYTHON_AVOID_BORROWED_REFS value = NULL; #endif break; } name++; } if (*name) continue; else { PyObject*** argname = argnames; while (argname != first_kw_arg) { if ((**argname == key) || ( (CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**argname) == PyString_GET_SIZE(key)) && _PyString_Eq(**argname, key))) { goto arg_passed_twice; } argname++; } } } else #endif if (likely(PyUnicode_Check(key))) { while (*name) { int cmp = ( #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 (__Pyx_PyUnicode_GET_LENGTH(**name) != __Pyx_PyUnicode_GET_LENGTH(key)) ? 1 : #endif PyUnicode_Compare(**name, key) ); if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; if (cmp == 0) { values[name-argnames] = value; #if CYTHON_AVOID_BORROWED_REFS value = NULL; #endif break; } name++; } if (*name) continue; else { PyObject*** argname = argnames; while (argname != first_kw_arg) { int cmp = (**argname == key) ? 0 : #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 (__Pyx_PyUnicode_GET_LENGTH(**argname) != __Pyx_PyUnicode_GET_LENGTH(key)) ? 1 : #endif PyUnicode_Compare(**argname, key); if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; if (cmp == 0) goto arg_passed_twice; argname++; } } } else goto invalid_keyword_type; if (kwds2) { if (unlikely(PyDict_SetItem(kwds2, key, value))) goto bad; } else { goto invalid_keyword; } } Py_XDECREF(key); Py_XDECREF(value); return 0; arg_passed_twice: __Pyx_RaiseDoubleKeywordsError(function_name, key); goto bad; invalid_keyword_type: PyErr_Format(PyExc_TypeError, "%.200s() keywords must be strings", function_name); goto bad; invalid_keyword: #if PY_MAJOR_VERSION < 3 PyErr_Format(PyExc_TypeError, "%.200s() got an unexpected keyword argument '%.200s'", function_name, PyString_AsString(key)); #else PyErr_Format(PyExc_TypeError, "%s() got an unexpected keyword argument '%U'", function_name, key); #endif bad: Py_XDECREF(key); Py_XDECREF(value); return -1; } /* ArgTypeTest */ static int __Pyx__ArgTypeTest(PyObject *obj, PyTypeObject *type, const char *name, int exact) { __Pyx_TypeName type_name; __Pyx_TypeName obj_type_name; if (unlikely(!type)) { PyErr_SetString(PyExc_SystemError, "Missing type object"); return 0; } else if (exact) { #if PY_MAJOR_VERSION == 2 if ((type == &PyBaseString_Type) && likely(__Pyx_PyBaseString_CheckExact(obj))) return 1; #endif } else { if (likely(__Pyx_TypeCheck(obj, type))) return 1; } type_name = __Pyx_PyType_GetName(type); obj_type_name = __Pyx_PyType_GetName(Py_TYPE(obj)); PyErr_Format(PyExc_TypeError, "Argument '%.200s' has incorrect type (expected " __Pyx_FMT_TYPENAME ", got " __Pyx_FMT_TYPENAME ")", name, type_name, obj_type_name); __Pyx_DECREF_TypeName(type_name); __Pyx_DECREF_TypeName(obj_type_name); return 0; } /* IsLittleEndian */ static CYTHON_INLINE int __Pyx_Is_Little_Endian(void) { union { uint32_t u32; uint8_t u8[4]; } S; S.u32 = 0x01020304; return S.u8[0] == 4; } /* BufferFormatCheck */ static void __Pyx_BufFmt_Init(__Pyx_BufFmt_Context* ctx, __Pyx_BufFmt_StackElem* stack, __Pyx_TypeInfo* type) { stack[0].field = &ctx->root; stack[0].parent_offset = 0; ctx->root.type = type; ctx->root.name = "buffer dtype"; ctx->root.offset = 0; ctx->head = stack; ctx->head->field = &ctx->root; ctx->fmt_offset = 0; ctx->head->parent_offset = 0; ctx->new_packmode = '@'; ctx->enc_packmode = '@'; ctx->new_count = 1; ctx->enc_count = 0; ctx->enc_type = 0; ctx->is_complex = 0; ctx->is_valid_array = 0; ctx->struct_alignment = 0; while (type->typegroup == 'S') { ++ctx->head; ctx->head->field = type->fields; ctx->head->parent_offset = 0; type = type->fields->type; } } static int __Pyx_BufFmt_ParseNumber(const char** ts) { int count; const char* t = *ts; if (*t < '0' || *t > '9') { return -1; } else { count = *t++ - '0'; while (*t >= '0' && *t <= '9') { count *= 10; count += *t++ - '0'; } } *ts = t; return count; } static int __Pyx_BufFmt_ExpectNumber(const char **ts) { int number = __Pyx_BufFmt_ParseNumber(ts); if (number == -1) PyErr_Format(PyExc_ValueError,\ "Does not understand character buffer dtype format string ('%c')", **ts); return number; } static void __Pyx_BufFmt_RaiseUnexpectedChar(char ch) { PyErr_Format(PyExc_ValueError, "Unexpected format string character: '%c'", ch); } static const char* __Pyx_BufFmt_DescribeTypeChar(char ch, int is_complex) { switch (ch) { case '?': return "'bool'"; case 'c': return "'char'"; case 'b': return "'signed char'"; case 'B': return "'unsigned char'"; case 'h': return "'short'"; case 'H': return "'unsigned short'"; case 'i': return "'int'"; case 'I': return "'unsigned int'"; case 'l': return "'long'"; case 'L': return "'unsigned long'"; case 'q': return "'long long'"; case 'Q': return "'unsigned long long'"; case 'f': return (is_complex ? "'complex float'" : "'float'"); case 'd': return (is_complex ? "'complex double'" : "'double'"); case 'g': return (is_complex ? "'complex long double'" : "'long double'"); case 'T': return "a struct"; case 'O': return "Python object"; case 'P': return "a pointer"; case 's': case 'p': return "a string"; case 0: return "end"; default: return "unparsable format string"; } } static size_t __Pyx_BufFmt_TypeCharToStandardSize(char ch, int is_complex) { switch (ch) { case '?': case 'c': case 'b': case 'B': case 's': case 'p': return 1; case 'h': case 'H': return 2; case 'i': case 'I': case 'l': case 'L': return 4; case 'q': case 'Q': return 8; case 'f': return (is_complex ? 8 : 4); case 'd': return (is_complex ? 16 : 8); case 'g': { PyErr_SetString(PyExc_ValueError, "Python does not define a standard format string size for long double ('g').."); return 0; } case 'O': case 'P': return sizeof(void*); default: __Pyx_BufFmt_RaiseUnexpectedChar(ch); return 0; } } static size_t __Pyx_BufFmt_TypeCharToNativeSize(char ch, int is_complex) { switch (ch) { case '?': case 'c': case 'b': case 'B': case 's': case 'p': return 1; case 'h': case 'H': return sizeof(short); case 'i': case 'I': return sizeof(int); case 'l': case 'L': return sizeof(long); #ifdef HAVE_LONG_LONG case 'q': case 'Q': return sizeof(PY_LONG_LONG); #endif case 'f': return sizeof(float) * (is_complex ? 2 : 1); case 'd': return sizeof(double) * (is_complex ? 2 : 1); case 'g': return sizeof(long double) * (is_complex ? 2 : 1); case 'O': case 'P': return sizeof(void*); default: { __Pyx_BufFmt_RaiseUnexpectedChar(ch); return 0; } } } typedef struct { char c; short x; } __Pyx_st_short; typedef struct { char c; int x; } __Pyx_st_int; typedef struct { char c; long x; } __Pyx_st_long; typedef struct { char c; float x; } __Pyx_st_float; typedef struct { char c; double x; } __Pyx_st_double; typedef struct { char c; long double x; } __Pyx_st_longdouble; typedef struct { char c; void *x; } __Pyx_st_void_p; #ifdef HAVE_LONG_LONG typedef struct { char c; PY_LONG_LONG x; } __Pyx_st_longlong; #endif static size_t __Pyx_BufFmt_TypeCharToAlignment(char ch, int is_complex) { CYTHON_UNUSED_VAR(is_complex); switch (ch) { case '?': case 'c': case 'b': case 'B': case 's': case 'p': return 1; case 'h': case 'H': return sizeof(__Pyx_st_short) - sizeof(short); case 'i': case 'I': return sizeof(__Pyx_st_int) - sizeof(int); case 'l': case 'L': return sizeof(__Pyx_st_long) - sizeof(long); #ifdef HAVE_LONG_LONG case 'q': case 'Q': return sizeof(__Pyx_st_longlong) - sizeof(PY_LONG_LONG); #endif case 'f': return sizeof(__Pyx_st_float) - sizeof(float); case 'd': return sizeof(__Pyx_st_double) - sizeof(double); case 'g': return sizeof(__Pyx_st_longdouble) - sizeof(long double); case 'P': case 'O': return sizeof(__Pyx_st_void_p) - sizeof(void*); default: __Pyx_BufFmt_RaiseUnexpectedChar(ch); return 0; } } /* These are for computing the padding at the end of the struct to align on the first member of the struct. This will probably the same as above, but we don't have any guarantees. */ typedef struct { short x; char c; } __Pyx_pad_short; typedef struct { int x; char c; } __Pyx_pad_int; typedef struct { long x; char c; } __Pyx_pad_long; typedef struct { float x; char c; } __Pyx_pad_float; typedef struct { double x; char c; } __Pyx_pad_double; typedef struct { long double x; char c; } __Pyx_pad_longdouble; typedef struct { void *x; char c; } __Pyx_pad_void_p; #ifdef HAVE_LONG_LONG typedef struct { PY_LONG_LONG x; char c; } __Pyx_pad_longlong; #endif static size_t __Pyx_BufFmt_TypeCharToPadding(char ch, int is_complex) { CYTHON_UNUSED_VAR(is_complex); switch (ch) { case '?': case 'c': case 'b': case 'B': case 's': case 'p': return 1; case 'h': case 'H': return sizeof(__Pyx_pad_short) - sizeof(short); case 'i': case 'I': return sizeof(__Pyx_pad_int) - sizeof(int); case 'l': case 'L': return sizeof(__Pyx_pad_long) - sizeof(long); #ifdef HAVE_LONG_LONG case 'q': case 'Q': return sizeof(__Pyx_pad_longlong) - sizeof(PY_LONG_LONG); #endif case 'f': return sizeof(__Pyx_pad_float) - sizeof(float); case 'd': return sizeof(__Pyx_pad_double) - sizeof(double); case 'g': return sizeof(__Pyx_pad_longdouble) - sizeof(long double); case 'P': case 'O': return sizeof(__Pyx_pad_void_p) - sizeof(void*); default: __Pyx_BufFmt_RaiseUnexpectedChar(ch); return 0; } } static char __Pyx_BufFmt_TypeCharToGroup(char ch, int is_complex) { switch (ch) { case 'c': return 'H'; case 'b': case 'h': case 'i': case 'l': case 'q': case 's': case 'p': return 'I'; case '?': case 'B': case 'H': case 'I': case 'L': case 'Q': return 'U'; case 'f': case 'd': case 'g': return (is_complex ? 'C' : 'R'); case 'O': return 'O'; case 'P': return 'P'; default: { __Pyx_BufFmt_RaiseUnexpectedChar(ch); return 0; } } } static void __Pyx_BufFmt_RaiseExpected(__Pyx_BufFmt_Context* ctx) { if (ctx->head == NULL || ctx->head->field == &ctx->root) { const char* expected; const char* quote; if (ctx->head == NULL) { expected = "end"; quote = ""; } else { expected = ctx->head->field->type->name; quote = "'"; } PyErr_Format(PyExc_ValueError, "Buffer dtype mismatch, expected %s%s%s but got %s", quote, expected, quote, __Pyx_BufFmt_DescribeTypeChar(ctx->enc_type, ctx->is_complex)); } else { __Pyx_StructField* field = ctx->head->field; __Pyx_StructField* parent = (ctx->head - 1)->field; PyErr_Format(PyExc_ValueError, "Buffer dtype mismatch, expected '%s' but got %s in '%s.%s'", field->type->name, __Pyx_BufFmt_DescribeTypeChar(ctx->enc_type, ctx->is_complex), parent->type->name, field->name); } } static int __Pyx_BufFmt_ProcessTypeChunk(__Pyx_BufFmt_Context* ctx) { char group; size_t size, offset, arraysize = 1; if (ctx->enc_type == 0) return 0; if (ctx->head->field->type->arraysize[0]) { int i, ndim = 0; if (ctx->enc_type == 's' || ctx->enc_type == 'p') { ctx->is_valid_array = ctx->head->field->type->ndim == 1; ndim = 1; if (ctx->enc_count != ctx->head->field->type->arraysize[0]) { PyErr_Format(PyExc_ValueError, "Expected a dimension of size %zu, got %zu", ctx->head->field->type->arraysize[0], ctx->enc_count); return -1; } } if (!ctx->is_valid_array) { PyErr_Format(PyExc_ValueError, "Expected %d dimensions, got %d", ctx->head->field->type->ndim, ndim); return -1; } for (i = 0; i < ctx->head->field->type->ndim; i++) { arraysize *= ctx->head->field->type->arraysize[i]; } ctx->is_valid_array = 0; ctx->enc_count = 1; } group = __Pyx_BufFmt_TypeCharToGroup(ctx->enc_type, ctx->is_complex); do { __Pyx_StructField* field = ctx->head->field; __Pyx_TypeInfo* type = field->type; if (ctx->enc_packmode == '@' || ctx->enc_packmode == '^') { size = __Pyx_BufFmt_TypeCharToNativeSize(ctx->enc_type, ctx->is_complex); } else { size = __Pyx_BufFmt_TypeCharToStandardSize(ctx->enc_type, ctx->is_complex); } if (ctx->enc_packmode == '@') { size_t align_at = __Pyx_BufFmt_TypeCharToAlignment(ctx->enc_type, ctx->is_complex); size_t align_mod_offset; if (align_at == 0) return -1; align_mod_offset = ctx->fmt_offset % align_at; if (align_mod_offset > 0) ctx->fmt_offset += align_at - align_mod_offset; if (ctx->struct_alignment == 0) ctx->struct_alignment = __Pyx_BufFmt_TypeCharToPadding(ctx->enc_type, ctx->is_complex); } if (type->size != size || type->typegroup != group) { if (type->typegroup == 'C' && type->fields != NULL) { size_t parent_offset = ctx->head->parent_offset + field->offset; ++ctx->head; ctx->head->field = type->fields; ctx->head->parent_offset = parent_offset; continue; } if ((type->typegroup == 'H' || group == 'H') && type->size == size) { } else { __Pyx_BufFmt_RaiseExpected(ctx); return -1; } } offset = ctx->head->parent_offset + field->offset; if (ctx->fmt_offset != offset) { PyErr_Format(PyExc_ValueError, "Buffer dtype mismatch; next field is at offset %" CYTHON_FORMAT_SSIZE_T "d but %" CYTHON_FORMAT_SSIZE_T "d expected", (Py_ssize_t)ctx->fmt_offset, (Py_ssize_t)offset); return -1; } ctx->fmt_offset += size; if (arraysize) ctx->fmt_offset += (arraysize - 1) * size; --ctx->enc_count; while (1) { if (field == &ctx->root) { ctx->head = NULL; if (ctx->enc_count != 0) { __Pyx_BufFmt_RaiseExpected(ctx); return -1; } break; } ctx->head->field = ++field; if (field->type == NULL) { --ctx->head; field = ctx->head->field; continue; } else if (field->type->typegroup == 'S') { size_t parent_offset = ctx->head->parent_offset + field->offset; if (field->type->fields->type == NULL) continue; field = field->type->fields; ++ctx->head; ctx->head->field = field; ctx->head->parent_offset = parent_offset; break; } else { break; } } } while (ctx->enc_count); ctx->enc_type = 0; ctx->is_complex = 0; return 0; } static int __pyx_buffmt_parse_array(__Pyx_BufFmt_Context* ctx, const char** tsp) { const char *ts = *tsp; int i = 0, number, ndim; ++ts; if (ctx->new_count != 1) { PyErr_SetString(PyExc_ValueError, "Cannot handle repeated arrays in format string"); return -1; } if (__Pyx_BufFmt_ProcessTypeChunk(ctx) == -1) return -1; ndim = ctx->head->field->type->ndim; while (*ts && *ts != ')') { switch (*ts) { case ' ': case '\f': case '\r': case '\n': case '\t': case '\v': continue; default: break; } number = __Pyx_BufFmt_ExpectNumber(&ts); if (number == -1) return -1; if (i < ndim && (size_t) number != ctx->head->field->type->arraysize[i]) { PyErr_Format(PyExc_ValueError, "Expected a dimension of size %zu, got %d", ctx->head->field->type->arraysize[i], number); return -1; } if (*ts != ',' && *ts != ')') { PyErr_Format(PyExc_ValueError, "Expected a comma in format string, got '%c'", *ts); return -1; } if (*ts == ',') ts++; i++; } if (i != ndim) { PyErr_Format(PyExc_ValueError, "Expected %d dimension(s), got %d", ctx->head->field->type->ndim, i); return -1; } if (!*ts) { PyErr_SetString(PyExc_ValueError, "Unexpected end of format string, expected ')'"); return -1; } ctx->is_valid_array = 1; ctx->new_count = 1; *tsp = ++ts; return 0; } static const char* __Pyx_BufFmt_CheckString(__Pyx_BufFmt_Context* ctx, const char* ts) { int got_Z = 0; while (1) { switch(*ts) { case 0: if (ctx->enc_type != 0 && ctx->head == NULL) { __Pyx_BufFmt_RaiseExpected(ctx); return NULL; } if (__Pyx_BufFmt_ProcessTypeChunk(ctx) == -1) return NULL; if (ctx->head != NULL) { __Pyx_BufFmt_RaiseExpected(ctx); return NULL; } return ts; case ' ': case '\r': case '\n': ++ts; break; case '<': if (!__Pyx_Is_Little_Endian()) { PyErr_SetString(PyExc_ValueError, "Little-endian buffer not supported on big-endian compiler"); return NULL; } ctx->new_packmode = '='; ++ts; break; case '>': case '!': if (__Pyx_Is_Little_Endian()) { PyErr_SetString(PyExc_ValueError, "Big-endian buffer not supported on little-endian compiler"); return NULL; } ctx->new_packmode = '='; ++ts; break; case '=': case '@': case '^': ctx->new_packmode = *ts++; break; case 'T': { const char* ts_after_sub; size_t i, struct_count = ctx->new_count; size_t struct_alignment = ctx->struct_alignment; ctx->new_count = 1; ++ts; if (*ts != '{') { PyErr_SetString(PyExc_ValueError, "Buffer acquisition: Expected '{' after 'T'"); return NULL; } if (__Pyx_BufFmt_ProcessTypeChunk(ctx) == -1) return NULL; ctx->enc_type = 0; ctx->enc_count = 0; ctx->struct_alignment = 0; ++ts; ts_after_sub = ts; for (i = 0; i != struct_count; ++i) { ts_after_sub = __Pyx_BufFmt_CheckString(ctx, ts); if (!ts_after_sub) return NULL; } ts = ts_after_sub; if (struct_alignment) ctx->struct_alignment = struct_alignment; } break; case '}': { size_t alignment = ctx->struct_alignment; ++ts; if (__Pyx_BufFmt_ProcessTypeChunk(ctx) == -1) return NULL; ctx->enc_type = 0; if (alignment && ctx->fmt_offset % alignment) { ctx->fmt_offset += alignment - (ctx->fmt_offset % alignment); } } return ts; case 'x': if (__Pyx_BufFmt_ProcessTypeChunk(ctx) == -1) return NULL; ctx->fmt_offset += ctx->new_count; ctx->new_count = 1; ctx->enc_count = 0; ctx->enc_type = 0; ctx->enc_packmode = ctx->new_packmode; ++ts; break; case 'Z': got_Z = 1; ++ts; if (*ts != 'f' && *ts != 'd' && *ts != 'g') { __Pyx_BufFmt_RaiseUnexpectedChar('Z'); return NULL; } CYTHON_FALLTHROUGH; case '?': case 'c': case 'b': case 'B': case 'h': case 'H': case 'i': case 'I': case 'l': case 'L': case 'q': case 'Q': case 'f': case 'd': case 'g': case 'O': case 'p': if ((ctx->enc_type == *ts) && (got_Z == ctx->is_complex) && (ctx->enc_packmode == ctx->new_packmode) && (!ctx->is_valid_array)) { ctx->enc_count += ctx->new_count; ctx->new_count = 1; got_Z = 0; ++ts; break; } CYTHON_FALLTHROUGH; case 's': if (__Pyx_BufFmt_ProcessTypeChunk(ctx) == -1) return NULL; ctx->enc_count = ctx->new_count; ctx->enc_packmode = ctx->new_packmode; ctx->enc_type = *ts; ctx->is_complex = got_Z; ++ts; ctx->new_count = 1; got_Z = 0; break; case ':': ++ts; while(*ts != ':') ++ts; ++ts; break; case '(': if (__pyx_buffmt_parse_array(ctx, &ts) < 0) return NULL; break; default: { int number = __Pyx_BufFmt_ExpectNumber(&ts); if (number == -1) return NULL; ctx->new_count = (size_t)number; } } } } /* BufferGetAndValidate */ static CYTHON_INLINE void __Pyx_SafeReleaseBuffer(Py_buffer* info) { if (unlikely(info->buf == NULL)) return; if (info->suboffsets == __Pyx_minusones) info->suboffsets = NULL; __Pyx_ReleaseBuffer(info); } static void __Pyx_ZeroBuffer(Py_buffer* buf) { buf->buf = NULL; buf->obj = NULL; buf->strides = __Pyx_zeros; buf->shape = __Pyx_zeros; buf->suboffsets = __Pyx_minusones; } static int __Pyx__GetBufferAndValidate( Py_buffer* buf, PyObject* obj, __Pyx_TypeInfo* dtype, int flags, int nd, int cast, __Pyx_BufFmt_StackElem* stack) { buf->buf = NULL; if (unlikely(__Pyx_GetBuffer(obj, buf, flags) == -1)) { __Pyx_ZeroBuffer(buf); return -1; } if (unlikely(buf->ndim != nd)) { PyErr_Format(PyExc_ValueError, "Buffer has wrong number of dimensions (expected %d, got %d)", nd, buf->ndim); goto fail; } if (!cast) { __Pyx_BufFmt_Context ctx; __Pyx_BufFmt_Init(&ctx, stack, dtype); if (!__Pyx_BufFmt_CheckString(&ctx, buf->format)) goto fail; } if (unlikely((size_t)buf->itemsize != dtype->size)) { PyErr_Format(PyExc_ValueError, "Item size of buffer (%" CYTHON_FORMAT_SSIZE_T "d byte%s) does not match size of '%s' (%" CYTHON_FORMAT_SSIZE_T "d byte%s)", buf->itemsize, (buf->itemsize > 1) ? "s" : "", dtype->name, (Py_ssize_t)dtype->size, (dtype->size > 1) ? "s" : ""); goto fail; } if (buf->suboffsets == NULL) buf->suboffsets = __Pyx_minusones; return 0; fail:; __Pyx_SafeReleaseBuffer(buf); return -1; } /* PyDictVersioning */ #if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj) { PyObject *dict = Py_TYPE(obj)->tp_dict; return likely(dict) ? __PYX_GET_DICT_VERSION(dict) : 0; } static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj) { PyObject **dictptr = NULL; Py_ssize_t offset = Py_TYPE(obj)->tp_dictoffset; if (offset) { #if CYTHON_COMPILING_IN_CPYTHON dictptr = (likely(offset > 0)) ? (PyObject **) ((char *)obj + offset) : _PyObject_GetDictPtr(obj); #else dictptr = _PyObject_GetDictPtr(obj); #endif } return (dictptr && *dictptr) ? __PYX_GET_DICT_VERSION(*dictptr) : 0; } static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version) { PyObject *dict = Py_TYPE(obj)->tp_dict; if (unlikely(!dict) || unlikely(tp_dict_version != __PYX_GET_DICT_VERSION(dict))) return 0; return obj_dict_version == __Pyx_get_object_dict_version(obj); } #endif /* GetModuleGlobalName */ #if CYTHON_USE_DICT_VERSIONS static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value) #else static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name) #endif { PyObject *result; #if !CYTHON_AVOID_BORROWED_REFS #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030500A1 && PY_VERSION_HEX < 0x030d0000 result = _PyDict_GetItem_KnownHash(__pyx_d, name, ((PyASCIIObject *) name)->hash); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } else if (unlikely(PyErr_Occurred())) { return NULL; } #elif CYTHON_COMPILING_IN_LIMITED_API if (unlikely(!__pyx_m)) { return NULL; } result = PyObject_GetAttr(__pyx_m, name); if (likely(result)) { return result; } #else result = PyDict_GetItem(__pyx_d, name); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } #endif #else result = PyObject_GetItem(__pyx_d, name); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } PyErr_Clear(); #endif return __Pyx_GetBuiltinName(name); } /* TypeImport */ #ifndef __PYX_HAVE_RT_ImportType_3_0_11 #define __PYX_HAVE_RT_ImportType_3_0_11 static PyTypeObject *__Pyx_ImportType_3_0_11(PyObject *module, const char *module_name, const char *class_name, size_t size, size_t alignment, enum __Pyx_ImportType_CheckSize_3_0_11 check_size) { PyObject *result = 0; char warning[200]; Py_ssize_t basicsize; Py_ssize_t itemsize; #if CYTHON_COMPILING_IN_LIMITED_API PyObject *py_basicsize; PyObject *py_itemsize; #endif result = PyObject_GetAttrString(module, class_name); if (!result) goto bad; if (!PyType_Check(result)) { PyErr_Format(PyExc_TypeError, "%.200s.%.200s is not a type object", module_name, class_name); goto bad; } #if !CYTHON_COMPILING_IN_LIMITED_API basicsize = ((PyTypeObject *)result)->tp_basicsize; itemsize = ((PyTypeObject *)result)->tp_itemsize; #else py_basicsize = PyObject_GetAttrString(result, "__basicsize__"); if (!py_basicsize) goto bad; basicsize = PyLong_AsSsize_t(py_basicsize); Py_DECREF(py_basicsize); py_basicsize = 0; if (basicsize == (Py_ssize_t)-1 && PyErr_Occurred()) goto bad; py_itemsize = PyObject_GetAttrString(result, "__itemsize__"); if (!py_itemsize) goto bad; itemsize = PyLong_AsSsize_t(py_itemsize); Py_DECREF(py_itemsize); py_itemsize = 0; if (itemsize == (Py_ssize_t)-1 && PyErr_Occurred()) goto bad; #endif if (itemsize) { if (size % alignment) { alignment = size % alignment; } if (itemsize < (Py_ssize_t)alignment) itemsize = (Py_ssize_t)alignment; } if ((size_t)(basicsize + itemsize) < size) { PyErr_Format(PyExc_ValueError, "%.200s.%.200s size changed, may indicate binary incompatibility. " "Expected %zd from C header, got %zd from PyObject", module_name, class_name, size, basicsize+itemsize); goto bad; } if (check_size == __Pyx_ImportType_CheckSize_Error_3_0_11 && ((size_t)basicsize > size || (size_t)(basicsize + itemsize) < size)) { PyErr_Format(PyExc_ValueError, "%.200s.%.200s size changed, may indicate binary incompatibility. " "Expected %zd from C header, got %zd-%zd from PyObject", module_name, class_name, size, basicsize, basicsize+itemsize); goto bad; } else if (check_size == __Pyx_ImportType_CheckSize_Warn_3_0_11 && (size_t)basicsize > size) { PyOS_snprintf(warning, sizeof(warning), "%s.%s size changed, may indicate binary incompatibility. " "Expected %zd from C header, got %zd from PyObject", module_name, class_name, size, basicsize); if (PyErr_WarnEx(NULL, warning, 0) < 0) goto bad; } return (PyTypeObject *)result; bad: Py_XDECREF(result); return NULL; } #endif /* Import */ static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list, int level) { PyObject *module = 0; PyObject *empty_dict = 0; PyObject *empty_list = 0; #if PY_MAJOR_VERSION < 3 PyObject *py_import; py_import = __Pyx_PyObject_GetAttrStr(__pyx_b, __pyx_n_s_import); if (unlikely(!py_import)) goto bad; if (!from_list) { empty_list = PyList_New(0); if (unlikely(!empty_list)) goto bad; from_list = empty_list; } #endif empty_dict = PyDict_New(); if (unlikely(!empty_dict)) goto bad; { #if PY_MAJOR_VERSION >= 3 if (level == -1) { if (strchr(__Pyx_MODULE_NAME, '.') != NULL) { module = PyImport_ImportModuleLevelObject( name, __pyx_d, empty_dict, from_list, 1); if (unlikely(!module)) { if (unlikely(!PyErr_ExceptionMatches(PyExc_ImportError))) goto bad; PyErr_Clear(); } } level = 0; } #endif if (!module) { #if PY_MAJOR_VERSION < 3 PyObject *py_level = PyInt_FromLong(level); if (unlikely(!py_level)) goto bad; module = PyObject_CallFunctionObjArgs(py_import, name, __pyx_d, empty_dict, from_list, py_level, (PyObject *)NULL); Py_DECREF(py_level); #else module = PyImport_ImportModuleLevelObject( name, __pyx_d, empty_dict, from_list, level); #endif } } bad: Py_XDECREF(empty_dict); Py_XDECREF(empty_list); #if PY_MAJOR_VERSION < 3 Py_XDECREF(py_import); #endif return module; } /* ImportDottedModule */ #if PY_MAJOR_VERSION >= 3 static PyObject *__Pyx__ImportDottedModule_Error(PyObject *name, PyObject *parts_tuple, Py_ssize_t count) { PyObject *partial_name = NULL, *slice = NULL, *sep = NULL; if (unlikely(PyErr_Occurred())) { PyErr_Clear(); } if (likely(PyTuple_GET_SIZE(parts_tuple) == count)) { partial_name = name; } else { slice = PySequence_GetSlice(parts_tuple, 0, count); if (unlikely(!slice)) goto bad; sep = PyUnicode_FromStringAndSize(".", 1); if (unlikely(!sep)) goto bad; partial_name = PyUnicode_Join(sep, slice); } PyErr_Format( #if PY_MAJOR_VERSION < 3 PyExc_ImportError, "No module named '%s'", PyString_AS_STRING(partial_name)); #else #if PY_VERSION_HEX >= 0x030600B1 PyExc_ModuleNotFoundError, #else PyExc_ImportError, #endif "No module named '%U'", partial_name); #endif bad: Py_XDECREF(sep); Py_XDECREF(slice); Py_XDECREF(partial_name); return NULL; } #endif #if PY_MAJOR_VERSION >= 3 static PyObject *__Pyx__ImportDottedModule_Lookup(PyObject *name) { PyObject *imported_module; #if PY_VERSION_HEX < 0x030700A1 || (CYTHON_COMPILING_IN_PYPY && PYPY_VERSION_NUM < 0x07030400) PyObject *modules = PyImport_GetModuleDict(); if (unlikely(!modules)) return NULL; imported_module = __Pyx_PyDict_GetItemStr(modules, name); Py_XINCREF(imported_module); #else imported_module = PyImport_GetModule(name); #endif return imported_module; } #endif #if PY_MAJOR_VERSION >= 3 static PyObject *__Pyx_ImportDottedModule_WalkParts(PyObject *module, PyObject *name, PyObject *parts_tuple) { Py_ssize_t i, nparts; nparts = PyTuple_GET_SIZE(parts_tuple); for (i=1; i < nparts && module; i++) { PyObject *part, *submodule; #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS part = PyTuple_GET_ITEM(parts_tuple, i); #else part = PySequence_ITEM(parts_tuple, i); #endif submodule = __Pyx_PyObject_GetAttrStrNoError(module, part); #if !(CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS) Py_DECREF(part); #endif Py_DECREF(module); module = submodule; } if (unlikely(!module)) { return __Pyx__ImportDottedModule_Error(name, parts_tuple, i); } return module; } #endif static PyObject *__Pyx__ImportDottedModule(PyObject *name, PyObject *parts_tuple) { #if PY_MAJOR_VERSION < 3 PyObject *module, *from_list, *star = __pyx_n_s__3; CYTHON_UNUSED_VAR(parts_tuple); from_list = PyList_New(1); if (unlikely(!from_list)) return NULL; Py_INCREF(star); PyList_SET_ITEM(from_list, 0, star); module = __Pyx_Import(name, from_list, 0); Py_DECREF(from_list); return module; #else PyObject *imported_module; PyObject *module = __Pyx_Import(name, NULL, 0); if (!parts_tuple || unlikely(!module)) return module; imported_module = __Pyx__ImportDottedModule_Lookup(name); if (likely(imported_module)) { Py_DECREF(module); return imported_module; } PyErr_Clear(); return __Pyx_ImportDottedModule_WalkParts(module, name, parts_tuple); #endif } static PyObject *__Pyx_ImportDottedModule(PyObject *name, PyObject *parts_tuple) { #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030400B1 PyObject *module = __Pyx__ImportDottedModule_Lookup(name); if (likely(module)) { PyObject *spec = __Pyx_PyObject_GetAttrStrNoError(module, __pyx_n_s_spec); if (likely(spec)) { PyObject *unsafe = __Pyx_PyObject_GetAttrStrNoError(spec, __pyx_n_s_initializing); if (likely(!unsafe || !__Pyx_PyObject_IsTrue(unsafe))) { Py_DECREF(spec); spec = NULL; } Py_XDECREF(unsafe); } if (likely(!spec)) { PyErr_Clear(); return module; } Py_DECREF(spec); Py_DECREF(module); } else if (PyErr_Occurred()) { PyErr_Clear(); } #endif return __Pyx__ImportDottedModule(name, parts_tuple); } /* FixUpExtensionType */ #if CYTHON_USE_TYPE_SPECS static int __Pyx_fix_up_extension_type_from_spec(PyType_Spec *spec, PyTypeObject *type) { #if PY_VERSION_HEX > 0x030900B1 || CYTHON_COMPILING_IN_LIMITED_API CYTHON_UNUSED_VAR(spec); CYTHON_UNUSED_VAR(type); #else const PyType_Slot *slot = spec->slots; while (slot && slot->slot && slot->slot != Py_tp_members) slot++; if (slot && slot->slot == Py_tp_members) { int changed = 0; #if !(PY_VERSION_HEX <= 0x030900b1 && CYTHON_COMPILING_IN_CPYTHON) const #endif PyMemberDef *memb = (PyMemberDef*) slot->pfunc; while (memb && memb->name) { if (memb->name[0] == '_' && memb->name[1] == '_') { #if PY_VERSION_HEX < 0x030900b1 if (strcmp(memb->name, "__weaklistoffset__") == 0) { assert(memb->type == T_PYSSIZET); assert(memb->flags == READONLY); type->tp_weaklistoffset = memb->offset; changed = 1; } else if (strcmp(memb->name, "__dictoffset__") == 0) { assert(memb->type == T_PYSSIZET); assert(memb->flags == READONLY); type->tp_dictoffset = memb->offset; changed = 1; } #if CYTHON_METH_FASTCALL else if (strcmp(memb->name, "__vectorcalloffset__") == 0) { assert(memb->type == T_PYSSIZET); assert(memb->flags == READONLY); #if PY_VERSION_HEX >= 0x030800b4 type->tp_vectorcall_offset = memb->offset; #else type->tp_print = (printfunc) memb->offset; #endif changed = 1; } #endif #else if ((0)); #endif #if PY_VERSION_HEX <= 0x030900b1 && CYTHON_COMPILING_IN_CPYTHON else if (strcmp(memb->name, "__module__") == 0) { PyObject *descr; assert(memb->type == T_OBJECT); assert(memb->flags == 0 || memb->flags == READONLY); descr = PyDescr_NewMember(type, memb); if (unlikely(!descr)) return -1; if (unlikely(PyDict_SetItem(type->tp_dict, PyDescr_NAME(descr), descr) < 0)) { Py_DECREF(descr); return -1; } Py_DECREF(descr); changed = 1; } #endif } memb++; } if (changed) PyType_Modified(type); } #endif return 0; } #endif /* FetchSharedCythonModule */ static PyObject *__Pyx_FetchSharedCythonABIModule(void) { return __Pyx_PyImport_AddModuleRef((char*) __PYX_ABI_MODULE_NAME); } /* FetchCommonType */ static int __Pyx_VerifyCachedType(PyObject *cached_type, const char *name, Py_ssize_t basicsize, Py_ssize_t expected_basicsize) { if (!PyType_Check(cached_type)) { PyErr_Format(PyExc_TypeError, "Shared Cython type %.200s is not a type object", name); return -1; } if (basicsize != expected_basicsize) { PyErr_Format(PyExc_TypeError, "Shared Cython type %.200s has the wrong size, try recompiling", name); return -1; } return 0; } #if !CYTHON_USE_TYPE_SPECS static PyTypeObject* __Pyx_FetchCommonType(PyTypeObject* type) { PyObject* abi_module; const char* object_name; PyTypeObject *cached_type = NULL; abi_module = __Pyx_FetchSharedCythonABIModule(); if (!abi_module) return NULL; object_name = strrchr(type->tp_name, '.'); object_name = object_name ? object_name+1 : type->tp_name; cached_type = (PyTypeObject*) PyObject_GetAttrString(abi_module, object_name); if (cached_type) { if (__Pyx_VerifyCachedType( (PyObject *)cached_type, object_name, cached_type->tp_basicsize, type->tp_basicsize) < 0) { goto bad; } goto done; } if (!PyErr_ExceptionMatches(PyExc_AttributeError)) goto bad; PyErr_Clear(); if (PyType_Ready(type) < 0) goto bad; if (PyObject_SetAttrString(abi_module, object_name, (PyObject *)type) < 0) goto bad; Py_INCREF(type); cached_type = type; done: Py_DECREF(abi_module); return cached_type; bad: Py_XDECREF(cached_type); cached_type = NULL; goto done; } #else static PyTypeObject *__Pyx_FetchCommonTypeFromSpec(PyObject *module, PyType_Spec *spec, PyObject *bases) { PyObject *abi_module, *cached_type = NULL; const char* object_name = strrchr(spec->name, '.'); object_name = object_name ? object_name+1 : spec->name; abi_module = __Pyx_FetchSharedCythonABIModule(); if (!abi_module) return NULL; cached_type = PyObject_GetAttrString(abi_module, object_name); if (cached_type) { Py_ssize_t basicsize; #if CYTHON_COMPILING_IN_LIMITED_API PyObject *py_basicsize; py_basicsize = PyObject_GetAttrString(cached_type, "__basicsize__"); if (unlikely(!py_basicsize)) goto bad; basicsize = PyLong_AsSsize_t(py_basicsize); Py_DECREF(py_basicsize); py_basicsize = 0; if (unlikely(basicsize == (Py_ssize_t)-1) && PyErr_Occurred()) goto bad; #else basicsize = likely(PyType_Check(cached_type)) ? ((PyTypeObject*) cached_type)->tp_basicsize : -1; #endif if (__Pyx_VerifyCachedType( cached_type, object_name, basicsize, spec->basicsize) < 0) { goto bad; } goto done; } if (!PyErr_ExceptionMatches(PyExc_AttributeError)) goto bad; PyErr_Clear(); CYTHON_UNUSED_VAR(module); cached_type = __Pyx_PyType_FromModuleAndSpec(abi_module, spec, bases); if (unlikely(!cached_type)) goto bad; if (unlikely(__Pyx_fix_up_extension_type_from_spec(spec, (PyTypeObject *) cached_type) < 0)) goto bad; if (PyObject_SetAttrString(abi_module, object_name, cached_type) < 0) goto bad; done: Py_DECREF(abi_module); assert(cached_type == NULL || PyType_Check(cached_type)); return (PyTypeObject *) cached_type; bad: Py_XDECREF(cached_type); cached_type = NULL; goto done; } #endif /* PyVectorcallFastCallDict */ #if CYTHON_METH_FASTCALL static PyObject *__Pyx_PyVectorcall_FastCallDict_kw(PyObject *func, __pyx_vectorcallfunc vc, PyObject *const *args, size_t nargs, PyObject *kw) { PyObject *res = NULL; PyObject *kwnames; PyObject **newargs; PyObject **kwvalues; Py_ssize_t i, pos; size_t j; PyObject *key, *value; unsigned long keys_are_strings; Py_ssize_t nkw = PyDict_GET_SIZE(kw); newargs = (PyObject **)PyMem_Malloc((nargs + (size_t)nkw) * sizeof(args[0])); if (unlikely(newargs == NULL)) { PyErr_NoMemory(); return NULL; } for (j = 0; j < nargs; j++) newargs[j] = args[j]; kwnames = PyTuple_New(nkw); if (unlikely(kwnames == NULL)) { PyMem_Free(newargs); return NULL; } kwvalues = newargs + nargs; pos = i = 0; keys_are_strings = Py_TPFLAGS_UNICODE_SUBCLASS; while (PyDict_Next(kw, &pos, &key, &value)) { keys_are_strings &= Py_TYPE(key)->tp_flags; Py_INCREF(key); Py_INCREF(value); PyTuple_SET_ITEM(kwnames, i, key); kwvalues[i] = value; i++; } if (unlikely(!keys_are_strings)) { PyErr_SetString(PyExc_TypeError, "keywords must be strings"); goto cleanup; } res = vc(func, newargs, nargs, kwnames); cleanup: Py_DECREF(kwnames); for (i = 0; i < nkw; i++) Py_DECREF(kwvalues[i]); PyMem_Free(newargs); return res; } static CYTHON_INLINE PyObject *__Pyx_PyVectorcall_FastCallDict(PyObject *func, __pyx_vectorcallfunc vc, PyObject *const *args, size_t nargs, PyObject *kw) { if (likely(kw == NULL) || PyDict_GET_SIZE(kw) == 0) { return vc(func, args, nargs, NULL); } return __Pyx_PyVectorcall_FastCallDict_kw(func, vc, args, nargs, kw); } #endif /* CythonFunctionShared */ #if CYTHON_COMPILING_IN_LIMITED_API static CYTHON_INLINE int __Pyx__IsSameCyOrCFunction(PyObject *func, void *cfunc) { if (__Pyx_CyFunction_Check(func)) { return PyCFunction_GetFunction(((__pyx_CyFunctionObject*)func)->func) == (PyCFunction) cfunc; } else if (PyCFunction_Check(func)) { return PyCFunction_GetFunction(func) == (PyCFunction) cfunc; } return 0; } #else static CYTHON_INLINE int __Pyx__IsSameCyOrCFunction(PyObject *func, void *cfunc) { return __Pyx_CyOrPyCFunction_Check(func) && __Pyx_CyOrPyCFunction_GET_FUNCTION(func) == (PyCFunction) cfunc; } #endif static CYTHON_INLINE void __Pyx__CyFunction_SetClassObj(__pyx_CyFunctionObject* f, PyObject* classobj) { #if PY_VERSION_HEX < 0x030900B1 || CYTHON_COMPILING_IN_LIMITED_API __Pyx_Py_XDECREF_SET( __Pyx_CyFunction_GetClassObj(f), ((classobj) ? __Pyx_NewRef(classobj) : NULL)); #else __Pyx_Py_XDECREF_SET( ((PyCMethodObject *) (f))->mm_class, (PyTypeObject*)((classobj) ? __Pyx_NewRef(classobj) : NULL)); #endif } static PyObject * __Pyx_CyFunction_get_doc(__pyx_CyFunctionObject *op, void *closure) { CYTHON_UNUSED_VAR(closure); if (unlikely(op->func_doc == NULL)) { #if CYTHON_COMPILING_IN_LIMITED_API op->func_doc = PyObject_GetAttrString(op->func, "__doc__"); if (unlikely(!op->func_doc)) return NULL; #else if (((PyCFunctionObject*)op)->m_ml->ml_doc) { #if PY_MAJOR_VERSION >= 3 op->func_doc = PyUnicode_FromString(((PyCFunctionObject*)op)->m_ml->ml_doc); #else op->func_doc = PyString_FromString(((PyCFunctionObject*)op)->m_ml->ml_doc); #endif if (unlikely(op->func_doc == NULL)) return NULL; } else { Py_INCREF(Py_None); return Py_None; } #endif } Py_INCREF(op->func_doc); return op->func_doc; } static int __Pyx_CyFunction_set_doc(__pyx_CyFunctionObject *op, PyObject *value, void *context) { CYTHON_UNUSED_VAR(context); if (value == NULL) { value = Py_None; } Py_INCREF(value); __Pyx_Py_XDECREF_SET(op->func_doc, value); return 0; } static PyObject * __Pyx_CyFunction_get_name(__pyx_CyFunctionObject *op, void *context) { CYTHON_UNUSED_VAR(context); if (unlikely(op->func_name == NULL)) { #if CYTHON_COMPILING_IN_LIMITED_API op->func_name = PyObject_GetAttrString(op->func, "__name__"); #elif PY_MAJOR_VERSION >= 3 op->func_name = PyUnicode_InternFromString(((PyCFunctionObject*)op)->m_ml->ml_name); #else op->func_name = PyString_InternFromString(((PyCFunctionObject*)op)->m_ml->ml_name); #endif if (unlikely(op->func_name == NULL)) return NULL; } Py_INCREF(op->func_name); return op->func_name; } static int __Pyx_CyFunction_set_name(__pyx_CyFunctionObject *op, PyObject *value, void *context) { CYTHON_UNUSED_VAR(context); #if PY_MAJOR_VERSION >= 3 if (unlikely(value == NULL || !PyUnicode_Check(value))) #else if (unlikely(value == NULL || !PyString_Check(value))) #endif { PyErr_SetString(PyExc_TypeError, "__name__ must be set to a string object"); return -1; } Py_INCREF(value); __Pyx_Py_XDECREF_SET(op->func_name, value); return 0; } static PyObject * __Pyx_CyFunction_get_qualname(__pyx_CyFunctionObject *op, void *context) { CYTHON_UNUSED_VAR(context); Py_INCREF(op->func_qualname); return op->func_qualname; } static int __Pyx_CyFunction_set_qualname(__pyx_CyFunctionObject *op, PyObject *value, void *context) { CYTHON_UNUSED_VAR(context); #if PY_MAJOR_VERSION >= 3 if (unlikely(value == NULL || !PyUnicode_Check(value))) #else if (unlikely(value == NULL || !PyString_Check(value))) #endif { PyErr_SetString(PyExc_TypeError, "__qualname__ must be set to a string object"); return -1; } Py_INCREF(value); __Pyx_Py_XDECREF_SET(op->func_qualname, value); return 0; } static PyObject * __Pyx_CyFunction_get_dict(__pyx_CyFunctionObject *op, void *context) { CYTHON_UNUSED_VAR(context); if (unlikely(op->func_dict == NULL)) { op->func_dict = PyDict_New(); if (unlikely(op->func_dict == NULL)) return NULL; } Py_INCREF(op->func_dict); return op->func_dict; } static int __Pyx_CyFunction_set_dict(__pyx_CyFunctionObject *op, PyObject *value, void *context) { CYTHON_UNUSED_VAR(context); if (unlikely(value == NULL)) { PyErr_SetString(PyExc_TypeError, "function's dictionary may not be deleted"); return -1; } if (unlikely(!PyDict_Check(value))) { PyErr_SetString(PyExc_TypeError, "setting function's dictionary to a non-dict"); return -1; } Py_INCREF(value); __Pyx_Py_XDECREF_SET(op->func_dict, value); return 0; } static PyObject * __Pyx_CyFunction_get_globals(__pyx_CyFunctionObject *op, void *context) { CYTHON_UNUSED_VAR(context); Py_INCREF(op->func_globals); return op->func_globals; } static PyObject * __Pyx_CyFunction_get_closure(__pyx_CyFunctionObject *op, void *context) { CYTHON_UNUSED_VAR(op); CYTHON_UNUSED_VAR(context); Py_INCREF(Py_None); return Py_None; } static PyObject * __Pyx_CyFunction_get_code(__pyx_CyFunctionObject *op, void *context) { PyObject* result = (op->func_code) ? op->func_code : Py_None; CYTHON_UNUSED_VAR(context); Py_INCREF(result); return result; } static int __Pyx_CyFunction_init_defaults(__pyx_CyFunctionObject *op) { int result = 0; PyObject *res = op->defaults_getter((PyObject *) op); if (unlikely(!res)) return -1; #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS op->defaults_tuple = PyTuple_GET_ITEM(res, 0); Py_INCREF(op->defaults_tuple); op->defaults_kwdict = PyTuple_GET_ITEM(res, 1); Py_INCREF(op->defaults_kwdict); #else op->defaults_tuple = __Pyx_PySequence_ITEM(res, 0); if (unlikely(!op->defaults_tuple)) result = -1; else { op->defaults_kwdict = __Pyx_PySequence_ITEM(res, 1); if (unlikely(!op->defaults_kwdict)) result = -1; } #endif Py_DECREF(res); return result; } static int __Pyx_CyFunction_set_defaults(__pyx_CyFunctionObject *op, PyObject* value, void *context) { CYTHON_UNUSED_VAR(context); if (!value) { value = Py_None; } else if (unlikely(value != Py_None && !PyTuple_Check(value))) { PyErr_SetString(PyExc_TypeError, "__defaults__ must be set to a tuple object"); return -1; } PyErr_WarnEx(PyExc_RuntimeWarning, "changes to cyfunction.__defaults__ will not " "currently affect the values used in function calls", 1); Py_INCREF(value); __Pyx_Py_XDECREF_SET(op->defaults_tuple, value); return 0; } static PyObject * __Pyx_CyFunction_get_defaults(__pyx_CyFunctionObject *op, void *context) { PyObject* result = op->defaults_tuple; CYTHON_UNUSED_VAR(context); if (unlikely(!result)) { if (op->defaults_getter) { if (unlikely(__Pyx_CyFunction_init_defaults(op) < 0)) return NULL; result = op->defaults_tuple; } else { result = Py_None; } } Py_INCREF(result); return result; } static int __Pyx_CyFunction_set_kwdefaults(__pyx_CyFunctionObject *op, PyObject* value, void *context) { CYTHON_UNUSED_VAR(context); if (!value) { value = Py_None; } else if (unlikely(value != Py_None && !PyDict_Check(value))) { PyErr_SetString(PyExc_TypeError, "__kwdefaults__ must be set to a dict object"); return -1; } PyErr_WarnEx(PyExc_RuntimeWarning, "changes to cyfunction.__kwdefaults__ will not " "currently affect the values used in function calls", 1); Py_INCREF(value); __Pyx_Py_XDECREF_SET(op->defaults_kwdict, value); return 0; } static PyObject * __Pyx_CyFunction_get_kwdefaults(__pyx_CyFunctionObject *op, void *context) { PyObject* result = op->defaults_kwdict; CYTHON_UNUSED_VAR(context); if (unlikely(!result)) { if (op->defaults_getter) { if (unlikely(__Pyx_CyFunction_init_defaults(op) < 0)) return NULL; result = op->defaults_kwdict; } else { result = Py_None; } } Py_INCREF(result); return result; } static int __Pyx_CyFunction_set_annotations(__pyx_CyFunctionObject *op, PyObject* value, void *context) { CYTHON_UNUSED_VAR(context); if (!value || value == Py_None) { value = NULL; } else if (unlikely(!PyDict_Check(value))) { PyErr_SetString(PyExc_TypeError, "__annotations__ must be set to a dict object"); return -1; } Py_XINCREF(value); __Pyx_Py_XDECREF_SET(op->func_annotations, value); return 0; } static PyObject * __Pyx_CyFunction_get_annotations(__pyx_CyFunctionObject *op, void *context) { PyObject* result = op->func_annotations; CYTHON_UNUSED_VAR(context); if (unlikely(!result)) { result = PyDict_New(); if (unlikely(!result)) return NULL; op->func_annotations = result; } Py_INCREF(result); return result; } static PyObject * __Pyx_CyFunction_get_is_coroutine(__pyx_CyFunctionObject *op, void *context) { int is_coroutine; CYTHON_UNUSED_VAR(context); if (op->func_is_coroutine) { return __Pyx_NewRef(op->func_is_coroutine); } is_coroutine = op->flags & __Pyx_CYFUNCTION_COROUTINE; #if PY_VERSION_HEX >= 0x03050000 if (is_coroutine) { PyObject *module, *fromlist, *marker = __pyx_n_s_is_coroutine; fromlist = PyList_New(1); if (unlikely(!fromlist)) return NULL; Py_INCREF(marker); #if CYTHON_ASSUME_SAFE_MACROS PyList_SET_ITEM(fromlist, 0, marker); #else if (unlikely(PyList_SetItem(fromlist, 0, marker) < 0)) { Py_DECREF(marker); Py_DECREF(fromlist); return NULL; } #endif module = PyImport_ImportModuleLevelObject(__pyx_n_s_asyncio_coroutines, NULL, NULL, fromlist, 0); Py_DECREF(fromlist); if (unlikely(!module)) goto ignore; op->func_is_coroutine = __Pyx_PyObject_GetAttrStr(module, marker); Py_DECREF(module); if (likely(op->func_is_coroutine)) { return __Pyx_NewRef(op->func_is_coroutine); } ignore: PyErr_Clear(); } #endif op->func_is_coroutine = __Pyx_PyBool_FromLong(is_coroutine); return __Pyx_NewRef(op->func_is_coroutine); } #if CYTHON_COMPILING_IN_LIMITED_API static PyObject * __Pyx_CyFunction_get_module(__pyx_CyFunctionObject *op, void *context) { CYTHON_UNUSED_VAR(context); return PyObject_GetAttrString(op->func, "__module__"); } static int __Pyx_CyFunction_set_module(__pyx_CyFunctionObject *op, PyObject* value, void *context) { CYTHON_UNUSED_VAR(context); return PyObject_SetAttrString(op->func, "__module__", value); } #endif static PyGetSetDef __pyx_CyFunction_getsets[] = { {(char *) "func_doc", (getter)__Pyx_CyFunction_get_doc, (setter)__Pyx_CyFunction_set_doc, 0, 0}, {(char *) "__doc__", (getter)__Pyx_CyFunction_get_doc, (setter)__Pyx_CyFunction_set_doc, 0, 0}, {(char *) "func_name", (getter)__Pyx_CyFunction_get_name, (setter)__Pyx_CyFunction_set_name, 0, 0}, {(char *) "__name__", (getter)__Pyx_CyFunction_get_name, (setter)__Pyx_CyFunction_set_name, 0, 0}, {(char *) "__qualname__", (getter)__Pyx_CyFunction_get_qualname, (setter)__Pyx_CyFunction_set_qualname, 0, 0}, {(char *) "func_dict", (getter)__Pyx_CyFunction_get_dict, (setter)__Pyx_CyFunction_set_dict, 0, 0}, {(char *) "__dict__", (getter)__Pyx_CyFunction_get_dict, (setter)__Pyx_CyFunction_set_dict, 0, 0}, {(char *) "func_globals", (getter)__Pyx_CyFunction_get_globals, 0, 0, 0}, {(char *) "__globals__", (getter)__Pyx_CyFunction_get_globals, 0, 0, 0}, {(char *) "func_closure", (getter)__Pyx_CyFunction_get_closure, 0, 0, 0}, {(char *) "__closure__", (getter)__Pyx_CyFunction_get_closure, 0, 0, 0}, {(char *) "func_code", (getter)__Pyx_CyFunction_get_code, 0, 0, 0}, {(char *) "__code__", (getter)__Pyx_CyFunction_get_code, 0, 0, 0}, {(char *) "func_defaults", (getter)__Pyx_CyFunction_get_defaults, (setter)__Pyx_CyFunction_set_defaults, 0, 0}, {(char *) "__defaults__", (getter)__Pyx_CyFunction_get_defaults, (setter)__Pyx_CyFunction_set_defaults, 0, 0}, {(char *) "__kwdefaults__", (getter)__Pyx_CyFunction_get_kwdefaults, (setter)__Pyx_CyFunction_set_kwdefaults, 0, 0}, {(char *) "__annotations__", (getter)__Pyx_CyFunction_get_annotations, (setter)__Pyx_CyFunction_set_annotations, 0, 0}, {(char *) "_is_coroutine", (getter)__Pyx_CyFunction_get_is_coroutine, 0, 0, 0}, #if CYTHON_COMPILING_IN_LIMITED_API {"__module__", (getter)__Pyx_CyFunction_get_module, (setter)__Pyx_CyFunction_set_module, 0, 0}, #endif {0, 0, 0, 0, 0} }; static PyMemberDef __pyx_CyFunction_members[] = { #if !CYTHON_COMPILING_IN_LIMITED_API {(char *) "__module__", T_OBJECT, offsetof(PyCFunctionObject, m_module), 0, 0}, #endif #if CYTHON_USE_TYPE_SPECS {(char *) "__dictoffset__", T_PYSSIZET, offsetof(__pyx_CyFunctionObject, func_dict), READONLY, 0}, #if CYTHON_METH_FASTCALL #if CYTHON_BACKPORT_VECTORCALL {(char *) "__vectorcalloffset__", T_PYSSIZET, offsetof(__pyx_CyFunctionObject, func_vectorcall), READONLY, 0}, #else #if !CYTHON_COMPILING_IN_LIMITED_API {(char *) "__vectorcalloffset__", T_PYSSIZET, offsetof(PyCFunctionObject, vectorcall), READONLY, 0}, #endif #endif #endif #if PY_VERSION_HEX < 0x030500A0 || CYTHON_COMPILING_IN_LIMITED_API {(char *) "__weaklistoffset__", T_PYSSIZET, offsetof(__pyx_CyFunctionObject, func_weakreflist), READONLY, 0}, #else {(char *) "__weaklistoffset__", T_PYSSIZET, offsetof(PyCFunctionObject, m_weakreflist), READONLY, 0}, #endif #endif {0, 0, 0, 0, 0} }; static PyObject * __Pyx_CyFunction_reduce(__pyx_CyFunctionObject *m, PyObject *args) { CYTHON_UNUSED_VAR(args); #if PY_MAJOR_VERSION >= 3 Py_INCREF(m->func_qualname); return m->func_qualname; #else return PyString_FromString(((PyCFunctionObject*)m)->m_ml->ml_name); #endif } static PyMethodDef __pyx_CyFunction_methods[] = { {"__reduce__", (PyCFunction)__Pyx_CyFunction_reduce, METH_VARARGS, 0}, {0, 0, 0, 0} }; #if PY_VERSION_HEX < 0x030500A0 || CYTHON_COMPILING_IN_LIMITED_API #define __Pyx_CyFunction_weakreflist(cyfunc) ((cyfunc)->func_weakreflist) #else #define __Pyx_CyFunction_weakreflist(cyfunc) (((PyCFunctionObject*)cyfunc)->m_weakreflist) #endif static PyObject *__Pyx_CyFunction_Init(__pyx_CyFunctionObject *op, PyMethodDef *ml, int flags, PyObject* qualname, PyObject *closure, PyObject *module, PyObject* globals, PyObject* code) { #if !CYTHON_COMPILING_IN_LIMITED_API PyCFunctionObject *cf = (PyCFunctionObject*) op; #endif if (unlikely(op == NULL)) return NULL; #if CYTHON_COMPILING_IN_LIMITED_API op->func = PyCFunction_NewEx(ml, (PyObject*)op, module); if (unlikely(!op->func)) return NULL; #endif op->flags = flags; __Pyx_CyFunction_weakreflist(op) = NULL; #if !CYTHON_COMPILING_IN_LIMITED_API cf->m_ml = ml; cf->m_self = (PyObject *) op; #endif Py_XINCREF(closure); op->func_closure = closure; #if !CYTHON_COMPILING_IN_LIMITED_API Py_XINCREF(module); cf->m_module = module; #endif op->func_dict = NULL; op->func_name = NULL; Py_INCREF(qualname); op->func_qualname = qualname; op->func_doc = NULL; #if PY_VERSION_HEX < 0x030900B1 || CYTHON_COMPILING_IN_LIMITED_API op->func_classobj = NULL; #else ((PyCMethodObject*)op)->mm_class = NULL; #endif op->func_globals = globals; Py_INCREF(op->func_globals); Py_XINCREF(code); op->func_code = code; op->defaults_pyobjects = 0; op->defaults_size = 0; op->defaults = NULL; op->defaults_tuple = NULL; op->defaults_kwdict = NULL; op->defaults_getter = NULL; op->func_annotations = NULL; op->func_is_coroutine = NULL; #if CYTHON_METH_FASTCALL switch (ml->ml_flags & (METH_VARARGS | METH_FASTCALL | METH_NOARGS | METH_O | METH_KEYWORDS | METH_METHOD)) { case METH_NOARGS: __Pyx_CyFunction_func_vectorcall(op) = __Pyx_CyFunction_Vectorcall_NOARGS; break; case METH_O: __Pyx_CyFunction_func_vectorcall(op) = __Pyx_CyFunction_Vectorcall_O; break; case METH_METHOD | METH_FASTCALL | METH_KEYWORDS: __Pyx_CyFunction_func_vectorcall(op) = __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS_METHOD; break; case METH_FASTCALL | METH_KEYWORDS: __Pyx_CyFunction_func_vectorcall(op) = __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS; break; case METH_VARARGS | METH_KEYWORDS: __Pyx_CyFunction_func_vectorcall(op) = NULL; break; default: PyErr_SetString(PyExc_SystemError, "Bad call flags for CyFunction"); Py_DECREF(op); return NULL; } #endif return (PyObject *) op; } static int __Pyx_CyFunction_clear(__pyx_CyFunctionObject *m) { Py_CLEAR(m->func_closure); #if CYTHON_COMPILING_IN_LIMITED_API Py_CLEAR(m->func); #else Py_CLEAR(((PyCFunctionObject*)m)->m_module); #endif Py_CLEAR(m->func_dict); Py_CLEAR(m->func_name); Py_CLEAR(m->func_qualname); Py_CLEAR(m->func_doc); Py_CLEAR(m->func_globals); Py_CLEAR(m->func_code); #if !CYTHON_COMPILING_IN_LIMITED_API #if PY_VERSION_HEX < 0x030900B1 Py_CLEAR(__Pyx_CyFunction_GetClassObj(m)); #else { PyObject *cls = (PyObject*) ((PyCMethodObject *) (m))->mm_class; ((PyCMethodObject *) (m))->mm_class = NULL; Py_XDECREF(cls); } #endif #endif Py_CLEAR(m->defaults_tuple); Py_CLEAR(m->defaults_kwdict); Py_CLEAR(m->func_annotations); Py_CLEAR(m->func_is_coroutine); if (m->defaults) { PyObject **pydefaults = __Pyx_CyFunction_Defaults(PyObject *, m); int i; for (i = 0; i < m->defaults_pyobjects; i++) Py_XDECREF(pydefaults[i]); PyObject_Free(m->defaults); m->defaults = NULL; } return 0; } static void __Pyx__CyFunction_dealloc(__pyx_CyFunctionObject *m) { if (__Pyx_CyFunction_weakreflist(m) != NULL) PyObject_ClearWeakRefs((PyObject *) m); __Pyx_CyFunction_clear(m); __Pyx_PyHeapTypeObject_GC_Del(m); } static void __Pyx_CyFunction_dealloc(__pyx_CyFunctionObject *m) { PyObject_GC_UnTrack(m); __Pyx__CyFunction_dealloc(m); } static int __Pyx_CyFunction_traverse(__pyx_CyFunctionObject *m, visitproc visit, void *arg) { Py_VISIT(m->func_closure); #if CYTHON_COMPILING_IN_LIMITED_API Py_VISIT(m->func); #else Py_VISIT(((PyCFunctionObject*)m)->m_module); #endif Py_VISIT(m->func_dict); Py_VISIT(m->func_name); Py_VISIT(m->func_qualname); Py_VISIT(m->func_doc); Py_VISIT(m->func_globals); Py_VISIT(m->func_code); #if !CYTHON_COMPILING_IN_LIMITED_API Py_VISIT(__Pyx_CyFunction_GetClassObj(m)); #endif Py_VISIT(m->defaults_tuple); Py_VISIT(m->defaults_kwdict); Py_VISIT(m->func_is_coroutine); if (m->defaults) { PyObject **pydefaults = __Pyx_CyFunction_Defaults(PyObject *, m); int i; for (i = 0; i < m->defaults_pyobjects; i++) Py_VISIT(pydefaults[i]); } return 0; } static PyObject* __Pyx_CyFunction_repr(__pyx_CyFunctionObject *op) { #if PY_MAJOR_VERSION >= 3 return PyUnicode_FromFormat("", op->func_qualname, (void *)op); #else return PyString_FromFormat("", PyString_AsString(op->func_qualname), (void *)op); #endif } static PyObject * __Pyx_CyFunction_CallMethod(PyObject *func, PyObject *self, PyObject *arg, PyObject *kw) { #if CYTHON_COMPILING_IN_LIMITED_API PyObject *f = ((__pyx_CyFunctionObject*)func)->func; PyObject *py_name = NULL; PyCFunction meth; int flags; meth = PyCFunction_GetFunction(f); if (unlikely(!meth)) return NULL; flags = PyCFunction_GetFlags(f); if (unlikely(flags < 0)) return NULL; #else PyCFunctionObject* f = (PyCFunctionObject*)func; PyCFunction meth = f->m_ml->ml_meth; int flags = f->m_ml->ml_flags; #endif Py_ssize_t size; switch (flags & (METH_VARARGS | METH_KEYWORDS | METH_NOARGS | METH_O)) { case METH_VARARGS: if (likely(kw == NULL || PyDict_Size(kw) == 0)) return (*meth)(self, arg); break; case METH_VARARGS | METH_KEYWORDS: return (*(PyCFunctionWithKeywords)(void*)meth)(self, arg, kw); case METH_NOARGS: if (likely(kw == NULL || PyDict_Size(kw) == 0)) { #if CYTHON_ASSUME_SAFE_MACROS size = PyTuple_GET_SIZE(arg); #else size = PyTuple_Size(arg); if (unlikely(size < 0)) return NULL; #endif if (likely(size == 0)) return (*meth)(self, NULL); #if CYTHON_COMPILING_IN_LIMITED_API py_name = __Pyx_CyFunction_get_name((__pyx_CyFunctionObject*)func, NULL); if (!py_name) return NULL; PyErr_Format(PyExc_TypeError, "%.200S() takes no arguments (%" CYTHON_FORMAT_SSIZE_T "d given)", py_name, size); Py_DECREF(py_name); #else PyErr_Format(PyExc_TypeError, "%.200s() takes no arguments (%" CYTHON_FORMAT_SSIZE_T "d given)", f->m_ml->ml_name, size); #endif return NULL; } break; case METH_O: if (likely(kw == NULL || PyDict_Size(kw) == 0)) { #if CYTHON_ASSUME_SAFE_MACROS size = PyTuple_GET_SIZE(arg); #else size = PyTuple_Size(arg); if (unlikely(size < 0)) return NULL; #endif if (likely(size == 1)) { PyObject *result, *arg0; #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS arg0 = PyTuple_GET_ITEM(arg, 0); #else arg0 = __Pyx_PySequence_ITEM(arg, 0); if (unlikely(!arg0)) return NULL; #endif result = (*meth)(self, arg0); #if !(CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS) Py_DECREF(arg0); #endif return result; } #if CYTHON_COMPILING_IN_LIMITED_API py_name = __Pyx_CyFunction_get_name((__pyx_CyFunctionObject*)func, NULL); if (!py_name) return NULL; PyErr_Format(PyExc_TypeError, "%.200S() takes exactly one argument (%" CYTHON_FORMAT_SSIZE_T "d given)", py_name, size); Py_DECREF(py_name); #else PyErr_Format(PyExc_TypeError, "%.200s() takes exactly one argument (%" CYTHON_FORMAT_SSIZE_T "d given)", f->m_ml->ml_name, size); #endif return NULL; } break; default: PyErr_SetString(PyExc_SystemError, "Bad call flags for CyFunction"); return NULL; } #if CYTHON_COMPILING_IN_LIMITED_API py_name = __Pyx_CyFunction_get_name((__pyx_CyFunctionObject*)func, NULL); if (!py_name) return NULL; PyErr_Format(PyExc_TypeError, "%.200S() takes no keyword arguments", py_name); Py_DECREF(py_name); #else PyErr_Format(PyExc_TypeError, "%.200s() takes no keyword arguments", f->m_ml->ml_name); #endif return NULL; } static CYTHON_INLINE PyObject *__Pyx_CyFunction_Call(PyObject *func, PyObject *arg, PyObject *kw) { PyObject *self, *result; #if CYTHON_COMPILING_IN_LIMITED_API self = PyCFunction_GetSelf(((__pyx_CyFunctionObject*)func)->func); if (unlikely(!self) && PyErr_Occurred()) return NULL; #else self = ((PyCFunctionObject*)func)->m_self; #endif result = __Pyx_CyFunction_CallMethod(func, self, arg, kw); return result; } static PyObject *__Pyx_CyFunction_CallAsMethod(PyObject *func, PyObject *args, PyObject *kw) { PyObject *result; __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *) func; #if CYTHON_METH_FASTCALL __pyx_vectorcallfunc vc = __Pyx_CyFunction_func_vectorcall(cyfunc); if (vc) { #if CYTHON_ASSUME_SAFE_MACROS return __Pyx_PyVectorcall_FastCallDict(func, vc, &PyTuple_GET_ITEM(args, 0), (size_t)PyTuple_GET_SIZE(args), kw); #else (void) &__Pyx_PyVectorcall_FastCallDict; return PyVectorcall_Call(func, args, kw); #endif } #endif if ((cyfunc->flags & __Pyx_CYFUNCTION_CCLASS) && !(cyfunc->flags & __Pyx_CYFUNCTION_STATICMETHOD)) { Py_ssize_t argc; PyObject *new_args; PyObject *self; #if CYTHON_ASSUME_SAFE_MACROS argc = PyTuple_GET_SIZE(args); #else argc = PyTuple_Size(args); if (unlikely(!argc) < 0) return NULL; #endif new_args = PyTuple_GetSlice(args, 1, argc); if (unlikely(!new_args)) return NULL; self = PyTuple_GetItem(args, 0); if (unlikely(!self)) { Py_DECREF(new_args); #if PY_MAJOR_VERSION > 2 PyErr_Format(PyExc_TypeError, "unbound method %.200S() needs an argument", cyfunc->func_qualname); #else PyErr_SetString(PyExc_TypeError, "unbound method needs an argument"); #endif return NULL; } result = __Pyx_CyFunction_CallMethod(func, self, new_args, kw); Py_DECREF(new_args); } else { result = __Pyx_CyFunction_Call(func, args, kw); } return result; } #if CYTHON_METH_FASTCALL static CYTHON_INLINE int __Pyx_CyFunction_Vectorcall_CheckArgs(__pyx_CyFunctionObject *cyfunc, Py_ssize_t nargs, PyObject *kwnames) { int ret = 0; if ((cyfunc->flags & __Pyx_CYFUNCTION_CCLASS) && !(cyfunc->flags & __Pyx_CYFUNCTION_STATICMETHOD)) { if (unlikely(nargs < 1)) { PyErr_Format(PyExc_TypeError, "%.200s() needs an argument", ((PyCFunctionObject*)cyfunc)->m_ml->ml_name); return -1; } ret = 1; } if (unlikely(kwnames) && unlikely(PyTuple_GET_SIZE(kwnames))) { PyErr_Format(PyExc_TypeError, "%.200s() takes no keyword arguments", ((PyCFunctionObject*)cyfunc)->m_ml->ml_name); return -1; } return ret; } static PyObject * __Pyx_CyFunction_Vectorcall_NOARGS(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames) { __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *)func; PyMethodDef* def = ((PyCFunctionObject*)cyfunc)->m_ml; #if CYTHON_BACKPORT_VECTORCALL Py_ssize_t nargs = (Py_ssize_t)nargsf; #else Py_ssize_t nargs = PyVectorcall_NARGS(nargsf); #endif PyObject *self; switch (__Pyx_CyFunction_Vectorcall_CheckArgs(cyfunc, nargs, kwnames)) { case 1: self = args[0]; args += 1; nargs -= 1; break; case 0: self = ((PyCFunctionObject*)cyfunc)->m_self; break; default: return NULL; } if (unlikely(nargs != 0)) { PyErr_Format(PyExc_TypeError, "%.200s() takes no arguments (%" CYTHON_FORMAT_SSIZE_T "d given)", def->ml_name, nargs); return NULL; } return def->ml_meth(self, NULL); } static PyObject * __Pyx_CyFunction_Vectorcall_O(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames) { __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *)func; PyMethodDef* def = ((PyCFunctionObject*)cyfunc)->m_ml; #if CYTHON_BACKPORT_VECTORCALL Py_ssize_t nargs = (Py_ssize_t)nargsf; #else Py_ssize_t nargs = PyVectorcall_NARGS(nargsf); #endif PyObject *self; switch (__Pyx_CyFunction_Vectorcall_CheckArgs(cyfunc, nargs, kwnames)) { case 1: self = args[0]; args += 1; nargs -= 1; break; case 0: self = ((PyCFunctionObject*)cyfunc)->m_self; break; default: return NULL; } if (unlikely(nargs != 1)) { PyErr_Format(PyExc_TypeError, "%.200s() takes exactly one argument (%" CYTHON_FORMAT_SSIZE_T "d given)", def->ml_name, nargs); return NULL; } return def->ml_meth(self, args[0]); } static PyObject * __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames) { __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *)func; PyMethodDef* def = ((PyCFunctionObject*)cyfunc)->m_ml; #if CYTHON_BACKPORT_VECTORCALL Py_ssize_t nargs = (Py_ssize_t)nargsf; #else Py_ssize_t nargs = PyVectorcall_NARGS(nargsf); #endif PyObject *self; switch (__Pyx_CyFunction_Vectorcall_CheckArgs(cyfunc, nargs, NULL)) { case 1: self = args[0]; args += 1; nargs -= 1; break; case 0: self = ((PyCFunctionObject*)cyfunc)->m_self; break; default: return NULL; } return ((__Pyx_PyCFunctionFastWithKeywords)(void(*)(void))def->ml_meth)(self, args, nargs, kwnames); } static PyObject * __Pyx_CyFunction_Vectorcall_FASTCALL_KEYWORDS_METHOD(PyObject *func, PyObject *const *args, size_t nargsf, PyObject *kwnames) { __pyx_CyFunctionObject *cyfunc = (__pyx_CyFunctionObject *)func; PyMethodDef* def = ((PyCFunctionObject*)cyfunc)->m_ml; PyTypeObject *cls = (PyTypeObject *) __Pyx_CyFunction_GetClassObj(cyfunc); #if CYTHON_BACKPORT_VECTORCALL Py_ssize_t nargs = (Py_ssize_t)nargsf; #else Py_ssize_t nargs = PyVectorcall_NARGS(nargsf); #endif PyObject *self; switch (__Pyx_CyFunction_Vectorcall_CheckArgs(cyfunc, nargs, NULL)) { case 1: self = args[0]; args += 1; nargs -= 1; break; case 0: self = ((PyCFunctionObject*)cyfunc)->m_self; break; default: return NULL; } return ((__Pyx_PyCMethod)(void(*)(void))def->ml_meth)(self, cls, args, (size_t)nargs, kwnames); } #endif #if CYTHON_USE_TYPE_SPECS static PyType_Slot __pyx_CyFunctionType_slots[] = { {Py_tp_dealloc, (void *)__Pyx_CyFunction_dealloc}, {Py_tp_repr, (void *)__Pyx_CyFunction_repr}, {Py_tp_call, (void *)__Pyx_CyFunction_CallAsMethod}, {Py_tp_traverse, (void *)__Pyx_CyFunction_traverse}, {Py_tp_clear, (void *)__Pyx_CyFunction_clear}, {Py_tp_methods, (void *)__pyx_CyFunction_methods}, {Py_tp_members, (void *)__pyx_CyFunction_members}, {Py_tp_getset, (void *)__pyx_CyFunction_getsets}, {Py_tp_descr_get, (void *)__Pyx_PyMethod_New}, {0, 0}, }; static PyType_Spec __pyx_CyFunctionType_spec = { __PYX_TYPE_MODULE_PREFIX "cython_function_or_method", sizeof(__pyx_CyFunctionObject), 0, #ifdef Py_TPFLAGS_METHOD_DESCRIPTOR Py_TPFLAGS_METHOD_DESCRIPTOR | #endif #if (defined(_Py_TPFLAGS_HAVE_VECTORCALL) && CYTHON_METH_FASTCALL) _Py_TPFLAGS_HAVE_VECTORCALL | #endif Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, __pyx_CyFunctionType_slots }; #else static PyTypeObject __pyx_CyFunctionType_type = { PyVarObject_HEAD_INIT(0, 0) __PYX_TYPE_MODULE_PREFIX "cython_function_or_method", sizeof(__pyx_CyFunctionObject), 0, (destructor) __Pyx_CyFunction_dealloc, #if !CYTHON_METH_FASTCALL 0, #elif CYTHON_BACKPORT_VECTORCALL (printfunc)offsetof(__pyx_CyFunctionObject, func_vectorcall), #else offsetof(PyCFunctionObject, vectorcall), #endif 0, 0, #if PY_MAJOR_VERSION < 3 0, #else 0, #endif (reprfunc) __Pyx_CyFunction_repr, 0, 0, 0, 0, __Pyx_CyFunction_CallAsMethod, 0, 0, 0, 0, #ifdef Py_TPFLAGS_METHOD_DESCRIPTOR Py_TPFLAGS_METHOD_DESCRIPTOR | #endif #if defined(_Py_TPFLAGS_HAVE_VECTORCALL) && CYTHON_METH_FASTCALL _Py_TPFLAGS_HAVE_VECTORCALL | #endif Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, 0, (traverseproc) __Pyx_CyFunction_traverse, (inquiry) __Pyx_CyFunction_clear, 0, #if PY_VERSION_HEX < 0x030500A0 offsetof(__pyx_CyFunctionObject, func_weakreflist), #else offsetof(PyCFunctionObject, m_weakreflist), #endif 0, 0, __pyx_CyFunction_methods, __pyx_CyFunction_members, __pyx_CyFunction_getsets, 0, 0, __Pyx_PyMethod_New, 0, offsetof(__pyx_CyFunctionObject, func_dict), 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, #if PY_VERSION_HEX >= 0x030400a1 0, #endif #if PY_VERSION_HEX >= 0x030800b1 && (!CYTHON_COMPILING_IN_PYPY || PYPY_VERSION_NUM >= 0x07030800) 0, #endif #if __PYX_NEED_TP_PRINT_SLOT 0, #endif #if PY_VERSION_HEX >= 0x030C0000 0, #endif #if PY_VERSION_HEX >= 0x030d00A4 0, #endif #if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX >= 0x03090000 && PY_VERSION_HEX < 0x030a0000 0, #endif }; #endif static int __pyx_CyFunction_init(PyObject *module) { #if CYTHON_USE_TYPE_SPECS __pyx_CyFunctionType = __Pyx_FetchCommonTypeFromSpec(module, &__pyx_CyFunctionType_spec, NULL); #else CYTHON_UNUSED_VAR(module); __pyx_CyFunctionType = __Pyx_FetchCommonType(&__pyx_CyFunctionType_type); #endif if (unlikely(__pyx_CyFunctionType == NULL)) { return -1; } return 0; } static CYTHON_INLINE void *__Pyx_CyFunction_InitDefaults(PyObject *func, size_t size, int pyobjects) { __pyx_CyFunctionObject *m = (__pyx_CyFunctionObject *) func; m->defaults = PyObject_Malloc(size); if (unlikely(!m->defaults)) return PyErr_NoMemory(); memset(m->defaults, 0, size); m->defaults_pyobjects = pyobjects; m->defaults_size = size; return m->defaults; } static CYTHON_INLINE void __Pyx_CyFunction_SetDefaultsTuple(PyObject *func, PyObject *tuple) { __pyx_CyFunctionObject *m = (__pyx_CyFunctionObject *) func; m->defaults_tuple = tuple; Py_INCREF(tuple); } static CYTHON_INLINE void __Pyx_CyFunction_SetDefaultsKwDict(PyObject *func, PyObject *dict) { __pyx_CyFunctionObject *m = (__pyx_CyFunctionObject *) func; m->defaults_kwdict = dict; Py_INCREF(dict); } static CYTHON_INLINE void __Pyx_CyFunction_SetAnnotationsDict(PyObject *func, PyObject *dict) { __pyx_CyFunctionObject *m = (__pyx_CyFunctionObject *) func; m->func_annotations = dict; Py_INCREF(dict); } /* CythonFunction */ static PyObject *__Pyx_CyFunction_New(PyMethodDef *ml, int flags, PyObject* qualname, PyObject *closure, PyObject *module, PyObject* globals, PyObject* code) { PyObject *op = __Pyx_CyFunction_Init( PyObject_GC_New(__pyx_CyFunctionObject, __pyx_CyFunctionType), ml, flags, qualname, closure, module, globals, code ); if (likely(op)) { PyObject_GC_Track(op); } return op; } /* CLineInTraceback */ #ifndef CYTHON_CLINE_IN_TRACEBACK static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line) { PyObject *use_cline; PyObject *ptype, *pvalue, *ptraceback; #if CYTHON_COMPILING_IN_CPYTHON PyObject **cython_runtime_dict; #endif CYTHON_MAYBE_UNUSED_VAR(tstate); if (unlikely(!__pyx_cython_runtime)) { return c_line; } __Pyx_ErrFetchInState(tstate, &ptype, &pvalue, &ptraceback); #if CYTHON_COMPILING_IN_CPYTHON cython_runtime_dict = _PyObject_GetDictPtr(__pyx_cython_runtime); if (likely(cython_runtime_dict)) { __PYX_PY_DICT_LOOKUP_IF_MODIFIED( use_cline, *cython_runtime_dict, __Pyx_PyDict_GetItemStr(*cython_runtime_dict, __pyx_n_s_cline_in_traceback)) } else #endif { PyObject *use_cline_obj = __Pyx_PyObject_GetAttrStrNoError(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback); if (use_cline_obj) { use_cline = PyObject_Not(use_cline_obj) ? Py_False : Py_True; Py_DECREF(use_cline_obj); } else { PyErr_Clear(); use_cline = NULL; } } if (!use_cline) { c_line = 0; (void) PyObject_SetAttr(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback, Py_False); } else if (use_cline == Py_False || (use_cline != Py_True && PyObject_Not(use_cline) != 0)) { c_line = 0; } __Pyx_ErrRestoreInState(tstate, ptype, pvalue, ptraceback); return c_line; } #endif /* CodeObjectCache */ #if !CYTHON_COMPILING_IN_LIMITED_API static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line) { int start = 0, mid = 0, end = count - 1; if (end >= 0 && code_line > entries[end].code_line) { return count; } while (start < end) { mid = start + (end - start) / 2; if (code_line < entries[mid].code_line) { end = mid; } else if (code_line > entries[mid].code_line) { start = mid + 1; } else { return mid; } } if (code_line <= entries[mid].code_line) { return mid; } else { return mid + 1; } } static PyCodeObject *__pyx_find_code_object(int code_line) { PyCodeObject* code_object; int pos; if (unlikely(!code_line) || unlikely(!__pyx_code_cache.entries)) { return NULL; } pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); if (unlikely(pos >= __pyx_code_cache.count) || unlikely(__pyx_code_cache.entries[pos].code_line != code_line)) { return NULL; } code_object = __pyx_code_cache.entries[pos].code_object; Py_INCREF(code_object); return code_object; } static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object) { int pos, i; __Pyx_CodeObjectCacheEntry* entries = __pyx_code_cache.entries; if (unlikely(!code_line)) { return; } if (unlikely(!entries)) { entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Malloc(64*sizeof(__Pyx_CodeObjectCacheEntry)); if (likely(entries)) { __pyx_code_cache.entries = entries; __pyx_code_cache.max_count = 64; __pyx_code_cache.count = 1; entries[0].code_line = code_line; entries[0].code_object = code_object; Py_INCREF(code_object); } return; } pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); if ((pos < __pyx_code_cache.count) && unlikely(__pyx_code_cache.entries[pos].code_line == code_line)) { PyCodeObject* tmp = entries[pos].code_object; entries[pos].code_object = code_object; Py_DECREF(tmp); return; } if (__pyx_code_cache.count == __pyx_code_cache.max_count) { int new_max = __pyx_code_cache.max_count + 64; entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Realloc( __pyx_code_cache.entries, ((size_t)new_max) * sizeof(__Pyx_CodeObjectCacheEntry)); if (unlikely(!entries)) { return; } __pyx_code_cache.entries = entries; __pyx_code_cache.max_count = new_max; } for (i=__pyx_code_cache.count; i>pos; i--) { entries[i] = entries[i-1]; } entries[pos].code_line = code_line; entries[pos].code_object = code_object; __pyx_code_cache.count++; Py_INCREF(code_object); } #endif /* AddTraceback */ #include "compile.h" #include "frameobject.h" #include "traceback.h" #if PY_VERSION_HEX >= 0x030b00a6 && !CYTHON_COMPILING_IN_LIMITED_API #ifndef Py_BUILD_CORE #define Py_BUILD_CORE 1 #endif #include "internal/pycore_frame.h" #endif #if CYTHON_COMPILING_IN_LIMITED_API static PyObject *__Pyx_PyCode_Replace_For_AddTraceback(PyObject *code, PyObject *scratch_dict, PyObject *firstlineno, PyObject *name) { PyObject *replace = NULL; if (unlikely(PyDict_SetItemString(scratch_dict, "co_firstlineno", firstlineno))) return NULL; if (unlikely(PyDict_SetItemString(scratch_dict, "co_name", name))) return NULL; replace = PyObject_GetAttrString(code, "replace"); if (likely(replace)) { PyObject *result; result = PyObject_Call(replace, __pyx_empty_tuple, scratch_dict); Py_DECREF(replace); return result; } PyErr_Clear(); #if __PYX_LIMITED_VERSION_HEX < 0x030780000 { PyObject *compiled = NULL, *result = NULL; if (unlikely(PyDict_SetItemString(scratch_dict, "code", code))) return NULL; if (unlikely(PyDict_SetItemString(scratch_dict, "type", (PyObject*)(&PyType_Type)))) return NULL; compiled = Py_CompileString( "out = type(code)(\n" " code.co_argcount, code.co_kwonlyargcount, code.co_nlocals, code.co_stacksize,\n" " code.co_flags, code.co_code, code.co_consts, code.co_names,\n" " code.co_varnames, code.co_filename, co_name, co_firstlineno,\n" " code.co_lnotab)\n", "", Py_file_input); if (!compiled) return NULL; result = PyEval_EvalCode(compiled, scratch_dict, scratch_dict); Py_DECREF(compiled); if (!result) PyErr_Print(); Py_DECREF(result); result = PyDict_GetItemString(scratch_dict, "out"); if (result) Py_INCREF(result); return result; } #else return NULL; #endif } static void __Pyx_AddTraceback(const char *funcname, int c_line, int py_line, const char *filename) { PyObject *code_object = NULL, *py_py_line = NULL, *py_funcname = NULL, *dict = NULL; PyObject *replace = NULL, *getframe = NULL, *frame = NULL; PyObject *exc_type, *exc_value, *exc_traceback; int success = 0; if (c_line) { (void) __pyx_cfilenm; (void) __Pyx_CLineForTraceback(__Pyx_PyThreadState_Current, c_line); } PyErr_Fetch(&exc_type, &exc_value, &exc_traceback); code_object = Py_CompileString("_getframe()", filename, Py_eval_input); if (unlikely(!code_object)) goto bad; py_py_line = PyLong_FromLong(py_line); if (unlikely(!py_py_line)) goto bad; py_funcname = PyUnicode_FromString(funcname); if (unlikely(!py_funcname)) goto bad; dict = PyDict_New(); if (unlikely(!dict)) goto bad; { PyObject *old_code_object = code_object; code_object = __Pyx_PyCode_Replace_For_AddTraceback(code_object, dict, py_py_line, py_funcname); Py_DECREF(old_code_object); } if (unlikely(!code_object)) goto bad; getframe = PySys_GetObject("_getframe"); if (unlikely(!getframe)) goto bad; if (unlikely(PyDict_SetItemString(dict, "_getframe", getframe))) goto bad; frame = PyEval_EvalCode(code_object, dict, dict); if (unlikely(!frame) || frame == Py_None) goto bad; success = 1; bad: PyErr_Restore(exc_type, exc_value, exc_traceback); Py_XDECREF(code_object); Py_XDECREF(py_py_line); Py_XDECREF(py_funcname); Py_XDECREF(dict); Py_XDECREF(replace); if (success) { PyTraceBack_Here( (struct _frame*)frame); } Py_XDECREF(frame); } #else static PyCodeObject* __Pyx_CreateCodeObjectForTraceback( const char *funcname, int c_line, int py_line, const char *filename) { PyCodeObject *py_code = NULL; PyObject *py_funcname = NULL; #if PY_MAJOR_VERSION < 3 PyObject *py_srcfile = NULL; py_srcfile = PyString_FromString(filename); if (!py_srcfile) goto bad; #endif if (c_line) { #if PY_MAJOR_VERSION < 3 py_funcname = PyString_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); if (!py_funcname) goto bad; #else py_funcname = PyUnicode_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); if (!py_funcname) goto bad; funcname = PyUnicode_AsUTF8(py_funcname); if (!funcname) goto bad; #endif } else { #if PY_MAJOR_VERSION < 3 py_funcname = PyString_FromString(funcname); if (!py_funcname) goto bad; #endif } #if PY_MAJOR_VERSION < 3 py_code = __Pyx_PyCode_New( 0, 0, 0, 0, 0, 0, __pyx_empty_bytes, /*PyObject *code,*/ __pyx_empty_tuple, /*PyObject *consts,*/ __pyx_empty_tuple, /*PyObject *names,*/ __pyx_empty_tuple, /*PyObject *varnames,*/ __pyx_empty_tuple, /*PyObject *freevars,*/ __pyx_empty_tuple, /*PyObject *cellvars,*/ py_srcfile, /*PyObject *filename,*/ py_funcname, /*PyObject *name,*/ py_line, __pyx_empty_bytes /*PyObject *lnotab*/ ); Py_DECREF(py_srcfile); #else py_code = PyCode_NewEmpty(filename, funcname, py_line); #endif Py_XDECREF(py_funcname); return py_code; bad: Py_XDECREF(py_funcname); #if PY_MAJOR_VERSION < 3 Py_XDECREF(py_srcfile); #endif return NULL; } static void __Pyx_AddTraceback(const char *funcname, int c_line, int py_line, const char *filename) { PyCodeObject *py_code = 0; PyFrameObject *py_frame = 0; PyThreadState *tstate = __Pyx_PyThreadState_Current; PyObject *ptype, *pvalue, *ptraceback; if (c_line) { c_line = __Pyx_CLineForTraceback(tstate, c_line); } py_code = __pyx_find_code_object(c_line ? -c_line : py_line); if (!py_code) { __Pyx_ErrFetchInState(tstate, &ptype, &pvalue, &ptraceback); py_code = __Pyx_CreateCodeObjectForTraceback( funcname, c_line, py_line, filename); if (!py_code) { /* If the code object creation fails, then we should clear the fetched exception references and propagate the new exception */ Py_XDECREF(ptype); Py_XDECREF(pvalue); Py_XDECREF(ptraceback); goto bad; } __Pyx_ErrRestoreInState(tstate, ptype, pvalue, ptraceback); __pyx_insert_code_object(c_line ? -c_line : py_line, py_code); } py_frame = PyFrame_New( tstate, /*PyThreadState *tstate,*/ py_code, /*PyCodeObject *code,*/ __pyx_d, /*PyObject *globals,*/ 0 /*PyObject *locals*/ ); if (!py_frame) goto bad; __Pyx_PyFrame_SetLineNumber(py_frame, py_line); PyTraceBack_Here(py_frame); bad: Py_XDECREF(py_code); Py_XDECREF(py_frame); } #endif #if PY_MAJOR_VERSION < 3 static int __Pyx_GetBuffer(PyObject *obj, Py_buffer *view, int flags) { __Pyx_TypeName obj_type_name; if (PyObject_CheckBuffer(obj)) return PyObject_GetBuffer(obj, view, flags); obj_type_name = __Pyx_PyType_GetName(Py_TYPE(obj)); PyErr_Format(PyExc_TypeError, "'" __Pyx_FMT_TYPENAME "' does not have the buffer interface", obj_type_name); __Pyx_DECREF_TypeName(obj_type_name); return -1; } static void __Pyx_ReleaseBuffer(Py_buffer *view) { PyObject *obj = view->obj; if (!obj) return; if (PyObject_CheckBuffer(obj)) { PyBuffer_Release(view); return; } if ((0)) {} view->obj = NULL; Py_DECREF(obj); } #endif /* CIntFromPyVerify */ #define __PYX_VERIFY_RETURN_INT(target_type, func_type, func_value)\ __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 0) #define __PYX_VERIFY_RETURN_INT_EXC(target_type, func_type, func_value)\ __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 1) #define __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, exc)\ {\ func_type value = func_value;\ if (sizeof(target_type) < sizeof(func_type)) {\ if (unlikely(value != (func_type) (target_type) value)) {\ func_type zero = 0;\ if (exc && unlikely(value == (func_type)-1 && PyErr_Occurred()))\ return (target_type) -1;\ if (is_unsigned && unlikely(value < zero))\ goto raise_neg_overflow;\ else\ goto raise_overflow;\ }\ }\ return (target_type) value;\ } /* Declarations */ #if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) #ifdef __cplusplus static CYTHON_INLINE __pyx_t_float_complex __pyx_t_float_complex_from_parts(float x, float y) { return ::std::complex< float >(x, y); } #else static CYTHON_INLINE __pyx_t_float_complex __pyx_t_float_complex_from_parts(float x, float y) { return x + y*(__pyx_t_float_complex)_Complex_I; } #endif #else static CYTHON_INLINE __pyx_t_float_complex __pyx_t_float_complex_from_parts(float x, float y) { __pyx_t_float_complex z; z.real = x; z.imag = y; return z; } #endif /* Arithmetic */ #if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) #else static CYTHON_INLINE int __Pyx_c_eq_float(__pyx_t_float_complex a, __pyx_t_float_complex b) { return (a.real == b.real) && (a.imag == b.imag); } static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_sum_float(__pyx_t_float_complex a, __pyx_t_float_complex b) { __pyx_t_float_complex z; z.real = a.real + b.real; z.imag = a.imag + b.imag; return z; } static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_diff_float(__pyx_t_float_complex a, __pyx_t_float_complex b) { __pyx_t_float_complex z; z.real = a.real - b.real; z.imag = a.imag - b.imag; return z; } static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_prod_float(__pyx_t_float_complex a, __pyx_t_float_complex b) { __pyx_t_float_complex z; z.real = a.real * b.real - a.imag * b.imag; z.imag = a.real * b.imag + a.imag * b.real; return z; } #if 1 static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_quot_float(__pyx_t_float_complex a, __pyx_t_float_complex b) { if (b.imag == 0) { return __pyx_t_float_complex_from_parts(a.real / b.real, a.imag / b.real); } else if (fabsf(b.real) >= fabsf(b.imag)) { if (b.real == 0 && b.imag == 0) { return __pyx_t_float_complex_from_parts(a.real / b.real, a.imag / b.imag); } else { float r = b.imag / b.real; float s = (float)(1.0) / (b.real + b.imag * r); return __pyx_t_float_complex_from_parts( (a.real + a.imag * r) * s, (a.imag - a.real * r) * s); } } else { float r = b.real / b.imag; float s = (float)(1.0) / (b.imag + b.real * r); return __pyx_t_float_complex_from_parts( (a.real * r + a.imag) * s, (a.imag * r - a.real) * s); } } #else static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_quot_float(__pyx_t_float_complex a, __pyx_t_float_complex b) { if (b.imag == 0) { return __pyx_t_float_complex_from_parts(a.real / b.real, a.imag / b.real); } else { float denom = b.real * b.real + b.imag * b.imag; return __pyx_t_float_complex_from_parts( (a.real * b.real + a.imag * b.imag) / denom, (a.imag * b.real - a.real * b.imag) / denom); } } #endif static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_neg_float(__pyx_t_float_complex a) { __pyx_t_float_complex z; z.real = -a.real; z.imag = -a.imag; return z; } static CYTHON_INLINE int __Pyx_c_is_zero_float(__pyx_t_float_complex a) { return (a.real == 0) && (a.imag == 0); } static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_conj_float(__pyx_t_float_complex a) { __pyx_t_float_complex z; z.real = a.real; z.imag = -a.imag; return z; } #if 1 static CYTHON_INLINE float __Pyx_c_abs_float(__pyx_t_float_complex z) { #if !defined(HAVE_HYPOT) || defined(_MSC_VER) return sqrtf(z.real*z.real + z.imag*z.imag); #else return hypotf(z.real, z.imag); #endif } static CYTHON_INLINE __pyx_t_float_complex __Pyx_c_pow_float(__pyx_t_float_complex a, __pyx_t_float_complex b) { __pyx_t_float_complex z; float r, lnr, theta, z_r, z_theta; if (b.imag == 0 && b.real == (int)b.real) { if (b.real < 0) { float denom = a.real * a.real + a.imag * a.imag; a.real = a.real / denom; a.imag = -a.imag / denom; b.real = -b.real; } switch ((int)b.real) { case 0: z.real = 1; z.imag = 0; return z; case 1: return a; case 2: return __Pyx_c_prod_float(a, a); case 3: z = __Pyx_c_prod_float(a, a); return __Pyx_c_prod_float(z, a); case 4: z = __Pyx_c_prod_float(a, a); return __Pyx_c_prod_float(z, z); } } if (a.imag == 0) { if (a.real == 0) { return a; } else if ((b.imag == 0) && (a.real >= 0)) { z.real = powf(a.real, b.real); z.imag = 0; return z; } else if (a.real > 0) { r = a.real; theta = 0; } else { r = -a.real; theta = atan2f(0.0, -1.0); } } else { r = __Pyx_c_abs_float(a); theta = atan2f(a.imag, a.real); } lnr = logf(r); z_r = expf(lnr * b.real - theta * b.imag); z_theta = theta * b.real + lnr * b.imag; z.real = z_r * cosf(z_theta); z.imag = z_r * sinf(z_theta); return z; } #endif #endif /* Declarations */ #if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) #ifdef __cplusplus static CYTHON_INLINE __pyx_t_double_complex __pyx_t_double_complex_from_parts(double x, double y) { return ::std::complex< double >(x, y); } #else static CYTHON_INLINE __pyx_t_double_complex __pyx_t_double_complex_from_parts(double x, double y) { return x + y*(__pyx_t_double_complex)_Complex_I; } #endif #else static CYTHON_INLINE __pyx_t_double_complex __pyx_t_double_complex_from_parts(double x, double y) { __pyx_t_double_complex z; z.real = x; z.imag = y; return z; } #endif /* Arithmetic */ #if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) #else static CYTHON_INLINE int __Pyx_c_eq_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { return (a.real == b.real) && (a.imag == b.imag); } static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_sum_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { __pyx_t_double_complex z; z.real = a.real + b.real; z.imag = a.imag + b.imag; return z; } static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_diff_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { __pyx_t_double_complex z; z.real = a.real - b.real; z.imag = a.imag - b.imag; return z; } static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_prod_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { __pyx_t_double_complex z; z.real = a.real * b.real - a.imag * b.imag; z.imag = a.real * b.imag + a.imag * b.real; return z; } #if 1 static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_quot_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { if (b.imag == 0) { return __pyx_t_double_complex_from_parts(a.real / b.real, a.imag / b.real); } else if (fabs(b.real) >= fabs(b.imag)) { if (b.real == 0 && b.imag == 0) { return __pyx_t_double_complex_from_parts(a.real / b.real, a.imag / b.imag); } else { double r = b.imag / b.real; double s = (double)(1.0) / (b.real + b.imag * r); return __pyx_t_double_complex_from_parts( (a.real + a.imag * r) * s, (a.imag - a.real * r) * s); } } else { double r = b.real / b.imag; double s = (double)(1.0) / (b.imag + b.real * r); return __pyx_t_double_complex_from_parts( (a.real * r + a.imag) * s, (a.imag * r - a.real) * s); } } #else static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_quot_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { if (b.imag == 0) { return __pyx_t_double_complex_from_parts(a.real / b.real, a.imag / b.real); } else { double denom = b.real * b.real + b.imag * b.imag; return __pyx_t_double_complex_from_parts( (a.real * b.real + a.imag * b.imag) / denom, (a.imag * b.real - a.real * b.imag) / denom); } } #endif static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_neg_double(__pyx_t_double_complex a) { __pyx_t_double_complex z; z.real = -a.real; z.imag = -a.imag; return z; } static CYTHON_INLINE int __Pyx_c_is_zero_double(__pyx_t_double_complex a) { return (a.real == 0) && (a.imag == 0); } static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_conj_double(__pyx_t_double_complex a) { __pyx_t_double_complex z; z.real = a.real; z.imag = -a.imag; return z; } #if 1 static CYTHON_INLINE double __Pyx_c_abs_double(__pyx_t_double_complex z) { #if !defined(HAVE_HYPOT) || defined(_MSC_VER) return sqrt(z.real*z.real + z.imag*z.imag); #else return hypot(z.real, z.imag); #endif } static CYTHON_INLINE __pyx_t_double_complex __Pyx_c_pow_double(__pyx_t_double_complex a, __pyx_t_double_complex b) { __pyx_t_double_complex z; double r, lnr, theta, z_r, z_theta; if (b.imag == 0 && b.real == (int)b.real) { if (b.real < 0) { double denom = a.real * a.real + a.imag * a.imag; a.real = a.real / denom; a.imag = -a.imag / denom; b.real = -b.real; } switch ((int)b.real) { case 0: z.real = 1; z.imag = 0; return z; case 1: return a; case 2: return __Pyx_c_prod_double(a, a); case 3: z = __Pyx_c_prod_double(a, a); return __Pyx_c_prod_double(z, a); case 4: z = __Pyx_c_prod_double(a, a); return __Pyx_c_prod_double(z, z); } } if (a.imag == 0) { if (a.real == 0) { return a; } else if ((b.imag == 0) && (a.real >= 0)) { z.real = pow(a.real, b.real); z.imag = 0; return z; } else if (a.real > 0) { r = a.real; theta = 0; } else { r = -a.real; theta = atan2(0.0, -1.0); } } else { r = __Pyx_c_abs_double(a); theta = atan2(a.imag, a.real); } lnr = log(r); z_r = exp(lnr * b.real - theta * b.imag); z_theta = theta * b.real + lnr * b.imag; z.real = z_r * cos(z_theta); z.imag = z_r * sin(z_theta); return z; } #endif #endif /* Declarations */ #if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) #ifdef __cplusplus static CYTHON_INLINE __pyx_t_long_double_complex __pyx_t_long_double_complex_from_parts(long double x, long double y) { return ::std::complex< long double >(x, y); } #else static CYTHON_INLINE __pyx_t_long_double_complex __pyx_t_long_double_complex_from_parts(long double x, long double y) { return x + y*(__pyx_t_long_double_complex)_Complex_I; } #endif #else static CYTHON_INLINE __pyx_t_long_double_complex __pyx_t_long_double_complex_from_parts(long double x, long double y) { __pyx_t_long_double_complex z; z.real = x; z.imag = y; return z; } #endif /* Arithmetic */ #if CYTHON_CCOMPLEX && (1) && (!0 || __cplusplus) #else static CYTHON_INLINE int __Pyx_c_eq_long__double(__pyx_t_long_double_complex a, __pyx_t_long_double_complex b) { return (a.real == b.real) && (a.imag == b.imag); } static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_sum_long__double(__pyx_t_long_double_complex a, __pyx_t_long_double_complex b) { __pyx_t_long_double_complex z; z.real = a.real + b.real; z.imag = a.imag + b.imag; return z; } static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_diff_long__double(__pyx_t_long_double_complex a, __pyx_t_long_double_complex b) { __pyx_t_long_double_complex z; z.real = a.real - b.real; z.imag = a.imag - b.imag; return z; } static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_prod_long__double(__pyx_t_long_double_complex a, __pyx_t_long_double_complex b) { __pyx_t_long_double_complex z; z.real = a.real * b.real - a.imag * b.imag; z.imag = a.real * b.imag + a.imag * b.real; return z; } #if 1 static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_quot_long__double(__pyx_t_long_double_complex a, __pyx_t_long_double_complex b) { if (b.imag == 0) { return __pyx_t_long_double_complex_from_parts(a.real / b.real, a.imag / b.real); } else if (fabsl(b.real) >= fabsl(b.imag)) { if (b.real == 0 && b.imag == 0) { return __pyx_t_long_double_complex_from_parts(a.real / b.real, a.imag / b.imag); } else { long double r = b.imag / b.real; long double s = (long double)(1.0) / (b.real + b.imag * r); return __pyx_t_long_double_complex_from_parts( (a.real + a.imag * r) * s, (a.imag - a.real * r) * s); } } else { long double r = b.real / b.imag; long double s = (long double)(1.0) / (b.imag + b.real * r); return __pyx_t_long_double_complex_from_parts( (a.real * r + a.imag) * s, (a.imag * r - a.real) * s); } } #else static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_quot_long__double(__pyx_t_long_double_complex a, __pyx_t_long_double_complex b) { if (b.imag == 0) { return __pyx_t_long_double_complex_from_parts(a.real / b.real, a.imag / b.real); } else { long double denom = b.real * b.real + b.imag * b.imag; return __pyx_t_long_double_complex_from_parts( (a.real * b.real + a.imag * b.imag) / denom, (a.imag * b.real - a.real * b.imag) / denom); } } #endif static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_neg_long__double(__pyx_t_long_double_complex a) { __pyx_t_long_double_complex z; z.real = -a.real; z.imag = -a.imag; return z; } static CYTHON_INLINE int __Pyx_c_is_zero_long__double(__pyx_t_long_double_complex a) { return (a.real == 0) && (a.imag == 0); } static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_conj_long__double(__pyx_t_long_double_complex a) { __pyx_t_long_double_complex z; z.real = a.real; z.imag = -a.imag; return z; } #if 1 static CYTHON_INLINE long double __Pyx_c_abs_long__double(__pyx_t_long_double_complex z) { #if !defined(HAVE_HYPOT) || defined(_MSC_VER) return sqrtl(z.real*z.real + z.imag*z.imag); #else return hypotl(z.real, z.imag); #endif } static CYTHON_INLINE __pyx_t_long_double_complex __Pyx_c_pow_long__double(__pyx_t_long_double_complex a, __pyx_t_long_double_complex b) { __pyx_t_long_double_complex z; long double r, lnr, theta, z_r, z_theta; if (b.imag == 0 && b.real == (int)b.real) { if (b.real < 0) { long double denom = a.real * a.real + a.imag * a.imag; a.real = a.real / denom; a.imag = -a.imag / denom; b.real = -b.real; } switch ((int)b.real) { case 0: z.real = 1; z.imag = 0; return z; case 1: return a; case 2: return __Pyx_c_prod_long__double(a, a); case 3: z = __Pyx_c_prod_long__double(a, a); return __Pyx_c_prod_long__double(z, a); case 4: z = __Pyx_c_prod_long__double(a, a); return __Pyx_c_prod_long__double(z, z); } } if (a.imag == 0) { if (a.real == 0) { return a; } else if ((b.imag == 0) && (a.real >= 0)) { z.real = powl(a.real, b.real); z.imag = 0; return z; } else if (a.real > 0) { r = a.real; theta = 0; } else { r = -a.real; theta = atan2l(0.0, -1.0); } } else { r = __Pyx_c_abs_long__double(a); theta = atan2l(a.imag, a.real); } lnr = logl(r); z_r = expl(lnr * b.real - theta * b.imag); z_theta = theta * b.real + lnr * b.imag; z.real = z_r * cosl(z_theta); z.imag = z_r * sinl(z_theta); return z; } #endif #endif /* CIntToPy */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_unsigned_int(unsigned int value) { #ifdef __Pyx_HAS_GCC_DIAGNOSTIC #pragma GCC diagnostic push #pragma GCC diagnostic ignored "-Wconversion" #endif const unsigned int neg_one = (unsigned int) -1, const_zero = (unsigned int) 0; #ifdef __Pyx_HAS_GCC_DIAGNOSTIC #pragma GCC diagnostic pop #endif const int is_unsigned = neg_one > const_zero; if (is_unsigned) { if (sizeof(unsigned int) < sizeof(long)) { return PyInt_FromLong((long) value); } else if (sizeof(unsigned int) <= sizeof(unsigned long)) { return PyLong_FromUnsignedLong((unsigned long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(unsigned int) <= sizeof(unsigned PY_LONG_LONG)) { return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); #endif } } else { if (sizeof(unsigned int) <= sizeof(long)) { return PyInt_FromLong((long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(unsigned int) <= sizeof(PY_LONG_LONG)) { return PyLong_FromLongLong((PY_LONG_LONG) value); #endif } } { unsigned char *bytes = (unsigned char *)&value; #if !CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX >= 0x030d00A4 if (is_unsigned) { return PyLong_FromUnsignedNativeBytes(bytes, sizeof(value), -1); } else { return PyLong_FromNativeBytes(bytes, sizeof(value), -1); } #elif !CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030d0000 int one = 1; int little = (int)*(unsigned char *)&one; return _PyLong_FromByteArray(bytes, sizeof(unsigned int), little, !is_unsigned); #else int one = 1; int little = (int)*(unsigned char *)&one; PyObject *from_bytes, *result = NULL; PyObject *py_bytes = NULL, *arg_tuple = NULL, *kwds = NULL, *order_str = NULL; from_bytes = PyObject_GetAttrString((PyObject*)&PyLong_Type, "from_bytes"); if (!from_bytes) return NULL; py_bytes = PyBytes_FromStringAndSize((char*)bytes, sizeof(unsigned int)); if (!py_bytes) goto limited_bad; order_str = PyUnicode_FromString(little ? "little" : "big"); if (!order_str) goto limited_bad; arg_tuple = PyTuple_Pack(2, py_bytes, order_str); if (!arg_tuple) goto limited_bad; if (!is_unsigned) { kwds = PyDict_New(); if (!kwds) goto limited_bad; if (PyDict_SetItemString(kwds, "signed", __Pyx_NewRef(Py_True))) goto limited_bad; } result = PyObject_Call(from_bytes, arg_tuple, kwds); limited_bad: Py_XDECREF(kwds); Py_XDECREF(arg_tuple); Py_XDECREF(order_str); Py_XDECREF(py_bytes); Py_XDECREF(from_bytes); return result; #endif } } /* CIntFromPy */ static CYTHON_INLINE unsigned int __Pyx_PyInt_As_unsigned_int(PyObject *x) { #ifdef __Pyx_HAS_GCC_DIAGNOSTIC #pragma GCC diagnostic push #pragma GCC diagnostic ignored "-Wconversion" #endif const unsigned int neg_one = (unsigned int) -1, const_zero = (unsigned int) 0; #ifdef __Pyx_HAS_GCC_DIAGNOSTIC #pragma GCC diagnostic pop #endif const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if ((sizeof(unsigned int) < sizeof(long))) { __PYX_VERIFY_RETURN_INT(unsigned int, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (unsigned int) val; } } #endif if (unlikely(!PyLong_Check(x))) { unsigned int val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (unsigned int) -1; val = __Pyx_PyInt_As_unsigned_int(tmp); Py_DECREF(tmp); return val; } if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS if (unlikely(__Pyx_PyLong_IsNeg(x))) { goto raise_neg_overflow; } else if (__Pyx_PyLong_IsCompact(x)) { __PYX_VERIFY_RETURN_INT(unsigned int, __Pyx_compact_upylong, __Pyx_PyLong_CompactValueUnsigned(x)) } else { const digit* digits = __Pyx_PyLong_Digits(x); assert(__Pyx_PyLong_DigitCount(x) > 1); switch (__Pyx_PyLong_DigitCount(x)) { case 2: if ((8 * sizeof(unsigned int) > 1 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(unsigned int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(unsigned int) >= 2 * PyLong_SHIFT)) { return (unsigned int) (((((unsigned int)digits[1]) << PyLong_SHIFT) | (unsigned int)digits[0])); } } break; case 3: if ((8 * sizeof(unsigned int) > 2 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(unsigned int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(unsigned int) >= 3 * PyLong_SHIFT)) { return (unsigned int) (((((((unsigned int)digits[2]) << PyLong_SHIFT) | (unsigned int)digits[1]) << PyLong_SHIFT) | (unsigned int)digits[0])); } } break; case 4: if ((8 * sizeof(unsigned int) > 3 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(unsigned int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(unsigned int) >= 4 * PyLong_SHIFT)) { return (unsigned int) (((((((((unsigned int)digits[3]) << PyLong_SHIFT) | (unsigned int)digits[2]) << PyLong_SHIFT) | (unsigned int)digits[1]) << PyLong_SHIFT) | (unsigned int)digits[0])); } } break; } } #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030C00A7 if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (unsigned int) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if ((sizeof(unsigned int) <= sizeof(unsigned long))) { __PYX_VERIFY_RETURN_INT_EXC(unsigned int, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if ((sizeof(unsigned int) <= sizeof(unsigned PY_LONG_LONG))) { __PYX_VERIFY_RETURN_INT_EXC(unsigned int, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS if (__Pyx_PyLong_IsCompact(x)) { __PYX_VERIFY_RETURN_INT(unsigned int, __Pyx_compact_pylong, __Pyx_PyLong_CompactValue(x)) } else { const digit* digits = __Pyx_PyLong_Digits(x); assert(__Pyx_PyLong_DigitCount(x) > 1); switch (__Pyx_PyLong_SignedDigitCount(x)) { case -2: if ((8 * sizeof(unsigned int) - 1 > 1 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(unsigned int, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(unsigned int) - 1 > 2 * PyLong_SHIFT)) { return (unsigned int) (((unsigned int)-1)*(((((unsigned int)digits[1]) << PyLong_SHIFT) | (unsigned int)digits[0]))); } } break; case 2: if ((8 * sizeof(unsigned int) > 1 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(unsigned int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(unsigned int) - 1 > 2 * PyLong_SHIFT)) { return (unsigned int) ((((((unsigned int)digits[1]) << PyLong_SHIFT) | (unsigned int)digits[0]))); } } break; case -3: if ((8 * sizeof(unsigned int) - 1 > 2 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(unsigned int, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(unsigned int) - 1 > 3 * PyLong_SHIFT)) { return (unsigned int) (((unsigned int)-1)*(((((((unsigned int)digits[2]) << PyLong_SHIFT) | (unsigned int)digits[1]) << PyLong_SHIFT) | (unsigned int)digits[0]))); } } break; case 3: if ((8 * sizeof(unsigned int) > 2 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(unsigned int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(unsigned int) - 1 > 3 * PyLong_SHIFT)) { return (unsigned int) ((((((((unsigned int)digits[2]) << PyLong_SHIFT) | (unsigned int)digits[1]) << PyLong_SHIFT) | (unsigned int)digits[0]))); } } break; case -4: if ((8 * sizeof(unsigned int) - 1 > 3 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(unsigned int, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(unsigned int) - 1 > 4 * PyLong_SHIFT)) { return (unsigned int) (((unsigned int)-1)*(((((((((unsigned int)digits[3]) << PyLong_SHIFT) | (unsigned int)digits[2]) << PyLong_SHIFT) | (unsigned int)digits[1]) << PyLong_SHIFT) | (unsigned int)digits[0]))); } } break; case 4: if ((8 * sizeof(unsigned int) > 3 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(unsigned int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(unsigned int) - 1 > 4 * PyLong_SHIFT)) { return (unsigned int) ((((((((((unsigned int)digits[3]) << PyLong_SHIFT) | (unsigned int)digits[2]) << PyLong_SHIFT) | (unsigned int)digits[1]) << PyLong_SHIFT) | (unsigned int)digits[0]))); } } break; } } #endif if ((sizeof(unsigned int) <= sizeof(long))) { __PYX_VERIFY_RETURN_INT_EXC(unsigned int, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if ((sizeof(unsigned int) <= sizeof(PY_LONG_LONG))) { __PYX_VERIFY_RETURN_INT_EXC(unsigned int, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { unsigned int val; int ret = -1; #if PY_VERSION_HEX >= 0x030d00A6 && !CYTHON_COMPILING_IN_LIMITED_API Py_ssize_t bytes_copied = PyLong_AsNativeBytes( x, &val, sizeof(val), Py_ASNATIVEBYTES_NATIVE_ENDIAN | (is_unsigned ? Py_ASNATIVEBYTES_UNSIGNED_BUFFER | Py_ASNATIVEBYTES_REJECT_NEGATIVE : 0)); if (unlikely(bytes_copied == -1)) { } else if (unlikely(bytes_copied > (Py_ssize_t) sizeof(val))) { goto raise_overflow; } else { ret = 0; } #elif PY_VERSION_HEX < 0x030d0000 && !(CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_LIMITED_API) || defined(_PyLong_AsByteArray) int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; ret = _PyLong_AsByteArray((PyLongObject *)x, bytes, sizeof(val), is_little, !is_unsigned); #else PyObject *v; PyObject *stepval = NULL, *mask = NULL, *shift = NULL; int bits, remaining_bits, is_negative = 0; int chunk_size = (sizeof(long) < 8) ? 30 : 62; if (likely(PyLong_CheckExact(x))) { v = __Pyx_NewRef(x); } else { v = PyNumber_Long(x); if (unlikely(!v)) return (unsigned int) -1; assert(PyLong_CheckExact(v)); } { int result = PyObject_RichCompareBool(v, Py_False, Py_LT); if (unlikely(result < 0)) { Py_DECREF(v); return (unsigned int) -1; } is_negative = result == 1; } if (is_unsigned && unlikely(is_negative)) { Py_DECREF(v); goto raise_neg_overflow; } else if (is_negative) { stepval = PyNumber_Invert(v); Py_DECREF(v); if (unlikely(!stepval)) return (unsigned int) -1; } else { stepval = v; } v = NULL; val = (unsigned int) 0; mask = PyLong_FromLong((1L << chunk_size) - 1); if (unlikely(!mask)) goto done; shift = PyLong_FromLong(chunk_size); if (unlikely(!shift)) goto done; for (bits = 0; bits < (int) sizeof(unsigned int) * 8 - chunk_size; bits += chunk_size) { PyObject *tmp, *digit; long idigit; digit = PyNumber_And(stepval, mask); if (unlikely(!digit)) goto done; idigit = PyLong_AsLong(digit); Py_DECREF(digit); if (unlikely(idigit < 0)) goto done; val |= ((unsigned int) idigit) << bits; tmp = PyNumber_Rshift(stepval, shift); if (unlikely(!tmp)) goto done; Py_DECREF(stepval); stepval = tmp; } Py_DECREF(shift); shift = NULL; Py_DECREF(mask); mask = NULL; { long idigit = PyLong_AsLong(stepval); if (unlikely(idigit < 0)) goto done; remaining_bits = ((int) sizeof(unsigned int) * 8) - bits - (is_unsigned ? 0 : 1); if (unlikely(idigit >= (1L << remaining_bits))) goto raise_overflow; val |= ((unsigned int) idigit) << bits; } if (!is_unsigned) { if (unlikely(val & (((unsigned int) 1) << (sizeof(unsigned int) * 8 - 1)))) goto raise_overflow; if (is_negative) val = ~val; } ret = 0; done: Py_XDECREF(shift); Py_XDECREF(mask); Py_XDECREF(stepval); #endif if (unlikely(ret)) return (unsigned int) -1; return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to unsigned int"); return (unsigned int) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to unsigned int"); return (unsigned int) -1; } /* FormatTypeName */ #if CYTHON_COMPILING_IN_LIMITED_API static __Pyx_TypeName __Pyx_PyType_GetName(PyTypeObject* tp) { PyObject *name = __Pyx_PyObject_GetAttrStr((PyObject *)tp, __pyx_n_s_name); if (unlikely(name == NULL) || unlikely(!PyUnicode_Check(name))) { PyErr_Clear(); Py_XDECREF(name); name = __Pyx_NewRef(__pyx_n_s__10); } return name; } #endif /* CIntToPy */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value) { #ifdef __Pyx_HAS_GCC_DIAGNOSTIC #pragma GCC diagnostic push #pragma GCC diagnostic ignored "-Wconversion" #endif const long neg_one = (long) -1, const_zero = (long) 0; #ifdef __Pyx_HAS_GCC_DIAGNOSTIC #pragma GCC diagnostic pop #endif const int is_unsigned = neg_one > const_zero; if (is_unsigned) { if (sizeof(long) < sizeof(long)) { return PyInt_FromLong((long) value); } else if (sizeof(long) <= sizeof(unsigned long)) { return PyLong_FromUnsignedLong((unsigned long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(unsigned PY_LONG_LONG)) { return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); #endif } } else { if (sizeof(long) <= sizeof(long)) { return PyInt_FromLong((long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(PY_LONG_LONG)) { return PyLong_FromLongLong((PY_LONG_LONG) value); #endif } } { unsigned char *bytes = (unsigned char *)&value; #if !CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX >= 0x030d00A4 if (is_unsigned) { return PyLong_FromUnsignedNativeBytes(bytes, sizeof(value), -1); } else { return PyLong_FromNativeBytes(bytes, sizeof(value), -1); } #elif !CYTHON_COMPILING_IN_LIMITED_API && PY_VERSION_HEX < 0x030d0000 int one = 1; int little = (int)*(unsigned char *)&one; return _PyLong_FromByteArray(bytes, sizeof(long), little, !is_unsigned); #else int one = 1; int little = (int)*(unsigned char *)&one; PyObject *from_bytes, *result = NULL; PyObject *py_bytes = NULL, *arg_tuple = NULL, *kwds = NULL, *order_str = NULL; from_bytes = PyObject_GetAttrString((PyObject*)&PyLong_Type, "from_bytes"); if (!from_bytes) return NULL; py_bytes = PyBytes_FromStringAndSize((char*)bytes, sizeof(long)); if (!py_bytes) goto limited_bad; order_str = PyUnicode_FromString(little ? "little" : "big"); if (!order_str) goto limited_bad; arg_tuple = PyTuple_Pack(2, py_bytes, order_str); if (!arg_tuple) goto limited_bad; if (!is_unsigned) { kwds = PyDict_New(); if (!kwds) goto limited_bad; if (PyDict_SetItemString(kwds, "signed", __Pyx_NewRef(Py_True))) goto limited_bad; } result = PyObject_Call(from_bytes, arg_tuple, kwds); limited_bad: Py_XDECREF(kwds); Py_XDECREF(arg_tuple); Py_XDECREF(order_str); Py_XDECREF(py_bytes); Py_XDECREF(from_bytes); return result; #endif } } /* CIntFromPy */ static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *x) { #ifdef __Pyx_HAS_GCC_DIAGNOSTIC #pragma GCC diagnostic push #pragma GCC diagnostic ignored "-Wconversion" #endif const long neg_one = (long) -1, const_zero = (long) 0; #ifdef __Pyx_HAS_GCC_DIAGNOSTIC #pragma GCC diagnostic pop #endif const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if ((sizeof(long) < sizeof(long))) { __PYX_VERIFY_RETURN_INT(long, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (long) val; } } #endif if (unlikely(!PyLong_Check(x))) { long val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (long) -1; val = __Pyx_PyInt_As_long(tmp); Py_DECREF(tmp); return val; } if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS if (unlikely(__Pyx_PyLong_IsNeg(x))) { goto raise_neg_overflow; } else if (__Pyx_PyLong_IsCompact(x)) { __PYX_VERIFY_RETURN_INT(long, __Pyx_compact_upylong, __Pyx_PyLong_CompactValueUnsigned(x)) } else { const digit* digits = __Pyx_PyLong_Digits(x); assert(__Pyx_PyLong_DigitCount(x) > 1); switch (__Pyx_PyLong_DigitCount(x)) { case 2: if ((8 * sizeof(long) > 1 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(long) >= 2 * PyLong_SHIFT)) { return (long) (((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; case 3: if ((8 * sizeof(long) > 2 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(long) >= 3 * PyLong_SHIFT)) { return (long) (((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; case 4: if ((8 * sizeof(long) > 3 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(long) >= 4 * PyLong_SHIFT)) { return (long) (((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; } } #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030C00A7 if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (long) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if ((sizeof(long) <= sizeof(unsigned long))) { __PYX_VERIFY_RETURN_INT_EXC(long, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if ((sizeof(long) <= sizeof(unsigned PY_LONG_LONG))) { __PYX_VERIFY_RETURN_INT_EXC(long, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS if (__Pyx_PyLong_IsCompact(x)) { __PYX_VERIFY_RETURN_INT(long, __Pyx_compact_pylong, __Pyx_PyLong_CompactValue(x)) } else { const digit* digits = __Pyx_PyLong_Digits(x); assert(__Pyx_PyLong_DigitCount(x) > 1); switch (__Pyx_PyLong_SignedDigitCount(x)) { case -2: if ((8 * sizeof(long) - 1 > 1 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(long) - 1 > 2 * PyLong_SHIFT)) { return (long) (((long)-1)*(((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 2: if ((8 * sizeof(long) > 1 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(long) - 1 > 2 * PyLong_SHIFT)) { return (long) ((((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case -3: if ((8 * sizeof(long) - 1 > 2 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(long) - 1 > 3 * PyLong_SHIFT)) { return (long) (((long)-1)*(((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 3: if ((8 * sizeof(long) > 2 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(long) - 1 > 3 * PyLong_SHIFT)) { return (long) ((((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case -4: if ((8 * sizeof(long) - 1 > 3 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(long) - 1 > 4 * PyLong_SHIFT)) { return (long) (((long)-1)*(((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 4: if ((8 * sizeof(long) > 3 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(long) - 1 > 4 * PyLong_SHIFT)) { return (long) ((((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; } } #endif if ((sizeof(long) <= sizeof(long))) { __PYX_VERIFY_RETURN_INT_EXC(long, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if ((sizeof(long) <= sizeof(PY_LONG_LONG))) { __PYX_VERIFY_RETURN_INT_EXC(long, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { long val; int ret = -1; #if PY_VERSION_HEX >= 0x030d00A6 && !CYTHON_COMPILING_IN_LIMITED_API Py_ssize_t bytes_copied = PyLong_AsNativeBytes( x, &val, sizeof(val), Py_ASNATIVEBYTES_NATIVE_ENDIAN | (is_unsigned ? Py_ASNATIVEBYTES_UNSIGNED_BUFFER | Py_ASNATIVEBYTES_REJECT_NEGATIVE : 0)); if (unlikely(bytes_copied == -1)) { } else if (unlikely(bytes_copied > (Py_ssize_t) sizeof(val))) { goto raise_overflow; } else { ret = 0; } #elif PY_VERSION_HEX < 0x030d0000 && !(CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_LIMITED_API) || defined(_PyLong_AsByteArray) int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; ret = _PyLong_AsByteArray((PyLongObject *)x, bytes, sizeof(val), is_little, !is_unsigned); #else PyObject *v; PyObject *stepval = NULL, *mask = NULL, *shift = NULL; int bits, remaining_bits, is_negative = 0; int chunk_size = (sizeof(long) < 8) ? 30 : 62; if (likely(PyLong_CheckExact(x))) { v = __Pyx_NewRef(x); } else { v = PyNumber_Long(x); if (unlikely(!v)) return (long) -1; assert(PyLong_CheckExact(v)); } { int result = PyObject_RichCompareBool(v, Py_False, Py_LT); if (unlikely(result < 0)) { Py_DECREF(v); return (long) -1; } is_negative = result == 1; } if (is_unsigned && unlikely(is_negative)) { Py_DECREF(v); goto raise_neg_overflow; } else if (is_negative) { stepval = PyNumber_Invert(v); Py_DECREF(v); if (unlikely(!stepval)) return (long) -1; } else { stepval = v; } v = NULL; val = (long) 0; mask = PyLong_FromLong((1L << chunk_size) - 1); if (unlikely(!mask)) goto done; shift = PyLong_FromLong(chunk_size); if (unlikely(!shift)) goto done; for (bits = 0; bits < (int) sizeof(long) * 8 - chunk_size; bits += chunk_size) { PyObject *tmp, *digit; long idigit; digit = PyNumber_And(stepval, mask); if (unlikely(!digit)) goto done; idigit = PyLong_AsLong(digit); Py_DECREF(digit); if (unlikely(idigit < 0)) goto done; val |= ((long) idigit) << bits; tmp = PyNumber_Rshift(stepval, shift); if (unlikely(!tmp)) goto done; Py_DECREF(stepval); stepval = tmp; } Py_DECREF(shift); shift = NULL; Py_DECREF(mask); mask = NULL; { long idigit = PyLong_AsLong(stepval); if (unlikely(idigit < 0)) goto done; remaining_bits = ((int) sizeof(long) * 8) - bits - (is_unsigned ? 0 : 1); if (unlikely(idigit >= (1L << remaining_bits))) goto raise_overflow; val |= ((long) idigit) << bits; } if (!is_unsigned) { if (unlikely(val & (((long) 1) << (sizeof(long) * 8 - 1)))) goto raise_overflow; if (is_negative) val = ~val; } ret = 0; done: Py_XDECREF(shift); Py_XDECREF(mask); Py_XDECREF(stepval); #endif if (unlikely(ret)) return (long) -1; return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to long"); return (long) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to long"); return (long) -1; } /* CIntFromPy */ static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *x) { #ifdef __Pyx_HAS_GCC_DIAGNOSTIC #pragma GCC diagnostic push #pragma GCC diagnostic ignored "-Wconversion" #endif const int neg_one = (int) -1, const_zero = (int) 0; #ifdef __Pyx_HAS_GCC_DIAGNOSTIC #pragma GCC diagnostic pop #endif const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if ((sizeof(int) < sizeof(long))) { __PYX_VERIFY_RETURN_INT(int, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (int) val; } } #endif if (unlikely(!PyLong_Check(x))) { int val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (int) -1; val = __Pyx_PyInt_As_int(tmp); Py_DECREF(tmp); return val; } if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS if (unlikely(__Pyx_PyLong_IsNeg(x))) { goto raise_neg_overflow; } else if (__Pyx_PyLong_IsCompact(x)) { __PYX_VERIFY_RETURN_INT(int, __Pyx_compact_upylong, __Pyx_PyLong_CompactValueUnsigned(x)) } else { const digit* digits = __Pyx_PyLong_Digits(x); assert(__Pyx_PyLong_DigitCount(x) > 1); switch (__Pyx_PyLong_DigitCount(x)) { case 2: if ((8 * sizeof(int) > 1 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(int) >= 2 * PyLong_SHIFT)) { return (int) (((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; case 3: if ((8 * sizeof(int) > 2 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(int) >= 3 * PyLong_SHIFT)) { return (int) (((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; case 4: if ((8 * sizeof(int) > 3 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(int) >= 4 * PyLong_SHIFT)) { return (int) (((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; } } #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030C00A7 if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (int) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if ((sizeof(int) <= sizeof(unsigned long))) { __PYX_VERIFY_RETURN_INT_EXC(int, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if ((sizeof(int) <= sizeof(unsigned PY_LONG_LONG))) { __PYX_VERIFY_RETURN_INT_EXC(int, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS if (__Pyx_PyLong_IsCompact(x)) { __PYX_VERIFY_RETURN_INT(int, __Pyx_compact_pylong, __Pyx_PyLong_CompactValue(x)) } else { const digit* digits = __Pyx_PyLong_Digits(x); assert(__Pyx_PyLong_DigitCount(x) > 1); switch (__Pyx_PyLong_SignedDigitCount(x)) { case -2: if ((8 * sizeof(int) - 1 > 1 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(int) - 1 > 2 * PyLong_SHIFT)) { return (int) (((int)-1)*(((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 2: if ((8 * sizeof(int) > 1 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 2 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(int) - 1 > 2 * PyLong_SHIFT)) { return (int) ((((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case -3: if ((8 * sizeof(int) - 1 > 2 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(int) - 1 > 3 * PyLong_SHIFT)) { return (int) (((int)-1)*(((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 3: if ((8 * sizeof(int) > 2 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 3 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(int) - 1 > 3 * PyLong_SHIFT)) { return (int) ((((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case -4: if ((8 * sizeof(int) - 1 > 3 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(int) - 1 > 4 * PyLong_SHIFT)) { return (int) (((int)-1)*(((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 4: if ((8 * sizeof(int) > 3 * PyLong_SHIFT)) { if ((8 * sizeof(unsigned long) > 4 * PyLong_SHIFT)) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if ((8 * sizeof(int) - 1 > 4 * PyLong_SHIFT)) { return (int) ((((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; } } #endif if ((sizeof(int) <= sizeof(long))) { __PYX_VERIFY_RETURN_INT_EXC(int, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if ((sizeof(int) <= sizeof(PY_LONG_LONG))) { __PYX_VERIFY_RETURN_INT_EXC(int, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { int val; int ret = -1; #if PY_VERSION_HEX >= 0x030d00A6 && !CYTHON_COMPILING_IN_LIMITED_API Py_ssize_t bytes_copied = PyLong_AsNativeBytes( x, &val, sizeof(val), Py_ASNATIVEBYTES_NATIVE_ENDIAN | (is_unsigned ? Py_ASNATIVEBYTES_UNSIGNED_BUFFER | Py_ASNATIVEBYTES_REJECT_NEGATIVE : 0)); if (unlikely(bytes_copied == -1)) { } else if (unlikely(bytes_copied > (Py_ssize_t) sizeof(val))) { goto raise_overflow; } else { ret = 0; } #elif PY_VERSION_HEX < 0x030d0000 && !(CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_LIMITED_API) || defined(_PyLong_AsByteArray) int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; ret = _PyLong_AsByteArray((PyLongObject *)x, bytes, sizeof(val), is_little, !is_unsigned); #else PyObject *v; PyObject *stepval = NULL, *mask = NULL, *shift = NULL; int bits, remaining_bits, is_negative = 0; int chunk_size = (sizeof(long) < 8) ? 30 : 62; if (likely(PyLong_CheckExact(x))) { v = __Pyx_NewRef(x); } else { v = PyNumber_Long(x); if (unlikely(!v)) return (int) -1; assert(PyLong_CheckExact(v)); } { int result = PyObject_RichCompareBool(v, Py_False, Py_LT); if (unlikely(result < 0)) { Py_DECREF(v); return (int) -1; } is_negative = result == 1; } if (is_unsigned && unlikely(is_negative)) { Py_DECREF(v); goto raise_neg_overflow; } else if (is_negative) { stepval = PyNumber_Invert(v); Py_DECREF(v); if (unlikely(!stepval)) return (int) -1; } else { stepval = v; } v = NULL; val = (int) 0; mask = PyLong_FromLong((1L << chunk_size) - 1); if (unlikely(!mask)) goto done; shift = PyLong_FromLong(chunk_size); if (unlikely(!shift)) goto done; for (bits = 0; bits < (int) sizeof(int) * 8 - chunk_size; bits += chunk_size) { PyObject *tmp, *digit; long idigit; digit = PyNumber_And(stepval, mask); if (unlikely(!digit)) goto done; idigit = PyLong_AsLong(digit); Py_DECREF(digit); if (unlikely(idigit < 0)) goto done; val |= ((int) idigit) << bits; tmp = PyNumber_Rshift(stepval, shift); if (unlikely(!tmp)) goto done; Py_DECREF(stepval); stepval = tmp; } Py_DECREF(shift); shift = NULL; Py_DECREF(mask); mask = NULL; { long idigit = PyLong_AsLong(stepval); if (unlikely(idigit < 0)) goto done; remaining_bits = ((int) sizeof(int) * 8) - bits - (is_unsigned ? 0 : 1); if (unlikely(idigit >= (1L << remaining_bits))) goto raise_overflow; val |= ((int) idigit) << bits; } if (!is_unsigned) { if (unlikely(val & (((int) 1) << (sizeof(int) * 8 - 1)))) goto raise_overflow; if (is_negative) val = ~val; } ret = 0; done: Py_XDECREF(shift); Py_XDECREF(mask); Py_XDECREF(stepval); #endif if (unlikely(ret)) return (int) -1; return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to int"); return (int) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to int"); return (int) -1; } /* FastTypeChecks */ #if CYTHON_COMPILING_IN_CPYTHON static int __Pyx_InBases(PyTypeObject *a, PyTypeObject *b) { while (a) { a = __Pyx_PyType_GetSlot(a, tp_base, PyTypeObject*); if (a == b) return 1; } return b == &PyBaseObject_Type; } static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b) { PyObject *mro; if (a == b) return 1; mro = a->tp_mro; if (likely(mro)) { Py_ssize_t i, n; n = PyTuple_GET_SIZE(mro); for (i = 0; i < n; i++) { if (PyTuple_GET_ITEM(mro, i) == (PyObject *)b) return 1; } return 0; } return __Pyx_InBases(a, b); } static CYTHON_INLINE int __Pyx_IsAnySubtype2(PyTypeObject *cls, PyTypeObject *a, PyTypeObject *b) { PyObject *mro; if (cls == a || cls == b) return 1; mro = cls->tp_mro; if (likely(mro)) { Py_ssize_t i, n; n = PyTuple_GET_SIZE(mro); for (i = 0; i < n; i++) { PyObject *base = PyTuple_GET_ITEM(mro, i); if (base == (PyObject *)a || base == (PyObject *)b) return 1; } return 0; } return __Pyx_InBases(cls, a) || __Pyx_InBases(cls, b); } #if PY_MAJOR_VERSION == 2 static int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject* exc_type2) { PyObject *exception, *value, *tb; int res; __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ErrFetch(&exception, &value, &tb); res = exc_type1 ? PyObject_IsSubclass(err, exc_type1) : 0; if (unlikely(res == -1)) { PyErr_WriteUnraisable(err); res = 0; } if (!res) { res = PyObject_IsSubclass(err, exc_type2); if (unlikely(res == -1)) { PyErr_WriteUnraisable(err); res = 0; } } __Pyx_ErrRestore(exception, value, tb); return res; } #else static CYTHON_INLINE int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject *exc_type2) { if (exc_type1) { return __Pyx_IsAnySubtype2((PyTypeObject*)err, (PyTypeObject*)exc_type1, (PyTypeObject*)exc_type2); } else { return __Pyx_IsSubtype((PyTypeObject*)err, (PyTypeObject*)exc_type2); } } #endif static int __Pyx_PyErr_GivenExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) { Py_ssize_t i, n; assert(PyExceptionClass_Check(exc_type)); n = PyTuple_GET_SIZE(tuple); #if PY_MAJOR_VERSION >= 3 for (i=0; i= 0x030B00A4 return Py_Version & ~0xFFUL; #else const char* rt_version = Py_GetVersion(); unsigned long version = 0; unsigned long factor = 0x01000000UL; unsigned int digit = 0; int i = 0; while (factor) { while ('0' <= rt_version[i] && rt_version[i] <= '9') { digit = digit * 10 + (unsigned int) (rt_version[i] - '0'); ++i; } version += factor * digit; if (rt_version[i] != '.') break; digit = 0; factor >>= 8; ++i; } return version; #endif } static int __Pyx_check_binary_version(unsigned long ct_version, unsigned long rt_version, int allow_newer) { const unsigned long MAJOR_MINOR = 0xFFFF0000UL; if ((rt_version & MAJOR_MINOR) == (ct_version & MAJOR_MINOR)) return 0; if (likely(allow_newer && (rt_version & MAJOR_MINOR) > (ct_version & MAJOR_MINOR))) return 1; { char message[200]; PyOS_snprintf(message, sizeof(message), "compile time Python version %d.%d " "of module '%.100s' " "%s " "runtime version %d.%d", (int) (ct_version >> 24), (int) ((ct_version >> 16) & 0xFF), __Pyx_MODULE_NAME, (allow_newer) ? "was newer than" : "does not match", (int) (rt_version >> 24), (int) ((rt_version >> 16) & 0xFF) ); return PyErr_WarnEx(NULL, message, 1); } } /* InitStrings */ #if PY_MAJOR_VERSION >= 3 static int __Pyx_InitString(__Pyx_StringTabEntry t, PyObject **str) { if (t.is_unicode | t.is_str) { if (t.intern) { *str = PyUnicode_InternFromString(t.s); } else if (t.encoding) { *str = PyUnicode_Decode(t.s, t.n - 1, t.encoding, NULL); } else { *str = PyUnicode_FromStringAndSize(t.s, t.n - 1); } } else { *str = PyBytes_FromStringAndSize(t.s, t.n - 1); } if (!*str) return -1; if (PyObject_Hash(*str) == -1) return -1; return 0; } #endif static int __Pyx_InitStrings(__Pyx_StringTabEntry *t) { while (t->p) { #if PY_MAJOR_VERSION >= 3 __Pyx_InitString(*t, t->p); #else if (t->is_unicode) { *t->p = PyUnicode_DecodeUTF8(t->s, t->n - 1, NULL); } else if (t->intern) { *t->p = PyString_InternFromString(t->s); } else { *t->p = PyString_FromStringAndSize(t->s, t->n - 1); } if (!*t->p) return -1; if (PyObject_Hash(*t->p) == -1) return -1; #endif ++t; } return 0; } #include static CYTHON_INLINE Py_ssize_t __Pyx_ssize_strlen(const char *s) { size_t len = strlen(s); if (unlikely(len > (size_t) PY_SSIZE_T_MAX)) { PyErr_SetString(PyExc_OverflowError, "byte string is too long"); return -1; } return (Py_ssize_t) len; } static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char* c_str) { Py_ssize_t len = __Pyx_ssize_strlen(c_str); if (unlikely(len < 0)) return NULL; return __Pyx_PyUnicode_FromStringAndSize(c_str, len); } static CYTHON_INLINE PyObject* __Pyx_PyByteArray_FromString(const char* c_str) { Py_ssize_t len = __Pyx_ssize_strlen(c_str); if (unlikely(len < 0)) return NULL; return PyByteArray_FromStringAndSize(c_str, len); } static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject* o) { Py_ssize_t ignore; return __Pyx_PyObject_AsStringAndSize(o, &ignore); } #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT #if !CYTHON_PEP393_ENABLED static const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { char* defenc_c; PyObject* defenc = _PyUnicode_AsDefaultEncodedString(o, NULL); if (!defenc) return NULL; defenc_c = PyBytes_AS_STRING(defenc); #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII { char* end = defenc_c + PyBytes_GET_SIZE(defenc); char* c; for (c = defenc_c; c < end; c++) { if ((unsigned char) (*c) >= 128) { PyUnicode_AsASCIIString(o); return NULL; } } } #endif *length = PyBytes_GET_SIZE(defenc); return defenc_c; } #else static CYTHON_INLINE const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { if (unlikely(__Pyx_PyUnicode_READY(o) == -1)) return NULL; #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII if (likely(PyUnicode_IS_ASCII(o))) { *length = PyUnicode_GET_LENGTH(o); return PyUnicode_AsUTF8(o); } else { PyUnicode_AsASCIIString(o); return NULL; } #else return PyUnicode_AsUTF8AndSize(o, length); #endif } #endif #endif static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject* o, Py_ssize_t *length) { #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT if ( #if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII __Pyx_sys_getdefaultencoding_not_ascii && #endif PyUnicode_Check(o)) { return __Pyx_PyUnicode_AsStringAndSize(o, length); } else #endif #if (!CYTHON_COMPILING_IN_PYPY && !CYTHON_COMPILING_IN_LIMITED_API) || (defined(PyByteArray_AS_STRING) && defined(PyByteArray_GET_SIZE)) if (PyByteArray_Check(o)) { *length = PyByteArray_GET_SIZE(o); return PyByteArray_AS_STRING(o); } else #endif { char* result; int r = PyBytes_AsStringAndSize(o, &result, length); if (unlikely(r < 0)) { return NULL; } else { return result; } } } static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject* x) { int is_true = x == Py_True; if (is_true | (x == Py_False) | (x == Py_None)) return is_true; else return PyObject_IsTrue(x); } static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject* x) { int retval; if (unlikely(!x)) return -1; retval = __Pyx_PyObject_IsTrue(x); Py_DECREF(x); return retval; } static PyObject* __Pyx_PyNumber_IntOrLongWrongResultType(PyObject* result, const char* type_name) { __Pyx_TypeName result_type_name = __Pyx_PyType_GetName(Py_TYPE(result)); #if PY_MAJOR_VERSION >= 3 if (PyLong_Check(result)) { if (PyErr_WarnFormat(PyExc_DeprecationWarning, 1, "__int__ returned non-int (type " __Pyx_FMT_TYPENAME "). " "The ability to return an instance of a strict subclass of int is deprecated, " "and may be removed in a future version of Python.", result_type_name)) { __Pyx_DECREF_TypeName(result_type_name); Py_DECREF(result); return NULL; } __Pyx_DECREF_TypeName(result_type_name); return result; } #endif PyErr_Format(PyExc_TypeError, "__%.4s__ returned non-%.4s (type " __Pyx_FMT_TYPENAME ")", type_name, type_name, result_type_name); __Pyx_DECREF_TypeName(result_type_name); Py_DECREF(result); return NULL; } static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x) { #if CYTHON_USE_TYPE_SLOTS PyNumberMethods *m; #endif const char *name = NULL; PyObject *res = NULL; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x) || PyLong_Check(x))) #else if (likely(PyLong_Check(x))) #endif return __Pyx_NewRef(x); #if CYTHON_USE_TYPE_SLOTS m = Py_TYPE(x)->tp_as_number; #if PY_MAJOR_VERSION < 3 if (m && m->nb_int) { name = "int"; res = m->nb_int(x); } else if (m && m->nb_long) { name = "long"; res = m->nb_long(x); } #else if (likely(m && m->nb_int)) { name = "int"; res = m->nb_int(x); } #endif #else if (!PyBytes_CheckExact(x) && !PyUnicode_CheckExact(x)) { res = PyNumber_Int(x); } #endif if (likely(res)) { #if PY_MAJOR_VERSION < 3 if (unlikely(!PyInt_Check(res) && !PyLong_Check(res))) { #else if (unlikely(!PyLong_CheckExact(res))) { #endif return __Pyx_PyNumber_IntOrLongWrongResultType(res, name); } } else if (!PyErr_Occurred()) { PyErr_SetString(PyExc_TypeError, "an integer is required"); } return res; } static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject* b) { Py_ssize_t ival; PyObject *x; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_CheckExact(b))) { if (sizeof(Py_ssize_t) >= sizeof(long)) return PyInt_AS_LONG(b); else return PyInt_AsSsize_t(b); } #endif if (likely(PyLong_CheckExact(b))) { #if CYTHON_USE_PYLONG_INTERNALS if (likely(__Pyx_PyLong_IsCompact(b))) { return __Pyx_PyLong_CompactValue(b); } else { const digit* digits = __Pyx_PyLong_Digits(b); const Py_ssize_t size = __Pyx_PyLong_SignedDigitCount(b); switch (size) { case 2: if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { return (Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -2: if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { return -(Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case 3: if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { return (Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -3: if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { return -(Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case 4: if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { return (Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -4: if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { return -(Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; } } #endif return PyLong_AsSsize_t(b); } x = PyNumber_Index(b); if (!x) return -1; ival = PyInt_AsSsize_t(x); Py_DECREF(x); return ival; } static CYTHON_INLINE Py_hash_t __Pyx_PyIndex_AsHash_t(PyObject* o) { if (sizeof(Py_hash_t) == sizeof(Py_ssize_t)) { return (Py_hash_t) __Pyx_PyIndex_AsSsize_t(o); #if PY_MAJOR_VERSION < 3 } else if (likely(PyInt_CheckExact(o))) { return PyInt_AS_LONG(o); #endif } else { Py_ssize_t ival; PyObject *x; x = PyNumber_Index(o); if (!x) return -1; ival = PyInt_AsLong(x); Py_DECREF(x); return ival; } } static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b) { return b ? __Pyx_NewRef(Py_True) : __Pyx_NewRef(Py_False); } static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t ival) { return PyInt_FromSize_t(ival); } /* #### Code section: utility_code_pragmas_end ### */ #ifdef _MSC_VER #pragma warning( pop ) #endif /* #### Code section: end ### */ #endif /* Py_PYTHON_H */ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/stats/fit_statistics_cython.pyx0000644000175100001770000000664414721316200022376 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst # cython: language_level=3 import numpy as np cimport numpy as np cimport cython cdef extern from "math.h": float log(float x) global TRUNCATION_VALUE TRUNCATION_VALUE = 1e-25 @cython.cdivision(True) @cython.boundscheck(False) def cash_sum_cython(np.ndarray[np.float_t, ndim=1] counts, np.ndarray[np.float_t, ndim=1] npred): """Summed cash fit statistics. Parameters ---------- counts : `~numpy.ndarray` Counts array. npred : `~numpy.ndarray` Predicted counts array. """ cdef np.float_t sum = 0 cdef np.float_t npr, lognpr cdef unsigned int i, ni cdef np.float_t trunc = TRUNCATION_VALUE cdef np.float_t logtrunc = log(TRUNCATION_VALUE) ni = counts.shape[0] for i in range(ni): npr = npred[i] if npr > trunc: lognpr = log(npr) else: npr = trunc lognpr = logtrunc sum += npr if counts[i] > 0: sum -= counts[i] * lognpr return 2 * sum @cython.cdivision(True) @cython.boundscheck(False) def f_cash_root_cython(np.float_t x, np.ndarray[np.float_t, ndim=1] counts, np.ndarray[np.float_t, ndim=1] background, np.ndarray[np.float_t, ndim=1] model): """Function to find root of. Described in Appendix A, Stewart (2009). Parameters ---------- x : float Model amplitude. counts : `~numpy.ndarray` Count image slice, where model is defined. background : `~numpy.ndarray` Background image slice, where model is defined. model : `~numpy.ndarray` Source template (multiplied with exposure). """ cdef np.float_t sum = 0 cdef unsigned int i, ni ni = counts.shape[0] for i in range(ni): if model[i] > 0: if counts[i] > 0: sum += model[i] * (1 - counts[i] / (x * model[i] + background[i])) else: sum += model[i] # 2 is required to maintain the correct normalization of the # derivative of the likelihood function. It doesn't change the result of # the fit. return 2 * sum @cython.cdivision(True) @cython.boundscheck(False) def norm_bounds_cython(np.ndarray[np.float_t, ndim=1] counts, np.ndarray[np.float_t, ndim=1] background, np.ndarray[np.float_t, ndim=1] model): """Compute bounds for the root of `_f_cash_root_cython`. Parameters ---------- counts : `~numpy.ndarray` Counts image background : `~numpy.ndarray` Background image model : `~numpy.ndarray` Source template (multiplied with exposure). """ cdef np.float_t s_model = 0, s_counts = 0, sn, sn_min = 1e14, c_min = 1 cdef np.float_t b_min, b_max, sn_min_total = 1e14 cdef unsigned int i, ni ni = counts.shape[0] for i in range(ni): if counts[i] > 0: s_counts += counts[i] if model[i] > 0: sn = background[i] / model[i] if sn < sn_min: sn_min = sn c_min = counts[i] if model[i] > 0: s_model += model[i] sn = background[i] / model[i] if sn < sn_min_total: sn_min_total = sn b_min = c_min / s_model - sn_min b_max = s_counts / s_model - sn_min return b_min, b_max, -sn_min_total ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.240642 gammapy-1.3/gammapy/stats/tests/0000755000175100001770000000000014721316215016352 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/stats/tests/__init__.py0000644000175100001770000000010014721316200020444 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/stats/tests/test_counts_statistic.py0000644000175100001770000002053514721316200023364 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from gammapy.stats import CashCountsStatistic, WStatCountsStatistic ref_array = np.ones((3, 2, 4)) values = [ (1, 2, [-1.0, -0.78339367, 0.216698]), (5, 1, [4.0, 2.84506224, 2.220137e-3]), (10, 5, [5.0, 1.96543726, 0.024682]), (100, 23, [77.0, 11.8294207, 1.37e-32]), (1, 20, [-19, -5.65760863, 7.5e-09]), (5 * ref_array, 1 * ref_array, [4.0, 2.84506224, 2.220137e-3]), ] @pytest.mark.parametrize(("n_on", "mu_bkg", "result"), values) def test_cash_basic(n_on, mu_bkg, result): stat = CashCountsStatistic(n_on, mu_bkg) excess = stat.n_sig sqrt_ts = stat.sqrt_ts p_value = stat.p_value assert_allclose(excess, result[0]) assert_allclose(sqrt_ts, result[1], atol=1e-5) assert_allclose(p_value, result[2], atol=1e-5) values = [ (0, 2, [0, 0.5, 0, 2]), (1, 2, [0.69829, 1.35767667, 0.947531, 3.505241]), (5, 1, [1.915916, 2.581106, 3.250479, 5.893762]), (10, 5, [2.838105, 3.504033, 5.067606, 7.722498]), (100, 23, [9.669482, 10.336074, 18.689488, 21.354971]), (1, 20, [0.69829, 1.357677, 0.947531, 3.505241]), (5 * ref_array, 1 * ref_array, [1.915916, 2.581106, 3.250479, 5.893762]), ] @pytest.mark.parametrize(("n_on", "mu_bkg", "result"), values) def test_cash_errors(n_on, mu_bkg, result): stat = CashCountsStatistic(n_on, mu_bkg) errn = stat.compute_errn() errp = stat.compute_errp() assert_allclose(errn, result[0], atol=1e-5) assert_allclose(errp, result[1], atol=1e-5) errn = stat.compute_errn(2) errp = stat.compute_errp(2) assert_allclose(errn, result[2], atol=1e-5) assert_allclose(errp, result[3], atol=1e-5) values = [ (1, 2, [5.517193]), (5, 1, [13.98959]), (10, 5, [17.696064]), (100, 23, [110.07206]), (1, 20, [-12.482807]), (5 * ref_array, 1 * ref_array, [13.98959]), ] @pytest.mark.parametrize(("n_on", "mu_bkg", "result"), values) def test_cash_ul(n_on, mu_bkg, result): stat = CashCountsStatistic(n_on, mu_bkg) ul = stat.compute_upper_limit() assert_allclose(ul, result[0], atol=1e-5) values = [ (100, 5, 54.012755), (100, -5, -45.631273), pytest.param(1, -2, np.nan, marks=pytest.mark.xfail), ([1, 2], 5, [8.327276, 10.550546]), ] @pytest.mark.parametrize(("mu_bkg", "significance", "result"), values) def test_cash_excess_matching_significance(mu_bkg, significance, result): stat = CashCountsStatistic(0, mu_bkg) excess = stat.n_sig_matching_significance(significance) assert_allclose(excess, result, atol=1e-3) values = [ (1, 2, 1, [-1.0, -0.5829220133009171, 0.279973]), (5, 1, 1, [4.0, 1.7061745691234782, 0.043988]), (10, 5, 0.3, [8.5, 3.5853812867949024, 1.68293e-4]), (10, 23, 0.1, [7.7, 3.443415522820395, 2.87208e-4]), (1, 20, 1.0, [-19, -4.590373638528086, 2.2122e-06]), ( 5 * ref_array, 1 * ref_array, 1 * ref_array, [4.0, 1.7061745691234782, 0.043988], ), ] @pytest.mark.parametrize(("n_on", "n_off", "alpha", "result"), values) def test_wstat_basic(n_on, n_off, alpha, result): stat = WStatCountsStatistic(n_on, n_off, alpha) excess = stat.n_sig sqrt_ts = stat.sqrt_ts p_value = stat.p_value assert_allclose(excess, result[0], rtol=1e-4) assert_allclose(sqrt_ts, result[1], rtol=1e-4) assert_allclose(p_value, result[2], rtol=1e-4) values = [ (5, 1, 1, 3, [1, 0.422261, 0.336417, 0.178305]), (5, 1, 1, 1, [3.0, 1.29828, 0.097095, 1.685535]), (5, 1, 1, 6, [-2, -0.75585, 0.224869, 0.571311]), ] @pytest.mark.parametrize(("n_on", "n_off", "alpha", "mu_sig", "result"), values) def test_wstat_with_musig(n_on, n_off, alpha, mu_sig, result): stat = WStatCountsStatistic(n_on, n_off, alpha, mu_sig) excess = stat.n_sig sqrt_ts = stat.sqrt_ts p_value = stat.p_value del_ts = stat.ts assert_allclose(excess, result[0], rtol=1e-4) assert_allclose(sqrt_ts, result[1], rtol=1e-4) assert_allclose(p_value, result[2], rtol=1e-4) assert_allclose(del_ts, result[3], rtol=1e-4) values = [ (1, 2, 1, [1.942465, 1.762589]), (5, 1, 1, [2.310459, 2.718807]), (10, 5, 0.3, [2.932472, 3.55926]), (10, 23, 0.1, [2.884366, 3.533279]), (1, 20, 1.0, [4.897018, 4.299083]), (5 * ref_array, 1 * ref_array, 1 * ref_array, [2.310459, 2.718807]), ] @pytest.mark.parametrize(("n_on", "n_off", "alpha", "result"), values) def test_wstat_errors(n_on, n_off, alpha, result): stat = WStatCountsStatistic(n_on, n_off, alpha) errn = stat.compute_errn() errp = stat.compute_errp() assert_allclose(errn, result[0], atol=1e-5) assert_allclose(errp, result[1], atol=1e-5) values = [ (1, 2, 1, [6.075534]), (5, 1, 1, [14.222831]), (10, 5, 0.3, [21.309229]), (10, 23, 0.1, [20.45803]), (1, 20, 1.0, [-7.078228]), (5 * ref_array, 1 * ref_array, 1 * ref_array, [14.222831]), ] @pytest.mark.parametrize(("n_on", "n_off", "alpha", "result"), values) def test_wstat_ul(n_on, n_off, alpha, result): stat = WStatCountsStatistic(n_on, n_off, alpha) ul = stat.compute_upper_limit() assert_allclose(ul, result[0], rtol=1e-5) values = [ ([10, 20], [0.1, 0.1], 5, [9.82966, 12.0384229]), ([10, 10], [0.1, 0.3], 5, [9.82966, 16.664516]), ([10], [0.1], 3, [4.818497]), ( [[10, 20], [10, 20]], [[0.1, 0.1], [0.1, 0.1]], 5, [[9.82966, 12.129523], [9.82966, 12.129523]], ), ] @pytest.mark.parametrize(("n_off", "alpha", "significance", "result"), values) def test_wstat_excess_matching_significance(n_off, alpha, significance, result): stat = WStatCountsStatistic(0, n_off, alpha) excess = stat.n_sig_matching_significance(significance) assert_allclose(excess, result, rtol=1e-2) def test_cash_sum(): on = [1, 2, 3] bkg = [0.5, 0.7, 1.3] stat = CashCountsStatistic(on, bkg) stat_sum = stat.sum() assert stat_sum.n_on == 6 assert stat_sum.n_bkg == 2.5 new_size = (2, 10, 3) on = np.resize(on, new_size) bkg = np.resize(bkg, new_size) stat = CashCountsStatistic(on, bkg) stat_sum = stat.sum(axis=(2)) assert stat_sum.n_on.shape == (2, 10) assert_allclose(stat_sum.n_on, 6) assert_allclose(stat_sum.n_bkg, 2.5) stat_sum = stat.sum(axis=(0, 1)) assert stat_sum.n_on.shape == (3,) assert_allclose(stat_sum.n_on, (20, 40, 60)) assert_allclose(stat_sum.n_bkg, (10, 14, 26)) def test_wstat_sum(): on = [1, 2, 3] off = [5, 14, 8] alpha = [0.1, 0.05, 0.1625] stat = WStatCountsStatistic(on, off, alpha) stat_sum = stat.sum() assert stat_sum.n_on == 6 assert stat_sum.n_off == 27 assert stat_sum.n_bkg == 2.5 assert_allclose(stat_sum.alpha, 0.0925925925925925) new_size = (2, 10, 3) off = np.resize(off, new_size) on = np.resize(on, new_size) alpha = np.resize(alpha, new_size) stat = WStatCountsStatistic(on, off, alpha) stat_sum = stat.sum(axis=(2)) assert stat_sum.n_on.shape == (2, 10) assert_allclose(stat_sum.n_on, 6) assert_allclose(stat_sum.n_bkg, 2.5) assert_allclose(stat_sum.alpha, 0.0925925925925925) stat_sum = stat.sum(axis=(0, 1)) assert stat_sum.n_on.shape == (3,) assert_allclose(stat_sum.n_on, (20, 40, 60)) assert_allclose(stat_sum.n_bkg, (10, 14, 26)) def test_CountStatistic_str(): cash = CashCountsStatistic(n_on=4, mu_bkg=2) assert "Predicted background counts" in str(cash) assert "CashCountsStatistic" in str(cash) assert "Total significance" in str(cash) wstat = WStatCountsStatistic(n_on=5, n_off=4, alpha=0.2, mu_sig=2) assert "Off counts" in str(wstat) assert "alpha " in str(wstat) assert "Total counts " in str(wstat) assert "WStatCountsStatistic" in str(wstat) def test_counts_statistic_infodict(): c1 = CashCountsStatistic(n_on=[3, 6], mu_bkg=[2, 1]) info_dict = c1.sum().info_dict() assert_allclose(info_dict["n_on"], 9) assert_allclose(info_dict["significance"], 2.788, rtol=1e-3) w1 = WStatCountsStatistic(n_on=[3, 6], n_off=[2, 1], alpha=[1, 2]) info_dict = w1.info_dict() assert_allclose(info_dict["n_off"], [2, 1]) assert_allclose(info_dict["significance"], [0.44872, 1.14942], rtol=1e-3) info_dict = w1.sum().info_dict() assert_allclose(info_dict["excess"], 5.0) assert_allclose(info_dict["significance"], 1.288731, rtol=1e-3) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/stats/tests/test_fit_statistics.py0000644000175100001770000001110614721316200023010 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from gammapy import stats @pytest.fixture def test_data(): """Test data for fit statistics tests""" test_data = dict( mu_sig=[ 0.59752422, 9.13666449, 12.98288095, 5.56974565, 13.52509804, 11.81725635, 0.47963765, 11.17708176, 5.18504894, 8.30202394, ], n_on=[0, 13, 7, 5, 11, 16, 0, 9, 3, 12], n_off=[0, 7, 4, 0, 18, 7, 1, 5, 12, 25], alpha=[ 0.83746243, 0.17003354, 0.26034507, 0.69197751, 0.89557033, 0.34068848, 0.0646732, 0.86411967, 0.29087245, 0.74108241, ], ) test_data["staterror"] = np.sqrt(test_data["n_on"]) return test_data @pytest.fixture def reference_values(): """Reference values for fit statistics test. Produced using sherpa stats module in dev/sherpa/stats/compare_wstat.py """ return dict( wstat=[ 1.19504844, 0.625311794002, 4.25810886127, 0.0603765381044, 11.7285002468, 0.206014834301, 1.084611, 2.72972381792, 4.60602990838, 7.51658734973, ], cash=[ 1.19504844, -39.24635098872072, -9.925081055136996, -6.034002586236575, -30.249839537105466, -55.39143500383233, 0.9592753, -21.095413867175516, 0.49542219758430406, -34.19193611846045, ], cstat=[ 1.19504844, 1.4423323052792387, 3.3176610316373925, 0.06037653810442922, 0.5038564644586838, 1.3314041078406706, 0.9592753, 0.4546285248764317, 1.0870959295929628, 1.4458234764515652, ], ) def test_wstat(test_data, reference_values): statsvec = stats.wstat( n_on=test_data["n_on"], mu_sig=test_data["mu_sig"], n_off=test_data["n_off"], alpha=test_data["alpha"], extra_terms=True, ) assert_allclose(statsvec, reference_values["wstat"]) def test_cash(test_data, reference_values): statsvec = stats.cash(n_on=test_data["n_on"], mu_on=test_data["mu_sig"]) assert_allclose(statsvec, reference_values["cash"]) def test_cstat(test_data, reference_values): statsvec = stats.cstat(n_on=test_data["n_on"], mu_on=test_data["mu_sig"]) assert_allclose(statsvec, reference_values["cstat"]) def test_cash_sum_cython(test_data): counts = np.array(test_data["n_on"], dtype=float) npred = np.array(test_data["mu_sig"], dtype=float) stat = stats.cash_sum_cython(counts=counts, npred=npred) ref = stats.cash(counts, npred).sum() assert_allclose(stat, ref) def test_cash_bad_truncation(): with pytest.raises(ValueError): stats.cash(10, 10, 0.0) def test_cstat_bad_truncation(): with pytest.raises(ValueError): stats.cstat(10, 10, 0.0) def test_wstat_corner_cases(): """test WSTAT formulae for corner cases""" n_on = 0 n_off = 5 mu_sig = 2.3 alpha = 0.5 actual = stats.wstat(n_on=n_on, mu_sig=mu_sig, n_off=n_off, alpha=alpha) desired = 2 * (mu_sig + n_off * np.log(1 + alpha)) assert_allclose(actual, desired) actual = stats.get_wstat_mu_bkg(n_on=n_on, mu_sig=mu_sig, n_off=n_off, alpha=alpha) desired = n_off / (alpha + 1) assert_allclose(actual, desired) # n_off = 0 and mu_sig < n_on * (alpha / alpha + 1) n_on = 9 n_off = 0 mu_sig = 2.3 alpha = 0.5 actual = stats.wstat(n_on=n_on, mu_sig=mu_sig, n_off=n_off, alpha=alpha) desired = -2 * (mu_sig * (1.0 / alpha) + n_on * np.log(alpha / (1 + alpha))) assert_allclose(actual, desired) actual = stats.get_wstat_mu_bkg(n_on=n_on, mu_sig=mu_sig, n_off=n_off, alpha=alpha) desired = n_on / (1 + alpha) - (mu_sig / alpha) assert_allclose(actual, desired) # n_off = 0 and mu_sig > n_on * (alpha / alpha + 1) n_on = 5 n_off = 0 mu_sig = 5.3 alpha = 0.5 actual = stats.wstat(n_on=n_on, mu_sig=mu_sig, n_off=n_off, alpha=alpha) desired = 2 * (mu_sig + n_on * (np.log(n_on) - np.log(mu_sig) - 1)) assert_allclose(actual, desired) actual = stats.get_wstat_mu_bkg(n_on=n_on, mu_sig=mu_sig, n_off=n_off, alpha=alpha) assert_allclose(actual, 0) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/stats/tests/test_utils.py0000644000175100001770000000104514721316200021115 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from numpy.testing import assert_allclose from gammapy.stats.utils import sigma_to_ts, ts_to_sigma def test_sigma_ts_conversion(): sigma_ref = 3 ts_ref = 9 df = 1 ts = sigma_to_ts(3, df=df) assert_allclose(ts, ts_ref) sigma = ts_to_sigma(ts, df=df) assert_allclose(sigma, sigma_ref) df = 2 ts_ref = 11.829158 ts = sigma_to_ts(3, df=df) assert_allclose(ts, ts_ref) sigma = ts_to_sigma(ts, df=df) assert_allclose(sigma, sigma_ref) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/stats/tests/test_variability.py0000644000175100001770000002274014721316200022301 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from astropy.table import Column, Table from astropy.time import Time from gammapy.estimators import FluxPoints from gammapy.stats.variability import ( TimmerKonig_lightcurve_simulator, compute_chisq, compute_flux_doubling, compute_fpp, compute_fvar, structure_function, discrete_correlation, ) from gammapy.utils.testing import assert_quantity_allclose @pytest.fixture(scope="session") def lc_table(): meta = dict(TIMESYS="utc") table = Table( meta=meta, data=[ Column(Time(["2010-01-01", "2010-01-03"]).mjd, "time_min"), Column(Time(["2010-01-03", "2010-01-10"]).mjd, "time_max"), Column([1e-11, 3e-11], "flux", unit="cm-2 s-1"), Column([0.1e-11, 0.3e-11], "flux_err", unit="cm-2 s-1"), Column([np.nan, 3.6e-11], "flux_ul", unit="cm-2 s-1"), Column([False, True], "is_ul"), ], ) return table @pytest.fixture(scope="session") def lc(): meta = dict(TIMESYS="utc", SED_TYPE="flux") table = Table( meta=meta, data=[ Column(Time(["2010-01-01", "2010-01-03", "2010-01-07"]).mjd, "time_min"), Column(Time(["2010-01-03", "2010-01-07", "2010-01-10"]).mjd, "time_max"), Column([[1.0, 2.0], [1.0, 2.0], [1.0, 2.0]], "e_min", unit="TeV"), Column([[2.0, 5.0], [2.0, 5.0], [2.0, 5.0]], "e_max", unit="TeV"), Column( [[1e-11, 4e-12], [3e-11, np.nan], [1e-11, 1e-12]], "flux", unit="cm-2 s-1", ), Column( [[0.1e-11, 0.4e-12], [0.3e-11, np.nan], [0.1e-11, 0.1e-12]], "flux_err", unit="cm-2 s-1", ), Column( [[np.nan, np.nan], [3.6e-11, 1e-11], [1e-11, 1e-12]], "flux_ul", unit="cm-2 s-1", ), Column([[False, False], [True, True], [True, True]], "is_ul"), Column([[True, True], [True, True], [True, True]], "success"), ], ) return FluxPoints.from_table(table=table, format="lightcurve") def test_lightcurve_fvar(): flux = np.array([[1e-11, 4e-12], [3e-11, np.nan], [1e-11, 1e-12]]) flux_err = np.array([[0.1e-11, 0.4e-12], [0.3e-11, np.nan], [0.1e-11, 0.1e-12]]) time_id = 0 fvar, fvar_err = compute_fvar(flux, flux_err, axis=time_id) assert_allclose(fvar, [0.68322763, 0.84047606]) assert_allclose(fvar_err, [0.06679978, 0.08285806]) def test_lightcurve_fpp(): flux = np.array([[1e-11, 4e-12], [3e-11, np.nan], [1e-11, 1e-12]]) flux_err = np.array([[0.1e-11, 0.4e-12], [0.3e-11, np.nan], [0.1e-11, 0.1e-12]]) fpp, fpp_err = compute_fpp(flux, flux_err) assert_allclose(fpp, [1.19448734, 0.11661904]) assert_allclose(fpp_err, [0.06648574, 0.10099505]) flux_1d = np.array([1e-11, 3e-11, 1e-11]) flux_err_1d = np.array([0.1e-11, 0.3e-11, 0.1e-11]) fpp_1d, fpp_err_1d = compute_fpp(flux_1d, flux_err_1d) assert_allclose(fpp_1d, [1.19448734]) assert_allclose(fpp_err_1d, [0.06648574]) def test_lightcurve_chisq(lc_table): flux = lc_table["flux"].astype("float64") chi2, pval = compute_chisq(flux) assert_quantity_allclose(chi2, 1e-11) assert_quantity_allclose(pval, 0.999997476867478) def test_lightcurve_flux_doubling(): flux = np.array( [ [1e-11, 4e-12], [3e-11, np.nan], [1e-11, 1e-12], [0.8e-11, 0.8e-12], [1e-11, 1e-12], ] ) flux_err = np.array( [ [0.1e-11, 0.4e-12], [0.3e-11, np.nan], [0.1e-11, 0.1e-12], [0.08e-11, 0.8e-12], [0.1e-11, 0.1e-12], ] ) time = ( np.array( [6.31157019e08, 6.31160619e08, 6.31164219e08, 6.31171419e08, 6.31178419e08] ) * u.s ) time_id = 0 dtime_dict = compute_flux_doubling(flux, flux_err, time, axis=time_id) dtime = dtime_dict["doubling"] dtime_err = dtime_dict["doubling_err"] assert_allclose( dtime, [2271.34711286, 21743.98603654] * u.s, ) assert_allclose(dtime_err, [425.92375713, 242.80234065] * u.s) def test_tk_function(): time_series, time_axis = TimmerKonig_lightcurve_simulator( lambda x: x ** (-3), 20, 1 * u.s ) def temp(x, norm, index): return norm * x ** (-index) params = {"norm": 1.5, "index": 3} time_series2, time_axis2 = TimmerKonig_lightcurve_simulator( temp, 15, 1 * u.h, power_spectrum_params=params ) assert len(time_series) == 20 assert isinstance(time_axis, u.Quantity) assert time_axis.unit == u.s assert len(time_series2) == 15 def test_tk_nchunks(): time_series, time_axis = TimmerKonig_lightcurve_simulator( lambda x: x ** (-1.5), 21, 1 * u.s, nchunks=100 ) time_series2, time_axis2 = TimmerKonig_lightcurve_simulator( lambda x: x ** (-1.5), 21, 1 * u.s, nchunks=20 ) with pytest.raises(TypeError): TimmerKonig_lightcurve_simulator(lambda x: x ** (-3), 20, 1 * u.s, nchunks=0.5) with pytest.raises(TypeError): TimmerKonig_lightcurve_simulator(lambda x: x ** (-3), 20.5, 1 * u.s) assert len(time_series) == len(time_series2) == 21 def test_tk_mean(): time_series, time_axis = TimmerKonig_lightcurve_simulator( lambda x: x ** (-1.5), 2000, 1 * u.s, mean=2.5, std=0.5 ) time_series2, time_axis2 = TimmerKonig_lightcurve_simulator( lambda x: x ** (-1.5), 2000, 1 * u.s, mean=1e-7 * (u.cm**-2), std=5e-8 * (u.cm**-2), ) time_series3, time_axis3 = TimmerKonig_lightcurve_simulator( lambda x: x ** (-1.5), 2000, 1 * u.s, mean=2.5, std=0.5, poisson=True ) with pytest.raises(Warning): TimmerKonig_lightcurve_simulator( lambda x: x ** (-3), 20, 1 * u.s, mean=0.5, poisson=True ) assert_allclose(time_series.mean(), 2.5) assert_allclose(time_series.std(), 0.5) assert_allclose(time_series2.mean(), 1e-7 * (u.cm**-2)) assert_allclose(time_series2.std(), 5e-8 * (u.cm**-2)) assert_allclose(time_series3.mean(), 2.5, rtol=0.1) assert_allclose(time_series3.std(), 1, rtol=1) def test_structure_function(): flux = np.array( [ [1e-11, 4e-12], [3e-11, 2.5e-12], [1e-11, 1e-12], [0.8e-11, 0.8e-12], [1e-11, 1e-12], ] ) flux_err = np.array( [ [0.1e-11, 0.4e-12], [0.3e-11, 0.2e-12], [0.1e-11, 0.1e-12], [0.08e-11, 0.08e-12], [0.1e-11, 0.1e-12], ] ) time = ( np.array( [6.31157019e08, 6.31160619e08, 6.31164219e08, 6.31171419e08, 6.31178419e08] ) * u.s ) sf, distances = structure_function(flux, flux_err, time) assert_allclose( sf, [ [4.00000000e-22, 2.25000000e-24], [4.00000000e-24, 4.00000000e-26], [2.00000000e-24, 4.52000000e-24], [4.84000000e-22, 2.89000000e-24], [0.00000000e00, 0.00000000e00], [4.00000000e-24, 1.02400000e-23], [4.00000000e-22, 2.25000000e-24], [0.00000000e00, 9.00000000e-24], ], ) assert_allclose( distances, [3600.0, 7000.0, 7200.0, 10800.0, 14200.0, 14400.0, 17800.0, 21400.0] * u.s, ) def test_dcf(): flux = np.array( [ [1e-11, 4e-12], [3e-11, 2.5e-12], [1e-11, 1e-12], [0.8e-11, 0.8e-12], [1e-11, 1e-12], ] ) flux2 = np.array( [ [1e-11, 3.5e-12], [4e-11, 1.5e-12], [1.7e-11, 2e-12], [0.4e-11, 1.8e-12], [1e-11, 1.5e-12], [1.1e-11, 0.7e-12], ] ) flux_err = np.array( [ [0.1e-11, 0.4e-12], [0.3e-11, 0.2e-12], [0.1e-11, 0.1e-12], [0.08e-11, 0.08e-12], [0.1e-11, 0.1e-12], ] ) flux_err2 = np.array( [ [0.2e-11, 0.24e-12], [0.4e-11, 0.21e-12], [0.07e-11, 0.14e-12], [0.12e-11, 0.18e-12], [0.1e-11, 0.13e-12], [0.4e-11, 0.2e-12], ] ) time = ( np.array( [6.31157019e08, 6.31160619e08, 6.31164219e08, 6.31171419e08, 6.31178419e08] ) * u.s ) time2 = ( np.array( [ 6.31156919e08, 6.31160519e08, 6.31163819e08, 6.31171419e08, 6.31178219e08, 6.31178819e08, ] ) * u.s ) bins, dcf, dcf_err = discrete_correlation( flux, flux_err, flux2, flux_err2, time, time2, tau=1.5e4 * u.s ) assert_allclose( dcf, [ [-0.332361, -1.009638], [-0.065639, 0.313054], [0.215329, 0.133132], [-0.373906, -0.56788], ], rtol=1e-5, ) assert_allclose( dcf_err, [ [0.34881, 0.553113], [0.236548, 0.174538], [0.401313, 0.361757], [0.820525, 1.204655], ], rtol=1e-5, ) assert_allclose( bins, u.Quantity([-22500.0, -7500.0, 7500.0, 22500.0], unit="s"), ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/stats/utils.py0000644000175100001770000000262414721316200016720 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from scipy.stats import chi2 def sigma_to_ts(n_sigma, df=1): """Convert number of sigma to delta ts according to the Wilks' theorem. The theorem is valid only if: - the two hypotheses tested can be defined in the same parameters space - the true value is not at the boundary of this parameters space. Parameters ---------- n_sigma : float Significance in number of sigma. df : int Number of degree of freedom. Returns ------- ts : float Test statistic Reference --------- Wilks theorem: https://en.wikipedia.org/wiki/Wilks%27_theorem """ p_value = chi2.sf(n_sigma**2, df=1) return chi2.isf(p_value, df=df) def ts_to_sigma(ts, df=1): """Convert delta ts to number of sigma according to the Wilks' theorem. The theorem is valid only if : - the two hypotheses tested can be defined in the same parameters space - the true value is not at the boundary of this parameters space Reference: https://en.wikipedia.org/wiki/Wilks%27_theorem Parameters ---------- ts : float Test statistic. df : int Number of degree of freedom. Returns ------- n_sigma : float Significance in number of sigma. """ p_value = chi2.sf(ts, df=df) return np.sqrt(chi2.isf(p_value, df=1)) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/stats/variability.py0000644000175100001770000004252014721316200020076 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np import scipy.stats as stats import astropy.units as u from gammapy.utils.random import get_random_state __all__ = [ "compute_fvar", "compute_fpp", "compute_chisq", "compute_flux_doubling", "structure_function", "TimmerKonig_lightcurve_simulator", "discrete_correlation", ] def compute_fvar(flux, flux_err, axis=0): r"""Calculate the fractional excess variance. This method accesses the ``FLUX`` and ``FLUX_ERR`` columns from the lightcurve data. The fractional excess variance :math:`F_{var}`, an intrinsic variability estimator, is given by: .. math:: F_{var} = \sqrt{ \frac{S^{2} - \bar{ \sigma^{2}}}{ \bar{x}^{2}}} It is the excess variance after accounting for the measurement errors on the light curve :math:`\sigma`. :math:`S` is the variance. It is important to note that the errors on the flux must be gaussian. Parameters ---------- flux : `~astropy.units.Quantity` The measured fluxes. flux_err : `~astropy.units.Quantity` The error on measured fluxes. axis : int, optional Axis along which the excess variance is computed. Default is 0. Returns ------- fvar, fvar_err : `~numpy.ndarray` Fractional excess variance. References ---------- .. [Vaughan2003] "On characterizing the variability properties of X-ray light curves from active galaxies", Vaughan et al. (2003) https://ui.adsabs.harvard.edu/abs/2003MNRAS.345.1271V """ flux_mean = np.nanmean(flux, axis=axis) n_points = np.count_nonzero(~np.isnan(flux), axis=axis) s_square = np.nansum((flux - flux_mean) ** 2, axis=axis) / (n_points - 1) sig_square = np.nansum(flux_err**2, axis=axis) / n_points fvar = np.sqrt(np.abs(s_square - sig_square)) / flux_mean sigxserr_a = np.sqrt(2 / n_points) * sig_square / flux_mean**2 sigxserr_b = np.sqrt(sig_square / n_points) * (2 * fvar / flux_mean) sigxserr = np.sqrt(sigxserr_a**2 + sigxserr_b**2) fvar_err = sigxserr / (2 * fvar) return fvar, fvar_err def compute_fpp(flux, flux_err, axis=0): r"""Calculate the point-to-point excess variance. F_pp is a quantity strongly related to the fractional excess variance F_var implemented in `~gammapy.stats.compute_fvar`; F_pp probes the variability on a shorter timescale. For white noise, F_pp and F_var give the same value. However, for red noise, F_var will be larger than F_pp, as the variations will be larger on longer timescales. It is important to note that the errors on the flux must be Gaussian. Parameters ---------- flux : `~astropy.units.Quantity` The measured fluxes. flux_err : `~astropy.units.Quantity` The error on measured fluxes. axis : int, optional Axis along which the excess variance is computed. Default is 0. Returns ------- fpp, fpp_err : `~numpy.ndarray` Point-to-point excess variance. References ---------- .. [Edelson2002] "X-Ray Spectral Variability and Rapid Variability of the Soft X-Ray Spectrum Seyfert 1 Galaxies Arakelian 564 and Ton S180", Edelson et al. (2002), equation 3, https://iopscience.iop.org/article/10.1086/323779 """ flux_mean = np.nanmean(flux, axis=axis) n_points = np.count_nonzero(~np.isnan(flux), axis=axis) flux = flux.swapaxes(0, axis).T s_square = np.nansum((flux[..., 1:] - flux[..., :-1]) ** 2, axis=-1) / ( n_points.T - 1 ) sig_square = np.nansum(flux_err**2, axis=axis) / n_points fpp = np.sqrt(np.abs(s_square.T - sig_square)) / flux_mean sigxserr_a = np.sqrt(2 / n_points) * sig_square / flux_mean**2 sigxserr_b = np.sqrt(sig_square / n_points) * (2 * fpp / flux_mean) sigxserr = np.sqrt(sigxserr_a**2 + sigxserr_b**2) fpp_err = sigxserr / (2 * fpp) return fpp, fpp_err def compute_chisq(flux): r"""Calculate the chi-square test for `LightCurve`. Chisquare test is a variability estimator. It computes deviations from the expected value here mean value. Parameters ---------- flux : `~astropy.units.Quantity` The measured fluxes. Returns ------- ChiSq, P-value : tuple of float or `~numpy.ndarray` Tuple of Chi-square and P-value. """ yexp = np.mean(flux) yobs = flux.data chi2, pval = stats.chisquare(yobs, yexp) return chi2, pval def compute_flux_doubling(flux, flux_err, coords, axis=0): r"""Compute the minimum characteristic flux doubling and halving over a certain coordinate axis for a series of measurements. Computing the flux doubling can give the doubling time in a lightcurve displaying significant temporal variability, e.g. an AGN flare. The variable is computed as: .. math:: doubling = min(\frac{t_(i+1)-t_i}{log_2{f_(i+1)/f_i}}) where f_i and f_(i+1) are the fluxes measured at subsequent coordinates t_i and t_(i+1). The error is obtained by propagating the relative errors on the flux measures. Parameters ---------- flux : `~astropy.units.Quantity` The measured fluxes. flux_err : `~astropy.units.Quantity` The error on measured fluxes. coords : `~astropy.units.Quantity` The coordinates at which the fluxes are measured. axis : int, optional Axis along which the value is computed. Returns ------- doubling_dict : dict Dictionary containing the characteristic flux doubling, halving and errors, with coordinates at which they were found. """ flux = np.atleast_2d(flux).swapaxes(0, axis).T flux_err = np.atleast_2d(flux_err).swapaxes(0, axis).T axes = np.diff(coords) / np.log2(flux[..., 1:] / flux[..., :-1]) axes_err_1 = ( np.diff(coords) * np.log(2) / flux[..., 1:] * np.log(flux[..., 1:] / flux[..., :-1]) ** 2 ) axes_err_2 = ( np.diff(coords) * np.log(2) / flux[..., :-1] * np.log(flux[..., 1:] / flux[..., :-1]) ** 2 ) axes_err = np.sqrt( (flux_err[..., 1:] * axes_err_1) ** 2 + (flux_err[..., :-1] * axes_err_2) ** 2 ) imin = np.expand_dims( np.argmin( np.where( np.logical_and(np.isfinite(axes), axes > 0), axes, np.inf * coords.unit ), axis=-1, ), axis=-1, ) imax = np.expand_dims( np.argmax( np.where( np.logical_and(np.isfinite(axes), axes < 0), axes, -np.inf * coords.unit ), axis=-1, ), axis=-1, ) index = np.concatenate([imin, imax], axis=-1) coord = np.take_along_axis(coords, index.flatten(), axis=0).reshape(index.shape) doubling = np.take_along_axis(axes, index, axis=-1) doubling_err = np.take_along_axis(axes_err, index, axis=-1) doubling_dict = { "doubling": doubling.T[0], "doubling_err": doubling_err.T[0], "doubling_coord": coord.T[0], "halving": np.abs(doubling.T[1]), "halving_err": doubling_err.T[1], "halving_coord": coord.T[1], } return doubling_dict def structure_function(flux, flux_err, time, tdelta_precision=5): """Compute the discrete structure function for a variable source. Parameters ---------- flux : `~astropy.units.Quantity` The measured fluxes. flux_err : `~astropy.units.Quantity` The error on measured fluxes. time : `~astropy.units.Quantity` The time coordinates at which the fluxes are measured. tdelta_precision : int, optional The number of decimal places to check to separate the time deltas. Default is 5. Returns ------- sf, distances : `~numpy.ndarray`, `~astropy.units.Quantity` Discrete structure function and array of time distances. References ---------- .. [Emmanoulopoulos2010] "On the use of structure functions to study blazar variability: caveats and problems", Emmanoulopoulos et al. (2010) https://academic.oup.com/mnras/article/404/2/931/968488 """ dist_matrix = (time[np.newaxis, :] - time[:, np.newaxis]).round( decimals=tdelta_precision ) distances = np.unique(dist_matrix) distances = distances[distances > 0] shape = distances.shape + flux.shape[1:] factor = np.zeros(shape) norm = np.zeros(shape) for i, distance in enumerate(distances): indexes = np.array(np.where(dist_matrix == distance)) for index in indexes.T: f = (flux[index[1], ...] - flux[index[0], ...]) ** 2 w = (flux[index[1], ...] / flux_err[index[1], ...]) * ( flux[index[0], ...] / flux_err[index[0], ...] ) f = np.nan_to_num(f) w = np.nan_to_num(w) factor[i] = factor[i] + f * w norm[i] = norm[i] + w sf = factor / norm return sf, distances def discrete_correlation(flux1, flux_err1, flux2, flux_err2, time1, time2, tau, axis=0): """Compute the discrete correlation function for a variable source. Parameters ---------- flux1, flux_err1: `~astropy.units.Quantity` The first set of measured fluxes and associated error. flux2, flux_err2 : `~astropy.units.Quantity` The second set of measured fluxes and associated error. time1, time2 : `~astropy.units.Quantity` The time coordinates at which the fluxes are measured. tau : `~astropy.units.Quantity` Size of the bins to compute the discrete correlation. axis : int, optional Axis along which the correlation is computed. Default is 0. Returns ------- bincenters: `~astropy.units.Quantity` Array of discrete time bins. discrete_correlation: `~numpy.ndarray` Array of discrete correlation function values for each bin. discrete_correlation_err : `~numpy.ndarray` Error associated to the discrete correlation values. References ---------- .. [Edelson1988] "THE DISCRETE CORRELATION FUNCTION: A NEW METHOD FOR ANALYZING UNEVENLY SAMPLED VARIABILITY DATA", Edelson et al. (1988) https://ui.adsabs.harvard.edu/abs/1988ApJ...333..646E/abstract """ flux1 = np.rollaxis(flux1, axis, 0) flux2 = np.rollaxis(flux2, axis, 0) if np.squeeze(flux1).shape[1:] != np.squeeze(flux2).shape[1:]: raise ValueError( "flux1 and flux2 must have the same squeezed shape, apart from the chosen axis." ) tau = tau.to(time1.unit) time2 = time2.to(time1.unit) mean1, mean2 = np.nanmean(flux1, axis=0), np.nanmean(flux2, axis=0) sigma1, sigma2 = np.nanstd(flux1, axis=0), np.nanstd(flux2, axis=0) udcf1 = (flux1 - mean1) / np.sqrt((sigma1**2 - np.nanmean(flux_err1, axis=0) ** 2)) udcf2 = (flux2 - mean2) / np.sqrt((sigma2**2 - np.nanmean(flux_err2, axis=0) ** 2)) udcf = np.empty(((flux1.shape[0],) + flux2.shape)) dist = u.Quantity(np.empty(((flux1.shape[0], flux2.shape[0]))), unit=time1.unit) for i, x1 in enumerate(udcf1): for j, x2 in enumerate(udcf2): udcf[i, j, ...] = x1 * x2 dist[i, j] = time1[i] - time2[j] maxfactor = np.floor(np.amax(dist) / tau).value + 1 minfactor = np.floor(np.amin(dist) / tau).value bins = ( np.linspace( minfactor, maxfactor, int(np.abs(maxfactor) + np.abs(minfactor) + 1) ) * tau ) bin_indices = np.digitize(dist, bins).flatten() udcf = np.reshape(udcf, (udcf.shape[0] * udcf.shape[1], -1)) discrete_correlation = np.array( [np.nanmean(udcf[bin_indices == i], axis=0) for i in range(1, len(bins))] ) discrete_correlation_err = [] for i in range(1, len(bins)): terms = (discrete_correlation[i - 1] - udcf[bin_indices == i]) ** 2 num = np.sqrt(np.nansum(terms, axis=0)) den = len(udcf[bin_indices == i]) - 1 discrete_correlation_err.append(num / den) bincenters = (bins[1:] + bins[:-1]) / 2 return bincenters, discrete_correlation, np.array(discrete_correlation_err) def TimmerKonig_lightcurve_simulator( power_spectrum, npoints, spacing, nchunks=10, random_state="random-seed", power_spectrum_params=None, mean=0.0, std=1.0, poisson=False, ): """Implementation of the Timmer-Koenig algorithm to simulate a time series from a power spectrum. Parameters ---------- power_spectrum : function Power spectrum used to generate the time series. It is expected to be a function mapping the input frequencies to the periodogram. npoints : int Number of points in the output time series. spacing : `~astropy.units.Quantity` Sample spacing, inverse of the sampling rate. The units are inherited by the resulting time axis. nchunks : int, optional Factor by which to multiply the length of the time series to avoid red noise leakage. Default is 10. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`}, optional Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is "random-seed". power_spectrum_params : dict, optional Dictionary of parameters to be provided to the power spectrum function. mean : float, `~astropy.units.Quantity`, optional Desired mean of the final series. Default is 0. std : float, `~astropy.units.Quantity`, optional Desired standard deviation of the final series. Default is 1. poisson : bool, optional Whether to apply poissonian noise to the final time series. Default is False. Returns ------- time_series : `~numpy.ndarray` Simulated time series. time_axis : `~astropy.units.Quantity` Time axis of the series in the same units as 'spacing'. It will be defined with length 'npoints', from 0 to 'npoints'*'spacing'. Examples -------- To pass the function to be used in the simlation one can use either the 'lambda' keyword or an extended definition. Parameters of the function can be passed using the 'power_spectrum_params' keyword. For example, these are three ways to pass a power law (red noise) with index 2: >>> from gammapy.stats import TimmerKonig_lightcurve_simulator >>> import astropy.units as u >>> def powerlaw(x): ... return x**(-2) >>> def powerlaw_with_parameters(x, i): ... return x**(-i) >>> ts, ta = TimmerKonig_lightcurve_simulator(lambda x: x**(-2), 20, 1*u.h) >>> ts2, ta2 = TimmerKonig_lightcurve_simulator(powerlaw, 20, 1*u.h) >>> ts3, ta3 = TimmerKonig_lightcurve_simulator(powerlaw_with_parameters, ... 20, 1*u.h, power_spectrum_params={"i":2}) References ---------- .. [Timmer1995] "On generating power law noise", J. Timmer and M, Konig, section 3 https://ui.adsabs.harvard.edu/abs/1995A%26A...300..707T/abstract """ if not callable(power_spectrum): raise ValueError( "The power spectrum has to be provided as a callable function." ) if not isinstance(npoints * nchunks, int): raise TypeError("npoints and nchunks must be integers") if poisson: if isinstance(mean, u.Quantity): wmean = mean.value * spacing.value else: wmean = mean * spacing.value if wmean < 1.0: raise Warning( "Poisson noise was requested but the target mean is too low - resulting counts will likely be 0." ) random_state = get_random_state(random_state) npoints_ext = npoints * nchunks frequencies = np.fft.fftfreq(npoints_ext, spacing.value) # To obtain real data only the positive or negative part of the frequency is necessary. real_frequencies = np.sort(np.abs(frequencies[frequencies < 0])) if power_spectrum_params: periodogram = power_spectrum(real_frequencies, **power_spectrum_params) else: periodogram = power_spectrum(real_frequencies) real_part = random_state.normal(0, 1, len(periodogram) - 1) imaginary_part = random_state.normal(0, 1, len(periodogram) - 1) # Nyquist frequency component handling if npoints_ext % 2 == 0: idx0 = -2 random_factor = random_state.normal(0, 1) else: idx0 = -1 random_factor = random_state.normal(0, 1) + 1j * random_state.normal(0, 1) fourier_coeffs = np.concatenate( [ np.sqrt(0.5 * periodogram[:-1]) * (real_part + 1j * imaginary_part), np.sqrt(0.5 * periodogram[-1:]) * random_factor, ] ) fourier_coeffs = np.concatenate( [fourier_coeffs, np.conjugate(fourier_coeffs[idx0::-1])] ) fourier_coeffs = np.insert(fourier_coeffs, 0, 0) time_series = np.fft.ifft(fourier_coeffs).real ndiv = npoints_ext // (2 * nchunks) setstart = npoints_ext // 2 - ndiv setend = npoints_ext // 2 + ndiv if npoints % 2 != 0: setend = setend + 1 time_series = time_series[setstart:setend] time_series = (time_series - time_series.mean()) / time_series.std() time_series = time_series * std + mean if poisson: time_series = ( random_state.poisson( np.where(time_series >= 0, time_series, 0) * spacing.value ) / spacing.value ) time_axis = np.linspace(0, npoints * spacing.value, npoints) * spacing.unit return time_series, time_axis ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.240642 gammapy-1.3/gammapy/tests/0000755000175100001770000000000014721316215015214 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/tests/__init__.py0000644000175100001770000000033614721316200017321 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Gammapy integration and system tests. This package can be used for tests that involved several Gammapy sub-packages, or that don't fit anywhere else. """ ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.244642 gammapy-1.3/gammapy/utils/0000755000175100001770000000000014721316215015212 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/__init__.py0000644000175100001770000000047314721316200017321 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """ Utility functions and classes used throughout Gammapy. You have to import sub-modules of `gammapy.utils` directly, the `gammapy.utils` namespace is empty. Examples:: from gammapy.utils.interpolation import ScaledRegularGridInterpolator """ ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/array.py0000644000175100001770000000777014721316200016707 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Utility functions to deal with arrays and quantities.""" import numpy as np import scipy.ndimage import scipy.signal from astropy.convolution import Gaussian2DKernel __all__ = [ "array_stats_str", "round_up_to_even", "round_up_to_odd", "shape_2N", "shape_divisible_by", "symmetric_crop_pad_width", ] def is_power2(n): """Check if an integer is a power of 2.""" return (n > 0) & ((n & (n - 1)) == 0) def array_stats_str(x, label=""): """Make a string summarising some stats for an array. Parameters ---------- x : array-like Array. label : str, optional Label. Returns ------- stats_str : str String with array stats. """ x = np.asanyarray(x) ss = "" if label: ss += f"{label:15s}: " min = x.min() max = x.max() size = x.size fmt = "size = {size:5d}, min = {min:6.3f}, max = {max:6.3f}\n" ss += fmt.format(**locals()) return ss def shape_2N(shape, N=3): """ Round a given shape to values that are divisible by 2^N. Parameters ---------- shape : tuple Input shape. N : int, optional Exponent of two. Default is 3. Returns ------- new_shape : tuple New shape extended to integers divisible by 2^N. """ shape = np.array(shape) new_shape = shape + (2**N - np.mod(shape, 2**N)) return tuple(new_shape) def shape_divisible_by(shape, factor): """ Round a given shape to values that are divisible by factor. Parameters ---------- shape : tuple Input shape. factor : int Divisor. Returns ------- new_shape : tuple New shape extended to integers divisible by factor. """ shape = np.array(shape) new_shape = shape + (shape % factor) return tuple(new_shape) def round_up_to_odd(f): """Round float to odd integer. Parameters ---------- f : float Float value. Returns ------- int : int Odd integer. """ return (np.ceil(f) // 2 * 2 + 1).astype(int) def round_up_to_even(f): """Round float to even integer. Parameters ---------- f : float Float value. Returns ------- int : int Odd integer. """ return (np.ceil(f + 1) // 2 * 2).astype(int) def symmetric_crop_pad_width(shape, new_shape): """ Compute symmetric crop or pad width. To obtain a new shape from a given old shape of an array. Parameters ---------- shape : tuple Old shape. new_shape : tuple or str New shape. """ xdiff = abs(shape[1] - new_shape[1]) ydiff = abs(shape[0] - new_shape[0]) if (np.array([xdiff, ydiff]) % 2).any(): raise ValueError( "For symmetric crop / pad width, difference to new shape " "must be even in all axes." ) ywidth = (ydiff // 2, ydiff // 2) xwidth = (xdiff // 2, xdiff // 2) return ywidth, xwidth def _fftconvolve_wrap(kernel, data): # wrap gaussian filter as a special case, because the gain in # performance is factor ~100 if isinstance(kernel, Gaussian2DKernel): width = kernel.model.x_stddev.value norm = kernel.array.sum() return norm * scipy.ndimage.gaussian_filter(data, width) else: return scipy.signal.fftconvolve( data.astype(np.float32), kernel.array, mode="same" ) def scale_cube(data, kernels): """ Compute scale space cube. Compute scale space cube by convolving the data with a set of kernels and stack the resulting images along the third axis. Parameters ---------- data : `~numpy.ndarray` Input data. kernels : list of `~astropy.convolution.Kernel` List of convolution kernels. Returns ------- cube : `~numpy.ndarray` Array of the shape (len(kernels), data.shape). """ return np.dstack([_fftconvolve_wrap(kernel, data) for kernel in kernels]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/check.py0000644000175100001770000000323514721316200016636 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging import os import yaml from gammapy.scripts.download import RELEASE, cli_download_datasets from gammapy.scripts.info import cli_info log = logging.getLogger(__name__) def check_tutorials_setup(download_datasets_path="./gammapy-data"): """Check tutorials setup and download data if not available. Parameters ---------- download_datasets_path : str, optional Path to download the data. Default is "./gammapy-data". """ if "GAMMAPY_DATA" not in os.environ: log.info( "Missing example datasets, downloading to {download_datasets_path} now..." ) cli_download_datasets.callback(out=download_datasets_path, release=RELEASE) os.environ["GAMMAPY_DATA"] = download_datasets_path cli_info.callback(system=True, version=True, dependencies=True, envvar=True) def yaml_checksum(yaml_content): """Compute a MD5 checksum for a given yaml string input.""" import hashlib # Calculate the MD5 checksum checksum = hashlib.md5(yaml_content.encode("utf-8")).hexdigest() return checksum def verify_checksum(yaml_content, checksum): """Compare MD5 checksum for yaml_content with input checksum.""" return yaml_checksum(yaml_content) == checksum def add_checksum(yaml_str, sort_keys=False, indent=4, width=50): """Append a checksum at the end of the yaml string.""" checksum = {"checksum": yaml_checksum(yaml_str)} checksum = yaml.dump( checksum, sort_keys=True, indent=4, # indent, width=80, # width, default_flow_style=False, ) return yaml_str + checksum ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/cluster.py0000644000175100001770000001213114721316200017235 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Utilities for hierarchical/agglomerative clustering.""" import numpy as np import scipy.cluster.hierarchy as sch __all__ = ["standard_scaler", "hierarchical_clustering"] def standard_scaler(features): r"""Compute standardized features by removing the mean and scaling to unit variance. Calculated through: .. math:: f_\text{scaled} = \frac{f-\text{mean}(f)}{\text{std}(f)} . Parameters ---------- features : `~astropy.table.Table` Table containing the features. Returns ------- scaled_features : `~astropy.table.Table` Table containing the scaled features (dimensionless). Examples -------- Compute standardized features of a cluster observations based on their IRF quantities:: >>> from gammapy.data.utils import get_irfs_features >>> from gammapy.data import DataStore >>> from gammapy.utils.cluster import standard_scaler >>> from astropy.coordinates import SkyCoord >>> import astropy.units as u >>> data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") >>> obs_ids = data_store.obs_table["OBS_ID"][:3] >>> obs = data_store.get_observations(obs_ids) >>> position = SkyCoord(329.716 * u.deg, -30.225 * u.deg, frame="icrs") >>> names = ["edisp-res", "psf-radius"] >>> features_irfs = get_irfs_features( ... obs, ... energy_true="1 TeV", ... position=position, ... names=names ... ) >>> scaled_features_irfs = standard_scaler(features_irfs) >>> print(scaled_features_irfs) edisp-res obs_id psf-radius ------------------- ------ -------------------- -0.1379190199428797 20136 -0.18046952655570045 1.2878662980210884 20137 1.3049664466089965 -1.1499472780781963 20151 -1.1244969200533408 """ scaled_features = features.copy() for col in scaled_features.columns: if col not in ["obs_id", "dataset_name"]: data = scaled_features[col].data scaled_features[col] = (data - data.mean()) / data.std() return scaled_features def hierarchical_clustering(features, linkage_kwargs=None, fcluster_kwargs=None): """Hierarchical clustering using given features. Parameters ---------- features : `~astropy.table.Table` Table containing the features. linkage_kwargs : dict, optional Arguments forwarded to `scipy.cluster.hierarchy.linkage`. Default is None, which uses method="ward" and metric="euclidean". fcluster_kwargs : dict, optional Arguments forwarded to `scipy.cluster.hierarchy.fcluster`. Default is None, which uses criterion="maxclust" and t=3. Returns ------- features : `~astropy.table.Table` Table containing the features and an extra column for the groups labels. Examples -------- Cluster features into t=2 groups with a corresponding label for each group:: >>> from gammapy.data.utils import get_irfs_features >>> from gammapy.data import DataStore >>> from gammapy.utils.cluster import standard_scaler, hierarchical_clustering >>> from astropy.coordinates import SkyCoord >>> import astropy.units as u >>> data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") >>> obs_ids = data_store.obs_table["OBS_ID"][13:20] >>> obs = data_store.get_observations(obs_ids) >>> position = SkyCoord(329.716 * u.deg, -30.225 * u.deg, frame="icrs") >>> names = ["edisp-res", "psf-radius"] >>> features_irfs = get_irfs_features( ... obs, ... energy_true="1 TeV", ... position=position, ... names=names ... ) >>> scaled_features_irfs = standard_scaler(features_irfs) >>> features = hierarchical_clustering(scaled_features_irfs, fcluster_kwargs={"t": 2}) >>> print(features) edisp-res obs_id psf-radius labels ------------------- ------ ------------------- ------ -1.3020791585772495 20326 -1.2471938975366008 2 -1.3319831545301117 20327 -1.4586649826004114 2 -0.7763307219821931 20339 -0.6705024680435898 2 0.9677107409819438 20343 0.9500979841335693 1 0.820562952023891 20344 0.8160964882165554 1 0.7771617763704126 20345 0.7718272408581743 1 0.8449575657133206 20346 0.8383396349722769 1 """ features = features.copy() features_array = np.array( [ features[col].data for col in features.columns if col not in ["obs_id", "dataset_name"] ] ).T default_linkage_kwargs = dict(method="ward", metric="euclidean") if linkage_kwargs is not None: default_linkage_kwargs.update(linkage_kwargs) pairwise_distances = sch.distance.pdist(features_array) linkage = sch.linkage(pairwise_distances, **default_linkage_kwargs) default_fcluster_kwargs = dict(criterion="maxclust", t=3) if fcluster_kwargs is not None: default_fcluster_kwargs.update(fcluster_kwargs) labels = sch.fcluster(linkage, **default_fcluster_kwargs) features["labels"] = labels return features ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/compat.py0000644000175100001770000000045214721316200017042 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from astropy.utils import minversion __all__ = [ "COPY_IF_NEEDED", ] # This is copied from astropy.utils.compat NUMPY_LT_2_0 = not minversion(np, "2.0.0.dev") COPY_IF_NEEDED = False if NUMPY_LT_2_0 else None ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.244642 gammapy-1.3/gammapy/utils/coordinates/0000755000175100001770000000000014721316215017524 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/coordinates/__init__.py0000644000175100001770000000074714721316200021637 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Astronomical coordinate calculation utility functions.""" from .fov import fov_to_sky, sky_to_fov from .other import ( D_SUN_TO_GALACTIC_CENTER, cartesian, galactic, motion_since_birth, polar, velocity_glon_glat, ) __all__ = [ "cartesian", "D_SUN_TO_GALACTIC_CENTER", "fov_to_sky", "galactic", "motion_since_birth", "polar", "sky_to_fov", "velocity_glon_glat", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/coordinates/fov.py0000644000175100001770000000412214721316200020661 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from astropy.coordinates import SkyCoord, SkyOffsetFrame __all__ = ["fov_to_sky", "sky_to_fov"] def fov_to_sky(lon, lat, lon_pnt, lat_pnt): """Transform field-of-view coordinates to sky coordinates. Parameters ---------- lon, lat : `~astropy.units.Quantity` Field-of-view coordinate to be transformed. lon_pnt, lat_pnt : `~astropy.units.Quantity` Coordinate specifying the pointing position. (i.e. the center of the field of view.) Returns ------- lon_t, lat_t : `~astropy.units.Quantity` Transformed sky coordinate. """ # Create a frame that is centered on the pointing position center = SkyCoord(lon_pnt, lat_pnt) fov_frame = SkyOffsetFrame(origin=center) # Define coordinate to be transformed. # Need to switch the sign of the longitude angle here # because this axis is reversed in our definition of the FoV-system target_fov = SkyCoord(-lon, lat, frame=fov_frame) # Transform into celestial system (need not be ICRS) target_sky = target_fov.icrs return target_sky.ra, target_sky.dec def sky_to_fov(lon, lat, lon_pnt, lat_pnt): """Transform sky coordinates to field-of-view coordinates. Parameters ---------- lon, lat : `~astropy.units.Quantity` Sky coordinate to be transformed. lon_pnt, lat_pnt : `~astropy.units.Quantity` Coordinate specifying the pointing position. (i.e. the center of the field of view.) Returns ------- lon_t, lat_t : `~astropy.units.Quantity` Transformed field-of-view coordinate. """ # Create a frame that is centered on the pointing position center = SkyCoord(lon_pnt, lat_pnt) fov_frame = SkyOffsetFrame(origin=center) # Define coordinate to be transformed. target_sky = SkyCoord(lon, lat) # Transform into FoV-system target_fov = target_sky.transform_to(fov_frame) # Switch sign of longitude angle since this axis is # reversed in our definition of the FoV-system return -target_fov.lon, target_fov.lat ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/coordinates/other.py0000644000175100001770000000554314721316200021220 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Other coordinate and distance-related functions.""" import numpy as np from astropy.units import Quantity, Unit __all__ = [ "cartesian", "D_SUN_TO_GALACTIC_CENTER", "galactic", "motion_since_birth", "polar", "velocity_glon_glat", ] # TODO: replace this with the default from the Galactocentric frame in astropy.coordinates D_SUN_TO_GALACTIC_CENTER = Quantity(8.5, "kpc") """Default assumed distance from the Sun to the Galactic center (`~astropy.units.Quantity`)""" def cartesian(r, theta): """Convert polar coordinates to cartesian coordinates.""" x = r * np.cos(theta) y = r * np.sin(theta) return x, y def polar(x, y): """Convert cartesian coordinates to polar coordinates.""" r = np.sqrt(x**2 + y**2) theta = np.arctan2(y, x) return r, theta def galactic(x, y, z, obs_pos=None): """Compute galactic coordinates lon, lat and distance. For given position in cartesian coordinates (kpc). """ obs_pos = obs_pos or [D_SUN_TO_GALACTIC_CENTER, 0, 0] y_prime = y + D_SUN_TO_GALACTIC_CENTER d = np.sqrt(x**2 + y_prime**2 + z**2) glon = np.arctan2(x, y_prime).to("deg") glat = np.arcsin(z / d).to("deg") return d, glon, glat def velocity_glon_glat(x, y, z, vx, vy, vz): """ Compute projected angular velocity in galactic coordinates. Parameters ---------- x, y, z : `~astropy.units.Quantity` Position in x, y, z direction. vx, vy, vz : `~astropy.units.Quantity` Velocity in x, y, z direction. Returns ------- v_glon, v_glat : `~astropy.units.Quantity` Projected velocity in Galactic sky coordinates. """ y_prime = y + D_SUN_TO_GALACTIC_CENTER d = np.sqrt(x**2 + y_prime**2 + z**2) r = np.sqrt(x**2 + y_prime**2) v_glon = (-y_prime * vx + x * vy) / r**2 v_glat = vz / (np.sqrt(1 - (z / d) ** 2) * d) - np.sqrt( vx**2 + vy**2 + vz**2 ) * z / (np.sqrt(1 - (z / d) ** 2) * d**2) return v_glon * Unit("rad"), v_glat * Unit("rad") def motion_since_birth(v, age, theta, phi): """ Compute motion of an astrophysical object with a given velocity, direction and age. Parameters ---------- v : `~astropy.units.Quantity` Absolute value of the velocity. age : `~astropy.units.Quantity` Age of the source. theta, phi : `~astropy.units.Quantity` Angular direction of the velocity. Returns ------- dx, dy, dz : `~astropy.units.Quantity` Displacement in x, y, z direction. vx, vy, vz : `~astropy.units.Quantity` Velocity in x, y, z direction. """ vx = v * np.cos(phi) * np.sin(theta) vy = v * np.sin(phi) * np.sin(theta) vz = v * np.cos(theta) # Compute new positions dx = vx * age dy = vy * age dz = vz * age return dx, dy, dz, vx, vy, vz ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.244642 gammapy-1.3/gammapy/utils/coordinates/tests/0000755000175100001770000000000014721316215020666 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/coordinates/tests/__init__.py0000644000175100001770000000010014721316200022760 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/coordinates/tests/test_fov.py0000644000175100001770000000477114721316200023074 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from numpy.testing import assert_allclose import astropy.units as u from gammapy.utils.coordinates import fov_to_sky, sky_to_fov def test_fov_to_sky(): # test some simple cases az, alt = fov_to_sky(1 * u.deg, 1 * u.deg, 0 * u.deg, 0 * u.deg) assert_allclose(az.value, 359) assert_allclose(alt.value, 1) az, alt = fov_to_sky(-1 * u.deg, 1 * u.deg, 180 * u.deg, 0 * u.deg) assert_allclose(az.value, 181) assert_allclose(alt.value, 1) az, alt = fov_to_sky(1 * u.deg, 0 * u.deg, 0 * u.deg, 60 * u.deg) assert_allclose(az.value, 358, rtol=1e-3) assert_allclose(alt.value, 59.985, rtol=1e-3) # these are cross-checked with the # transformation as implemented in H.E.S.S. fov_altaz_lon = [0.7145614, 0.86603433, -0.05409698, 2.10295248] fov_altaz_lat = [-1.60829115, -1.19643974, 0.45800984, 3.26844192] az_pointing = [52.42056255, 52.24706061, 52.06655505, 51.86795724] alt_pointing = [51.11908203, 51.23454751, 51.35376141, 51.48385814] az, alt = fov_to_sky( fov_altaz_lon * u.deg, fov_altaz_lat * u.deg, az_pointing * u.deg, alt_pointing * u.deg, ) assert_allclose(az.value, [51.320575, 50.899125, 52.154053, 48.233023]) assert_allclose(alt.value, [49.505451, 50.030165, 51.811739, 54.700102]) def test_sky_to_fov(): # test some simple cases lon, lat = sky_to_fov(1 * u.deg, 1 * u.deg, 0 * u.deg, 0 * u.deg) assert_allclose(lon.value, -1) assert_allclose(lat.value, 1) lon, lat = sky_to_fov(269 * u.deg, 0 * u.deg, 270 * u.deg, 0 * u.deg) assert_allclose(lon.value, 1) assert_allclose(lat.value, 0, atol=1e-7) lon, lat = sky_to_fov(1 * u.deg, 60 * u.deg, 0 * u.deg, 60 * u.deg) assert_allclose(lon.value, -0.5, rtol=1e-3) assert_allclose(lat.value, 0.003779, rtol=1e-3) # these are cross-checked with the # transformation as implemented in H.E.S.S. az = [51.320575, 50.899125, 52.154053, 48.233023] alt = [49.505451, 50.030165, 51.811739, 54.700102] az_pointing = [52.42056255, 52.24706061, 52.06655505, 51.86795724] alt_pointing = [51.11908203, 51.23454751, 51.35376141, 51.48385814] lon, lat = sky_to_fov( az * u.deg, alt * u.deg, az_pointing * u.deg, alt_pointing * u.deg ) assert_allclose( lon.value, [0.7145614, 0.86603433, -0.05409698, 2.10295248], rtol=1e-5 ) assert_allclose( lat.value, [-1.60829115, -1.19643974, 0.45800984, 3.26844192], rtol=1e-5 ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/coordinates/tests/test_other.py0000644000175100001770000000056314721316200023416 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from astropy.units import Quantity from gammapy.utils.coordinates import galactic def test_galactic(): x = Quantity(0, "kpc") y = Quantity(0, "kpc") z = Quantity(0, "kpc") reference = (Quantity(8.5, "kpc"), Quantity(0, "deg"), Quantity(0, "deg")) assert galactic(x, y, z) == reference ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/deprecation.py0000644000175100001770000000300714721316200020053 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst __all__ = [ "GammapyDeprecationWarning", "deprecated", "deprecated_renamed_argument", "deprecated_attribute", ] class GammapyDeprecationWarning(Warning): """The Gammapy deprecation warning.""" def deprecated(since, **kwargs): """ Use to mark a function or class as deprecated. Reuses Astropy's deprecated decorator. Check arguments and usage in `~astropy.utils.decorator.deprecated`. Parameters ---------- since : str The release at which this API became deprecated. This is required. """ from astropy.utils import deprecated kwargs["warning_type"] = GammapyDeprecationWarning return deprecated(since, **kwargs) def deprecated_renamed_argument(old_name, new_name, since, **kwargs): """Deprecate a _renamed_ or _removed_ function argument. Check arguments and usage in `~astropy.utils.decorator.deprecated_renamed_argument`. """ from astropy.utils import deprecated_renamed_argument kwargs["warning_type"] = GammapyDeprecationWarning return deprecated_renamed_argument(old_name, new_name, since, **kwargs) def deprecated_attribute(name, since, **kwargs): """ Use to mark a public attribute as deprecated. This creates a property that will warn when the given attribute name is accessed. """ from astropy.utils import deprecated_attribute kwargs["warning_type"] = GammapyDeprecationWarning return deprecated_attribute(name, since, **kwargs) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/docs.py0000644000175100001770000001465214721316200016516 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Gammapy documentation generator utilities. This is for the Sphinx docs build of Gammapy itself, it is used from ``docs/conf.py``. It should not be imported or used from other scripts or packages, because we will change it for our use case whenever we like. Docutils / Sphinx is notoriously hard to understand / learn about. Here's some good resources with working examples: - https://doughellmann.com/blog/2010/05/09/defining-custom-roles-in-sphinx/ - http://docutils.sourceforge.net/docs/howto/rst-directives.html - https://github.com/docutils/docutils/blob/master/docutils/docutils/parsers/rst/directives/images.py - https://github.com/sphinx-doc/sphinx/blob/master/sphinx/directives/other.py - https://github.com/bokeh/bokeh/tree/master/src/bokeh/sphinxext """ import os from pathlib import Path from docutils.parsers.rst import Directive from docutils.parsers.rst.directives import register_directive from docutils.parsers.rst.directives.body import CodeBlock from docutils.parsers.rst.directives.images import Image from docutils.parsers.rst.directives.misc import Include, Raw from sphinx.util import logging from gammapy.analysis import AnalysisConfig try: gammapy_data_path = Path(os.environ["GAMMAPY_DATA"]) HAS_GP_DATA = True except KeyError: HAS_GP_DATA = False log = logging.getLogger(__name__) class AccordionHeader(Directive): """Insert HTML code to open an accordion box in the How To.""" option_spec = {"id": str, "title": str, "link": str} def run(self): raw = f"""
{self.options["title"]}
""" if self.options.get("link", None): raw += f""" Straight to tutorial… """ raw += f"""
""" include_lines = raw.splitlines() c = Raw( self.name, ["html"], self.options, include_lines, # content self.lineno, self.content_offset, self.block_text, self.state, self.state_machine, ) return c.run() class AccordionFooter(Directive): """Insert HTML code to close an accordion box in the How To.""" def run(self): raw = """
""" include_lines = raw.splitlines() c = Raw( self.name, ["html"], self.options, include_lines, # content self.lineno, self.content_offset, self.block_text, self.state, self.state_machine, ) return c.run() class HowtoHLI(Include): """Directive to insert how-to for high-level interface.""" def run(self): raw = "" section = self.arguments[0] doc = AnalysisConfig._get_doc_sections() for keyword in doc.keys(): if section in ["", keyword]: raw += doc[keyword] include_lines = raw.splitlines() codeblock = CodeBlock( self.name, [], self.options, include_lines, # content self.lineno, self.content_offset, self.block_text, self.state, self.state_machine, ) return codeblock.run() class DocsImage(Image): """Directive to add optional images from gammapy-data.""" def run(self): filename = self.arguments[0] if HAS_GP_DATA: path = gammapy_data_path / "figures" / filename if not path.is_file(): msg = "Error in {} directive: File not found: {}".format( self.name, path ) raise self.error(msg) # Usually Sphinx doesn't support absolute paths # But passing a POSIX string of the absolute path # with an extra "/" at the start seems to do the trick self.arguments[0] = "/" + path.absolute().as_posix() else: self.warning( "GAMMAPY_DATA not available. " "Missing image: name: {!r} filename: {!r}".format(self.name, filename) ) self.options["alt"] = self.arguments[1] if len(self.arguments) > 1 else "" return super().run() class SubstitutionCodeBlock(CodeBlock): """Similar to CodeBlock but replaces placeholders with variables.""" def run(self): """Replace placeholders with given variables.""" app = self.state.document.settings.env.app new_content = [] self.content = self.content existing_content = self.content for item in existing_content: for pair in app.config.substitutions: original, replacement = pair item = item.replace(original, replacement) new_content.append(item) self.content = new_content return list(CodeBlock.run(self)) def gammapy_sphinx_ext_activate(): if HAS_GP_DATA: log.info(f"*** Found GAMMAPY_DATA = {gammapy_data_path}") log.info("*** Nice!") else: log.info("*** gammapy-data *not* found.") log.info("*** Set the GAMMAPY_DATA environment variable!") log.info("*** Docs build will be incomplete.") # Register our directives and roles with Sphinx register_directive("gp-image", DocsImage) register_directive("gp-howto-hli", HowtoHLI) register_directive("accordion-header", AccordionHeader) register_directive("accordion-footer", AccordionFooter) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/fits.py0000644000175100001770000001451614721316200016532 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import html import logging import sys import astropy.units as u from astropy.coordinates import AltAz, Angle, EarthLocation, SkyCoord from astropy.io import fits from astropy.units import Quantity from .scripts import make_path log = logging.getLogger(__name__) __all__ = ["earth_location_from_dict", "LazyFitsData", "HDULocation"] class HDULocation: """HDU localisation, loading and Gammapy object mapper. This represents one row in `HDUIndexTable`. It's more a helper class, that is wrapped by `~gammapy.data.Observation`, usually those objects will be used to access data. See also :ref:`gadf:hdu-index`. """ def __init__( self, hdu_class, base_dir=".", file_dir=None, file_name=None, hdu_name=None, cache=True, format=None, ): self.hdu_class = hdu_class self.base_dir = base_dir self.file_dir = file_dir self.file_name = file_name self.hdu_name = hdu_name self.cache = cache self.format = format def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def info(self, file=None): """Print some summary information to stdout.""" if not file: file = sys.stdout print(f"HDU_CLASS = {self.hdu_class}", file=file) print(f"BASE_DIR = {self.base_dir}", file=file) print(f"FILE_DIR = {self.file_dir}", file=file) print(f"FILE_NAME = {self.file_name}", file=file) print(f"HDU_NAME = {self.hdu_name}", file=file) def path(self, abs_path=True): """Full filename path. Include ``base_dir`` if ``abs_path`` is True. """ path = make_path(self.base_dir) / self.file_dir / self.file_name if abs_path and path.exists(): return path else: return make_path(self.file_dir) / self.file_name def get_hdu(self): """Get HDU.""" filename = self.path(abs_path=True) # Here we're intentionally not calling `with fits.open` # because we don't want the file to remain open. hdu_list = fits.open(str(filename), memmap=False) return hdu_list[self.hdu_name] def load(self): """Load HDU as appropriate class.""" from gammapy.irf import IRF_REGISTRY hdu_class = self.hdu_class filename = self.path() hdu = self.hdu_name if hdu_class == "events": from gammapy.data import EventList return EventList.read(filename, hdu=hdu) elif hdu_class == "gti": from gammapy.data.gti import GTI return GTI.read(filename, hdu=hdu) elif hdu_class == "map": from gammapy.maps import Map return Map.read(filename, hdu=hdu, format=self.format) elif hdu_class == "pointing": # FIXME: support loading the pointing table from gammapy.data import FixedPointingInfo return FixedPointingInfo.read(filename, hdu=hdu) else: cls = IRF_REGISTRY.get_cls(hdu_class) return cls.read(filename, hdu=hdu) class LazyFitsData(object): """A lazy FITS data descriptor. Parameters ---------- cache : bool Whether to cache the data. """ def __init__(self, cache=True): self.cache = cache def __set_name__(self, owner, name): self.name = name def __get__(self, instance, objtype): if instance is None: # Accessed on a class, not an instance return self try: return instance.__dict__[self.name] except KeyError: hdu_loc = instance.__dict__[f"_{self.name}_hdu"] try: value = hdu_loc.load() except KeyError: value = None log.warning(f"HDU '{hdu_loc.hdu_name}' not found") if self.cache and hdu_loc.cache: instance.__dict__[self.name] = value return value def __set__(self, instance, value): if isinstance(value, HDULocation): instance.__dict__[f"_{self.name}_hdu"] = value else: instance.__dict__[self.name] = value def earth_location_from_dict(meta): """Create `~astropy.coordinates.EarthLocation` from FITS header dictionary.""" lon = Angle(meta["GEOLON"], "deg") lat = Angle(meta["GEOLAT"], "deg") if "GEOALT" in meta: height = Quantity(meta["GEOALT"], "meter") elif "ALTITUDE" in meta: height = Quantity(meta["ALTITUDE"], "meter") else: raise KeyError("The GEOALT or ALTITUDE header keyword must be set") return EarthLocation(lon=lon, lat=lat, height=height) def earth_location_to_dict(location): """Convert `~astropy.coordinates.EarthLocation` to FITS header dictionary.""" return { "GEOLON": location.lon.deg, "GEOLAT": location.lat.deg, "ALTITUDE": location.height.to_value(u.m), } def skycoord_from_dict(header, frame="icrs", ext="PNT"): """Create `~astropy.coordinates.SkyCoord` from a dictionary of FITS keywords. Parameters ---------- header : dict The input dictionary. frame : {"icrs", "galactic", "altaz"} The frame to use. Default is 'icrs'. ext: str, optional The keyword extension to apply to the keywords names. Default is 'PNT'. Returns ------- skycoord : `~astropy.coordinates.skycoord` The input SkyCoord. """ ext = "_" + ext if ext != "" else "" if frame == "altaz": alt = header.get("ALT" + ext, None) az = header.get("AZ" + ext, None) return ( AltAz(alt=alt * u.deg, az=az * u.deg) if (alt is not None and az is not None) else None ) elif frame == "icrs": coords = header.get("RA" + ext, None), header.get("DEC" + ext, None) elif frame == "galactic": coords = header.get("GLON" + ext, None), header.get("GLAT" + ext, None) else: raise ValueError( f"Unsupported frame {frame}. Select in 'icrs', 'galactic', 'altaz'." ) if coords[0] is not None and coords[1] is not None: return SkyCoord(coords[0], coords[1], unit="deg", frame=frame) else: return None ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/gauss.py0000644000175100001770000002241414721316200016703 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Multi-Gaussian distribution utilities (Gammapy internal).""" import html import numpy as np from astropy import units as u from gammapy.utils.roots import find_roots class Gauss2DPDF: """2D symmetric Gaussian PDF. Reference: http://en.wikipedia.org/wiki/Multivariate_normal_distribution#Bivariate_case Parameters ---------- sigma : float Gaussian width. """ def __init__(self, sigma=1): self.sigma = sigma def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" @property def _sigma2(self): """Sigma squared as a float.""" return self.sigma * self.sigma @property def amplitude(self): """PDF amplitude at the center as a float.""" return self.__call(0, 0) def __call__(self, x, y=0): """dp / (dx dy) at position (x, y). Parameters ---------- x : `~numpy.ndarray` x coordinate. y : `~numpy.ndarray`, optional y coordinate. Returns ------- dpdxdy : `~numpy.ndarray` dp / (dx dy). """ theta2 = x * x + y * y amplitude = 1 / (2 * np.pi * self._sigma2) exponent = -0.5 * theta2 / self._sigma2 return amplitude * np.exp(exponent) def dpdtheta2(self, theta2): """dp / dtheta2 at position theta2 = theta ^ 2. Parameters ---------- theta2 : `~numpy.ndarray` Offset squared. Returns ------- dpdtheta2 : `~numpy.ndarray` dp / dtheta2. """ amplitude = 1 / (2 * self._sigma2) exponent = -0.5 * theta2 / self._sigma2 return amplitude * np.exp(exponent) def containment_fraction(self, rad): """Containment fraction. Parameters ---------- rad : `~numpy.ndarray` Offset. Returns ------- containment_fraction : `~numpy.ndarray` Containment fraction. """ return 1 - np.exp(-0.5 * rad**2 / self._sigma2) def containment_radius(self, containment_fraction): """Containment angle for a given containment fraction. Parameters ---------- containment_fraction : `~numpy.ndarray` Containment fraction. Returns ------- containment_radius : `~numpy.ndarray` Containment radius. """ return self.sigma * np.sqrt(-2 * np.log(1 - containment_fraction)) def gauss_convolve(self, sigma): """Convolve with another Gaussian 2D PDF. Parameters ---------- sigma : `~numpy.ndarray` or float Gaussian width of the new Gaussian 2D PDF to convolve with. Returns ------- gauss_convolve : `~gammapy.modeling.models.Gauss2DPDF` Convolution of both Gaussians. """ new_sigma = np.sqrt(self._sigma2 + sigma**2) return Gauss2DPDF(new_sigma) class MultiGauss2D: """Sum of multiple 2D Gaussians. Parameters ---------- sigmas : `~numpy.ndarray` Widths of the Gaussians to add. norms : `~numpy.ndarray`, optional Normalizations of the Gaussians to add. Notes ----- * This sum is no longer a PDF, it is not normalized to 1. * The "norm" of each component represents the 2D integral, not the amplitude at the origin. """ def __init__(self, sigmas, norms=None): # If no norms are given, you have a PDF. self.components = [Gauss2DPDF(sigma) for sigma in sigmas] if norms is None: self.norms = np.ones(len(self.components)) else: self.norms = norms def __call__(self, x, y=0): """dp / (dx dy) at position (x, y). Parameters ---------- x : `~numpy.ndarray` x coordinate. y : `~numpy.ndarray`, optional y coordinate. Default is 0. Returns ------- total : `~numpy.ndarray` dp / (dx dy). """ values = [] for norm, component in zip(self.norms, self.components): values.append(norm * component(x, y)) return np.stack(values).sum(axis=0) @property def n_components(self): """Number of components as an integer.""" return len(self.components) @property def sigmas(self): """Array of Gaussian widths as an `~numpy.ndarray`.""" return u.Quantity([_.sigma for _ in self.components]) @property def integral(self): """Integral as sum of norms as an `~numpy.ndarray`.""" return np.nansum(self.norms, axis=0) @property def amplitude(self): """Amplitude at the center as a float.""" return self.__call__(0, 0) @property def max_sigma(self): """Largest Gaussian width as a float.""" return self.sigmas.max() @property def eff_sigma(self): r"""Effective Gaussian width for single-Gauss approximation as a float. Notes ----- The effective Gaussian width is given by: .. math:: \sigma_\mathrm{eff} = \sqrt{\sum_i N_i \sigma_i^2} where ``N`` is normalization and ``sigma`` is width. """ sigma2s = [component._sigma2 for component in self.components] return np.sqrt(np.sum(self.norms * sigma2s)) def dpdtheta2(self, theta2): """dp / dtheta2 at position theta2 = theta ^ 2. Parameters ---------- theta2 : `~numpy.ndarray` Offset squared. Returns ------- dpdtheta2 : `~numpy.ndarray` dp / dtheta2. """ values = [] # Actually this is only a PDF if sum(norms) == 1 for norm, component in zip(self.norms, self.components): values.append(norm * component.dpdtheta2(theta2)) return np.sum(values, axis=0) def normalize(self): """Normalize function. Returns ------- norm_multigauss : `~gammapy.modeling.models.MultiGauss2D` Normalized function. """ with np.errstate(divide="ignore", invalid="ignore"): self.norms = np.nan_to_num(self.norms / self.integral) def containment_fraction(self, rad): """Containment fraction. Parameters ---------- rad : `~numpy.ndarray` Offset. Returns ------- containment_fraction : `~numpy.ndarray` Containment fraction. """ values = [] for norm, component in zip(self.norms, self.components): values.append(norm * component.containment_fraction(rad)) return np.sum(values, axis=0) def containment_radius(self, containment_fraction): """Containment angle for a given containment fraction. Parameters ---------- containment_fraction : `~numpy.ndarray` Containment fraction. Returns ------- containment_radius : `~numpy.ndarray` Containment radius. """ rad_max = 1e3 def f(rad): # positive if theta too large return self.containment_fraction(rad * u.deg) - containment_fraction roots, res = find_roots(f, lower_bound=0, upper_bound=rad_max, nbin=1) if np.isnan(roots[0]) or np.allclose(roots[0], rad_max): rad = np.inf else: rad = roots[0] return rad * u.deg def match_sigma(self, containment_fraction): """Compute equivalent Gauss width. Find the sigma of a single-Gaussian distribution that approximates this one, such that theta matches for a given containment. Parameters ---------- containment_fraction : `~numpy.ndarray` Containment fraction. Returns ------- sigma : `~numpy.ndarray` Equivalent containment radius. """ theta1 = self.containment_radius(containment_fraction) theta2 = Gauss2DPDF(sigma=1).containment_radius(containment_fraction) return theta1 / theta2 def gauss_convolve(self, sigma, norm=1): """Convolve with another Gaussian. Compute new norms and sigmas of all the components such that the new distribution represents the convolved old distribution by a Gaussian of width sigma and then multiplied by norm. This MultiGauss2D is unchanged, a new one is created and returned. This is useful if you need to e.g. compute theta for one PSF and many sigmas. Parameters ---------- sigma : `~numpy.ndarray` or float Gaussian width of the new Gaussian 2D PDF to convolve with. norm : `~numpy.ndarray` or float Normalization of the new Gaussian 2D PDF to convolve with. Returns ------- new_multi_gauss_2d : `~gammapy.modeling.models.MultiGauss2D` Convolution as new MultiGauss2D. """ sigmas, norms = [], [] for ii in range(self.n_components): sigmas.append(self.components[ii].gauss_convolve(sigma).sigma) norms.append(self.norms[ii] * norm) return MultiGauss2D(sigmas, norms) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/integrate.py0000644000175100001770000000255014721316200017542 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from .interpolation import LogScale __all__ = ["trapz_loglog"] def trapz_loglog(y, x, axis=-1): """Integrate using the composite trapezoidal rule in log-log space. Integrate `y` (`x`) along given axis in loglog space. Parameters ---------- y : `~numpy.ndarray` Input array to integrate. x : `~numpy.ndarray` Independent variable to integrate over. axis : int, optional Specify the axis. Default is -1. Returns ------- trapz : float Definite integral as approximated by trapezoidal rule in loglog space. """ from gammapy.modeling.models import PowerLawSpectralModel as pl # see https://stackoverflow.com/a/56840428 x, y = np.moveaxis(x, axis, 0), np.moveaxis(y, axis, 0) energy_min, energy_max = x[:-1], x[1:] vals_energy_min, vals_energy_max = y[:-1], y[1:] # log scale has the built-in zero clipping log = LogScale() with np.errstate(invalid="ignore", divide="ignore"): index = -log(vals_energy_min / vals_energy_max) / log(energy_min / energy_max) index[np.isnan(index)] = np.inf return pl.evaluate_integral( energy_min=energy_min, energy_max=energy_max, index=index, reference=energy_min, amplitude=vals_energy_min, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/interpolation.py0000644000175100001770000001775314721316200020462 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Interpolation utilities.""" import html from itertools import compress import numpy as np import scipy.interpolate from astropy import units as u from .compat import COPY_IF_NEEDED __all__ = [ "interpolate_profile", "interpolation_scale", "ScaledRegularGridInterpolator", ] INTERPOLATION_ORDER = {None: 0, "nearest": 0, "linear": 1, "quadratic": 2, "cubic": 3} class ScaledRegularGridInterpolator: """Thin wrapper around `scipy.interpolate.RegularGridInterpolator`. The values are scaled before the interpolation and back-scaled after the interpolation. Dimensions of length 1 are ignored in the interpolation of the data. Parameters ---------- points : tuple of `~numpy.ndarray` or `~astropy.units.Quantity` Tuple of points passed to `RegularGridInterpolator`. values : `~numpy.ndarray` Values passed to `RegularGridInterpolator`. points_scale : tuple of str Interpolation scale used for the points. values_scale : {'lin', 'log', 'sqrt'} Interpolation scaling applied to values. If the values vary over many magnitudes a 'log' scaling is recommended. axis : int or None Axis along which to interpolate. method : {"linear", "nearest"} Default interpolation method. Can be overwritten when calling the `ScaledRegularGridInterpolator`. **kwargs : dict Keyword arguments passed to `RegularGridInterpolator`. """ def __init__( self, points, values, points_scale=None, values_scale="lin", extrapolate=True, axis=None, **kwargs, ): if points_scale is None: points_scale = ["lin"] * len(points) self.scale_points = [interpolation_scale(scale) for scale in points_scale] self.scale = interpolation_scale(values_scale) self.axis = axis self._include_dimensions = [len(p) > 1 for p in points] values_scaled = self.scale(values) points_scaled = self._scale_points(points=points) kwargs.setdefault("bounds_error", False) if extrapolate: kwargs.setdefault("fill_value", None) method = kwargs.get("method", None) if not np.any(self._include_dimensions): if method != "nearest": raise ValueError( "Interpolating scalar values requires using " "method='nearest' explicitly." ) if np.any(self._include_dimensions): values_scaled = np.squeeze(values_scaled) if axis is None: self._interpolate = scipy.interpolate.RegularGridInterpolator( points=points_scaled, values=values_scaled, **kwargs ) else: self._interpolate = scipy.interpolate.interp1d( points_scaled[0], values_scaled, axis=axis ) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def _scale_points(self, points): points_scaled = [scale(p) for p, scale in zip(points, self.scale_points)] if np.any(self._include_dimensions): points_scaled = compress(points_scaled, self._include_dimensions) return tuple(points_scaled) def __call__(self, points, method=None, clip=True, **kwargs): """Interpolate data points. Parameters ---------- points : tuple of `~numpy.ndarray` or `~astropy.units.Quantity` Tuple of coordinate arrays of the form (x_1, x_2, x_3, ...). Arrays are broadcast internally. method : {None, "linear", "nearest"} Linear or nearest neighbour interpolation. Default is None, which is `method` defined on init. clip : bool Clip values at zero after interpolation. """ points = self._scale_points(points=points) if self.axis is None: points = np.broadcast_arrays(*points) points_interp = np.stack([_.flat for _ in points]).T values = self._interpolate(points_interp, method, **kwargs) values = self.scale.inverse(values.reshape(points[0].shape)) else: values = self._interpolate(points[0]) values = self.scale.inverse(values) if clip: values = np.clip(values, 0, np.inf) return values def interpolation_scale(scale="lin"): """Interpolation scaling. Parameters ---------- scale : {"lin", "log", "sqrt"} Choose interpolation scaling. """ if scale in ["lin", "linear"]: return LinearScale() elif scale == "log": return LogScale() elif scale == "sqrt": return SqrtScale() elif scale == "stat-profile": return StatProfileScale() elif isinstance(scale, InterpolationScale): return scale else: raise ValueError(f"Not a valid value scaling mode: '{scale}'.") class InterpolationScale: """Interpolation scale base class.""" def __call__(self, values): if hasattr(self, "_unit"): values = u.Quantity(values, copy=COPY_IF_NEEDED).to_value(self._unit) else: if isinstance(values, u.Quantity): self._unit = values.unit values = values.value return self._scale(values) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def inverse(self, values): values = self._inverse(values) if hasattr(self, "_unit"): return u.Quantity(values, self._unit, copy=COPY_IF_NEEDED) else: return values class LogScale(InterpolationScale): """Logarithmic scaling.""" tiny = np.finfo(np.float32).tiny def _scale(self, values): values = np.clip(values, self.tiny, np.inf) return np.log(values) @classmethod def _inverse(cls, values): output = np.exp(values) return np.where(abs(output) - cls.tiny <= cls.tiny, 0, output) class SqrtScale(InterpolationScale): """Square root scaling.""" @staticmethod def _scale(values): sign = np.sign(values) return sign * np.sqrt(sign * values) @classmethod def _inverse(cls, values): return np.power(values, 2) class StatProfileScale(InterpolationScale): """Square root profile scaling.""" def __init__(self, axis=0): self.axis = axis def _scale(self, values): values = np.sign(np.gradient(values, axis=self.axis)) * values sign = np.sign(values) return sign * np.sqrt(sign * values) @classmethod def _inverse(cls, values): return np.power(values, 2) class LinearScale(InterpolationScale): """Linear scaling.""" @staticmethod def _scale(values): return values @classmethod def _inverse(cls, values): return values def interpolate_profile(x, y, interp_scale="sqrt", extrapolate=False): """Helper function to interpolate one-dimensional profiles. Parameters ---------- x : `~numpy.ndarray` Array of x values. y : `~numpy.ndarray` Array of y values. interp_scale : {"sqrt", "lin"} Interpolation scale applied to the profile. If the profile is of parabolic shape, a "sqrt" scaling is recommended. In other cases or for fine sampled profiles a "lin" can also be used. Default is "sqrt". extrapolate : bool Extrapolate or not if the evaluation value is outside the range of x values. Default is False. Returns ------- interp : `interp1d` Interpolator. """ method_dict = {"sqrt": "quadratic", "lin": "linear"} kwargs = dict(kind=method_dict[interp_scale]) if extrapolate: kwargs["bounds_error"] = False kwargs["fill_value"] = "extrapolate" return scipy.interpolate.interp1d(x, y, **kwargs) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/metadata.py0000644000175100001770000002232214721316200017337 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Metadata base container for Gammapy.""" import json from typing import ClassVar, Literal, Optional, get_args import astropy.units as u from astropy.coordinates import SkyCoord from astropy.time import Time import yaml from pydantic import BaseModel, ConfigDict, Field, ValidationError, field_validator from gammapy.utils.fits import skycoord_from_dict from gammapy.utils.time import ( time_ref_from_dict, time_ref_to_dict, time_relative_to_ref, ) from gammapy.version import version from .types import AltAzSkyCoordType, ICRSSkyCoordType, SkyCoordType, TimeType __all__ = [ "MetaData", "CreatorMetaData", "ObsInfoMetaData", "PointingInfoMetaData", "TimeInfoMetaData", "TargetMetaData", ] METADATA_FITS_KEYS = { "creator": { "creator": "CREATOR", "date": { "input": lambda v: v.get("CREATED"), "output": lambda v: {"CREATED": v.iso}, }, "origin": "ORIGIN", }, "obs_info": { "telescope": "TELESCOP", "instrument": "INSTRUME", "observation_mode": "OBS_MODE", "obs_id": "OBS_ID", }, "pointing": { "radec_mean": { "input": lambda v: skycoord_from_dict(v, frame="icrs", ext="PNT"), "output": lambda v: {"RA_PNT": v.ra.deg, "DEC_PNT": v.dec.deg}, }, "altaz_mean": { "input": lambda v: skycoord_from_dict(v, frame="altaz", ext="PNT"), "output": lambda v: {"ALT_PNT": v.alt.deg, "AZ_PNT": v.az.deg}, }, }, "target": { "name": "OBJECT", "position": { "input": lambda v: skycoord_from_dict(v, frame="icrs", ext="OBJ"), "output": lambda v: {"RA_OBJ": v.ra.deg, "DEC_OBJ": v.dec.deg}, }, }, } class MetaData(BaseModel): """Base model for all metadata classes in Gammapy.""" model_config = ConfigDict( arbitrary_types_allowed=True, validate_assignment=True, extra="forbid", validate_default=True, use_enum_values=True, ) @property def tag(self): """Returns MetaData tag.""" return self._tag def to_header(self, format="gadf"): """Export MetaData to a FITS header. Conversion is performed following the definition in the METADATA_FITS_EXPORT_KEYS. Parameters ---------- format : {'gadf'} Header format. Default is 'gadf'. Returns ------- header : dict The header dictionary. """ if format != "gadf": raise ValueError(f"Metadata to header: format {format} is not supported.") hdr_dict = {} fits_export_keys = METADATA_FITS_KEYS.get(self.tag) if fits_export_keys is None: # TODO: Should we raise an exception or simply a warning and return empty dict? raise TypeError(f"No FITS export is defined for metadata {self.tag}.") for key, item in fits_export_keys.items(): value = self.model_dump().get(key) if value is not None: if not isinstance(item, str): # Not a one to one conversion hdr_dict.update(item["output"](value)) else: hdr_dict[item] = value extra_keys = set(self.model_fields.keys()) - set(fits_export_keys.keys()) for key in extra_keys: entry = getattr(self, key) if isinstance(entry, MetaData): hdr_dict.update(entry.to_header(format)) return hdr_dict @classmethod def from_header(cls, header, format="gadf"): """Import MetaData from a FITS header. Conversion is performed following the definition in the METADATA_FITS_EXPORT_KEYS. Parameters ---------- header : dict The header dictionary. format : {'gadf'} Header format. Default is 'gadf'. """ # TODO: implement storage of optional metadata if format != "gadf": raise ValueError(f"Metadata from header: format {format} is not supported.") fits_export_keys = METADATA_FITS_KEYS.get(cls._tag) if fits_export_keys is None: raise TypeError(f"No FITS export is defined for metadata {cls._tag}.") kwargs = {} for key, item in fits_export_keys.items(): if not isinstance(item, str): # Not a one to one conversion kwargs[key] = item["input"](header) else: kwargs[key] = header.get(item) extra_keys = set(cls.model_fields.keys()) - set(fits_export_keys.keys()) for key in extra_keys: value = cls.model_fields[key] args = get_args(value.annotation) try: if issubclass(args[0], MetaData): kwargs[key] = args[0].from_header(header, format) except TypeError: pass try: return cls(**kwargs) except ValidationError: return cls.model_construct(**kwargs) def to_yaml(self): """Dump metadata content to yaml.""" meta = {"metadata": json.loads(self.model_dump_json())} return yaml.dump( meta, sort_keys=False, indent=4, width=80, default_flow_style=False ) class CreatorMetaData(MetaData): """Metadata containing information about the object creation. Parameters ---------- creator : str The software used to create the data contained in the parent object. Default is the used Gammapy version. date : `~astropy.time.Time` or str The creation date. Default is the current date. origin : str The organization at the origin of the data. """ _tag: ClassVar[Literal["creator"]] = "creator" creator: Optional[str] = f"Gammapy {version}" date: Optional[TimeType] = Field(default_factory=Time.now) origin: Optional[str] = None class ObsInfoMetaData(MetaData): """General metadata information about the observation. Parameters ---------- obs_id : str or int The observation identifier. telescope : str, optional The telescope/observatory name. instrument : str, optional The specific instrument used. sub_array : str, optional The specific sub-array used. observation_mode : str, optional The observation mode. """ _tag: ClassVar[Literal["obs_info"]] = "obs_info" obs_id: int telescope: Optional[str] = None instrument: Optional[str] = None sub_array: Optional[str] = None observation_mode: Optional[str] = None class PointingInfoMetaData(MetaData): """General metadata information about the pointing. Parameters ---------- radec_mean : `~astropy.coordinates.SkyCoord`, optional Mean pointing position of the observation in `icrs` frame. altaz_mean : `~astropy.coordinates.SkyCoord`, or `~astropy.coordinates.AltAz`, optional Mean pointing position of the observation in local AltAz frame. """ _tag: ClassVar[Literal["pointing"]] = "pointing" radec_mean: Optional[ICRSSkyCoordType] = None altaz_mean: Optional[AltAzSkyCoordType] = None class TimeInfoMetaData(MetaData): """General metadata information about the time range of an observation. Parameters ---------- time_start : `~astropy.time.Time` or str The observation start time. time_stop : `~astropy.time.Time` or str The observation stop time. reference_time : `~astropy.time.Time` or str The observation reference time.""" _tag: ClassVar[Literal["time_info"]] = "time_info" time_start: Optional[TimeType] = None time_stop: Optional[TimeType] = None reference_time: Optional[TimeType] = None def to_header(self, format="gadf"): if self.reference_time is None: return {} result = time_ref_to_dict(self.reference_time) if self.time_start is not None: result["TSTART"] = time_relative_to_ref(self.time_start, result).to_value( "s" ) if self.time_stop is not None: result["TSTOP"] = time_relative_to_ref(self.time_stop, result).to_value("s") return result @classmethod def from_header(cls, header, format="gadf"): kwargs = {} try: time_ref = time_ref_from_dict(header) except KeyError: return cls() kwargs["reference_time"] = time_ref if "TSTART" in header: kwargs["time_start"] = time_ref + header["TSTART"] * u.s if "TSTOP" in header: kwargs["time_stop"] = time_ref + header["TSTOP"] * u.s return cls(**kwargs) class TargetMetaData(MetaData): """General metadata information about the target. Parameters ---------- name : str, optional The target name. position : `~astropy.coordinates.SkyCoord`, optional Position of the observation in `icrs` frame. """ _tag: ClassVar[Literal["target"]] = "target" name: Optional[str] = None position: Optional[SkyCoordType] = None @field_validator("position", mode="after") def validate_radec_mean(cls, v): if isinstance(v, SkyCoord): return v.icrs ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/observers.py0000644000175100001770000001000314721316200017562 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Location of gamma-ray observatories.""" import astropy.units as u from astropy.coordinates import EarthLocation __all__ = ["observatory_locations"] observatory_locations = {} """Gamma-ray observatory locations (dict). This is a dict with observatory names as keys and values of type `~astropy.coordinates.EarthLocation`. Not that with ``EarthLocation`` the orientation of angles is as follows: - longitude is east for positive values and west for negative values - latitude is north for positive values and south for negative values Available observatories (alphabetical order): - ``cta_south`` and ``cta_north`` for CTA, see `Website `__ and `Wikipedia `__ - ``hawc`` for HAWC, see `Website `__ and `Wikipedia `__ - ``hegra`` for HEGRA, see `Wikipedia `__ - ``hess`` for HESS, see `Website `__ and `Wikipedia `__ - ``magic`` for MAGIC, see `Website `__ and `Wikipedia `__ - ``milagro`` for MILAGRO, see `Wikipedia `__) - ``veritas`` for VERITAS, see `Website `__ and `Wikipedia `__ - ``whipple`` for WHIPPLE, see `Wikipedia `__ Examples -------- >>> from gammapy.data import observatory_locations >>> observatory_locations['hess'] >>> list(observatory_locations.keys()) """ # Values from https://www.cta-observatory.org/about/array-locations/chile/ # Latitude: 24d41m0.34s South, Longitude: 70d18m58.84s West, Height: not given # Email from Gernot Maier on Sep 8, 2017, stating what they use in the CTA MC group: # lon=-70.31634499364885d, lat=-24.68342915473787d, height=2150m observatory_locations["cta_south"] = EarthLocation( lon="-70d18m58.84s", lat="-24d41m0.34s", height="2150m" ) # Values from https://www.cta-observatory.org/about/array-locations/la-palma/ # Latitude: 28d45m43.7904s North, Longitude: 17d53m31.218s West, Height: 2200 m # Email from Gernot Maier on Sep 8, 2017, stating what they use in the CTA MC group for MST-1: # lon=-17.891571d, lat=28.762158d, height=2147m observatory_locations["cta_north"] = EarthLocation( lon="-17d53m31.218s", lat="28d45m43.7904s", height="2147m" ) # HAWC location taken from https://arxiv.org/pdf/1108.6034v2.pdf observatory_locations["hawc"] = EarthLocation( lon="-97d18m34s", lat="18d59m48s", height="4100m" ) # https://en.wikipedia.org/wiki/HEGRA observatory_locations["hegra"] = EarthLocation( lon="28d45m42s", lat="17d53m27s", height="2200m" ) # Precision position of HESS from the HESS software (slightly different from Wikipedia) observatory_locations["hess"] = EarthLocation( lon="16d30m00.8s", lat="-23d16m18.4s", height="1835m" ) observatory_locations["magic"] = EarthLocation( lon="-17d53m24s", lat="28d45m43s", height="2200m" ) observatory_locations["milagro"] = EarthLocation( lon="-106.67625d", lat="35.87835d", height="2530m" ) observatory_locations["veritas"] = EarthLocation( lon="-110d57m07.77s", lat="31d40m30.21s", height="1268m" ) # WHIPPLE coordinates taken from the Observatory Wikipedia page: # https://en.wikipedia.org/wiki/Fred_Lawrence_Whipple_Observatory observatory_locations["whipple"] = EarthLocation( lon="-110d52m42s", lat="31d40m52s", height="2606m" ) # communication with ASTRI Project Manager observatory_locations["astri"] = EarthLocation( lon="-16d30m20.99s", lat="28d18m00.0s", height="2370m" ) # coordinates from fact-tools (based on google earth) observatory_locations["fact"] = EarthLocation( lat=28.761647 * u.deg, lon=-17.891116 * u.deg, height=2200 * u.m, ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/parallel.py0000644000175100001770000002330414721316200017354 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Multiprocessing and multithreading setup.""" import importlib import logging from enum import Enum from gammapy.utils.pbar import progress_bar log = logging.getLogger(__name__) __all__ = [ "multiprocessing_manager", "run_multiprocessing", "BACKEND_DEFAULT", "N_JOBS_DEFAULT", "POOL_KWARGS_DEFAULT", "METHOD_DEFAULT", "METHOD_KWARGS_DEFAULT", ] class ParallelBackendEnum(Enum): """Enum for parallel backend.""" multiprocessing = "multiprocessing" ray = "ray" @classmethod def from_str(cls, value): """Get enum from string.""" if value == "ray" and not is_ray_available(): log.warning("Ray is not installed, falling back to multiprocessing backend") value = "multiprocessing" return cls(value) class PoolMethodEnum(Enum): """Enum for pool method.""" starmap = "starmap" apply_async = "apply_async" BACKEND_DEFAULT = ParallelBackendEnum.multiprocessing N_JOBS_DEFAULT = 1 ALLOW_CHILD_JOBS = False POOL_KWARGS_DEFAULT = dict(processes=N_JOBS_DEFAULT) METHOD_DEFAULT = PoolMethodEnum.starmap METHOD_KWARGS_DEFAULT = {} def get_multiprocessing(): """Get multiprocessing module.""" import multiprocessing return multiprocessing def get_multiprocessing_ray(): """Get multiprocessing module for ray backend.""" import ray.util.multiprocessing as multiprocessing log.warning( "Gammapy support for parallelisation with ray is still a prototype and is not fully functional." ) return multiprocessing def is_ray_initialized(): """Check if ray is initialized.""" try: from ray import is_initialized return is_initialized() except ModuleNotFoundError: return False def is_ray_available(): """Check if ray is available.""" try: importlib.import_module("ray") return True except ModuleNotFoundError: return False class multiprocessing_manager: """Context manager to update the default configuration for multiprocessing. Only the default configuration will be modified, if class arguments like `n_jobs` and `parallel_backend` are set they will overwrite the default configuration. Parameters ---------- backend : {'multiprocessing', 'ray'} Backend to use. pool_kwargs : dict Keyword arguments passed to the pool. The number of processes is limited to the number of physical CPUs. method : {'starmap', 'apply_async'} Pool method to use. method_kwargs : dict Keyword arguments passed to the method Examples -------- :: import gammapy.utils.parallel as parallel from gammapy.estimators import FluxPointsEstimator fpe = FluxPointsEstimator(energy_edges=[1, 3, 10] * u.TeV) with parallel.multiprocessing_manager( backend="multiprocessing", pool_kwargs=dict(processes=2), ): fpe.run(datasets) """ def __init__(self, backend=None, pool_kwargs=None, method=None, method_kwargs=None): global BACKEND_DEFAULT, POOL_KWARGS_DEFAULT, METHOD_DEFAULT, METHOD_KWARGS_DEFAULT, N_JOBS_DEFAULT self._backend = BACKEND_DEFAULT self._pool_kwargs = POOL_KWARGS_DEFAULT self._method = METHOD_DEFAULT self._method_kwargs = METHOD_KWARGS_DEFAULT self._n_jobs = N_JOBS_DEFAULT if backend is not None: BACKEND_DEFAULT = ParallelBackendEnum.from_str(backend).value if pool_kwargs is not None: POOL_KWARGS_DEFAULT = pool_kwargs N_JOBS_DEFAULT = pool_kwargs.get("processes", N_JOBS_DEFAULT) if method is not None: METHOD_DEFAULT = PoolMethodEnum(method).value if method_kwargs is not None: METHOD_KWARGS_DEFAULT = method_kwargs def __enter__(self): pass def __exit__(self, type, value, traceback): global BACKEND_DEFAULT, POOL_KWARGS_DEFAULT, METHOD_DEFAULT, METHOD_KWARGS_DEFAULT, N_JOBS_DEFAULT BACKEND_DEFAULT = self._backend POOL_KWARGS_DEFAULT = self._pool_kwargs METHOD_DEFAULT = self._method METHOD_KWARGS_DEFAULT = self._method_kwargs N_JOBS_DEFAULT = self._n_jobs class ParallelMixin: """Mixin class to handle parallel processing.""" _n_child_jobs = 1 @property def n_jobs(self): """Number of jobs as an integer.""" # TODO: this is somewhat unusual behaviour. It deviates from a normal default value handling if self._n_jobs is None: return N_JOBS_DEFAULT return self._n_jobs @n_jobs.setter def n_jobs(self, value): """Number of jobs setter as an integer.""" if not isinstance(value, (int, type(None))): raise ValueError( f"Invalid type: {value!r}, and integer or None is expected." ) self._n_jobs = value if ALLOW_CHILD_JOBS: self._n_child_jobs = value def _update_child_jobs(self): """needed because we can update only in the main process otherwise global ALLOW_CHILD_JOBS has default value""" if ALLOW_CHILD_JOBS: self._n_child_jobs = self.n_jobs else: self._n_child_jobs = 1 @property def _get_n_child_jobs(self): """Number of allowed child jobs as an integer.""" return self._n_child_jobs @property def parallel_backend(self): """Parallel backend as a string.""" if self._parallel_backend is None: return BACKEND_DEFAULT return self._parallel_backend @parallel_backend.setter def parallel_backend(self, value): """Parallel backend setter (str)""" if value is None: self._parallel_backend = None else: self._parallel_backend = ParallelBackendEnum.from_str(value).value def run_multiprocessing( func, inputs, backend=None, pool_kwargs=None, method=None, method_kwargs=None, task_name="", ): """Run function in a loop or in Parallel. Notes ----- The progress bar can be displayed for this function. Parameters ---------- func : function Function to run. inputs : list List of arguments to pass to the function. backend : {'multiprocessing', 'ray'}, optional Backend to use. Default is None. pool_kwargs : dict, optional Keyword arguments passed to the pool. The number of processes is limited to the number of physical CPUs. Default is None. method : {'starmap', 'apply_async'} Pool method to use. Default is "starmap". method_kwargs : dict, optional Keyword arguments passed to the method. Default is None. task_name : str, optional Name of the task to display in the progress bar. Default is "". """ if backend is None: backend = BACKEND_DEFAULT if method is None: method = METHOD_DEFAULT if method_kwargs is None: method_kwargs = METHOD_KWARGS_DEFAULT if pool_kwargs is None: pool_kwargs = POOL_KWARGS_DEFAULT try: method_enum = PoolMethodEnum(method) except ValueError as e: raise ValueError(f"Invalid method: {method}") from e processes = pool_kwargs.get("processes", N_JOBS_DEFAULT) backend = ParallelBackendEnum.from_str(backend) multiprocessing = PARALLEL_BACKEND_MODULES[backend]() if backend == ParallelBackendEnum.multiprocessing: cpu_count = multiprocessing.cpu_count() if processes > cpu_count: log.info(f"Limiting number of processes from {processes} to {cpu_count}") processes = cpu_count if multiprocessing.current_process().name != "MainProcess": # with multiprocessing subprocesses cannot have childs (but possible with ray) processes = 1 if processes == 1: return run_loop( func=func, inputs=inputs, method_kwargs=method_kwargs, task_name=task_name ) if backend == ParallelBackendEnum.ray: address = "auto" if is_ray_initialized() else None pool_kwargs.setdefault("ray_address", address) log.info(f"Using {processes} processes to compute {task_name}") with multiprocessing.Pool(**pool_kwargs) as pool: pool_func = POOL_METHODS[method_enum] results = pool_func( pool=pool, func=func, inputs=inputs, method_kwargs=method_kwargs, task_name=task_name, ) return results def run_loop(func, inputs, method_kwargs=None, task_name=""): """Loop over inputs and run function.""" results = [] callback = method_kwargs.get("callback", None) for arguments in progress_bar(inputs, desc=task_name): result = func(*arguments) if callback is not None: result = callback(result) results.append(result) return results def run_pool_star_map(pool, func, inputs, method_kwargs=None, task_name=""): """Run function in parallel.""" return pool.starmap(func, progress_bar(inputs, desc=task_name), **method_kwargs) def run_pool_async(pool, func, inputs, method_kwargs=None, task_name=""): """Run function in parallel async.""" results = [] for arguments in progress_bar(inputs, desc=task_name): result = pool.apply_async(func, arguments, **method_kwargs) results.append(result) # wait async run is done [result.wait() for result in results] return results POOL_METHODS = { PoolMethodEnum.starmap: run_pool_star_map, PoolMethodEnum.apply_async: run_pool_async, } PARALLEL_BACKEND_MODULES = { ParallelBackendEnum.multiprocessing: get_multiprocessing, ParallelBackendEnum.ray: get_multiprocessing_ray, } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/pbar.py0000644000175100001770000000201614721316200016501 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Utilities for progress bar display.""" import logging log = logging.getLogger(__name__) try: from tqdm.auto import tqdm except ImportError: class tqdm: def __init__(self, iterable, disable=True, **kwargs): self.disable = disable self._iterable = iterable if not self.disable: log.info( "Tqdm is currently not installed. Visit https://tqdm.github.io/" ) def update(self, x): pass def __iter__(self): return self._iterable.__iter__() def __next__(self): return self._iterable.__next__() SHOW_PROGRESS_BAR = False def progress_bar(iterable, desc=None): # Necessary because iterable may be a zip iterable_to_list = list(iterable) total = len(iterable_to_list) return tqdm( iterable_to_list, total=total, disable=not SHOW_PROGRESS_BAR, desc=desc, ) ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.244642 gammapy-1.3/gammapy/utils/random/0000755000175100001770000000000014721316215016472 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/random/__init__.py0000644000175100001770000000100514721316200020571 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Random probability distribution helpers.""" from .inverse_cdf import InverseCDFSampler from .utils import ( draw, get_random_state, normalize, pdf, sample_powerlaw, sample_sphere, sample_sphere_distance, sample_times, ) __all__ = [ "draw", "get_random_state", "InverseCDFSampler", "normalize", "pdf", "sample_powerlaw", "sample_sphere", "sample_sphere_distance", "sample_times", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/random/inverse_cdf.py0000644000175100001770000000541114721316200021326 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import html import numpy as np from .utils import get_random_state __all__ = ["InverseCDFSampler"] class InverseCDFSampler: """Inverse CDF sampler. It determines a set of random numbers and calculate the cumulative distribution function. Parameters ---------- pdf : `~gammapy.maps.Map` Map of the predicted source counts. axis : int Axis along which sampling the indexes. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. """ def __init__(self, pdf, axis=None, random_state=0): self.random_state = get_random_state(random_state) self.axis = axis if axis is not None: self.cdf = np.cumsum(pdf, axis=self.axis) self.cdf /= self.cdf[:, [-1]] else: self.pdf_shape = pdf.shape pdf = pdf.ravel() / pdf.sum() self.sortindex = np.argsort(pdf, axis=None) self.pdf = pdf[self.sortindex] self.cdf = np.cumsum(self.pdf) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" def sample_axis(self): """Sample along a given axis. Returns ------- index : tuple of `~numpy.ndarray` Coordinates of the drawn sample. """ choices = self.random_state.uniform(high=1, size=len(self.cdf)) shape_cdf = self.cdf.shape cdf_all = np.insert(self.cdf, 0, 0, axis=1) edges = np.arange(shape_cdf[1] + 1) - 0.5 pix_coords = [] for cdf, choice in zip(cdf_all, choices): pix = np.interp(choice, cdf, edges) pix_coords.append(pix) return np.array(pix_coords) def sample(self, size): """Draw sample from the given PDF. Parameters ---------- size : int Number of samples to draw. Returns ------- index : tuple of `~numpy.ndarray` Coordinates of the drawn sample. """ # pick numbers which are uniformly random over the cumulative distribution function choice = self.random_state.uniform(high=1, size=size) # find the indices corresponding to this point on the CDF index = np.searchsorted(self.cdf, choice) index = self.sortindex[index] # map back to multi-dimensional indexing index = np.unravel_index(index, self.pdf_shape) index = np.vstack(index) index = index + self.random_state.uniform(low=-0.5, high=0.5, size=index.shape) return index ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.248642 gammapy-1.3/gammapy/utils/random/tests/0000755000175100001770000000000014721316215017634 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/random/tests/__init__.py0000644000175100001770000000010014721316200021726 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/random/tests/test_inverse_cdf.py0000644000175100001770000000332114721316200023525 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np import scipy.stats as stats from numpy.testing import assert_allclose from gammapy.utils.random import InverseCDFSampler def uniform_dist(x, a, b): return np.select([x <= a, x >= b], [0, 0], 1 / (b - a)) def gauss_dist(x, mu, sigma): return stats.norm.pdf(x, mu, sigma) def test_uniform_dist_sampling(): n_sampled = 1000 x = np.linspace(-2, 2, n_sampled) a, b = -1, 1 pdf = uniform_dist(x, a=a, b=b) sampler = InverseCDFSampler(pdf=pdf, random_state=0) idx = sampler.sample(int(1e4)) x_sampled = np.interp(idx, np.arange(n_sampled), x) assert_allclose(np.mean(x_sampled), 0.5 * (a + b), atol=0.01) assert_allclose( np.std(x_sampled), np.sqrt(1 / 3 * (a**2 + a * b + b**2)), rtol=0.01 ) def test_norm_dist_sampling(): n_sampled = 1000 x = np.linspace(-2, 2, n_sampled) mu, sigma = 0, 0.1 pdf = gauss_dist(x=x, mu=mu, sigma=sigma) sampler = InverseCDFSampler(pdf=pdf, random_state=0) idx = sampler.sample(int(1e5)) x_sampled = np.interp(idx, np.arange(n_sampled), x) assert_allclose(np.mean(x_sampled), mu, atol=0.01) assert_allclose(np.std(x_sampled), sigma, atol=0.005) def test_axis_sampling(): n_sampled = 1000 x = np.linspace(-2, 2, n_sampled) a, b = -1, 1 pdf_uniform = uniform_dist(x, a=a, b=b) mu, sigma = 0, 0.1 pdf_gauss = gauss_dist(x=x, mu=mu, sigma=sigma) pdf = np.vstack([pdf_gauss, pdf_uniform]) sampler = InverseCDFSampler(pdf, random_state=0, axis=1) idx = sampler.sample_axis() x_sampled = np.interp(idx, np.arange(n_sampled), x) assert_allclose(x_sampled, [0.012266, 0.43081], rtol=1e-4) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/random/tests/test_random.py0000644000175100001770000001302614721316200022521 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from numpy.testing import assert_allclose from astropy import units as u from astropy.coordinates import Angle from gammapy.utils.random import ( sample_powerlaw, sample_sphere, sample_sphere_distance, sample_times, ) from gammapy.utils.testing import assert_quantity_allclose def test_sample_sphere(): random_state = np.random.RandomState(seed=0) # test general case lon, lat = sample_sphere(size=2, random_state=random_state) assert_quantity_allclose(lon, Angle([3.44829694, 4.49366732], "radian")) assert_quantity_allclose(lat, Angle([0.20700192, 0.08988736], "radian")) # test specify a limited range lon_range = Angle([40.0, 45.0], "deg") lat_range = Angle([10.0, 15.0], "deg") lon, lat = sample_sphere( size=10, lon_range=lon_range, lat_range=lat_range, random_state=random_state ) assert ((lon_range[0] <= lon) & (lon < lon_range[1])).all() assert ((lat_range[0] <= lat) & (lat < lat_range[1])).all() # test lon within (-180, 180) deg range lon_range = Angle([-40.0, 0.0], "deg") lon, lat = sample_sphere(size=10, lon_range=lon_range, random_state=random_state) assert ((lon_range[0] <= lon) & (lon < lon_range[1])).all() lat_range = Angle([-90.0, 90.0], "deg") assert ((lat_range[0] <= lat) & (lat < lat_range[1])).all() # test lon range explicitly (0, 360) deg lon_range = Angle([0.0, 360.0], "deg") lon, lat = sample_sphere(size=100, lon_range=lon_range, random_state=random_state) # test values in the desired range lat_range = Angle([-90.0, 90.0], "deg") assert ((lon_range[0] <= lon) & (lon < lon_range[1])).all() assert ((lat_range[0] <= lat) & (lat < lat_range[1])).all() # test if values are distributed along the whole range nbins = 4 lon_delta = (lon_range[1] - lon_range[0]) / nbins lat_delta = (lat_range[1] - lat_range[0]) / nbins for i in np.arange(nbins): assert ( (lon_range[0] + i * lon_delta <= lon) & (lon < lon_range[0] + (i + 1) * lon_delta) ).any() assert ( (lat_range[0] + i * lat_delta <= lat) & (lat < lat_range[0] + (i + 1) * lat_delta) ).any() # test lon range explicitly (-180, 180) deg lon_range = Angle([-180.0, 180.0], "deg") lon, lat = sample_sphere(size=100, lon_range=lon_range, random_state=random_state) # test values in the desired range lat_range = Angle([-90.0, 90.0], "deg") assert ((lon_range[0] <= lon) & (lon < lon_range[1])).all() assert ((lat_range[0] <= lat) & (lat < lat_range[1])).all() # test if values are distributed along the whole range nbins = 4 lon_delta = (lon_range[1] - lon_range[0]) / nbins lat_delta = (lat_range[1] - lat_range[0]) / nbins for i in np.arange(nbins): assert ( (lon_range[0] + i * lon_delta <= lon) & (lon < lon_range[0] + (i + 1) * lon_delta) ).any() assert ( (lat_range[0] + i * lat_delta <= lat) & (lat < lat_range[0] + (i + 1) * lat_delta) ).any() # test box around Galactic center lon_range = Angle([-5.0, 5.0], "deg") lon, lat = sample_sphere(size=10, lon_range=lon_range, random_state=random_state) # test if values are distributed along the whole range nbins = 2 lon_delta = (lon_range[1] - lon_range[0]) / nbins for i in np.arange(nbins): assert ( (lon_range[0] + i * lon_delta <= lon) & (lon < lon_range[0] + (i + 1) * lon_delta) ).any() # test box around Galactic anticenter lon_range = Angle([175.0, 185.0], "deg") lon, lat = sample_sphere(size=10, lon_range=lon_range, random_state=random_state) # test if values are distributed along the whole range nbins = 2 lon_delta = (lon_range[1] - lon_range[0]) / nbins for i in np.arange(nbins): assert ( (lon_range[0] + i * lon_delta <= lon) & (lon < lon_range[0] + (i + 1) * lon_delta) ).any() def test_sample_sphere_distance(): random_state = np.random.RandomState(seed=0) x = sample_sphere_distance( distance_min=0.1, distance_max=42, size=2, random_state=random_state ) assert_allclose(x, [34.386731, 37.559774]) x = sample_sphere_distance( distance_min=0.1, distance_max=42, size=int(1e3), random_state=random_state ) assert x.min() >= 0.1 assert x.max() <= 42 def test_sample_powerlaw(): random_state = np.random.RandomState(seed=0) x = sample_powerlaw(x_min=0.1, x_max=10, gamma=2, size=2, random_state=random_state) assert_allclose(x, [0.14886601, 0.1873559]) def test_random_times(): # An example without dead time. rate = u.Quantity(10, "s^-1") time = sample_times(size=100, rate=rate, random_state=0) assert_allclose(time[0].sec, 0.07958745081631101) assert_allclose(time[-1].sec, 9.186484131475076) # An example with `return_diff=True` rate = u.Quantity(10, "s^-1") time = sample_times(size=100, rate=rate, return_diff=True, random_state=0) assert_allclose(time[0].sec, 0.07958745081631101) assert_allclose(time[-1].sec, 0.00047065345706976753) # An example with dead time. rate = u.Quantity(10, "Hz") dead_time = u.Quantity(0.1, "second") time = sample_times(size=100, rate=rate, dead_time=dead_time, random_state=0) assert np.min(time) >= u.Quantity(0.1, "second") assert_allclose(time[0].sec, 0.1 + 0.07958745081631101) assert_allclose(time[-1].sec, 0.1 * 100 + 9.186484131475076) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/random/utils.py0000644000175100001770000002147714721316200020211 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Helper functions to work with distributions.""" import numbers import numpy as np import scipy.integrate from astropy.coordinates import Angle from astropy.time import TimeDelta __all__ = [ "draw", "get_random_state", "normalize", "pdf", "sample_powerlaw", "sample_sphere", "sample_sphere_distance", "sample_times", ] def normalize(func, x_min, x_max): """Normalize a 1D function over a given range.""" def f(x): return func(x) / scipy.integrate.quad(func, x_min, x_max)[0] return f def pdf(func): """One-dimensional PDF of a given radial surface density.""" def f(x): return x * func(x) return f def draw(low, high, size, dist, random_state="random-seed", *args, **kwargs): """Allow drawing of random numbers from any distribution.""" from .inverse_cdf import InverseCDFSampler n = 1000 x = np.linspace(low, high, n) pdf = dist(x) sampler = InverseCDFSampler(pdf=pdf, random_state=random_state) idx = sampler.sample(size) x_sampled = np.interp(idx, np.arange(n), x) return np.squeeze(x_sampled) def get_random_state(init): """Get a `numpy.random.RandomState` instance. The purpose of this utility function is to have a flexible way to initialise a `~numpy.random.RandomState` instance, a.k.a. a random number generator (``rng``). See :ref:`dev_random` for usage examples and further information. Parameters ---------- init : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`} Available options to initialise the RandomState object: * ``int`` -- new RandomState instance seeded with this integer (calls `~numpy.random.RandomState` with ``seed=init``) * ``'random-seed'`` -- new RandomState instance seeded in a random way (calls `~numpy.random.RandomState` with ``seed=None``) * ``'global-rng'``, return the RandomState singleton used by ``numpy.random``. * `~numpy.random.RandomState` -- do nothing, return the input. Returns ------- random_state : `~numpy.random.RandomState` RandomState instance. """ if isinstance(init, (numbers.Integral, np.integer)): return np.random.RandomState(init) elif init == "random-seed": return np.random.RandomState(None) elif init == "global-rng": return np.random.mtrand._rand elif isinstance(init, np.random.RandomState): return init else: raise ValueError( "{} cannot be used to seed a numpy.random.RandomState" " instance".format(init) ) def sample_sphere(size, lon_range=None, lat_range=None, random_state="random-seed"): """Sample random points on the sphere. Reference: http://mathworld.wolfram.com/SpherePointPicking.html Parameters ---------- size : int Number of samples to generate. lon_range : `~astropy.coordinates.Angle`, optional Longitude range (min, max). lat_range : `~astropy.coordinates.Angle`, optional Latitude range (min, max). random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`}, optional Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is "random-seed". Returns ------- lon, lat: `~astropy.coordinates.Angle` Longitude and latitude coordinate arrays. """ random_state = get_random_state(random_state) if lon_range is None: lon_range = Angle([0.0, 360.0], "deg") if lat_range is None: lat_range = Angle([-90.0, 90.0], "deg") # Sample random longitude u = random_state.uniform(size=size) lon = lon_range[0] + (lon_range[1] - lon_range[0]) * u # Sample random latitude v = random_state.uniform(size=size) z_range = np.sin(lat_range) z = z_range[0] + (z_range[1] - z_range[0]) * v lat = np.arcsin(z) return lon, lat def sample_sphere_distance( distance_min=0, distance_max=1, size=None, random_state="random-seed" ): """Sample random distances if the 3-dim space density is constant. This function uses inverse transform sampling (`Wikipedia `__) to generate random distances for an observer located in a 3-dim space with constant source density in the range ``(distance_min, distance_max)``. Parameters ---------- distance_min : float, optional Minimum distance. Default is 0. distance_max : float, optional Maximum distance. Default is 1. size : int or tuple of ints, optional Output shape. Default is one sample. Passed to `numpy.random.uniform`. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`}, optional Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is "random-seed". Returns ------- distance : `~numpy.ndarray` Array of samples. """ random_state = get_random_state(random_state) # Since the differential distribution is dP / dr ~ r ^ 2, # we have a cumulative distribution # P(r) = a * r ^ 3 + b # with P(r_min) = 0 and P(r_max) = 1 implying # a = 1 / (r_max ^ 3 - r_min ^ 3) # b = -a * r_min ** 3 a = 1.0 / (distance_max**3 - distance_min**3) b = -a * distance_min**3 # Now for inverse transform sampling we need to use the inverse of # u = a * r ^ 3 + b # which is # r = [(u - b)/ a] ^ (1 / 3) u = random_state.uniform(size=size) distance = ((u - b) / a) ** (1.0 / 3) return distance def sample_powerlaw(x_min, x_max, gamma, size=None, random_state="random-seed"): """Sample random values from a power law distribution. f(x) = x ** (-gamma) in the range x_min to x_max It is assumed that *gamma* is the **differential** spectral index. Reference: http://mathworld.wolfram.com/RandomNumber.html Parameters ---------- x_min : float Minimum x range. x_max : float Maximum x range. gamma : float Power law index. size : int, optional Number of samples to generate. Default is None. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`}, optional Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is "random-seed". Returns ------- x : `~numpy.ndarray` Array of samples from the distribution. """ random_state = get_random_state(random_state) size = int(size) exp = -gamma base = random_state.uniform(x_min**exp, x_max**exp, size) x = base ** (1 / exp) return x def sample_times( size, rate, dead_time=TimeDelta(0, format="sec"), return_diff=False, random_state="random-seed", ): """Make random times assuming a Poisson process. This function can be used to test event time series, to have a comparison what completely random data looks like. Can be used in two ways (in either case the return type is `~astropy.time.TimeDelta`): * ``return_delta=False`` - Return absolute times, relative to zero (default) * ``return_delta=True`` - Return time differences between consecutive events. Parameters ---------- size : int Number of samples. rate : `~astropy.units.Quantity` Event rate. dead_time : `~astropy.units.Quantity` or `~astropy.time.TimeDelta`, optional Dead time after event. return_diff : bool Return time difference between events. Default False, return absolute times. random_state : {int, 'random-seed', 'global-rng', `~numpy.random.RandomState`}, optional Defines random number generator initialisation. Passed to `~gammapy.utils.random.get_random_state`. Default is "random-seed". Returns ------- time : `~astropy.time.TimeDelta` Time differences (second) after time zero. Examples -------- Example how to simulate 100 events at a rate of 10 Hz. As expected the last event occurs after about 10 seconds. >>> from astropy.units import Quantity >>> from gammapy.utils.random import sample_times >>> rate = Quantity(10, 'Hz') >>> times = sample_times(size=100, rate=rate, random_state=0) >>> times[-1] """ random_state = get_random_state(random_state) dead_time = TimeDelta(dead_time) scale = (1 / rate).to("s").value time_delta = random_state.exponential(scale=scale, size=size) time_delta += dead_time.to("s").value if return_diff: return TimeDelta(time_delta, format="sec") else: time = time_delta.cumsum() return TimeDelta(time, format="sec") ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/regions.py0000644000175100001770000001725714721316200017240 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Regions helper functions. Throughout Gammapy, we use `regions` to represent and work with regions. https://astropy-regions.readthedocs.io We might add in other conveniences and features here, e.g. sky coord contains without a WCS (see "sky and pixel regions" in PIG 10), or some HEALPix integration. TODO: before Gammapy v1.0, discuss what to do about ``gammapy.utils.regions``. Options: keep as-is, hide from the docs, or to remove it completely (if the functionality is available in ``astropy-regions`` directly. """ import operator import numpy as np from scipy.optimize import Bounds, minimize from astropy import units as u from astropy.coordinates import SkyCoord from regions import ( CircleAnnulusSkyRegion, CircleSkyRegion, CompoundSkyRegion, EllipseSkyRegion, RectangleSkyRegion, Regions, ) __all__ = [ "compound_region_to_regions", "make_concentric_annulus_sky_regions", "make_orthogonal_rectangle_sky_regions", "regions_to_compound_region", "region_to_frame", ] def compound_region_center(compound_region): """Compute center for a CompoundRegion. The center of the compound region is defined here as the geometric median of the individual centers of the regions. The geometric median is defined as the point the minimises the distance to all other points. Parameters ---------- compound_region : `CompoundRegion` Compound region. Returns ------- center : `~astropy.coordinates.SkyCoord` Geometric median of the positions of the individual regions. """ regions = compound_region_to_regions(compound_region) if len(regions) == 1: return regions[0].center positions = SkyCoord([region.center.icrs for region in regions]) def f(x, coords): """Function to minimize.""" lon, lat = x center = SkyCoord(lon * u.deg, lat * u.deg) return np.sum(center.separation(coords).deg) ra, dec = positions.ra.wrap_at("180d").deg, positions.dec.deg ub = np.array([np.max(ra), np.max(dec)]) lb = np.array([np.min(ra), np.min(dec)]) if np.all(ub == lb): bounds = None else: bounds = Bounds(ub=ub, lb=lb) result = minimize( f, x0=[np.mean(ra), np.mean(dec)], args=(positions,), bounds=bounds, method="L-BFGS-B", ) return SkyCoord(result.x[0], result.x[1], frame="icrs", unit="deg") def compound_region_to_regions(region): """Create list of regions from compound regions. Parameters ---------- region : `~regions.CompoundSkyRegion` or `~regions.SkyRegion` Compound sky region. Returns ------- regions : `~regions.Regions` List of regions. """ regions = Regions([]) if isinstance(region, CompoundSkyRegion): if region.operator is operator.or_: regions_1 = compound_region_to_regions(region.region1) regions.extend(regions_1) regions_2 = compound_region_to_regions(region.region2) regions.extend(regions_2) else: raise ValueError("Only union operator supported") else: return Regions([region]) return regions def regions_to_compound_region(regions): """Create compound region from list of regions, by creating the union. Parameters ---------- regions : `~regions.Regions` List of regions. Returns ------- compound : `~regions.CompoundSkyRegion` or `~regions.CompoundPixelRegion` Compound sky region. """ region_union = regions[0] for region in regions[1:]: region_union = region_union.union(region) return region_union class SphericalCircleSkyRegion(CircleSkyRegion): """Spherical circle sky region. TODO: is this separate class a good idea? - If yes, we could move it to the ``regions`` package? - If no, we should implement some other solution. Probably the alternative is to add extra methods to the ``CircleSkyRegion`` class and have that support both planar approximation and spherical case? Or we go with the approach to always make a TAN WCS and not have true cone select at all? """ def contains(self, skycoord, wcs=None): """Defined by spherical distance.""" separation = self.center.separation(skycoord) return separation < self.radius def make_orthogonal_rectangle_sky_regions(start_pos, end_pos, wcs, height, nbin=1): """Utility returning an array of regions to make orthogonal projections. Parameters ---------- start_pos : `~astropy.regions.SkyCoord` First sky coordinate defining the line to which the orthogonal boxes made. end_pos : `~astropy.regions.SkyCoord` Second sky coordinate defining the line to which the orthogonal boxes made. height : `~astropy.quantity.Quantity` Height of the rectangle region. wcs : `~astropy.wcs.WCS` WCS projection object. nbin : int, optional Number of boxes along the line. Default is 1. Returns ------- regions : list of `~regions.RectangleSkyRegion` Regions in which the profiles are made. """ pix_start = start_pos.to_pixel(wcs) pix_stop = end_pos.to_pixel(wcs) points = np.linspace(start=pix_start, stop=pix_stop, num=nbin + 1).T centers = 0.5 * (points[:, :-1] + points[:, 1:]) coords = SkyCoord.from_pixel(centers[0], centers[1], wcs) width = start_pos.separation(end_pos).to("rad") / nbin angle = end_pos.position_angle(start_pos) - 90 * u.deg regions = [] for center in coords: reg = RectangleSkyRegion( center=center, width=width, height=u.Quantity(height), angle=angle ) regions.append(reg) return regions def make_concentric_annulus_sky_regions( center, radius_max, radius_min=1e-5 * u.deg, nbin=11 ): """Make a list of concentric annulus regions. Parameters ---------- center : `~astropy.coordinates.SkyCoord` Center coordinate. radius_max : `~astropy.units.Quantity` Maximum radius. radius_min : `~astropy.units.Quantity`, optional Minimum radius. Default is 1e-5 deg. nbin : int, optional Number of boxes along the line. Default is 11. Returns ------- regions : list of `~regions.RectangleSkyRegion` Regions in which the profiles are made. """ regions = [] edges = np.linspace(radius_min, u.Quantity(radius_max), nbin) for r_in, r_out in zip(edges[:-1], edges[1:]): region = CircleAnnulusSkyRegion( center=center, inner_radius=r_in, outer_radius=r_out, ) regions.append(region) return regions def region_to_frame(region, frame): """Convert a region to a different frame. Parameters ---------- region : `~regions.SkyRegion` Region to transform. frame : {"icrs", "galactic"} Frame to transform the region into. Returns ------- region_new : `~regions.SkyRegion` Region in the given frame. """ from gammapy.maps import WcsGeom wcs = WcsGeom.create(skydir=region.center, binsz=0.01, frame=frame).wcs region_new = region.to_pixel(wcs).to_sky(wcs) return region_new def region_circle_to_ellipse(region): """Convert a CircleSkyRegion to an EllipseSkyRegion. Parameters ---------- region : `~regions.CircleSkyRegion` Region to transform. Returns ------- region_new : `~regions.EllipseSkyRegion` Elliptical region with same major and minor axis. """ region_new = EllipseSkyRegion( center=region.center, width=region.radius, height=region.radius ) return region_new ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/registry.py0000644000175100001770000000153514721316200017432 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import html __all__ = ["Registry"] class Registry(list): """Registry class.""" def get_cls(self, tag): for cls in self: tags = getattr(cls, "tag", []) tags = [tags] if isinstance(tags, str) else tags if tag in tags: return cls raise KeyError(f"No object found with tag: {tag!r}") def __str__(self): info = "Registry\n" info += "--------\n\n" len_max = max([len(_.__name__) for _ in self]) for item in self: info += f"{item.__name__:{len_max}s}: {item.tag} \n" return info.expandtabs(tabsize=2) def _repr_html_(self): try: return self.to_html() except AttributeError: return f"
{html.escape(str(self))}
" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/roots.py0000644000175100001770000001122414721316200016724 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Utilities to find roots of a scalar function within a given range.""" import numpy as np from scipy.optimize import RootResults, root_scalar import astropy.units as u from gammapy.utils.interpolation import interpolation_scale try: BAD_RES = RootResults( root=np.nan, iterations=0, function_calls=0, flag=0, method="brentq" ) except TypeError: BAD_RES = RootResults(root=np.nan, iterations=0, function_calls=0, flag=0) def find_roots( f, lower_bound, upper_bound, nbin=100, points_scale="lin", args=(), method="brentq", fprime=None, fprime2=None, xtol=None, rtol=None, maxiter=None, options=None, ): """Find roots of a scalar function within a given range. Parameters ---------- f : callable A function to find roots of. Its output should be unitless. lower_bound : `~astropy.units.Quantity` Lower bound of the search ranges to find roots. If an array is given search will be performed element-wise. upper_bound : `~astropy.units.Quantity` Upper bound of the search ranges to find roots. If an array is given search will be performed element-wise. nbin : int, optional Number of bins to sample the search range, ignored if bounds are arrays. Default is 100. points_scale : {"lin", "log", "sqrt"}, optional Scale used to sample the search range. Default is "lin". args : tuple, optional Extra arguments passed to the objective function and its derivative(s). method : str, optional Solver available in `~scipy.optimize.root_scalar`. Should be one of : - 'brentq' (default), - 'brenth', - 'bisect', - 'ridder', - 'toms748', - 'newton', - 'secant', - 'halley', fprime : bool or callable, optional If `fprime` is a boolean and is True, `f` is assumed to return the value of the objective function and of the derivative. `fprime` can also be a callable returning the derivative of `f`. In this case, it must accept the same arguments as `f`. fprime2 : bool or callable, optional If `fprime2` is a boolean and is True, `f` is assumed to return the value of the objective function and of the first and second derivatives. `fprime2` can also be a callable returning the second derivative of `f`. In this case, it must accept the same arguments as `f`. xtol : float, optional Tolerance (absolute) for termination. rtol : float, optional Tolerance (relative) for termination. maxiter : int, optional Maximum number of iterations. options : dict, optional A dictionary of solver options. See `~scipy.optimize.root_scalar` for details. Returns ------- roots : `~astropy.units.Quantity` The function roots. results : `~numpy.array` An array of `~scipy.optimize.RootResults` which is an object containing information about the convergence. If the solver failed to converge in a bracketing range the corresponding `roots` array element is NaN. """ kwargs = dict( args=args, method=method, fprime=fprime, fprime2=fprime2, xtol=xtol, rtol=rtol, maxiter=maxiter, options=options, ) if isinstance(lower_bound, u.Quantity): unit = lower_bound.unit lower_bound = lower_bound.value upper_bound = u.Quantity(upper_bound).to_value(unit) else: unit = 1 scale = interpolation_scale(points_scale) a = scale(lower_bound) b = scale(upper_bound) x = scale.inverse(np.linspace(a, b, nbin + 1)) if len(x) > 2: signs = np.sign([f(xk, *args) for xk in x]) ind = np.where(signs[:-1] != signs[1:])[0] else: ind = [0] nroots = max(1, len(ind)) roots = np.ones(nroots) * np.nan results = np.array(nroots * [BAD_RES]) for k, idx in enumerate(ind): bracket = [x[idx], x[idx + 1]] if method in ["bisection", "brentq", "brenth", "ridder", "toms748"]: kwargs["bracket"] = bracket elif method in ["secant", "newton", "halley"]: kwargs["x0"] = bracket[0] kwargs["x1"] = bracket[1] else: raise ValueError(f'Unknown solver "{method}"') try: res = root_scalar(f, **kwargs) results[k] = res if res.converged: roots[k] = res.root except (RuntimeError, ValueError): continue return roots * unit, results ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/scripts.py0000644000175100001770000001327414721316200017254 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Utilities to create scripts and command-line tools.""" import codecs import os.path import warnings from base64 import urlsafe_b64encode from pathlib import Path from uuid import uuid4 import yaml from gammapy.utils.check import add_checksum, verify_checksum from gammapy.utils.deprecation import deprecated_renamed_argument __all__ = [ "from_yaml", "get_images_paths", "make_path", "read_yaml", "recursive_merge_dicts", "to_yaml", "write_yaml", ] PATH_DOCS = Path(__file__).resolve().parent / ".." / ".." / "docs" SKIP = ["_static", "_build", "_checkpoints", "docs/user-guide/model-gallery/"] YAML_FORMAT = dict(sort_keys=False, indent=4, width=80, default_flow_style=False) def get_images_paths(folder=PATH_DOCS): """Generator yields a Path for each image used in notebook. Parameters ---------- folder : str Folder where to search. """ for i in Path(folder).rglob("images/*"): if not any(s in str(i) for s in SKIP): yield i.resolve() def from_yaml(text, sort_keys=False, checksum=False): """Read YAML file. Parameters ---------- text : str yaml str sort_keys : bool, optional Whether to sort keys. Default is False. checksum : bool Whether to perform checksum verification. Default is False. Returns ------- data : dict YAML file content as a dictionary. """ data = yaml.safe_load(text) checksum_str = data.pop("checksum", None) if checksum: yaml_format = YAML_FORMAT.copy() yaml_format["sort_keys"] = sort_keys yaml_str = yaml.dump(data, **yaml_format) if not verify_checksum(yaml_str, checksum_str): warnings.warn("Checksum verification failed.", UserWarning) return data def read_yaml(filename, logger=None, checksum=False): """Read YAML file. Parameters ---------- filename : `~pathlib.Path` Filename. logger : `~logging.Logger` Logger. checksum : bool Whether to perform checksum verification. Default is False. Returns ------- data : dict YAML file content as a dictionary. """ path = make_path(filename) if logger is not None: logger.info(f"Reading {path}") text = path.read_text() return from_yaml(text, checksum=checksum) def to_yaml(dictionary, sort_keys=False): """dict to yaml Parameters ---------- dictionary : dict Python dictionary. sort_keys : bool, optional Whether to sort keys. Default is False. """ from gammapy.utils.metadata import CreatorMetaData yaml_format = YAML_FORMAT.copy() yaml_format["sort_keys"] = sort_keys text = yaml.safe_dump(dictionary, **yaml_format) creation = CreatorMetaData() return text + creation.to_yaml() @deprecated_renamed_argument("dictionary", "text", "v1.3") def write_yaml( text, filename, logger=None, sort_keys=False, checksum=False, overwrite=False ): """Write YAML file. Parameters ---------- text : str yaml str filename : `~pathlib.Path` Filename. logger : `~logging.Logger`, optional Logger. Default is None. sort_keys : bool, optional Whether to sort keys. Default is True. checksum : bool, optional Whether to add checksum keyword. Default is False. overwrite : bool, optional Overwrite existing file. Default is False. """ if checksum: text = add_checksum(text, sort_keys=sort_keys) path = make_path(filename) path.parent.mkdir(exist_ok=True) if path.exists() and not overwrite: raise IOError(f"File exists already: {path}") if logger is not None: logger.info(f"Writing {path}") path.write_text(text) def make_name(name=None): """Make a dataset name.""" if name is None: name = urlsafe_b64encode(codecs.decode(uuid4().hex, "hex")).decode()[:8] while name[0] == "_": name = urlsafe_b64encode(codecs.decode(uuid4().hex, "hex")).decode()[:8] if not isinstance(name, str): raise ValueError( "Name argument must be a string, " f"got '{name}', which is of type '{type(name)}'" ) return name def make_path(path): """Expand environment variables on `~pathlib.Path` construction. Parameters ---------- path : str, `pathlib.Path` Path to expand. """ # TODO: raise error or warning if environment variables that don't resolve are used # e.g. "spam/$DAMN/ham" where `$DAMN` is not defined # Otherwise this can result in cryptic errors later on if path is None: return None else: return Path(os.path.expandvars(path)) def recursive_merge_dicts(a, b): """Recursively merge two dictionaries. Entries in 'b' override entries in 'a'. The built-in update function cannot be used for hierarchical dicts, see: http://stackoverflow.com/questions/3232943/update-value-of-a-nested-dictionary-of-varying-depth/3233356#3233356 Parameters ---------- a : dict Dictionary to be merged. b : dict Dictionary to be merged. Returns ------- c : dict Merged dictionary. Examples -------- >>> from gammapy.utils.scripts import recursive_merge_dicts >>> a = dict(a=42, b=dict(c=43, e=44)) >>> b = dict(d=99, b=dict(c=50, g=98)) >>> c = recursive_merge_dicts(a, b) >>> print(c) {'a': 42, 'b': {'c': 50, 'e': 44, 'g': 98}, 'd': 99} """ c = a.copy() for k, v in b.items(): if k in c and isinstance(c[k], dict): c[k] = recursive_merge_dicts(c[k], v) else: c[k] = v return c ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/table.py0000644000175100001770000000436714721316200016657 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Table helper utilities.""" import numpy as np from astropy.table import Table from astropy.units import Quantity from .units import standardise_unit __all__ = [ "hstack_columns", "table_row_to_dict", "table_standardise_units_copy", "table_standardise_units_inplace", ] def hstack_columns(table, table_other): """Stack the column data horizontally. Parameters ---------- table : `~astropy.table.Table` Input table. table_other : `~astropy.table.Table` Other input table. Returns ------- stacked : `~astropy.table.Table` Stacked table. """ stacked = Table() for column in table.colnames: data = np.hstack([table[column].data[0], table_other[column].data[0]]) stacked[column] = data[np.newaxis, :] return stacked def table_standardise_units_copy(table): """Standardise units for all columns in a table in a copy. Calls `~gammapy.utils.units.standardise_unit`. Parameters ---------- table : `~astropy.table.Table` Input table (won't be modified). Returns ------- table : `~astropy.table.Table` Copy of the input table with standardised column units. """ # Note: we could add an `inplace` option (or variant of this function) # See https://github.com/astropy/astropy/issues/6098 table = Table(table) return table_standardise_units_inplace(table) def table_standardise_units_inplace(table): """Standardise units for all columns in a table in place.""" for column in table.columns.values(): if column.unit: column.unit = standardise_unit(column.unit) return table def table_row_to_dict(row, make_quantity=True): """Make one source data dictionary. Parameters ---------- row : `~astropy.table.Row` Row. make_quantity : bool, optional Make quantity values for columns with units. Default is True. Returns ------- data : dict Row data. """ data = {} for name, col in row.columns.items(): val = row[name] if make_quantity and col.unit: val = Quantity(val, unit=col.unit) data[name] = val return data ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/testing.py0000644000175100001770000001627514721316200017246 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Utilities for testing.""" import os import sys from numpy.testing import assert_allclose import astropy import astropy.units as u from astropy.coordinates import SkyCoord from astropy.time import Time from astropy.utils.introspection import minversion import matplotlib.pyplot as plt from .compat import COPY_IF_NEEDED __all__ = [ "assert_quantity_allclose", "assert_skycoord_allclose", "assert_time_allclose", "Checker", "mpl_plot_check", "requires_data", "requires_dependency", ] ASTROPY_LT_5_3 = minversion(astropy, "5.3.dev") # Cache for `requires_dependency` _requires_dependency_cache = {} def requires_dependency(name): """Decorator to declare required dependencies for tests. Examples -------- :: from gammapy.utils.testing import requires_dependency @requires_dependency('scipy') def test_using_scipy(): import scipy ... """ import pytest if name in _requires_dependency_cache: skip_it = _requires_dependency_cache[name] else: try: __import__(name) skip_it = False except ImportError: skip_it = True _requires_dependency_cache[name] = skip_it reason = f"Missing dependency: {name}" return pytest.mark.skipif(skip_it, reason=reason) def has_data(name): """Check if certain set of data available.""" if name == "gammapy-extra": return "GAMMAPY_EXTRA" in os.environ elif name == "gammapy-data": return "GAMMAPY_DATA" in os.environ elif name == "gamma-cat": return "GAMMA_CAT" in os.environ elif name == "fermi-lat": return "GAMMAPY_FERMI_LAT_DATA" in os.environ else: raise ValueError(f"Invalid name: {name}") def requires_data(name="gammapy-data"): """Decorator to declare required data for tests. Examples -------- :: from gammapy.utils.testing import requires_data @requires_data() def test_using_data_files(): filename = "$GAMMAPY_DATA/..." ... """ import pytest if not isinstance(name, str): raise TypeError( "You must call @requires_data with a name (str). " "Usually this: @requires_data()" ) skip_it = not has_data(name) reason = f"Missing data: {name}" return pytest.mark.skipif(skip_it, reason=reason) def run_cli(cli, args, exit_code=0): """Run Click command line tool. Thin wrapper around `click.testing.CliRunner` that prints info to stderr if the command fails. Parameters ---------- cli : click.Command Click command. args : list of str Argument list. exit_code : int, optional Expected exit code of the command. Default is 0. Returns ------- result : `click.testing.Result` Result. """ from click.testing import CliRunner result = CliRunner().invoke(cli, args, catch_exceptions=False) if result.exit_code != exit_code: sys.stderr.write("Exit code mismatch!\n") sys.stderr.write("Output:\n") sys.stderr.write(result.output) return result def assert_skycoord_allclose(actual, desired): """Assert all-close for `astropy.coordinates.SkyCoord` objects. - Frames can be different, aren't checked at the moment. """ assert isinstance(actual, SkyCoord) assert isinstance(desired, SkyCoord) assert_allclose(actual.data.lon.deg, desired.data.lon.deg) assert_allclose(actual.data.lat.deg, desired.data.lat.deg) def assert_time_allclose(actual, desired, atol=1e-3): """Assert all-close for `astropy.time.Time` objects. atol : Absolute tolerance in seconds. Default is 1e-3. """ assert isinstance(actual, Time) assert isinstance(desired, Time) assert actual.scale == desired.scale assert actual.format == desired.format dt = actual - desired assert_allclose(dt.sec, 0, rtol=0, atol=atol) def assert_quantity_allclose(actual, desired, rtol=1.0e-7, atol=None, **kwargs): """Assert all-close for `~astropy.units.Quantity` objects. Notes ----- Requires that ``unit`` is identical, not just that quantities are allclose taking different units into account. We prefer this kind of assert for testing, since units should only change on purpose, so this tests more behaviour. """ # TODO: change this later to explicitly check units are the same! # assert actual.unit == desired.unit args = _unquantify_allclose_arguments(actual, desired, rtol, atol) assert_allclose(*args, **kwargs) def _unquantify_allclose_arguments(actual, desired, rtol, atol): actual = u.Quantity(actual, subok=True, copy=COPY_IF_NEEDED) desired = u.Quantity(desired, subok=True, copy=COPY_IF_NEEDED) try: desired = desired.to(actual.unit) except u.UnitsError: raise u.UnitsError( "Units for 'desired' ({}) and 'actual' ({}) " "are not convertible".format(desired.unit, actual.unit) ) if atol is None: # by default, we assume an absolute tolerance of 0 atol = u.Quantity(0) else: atol = u.Quantity(atol, subok=True, copy=COPY_IF_NEEDED) try: atol = atol.to(actual.unit) except u.UnitsError: raise u.UnitsError( "Units for 'atol' ({}) and 'actual' ({}) " "are not convertible".format(atol.unit, actual.unit) ) rtol = u.Quantity(rtol, subok=True, copy=COPY_IF_NEEDED) try: rtol = rtol.to(u.dimensionless_unscaled) except Exception: raise u.UnitsError("`rtol` should be dimensionless") return actual.value, desired.value, rtol.value, atol.value def mpl_plot_check(): """Matplotlib plotting test context manager. Create a new figure on __enter__ and calls savefig for the current figure in __exit__. This will trigger a render of the Figure, which can sometimes raise errors if there is a problem. This is writing to an in-memory byte buffer, i.e. is faster than writing to disk. """ from io import BytesIO class MPLPlotCheck: def __enter__(self): plt.figure() def __exit__(self, type, value, traceback): plt.savefig(BytesIO(), format="png") plt.close() return MPLPlotCheck() class Checker: """Base class for checker classes in Gammapy.""" def run(self, checks="all"): if checks == "all": checks = self.CHECKS.keys() unknown_checks = sorted(set(checks).difference(self.CHECKS.keys())) if unknown_checks: raise ValueError(f"Unknown checks: {unknown_checks!r}") for check in checks: method = getattr(self, self.CHECKS[check]) yield from method() UNIT_REPLACEMENTS_ASTROPY_5_3 = { "cm2 s TeV": "TeV s cm2", "1 / (cm2 s)": "1 / (s cm2)", "erg / (cm2 s)": "erg / (s cm2)", } def modify_unit_order_astropy_5_3(expected_str): """Modify unit order for tests with astropy >= 5.3.""" if ASTROPY_LT_5_3: for old, new in UNIT_REPLACEMENTS_ASTROPY_5_3.items(): expected_str = expected_str.replace(old, new) return expected_str ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.248642 gammapy-1.3/gammapy/utils/tests/0000755000175100001770000000000014721316215016354 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/__init__.py0000644000175100001770000000010014721316200020446 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_array.py0000644000175100001770000000105314721316200021074 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from gammapy.utils.array import array_stats_str, shape_2N def test_array_stats_str(): actual = array_stats_str(np.pi, "pi") assert actual == "pi : size = 1, min = 3.142, max = 3.142\n" actual = array_stats_str([np.pi, 42]) assert actual == "size = 2, min = 3.142, max = 42.000\n" def test_shape_2N(): shape = (34, 89, 120, 444) expected_shape = (40, 96, 128, 448) assert expected_shape == shape_2N(shape=shape, N=3) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_check.py0000644000175100001770000000120614721316200021033 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import yaml from gammapy.utils.check import add_checksum, verify_checksum, yaml_checksum def test_yaml_checksum(): data = { "a": 50, "b": 3.14e-12, } yaml_str = yaml.dump( data, sort_keys=False, indent=4, width=80, default_flow_style=False ) checksum = yaml_checksum(yaml_str) assert checksum == "47fd166725c49519c7c31c19f53b53dd" assert verify_checksum(yaml_str, "47fd166725c49519c7c31c19f53b53dd") yaml_with_checksum = add_checksum(yaml_str) assert "checksum: 47fd166725c49519c7c31c19f53b53dd" in yaml_with_checksum ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_deprecation.py0000644000175100001770000000400014721316200022246 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from ..deprecation import ( GammapyDeprecationWarning, deprecated, deprecated_attribute, deprecated_renamed_argument, ) @deprecated("v1.3", alternative="new_function") def deprecated_function(a, b): return a + b @deprecated(since="v1.3") class deprecated_class: def __init__(self): pass @deprecated_renamed_argument("a", "x", "v1.3") def deprecated_argument_function(x, y): return x + y @deprecated_renamed_argument("old", "new", "v1.3", arg_in_kwargs=True) def deprecated_argument_function_kwarg(new=1): return new class some_class: old_attribute = deprecated_attribute( "old_attribute", "v1.3", alternative="new_attribute" ) def __init__(self, value): self._old_attribute = value self._new_attribute = value @property def new_attribute(self): return self._new_attribute def test_deprecated_function(): assert "deprecated:: v1.3" in deprecated_function.__doc__ with pytest.warns(GammapyDeprecationWarning, match="Use new_function instead"): deprecated_function(1, 2) def test_deprecated_class(): assert "deprecated:: v1.3" in deprecated_class.__doc__ with pytest.warns( GammapyDeprecationWarning, match="The deprecated_class class is deprecated" ): deprecated_class() def test_deprecated_argument(): with pytest.warns(GammapyDeprecationWarning): res = deprecated_argument_function(a=1, y=2) assert res == 3 # this warns first and then raises with pytest.warns(GammapyDeprecationWarning): with pytest.raises(TypeError): deprecated_argument_function(a=1, x=2, y=2) with pytest.warns(GammapyDeprecationWarning): res = deprecated_argument_function_kwarg(old=2) assert res == 2 def test_deprecated_attibute(): object = some_class(1) with pytest.warns(GammapyDeprecationWarning): res = object.old_attribute assert res == 1 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_fits.py0000644000175100001770000000475614721316200020740 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose from astropy.io import fits from astropy.table import Column, Table from gammapy.utils.fits import earth_location_from_dict, earth_location_to_dict from gammapy.utils.scripts import make_path from gammapy.utils.testing import requires_data @pytest.fixture() def header(): filename = make_path( "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz" ) hdulist = fits.open(filename) return hdulist["EVENTS"].header @pytest.fixture() def table(): t = Table(meta={"version": 42}) t["a"] = np.array([1, 2], dtype=np.int32) t["b"] = Column(np.array([1, 2], dtype=np.int64), unit="m", description="Velocity") t["b"].meta["ucd"] = "spam" t["c"] = Column(["x", "yy"], "c") return t def test_table_fits_io_astropy(table): """Test `astropy.table.Table` FITS I/O in Astropy. Having these tests in Gammapy is to check / ensure that the features we rely on work properly for all Astropy versions we support in CI (currently Astropy 1.3 and up) This is useful, because Table FITS I/O was pretty shaky for a while and incrementally improved over time. These are the same examples that we have in the docstring at the top of `gammapy/utils/fits.py`. """ # Check Table -> BinTableHDU hdu = fits.BinTableHDU(table) assert hdu.header["TTYPE2"] == "b" assert hdu.header["TFORM2"] == "K" assert hdu.header["TUNIT2"] == "m" # Check BinTableHDU -> Table table2 = Table.read(hdu) assert isinstance(table2.meta, dict) assert table2.meta == {"VERSION": 42} assert table2["b"].unit == "m" # Note: description doesn't come back in older versions of Astropy # that we still support, so we're not asserting on that here for now. @requires_data() def test_earth_location_from_dict(header): location = earth_location_from_dict(header) assert_allclose(location.lon.value, 16.50022, rtol=1e-4) assert_allclose(location.lat.value, -23.271777, rtol=1e-4) assert_allclose(location.height.value, 1834.999999, rtol=1e-4) @requires_data() def test_earth_location_to_dict(header): location = earth_location_from_dict(header) loc_dict = earth_location_to_dict(location) assert_allclose(loc_dict["GEOLON"], 16.50022, rtol=1e-4) assert_allclose(loc_dict["GEOLAT"], -23.271777, rtol=1e-4) assert_allclose(loc_dict["ALTITUDE"], 1834.999999, rtol=1e-4) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_gauss.py0000644000175100001770000000575414721316200021114 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from numpy.testing import assert_allclose from astropy import units as u from gammapy.utils.gauss import Gauss2DPDF, MultiGauss2D class TestGauss2DPDF: """Note that we test __call__ and dpdtheta2 by checking that their integrals as advertised are 1.""" def setup_method(self): self.gs = [ Gauss2DPDF(0.1 * u.deg), Gauss2DPDF(1 * u.deg), Gauss2DPDF(1 * u.deg), ] def test_call(self): # Check that value at origin matches the one given here: # http://en.wikipedia.org/wiki/Multivariate_normal_distribution#Bivariate_case for g in self.gs: actual = g(0 * u.deg, 0 * u.deg) desired = 1 / (2 * np.pi * g.sigma**2) assert_allclose(actual, desired) def test_containment(self): for g in self.gs: assert_allclose(g.containment_fraction(g.sigma), 0.39346934028736658) assert_allclose(g.containment_fraction(2 * g.sigma), 0.8646647167633873) def test_theta(self): for g in self.gs: assert_allclose(g.containment_radius(0.68) / g.sigma, 1.5095921854516636) assert_allclose(g.containment_radius(0.95) / g.sigma, 2.4477468306808161) def test_gauss_convolve(self): g = Gauss2DPDF(sigma=3 * u.deg).gauss_convolve(sigma=4 * u.deg) assert_allclose(g.sigma, 5 * u.deg) class TestMultiGauss2D: """Note that we test __call__ and dpdtheta2 by checking that their integrals.""" @staticmethod def test_integral_normalize(): m = MultiGauss2D(sigmas=[1, 2], norms=[3, 4]) assert_allclose(m.integral, 7) m.normalize() assert_allclose(m.integral, 1) @staticmethod def test_containment(): g, g2 = Gauss2DPDF(sigma=1), Gauss2DPDF(sigma=2) m = MultiGauss2D(sigmas=[1]) m2 = MultiGauss2D(sigmas=[1, 2], norms=[3, 4]) for theta in [0, 0.1, 1, 5]: assert_allclose( m.containment_fraction(theta), g.containment_fraction(theta) ) actual = m2.containment_fraction(theta) desired = 3 * g.containment_fraction(theta) + 4 * g2.containment_fraction( theta ) assert_allclose(actual, desired) @staticmethod def test_theta(): # Closure test m = MultiGauss2D(sigmas=[1, 2] * u.deg, norms=[3, 4]) for theta in [0, 0.1, 1, 5] * u.deg: c = m.containment_fraction(theta) t = m.containment_radius(c) assert_allclose(t, theta) @staticmethod def test_gauss_convolve(): # Convolution must add sigmas in square m = MultiGauss2D(sigmas=[3], norms=[5]) m2 = m.gauss_convolve(4, 6) assert_allclose(m2.sigmas, [5]) assert_allclose(m2.integral, 5 * 6) # Check that convolve did not change the original assert_allclose(m.sigmas, [3]) assert_allclose(m.norms, [5]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_integrate.py0000644000175100001770000000101514721316200021736 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from astropy.units import Quantity from gammapy.modeling.models import PowerLawSpectralModel from gammapy.utils.integrate import trapz_loglog from gammapy.utils.testing import assert_quantity_allclose def test_trapz_loglog(): energy = Quantity([1, 10], "TeV") pwl = PowerLawSpectralModel(index=2.3) ref = pwl.integral(energy_min=energy[0], energy_max=energy[1]) val = trapz_loglog(pwl(energy), energy) assert_quantity_allclose(val, ref) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_interpolation.py0000644000175100001770000000102214721316200022641 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np from gammapy.utils.interpolation import LogScale from gammapy.utils.testing import assert_allclose def test_LogScale_inverse(): tiny = np.finfo(np.float32).tiny log_scale = LogScale() values = np.array([1, 1e-5, 1e-40]) log_values = log_scale(values) assert_allclose(log_values, np.array([0, np.log(1e-5), np.log(tiny)])) inv_values = log_scale.inverse(log_values) assert_allclose(inv_values, np.array([1, 1e-5, 0])) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_metadata.py0000644000175100001770000001666414721316200021554 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from typing import ClassVar, Literal, Optional import pytest from numpy.testing import assert_allclose from astropy.coordinates import AltAz, SkyCoord from astropy.io import fits from astropy.time import Time from pydantic import ValidationError from gammapy.utils.metadata import ( METADATA_FITS_KEYS, CreatorMetaData, MetaData, ObsInfoMetaData, PointingInfoMetaData, TargetMetaData, TimeInfoMetaData, ) from gammapy.utils.scripts import make_path from gammapy.utils.testing import requires_data @pytest.fixture() def hess_eventlist_header(): filename = make_path( "$GAMMAPY_DATA/hess-dl3-dr1/data/hess_dl3_dr1_obs_id_023523.fits.gz" ) hdulist = fits.open(filename) return hdulist["EVENTS"].header def test_creator(): default = CreatorMetaData(date="2022-01-01", creator="gammapy", origin="CTA") assert default.creator == "gammapy" assert default.origin == "CTA" assert_allclose(default.date.mjd, 59580) default.creator = "other" assert default.creator == "other" with pytest.raises(ValidationError): default.date = 3 def test_creator_to_header(): header = CreatorMetaData( date="2022-01-01", creator="gammapy", origin="CTA" ).to_header(format="gadf") assert header["CREATOR"] == "gammapy" assert header["ORIGIN"] == "CTA" assert header["CREATED"] == "2022-01-01 00:00:00.000" def test_creator_from_incorrect_header(): # Create header with a 'bad' date hdu = fits.PrimaryHDU() hdu.header["CREATOR"] = "gammapy" hdu.header["CREATED"] = "Tues 6 Feb" meta = CreatorMetaData.from_header(hdu.header) assert meta.date == hdu.header["CREATED"] assert meta.creator == hdu.header["CREATOR"] def test_subclass(): class TestMetaData(MetaData): _tag: ClassVar[Literal["tag"]] = "tag" name: str mode: Optional[SkyCoord] = None creation: Optional[CreatorMetaData] = None creator = CreatorMetaData() test_meta = TestMetaData(name="test", creation=creator) assert test_meta.tag == "tag" assert test_meta.name == "test" assert test_meta.creation.creator.split()[0] == "Gammapy" with pytest.raises(ValidationError): test_meta.mode = "coord" yaml_str = test_meta.to_yaml() assert "name: test" in yaml_str assert "creation:" in yaml_str def test_obs_info(): obs_info = ObsInfoMetaData(obs_id="23523") assert obs_info.telescope is None assert obs_info.obs_id == 23523 obs_info.obs_id = 23523 assert obs_info.obs_id == 23523 with pytest.raises(ValidationError): obs_info.obs_id = "ab" obs_info.instrument = "CTA-North" assert obs_info.instrument == "CTA-North" @requires_data() def test_obs_info_from_header(hess_eventlist_header): meta = ObsInfoMetaData.from_header(hess_eventlist_header, format="gadf") assert meta.telescope == "HESS" assert meta.obs_id == 23523 assert meta.observation_mode == "WOBBLE" assert meta.sub_array is None def test_obs_info_to_header(): obs_info = ObsInfoMetaData(obs_id=23523, telescope="CTA-South") header = obs_info.to_header("gadf") assert header["OBS_ID"] == 23523 assert header["TELESCOP"] == "CTA-South" assert "OBS_MODE" not in header def test_pointing_info(): position = SkyCoord(83.6287, 22.0147, unit="deg", frame="icrs") altaz = AltAz("20 deg", "45 deg") pointing = PointingInfoMetaData(radec_mean=position, altaz_mean=altaz) assert isinstance(pointing.altaz_mean, SkyCoord) assert_allclose(pointing.altaz_mean.alt.deg, 45.0) assert_allclose(pointing.radec_mean.ra.deg, 83.6287) pointing.radec_mean = position.galactic assert_allclose(pointing.radec_mean.ra.deg, 83.6287) with pytest.raises(ValidationError): pointing.radec_mean = altaz def test_pointing_info_to_header(): position = SkyCoord(83.6287, 22.0147, unit="deg", frame="icrs") altaz = AltAz("20 deg", "45 deg") header = PointingInfoMetaData(radec_mean=position, altaz_mean=altaz).to_header( "gadf" ) assert_allclose(header["RA_PNT"], 83.6287) assert_allclose(header["AZ_PNT"], 20.0) header = PointingInfoMetaData(radec_mean=position).to_header("gadf") assert "AZ_PNT" not in header.keys() with pytest.raises(ValueError): PointingInfoMetaData(radec_mean=position, altaz_mean=altaz).to_header("bad") @requires_data() def test_pointing_info_from_header(hess_eventlist_header): meta = PointingInfoMetaData.from_header(hess_eventlist_header, format="gadf") assert_allclose(meta.radec_mean.ra.deg, 83.633333) assert_allclose(meta.altaz_mean.alt.deg, 41.389789) meta = PointingInfoMetaData.from_header({}) assert meta.altaz_mean is None assert meta.radec_mean is None def test_target_metadata(): meta = TargetMetaData( name="center", position=SkyCoord(0.0, 0.0, unit="deg", frame="galactic") ) header = meta.to_header(format="gadf") assert meta.name == "center" assert_allclose(meta.position.ra.deg, 266.404988) assert header["OBJECT"] == "center" assert_allclose(header["RA_OBJ"], 266.404988) header = TargetMetaData(name="center").to_header("gadf") assert header["OBJECT"] == "center" assert "RA_OBJ" not in header.keys() @requires_data() def test_target_metadata_from_header(hess_eventlist_header): meta = TargetMetaData.from_header(hess_eventlist_header, format="gadf") assert meta.name == "Crab Nebula" assert_allclose(meta.position.ra.deg, 83.63333333) def test_time_info_metadata(): meta = TimeInfoMetaData( reference_time="2023-01-01 00:00:00", time_start="2024-01-01 00:00:00", time_stop="2024-01-01 12:00:00", ) assert isinstance(meta.reference_time, Time) delta = meta.time_stop - meta.time_start assert_allclose(delta.to_value("h"), 12) header = meta.to_header(format="gadf") assert header["MJDREFI"] == 59945 assert header["TIMESYS"] == "tt" assert_allclose(header["TSTART"], 31536000.000000257) @requires_data() def test_time_info_metadata_from_header(hess_eventlist_header): meta = TimeInfoMetaData.from_header(hess_eventlist_header, format="gadf") assert_allclose(meta.reference_time.mjd, 51910.00074287037) assert_allclose(meta.time_start.mjd, 53343.92234009259) def test_subclass_to_from_header(): class TestMetaData(MetaData): _tag: ClassVar[Literal["test"]] = "test" creation: Optional[CreatorMetaData] pointing: Optional[PointingInfoMetaData] METADATA_FITS_KEYS.update({"test": {}}) creator = CreatorMetaData(date="2022-01-01", creator="gammapy", origin="CTA") position = SkyCoord(83.6287, 22.0147, unit="deg", frame="icrs") altaz = AltAz("20 deg", "45 deg") pointing = PointingInfoMetaData(radec_mean=position, altaz_mean=altaz) test_meta = TestMetaData(pointing=pointing, creation=creator) header = test_meta.to_header() assert header["CREATOR"] == "gammapy" assert header["ORIGIN"] == "CTA" assert_allclose(header["RA_PNT"], 83.6287) assert_allclose(header["AZ_PNT"], 20.0) new = TestMetaData.from_header(header) assert new.creation.creator == "gammapy" assert_allclose(new.creation.date.decimalyear, 2022) assert_allclose(new.pointing.radec_mean.ra.deg, 83.6287) assert_allclose(new.pointing.altaz_mean.alt.deg, 45) # no new attributes allowed with pytest.raises(ValidationError): test_meta.extra = 3 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_parallel.py0000644000175100001770000001023014721316200021547 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import astropy.units as u import gammapy.utils.parallel as parallel from gammapy.estimators import FluxPointsEstimator from gammapy.utils.testing import requires_dependency def test_parallel_mixin(): p = parallel.ParallelMixin() with pytest.raises(ValueError): p.parallel_backend = "wrong_name" with pytest.raises(ValueError): p.n_jobs = "5 jobs" @requires_dependency("ray") def test_get_multiprocessing_ray(): assert parallel.is_ray_available() multiprocessing = parallel.get_multiprocessing_ray() assert multiprocessing.__name__ == "ray.util.multiprocessing" def test_run_multiprocessing_wrong_method(): def func(arg): return arg with pytest.raises(ValueError): parallel.run_multiprocessing( func, [True, True], method="wrong_name", pool_kwargs=dict(processes=2) ) def square(x): return x**2 class MyTask: def __init__(self): self.sum_squared = 0 def __call__(self, x): return x**2 def callback(self, result): self.sum_squared += result def test_run_multiprocessing_simple_starmap(): N = 10 inputs = [(_,) for _ in range(N + 1)] result = parallel.run_multiprocessing( func=square, inputs=inputs, method="starmap", pool_kwargs=dict(processes=2), ) assert sum(result) == N * (N + 1) * (2 * N + 1) / 6 def test_run_multiprocessing_simple_apply_async(): N = 10 inputs = [(_,) for _ in range(N + 1)] task = MyTask() _ = parallel.run_multiprocessing( func=task, inputs=inputs, method="apply_async", pool_kwargs=dict(processes=2), method_kwargs=dict(callback=task.callback), ) assert task.sum_squared == N * (N + 1) * (2 * N + 1) / 6 @requires_dependency("ray") def test_run_multiprocessing_simple_ray_starmap(): N = 10 inputs = [(_,) for _ in range(N + 1)] result = parallel.run_multiprocessing( func=square, inputs=inputs, method="starmap", pool_kwargs=dict(processes=2), backend="ray", ) assert sum(result) == N * (N + 1) * (2 * N + 1) / 6 @requires_dependency("ray") def test_run_multiprocessing_simple_ray_apply_async(): N = 10 inputs = [(_,) for _ in range(N + 1)] task = MyTask() _ = parallel.run_multiprocessing( func=task, inputs=inputs, method="apply_async", pool_kwargs=dict(processes=2), method_kwargs=dict(callback=task.callback), backend="ray", ) assert task.sum_squared == N * (N + 1) * (2 * N + 1) / 6 @requires_dependency("ray") def test_multiprocessing_manager(): with parallel.multiprocessing_manager( backend="ray", method="apply_async", pool_kwargs=dict(processes=2), method_kwargs=dict(callback=None), ): assert parallel.BACKEND_DEFAULT == "ray" assert parallel.POOL_KWARGS_DEFAULT["processes"] == 2 assert parallel.METHOD_DEFAULT == "apply_async" assert "callback" in parallel.METHOD_KWARGS_DEFAULT assert parallel.BACKEND_DEFAULT == parallel.ParallelBackendEnum.multiprocessing assert parallel.POOL_KWARGS_DEFAULT["processes"] == 1 assert parallel.METHOD_DEFAULT == parallel.PoolMethodEnum.starmap assert "callback" not in parallel.METHOD_KWARGS_DEFAULT fpe = FluxPointsEstimator(energy_edges=[1, 3, 10] * u.TeV) assert fpe.parallel_backend == parallel.ParallelBackendEnum.multiprocessing with parallel.multiprocessing_manager(backend="ray", pool_kwargs=dict(processes=2)): assert fpe.parallel_backend == "ray" assert fpe.n_jobs == 2 fpe = FluxPointsEstimator(energy_edges=[1, 3, 10] * u.TeV) assert fpe.parallel_backend == "ray" assert fpe.parallel_backend == parallel.ParallelBackendEnum.multiprocessing fpe = FluxPointsEstimator( energy_edges=[1, 3, 10] * u.TeV, n_jobs=2, parallel_backend="multiprocessing" ) with parallel.multiprocessing_manager(backend="ray", pool_kwargs=dict(processes=3)): assert fpe.parallel_backend == "multiprocessing" assert fpe.n_jobs == 2 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_regions.py0000644000175100001770000000664314721316200021436 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """ Here we test the functions in `gammapy.utils.regions`. We can also add tests for specific functionality and behaviour in https://astropy-regions.readthedocs.io that we rely on in Gammapy. That package is still work in progress and not fully developed and stable, so need to establish a bit what works and what doesn't. """ from numpy.testing import assert_allclose, assert_equal import astropy.units as u from astropy.coordinates import SkyCoord from regions import CircleSkyRegion, EllipseSkyRegion, Regions from gammapy.utils.regions import ( SphericalCircleSkyRegion, compound_region_center, region_circle_to_ellipse, region_to_frame, regions_to_compound_region, ) def test_compound_region_center(): regions_ds9 = ( "galactic;" "circle(1,1,0.1);" "circle(-1,1,0.1);" "circle(1,-1,0.1);" "circle(-1,-1,0.1);" ) regions = Regions.parse(regions_ds9, format="ds9") region = regions_to_compound_region(regions) center = compound_region_center(region) assert_allclose(center.galactic.l.wrap_at("180d"), 0 * u.deg, atol=1e-6) assert_allclose(center.galactic.b, 0 * u.deg, atol=1e-6) def test_compound_region_center_single(): region = Regions.parse("galactic;circle(1,1,0.1)", format="ds9")[0] center = compound_region_center(region) assert_allclose(center.galactic.l.wrap_at("180d"), 1 * u.deg, atol=1e-6) assert_allclose(center.galactic.b, 1 * u.deg, atol=1e-6) def test_compound_region_center_concentric(): regions_ds9 = "galactic;" "circle(0,0,0.1);" "circle(0,0,0.2);" regions = Regions.parse(regions_ds9, format="ds9") region = regions_to_compound_region(regions) center = compound_region_center(region) assert_allclose(center.galactic.l.wrap_at("180d"), 0 * u.deg, atol=1e-6) assert_allclose(center.galactic.b, 0 * u.deg, atol=1e-6) def test_compound_region_center_inomogeneous_frames(): region_icrs = Regions.parse("icrs;circle(1,1,0.1)", format="ds9")[0] region_galactic = Regions.parse("galactic;circle(1,1,0.1)", format="ds9")[0] regions = Regions([region_icrs, region_galactic]) region = regions_to_compound_region(regions) center = compound_region_center(region) assert_allclose(center.galactic.l.wrap_at("180d"), 28.01 * u.deg, atol=1e-2) assert_allclose(center.galactic.b, -37.46 * u.deg, atol=1e-2) def test_spherical_circle_sky_region(): region = SphericalCircleSkyRegion( center=SkyCoord(10 * u.deg, 20 * u.deg), radius=10 * u.deg ) coord = SkyCoord([20.1, 22] * u.deg, 20 * u.deg) mask = region.contains(coord) assert_equal(mask, [True, False]) def test_region_to_frame(): region = EllipseSkyRegion( center=SkyCoord(20, 17, unit="deg"), height=0.3 * u.deg, width=1.0 * u.deg, angle=30 * u.deg, ) region_new = region_to_frame(region, "galactic") assert_allclose(region_new.angle, 20.946 * u.deg, rtol=1e-3) assert_allclose(region_new.center.l, region.center.galactic.l, rtol=1e-3) def test_region_circle_to_ellipse(): region = CircleSkyRegion(center=SkyCoord(20, 17, unit="deg"), radius=1.0 * u.deg) region_new = region_circle_to_ellipse(region) assert_allclose(region_new.height, region.radius, rtol=1e-3) assert_allclose(region_new.width, region.radius, rtol=1e-3) assert_allclose(region_new.angle, 0.0 * u.deg, rtol=1e-3) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_roots.py0000644000175100001770000000263714721316200021135 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from numpy.testing import assert_allclose import astropy.units as u from gammapy.utils.roots import find_roots class TestFindRoots: lower_bound = -3 * np.pi * u.rad upper_bound = 0 * u.rad def f(self, x): return np.cos(x) def h(self, x): return x**3 - 1 def test_methods(self): methods = ["brentq", "secant"] for method in methods: roots, res = find_roots( self.f, lower_bound=self.lower_bound, upper_bound=self.upper_bound, method=method, ) assert roots.unit == u.rad assert_allclose(2 * roots.value / np.pi, np.array([-5.0, -3.0, -1.0])) assert np.all([sol.converged for sol in res]) roots, res = find_roots( self.h, lower_bound=self.lower_bound, upper_bound=self.upper_bound, method=method, ) assert np.isnan(roots[0]) assert res[0].iterations == 0 def test_invalid_method(self): with pytest.raises(ValueError, match='Unknown solver "xfail"'): find_roots( self.f, lower_bound=self.lower_bound, upper_bound=self.upper_bound, method="xfail", ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_scripts.py0000644000175100001770000000300014721316200021437 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import warnings import pytest import yaml from gammapy.utils.scripts import ( get_images_paths, make_path, read_yaml, recursive_merge_dicts, to_yaml, write_yaml, ) @pytest.mark.xfail def test_get_images_paths(): assert any("images" in str(p) for p in get_images_paths()) def test_recursive_merge_dicts(): old = dict(a=42, b=dict(c=43, e=44)) update = dict(d=99, b=dict(g=98, c=50)) new = recursive_merge_dicts(old, update) assert new["b"]["c"] == 50 assert new["b"]["e"] == 44 assert new["b"]["g"] == 98 assert new["a"] == 42 assert new["d"] == 99 def test_read_write_yaml_checksum(tmp_path): data = to_yaml({"b": 1234, "a": "other"}) path = make_path(tmp_path / "test.yaml") write_yaml(data, path, sort_keys=False, checksum=True) yaml_content = path.read_text() assert "checksum: " in yaml_content with warnings.catch_warnings(): warnings.simplefilter("error") res = read_yaml(path, checksum=True) assert "checksum" not in res.keys() yaml_content = yaml_content.replace("1234", "12345") bad = make_path(tmp_path) / "bad_checksum.yaml" bad.write_text(yaml_content) with pytest.warns(UserWarning): read_yaml(bad, checksum=True) res["checksum"] = "bad" yaml_str = yaml.dump(data, sort_keys=True, default_flow_style=False) path.write_text(yaml_str) with pytest.warns(UserWarning): read_yaml(bad, checksum=True) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_table.py0000644000175100001770000000173514721316200021054 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import astropy.units as u from astropy.table import Column, Table from gammapy.utils.table import table_row_to_dict, table_standardise_units_copy def test_table_standardise_units(): table = Table( [ Column([1], "a", unit="ph cm-2 s-1"), Column([1], "b", unit="ct cm-2 s-1"), Column([1], "c", unit="cm-2 s-1"), Column([1], "d"), ] ) table = table_standardise_units_copy(table) assert table["a"].unit == "cm-2 s-1" assert table["b"].unit == "cm-2 s-1" assert table["c"].unit == "cm-2 s-1" assert table["d"].unit is None @pytest.fixture() def table(): return Table( [Column([1, 2], "a"), Column([1, 2] * u.m, "b"), Column(["x", "yy"], "c")] ) def test_table_row_to_dict(table): actual = table_row_to_dict(table[1]) expected = {"a": 2, "b": 2 * u.m, "c": "yy"} assert actual == expected ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_time.py0000644000175100001770000000517014721316200020720 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from numpy.testing import assert_allclose from astropy.time import Time, TimeDelta import astropy.units as u from gammapy.utils.time import ( absolute_time, extract_time_info, time_to_fits, time_to_fits_header, time_ref_from_dict, time_ref_to_dict, time_relative_to_ref, unique_time_info, ) def test_time_to_fits(): time = Time("2001-01-01T00:00:00") fits_time_simple = time_to_fits(time) epoch = Time(10000, format="mjd", scale="tt") fits_time_epoch = time_to_fits(time, epoch) fits_time_unit = time_to_fits(time, unit=u.d) assert_allclose(fits_time_simple, 4.485024e09 * u.s) assert_allclose(fits_time_epoch, 3.621024e09 * u.s) assert_allclose(fits_time_unit, 51910.000743 * u.d) def test_time_to_fits_header(): time = Time("2001-01-01T00:00:00") fits_header_value, fits_header_unit = time_to_fits_header(time) assert_allclose(fits_header_value, 4.485024e09) assert fits_header_unit == "s" def test_time_ref_from_dict(): d = dict(MJDREFI=51910, MJDREFF=0.00074287036841269583) mjd = d["MJDREFF"] + d["MJDREFI"] time = time_ref_from_dict(d) assert time.format == "mjd" assert time.scale == "tt" assert_allclose(time.mjd, mjd) def test_time_ref_to_dict(): time = Time("2001-01-01T00:00:00") d = time_ref_to_dict(time) assert set(d) == {"MJDREFI", "MJDREFF", "TIMESYS"} assert d["MJDREFI"] == 51910 assert_allclose(d["MJDREFF"], 0.00074287036841269583) assert d["TIMESYS"] == "tt" def test_time_relative_to_ref(): time_ref_dict = dict(MJDREFI=500, MJDREFF=0.5) time_ref = time_ref_from_dict(time_ref_dict) delta_time_1sec = TimeDelta(1.0, format="sec") time = time_ref + delta_time_1sec delta_time = time_relative_to_ref(time, time_ref_dict) assert_allclose(delta_time.sec, delta_time_1sec.sec) def test_absolute_time(): time_ref_dict = dict(MJDREFI=51000, MJDREFF=0.5) time_ref = time_ref_from_dict(time_ref_dict) delta_time_1sec = TimeDelta(1.0, format="sec") time = time_ref + delta_time_1sec abs_time = absolute_time(delta_time_1sec, time_ref_dict) assert abs_time.value == time.utc.isot def test_extract_time_info(): dd = dict(MJDREFI=1, MJDREFF=2, TIMEUNIT=3, TIMESYS=4, TIMEREF=5, TELESCOPE="IACT") time_row = extract_time_info(dd) assert time_row["TIMESYS"] == 4 def test_check_time_info(): rows = [ dict(MJDREFI=1, MJDREFF=2, TIMEUNIT=3, TIMESYS=4, TIMEREF=5), dict(MJDREFI=1, MJDREFF=2, TIMEUNIT=5, TIMESYS=4, TIMEREF=5), ] assert unique_time_info(rows) is False ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/tests/test_units.py0000644000175100001770000000240214721316200021117 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np import astropy.units as u from gammapy.maps import MapAxis from gammapy.utils.units import energy_unit_format, standardise_unit def test_standardise_unit(): assert standardise_unit("ph cm-2 s-1") == "cm-2 s-1" assert standardise_unit("ct cm-2 s-1") == "cm-2 s-1" assert standardise_unit("cm-2 s-1") == "cm-2 s-1" axis = MapAxis.from_nodes([1e-1, 200, 3.5e3, 4.6e4], name="energy", unit="GeV") values = [ (1530 * u.eV, "1.53 keV"), (1530 * u.keV, "1.53 MeV"), (1530e4 * u.keV, "15.3 GeV"), (1530 * u.GeV, "1.53 TeV"), (1530.5e8 * u.keV, "153 TeV"), (1530.5 * u.TeV, "1.53 PeV"), ( np.array([1e3, 3.5e6, 400.4e12, 1512.5e12]) * u.eV, ("1.00 keV", "3.50 MeV", "400 TeV", "1.51 PeV"), ), ( [1.54e2 * u.GeV, 4300 * u.keV, 300.6e12 * u.eV], ("154 GeV", "4.30 MeV", "301 TeV"), ), (axis.center, ("100 MeV", "200 GeV", "3.50 TeV", "46.0 TeV")), ( [u.Quantity(x) for x in axis.as_plot_labels], ("100 MeV", "200 GeV", "3.50 TeV", "46.0 TeV"), ), ] @pytest.mark.parametrize("q, expect", values) def test_energy_unit_format(q, expect): assert energy_unit_format(q) == expect ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/time.py0000644000175100001770000001346514721316200016525 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Time related utility functions.""" import numpy as np import astropy.units as u from astropy.time import Time, TimeDelta __all__ = [ "absolute_time", "time_ref_from_dict", "time_ref_to_dict", "time_relative_to_ref", ] TIME_KEYWORDS = ["MJDREFI", "MJDREFF", "TIMEUNIT", "TIMESYS", "TIMEREF"] # TODO: implement and document this properly. # see https://github.com/gammapy/gammapy/issues/284 TIME_REF_FERMI = Time("2001-01-01T00:00:00") # Default time ref used for GTIs TIME_REF_DEFAULT = Time("2000-01-01T00:00:00", scale="tt") #: Default epoch gammapy uses for FITS files (MJDREF) #: 0 MJD, TT DEFAULT_EPOCH = Time(0, format="mjd", scale="tt") def time_to_fits(time, epoch=None, unit=u.s): """Convert time to fits format. Times are stored as duration since an epoch in FITS. Parameters ---------- time : `~astropy.time.Time` Time to be converted. epoch : `~astropy.time.Time`, optional Epoch to use for the time. The corresponding keywords must be stored in the same FITS header. Default is None, so the `DEFAULT_EPOCH` is used. unit : `~astropy.units.Unit`, optional Unit, should be stored as `TIMEUNIT` in the same FITS header. Default is u.s. Returns ------- time : astropy.units.Quantity Duration since epoch. """ if epoch is None: epoch = DEFAULT_EPOCH return (time - epoch).to(unit) def time_to_fits_header(time, epoch=None, unit=u.s): """Convert time to fits header format. Times are stored as duration since an epoch in FITS. Parameters ---------- time : `~astropy.time.Time` Time to be converted. epoch : `~astropy.time.Time`, optional Epoch to use for the time. The corresponding keywords must be stored in the same FITS header. Default is None, so `DEFAULT_EPOCH` is used. unit : `~astropy.units.Unit`, optional Unit, should be stored as `TIMEUNIT` in the same FITS header. Default is u.s. Returns ------- header_entry : tuple(float, string) A float / comment tuple with the time and unit. """ if epoch is None: epoch = DEFAULT_EPOCH time = time_to_fits(time, epoch) return time.to_value(unit), unit.to_string("fits") def time_ref_from_dict(meta, format="mjd", scale="tt"): """Calculate the time reference from metadata. Parameters ---------- meta : dict FITS time standard header information. format: str, optional Format of the `~astropy.time.Time` information. Default is 'mjd'. scale: str, optional Scale of the `~astropy.time.Time` information. Default is 'tt'. Returns ------- time : `~astropy.time.Time` Time object with ``format='MJD'``. """ scale = meta.get("TIMESYS", scale).lower() # some files seem to have MJDREFF as string, not as float mjdrefi = float(meta["MJDREFI"]) mjdreff = float(meta["MJDREFF"]) return Time(mjdrefi, mjdreff, format=format, scale=scale) def time_ref_to_dict(time=None, scale="tt"): """Convert the epoch to the relevant FITS header keywords. Parameters ---------- time : `~astropy.time.Time`, optional The reference epoch for storing time in FITS. Default is None, so 'DEFAULT_EPOCH' is used. scale: str, optional Scale of the `~astropy.time.Time` information. Default is "tt". Returns ------- meta : dict FITS time standard header information. """ if time is None: time = DEFAULT_EPOCH mjd = Time(time, scale=scale).mjd i = np.floor(mjd).astype(np.int64) f = mjd - i return dict(MJDREFI=i, MJDREFF=f, TIMESYS=scale) def time_relative_to_ref(time, meta): """Convert a time using an existing reference. The time reference is built as MJDREFI + MJDREFF in units of MJD. The time will be converted to seconds after the reference. Parameters ---------- time : `~astropy.time.Time` Time to be converted. meta : dict Dictionary with the keywords ``MJDREFI`` and ``MJDREFF``. Returns ------- time_delta : `~astropy.time.TimeDelta` Time in seconds after the reference. """ time_ref = time_ref_from_dict(meta) return TimeDelta(time - time_ref, format="sec") def absolute_time(time_delta, meta): """Convert a MET into human-readable date and time. Parameters ---------- time_delta : `~astropy.time.TimeDelta` Time in seconds after the MET reference. meta : dict Dictionary with the keywords ``MJDREFI`` and ``MJDREFF``. Returns ------- time : `~astropy.time.Time` Absolute time with ``format='ISOT'`` and ``scale='UTC'``. """ time = time_ref_from_dict(meta) + time_delta return Time(time.utc.isot) def extract_time_info(row): """Extract the timing metadata from an event file header. Parameters ---------- row : dict Dictionary with all the metadata of an event file. Returns ------- time_row : dict Dictionary with only the time metadata. """ time_row = {} for name in TIME_KEYWORDS: time_row[name] = row[name] return time_row def unique_time_info(rows): """Check if the time information are identical between all metadata dictionaries. Parameters ---------- rows : list List of dictionaries with a list of time metadata from different observations. Returns ------- status : bool True if the time metadata are identical between observations. """ if len(rows) <= 1: return True first_obs = rows[0] for row in rows[1:]: for name in TIME_KEYWORDS: if first_obs[name] != row[name] or row[name] is None: return False return True ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/types.py0000644000175100001770000001214714721316200016727 0ustar00runnerdockerimport json from pathlib import Path from typing import Annotated from astropy import units as u from astropy.coordinates import AltAz, Angle, EarthLocation, SkyCoord from astropy.time import Time from pydantic import PlainSerializer from pydantic.functional_validators import AfterValidator, BeforeValidator from .observers import observatory_locations from .scripts import make_path __all__ = [ "AngleType", "EnergyType", "QuantityType", "TimeType", "PathType", "EarthLocationType", "SkyCoordType", ] # TODO: replace by QuantityType and pydantic TypeAdapter class JsonQuantityEncoder(json.JSONEncoder): """Support for quantities that JSON default encoder""" def default(self, obj): if isinstance(obj, u.Quantity): return obj.to_string() return json.JSONEncoder.default(self, obj) # TODO: replace by QuantityType and pydantic TypeAdapter class JsonQuantityDecoder(json.JSONDecoder): """Support for quantities that JSON default encoder""" def __init__(self, *args, **kwargs): super().__init__(object_hook=self.object_hook, *args, **kwargs) @staticmethod def object_hook(data): for key, value in data.items(): try: data[key] = u.Quantity(value) except TypeError: continue return data def json_encode_earth_location(v): """JSON encoder for `~astropy.coordinates.EarthLocation`.""" return ( f"lon: {v.lon.value} {v.lon.unit}, " f"lat : {v.lat.value} {v.lat.unit}, " f"height : {v.height.value} {v.height.unit}" ) def json_encode_sky_coord(v): """JSON encoder for `~astropy.coordinates.SkyCoord`.""" return f"lon: {v.spherical.lon.value} {v.spherical.lon.unit}, lat: {v.spherical.lat.value} {v.spherical.lat.unit}, frame: {v.frame.name} " def json_encode_time(v): """JSON encoder for `~astropy.time.Time`.""" if v.isscalar: return v.value return v.value.tolist() def validate_angle(v): """Validator for `~astropy.coordinates.Angle`.""" return Angle(v) def validate_scalar(v): """Validator for scalar values.""" if not v.isscalar: raise ValueError(f"A scalar value is required: {v!r}") return v def validate_energy(v): """Validator for `~astropy.units.Quantity` with unit "energy".""" v = u.Quantity(v) if v.unit.physical_type != "energy": raise ValueError(f"Invalid unit for energy: {v.unit!r}") return v def validate_quantity(v): """Validator for `~astropy.units.Quantity`.""" return u.Quantity(v) def validate_time(v): """Validator for `~astropy.time.Time`.""" return Time(v) def validate_earth_location(v): """Validator for `~astropy.coordinates.EarthLocation`.""" if isinstance(v, EarthLocation): return v if isinstance(v, str) and v in observatory_locations: return observatory_locations[v] try: return EarthLocation(v) except TypeError: raise ValueError(f"Invalid EarthLocation: {v!r}") def validate_sky_coord(v): """Validator for `~astropy.coordinates.SkyCoord`.""" return SkyCoord(v) def validate_sky_coord_icrs(v): """Validator for `~astropy.coordinates.SkyCoord` in icrs.""" try: return SkyCoord(v).icrs except AttributeError: raise ValueError(f"Cannot convert '{v!r}' to icrs") def validate_altaz_coord(v): """Validator for `~astropy.coordinates.AltAz`.""" if isinstance(v, AltAz): return SkyCoord(v) return SkyCoord(v).altaz SERIALIZE_KWARGS = { "when_used": "json-unless-none", "return_type": str, } scalar_validator = AfterValidator(validate_scalar) AngleType = Annotated[ Angle, PlainSerializer(lambda v: f"{v.value} {v.unit}", **SERIALIZE_KWARGS), BeforeValidator(validate_angle), scalar_validator, ] EnergyType = Annotated[ u.Quantity, PlainSerializer(lambda v: f"{v.value} {v.unit}", **SERIALIZE_KWARGS), BeforeValidator(validate_energy), scalar_validator, ] QuantityType = Annotated[ u.Quantity, PlainSerializer(lambda v: f"{v.value} {v.unit}", **SERIALIZE_KWARGS), BeforeValidator(validate_quantity), scalar_validator, ] TimeType = Annotated[ Time, PlainSerializer(json_encode_time, when_used="json-unless-none"), BeforeValidator(validate_time), ] EarthLocationType = Annotated[ EarthLocation, PlainSerializer(json_encode_earth_location, **SERIALIZE_KWARGS), BeforeValidator(validate_earth_location), scalar_validator, ] SkyCoordType = Annotated[ SkyCoord, PlainSerializer(json_encode_sky_coord, **SERIALIZE_KWARGS), BeforeValidator(validate_sky_coord), scalar_validator, ] ICRSSkyCoordType = Annotated[ SkyCoord, PlainSerializer(json_encode_sky_coord, **SERIALIZE_KWARGS), BeforeValidator(validate_sky_coord_icrs), scalar_validator, ] AltAzSkyCoordType = Annotated[ SkyCoord, PlainSerializer(json_encode_sky_coord, **SERIALIZE_KWARGS), BeforeValidator(validate_altaz_coord), scalar_validator, ] PathType = Annotated[ Path, PlainSerializer(lambda p: str(p), **SERIALIZE_KWARGS), BeforeValidator(make_path), ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/utils/units.py0000644000175100001770000000505214721316200016722 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Units and Quantity related helper functions.""" import logging from math import floor import numpy as np import astropy.units as u __all__ = ["standardise_unit", "unit_from_fits_image_hdu"] log = logging.getLogger(__name__) def standardise_unit(unit): """Standardise unit. Changes applied by this function: * Drop "photon" == "ph" from the unit * Drop "count" == "ct" from the unit Parameters ---------- unit : `~astropy.units.Unit` or str Any old unit. Returns ------- unit : `~astropy.units.Unit` Shiny new, standardised unit. Examples -------- >>> from gammapy.utils.units import standardise_unit >>> standardise_unit('ph cm-2 s-1') Unit("1 / (s cm2)") >>> standardise_unit('ct cm-2 s-1') Unit("1 / (s cm2)") >>> standardise_unit('cm-2 s-1') Unit("1 / (s cm2)") """ unit = u.Unit(unit) bases, powers = [], [] for base, power in zip(unit.bases, unit.powers): if str(base) not in {"ph", "ct"}: bases.append(base) powers.append(power) return u.CompositeUnit(scale=unit.scale, bases=bases, powers=powers) def unit_from_fits_image_hdu(header): """Read unit from a FITS image HDU. - The ``BUNIT`` key is used. - `astropy.units.Unit` is called. If the ``BUNIT`` value is invalid, a log warning is emitted and the empty unit is used. - `standardise_unit` is called """ unit = header.get("BUNIT", "") try: u.Unit(unit) except ValueError: log.warning(f"Invalid value BUNIT={unit!r} in FITS header. Setting empty unit.") unit = "" return standardise_unit(unit) def energy_unit_format(E): """Format energy quantities to a string representation that is more comfortable to read. This is done by switching to the most relevant unit (keV, MeV, GeV, TeV) and changing the float precision. Parameters ---------- E: `~astropy.units.Quantity` Quantity or list of quantities. Returns ------- str : str A string or tuple of strings with energy unit formatted. """ try: iter(E) except TypeError: pass else: return tuple(map(energy_unit_format, E)) i = floor(np.log10(E.to_value(u.eV)) / 3) # a new unit every 3 decades unit = (u.eV, u.keV, u.MeV, u.GeV, u.TeV, u.PeV)[i] if i < 5 else u.PeV v = E.to_value(unit) i = floor(np.log10(v)) prec = (2, 1, 0)[i] if i < 3 else 0 return f"{v:0.{prec}f} {unit}" ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615308.0 gammapy-1.3/gammapy/version.py0000644000175100001770000000051714721316214016113 0ustar00runnerdocker# Note that we need to fall back to the hard-coded version if either # setuptools_scm can't be imported or setuptools_scm can't determine the # version, so we catch the generic 'Exception'. try: from setuptools_scm import get_version version = get_version(root='..', relative_to=__file__) except Exception: version = '1.3' ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.252642 gammapy-1.3/gammapy/visualization/0000755000175100001770000000000014721316215016753 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/visualization/__init__.py0000644000175100001770000000133014721316200021053 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from .cmap import colormap_hess, colormap_milagro from .datasets import plot_npred_signal, plot_spectrum_datasets_off_regions from .heatmap import annotate_heatmap, plot_heatmap from .panel import MapPanelPlotter from .utils import ( add_colorbar, plot_contour_line, plot_distribution, plot_map_rgb, plot_theta_squared_table, ) __all__ = [ "annotate_heatmap", "colormap_hess", "colormap_milagro", "MapPanelPlotter", "add_colorbar", "plot_contour_line", "plot_heatmap", "plot_map_rgb", "plot_spectrum_datasets_off_regions", "plot_theta_squared_table", "plot_npred_signal", "plot_distribution", ] ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/visualization/cmap.py0000644000175100001770000001074214721316200020243 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Helper functions and functions for plotting gamma-ray images.""" from matplotlib.colors import LinearSegmentedColormap __all__ = ["colormap_hess", "colormap_milagro"] def colormap_hess(transition=0.5, width=0.1): """Colormap often used in H.E.S.S. collaboration publications. This colormap goes black -> blue -> red -> yellow -> white. A sharp blue -> red -> yellow transition is often used for significance images with a value of red at ``transition ~ 5`` or ``transition ~ 7`` so that the following effect is achieved: - black, blue: non-significant features, not well visible - red: features at the detection threshold ``transition`` - yellow, white: significant features, very well visible The transition parameter is defined between 0 and 1. To calculate the value from data units an `~astropy.visualization.mpl_normalize.ImageNormalize` instance should be used (see example below). Parameters ---------- transition : float Value of the transition to red (between 0 and 1). Default is 0.5. width : float Width of the blue-red color transition (between 0 and 1). Default is 0.5. Returns ------- colormap : `matplotlib.colors.LinearSegmentedColormap` Colormap. Examples -------- >>> from gammapy.visualization import colormap_hess >>> from astropy.visualization.mpl_normalize import ImageNormalize >>> from astropy.visualization import LinearStretch >>> normalize = ImageNormalize(vmin=-5, vmax=15, stretch=LinearStretch()) >>> transition = normalize(5) >>> cmap = colormap_hess(transition=transition) """ # Compute normalised values (range 0 to 1) that # correspond to red, blue, yellow. red = float(transition) if width > red: blue = 0.1 * red else: blue = red - width yellow = 2.0 / 3.0 * (1 - red) + red black, white = 0, 1 # Create custom colormap # List entries: (value, (R, G, B)) colors = [ (black, "k"), (blue, (0, 0, 0.8)), (red, "r"), (yellow, (1.0, 1.0, 0)), (white, "w"), ] return LinearSegmentedColormap.from_list(name="hess", colors=colors) def colormap_milagro(transition=0.5, width=0.0001, huestart=0.6): """Colormap often used in Milagro collaboration publications. This colormap is gray below ``transition`` and similar to the jet colormap above. A sharp gray -> color transition is often used for significance images with a transition value of ``transition ~ 5`` or ``transition ~ 7``, so that the following effect is achieved: - gray: non-significant features are not well visible - color: significant features at the detection threshold ``transition`` Note that this colormap is often criticised for over-exaggerating small differences in significance below and above the gray - color transition threshold. The transition parameter is defined between 0 and 1. To calculate the value from data units an `~astropy.visualization.mpl_normalize.ImageNormalize` instance should be used (see example below). Parameters ---------- transition : float Transition value (below: gray, above: color). Default is 0.5. width : float Width of the transition. Default is 0.0001. huestart : float Hue of the color at ``transition``. Default is 0.6. Returns ------- colormap : `~matplotlib.colors.LinearSegmentedColormap` Colormap. Examples -------- >>> from gammapy.visualization import colormap_milagro >>> from astropy.visualization.mpl_normalize import ImageNormalize >>> from astropy.visualization import LinearStretch >>> normalize = ImageNormalize(vmin=-5, vmax=15, stretch=LinearStretch()) >>> transition = normalize(5) >>> cmap = colormap_milagro(transition=transition) """ from colorsys import hls_to_rgb # Compute normalised red, blue, yellow values transition = float(transition) # Create custom colormap # List entries: (value, (H, L, S)) colors = [ (0, (1, 1, 0)), (transition - width, (1, 0, 0)), (transition, (huestart, 0.4, 0.5)), (transition + width, (huestart, 0.4, 1)), (0.99, (0, 0.6, 1)), (1, (0, 1, 1)), ] # Convert HLS values to RGB values rgb_colors = [(val, hls_to_rgb(*hls)) for (val, hls) in colors] return LinearSegmentedColormap.from_list(name="milagro", colors=rgb_colors) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/visualization/datasets.py0000644000175100001770000001370114721316200021131 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import matplotlib.pyplot as plt __all__ = [ "plot_spectrum_datasets_off_regions", "plot_npred_signal", ] def plot_spectrum_datasets_off_regions( datasets, ax=None, legend=None, legend_kwargs=None, **kwargs ): """Plot the off regions of spectrum datasets. Parameters ---------- datasets : `~gammapy.datasets.Datasets` or list of `~gammapy.datasets.SpectrumDatasetOnOff` List of spectrum on-off datasets. ax : `~astropy.visualization.wcsaxes.WCSAxes`, optional Axes object to plot on. Default is None. legend : bool, optional Whether to add/display the labels of the off regions in a legend. By default True if ``len(datasets) <= 10``. Default is None. legend_kwargs : dict, optional Keyword arguments used in `matplotlib.axes.Axes.legend`. The ``handler_map`` cannot be overridden. Default is None. **kwargs : dict Keyword arguments used in `gammapy.maps.RegionNDMap.plot_region`. Can contain a `~cycler.Cycler` in a ``prop_cycle`` argument. Notes ----- Properties from the ``prop_cycle`` have maximum priority, except ``color``, ``edgecolor``/``color`` is selected from the sources below in this order: ``kwargs["edgecolor"]``, ``kwargs["prop_cycle"]``, ``matplotlib.rcParams["axes.prop_cycle"]`` ``matplotlib.rcParams["patch.edgecolor"]``, ``matplotlib.rcParams["patch.facecolor"]`` is never used. Examples -------- Plot without legend and with thick circles:: plot_spectrum_datasets_off_regions(datasets, ax, legend=False, linewidth=2.5) Plot to visualise the overlap of off regions:: plot_spectrum_datasets_off_regions(datasets, ax, alpha=0.3, facecolor='black') Plot that cycles through colors (``edgecolor``) and line styles together:: plot_spectrum_datasets_off_regions(datasets, ax, prop_cycle=plt.cycler(color=list('rgb'), ls=['--', '-', ':'])) # noqa: E501 Plot that uses a modified `~matplotlib.rcParams`, has two legend columns, static and dynamic colors, but only shows labels for ``datasets1`` and ``datasets2``. Note that ``legend_kwargs`` only applies if it's given in the last function call with ``legend=True``:: plt.rc('legend', columnspacing=1, fontsize=9) plot_spectrum_datasets_off_regions(datasets1, ax, legend=True, edgecolor='cyan') plot_spectrum_datasets_off_regions(datasets2, ax, legend=True, legend_kwargs=dict(ncol=2)) plot_spectrum_datasets_off_regions(datasets3, ax, legend=False, edgecolor='magenta') """ from matplotlib.legend_handler import HandlerPatch, HandlerTuple from matplotlib.patches import CirclePolygon, Patch if ax is None: ax = plt.subplot(projection=datasets[0].counts_off.geom.wcs) legend = legend or legend is None and len(datasets) <= 10 legend_kwargs = legend_kwargs or {} handles, labels = [], [] prop_cycle = kwargs.pop("prop_cycle", plt.rcParams["axes.prop_cycle"]) for props, dataset in zip(prop_cycle(), datasets): plot_kwargs = kwargs.copy() plot_kwargs["facecolor"] = "None" plot_kwargs.setdefault("edgecolor") plot_kwargs.update(props) dataset.counts_off.plot_region(ax, **plot_kwargs) # create proxy artist for the custom legend if legend: handle = Patch(**plot_kwargs) handles.append(handle) labels.append(dataset.name) if legend: legend = ax.get_legend() if legend: handles = legend.legendHandles + handles labels = [text.get_text() for text in legend.texts] + labels handles = [(handle, handle) for handle in handles] tuple_handler = HandlerTuple(ndivide=None, pad=0) def patch_func( legend, orig_handle, xdescent, ydescent, width, height, fontsize ): radius = width / 2 return CirclePolygon((radius - xdescent, height / 2 - ydescent), radius) patch_handler = HandlerPatch(patch_func) legend_kwargs.setdefault("handletextpad", 0.5) legend_kwargs["handler_map"] = {Patch: patch_handler, tuple: tuple_handler} ax.legend(handles, labels, **legend_kwargs) return ax def plot_npred_signal( dataset, ax=None, model_names=None, region=None, **kwargs, ): """ Plot the energy distribution of predicted counts of a selection of models assigned to a dataset. The background and the sum af all the considered models predicted counts are plotted on top of individual contributions of the considered models. Parameters ---------- dataset : an instance of `~gammapy.datasets.MapDataset` The dataset from which to plot the npred_signal. ax : `~matplotlib.axes.Axes`, optional Axis object to plot on. Default is None. model_names : list of str, optional The list of models for which the npred_signal is plotted. Default is None. If None, all models are considered. region : `~regions.Region` or `~astropy.coordinates.SkyCoord`, optional Region used to reproject predicted counts. Default is None. If None, use the full dataset geometry. **kwargs : dict Keyword arguments to pass to `~gammapy.maps.RegionNDMap.plot`. Returns ------- axes : `~matplotlib.axes.Axes` Axis object. """ npred_not_stack = dataset.npred_signal( model_names=model_names, stack=False ).to_region_nd_map(region) npred_background = dataset.npred_background().to_region_nd_map(region) if ax is None: ax = plt.gca() npred_not_stack.plot(ax=ax, axis_name="energy", **kwargs) if npred_not_stack.geom.axes["models"].nbin > 1: npred_stack = npred_not_stack.sum_over_axes(["models"]) npred_stack.plot(ax=ax, label="stacked models") npred_background.plot(ax=ax, label="background", **kwargs) ax.set_ylabel("Predicted counts") ax.legend() return ax ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/visualization/heatmap.py0000644000175100001770000001116114721316200020736 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import numpy as np import matplotlib.pyplot as plt # taken from the matploltlib documentation # https://matplotlib.org/3.1.0/gallery/images_contours_and_fields/image_annotated_heatmap.html#sphx-glr-gallery-images-contours-and-fields-image-annotated-heatmap-py __all__ = [ "annotate_heatmap", "plot_heatmap", ] def plot_heatmap( data, row_labels, col_labels, ax=None, cbar_kw=None, cbarlabel="", **kwargs ): """ Create a heatmap from a numpy array and two lists of labels. Parameters ---------- data : `~numpy.ndarray` Data array. row_labels : list or `~numpy.ndarray` List or array of labels for the rows. col_labels : list or `~numpy.ndarray` List or array of labels for the columns. ax : `matplotlib.axes.Axes`, optional Axis instance to which the heatmap is plotted. Default is None. If None, the current one is used. cbar_kw : dict, optional A dictionary with arguments to `matplotlib.Figure.colorbar`. Default is None. cbarlabel : str, optional The label for the color bar. Default is "". **kwargs : dict, optional Other keyword arguments forwarded to `matplotlib.axes.Axes.imshow`. """ if ax is None: ax = plt.gca() if cbar_kw is None: cbar_kw = {} # Plot the heatmap im = ax.imshow(data, **kwargs) # Create colorbar cbar = ax.figure.colorbar(im, ax=ax, **cbar_kw) cbar.ax.set_ylabel(cbarlabel, rotation=-90, va="bottom") # We want to show all ticks... ax.set_xticks(np.arange(data.shape[1])) ax.set_yticks(np.arange(data.shape[0])) # ... and label them with the respective list entries. ax.set_xticklabels(col_labels) ax.set_yticklabels(row_labels) # Let the horizontal axes labeling appear on top. ax.tick_params(top=True, bottom=False, labeltop=True, labelbottom=False) # Rotate the tick labels and set their alignment. plt.setp(ax.get_xticklabels(), rotation=-30, ha="right", rotation_mode="anchor") # Turn spines off and create white grid. for edge, spine in ax.spines.items(): spine.set_visible(False) ax.set_xticks(np.arange(data.shape[1] + 1) - 0.5, minor=True) ax.set_yticks(np.arange(data.shape[0] + 1) - 0.5, minor=True) ax.grid(which="minor", color="w", linestyle="-", linewidth=1.5) ax.tick_params(which="minor", bottom=False, left=False) return im, cbar def annotate_heatmap( im, data=None, valfmt="{x:.2f}", textcolors=("black", "white"), threshold=None, **textkw, ): """ A function to annotate a heatmap. Parameters ---------- im The AxesImage to be labeled. data : `~numpy.ndarray`, optional Data used to annotate. Default is None. If None, the image's data is used. valfmt : str format or `matplotlib.ticker.Formatter`, optional The format of the annotations inside the heatmap. This should either use the string format method, e.g. "$ {x:.2f}" or be a `matplotlib.ticker.Formatter` instance. Default is "{x:.2f}". textcolors : list or `~numpy.ndarray`, optional Two color specifications. The first is used for values below a threshold, the second for those above. Default is ["black", "white"]. threshold : float, optional Value in data units according to which the colors from textcolors are applied. Default is None. If None the middle of the colormap is used as separation. **kwargs : dict, optional Other keyword arguments forwarded to each call to `text` used to create the text labels. """ import matplotlib if not isinstance(data, (list, np.ndarray)): data = im.get_array() # Normalize the threshold to the images color range. if threshold is not None: threshold = im.norm(threshold) else: threshold = im.norm(data.max()) / 2.0 # Set default alignment to center, but allow it to be # overwritten by textkw. kw = dict(horizontalalignment="center", verticalalignment="center") kw.update(textkw) # Get the formatter in case a string is supplied if isinstance(valfmt, str): valfmt = matplotlib.ticker.StrMethodFormatter(valfmt) # Loop over the data and create a `Text` for each "pixel". # Change the text's color depending on the data. texts = [] for i in range(data.shape[0]): for j in range(data.shape[1]): kw.update(color=textcolors[int(im.norm(data[i, j]) > threshold)]) text = im.axes.text(j, i, valfmt(data[i, j], None), **kw) texts.append(text) return texts ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/visualization/panel.py0000644000175100001770000000721514721316200020423 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst """Helper functions and functions for plotting gamma-ray images.""" import numpy as np from astropy.coordinates import Angle from matplotlib.gridspec import GridSpec __all__ = ["MapPanelPlotter"] __doctest_requires__ = {("colormap_hess", "colormap_milagro"): ["matplotlib"]} class MapPanelPlotter: """ Map panel plotter class. Given a `~matplotlib.pyplot.Figure` object this class creates axes objects using `~matplotlib.gridspec.GridSpec` and plots a given sky map onto these. Parameters ---------- figure : `~matplotlib.pyplot.figure.` Figure instance. xlim : `~astropy.coordinates.Angle` Angle object specifying the longitude limits. ylim : `~astropy.coordinates.Angle` Angle object specifying the latitude limits. npanels : int Number of panels. **kwargs : dict Keyword arguments passed to `~matplotlib.gridspec.GridSpec`. """ def __init__(self, figure, xlim, ylim, npanels=4, **kwargs): self.figure = figure self.parameters = {"xlim": xlim, "ylim": ylim, "npanels": npanels} self.grid_spec = GridSpec(nrows=npanels, ncols=1, **kwargs) def _get_ax_extend(self, ax, panel): """Get width and height of the axis in world coordinates.""" p = self.parameters # compute aspect ratio of the axis aspect = ax.bbox.width / ax.bbox.height # compute width and height in world coordinates height = np.abs(p["ylim"].diff()[0]) width = aspect * height left, bottom = p["xlim"][0].wrap_at("180d"), p["ylim"][0] width_all = np.abs(p["xlim"].wrap_at("180d").diff()[0]) xoverlap = ((p["npanels"] * width) - width_all) / (p["npanels"] - 1.0) if xoverlap < 0: raise ValueError( "No overlap between panels. Please reduce figure " "height or increase vertical space between the panels." ) left = left - panel * (width - xoverlap) return left, bottom, width, height def _set_ax_fov(self, ax, panel): left, bottom, width, height = self._get_ax_extend(ax, panel) # set fov xlim = Angle([left, left - width]) ylim = Angle([bottom, bottom + height]) xlim_pix, ylim_pix = ax.wcs.wcs_world2pix(xlim.deg, ylim.deg, 1) ax.set_xlim(*xlim_pix) ax.set_ylim(*ylim_pix) return ax def plot_panel(self, map, panel=1, panel_fov=None, **kwargs): """ Plot sky map on one panel. Parameters ---------- map : `~gammapy.maps.WcsNDMap` Map to plot. panel : int Which panel to plot on (counted from top). Default is 1. panel_fov : int, optional Field of view of the panel. Default is None. kwargs : dict Keyword arguments to be passed to `~map.plot`. """ if panel_fov is None: panel_fov = panel spec = self.grid_spec[panel] ax = self.figure.add_subplot(spec, projection=map.geom.wcs) try: ax = map.plot(ax=ax, **kwargs) except AttributeError: ax = map.plot_rgb(ax=ax, **kwargs) ax = self._set_ax_fov(ax, panel_fov) return ax def plot(self, map, **kwargs): """ Plot sky map on all panels. Parameters ---------- map : `~gammapy.maps.WcsNDMap` Map to plot. """ p = self.parameters axes = [] for panel in range(p["npanels"]): ax = self.plot_panel(map, panel=panel, **kwargs) axes.append(ax) return axes ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.252642 gammapy-1.3/gammapy/visualization/tests/0000755000175100001770000000000014721316215020115 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/visualization/tests/__init__.py0000644000175100001770000000010014721316200022207 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/visualization/tests/test_cmap.py0000644000175100001770000000245714721316200022450 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from numpy.testing import assert_allclose from gammapy.visualization import colormap_hess, colormap_milagro def _check_cmap_rgb_vals(vals, cmap, vmin=0, vmax=1): """Helper function to check RGB values of color images""" from matplotlib.cm import ScalarMappable from matplotlib.colors import Normalize norm = Normalize(vmin, vmax) sm = ScalarMappable(norm=norm, cmap=cmap) for val, rgb_expected in vals: rgb_actual = sm.to_rgba(val)[:-1] assert_allclose(rgb_actual, rgb_expected, atol=1e-5) def test_colormap_hess(): transition = 0.5 cmap = colormap_hess(transition=transition) vals = [ (0, (0.0, 0.0, 0.0)), (0.25, (0.0, 0.0, 0.50196078)), (0.5, (1.0, 0.0058823529411764722, 0.0)), (0.75, (1.0, 0.75882352941176501, 0.0)), (1, (1.0, 1.0, 1.0)), ] _check_cmap_rgb_vals(vals, cmap) def test_colormap_milagro(): transition = 0.5 cmap = colormap_milagro(transition=transition) vals = [ (0, (1.0, 1.0, 1.0)), (0.25, (0.4979388, 0.4979388, 0.4979388)), (0.5, (0.00379829, 0.3195442, 0.79772102)), (0.75, (0.51610773, 0.25806707, 0.49033536)), (1.0, (1.0, 1.0, 1.0)), ] _check_cmap_rgb_vals(vals, cmap) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/visualization/tests/test_datasets.py0000644000175100001770000000521114721316200023327 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest from numpy.testing import assert_allclose import matplotlib from packaging import version from gammapy.datasets.tests.test_map import MapDataset from gammapy.modeling.models import ( FoVBackgroundModel, GaussianSpatialModel, PowerLawSpectralModel, SkyModel, ) from gammapy.utils.testing import mpl_plot_check, requires_data from gammapy.visualization import plot_npred_signal, plot_spectrum_datasets_off_regions @pytest.fixture def sky_model(): spatial_model = GaussianSpatialModel( lon_0="0.2 deg", lat_0="0.1 deg", sigma="0.2 deg", frame="galactic" ) spectral_model = PowerLawSpectralModel( index=3, amplitude="1e-11 cm-2 s-1 TeV-1", reference="1 TeV" ) return SkyModel( spatial_model=spatial_model, spectral_model=spectral_model, name="test-model" ) @pytest.mark.skipif( version.parse(matplotlib.__version__) < version.parse("3.5"), reason="Requires matplotlib 3.5 or higher", ) def test_plot_spectrum_datasets_off_regions(): from gammapy.datasets import SpectrumDatasetOnOff from gammapy.maps import Map, RegionNDMap counts_off_1 = RegionNDMap.create("icrs;circle(0, 0.5, 0.2);circle(0.5, 0, 0.2)") counts_off_2 = RegionNDMap.create("icrs;circle(0.5, 0.5, 0.2);circle(0, 0, 0.2)") counts_off_3 = RegionNDMap.create("icrs;point(0.5, 0.5);point(0, 0)") m = Map.from_geom(geom=counts_off_1.geom.to_wcs_geom()) ax = m.plot() dataset_1 = SpectrumDatasetOnOff(counts_off=counts_off_1) dataset_2 = SpectrumDatasetOnOff(counts_off=counts_off_2) dataset_3 = SpectrumDatasetOnOff(counts_off=counts_off_3) plot_spectrum_datasets_off_regions( ax=ax, datasets=[dataset_1, dataset_2, dataset_3] ) actual = ax.patches[0].get_edgecolor() assert_allclose(actual, (0.121569, 0.466667, 0.705882, 1.0), rtol=1e-2) actual = ax.patches[2].get_edgecolor() assert_allclose(actual, (1.0, 0.498039, 0.054902, 1.0), rtol=1e-2) assert ax.lines[0].get_color() in ["green", "C0"] @requires_data() def test_plot_npred_signal(sky_model): dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") pwl = PowerLawSpectralModel() gauss = GaussianSpatialModel( lon_0="0.0 deg", lat_0="0.0 deg", sigma="0.5 deg", frame="galactic" ) model1 = SkyModel(pwl, gauss, name="m1") bkg = FoVBackgroundModel(dataset_name=dataset.name) dataset.models = [bkg, sky_model, model1] with mpl_plot_check(): plot_npred_signal(dataset) with mpl_plot_check(): plot_npred_signal(dataset, model_names=[sky_model.name, model1.name]) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/visualization/tests/test_panel.py0000644000175100001770000000105114721316200022614 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst from astropy.coordinates import Angle import matplotlib.pyplot as plt from gammapy.maps import Map from gammapy.utils.testing import mpl_plot_check from gammapy.visualization import MapPanelPlotter def test_map_panel_plotter(): fig = plt.figure() plotter = MapPanelPlotter( figure=fig, xlim=Angle([-5, 5], "deg"), ylim=Angle([-2, 2], "deg"), npanels=2 ) map_image = Map.create(width=(180, 10), binsz=1) with mpl_plot_check(): plotter.plot(map_image) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/visualization/tests/test_utils.py0000644000175100001770000001007014721316200022656 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import pytest import numpy as np from scipy.stats import norm import astropy.units as u from astropy.table import Table import matplotlib.pyplot as plt from gammapy.maps import Map, MapAxis, WcsNDMap from gammapy.utils.random import get_random_state from gammapy.utils.testing import mpl_plot_check, requires_data from gammapy.visualization import ( add_colorbar, plot_contour_line, plot_distribution, plot_map_rgb, plot_theta_squared_table, ) @requires_data() def test_add_colorbar(): map_ = Map.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") fig, ax = plt.subplots() with mpl_plot_check(): img = ax.imshow(map_.sum_over_axes().data[0, :, :]) add_colorbar(img, ax=ax, label="Colorbar label") axes_loc = {"position": "left", "size": "2%", "pad": "15%"} fig, ax = plt.subplots() with mpl_plot_check(): img = ax.imshow(map_.sum_over_axes().data[0, :, :]) add_colorbar(img, ax=ax, axes_loc=axes_loc) kwargs = {"use_gridspec": False, "orientation": "horizontal"} fig, ax = plt.subplots() with mpl_plot_check(): img = ax.imshow(map_.sum_over_axes().data[0, :, :]) cbar = add_colorbar(img, ax=ax, **kwargs) assert cbar.orientation == "horizontal" def test_map_panel_plotter(): t = np.linspace(0.0, 6.1, 10) x = np.cos(t) y = np.sin(t) fig = plt.figure() ax = fig.add_subplot(1, 1, 1) with mpl_plot_check(): plot_contour_line(ax, x, y) x = np.append(x, x[0]) y = np.append(y, y[0]) with mpl_plot_check(): plot_contour_line(ax, x, y) def test_plot_theta2_distribution(): table = Table() table["theta2_min"] = [0, 0.1] table["theta2_max"] = [0.1, 0.2] for column in [ "counts", "counts_off", "excess", "excess_errp", "excess_errn", "sqrt_ts", ]: table[column] = [1, 1] # open a new figure to avoid plt.figure() plot_theta_squared_table(table=table) @requires_data() def test_plot_map_rgb(): map_ = Map.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") with pytest.raises(ValueError): plot_map_rgb(map_) with pytest.raises(ValueError): plot_map_rgb(map_.sum_over_axes(keepdims=False)) axis = MapAxis([0, 1, 2, 3], node_type="edges") map_allsky = WcsNDMap.create(binsz=10 * u.deg, axes=[axis]) # Astropy 7 does not support uniformly 0 maps map_allsky.data += 1 with mpl_plot_check(): plot_map_rgb(map_allsky) axis_rgb = MapAxis.from_energy_edges( [0.1, 0.2, 0.5, 10], unit=u.TeV, name="energy", interp="log" ) map_ = map_.resample_axis(axis_rgb) kwargs = {"stretch": 0.5, "Q": 1, "minimum": 0.15} with mpl_plot_check(): plot_map_rgb(map_, **kwargs) def test_plot_distribution(): random_state = get_random_state(0) array = random_state.normal(0, 1, 10000) array_2d = array.reshape(1, 100, 100) energy_axis = MapAxis.from_energy_edges([1, 10] * u.TeV) map_ = WcsNDMap.create(npix=(100, 100), axes=[energy_axis]) map_.data = array_2d energy_axis_10 = MapAxis.from_energy_bounds(1 * u.TeV, 10 * u.TeV, 10) map_empty = WcsNDMap.create(npix=(100, 100), axes=[energy_axis_10]) def fit_func(x, mu, sigma): return norm.pdf(x, mu, sigma) with mpl_plot_check(): axes, res = plot_distribution( wcs_map=map_, func=fit_func, kwargs_hist={"bins": 40} ) assert axes.shape == (1,) assert "info_dict" in res[0] assert ["fvec", "nfev", "fjac", "ipvt", "qtf"] == list( (res[0].get("info_dict").keys()) ) assert res[0].get("param") is not None assert res[0].get("covar") is not None axes, res = plot_distribution(map_empty) assert res == [] assert axes.shape == (4, 3) axes, res = plot_distribution( wcs_map=map_, func="norm", kwargs_hist={"bins": 40} ) fig, ax = plt.subplots(nrows=1, ncols=1) plot_distribution(map_empty, ax=ax) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/gammapy/visualization/utils.py0000644000175100001770000003163214721316200020464 0ustar00runnerdocker# Licensed under a 3-clause BSD style license - see LICENSE.rst import logging as log import numpy as np from scipy.interpolate import CubicSpline from scipy.optimize import curve_fit from scipy.stats import norm from astropy.visualization import make_lupton_rgb import matplotlib.axes as maxes import matplotlib.pyplot as plt from mpl_toolkits.axes_grid1 import make_axes_locatable __all__ = [ "add_colorbar", "plot_contour_line", "plot_map_rgb", "plot_theta_squared_table", "plot_distribution", ] ARTIST_TO_LINE_PROPERTIES = { "color": "markeredgecolor", "edgecolor": "markeredgecolor", "ec": "markeredgecolor", "facecolor": "markerfacecolor", "fc": "markerfacecolor", "linewidth": "markeredgewidth", "lw": "markeredgewidth", } def add_colorbar(img, ax, axes_loc=None, **kwargs): """ Add colorbar to a given axis. Parameters ---------- img : `~matplotlib.image.AxesImage` The image to plot the colorbar for. ax : `~matplotlib.axes.Axes` Matplotlib axes. axes_loc : dict, optional Keyword arguments passed to `~mpl_toolkits.axes_grid1.axes_divider.AxesDivider.append_axes`. kwargs : dict, optional Keyword arguments passed to `~matplotlib.pyplot.colorbar`. Returns ------- cbar : `~matplotlib.pyplot.colorbar` The colorbar. Examples -------- .. testcode:: from gammapy.maps import Map from gammapy.visualization import add_colorbar import matplotlib.pyplot as plt map_ = Map.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") axes_loc = {"position": "right", "size": "2%", "pad": "10%"} kwargs_colorbar = {'label':'Colorbar label'} # Example outside gammapy fig = plt.figure(figsize=(6, 3)) ax = fig.add_subplot(111) img = ax.imshow(map_.sum_over_axes().data[0,:,:]) add_colorbar(img, ax=ax, axes_loc=axes_loc, **kwargs_colorbar) # `add_colorbar` is available for the `plot` function here: fig = plt.figure(figsize=(6, 3)) ax = fig.add_subplot(111) map_.sum_over_axes().plot(ax=ax, add_cbar=True, axes_loc=axes_loc, kwargs_colorbar=kwargs_colorbar) # doctest: +SKIP """ kwargs.setdefault("use_gridspec", True) kwargs.setdefault("orientation", "vertical") axes_loc = axes_loc or {} axes_loc.setdefault("position", "right") axes_loc.setdefault("size", "5%") axes_loc.setdefault("pad", "2%") axes_loc.setdefault("axes_class", maxes.Axes) divider = make_axes_locatable(ax) cax = divider.append_axes(**axes_loc) cbar = plt.colorbar(img, cax=cax, **kwargs) return cbar def plot_map_rgb(map_, ax=None, **kwargs): """ Plot RGB image on matplotlib WCS axes. This function is based on the `~astropy.visualization.make_lupton_rgb` function. The input map must contain 1 non-spatial axis with exactly 3 bins. If this is not the case, the map has to be resampled before using the `plot_map_rgb` function (e.g. as shown in the code example below). Parameters ---------- map_ : `~gammapy.maps.WcsNDMap` WCS map. The map must contain 1 non-spatial axis with exactly 3 bins. ax : `~astropy.visualization.wcsaxes.WCSAxes`, optional WCS axis object to plot on. **kwargs : dict Keyword arguments passed to `~astropy.visualization.make_lupton_rgb`. Returns ------- ax : `~astropy.visualization.wcsaxes.WCSAxes` WCS axis object. Examples -------- >>> from gammapy.visualization import plot_map_rgb >>> from gammapy.maps import Map, MapAxis >>> import astropy.units as u >>> map_ = Map.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") >>> axis_rgb = MapAxis.from_energy_edges( ... [0.1, 0.2, 0.5, 10], unit=u.TeV, name="energy", interp="log" ... ) >>> map_ = map_.resample_axis(axis_rgb) >>> kwargs = {"stretch": 0.5, "Q": 1, "minimum": 0.15} >>> plot_map_rgb(map_.smooth(0.08*u.deg), **kwargs) #doctest: +SKIP """ geom = map_.geom if len(geom.axes) != 1 or geom.axes[0].nbin != 3: raise ValueError( "One non-spatial axis with exactly 3 bins is needed to plot an RGB image" ) data = [data_slice / np.nanmax(data_slice.flatten()) for data_slice in map_.data] data = make_lupton_rgb(*data, **kwargs) ax = map_._plot_default_axes(ax=ax) ax.imshow(data) if geom.is_allsky: ax = map_._plot_format_allsky(ax) else: ax = map_._plot_format(ax) # without this the axis limits are changed when calling scatter ax.autoscale(enable=False) return ax def plot_contour_line(ax, x, y, **kwargs): """Plot smooth curve from contour points.""" xf = x yf = y # close contour if not (x[0] == x[-1] and y[0] == y[-1]): xf = np.append(x, x[0]) yf = np.append(y, y[0]) # curve parametrization must be strictly increasing # so we use the cumulative distance of each point from the first one dist = np.sqrt(np.diff(xf) ** 2.0 + np.diff(yf) ** 2.0) dist = [0] + list(dist) t = np.cumsum(dist) ts = np.linspace(0, t[-1], 50) # 1D cubic spline interpolation cs = CubicSpline(t, np.c_[xf, yf], bc_type="periodic") out = cs(ts) # plot if "marker" in kwargs.keys(): marker = kwargs.pop("marker") else: marker = "+" if "color" in kwargs.keys(): color = kwargs.pop("color") else: color = "b" ax.plot(out[:, 0], out[:, 1], "-", color=color, **kwargs) ax.plot(xf, yf, linestyle="", marker=marker, color=color) def plot_theta_squared_table(table): """Plot the theta2 distribution of counts, excess and significance. Take the table containing the ON counts, the OFF counts, the acceptance, the off acceptance and the alpha (normalisation between ON and OFF) for each theta2 bin. Parameters ---------- table : `~astropy.table.Table` Required columns: theta2_min, theta2_max, counts, counts_off and alpha """ from gammapy.maps import MapAxis from gammapy.maps.axes import UNIT_STRING_FORMAT from gammapy.maps.utils import edges_from_lo_hi theta2_edges = edges_from_lo_hi( table["theta2_min"].quantity, table["theta2_max"].quantity ) theta2_axis = MapAxis.from_edges(theta2_edges, interp="lin", name="theta_squared") ax0 = plt.subplot(2, 1, 1) x = theta2_axis.center.value x_edges = theta2_axis.edges.value xerr = (x - x_edges[:-1], x_edges[1:] - x) ax0.errorbar( x, table["counts"], xerr=xerr, yerr=np.sqrt(table["counts"]), linestyle="None", label="Counts", ) ax0.errorbar( x, table["counts_off"], xerr=xerr, yerr=np.sqrt(table["counts_off"]), linestyle="None", label="Counts Off", ) ax0.errorbar( x, table["excess"], xerr=xerr, yerr=(table["excess_errn"], table["excess_errp"]), fmt="+", linestyle="None", label="Excess", ) ax0.set_ylabel("Counts") ax0.set_xticks([]) ax0.set_xlabel("") ax0.legend() ax1 = plt.subplot(2, 1, 2) ax1.errorbar(x, table["sqrt_ts"], xerr=xerr, linestyle="None") ax1.set_xlabel(f"Theta [{theta2_axis.unit.to_string(UNIT_STRING_FORMAT)}]") ax1.set_ylabel("Significance") def plot_distribution( wcs_map, ax=None, ncols=3, func=None, kwargs_hist=None, kwargs_axes=None, kwargs_fit=None, ): """ Plot the 1D distribution of data inside a map as an histogram. If the dimension of the map is smaller than 2, a unique plot will be displayed. Otherwise, if the dimension is 3 or greater, a grid of plot will be displayed. Parameters ---------- wcs_map : `~gammapy.maps.WcsNDMap` A map that contains data to be plotted. ax : `~matplotlib.axes.Axes` or list of `~matplotlib.axes.Axes` Axis object to plot on. If a list of Axis is provided it has to be the same length as the length of _map.data. ncols : int Number of columns to plot if a "plot grid" was to be done. func : function object or str The function used to fit a map data histogram or "norm". Default is None. If None, no fit will be performed. If "norm" is given, `scipy.stats.norm.pdf` will be passed to `scipy.optimize.curve_fit`. kwargs_hist : dict Keyword arguments to pass to `matplotlib.pyplot.hist`. kwargs_axes : dict Keyword arguments to pass to `matplotlib.axes.Axes`. kwargs_fit : dict Keyword arguments to pass to `scipy.optimize.curve_fit` Returns ------- axes : `~numpy.ndarray` of `~matplotlib.pyplot.Axes` Array of Axes. result_list : list of dict List of dictionnary that contains the results of `scipy.optimize.curve_fit`. The number of elements in the list correspond to the dimension of the non-spatial axis of the map. The dictionnary contains: * `axis_edges` : the edges of the non-spatial axis bin used * `param` : the best-fit parameters of the input function `func` * `covar` : the covariance matrix for the fitted parameters `param` * `info_dict` : the `infodict` return of `scipy.optimize.curve_fit` Examples -------- >>> from gammapy.datasets import MapDataset >>> from gammapy.estimators import TSMapEstimator >>> from scipy.stats import norm >>> from gammapy.visualization import plot_distribution >>> dataset = MapDataset.read("$GAMMAPY_DATA/cta-1dc-gc/cta-1dc-gc.fits.gz") >>> tsmap_est = TSMapEstimator().run(dataset) >>> axs, res = plot_distribution(tsmap_est.sqrt_ts, func="norm", kwargs_hist={'bins': 75, 'range': (-10, 10), 'density': True}) >>> # Equivalently, one can do the following: >>> func = lambda x, mu, sig : norm.pdf(x, loc=mu, scale=sig) >>> axs, res = plot_distribution(tsmap_est.sqrt_ts, func=func, kwargs_hist={'bins': 75, 'range': (-10, 10), 'density': True}) """ from gammapy.maps import WcsNDMap # import here to avoid circular import if not isinstance(wcs_map, WcsNDMap): raise TypeError( f"map_ must be an instance of gammapy.maps.WcsNDMap, given {type(wcs_map)}" ) kwargs_hist = kwargs_hist or {} kwargs_axes = kwargs_axes or {} kwargs_fit = kwargs_fit or {} kwargs_hist.setdefault("density", True) kwargs_fit.setdefault("full_output", True) cutout, mask = wcs_map.cutout_and_mask_region() idx_x, idx_y = np.where(mask) data = cutout.data[..., idx_x, idx_y] if ax is None: n_plot = len(data) cols = min(ncols, n_plot) rows = 1 + (n_plot - 1) // cols width = 12 figsize = (width, width * rows / cols) fig, axes = plt.subplots( nrows=rows, ncols=cols, figsize=figsize, ) cells_in_grid = rows * cols else: axes = np.array([ax]) cells_in_grid = axes.size if not isinstance(axes, np.ndarray): axes = np.array([axes]) result_list = [] for idx in range(cells_in_grid): axe = axes.flat[idx] if idx > len(data) - 1: axe.set_visible(False) continue d = data[idx][np.isfinite(data[idx])] n, bins, _ = axe.hist(d, **kwargs_hist) if func is not None: kwargs_plot_fit = {"label": "Fit"} centers = 0.5 * (bins[1:] + bins[:-1]) if func == "norm": def func_to_fit(x, mu, sigma): return norm.pdf(x, mu, sigma) pars, cov, infodict, message, _ = curve_fit( func_to_fit, centers, n, **kwargs_fit ) mu, sig = pars[0], pars[1] err_mu, err_sig = np.sqrt(cov[0][0]), np.sqrt(cov[1][1]) label_norm = ( r"$\mu$ = {:.2f} ± {:.2E}\n$\sigma$ = {:.2f} ± {:.2E}".format( mu, err_mu, sig, err_sig ) ).replace(r"\n", "\n") kwargs_plot_fit["label"] = label_norm else: func_to_fit = func pars, cov, infodict, message, _ = curve_fit( func_to_fit, centers, n, **kwargs_fit ) axis_edges = ( wcs_map.geom.axes[-1].edges[idx], wcs_map.geom.axes[-1].edges[idx + 1], ) result_dict = { "axis_edges": axis_edges, "param": pars, "covar": cov, "info_dict": infodict, } result_list.append(result_dict) log.info(message) xmin, xmax = kwargs_hist.get("range", (np.min(d), np.max(d))) x = np.linspace(xmin, xmax, 1000) axe.plot(x, func_to_fit(x, *pars), lw=2, color="black", **kwargs_plot_fit) axe.set(**kwargs_axes) axe.legend() return axes, result_list ././@PaxHeader0000000000000000000000000000003300000000000010211 xustar0027 mtime=1732615309.252642 gammapy-1.3/gammapy.egg-info/0000755000175100001770000000000014721316215015544 5ustar00runnerdocker././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615308.0 gammapy-1.3/gammapy.egg-info/PKG-INFO0000644000175100001770000000664114721316214016647 0ustar00runnerdockerMetadata-Version: 2.1 Name: gammapy Version: 1.3 Summary: A Python package for gamma-ray astronomy Home-page: https://gammapy.org Author: The Gammapy developers Author-email: gammapy-coordination-l@in2p3.fr License: BSD 3-Clause Platform: any Classifier: Intended Audience :: Science/Research Classifier: License :: OSI Approved :: BSD License Classifier: Operating System :: OS Independent Classifier: Programming Language :: C Classifier: Programming Language :: Cython Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: 3.12 Classifier: Programming Language :: Python :: 3.13 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Topic :: Scientific/Engineering :: Astronomy Requires-Python: >=3.9 Description-Content-Type: text/x-rst License-File: LICENSE.rst Requires-Dist: numpy>=1.21 Requires-Dist: scipy!=1.10,>=1.5 Requires-Dist: astropy>=5.0 Requires-Dist: regions>=0.5.0 Requires-Dist: pyyaml>=5.3 Requires-Dist: click>=7.0 Requires-Dist: pydantic>=2.5.0 Requires-Dist: iminuit>=2.8.0 Requires-Dist: matplotlib>=3.4 Provides-Extra: all Requires-Dist: naima; extra == "all" Requires-Dist: sherpa; platform_system != "Windows" and extra == "all" Requires-Dist: healpy; platform_system != "Windows" and extra == "all" Requires-Dist: requests; extra == "all" Requires-Dist: tqdm; extra == "all" Requires-Dist: ipywidgets; extra == "all" Requires-Dist: ray[default]>=2.9; extra == "all" Provides-Extra: all-no-ray Requires-Dist: naima; extra == "all-no-ray" Requires-Dist: sherpa; platform_system != "Windows" and extra == "all-no-ray" Requires-Dist: healpy; platform_system != "Windows" and extra == "all-no-ray" Requires-Dist: requests; extra == "all-no-ray" Requires-Dist: tqdm; extra == "all-no-ray" Requires-Dist: ipywidgets; extra == "all-no-ray" Provides-Extra: cov Requires-Dist: naima; extra == "cov" Requires-Dist: sherpa; platform_system != "Windows" and extra == "cov" Requires-Dist: healpy; platform_system != "Windows" and extra == "cov" Requires-Dist: requests; extra == "cov" Requires-Dist: tqdm; extra == "cov" Requires-Dist: ipywidgets; extra == "cov" Provides-Extra: test Requires-Dist: pytest-astropy; extra == "test" Requires-Dist: pytest-xdist; extra == "test" Requires-Dist: pytest; extra == "test" Requires-Dist: docutils; extra == "test" Requires-Dist: sphinx; extra == "test" Provides-Extra: docs Requires-Dist: astropy<5.3,>=5.0; extra == "docs" Requires-Dist: numpy<2.0; extra == "docs" Requires-Dist: sherpa<4.17; extra == "docs" Requires-Dist: sphinx-astropy; extra == "docs" Requires-Dist: sphinx; extra == "docs" Requires-Dist: sphinx-click; extra == "docs" Requires-Dist: sphinx-gallery; extra == "docs" Requires-Dist: sphinx-design; extra == "docs" Requires-Dist: sphinx-copybutton; extra == "docs" Requires-Dist: pydata-sphinx-theme; extra == "docs" Requires-Dist: nbformat; extra == "docs" Requires-Dist: docutils; extra == "docs" * Webpage: https://gammapy.org * Documentation: https://docs.gammapy.org * Code: https://github.com/gammapy/gammapy Gammapy is an open-source Python package for gamma-ray astronomy built on Numpy, Scipy and Astropy. It is the base library for the science analysis tools of the Cherenkov Telescope Array Observatory and can also be used to analyse data from existing gamma-ray telescopes. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615309.0 gammapy-1.3/gammapy.egg-info/SOURCES.txt0000644000175100001770000006140414721316215017435 0ustar00runnerdocker.codacy.yml .gitignore .mailmap .pre-commit-config.yaml CITATION CITATION.cff CODE_OF_CONDUCT.md CONTRIBUTING.md LICENSE.rst LONG_DESCRIPTION.rst MANIFEST.in Makefile README.rst codemeta.json environment-dev.yml pyproject.toml setup.cfg setup.py tox.ini .github/PULL_REQUEST_TEMPLATE.md .github/dependabot.yml .github/ISSUE_TEMPLATE/bug_report.md .github/ISSUE_TEMPLATE/feature_request.md .github/workflows/cffconvert.yml .github/workflows/ci.yml .github/workflows/dispatch.yml .github/workflows/dispatch_benchmark.yml .github/workflows/greetings.yml .github/workflows/release.yml dev/README.rst dev/authors.py dev/codemeta.py dev/github_summary.py dev/ipynb_to_gallery.py dev/prepare-release.py dev/codespell/exclude-file.txt dev/codespell/ignore-words.txt docs/Makefile docs/conf.py docs/index.rst docs/make.bat docs/nitpick-exceptions docs/references.txt docs/serve.py docs/sphinxext.py docs/_static/1d-analysis-image.png docs/_static/2d-analysis-image.png docs/_static/3d-analysis-image.png docs/_static/DC1_3d.png docs/_static/atom.png docs/_static/box.png docs/_static/custom.css docs/_static/data-flow-gammapy.png docs/_static/galleryicon.png docs/_static/gammapy_banner.png docs/_static/gammapy_logo.ico docs/_static/gammapy_logo.png docs/_static/gammapy_logo_nav.png docs/_static/gammapy_maps.png docs/_static/gears.png docs/_static/glossaryicon.png docs/_static/hgps_map_background_estimation.png docs/_static/hgps_spectrum_background_estimation.png docs/_static/index_api.svg docs/_static/index_contribute.svg docs/_static/index_getting_started.svg docs/_static/index_user_guide.svg docs/_static/install.png docs/_static/matomo.js docs/_static/upgrade.png docs/_static/using.png docs/_templates/custom-footer.html docs/_templates/autosummary/base.rst docs/_templates/autosummary/class.rst docs/_templates/autosummary/module.rst docs/api-reference/analysis.rst docs/api-reference/astro.rst docs/api-reference/catalog.rst docs/api-reference/data.rst docs/api-reference/datasets.rst docs/api-reference/estimators.rst docs/api-reference/index.rst docs/api-reference/irf.rst docs/api-reference/makers.rst docs/api-reference/maps.rst docs/api-reference/modeling.rst docs/api-reference/scripts.rst docs/api-reference/stats.rst docs/api-reference/utils.rst docs/api-reference/visualization.rst docs/binder/requirements.txt docs/binder/runtime.txt docs/development/dependencies.rst docs/development/dev_howto.rst docs/development/doc_howto.rst docs/development/index.rst docs/development/intro.rst docs/development/release.rst docs/development/setup.rst docs/development/pigs/index.rst docs/development/pigs/pig-001.rst docs/development/pigs/pig-002.rst docs/development/pigs/pig-003.rst docs/development/pigs/pig-004.rst docs/development/pigs/pig-005.rst docs/development/pigs/pig-006-class-diagram.png docs/development/pigs/pig-006.rst docs/development/pigs/pig-007.rst docs/development/pigs/pig-008.rst docs/development/pigs/pig-009.rst docs/development/pigs/pig-010.rst docs/development/pigs/pig-011.rst docs/development/pigs/pig-012.rst docs/development/pigs/pig-013.rst docs/development/pigs/pig-014.rst docs/development/pigs/pig-016-gammapy-package-organisation-proposal.png docs/development/pigs/pig-016-gammapy-package-organisation-status.png docs/development/pigs/pig-016.rst docs/development/pigs/pig-018.rst docs/development/pigs/pig-019.rst docs/development/pigs/pig-020.rst docs/development/pigs/pig-021.rst docs/development/pigs/pig-022.rst docs/development/pigs/pig-023.rst docs/development/pigs/pig-024.rst docs/development/pigs/pig-025.rst docs/development/pigs/pig-026.rst docs/getting-started/environments.rst docs/getting-started/index.rst docs/getting-started/install.rst docs/getting-started/quickstart.rst docs/getting-started/troubleshooting.rst docs/getting-started/usage.rst docs/release-notes/index.rst docs/release-notes/v0.1.rst docs/release-notes/v0.10.rst docs/release-notes/v0.11.rst docs/release-notes/v0.12.rst docs/release-notes/v0.13.rst docs/release-notes/v0.14.rst docs/release-notes/v0.15.rst docs/release-notes/v0.16.rst docs/release-notes/v0.17.rst docs/release-notes/v0.18.1.rst docs/release-notes/v0.18.2.rst docs/release-notes/v0.18.rst docs/release-notes/v0.19.rst docs/release-notes/v0.2.rst docs/release-notes/v0.20.1.rst docs/release-notes/v0.20.rst docs/release-notes/v0.3.rst docs/release-notes/v0.4.rst docs/release-notes/v0.5.rst docs/release-notes/v0.6.rst docs/release-notes/v0.7.rst docs/release-notes/v0.8.rst docs/release-notes/v0.9.rst docs/release-notes/v1.0.1.rst docs/release-notes/v1.0.2.rst docs/release-notes/v1.0.rst docs/release-notes/v1.1.rst docs/release-notes/v1.2.rst docs/release-notes/v1.3.rst docs/release-notes/v2.0.rst docs/user-guide/catalog.rst docs/user-guide/dl3.rst docs/user-guide/estimators.rst docs/user-guide/hli.rst docs/user-guide/howto.rst docs/user-guide/index.rst docs/user-guide/modeling.rst docs/user-guide/package.rst docs/user-guide/references.rst docs/user-guide/utils.rst docs/user-guide/astro/index.rst docs/user-guide/astro/darkmatter/index.rst docs/user-guide/astro/population/index.rst docs/user-guide/astro/population/plot_radial_distributions.py docs/user-guide/astro/population/plot_spiral_arm_models.py docs/user-guide/astro/population/plot_spiral_arms.py docs/user-guide/astro/population/plot_velocity_distributions.py docs/user-guide/astro/source/index.rst docs/user-guide/astro/source/plot_pulsar_spindown.py docs/user-guide/astro/source/plot_pwn_evolution.py docs/user-guide/astro/source/plot_snr_brightness_evolution.py docs/user-guide/astro/source/plot_snr_radius_evolution.py docs/user-guide/astro/source/pulsar.rst docs/user-guide/astro/source/pwn.rst docs/user-guide/astro/source/snr.rst docs/user-guide/datasets/index.rst docs/user-guide/datasets/plot_stack.py docs/user-guide/irf/aeff.rst docs/user-guide/irf/bkg.rst docs/user-guide/irf/edisp.rst docs/user-guide/irf/index.rst docs/user-guide/irf/plot_aeff.py docs/user-guide/irf/plot_aeff_param.py docs/user-guide/irf/plot_bkg_3d.py docs/user-guide/irf/plot_edisp.py docs/user-guide/irf/plot_edisp_kernel.py docs/user-guide/irf/plot_edisp_kernel_param.py docs/user-guide/irf/plot_fermi_psf.py docs/user-guide/irf/plot_psf.py docs/user-guide/irf/psf.rst docs/user-guide/makers/create_region.py docs/user-guide/makers/fov.rst docs/user-guide/makers/index.rst docs/user-guide/makers/make_rectangular_reflected_background.py docs/user-guide/makers/make_reflected_regions.py docs/user-guide/makers/reflected.rst docs/user-guide/makers/ring.rst docs/user-guide/maps/hpxmap.rst docs/user-guide/maps/index.rst docs/user-guide/maps/regionmap.rst docs/user-guide/scripts/index.rst docs/user-guide/scripts/significance.py docs/user-guide/stats/fit_statistics.rst docs/user-guide/stats/index.rst docs/user-guide/stats/plot_cash_errors.py docs/user-guide/stats/plot_cash_significance.py docs/user-guide/stats/plot_wstat_errors.py docs/user-guide/stats/plot_wstat_significance.py docs/user-guide/stats/wstat_derivation.rst docs/user-guide/visualization/colormap_example.py docs/user-guide/visualization/index.rst examples/README.rst examples/models/README.txt examples/models/spatial/README.rst examples/models/spatial/plot_constant.py examples/models/spatial/plot_disk.py examples/models/spatial/plot_gauss.py examples/models/spatial/plot_gen_gauss.py examples/models/spatial/plot_piecewise_norm_spatial.py examples/models/spatial/plot_point.py examples/models/spatial/plot_shell.py examples/models/spatial/plot_shell2.py examples/models/spatial/plot_template.py examples/models/spectral/README.rst examples/models/spectral/plot_absorbed.py examples/models/spectral/plot_broken_powerlaw.py examples/models/spectral/plot_compound.py examples/models/spectral/plot_constant_spectral.py examples/models/spectral/plot_exp_cutoff_powerlaw.py examples/models/spectral/plot_exp_cutoff_powerlaw_3fgl.py examples/models/spectral/plot_exp_cutoff_powerlaw_norm_spectral.py examples/models/spectral/plot_gauss_spectral.py examples/models/spectral/plot_logparabola.py examples/models/spectral/plot_logparabola_norm_spectral.py examples/models/spectral/plot_naima.py examples/models/spectral/plot_piecewise_norm_spectral.py examples/models/spectral/plot_powerlaw.py examples/models/spectral/plot_powerlaw2.py examples/models/spectral/plot_powerlaw_norm_spectral.py examples/models/spectral/plot_smooth_broken_powerlaw.py examples/models/spectral/plot_super_exp_cutoff_powerlaw_3fgl.py examples/models/spectral/plot_super_exp_cutoff_powerlaw_4fgl.py examples/models/spectral/plot_super_exp_cutoff_powerlaw_4fgl_dr1.py examples/models/spectral/plot_template_spectral.py examples/models/temporal/README.rst examples/models/temporal/plot_constant_temporal.py examples/models/temporal/plot_expdecay_temporal.py examples/models/temporal/plot_gaussian_temporal.py examples/models/temporal/plot_generalized_gaussian_temporal.py examples/models/temporal/plot_linear_temporal.py examples/models/temporal/plot_powerlaw_temporal.py examples/models/temporal/plot_sine_temporal.py examples/models/temporal/plot_template_phase_temporal.py examples/models/temporal/plot_template_temporal.py examples/tutorials/README.rst examples/tutorials/analysis-1d/README.rst examples/tutorials/analysis-1d/cta_sensitivity.py examples/tutorials/analysis-1d/ebl.py examples/tutorials/analysis-1d/extended_source_spectral_analysis.py examples/tutorials/analysis-1d/sed_fitting.py examples/tutorials/analysis-1d/spectral_analysis.py examples/tutorials/analysis-1d/spectral_analysis_hli.py examples/tutorials/analysis-1d/spectral_analysis_rad_max.py examples/tutorials/analysis-1d/spectrum_simulation.py examples/tutorials/analysis-2d/README.rst examples/tutorials/analysis-2d/detect.py examples/tutorials/analysis-2d/modeling_2D.py examples/tutorials/analysis-2d/ring_background.py examples/tutorials/analysis-3d/README.rst examples/tutorials/analysis-3d/analysis_3d.py examples/tutorials/analysis-3d/analysis_mwl.py examples/tutorials/analysis-3d/cta_data_analysis.py examples/tutorials/analysis-3d/energy_dependent_estimation.py examples/tutorials/analysis-3d/event_sampling.py examples/tutorials/analysis-3d/event_sampling_nrg_depend_models.py examples/tutorials/analysis-3d/flux_profiles.py examples/tutorials/analysis-3d/simulate_3d.py examples/tutorials/analysis-time/README.rst examples/tutorials/analysis-time/light_curve.py examples/tutorials/analysis-time/light_curve_flare.py examples/tutorials/analysis-time/light_curve_simulation.py examples/tutorials/analysis-time/pulsar_analysis.py examples/tutorials/analysis-time/time_resolved_spectroscopy.py examples/tutorials/analysis-time/variability_estimation.py examples/tutorials/api/README.rst examples/tutorials/api/astro_dark_matter.py examples/tutorials/api/catalog.py examples/tutorials/api/datasets.py examples/tutorials/api/estimators.py examples/tutorials/api/fitting.py examples/tutorials/api/irfs.py examples/tutorials/api/makers.py examples/tutorials/api/maps.py examples/tutorials/api/mask_maps.py examples/tutorials/api/model_management.py examples/tutorials/api/models.py examples/tutorials/api/observation_clustering.py examples/tutorials/api/priors.py examples/tutorials/data/README.rst examples/tutorials/data/cta.py examples/tutorials/data/fermi_lat.py examples/tutorials/data/hawc.py examples/tutorials/data/hess.py examples/tutorials/scripts/README.rst examples/tutorials/scripts/survey_map.py examples/tutorials/scripts/survey_map.rst examples/tutorials/starting/README.rst examples/tutorials/starting/analysis_1.py examples/tutorials/starting/analysis_2.py examples/tutorials/starting/overview.py gammapy/__init__.py gammapy/__main__.py gammapy/_astropy_init.py gammapy/conftest.py gammapy/version.py gammapy.egg-info/PKG-INFO gammapy.egg-info/SOURCES.txt gammapy.egg-info/dependency_links.txt gammapy.egg-info/entry_points.txt gammapy.egg-info/not-zip-safe gammapy.egg-info/requires.txt gammapy.egg-info/top_level.txt gammapy/analysis/__init__.py gammapy/analysis/config.py gammapy/analysis/core.py gammapy/analysis/config/docs.yaml gammapy/analysis/config/example-1d.yaml gammapy/analysis/config/example-3d.yaml gammapy/analysis/config/model-1d.yaml gammapy/analysis/config/model.yaml gammapy/analysis/tests/__init__.py gammapy/analysis/tests/test_analysis.py gammapy/analysis/tests/test_config.py gammapy/astro/__init__.py gammapy/astro/darkmatter/__init__.py gammapy/astro/darkmatter/profiles.py gammapy/astro/darkmatter/spectra.py gammapy/astro/darkmatter/utils.py gammapy/astro/darkmatter/tests/__init__.py gammapy/astro/darkmatter/tests/test_profiles.py gammapy/astro/darkmatter/tests/test_spectra.py gammapy/astro/darkmatter/tests/test_utils.py gammapy/astro/population/__init__.py gammapy/astro/population/simulate.py gammapy/astro/population/spatial.py gammapy/astro/population/velocity.py gammapy/astro/population/tests/__init__.py gammapy/astro/population/tests/test_simulate.py gammapy/astro/population/tests/test_spatial.py gammapy/astro/population/tests/test_velocity.py gammapy/astro/source/__init__.py gammapy/astro/source/pulsar.py gammapy/astro/source/pwn.py gammapy/astro/source/snr.py gammapy/astro/source/tests/__init__.py gammapy/astro/source/tests/test_pulsar.py gammapy/astro/source/tests/test_pwn.py gammapy/astro/source/tests/test_snr.py gammapy/catalog/__init__.py gammapy/catalog/core.py gammapy/catalog/fermi.py gammapy/catalog/gammacat.py gammapy/catalog/hawc.py gammapy/catalog/hess.py gammapy/catalog/lhaaso.py gammapy/catalog/tests/__init__.py gammapy/catalog/tests/test_core.py gammapy/catalog/tests/test_fermi.py gammapy/catalog/tests/test_gammacat.py gammapy/catalog/tests/test_hawc.py gammapy/catalog/tests/test_hess.py gammapy/catalog/tests/test_lhaaso.py gammapy/catalog/tests/data/2fhl_j0822.6-4250e.txt gammapy/catalog/tests/data/2fhl_j1445.1-0329.txt gammapy/catalog/tests/data/2hwc_j0534+220.txt gammapy/catalog/tests/data/2hwc_j0631+169.txt gammapy/catalog/tests/data/2pc_j0034-0534.txt gammapy/catalog/tests/data/2pc_j1112-6103.txt gammapy/catalog/tests/data/3fgl_J0000.1+6545.txt gammapy/catalog/tests/data/3fgl_J0001.4+2120.txt gammapy/catalog/tests/data/3fgl_J0023.4+0923.txt gammapy/catalog/tests/data/3fgl_J0835.3-4510.txt gammapy/catalog/tests/data/3fhl_j2301.9+5855e.txt gammapy/catalog/tests/data/3pc_j0834-4159.txt gammapy/catalog/tests/data/3pc_j0835-4510.txt gammapy/catalog/tests/data/3pc_j0940-5428.txt gammapy/catalog/tests/data/4fgl-dr4_J0534.5+2200.txt gammapy/catalog/tests/data/4fgl_J0000.3-7355.txt gammapy/catalog/tests/data/4fgl_J0001.5+2113.txt gammapy/catalog/tests/data/4fgl_J0002.8+6217.txt gammapy/catalog/tests/data/4fgl_J1409.1-6121e.txt gammapy/catalog/tests/data/gammacat_hess_j1813-178.txt gammapy/catalog/tests/data/gammacat_hess_j1848-018.txt gammapy/catalog/tests/data/gammacat_vela_x.txt gammapy/catalog/tests/data/hess_j1713-397.txt gammapy/catalog/tests/data/hess_j1825-137.txt gammapy/catalog/tests/data/hess_j1930+188.txt gammapy/catalog/tests/data/make.py gammapy/data/__init__.py gammapy/data/data_store.py gammapy/data/event_list.py gammapy/data/filters.py gammapy/data/gti.py gammapy/data/hdu_index_table.py gammapy/data/ivoa.py gammapy/data/metadata.py gammapy/data/obs_table.py gammapy/data/observations.py gammapy/data/pointing.py gammapy/data/simulate.py gammapy/data/utils.py gammapy/data/tests/__init__.py gammapy/data/tests/test_data_store.py gammapy/data/tests/test_event_list.py gammapy/data/tests/test_filters.py gammapy/data/tests/test_gti.py gammapy/data/tests/test_hdu_index_table.py gammapy/data/tests/test_ivoa.py gammapy/data/tests/test_metadata.py gammapy/data/tests/test_obs_table.py gammapy/data/tests/test_observations.py gammapy/data/tests/test_observers.py gammapy/data/tests/test_pointing.py gammapy/datasets/__init__.py gammapy/datasets/actors.py gammapy/datasets/core.py gammapy/datasets/evaluator.py gammapy/datasets/flux_points.py gammapy/datasets/io.py gammapy/datasets/map.py gammapy/datasets/metadata.py gammapy/datasets/simulate.py gammapy/datasets/spectrum.py gammapy/datasets/utils.py gammapy/datasets/tests/__init__.py gammapy/datasets/tests/test_datasets.py gammapy/datasets/tests/test_evaluator.py gammapy/datasets/tests/test_flux_points.py gammapy/datasets/tests/test_io.py gammapy/datasets/tests/test_map.py gammapy/datasets/tests/test_metadata.py gammapy/datasets/tests/test_simulate.py gammapy/datasets/tests/test_spectrum.py gammapy/datasets/tests/test_utils.py gammapy/estimators/__init__.py gammapy/estimators/core.py gammapy/estimators/energydependentmorphology.py gammapy/estimators/flux.py gammapy/estimators/metadata.py gammapy/estimators/parameter.py gammapy/estimators/profile.py gammapy/estimators/utils.py gammapy/estimators/map/__init__.py gammapy/estimators/map/asmooth.py gammapy/estimators/map/core.py gammapy/estimators/map/excess.py gammapy/estimators/map/ts.py gammapy/estimators/map/tests/__init__.py gammapy/estimators/map/tests/test_asmooth.py gammapy/estimators/map/tests/test_core.py gammapy/estimators/map/tests/test_excess.py gammapy/estimators/map/tests/test_ts.py gammapy/estimators/points/__init__.py gammapy/estimators/points/core.py gammapy/estimators/points/lightcurve.py gammapy/estimators/points/profile.py gammapy/estimators/points/sed.py gammapy/estimators/points/sensitivity.py gammapy/estimators/points/tests/__init__.py gammapy/estimators/points/tests/test_core.py gammapy/estimators/points/tests/test_lightcurve.py gammapy/estimators/points/tests/test_profile.py gammapy/estimators/points/tests/test_sed.py gammapy/estimators/points/tests/test_sensitivity.py gammapy/estimators/tests/__init__.py gammapy/estimators/tests/test_core.py gammapy/estimators/tests/test_energydependentmorphology.py gammapy/estimators/tests/test_flux.py gammapy/estimators/tests/test_metadata.py gammapy/estimators/tests/test_parameter_estimator.py gammapy/estimators/tests/test_profile.py gammapy/estimators/tests/test_utils.py gammapy/extern/__init__.py gammapy/extern/xmltodict.py gammapy/irf/__init__.py gammapy/irf/background.py gammapy/irf/core.py gammapy/irf/effective_area.py gammapy/irf/io.py gammapy/irf/rad_max.py gammapy/irf/edisp/__init__.py gammapy/irf/edisp/core.py gammapy/irf/edisp/kernel.py gammapy/irf/edisp/map.py gammapy/irf/edisp/tests/__init__.py gammapy/irf/edisp/tests/test_core.py gammapy/irf/edisp/tests/test_kernel.py gammapy/irf/edisp/tests/test_map.py gammapy/irf/psf/__init__.py gammapy/irf/psf/core.py gammapy/irf/psf/kernel.py gammapy/irf/psf/map.py gammapy/irf/psf/parametric.py gammapy/irf/psf/table.py gammapy/irf/psf/tests/__init__.py gammapy/irf/psf/tests/test_kernel.py gammapy/irf/psf/tests/test_map.py gammapy/irf/psf/tests/test_parametric.py gammapy/irf/psf/tests/test_table.py gammapy/irf/psf/tests/data/psf_info.txt gammapy/irf/tests/__init__.py gammapy/irf/tests/test_background.py gammapy/irf/tests/test_core.py gammapy/irf/tests/test_effective_area.py gammapy/irf/tests/test_gadf.py gammapy/irf/tests/test_io.py gammapy/irf/tests/test_rad_max.py gammapy/makers/__init__.py gammapy/makers/core.py gammapy/makers/map.py gammapy/makers/reduce.py gammapy/makers/safe.py gammapy/makers/spectrum.py gammapy/makers/utils.py gammapy/makers/background/__init__.py gammapy/makers/background/fov.py gammapy/makers/background/phase.py gammapy/makers/background/reflected.py gammapy/makers/background/ring.py gammapy/makers/background/tests/__init__.py gammapy/makers/background/tests/test_fov.py gammapy/makers/background/tests/test_phase.py gammapy/makers/background/tests/test_reflected.py gammapy/makers/background/tests/test_ring.py gammapy/makers/tests/__init__.py gammapy/makers/tests/test_asymm_irf.py gammapy/makers/tests/test_core.py gammapy/makers/tests/test_map.py gammapy/makers/tests/test_reduce.py gammapy/makers/tests/test_safe.py gammapy/makers/tests/test_spectrum.py gammapy/makers/tests/test_utils.py gammapy/maps/__init__.py gammapy/maps/axes.py gammapy/maps/coord.py gammapy/maps/core.py gammapy/maps/geom.py gammapy/maps/io.py gammapy/maps/maps.py gammapy/maps/measure.py gammapy/maps/utils.py gammapy/maps/hpx/__init__.py gammapy/maps/hpx/core.py gammapy/maps/hpx/geom.py gammapy/maps/hpx/io.py gammapy/maps/hpx/ndmap.py gammapy/maps/hpx/utils.py gammapy/maps/hpx/tests/__init__.py gammapy/maps/hpx/tests/test_geom.py gammapy/maps/hpx/tests/test_ndmap.py gammapy/maps/region/__init__.py gammapy/maps/region/geom.py gammapy/maps/region/ndmap.py gammapy/maps/region/tests/__init__.py gammapy/maps/region/tests/test_geom.py gammapy/maps/region/tests/test_ndmap.py gammapy/maps/tests/__init__.py gammapy/maps/tests/test_axes.py gammapy/maps/tests/test_coord.py gammapy/maps/tests/test_core.py gammapy/maps/tests/test_counts.py gammapy/maps/tests/test_maps.py gammapy/maps/tests/test_measure.py gammapy/maps/wcs/__init__.py gammapy/maps/wcs/core.py gammapy/maps/wcs/geom.py gammapy/maps/wcs/io.py gammapy/maps/wcs/ndmap.py gammapy/maps/wcs/tests/__init__.py gammapy/maps/wcs/tests/test_geom.py gammapy/maps/wcs/tests/test_ndmap.py gammapy/modeling/__init__.py gammapy/modeling/covariance.py gammapy/modeling/fit.py gammapy/modeling/iminuit.py gammapy/modeling/likelihood.py gammapy/modeling/parameter.py gammapy/modeling/scipy.py gammapy/modeling/selection.py gammapy/modeling/sherpa.py gammapy/modeling/models/__init__.py gammapy/modeling/models/core.py gammapy/modeling/models/cube.py gammapy/modeling/models/prior.py gammapy/modeling/models/spatial.py gammapy/modeling/models/spectral.py gammapy/modeling/models/spectral_cosmic_ray.py gammapy/modeling/models/spectral_crab.py gammapy/modeling/models/temporal.py gammapy/modeling/models/utils.py gammapy/modeling/models/tests/__init__.py gammapy/modeling/models/tests/test_core.py gammapy/modeling/models/tests/test_cube.py gammapy/modeling/models/tests/test_io.py gammapy/modeling/models/tests/test_management.py gammapy/modeling/models/tests/test_prior.py gammapy/modeling/models/tests/test_spatial.py gammapy/modeling/models/tests/test_spectral.py gammapy/modeling/models/tests/test_spectral_cosmic_ray.py gammapy/modeling/models/tests/test_spectral_crab.py gammapy/modeling/models/tests/test_temporal.py gammapy/modeling/models/tests/test_utils.py gammapy/modeling/models/tests/data/example2.yaml gammapy/modeling/models/tests/data/examples.yaml gammapy/modeling/models/tests/data/make.py gammapy/modeling/tests/__init__.py gammapy/modeling/tests/test_covariance.py gammapy/modeling/tests/test_fit.py gammapy/modeling/tests/test_iminuit.py gammapy/modeling/tests/test_parameter.py gammapy/modeling/tests/test_scipy.py gammapy/modeling/tests/test_selection.py gammapy/modeling/tests/test_sherpa.py gammapy/scripts/__init__.py gammapy/scripts/analysis.py gammapy/scripts/check.py gammapy/scripts/download.py gammapy/scripts/info.py gammapy/scripts/main.py gammapy/scripts/tests/__init__.py gammapy/scripts/tests/test_analysis.py gammapy/scripts/tests/test_download.py gammapy/scripts/tests/test_info.py gammapy/scripts/tests/test_main.py gammapy/stats/__init__.py gammapy/stats/counts_statistic.py gammapy/stats/fit_statistics.py gammapy/stats/fit_statistics_cython.c gammapy/stats/fit_statistics_cython.pyx gammapy/stats/utils.py gammapy/stats/variability.py gammapy/stats/tests/__init__.py gammapy/stats/tests/test_counts_statistic.py gammapy/stats/tests/test_fit_statistics.py gammapy/stats/tests/test_utils.py gammapy/stats/tests/test_variability.py gammapy/tests/__init__.py gammapy/utils/__init__.py gammapy/utils/array.py gammapy/utils/check.py gammapy/utils/cluster.py gammapy/utils/compat.py gammapy/utils/deprecation.py gammapy/utils/docs.py gammapy/utils/fits.py gammapy/utils/gauss.py gammapy/utils/integrate.py gammapy/utils/interpolation.py gammapy/utils/metadata.py gammapy/utils/observers.py gammapy/utils/parallel.py gammapy/utils/pbar.py gammapy/utils/regions.py gammapy/utils/registry.py gammapy/utils/roots.py gammapy/utils/scripts.py gammapy/utils/table.py gammapy/utils/testing.py gammapy/utils/time.py gammapy/utils/types.py gammapy/utils/units.py gammapy/utils/coordinates/__init__.py gammapy/utils/coordinates/fov.py gammapy/utils/coordinates/other.py gammapy/utils/coordinates/tests/__init__.py gammapy/utils/coordinates/tests/test_fov.py gammapy/utils/coordinates/tests/test_other.py gammapy/utils/random/__init__.py gammapy/utils/random/inverse_cdf.py gammapy/utils/random/utils.py gammapy/utils/random/tests/__init__.py gammapy/utils/random/tests/test_inverse_cdf.py gammapy/utils/random/tests/test_random.py gammapy/utils/tests/__init__.py gammapy/utils/tests/test_array.py gammapy/utils/tests/test_check.py gammapy/utils/tests/test_deprecation.py gammapy/utils/tests/test_fits.py gammapy/utils/tests/test_gauss.py gammapy/utils/tests/test_integrate.py gammapy/utils/tests/test_interpolation.py gammapy/utils/tests/test_metadata.py gammapy/utils/tests/test_parallel.py gammapy/utils/tests/test_regions.py gammapy/utils/tests/test_roots.py gammapy/utils/tests/test_scripts.py gammapy/utils/tests/test_table.py gammapy/utils/tests/test_time.py gammapy/utils/tests/test_units.py gammapy/visualization/__init__.py gammapy/visualization/cmap.py gammapy/visualization/datasets.py gammapy/visualization/heatmap.py gammapy/visualization/panel.py gammapy/visualization/utils.py gammapy/visualization/tests/__init__.py gammapy/visualization/tests/test_cmap.py gammapy/visualization/tests/test_datasets.py gammapy/visualization/tests/test_panel.py gammapy/visualization/tests/test_utils.py././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615308.0 gammapy-1.3/gammapy.egg-info/dependency_links.txt0000644000175100001770000000000114721316214021611 0ustar00runnerdocker ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615308.0 gammapy-1.3/gammapy.egg-info/entry_points.txt0000644000175100001770000000006514721316214021042 0ustar00runnerdocker[console_scripts] gammapy = gammapy.scripts.main:cli ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615308.0 gammapy-1.3/gammapy.egg-info/not-zip-safe0000644000175100001770000000000114721316214017771 0ustar00runnerdocker ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615308.0 gammapy-1.3/gammapy.egg-info/requires.txt0000644000175100001770000000121314721316214020140 0ustar00runnerdockernumpy>=1.21 scipy!=1.10,>=1.5 astropy>=5.0 regions>=0.5.0 pyyaml>=5.3 click>=7.0 pydantic>=2.5.0 iminuit>=2.8.0 matplotlib>=3.4 [all] naima requests tqdm ipywidgets ray[default]>=2.9 [all:platform_system != "Windows"] sherpa healpy [all_no_ray] naima requests tqdm ipywidgets [all_no_ray:platform_system != "Windows"] sherpa healpy [cov] naima requests tqdm ipywidgets [cov:platform_system != "Windows"] sherpa healpy [docs] astropy<5.3,>=5.0 numpy<2.0 sherpa<4.17 sphinx-astropy sphinx sphinx-click sphinx-gallery sphinx-design sphinx-copybutton pydata-sphinx-theme nbformat docutils [test] pytest-astropy pytest-xdist pytest docutils sphinx ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615308.0 gammapy-1.3/gammapy.egg-info/top_level.txt0000644000175100001770000000001014721316214020264 0ustar00runnerdockergammapy ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/pyproject.toml0000644000175100001770000000253614721316200015333 0ustar00runnerdocker[build-system] requires = [ "setuptools >= 65", "Cython", "numpy >= 2.0.0rc1", "setuptools_scm[toml] >= 8", ] build-backend = "setuptools.build_meta" [tool.pytest.ini_options] filterwarnings = [ "error::astropy.utils.exceptions.AstropyDeprecationWarning", "error::gammapy.utils.deprecation.GammapyDeprecationWarning", "error::matplotlib.MatplotlibDeprecationWarning", ] [tool.setuptools_scm] version_file = "gammapy/version.py" version_file_template = """ # Note that we need to fall back to the hard-coded version if either # setuptools_scm can't be imported or setuptools_scm can't determine the # version, so we catch the generic 'Exception'. try: from setuptools_scm import get_version version = get_version(root='..', relative_to=__file__) except Exception: version = '{version}' """ [tool.ruff] exclude = ["docs", "dev"] # Like black line-length = 88 indent-width = 4 [tool.ruff.lint] ignore = ["E741"] extend-ignore = ["E203","E712"] [tool.ruff.lint.per-file-ignores] "examples/*" = ["E402"] [tool.ruff.format] # Like Black, use double quotes for strings. quote-style = "double" # Like Black, indent with spaces, rather than tabs. indent-style = "space" # Like Black, respect magic trailing commas. skip-magic-trailing-comma = false # Like Black, automatically detect the appropriate line ending. line-ending = "auto" ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1732615309.2566419 gammapy-1.3/setup.cfg0000644000175100001770000000740214721316215014243 0ustar00runnerdocker[metadata] name = gammapy description = A Python package for gamma-ray astronomy author = The Gammapy developers author_email = gammapy-coordination-l@in2p3.fr license = BSD 3-Clause license_files = LICENSE.rst url = https://gammapy.org url_docs = https://docs.gammapy.org/dev/ long_description = file: LONG_DESCRIPTION.rst long_description_content_type = text/x-rst edit_on_github = False github_project = gammapy/gammapy url_raw_github = https://raw.githubusercontent.com/gammapy/gammapy/master/ platforms = any classifiers = Intended Audience :: Science/Research License :: OSI Approved :: BSD License Operating System :: OS Independent Programming Language :: C Programming Language :: Cython Programming Language :: Python :: 3 Programming Language :: Python :: 3.9 Programming Language :: Python :: 3.10 Programming Language :: Python :: 3.11 Programming Language :: Python :: 3.12 Programming Language :: Python :: 3.13 Programming Language :: Python :: Implementation :: CPython Topic :: Scientific/Engineering :: Astronomy [options] zip_safe = False packages = find: python_requires = >=3.9 setup_requires = setuptools_scm install_requires = numpy>=1.21 scipy>=1.5,!=1.10 astropy>=5.0 regions>=0.5.0 pyyaml>=5.3 click>=7.0 pydantic>=2.5.0 iminuit>=2.8.0 matplotlib>=3.4 [options.entry_points] console_scripts = gammapy = gammapy.scripts.main:cli [options.extras_require] all = naima sherpa; platform_system != "Windows" healpy; platform_system != "Windows" requests tqdm ipywidgets ray[default]>=2.9 all_no_ray = naima sherpa; platform_system != "Windows" healpy; platform_system != "Windows" requests tqdm ipywidgets cov = naima sherpa; platform_system != "Windows" healpy; platform_system != "Windows" requests tqdm ipywidgets test = pytest-astropy pytest-xdist pytest docutils sphinx docs = astropy>=5.0,<5.3 numpy<2.0 sherpa<4.17 sphinx-astropy sphinx sphinx-click sphinx-gallery sphinx-design sphinx-copybutton pydata-sphinx-theme nbformat docutils [options.package_data] gammapy.analysis = config/* gammapy.irf.psf.tests = data/* gammapy.modeling.models.tests = data/* gammapy.catalog.tests = data/* [bdist_wheel] universal = true [tool:pytest] minversion = 3.0 norecursedirs = build dev examples docs/_build docs/_static gammapy/extern catalog/tests/data addopts = -p no:warnings --remote-data=any remote_data_strict = True astropy_header = true text_file_format = rst [coverage:run] omit = *tests* gammapy/extern/* gammapy/conftest.py gammapy/scripts/jupyter.py gammapy/utils/notebooks_links.py gammapy/utils/notebooks_process.py gammapy/utils/notebooks_test.py gammapy/utils/docs.py gammapy/_astropy_init* gammapy/conftest.py gammapy/*setup_package* gammapy/tests/* gammapy/*/tests/* gammapy/extern/* gammapy/version* */gammapy/_astropy_init* */gammapy/conftest.py */gammapy/*setup_package* */gammapy/tests/* */gammapy/*/tests/* */gammapy/extern/* */gammapy/version* [coverage:report] exclude_lines = pragma: no cover except ImportError raise AssertionError raise NotImplementedError def main\(.*\): pragma: py{ignore_python_version} def _ipython_key_completions_ [tool:isort] profile = "black" sections = STDLIB,PYTEST,NUMPY,ASTROPY,THIRDPARTY,FIRSTPARTY,LOCALFOLDER no_lines_before = STDLIB,PYTEST,NUMPY,ASTROPY,THIRDPARTY,FIRSTPARTY,LOCALFOLDER known_pytest = pytest known_numpy = numpy,scipy known_astropy = astropy,regions known_first_party = gammapy multi_line_output = 3 include_trailing_comma = True force_grid_wrap = 0 use_parentheses = True line_length = 88 [codespell] skip = ./.git ignore-words = dev/codespell/ignore-words.txt exclude-file = dev/codespell/exclude-file.txt [flake8] ignore = W503,E501 exclude = extern,conftest.py,__init__.py extend-ignore = E203 [egg_info] tag_build = tag_date = 0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/setup.py0000755000175100001770000000415214721316200014130 0ustar00runnerdocker#!/usr/bin/env python # Licensed under a 3-clause BSD style license - see LICENSE.rst # NOTE: The configuration for the package, including the name, version, and # other information are set in the setup.cfg file. import sys # First provide helpful messages if contributors try and run legacy commands # for tests or docs. TEST_HELP = """ Note: running tests is no longer done using 'python setup.py test'. Instead you will need to run: tox -e test If you don't already have tox installed, you can install it with: pip install tox If you only want to run part of the test suite, you can also use pytest directly with:: pip install -e .[test] pytest For more information, see: http://docs.astropy.org/en/latest/development/testguide.html#running-tests """ if "test" in sys.argv: print(TEST_HELP) sys.exit(1) DOCS_HELP = """ Note: building the documentation is no longer done using 'python setup.py build_docs'. Instead you will need to run: tox -e build_docs If you don't already have tox installed, you can install it with: pip install tox You can also build the documentation with Sphinx directly using:: pip install -e .[docs] cd docs make html For more information, see: http://docs.astropy.org/en/latest/install.html#builddocs """ if "build_docs" in sys.argv or "build_sphinx" in sys.argv: print(DOCS_HELP) sys.exit(1) # imports here so that people get the nice error messages above without needing # build dependencies import numpy as np # noqa: E402 from Cython.Build import cythonize # noqa: E402 from setuptools import Extension, setup # noqa: E402 kwargs = dict( include_dirs=[np.get_include()], define_macros=[ # fixes a warning when compiling ("NPY_NO_DEPRECATED_API", "NPY_1_7_API_VERSION"), # defines the oldest numpy we want to be compatible with ("NPY_TARGET_VERSION", "NPY_1_21_API_VERSION"), ], ) extensions = [ Extension( "gammapy.stats.fit_statistics_cython", sources=["gammapy/stats/fit_statistics_cython.pyx"], **kwargs, ), ] setup( ext_modules=cythonize(extensions), ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/tox.ini0000644000175100001770000000713014721316200013725 0ustar00runnerdocker[tox] minversion = 2.0 envlist = py{39,310,311,312,313}-test{,-alldeps,-devdeps}{,-cov} py{39,310,311,312,313}-test-numpy{121,122,123,124} py{39,310,311,312,313}-test-astropy{50,lts} build_docs linkcheck codestyle devdeps requires = setuptools >= 30.3.0 pip >= 19.3.1 isolated_build = true [testenv] # Suppress display of matplotlib plots generated during docs build setenv = MPLBACKEND = agg devdeps: PIP_EXTRA_INDEX_URL = https://pypi.anaconda.org/scientific-python-nightly-wheels/simple https://pypi.anaconda.org/astropy/simple https://pypi.anaconda.org/liberfa/simple # Pass through the following environment variables which may be needed for the CI passenv = HOME WINDIR LC_ALL LC_CTYPE CC CI GAMMAPY_DATA PKG_CONFIG_PATH # Run the tests in a temporary directory to make sure that we don't import # this package from the source tree changedir = .tmp/{envname} # tox environments are constructed with so-called 'factors' (or terms) # separated by hyphens, e.g. test-devdeps-cov. Lines below starting with factor: # will only take effect if that factor is included in the environment name. To # see a list of example environments that can be run, along with a description, # run: # # tox -l -v # description = run tests alldeps_noray: with all optional dependencies but ray alldeps: with all optional dependencies devdeps: with the latest developer version of key dependencies oldestdeps: with the oldest supported version of key dependencies cov: and test coverage numpy121: with numpy 1.21.* numpy122: with numpy 1.22.* numpy123: with numpy 1.23.* numpy124: with numpy 1.24.* astropy50: with astropy 5.0.* # The following provides some specific pinnings for key packages deps = cov: coverage numpy121: numpy==1.21.* numpy122: numpy==1.22.* numpy123: numpy==1.23.* numpy124: numpy==1.24.* astropy50: astropy==5.0.* oldestdeps: numpy==1.21.* oldestdeps: matplotlib==3.4.* oldestdeps: scipy==1.5.* oldestdeps: pyyaml==5.3.* oldestdeps: astropy==5.0.* oldestdeps: click==7.0.* oldestdeps: regions==0.5.* oldestdeps: pydantic==1.4.* oldestdeps: iminuit==2.8.* devdeps: scipy>=0.0.dev0 devdeps: numpy>=0.0.dev0 devdeps: matplotlib>=0.0.dev0 devdeps: astropy>=0.0.dev0 devdeps: pyerfa>=0.0.dev0 devdeps: git+https://github.com/scikit-hep/iminuit.git#egg=iminuit # build_docs: sphinx-gallery<0.13 # The following indicates which extras_require from setup.cfg will be installed extras = test alldeps_noray: all_no_ray alldeps: all cov: cov commands = # Force numpy reinstall to work around upper version limits dependencies put on numpy devdeps: pip install -U --pre --extra-index-url https://pypi.anaconda.org/scientific-python-nightly-wheels/simple numpy pip freeze !cov: pytest --pyargs gammapy {posargs} cov: pytest --doctest-rst --pyargs gammapy {toxinidir}/docs --cov gammapy --cov-config={toxinidir}/setup.cfg {posargs} cov: coverage xml -o {toxinidir}/coverage.xml [testenv:build_docs] changedir = docs description = invoke sphinx-build to build the HTML docs extras = docs, all commands = pip freeze sphinx-build -b html . _build/html {posargs} [testenv:linkcheck] changedir = docs description = check the links in the HTML docs extras = docs, all commands = pip freeze sphinx-build -b linkcheck . _build/html [testenv:codestyle] skip_install = true changedir = . description = check code style, e.g. with flake8 deps = flake8 commands = flake8 gammapy --count --max-line-length=100

6;֫w6'ۊYvuVP6NvEq85-L @ +yGO A~5i0i-S~u,D"&8zkhy4^O=C]P8/f^h dEMl7$8WH`AG@K& ` @s^,Fp2b!,Z@M ҡ/hm~۳g3A & D ~@,@ârRn&h'Qs纅^LX` [Es;`|뭷Jn HBrSOhB1y~#xqH%pn @uu@X\7`&&h7eRCx㍮\HA)rLLꡬA#(37+!tuJ=0 _|]e6c B6,8"ox$ 4;/{4p̖Ȅ- FpK!4ifvAMvQvD@ CC2~woh6" #o8"{AC[' #KȐ6I;AEޑ,DNIтKN<`AOg1Iʏ9cOǀe7&v6vh7Z xXeN$v$@=T| At]tq4=-G4~˩<1w0p]̏B_Eɠw|wӋ"|1wקhy|fǶ^-b>>1+`̮0Գ4.AN30P{~G|Me^@IDAT7U`=mUeHQ# L9:vڤ|jl,U5pe ZD7I+iR:,-X9__2{jQj&`~~{{=)Jg? 5$$M_ځi[X34`3|n겄۷mw #hq2', ";haRO!} HF@OX8#K}7LfΚ0(rW&ܷ;@8 .WhA0[ /$hcȃih*|$9,m @D'aH}ї G9RxS&@h` C(F~LVyFCkEˏ<)puە灣QN: 3aE;t)` 6ǀhk`NGZNHĤ@; n? {yS{YVZ xM6ivq+ɏk,ڀ7uW!;A2g?uБ8"!""x=Iq. >^}PCaJ;pIc%>Ǐ4tpI6xKHx}I`dci7>Fl! A#L 4o|4a0:[@W D"ƹv^ r egɹ^OBnxSlyEv{n(zg\\8I@nݳ635ΖToFL}@8()"%BX{;[@K\x;_ Y֋ bq@;i@}$ڝ0?3ݞ{`=t*N~y*ҺW{dzdZyGdžf=gPKJ̾gZTd2s&Q9 g-A;IupL@.bbU cTQrFVv\ni7^w5=+H83FN?;{F2;p.<&+?ŪBy 8vHi.1y> m)2yw_׬Y>ǥ/?g4(X(cQp^|E"M`8L,0Y2v 3,X4ReˢEfY,XУqf @ :Lᓉ *7 ,-p'Q^Ûqq xT 2s  $. 3hzBZ',!%(3h1ybQ|w%F xf7&EN&36^/_d-|}.lA zȓX#˞WYhF(I=0A/k'kkk\mHՠKgh2Q7hQqts&`hyǝeCHc-zM*o)t{@ s:ܣ/Q\sIX3c4 W;&|ƥᚶs#@<]ȓyhqȝ:K u;ZCp<Eȃq -V40)c#WwlG8>!4`>"H""̑y2$0j@zrsܻwxyc5sY,mh1LJ>;RYY3GׯX@ q- ,oBI BQ` |HaB"4d0%D3D: *HGit\̂|4*dF'glЁ.iR~ ʲhIkgPfr5F:y٣v쑝<)-Ec# jO?gL-rt.2;[,YP 2kOk֋s\*OʨaVCjTGQm4feKRmKzhҊD{FV 5blc[ՊFb؆itLwjPs47ʚH03֒䋪ljs4%oQ HrLiFU4_`iy ؙ4VNN/ Z_6ۛdJ.eO-mʎ M,J)'FHP5D)=rN1$QN3Uh㈿4)4ã&Jg&M0XҞv2T`hqߨ:"!D~];=(Ff rj\0SBΐy ^,f/R,&n,X&A SB@.^,N ,) XlCТE 6̨g nBxE7q1 E4ׄ6@¯ ` NLXl/Ѥw7y I&_Y'W2a X~s;ɈKCЖ`t뭷8)xD,&>4&*iP6@di_ȗ8Pw-<9O4a/@8<_&L hؼe映!3-@6d9`@Z@J&a7qgIۅ%ZeȅP^d@}4@wUBdee<5/:6 x{7 :@v| io4B~37!k*&KLR1W2vB #;>6ꁾ X" c]0g 0bO*#rH "^ `n5J$b ef^}y=T;n{IN% -Z>prM|9odyG_iXc0P2uM g¯;áP, iB9;`TfJ}4gRԗuYf /!@md'`LU6R-O'X{V-ҰEKeC[h}%(Ri̔34>bg ɐȐLZEHMOsd'ڕo8Zf{sՒET))~& DΏl9>ox5Swڙqn5Q; t[(~8G`v8T({F?륍ϲLx% ؑ͹q!QBm[o Eg4HF埥r'ԚuQGcs VI2>2lJ }Ҽܯe!^e^@exO6:XtJ\~&5CX";вQ'tYcG<ӐЕ8^q{u8V["#\O}9f"a5Wx `=K 0bC BA'bv:4X`Dh#)&,RF j4$bhⓅ EΠqB$ZLe9[iiV);WLfYx˟DRF[$s@0 ?І_4qU!{%; i!s!@ 3}q,׾ jn]@'hWQLnжc@dY ;,~_:M PE;beo~әQFBꑶ|yX?``^LRF@z#jQfL2=?;:~5$QoO]dh!/0 %iK^ 8Fi(SOkA]@9[|Q_mʄ|*(#2 qDV̀kc"l,ר1\ȇM]dF6uMۄ` Sv} і)SJꈶL,qN' LTРc YSFoʉl|ٙCgDy{p@ $fœכѮθ؆^%i뾍QG[wx%_mSV "(}~G?H@om[%r|uJvA}҆h{;V?.>YǛ}<~tyc p҆7Vyw6dh# Y6ZeuKhOpŇHˏehy+_^0?dmu+$Fo v5*& 0"% ˁ@ We XN/SQZ-OjUlZr4g6C KPS2A#Y68-;.]eOWvܼ+%}ж%ʷQ@̐x+{^]SwjM e4b:+mCůWK Yy>Tmq*7E !~?I#M^}c283RHH&OF^G(hx =~=`}a!tfR9#^,hXhK/H*O?,%or羗u-򡎈ʦ<^}C^}q/iw;%>ro' :2#-|e>& ')<;i᝺ rmȈp?(SC '.yRV/䏼x p&| /K"/__A'|'p'( HxK '8*k.t f?ֻTcN@p; p'DȂz$g g 4> wX/Tc j+xc .-` ?B6|`;%mP;CCG-QcKiikh?SN3dˇ@O?4G +XeƬp\k$lƁV:`C4$G7䧥XY 4`e+Z5w|i|퓖T(8,?I\pFiW:$|- %#n?;9HI.>GZ--K%:]`]N9j'գ8C H}vnU; JNBXnkYJ:(hZMPAWO_u@@BT(nUqb@X#03RcOn iYS]bz*[DjqJ׭6KscH>4G:t@<..C䎉cjgR4kuҰ¤<(LStzV>n!6IZȔS3{d_*' ]h8cVMc|W@ZiiE iEOy٪е3_5 < L8MB@ Y%͇(YFVǧ{b67;?_4Wg?Y+<⻑)&VX6/p-`XȳB fJ`p=~~ËӅt -6GPq82P>ߧL& s~^^>c6/S(}?& +O{|){x!(;h)'?4#Y "Lj"H&\4[n^7r2~tOu|9c7р+$yڼK?9/=   OX@:F|g7fuYY0 Le:.Hwv?ޥ]iU^\n1  J $HBH@bHB $$P7l{fumj{#_?'my2|Gz{ߛ;sf{g30iH !R`V' 2P"G!TKl=9 6poIUUh)مb 퐩v{] J\0;!6pv&T햟b5)va6rZ~Fݱ.?ޑֺ{H//mm}v*KȄq6#Z9HzA*G[Z32(ItTˮ)b揹9Jv1]sJ]- \vZ(LǾ)V"WSub=[bo5lumb#B 'mq5W@.u-UXC| |XR|fbq=*}*_Q[\6x` ێ~?*>- _dJ[mW!V `gѥxƉpGG:2~[r$֠t:7c깈"'$*&"| ƪ >']a]%fy幱'h;^40U6S=}j]<>5}z< {ϧIɈOiyڑ#ϗrR+35mDˁUyk[=XdfJ2R"'&*_~LWrSeiWhF?S1O>Rf:ޘ8^](#]ݣZT5kTXCcldڊl+@1L&-'a1ч1{0WH `>\x{k_$W+XoյɇkM|W؊bknm!YJ0"hebqͯ*sD@S*@j>.Ep`!Duv=l쐓%Uv.R*X#RB;yQFm6C^w;M;dҫTFEL/4`idE=Cv mWs$KW"s֥ٙ}vzL30WE{ꖣv*ʷ&fp?+Wldr_%ߴ6wblG@WȩUvS@;E|5Pz}Kn&$c. p.͑Fx 0'2sa&8_< /y𜁙s4Hč'L OAqX^p_FٗK5k @BA el+XR=x{/uyd~8'[ ?#XlCԅjkkB'w!8`&5`F a^ajW#lR 1_P&(GX.JKu[RQQhO ɓ\xٺy~K1)EbHކnkORU8^nJJvVZ7Urd?"v2Ys=&%v7߾}V"@xmȪ*򷯶-O@ӯ-)f15%b^~t9=Tw.(2tXhwmv`5 |g ]E{߀ArvEi.,)wǕo`W$ahXvˑQ1-S{vԶV#[%HGu/;*gE^[>TV+טI+8hQNRi Go35K| @)f_ Etary@Il)d~+1iYY9 !ukwThl|5 = ?oN` cf8qL\2CůGzƧby8hhdC}66Chu$qtX^u{ꂯL6{[)+N1upK$U6d#+v90@#24gZ3ʆՆX΋CSG' p SJd@<IHOWBPQ/Bp@ke֯-'ZTQHVU^Ҕ*LQcjnlOT95Ӈ:ϺeV':`DT2sNTaR ;;lG-m s%(\v5+gvvDiv8Oۚy"+c@rfu6GDmMrwyDȔx~Mrv{i[btHK\.]Riյ/4j&;-hSYV-vL'RQb[zmLAUۮj%s|Z kf7d:7$RH9,VXP%j6vvTs} 5h}]]ߵc; bf$@ >iζ]vaqNVվUKQ?+^ƺkNV\sM`>a鉶8xRqH1e>y8 zK.@T&hv7ͧ8~>TL:Xb 4}Av*GnfGA0ΝaLM6Pg>V^^1^mԙbAݿDVQ#d?5 >Y}J5k<-S,gB7fUM57;M/@Z*SJ٢t-tVݦ:GA]}rWy =- 3#/W;< h)}Rye§SLyZL P=P֩.{~)G[9[ujh[21KVz!X`(>2߻RL ةD,(˱V9Ts#!~v|&.)ε_)}mV%vR; Ag>R,s~K[q-_*v雫KT9RT`KzL PjJuXz{hKGfJ^<+iV-8/[ѿzfQ~V[)Diq~\8_&=vNWv똄IEh[s*X GC^ՉWw4^JlOjYKQ?C8F׼5vׇ~q~umȳǯ,X!;`.5@84{ z-aҧ׽ ,ސ!@X/ DP`G1!l&00 ucEi٬3b·ŋ3m"UFv` ^j!  P#$wcARMvgj(3@)^̠a;SfFEh)Ev'Z{`YnE rnyȔEV ψٵ,')ڐ E& nl& l8Fmy\ eq ؉&aUwXs ;=,bmҐVESxub>$O+HEvk]a JmȪPU*OPӪ tO\LMv J~ΟQbhAݫfe:nteeB[rj|φuee*?8uWslL. ost|Pm>2` L'ھ][VZ|aV!dgڻ\Uv=]Xd '=#`@&9/ʱWNKL[9<Κ%c[mY"`D`Zί܋E>TbsPuQk[B%}T_c6YJAA1Hvγ);Q8\"CXa}*U;qA>ǹl!3u|!brhOk P/0\Cn8EbejSK_?z*~<)@SKLzys ;j}`{!bǤ3d l:wuu+@0ѮXK 0*'"Xa `B#:3j 'BĥnG$OH L9Ujlbt52) to}])_Q [եZΑe lX@}XDt,R`\e˔ G0h7]:!W1ӻsWڨH3Č+a+<5G>Ft0/I'.ͳVU[r" SLϵ|֢z]Y^hubc9 #~R|m퐙$^z!.>c=C@3j.A3V)O*zP3m}w͵$$m,. 2hŽV6,'jg1)0:L`4 NBw!Gw5 ջX T~1ɤ&Sbffadkh6`WRr2>qxV<{`^٠y6Wd{cR BѐZ?FD292Ƨ&o2`cǫKjir>2Kނ__>56l65d~z..*,O?x{qtMH9|(g?}߲s&QN60ujhL NNLl(ĂR. 03J10.@*2Q۱W %GK`t*!0[Pnq#ŵ5[iOug0zTZlv;K~ dX,Ӹ=6.joC9vC"ePg(QYB>zmhȨ[ҫ9\@Ĵ<[&.9E/kT/beg~5 xyh0F ;n9V.`/+_&$>D4Ng Kn`>nw@r=>V:z搊' xMS;*6 j8?ϖ}/5Ǟ+RxԶ<ĩS3&`O?G O=zԸŧG)}|gB.m$U~Șz9˹tS}NDd?'>_xM~thdy!7N^D3:z<^Yښ翩W('2DMv>[sGO9zw?GYѸhO=e\%:sy@p?(8st`ȪgJ9MWibU}Y;kG!p{KLE^_hR4ˮY`$S怴<sX>vOF4$XDEoy6R"GT#󅆝#rlNUM˷#bHu 5T.Z^fe8}@zHILA ӿ>n~VȄ1W,56XqMLys &Rkv%7Q[崽uXۖk̩rB Se͐^)bΖ)pME'AjBnUi9b_JlV*|'`>Ic0zHfM8ϛ'x6l`W/8?Ǡ#e^tEtDs=ݗ^zM4@Go ': Nj={ =>)|#F?^?3}M&ϞOڄ #uG{`LrpqjժP57y'C9l1gL#]n\o?phe:2Vkkk3T9LM 76m:6Go?-9`TUU\d`:~äGĹ5T8\ȧ\ m۶c8C{7N>ǮQouȽꮻ tݺu~Boꪰ "~A/Ah]SK'?\ _|!ʆ|dSWd6d+x}lv=szߓ;t~L{?'[Y<G] zϜ b`€smlp} @?яyщO8Wu%+?Y~YU8Xb bCjVڡoW s R^SWOip$g@XZ#Ы^.jTiٟH$L0v})OlCoPשwu~TXJ{pQqJQ#B^9O- 8E3Thca$&!!]C k~QkP%eӂf ÃO&wyd.bw]*SɯT*Jbevݍ6,s 굳-Q]~`j:9$q:NjMls"ŮG(n:"F,j2iv%pAO kT=i9~yuHGT]`-+ ;9Q±~0+<"}hY ,_, 4qȮ1jrY~z-X͙3XjxFyAׯ__Tk&DL.|-Y27>-7|sXP: <>Їh?q@ -o 0@n-,~7~57P9,8[/ edւ??ٝw@YhΞ=;{ַ??s׿u{׻~~w>Mo/| a!E#9^@uac(t,:/bJ;+_~A=&w??뮻~~vm7x?Oۿm~>9-_Q{_cy ~w7L&#z>^)L7pC_slhـHB}(u?? ' ʩaUe_UhuAcBO`P]zu_ o >򑏄;zZ&_W } ٗ%{^go~C:&L#& ǯ\܀Pȿ馛XC]l~>'>w/d ^ίc,+_ov\o}{ ( ]ȋ? ޘۿ[x8A;t„끾&$,ٟYHE  ?˘fJ| xHP܏۳u= `s?\#~-jdno~{[c;SgtK;X = ]9L= `|zrOb'SWCL_d3 7DՇgF=FM1 ϫl^bglǻ4ߠe 5/3/| }el"qj@aw`TL8{Ղ\LtC*i QbQtΘ]l?e{$RHV^FGa17>jyDaWtT2 (ӹbpS}ʷ@CxR{nrۇVO e 49y:Hu/^a3C -@̑BH8"fCFlB@m+PM>~ZsBȖ閶]$ʕc+N@BfjP@5ʙ :$iMg7A5,YudP~Hm¬E-Vz-Yg J 8dsB+ʳv.+P^m(u<%aªltnX`CՖK%^(W:jtVeե8rw K lgL1W7@"aW`1ˏ#f@2Ys6`ek͸,+~q#`4k֬fb+ l]A&0 ,0xXFh|@γyի^0.8?mwqG`3ԉ  &+0`xx;nVL8֭[(\,گ8X }ua xwc\,Ƃv` "  3sCD"1zD~maOofXx{.&a (|H,[I6zr`|2׾)!7 #=@`__B9L` &lP _O*d`qlٲP.3 u-%za<} 5L*8( ` =`_7:G1N@ @O~  `l7gE@ !p9`},H>7o >r][F2C%=psbr/_:pϽM8ŵJ uTx\O?c[@<dC\cWxC{I>~9zP:}2s 7}1Fx}8KD|@k˘fd#5l>lE6י_٨72>K`jR}f0 i :ͦN5e8x ّ ٥ْ9`6"y/]'1Ud1A/^'/uǜgdBsy5Yǫ#"5>0#Ħ9Hnp=%V)].{r6` ɭf H.}k]9nd+slYK0llQ 6XT)+[V t9uB*8:ǧSnX}~^Xh]:3h?@@տ/G`\2+J$kKÐ+D,5 _ZXg 1CEʃ߮]}&Ȱ;TqySlO^ 9{D;=vt"[_|-KXWL:Қa}Y+w[lŀzemՉ ]`鳦RGFD}r^Vۨ~Γq/F;X} p8CoR3DeOAjifaJm#vH_UbԨxP/:!KH}6j `sn|6JA N**J0|Ċ/H>;!J M[.?`mwL[wv |Q`]ǐɔPKYqT5LuԙslsQqn[Jm=%/$(%^+ kc҈va1V}VvՋ5MQޮR$' Xpb,%^U+n'kW}=6]v Sa[ 2oَ.{ojjnR?`2v{jgwX}'P,g%aG"8KeR[LKׅ* +9hV<ީ mITp ʟ~dY DsqXBB6 ~M?x"0, @C|aOLX,Y8B S"& e ҲpdB}; -,YS>` Q,3e:$l,̩ F@@:7e )V0:@/eBsC0\>cg|0njfث^*-JH`p2F'0Ng,߇d?†>bd| lXn]`4h?? Ph2/>&,`U$4&V e!:T77 D_Ǥ4NqCCV]6q2zN{22}\g?̇xgT6uxF2y3x\*f^eCh@D`@Pmaצxv|@@bt'wcRQc d6Gd'<j?RA:6*بmG'L`V4lO}'NMN1KŚ H mͬ{F6_@#a[@XTVc[4' tک6!` f .ܛd8][:3]&x[dv* bşKj2bߓ3win}Nq,PlDZ>A?I?v>+ mW'!B?NjbI\k0֒\#M0#KXkʸ,`7mU!'7ͰPE*|V"{,agdb"xd`<E~` Y{&-`Az`cqpAeP@9:hM&A yhXyz&*6n|Q./V!E=uB0g8 3X_|(H s{]O0Accdf Q#O}*0/>oXxI)D>G cP Id_OX> ?~!4ǰ.. cɅ8nݺz_;J`~zZt8lpch7 L6ÅNKͽ_cٯX9$܏q9}IΏz5>iQ{c 늅c!? Iʥk1^DsM=}W"%cP1N.^p=Q6}Mk%c^ %qԺ)lI=fCi.e]Vq"2G22HDd/ɐ2xzYْ M想g9t8^:̇Wֆ9cc^?/^W?^AT:JwE vw x$|" ݣ`Ń`RJlt$'aY6ؠ]4XmH`_JW,0u6"l# 3wP KXEԦ[&\mTo|D 5b8W|eU@+(Aq Q97Mi$(D߷55$kYvK|#'}'3;5 CGnհg|v}FAñ|1Rolq*3rDc >ÈT1="` 6W4{by0@!1$'B@Eed᜞ T>! ^=FGtٳ*>NTd+aoXC,a= kVa`LatSE !0,,6e,aŀXb!Fe?-,& 7Pd{9e$ }btԕE,h,XcnI.?(`fg0 h' WLX45 :L w/ ucAJ=ۈ `h,` ͰIX늲) ӎ;8`) Pߗ,2WA{T0]d4uu<^zj}@N;a7[|ȧ\ AOQe#w!C2 Ȥ `LSS7 'm 6 WBz Pw7&G @q ˵D 9}3y(.Hk$w8vIty e$l4&DdpMȫ 4H]q pMrCt< XBԗ21גc&Rp.`&u8{b{EJ>_U'!iMiR4))9vc]mU:WKtSj9eĶܖ9Nk2f. 0OaC`'k1..XLp,ի 횇^ت 9+x׊Yd^uF}2FL? R{ا|eak?7Yc|SoyE\k![A> Ë5l)Yp+s0cf`y'pnI 2@X\`F0`]Lab%ʸ@@Bo]ɵ%Z yY&\&<:t0,r(>8@ xvS] 7d9X X6=`&01" @"&O& g/x$I'S/kG8f[0~<<-Ɇ+%8s4|$rXV+ꩁ{]=pB~\+(Q-=V^ Or> ԪavH{6 !`U YtudES&ЦZ C~*)Jwp(NlԽ]MWЩ}r>!<ÿu}B: ea iyvXjGӁAʗW*,_vX 'Gl+@;W, C``aH]8Ծ:$ 0,_X vE}W,9jG}hr[Ō=tUU}&CMg*i3U_>dB߀^$={$')WȞN_0A D+\ F7Y{:0X'~?`})hё&fE<&aq ~w2% 64B*&+H$kă<`Gmmm( |,Y8 dGuPW>ԑ0I udb2YLzƱE6 RXg(  v #pCY8ԛ3 a  xv8`|ģ; f7+XKcRGDۼi"q7 7oefD(hOzCiхElSOp"<|8|Gm qu胼<)`->2ӦءԀ,tm%}RWȇ ;}AyԏMyx澀\d VmEYE6K_3v} #djnPR/;8&[OGtckuC_Q1cv;ڋ,> #=2'0⹮#c]S&wbC^&mLhl Lqqxjqɘ j 181-\Zgf3pOD-k@''*/5:۰dzYo>LV=~~u2/}~xpK/3J(˟K<bއL<0νxe |' F.:^/縏*'c-/Syż{ h 2YI`U+i2jw(a+,˹Γ[i\&ΔX͒5*ǺO?ӈv@&6jj1WgTʧb :04 a6¡=@ @r:ovV{:xQ;ee5  IMhity#,53KA^3Z|mW|t1SL#!MT_h'ҙV3` i&+B-mN]yK00;  * %Ӆc EG?ZS/]i;so;|Cϐsg /};Nv qLw̓ !T.he<̫ ;%^ws\̎w=OJۅsLw.]Iw.]\&Nj2rLBt{u!O4pI`EӍ4]ّ0vDD=k&K&"StQԴA@?eB\j^/"]|j\oz@A(^pdMtuzNH8BjK{<ҧK't8qO!R4Jy Gs;/~\,o9`J^n"r%`)Oͤ0 E V"3O5b]&g/I #!PEX;#YY0wP xvp!L_#Ənp*C|%'XBUNFzy:ê=z! d [2/P[:%VrNt S}[ awHXԻW.=mCvvIw6ʬS*'ݻ R~+TЦr K *qN~t4;Yz`nOL)-Ra1=)q6-,D+2ny`U]ƉQQD9'uN./ ޘ:pE{y >|RGeC_ʃbS!>xedO*9OGy3ŏLd'P!]<>XꛩC>~菗Ny2\&8=RrZfdM4xި^㵉:|P = ~GL=c(gzM=~ӵDfOTVjm Χ!;汴)*u@2h}xBjoXN= a&j<>=9ravq֖ߏ#353 0+T=ſ`\f T@*! H!"r|DaB&3B1kx`Lip=1C*ДE=)@ i¡:ϟ\(U+h)%%AIN!! vԷZuQ d;-T} 򉅳t@*-$:`R\*T*#ZWh%LyT/?B}(Q+">`S/(U옂a1fi\}K=0)o(>T0}R0d3ɐe2VbR a淘\v_>up!g דEy՘o~:_g")x2' ^>v2~aӦO}S bE-Snc2yy//Rxӧ+tmN-kThLiҶ_lt!*g3ǒ/Z^j^N,OG& 6gA Q?t9L5Oф)b x9jjyOT\0\OV>J?ef8ԀCN޲9 uLc‘Z$A;&@ץa"+,=,g> J?SczhS8vH~a9j`sH&Y- 8&LD-вGqRߩэzt3FvT|KňʡZUjCb^u6Gs Ro#~iCylIv>0`w4[jh7_U&r;ͦWtͪ$5l2pqLKsl]um."1[dJ;GV.-˳ḥKzX->1x@ \lRZN4OIw9 M3ҌfM=__!`=lQ#sRcW+Lur9/n0S|!?(yMlz㯛 .?w>_b. zN"sƯs5k Xb Jy0dL&KJ$ A~710x#`arAJu߱ 1Qq,8l°TFǒ K 1\4"H}ܹy'PW a }vÑ7&jXӕf L0(c`0ۨm2;ﳔTT;ѻdwD1iSTK69=\sk5{f-ϷGvgV9duO2knLMrHʼ) =KE |"..V0(@KqqV-[1gk]z:e(hOe*/ %UtDm\=L-PD0_L'k~|ߩݯ%/Wt4XL_-">!Hv9xv5Ů@D LXr3OgD}>} P#8K9gϛs#y4A6r)cGC3LxebѦh`l`XQa1/%˩|׿Ϗq.3W]Xb LDW s 6DyČ`,uOwcʂ'rj̫s:4 pƁG.W}$`axE_ep =2k~1T۫M~*n`0ub )-ՋV: 4 mva%s%Q^(WVi' V'AD9]Tlڹ Ɠl@6kZ>[!QjGA_UJ=Dȟ|o)OҕT'HjIxٌ|xE/'YP_6tmSkJT9R٭oS?rڞ>[4@'}V.{⥧9s죷5ۊU` xy8DG-,z }zI[â;nWxE=h?/dbL J6:Bf:O u~!o=[ooG6pY+dvd%>^5i׾6".Eg0|WBai;} Gur%ĹO;.g&S'8:wok::KNo RnXtk @0Ït61o-6x*ý7zυyl~w++f]+lE>RXFjA3Ӵ~lҼN#4ˏJa~Ȏt8f¹(XURB EvpE`)Itp^(I2ru(Gi`>{zZQ7@ ꒬*P Trm?RnVE=yOOMJS3PX#P졃UxUBp~JdT{lwٗ7WmGL{S}*?Wu@f5V^ة{G%S M#vK琭+εԲ7av_^ӆʑt|_%W6H 3e84bj7_1/|ZO+޿\;!z0~P4Dzp20圪v#6&qAΨx;v ^ti͗Nc4Mvxp̜0sﳒN`BH&>ҀnCuq*n2L ~k  L˸D |R|Ko yC7֎2qf7~==qXYT^u T"Nͧ < H ̪ځsnY<7G ص.in(XϘggVq,06 iW:0U ah-'CE`${ָaݠ]Tq:O*h}>ډLywB\LibXm*P.N1rB1tZU GlPziS|iBkR\.90EU* h@,O`2ш av-Ebڽ2,b q"թR%V4u[:qY+Mp4q 8U@մQÎ/sloPYNk>o<}X"y¦#`R)*GL  ?׀?7H5Q敗3 2Xw\uU7?`gϲo}[|y#MNEE +QH|:2:7:Ë_===azL8U S5IHQa&M.txNj85k४&n6W<}߰ǩ-@ ?A@IDAT@WѾ{;iw{6iNx 1$k74p:E807Ù9>Xn05Yb?XMAeq…yp{:0r?ֽ=ClFzBd#UbJӯNh4_;ɭ͊IJ1`XO]s5yv[)Ϛ{%:fU뗗Xn괵r6U(ܲ700$o7ęaΖDk9|P]*:Vi5G9"壣y99"_ObtS3m\n  4 <2ZGkNlx'}+_ *6mF[|LxBW_=',H 롇 ]wQ}\T2f`tɇ>yɱ5p"dddʏOɇΩmeu*p4Xb 5 jf̬T+RQm0 d#ynwWpBܹsk [c$!+8   `7B`'P =օb 0Ce/A&|"|}XB H7_ 6I-jK8Ko,~_F?*s)U6ifN@Еa!xU:>!_'iFYsS6.\uѶ,'M1(:[eI.ǔTҼrʿPՈڎc|X]TVa@hYHnpީ8v^R1 V3hO*ŎuXamJy;7Ώ&(NW>/Ht>* 6kܳ#$A>z'`Skg&j62$?}+W6^oj{7xn=Fϋ||a!2/ACF_uV^n͖b9b x!5+:|5͂)xEYD}mum~ ˡ|Rڵk([qn!BvAY! Ua- Ǒ bTsq9&s<~F|o{u !E*L ʕߘV]br)JWr-}:*V; 9ƧDů⦗HG\~^o-iU% .ϒSslpZS z'KQۺ[,]) _a-n灃Aa;~dݼn9`-QC jիfF=!Gq|ѣ:HBv3<39 5* B!Wʇ- ~ >\>IZI%,<^χ~Ov88ʈÉ5}~9xUD|^Qw3Jv|ogk_ZxA?=jkk u'jŨ+|=( , t#;*;9Ko/eEӦ<1z#A^7ҥkmx`u@xi>e`4|~GxFE{ZyiҏO' ^{nߥ+.>5t: +L5i5gdzңLO=e88w~Oe#>.'O!*e0cx޵?:]F5k KP~*7d* ^e):pڎ+|_֎m? %3IX'[2Rɩ 8^zTrj$I\.} A~*3< $٢<{H,BKԖ@,@;I2pH_ ^9ʳe( Iy3tV|1z'CWc !ewj:ԇi .;S֫MWK.mnnkĬڤ&¸"}}fM~W*mh eA0楣Y >&k6.[n ϖ/fTsk1vK/4܋i!/WU4-K"gl!˟m.7م:{;, "-\K49&'|E!Se3λ\dP;rޤOC]yk 7ut^&.r\ Y^'{}),Wٻx=* I M5~,ֵ.Z"%(⪨4++,ґ ?߹9{onyg>3̙3g͙y|rL.| νti˻I:1/!hy@wI5" 3ar0hW7 ͋^L⽴8<яn~0IWP/hzk&2(&$zWȉEPb'=i)XJYB~ª.d_|h)sm0Qr|O%&'Mq2ï.eAsǝw0@:,ݬx2yJ3miW)>|s/ʉgifA7?ayQh?yzRg[KNɻ8ꍲwh7ɿ'ܹsm`N{{4K?)̓|!?N2<; ?uC'?Nxa3gN'}mN3k/m_t }~]+ _ xu ~7xUn^Wrcb<,6|FMCYޱZ#BN:++]̴yHe{5NxבrPiܑ7^&'|r33}tmݚ>.;ϕA/2N7q6llVͧ?U9N(֬PғT3ٖ&:LovO4I: '_CRր;f÷o~Sַu19?ϊW,%9|i>kӟ-O7t%@;N}a mU{N>?uL$F׿uFRڂ6ãw(Gnd,ík%HI6BIm`bǬYPw7 9$arƺ*M0dkqk@=\ /n m}!A8M8!MԙZ0+8o瘹w5{oo|Qt1`?q@Ha$O[EZβ*7ӆaz pjo93ox頋7,:׬قyYo *;M&pl~uhG;3@3bLsgXmEۄq[+p3ςl9 3n)ž" D8Ε<)sf Giz\)UItWΡb,PƏ䡷C Io]9~%aKFy]hf 7uDB5pĊ?e=yj?^Ҥ( ClJx{|oʀn"b5o LcU(yF駟^2j0 _/k&( YX. 1~Z4\z%^b U_L0?yx;% fq7g \!xY CR_W0D o惜LJ(@w6*.p!3,pR@03,tixžx|"gq'o}[߿K>Zs)W?+tYPoۛۯ!`{"G|LO|SꁊO|b>A)}~$@ :O_a#R&8ѭܺ)gqS'Q\2ԛ_^lVf` K.-B^5@c){Tꑥ]G>v/(eOE;Q/~kGHWڸ;@6u :JSIS#>ͮZ'`*46zΉ_jQ`a*@QЉy+ I[`jO# /va%?;ld}:/{y?ꨣJ{VGџYP8:dyꩧRUJ+a;#Hx?",<^_V G^kVUř^){IL@h2'2a2>Y:Wg(Wƒr?Js@ㆹ~}sg?ec\Y-֯@$@UXhs3ǦHGAǖ2~x?8u*ωrC |<8}ˠyK415oNްy^X tXVbh+`~Ĺ5_OPOn~vX{ pI ` &A:,mN pA{uWznY7h8w;k. ;Khm[?vݍÒqu [vU7Il K@+hfVc[U36V}m?SpUe@my۲쟱FVgaY+N#9  Kll/3Rzs;YbWB ,eǜ?yad?uC>cǕ5~a(c:>j,2z\3{,lCdLDϯK_"K7q20 r^, Bovp &@* 0% I%ŜEqT_g=Ye ޫ(7}ll͂[/o{DN;Ťd$SbU<P0M @V?8%{.8Qn[N2~U>1) D} _(@tTGJ k)8JSr !~T``]zdXQ!Ţ쨣Vʩc=(ifEWɁo@)@ KrR'( |~_f͚U%4(xe\Ӏ(x Ϝ9,Ƀz(-;>N(supIeYWԠ)/@ *xQFuU 0ȕ؈-VGXVe.ytSW:220r&@p@aԀog{Qפ':jHڜ4Yi`p~XO7dnۭ Ы*ouTxZFVehg ?G6ECF5qx4Od}-ϊ-?x_r{sSڤ6)@O? O( 7ܰc|d_V~HІ_%.ˌ#rz x%m4oNØKG䠯bMc$A^Sy~T_^9cya233ځ?pϼJ> m o?,# $ 3ޓ_tk+e%3%#FpZu}Qm#=l@;O]9{^f>˕_,6`EsBH8~d2֞X6p;`F ǶDpC?O,M&glP9Ux3`S-XQ2)ŔK !Ő.O`/9y[( aՋ̱FnI #jXie X3aE9rhHVbbD gMaM^M8Fc? gW}2cEh{7h?uNEccٌY_:V62?\\â߼fWY^{S`2-u,lʟߘի:4ٴqW^ d-umg̹Ȗ7 TqizW_*^'{֌ COzzg5DX80v#[< ,PoU+ΗZ 8~m9erqtfb8j^|vE oⰲ:x NquF .ȃϓ)nHX'@䰈͌{) ")^x65ͳi֛i`i;=P]q` EDo ִ1qG9gl(qziȅ1tJW7Zwk|*C(/@(q@˃nFB }B^|]AЉ ːJV|D]ˮy2ԑxżk>R+/N,g<ߊfq~(+ee <34uI|{6#jȀii`H: u9JZ4\9J>ÊꙂFY7@S^QMcDA# @҈7Ú@qPnY6Q.mc:;JG5Q77`O@4IJP #. R)Oaggy paD0ya"?/2XPd?5\SYUFJ|g=Gƶ$3dAve%YPna)L#?[4i!Sq1euxY6x3i ߶: 87 xޑ Y1k4u/"y&z@Uuqv yU9Vg0d$&G# (p˟lSe򆦋'omm|3CSL۽KTO[___"?I[UOu3JK]*s#Cz9jY7m4Xٞ Lsnt@/ձRR!:ŷ|Y7_@d!dVD~D,Uȷ6GdM.:?ڪ>Or>XfuO귲wҩkaO=S'M{%~䡝i<,){,Y2AײOeQ{ټU_8VgslN"bLL]VeTf`S2exuV![L&),NJsp(2N+)x XUA8"-bِ-s@ gI"nPDQr)lW|Ǭ!x5iZ"mJ+#JQ-Ǥ TPzK^ZxVyxb%`J P=@y1açwr314?`d$ 1 O+* XP"dkref/󜤧LՃ8q+4k"A9 }r>u_(;xi+Ig]2?8 <ޔ|zo敀2Y8+ '-\xogN^0k!~cEcrW9߻93nZ1[|ఘ7Xuia}oتm ssFy֙%KJ`iHU7(cv #+("q7CKe@JwaX/΋_W?]nQXW8,_Rsu]V>3Ҿ1_'w?mṱM S&6s_sBXT=+@-" _QұE^jbWD^> jߌ, iVʰZ P9o=m!on'Gg57\Wmd t##h5F/ ?Sf/NCg׋[ڸ,dz}5㖐MNk5hP Sv4q넬vOï~WHCrWwCg@< ^bOcѷt1c)8?1*ca; Rheyv@dĠj߽>c ] Xlq;R|fhY5XB&M`Єb mK3FQM Y6[  &qaB3 /@|<ʧwСHR`[B#W)Lzo "EʁEU)r܏+pfBDq&cg&#`ZD9ˉYDP{rutr<{@u6D!v_ ص:k6$Yk.@@SY\xхıȢVI hq@6\'kɏgSM{ Q&xWLVL&G@2'`LLS89x5ʆ\/m*+:2TMܝd5ŜCMvkkr00Y~y'm r^}Ȥ%ZS|g6)lEu|.R~ύ-v|w-l 'rƏ2 3 g}!=r0:VodFFʝE'L==2`97YϜUz i#@ߟ6 9`l`A^ZJ`( dPaw+:A^ 1133O2Ӯ~n oy[} 4C$Yh'Sscg4u޸S~H697/8 OVn;pZ]{ ^FuTx< "C]xɏ |.ό;a,â?­ Z+Xyt1X I+eʶg̚521WOU7 2F(@ &`YI5sfϟN!>`mBj\HGr[ř2LٶZF1W0,"8H(&? Bg℉ʹ5',c"50_;h]mˮ(sUt I)S[,(Mmd&e\0G=CΞl#iJ"% >KgZ:Qmw BP+<5%ɕ?'N˛80L2%oHƧ[Δe(JG)'ɯHY:uO[L=%O9yUnn)3yW^dݩ?}}};K |'xyL9ƒ!GmSꞼAmX hȫrSdW_o=y2F{zO%x36OQ?4yXk$w0|Ÿc`wcؑanl`X8J~VY^#VVG) ,dHsK^Km:e!Pۭi$u|J@j }Xي}V3PV[,LGmQ89WwGox~# uTltvl0)thLm [ a-uqEjI:u*ٶhKܶni͈{89 DQϊ6^m~W_cfҺ0_YfzqiX|MsWsQ|ozsza=pn[Fx8X=fCg핑ffК߉;>[ "FzfF"KXG4ٶH9]$SG2`p-԰l)\^b;艳fuz_heWcn4mZVGR162fpetӜ/ C on( xW~eʹɋ r( J3F'phqhP9 3䥛~$tbu`_O8ϵvQ밃XI+.\NZ~fT?e*L4Aֵ,& /¢]Sg)Oyqg:k79 Eas<$ yy&o9i s9y])=Nekx:2?BuIu$jZV5e_ +@J \{{?>%``<Xf+Ԉg~Tc0m?4ay4=n"B9P7qXپf,Yfb~%f}fRXݑ-l88c* |~_S{zjDV P٢tY2,Xr9 heHO׊+?]ow9ZlecuNa%J[!1Wx)BǽV:u`m@MZgKfADZ/cwb|91zs!/ydpuUl;bh[0+>oPf;F9GZ<R䔖WUh0GE,Ony|Ya1n6[`qI=*| \v?9Rjl^gh>Ϗqev/u׀Aᬝ~K.S0V\% ]j:`?;R~dd*[fa3Ui^/¢ۭ2M¥(t#z̓xWWӿ?*,ό-OүԙLVf_7wK;{#e[ yt'VvkKVF>VZ w YaVe Cx ʶ?sw9񽳈\Gv+[nQ>\O[@XZ̶{x[_+%f%)XVpg#9ChU³EXYfm%n]Z ,$8JَE$ճMl C#![n i,&Bw9*a`q v5 ]?^>XWas"⍋@o|]NEq8GΫ=MU>Ac`ZG]ax P4H@qX6ܮ e9NM9l7[ƾ)YZdj+#J"@K-qXXec?yqһ=B*i{&K-xhI13xg k3ѫ%WΖ>ݘrUNY`9S__߰,) >*_WK;WYsqc6 xbjȪYe[sIو~=wx{bH`|>K ı &q.+TR܄s%`4̈ s_xPt[6 b@-75[F~C@ӎq?#HwũımOx,N\ֱv=cj G fHYOW_Om0^ECid#=9VǎfƏk$W|mq)='-!~%n[/gD@:+v=*~8}^hpغSMd wrT m܄{Ws|GX)PƢg.xp8,&>D4KeW]ŘںI#eʑo (;mo^̶8T:Bhl1IJ@+K٧X`mvemx؈&.yK9s+q}XE)v8я~|i(ʃA>+W# A0XWޯX^Ygd{w=ջkz"ZW&Gv4Ra+>V2V~ܢٳ2C|C- 'tRsAn/ ){Wc;s^>X|HϐA8)vK T ^L-Pno"%X]"W\]W6)dJ&6Z  Ϯүm^89O~W9ٿۚOens Uu駗<;K2-XJraKJ%*8fk4ejc-~+`UEv<z>Y$1>+n5p6(]|J;a,\R3W(.m@ڑGL ŗ ^3yœ%S+^NFqً-#=8FwK~CY'^z{AkI4j6 \syw{ '4oyc;z)I_a|ի^UfyS$0% .a9Xie[X.xk]+kNoJ+s_3A4XѶO+Sַi12ѯv\o{ۚ#89Ú~vXy`phi,1cFJKgZu'TR3u}JJ%l5xUo+cK] "8mݸqg:eY}ucBZ b9̝?˂Ӟ0dZV`8}-%^\",%\<fd\ 4G^ԩ!啯u}р-K+_Cmn%h%"H VMW&p& +w}ͼy6lD:~98qΪ+XVϝ?}h1K^敜v`2;no6/~ !%vgeᗊc1m,6+ d_l̫,2v/ ㎅O~eհθ~A\X}G&N:#u?*+[),Ϯᓎw2LҫiuOlu:WσHcICretxn@Ng~2Lo3}tLN3T:u%꾓wM[)iw'ڿNɻ$;y0<o}a]n[VZ ,}-tKխ_uUq?:I˫l~4/vۭlwo~J:uyȣ@mlUB[:^{rK 뇺S_S7g4֟z_$0i4'|.cߛSr 7+ ?/CI*g/(mniG?z898C5yMw[H܏ t=̤0B03u?i?p_q\i'3)PO;~uIP9#i ?V7X|H#y[( 0&4jYIõ,wK32-jǿ3LI:hּ{N:}u) y.n>jtW_˨+|&I, ;<%L|׋k7&C]ec[E;I3O_W>5wgi&^佥J@+e-}-iyŪ ,h.A 뮑lDHCڍi|\]XƢ̜9s`^}!}]_{J@;S_9st3@+/l{^m|_-mMi0eDuc[} 4ˆ5\|h.Yr`X%7ƗZ/4/x ʧAQ`+_JQQ6ys_WyTH/zы<)EQ}y^VM>檫*hP&V鬀߾?|^k=29y睛ͺ[,}mWk]$y6o TŤY LO7|B6i7y($tuE>&=/ |oϟ_GA]"Oa~_&0/W\&Edꫯ.5ys^(ˀwq4[oSN)f&}O~KJw͔}ݷ4G̗*g0" ^.n/x+w߽X*<+uIOzR1rV8_{k_[}A\u*h$o~=d0v.WE= ǭު?qY{_&ipDyq(audZ'fY6) 6ؠyK_Zx&Q3&G;wni K^YV-ˇۧzji~W`UY3gU7_de3>[umv( u/yKJ[?A)2G[U7կYf >c/Dߗ+CBg(M6٤x?x%9x]|͗=yO9dr#O}z(dUggWdmg{}7?V[ h%$zʖ'yW>v s?e.xUOg?[`e ܔ3˛y¡Zky/l󹽶 z70o~Ӝ~ennoΖ J@+e+\ڨ6K7U >]:,Ӄ tm/ =C\?V 5XɨAZ Eqv-Gg>% 7W9CRpKS(LL(×^ziQѣ;C59@7\ KA9#K97!~Y\%Ż{P ew;1{ZC (įŬX|(}ced*yiLρf?Zg{C [Uw4)J96RƏ:k)sC2:蠃 \>u 49e*M`}7'_@5s\)c45)蝆7mݶBP6 7ʺ ZaV_}R_[ 2U|rlK3W8GZntl##{r<م.0̕@A09@RCS]No /mWm8dE.@^l|R@:ȩr0^@;vv}r!'|ӟ.2Sud]Ɋ>˿KsSgg?7HhU[oZ ,{ h?` :+./7~q9 G0ӚJ8(]a\[ge/ȣ"I9/WТdQ<)w )@AN'AJ|)8D0zex@* q,^t(~VX8aP@VC.R@+i&6 ?RA>U8 + e8 +!;'4@ <|䃅 ,fe ),Xd oYpK<2PPcxqeD_q ٰwQgi@4>Q*Qġ)mq`Ixr(ouYoClZF_'OO,+:24a5y [RF,gPO xW`OOxehҫAsd);"<ٳdBGڮQFtx,0P>#Uvr#G 9K! Gy(3?hKM_>hɄՑ똵sr|3V *𞅣@Ǯ~~3)/N# -U8&O\²JOVl/~ɋ@fmJXr0B(B?kSDd>==&2s6؋eI;I^v#gz&{A7廴y:wr0fX03ѤMa~9meE\|gxe6=1;ykW ,7[WrN =S}NVˏ_EF7t8c~Wa\sp6Q/xV?\:oh t$c}4)#kyՙvU8u.0ϸh-]/Ql+_C'zMw}V:mj⾭{)J`Kx`,3UKWaVt' " z-3˫g={#\t@Yl/7Hn2[g61 !@bJ(u&uj$B] r9Sd`dseC5uTZR7%( tRY`g:[mDdaƒrjBV; 9 uAD_`w*2 vqLdUʌOaWyQm d #>޿t Q~1tk.GEܜ%r$,)J%ݪ&Y9p#`)M:;@0zu 10K}.2T,Q@ N]E>}d)_b層#M2 P xZSdcay#)dа5zW|-t)/r:?k6dβmmP&Xf,#=G>O*rI0eayr*{2y.Cj755%+5T!>](]zT7(4_@sd᧌MY . O l~I}DQD烑+d?+ukEn:^g5 m4)X7%瑋L䂶2LYL"+x&_y I{,s\!"\ʯciNbΤ/7(r5&[ ;nIi CJSamA.`rsAm`J`K|(qz\gsw;:];p:riyխm?wtiv] P8BdCbFI::)i 5ߡ89:T=W^>L)^¥Bx+` Hb#jveגedEaB$D]g/ *}'xA(2ʒɖ-|sƊjJTnN$ 2cm"RtT)dʣrdML7C @XaC:H&΃H ] ?F] TilwS~pT C)Q14YZ!_g}vj'rz,MKk$ %@MmW+՜홶)\]nrɅRGQ&U uhFenjK2f 7p0^gѥYIJS[R+j<"P׹ qدwlUGc`7ݜWM7v;QDF2>c"ߍx)o}> kړ'! e &Oy?UGGɽv83up?=w+,Ԗx20NŚ`<#3h|96?nM}f80Z ,} -2m`HcOgp S4XƟaXY CR|Q9i&Nz"+;}&떏 o^?Ϟ}tp0q`ⅢbjEФr 4NkUr0JRƝeBNcu* V*>KX;F.J))!&w}.l3 ȳ`hK /d X_h - @s>%  S#ŧrgG}0b-lÏzȤFN k PL-Wii]6FMy$'P~ґL惂8W|jxӍoPh(3u[O8b1QNH`Kzv3| 7~)Xwu`h"YN<~R( VfDV U? !|;R1$F''HmݑW%3R9lS&Oi~S()^ҩMq }kA!WqQY 0 i%iK0H]2va%m_-ԇ @s_[ܪ4.?˅+?~B/zIuOh--|O(Lsw]=廢kKs!˘ʢq")m0yV|l!m&_#e k9:+1 <%g>~f^{D~AZ X$w07,Ѭ|tX.ǩ{ᣇ(V?C9l0)Q`ՈCY\>e XXLf!I)Q:01gRP V S:A,z߹6geDϊʿ]Y@J8AzP),lQɴa ?& %a_Q gX(${gB&eAZ\ E˟o,kJ<ٻ˚rfHo?=E%[ 3_4)syB@4ѧ(7|E7Y3y;ySv@ G0&"2kL /@I'¿Ͼ4m]ʟE 3K2JyQ/mQVmʘr oeao@DJ7qɗUA z:L>!m2a QO>pX4#ŁCȁ#P_`)F=˫4#:أ0굼D0\!xg~ž `GN=&ʆzL SJ _FYm: |= }[K$灧<ˀڟ+IZhg~S}/dJjibG±_f)F~` jŷŨd ~V| P }P/\5?x AVRv0-\xO;|7+K䢿d,4sV&k4{ջ#Lh"!klr_R/fPn̩o}˒ fzhX`O#i-ǂ-j迓hek%0.f3߰PjnZW9a[7$ܹlWf*_E3O,!0tV4 . L,k+:S Ae*|Z$0ؽa`Qud)c餑iXЙ8!V&>+)FN=颟Ak^\9W\79_\e|uSL /袓Nᯖe뼒Ù0ϙPua:iyŇ2wȱN; Rd>d9tܣ%3ܼvg? ]+NyMy\;yet_zr6y}k]Oײf:'ہ~0hЏeuiH2L^k,r׺Wkan7UN~msv=t}m<69V20ݼh_c-(`KѶUMP/Y(G "cXQGG| XH#ξN'@1_Ū9J,&Xq`!zʲ[yfZhu]ka,{ţ^˿{{J깂Ɗ^n~Z , 8g1dzX_9eEHr˨V u\|b.;TBs w9M䥓6Jk'NXxV ~:?K4 閏{0eWeϼ9ۙxI-ibHNzM;8iru e-\/HQ tt, hG;#m3|gzc}. uCgVg:y s_s^YǼK;ך۲weڙPitVZ W w@6k/7Ou:9Cc~d8]Pyh-pYr%zE[VZ ,r&,fy{Ͻ\S:Oi-*a{pcgr|__is@]K7#-X>uᇗ;z?+6!nlE9i:6. ?ne@{\d.l1eQN+VoO>4c ,k+z|*<#_j6pVeX|P]IJU+u@+VKK?ti]~$`>L,?>zOc,ӊ 3r]PiQG7*Vy\sZZ h%0$`+8ߖ> `:^kgL0!|n ˗qXaݝa]X>sU{P-@+U;N?Vt@o,X4:+"L_M뭨W.jdgUll V ΠeEڠ:ޝu_r6$Buc9ZrYR(7^v6ayUV_2\Y湕@+' W^yesW4'xbW{lj3,r"1їm_җasd&5$樕@+$_P֫r-_ZuFƹwfpOzғi\vedM=أX#|떍:i٤R[`]̋F_,Uml)(gbeĜT./ #gr<Kj)-J?;ׇB/SsIa}v]wIZE{R^q6|kss|\]ҽ\r,1< y,?y<.2 KNGuT^{ s1@+VˉsfGќwy͛xFa,>C-Q kfWW|n`SN)ۭ}@c+F1I[6Vp ,Z\?K񖙕F+dZi h+VƁW+ʿm9$4Ya@S9o,5 (Kw}Ɓh,h%HOnXP3}9>BVh7.+fvj^זp)?džeWZchkF*Qֿ:O}r5k>_?z\/Szr#]3|4!Lo(aפLgWuf9صj_"ks?T|7z)uԴ~M._H42^_`th%J`E xEQ^zx㍛:yԣռhN? ^ @Wyo]+VV #Pν_>rH!]Y| ."KR:}f]w-~%Zk%0V Keu6V*i~LVʁNl&:iWa3 ܯܙLdB۽q2>yi/8`_hs•|D2BK,/eL3eZ~7i >NZ)I9eFigG2odՉK0%){uNǽg{xۨŔcg D^ӕ_k<%ͬufNZV77OY7%&O*ϴ+i'?`@O:,W|zt|,42>e*(\ƗWܴZq6}ZRTyofHsT  # *(%~UAE% P"+ EQ(A __s ̜VW:S9]W K* OYyuUhv]To~vۭ+M V8y^MD`r >&h+XCyfmVVdg2`U-TLYr$da&3q"n/t$뮻n݂:G 2mE$y8/¼⋛ڪ5\Ӝwyur F~ଳΪ[g̘Q}W4?%r?c|^oη>?*y衇VgN~k<0^ΘቑIvu<y:3!ȱWil ,j+cdJ_#+S/W'0=pͷTnB̷N;y[Z`ԿU2c=4_ r?n9KQFNOG>O7-O,w%߮Ic$2@m=L1`A/ocg>*\# Ux ?CςDh 򕯬|xвMگCh>v)}2vÍ>ѕt%EcO}y?q2gc+8e|?,هi(uq7ƃ `e&1W9y"q7k =I7 ^U~lp Z/u ;M.r;pKԫHI"L&6plG̿fJQ}Ζg[:#iWVCh_CLQǑ'PqN#^Lbun+N5U#x4BI<@MP. A,Υ8^km-A N<HPW ]6$|-Ф>_2p펳}qxv^ l7䃌xÍ1{ S}sk^y2d爠}_ #xܦ`<{w?v}WF ֚kYySg]uiӷXZv)m[mj#?S oxC F0+A\ B.1.2ɏPK@$vTֿۿձ4)xMLo6'pBK2e/on0}2g^wsoQĿG7GBoڃ-d|xozĎ0?W;[~Vw Mn@}$p$8=/qs3t~zM^gN kO Oxcɦm'PlϜ6cZ;A2 dv5OK^#aN6IpT4@GXs~)6fXڰdj4dZ"o-6)b+<6nnKٹS vM cÏ)_z6svq< Ǟ{4?;g. !`0&X|!fÎq{3h|+H3Ns @q9AQTQhsN;|e~Ȓ]ah ٸ|c+[i.H8R]<-;N Ղv'v#0#4`0%#9v/yES}$\5eꔺ+}@Fv9 Zk2Bb̿Xe7TDz}V ]GiNE ]z ӴѝӴ,h}n<7'3 n~l` D̕ڱ !oGOmZ|`u5%x5Y4Ht[=7+]͇sӦ A)A?g}z+Y~6lH4lVKlM~zd^"0Q*rؔN q9v9N5CEIȁ,1)a pL4 ܸ,fWGؔ\CpOp/c.CGdz\1L#>wD iuN" Vףw p%" 0ñg@ x$N;/? PpA C*vWم> ,mrF!A<E,Xű9cf'1 y2^7*E=$y|8o'0 >ݍ6ܨb$p';(BJ!Hv7.;Jʩǃq38#C7M!] NdIvˆS,Հ幻&86~3f̨*fsG?acD0+dXv zTJN ٜ0njL;Y_9 d50jwN`3Z_d#ytgڂe~3iXC 8Et8gɊc0#Am;<>$LN_]A`8>h\Ż|8rq3)E_7ߵs2^h$A'iKҶ-w8ZsK/t`/zFs޳CLzN4&w%pKMn'٘+yN&^ s2m*9A0bM'o<T'=Q=8tۋ* O>7t#MsNX?W,${,YgktO$>:^Kf0v4uɍ?iЊnܲA3P,z@ĕW]Yw6rO GTJ3cp p9?);g.ď-W_suu=1c1/##p(7Yo*L*w 8Gk,xgL1MN΃6OpDiHPбIh;5 &ːL~ǹ] Zs~Z_#wNh#?vC |*pذm(;s>U]QA'Ш>ߎ7ۼ2XIS.^1rMK8 ;~_VXf%k(j8Xn zf e2WNwBLwSp1<ljGPɃyȯdE`?!?,MO}}ӊ i؁EЉx39sv$Lajx%L?` ~ى(zV;poZysܰie@%ynXb.95Нc^iD9>E X4 9aӘMEl3}Jl<۹AgN*<'!@I@8ZDIv;2;`.)hLȉ@}3K{vqԏ)J(agۥ*/i-/Ç7M_IБ  xq2[T%_dA1}h{{wW{-t \-X;>?Mi.4>{ڹg\dF`%O9;q$¼0&{wKvn  zks/+8'A34Iεi˪}y{B\)@1#k!ݘ D=z\g\fߪm/E ˘^׼cw9INf<}F-GG,&Uc%`LYbEwIԣȮy'$(g ȁxe"Oo1~ps UL=n׾Pۢ(KcL|'ݤ~>'ڝt 6&dҾ<P ~̧gn#L9]f }N겘_X.,63xNqlmqg,q?ۇS޾ʇC)@)HqhlQc7u͚kYdl*kG[ ]K^)_{vpW\qeηI뎟A#!T^};bC/qHۋ>G֟p@Dv}Oﺶ,s'v C>|8Bp?<S?iv /xC;P>}`b~Xz\=y 3x6^Gex!x4xS诹S21BA&ޔ>bLѶ1>un ^d|OvsQ~gAls ^<Ÿ_B捵2Ak[=}]^@.!rOĻ62M^tG.\{&8W٩+/da&9KfV9gjn!tD`"ж'+עXɇ!ꦸ:'k E]坷~mzh+}Ihs6# )xQ&8G7Gvje񦌶$]^NO$զ~t(mzNīxokVOY+rЭu#>?'$>)NA[x)Mxi7:!O[W&/QxqW &uuR`}m[T2w/qM9DkV 79¡vyDVpNZwcIrYʪXwY3^|fo0'_]fx?~ee?ߊ[}o7ؽ.;;vˋ:rDg4N'?1 %rvc>=Wa"04(9LD Xt֢Q4HD 3B{^MN."(.8'@"$@",Zby$!h3g{ ' 9RRĸ[i=:ۚO% kp㻰_~bAˋ9ko4~b{c0;Sʒ@"$@"$ 1XXYY{\#4Q:/Ω"osz/fJrXWND. Ec|U !~d ml}3^CUf#Lee޵,vȢ_C{e/F/3ࡗ}ΟZ:VlL%8Ujoc^#a] /qb0bPwU@\&Qc-T)}+?fU/<1Cj!4P* !A+m}5Z(gtzmRm,3km#36φ 󥽑]*7sd^'s=V(U>-FK:hwfV -FAޭ=]sZR]z׆4TG ㇈aCo||eׯl?F+S"$@"$E`8>=' Ss '/q"#o*ԲPS36|wP ѦSJq3ܑJjn7~5?3[>!ڎ3rœ#Z>%2j]~g+̷Lz#}23RU6s2 Re=BazCG~=BQX۫ǩ}ަknӣ7-Rn?w. 폨?rۦWeʱfF][sgaD3ӈ([AJRicӪ7DᡡP\GqəIV_D읍o( hW{NJ7YyqQ>4 Mg<wCŇof,y0eLn;2һ!0jvGC-9OJut7w0YΔwйӶoYfyx$@"$@"0!F `q[nfV,<{Zy+Yq'%s).|Ebr-y]j$ -|7/0&Dsk$8(z;hhZ"H#&GcA\eUUW]|q~ 7ZkUƗ_~yK6vdϔ$@"$@"0:`N&&@"$@""A)~͙g\r%5W 7xfƌe/| X txGn֜uY)RUnM6i̝_~]x%#{$@"$ `+D HD X mVOlvq oxC}/_3جͶn; T2$@"$\D Xs$$@"$ >vR D?kf=l켊dՆn|_l6`F| Qh[o֬jmjf޽5\(OD HD H)A"$@"$@$:-آyի^UQtQpj]wm'@UX~SKlM7W^yesw:'HD HDddN"$@"$#ꫛSNZSU^N;5W\qEsUWRK5niRGwnޯ$@"$@"E X]D{"$@"$juة_6ޗɎ+P(pMSLiXb^@"$@"$3L,fFY":ƫ?יG =G,̳f/Ҡ󸾰ݱٯD`n"+wYya{)xv*ʌP @"$@",dk6Gl"%W;.5=ɶ=/Ҡ󸾰ݱٯD`VmU,Lᄏ6e/6_y즏m^{()HD HD`֬cWkzj<&l8MY}."pwֱTta5GFɤs}_/ΕptnDꟗ&x̟\8C;[nβD`r"@/zGo(5hvqsm>lY#~94~zV[7ӧO5|~?@"$@"$@f:_tMQGtAa>Ց9A@pOmo~|y[2\P.>?~6^zin5/b3yFj|g9_;g~4_K/J0ʏQJN:3'l;׿5ck޹MF jGO"0k38i~7??=U/{ΪO~u.ksq5}w}Pr}%t9D HD H ' kr-<Ϩe]yӟl#9Avc[ֱn?~6lXn饗n^W4/|./˛VXYs5}K@cڴi#vLQ?'1FʩhOڟ|#8.8ԣ5~ X`7n<*O|u])OyJ&4e)/'XDCSOm}{_ 4 ?餓jpc|poW_qˎ=ܳ_Gkhk꣇h-mO~OD HD`n X ?pk VZinHNryMha;/!6Rvyr t՟~5(_-GI9E z+~D{|pVJ V : | NeB7^kf.Ot/:&;  ˝X]t{"$@"$=rlJBMgl6 c7|.5=V,,֓r> k'8ksNR*l2LtM͉9Fحqe]XGD HD XTltht>MdGt>df34lݿ\l~$˂ɨL@"B yED=wjIm.3vN:~cљD HD HD +;.ӜAiNL ʢE\쮛4Yڢ|&ˈ$@"$@"$@"QX*d^"$@"$@"$@"$@"0/Xl+HD HD HD HD @y@"$@"$@"$@"$ `MHFD HD HD HD H!~d^"$@"$@"$@"$A Xf(D HD HD HD H~d*$@"$@"$@"$@"0i֤d$HD HD HD HDJ%@"$@"$@"$@"L25i"ID HD HD HD @y@"$@"$@"$@"$ `MHFD HD HD HD H!~d^"$@"$@"$@"$A Xf(D HD HD HD H~d*$@"$@"$@"$@"0i֤d$HD HD HD HDJ%@"$@"$@"$@"L25i"ID HD HD HD @y@"$@"$@"$@"$ `MHFD HD HD HD H!~d^"$@"$@"$@"$A Xf(D HD HD HD H~d*$@"$@"$@"$@"0i֤d$HD HD HD HDJ%@"$@"$@"$@"L[롇jM |L@ @ 4$@"$@"$b-P[l ɎM';_"$@"$@"$@"0 0K;A+}p(+ ;.1cy/ꫯ--&U￿,!يO~%3‚ݰ/$@EuweL8pw]wӜ|͙g\y<0\Az6뛿~eF |0@э7~Yv7]w݀&w9xLnN.w<ꨣ;kqmr0>+S"@&Ewn3dh6dj;+\ :ćXk†@{ ʤ~3#x.5tƛͶo񔛟š}8Mhկd?+L:uҰ b;SڲُfzV~e'rVp[!*ۥ׺"@̩~s#g_\jIfӁe?'J{<4u˻&u'릛nk>٘ש'`ۍo}9餓Zy+^O5OySAeu…[7>!K,DGG>rNdO@R_[,=S""s\|ԃr̵UVYgQk=+_9Ցb4x^óF;Hwӭƅ~-Ut~EіUW]qǝůZ: w1g$Ы_y{YCqmdf7ڂK~?<Ě5[ d幆X!H&'>nۮc=M6dب.'5lMo~Ї>T/}i\>@`l"{$;gbǰ\wE`:Qq_Rv1|Sn!g>u!ڪ _i6|֠Fsi1Vɳ:Uk'{ֱ7. ?կAE勡0~_ ł(C~pȣ~k\rIöiq5c Nʋ^P5܎O<[ou]7ZA駟\~ca^_1-ܲhj*py`z+[gŕVjG7O}S=u|}7Tɓ`]O (ۮ?Y|n? ``zǪ;f#)+ikQm' !mG{ Ǡ|"ڞh=4nyqޯ"v;g4s>4&sv]h~tGkvhvBA_‰A;Q8?|q3^{ <kmtۋrn|ǸkJv8GǵA+?l1؏VF[/ǮT(&FO.i#:*4Սk:y>`0 A'cW~k` +;8̶e`ls0hNz"d&s+Lo[e=h(Ow%6CnkKWh(?7_뎨mߦXg9N6m8$e}]ww%/{TӒ@@>ۿ[.'%'}=wS$fZ0;ɼn+ַxWͪzU_?l8%@=n976ĭ.:sQiuA,+QX`&uo~GoB.4)7_bb\_6<[͙gݼ?nަ7s2ifQ]JMAOyB +yeQ;f+ d%5xcx/xAe<CH")K#V T>w8v]'hue(w;lAzqmi_(۴2=щ2A]}uy,l|)qB] 7ܰVԕF+k?vyVK,{T!W62Eެ :2dBk"ukΝx9VEڴq=ۂmYauckG5T1-uH8iHڵJЮr*8x{jS_hl#S*c!H%E~\#?qPB}"i3߫%i-(*7@H!'} ȍNk Oxp >(XF>iO#^+)S{7n" l'1zN ۙ&@n7\vBnT 3 ԧ>Ҳ]Wf=vK^x`skq̝YVn0} O!p1ǔjr^3&Ç~LhY;W]n <$請h]tQA|f؈ c˾:13f̨wľwꝯv$^}!`K;y3nLw)4Ad +}zٮ6-ʩwad4CayU6qw-qupWόFmhZl}W?h_Qrz$w$ݝx _I]w1wh31(Cu)xNF}ό_i7Yd9#ʵ_mh(C2wBo{4sΩ <{:N ܨ87dx1჎`}Cr~wӧò!b,=?+̠'d]L!Cd,_x?og͜1#c(is].MppM0Mwh4c,q9.#pxq,{}4l'|28^=xx+uш>v0#Cq$SZ&ݤkqȃg}z.1Lf`y"c11G8KcA[ FjL0;6ƂUĺgc2?Q8.48ZQ6ѶcэtCx2ʫu%^~OzF}iӦUտD=2q9[)ʅ;K[#е6XS 2, ,91w5Ưڪû(E[7ǛBϚ+DyOEN/QgckD::©L=@IDAT(Am-6qiQ_9mtGqL!n]&B.şyiӧWYW BLYlJ޿j'#Pe/Q|Q65σRЈr]zQ7D[v"/QgGvTN =ݦmm~m͏qM6_#p1PvۛWwXogeݮT|G!š͗onNC5 3w B`V 4C@3&(*iФe ^\cqDpmEe {ǑysS2D׿樣F cc=Xi?_%/WjzѮZ^A~^H#gw&uWbغNA vo|թ\',)mQJ^^O[ֆl!4Ep k1  Ίs0ll|{Si{3 y{^ kc࠿~%Q9܂x7>S/bl –\yF[ _^e hK8k&ݗ{.RVw)Pf?_W wP~ _X"@;> <K8u#l/Ԁ0y6ȉ Ò#b? \3`t'l'x$Yq[ *PdF 7{nx4dQ+>@ 8NsQ0`S;[xکc@*ÿ^׊{=܇ht~ iaxOѻ3_rmNJT裏 kjk]^?luڸWU`T.0=fvC9'^m72MNh cR 4Xуvwе{7/9ʌ؏Πz^ܬDhE8ke^| ^umWbCkv;IX!ectPkD ?mڴxBvʢ~bu)?]|sP)Ɗ3 Gk.WF_ v6ZmrQ]yN뙷!@Zqc1(%;8o۷_jq[{n^ȴAD~E.C/n?4.z~NA7w~γͅ>oaoq͌g̨1;mav N0zryf|uמ{TwyzL7.Ɂ`)"C(@'à ^08^?mڴʎ*T_n(#:nhW`IS +T\0r=IAv³4 )%vhK;dutw/ -A SRGPGbDO[3˟|;uKmp@005Rz>MG˝Q +vCXPЛ1C%536 _FGK8'.wxŧ6\}M P yn8]f|q5xS,gsQ7W>ʡ8Yg/w(,)HH]v)z;YLjkd43|[M`L *'7vx3oJ[إnIXH}ÿE.͍ Av v{k<?5P% 5ep*0CzKe'r~CyqGG}4׍u<Ծk7>yX Bkǜ`~WÏq:.ЙCl6vZ xyF |[XFn +13}S5zUN?G 8z@%#l>%Cd<Ȁ9o‹lWZw"<=~Uli7I(0,0]}u gzګ_JV.k<0.c-a`LAT<ۉ`ָbk>5]n2h_|;.0/K|ո5d#05\@,LfdpmƿW?EGn+z.rQ~#vW'8We`[87 lڭo`j3#:s o4VX79rn6ΓGF[kM^y亏@ad5n?mqz^inp @^.e?"/d顇fnjˊ}ԒK5kFW7K>Hp Yz4d|b_ѕdYV|t*7\Zqn:ws3_1:O_T׭%ePbGkQα T6mQh:sэϣ,{EU j#w:LleufƄβXw'4y6ZlЭ [VU_y+E M\kI@k wûD6SO.7Mu&[9}knCh`'I; S`ӞdqGϨS/&kIг˵܌RAOᅳvD ژȂF 8SH uGRjڶKAI[3X$a%xI=.G\ƒR(.eOD>7Ȃ..;F"uxHʣ#XǨ[HS.]TαJfqM=JH & m:9@UYuv<vŚkYm\]Nڅ dl,^y^|oǝ6)m}a-lDR_9dWQ/ʴ,cg'ͥzƽ~s=T۵jca$HmcbhcACNK 'P]iF~|T\ER,ڑ) A9M]Y~U-9GgE2-o~!@frCV} '֚aV rz5s[\H In %L_kUD\UWmw:lGYmk:'=7O̕(߯]yYACA%`I*&7h WC1Qw\99ְs10aӠz!QǼtsl壝1Mw<7S`bM\ԫGZk1}E\ِ$fㅬHơo6Ąَu mnz_hAνbroW N_"~Gվc؆Z8ˣ>y;!P"@cn[~嚯䱭7B. M?qM&ݜ]KY-݄D퓿{rs{7Wj󎷿ΙZѥwCcTg~ӟ)?TvȗGni7{zjGG4y'5Mf=yM9vy>ݓ[Ynei Zf'?'NolB)htssC\g;k}mZj#05n^[&E(_{5f7F ٧ ~D?E*ؕ=]$.*zؐnnGԉL둛x2=dE`9 N dsk4Q꺤N!AjGŕ`@1J9 FtT>8xAh~PwtW06?Ց #l%3S'=Nd8Փ.^6qa&q,<7!Ep$fa'3J/k|9zxtM AQ'9F-c428.Ɏ1<"7Za7&W C\\*2 GA+}>/a$7 }ˋOSeд}A c g| 6qn9뤛DFx#EFX1(#h_F(]i wt-z*ŏqη+7g}]̀ < >5%W75uA'Af嬙k^x/Ş:zXn쬰 k4-˜<>|ssIwj_"ڿW_g}ӦW?fGښOꏖf[?(hgF']kIt_Rkk7"9{#skM7`o}oꔩUǸs@yk7!?FOآ#j@o>zׂj&޵Mqi1${ӧOߕŗФ`>x)v(vF?z%ʟ(?qG`hnD}F!"ѧvyG)c/R4x$wGC0=-hG=C>5Mu#2'G8M` 8cc툟Hhp7f0rdwS3v8صh7șk'kqbhA}y̱t=q$pGK|s&Ǽ%'/vEEі~[o.]8Q{4792H OK0JQj3.qϤ@ L12bȢV 9.tW>8sZ#oy~hYkȜ&n'` z ΍+ IG k+ S5JpOzMXݛOhX{G77kli7k> U#׎)XU\[~NƴFAgMk]_)E#]e VџtuXHxqCuU,A=&Cw92V7x tXk랇Z[ag]`/֮qK con;zX~)cF$e##XkIxlۖGre-޴o<` sdޜ#K#sŜU'`O }k8;る@Ȇykԏ~uh}۔!miGY%dG5un_Rs/nvaǪB/ _Z-1V.[V->[;hˬ$NY** $P]Ǚ?V^9B+A%wLyGS{j7yf5p8DQ~/x*D6XY0m5X6BX2?0KnR"3gtSGB]x D-˸TAʘG[QBQҡ@^]Q^@_Ѣl#@Fizd䐯+7wqgv[9ǿ$IGSy\ prZd⚲Rcp{ו;]#m _ПPQxiX]ol΃wۢi͇UjesuYD:jm,6"(pq]ࣛDWIAHI{?zQf#Y^.v=|77GfWUs}$;t'ˇo6@umJ^KDpQ$땗1QtT~ʎ , "`]<[`ti a #E`*mL6mx7h2-Pk>9 Md͐3wyW̓2o~2 ]S@^dɽ%SlQGUkC[V=c#?Ho’ae΁ڋ9Q\:c Cu+TY94>)ͻuH r?dޟJ%g=9ݠؒ;NZ#N/O /6FDr:AoP/BUږV_G|γ>@dvGu>8̳^װs]~v鑥ʇOc4kr}e)M7[{T(>qdͥǬ-6WuRMݥkm<.ҥ$HqH1ꗡ?jˣ;ԎbTE#Mm[x)C;7NG]ҖkX)ڌF{q!>w Q˘ЙB䷏c7Ÿ0H`]9 ^#i` fЊcםG}w/6k9#2R^~EiQvn=].FPzv?_em#Rp aCԙhom56mGݕg|xƭ߈D?/`y<ؼȳbhן;2ێ3\wt@8~DX-)c}i!۴4`ة2tOrޞkB8P쀵 |n>>Q.z3u : >qoOA(\w Z! gеNQkf6oq|GQclk?6W}OKEK.{OSvytX`xp> nBU?Uj+h_IALo卑Iz%KN92h M֜㈶u-m1ϭA5l-2}稾y`ACW'C1'q\ b >BU{"O9zMZ)7?^crI x駟|Yj|ƚܬv:<{Q.ϭC27B^unm[q_;0Lr.=˻^4§WG[罶̿eeR:`ԍୠL65W_ߥg؛o}yjҳc-5^`էleY+X\qY!74[a4ʐ,V}Do^lߤbY老ˋ7-HrAҁ1{?|yWU["X] `\"=7Dv Ø.y/cy 5qD-hD+ݷ0#T TaF) Xܥx!V_#αTX`F4#ǹėɄ^ $> RTE¿r XR̾K%șp8Hm5cD9[m]# vmM52rKPb\_` Ǣ +Vw2}huŗ xq D!z&J?X{~ܜH=١<#S|>;sǢe./9g~{؝ts..h/PDc̘1h.iEF8AtH1h׉f<8kQ{܁{Rcv^m<2J"gBVTsۜSE%Ac.]lp|w)c^ҽ^wڭqtΧ-EC]A3fi:1((8V<;G}! |N/K FԞ~tSuH0 "/*<ĵw7'{CNjl#?~qOw_+'k$.:"=_ѯJO-زygրۚ__˯U':xJE{;ԩ3az{?_:+n_Oui_wwkkUV>oK13WN_Cbg^N߃At]Xjqx/5XO]z8WwUWiXvm[uenyRftxدW~̜AM@IQ2ǝ2ZV0SQ$p aqPL ˸# B0=C>VW9=أRuX"(^m#| Zc|c iNt5J-}(pN~nr"h;2k|cEe]*? P>cƍG5“wK+-(kn㎭W;26o7 AR`e~^Ƙ4ܑ$ ƩȺX .CdIi2ӦMprWiE1e뿗>鷖gvڴs^R|'XHcq-3NKYюjo-mn~%CycFtȿ3$2^FˈZʛ_^^QYN"3qh1֣)?1_4RȔ~M~6"z滹ol8Mb/ @z8L釅 ׿J/nTBO3c۞(}ˌ_7ƈ\!CFW90d"H amyg~AtvKZscNE{__Ud^}X8Um}#7֖% Jֲnwf2/!fW]z,ֿAu'Y7w%O9{]b/]->'C5]wu%ڬ4\,Xl 9Z7xS t^Pn5(<سxc6fa1FEÌ&d%FJDML?(!)!F%b!b$nQ+)L,ࠉXFPcRI00KΜޛ|o< o>}ݧO>￷?wl,3MVB?{PIfdR3W ł|ڵB2k#y_yv饗v5_4|UCU \J(,vbd6Ģ\̲{\#N9]KeNB oC żs(j[I*Yi7+;\J( P!(aLXgcⲫB>Fߣ>U<),4׍?M}SOwʙQ]~ugR)gvZψ5R ;߳>>vu8;%¬رcg?n{9/Je/YX 6E[=x7]-2N?vEuFEULc%wG,ovnSdw%Z0F^Wu}3>ضs=)kl?cay[h1y\* #qA^rl2.N08oW&Ldwd#) #1M7uG諨^?X >וApw0wH,~z3Ion13{cÂ\na"0gLFcS0O,mr~Xgw~.ū]<;%nw9+âb@I ȇg"gO NF0/GE'miLjsc##y&=%-9E=( de0'xv_Z2i%Gs#JXXw#ݏ;$VN%ΠmO&Mmf60d\&7/ُ=4իWwl,ҵ~J4LdڤW Ryk[o.2XzÀ|gllwN}թ~^ԓ:O_ߎr|bE9nhڥBXJ<}CgGudw$ n#40!;jxK\ υSA:yв^}Fl^Ys\G"yHzeڀ?[wC̝?߭GC՗<"KחQߊ5WnI, ӗnc.ʴinLsf4c|o6@яFt7 #F`YqdgH8SʫV  tkAaA*)r0Bl41pFUpph p}^2NLdEgqFwU PN;NسxD$?eXxRхG"W2"0IÈEaW/f'ׇ3}(ى+es=F9_(2v.BΈ#)O| ]ٗkI`M \Dy{CQ'܍34lcmJa~;倫e}~5rYUw|"Y>&Ia{ޥD&_Y75_52 &*.e9RmhrNRav" yOXP0f&d?t/Lߢf_uͿAl 54`5&sṼ-4#r1d*ƭΘcw:-m?LM3uo6'^,c#8#LQy/D4] ɿP='އaҒ#dorPHZxqOЕ 3A:HK>D?0}T#/>q:BON2G>j^뮻P7y*駺J+uT~GOLliu&#FG.ЍI!}sxC΢>ʮ @1fQWF믻day!D&༐:xꩧv1I#m _zF/XNyp ΛruG Ԓr>-!+*ex9\wz.;ޠcnoK7_KLtks y썥sמww^pPv+ȵi o@Ď^_>~Aݫ?2ռ%wroix b-15q/ԭ~^Na*3O]k@zdwe0IO ;7?E!,0=`|pJ7~7CQ)M}CL*΁[/yIg`Xpgy'dJ T$3nGJ\OteSa$a8;UB O ;֩{%CWu0J+׮0Lɂ~Yp9D~EOAe@Vi+A5,`ʆz+""ڋ'3ג,G>QTY9b?]楀"|I_̛iNJZ``&e$S)µY ϲuL[(ҷs#/S.V<@=^+[1`QR4Sc8E\Xk4E|6^1F#U'ed/ŝaM!AO%GCF.YHK~8zxTG٨ڎgښVgz0"2l g];)gSz(\ӖjsR?Wc%y";UK<8/TXPT8ȀeEvE1hudǣo/˓uѧ97CheAFI<4a-~i佱@/7%|NY^]#_k0H?}Vip u5G𐲔^،q0_t6 y\Կ7wƤ_:nMw'Q9^(ƙdz wv&c߼e}x}qKwUq6Է5\ӾsW b>Cc>ull^p1}ϴS^~J'4Ϯn.#ı;מ]oh_ֱGtI`CxipsqBvO'K/yK˺&TCwa9aS{Ymǡ;Bk?_n\NMm< Xzu&ۅߞCBy?c{N|ѲnT=;x$ k*Ch2I*0i(;/ ߸K^'1Ne|:C?o9\l(7x?mSdq~zfs?w3f Oi4;U :B9m2G͸aa#$F;]xFk~Kc*o\1*n홄!F=iS?SZ<ُ;OaQf@;lkq.iN׎'>ܷeǫAot9&}|Ȱaq}B+<S]]ƀ\?AQmw6*<>Խ*؀;yLl9dP[[m劃:/wym7m˖,o_slv|7dYzkif~JЯC2Ixux\2G먀W,,>*dЇc6-:L4号iKmĐ6UfU3iot %uK#6N?U@nWMF(ϵuNX_GGO/ZmmOƮ$Lc/-^yyFEUH~U A ax.al="\lf.>d̄jяW\XBD4,cuٸ%fa[/o >u{מ3G3#0l'qb|,fL|n :gDE IDATNv,ەݽ)[e.(eLnwnWoiއweJn?b\[16 c78,VQG+;ûl^DžZڣ4ݷ9dnCͦҊ N2}^#f£ؔ o!l$9i̝yg(c3ce=hPY{]߻_, 䰾TaC` X$UB( B`v"WlCk0N.{Jk2attқgRG}0]XM>:?~OtúHoa<o {%/|^y_:ƃ7V sƆ۾ݾ+o|k;7ۦ~qSV~v]; w mjK~Z퐨זnhwv&8glKxw[?ZhۖvE&޸mͶWvivϧ?6> C,JLCY4;U*8/\0)߹-X( B"6 i30~:mʛ4X=,7gzӈъOۏZYƐwq^~\?OWb/O#Y:.~~ީg?ӡ5a04I3hqvɓfp!ϧ+".]fWƸLߏnxyr(oLE"*UPiq-~i#0ema-k~!mǗ35&[j~x.GQ_wY[&/>5uokѿ+řZvAuT8cq^&{T[ѫqax.<^S +8mGn9_UmCK?v@??lW'&=q39^u!jgݘ:;ZkTЊUB( B( B`' 2_^n]EF(hcr3|]0}KEq,'-aZy.2{_#t/=ߨv8M/s;*iߏa;Yi~ԗҶBjHIf(v2`Z( B( B#0d q5Õ )-{#jǭ9;?H}O}w2m^iF{umeq0vӟ_{ZL|DaDqϓK7|XoB9r[ɐ4gLY7@!Fn@!P@!P@!0`ty}A!͌[<[K%GWiL"B`!W^u4x=x +…@! @!P@!P~Õ0j=ُ᫪+nmu@_'~LpV~@fJ@!P@!P@!0:kV@^!(B( B( B( B`^|2A(v"P @!P@!P@!P 0n{PR s<@!P@!P@!P@!P$en B( B( B( B(f#eR<@!P@!P@!P@!PL"PI(( B( B( B( B`6"P*S!P@!P@!P@!P$en B( B( B( B(f#eR<@!P@!P@!P@!PL" IENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/DC1_3d.png0000644000175100001770000055203714721316200016446 0ustar00runnerdockerPNG  IHDR(e pHYsttfx IDATxwTef{Eʢ4[Tb$cL||rxԛo1T(X>3[ι,,jtcDv9g|X,     L!AAAAG :    SN :    SN :    SN :    SN :    SN :    SN :    SN :    SN :    SN :    SN :    SN :    SN :    SN :    SN :    SN :    SN :    SN :    SNGz'|8z(~;YYY$''jl6VG\WW/"۶ml6Oss3_W"++H$?8=6mto??Ύ;!--Byw!~IHH`0|m[aZdM6yf}Ͷmزe ===]AgR\\bAL:~HHH ))Cw_2eee/K1$$$vٰa?8O<`466׿Y,j>yپ};)))shnn^UUURP{{;_yWXxޝ-1<<̃>7o B L; ۷ 7o[r0LORYYɴiӰZ\}dffha׮]fjjj̔D,r}$r8qǏz^vz{{EVĊ+HKKCP0::[oECCTWW;wd``e˖f;GcXhYYYSP3k,rrrtS͏J?---D"9_3}RSS|$%%)Moo|]9bSuSO=Evv6$$$`4˧}aA'cIRYYڵk'VF8Nv;iiiDpšP(An˅ZS(PE,CTh&<\K DzB  FT*pzQTj0>P(ell BdBZ8WPb^s1+τn0 6J%hX,6!:x0jbp@ @4ϕVC8& DPTzK4% cjjwbb1"YhZ PX,ZFɿ+{N'G"<T*D""|U*RI(" Vh4(CT" ҹtDQ9Hu-H|yh4h4j}:}P1}?}{^RBO|>:GTʟw I0Ϲtޤ*]w,?33PTt:0^H$all Jh$ DP*4~X)LH$\.zzzxBD"HMM`0r0r"Sp D~|R3c8xHFi=]h&܃ W_FQ-wΌK$2HD>gƍR,pzl&Gi^n< N'>+>ciO^ }bɮk^APP(h4h&Ħ3J1tMKȱc||-~888GD"y |XbIXh4c4|tL>Mgg'r5id2~nf|AxWشij7ܹsʚꫯAvv6+V;Nmƺu&))E/~D^/ /xgDPt:/MFuu5NnxHHH`Քu>=c@VVFQD^/Ǐ _1 <lڴ JK/DMM +WV˖-[Ƞ`0(;vo||Z\.{eD+dʕ344āxhjjBRQVVƽKQQ$Lh>{9vA?뮻. z)B HBB+V7`ddjկG\\{K_-BP__#|q98я~Djj*۷ogƍ\z,Ydٸq#:Fmm-_ILLsA`y뭷+Yjջf=z^z[2::``\uU̞=m۶S]].?. |'OV袋HNN楗^o}8̙7LBBO?4HB֯_ϴiӸkl<ʒlmC|An77n䥗^BR|-[w @_~=tRTTM7_zPXfoF1ciiiYd 455t:ϳ>KFFuuuΝ;Z^g}f~_JbϞ=l޼$>`hhhرc|>f33fIIIO?͡CpWr5ל3Kٰa7op8̼y7ohǏfn7˄a,Y²eP4n:f3l۶inn#ra0[yyytwwm6^xPTOK/vx Lrr2?NZ-555|󟧼Cx衇(++&)))fΜ9Dbolڴn ~tM( >Od>ϑ 9vŬZJf?Wf\.˹[?>7RXXȲe7ɽK||<޽>~*++ٰa:CCCt:˹曙7of8+p^/V/~B!^~e~ƃ>Hii)_= ,`0_R87ofƍ466b6Yp!W\qNnh8|0>j?̈́t>  n7mmmڊVeڴi8Nr:a}泥CEE3gd3w\IKK>RSS ݻSVVE]bΝlڴKbZz;ł墫KS3dggT*),,dʕ>|s2w\t:9St9ƍtr-zN<Ɏ;o毳KUU_~9]]]X,f͚Ŝ9sΦ@jj*7x#tuu122"ϔH^/]]]x^(<\|Ťȳ}t:ػw/>}: ,#ݎ"Ɏ;رc'995\jѣG5k_җ7ߤ49uLׯgttUVD8l6H4vt:{ 60W]u .y.Yt)7teee撟O__A ׾j%''@ @ee% ,@Rʼn'-Bqq1N~6JKKꪫP*x^1|Sbƍ|ӟfr! 6IMMf?|J8'N@Pp%peL]]o-RA* Dvv6N9qF*MMM188}Yc:;;oNɑ_脌4%gIq `xx>OS~iIIIGb6?>%%%u=88HKK ӧO[o%1}t hwR^^]Ln6L&> <(rJ*++)(( ??Zn 6o޽{IOO+dll6l / zʍ7ȴi줳իWo~#>9ĠT*ͥV.]Jkk+F"immeSyySRR(**Tʳj233kQx^C]]崴pq(W_}5ttt/reMt&VKzz:cxx pM7@cǎqQ>ϳdT*[l^;k$##ٳg/r͛G0dRTT$8=ʵ^%\f3::hɓ=z1ϟ̙3|ݻ,]>'K[[g\s5r`Dkk+rX,fcrZٻw/W]u555/ /pM7QPPp qq yFTʠX,K *NzJ}}=&˗SRRBss3> 6PSS3!ȴZ̝;Jٲe 8s:?~\0d4EV`MFww77ofլXy1sL4 -bŘf~?B!f33gdٲeQ-[Faa!:~?o200>ˊ+X|9gFRKRR^ sN,Y2fillh4rדCCC=;w;T*%KÛoɉ'gbI;b蠬FJJJ˓k"]r% 288HUUդiiiQTTDrr2^{\I h4ruסVq:477@uu&HMMk%--={pΥj񤦦 p5?Hii)^{?p8DPh" B C/~K/8:Į]xwHKKtiddu֑… 7or1ǬYSK袋illfϞ=l2F#*۷okn###twwrGFFHLLN ޴|j|IE:;Fww7~;555x^9[iVT2c .28y$tvvN: 8y$̚5kZ. PTTTTb 0[lѣ,ZhurJ'lCѐ_`@022¬Yl6cyxioo'++!vڅ㡦.FFFP(FFZ-[FvvJ@"MBܹ`0ȧ>)/_dllcǎw^ZFhZ3?y$/k׮_?u\c-??VKkk+hNeii) 322BGGU^^_|1 L&IOO577rX`ZN85G4n --T*++G`0DGG---IKKCFWW3fl63::J  ##JE__v펎r iiizbdeei䒌 f̘ANNqqqPTTbaܹXVϧ .^'!!N<)gueddjt  'O ǃl>.aVXAnn.&b/_N}}<(Yp!UUUfrrr344t!))Iާ'O2<F;ʲel6NYYOCAu* N͆Zfpp0ӦM3<|jkkZdggSSSCvv6--- ZJKK%jf" sbTT@,c``&0 ,nJ%p XVҨcǎMIz6HHHsǓG `ppH$Bjj*IIIrt%''c,t:$##KhHOO碋."##~[)//3sss)..S&t> x G]SXXd>~?L6휙!jR׌ ^j5&x\.yt BL 7^OZZNs̞=k JEmm-~;eee\.9BSS>|(cccv~?8qo~fN}>'S3KFQNTWWsf~)//ogΜ9 'n˝.FYzARSS ,..y_||CN'=kSO=%/?Yl?c0lr֖Zh4E6L g8ѣY:v;~_nm q݌bXX,#^,cddN7aMn zJEAݨT*y7iiia||yDkg||3f|/>#gN(Evڋoшfk͘1V8<n[ܺgŊrl"9<?~˗˯7 &B*'t/>SII *!|I)))\uU|%))K/B͛yᇱX,,X;sr6T$GՒ1!PTr8urAɘL&nT*6mCPdVZEmm{jg:9'0OgXzfӦMlذ^9N7~FFFHIIPҠJl6O- FySM SըjYTw=A'c-''$zzzxW1LXVZ-fBcc#@222YN&q.8T*###<~cdR7iz?ĥq>@ nVS,mtlKLLtikk`0P\\,<~b瓕5l6XpYM&Yw;9Ml63m4^ySS?o~̙3),,sW^y%v#GsN& hl?΁4ruAn慅p 244ѣGٽ{7@;^O$p}I&iN}>,")u;]cc#:|s#==Gss3bzL&K(:k~O fY:NLL\c>u"IR(,,h4R__Ϟ={ !!Awޡ$ϙa^}.rY>)-MZޥM'-swrbihh`ZL6<+S%>>[b4jz\|g5@k0a^'kBTbXXl 188Ȗ-[xIIIV+wsx޳&BYieIwP(& \.͛Yr%{3l3Y}ӷ^+JE||z<~? >NM레"xl6@דj L vtQ>n'%%Nrr2L&z=hhlld``@(JymjXV(o555T*:::hll$##>$$$Vq88, ;vL~t FCJJ rL zt: $&&Nd6)++h4JMM\s`||p82at:X~qݸnoNooKRSSX,ݻ3gGww7#?? 2< XVy&11BA\\yyy]j9߯t, uuu̙3RIgg'ΔfJ5Z`llvzzz.Rlv~zdddDJJ\v d%`ҠAUUIIIʱrxl6x<b7oQyb̙( j59rdx(##DCC imm$%%ﺢg ˝J%>KەinrH!z-fϞM4N̙Cbb<#ŦEj`MMM ӧO' #NBB>|bRSSp*f^9j56 ͆\tE܌ᐳ"5JEjj* lٲ{wg s8rpa  2M69¶mHOOGb Mzz:ӧOg޽YG^';;ZMKK vkU9 c4)//ȑ#t騫chhH>Vzz:9Brr2$''9 15vj1Yehoo?+IRrضm^ϡCPT̜9d|>|]Zrz&qIbbwllN(΢EXz5{! ;h4RYYIGGc|| YdҠemm-bƍkCCŋI8#%=Fnn.w&;;[T*엲w} k'?d2pB~߽eٲex<~244$wO7!)..dڵ2w\JKK囟d"֭[?LBB7n;w.P_Kl6Ydɤ>*w=t,_ _ѣA7o!>>Ehxꩧo~l6SSSCAAiii,Z[Z>ꫯ V\J"##+kկ~NǗ%RSSq\NFF6mZ%\Ozz:p*n`͚5,\?򗿤^xaB7l~+N'\|@AdffIWWr2^/SXXcΜ9l޼뤥rJo}}? P[[̙3ٵk,XXf ӧO `K8{eݺuj)++cժU̚5xϟjG^tj{4S1s=Ǻu_HIIK/뮛c|>Yf vBA~~>W_}(d޽ɓDQXp!_HNNE N]3K.E裏j;w./(..P7p֭[G]]~Ċ+Xx<|7Ofyϟd:`W-|fttӧswS^^h$++Kʕ+O:>ZVf͚E]]wB̘1 V+K. /f<iiiw}-¯~+?ǭʲezUVP(c=b.osA 1XSNII 6 F#&]9 1>Lyy$ڊARəN'O/ `xxQ0rZk4:kwOO^Wx<Ǔb!`Z=iJg4 VK$!==]S__OYY6 KWW\/Iz_p`Pr100nGTZrC4EV/;J%n\JdXΙ#@ Dz=6 ]]]A,5660>RT*&} gdd'O)>9]RGff&Av]:eggRK?Z-^Aj$$$F|=z3fȵ1RɺI^앴_ccc(JϣNܾ: xP(rVRID>ZMOO&iBD׋Rl6uJKKp8Looٜtuu2..Jx BᥥX,jގ` ==}B---c2b\.VjR#tW*枳0X B\\===rCɓ'*TWW999nHOsrr@ 0݄B!F#:BN#==]pRI\\ɤRFx<' ̔;M&aq8n0Dl6| FFF(//[#V544D  ++ B!xGcZDPt:듳&++K.a1<<㡨V+gH% x@T*}KJJ")) 7"11"8kH"ʓiiir ztDGTR''_6AᰜşRGJ=zN&}}} Fb^WF΁CCCD"yRzӟz=$&&сR 8[vvܓ$9w.322ǹ袋D<$L 1t޽'NrX,Z ,ڵX,FII r1#Gp1FFF̙3gBj'8q( J||<@gyZ***.AA|>z-yiLFF]wpˮ]h4̘1c$D]]{t:/^Lvvx&yYܹsl6p8XnW_}0AAXkpo>:::x<\qf9|0"HII!''>vM]]:C @PPUUeǎ444EN' YfxꩧX,bIA9@cǎQ__O[[6+Vw!-ؓzǏGxP̟?ټy3===a \{TUUp8xǨN ONٳȑ#^Zn?-0vÇþ}\r%|gl߾;wʝ_W#;;j)` F[p ;QS/T*?T8"Ph¡jQJNc0Xp!g۶m)j(.. B#-_9b~Zh 1!v/ByѨܜBMM jsvo>233)++#..4222x~?WzWcE*֜q֠w,#ʃCҀ燐0L\r%vTVVd"==L;P_~YΣfy.|bk.׿^䷿-zWO :] z^.{tyټy3JjP(HHH@łfCTţB|zܿz{^]8U_t^]ڻw/z7>:;;yW'tN(brANQ(MC_lقZp8LOO\YZvT* KJJ T* )))HKK{9NΣ~^}ճ SGί'NLiEaQ^~e/șZL6M΀';;85ST*TLBZ&z$''Odb1j%KpL )CMd?R!XE +HhT#Aee%t: HKKpPTr$-b$χN) HDz嗿p:v^/Gs~I ǃtto|㬟)J9u\"5TTrkD䖠3>>.xd``X,8allLp b\30tL ?Lj6TTh: \H*d1T@RT*Fr\ňD"ߍr S 'vP(DOO y222<%`'Pz2%H}!>_ RR!))]P : ILL$0::Jrr2H'/KIIM!--= ~m}Qz TWWcwDy睢yzQ*B~ BA,c=&ٳ[馛>{ŏnv]w.-l0 &12Cb*v~kgRT8oJʎ,`Y6 @bHB;Bzu7;1̙s4gz{zfyocԽp.x0wu""T*3g&''qО m9sԩS8{,!uQGҼ IDAT\kB;|ߧ 066oƟ;#zU9/g7͘noG0r\zovN<*m8w\\xᅙ1>9 ~~AxxÙ+tj/x 8|0_W_}5V^t]}8z(~عs'͛ ]]]K1gر?0FGG1<<+V~7o͛n?=;o%sxw8#m\Qb;6T<>75\ݩIbۋqbMǒ )T!@}MDMt_~>9ߔUׇyR}J҆xZTon}mmk :ZoF*9͛b?)+N5q]vGAoo/>Ochkküypgc=|k׮~5}Fh%oKopk!kbKES;i)3,w;H6G{Xvi_mkÅ~URݍ ^:Ҏѯ`b%fyl#[&&&/W^ֆGy1 INꪫsN_g>^Yۼc~Gxs4J?>a [NJ^zMo,%4.cKc]vLkAj;N ߟ阶6}ݸinr-ى U<8DH{s_|59jt8tƙn*M3͔sYvmlR}i6:X>ie|'9jꌣ+bƲʙӉ! OO3ގ{{tGGs=yhsjN֍E.>K$3(d38z3Ot3qOYωvi6`ELfʈ'a6YuqQIUFCaǚ` D%4RALz=a5M:0>[=8X;0QGTw2 zj8Z0C?Ca)a|3ZwYF2.VFZiri qit"53J}`FIpԆŵ8؋$D5W+y6lsnPY49[i9QӒr$ w WVϻBT*:H) r:"Eb9jGg3)EX8ra}dO^vOr̋~P Y4#H)1;Kzsؿ*7kR`<S Sw5ŷ |KA1 ^h;_b6)6\LEf r`O%zjKȍeYBVNү 9 %Act+hhc  "6Z΄' \8KAAD@bGL']hO*ڼXSǢ;h:Q&d;zjQREۓ3N<  v.kk8_@IW|_Si8Tomd9CD];"*"&;k.E0?"#_ސT}|ɝTIJxĿ H kl+FbrYyp$', #n{&/uE.G{Ú&X4>]Y4d9 % ?Z  Œi EN' hE[:1645%  ZIwA6aGӉB=,JoQ'uAbrͦ)9^@\jWEq 'sZfNbC(OQbu1%K:R¤:p v5Q~OYz;y̕R玁F/O 6P/{ >-do~v5zىcJ@S2V {-,98s0ϩ5,)r:WVwQq"WM&_Eb#j?)y3 Q6ZpCKY2?Lίtb,DVcrI?;4E!Ge+jxp #<$DA- cT&AA/ EN' hA‰lg':  *Nl(pft"$'\&#ċΜAo&T(vϤе5;f*YuWɦ 4\HZb9}; tost1jv"J!G@GV$ VoÄcDC1'MZWA*%򗕡x2mt1]}l4'8L"~)ᝆy&暪[<|ט0D |f<9b5/Y:PW1]GJeSQJ:AɈCȌ4!jk8ՏVՕey}?eFk&''yz{{1o#D4 ٦f0>pU0-ݏg\9-ë _ƧeqzvFv 4ɹB:4q6SAYNBI'gIK$!–#F|Ϟ=կ~vcbbG .M7݄v<x9ʕ+G MD1$i'0(JR1eE)&Ͳscmܨ@xytvtYO1sMgW<wR>jCYT%;[o7Ν;]v?!*|séS+%z,\о~ODSaҭH#Cb?IR;6Y zgk ct*ڦw]&wWiSV6wd¬d5&&&k.+XngΜSO=UVn?A]o&_DIBV-<_~ۙ.\(O0SN+hm+bg}gibecrIUFpV_Ee, Q卍wΝ;qc`` =_\s z\s5Xt)^x=z9 UaVľ}{naڵž}pB̛7+1  X~*ヒ{===qbIbuSd[f.qJv{VqA[ȲxHȶ/QB+Woyi~a9讓6u)05ywƷ-tvvp-n9s͛7ԩSXp!.Blݺ5̇Ӓ fj*x_JV~?7n'IjVm(©S@p>n˖-ٳgeLLL`ո l(rr:ZTidwPĭm1 % -e(nBWǹ[_9 kM4mx-tLnnQ1tZV g/U< [nVBwwwnzz۶mÛoٳoCԩS48瘚ӧ:(S*X%bOdl>eLo@Y?%65+8fJ/MH59jAADFi Dihٸ뮻00g\|8x Μ9gbrr<8rN뺹ND I 'mN '4=S̈г gh&.buLJ+5Y]KQPV?AS"W;oS5K̙3شi<ѣGm6|ٳr:)(Jk,Ecw)NsݸKIk\"-:SwimMi]ܳ# VsjF1(A3ít*ƉL95CM3G~^x>~O~NY6lyrr:ZTQLF]*oƓl]B_c4Z72kPږwd['ޔ-)Jɕ1I ߢ`c<dw۱rJ| :;;qq1x!\xᅤDXdV"edzF}O:'+<߳ mT+Vj'4R)lEhwbU-[[.gG%\/Foo/֬Y/}KA[[ϟy档þn!AA$f]!GŎMNS-߉_[W˛6ԡ❥|]՘Ѡ%jd=*,mxe]JЛ.G~砙dHic,E0]:DO$4s/k(%I XY}t˴2lY\3j.ޞ2PV!DȢT}YuOjWE-,dbP_ԥӖGPAnR{8뱭D|z9:viAHqaL IIݽaG^[, N3r:ZLkh:Ux=4'2K(6B&>=#Z+3߅>y6e ̓fϙUr0 -h4i8i,vFcF-!Yu MpQf6Tik[m5j<9{ه\(& !,-KԌ%>n.W3Űe'q(-|f O3fuacꦝFKtYJ]8`Ӯ3]JҚ̾WNJzڠK%}:&7=LA(`^e3p2Im#sg8Lϲ-e̊A2f<ڈy-?u|.]G4ndlV D( &&Nڦ̨M[)jHS.;4ò(Ϫt&ǥrsA7}Mҥ9Z0Jm>ˍǶLhsnJKSW۱Y?܁ Eפ(b(<G#0m:Ij9zI ܉Pr,#?InK&+\MRkMCSZpkUP3% [X"x&G/M~19mU׮6_449UƬNuB⭱ 9 jL] g+#9tĦWO1a )vRkAՑSDl{v;|g2-x\,[ryw)EmjmIN'PxUG^۱%K?*6QU.[IWW'LGi;E0OEX-xӟDA ~XtjiAA1P@Qt" ,Cև  bP@Qt"0j:3/UF4% -~/O xdQ+|$YA,0zLM(zSw7i?I i>q:>Y0j]hV1DY<䘃 $;r@b?eiҤMY'EIj5/C cOtRwľ+gx9HM%滈n2\1c-Vhfꤹ]diNM(^޳FD|HSt 2w V3瘜i_DFKhqͰTqxkgf:3x6;cK5Ah:Ӊ ԸR r>Mc"u:nӕ ݥ1P(X8juOΡ[Oqp>`g69<^[YeUHf:CN'B }K9ƽ`8YBL+@Wgaɺ>313SolfEEz\5tOJ=^O`>ulyfd67>h%{vZ [  n+<˛9Y,Eυ`#lS^]Y.ǁ!xԮܼևT3E(il4ϣ~s{6NAѪ0m-2-  d1=foC 8 VHӉ  " ax]SkQt"4|ʋ$Քg;m331MSX|lw+h81=IlPDLID&O[CWpImRX_Bk:`aGdyӺAѷvQBK'N9/ 1ʓJivq 'b( 2mk2 وyd*Di%܊% Nݍ:ZtIw%\+5ԎN)E),.~yB/VS׍/_nQfGS"I=3rcւ-b+DCI#j3b6qmme":LՊ5Ȥf6%bM*5fҰΌ5a >Zr:AD+RZX" jEN' hYcVȲUAAĨOyIH kyxpqw2|<"}T|:&S8u/#\ب$4$Sߌ5:J<4^,5L$/q y'OO')!&l}Gwmjpko[h)j}d 3 YQ矶irJ4K$өch = ~X7B.4-&֕!w3}e Ӊp2ZnFăbw98"qqOjibbu.Té]Kƫշmc[=tGPΈo Kt-l\pqO1x|EbJsj}]ghKr9(4$jk(xE[8|%BGB'lޜOlF9 Ua mt" 0Σ EN'B,eWb=Bb`ɯ*VvyA_RϯS#ŴHM}wm;~e+ Z1ŵJMk]3C)5QlLmrxx'n*3ZD c6m lż_Rccg'ff9m-&eM6Tp41t"r{ OA|Wuh)|p;:8]2G=eu6儓%tiyߒGt_SS `ս)w) Kg)e˸6*Ez $hَ5+8.-WW93_ojiMr~g4naE7Emf`xU3稹 AA$pb1Gk,  Z<6]r:ADR#hAA-H^ [r:ȸzX22lQd!ĝ[fhrP仓\];|Q@pM^ NA0{U%  9NM t"P}(`DfZ6\qf2ԋF^;14˲*gX |;M)s>StBW^ZZp)-O,ʐA!OyO B6ݨO)#=KtRΜW5 U; AA,,e6zZX" PDBiөLLL8v8099#G`rr7o̙38y$FGGJ}}}3g* tddǏٳgy088;wǎyaZjϨՖ@/RgulkKDsN~azzz+.\~xwטފ`#<[1A}ݸ{B'ƍo&:;;q};ܹsq!Z<3=44?jMd*{sm9QF95<j;N s7wWU W^z%,XO>$*YcccxWqEK/E?n[oǏǧ>)߿o&ؾ};~_K_|XE/ '׏NZ>b]GO̭2 i:]Yq+ 6=ؙxLs~###7>?ū6<3qqz Wƺu0{pӦMpA<3{m6lݺK,ދ`ǎxqw}brr2GLYʵܣ\5ӶeY8<ȀN66rME+K[i4$X[s:ENqq߿1vmݻ~ضm֯_vlݺ˱`̟?I ؽ{7|;#G/Ɔ p!غuk׎yg>ypE sQDa֏FЯαb0my=i˖{L\/Z/k&e;~Usw͐C7LOOѣطop뭷/?_{m݆CѣGb ,^/F[[`9r7oƾ}0==͛7cll ֭ mz.Cgggx| qۅ 2/!,oOnЕ ?@08.g۰\891y׆}RTQɖX9Y%cR!ۆ̱ssBN~… )ݫVoߎyСCƹs044vattñcocrr˗/yxw>֭[?>z{{qA;v ]]]Xz5+%Ia#ejéY֜+q2_-lLJlI: 21řp>[rًgs*'L׮?6hj{nT*,X5kwŞ={0w\ٳǎ R OơCpqرa…֭[ߏիW /FOOΞ=SN]v/^\2d{l\L[Lt7cU,+v 7ckfڸ,~2S8…Mi6gZ.k+P9b9y$* сQq{ٳ;ׇ!q{~g?;q]wO‰'pe{-| &P4zZDhqqttt7f(z!;_ox'с/| '>q ?̷gϞř3gh"|K_* % EN& +W׿uVUe˖_2~op Ww i/*A)c \T`ڵx>̙:`ڵWO?4&''s2ܫA)B1өAbzzgϞ ۋ9sH+b g?n f™3g022wbb1+og};w< HVPNkiQ) ׮$9b2W)MBvQklqa>;xm6-_7GƑDuhKe=MfYz Sn(:dy^umT}#N׻ןG}-[ ЇpW[5Q9&3O%y155ߏ~YFz|m?T*<xq7YMli*O6m /}^~eW>)CUSTiLZkrAFqS.V}5EIt I4χdUKczMd]ϖKG7 ^_=.䷡ #eܹѣCgg'8\|J$8tww= pB={{^:cV;66CaϞ=ӧOcn_DlNZ5T>*W\}~@-bjty&[1o6.y<3Ogl\q)CL}j+N B3j'؉'~fF?*!ra,CR… q9;v ޽{Voibb1tttf͚)?~aҥ8~8{=[.088t:(iO Ϭ]Tm"=I|?fNwrAa嘉CF6iS >r%Աsqߊ:]k*gZՐI֊ 9rH_^{ IDATcE.j!_\Qtj\p:;;{nڵ s֭[1>>/\2ʱc0::y@ҥKy.ݻ6mߎm۶attgώ<\y啸+~xg&'MAdßɟ44chkkg c˖-Xj.2Gt͛qpE|xokm099a)q7GԨADo|뮻] tjaÆ ƗeE][o\prFƍ裏ѣ<]]]X~=>O ^z)<Ɲw 6l؀O~eA1S^"舦R;#<pcXh'|O<N8Ӄo~;pWb׮]xGc<|Ć :5 b(.r:5իW k֬Dze˔ꫯ0FGGCgg'-Z::;;qW'tL\˗/e<6lqF];V_Y,v4r=ƑJj_ שB.ͦsjɈSRE@rŀP'9qiö:TDLxޖ(u̢X z >#H7IXkdq:Ix j4K K/Egg'>sŋ3T\{XhCWW/_:| ,W袋dɒN jC*\ko0wEy,amv嘖*QT&`ѱqӬ5a񐒣sv2r]4 '~f]DP[r:5 `ڵǬZ VҦY`,Xz@)f D:.h0Z"ݍ 3VQ\5Gc竺f]NK}DݲKF~3^|ڇ!(TfN*y>SKD144:%\* l2,[,ouWHC)GY,OKI+@{zJ3i:R0e"{eje}c+ !p8tᰚAӉT @.d3X-YxN:׎4JEWflᵈeRˈfx-W 2:!YEQH>6nIgJnb2 W'rD1n`x(`t"%1 gPٗV8[e,Ś)b.YQbbߛ6{~+%Dž-tF?ֆzLp 9-L\vsh\Ӈ1qʂRK!;9KjCMK3#T\)R'>wQ")3}@}N\ ۲6Z*{ribFiDh3Q;dRWaiխ6?"O.dj6 4Iik/ uZd7ˮ?$5ϚWI{[iC\}>&e,GiYhA](N(6a6{wcft8N<ό]mp-ӹ[/ mjfybih t" V&ח0  uFӉPùSBƲxS..-#!/UsСi)铫ֶM˒e*dEB3pSͥ,_iuLAhꜼ OBGHڿ}=Y'вn,6%=h5^G吵uP붘r,gpi.R}:WR5U˚֦.E5TkleL#iJy-߻,3ȍ Uˣ ¯ːꕛZk=)j c"/ʼn3!@v<܏L CKb/tfa<~<]ꋮk lBcvxN;\Bն^D-|dUL%ݎ` qh!&r42g?ۿjbXmU|&w!ǟ,Z;X9ةInM|UHBq@q\QFĽ`'-ШG)/R&CD?gOCcCt" %x5`пBAZP -v9 Ua9gW_o 0ưrJ\r%Xt)b߾}n&̝;J%  2Ust28}={/كI0pEaڵ7oFFFSOȑ#Eᦛn¬Y:w9%.];$d H%5",3NSBbjN"]F5:%d.ֆzƖ!,ciM4\µ1RET7.Ӥ5_Dj0aVTv]4idæniN>>BE%IVgu(M]jLqqYܹ7oƩSӃ;wgѣGֆsΡׯG %g`lZ>j{DjêR QJV揂Ms~] +5ʸi9smP|ڵkc`֬Yڵk144d_?GӉQ:f@]ˉ#Rf j]2=4%Lն~ZV῜˧F4ɬ U*j< R=~)9dNsс۶ {ZcTosSR&xYU}f`RTpu(._:^z%,X?O1w\s=Xd ~C__9Ɉ/yk/~nQLwkKhsԴQƆGY4K=Z=Y:!oFÉ e9s`ll _kajj O?4V\O~gxG044Nϵ" q/c Xd ,X`tww?/V0nFرAADalkk/_ٳgR .yصk6mڄ}CXh-Z|xqرgt" V^g تT*68u9! c߾}<Ò%Kp X֍   jS$YRmmmhk \7Ν m%K`֬YxpB͛bll̾n!q~Jx`)#1 ,㵅AabY;uSO#0EuyϠq]nYː?[y[ ]{ wY`,e. TSdD1x,vϑkCۂ6f61HL)̳>8y$zpvgg'-[%K;q OSO=XlfϞQݍs΅1m&[VAYI8by>ti"$U:aTRDknif\ʎغkF77/aC9&&&_>Ў}>|O>$:;;tRp1>>^GG˗9 "X%_F5ck{w_ph5R< SNQ-S$jBn~$`Sj'3d[iP+I#Hx? YP2 1<(nǞ,p@]T"nTӔhaIC%hz>E_ZjrLMOpmappPtoo馛pcjj Jsp4< $&qd꟱ Cn7.띖)K;M,#LӚn;"M~*u0WnQ_Lx_ki6GYn === өSe?>9\yx7QT>xm}GRiNAxZᱣ+Ww],X@]愊>^|E|>lذ]tmۆٳgٳEGG?Yf+AAC :188xW\qE̎?'x~;6l؀%K`ĩS0118s X# sx`*rΙ~4c^ cmE>{wO1w09vdv0e&~Kd(iE)`, p.lH" 0D uxf &?%#E0'iWe.I h'ug Xxeg11CRYۛs=E;%Kyzzz};vŋ1|׿CgA,_`L~,}S4ń*䟰bۮWjM.v$ tS3K9EҥOW2¸cq*rll:{#!aSCɽ0c ݩvsXb>OBCCC+/3x \qQ7wӉPR eēNj;9jP9ɖǍ$EM.-$ {{x}S f:{28 wڜH`.KRrCS{?^Ieu'*?tF \d8gUVDqHGƍqF,Y6m֭[h"̟?wy'^{5<۱~zXкDQOgdN%g1}]JH~,20TEq>؆M6NDu*ӛJ{jѫ;]6u9sѣ/~+Wo[^nr-xq ÇaÆpGN' heLǏcddc||<+z+vڅ-[`rr/Ƶ^e˖Ս  Pw:MMMѣǏǣ> ;;曱eK/7|3fϞm_7Ӊ  Zv_CCR/ƃ>Mggv  ( dkCFvTww7֭[twQ0t"p _'7#Ґ0u0LELc6 9 Z|!]isdWtU^d]]7 RTit{bU@7(j2c: % A' .\T:xbJl%BN' hit" 8(^|NENm#qU6 g Œ(]%>SI)i.]ȗ8ԭpa;e6i]LHʫ6J{DT,Pa8?k.0y_5%Ͻ0h,ls HړK\'cdkk]TNRAQ Y:]ytE`Z۲BIZl$eAnD"'RJh- Uͣ sx#?'҇*({a/ 2Q]]}~|Bdy#STo&_SZ-jLj; Uv,Gh7"rۑJ LeWJϾz'LaN f@C4(3CD31 ^9yg2pO g8'ѶRm3A'_"vZ"[̿W t2NWm86x򔭞uÄhZЧtv x# ߘ4byD].nr-~$',qGU} Vg:5(r:xb:X*$Vc̠Wc?am33"PNW+ϏoWgCbHxNmVMOL^ubW!FfqQuD&CfL=kFX`K*ܪ~r&Î`X&fNKWȳ ;Lc-Ϋ)b4nf̣NQq'Ǹ(P1冲쯐\2gY%- ۩eF7Xq$X ctQ6Ie}9,C$-9wO6oWDt5`0"$.uR(ӯ%_leu@ߞΑxg9|#3%8 FSs fwUC3c3CxThLg{4&Ii%Y+jd툢Cl;:CiAA#x-.SQtR055Q9s48hkkK">OQM'V .,chi9mYϓצ2ÊWKB"T/RYNT m2ԱaύyYⵈ&$ E]:%Sl_LX`>GLO" P[$h3xOU/3uNI$f s" j-֎+|qQ@5f65h|,'4$Fv|+@N' 8|0vڅ{l2:u ҥKjO:JC%&7O'[YD f?P8 fD!bh2 | ]QsDC2kTc(}uV &cQM'QCGtPia4h64.Ge$WK#yQ2L{?g8jְرcصkߏK.֭<9s`Μ9fpxUQ*\TwEL.bA0 1 MӉ!B?dT剺6twj[;m|{݄ u)ө Fߟ,ٲe &/[l8^u9^FW  P}bZX"c?o66n܈IݻwqFW  rӆʳz N^u//xbgӧqܹFW  j_-DdwźuގN,\㘘ht "?yl̳8g.^'+կ IDAT===`R=#h 831O7_7F ~T[lZ,;ݎ0AF4ixHQ*QU%bFH/h:ر1 `v]oն]y__}1m+5c\pm#qT\1d4bN'ĝsNhfi4Dd(ҵZ,Ȥ0S$"zUbYf;{\{xkKD6ӃgtuuU,4q'B1n+ YcYcCN' ^{-Ν{Gu?9=}J+%a%l[6]6`)_H E$UTTHJE @ 2  d,Xek%>fWw}]m=ӏ3=3gN [n+ 6~7nV\xᅽ¢;n]ݻzjg?Ö-[pI8p|;q뭷b֭¢st5R022K.r^z)yfl޼Ӂ5s9p~d$BtU%4 ? Cf}&7QZ}մ^.SkA9rTMc~i% =HʙB_x$M.S6Ys܈JYp%tyH{GX='l@z5n28]Zvhp^DvVd}vmۆj~c۶m﵈ 8"'RW4. b9M Y(9Oܤ7C|Q)SWEp8~iF' &S7rF{ֵYZғw[/iDP@g|P`bbKr$ymoÆ Y^AE< qEQ\5&q 4M{:E3vf'o)a_# a$C؋)]s"uL4鰟sЌlf[.hr4v,W*MV~}FF4' ^<,&ZX<ۇ󳳳x/cؾ}"KΨ/%Ic~9 dYHĜJ:h8m]knf9s7h@|jm!Rxt)_B`3.d֐2g5by -$MrA':TX~%=zgADV h:uQ@D?2Nlz&sGl 祿|!!AT u9N, &|;DIO4!`=?AgQΛu$(([gSzxhx2'W)^&_ϣH0WdSSSҗ'N^Z8@(ޙH P/; 2%mA+,G5,D;jE}pji l߾x' /R[n 9~Jׯﱴ)^ Ò*H6GF%"(pU:B8ND'N bP^Aˉ "l٩3Bpˬ&k:CϧԊAƩ fZџ.I?<$y ndfD'"p:mpB28[ܞ9XJDJ 0~,$gݱc =ڵkO} Z /x M7݄xGQTyK fH;Sf0Q6;J/=HhiO 5O'/T'$T݀kKPZ2fݠj/)0:`,]e(3s#C:TTkt 044ݻw|I a8/7oƿۿaff."<\<@3:hg>*M# iO+gS]Λ )!an̰!%HYr̳c%Sfo]y -klZmOF3 rNo_TSE-Ի*BG诣`Nz,L^l 1jH+\ڴ26{UN3әU34/=Բ%g)u.*LK}oL6ޑ^Qc&Y ZAʜ,,ţ\To( @ѼmA'W218ndsuCJ-z-0 uSq Z2GȬ+CRaNYV]a~3*c%Rp-u]|3AZr۶m>1\r%qq =^ ~XqH 4ܸA4{p4u*|@ThT.|P8#Ms?]H)ow˓Ӫ,yUOX+Ty6 ˜~5R '͔n ­<h뵈 b !]Cs3H%H;/7$ X8D@N ,b]s*n$DRDL_mעZLq"OKݓqi#2fyZHr<3ar5%ᔆNV#eN)شikldtZv-lق^Cx4x 2CR6>l髋en^ tmMf_=BƇ{$iK~A~9iLiAQ {]bSݨr`)R+VVall r .LOOu]8իWc˖-XzuE\1N)WN\:3EMrK Hg86wjn) o{zQ]K8hߪYsWfsey]~V`جU=RpQ=zK`fLLL`bbwƺuz,EvN<_|=JJ )%pܹ6m걔x!XS :]ak?Ʋ CYwCF[jY$ 0I2%PT4=u]\вX91#/["F cxxX;.Μ9SNYy͕6ktHcT*ajjjEx:YXXXX`ޥ+蔂W^BZ ǎc  R&nk73:BWFN+@\f]4F:s5P@Hhi T4mGBG HNNe@`Rԍps lgPДs-4weN2s)YYrNF^3l,̫nEri32/̼MHn<3@0&!AB6~(T47ݮrvZϭ>i$HoU˅?bqq!>KOjRw"OD| 9MdCTi. &&Df-ΧV+[JgG#mI͹h_ ˯XS ~ӟ[V444~ ZGT@,Avru 9D`\y؄Dҡ;+sXVf3jacҴ>̐.Nk8 "]dJEm2[OںjNjܪI%0:-2bxVBTJ9z{ 9sLx׃g_mtNu(aN)xqIovܹsw]v}Wk]Jsvg D( $͍``@eX2J}**rr2ucw~!G160QMqJ+a0p-,/Ϙ6cTB wb^Ŋ|Η)~3+ĔG0쐺@73ʹ:euןus"18L Ugx7EMϦV5:h48{,Ξ=^z Ns=gf9rΝCCU,,,,,,+ȷIўҳRh)%Ξ= oAzXdrCY WuM@,YkRO8F BA7k-HǤRׄʽѺ^$ 9z>pUzN./!XMvc_C;Q9ڑC}w2P@zNCNmv!'Ae-, ~뷰gv^kF&ݙ [ vB^ϛ[sj=ٕQAOQ0ee]Loy[044M6K.AT5ӎw`Rd@$)sB]݅7_7[KFcvSE( J Aupۉ*|Y 쓉+⧱3c[ѱ"+_tƼ =û"&s#[Of5d7-Fys?P\*^|o3#BO2N,4LP [qO?krSSSطo_jw]5⼀0eɢ9:~_a׮]pzOͻg\r%,Ew@YFNƒ>-[`߾}72>>nNTNֱ,!+?QǭbY;ԡ:V)֭[77=܃[orYh4ؼys\dxޜ U]Ȫ+swC6Ѐƽ pҙ2D#vcNُrJĩD%/29Ruk{9Ww&uDEq<NaVZWñ0&^L!HEOQ'I YLlgD5nY: G7! @`:dVB0di3)&Dl$irr& H78ed Cb3ܳ`N+CYh5k֬Y-[+@\Vf뺸 z("e~X]jQY1;SsNĜOWg1U G  >}pEg@%V׀Kq @#@肗c#7[ ^`0iKx#SBgI0%H8A]E BQ2⍷34MSP9 vy#d0)yUS:1nN"cҁ099~Z'Ok_nvZb..``!22:ué2IWkq] >!K嵗Rxc)u"qȓ)EiwbG!Vs ahȽ\d6i2~"4JQҰ@lv^rE,%.".48XXǸ\&|fE\ 7Zyq|CB___Vk_vލ^8H'4RlS  `@<qE|)Nlը_pC45|M+3 b|5o!_EY'۳pZwi.3טn(ッ VMVXSfF^G^s?~zkt^JÇUW]"™).ғǓ$ғj:lHȆb{]MEϓÚqJ&yH]YBY(2EKFGG}_|q e:d^g~߿o&_5209!r-ؽ{w1У)UpbÆ 0??k&:/6mڄJjaaaaaٝ켲2%T*ؽ{7FFF{n;v {6e!"T*lڴ 6l豴{)1uR)@?v؁;v`Ν8wp9 & g^@+ *DhZ^!#aa*A2qT74fnJ yV$Į›rw3O4H+xBPR%G NY˥znc$)~!^1y<`e'QfOw˩?5!iD{h"LY4(K0AD%jrI㚢&|29R=·n݊[.144ijH:s VZ6d8(:#{r.%Jv8ZI߽NAiU $΀QVt0ch%2Fz!|UDҕbk[ɛ6\h3N N͘l!.i28:dnX>q3-,W`-BЍ=1a)K:oVeN)ɓ'_/#/%*k׮4*%y68m scW@r) Wf? MJ5@y+E g|/ 7.Nξ}Ր'u [dҙ>(?Ok!Sn֚! 5::ߏR/":E?ۉ(IH@)A(ܓR_ҥ2edx A(N()JgU NRgrh/_I`kەw_o*k(CHV| ~dr`+蔂cǎGG8tw`޽xCYy k 8N:ӧO011 q-ࡇF' eU u+jJ{֯_/}}}wT^x)r"%–zʝFf%Đ0> $A\ D` PqPlDJ ϨRsczJ (KZXӤ -Dt'H! x>$F ΑO c!PJ؂,U4&!Ї>5 5N'ABx£8}.V:!NNgF%Χaiџ4YxZX1S|o(%o^ekJ¨o ijmҚ!sHO'.|y[WzVd:W_}50::;wbӦMx1;;kASK@ C_b/J:C.ʫQZc3Cv279\)AP31U4+QX:(E$8TM%9`\P*Td~bf%@kD 0scTu+.yهͮ']BVf/z\Hr8e]Czvdm/AvCktJU022)%lقcǎGs +#(o&@"΋_XT pR $ a }0 A5L'OLҸ("O|4QOQb2N'x:B`'VRE3BN?X8 IHIH{AHI<7.2~ds,놭Ƞa)uSS@P\E|մTߑz7 "wڇ3,Hy>c#M价U&TV Ek(<Æ 099GyB4G!B%HJ@Tr=pU#,Gth%C A(@i8F a PVPN gjr6tr &Ogb5,hI-nb%oDFΓi7IkԢsؒ92gKUd֓ Ե &j[ ˯XS .b ^;˿ӟ4֭[;ccc¢+ m^Y $6n܈ݻwV~7?ׇ;[l鵈!6 Q'F\p]Wƍ7ވ]varrZ sssXn]EDA vލ;v`ddo{۰}v|G^{-Eg"7: u+V~(Mm۶0*;֕#on)c 0,1o mQ/KAŲnBᦒB6> Jxۨ d9UdBbDn '$CB(sDV6@FjX(:Ȑt%N>/JU6!FiHp^B'SpYBb#螒̙4Fak}I=g%YhcQ- #DLa˚^lh#P̨WNA8.ϰJ+}YayjZf%\kh׽{(Kappa ᢋ. >@ϼΞ=^z <j6oތ-k055瞃yÍ7ވ;wbff?8B@!ޢ$奊 t >|C/ScI<2jb`o*`p(J˾~p9A:eH)JQJYّAFOYie.Jb*w0x|vp$O\ZDpU( yߦPH@{pEQ܇ٸÔE=&ktngJE9ѩR _vm=7=>|̌Gضm^y߿>,/G k׮SO=RSH)fyFi*~۽YshVϚP>2j(ty3*+\0k/{2D8͉< yd䱥fh)DJݒ~ 7p^D 3<|;x'W_}5^\pq jx-ecMq LLL`ӦM06oތ+O>$x \سgmo{z! q )De'28U֖HWÍ@BƒV`z6QF\eҪ؃E x͊%'oAA '0n8[)!XoOI<f 0>Uf7׍"8ge';@$4噅I4Otb$;hH}:$P܏Jϔ` ǩF@"W~QB<Tfn^n8,H3:o%Ş|כ펨B}. dyY|xG }>VI)gΜg>'|W_}5.袶dX^ Ro&fgg#jpp;voG}4z/Rn&|ǡCpW}}}{?Ο8)&AA) :hq'Wt tvCx<7hxC '?K*Sޚy. 2#oem YI-U:G׭~H"c<.׿=s==_5;FOȔzZ7FtF<#vNt(|RJQBY;%VH2jkJiIZAh9j f_Rjo*Ëtb*7Yi ko12W*V'es:y-%D@Jcǎazz֭>/2>}=5:-sLOO2TUk(YIxu֡\.022իW^9 IDAT\8&''zjE\. #|Ӂu$DRF!ya-zrQHXS`F`4ǀ&i_^,nV_*t--lٚ7=ZABա|3NM+qOd}a[*x''5*_]@NzIDk~w/tT5@oTV^D4^ѣVؾ};.H/lNzǏZ03*wtzu188Y*PT077ǪjB`׮]=b)O&R7_CC>֭[1==W_}-<éSpL=<-oi·@`f D2eeqȋ000P#"\x:t*P64:att4ڨc8x =vkt 066{Pޝ;w.48Fċ p ո0==SNavvpi<xRT&I:p۹-pUCTR5x 93dp:1^N[ODW4|i9HaonoXEs%O}6t i|nPo瞋CtM"iRJTU:tDs3cjj D'q\A~jMʫ-4(:y<@Jgf !DSٳ|N  qt%3/'F(NR9XFlJp8%nRENtt^Hk3Js@-=ͮ<?N͊>q=Й }MOqQz=jZ:Q;ҬDN^W_}Uӡ~zmz|/dz#k֬իp9- VZfߗ2)pă "sqȸTV133GYRJ!p9Qӗ'?y?LMMݣCGr$X2ñR&*%PWvkxp~8ۚq%ҩNp ј}d v#Ώ,bs~   G'eO#H$Yuf~p7Rƃx_W{Ň?aNdLA.^}U_ua>:: v.¸&*J_~yңVZ!Ξ==sssjRbbbRI3/~O?4ܹswk̝ĞǥƷD҃P)CRr&T;1oW@j>-p*1LR,$xGc#L`+U2E Y+Й{RZcS=6y8/*Of^"6J1$@Q2@\j~q-v ;rXq1i;5hM7݄??-v&:r|$>Oߟ*JGرog97,uwhűcpq 7 /{G3ꪫ_PXݻWW| _C 2o eu-$rH#T7U4)y|!T|@30*O+h}ϝ{=sb6M5SLI޼S*mD4YRl~%E:T{uy]0$ic!{wuWŖ-[:Ô]ŶT*aӦM'>W*k܁᪝yOMMqͺ!ic s.tMKsc6v#:BCl0u]bn{1oTz70NI}]s o꤅ӉWbb uB>:DV'g(HQQ_8͛7[fE/oc~~;w 7܀7[o<PwnJ%v>eH5@F4 i\;mnt IQNcO /wh1Œ{a"]W%ߪϩEN9E{ͤȻN-D_NOU5j<}I3'<|ඍ,#(Og:T /NlJH-d 7pIEyD ?Υ]3e 'y)3'CYM7Hp2]db]6Gg@ȉ!OSld RJO<@ {ˑ~M9_@m#Wf&Wg SQ'<34N-YFaުe8~4?y# DE 43&y8e/8FaB}g}2h/|1sw-|l !p7ĉ˿KH)q뭷o}+j¢UlذwqĹ}q;Lp'qnk .ll725h}~xx~SU 3x m-e7#IRLխfIMa$i^\ڏyo$rT2Czq3Od+.?VJ)w2> Y0ӇȈnerwCk!.|_'?IxoW_}TaN)w'OCADp+쵘m#X$*֭[q}o3cڵ[- <{1w}Xv-ǁy8{,/o;n^iaaaaazCڵ 3cݺuچ-K蔂Çرc(ˑZ =X+Ax "=ѭTx}TD,5!{$Pg3:`k, f? g%bxb$║UUXDFf{f. GOwMA^O$|PV{ >)n}E_sZ79.B}QV#o#6<.//87:sޅp%o[zq}e:(<=PZ7f:%O^cWȪw?b8)@il+y Ex7BT'϶בo4pIhIDZiu[dSOZz[X/6ryP1Tp:(֓ sN3E3%`Mu6ӥ-Ҿ$ڳWQFЗ.kh֜6 3# zNt|Q cΝmKXS :`fPGyy[o*,bpDJ@ Uу<ƱYHv^>GЍ=nv93_+%[;3l?%):yF@4dM&ĎS߉% pþFQ%ݐ5UB NV4~5{>"=]5Hj6JE˛jfʜc1f w2԰ތ.gJ*_78O.jٳ}}}8y$&''˂!¢ Eg8qK.###įks=سgOEV5:`;0>>9H)k.úuz-!T'(|,$4a'FKi.-)j ԰<Ff#&Qah)EkIq?%3&WZN #c!eP, 8,t( 9N jW$pDA k HJوz}E'u&3vS8 p8]hSoT X9ljOKJ3w[)BZ63o(=.$Wk!)Gocte/ ĚƗs7 NZF*oip:ib |K}_ G5 !zqXaVdr/\M,uC7SV9jO {j>u4$)(ǓϘD\yԔөI;#[ihljs f=3cff&ғ֭[n ǎ<iӦGyIfT~G6|Q҄" 9)F'6Pza:zZS0qUsHo4MlMɑ=n 8Û_=I錌ؘ7E%WeFe],̿v F^ϙO]Z̿U#_ΰF333xgqUWرc8tv~nn'NݻwlOhaaaaavQqAl߾SSSxsss\yضm[/Ĵ "ԩS׿7b߾}w~zktEY|wߍCHGDG?jN\r V`Æ ?Clذr .* usICDQavT$z 3K]tsjIC̐=Jp3!JCr6j: N8 T}Y#rdhР,.4]AzP@m(%k#/8ʨ0n N' z'=?4'<<0TS{=.eqDF8y2Fzߴ!S3"=-{qo"+FbZXƽދM6aڵԧ>R<;v衤 fH$y> #G o1ØL]]6tc nnșDq0cF)RC7YB e*wp36ah qGޔ\!q 93oW૕f(Zzf\s1Eܺ0T:ѡ+`xx8Mٳgc:m'weٴ&&bB(bIpl PlN H N'@G!v\۰ 'LU%Q)QD4˒{[̹3B|̽oڙs{oltMR!ٳgV(K>81*a U^^~nWxq#.~E^v58#G*> 9#h#nɜN>Ux'r5P!G"q:Y~!8c 6ĩyp۩3:t(N(Q|׾]ӗå@ D]}ki mzfUrD ظq#ԩSLJ?ayJكFY_HLۂ8U[E( \M<ԩ+g}QKtJVK6sJ2/$]h 7$C\N- 3ʠɘ|DؠR B OyoR5r\3htʢCF2F'sJ%J%ӧquI.կbX|y500000*HT%*?[ bӣ^x<ؾ};̙ѣFFFo|<6l`i MCG2F'R .\q9 ѣGbrrvܰ&GٱCgJa;wgΜ8}]tvvauϟ*JtM /|^ctr166_~?<~`jj Y.|@#5 {}+h-. 2Ynx]A6eܭ*I' [bCnp+Frb G= ձL%_D\U.9Z m8cR\oFØVJ3ź(ٜ al>]BvZKS<%n*T*S t>GoyАŰy$deIC% (])ד'؈+L?_zpg,C6&8PcMb;sssO,٠3:tO?4݋'N|~Lqwꫯnbl1DYuN_3QRDe9«u\>ǂWCȕjU]Ͳ Kxup#SѰ7I` b> 2 (X|R*'&2千1.aNӫ4ͯC90FYYbݘ94g `ҥ > IDAT`Ac |{ŤK}UM\0q;qx/%p.1bd$'.SoM>) p>A #9R4J'5l@*wddbv@ZV7\xq]>JUTnWy!'@w75hh\R^0NT<!, CY2FXnf̟?NF7Xz[wG$ ^tran?Ɨ&\cl]sGߏVF,ltvkF5 :{A6b }yޝ AgaWW2o+iwRSuIu(,1:Ć a̟?}es hx9Un ʒA2:u B{{; OP@GG<'O6XB,`gv*ߥ QT:۶1331 ]- JPT92&&&03?R*l@ ctR`8y$كUVxױl2tvv6ZAũb2Nc\m/^q0 @ㅭ1(B4!\pk NْÄ8ӄ FFc*I[Q\tB5̶J,"cY3v Nx%ו*%\cvQ)*rIi䂜4rUzcO)cӆۅԗ?ϒ%L'[9===;w.|IY]]]os碧"_S5`N%_8W =u.ݝS-Uu:.(s<1'y%WQ+dwik$Ah TZ&V);*/;ʥjA1d%IBR#!K'%XT_ڑirVYɪ&qk%\ұǹA)Hf]a#50H_?۷c…/oʕ+-bu9رkiIbŋc׮]ꪫ`61oUx# ż"Qo]'NN20q\r:~i^Xm'*;έ q t]Tp_q:!PbH-ØאFrD19]{6"tߴ $!s~+B-s_BkkQdQe (&''q)kpfff]X,b׮]شiSŬ)e{v64&bAKIa*b#8Đ?ƹ|MxDǣ'zQJUVNPW$IsʖcL"˼,=}uhl"ln=) PNÙx뭷bػw/.\v ֭[7+NN9LϮTdɞ^ oRAtB8‰46\qXhQ%4000009]( m݆$~__Ggg'>O`ňR51x:˴"ȼI|4[B=׹䛧e?-qa|ttZxr=\;v _b'? 6X@,N'!\6y!Hiq"U8_S'[rXZJ&_ -SlB|)B-~QOW`Rp]HI,Nfua-踛a5ƝOs4`8q|\,kO1 a|k_pdޏ|# U!{_V+]鞈\YZ7`x-49Qy겖&Qxv9ב\`A *Y_N LJ+Z*/5j RJ9y5=ŒNMStEyd`ܣu'%',,9GՓP輣ur󁨺8GUՕUӯi#l{٠0 ݠyݍ(ˑy{{{ QAWZ=.UN-{gUCk`n:ʅ) 2}C&Tsr+ q$ӨW;D*rE1WHSr3ty_CY-z1:hmmŲeb-Q`@[DN-$8|}JCBl1:qDCz,!GUDVөz) coE%klYt*>EDS 4 p%^%GzadYD-1 mH޳@[𬴘|SoIAt*țd sr6PIxkkM"v]'eu C00BT٠G`ݵ3Q4TCԷ3 oA9.0 DY泺{2$i+:'QT 0F!1tr9<xw8$'[^~`7]dG%F 0F't98p{ fff`V|_Ķm)j@C!8$yMjYLxR6\t}6F)NYAă~_܎VX9]=BE7e Y܏q~%YA )]%c {ȑ#O~'Objj Js]w`iM4HͱfR%u\\K4rǬbHն׾6=qZUO5(g"-(ELI׸jx\ Sz^{P Nz<Ξ&˦ =Ih:_~---ؿ?2zzz0::+O\N4;੧BXÇ122 b||OXE400000:lO'_Kur\^k |0wE !-Kmqּ4#lxܑ\"9"rC>=ّmwy3NZķ:eȌSL+Q.Ipж9S]:Ud#\MWwnAl}0:siL-y&5k"bbmc``Dž 0111T*Cڐ`b b---(Xl}|_Gkk+cpg`````p}1PѣIk( oCX=K Uo itBwx82a` o!)%ztz]HPyx(D%EljX&uqtGnZ+8ntttܔq IĥzU^XqȂP*ɼ{j × "͎:dXf N vm8|0}Eڵ W^yeE s2`Axx@j; Bz}m$ z;k~Se&z|KSSNkX /3)U0wXA$Տ~L6yiU=7?uaz͢C5JXs۷oGTB[[N,X>۱cǎYECxz|1ԅ[u-/"dBrZvѩꈙ,ZFDj_ c6RQ&S՞!}LwW4!t(P,j\詑\ΧPhN7 2|!|")"3A}UIm6eփ&A`@[[n&,XGA\Ν;h뀈 5+E2pw43S2ĽYo IDAT*'QAOЭmFy+`H]'%y?n<DuFƄl@Z$D6]EgS]oT-^5#LkِC1]va 0F'gϞ=|ܹsmѣꪫn:aϞ=X|0:1朸u-B)҉ \aտ:SUȍ/%,ymk<O`͛]v[nѣG ^NTHPm=U  6A&p pEהo)XQÓ>B86#SkO 1%Q0Pk?t^[*K^G~wZSިDqRc5L<7";VDg0^2A0y?dNrԯA4J9nI я~W>ӡ3|8ZX5XLMqksiѴ] FQuի/Ue!V-Iۭ?y'ν/ ssFphgKR`||\2(ӟ-#q1)Q~$m(` IP-oTZ' -ts2NEe{BlUh:p$~>`r9ylNʅ%-[JB7o-((VZHDWKؽ{79k׮Eoo/,N<Çc۶mXzuŭKrGWWUL> Y`=>}b;)^T?p$mWuvET(zu:Y-޽K SM.GQ\_j4DxUxs~ S%UO\Mbdi+V/l:`N}}}>~طo^{5󘜜> R萚G>܂: b$lJ7 :YvMłIh4#ǘIA]UD HƍCB8!Nsdjq7HxY~q1hxp-^W N<6Y }}}yfpڊFUWH2p (EB@.,lIf(gS8'L3Ti^>:X'بv#5@-QAv[^Q=ZI$yM|P7&mTa) |"l 9[ְ|5KCyN9RCa FTExxĂF'6I$@WoI0$bF= &Sx, CAD@r6ȆB WjH]72&-p 2$K++*J-SQ6ЯK hkkC[[[h8o@kB4K>HywYN OE`<ў!^kb\59ٽQ sOiujo4 w;6DCpm$S)3at$ӟ S4@#^TyBv&!DD S6Xiy5Dm輯B=}RRFi:-A`a7 DVeI}t=6-Є$/SU>ɔW?W=Gi(Kfo*;N;iQ:j=4c?􋑍äoXjL#Gtq+'4|X"O'ˆNz|Q'`8 5TrGy>)@ƀV.F"p:B: McqrW Y$}ۜNԴS`@Hb F3h\[-)*7: 'qFN6x{!Kw\rD\hC\iZIK-aB4Fa-IHk$Zzi0(LĵX6)FW_["=Q"8eb#KWLX}a@{k)Z?oxNuw,Ӊs0bt]Qm0Av ׶ K vE0ۨK(ct2*,4~iOㄛIJbpBPm/96FXAt\Oo~Q+O.RYS|#F>IN\jXX=d@N }fx޸G?>jxT9#kU!.0 q"Qӷ^A81C;ig㴚lj.Ow:7Ռmuђdc7Ku7\PN'q:qÉ۰슐l+W/d֡r~<iffP(ى9sz4&''1553g:;;Ғ0f`````0KRYYzOرchmmŋ;@{{;gŞ={p <ø[1000xYTcO>$~_7DXĢE{nɟ :;;O<{²,뮻vڜ f18V[xb}< :soٳgsϡ<FGG3Ϡ+Vw jQ!szΊԂ p뭷{EX /w}O?4>ܹsطoFFF/} x饗g\y5ӣ@-xz(x)jV d* /nDszׅ)R?2N@k4:I@Z 0nNnj##8y8:o#\A+2Nb~#-8!GRN)M]֭[,X3338w|M;v ۷oǑ#G0==w8~_ɓp,X@94ELXuA5yKK*y$K$u'y^Ft][10U!wI{ }˩R'O*?m͉yt.e-s? !y5Dfʯ$?Z +WĒ%K8ߏGy`uA8qϟ… kbx2F'=nlg`0% oR:rB(|CTO&rKV&gVH"o5N]kUtuyC k?cXj:nqW^q\o )xUW<6?ݭ'@t+d,,Z䓇&}gPxz<c8ʃ"&-'/_CWWZZZ9 x ,^+WDkk+ϟ%KT*ɓdp n\Qu);||-RuQf-=Le IB4=E,c{i~Tv3EtIzN'\IѲz&sAixoyw)1NitN'Q!D& 9 !T*\p;w022RCi ddաjm{W144 6ptbN0w\`xx7N d1.S q|oxmflfɉCxI_! h<)|yd/+ C%{BXp;g5 ̆UK;8n@C'33z&8k4aKaa(z{{---[p7o|/꜈6<زe -Z!DZ~zܹ|GyzU-(ct2Ѓ!`1!t!yua IHP5FWqeAXs1XdQ KbeoO9QIUW0ZæospfI{@LCasQg$)H4c:;;}ctvvbҥhkk ]1"Ν xWcΝغu+1<<9s`ffF2\MLL?y@7TY6%:ƅ mDE^qZo⥢i鸿eEF5QtYR{۩Ph1Jq1N"TpS9NbVui|r8Ie{+ehbZZZn:T,|rtttHyW\rԣZZZdchhǯk_7p͛ц099闝F\FWW!7000000Vaԧ>Kj󵴴{oll ^xtuuaXr֊+oٳm333|r,\0]' R +yw}a۶myo6vmy&&&ogc v͛KbppOm(Jp2`Ya_2F'PyͿ""n*FNs=DfhB҂+zu!Yi;-7ay=5d,jqLء ;]UL۫k̼E=x ~ܹsxGӃi ]w~i書py bXbEd}dژu=\{T=Im0MֽU(Yܳ4oFQE"*_݊3yRs;H]+a״~Yַz O<8ҥKqET*aӦMxk>qibժUdPpƀb8VHÝȅ&K/>TNJقn ?|LGBڲlV2@۲^A\2FQZb0'Q%FYGfrlR?㌴|N'wzL|ͥ U3u> OjdؔIGh_y"Gl[n&G s:p> 38ٶ^ @3l~{R_" -[;scfX"KGlqP/Y̊$W'jQgh{d|'AfU(?L,in)%]Qj:UZ>6]w݅k ƚ5kC[[n,[ gΜeYk<O'r#:D栀 Q$K_.ύ&Uυ (˞^̔ k%G<wxOA0QtQܒ8),xDV]vazTWW߶mۆn8q1_k֬Wct200000^t]wEYfM͕#PS3ڵ v ͳbŊSA(Cu]TTd9a IDAT2o-;~ѓ*yCed`\-y;X뽋$DzOuQ^{^ΥsWx)) E%tAMAht6/e`0[PG:zZ[Xz\"m9R >.aQqʤhQaE%JrIt%j wKsJӤѺl_7N ar!O{WT]% oHT7U=+2_ ݫNTV =z1:8+! 40C L¦}9,-OjD)IZN!K^GlGG z$gb/D)ߒ./O(OwMCdPc?OQ.b\s P9c%O J%r\="~&eěӐ j;|uUJE;Nk0mڂeU_=ob93 GZN< !mU~]"^9,c.ڜW]m؄s'pqR2pYڙ%ʒA`tt7sٳЇ>ۿ5\.;w=z)lڴ ?n݊Rƞ={P.QTpM7Ǯ]o}[ORիWdTg7ěo;ׯ~b7x#o^z%|( }v\.G?ۇ 6`ʕx衇i&'?#G0o<\{x"* r@HRSD!18"ۈ<'d7:572& .@";-irA:YtiQ&T 65z5zcp9xW?/}KXf .\w}~֭[Q,я~~Uo~ /_ŋCT>9,[ ?Op!̛7}}}F{{;*"e>~-F_S1~ݥС1cժe\|vb|8 وץi|*1dwPpQ C/݅Lx]<S1227͛uV{N>)`ll 8}4{=eY+P,133gp&&&9ǡC҂;Їp8po6oތVE ҲǏoQgZ*>-u*EPdQ{)qGOڻC'L"%C)nA6d+)N,yř#XPȠдzktbEd XҎ,r%P᳓hKBar~΃مx3̎l$&-k`P/={²e˰e[]]]wsΡ333ºup ptիWcհ, 333׿!LNNT*a߾}ضmnv̙3x;`hmmEKKKLJ .MOKe$Gf3_:%vPUoݛƂs1A4 \}ot Nwp3NNu$ɫq[V&j%<ϧgӉ_KWO&"%.?K(ct3&&&p!xXd :::l2 cbb---166^|P0=1]]](8{,VX+W6l؀رc, +V@Pye ܊XiBђAtP2Ѣh{B-c`O@(ۻG%'H:xXmshȲh"A޺\vaCO_^XQŻg!.J]ÃŶl'>hT'$͉+xX۶oQa%ʒAs`hhn&aΜ9Xhc4"[oajj s  رcG\Fww7:::0333gΠ W\qJ6oތԩShmmŪU%ՙ *@Z I_LADK$rۙ7Sj$7ěGqg[ғ 3ݲ i+gWyh*'ǘ3$ƫDsyCP4,y^^ۖ _vŗQ-_)bSPWєkJ0F:cffΝCWWOBiYϟR)twwcʕ7KOOs߿_p r-x衇000Eò^ض퓒s= 鷁A"3K%4FFFۋv@XĂ 099bѢExGe|cs_'Oq+ɓ'ֆ60P(|l||xu&hX0F&c 1̟?=~E+1;w>G'bFg````` znȲDY2h~x:ҥKԧ>^'uV6~b͚5z (PѣѩƘ믿Ç… Gww7qE( :::P,ׇ>es133ÇĉxqףR`ff a `3Txg|{bIgvT٥TKPCO *BHr%ޤ?YtB=jx-PmGEۯr%s:1/=G%RQ$Dv!#:NH( ke ìTބ A [_bH=uq:uu``;wĶmu@@e jk8|0&&&ptvvb||333JQK/ŋuΛ7(Jx0117nDTBT/ W /'N^yغ>ۗx k4M|uZ1-7Q)>;Yjq˦kg=;Ӥn3T(sRpbbh[ άzMC 믐9Q'Bl7M7+k._^? aj ۶1>>gٳ(Xf qchh9sue(pBP@XDkkp͙3R ΝâE|.K955A?~s$N4N"%2IJ N\,T#aMjʯ$歐MQCjr":#9_ FooTL& ϔ󈗯 ƼEN,rN@[FHx=8yJА.@{{;'EY,4~388\q8v*axxn `+J^'˲|Q:$ގwymmmXpa8}NLL@t4KofuJՇ-<9xr mJqĠ{@qo:0}.12j{1iCױ۴r\lA9ɓAq IHWs9?`jjJ b$kHaN5ܹsqw; xpa㭷hѢPittT*ŋ;`rr'ꪫp) QV\??^bݻw?)?㓟dovdp)۶mË/o}[W%g>~m9۶1<.$[-.!Nq﷬M^|HvJ TBtRqjS2?^8W2!nLfh8NC,< t"tOCX[ 1:57nLs6gl۶-kׯχ 1p94 ?s%19lDLᯩ3*+nq?a>{L9D DQ2a*CbpbI^Ǔ50V"/WnBEjEQ 7oNTƲ,`ǎ:;;yk![/E]h6Q>]v"yZ\.?QdgՒ_}7-\`K_!` wvʓV Ԩli$$u dĠ ~jis *߮]W)pq 9ygF!Ap:͑.͠iX|!1݄c1%:iFÉSNM-z1:#O#MPZTȘQ[4IQKY끺ɥi2Cް1Fx J7tWj?<s!w}=9&zrhS$23-+0erYӉGbST}QåձDaIfU>h? 4^Q}KvmfR{a2˜c*lc8&h fWI;qT@M'~ ^W̷WӉp!E3$0y.v gr;خbJ6 Ц>P2R*C=F#9DAxIt.VQ/&{.-`*2-VUDStLz+ZD ™x;M/Ӵ)71EK!|T.v# 1D8* ˯ P}YHۜCN |ǽj.:U@j{b}SVS?/fj{^2)_\ϵ&?.-W!C%kqa3`缉ٵYDndi.in(&p΁D¾D)R<5Wփ*\32/j6.bZDT)҉ "{ ~8Bq Hp*-WfIn4YT/wpqxA5i:L)5klpv< jFV0M*4$gӴ: eұ $ |7vAAMȃH' (A3DAa TӉp^鐓wa%Ƥe;x 9iV˘cQbtr0NX W`s_ss\u9h5q-ՏˤMdMѬ? IDAT5P~^܋&Gr7}IK11\f+Q+Ʋ:M$U8E$C 7SfF&+#t.An8=_?w~0\Z=aM6 CwKZ=|'Rsӿdh+yMlGF7nt'VLeG7O '݈4쾬*b<.N N80oCΓe?Wmk']6-4hTq٣k2&{<{ZCJ]45ڼI91IV1ǨvBWmk1ח]gu->:]\iqf'fsXkBvȁ& W;1%TB.Œ4O6S+ fM(j eLː5k<)HP,Rc`,m ]e7S cV<%#LFpk DŢDE9=[\5=x6Vꎨga_;AI}1of9vd\z2Ul1yWt9%=)mt3zyADA%.\S-  J~ cCdBDpbNmsT>a c\vNNfG浌 c^ʎ"\빋b,Ϲwߜ𮁠Lr d0>@Z33 ys^=uHo}Dg!jz\zy"T7w^z\OXVEt5-}xr4N-6ƼF66P:v9R=i|r;sq>FQ^z 򲨱v>3?pfikșvMwX\^K8ZpkEf#(0kOzL [Z$2D]%Ua`QTڝY#7E j haޜs$Ie@ڴfuv$۫*lʬ+A-14alBDx4'/ۚVOfWnbf[fm`Y%v(<!Ꭹ ps vd>۝/ȋ#ҰDؿ0mʳlw.I7 +FNmg[^y{Ǣa ZG6Y#rܗA RˀpL6ωgNB[vqĶ8.a{Zn,  op4{AD `YY5שD]`n[N-8 ?hz~1Ôu_1uj[I\] f4to m(K66eX&kD i:Ӊps 껬qnFDDen8uq6}s<%N+Δlk)̆AT1+C"U!0ռ+NX6U.fr< 94ZU/auD"E@NQx=Ċ?8~k(a8v@.O"coRj fi$p%0,38-QG5WεT;c3θdwW` bxL/0'؉&!G:)޾1C#i:!EN' (Q&{$AAgÔS Ӊ  JZ=at   VxH BDIMӉp% 8M|:g(d4 g[q .%N۪W7m{F-C!BiwpHSV&Ƅw1r si_23"fQR [gL{?C"8q).k*3 BוA ?|s)O4 r4UK$D}%ZӶvJkr΢W5M1ڗ]83eJ49uo7ޓնi߬}'W- 2R1& D,Zu[I.5.U(8&syt†{7Q AQڰ/9 (1`H8ANAQp GP*AApdl4WBR"DE2Q<]Ǟ-)gѡ(yZ iZL> .3_;'q;k\\WpM$޷nEУ~YH7Sqhp]W@J.UͺPLedȆu3γ3# C7 ad/+­~"^3nVPod2GEB,[:4'\5\w{;ݸqIk:yi5Eye14KY3d+KfYYCR[z5*nxy46K.O2s2"^)y3A97ww!Ⴓ27P0k0fXM]Y݃ǥ9\>y:pZ^^G6]5AI0gsΥ1c3ϩv/´l45ge 88fx@Bt:YfP{922q"/a8q6Qb@pˇ}ɩ j'$.mv~IlDw ıF6ޱxh?%zCp7VV~l3f#vB:JŎ"AA*!Je  BC% 8AAPQ%2BGA!* ;"_},z¶>/[Okw.ƫNm'p$=ҝ\gƬSN,^^)yBM^ s*\~bh ͨ˘&0BųM{ E9ơHS I^s)bxLwʃ YDݐVkibM{b- #6*M[ J}(-byZE[N?g\\UlQ)LonL =MCh TۢZEѾ} ΆBD:ӉLl6 sz:YVO6b2+\8yqauǗU}(atiYR gH*Gr#0) cȚhwe~YnrLat8Wct2 mK-zn {}vbWhpl&z]bCrvPfGDsr0i3gT*{3/'#. EN'ᶉ٧0 oaSG%n[{_g,-NO+6^RسeX\ubl\\gV.ǩkgNTPA^=~¿g(ٰ8Ye%ZZ _ =(yi㭆#4Bp`CR7m&y[߬_r/vac5#f߻r:.`qDh9}9PV|/P,ⵏ?r]H~H{LI[^VMW-SqHa k \Qm9#jb6gT00a(q[.,3o"HӉ J0^W-t\P-Z]i]ns-q\BY,c+c^.Sm(.ѥs(z/LI>!,{88PCPZ)9x>r:Lb^W^sY֣ dKEk۽KqKb {փU$$41} GLg3ώgT2nm}8S\:ƕ+W?H$ׇ[ȑ#8n߾G}-*t B"[ /D-[U_ʳo%z7.A^&KǶa#%eSP*nܸ^{ ?|VB*B__vڅ۷v˗裏bڵNo8ZdOk ,$⴩"{lluG>Vyq6ùUOݽ <f3xmurrMy4YLX6oّ$2PDsmUK 'W893B\A>$LMC;88/ȑ#4"FGGqa={֭?q=z7QQQAN'ȒT(<.k5?{(נNTm*.6jwOW* AΔGǬ&zE_fd3.p10Ad9Lg#522K.XwI:u g__ѣ_=VSZ  ZW>a޽ػw/>O}ttG[[.\Jttt1^E   duIl۶ />w1aŊH$7o***ۋآDA% E0q{ŋPUk׮EuunbbL(:TWWc||NAA" riF\v 5k(z*1c 0PSSpqΝ؜N^G0p;/J)#>j8e!Ls'OgY.&{'M9(_qӰ sq_SqәcLKElO`^xbb$M'wfU^A'=FH?vލ&@*Œ30m4kۥ8vz{{111a)s[o,]---H&f@*BEE t _$]_NPkȭ o:J~?P|i>#̋Mm ~gVj3Ȧ7- ë& MP LTM[Lsmr}uL7\;#rСC(bQSN֎J$3gۇK.aŊ>}a F|"1IңCb9ϪUy[Ud2-*iݶs3=8T(\ӼbѰR#f1sP.3.餪20&k<#J\4TS[t 841>W. Qcc#}Y>Q u7Y;N;NAQFLdgK/SM2SL1ğɟ`xxp PU a``_-[@UU>|/_(y444`ddƾhmm#  w:%SLٟG]]'?I|#8Ɩ-[G(5_; IDAT8vR&&&F:Fooo?P__EG}NAQpʡL #5n) t?c1aܼy+V?z,_'NٳgpB\t ccc!AAy# H$0k,̟?uiӦaڴip~|SիW +Z<¢Epa;v <.]!TUU!h'r:X^P!bӇR~&ۍqfrкҖ`Q824THkZٝ+2\?&!9k.JQӁ9xc e2ooT\ERj\0i[/@M8Q((++ߏlݺvByy9>bʕn.ADksWFX]dr?ub_Q~[mnM n_EQy M&8 ÏLxʈ>gk~xWԹ~^uSq0s4c) %ZEf3=ZS<:.(sPuVyn"<^*Mv3o&V99 lvE~=G{KҲܰ_(p^~C),msp&LN.zqK˙d21}f).tRNC5 9ӬDٗy<'--ĮUu[.?=`< _X.M,07HQ G t$LLӘfFZ6P:v9 OAJX" jNAALf!4JV" 0`ziXM'r:D~JB+.Ż&]~ݕr奍:FW:[sK]ە$a^ujdٵD_Dq[gVoNv-sdk`H^IGe'ER8lHHZ>ˌzs&CiaY78Zޞpppt* c ÏVO!Պ {Ml(a]\̐`#{z(~D,c:_ƅsŘbMˤ܉k(z9N:tB:&r:9_/D LRq}Sљq> 8bq1\oFz g?mƦMmX\!ݦL:\~vW]) %kuli;ƓT-j4(fQίQFCyDN'TA S=B9QrTo=K)殐a4xVs&lCje$~%oy:qCX" L0D%  M3 t"cLVcqNCG\:18 [ƤE9nw_?=:v6x/n>ǩ7ީ_5Fa۲lymf R1"x;n_qF>&KN.̅Ovzt}PXFp<U{BVD7S=n 7*H~KU0hf 0Oi?Hvbe3(P PCt"I@qQ@u 4/)e[%Nڱ m'}Fi=oAěRUXr:T<;ks8B'_r:&*l)**Wp>/T +~E7O0d2\0G<)MfYügPIpŗ! vA"AAAA9DA% W>lAAD)9xvJh2؉>{}Qi4fBPxLBN' (Et-`jR$ 49B AA*<6JFǀ  ExH'EӉpj<Xf ***8s :D"%K`ٲe;wԩSؾ};&&&0c ,]Vѣxׇ:_K,A"cp-477cɒ%;NCm3kUϡ&wANjnMV b)5\x88lH;Eުf6ޮ|v¼y(+yQzk*6l=XD-6|a6a<$85:lHmh%)++ϣ W^lׇcǎ_*v܉W6Ԏ;0227oϟɓ'flٲxvZ8}4裏BQlbR5? *K@'l7?N V9"9by&#R}bl/\ZKQroBDI2DCC̙&$ ͆M__z{{JpBR),^---8t4N< UUvZڵk JTcTL=ڔr9h1Bfםg= U13/^~& AT*&b<¬twA__jkk1w\$ YT GE:ÇQWWŋ6l Ο?ƘaClk?MVZd#ϽI݇PgJޒm.7Yx+ [" >Rpիӟb7C]]4jnnFee%*nݺ ZZZ9۷L&vZ,YUUUZzt|sGbhJPde; -4L2c{'跈R-xu .)f՞qۧl Yu9LjXzzam(RNEHMM 6n܈NCH7BE-N&ya&V+e GGp=zXWUUF444n`N X:LAy Ae||H@mm-ZZZ x P`PzBQLLLZnܾ}>eyޜrZm1?runvan KT'?mp>*w[s}ڏbivץNvϲ=ܴ<Meoi뵁lt*ͨ DRAc,}**3^qxMf3~_/^ĢEWWhll]֔)S 'N?P__jjjC__]޷owGyy,|tv/NtBM_b)4k2o/; W^=c͚5k / ݎr^XD0M$Zkg(' ܼy*^uܺu ޷ ( 1>>;w+] Nc``555sDNI[;>l;11_MG,W6?mRc`2PF$-:? UMc`p_W}}Æ ?~먛R UPTSLqL L+jkki&\###Acc'Jzա]]]3g=_d~:ك'|{A"@GGk+VK/d`/2]^z)qtvv_"aҥnΤ+_ Z[[~Mlٲo} 555nΤc``_?l^WW+Ǎŋcbb"EQPYYh rԄg}=؈2DA__Ο?{{V\d2+VX|9z-TWWcΜ9oܸC?_:o6^y|k_%Axi&CD=tvv ":Ν;/~?S_3gΌ5kVo:s΍] 9bF?4կ{}}}Ї>E9z:$-[A|ESSq=`ҥH$Xv-~m8s Ο?ٳgpmgCC bƍѝ @_|9yB7gR҂6cҥK?YTNoo/0w:rzEM*=܃{wMl޼w1e\v =ReeeXz58@CCrJttt {nl޼{ с%K0B IDATŘ>(ºuVLJ0oB7@NIӧctwwOWܹhhh>!TUU1Wb˖-qN<ӧΝ;(z*x ?'N`hh*xbcvW\AYY6l؀öm0|,[_pn޼GbcܹXt).\^ aݸsNM6aq>Z̛7ͅ>XٳPUbܹX|9***[oʕ+PUD555xֆd2Doo/TUE[[6l؀&$IL:x1o<,^Aܼy3.\00c _gb۶m8}4PWW'|fߏ?2bb||/^Ď;pTVVbҥXjՅ7xΝC*Bcc#֯_EQp!bԩXhQ &%.\0D"u PMM 6mڄc[ O>o޾}{ʼn'0<<3f`Æ Fg9r'&CEUU\x{ٳgJс+V`ʔ)x"|MTWWcʕXd!Յ'Nƍw{ .DE6e;pIܼMCSOCyy9]m۶aݺuXpaBazaor1|^fB__ٳ{i&ŋ8s jkk`P"st:A:FYYΝUVall ۷o˗L&( Z[[cؾ}1ܹs 1̚5 /^7 ,ZK,̙3100^{ ===CYYf̘~3g΄͛q%F!c֭HR?>8K.sw}hmm0QVVիxwYszzzk. _<4::_8wƐL&q9cԩ8|0_!TWWcڴi!"xbtuuhooǍ7oƍ(// x lڴ ~:~aڴi%tڳgy !J᭷c 1߿s… >}::::pUݻӦM' l>ÆéSP^^UUobddsjjjPSSt: x"ك} tCCC`ᩧ2Ν;QSSiӦ!N}}}(=:TUUAQ>|&&&H$piL>VB?=k׮m:8BRVV͛73fC0uTpq)\~fܹsݍNs=?>99&!mmm-oo'&&cttt`ڵFRss3nݺÇW~[CCCBOOزe >яܘm9oߎ|K_Bcc#/ԩSΝ;|n޼.477cժUXf?Ɓw^,YGťKxb|ӟFWW eeeXbcƇy)3::CڵkW&|_lj'0g76l؀㨨/Çƽ:cYY>@Q$ILLLsN:e߰a{1{¾gϞŮ]`|cCMM mۆoXv-p tww^y\~cccć?a$ R) ॗ^BWWfΜv?GUU~w .~8pXx>t>^{5|3MӃ_|O˗[oaؾ};Z[['!baxQVV߿xpe;wxܹUUUxԄ۷?1/^1>}.\??bڴiزe ^y<35ΞϷ>X_} .p>`Å vZ<#xGQVV~G?BWW0c ߿===ظq#~i9sӧcΜ9`M 9FGGo#LԄ/8ucOo66n܈!|Ν;1e!]_4{=|sCWWЀ|6l!vvvرcXrqkG}T .]0K_n޼qR$.\^z 'OD[[g?:l޼[nESS'QU8zRVVO}SBww7v܉gy=:;;}v4%&9*F&innQ[[k[gv89ʕ+|2|I466bܹ9s&9Q9sx衇PYYFL>UUU8y$PYYY8}4Rf̘faʕlj'l4cHӆO\`$z{{EQP]]]^/bll 3gĴiPQQE o6<3gbܹhllēO>˗/ŋH$4T*2ybb|Ν;Xr%jjj0ex166F/WTT J!z9 ߏ6,\Hӯ((//GeeԋEww71c 444q%ݻx衇PSS (Rq~ FtttCuL`ї{zzpq<9s&fΜ?~x1EO>Foo/QSS3?pYaƌhjjBMM ֯_?xO硢¸D.;#磽耪sL&QSSc8Ν;g JaٲeH$ؽ{7:̜9---Xv-?$ɒˑL& [&Nϻoؘa766 ׯ_ǭ[,6*++z'&&0>>nlۋ;wbٲe{҂7gPևdϞ=t֭[fcعs'QVVFvTHIa $z!G[[***PQQah sA"04`Ntxꩧ044TC+t===)SNׯ_G"> 0|ctٳرc;˗/cժUXnTUEOO T*ZA5m4Lss3|>:;;d2(BWWh4J*bb1N8A}}=UUUbGJ @բhԿ;xgl}[Y*" ]zzjQQEEE7#KtGE!''7d2VzEQf=-p}n~d2(v*++?}WVVxmٲv9ŋg t"챸8e=g{ /ݻŠsqaf3DBWrn:֭[wξo 6~ Yp]wqw288??p,Xp8Hܹ]vuV.\HMM Tϳ{n\.p|u[II %%%}W| 6M?UUUiDb^*+]Cr/=S#%C\USv N:&X,j]I8 ԚFFvC~tV٥WO:(ǙQ!J+ c].!J$wI$ DZX,F`fB.K-YQQڵkzt: ~~@ @nn.f77==MOO(Pd2:Vg4Q~vʡ`PGU8.frv4U\V+ ~,p8.:p\$Cv%?Nw͵Af3~;@sΩSBPH]wlv+h4\.6w:t8N-[4^5ZSSCCCgϞ%b0hhh_*_?G?Qu۵NKWfllt:A(xߏf7Sz=!LI p8~U7DJKKٶm;wCcc#&%K`طoZ W7zb^I$GE׿ԭYZZZD"AWWX !IKTVVdtth4ɓ'`ڵ477x8wl۶ViUe q槧P(nWLLLp)n7D"A4EcZb``K^DD"3gΐL&D+W… 9sy8NW]~^}UXjv7|`0ș3geʕ6fPXXȲe8pӧpPQQAYY{b||yoXz"_mza&all`0\Lutt+~l6֬YCWW[wi2 NA~-^ŋIRx^uմvRk֬ӧOx,[D*bzzZMi|JX,fffNxK? ,`0lj^FFFp\5"bd2B(¼;8qϟgtt^{m^}?Yאh4^C9NuIIȅ ؿ??~Á墹+L)))nc˖-l޼NG( ??/^n^@ժgWt:.n Cl… Z8H311\!:06>V IDAT0@EEMMMX,V\IGG|>XdPFpHggNLLpm%0>>ƍ)))@$ x B(B8FP]]M2gUc$wΞ=˱cjLMMa4{1WRl߾X,F! Xjf6n޽{`0044D[[1/dk-\"2 *jgƍe $W[O'N+&:;;'NР^ Z-uuu]}NinnJ8e)x7d2^e˖mo磧GKss3WFP^^Nss3333;v ??OvsPE=N tFVZEYY??DPUUEcc b]]]j]VXV`0ífݺuѣj+W׃w288HooZV5fΟ?vbŊ[nࡇb͜|'N?͒%K>O( ~;H{NY|9MMM,ZɓO>I__:ZZZUL&Pd2FVXAss3SSStww Օذa&P(GEQ۸;߱I" R&:Z[[)--%NaN"֮];ck4YVwA]]h49{,Z[[U9%(B!B!NB!B!bII!B!B9I: !B!B9'I'!B!B1$$B!B!$B!B!ĜB!B!s!#N$( :˅fC{KB*bzzOee%ͻ['IfffX, 4 L1L&VsX,F  PZZnq.\@YYxhۍdV !❤ifff|b1EA`0p:w;$SSSb1,XړH$BX,z=H$ncZ+F̐J(..~Om{7B㔕( XT*fCQwH$B("QPP^V !k_nt#P(ıcwÏ~#^z%oξ}lav~_~}{< wݞ v܉ll6c0F<A5vٱc/"6l .ggbX룽w,`Νl۶ PPP0ǭB!; k<,[nxO7fa2~Nb\zzzx9x mmmmѣGַwA"sQ^^;Ξ=o[mFcc#v}[-V2Iyⷿ-dɒ%lܸN"P{N:& d2=SSSl߾\n76 t^RXp8nկ~ ߿Vrrr>SxH$B* !Zlݺ'Ob >Oa4A$wG:&s{ؽ{78Nf3.{NR)(H=ݨq: q)^/~DqIIy`xx'O233'> jjj0$ 9<X, {%LN)**bŊWe 8{,'N HP]]MSSn^Ξ= P\\Laa!Buu5P{RQQAoo/۷oĉnZZZG0>>NKK ~^y-[Fyy9Cgg'dx<8t84bb7xP(D2@~dB1w$tvvh{AŘbppD$8pd2I&+P@vդUSSjٳg9}4(Byy9L("NSZZJkk+zIIyp8Lii)z<cǎL&bD{.o,p1X!4 mmm8#GAղh"(SSSX,VZK/ʕ+q\ivӧ`0`2Xvu%4СCjjZDQ~zj\.`20\.N<Ɏ;8EaQQQ(Oĺu|x<f133 OB|>?<DZZD"^z%0(B8p0>>N{{;WkF9N w.뽜&B!DvW<z9HqqZiffY1x1T4V j,_[oE8F $Ÿny~M*W^ԩSq %%%_NP={pqnFQƈbXV֮]F_~Ndff_~+WOww7yghh ?VerrIhjjRNկtl6o~v]z9~8 H$#Hp}B'I'!恑F#%%%ohf ~;v˗_԰l|U^/?Oo~wAgg'裡d2 h4twwSSS>9֭[GII `Pix</x̌٠gtsq,DZV|A/~AGGnJ~_RPP?y\.v駟&]|>^}UqeWmg$ԩS<477{n]mu mƣ>JCC۷o6tB!P2`Zmf3Wf͚5 /~~3֯_vG}O|({{cǎ~VZŧ?iF#333F9›oɟɟ|rn7꾣(|[駟f՘fuMˎ.O>O|H$?ΡCfBllڴ K/ݻ)..&ꫯRWW﫝vO?:JRmmmB!Ο??j{^`Gr1|I yկ~~޽{ŧ>)Bl޼gyKNB$$QXp!999hZ*hbܙL6^{5.\&Xd Fz:::LOOs/Rv;+W$77V{;cccLOO|r0,X@9v $PP(D*z7Bqmy:(:ӧO_C4eppŢNT*bΝٳK*zP(ѣGYx1˖-S"׺(cccRQQZVٻT8fdd͂ `0po2<UWW}_˅$''~YKqq1XZzJJJxꩧX,XV1͘f.]>1 xR?Kss3ZzCŰSQQ!I'!}i&vΝ;jd2v>NG?:Aڰa?Oyg*;bJ?vRW~kmmeƍs=O~6mD[[oDmm-_җxo j񥆇ٺu+ǎSkll3 GݻgV=vqsQ d2\.Rzڵk{$77*++_ 6 IYY;svI2tf$$H\-.$ fzzEQԞ tގVnd#E>a;wFfzz̝w \,8008Gyy9C(&=J^^f( A-}i8}$餹T*( 999PPP@$W_REEAAUUU:;;!322Co|+wMAAH$g\.?W DQ/^LQQzM__tZMLe2,YA("H`4q,]t0w!B̍D"륯ߏ^GөdzEQ(,,d…l6<$IuDcǨV8gΜ!jq8\4~N>g%XlPKmm-n ^up\R)&&&bɒ%099Igg'XX,Fww7< /"uuux^=wߍb… ={D"AQQ :::Vkt,^|8{͆l&Ѐ``bb3gΨL&Yt$xII!A"ɞP(.\illMB!Љbq 4>!<_gFBܚdzBd2dzh4\.gzKB!bx<կh4?+NBzdB\t:*( _B!A&!JEA`4eJ$B!B!Ĝ!B!B[$B!B!ĜB!B!stB!B!sNNB!B!bII!B!B9I: !B!B9'I'!B!B1$$B!B!$B!B!ĜB!B!stB!B!sNNB!B!bII!B!B9I: !B!B9'I'!B!B1$$B!B!$B!B!ĜB!B!stB!B!sNNB!B!bII!B!B9I: !B!B9'I'!B!B1$$B!B!$B!B!ĜB!B!stB!B!sNw ޽~n,ˍnBd2|>'XznB!Ct=zÊ+ntSBw%Hk.֭['I'񁑎;!7NN71x"B+`'N`4otS<"wB!nv7Kǝ$B!ćZ&!355E<jv B!8t:X,hڷݟt !,wtB!jh~_200ի/ IDAT'?ѣGٿ?.\h4҂MB!tB!i&&&x'il6˖-㮻=Ǯ]ػw/XV>0kJp#3 y衇.{c6!NS^^N&Yeҥv8tPVK~~>z)))O3B!I: !aRSSSnxzrssq:%*#[`0p^z%XhѬcllD"ACCzP((ݬZ 5=jjj(((`bb3g\ֆP(ٳgK_N~ C~~>Ν^CMMUI6rw('!-O:[N"?!7D*b||ۍhiLe2"xG~~nҼ( & d2vSx'ͥ۷{no^tZ`DQ4())KJ8q?\;v  Juuu^!nFq7? w7gB2 P˅nGKyLMMFopD:?{%b -[vY2P(D(hlldhh^QYYy===(䨏v SSS髾6LyٱcZ˅``޽DQov_qN:ɓ'9yBtJǝ$BLT*^-* Z-XH(BCCKWW`x$ 4&mFjeլ^zs6jyjSX"FH2K>[߭q'I'A$"MM(1a)$IѨzC%2!|699sȢŋ.7ӧbjbaÆ {X-VԨd6l oل(t:\. 3::JUUլm9qԐNٶmuuujzĦ'?7I 4FuӂV rB>iq'wtq&&&8p,^uCWWt:V^MUUidݓQYhNSG&$& łl`0'B\Ǚ3gطoׯW qJJKkٹs'ZIl r===}q*US&''(b(**B>IEE`sα{nV\)I@<' jX,($bƍ}l߾^zcǎcjCUUnp\ttt055Ń>xiqj'³?m/r@nP@+b~;鸻N}}}9sDkk|a===?χ`nSo0twwd(,,⺂^$֭SKRA|M\.tD a'Ü GO)Miee`17BqqVZEss3LF2\pv b1V+82 SSS:uJ)s,Z `rz9qfFnn.2==lfzzF^4N2.\x͟w…lذ%W|̐H$`߾}9r۷S__OYYs}NۼyYh͔>R\\?Lcc#999hZf3W|^{s)zzz~3gXf ˗/`08?8Fh4Jcc#7<{j>8 oȇ\'K nj(.MF? !nyq'wp'~?w'? /,N'Х2 D_?/LWWH,/́h4~zz!2 >B_9998NJJJشi@P(``ҥ,]Tܲe HdՕeM:Mqa4Lgd*ICÒ!:rl&9رZt:Z !%LēioSPxtf=vf 0Nłnjj9z(7hkk#l|gD"?Ζ-[F$ JKKYnwuP}sN"Lf,YBOO=zh4ŋX,'?3<3W+((px+x o###lٲa֭['I1Vh`0d2n:jkkq8ZڊBQjbۯ>:|fEH$oA&a崶^7pء%Nu'L^tw1BA8.B[t]4;nڤӁ藺 p8L"h45=/@AAeL8p>ovwˑ#Ghii!NY>AU6 }9Šeq GVs~4Q~0X}< 6VBjkj,_6p|d2ɞ={xg(++z۶mG?+Vd>b6I$d2^y6mg>scjj #n,A:A:fժUWp1^/'fcppcǎq!>],//tłP(D~~>UUUF8w'N;G%z]qqN۳gzZ;U@h04,,b֦Ӆ%J2Lzw΁E9Z* E ⴙZ,XV@|B\t&C4&OIH"E2΁``qboCCC|;ᡇbڵL&VZEYYz/~ BCCCj`NEFnn.zT*E:&h0/&l6nG0<<Ç Jqil6 \qh]JIVj*K׫@V6\`4t`%V\=FhۻjE1>>/L]]###\xod(P(xVLOO322^S(\pv 'N@Qz-$k׮/,,066xn}˯פP2a<ԻU#IF"i&KL/\?cx>Ǖ ~[SG@׌bh0\x\A8G=Dl,|nR`,l68@ss3333pΞ9K Jӊ1 zn7x\.G2P(uVEA$tmvρ@`{MQ\`0p8l+3gЀ,..NEc [z_*l۶pA7Iq+̻]a,z .]Om S0Pf3`6dFx`@z.]Kp$ &'UH3lo-~0*A>D$D~NL)~m}Y***~͛7x7Ȳb^V$ k~9Eя 2 +*ILNN4d+Po&j8vZ Od:Kx1Ktޙ,ÑsKBKT;Ҭ96\V 8L*ܾ@P~|>X3gΰ[VJz=|s#XZZ"͖ XDQJk36)ƕFF׾ 9+vWV^K@b$ A۬pZT\VՄƉMŮWg'a`z )Wa4 RxAKdH܉lU0.]l6S]]]j1jjjxg8y$?[WͣhpݼKKƫ /@YY LOOS__OUU;vV4 lv%ص3umh6R:oWA)xGsanhm? H]aLܭSYY=+MIR~~~ӧOsa/PYY.]vcǎ[`0FaddyL&DQGedYFP,LF\v+4Xs)&cbESq*]:jLqYMO˥ L]]gVSjZl66mpؼyʒm۶aZZرNR.n0|t:f3wr(tߝGUFT8 L**l$yRW{E8u𳟔*Ͱili-`XeXL$0&Vշ~ܱcVfX,F8P(pA|>]]]d2.]D[[uAa"pY, `rN'@!:bappAKKt0Ih4h4 Bk6Ӗ"NJ I3IgN&YQ1t[-8-ztr\Ahߥ*466ްg.9auM6iӦkzmq׹e˖>9׾q< ǛHP[E"$3ΏeY@iMeQ6,mV,&#D .AXfTBMXf|2\s)&2,$sLD$0F-.#N>W]涭X Y,sa,=H0; PS C[VN=rN&m$w%{nUx9pJ*++ +Sà q\\$2hPZV,48 F ehK?@or#k|Nf V{RFA{A>),a7jZn!G6W 30d`&@hE9=LmS[fȚV6cp"\G:eذ<@cX?zL,ߪ: ?IV %|XP$4ӱ%g e螌S]BBˈäǠWW>D  PX,R(ݤk"t0ȲXb&a2f"bn1K=q,z v]GE_ by# >8Na8+p_ ub}*Ytt:v;>o@G \H=f(t n-&mTx&z-ZEEIAr aRqJ EE$^H3I1I1IeO,1I1IӖzB*]N& iu#8b@/5%.@zㅺ {?Bi IDATWݏϡﺂ% FP+/Â*{`&S >Y`4A q.3:VP2:݊,CKuEFf\N1[x,7RaQ4  /IYa淐_|hr1z&O-Grd  &xkT1]߼ y?VJ5c`:p*x  tH4.f}8P(p8h8x4|zb)w1 dEM4?lh4"Z)//_n4bFuu5۷oG\400@WWb1/S[[f B=z۷'Iioog9A$IBdEb9W, V*Gָ's-fO3:d|>Ed1( *+.7;*.H#]F:W!S8~6p@}3lmg ppߒe^^if\df~ީ.'N15cԺU6Th x8M:lF+ B!cx( EH$ر.^X,&hc4[xzz .p)N:ƍgA%&&&7vRg?"$!IP5Fn,u'/ds"%B ci± XXy .2D)k]5^zጄo. ©Spj@7BmUANwO A;tV UUQUNcuGLF<yΏ99NsʾF'}T $Izo{n,?#*\g}X,s N[[d^])/}={WW w$I 6r6 "B4#$c)y"vK95;uJ#:1\27upE%(gv nw)e6>P;A'a2 |Zlv;-5if鞈ry<@( C0ĩ+34օaƠSŴ;A>5F㥗^??hoox<֮]K}}ߏln,--}lUf23\ LP(ݻq\q4 O?4C2>>oAGG$t:y'366ƿDZMf$Pük2??`׮]۷ 3$* u^u2bD6OL. G[Ys(` 6<<[&?s~x'iY_x>6-Ax ĝ Nª$IFQ0U\Vu6&M12o&X4d4IH} Ԕ+3Xn!趠Q1NS!8t&a1uW[ac'Ue 6l^#D|2N:f& tz[y0ϟl6{n9q&;իW)++'NڊNw}BhG?F#I;LMML&fgg/Iuu5. .BDMǎ:LH<\C$h__ n!-mDH65#}(G.4'pRSu3;7V0E3rAx@×A'a[n@n00mYXk|EB)Ҽ?(.+m4 z&:/!Ld8\pΟoz>U J=UUr$ Ig%[o*mmmݻh4,\.;3<tt:Ξ=!p!cbY4gN" QW_#s%FGGEVQp$H`k05Jg4'-.fg[2M3M3K3Xb$W|1/znZ<C(C8@98*6z0AN,7'w"qwwGe=F`kLG\9ӜL~iMUzT0%EB+ %`4?Z޽X,oBB_,IRu:ݧ d&}177G" LR^^'ɰw^9$vT*Emm-nV91fggtttthȲL$Y pi|6tW,  -26"M3˪G/&hL``;0%*aϺ;*O|~{DIxiZn;.͕ 9;ǥ89=cɯg=t YIҫ`b D^㡪yfffnO2.t:׿W45e$hV9,Q,o$"i&nJfL&#&Ivݨ\n Y:'tL_`pj^o R)6쏎x,PNzM>ҶǟdI4"n"q%Nt#(2F^Ӣghc̦83p8{1jʌ4X_iSS *嘟_&Nn:f3?).\ߏ(D'dΝh4N'FS]7X҇*4%$((FF#pZ-Oj0 FGG)//'rE2 . իWټy3>%", 7 ZYXH瘈2{"ND/8eVMH&v C& =(ػ[~H}""qwwhᡱ|SSZvB M3ݓ gR%<d)<&N~U/ C(q r,,,M]][nEJmm-og%*l߾}ePlbb!Ο?'N P[[K[ۧ+]_&2;w xt?F0L۷W) ٰa -[`2nX>::[oӧz:u)v}]IQv;>ˌWUUzW2wV2_. WU^y= G|>7M~ݍ^8ZV :YywP̎;p\vjkkEJQdlAKT -d 'M2>bzT<@ʞ={>uQ$Itj<j<ϐl@,Dd8q͜3IUmWLі }w1O`~0_ c<طm ` :1O;/^_X,z2L2D L/rq8B"ՠR1P3NKЎbx(J$vf Zr;,H$Eo|~\c_X Er"|XrpKQڇ NĘ /"疨(&ّɩl|m4̂6Aq>qZ(q8[V,YZZ"ϣ( JNm$IBUU l|>O:aNP,WQ.ˑf17 1ԃJߓ$IZֈj3~ 4= \\`t6eT{4mTX Zq-Hu>{\h-'0bP4,ICx)bnnX, jvl6۽Ar3/Ovg`P0^.so8*[W >v$~ǃA'AK*]fʬMd9xw=Gi&v48衾̂"-Nvq\tX9IeY *p*2Nz';, '8?P5 cjgq }4u3wZj>'sm&w!q'Nph"cipU BKlEc LF2R5R J>F| ,2ktN*'M3owϓ&d׹_?7~z=l삺{rH=ĝ ]&KSTM,VKo{rPpݓ tOi(w8L:;WA~%IFU*،E&J'3kX{&.crzubZjBrݰe#4Ԃr_;a5wQX$DfL&\.$ ,$a1Ypk$Ibs7̩y&#z&x4eFxi X XZz ,z?  $}2t(T6ӿ-,M1?@B4e(O1\G MBKTb Ax(]Nl߾??ܹsOH}JU/  A$!I (-,z7UCld-=:ib; v]{`6K <Vէr??BUUΞ=˯~+FGG&4??O(biikעjI$˶mp:ל,W^,,,vQx< 0<<|~6ƛiff'xBT9 (2\<㋌ϥ:aVXegC56f-NAAVY05*rA+7C-Oo!;4J|7{ `|7+zz9V@K2e1NUkUxrtuuѣ?uA td|>Z[[󴷷/o|Ç͛ihh YZ ϲq9݊l6 pM(F,dYMe@SSfEQh4}Y<ñR @IxoXP(JwE* .o[φ#Iz'tO-9c`&(e&4[XjdIA.Sd Q#k\l.7QIN}+8\*uAupIxdx={A:%ɛN]F|EQ>rGy]vNWWf5ۚfL~?>r `iiB^эeYd2}vo~s6l Q IDATfhhh_l vpf4<%IZBB֥á. seb$] \1N[2rQE/*AA YP5ERXF;7ۿP#c;ONS}Ho56 J /x*)$IBeX,7fH$4 N&m0a) 477'r8DQ"$Fd2x<z?r {6u^0@VJ'R,^))$^w*ʯ;8c4ف8&M<.f4F$D =t'w ܿۢmQ]_Hh;ӤN_D:Gai*!l1oWBuTW)f/p߸A~r>r~Mn|"@QF#$177ӧyg[mjjC_ۿsa_/OzjjjPU^&''q:\|H$O?}iq5n 8}:O@?UP(઄ַšVSO,sy:Fxosi01Nsȣ-n6zqY "$pĝ $I8L* 5ҏa 4L2}4ue")P?`4NEhJ(A{G 4 {e޽t?M["_|>OX$RWWdž mN)//[͆( z{{oxcǎqI?W^eΝlڴ UUٸq#p|;t:4֭S|^8jaf  Qq9'`섍kY$3$*t7Ae ,rix$p+J +;fE wL Z*g-Qzxt7~Ͷcx3GᄭsO-w <444u֏vttX,v*tEAӡ*Lbf޽455aZd2͛q:HD0fa2n h4x<>b0yN$ZZZHRE6m͛;a1~K`!c`tbJ\z 6T(Mųy\ZJE=-wY 6"U)|FLN249'NY[ 1t(uKA3ngNmQZhvvLYw/!#g`#ӻ61 w:=ɐ(Ȳ($ITVV7=ho۷t$ ʾ}nρ8p⩧Xu4p:J?3@KKWzJ\s p$$KAy5FhY`qLF0K(Quhcv䙍%ygC1Rq+B[esmunX ,-F=+( ׹;A,U!PtɵT0:aMXpp/G߇>hoP5uUb.]qWyFGG%b4f}䄻H_ݰ{stGN`*L ǔRO(YMuCIyE?pi2gygbMe ƥue*{8kC/VJA#ngN$Ih::o+MžHgߜ$pڑ_؂g wDž찂Nw_ ;tF?388NC"2|d2d;vaÆ;}I§\+I|Ya0[QH`afB08 =CꨡW^fhP5U`]VZtrSφͤcs]E.1@(t4̩E}F,lsⲚP1N;AnFܻPZC84Ä^9Jk~\]Tv/~O͆ }G}~3A!:HH%j\K.IXvz7>{mډ7V-GID^(H   d5`y ~x`0y5ӡCbP\\l6*x 癟 g !>ey}fcݺuRtQUܤ|> l#8:{~Y>_<$ JN㴲ʶt/MR>̋9œvvl;V4ۇCV. !ߏ㮻HnNBe6hd椒0F:F8x15e ;P!ض}wOBI =(fj! h_ttwwsA(--tF%W2yܰn-QO:-w \r t)|(.PZt&ɓ'`"9C}0v# 'S=\d˃P2ZdU5( &^a&k&'`"'B4t\ąqMS6ɚ5+6^FA(!RB,'A#ܽ>:Z<8 ?ǸXS>-e-AX466F]]Յdb֭'?:)  8LN68Mr6y> apJKք! v| nWX9]a53: rٞ)zd̓}fy8L8,ӗB\M wBi*Kvjnm&"MןF;݂61rѢb6,&#@+:)DQZZZFw}l޼^x& ʛx=>BOt5%>y(7 PRV.sJ@S'i.yG'9y}͗8ya#mMbs^ʲx]: MU."!)K233!33QFFFBT߅M vn&v(1 'Q|QzzXشX85; }TU> !bhfF ذaDJME܌T5unmoG!6 'pKc2`.u֕@VHOM4^'n5Q9ʑ1N)̐bcʖ5n\v:݊= !uZ#HH$PU^^&7Xx( >|.z={/555xKt:yǨ9===<,..qFno[m2< L&jjj 3y Tpds L\.g5JZ9?M@ց#t:0EKp4'Y~+!Ո`wB[ε*Fn݊jE?O8jbۗ;77:Ѹ !n עpzڵ ߏ`< 055uU422Bww7Hm۶a4Fv+?? {0>>Ngg'N`֭C4֯_"6 ͆zkx<ŋZVkH\ z0豮FE;eĺz SN/ 2};7 !nZ:q.^ 8N|>Uzzz!$OK'{B[|;=ot̍w?-@zdcslHu&Y-ɕVx`(JrۇSa'/bd|qwN?K(o^T4k3䧹zF,&O@ !D"ǎ̃>H @}iF&&&"-=If ٤h4}pŕ\CP[[!?HBa(+Ak;FM{uBo#c%j0hå,{411/[o$[lt3<_rOGAVFryBw/8o=/@X 7MPZef( ft@ L58ʩ xy}1ZxyXؔ!?݇@'!nP׺pikk^cqqݻwS^^ύFs([oEFF`<5%%EQx"h^ Hy&eMc.l!ӯl<~-ae7e3044Ķm|={z z!V+W+MAP8~| 09 ttN5rY B'*, t^i:8;Y.3<>(EiͰ6Ál6V ! Zڵ;;44޽{/~Om.##@?|>R> q[͸Z_L0~:=Mh?c\+L5l՞XЩBֲwfE! xw8v.ޫ>q67/pd8u, I!5) !n%z +ibꇿDd^8  D?9R!lT4EI$unC+M*bveee%ӹSƠ(a4&ӑ\4?^;a|ԵBzk? "(/lz~3nBW>thIU1uLz\+k'9;(Sm0J85BIh␝LՊ`㴅J wB[)+ ÷WaI"jO%34oߣSY"q=[);;pSSSq^rOEiX_ FǠÉVX脱K0t[oZa zޔ!(hld2uX lO;M,q&p1Bkvr2|6_$+I wB[ףz638蜩$Z2a%8 }LlBl6RvlDס7gZ!nt+vL89r v !HE>orTA"oS P[>x Ƨa* *[*`0 EQ{=nN14A)zfqqӔ.!NnIﲡi%ܲ[p'թF s֠zBd-wnf|=0fK> qYЩfǎTWWr]fp僕S0w^8?= uprq P_ گbYVf ~ŒikC4ˡIFJr:%!ĊB)=0mL֞A`?~ /2S0)C=$+MKZ 9vxŽ}ZG'b:;sCz?:[P IbFJJJBl⦥']0Z`W tA_\smpnCo&{>P^ Ep}EA44MCc3IOsQ4Qzy}o4r3JY4ՙTf:=f\=F!>9l*`])hgbNl'^R0dD~OB\3+:eeeqt:U 0rssWj*B ;`f֟¥Pw^o={ho leJ[bFχ!%euc\To]QNhz *3S % `f5Fh脉8*XW 9-x7@\^*:c3H:HOfM`֡چtϳ9BSYfF( 8m6FPB%;!vb/3=#<'IhZXi||3gΰ{n\.H$bLMMVj:B(`6CVFrܾYhn&;.ga7ºd,eBFx`2!^Gc6qۭdz-L<0M}]#Q}9 : zX&F|B\E wBq(b2`*GOH|zkazy;+n{Л@!> :::֭[qK+^}UB6lX!V Uca"3ppxFK6" !0ͯcaXX,x\nrTg28ʼnhr/Jn,d\RXF zj IN!@0ѢFxBeC4\"'"^Bڤp'HQ00|f';63ॗ|ķw}^wfdQuwTɊNWN^ijjh4zb455177(Ke( (ASr%(Z _>? oAC}v(] 5[`ceru||O`1u8vR}9?8M]clu0ƚ`iֆ{8mwB[beV3߽Uo5x/'!G5 QX ٽ{7o6ǏbK6!nY6J2@eTa.z8}Nk&ٙ`6%wQNNh4b y P08Is3ts3B,/LQ9MAF׆nbH'!n1RBw峕b2bP?4>3,ъ1?Y֢^OB| Bx'ill$*.ݻwSQQRSB\t:p9r]rԞ[gr-* i~zTkB*& Ʉn'm(hgdg5:ChIVd t:OBRRRضm {Q:.YD) Ũʞ$F.=IN!VjƱǎ ^~o;I?W]ݎ՞םez{{9p:TRSSZhF,c||~&''TTTH${ZB^~t@^LMC:ؼZζ@K4CqHBfTTun_ ᯄVT`5CSORS=tDNS>MIFώnl6K'!nRB돯f- )w V~]efC'eddnF# fO4^fffx$n8VYכ!-P2r -!u J'шj紒b8mi3X,mbtMQ&'$+ʼnݒ)OBؤp'łs-JJ{(L;$|0v3I_<4AӖ-[ŋsYbXVعs'sss$ *YXX@Qf{z,..ljb@T^Ubg?j}Q8|0]]]z^jjjx<qK~ß9{~߀7_ ~v~w^"EQ\dq,ktS59b Ĩ뉲>cٗ="7nf&o3ϰd "<3H$ħ i6c߀nx}/ ?Qt]{$Oлbd2z NR=JG4 y ϱ)s4;Aۍv%HB0uN!ĵ4R/NF/%~ȩQ,&kR4 ٌlT1lݺ5Y'cZKϝcaaNh\z|~~Qn{®`0͛ ٳ٫H$8q|K_fqqjkk)(( Hp! 1͸\.f3Hddb``66mڄ!MԮ|Z놠)pF-mp$px ݺIqcV;!Gg}l߃_l'b>ꗸO1^btB܈Rtx^vڅ`0pq~i*t=m6F#tuuȮ]HIIgw}d~~|0477?kkoox<ζmp::BB%@Y1LMAK;) p5 PTYPV}ݕO͊f"g!?8i?@],1. L&-BnA߁bZ:N*nB\UN!Pb”ijk+ _0>ӵwG}v\bE-{F4C}z۽|qqqI"`aaOLLPWWɓ'Z[[ٿ?TUU't l6&''fxhhh~}H$㌎2::J4exxEQxWD"TTT`>p+L<', i*'(p@UErAsks/9L{"+ Um>*xp8g(IBǻ"c8J`(eiTdX }NdϺ+7@ !B|rNÑJ<YD)DS`f%fnT]r&oCn\.~_;G[[)))b1FGGdž ~U+v;z_n~~~o> fF=F?kBJ ]h뀷˯o+—>{7ɞQMӰlXV<^/Y)Tgqgw.Dy(Ρ sԄ#lκD^ȃ㖕B\eN!txk1ɢ88cG}mTp'nn)رc455( @BIMMT8Ga޽<#a U4<^x ^z%JKKyGz"p+V>Nvu֥> ~zll6\.^P]]M^^/^>B\+l%|pas |=Bs'?"mUv~4)c DhR3Qc]4%qn'6*p'bvvm`gP~?zOh_2bn'nn:,mkooUU1ͤ Ey;& os\UU4QfffFbO|b`fff鱹9ZA\+c6 BB!<4bl΄*()J8 9h}} @E%[ i!JQehf1p[ NS08IZݶiچO6{TnVeXv!YB*AŚ#0UE˛ozgW:u/}d\B,e6o͛Ν;D4ddd'Mx6"w令}FønPK0dx~?p%(z!fff(((%BlT59Jc:8qZ7vݻa]B 6+Ҋ+ M&VT3CS霦o8-#QzcMQtp:1W|BP(tUvz{{1L&~9V!cz]a6E4`o#>=189rXBbX#//xy&&&رc?O댏UUٵkwq>whh{_'x%%~jOWkfՖhdƍD>?8 IDATO7t+OUUPhxrrrØfp\ս/[oESS sX-uF^yyO;zBEEַZgtUVVp8θ>t:1MӉ(Av;.ΐd޼yձdLӤ¡kjkku*4TWWK7U.Q c`XX0†}?'fN4&O#3I'N߫:ya4<==`G='"xm-}33lX&}a({eanR1X #4u8fTѥnP ">JN!EUl^Ƹ>/ePnq]Hr39Ź+IX,6t& pbXr46K=}9 2 P,&hڰt?\.N'~n&uquGewKs:L)sPU#/axB!:oT^%}c*|/5!=ӡEsP֓NK.r1qD***};!8ho-x귰n%is`l-b!/u3==kxΡ(]"!-!>aRB eyLjy1x1-Cw>%.-B1a*ćuׯ76f|B Et>y' `x8~ gi0wTWe>G EQ[ z؟ˎ H[aFq8!OB|p'UPy>}&LIIޚGėvW.5,ct=z4HGݻ9q@P(DII x3 !yq] =P3ֽ vFx}G7 .ρ #Պi6.+U!;%^7{|=~OE8زA*<<B|9US_zf#FV lm(ǑԩS:u*H'OoqF֮]K:zL˜1cBt=3t|bh<+_g#-w ̇!/|ޜ}RUłbelQ7Gzyp(lo1=ĢnFy(/{uUUek y0$OW)޺wOgfxP^ NH~\Hun7̟?sRTTij>3<(\r%弱rJN'3gPe:’k~"$Ӱi^t@UyZ}GUUV+|*BL=Mso1vb1hahxŹ!2eF9UӁi|CB\26ӘVNKy)'i)#6opLדӾ!W#n֭[9vTUUQ__τ jf;!8(JbU]yt·![l  0]&@i)|QiQR[džl:hG־0:vQ]親Oj`{߮ ߵDss3---200(7 jjjPBdu?BTeG3M~^@ #6m$;!gpMEtL/n9مhQSN!|_t:9tmmmx<ƎK?X ł)Y!x_` 8z|4U/pBXwNCSUAq&]-l9cGs8 cL0ϋf4V!vRBJNJ3eD+>n|0_P\yp)v^{5oNcc#&L`̘1 b ˲Bl6,W EO۷;g`SfʖM:aXYL |N*{vd167Ehꌳ9̤Aj ]8N${p'O1u+pVU3P~3B^̍P/'ۉa'N`ٲe;0 6oLyy$0 a\>vNа=2 &U UU. ?|r\X,+5}%BKwA4`tq"͊a|4) !S4UuME2$ܫlhvLgrt줡K=*>t:M,U(Bqx`41[gᥗE6n 3 MIq˅K„^64j kFu"ƤSQ.40/@4͡2l\\p'Q„Er,И<ŏ<ŌCzL t=YIQ @Q?4hd20PT_9btAӴ7V qvwSeAvxW}m1TD&ut:N뒮'q!B*e2yXB_T1󟃿G:S\@rt:=oN<g͜8qSRWWP`N0j mwvp'AUEfx_uՊjRR%ck} ?NmcK"=Nl6{) !tMQ9WrءгE&}OOP[3ܡ D֓N---8ꪫضm`6, 7tuuul"@'`pmp0~ە|riN'>}a=:c}(ʮ8]1)9( v9X,CSBϤp'5änh[BI}a\f@:Ee=t}1m44Mc„ ՑN3wBۡvwa 2{G99=ttؼ^5Fp4K;r<ƌ "B>'+yK wB!>*EQ3T;amv#67:ކh ŗPzҩp8.EQ5{\5ofײod7(mE'!>aYO: o><ϟnܸqtB\;4`|x 0hl0j4TU zv4 ݎffᲙ{n`~O%'`4Ay LyEq˅i2Ip'ⓠ( P]B]8C~;A=M!7.gfm('$MHVP;( .3fd;!f*ୗ`bařI~nj| =e[sq32A26a#?/%I'qNBOZU1zؒ04m#^x`LSۉOF֓N>sr5׼uB!6,Əu.$z? TUÁa8.T) }/2X{4m=aƙY` uvqh6}OsBdC׊{r9-rj,޼gp}p w'%^9gKs= FET2t:q:.'qΐBlnԛ/bGesvn2ڏA͈St!///۷B e%Ǵzp>xQhkP? Ə-sE0 <łj!0k`KS$$m=1*EH$vLĐZ9@ wB!ESvEE0Fg'p#p+aYO:-^WRl6!gwសʕ=鯠*3гNQl6ۍFKgTHcQro=mqfU%H$x\$Ol%;!ԍ* ff#&\Oѳ/`MvdyO3;>,[n@8p8p Q ! S {; 6/ xNBQBQQe-qh;_ Xp}d#455H?d2'q Q,^tm*"JBj0id>uo-5C'H^?@,dJ;áCػw^{InJBQ(Jfv;,FV3/fNᦛzDV+ct]pPiv1^4Rx2zOW^2asP֓N'Kb1q8K!q:TS%E0z Ľ# p&Ni֬daZ1t Ok%ʁ8NHPPŰliYSJ wB!rEULM$`gchҼ,XkIġ/ 7-ʬP'>'nv>еiYJ!'ft;ӷ`NxxIx1xa ܲ,fUYZAl66[^{csl͍QgvnPדi$B䒢(X 6tc޴Qk6dݧMOOzҩ)Uf!8GoWCK^{n\fT~('>4hyB_cgkw zݭqfVZ*M"F__>@ '܉a';!bhԗ{qXܼIo{0w$$W֓Nr B\@tN;r'̞ y6m_ _; ?r)il6 2I([V]eO%x@c H& rx<)ObXHN!pUB]^/x5 2mo~̉B "ⓥvA(+g>6ſtH`,(N'g:6==d z)ƩFDDQŒcp'b(n ֩sYkH308LQ޳S:&H X EQd+BoXr onG uuPX{VCX,fCTA{ rT QN &)& D\A$B(Bʬ5Fbr&;H#@:w,UMD&|'DE:ĐKKK馛~믿Ngg'0|.jkksBz*+Wp/ZZatYNJbX0M0u$x,bҌ/LQKzrmCɼ'IB(h T;ODI;P}x;U?@AHf< I'UUZ=1| C!BQ2뮻 Ok =µ_~uYX,ݻx', +S)z-.>22lF9K:pBjkk*6+`0U(B!FI>Տ<4`X;hn&XpIf˝Œ02'Á躎~|6 Mc_kMM1z#ii*:DxND"8~8 E!Ĺ@Q2 FG/ H3?_73)+adF0 [o/]]`1y.b|ɤ]]]D"n7^0P'> ) !8Xuʐo\QͿ,PT7?3]#giqO;??8ǏYB!8AI)Cn O> Z\q%LY(łib38- AƎ{b;?E]ɨ+"HljF\.v;,ͥ) !8dکl85jehSRp/>8,=T*tttxrBsC^Ax_@)ٳ85/K ]q8ix *6v9F@XdDP'`TjH$B$r0MUU"() !8Wݢ3mTU(<0yK8ө.a罜%N:źux.{n=JuuuBBq>FVvX ZfK^g=ibhoo'K[:;58FXgyV.RHR288H?CTB!RBq V4u%y(L?/n6IO%:;;H:>f~ &*!fM3úwͯ] SdNzΡa㝝qF Vκ1ҕj+cS!HK4zYW{p'\pL+dH "ɤ N?+w"rt:}i*Nr9yE!ćc 9X2:,΁ڑ?*-Mat:1 0BĪ+*JCK$0A}IWLI$$ɡYOn]eи";!*1Y'ʃ$p-,uȬy+IÇ(((״J(,,v8B!'F7eeZ)x{š.Fݖa( ===h 6M1XhhqRTD"q"dÁbAu#p'✥k*ɥuy$Si֥ROgQF,V XL4lWdA֓N˗/h;w.K,v8B!WEaLn:<~FUgf=e)( a Zۋ݀EΦ(kDx`$ GY  Ha0~ߏxӅG wB!A;sF$<ߗKN굨V+P="S$畬'1M}1B!. *੟f<M?܏SIRѫ*;N+djI]PHRDQRDۍjJ";!R]VE,|,Qݥ?ͬᮿ;L zرcRUUE2dǎ }k!IWS/@_zP?L* Nhj*}}}8$5AⰨ4q3A$f ftAajňC˕yMM@ wB!'`hPr%ͦ;qn4T>s;&tz衇⋩"sJI!D}0{ւSݫpfN ׳vhZL|oK)y>#9;IL]v=l' $tB(ʔ /q^ t #m=nx}Vs]Stzd}ݝ ! ߇QpWC__CPUNII >oh˔<ٙWc%JA^3ގ8߿H$V"H롇bC} sTB!ǣ( ]c^mb?ϻ޹7nk p+>w: l2$%8B!rEQ2 \/X /KV◠9{aALӤ~tvYeBvuQv$菤8ޝd^ tx' 9rk׮%Jlj'9:%I'!g > ,0Üٙ J X,x^Pb1\IE&F0O1ȤofbJRaIwv0PUy>B󝩫-\9>S}Q]'rNlO> .,Y)stwNl"B^y|kP0>=X=zKJ0xX}qVL%}ݔ؆@I1XeszVpMB!rj{ᇡa3}f/+PQvn00 :;;F !/Lu:3H[_Uv C}/q)|n,'O wB!.$* pNW wW/K_ҢS|HY_R!JӠ V^Y>I`$prsV+~]ׇtRx,*T,BÉ{[?w0ŔRA%.NFϘvlR9GIN!ąçgu,Nn '\#*;L!H T!ąMQbs=>v+D0u Y AUUV+%I$:Ԇ Eaݑ([Lj%P[SH&ňxӦ]= 9 RPP@0СC޽GraZZZp\8JKK}{!S↤ֿ/03eivT-S:Рܧc([`ඨcH$H$\!~:::2e o&3f̐B\ M:c8x' B/#H][;̝(bYO:޽NZZZxill$ JI!j¸*{<$;&=k USl:C]JPޜbƫ9؞7@<͸BM~?)ND"  躎*B\vb& UƐsew7WQFCDUWW\q_~9ӧOgL:ŋ3{lZ!Tʊ៿_aspWa/Y0 X,3:yM<5#&K&pqN:E~K#`R1sYt2 n_ذaht!⬦(N&|.xexe%|_ŋ3rLIu\=NJL&Q]<Ƽ* ʚCQ63frɤ"M9Ժt:M4ԩSDQ^/. MӲ}+ǃngԩ̘1M!)qR‘q3R /CY \,p*d~JJJ=X,mBOdMU9XW (,Z~?I(ax<E>MPq*NʺQv&荤PhⲨheS*"H$b1\.E۝p]]]l۶E[n!//þ}I&1jԨY!',S4Ks\VO@Ƹn7knnx~(-ѣ2Y#'Ϝ93B! bQXlv7B!0ZQ4M( tē**s*-†O+GPACH$!H$pج6TM!\(% ֮]f]{CyXF6lSx<9qp8kGu6lŋ|`aƍtB(˜b7V+) pnbpϿƌeCpJeB!'yA~~Cxr)> ?.Bt xx;x 8'RrZY$oOyHo*鎳BBQ swd<<tvµBh]ױjf\oo/Hd6Cأ`5, MQvM1¨ӢR2$L&Ƣ;o>%K7y_b;d$ao)CKg`BaOga]5LiKKaV!) ,$NdY7.i,+?>9?7^=xO2͌?v QWWGNNx^2333g#77%K0}^p8HNNx7Z n9tvmo3O0_Ogýw0Nmmm HIIbyV#G`Z|Qq:atHMV߀pd`ջrf#V\INNa`٘2e)sAoξ} 3c5\,Iw @jx%(e71je۶mkw?P(4SQQRDDDbnq7Au*K'>/!<%fb>駟fX,yꫯ`dffdɒX#""? P8Aw'|N(v;XVN (,&(˰2P7w9ύ%.V)F v19w[lӧSVVFvvȦ c-$N}~8&Ĥv9иKJJ"++z͛7s 7 '""rf^ȡv7Ec}EL/s:|hӰydZ Bt:1L[oų>˾}(--u)"""d ǿ ^y ־~Ô2zbzp8nXD"nqdc3HK0|g/kkt- IDATx8iq>#qٶm{a=z7xK/())zZZZ0L"""e33%˂R+e17^5Hs|^n;d|"<:;;>a{QQwu^{-6X"""2L&BA1:CIV+$/YYYlSΟ4qiv1 SO(3|q 6n&ƍdz>ݻ q3 믳k.v;/6ltqa^z%5\CQQ.nugVfQ>!j2cr`xt`61tw`6IKK;ad"99" 逅sf_{o@,[繠bRbvEKK ==='x2N &Y6ƃAv?vJm̟qO6.wn_qo9>^"h0lXVf3fbڴiD"L&MfZinsK|tZ^6 sV-ӷO0JLwFC>RSSzD"jjjHOOb6رcZADDD.N9<Ug!%9 \>]]]'OfdPaj6zA:QeI4HjܝnBpłNL&>s'p!oL29Op5tƻQ!mذ}ï׭[G]] yغukKÀXv yW;¯~ W@뱘n)99u[I>+s)ɲC}T&;p\j:::wj܉|BJi[g`ZYW:u7ݳx0Lw0Xs CZ2\6=1_.)) H$rҪvB3smحRy܃](dw"""0 f;*­8_ ߏ߁e`ƻRCny^x(`_<@. QZZR>S__/2oMMMO g7n駟dee1|/A8/L&,Xc=6xZZ唗s1֯_Oee%7pðĉ;>muO6ÿ>tPjlP(tQ3 fyv\&WkeN16Rw"""d3%SY&/-$wn9a.' ~FOEQc]gZ̛7ӧyf}Ǣ( ?dڴiٳ_W\ztuuc/_~>yz]&ߦŋDDvR07 wKCAAuuuSOAI+dSFҸ B__ˮC;q_Ɠ PN1)###ևf:;;Oz, cǎP(Dqq1դ3uKHH !!ᄎ_aڵQ\\LQQQLODD.@Ǜ0( D![]A1.aV|>mmm'jlv $X40); jb/u9Y|=] aQlFN#aDNEKK $%%azL0FBeee7B:t;vž={x|}#DDddy6W >?\s<ϵ[ fCs ҸЙ  k⣚chI/#o7} lj!82( III V+^\Njv܉ng466Ν;رc8p`pSihh`Æ <<߿!=W_*]p˝t~?%l6IKKn+X|455] # /ɕNqRێl-"Nwށ~Z@+t:\'$$0| 0VՊa|_;d2a20?vRƏ?4};\]ODD._+0,🿂 ]o0 "á{m68qbKatoE)4um#_ OBo |.)+gWNgtvimm |8v xvd(uvAK &L)q %&&ikk7,Xrl.IDDe3sdZ{Br>:ҏk̾Sxex`xzQPt^/Boo/mmml߾sjbZ%p5Isc:׀dH_O21<ݪ""r2 $S&퇻+Lcdmwı0&oK΋tH$ؾ};555SQQD"dee®]xwپ};mmm)tg2{`Rh=XG4 M8z]+"""nlAK5_[+[鼨'@+VP[[箻[oz{jk?~:y/GR} lZBRG$$""2P2 >X9/<o"l6vx&{k`߁x9")tIl6(_{n\Շ`Nvf11(||hF&6YR$""2vS/e7âpt'< lHB""""2w|*&`Ձ^N;.uDD"""#QZ*\1gයzxwK %""""gl2p\7-c]}4uA'ܼ};0>| !7ЋF:Tt1}/X"j% '"""ry1.ʚvey|HN5kUr,e DDDF2_*\Z:߂xW&"""2ũ4XW]O; *@o ޥ FV6nO<={B.IDD0@'Fhjwu""""#NQzJsǭGs3}\9`:0V$rrrHHHdҏPDdT3[Ű} <8l$"""r6&nn<a͉9p`6sO;w<%#Lbb",Y0k?'X5џB{IDDbf~nÛV9-0:-,ƸL7Z;3صVx{P$""r3 !W`Fh=DDDDFĘTWLLi3`.&̀ÝpFB'"\_kpoƻ*0 L&Eӹ0vg;LCASQ+B'dc`ܵP_ow7Ż2#3$ 3ݬĞnow+4wC;]Ah@Y 7ʄ/@awu""""<`rn" K|X-fo=eab@&9RNh%W?֊+""""hgژ$SIE}7}߀whE.7n:6f3a>xAؽ$"""r:d.j6v=f!:m:쯁5k`AՂ:0ͬ]{JB_DDΕ;n3xcDFșJvYYX(7ݬB7跿π-;๗!wqiq:0ydRRR0#sd@Y),[j"ޕ7_Mqz5ݼUNOQ1L [90'ޥƅvS\\̼yl6ǻ$<(-/} `G-hj~$"""i 0 x6& `Ŷ&D.vvs;)tR`U +ބ?^89)n&QfN>:paޕ/ox::Ȁ< ߆wU"""")^' l,lMy#TUB'03&[} >خEDDDNm01'.Ie KfWB'`̙p˭P4VkP]+L9I`cc|Kd8M̀-`.uX)t?KI2,j t""""gPeEɴu`; ;?\ca@?VDDDN/;nþǷrǻ2 0HJrd'9U]ǎ%{s]R$"""'2 [<6~M Ż: b/^!(I`]R$"""'2 p:`t6((n VS)vs&[.$'j=DΊc<bZ]\\63c|.OI3]ְĻྛ_ +lJv|>>׋ɤޫĖl"=˲9z,@sgIx6l:Y, @Q"""""岙 %> 2 DDD)l9cfA—`fB'9sP -l:Nijj|.IDDDDDDD>1Jwf‹@WW{eݺu444]I4iIOOgЀMHcVՊŢ\t{ 9N""""""""2:ȐS$""""""""CN3Q`]]]۷h4zD"a1 c;{P:bs}lLA, f^wfϼh4n?vPh4:VF!w}D"1}0LC(C4z۹vBaYT " DDDD¢i8q"hWgg'w怜!۷sN*++)++#11$++zOSS쌂;w :uYg;p`'ƻ3 ٺu+ŤĥÇK^^ސ/ RYYIaayoZÇ蠬^wlʔ)g;V̙sV9h4ʥ^JFFyKLqq'ݹQN B졇}UWW3d=O?4w}9Nuu5>(?0缟^}Q*fΜyV=?|F '?}{gu_/ޥ6~rws%ĥ˗SWWg?\s5C6}Qn~ε˗S]]}ֿ=#g;gu w?wr:jܝ5Ը(t3gN.&L0⎓ļyHJJ:X,s8`̙C˧7n.8̙Cjjjj(((SL 3g\k+((rwpR.QNNGsƝwN@bb"3f̈8̙3:t0:M2s֯=IOOnkw3p82Ίngƌq<+--vӧw ^[^^Ywϟe DΞwF;95΍wj]h: 0Rs CyNnny0syr E1fs?7=Cy[VV֐\k;l?҆lHe-IDATHݹQNNGsƝwN"rFL28ᦈn&M"---e\Ը;7jqwnԸSBIDHiiiK kD;9n46P+""""""'9~ls%"oph42z(t!gw""""""""rQ$""""""""CnH("C* yf֯_lK.cƻ4fٳ5k:uj8S$2;{n|K_|a;磴K/ZILLdTTTx:`+V`Ϟ=ttt% bƍ9rCII 3gfyioo󙈈\ԸFCN( aΝ֒vBoImm-QWWG~~>555B!O̙3ٸq#ǎɈy;x ;wL&nЩ˗SSSjw^(,,o=5DԸt DF1ܹs)//f̓ajjj~0m4֯_/KϟOSS d0x ٳgsem6~ nD"466SOo~%KP]]<ŋIII!--- /5DԸtH\d픕8ᱦ&'77K~~>3f`۶mB!L&mmmDx I&:a{kk+"++\HOOgTUUQWWEDxK,9O6nv%KGQ]]MSSHD;ٳ;OݬY7/i&KGGG*> DF9͆ld:㠧^|>vDBB鴴ngݺuX,8 >Lkk+SL %Hׇbt0L\ ٳ@Ȩp.#GPRRB `ݺu۷O;4w+Wl%''3eʔ8S$"d61 P(4؁D"cXl]T"rj& D("P(btR\\Lqqq+?5D85:ȧp\8ZZZ? $&&b\ Ӊ㡵`0 L܌tƹB w"rwID>Ejj*)))sQ9r[neҤI$''ǻD&dggsQꢵSXXxQvDDΕw"rw$2~֭[Ν;ٸq#'? ӦMcĉ3.+Va:::HKKdpx^;wyfjkkyǙ9s&'O&''+ K02qD]/wrںu+~w"g5QѸ3~Y;vݍ餹4"--m۶q7oӦMhzjoNKK x%++tٳgH px]ȰZUV~zjjj%xZ=zzmFUUfkLfsOADիYz5֭c׮]Az{{xկnl۶G?Oez!n&ƌ3,"""qԸ0)t /|ap[RRC~h4J4dڻ <\veiݻ{EIOO~zBXVؽ{7ƍRVVF( 6xa&"""?5NƝȅIčd Xgg'/IIIASSHSPP餷>#GӃngԩ ٿ?uuuB!Biii|scӦMKBB0( {n͛#pvɄ ?~I&$$0k,l7Ñ#Gl'coo/UUU|Gf3\ydeeB[[cZr222addd S"""2:qw25D.L D$n(l޼yp[JJ >N~2n82339z(MMM8v;TWW@00 k1 իW"77RV\I}}=Vή]7ogϦgycǒJGG7ndSNg{ lذ;v '##$&OLaa!lڴ󪭭n#%%|zԸSNdP$"q xWٸqEqm vN']w%%%kl߾T ॗ^b̘1|͌;կRPPĉ ٳ|ry[naɒ%ڵtl6G@ @[[oK.pJ֮]~bÆ <|ߥz֯__O(JJJz]\jܩq'2R(tZ,[/˃ێ_0utt0aƍn͛7MWW|Z EOOݍdԩddd`9z({\l6>ɓ' @bb" ,~?v$33ϹZ>#y===`X藺*}Y:;;رc\~RDDDԸSNdP$"qc999\ve'=vRSS B8pSv3i$/_Nqq1cǎ* 0 LRRlݺ=܃MIIAee`]4773~x>YfDDDƝw"N"7P^x?pp[QQK,K.feeq=?``կ~up'%%%1|vɋ/ȳ>KRRdffvfΜISSpcYwb0k,O<~~Yd yyy8RRR1c7ɓ'EDDƝw"#B'o}[tttp8%%%QPP@JJ /Bqq1v0+_d,[+ߏa8Np\\uU\z饃f9s&}}}tvvOvINN49p*,,dٲeL1w\|>}}}a\.[,|>ILLdرT\|ԸSNdP$"qx||ҥ'|œ9slL2S_i+tww C^^tvv{e1cghX;̞=1c [z 7N^^invnᔏHE_["""rRN;B*PS]]M0tr50ydG套^"3w\>srJHKK2O?ppO^tM;5DF N"28NoFfb8G8aYp!W^y%h>l+̛7ٳgD"""2:q'"ˆFx!""""""S6L&H 4l6g+  Oh_-/B}}}'4"B'rUDDDDDDDDB'r DDDDDDDDd)t!IDDDDDDDDB'r DDDDDDDDd)t!IDDDDDDDDB'r DDDDDDDDd)t!IDDDDDDDDɻ38^ IENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/atom.png0000644000175100001770000006124514721316200016445 0ustar00runnerdockerPNG  IHDRx pHYs  bWIDATxKMQZ#GJbD^Iʝxe LL$b &&d e(0 0$($u܎g[g3ޫ@CelrR5/|GrC. 9(;{c@TdFM [O=ruZ>YվK{}g@ϨjaW!+Jyw9⧬r@~ʦJW _t/s@ME* ,V>?ɩ?qQzTd5 >?p2wehf*>E>;ų4+ 9g,@H;]?4w<F18?՝@Hn?qW3acer ]rVH&^q`h*rL#!}dKձ!=Cau p]H}AwWÚq9dtLK/Z@\&JZkr<;3sf(+ʬ=<#_s#!=БMJdw@Qe"i~gmljdWv˗I~|A\F0ػˬ[!!y9l:{(yJcn$W,q) (\,p>P*VAqr3Dt"_G[yyJbjo_|z*P4?,]H_[{AIue;JVK;pd>T Yԕ*YM >"ʠ-'Ȃ}'"(ΦODCVP P_MMqQ^NyɬDI Yb1ŊFj f4MJYH$5I&RYL<;n;_w]|֞n=s*p`1eAnPw:%8O,FŭeoD솣" @ġߟg>Ž4͊*E"ߪeެWh|*hp ĽtU+՟M*Ȫ0{ (yh|][z,f<{5gУ,@;βO,V [54[ i;J>,]\^./رlKΩPf B2b[mJH%p$zonԢgFe6)0nT+SimA#au |M䇳ڷlBi#SY N*W'q IpHݧ@V*w'!nHwW[{ᙺ?B>"!%,IMB1f*oqxo _ H6_tSVR}`4oѫS1{vJve"PC{WV^.rʠP|@-'Dݥ͖ ̨)Jߵ:uz>0hQH!Pj71Xu[vtAG@ JK.oHݤ M/TWR^,+&]zҠʨ?B>%Q4,U7R~*gA1ޮ^oOݤm])e;у]NH%(S;o{ѣ\ʑ)_"76{0kиL݁TͫMat;"##-E(e3Ffn)(QdD^fە{-R`PTF-5V]w8/^> uX1E^P7 00&YDyLI]ܘ:C y@ ~P 00z"2Ͱ޺UYZᦳ,"\kXn:7 7y@n{[W3U7 (A^Xïj#owTP ':.66zrc:4y@Nc6u]cuSo_iBy@.cCu]fΓEF? g:Q9;or|^U_2n ( rf9"u;^e;w[ Y^`1pX>W/:6ܓPOWwŝr s;KTWk@^Д~UÐ4xQgÒ4wɲ1@ O^9G&9MX[:C^PǭWu 2@]YVUpie,#̰(&K6XiH$eXD i eVP|***lN'h.+$bJ_|>o>{8{i:/e "*z"@;.vi:Y2i=f ;oMwƋHi:ɣ2i=f~p DDE@|mvt=2kNT -8.)vgq;S-vЏao؁(sɣ2&xlE3hKEs&3a\dqDܢh=`8w^(#`8NXނ`}I!<`+*4CaHCa\ёm1*6^+p\ pX6v=:Cnm}9mF[\{ŋ,~`],OM+9%)Q3{>AŵpOѹ?$hCZ~ŋ=aܫ@XMS]ƿe <7X,GTV(߹u>;Dc8a%3/[ r:b?*PUd>w2Aȩ^xG~5 1~_"Ѻebu9;)Xb_Hum=*87,Xg'z|a4r=̰͊8'e&x%u\9z!iI/| {}a2Ji:;K#X Թ^1b lBrh1s0R=9ˍ rZbgzv@&Ee+$䱈R 3~ކTb6y#gԬ= wر6A2J>9*9:*=r w_ OrZS 8?HxGYWp5;_^>*|fۏ3nPrE#xt,mFf켕ۉZv o3_j@ϕ^ :^d答>i>{8$(SP}yđyX1\L83$AIuW\ez]~T-@AP'n%,x*y%Lau;t&$/#ع.[l |=߯5/׸ӯJo=.TWJux86BŶvK^ٹ,z>審=a/X\Ѻ;.v]h=U~?cI_f/Ph}g.Ftfq"1$̴*VR}뮺EOZȮ+fc0bڕEUr[|zi(4BkUUǩD0ViIJAaؘe!~aOIER,CzN(5d$=CS ʲ Z֙f99g}}gkݳ^{-P "A3eۗ"VflRve`d`ہA_(+CپFmpLqKc#>&(\cZ>=QYA4,93Y/p8GlҏwSNKǁ k?44nt bbkpFj0zs6v VV)˕N)VCyJi*VP7w) ~r 7!>FĞbW^Evoi~ NmxXO4+~'AH)!69JvFFqYb5xqHu(h{rkQG1Я(++PqqX:URi۬`|e8خ7X#-DJ SQ.kbҒLPUEIb jli蔃6~,:%N4;b_ߣض+bGhucPTٽFh(7qV \ iH"Nk{C4NtN0E ^"6+jhZU|IbqU$agXhjRXY;X #ˈ܋{х!b şb_mU1:2|zOH0Z2e[eF{,iduZ37;08 b矿Of+ PTn D//܌.c &2@*L>$͟cl(ϊAc -F;'z0lQn:UDލNXC*5GSuG6s".z llTcW1Ēļ$٦RzP<_'`~s`׈F8i~).k 0zud1:M#օDyusd؋ʨדt:9+_&$q]oz~˫VUI=feJ2%ki`0 -C+(G6"hP vu.tX&Kj*QI+:%N-⻟};xe}{^ @s˖ ]^coIL)ўhsoo劌Tk*VZg9+cyj}4RX쯆4ǯ[8ʐ'I<=u m+e_.b_^T ?@Z|_&輒ghc?L}.UtF(0^\gud]MPv |o}b剂aLv6i,+n] Hk}/7V,Hz 4t&p*]X}2_Moyq ^-P&2NHzu=uN/BE<*ְquQ9?.7u%ix##炙Ho`>$KvzO]@. ~3w^@>,Y-3hHrX=yVO]ԹBED?彅KȯAʋdX&sHA} q.ԗ!S_ *:Z{/"N{b6A+=uѦ>#]2V'xTL3H;Sm,Mp6]hߠ wo_-.q|NX 4vX'Y 9N\/>3%"f'Es橍6< h3]H9Peu8XI-%(;$DS\*`.h ]y'=y ʘKq{!c[ -&hE({ߡGg)[WZ뭯AiQCoGCO:O61>9]/[UUR'kTr0“PIFQ A"/RPD!IPXA]ˇ8A=tU| " +BQV-lsv5?2GOA+:7?:AI`/dv ԯy.GM:Obΐt.'rbc`̆ u;hMF^]{г'Ut[ryW^̜u;O%{ɨ߻\wе3`A; oj>'_%%og4uu3]{PGlTZOpȁ]ۋ=DZq]_G;h*m(ڃ{tIuA9QIGWXV\v`񺸺lB.kgl= Uc!㔊#$m$?.񓄻t siMÚYtX^f)~[CKpc? v4Lc&]PJ5WP%*imG 9Ņo3=4 ^:JKWUbo=a6hFΚqc!B{hv(ǝtAe=u"NTkɴ!i)˹kƥ"G*8X^Gڃ{k9^TŏX쥹Ÿ<>,  q[M{h u(Çڃ,eL?UYsQǯ9n\|0>_Ҳвn >UahюK{r@=I@-J}DU`ؤ\EluEDTU٠97?˿,er)˅2CՏ0N0/81CQ#Pn6O>Is=ݧL'qU?8У XtPnh/LVn^OƆ֜I;+Ric(x􆃀=uwVÒד]YL I~ ?2< BA@;Pfӵ>)8XWMe[2Ix rx٭;hY鵥Lh7yPGS <ayƎ\L<'ZCGd`6LANv%eIhFRR$f(TV%eY4I!`Ei -(2Rhe6 ~.s\_s{-x-@c_GOv6})\{_@ko\֣x-)ʯ&z0 e:6Cj[ i$: PBknlG@CW0+ OsLUTV B?[aV۔We⾋OPihGmTA{Su :'B@COܷ4}ֶO1MhMLK鞕O]+aN_~P;z6z2I_`G>mҗ ǣs!P\R.57}a>W{ϮCk@-hM'o 0hMШ݃vvoA5?+Z׎dn#(6b%"xZAkb PRo4Ue:Os&ڲ=,/Gk,<µ!ІFb$H?йmp`F!4U~"q?he K~$Zf (Z>P@aעsoQ'%C WWVF9-^k]Gy@4ےR!_ x:4&ˀYMD(*WA';A R,0F.g:&&|o0 (oj:FyT 5} 4ux΢APi*)Pp;^T*|D0U#6+`tf=^ dÂI[%WsC{ʊUYPV)4r^ fY8]+EBbj k ]gsoB_x4lUnlCmY?]1O.DGQD6L[>M i:{MV\Ugڙ$=*2@?<,BׅD7]4iv)5"tuMPkU/e5d%>iB ?fnx/)uzT]&s=C jlF{ %;*$yo_wR9] zBkY  5ꫪLG )[Q*Q_7wr>±'h֑3S:'&2j˚mnC}-AYt} <c++xҸ!e)wg03lًGl%n\2:,EsMUJ~ʤA!B B,"&Q(Q Npяe٨*DNjTIhf|t2l#]*@=3@xk f;Ŵ:v-RWj'/8,Ii 9{Qs5m</Ѻp¸]p&L"Z\#Yݖu) ~ 5^w ׍tVm-ρt:M>g&t|U.~@/E *<ZNA!{j`~9P5v Ԧ$>y] x*XqZ|Mkԧ!}~.L2ƥ԰|YWN%RjZ.^mœF}h8 isB <2">_G49o@ z-&V mK<Q"Q@;9@= >>-PZv9<}qt{k-9>O.cGli$Wӱp-ճ)(0H#qP\@b[BPi f88H8 ; }+9Mx9Sk'4_|;Ӈ9p!+cCV:@&-r[kYoe*dzйG Zlצ:yyv[_{2BYz˻ K3%gګ* -KşcFg H~/Wx2vz МLV^'~PNB{Vpoşw-bYsGk'DmE{*U\@ġw2) -&?EKAD"GCWp#(AgWeYC g < {6? Zr}C-DFnל/e]cڿ$ƃhwq?RNpEw cן.o`}ݧ}G9V> #+Ga/VlAZ&ph&;zdCk@ߋ^hh.QxmqXD߆,"# '=kIc&zq^de4'7IΙ5 ";^3jQԝ>k]h: iqNn#V|,${. 266 K؉ڴ dG p: ތ* 0Sr ن`ocA}mI`Hgh.ר K!O]XڹX `M&"(K2KM?wu}+IX I)owF{Yq%~3rpA,͘'KEa#.7vHLncpo3 5.FXc )W^@v[/S\wDS3vuJE0SY`HwDǗ#/㕧 [^j7@g߾3Jft. &-qH25͑{VU܋ Oۺ~VNGVQm ~;̓rcwvL l [UU)2"(L.f( # J!. "J*EBQ e&A)heBa=dK49x<{5ZY:{y~C{O<}=:Î#1_%rʵoj_ɕ#Q Pb[e:LABҒYKkcߵf^veN 8Sg; `Mu߭}as ZSppO\t<^|'/'޻뚊@+=+Q5o.Vx!{dn_@8@Ck,@wwXc;ds.;Y_jFBB K\r{|u\Xg@pk;ߡc)yQfyVCpοG,,m=KLNJBR̆@vx'Z<ʞrl3BsD~TaS=Lpz`ٍ&0-f]v `&D9)-|vN P#6[v7E,cgB3-,`+XQV%WԻ =\b! 5|aﻠp'G;Bs1vC:Kz;vP?|\l߱QŢRNvr0Q@|n&Mn;O`j9JYlSZ Zn!,f(K44-w(dj(G[Zisb{%eE+9`y_lbߡN嗱^yb)@e2zXvEIWiv4%mc8h$۫N6.H-|l3fx.w~pl7 ecg5rMM:\8P@6p3=Plʣl/ `q](2iǙj ZJ,VP|"ө I>ZbVSsnPffH-Za:vkW.t1 kݪ~LbUByvZv,%=\81ZM|l`!Cluo)!YgrÅ/of;h AfKocSe:H-`ڃl,Mgo4.'jRqwW:ur/[VU4\h%TEDDPc4"F/DQT IEЃI7 +C I,Pم0BFZUä8[g{}Nm>B.8zp}XA t hz6ws)vQ?_zwZ/ičQS`H(zr{@}蘆Wݦ`-Dm5QNjc{KCWw+=H`-y=˅ =+ Pއ^1DQ)sq}=zrԦ.q0$KR{@}x #[s:t ܂̳֝LM2aSF=<"'Mpk:R55A8D~Z{]<`i3t&9~^glԵŚt^\0_iKaS E%G[- v0,tEzq.C} p4.s3t69΅slаJ"h˿C Y+MCWrD9&> +S78+6%z zFeT].{y*WlXi2XE^qG&+J<ǁ&,W VKS:le'qqWE:&Q׷%iNKC=m]\/O@짵Wى6em"3%|/imzti"[׷ҬgZӾ5 l!W=65m7*;̥%(ׁ*{IߟA9ֆ2S9`GNگ&"0*h9~S):IKZPN_JsRn 8c?|K$>_|`- +yt-=( a_L n')8k;ʍ=߽P~ur -ܩ$G-z^PSj=a>׺ڗPw@97?A=7J0PwA{Q"}-M 7vEyJնN4hziMB<)e pTEt?JD-_N 7vOqíϚFu)4D?ڛ-H$ryOp>Z'%U[U)KEttl(T(b$Cva,,***)4B3(+ ad5&i)hF]B4M}9ss^9{o-O%2M~6s8vmW\I?ʠ8_R/${e\taLFc}\w?*O*@ĉs6;؀&3A 9'dԭ4^ݙ9P- CbFi"0, #Z50Iqd{ۏ 疱tCv6$Zvut* O6F63HM?^-\}2NH͘Le~]ߑt3ˆ cwa"8":ic".*"2XDn %c窒&A?^t^BP/'Ŭ yĎ6Y$Ux%须B;uquY.&0"Xb]Xfh7ky.ر;T̰7]8'M`vb&+Lf{r|n#鳊Ӑ?W@ލt0 礋؞cQ<2vY|o}NNyEbvN>(OhNHZ|&8 eh=p>raPAP4户9ǽ={g=?5?kךڜ5,Tm [7H[Jb]Giʓ 4:6m/g)m;QYO,4gFs>OoppG\I@.p}1o:dзMX]ߐ6s kwp )W+cH]Y"eb^TB6\`K(M7XctYlMl@kE;/8h΁[4lS^PkȲLVUcxP٩$|`38Tgԗ3~Aql0[lh{*~;f?bGw4]ttp:*o98sͬr0M{V+:pm`^qϩKUaHz%]`]3!n#b5qsg9ǦXZ\X@+o`܁FV/[|ۜӝQ8scq>f\\\|5hm59׽W  wӻ2a=z8Oi TKWu|NjMGǼbicC`˪q]]bwRXmw;)2vz|)ٯ#]캂e=z8 99 N%eXzcz}uBa b/q/_F۰lw*āo"zYÒGb:,.C& 9[9WY$oجHώCN3ΰgcb5wo }H 7LeVaDi?(_Kl̎E/(=&% rnv# x-v39H9؛ĿEެWޠT_w*NfR"Wi'iE<@7`M@"v=בDKAq4  @LAWRc#(~%녖!œ/LDŽnŌh:U<e>;Q7cgo~,c ݀ =G[XAI21l}ۉA1c ݀D}Xg, E9ދ%GpH' 1d5[LT,˔,7hDP⡟:uryMpl 2@r{&>C?Stn2`W:3& ݀BQ,->|mP0Cq>8A1Yɞ&t>;e / mPz0J9| 9`[UUq*"aم.ZJ>EJB*Ȣ I1J2QD >iXP]"L"B|3FNzZ?zcεs͉5}z6owj^j;aqFKO 0Y$8hS/ݧ?Z :Ylu0(]}&^jebGtSS d#|KgO e_K `? d#x8Nn'C {2T6 c1F7v?@XwVJiQ&yZz<5͠0T6{:A2!OC$.FN h7jT9hNvj|U49h黔UtJOkov&:V=cW{?2_Η# ?=I\(=fυtC2J=tt;MFF'Z~Ԛh>*Łf=JSp]Hg0D.ȗVH7j'ҙ{EOΔ[- Fޗ f^_p|\!1)K<6lcFw`F.&=˧0ney 3r$^-#0(Ң49@o %2,Q Gz5x:t\_xVO>c8t^Yޥh9#d\$5rUwZX?*[ bٖ̓T$wc,a8B\O Bӽ;$Q3v/'x!a.pBgQ+7gt f_Yo7 ~H&=s 3+ d5źl (kȉ:Pgit7ުZ: 4@fq5|-9e<|I׹҂u.U,`/ fX~kT{e9+4[Z ^FZ{1+A}6<|Dg&WNs(XcN(#{1gx!4K3xq[O F: 2.9;~<3*2Aވ_:vB,ofDN4Z>:)˛}? d:vB2:0dol{Y:2MqB?y@gNB(AK䤓v%E d*CM٦u+-^- / 24菲|9A z/҅ΡOp-FLhWx! xny6,N KZwz%̤vBY'dC{!srO';:|Es-^@ei2!;I:w/^XGgt +rgyo̗PHΡsv zt:/! <$_?}햭 |ѵ lkBǖ/9t3ˇDm{C:va´^#ϋNqffadsB 55XRBX+BQcĆҬdE%?REȒBpNz~|^ϩ̽+/hݩa\OsJ^$) o'Y9 U:`T0>@xk=_gѭ{̫F%/hX ;j>K^PX #d~C`$/(r, gYg{3AYj J1bߦsOynkP-y@)c1P΢o< TM^PX ݸԤ͞WήՒ:  J]u}9N TO^PX gx.AP=y@ic1Pxyʳ~%# lS#\QQ+Y,6 CX#q;y3;w!>=SA^M3g5cymtŇH:ZxV+0G a$t[ydJЫEu@]!>g9VȆG<2|6 /83 @iQ瑕gs-A'ymt7$8TB Uggrr ݄:m,Wý]}0RO P@z%஀A_Wh$8h݂:a?`-u&@I%4>u.>,>oĞ\j>$8p=wS\:?V.'}eu.@iˋq4e2jʥ $XHM(P4. {H));lؒ) K|iz8{N9ϙ<b*KѰ;e%wvg} ]/(Fř{ܹt s;??9q;lژyz8ӄ} ]Xd;6d}<@c}lND> ŒOȴAw6@'"Q 8'5M5x&@t,qgt} AsM|#׳2R<НiTSC|ZB @D*4@9U),|Tn|1@f\ugR9Nq4N~} @NͼwN} 2E5SEnC%@>xsvb /hyr-@&9 pDX*_DcNe$@)^W} *k2dF)ށJUMָ} * lˡSIyN:B+ؿ$òx sX-#BJ9 @NT [Ry]f@ljuwɀZZZ} 7) Sˁ~̒L6n9#[[Cֻ } G*s dQ*0cDeȕe< PrvݽF9K~2`:x$Dfȝf|Pz(=^ԁ gSIx) ȁ}.T<[g}ȅ}NT@;R,w@Nu"LySe}rc#Ґ(Ǻ+"Z~JŴ-``;$n-?HQq JP-FCC(N94$n m-APNCސNJA8H빋 {}{w7=P2\< n*}g@<T|m2-G UەQ 8OE8 K UMnp1Ð;uJeTc@i7r+P:{G9#8&ev#QޓWO t\J< mT+H ~*rl(MJ{==Rwxp|-(!^c<#cT2`Ae#C_@~6~=sckaMއ_1> cZ sIENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/box.png0000644000175100001770000002737014721316200016276 0ustar00runnerdockerPNG  IHDRx pHYs+.IDATx1 J@%o Cs<`/{y^<`/{y^<`/{y^<`/{1nAa!AbH "΁S)%L"!2 MWU_Z[lY1w;5dZn{2L<r93'|7yl;l90GLsbޛ枃<һeiyk8xpU0|{<Һc^~Nb8xe͚> As>$$tne߀j9x/|.moySP !(~@xg@AQ 4!?CkP T%p?C-x,~@Q!"?C[{Q!%"?C,!,pg`C3p)yZ >X L ?!CsX?C j N`52 (1:#(:1PctBPm0|c$'(u0|c$%j(~ P|=a$#Q>0A1bO 6(~`c :(~A`6?&2< `p%?<| _N i> pϏ!91@V)v ΀"[o *:נ3в!f3 E̿5+~J A}Pj-?,3DPАUrEJ^$g"<^߾t{Hkyhi9#0*FkߢhQmW["01*V?y>  (} 䐹۝ϼ}<\t[vVտmտq6 L}(E y:y\&[;F>ֳ~mCߢhP_+-%unF2V["0 3~g;v3c}\oQ0ߢh(yEԼߢh(E ~"E ~"Q>@`5-5-F}~(諔(E`!Rߢ`Rߢh}~(j ~"Q:8"Zߢ(ZE}J[E@k0#_k5-@Z ~"5WE>FК)LPG"U_PCCЪ-qP#AЪ+lgX%ɿ!K$5yrbUwmGMgG Yəs}#pǯq?]ѧZ*tkM0 C@Tlp_#A"0{""읿kA1Xņ@ h! )lE$LJ[[-,2?@qw'7rw|w^n?ŧ ݙw}oX׀m`rΌ>S@}~ F '>׀609a@L ~hs #T@׀C609a@͛#>F qS׀ò`ȩfk@aU0n k@4mbrz3ʋ#>`ȑEy;mWħqDNljnrb3qno9$#bk\>m96 m3%6VGw [{Mnc7%.B߇VZ>?x`h@ ;XT>T}l3xkG&MrG> ..߮l7[^21[tvAs G%;%h@應rр_ox ¯KnKDo𱵌.o"A`RQ@uɌw9 #ck#;09{b7/dOk~ɓ}>W{8qKxF_ hW9`> 0Pk9<ڧ#P:9fjVtm~]r"]Q`|,0 ď~ }֡տ, p4s|{شukojC61iYAH6Z3Vp4c@y|4t:A`}Aů߇>t|hbG4$x`p}%y} #b2I.5*~ T>TCGt `2YCC_:>{t3u2T*~ T>Tzа6ǿ=Qq_ A功AkhgŠb;jAE|YlaEYFYcvI2I&;{][it?5&#I"н*rtwA!]`⏃_c_=D k YL?$}vIK"᠐y0į1G"đz"/iq_f0z8hH ي堐6 ׈Հfu) 1^V(~8h =OY;$? gDzM_珃fa=j- z֪4va鄃5hF`~JJ?<ǶX @{L?H4 h'-੃BE"0ď&D`'u[=,D`ďiH"TRnۂ.9(+}N5&~̊D@s"tf#L]4~ƏEqtȱ{W\wPP v ׈5"{5`=zGnq剭ZeŔS"į1#6-D ׉颭Z'EŖr"į1cH">}W,)%L?VD@K)_wZ}#^;(nDA_c7$Z&ߗn#N)g?D_cw$D~ UYe"į1#5$ZVu⯻QvÃ_#u", @xjeܦYLAѫj_#Gnb^ 9oz#4iO;(ރ.&~#Њ]"?mзmu@"ď!IG(om6PA鯃y1K"į1H"OGٯ wOܿY^MdAƯa"?sbY_ԃoI @hd"h=E=w5KOhl8hJff_RڽAQ`lb!L҄ 0 9dJ azq=>keq<Ǐ9k{X*&Z?T!.j:ᬇ*a/!S?P ?:uC*aa,B`]?~B.cuuP@ju ~B`\jJ@ᇹq 1!<& 5!Bk 9!+y ?MgHc vJ H?NBq!Rx!nWrCO ? !p~%!0[9 ?pC`d0@O+8!`L3d0Hf %?Y0*RHR) ?PIXrC._Gx.]#E?oŻ>3k>~lߟ|}3K~@*/J* xVT~!/t0 `$3 !?AqA%bxS(q PY<(blzT{U >Nϛb^ j5),] %)SwQȨ(j B?"`>yPGWu0` ,,@"OUE|2PTQ8F; F2ߎ"`>߇ %B{"ԌE|sA(@Mz7%"xFˤlfPOvlZxMA(%]'"lG"PP~)9/F# Br"O""?4A(1~RyIBxrt0'_Ssʩ~;ֿ_Oҽ ~$+kk^P|݋nFϣ"Fҽ4Sӽ5cӽ"pY;D|PO^Ep(趲^_ꛃşst݈ E`(#_+zݚwos{=z2c& 腃ŒP_$({#C z]vag!<8tԿt<@8:_G}";[;f?]NvcvS8QFcЖrp"Dcǯ˹‰Km}{`X+%wS8Duaǯ+‰u ? QqN!:J?MDb !-ށR"Qh5D@I oal*q5sf~ofy&Xs0<hM"89/w9 9)x5jPkSVCjiwrpSSh&`^$~SOj݈YSh4LįyN)55u`(K9% 4ErJ)4gzE?p0>.Z`\ZI_q)(^ hJ}5ia#.p0c)9hM0$~ğRb#p..춃Aog5$F)$~ğϘx֔܊ z`U՜Sh&_$~ğRB#0.䝃A"/F@_#//Fm\G,Ŀ: 4!k$$M<h4#k$3*F{ASM4\4.83뗃g|r oc4\4.8ljg8 zኃ]~#5w\ O> "`.qkvr<[;q9 uJūWPkTv♓K?v0hlhM_#9݈ :`1ؠhGHvJH}g6v 364Zh5F{B#`F@_#)1wݍl{2uؠК@#.FSj(:] MFF#0_#)=w=o݀ӅcA#5F įԐ]38pj46h45>7gC΁FFS[pj^ؠjlHNsZ_ڬ;\.ֶڟ=0bk$8HPSPsZ0rWѬ$3~zvL|lˁV0jWZj~~K>|l['h[8`ǯaήd'e&`"AZ,FDm侍vS5xP>د՜[2я֯ ^"Aj,Fc/DsxP>X}&?_:,DǾ_ xՀօWWca(E"u_~!?ɹP"AкCkY?q,E@xP._~!8"Q<(Z_#}~ K( E~_BS4E@¸CkRbo(#=ZG];!;QAD@DLaÎ_[u?TQ4EC ?F!*?)>"Q<(!?)>P4E`~_2"Q<(CZ"AhCkMR|'h@{~_k*8E@xPGZ@,qdzhW~mJ?[}cn˭2D@D Gч趂5;Oi ikMh'StWb5,ǎOV`l'o'uʪ (P4{(Q@X"GoE1nA  d#ug\9dYiJ2޵w_W|ymm 9} ? ~@YCǥCC$@%~f` 1Ï{e?+K0+S0>F F QBGBHPÏޔ! ZBZ`1DÏD? `,^!ct!H@O;F5w.c${Bo,H@O@[CoXSRޝ{Bo;=o!% !v-7Ex!nr @ B6 m/Cok ?@aD\3 #y^|=]~S7Wn{X< ?;.@!BGop'=#d:p& Lop7=x L>@ii~ӄ; Bt!@X77M 773G~ӄ; BL .S5f= wC`ogS%f? w9*̇ |5rM'x=!@K&iO~O{ =~C81k&mO3!@Ě%~M'xB`χ 5~ӄ ޿!P|@op=)*~C8P@o'x2*~C8*~ӂJ[@og9Ds 28wgxgpG8z:=s7Ç1@>˷|7gNg> ߇+~ w3gF|/7}_÷:̙ b:&x{8^%k&@p& JTA8@% @U P *g@3TB ?֡ ð j`z Vx`XbXa K + @X=``<0,10a% Vz Vx`XbXa K + @X=``<0,10a% Vz Vx`XbXa K + @X=``<0,10a% Vz Vx`XbXa K + @X=``<0,10a!AaEKgh"ވM^ڪk:FL H D0V>@$)@&L!@0` |IS`$L"B+ ` X H D0V>@$)@&L!@0` |IS`$L"B+ Au|yh |9s. x uB `'/~  A_,{" V/J"V/ZC/~U@?+X y^B ;ϲ4»|o7@_Q7 vxݛ|7@%_Q' ^vzU U>@  _w9Îo7Xp$.G6x~j~?`=nAqA܄ːp; qP]^v=,w@V٢{{`h7&'W߅@v9٢{s`5[oW/|틓P'\lѽ5O?_|x]"0{-xQ nC;~406{5yQnjG'E5*{vv޿O?ʚ;۾&GUxyNI`w|.Uh㾋O`g޿O?.w}_4tū"7<}[6|\?:Vߧ _ EOV4l˄c^}YA2wb O?g Ip Pb>X ql! @"U1`|(1@NO?G|(1@.yq@#t"`o<#c:BQLx{O(&'6?|QbLxkO,(mƞ`1(u`Q1 (u`1(e`Q1"(e`1(1<"cC&DQc›y{C~(1< c|P@DŽ7gG@"p o ϊ10Cb[,6&4ؒ8~T!10I_ ůQ!/WAkap!1~@~7_'`?1݇@> w5"5gQ?!!6Y (FO~ch 0qůQ fC oBkB9fQſ !6Y +F/~ch 0aůQaC oEkF`86eߢ5C`fa|,*_7f?v m!7P^Z!ƷPŏ(!Ua|,ůQ`?e1jC o5(~ c?ahw@oRŏ~m@\Mڅ&YrO,9Rd-;;30Lhu, 9'&mG̸ȧ?qpa %m#oSW !3(~8(:cԤF)Cw0?t%c(~B< PPֻMCAE!u+h:B4G V6#xL!`KXCCW.`QQXܪᇰxnCC'':5!`SG:.2!`aWz= ? &I/AI6Qu~D<^]lu~T<ձyh^ϱ~`'ms<5χ<6^`g ⺹9|U=@xP/ԋ@xP/ԋ@xP/;!àb7<л=w{@zл=w{@zл=`5UهIENDB`././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/custom.css0000644000175100001770000002032114721316200017011 0ustar00runnerdocker html { /* Font features used in this theme */ --pst-color-surface: #f3f4f5; --pst-color-border: #d1d5da; /* Base font size - applied at body/html level */ --pst-font-size-base: 14px; /* Heading font sizes based on bootstrap sizing */ --pst-font-size-h1: 2.5rem; --pst-font-size-h2: 2rem; --pst-font-size-h3: 1.75rem; --pst-font-size-h4: 1.5rem; --pst-font-size-h5: 1.25rem; --pst-font-size-h6: 1.1rem; /* Smaller than heading font sizes */ --pst-font-size-milli: 0.9rem; /* Sidebar styles */ --pst-sidebar-font-size: 0.9rem; --pst-sidebar-font-size-mobile: 1.1rem; --pst-sidebar-header-font-size: 1.2rem; --pst-sidebar-header-font-weight: 600; /* Admonition styles */ --pst-admonition-font-weight-heading: 600; /* Font weights */ --pst-font-weight-caption: 300; --pst-font-weight-heading: 400; /* Font family */ /* These are adapted from https://systemfontstack.com/ */ --pst-font-family-base-system: -apple-system, BlinkMacSystemFont, Segoe UI, "Helvetica Neue", Arial, sans-serif, Apple Color Emoji, Segoe UI Emoji, Segoe UI Symbol; --pst-font-family-monospace-system: "SFMono-Regular", Menlo, Consolas, Monaco, Liberation Mono, Lucida Console, monospace; --pst-font-family-base: var(--pst-font-family-base-system); --pst-font-family-heading: var(--pst-font-family-base-system); --pst-font-family-monospace: var(--pst-font-family-monospace-system); } :root { --gammapy-primary: rgb(224, 10, 52); --gammapy-secondary: #1e254a; --gammapy-lightgray: #f2f2f2; } body { color: rgb(30, 37, 74); font-family: "Fira Sans", sans-serif; font-size: 15px; font-weight: 400; line-height: 20px; /* override variables of sphinx-design */ --sd-color-secondary: var(--gammapy-secondary); --sd-color-secondary-highlight: #666; --pst-color-secondary: var(--gammapy-secondary); } p { color: rgb(30, 37, 74); font-family: "Fira Sans", sans-serif; } /* navbar */ .navbar { background-color: var(--gammapy-secondary) !important; color: white; } .navbar-light .navbar-nav li a.nav-link { color: var(--gammapy-lightgray); font-size: 16px; } .navbar-light .navbar-nav li a.nav-link:hover, .navbar-light .navbar-nav > .active > .nav-link { color: rgba(255, 255, 255, 0.5); } #navbar-icon-links i.fa-github-square::before, #navbar-icon-links i.fa-slack::before { color: white; } .navbar-light .navbar-toggler-icon { background-image: url("data:image/svg+xml;charset=utf-8,%3Csvg xmlns='http://www.w3.org/2000/svg' width='30' height='30'%3E%3Cpath stroke='white' stroke-linecap='round' stroke-miterlimit='10' stroke-width='2' d='M4 7h22M4 15h22M4 23h22'/%3E%3C/svg%3E"); } /* This changes the 'version' dropdown, the color of the font is white */ .dropdown-toggle { color: white !important; font-size: 14px; } .col-lg-9 { -ms-flex: 0 0 77.25%; flex: 0 0 77.25%; flex-basis: 77.25%; max-width: 77.25%; } .bd-sidebar, .bd-toc { position: relative; } /* pre */ .pre { font-size: 12px; } .docutils > .pre, .external > .pre, .descname > .pre { font-weight: bold; color: #425bab; } .sig-paren { font-size: 12px; } /* links */ a, .prev-next-area a p.prev-next-title, a > .docutils > .pre { color: var(--gammapy-primary); } a:hover, .prev-next-area a:hover p.prev-next-title, a > .docutils > .pre { color: rgb(0, 91, 129); } /* admonitions */ .admonition.note { background: var(--gammapy-lightgray); } div.admonition.note > .admonition-title { color: var(--gammapy-lightgray); background-color: var(--gammapy-secondary); } .admonition.note, div.admonition.note { border-color: var(--gammapy-secondary); } .admonition.note > .admonition-title::before { color: var(--gammapy-lightgray); } .admonition.warning { background: var(--gammapy-lightgray); } div.admonition.warning > .admonition-title { color: var(--gammapy-lightgray); background-color: var(--gammapy-secondary); } .admonition.warning, div.admonition.warning { border-color: var(--gammapy-secondary); } .admonition.warning > .admonition-title::before { color: var(--gammapy-lightgray); } .admonition.seealso { background: var(--gammapy-lightgray); } div.admonition.seealso > .admonition-title { color: var(--gammapy-lightgray); background-color: var(--gammapy-secondary); } .admonition.seealso, div.admonition.seealso { border-color: var(--gammapy-secondary); } .admonition.seealso > .admonition-title::before { color: var(--gammapy-lightgray); } /* font size smaller and striped cols in notebook tables */ .output_area { font-size: 12px; } .output_area tr:nth-child(even) { background-color: var(--gammapy-lightgray); } th { background-color: #f2f2f2; padding-right: 8px; text-align: left; } /* font size smaller and not-stripped cols in API tables */ table.docutils { font-size: 14px; } .rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td, .wy-table-striped tr:nth-child(2n-1) td, .wy-table-odd td { background-color: #fff; } /* main index page overview cards */ .intro-card { background: #fff; border-radius: 0; padding: 30px 10px 20px 10px; margin: 10px 0px; } .intro-card p.sd-card-text { margin: 0px; } .intro-card .sd-card-img-top { height: 52px; width: 52px; margin-left: auto; margin-right: auto; } .intro-card .sd-card-header { border: none; background-color: white; color: #150458 !important; font-size: var(--pst-font-size-h5); font-weight: bold; padding: 2.5rem 0rem 0.5rem 0rem; } .intro-card .sd-card-footer { border: none; background-color: white; } .intro-card .sd-card-footer p.sd-card-text { max-width: 220px; margin-left: auto; margin-right: auto; } .custom-button { background-color: #dcdcdc; border: none; color: #484848; text-align: center; text-decoration: none; display: inline-block; font-size: 0.9rem; border-radius: 0.5rem; max-width: 120px; padding: 0.5rem 0rem; } .custom-button a { color: #484848; } .custom-button p { margin-top: 0; margin-bottom: 0rem; color: #484848; } .btn-group-sm > .btn, .btn-sm { padding: 0 5px; } /* intro to tutorial collapsed cards */ .tutorial-accordion { margin-top: 20px; margin-bottom: 20px; } .tutorial-card .card-header.card-link .btn { margin-right: 12px; } .tutorial-card .card-header.card-link .btn:after { content: "-"; } .tutorial-card .card-header.card-link.collapsed .btn:after { content: "+"; } .tutorial-card-header-1 { justify-content: space-between; align-items: center; } .tutorial-card-header-2 { justify-content: flex-start; align-items: center; font-size: 1.3rem; } .tutorial-card .sd-card-header { cursor: pointer; background-color: white; } .tutorial-card .sd-card-body { background-color: #f0f0f0; } .tutorial-card .badge { background-color: #130654; margin: 10px 10px 10px 10px; padding: 5px; } .tutorial-card .gs-badge-link p { margin: 0px; } .tutorial-card .gs-badge-link a { color: white; text-decoration: none; } .tutorial-card .badge:hover { background-color: grey; } .sphx-glr-footer { text-align: left !important; } .sphx-glr-download a { background-color: #d9edf7 !important; border: 1px solid #bce8f1 !important; background-image: none !important; } .sphx-glr-download code { color: #3a87ad !important; } .sphx-glr-download a:hover { background-color: #d9edf7 !important; } .search-button__default-text { color: white; } #pst-version-switcher-button-2 { color: white; } .nav-link { color : rgb(119, 117, 122) !important; } .nav-item { color : rgb(119, 117, 122) !important; } a.dropdown-item { color : black !important; } /* Change the color of the font in the navbar */ .nav-internal { color : rgb(209, 213, 218) !important; } .show { color : white !important; } .btn { color: white; } .svg-inline--fa { color: lightgrey ; } /* This changes the color of the dropdoown itself, when there is a smaller screen */ .fa-bars { color: white !important; } #pst-page-navigation-heading-2{ color: red; } .bd-page-width { max-width: 100rem; } .sphinx-bs{ color : white !important ; } .search-button-field { align-items: center; background-color: var(--pst-color-surface); border: 1px solid var(--pst-color-border); border-radius: 1.5em; /*color: var(--pst-color-text-muted);*/ display: inline-flex; padding: .5em; background: var(--gammapy-secondary); } ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1732615296.0 gammapy-1.3/docs/_static/data-flow-gammapy.png0000644000175100001770000101444414721316200021014 0ustar00runnerdockerPNG  IHDRM ԾsBIT|d pHYs.#.#x?v9tEXtSoftwarematplotlib version 3.0.2, http://matplotlib.org/8; IDATxMlg'=3=0؃+Q6Цu@ʱmGv#H9* OR>* akӹl/R"r\ `?=d)KBzT&Yg?0c~>AhIBLf0&$4$ `& M3IhIBLf0&$4$ `& M3IhIBLf0&$4$ `& M3IhIBLf0&$4$ `& M3IhIBLf0&$4$ `& M3IhIBLf0&$4$ `& M3IhIw,˲{ƍbnn.X,F$Cw?4ՎyyU0[_5ן jQ""4#""I}v|wL eYaX^^b8fFeiiYEV,"te$$IʼnLV;t@;Q "y_,cmmmn^j5HV8==r?~,//ZQ:\'QIqXYY9?`w߯j([]]4MDz$Iٳgc7Bhb-..^wjjxucY"ML{EiF/^MJbW N&&@^ZIx"4$iwOzIqxx8>==wΎQ(QVzt N&fD>RR)"" ,{Z-H$n߾nܼy̲,*Ji|S(ٔeY kyy9={Ӎя?ktX\\'OijgѣGQ*bnn.|܌ϟ7jԪ܌*zޯҩT*#q0q_d "IxYjZƳgFWܔ)qxxv{P;F܌7TOOO((ɕii:2&)J02`l&h܌SfY9ʦCZ?{ţG.KKKQ:BQ*buu˿Fk(:IZ[nu3sd}}b\4EڮvDr׵ZH5ڪe5$T. & ܌B9QVGTtr|eee?tˁ iss3;>>JV w1T0_"4?~9[[[#dz8m5Q:eY.a7P?P,#VVV"e?sS^3Z-a ^Mh͔^H$7n -FNRr| u',Ǔ'OnKp k 'tF݌ \\GA?4M-bءn Q:U.ϓ$b8+`P&jnn.$UCDpt?fWMГۏFT |h 1Jg8V]mqqq\A'F?eYfɈvFT ׍Q:WVcoomC_: =6Uo4M;st]`"t * zv֭_x1JfvQ%\WF NRi;3GX:Jhw~zz:JfnY& 3~q3n裏FX 82`*Jhu}NRA%\.a第`VT*^1F\]\4M/V,{j_0>Nx\/]ӥP(tLU*7B4?D\?O77.!3Jn3C/DQ,׿u_˂j*JDeY}6jZ$II͛7Ν;Q,]*\}za0j].ǏGT0'fKtpAۥ JEx58r9,mrߏ|>?Ra`&;q7oFTUTŋq||Z?~$0JrjnY=.KxuuYF߳Kj8<jdYZ㷳F50ןQ:6\,2 OyŋEV(gT/\v'4+F}I `܄'/teY>Vu'<L;D'iÇɓ'L/+?Q`(xU.+2JѥR5<CQOwI0r&KeèV.zb 3^ւ&'tiiv`yYj8 B,,,|eYx".O}?8>}:Šr4A_z9H 10<1pO_KR$I2jef`BO>n/0 BRt~U2$X]]mO?4={6ªGxbrYY&$VW'<4]T>&NNN^\9KX,vIg4$z#`D __BtBQ:;88,n6+ٟů?o~~6-WP(tߎ/]ӣ[Jύ*J/byyy __Ŀ6:.gfs^ZKL??뿎]w9W%w͛UgݒzRi4w܉Bu;9L aaVn LdrwjWøMГ,˺tܹ3j):\NNN\ OXbs^eq||v{:L;W_q{%/a\&ɗ_~q{Ppp $xCo&2Jn0Zo~믇Z׃ׅmoߎ=4*"biiiDp֍7:n7Qw014?O]afX,ƿ7fe\ Kp迚~uC?KXYjrUծGT Nh tOJ_|ElnntT:agNo߸qchb֏? K0k_]Ne$srrq{.riXR4@Rrrydǥۉ'I%,q}EXYrj}:~ov}TB4IӴ}و*}Dtɓ'#Iu~&a(*WZ-cggghVq||9 C?\ GGGy?vôw}?xi4M  cLa f˲dH4byyy(=IXBDDVWDښ.jxA<~xhǪAX dL>a fiOJx!˲8<<Ihbƥi]g-liiiUMZvÇcmmmlwwG}4}闿e,,,K\CFd_įka fK44=󍍍_ua &ČJ4]\2z{]ގ4Mc}}} 7&2 K`ƻ;2`_ N߳Rkmma!i7|^)W91鯗/_ƍ7۷pYB!bjo޼, I="(e9_M^44M#EX[nEP|>|p?6y$fGgjM$IKO?'> hU?\р(f:JuyU(-L "⧓u N$i9I? 8>>nm,p} ML]`p|}W r9666777T* ދ^(wA\8 u{a^mmm@FZO?޳uk===r2`mmmE\n<.m.cqgT_~bHN IDATӞvj5#M񍈉>OS\/ߏr3$˲/FDS{T* ui97)h_(t4/_FRiI۷ې>r/^hz׺hPk$Q(baaa FߋI20j~?8"%˲؈Ӷԁjww7}݁4/1FDlll4 N$677omhh$I<{ϻ&:j>l{50ZRFr9n~\w_tV*[Jmu8]\.WSR뱷ǍaYT/hj$I|gmg u0j~T~w-R(9}'R)VVVz}AF݆;kmjztt{{{]k|qms멫fks\|'\ ]\\5w0:/vmgT*wCɓ7mfk'mD {BEeAֵԔ<>>h]dl'b|ᇑ$ITըT*qppTkeqtttf.kz9k^wƍs?k LJW1;g\~qzztU7ʀnO,buu͎QoV՞Խjׇ]g0T*ö["β,>|tlئF|>FX,tw}Og_}Օ1vYjC\w}wnn;Z->|SY!^izөC gMKIZPf24eYӛX,Ư~xw۷ǫWo?uoo/^86%r\t LNtQ-IX^^'?3d8u&zM=vb 7.[Or\|7M`8{.:eY_|񅛬@GrIHduVwuu\c~=\M`Ұ<rxnkꍍx{xa(4Yg/Poӳmz;ZIxu6aI{Vڥ{rrXN^.#9:@Pxw(JB!$GZyviiw3ɓZ&VWWVEX Ӄ=jZz?qإ\kZ0ӧO{J\^t.(ǏS׹Z>è>UOJQ$DZX,vpY__o M~׽J?;IbP(ȃ z m]󱴴wk AEkq.kjZ]Fdy5eJ4Ceqxx$IO^] "bnn.#Mގ_W}U?M0[ Mދ[vZ].5Z_SHZ6M;ۦ-=eߋA L\?wVO!ݻw:JR)?~vΝ;n?~S|}*{u w^_~g //^Tt۷GVo;N`|> B|o޼4VWW7Mlmm5Ul=2A2ߺz"yn7 ||gmGa ˨|YO=z[giX,W_}멏nOԎcX,Ƴg>i? vA!uFyLӧ677udw}ݳ}Lٵ?5EtvvvZvϴ񻙄E20yڧݵ MT+377kkkeم_j5{zM+g}3(DDܿ;~êX,}G}KRNw7olzz[JxYF:>>w|Fl>M4(&Ɲ;wjd4xQ<{,={+++M휋Vzmui97{/R*zz|gMkZ&߸HRij7غ$Gk?Yf>ϟwP>>>x_u8<sXoR^Ek#ϻ r_|$izot`)Mvtt&.:Y8^L|>M'ۋ;w{ni`2|a37MX]]:zW fYvagܤI5<.cj󱴴ԸO{n˴vZ 0vE݆1־^j裏z~ -,,4FΤ5 v.zZ&n߾}7;wԿX֭[ptt}cyyyoԓs?}q\[kZ&֠:ć]W$Q(_[ cklRI^H$ciiQɓ'}(ǰjzכt6uNښ=z3Ǭ>soRB$_~\.sIRiz||/_{In;kܟN LkwYɲJp[V=Q4hRFN w26)u*y&8NL%DZ#%|>vvvݻ %41` 1oNڍI~mׯ_vFfST:իsLqy氎ߤ[n5-ߩ[nRh㢾Ij;_$|10ypUhLIYMH)y駟^jϟO %b8u&΅"ZguI=$I<{,qyfhSdDt}R,nR Βa[oL,i3Mokz_NiVVVBV:~|Lzz:&9}/ ui; L 3n޼?u߿s2lVVVJF'=-eD$FUW1k=Y]#˵ LqZIx"4=7K0+JM~̇&ZrwTKC^v(LC*7-ѽ}vذ41 e&.`;iCOOOcqqq% ܽ{:כ>]otN?ӺҸP˗/_VnwzMq|>JRlnnW_}O<9>Suh]unsQwZonMuh\.JRN<{,?viaց3LB[eXQcҋnjh~]d$|nawT0QL33nܸo޼/2JҵK-..[+'|Kv\$7} NʴX,6}ɜYl"">h:z'\ 8f0tywTV5Q'(\]]mZP(N|WQ,#Ic2 5@'ר'a$pYê}I/^~xP5ZEiiw33/Gh\.o߾KTL{ŸJiOOOo\5FU~Zo^;kRZYYiz|v4Ak&. cdr +ڡ>QY.^T*ӧO.7P#tw5=ٶ8IOai?&dY4 EάrrrkEmiibܟA007B-oY2~aII$$3L%x3qR]_պd.6B; Imoq/ZziK2.$Ʌm20ut_4=n=~Wυ0s΅ϛ6~$|΄&Ψi]#I)˲,vwwQՕ$IӉ%Ms%41*fm&(˱ֆ,Vwaaq?krQyv^G b-~LC\_iݻwԁ>M1q`M1-eY72q4Mbv뤷{ LP LLh Y MD\LG/)Z׳m(޽^Jcl("Mlt'ae,%+JjOkA^Ě"sbx-ﻬaKҹ^GEgȹꨝ~טGPW؈K?{u6e3vڋi97MǤZ^WTjzrSZ>긟Q[8WG70y|&Ψj3?nk^/|>n ֍؈4Mvww7~6F kѨ:{V5>DZnjVk ,4Mckk+޽{.Soi>A#XVa4PŢ/t` *Jlmmos#: B쌩5oM5q; 2 î͛M[í.5Q9;8޽[[[= ݽpvN[@=/ˍ1|4cv[Zsg5=xܲ,6-x޽븟Q[?BY=[İ?&O_I2QW?影O>b1vvvk*mQU*νe7z{U*/Z$"AIKQj\I4MeYSnPLvRKKK177|>jT*oqŋMׄQ*baa!#GRs!Qv|A6Ǜ7obyyql僃Kϴq֭u{=G5^˗q||I Fq7߼y3y睸qF}qߺx޽!|<~8>|^x"|2^|ٴ/cXQceYƠ,ˢ\.n˚FcBh]Ԧ- ѣU~,#y&j7c{5eg&iB!3kJ4 L>Oh4Mcmmme]XƗS}Yloo4 LPhiu--- 6,..[jDLqRIrIa}}={yJ$IbggK@GZVb1>裡ݜ.}qwku͛7#_au֧=Zu\c" ?Fzܹs'εjZϳh577O޽Eiii8<<$I"˝4S_n70y2M78nV}wd`'''ƒ'I|,,,bxcV\.+++}-O16rk>N$VVVT*u]b"z<~?8"&nϽQ4rׯW,caa)L[.om<Qֹ^jyXHkI`M"iI: +(ntn^۷o:T"HD{""$nrt(aq[x;C5G:,TB6~_.\0&ڷo=6?X^﫫Ӯ]Z*[4o[c:~a5VӧO?b1]|Y/_֛o"E(K@`YI@ 4U}ĵs='|Rv]XLHD|LO]2 joo_RKK\.$=ߚZ$@4 IDATU&:;;|ɓ'5::ӧO-@6~_v~/y Q J HD/_6=ݭﭯŋקE uuuzp8rZGSS5;;[0, .\0-y'?> o]h4mŋLPI ZXXкut:a ~nWKK˲C bl{Y|4UkhhH`XաC,'Xzjnn,\+晙}_/ȈO-&ݻW[l1^)cyr%DLǢHs_DÇGI }NSr:jhhX=ܹctW^. /Nooa?PR&njQ$X zzzLMMMjjj(EjvvV422")׫m۶Y61`L ڰafggD"KݻEntttQ^{͢hI]oxHH6;;+߯ݻW @<7hnn5}aab.˴F+]O__irޢhIqڿqݽ{W3u255>~T䖈*l6l6C$6nW,3;}:;;- iŢHIcٴm6m۶MpX $N(& $K.jG… ƲulttTn"b\.k.|đh4?kb׭[D$I/^̺H$^~$.ǣ&|EAS$Q$1JF}`F߯@ `IJQr\ڵkgTFFGGDݽdtqv9Nv566jjjjZvkIv+TuW/6l`at@fմH655e<^ ?P>ۧ?TmѣG6Dե@ oHD~_v]s#襗^2ۧ'Of}_GG~'dlQskk񸯯O.\0lIhTO6m?Y,S P Ѕ ֖u@@XҭǩU&FGGuҥqOOy%'Ѩ~.˗/io.P dOVX,1?ѣGϜη~h$ҽ'1]|Yv]mmm'T\߿_7o6=wu֖jnP.?Y544dzXpm166=jooWOOϢjK/-9,CO.ȶC]]]2աC(OW_g%]}}}9o9Rn/討;>H$cǎtlJXg{{:bpBKr4{b1}79PiP1^xFY  qI-dJ)qVijjRgg:::=2s=]v-N@ vs>OvR]]\.q{r~EѴ VJ$NթQO>d7XTUr\FF֭[466&Ix#c9 Q6¨ϧ3gΤm,-\.|͜;DQSbn{キh>S}߯*&x}>NS5H$v?>& Mp8tS;.EќW wwvvfM 8\'ڛt߿?S+dڿC ~%Uʕ+K&#$;~)T1k<7|x˱9::jZ>rHKJD)eՂ @ٶmi9 Z Jpq]rEW\Q[[iR4˗/_3::j*֖s;Ԓ'ohԨ =]nedÇgݿC6' )[;bgR|ΝԶ+bu}&ЙRVH>If=n9Nύk˖-Fs aÆȈfff4;;+fTKMH/7wѽ{ 5;;x<.өZmݺזV8V0񸱟nhgYfffe+-I{FFFFFFftj֭c%)㸚Ջ/ӿ;@KommU47߯%K_tɴ]q|>ckIPMMM9 I VLG}G58U3-ST?iX7V1g.Fv`_fLJJ=~^!iPnFFFp8`S<7ojc~O/(ݻwx:u]M333x<.IϴyfUWW[if(b` x㺠PI񟼯$=82vs:455a566Hgvv :C---FrMwܙ=EE]صki@`Y)^{-޺u˴Ku\ ֙EHR$z7r\ J]#ňNl>c>pv{LI.i"I ޽{i:w%1R!y{^;w矛tNguz333円$>3}}KuVNB3577k>F;]R,X>38N8p ye0PuqfN)tΪWJN 9N\[zoն^j{믿jhh(YjkkMx<TĪwq=0=.j71::jZ^djcciy``,&v{m $s|Z2dDLcccLFsNU,VrF!RjTOc{nu&'פVHp8o>S%.]pA;wXc{"iPjkkM˙&''G-wfhG}L*!tRrQ2`899E$>ӱcNJY~J9(]E ηN:\Ν;η^l٢ 6hffFqsnQL###FL zV<䤂iwvvVwztںujkkU]]vhaaA/BΟT]tt:z2Q- &NjGYT"Ycw9fe;nݺ555% gfftucے(6ϷhoۢoִduZ JHmwD]͡I҃ W"b"B!-RKܺu43$^JuI X,~(˥F3&)ݝ333iOȷljhh6%( h~^y啜uV:R)АN۽`kpu0_&?Guu_=f%-?VrPXV2l6߿?j+{zРk׮R_yajf3Rݻ.`}"Kr{|Xm%Z%KW fi۶mںukڪ|Ő|.ZnIҩl6ۢ?\+j=-O|LHu!u%浵&S>J 7B&X-@% ݽ{w|(u}=zhӣִ2p/4_ɓ'sZ/HKM')Mm%R_sNl6DiAb :bKdb ڲd!=Hz 0U_r𯔃RqZIf_f,HM(Drh퐯DB%~Vz=11d&v|>t]rE|:;;KS9Xg /_ֳ>W_}U/^z%Su6555V|izzz^ZG`MԁRU$S|Jvkw qx#Y9TAdXJvv/bNT˹HN}]9Tc"Rwæ\yfc9[BIur)G yK \nH~ I o&mNJC1B% kT__h/Kѷn[WW3gO?UggZ[[$%aqeu}F"sՙ666('Wp󩳳S~.\X6"&+u`tdKU8R9Ęd]-R?G)z` 㦁e){ծ'l޼9j6͸DLy䑜^ .\lݺմ)TJr9Fe(npHw5-N¦VHE{sy9H,Nniôϓ/'i95VoZŋ)KLx\}ǮnW]]'G6:sgz\3WVrxO]~ -r駟+M;Z S/+sb8yp8za``jfft455>,'7[ʮPts~)նr@¨ <6l`$||C5C`%"ȊZ]DQӄZ>á:#I qWy>3-ܹs1*e;FS/pcc9~v VKTH>^]X r.m- 2%Wc~z˥'O.7ikkS<66VV4JB ̇1b\KM~Z  U1Ԋ5PM1J)/J#VV[JjRݻ_XXXZB ^κS]m~7z}&_'gkZ'(x<@K*8Ӧ{.=jZΧ$~ 0%6455tPrɑ#GB{wL"x7.z.eDj;,uS,c)~dR&)-* I_3A033p8q&Kb5aB*j0LMԻyJ9(LwJryO'H<_*TJn*2+ .ځqmmm^@IL^vttyOf.vgs甬Faz.CBDhTw0UD":|^H҃ߟWˁıC鹱1}Fjoo7U HH>rMYiD5\Yc:}>c㞞!|K걶j;k9)͛7p8=iT13Q]2`9*5 UssN׵s1˫ӕB$于Rn fCCCZXXPss񚙙~5]߁J|{o>>{ϟ?˗/kkkx;Ccǎ\=E$Q{{Ç-u&#%N<@ `|˗/ZuE"7Rޞq&K.KǏ_Q}}:;;M@@Rgggmzzz*i֭[ZZt˹Fd~Iuuuڵkעc(ϴݦ&*M䉤 @6-U>3c1f'VNS֭[pUBQ2`Z>G)Vk$LHҮ]_~?uO󓓓k>i"u,gdrrҴz[m[n233c3֚k w^mٲdrIܹS.\X4yr-kĤqlb3&9s&'W|._,%ݮH$b`OE_R__nSؘi,%1 Nb?%'#j! Ԥ+Wt[nk틞O,Iَ|u.tBJcFr\9'J$Kİo߾8H'dv;ydz֑4(zp>kLQ[[ +Xe0DXM·lڹs6Jf)+=N2Zzͱ_iy뜖 IDATRmff4eddD҃cСCQ0T8o֭[PQc @6MMMx~N&ɓ'Ozpv]mmmYX$o.Kmmm|JJ}}\nѣG&$r]o]]J4!=_Hr'OX&hJdEKKK*.Ki]Jc>F:v؊+e$[&NR<[l1阚ئvS<_6myL^ܹcZNWz'L 0XR5G߂jܹʐ| _~޽P(dnrب]v,^/^T 5;;X,&e2HĔ+WfLvy<mݺuQ;wn73M:;$&nGGGu- 0^۷ok_$ϴ =JKX4ɞri׮]9Ñxؾ}0Fua9ݭӧOmCŌqkO6}ӑ#Glh266( HH8"~B"t:ؘ f$M*Аi"fi޽ߓ:qK2B#TB+Ur0̵ڂ6%KW\IRz9iO666rNPO܉v3^Ôj[[n5]X;~I[(bkjj*JǏƐ$/קmgPr=GM:;;1.G!~$Çgpnmmӧ*tҥ%wa rܹ3{rڜ+P 1Trfb!N#b0תӅ?ϑ4Zhӑ,^Buu6ol,UkffbCe9Nc޽{x>!fl{c4{G}d,_~=HZΠ4rLMM{K~z844͛7/H05~6oެm۶iÆ yFFFVZ\lذA>t pun544V֭̌FFF%֦r[^Ww5]jںuQmrr2vV 6x\ze4;;qtf)zŶmopz駹6 Tbc^pL$MJfrrRW^]wvt:w޼.Lwahvv֘lDq0\ JVJՊ5[CiD%xϖl^x<.әsDgohhxck/AC__ZZZZG45!I1"?$MJ&/Bf1$B$eJ#?W+X)UX0ܶmE0T04(OU 6_4%NLMMTt.j#X+IDtIn[/~N缉եD큁ץ6MO?K-ͦ_|Qׯ_ϸ ͦ;wj˖-f۶mYo2p:zu rux7 n"$)ĉQZ[[ $[x;˶ M̹JLBoذA[lY]'x4iܜS￟JRŸ\xTLpgff0L~d%mDl٢S%cǎ4'hrɂ>0s>3%]£>ZPn$*af֖-[\p8q>q:VCCC嶭DոI}'uV599i, ox\n2Z,8/zz ׻OBԔܹъD Pj:v鹺:,<׷&Ç 9 iV9*ehhHƅ"1VnۘOXHuP+ŎcxS+;(999i &Wp:ڰare_)wkFjrn|H `z cqBbDD2.%ID"i_#ImmmT($M*@%X;ZŵX#r+%W+t:zW cquԤ^{ME ˜1'Z.ǖ-[ %)kJֲǏ/4::j.at:UWW&ܹS⨑ Vx<^.7Yavx\w{x\###  @9Ė-[dٌJ~_m۶?99^s^1߿ wYi>!` E:裏L֪ap8;w޽{^k!pV Xk1xXk%*F$*N$lr:ٴ抨2uO?o1=p8tR_FM=կVo &`c`ZА`^qz駵aÆ"EUX\w._}^~e$i&(՛ܬf+krrT]ffVn[[nf8j!iUn˖-ڲea@5J( csd-?k{]}m^MH@YxEv[^W')ݿͫtK iI$M5 &4$&DXHkI`M"iI$M5 &B۷Muuu*{'&&t)IݻuĈ:weQr1cP>_* i~~^ׯWMMjjji&ر#n޼)IQgggBHp%;v0P"^TP(En߾k׮񨵵U555ErCք۷okzz5z188cF{B!aI,,剤 jw6JĉGRsڱcaM 1sssvqƴygi&IʩV&=FP(%'MT7nO8!dzkڪ={,A[Pl0_tHJ($_>cD2" z^Wn[VkzzZ4??oq$d$M`M8x񸧧HB"J-WVQ(^ CӚn<yUUU-މ ={Vty^MLLhppPÚWMMjjjtmܸѴ7nM'8x`NW_x<ڳg abbBN$=S:vXݸqC׮]$jǎ|/^$رCw9Z[oe:6֒FaI2IS?X7ok}]]]icW^HqcRڵkNֹsu1y<%דh+077vR6q)z۷ۺ}vI9={֘hNt^8k׮ippP555ڳgO 'NX g ͛7uM#q'+/JWHqF=S}1Q?sssFFy<ԨJ`P׮]Ν;U#<vء*D"1ҥKT(SO=FIW_}eJvN8v;N2֗VMM&&&.^$Ora՝{7oڵkY4yXT*!K߅3mۆnc94SO-9ម B)nVrgU2Ν;'wA+C,K;voӉ'*ɦ\u$WgϞ5Zرc_y֜mn޼bJnaknn.kjj׿8ђcu$Op l;v0ڀTwﮨD'NwU86%GU-H<࠮]&ATww~lݹOkWHSUUݻw͛k׮\d\&rܶ.m<)o{e2TUUiϞ=x<:uꔤ:BX\F믧}F֤jppPԞ={077P(JӚ7Z*j9n03/k[(;vjlܸє(511 ˯O~卤 Y?1)ƍ;I5558" ǦJMM -V`ӦM$%ٳrkk6nܸ7&fٳG_5??۷o*Kn<|IUUUFUUU:w,6mڤp8鬭B` ;wNꫯJ.^/ĉ+n+V%kkkgj~~^҃ ٪+o[`/eppHhmmohƍƄh9OM^ꫯ&ݻzzuIҹs411Q8+Żロsܜ׿˭cJjmm~卤 i;vPMM-sssj9ظqQ`ppP7n`02%*۷3wnnNN<)gϞ=zꩧ$=tP maibbBpXxNLLC޽{Y588(Aˍ_=+ojCv'xrk/jFk9}j9yWu)MOOڵkVcc6n(Ą5??Vo,t#+&q) ?κD2DMMі#ٱcnj0Ν[of[=$'ܾ}[oi&nIX+SSj2J&&&Lh<+ɿ~勤 yGn;555ڽ{n޼)Iy85˪@UUz-]xQPHpx[SSÄmSO$ 8n6mڴdBMww}8qbd_]4==g.Ч,^WqㆱO3VٱcGN4ILIȥOggP|4H:rN:kjuXk@pݵTKK("B4{||SﵯOo,:+ОTX,fu%ooZ~a4*V<:lza P(_B,,,X IbaaA߷: V &*6@4PA6@4PA~q`U iPm iG@:*I觟~+D@ZXX:*I~?luT,&*!PH`x\aP~euXUUUYFQb1C ^+Ç߿ wYi>!` E(zHzPVZZZOohhP߿4 ,,,䇤 U_~Q<: *ʯ ͦzP ^SUUU^Ebӟdz>PuuEfff/{|Վ U֏?GyP p8V]]7V{E%=*2??ouT &V~I?aPHXe@*?: I!PHX~4J-,,Xe UjaaA߷: I6@$MbT`i$MbaPHX6@z:?Q<աA} BUSSmڴI;vPUUUutwwۦ纺O&&&t)Iݻu;wBP_^6mՎ;@z}}}$uvvϿy IDATY 5`~~ B5??oz~~~^pXpXoֵkxڪ%;vag5??P(P(?X?S8J%Kv]V'߯P(X,&.ǣU1_)*IkO?Qnu(egppP===rMM<+ )Kz`1776n߾-russsV(_Wڵk[oSrDF5::@ /Ie,PGGGo(b  r\:s\.W/JNQHX#HH1==mJxWzӾvnnH:ظqcN߽{n޼)Iщ'VtqFmܸQ{1چLOOٳzKK1<OAu:uJtڵ&MLOOhG|x7ؘH$H$~uvv=q^Kŝ_K7Y3?S8mt&=$Ep-M$ bK,Ų?De[d߼_3Rt! ԩSFCk֬y˵j*uww+ #bBWz{{u9uwwvܹ3.uuuFpvkʕjii1nd777f\۩X~B`bh֭FHG=<ϔk{-E= ڪ6cGyD%%%FH %hxxX*,,4kЫ jiiɓ'sN\7o6 GHh#aDt:yϹXٳǸɽk׮`fV}}^}Y555U괵r-\ygm˥Ƅjjjɤ{-]#z)ljhh$'0BH_1ytj~;wt GB!'?!UFP"ѣiםZ'8 :1nn{~߯}I***0*~jDk׮pL\K:,KҀKւu )??R..itEGG;&i4ܬݻwhM6!N:7xƵnڴI߿_>O{@q ۷oؚS.kv KW]]. ;\uvFE3k)֥"V7Z1 %,BqqeOJ&9z6lЩS$IǎKDmۦ]v稬X(~IzSzMxN1>hϞ=-[hǎs^B IJZFC=== "s]|YCCCf kÆ yfI:::fXd&]Ƃp7/^Lk7#?=jT$ө.|>3gjll4G1VQQnf#`Fu9Xݞ!"]ڿnэv__߄IJڅ'P]]8ӧOk 7~Dk׮pTpب9֍g򜱛]t544NhࡳSl joo40d֦jʕ [,xr8 4IBeg/f啕jllɓ'!i4$1Yәv]eee PZZ 6ԩSFt޽{V9z>xx9sfq%%%jnn?.׫$]jf$0n`P`~S:jL|eٴk.۷Oil;F& O{-nX,mݺu  HFFF488"K3զMեwyG@@/^T4UQQJKKe۵f͚YC'm޼Y BƘY?|VRR&y^S`PŸn:#,bn[DdZUUU%-&ct"&k>z#8ϫafo̞?`I.S",\ٺk5Ov?H ~'{^¾V]{&ULHufycȖ-[t}%u"?ץ[ )e+Hvؑ40!ٱc}sRbFh Dr`Pc?Ɗ,q…Yi)!4W\Аep8$ImmmSw^c{ճVRkvѨ.]vi۶mD"UQQI(:;;jt19BR^^٥fZ /5۷oWmm@R&`I`PǏ׻ᆱ`0`0("ժe˖rv]BhI]tIW\QNN٥`ZcXR.W45f LhppPf 4I ]&500@ Dheq&0]EhS`!4m@J\! cr. G4U~~/ٹ0;}.\0dRϐu[a=`Q.뮻0/l2e|O>Dn;fU^{es`Z. 4ir%&0-EEEf@F]lyO5Ee*tE"G?J_J˗/7"d… 's]} PYYYsv~*))an,_\^{e \1)Raae1&2l#4e ͭ%O6L8BT~~rrr.#4IeX򔛛kv BPaa%0kM ]***RVVe0kM`l]*0..򸢢"|ͪ𸮮.IRcc\.@h TXXhvs*ѣڼysn7S ( IL94H5R]]]Jk/6`PbQcc,Su u!uvv* bjvvg|x<|D"X,r,.cmڴIB!B!V4UKKz{{:HRqqvdꚴ3EncCL؎mL[PH)c  zx<4og,D"h߾}jiiQSSVkzGDƝjf|XM R2QZZRv駟V4UGGDž"*++4:^Ch"D쵫V 6ԩSܹsʵf$Su/xb֭[UQQH$7xCmmm Ç:W04E֭ի'߯6|P&8e]揂eg/[H -=:c;HL$YbӦMƾU4$UUUMzƍ>O)նY,\.Ԙ]ʴd#IZzU[[+!˥cH04})B/rvРÇ8>aq0OD"7 ??b]LgUVV@ h4DqqqϏ VX B7cXX]Sٴi<(I:tv=k3g$OPh!x*ϧH$"ժ*=C3۩t;X,D" $lٲ/^0ԡCn\?Z~;Y{-.ݮZl6uvv7ސ7FF,ULVh4a… AutthÆ &V3SXXhv FWWhAlF5n Ǿ5?UV&,{mYYYJ9mڴIO>$رc&&裏Y  ֦FF8LS|Tn[ִ_ʕ+^hϞ= xTҽ3^W^7k|>#4zw}3ѽkRE^zInĪ`zrrrgvKD] qDgggowwwǎ}m칮.UWW'uԸpR?.a>7S?c]'f#DUWW'itDEuvRj"c]*vt72\ kxxؗu);9-D^0>a__ `Raaeә֍doP:F_@@5*++u1I7@c(& \X6lЩS$IǎB`}SzMƴ4=l:>F~<#IڲevؑuFgDע4?'Msw_җb *B&\wuZjU·>[;L &FXjWv]&{. %|}w$v(//O͛7ChTȏd˓p$i޽jkkXaSX;M{=#HF\_s:At!ǣ`0(?nDG{{{±  IDAT06a_II/T2i:{l_bkg>k eb6UUUwyGv>L5^cժU:uꔢѨBJKKE]&mܸjqQcLRoZuaY|o{);FxTTTdXb RkvE`P>{1UUU)Fg׮]3Z1#4̡ &[vLkתw%꫺{L +((`,Tď߈R >TUU3|>*++1:q™Ѩ 2BeٌqpxZ0-|N*x<c;Sa={񨢢BO=ԬKFɺ;*k1&+"1$^MMnj\Ȋ^h1z>OSfEyyoH={V_)&PTTdv n$+4^,n.̔͛7---]{!y|+Yjjj$ďF8>o}}<\.<8+ )+Tb4ĵ$em߾aZrDg&90444/fs ¬r:*((H+hhhȤ Q~~rrr.cQ 7c|>#41Xp:*--wfpH|Xx7ח﮻5\cREM׾ӧO~tXDUVV7:;::}6?35`Z߿_pB~kAO<:A8p@On~+h׮]3ɺ3lZڵkۧH$o~ںuj-ˤkrg ^<3S TCC$:;;U[[+ͦp8I1^6Mmmm:p^V\bэ7(ÑV0e1"4̲P(7|3aʕ+uT?y|>c$fӊ+L R\neZqq}Hv\\cy$q…Yi!ah0 KgTVV65"}*jwM&&S-NȄk~Mەq:ʍ7ިn-a_kfSaa!]&`ukkk{۫W :Mv%9s&a_aaN8?QP"_6)u]7/Zyo-~os~~u]_h4j;s7YB̂]vi۶mD"UQQa Tkkљ&*B@W8N}MEEE3ZǏ*d&JK*=(8 c{y Μ:uwEZvX 2j /\^WGgl߾]sSW??ϲlL bG XJJJԤ`0ǏwU0T0bjղerv.w^!4d522bڵk1ܳoǏO y曬,\.=|]Jp8O}jZdE,0qRy'fG~nK9 2,_\WNIXf: pX}SReeײ~`bزLWn([oӕʦ}NIʚŠFg< %]wu ?o2|0 si+֭k~M9&=fزLU]^u.N 8$rξW_Vٷ</NTk#24EKANN֭[7ƾ˗/񨦦f\`"t`r&4{_nj:E>46 97Uw;5xw~@ ^:2/3>:O^]&bn} _Пgc_oo}]UTTX"//O `20883g$+..]w5u:NLfrC""L+012E˺|mW#F˷ަ+{W}LbRXXhv {&4wS>9UY>aߪW/;)32EQ[;>ߐC&"\ tw'FzLb<`#4￟0ZA}nZ3lY#;i9?hJ\MvF;fd2Xr-?V04 BQQ.yAV \rE rssru*;N&|mkIJ,#NfȹQФk~"h7 y51x^o1ڵkuy ۵vZ+[ZrrrrJ2";;[f@h7|S?_-^aX˷ަ#7-C΍ƷU_#}~=9gVٷ%vθҮu*))Wuttرc&V\s5ַev@Fef7Lx\ZZ/~Zr\L7]}wFA_uZ_wؖV}K˿룏>2 XVV .#"% M> .P 2)g-ϐ0 (''gZk9xu=2]s2c";;N*QMFXUvv֭[#Ghddr,Pď~#K,x.  ;;~ljxeY˚etjjK|##-_VZ~{ꃑa]t2[l]&m&wҝrn489DwLX뮻t]w]Ƣ۫V2@fON`uH2oXM2 |؈(ύoXJ.N}YmX.}nݯ;SzmVO/]}N˘<򈊋"2Ka"~%կ~˗T2… z.|>Ŏ0"} e|m=&_߁Xs~*))1 duך]2 99[[oӕ[o3 ؎y*GߪhMd B `l>J&76c'ү}̑BK`"v̡5hCC΍&W3#:rξ=eW#0ά<`"4̡2Ef5\ Mߓ&r_WvѸrC.3}G 588f)((_;S_XDev,X&$~]ؾ:oWW:;;tHbA=Z|%Xov,h&$us9gߙ𸜳o)&7/@(,,i"4 2 [);gH%gF,˦}^{M]]]3s),,4@VV> .uno'=&㤱=@ݴq%+V]E.d r$&_co(ӕ[oǏW 0qog4xRq0ۃ{Zk 裏f^b ~;>PPPl @&0k'v9rFv\)K/MXlʴb -_\d3ǏOrTVVr=<⮻-2ub, ӊ.E p1S^K^nvUVVN;h088h&$?4o&B|] )(͑j@ ^{m[nEwu׌ (p_@ %ϧP(P($I*--5~9NO<_QQnfUVVtTcss\T7|S^Z[[o>IRcc\.!m&LhruY,566]Nҩ; СCT0bj힥b~o? `RÖeʎ)烫_4/8 ]-[/#/}d*4A `a'K"ե&*--Uu9|>EQB!;)&6mڔ4߯P($ϧ'N(رc}OvmmmBz{{٩ncN{Jkw`PpX~_^WG3Uw|׃H$۷O---jjjjPsg17/&LjIJ\(ےF8{lB c^h⣏>24Kɓ'u1UCM*ѣGUZZֹU^^rUWWc7B~i޽{kFTVV_O?yڹsgZ6TX,mݺUD"z7֦`0z>|8#+a6 KK6lA d$0!2b L j8N4tuu%&6mڤ)08NG?Қ5k2ZO]],##kF󩷷7#kbrR%#IZzU[[+!˥cG04]{DhF,%vVO/xJc92a"&S+Mߟ0t&1;M$c<|>_FמkgΜљ3g/QW6d{~'rqG`pFε0?(Gu\LWnm?Ä.+a?0!qM7xIJeҪ 0w(**Rmm\U^^nl_x1kǯ7̍6Il6ۄǹnc^W&,IXv(8r۸C S_6oq-˄^wD&;a7o6D 2lB nglm̝`0H$"IXalwvvj]\ pĺLHS=@ׯx=ohAVN{QYYiR% :۫~ZhTm6gd񨵵U>OHDVUUUUz衇dZ.cl|X,D" { :tۍ렢Bׯ7F*kE^RvfSo~Vk¸!IeE.w4r… +((}ݧ5v4G:?01&`;w]VV6/PH BFi7~B!B!8qTTT:L zGx jkkSccoϖ`0hlrj.d{x<ڳgϸ<JT0Ԟ={ƽ&}B#B;#8wFG}Nł RfG`LvsJ~ܹs6ڹsgJkecʴf97nV\)߯vssl6^ J5pX%%%RW&~B`bh֭FHG=<ϔk{ME= ڪ6cGyxpx}#4`RY;*ItYc /6BY?_GGG؍tj!}b.B" Ş={ۻv6lrݪճ>&J5D"v&|o>rؘPMMMB "tVkSO=e*l6$Immmzz'P0-C_S#sC@ a4M7ݔz3"cŋ&VrUmm:AlذAUUUB!;wN>O@@@@ǎTmmm&J^b#ћ7555L :.0SRRkO|&X%i mt7cc2B&z4l2!4ڹw[;6c,ϧO|cc6mڤq/'iLXBR{liiSGG|>﫸xF\ ,l߾}ckjj6/nRǏng0:;HJ+Pk*֥n'}j5f\+"4`RّĿrn42X>ԩSF ضmv%I:v*++ge|Fiivܩutt( ڹsgϵPľ/I>`Jao c}&zo4:cҽ@Z>] IDATPDOOτ#@~±H ~?1Pp9smhi| 2KoqwuuM8n~¾N١"ŪUKK$YwN{݉) Mnғpa޽{ q8jjjQmƐ{4Y-~t\S1CG`P~܈c>B&u1WiR%Wuuu%<馛Z>JxLh 6&N<9ahBҸEEEtK>O@@'OL:#Sv:::$IK64AoZuaI]Mt;8ؘY+zohalP!UnZZZ 裏SUUᰎ?nubj>B&'<ZsI7Ns\Zki7P()oΖZٳG鐤/f$/l61!O@CCfЏ+1f1vcdR5HBՈ.}2XX[oWVV֚>KKqqmf<ɓ'MT6m2777yz{{m<^]&$%t>pL_MMћၱZ[[o3[|بĺ[$k*Ib˖-ھ}V.KMMMv+B&!+bٲeiueB g`ᩬL+;vLZ0`lLG&|>=c9k,&!iFT7=~\6zqǴJ\.q}39#8 wޤy^#ԐL&K/_Z>|XgΜÇha<(;g+ul޼Y BƘWZZ2UUU2.%%%jjjU{{:;; eX֭cjkkrtСv\.n%Θvp]"Vvexn c@ĤsMק\o,PrMd!4 7gl_$[XBi@ZkX7t sqccp8$麭V뢽an{Zs\r\)5kg1KlٲEw_~_O?/iE,ԁ6OeG ojfg4<4188t+F;vHFGرCƾ9q1"4` y*@pC&B̶`0(icwp¬Դ0Nn W7#Y )%0FyOy޽{իWZ]]{qWX sڵgbۺ>NK3aLBt1CPYe K`v<.076 *IWi,`1݅ 4)`1trLI-{!󄇤GD|Fys4^|U]pA`P'OTWW$mUUԩSV#4&ʄ$mta$fޫYdRn{ԏ~z5uǣڵkS(R(*ȈΝ;?|A`?fm˛2g]סxŤR)B8phBE'R)MLLHښ~Dp8F駟n;a+~xtO>u-,,hbby5MSZ\\{e'D%&&& ijvv9/Ty% I[~\.LT<ndR)-..jaaATJNS@@^Wa(LjeeE YFT{{\.Wca魷*{ HBdg@E|rEC(_O;'ʵꘙm_"Ljzz:$Gڤ{J!,!׫a]CV(gΜ&].~N>]0DJ%[5"0 >}Zo|>51Ο?okiRZC$)`WooX0 C:sLm3EG ۭu=zTձB%DGGmrGɤDv!.KgΜ &ph![J@v. lQB~;U -t JVoQl>UM?0 r4U2X{=qIrk0 9^Tpa &d+WٖKDnUrmm1Ѩm "QX񸭽DG. KD%6ȗ_O #4B&%Z&H'%YmRԶۖsC']v&쾾]/}H=TrO;v̶\*8{+ڑN*AAh8d+HT ZYY^`QLggmX bA%jE"uGGGm= ð)P=q=Cy+W8(Z=ɯPzV (%NBD"{iڪ4,RJ2M' }oiZ,BrU~"42m7zbvd}WDHR%[V^v+4QF~Ǐ|<^= jIzNżh-)צijaaaugff5o*MصZB… uq!o%fgg@ `L& l=zԶ_%b'+I"T"LR%!G2-/,,haaA2MRG !B{U8E<-1;c=ȒɤVVV$mȕH$ ~*MZ~5h4Z>Fvҿ|ӧO%M[ 4`G|>NH$Rv;45==m{7ah4Z1+w\谖pU@ ȽՄUÄ>Fu̙:hwkyzzZ|EO&i{~_n{cB ޶MG$Q({t:ݶ %LZN ~r| BLӬɶ; ^{zצijqq}p8l- zpp|>aUBފ㚘k:FVMlk9 )gD4VVVl~[ !4q:}&&&DBPHPHatJRN@ @ h&n.\Y-,,>3MSDbm^x &+&  D"Fa0 Say@eM||PS@= hH&@C"4Rk&&&TaP&pj`ssCT! ۴^ ---`h АMDh4$B! hH&@C"4 АMDh4$B! bK_*p_=CgnnN5??X,U99N9uwwĉ *p5e2L;~m '?I@]w~]ݻjSNŋi?y睢롴X,>@E?|yT[ZZү~}?PِC(bXEt8:u{=`ittSNʕ+r,~VQ *MH&NT钁 Iq㆜N灩N577/H IDAT555U>WWWu=ݻwᔙ&&''k?4&Bp8/~`0%]vͪ0>>~`BhA7믿.áUb1}ך);P(TN j/46B8<>]c4X,fpy<IRww>#]pZݭ˗/o,xÇ |ǣRTs֭Xsu---Z^^$uuum~yaͮLMp8lF}N:e&eMONND[[[m,?ؖGMK,ӽ{lݾ}{[wnܸQk NkTm_*x1vabbBx|G|> X˳ZXXk2ƽ;^IzѾL۷m>}Zn{WcEkgppP>z$v_tFzLӔar:r:z#0>MS$'gMg[u{˗/W~Rwww*tww+YAfffb}#I`v kkk2Ms۾HѨɤ\.Wۆ=R4toVVV4;;+׫`0{^`rCJur([p8lPj,CV]&rKWWWA&zx4\p8@ Pv{0jVekkkkFJ$; 3Nnb*_pkV^nhbyyYXlWA'OM5ޮoc/ ʎ&ѨRm?8ɤ-0QŃiU8>)* NK.g---Ys(`/&:;;4ZtLLLŋ;ҒA`W=[\.|>|>^ou"~_ꗿ>p{7MSHmOV*`60!mjjXhbnnNr0|9?~\F:[ub'rLx<Zs`M50$og$~OOϞ F%m]=[xv/¿X%7nX=|\I~`nnnۊA]]]܎333StNP$DBdmᰤIR ppd[OY-g ׯۂ/_!xmmmr8_^^EGVnRPj!QZx1x^!4dghooW"Պ".xp?y<+ۆݻ u{i||ZBu.^XQutʕmUtZSSSS,Yá^{M`w޵3̌Ŵ*ǣ7|S^ARC[r8r:r8Ӊ'*p@q=x@Vi$m$tusumDN2GQZN[qaEQR)!өN0L&H$"4t:ޮΪ'#/+߿kDc% ڃarݗki&BR*>;5;;kG>OhT{d2O?TJp//X,gZ˧NhBի B??&MC^j-{<};Н;wj2\N__5ŔN533c ~8u>3[u{i~~^ޡ044d ̄BC:U,mk%׿ʶݻwGqPm2+޽{\8tZ׮]+duu:ݻ'áڙl 2MSH$‚nc}ZYYgfgg VB(JIRJRZYY&1MSo}aa  A=f4U4$}UZ~ oH[qaaAz֟ZU tZ8 dgH[y0 522bMb1]zUcccm?::j[%0!Iݶ Iޮ.ݼy}.--m߾Cill:.//[a~+ m@299Yqh"G%U .lᰅb.\7oV]lV .'.\`koS~o4&I&J&[S>Gp8YIp^?_7o;엪*4Mݿߪޮ\.x>n[? wߵw:ZB!jcn4&YYYhBRFvѣ(OJ$J&EIikB7L |>YV|=^wtt護޲}0ΝD./>ϯv۫ EQ=zH=eFn|>@ŁZ0 CgΜpAEik> iaaA;jP.0!m?=i64mW!UVe"χ^~hwyд= )w:ujK{<[U.ݹs---YO  uM]zh{l{.X2<>^Q\hۍn;wMG6@mmm5>8H=/g9O>_S(::: 7 Cr~9}9΂I}jCL& *A\F}%*Zo'ՙ3gjxxتH+JU:&&&&!lbjgv/F~%lՁ'yk%?xvH i:;Swbխ[atx<겖s+Hlgbb¶\iKR>a IxZԝ;wJV(V5>+Xoiiɶ=v6h?/ 4Ԉׇ~XџzLpqv29WDk̋r\.:::$jc+:BV)5/}}ܕN𛦩O>D  zu۹0M:l%ikRɽr>d%IzW{nsPr/ؽ6ݼy6;::t:m-i[ǣ>h_Ǚ5::jkM mM玷ov]xQ>sܻwOׯ_ߧUɓX,V_Zk = ؖR`E9]]]UorPs===S'ֳ>/IO<^W/lM?yĶͿ[T~Z6# VkT*Yvuvvj|/!߯Ǐ&hsB[XXP"P<]hF3?{+;mpXy<ittTքktMIҥK C7oT[[۾1NҥKv v`nn֖@[¹st9ihnn`] tww{:uV ÇVbXZZUTwxWm*V[[y[뒇B"mP(TQ F 5rܧMӴ&}wlrr[E䷹ȝ=vXcɪDbΟ?_t2On[#7 QL6 }+j[!"Ngdj_/_tu]~TtZ.\ԩSutΝ`F֦!wy睂unܸ*fkGQEnQb?oZ1>>^Q{F {GhTDBdR.K?>a4MONӚO&UW Svϟ>*9f"*ZijbbBk{^+`ԃd~̙3%1 CZYYQ8hW|?{@ 믿Jϟt|O.b1} ˗/[rx :F9sssw}F#~q]xQU$}gxbFe744dX]]\{vffz/-q~Emmm:uTt {uÎJ&aO V6hoo}/WrcJRUD %ϧ?^.\ AnJ Qg+jD":::*PA 6.^h+ݟuԩ}oaN 7o޴122b[nn'sLtuumv` d_i4- BuiŬV'wjm9:ȕ+WVիW\gϞ׵}*;hUIdkٰI*;ǭaȝt^XX(RՌ ð**H;kPW|45??o-wvv\?خSooP0Mf7U&"vnll6xtʕ}ǵklcccmmm"tLNNڪLfaX,Vuk1Zt<|zյgmYt-g?Ξ=Q]zU B{rb^yr*7o,y.bt]~c|ܒŞ(?hOg'aS%mM&D's=$x^k_Tj}Eђ'MZG",,,XUɽB<W"Ч~/o~cP࠵t{<`FV ӹ|Xe{crrVb:+IKKKViE CCC경>"W:֍7lDu yIzɯ{i^TOݳA)z@(w::vx ڶ&Jm.K^Whz*OOdᰢѨammMD61SpNIve"HE-/u}R)촮W*A$i O?~XDBar]i*J*9?~`Z^&+++ZYYarw5M Wdy^0 2MS%Ks{^9NR){D|?{@---BYW^۷-`_ڶnؘΞ=k-///E+H\vzp8 &2ݶj|́ twwX/>|hnrGWWkY-4??_2SK|mmuϝ;s)Nk~~^>ܜ-H%mUҸ{Ν;'c>MHv<;aEOOnCa󚞞V4U*@ Prv+ Z1JluRΆa̙3֘Ķעڪ"I=|z`%c===p U4B4==mM Aý .]d-g㒤K.Ν;{ W~Rx<|-155'NتSܺu/~񋒓^nb644dݷ+5G0_]]]ɵ쫯-W %W}qB 42o5t:rt:*kPlh4E% R)!)׫+׫Y=yDiZV$ښ-$D nɤբ";N߿r\P Ǐu4el\~ gN|>!4@jҥKdܜ>=KnhА455e[ݻwv>PlT# Z i߳P&w#٪DVTwwn޼ׯ~βv&]8sLv]/;Y]Bn{ \tr}f[[˥zk}>_v2rr^]lVq0 ^RUT"}Ƌr?{@* znp裏>>yΞ=X,X,Qݼyf-s}7k׮7ߴ&`"*NmЄQWWuO>϶̨J.J~ݶ& M\{ FѮ6$QjWŕ{ ?|nbnnNׯ_yl˹|G X,f Ltuu 077g[F7w9[Ibff8uTMdC** b1[A.0Z -m*ChxEQIyx%iU0t6U# `n]|Zݻw:V5[g䎱m`x<}ǻڡlz_^uUI[pޫk&74j nVh4= ɋ~#Nnɤ%.c=Qt-]xѶN:g}fkCQNvR5pZplʕ+ZZZ&cGGGuOBK/~ [cjjJKKKzե6b1}7/7n䤆-ǣt:t:%innNwi'触455%á~+zWt:JsjwޱUs8 {1aԔ;wNֽ:77wV]ᣘX,ۿȈPH}YU4;uɓ'mmmV5I[#{\"48?~{TPCCCӽ{xp8l'=~5XtYkҥK~+A]|V`yyv3[J,//o[Ew h|| jFFFJV%9(rYRj( _6QlTWWVWW*`A*+rE-F* Z5*s_~YNZ[oUhdΝ+$w,5x466VQ/ IDATx۷722m?rƪjxtΝݚޚ2LރoCC.`.Dnݲp8z٩'OM(Çl;G:qJN.--irrb$|+~B!ǏW_Ya+P\'NXAl[R7N?  D,ĄKR)7pUal{JX̪dp8t:ק'N4NP ?8HJ&p0ў4$B! hH&@C"4 АMZ"hkk~zUhH&@C"4RkHk{mmmjn9`JӶ^}UтP{~~ӟ{{XhH&@C"4RkHR[[[{kc=LfSMjnjUSS63lϒ[ʑVڎOmj]SFZ?Z뵶@?<Үfi}?E-M$)M=$j휆W2?̑Yڔudߔ6.;cG~#/H֟i6MeRJl}SIRSSZI̬K66嘚Ԭ6nKQiZlOE_}6#?>MGtDMyr.6:9mjliS:F=]muHZRM׍3ic~m}&CMȬk}455+T5^jժ#9XϘh-,]+MoY*H|xZ$tze{65~NUDy~4dmwZ{~[oljcs18Sh=cVs_-u47nAvG[rui1fZ5=ZW7CzY6M?Ӻ6LO7??yn~~X5u_6Eg27ֹ.hSq:9f5o]ÖzOn~vabbBx?lVi``&FO>-۽}%I}駒v9s_|0 CNSn[X4Mݾ}[<"TbqqQPH488(We׫/>]-Ur.v#$H6դͦ MS6OeL og}=͍fmvXmdLy륥ͭI?jss쾟6=ՑLFzX_/fkiS뙌Z2MmҦiSkSOSsե͗Ԕ1tL-cj![gdZwy[ô}?+xL5y(~͒[[jc=ljCϞOY6?~6qjkKy<#mju~&Ș22-R7O|֦M5kk2iJ_:clP6T:'KϞ*j d a455Yf]MM%uo|-@M?F~ ZԚ16Ԥ̦6T~-f]I붡̴v=zFY}4e6|gdM#g(}u#-m2Zji:fewhC6e[roUZNsSZIGѤLf]//jȆZF밾fֵ=sn3l mi]MMjlLS)EBh؅5fDC֊aX[oh_pz׷}D"DbkRѣG#%*LrUm _(|ri]Md6E.|2rsQ57'ȳ֞.!ɚ|&C|b%e2zlO5U7zQ81|4{#?تIVZmMYrmdU-!ߑhqŶ޻4q随_"#3&HDu.5)1mٌ5WdTfJٝ_U*)ZZlL\f=&OC}$.pH 7B%=4DF_S^Prv})4oPhVد8:jpu3jyӚ~XPIOCS59GWN:$ ,C8ʍO]g(ѹp'b19Eqi"Fԡb\}yȎBcT5+顴7\[ `C)Ճ,f8nb^q|>`M?J(N 2W{B,tOO̍Rƾ<1=3B3Ok#65(.W; YBOc1s. ̍EP\LϹ>,# P*0f^`:ETnQEvX^+nHsHU`L눁R,sn"?~) ƔfWG١#^?huVeqK8օY:o`k9C̘4oPV1fD46 -Э @ dGĵ'Uzf v~z U+RώKmow]@\@IcCD    #,˹˗//-WXԮ>7cvvve}}Z/W>.(^:UKrظvAN3eo s M{吷n;@Ģ~VI:LNC4nR.J:asEw1eDEc[)_F{!B.\r$(  =YüP#YkeA()ZY8e1Fz%ӱ I0}ojPvkw5}iK:hJcq j8U3UcACMx!cSZJin}hH|py>43w?5OF 9HJLF54TA(@֥dgC/]CVz??1Ԙ(]E& ZZ4^7"!E\nN?4Y`dLJf&tX20idVA^VQ^(lIfQ+;'uY&KkB)E'J/38?D4!    ?" &{lssAx7a`1?`ŚƴN mQ%6XPmD1c5V s'IbPߢ P*ױ) Z,.H14šOfB8dQ%%<>:BxA-8_z6]vb\0WZSV0XuM"ePU&n縄Dzj~X)S6 B/jl 1Xݧ<u"}v^=Nɦ86FW Uܵ{4 "h l]m2m4~:W;q!5uǥv=ѱ* .(ņ:{O ©Y̰NOǽᨩAvŜS'eAAAAA񲾾h4;YN_/^ᅨdQ/:cBlooQtqw AAᇉ8M    ->3?p׈T Fv(_eYRU/^^"lD0!:''oz,:loӨ&[;~,6mI=Bޞ:MT~bTVSSK.ZOO0?`V(B/rE*sē>NŠ D |ps>Š2Ǥs1S5 U q䰸}Zz9j+O}k-NU"Eo֨uZ ¬6^S(*6fܨ6_Ѹ@z~Ɣf Fm ͵ 욢 rh5 ڟI/JѪnihFq=sUǿi8p ߠS .:Lwۛhe먤;jHx~׳k*Q N591$h4]]<1X ՔuNJټCxպ#?q:?vD4!    ?0vwwfggxLYF#>677@ܿ?& y>|x5]GwjIvwwy)UO?>K1񘪪s10mZ0#Plp5^cT8p`T7lx<$0U\hnjwHĵk$`4"q`pغ`< M@_Ip֚K}?0+s`"ο=gP]_E87kzfD/W)Mzz58^D.ݧf)g#ͪ[Bhg#-7V^(PDwIGѳ7Mh\4Ο+"AAAAykUUϟ?_X>fvv>W4qUUxɓ'v2>Ϟ=O$"NE~ucO{eѻ>Lx:KO.-zgϞͽY4pަ*^zu Ν;_Y1/5Rvg!Z^꜠;붴М+a},ժ7Ch, :ľ]G)MiFJ xሷ2[*6Vx<5S&!rcmȶhOet&h c)9-BEy|a.l2Q4D*|hbL?W<֪ԫj,jG}bE"jjڨ ckv It .\? HuȢv.-%봳+k =gזɄP5\hc: g\g}6jԗ諪ʧݻш,y5ϟ?*>}csb\|W[Z*HI,y19$HdwwO?4'ǻ^Gŋ/llld7&QhΉʲY~\[[Ν;>IxuuE=i^FQnx<ٳg׿@DQBD+sPxo##Twj\Ji*2*kh̉zz#1h.fTG*<%=Eb_ሉgҾ թ{AAhl4# =T<ɟfiOدuAOgdf1&b,sկ+b{{R &Rt'O3)"q%0s9FYq^GELCuuܿOyzBO$Da:qGL7X@ϬUA>£l\@zTazՎA,iDi!Xǚc M>92A1Ч"|c[1w?UraHķsGAՂ^ =eVж4&q@q -~;x.XgBM+\hR:sEP* p"=!ߑؐx2Gw֤m0 IDATtј~fG,ωpk' g"aqRGKI:} ̹tιoYҺ*qfCe{*1cc̀aVqM{rY=c8uvq!#H{OjRg5XK1 r!AAAAA$t޽H|g޷+YQUU\%r`kk_ק>C ϟ綹,K>|TD-~嗧~ҽ>裥*˒ >}ÇgF,rᘞH]ѣG nFBLϻHܻw/".+,yݻK I(s  Ÿ;4!    Yg|w׌' ۧ_\ًF |gdw 瑄ɝ{*YF:uO/"M`2Yl]'O,j]hH?C?9ɭcD677y<~xw]eE5/5]7>677}Q +.Li29І%عTM7DNhcGǁ,BhWV%"F}X]G)=ORٱ@o+2s1Oap̜&:/4jP Kzj8Zr|6g,f2P@BB}VpƇ"AxU 5R4Wzw 86F5p{Q ɾu׵;'$'E (;fIkbuHW{ 1LB5S&f/s~J,槆&AAAA{p~\/ϼ<X,HY>S<{ʖ']A_}ϋ(HQݽ=Ϛr+Ijfc]G$DCWWqcy B\5D~C5eP:& eST4e FPĨ ]QU%IQ NCqfFǺXpsfB4wȠM!Er$QAF)S\^, bª¬m#T.C>pmb[m+448 0 GʽITѥh*fbNeE21XF,HZŵzrlM965T?*}sxB (T 1>$_rlgꏢ)>:(Y+&oD4!    3EFQ<,\p˞ ?fggx*t:eUgэjHϞ=?>s666tqx{/;wwޕ^ws|Qhas}r]r]cLsr4Ŕ%FP3WWsKU-%X}eP?*bQgSP*PUA/- U'v( 5+ չk-7vgJL]Rq:TLC"$Dì|TMߍ?(hhb:sNL+9_ѷ7mއ: 0*ۛSռI*j&LN,X))z tAEƻ% U%EhC`hN8$JuggBڬ+g qx({=zVSp7zж@Ŗ%2)TɑSދ&΢qheѪ헎@(*@Eb ,F(Wޯ?&D4!    3(nb]ayӲ,?H-_gYlll DWE/^+n'|ᇌF#>w}߿/^0y)?fccdvG] j}_$\&;;;%dmmm.XE\׼\XF#>+CORa79!B4h^W10b1~Bs Cᨳkjwx9swX\tm\DrpSw؞:o#(!iO-F0jN1MУsZ~Y>n%:TLSf#*ZLj(`QJ=+q /Qᨩ۶V㨩 bx9p\vH1OS8/("Q4^ЕbTGW0q PRK4bǔ墉ђ]gOA⨙rpͷs&pa԰ iA'%A1CEQl̕⧂&AAAA{f}}=[<Ѝ8} "&@.oll\=}q666r(˒OE tsX^Gɓ'yN=zg}ϟ??w*>;I677T_wQ5gq޽,rx5[[[y\Սb}}=DCèJ4:FD&#L[LnrqYU*4o/]6LT՝)ϴǚOC`sKłmռK|? آ.RU[]zSX,Vhu4yHm:(XGar$JWpQbA;X7 +[G:R稘2)݇f;I\Q-     |t)Jc[兢,_zN ںJ/hd_| Ə=⣏>ÇlllgEt;9=?q/7R}kkFܽ{'Own_z䫯5+F,"w%˗/k2AA) AAAAAw^.駟.|c^݂vFW*_>}dmmmETU5w$kkk(\UO>=˹t&?0 8={[[[|?/w0OxivvvP]Ig<qsΙ,k^k,>WիAa{&|fKiEu44 exqICJ3:,0+f q3տt;1ӳkZ4uj~Vg*&nWLPC=;%!~-C H.S5 c 4 :N\mG]u8fgPD1[~ΰMϬdgX@(ʶ)Ъ fAv;{qz32s mMVFYzwG¬0o9=ZIUEW9n~TnL4QO Y/X%zIߏ7Tဉ:X1i*5fSqM\W4NYܢ,ny}\g?տsP¨B( 2?W(0Xz nH<     կ~o~b{{ݻ:ɄBe߿g}0p'I?9ݻw,tϞ=ckk+GsTUuǏeoo~߰ 1_}U.ommq޽ ,Kܧ-שW^]`b766t5~!o</> ۴w_zEUUޅ`h16)jh\}yf<\?GC4~Lbeu?=M    ,y;;;'>|xSծUUϲ~e/\u>|E/^e"emxĤ\:*˒wfQI!$qׯUu=zs<Zsi_D7&2E\(˒>(goooh4Qӄ*^A.;̤k\ebK x}hhoq > |pfT4Hɳ}\>^ yidn]ct.?F&"WhxW./C&T(] {JϕVetyi5Rjp_%UNJ zFG!RYsUsL: -zfCъ{.6^Z݋B'= T|ݚus=\kϡAq=(GC?p81# AAAAA vvvx%{{{9Wh˲u4UuQFw>PÇF#}F/^૯xdww7ewqN ϟ?E../,*?Bri U׹w2DtX|嗹MEիW=Wtw^v=r]Ƣ۞W^#ݹ uJWD_ƨXE ~qPШrxHqA9َn;Pl[Mn1υ6fy#F qWLJi,ܞG,S6~ëؗ4 <׎\l[,p~U,w\30ExBɪAXN>|m1L|DžD"~ff޻*?tI㎙ޠ4#z ;S#zz_^bpЧ ߖ:(jU깭1uO{eT00QhuX~v $iURsg9AKrnbiˏBMAsA4AA^-"yyvbsss`wwW^9׿SpIwx ?3M\q==[Kp'_X^.=5ǔ#NBgvqgQQpQ5ў`oMA[_|?J1b-p3\B~ոE@ p6?#E6ihU"@όD4Pڛhrx !Q P1 o9(F 6aX蒾ieڼeXVjM8 {^xp|@:,|x/FJn_d׆,M u} 6CԪc\۩zvQ.Mp92q,E }u#=L8hμfPܦ1c`* C~uI$y75_1??r74!    pz >|}4'|_~) AQXmT3bv1s_ĵ`yvXUVUeAƬQ#->?߇+P9C"9eK01(co1(iV#f7^^-w&ǀ8fVh VGC* ژ7#($>ϢI(hD."8+:!9Pc"m+b5(,~BhTUR!yc5*ABSLhA{Bq7Y81 =Ǡ‹iΓ1xDPٽF3{j@ieИNL4#U,U%n!S(epZeZ6?=_,3`"     d</IHAAA4ӄ     5lnn駟]+ 5nAYd?wVX~"p|%7g =Xf. pjҌ|E !4mvZ0N!^;:G}񡹠saTI €Z.CNW>f6`屦ϴ¬0j@ `؆?B&;MtY|,XJzjbڨ>? t]#_a~Mc`\}9eɅèlv?q-\YЃ5SYTɴ٧1phWFwl J)ׇڽQ<hPD׈ѳks13<B^-0Н̜;J9:1Jwpf_L) AKb @[ls/n37)*6T}fGhUw'Ɲ.FHG6RAϊz0XJtۢt,6J F8 ERAR^׆ô0CNүV" T~X>ɂj5uP;bj0ݦ.ۑHk%e(CIDA(P]I03JѮO o %v-;B{31Ok:BhU0*2Gc+:_7qyk4],\pD:(̠iHm,o6eBrNb] I y^YIZ֔uzz )! AAAAAkdmm?gϞׯy<|?Q A~*NS\+KO )B,:bz|.)nTbtҌ&qs v;9s$m"s{T\L4 ݊jф     5S%O!JEL&qqfVYɱeX`e)C^$VT ?j:FkY>`L|%SwQ5ڽ6`#61 xq 5? LT:9Sw׍nl糨7s4H}Bٮ٫$ig<;=P!"AAAAA5>|}7CGˢuhrr`OVx 8qsv0sY03sm8izv}v㕧}}3רt~^*p+I3ߚA>ZMi {k6ƹRǼ\?dq Z SmY59R|hqzO3u<GL+o폲@Mh *Z? mԩ;jo9~^Jivuri7~0ECA_Ng(pfׇ )ǔ䂐0vp`pdg7Bb{#wJlE;W59&T_eh"%J 3™&&T8<7`=Q5oE~@gJ6'^wjfKE5ʮ6MLn_-}vAAAAAAAA~ӄ      !Q 5+N'7lGO4/|:}d;uB}RyhUɉ@)='qxjʿX/f%?q kюm QV; 6GLQ5Eo&97B&ͩb\>T|9_Ѹc9keVXx5hU%Mg IUxs-!x5Z5\6M86bNƣ؇6`BjhŐAuUX%JauvC1fąߖk&h "9N(exi V&x15Wi:JiBpsLOuxzJ z~4։?wy">: J9fgѵ:GqNOϹ4^TPӨs@D      T_dQ# cqYM*(( ,]ЖgrU`Džh->·xps x\h_+p4u[LZMh|^U„g} xVkIhQb z>xJ85/QsgTvn1 Ձ9q@\1CaNAjPj7Y:}{׮ӑDh&錫i:( !v☴Bv5Z*(Jl 9RZGhtdTxoNhḅ6h b0PU腢t=V%ZsF ZM6nfjύкGόr̪~2 rT ф      gG;m޺-op~>\Ե*C1j|ut=Bs'u[,=@ޓ*ib:g{Ypc( <*7^s_1GUЄ,Q=f Jzj@b 6l+M4M      B 0Jmu ߫07r!N8i{j>]sYR8 Bi*TjV.J-)j{ѩ_v~o@"xr[ @fۙB O\8k2j0B;7 5JMq^MRDomƢTth$m.Y9u}q~2AD)ĈQ(Lx/Pd^F ( Y¿Bé6 Α~pQ"9O$D>=xZWJ3Yr^B) r&X+V=YbB%m2u@ ELL8":'0cT7zHh9:; >4Ia +}zaM{Bw/PS35{-9Voi|q-w Cw~?/XJlhK 8#Gم*&ZK.)I`gޝI<}>w̬48 [ǎ@]@i(W`)R4P5Hf{hh3]U~9_2=mDTFx-rLݺ-snmq";>ϐ8+X:zl$&\9vZǷDbUɂEN6\~EgV37d瓔a  EQEQEQEQEQEQ^[?jN8X.n((&EQEQEQEQEQEym>n{/ $yCot '{͐OsȮᖽ5u!vrC0203:4`e]jϏ)H^3Zծ;TUa0yNѧܷŊ?_o[CHf7LqwH0SvFH蒘'ѕ9xdH~6]5'ۈdškBd~vLi]"}6k]9[ 1E(2sHDpuq{tXӷ{5y]rX,NHK܊K&out,e`!kUvT ܃~>wӿ0j%YM)`ĵ{KAzl@`bbQ!snײ{d"b,.ٳ A{׬EtX36֤ߟwLsF3m)N"󵨮5r۱1iFqOl ,Dn9ӯG_A_T4((((((n999CwB6cɺEkQYYr {KG`bb{ 8,9YcZ4I$q2GH߳jv}$ (宅z"]jB(-aBN9z:{-bg1] 9d /↘&6ӧ/{rѶ?ȓR["U41xvS ɷ"yH㝢Ex ttK;Gб&,Ȃ%9.Žs7 g>fq|-9Cj;A&b ƫE497qNw=#(#&,#Jv̈ɗs^g ]g{`u&{1{ ElEC(`cT[JtfF>csS UkJ2|3n/Z,[\~6nLۃ,rvk흪MLq). tOs Jv;"1-7ӗjkR_蚃4Cxw>Vop8΄%ͱ'iG*;451G`U&nU4((((((q3#!4q9ޞ+R8pO)˲?ruRNݭXx~%#Y"~vmh)U0Qu<#;E1"<Q9)'&0 )0.Lso=ló"1]E@^#!=kNE0pt) P.wR'.YbeJ-#=!&jcqm9_,zrM@p2wribxdI*PEQEQEQEQEQEQl6, DxF\(|JѸLE3ooi/%Ɓb)zZI %b˸XrӛW_$1Mp`I ./]EgOi`_DVm&,{׈)gώڪ({7lŽB Ӣ6zn.qZѿOcmb-6$kҘ#2J4ibY;fĖޝ6r@ 嘒9\NۭynLd`eқލ$$]EBFm_0"g&6!K,}4.X-qk,p+W_);42+|upfcJÚ gբ"S]NRi.pY[1؉bvB)&cr}`qѠ:_-9b9Xgb\1}9j&0Akw{"y-}9c+b=:Wv%;ktex24'im F% mz4WsHSx+^7k:YSϙ;o3ʆ]ڰK7L%&cNvbZ K{84~13q}g͒(9||]Qф((((((kFn0 /5fs՘>tXӷb6^qCN[},5")nH\qyܔbeoפ~ D̀nkRMMڮf]y^I_$;8@4g$I 9j?H)9m2pɶɌMaJ;,0D`@iBt9"K|daA߈m",㞋<ͣ)dHy&L "95ʤQpLP#:^F:{vEp}xSPф((((((kvDebi iEM`]߹S{ ;%bpŵ ݽb9y;gr8.wQRpx<]L՝OV6bۉZom | >I ',ӊ.9!D]ر!0] ^#.Є)Pr0dR). 9,]  YetD``bns`D@Y8pK~/p0x\>&u $,8).ER7[%td` LS۪7r(&~RN5҄T\nk"|EU9'o8Rdpum{H$;dSu_{UY4݉FE((((((k=]׽7OXGy@s\\cؚK|gSv~K4O"baٽʾM'\wCiI%k἞U]Wc{b+T ) c#B+V>N!G qK{*@p60 F<髳FEުuݫmƜzҾnu& Il'+c2PQ&S_0ak  qQfq H*1~g#J|!FlWqt:6x״Qb3. N+ iQV)ee5??/}I{$6׽nHS"1uKo+ա# m MtwEusSe:!WV: EQEQEQEQEQEQהfƋ&N6qbf%@دҊA䋹4Xdef!v?/(EhgD5kW.*ˉ靅ܟ17S7l0ұp@s '·oЧ%wP; |"_:LMS,/rOHp[HPxսKgM=>"Pm1ha..n5 IDAT A ={R$8#gbñ v;0_g؜`WKIy<o1("Z^Bi$kMU\q`8ޥ6]b1Q?/]& #)[]ЙUvWq]v11=Nfߑ۾6hBQEQEQEQEQEQ5fG%־9'錔 ]SmJH'cJ}#cd ޝxX xut1g0 Y|P 闞!8Fv~$Z肻p2>р=T&˂Ajq]4M1kGA&&vd1P vz&G}Ln_,0љU[W(qfhSJ njX"@`S+q桉f"(4́cץ~t=߉#po"9^]-~ 1nd[JMdEͽsjT|Hp.\qtĔ&%JGoNa"SBqC`j3P,>]%=3K:otm&(hqCu4=aĵj7LESSߧ&BY3yTz x -cb/]WU>&t&b x͋gWl@E((((((9햓7ե,.YTK]cOKO#h>Ս`"+XЛ5N.p9ۗ'pQ‡ zлSt=#1FԂ{L0 iʘ= H.`SB+OidS[k8lg*\eTFÓ }Z5g "38NM8Q~nqga2@gD;v+V:SynlBùU#A_gm;ɢ #7kr(pпč]ZD|mqxp)~LL & w,]{"0Ϯ&Ps{m42 sMLv>ټ8ա+DCyw\t,_*ogbT`FQ;_؄WK,.p}_sQф((((GӟѣG)(r1:%A $ ^_ 8\.Z@52U RO9Kɷi-Vc>×rŔS/?@X7s؏7 r S+ P˱ 5@rC-V}r$#FjA.K}nOL['.a5,[@:lOSd#ˆ!KsS,!@} :~ ;+{q`ɳ&M Sv0I뒣Lmj@ K=+K!8" g#Ak7E_ P]& omSۭ`}BvBBNrLP=a!J![-a'k`X)Fx" :7:?V,ҙ*0R#EG?c) q{K,1.|^T7D(/b Kb &63'^4= ' GTGL5]$ExW-jJp(DH#S݃peiq{BNY6ئKBp%*@QN6 ,Hr]50-jc/H'IDA}׳ H"jBqNckLi&|v_"@T}q1_T4((((((lX,/~D|?G@T ۊ,w" [oJ2pHl)6ӫ/l..IsGLtfae>pXt.W]&:{` "!M@G1$0Rd,\Α)F ML1^Ϛ@gsČwoC4VvMJ= >ّ% Lw24y<] sx&vEYQOZtEؒLG#Gdyx nsR181;[Xp%^%DLg$CgV >nڵy m89.sFZ2E{NOO'=إM@޷YhSMJ Gssά,۾r[8p9|G(),ٽKsW*HL[LUC ^9".;yt =uhREQEQEQEQEQERb?O䃢((4(((((_!}?ӧ>9????Qۿ=S?~̓'O')8??LQEyl6 FMdK.ݰ,@WiIBK ?b:dg{grL$ۃuOdqѢ$ ̂,^V#EO!߯Ś{Dk1Gt@H9:tf kcZ̀IIR@lTp u>'$"8|؂N{haUشHl^? Ȯ8_=q1k|ܐ\p١ W0KVȱFJ9eϨkx~%kL' {gl%c03,W_3%%Jɮ35Cba1}Sd W9֤F̝&դr5}6$y׽=w)\Ӄ{>-q2;L֗WO5wuY;lV!euBE((((0#/>䓃_^^c?~C~~Ǐ~yy%O<yg+(|v _ӧ >]cEsL# kb`euPଅ ҽeN@" O="Ej.<5CQLE 1 1y=5ݰls\eT V@L`ex)/?)7ãGwǏ?]S4a>h]^^_3`EQ取vKo;L>Bi3`qAOKz@%b lqm!!=k'(p{t>G0tD+>mkʉ9O߅v⚫O#;\bN&j__<\@S[_[w/B9q#. \-XћU냗Dw.n# WMX3Е7g'TǁP$$$`d D(2"({?|qH N{ھeT]#򥢉:U|Q[} 6um= dDJ̯,KVF>DgHJ2,[sG؃ ;椉k{%!I02-,Kupu7gOXس⸳l;&&H)~/uiOU$SpǛ&EQEQEQEQK &/ ѣG|{x Oć~>b~?Ϛ`wgG?~|$PEQ\xON#}QtٙPD ٖPIăBku%*Rc5 ] oئ3{7D$ȡJ`bV~Sa 3A &VGv;wh^j?7ӧMDr{ެ#SzVtvq0þH.\qfqpOHS|3;r#فŠCfBKv!Gk΋/~JE8M0-Õ8s%8EĂ3,IYTRfAr{Ď[L#'&e;4Z$h((((0>'O|g/Ֆ(f2MEQuE&EQEQEQEQK00+S<(5[֗jKQEysn?t7~/lX=}%'{s( Qbs:4{bq)bAo|sdd4WS!ܾ`ib i"(eۜ:ՑϤggOޱ/pv QIpt,[\F!x"5$+.L@vDK,sfq2;p>|A{8qv.^eGQc%f1 vML}6'w8]X4/}xD6r$ZDb\g\ћ5,3i"Gtlbl<$ $sP3D SܻnSAIGb(x@$iå4qC8t̉~EQ7qY.Xk_~k\X bf&]pPL&$]@o(PlJɱw{c݊q jQS {ɅAt2`SwP,^DgO2[[b3*tOϲh٘g1Xq83ГEFlH6 YcKH);n{׫{-Zra@ `xԄ%tC1K3YXQRⅹ&0~Wɉ{WOD|ۺ{n3@ ؞L{XX6ӧެ;X<05b ҇Œ("KJ^D8g&hQ-b"`&!IUk+")c E+yL%mD >Amg % RzlXyLӕIUqѳ9%GzԕEo2*PEQEQEQEQ/0g.=y(n999Cw'83Rd)m` GwkE]kKdJ޴Sñ/ȷ% {­-!b;m.& ,${ 꾈[5JUp[41سVH'kA:,L6 M`[`.pK ܾ{XuYd(5AKS+,\lb{SiKp1]PS+s pPU]% ,l.cLi$Jhx5Do]J0+u ZR9":{r4VS\dwҔ\A~B"DmV  Bmь1]"?^,bLswi"ێsb825CbEZ1WTDŋ'@HSK.e"A|[sd@51Ooy|*3saCĎINv$$9w/ wVyQ#B*/`8LdQ EQEQEQEQEKql"'ny&EQ{n,K:S , S%Vv%v"0D4fE/+. [09}9}4XrA b\#`Ϙ)bvn{1ޜ1t,Z{"30 R#f F]sqz"NW1DNYG-ǰY >`-0CTl8XGϾ*451~K>Jlɧ#&+Xtfu )MnsDlGl\Ȁ}`m_TȐ=pMdbn}|xznU@g}).^eI(Ws# 'T4(((((_>Ǐb\ IDAT_ !>~_OO|{TEQ\jL?רDqH)lV1yŁeEgS15cRq=nE1Mtұ(Bs'ϔq,eqtt=lrN3 &D K -å2q3ﭐ&L)w1%N_a1U1l#Us&Ǒ?.oQ(׃fb@<;6aerc?]HL'C4(, E-N EPNOq-I]l򠟽;oEn_%~S>{yy/K&~5*PEyCI)1@OQEQ&EQEQEQEQK2 ?яOʓ'OOO. n>G'?ᗿ%8Ȏwvv9G(Il6ax-&ԓ&M_Y[r)'MQ1rںU q/o2Stv"dhCzμ5h1!sWg\:ޖoJkL|5`G$9臮3.}#!ML(nb{]*9g f$k:IF-xELSv`%Ч[$%dy@յbn6cOw0 m%"by H`jȻ=C0 7Mw{`bM*k]nb+^jRd :5es)a(Q)[##Å=.!Z~<%{JcV8SæEtfٜ4[m{^#CQ'f';9ӣ6 (~Ytެlv.7}yikKK,}0l 9gCxxhhq s6imtLirI-H/C`Z|m\rCĶ脛ތChExuԷJ$K8 7^8f“GΞޞt9Nb!S 051B$d@y^'&wEQDJ6a0YQ R AI;D-zfip mB%bKmSOo^PRX3X%t*?#rdEַZT"*tō;rXXh'Jlu/+p_$t)\3D([E4"vYۄL@~?F{%Á'~׹K)΢kqrM򦣢 EQEQEQEQE y_(xUxyyѝ/??>}(vm"Ǜ {l3L)#҂! t(Hd";#M,1?.dMفϭXLOk""rqt6"8{SN UDm0tX;NYCqGLQ~P #)x0IJϋApi_,'ydE,%W0>^K7bgs\$H GBHS+xWKd7/:r?[<0sL0홁 -b;&BSukOES"tE`Y,+ ¼Iŀ'rfcٽu3D{{ /EW_MH0ṑ [bk[v iB$a^$%6 kzY2 njQdk=w8enx6@h)!nmbpfhgb{p##]#GWló#lО3M9eI!-6*^|+'Qx xtfņ]N/䈜ޝd׋ S/Dl{T([6ru 'w ϊBAsjq@1NǝDYY.g÷yK2e1Ea}?M1]qG,wXGO=Cb.uDY1zn]ab[wMFE((((((_6k'ئ焔m Ykja3hbb(x. >\s6+fyDO`wbU.d[~rt0UwhDd= cB;YJf3#Z$v~-?q[|s5ZPQXq 6NG9z@ mAc}62NϲO6Dam"'nm&u^L‡1$b(;K:7{!p`/}6 Lh'n-va+ٮ%e#DġŨ^5WkEsYwIBU2GYc '("ᔒɬ홲ٖ !Ks[39ƯX˱}uGЙCPNds1'ҟm#[DQXe1jWOH=a`n.?Zqф888888zx<¿ YK|_ct$ms樂gG= EvX]6nx$9#GlC Q]Df%R,mRE]x^'ǘ_ HU,RǢ>]L`" $$"h"NCMwVaW^]VVY)&zG/pE}^&zy֬wZLBIH"ȌY h@DOsD=ty.-/U,`uk$#2"~bZ终ڼdb/cws樴F$X!B D?أ\E+v%dvqI?q닭k#DwaZ|v-s)uSSL#"a (39.w0#&'z\ T6V0"ghTXz䑃=dyR ÕD(a':WJ!ǹE_.pqqqqq1|8g k(!~^?Vx]˕@-2N|MimQ->F(2C-V&2lE[At}mk: F= nC3sb 㮟НpdgL>DYOd=Eb#DZ]$6[yޟ!vQ, ':&Βyi5JbMhB-wB.W6QDdss$c>`#qyZ7EUV+/~O?#C8ql<#̤s@^(!ː.WV\0#U.SD# aB"AS8 Di}Ϻ$:I̚Ϥ8b@3f&9r<~S8 sZh"|PAPH23ig䵰~,Q(?﹔-eC$Q&6=@ 4\n8#Jh34,!^#_䃎888888ο2{x88_4888888bf\W/ݕ?,p[;~V)|$qº[/zA Lrbpz} }E,1I(*NF=)؈(:Eqjv~;XoD (\'C%9NVc>vb"wN+y8j+w.-bc陋98ɦIaCGwiXZIJ=uwsLv1-֢Y9]1QVEpcUŦ§zՇV] Ĥd[9;Ckm.h }o~69oxD.-C r?h7ˈ`5d'fRlsfVw/M%jkw3+7{7R=,3C(I6s6_Nçm/bbI1qD\$"W1m#k/F1+v\uykԏoޯ5Ep[TԵʅ,$ U~ˉ׎&qqqqqqP3<#כaC,R V"QHhʭx2Hl~mUk~~M@Uk? ({EfeZ|Ͷ8S2"Drfmb˫ j1f4"zTڟ`5.!m$.c`]Ѹd#JEdS l6abG+1$Q,!0GN5l"}G E'V]t.蒅.nX=D:@ʭVgeez4>B`SDk9 1KdW-Bh%E&zTK}o{7Xb*ړT ah@4y;am3sJ9k(ܽ]@dLTbQ5%-"JYu)|TL/w=Cd:G./PcA2[2J@&iko-C@&tą̗&qqqqqqPJ)n7y!N!k3Ol"kW YTj|/䁭}DR%be_esjC IDATb, f$z$QA#p' b?L0"m@=q;GhdY-߹qSN2ہEQQd.U. Clы\/ɲRBr*8SHHDba G dS-Λۉ}5%1x$Upw4'чb6Ɠ-ڡy{;dpVw".eۤ $+qLm ko5]0х{7l~kX$aRƳ6DMԲ9m3 Ŷ9B@i E +m-2dQz/2UT[~yc&t&܄7YjSֻJ"*g > 6x2wA$GPE_܋ozUN4czXc2G,\4888888s\mbAU&P 3 K𭸿gK=An^{A#I›xomM4w=?^w$U] +)l)OEÑNūln 5:" QĬgqnVg "foԢm&evddbv;_?(넠̥l4B4Q؟+m[wg=ǝtFSeV%HhX,X-S^ܷTR8u=f;2Csو]KDhUt0U0(70֊NYɲTQT)ý% aA\;E4gcs # #6uHaMj1NNѦ&qjK{֞$&imhsk'5*?SEƼ)]Fdˬ\3m G@Nk~?"WrI)<80Tޏ1@}fLrNʍ,Yn5]̰~<2s _I[̆8֠I$&2ŝ{hq{l,-{U.gs'{׈&qqqqqqp_hBM@f/r̲+՞(P,pHb#bpY%kqbGzbr '{B@jPtApra850՛D; .vWX,alp'3u\P}<9tTmgw.;'h5fe0J VIqn?w U؋@]߹9漑i:OV4u"X>֡=ne8lztU>P5jD{ڳ$77 Ap'\%~̟$q[~|5Qm ~-s'agw5/k1/2z}|ffˑD$~OQGž9Kߙ'~RnoKE888888H~{ZF%9^{uEtEV.V +UjA,dVczAwх E ֡L}=%\7"@x($(+5Bi␭wq5ΣX-!R/Z,F)5V# Ƒȣ~Cl!]["|8}q}1CdWsp#I0j29Hj[ `F.pkV^=z{Bi&H=a®~`e[~}God2;ț.F 0z_Z, FN܋.֦ (BSl$鉨MQlŬ`V(p[?úMNǔP})ϛZX1|$dsL$I٭beia=JRd"64V&7?wח&qqqqqqM<<<ݸc)PV "$&2'gl8Y ܝN6`wX1r-J 6g*n 0~$' M>,(D $=$qxϑQd5bu0JlnhDtDmքԶ&rjƃH69SfV[rQ ;7rWsH3*~NeW0Oĺ^onwfPt'B &9sJ-p/Ps[]41FLDL@b5seQx!ل{Ǿ(9삉:9}\]c/<DL e'c錭>['̬3>ڕl rԚϬ"qkC經{~RΨ( T2K9s1[ ͬxaojI.b: e2;}&JvNv}s4\4888888U4q<QksABҭ$`USg \47lڦq/ zl2Ym IN OSpcZpӯ/X *cVHr}4UF R6aíń(FV\Rl1wqb{]E~&5je7d*n?1 )CtFk}Qs(lNoQě%.(?[߳&}m6ZUdRgŽ*McC?#ZMĝ G)fO~}K?ٶ)d33,lG{} !aIbZNț-#*yQg$/G@$2ɉ˱@&Ӆ4'N!0u[bGѣ_] [A^EqMu͒YwP qqqqqq<ϣxKra e%P 뽔*Ch3'&Hl{z!jX Vl{eڗjWm?rFwI6d/mu6ڊWj?9ZF7j|!lRN P̻.Rlb6"2Vf6ElD3ڗ^Xh@{т#2uf%f[Z͌"eLokSTx- EoA[Fmk."$"*2(KpmkS#X6ȽEZډ&}ZE*5PB OZy~X+}*ZO7-o *b5F]|ks3w3+(hhy,{aP9=q?|{Lй_ʬf!-%[4iKQ~*r"G-Ϸxf\4888888QJv1/.r$  ᬰ/6N~L⨁I{bf,KX%c/@$Z !BbяlaN_ 'VVOI$|@ 8 B-‹l0X, (OǓ@:{I.fYK}\ w\JP{B~j" |i&,X66MIX /kٌ jTэ vxY4" /9EIZ6L;Ho;Vfޜ>fUhs݀ͅB.Bh0+0>Tf(hV εd{_\ ܸJubly#VN,M,th m) !6Ц,YX7G Nvɦ翊9;|_/.pqqqqqq^q\wXd$ 8E3[[W'B=~b&QDI="qu #EHvyH=DЊXXPƩ:/ cLf,Br/GD M#Ha51E}D:+{Mp+2\CmMZ{ [khp?g(&M-LMY,r: +;%::8 =C9b4G}9D_ ->%Fx*>xu0ewlJ =Xibm5/UDM)Q;\Ye',X\?E TN!䑯//p3Hq!B&4OYIz%sx'**p"ZbMޱp#1PxADҺNQ_Z4ZsȲ9d3Vw ɤ"g7#r*% qqqqqqg]Wu%6c82sAu|Vv%03T jPJ[M,8NeϤf_OJSKݻQ"DĶhd%bƓ"Eyk܊ GLQQP3iPB: \V|'׳h.w{5*Bz~jF s0jB[sV>ZtVa) /EsEML`QR;#W,;Q Hc/}Eᢾ%xm57);7B L^jBL 3%bsm{ws,¶Aò"MVY12]Yʭ#RJS;@gE{LrXhG |Y,MQP Wqf=duDoD $|g@F\WPEDP'~%ܚh⒫yA*)MH[ "END&&fIK888888p>q{{+?8D}^YwNy8Mv Di1 &RRXx -VhLXMjN‰*Y l1{q˸*^>KuF X-l%P,#Ȑ#cQ0rVnQhͶ? ʘ c^,-f'+W{ ޫԸ …b+y?xJ邧#lք8)bB&j!Ja5֓*ȹd\Y֣oLha쵾f@mߣv\4888888Y. Ƚ^ܖwSA6GkElky-d*#pܮ-Gq8o'QTOX̬ :Y(G 7-r& ܉B*![}?PrQ (;U׊E nWxpEBsȁ:Cl",&M !k$sAN+7BstQo"69K܊Pu!ݝct1Fo?~f˪fd(xGPvA c?!x|xZ6 +_H5QUĊeADh@0f8|Kcf[EXS~k{ƾ2Lr}{=V;~l:Y)VfO|78Zmz0,3bu;f9)df !&6 9o%Hl␽h w}b҄E888888Οrp<Q՟_^ע xjdεGKd+Zð;Wܝ.JEۗm^UQƝ.U$<1'wRǭe(rgQj#c/@b$1l}n9p+ EGDJՑcZǂ6O7gNI mm5rYj8P/VX!`m(jX?w`b Q^,Yb ڥHkl6}ml)(A8RJF1Tj<7Hgg35(l\LƜ  Gғ* c$>|[XԱFBWrMn\E=UP0V(%?՟d' ^_mIE,\O㚧ʻ֟06'(1!UC۾Q#m.ER.ѿ `(MhqqqqqqYzL/E6C[x{AeArˬX;S]41v^P#qw*>h;}~iB$6G-v7_A$N!p[{_=\ߔVO;Dj #4Dj'E 3AkAm&b[~=157c\8W0/H+KSLyqHUk$aQy%tjFYR!>pٹJWK:e0105͚[s<-u?-­twM&h+5>A3(sɯ-ƣ pVbI\@vx_*B5%MYMY=R#lY9܉tQI~(Uxpg7Ǟm>]c,LDWYh;3쑯쑃RuwPl'~tF%b;r!^B~`U94fCLڊM%3k᠅:(2!W?C2Ilvgޚxks:8888888_d1quN888888bf\Wÿ}fhc{^XIzdK;aq~+:kjJvr\MnGP?w/w'ѓR-D5|Pq Cs ػF+яq}w,Hj́Fq]N BoP#x`.wj@uhƼߋ !M!slµ(fzNߣT W}>P81<ątcn#1drQ.K׺tM|w ӄ5:F}\?t-0H*ZP?3:/=A}5؛6!Gc1V6lH$KlmTQJ~xZN_"9o;M,噳86^)LfT\suf0J"2q;#u<*5W9įZΔr< ۻ(Vҟ= ^%NL`MQ&D J1kQ8.,ȾHM8888888y}fdc)(51{a{)Zz K剋|f |@.0Qp`V^,VdS&L5>pV#NR-c7aDIk; h0.(-7q/EA6 P =%*JD1V̄l"ԾOEV䢔5YP-#ifleb(j I}C ^ާP^k>1&9h!|}83_=~O^'>=Z(Qbx_>5FecQ` ±9`v=jcV )=#km,EL\t٭H]P :3ܢ=r/94ZdVrf+}֏oF{}Ό=S@Vu b٬3Af̜im‰dTqR=xAwEXI;,u)OȯM8888888nRH*L zOj݆ F&ca&XuRXwgUlEl-NdvX)8io VDhnbQd~g{MxBf5f-L0y{h˪ε־sB hmؤWPԘNH!#B|?&Dֶ5!&FitU4QLCx!ADES眽^kcι.`Tjs{ޞ:3g;e2R 9hiرg wd'R,o^S0:}кd.Ԝ &\!\G(ךΕbh`´bEwTKh;]^gȂQӌw+|;3JNɨ(WSZ/Z;qdRxG :]mxŎ$܈B 8$@!G&"#9VdِŒbN`.6'uy+Q,) 35Vŋ,h)bm6xv^lP:.CN¹%~MM$D"H$D"H$D"H$~`)హM̤D%èA:N56kbݠ8::[gV/m`(cn4+AU1kU@r=aF1"A0c6&ZD u&h7c  ޙ!۰9nؗ!E*\Τ0.E$5:2u0um_7iB,|(4BMXB(0gE񋫜{81x T d`An`mǚYVDsr u+)e&E` )u&՞^-]#/ XZaBQT0^EeVLPZ ]2RJD”!^q[޼fSdf1&̍7>yx\q%/aΝ.%Hݻヵ5%.%H$s>pw̻A/?k撉D"H$ġc26w]Tf08UH$$8@ʲ/wGd2x{sCs7s? G4_N:yH$r W^yHʥ/i]>b5"vSxJftRWwYb,)t%up!g%yd׼wGtej-qX 1Z4Dmu\3fRplɞH&6;G(R;e愩V83pOϜa_2PS&לZj*f\qX7ùNdN4G3I4q|O뮻:N$t:kwLj]v?dY/N$#k-]tQ:J?^r '̻D"H$D`4ͻ 8:dH(zQNsuoژ Hmoe"%fU[0 Xs%˩CB[+ˆuθQ!SrA-6Ԏݍ0ױan~{6M=QL#61M#0֙`¸4 N @&EꜵYlQF 4l׼  6n24^ ĸ8Fe:ٶ x얓0c,/c$iMU츇l}4^gqm u{aJgp| siM p+й<MԆ}e>[ٯN)߮VR9kDXu(zAS~nyd,m L<a0R0ʶ1@11-s)JJ&w7)d+K u?kPpZ {K-V8Rc SyΜ0012ef}a jeTQII%pN(M c$8?썝q\xs(q f3~~|K93YZZceıu]d2鍽/ Oxœ*Jr 7pC={馛83XU"H<6|3Fos~TQ@[U.2~wwU%D"H$Ga;F? yFZQߔXXr3 P`0"8U*I1LD 0WGKDW:Y0P2nu4?-(pMCݨ9ah cd馾w?LQP GN+p=т?~[Sܮ+tK+:0( ePAek<~P-_O&>:"KlO$>]>Gwe""kLQP @67 1#̈})-('TyMnL08/ V¼Ķvׁ;kTzC׵KBDDw[ ޱaBkS*?8E(08*U -{vaFPh (A8p eD&:tµ20 eq耬4F̠Rc3%Wj ċ=f>f=*)5'QD&F;xJ J NL V>WkrV"lDUcue|4D? 17Fe\xᅜvis,q Cz׻XU"qrwo~7=3HY;8nVf_2??̶mXY"H<:>>N971TU@8җ?fl׮]+K$D"H$}hNH hAjU%3o"&zJq:BF⛵q%{5HeJFFNFAfqdR`0Amiv[lGX2,JMfC<W hb5rQr5 qq{lv8/np mЊ-T3eD|]968@*MdA[1E<`B#Nhu+Zh'ΠRd6EU疂;Gj _o+pھkGt.I# 96 G\8Gq^qhll:AxA_V9eUx6 q: 1'wfu 8ra |>5tqdF#/Ta^3r2q󑑷N*bʍJeרg\ム 5#'' 3G`XWqF%S5*ڲ\9(߯F^,1SwL  HJ\s3Q"b(v#4|nn˟Ğ*&>9կ=$x񲗽*ӟ4/y39ck-vEYg5Ces9|lƬ޽׼5ωD"Xr饗{ߝ33- IDAT,//7c{G?lH$D"H$ 96RUT'@F$4Szטl&b-яr,f'0Tjܨ +`6Cꀭrٰh3r2-: tqELEFSŪ6M<#6b E\7<6u*QjHBxB}.~-"?3^6ƭt5:xZ4sTI_cٞȶA=/17v?2\SCt٨(dxbq(]O>qU&yNwU*Ž+j.Hpn(.Lcx)קDqd=K#tPޡBFaè鉙߸"VːqIgL 5q(Nm!Q1G%湁0a_㨤nX9?p#8GNW4&01OtG;I4g7vI'qyͩ"" oxeY./483,55F(n<"ƷDKHG(+bM.TonልV lYOTĸ h]&Z0F$6mVǦ~ Xon<qL:ٴب"a"F~}3/q_qB&ka9$QAaRyt̾m,yp_>hMDc7$ Hx> 2dD꼦U;̚c[l-^lU4bxLiS.@ !c]d m"ӞAE ?&c{b_Mbjo~7hN% '|2wÕW^9c|{|{c^{-tN%%\۶mo}[TQW<9*J$D"H$ߤH"H$6ҲGK__{c??s9Ux,y/r-4cG8s8XY"qڵ Zp9`L=^ uY\uUd2e/{+K$/덽mocs(h.;6H$D"Hh.ۧ%hXd *]1IoktaC1T]oKMmJmC 22AGk}ͳ#9rmiW 4+CDXɟ"-6DJYpJ]T6cx !"q >m!gc:j,^7kt񹮳B*#Fh0J~UuAp8&qF#F{3sB6zt0D x1+dy6M̬ol<7mIwΣHבBUp?K4>$\6ņuBH2ADbtJra` LAppr'!F 33D6 M\3WޅһLw3#2A_X|8$_aɤ?LڨDFMpuDjW2UՊEXBJaZ KEg(~EBNL28Hz){zcKKK( s.B4geY/[%7;{c?c? '0vO}S[o~<IOz+K$o}[|Sꍝy晼SENJSO=__+hƾoO}s=w%D"H$/2q.' u8RMiJQ)cH5vQC-5%jƇ m\GFqЍhI6< V[DkGnlUY T Ri4 %/~:B(ZpР $1 \"xGg8D|*664ce-@:D}w<lc2}T:CL /V&·UU5cv=Y=̖D"1_梋.9! ?$=J_/rm5cW\q/yKعs+K$D"H$ }gZ%χ u1anfvm'3m:iT:1avAa:m"6CS~{} mxDyMW! MP ߼]x8:MBgn b #}WV[αDk,gMQs(Xeq54c{8&H_%/_N=9Ux) /8LK$I$D"H$x1e1CS#vb[jrrxw?ʝ( ǎ.d'Z/bc5TkfnA|I0b G>efr=O'uChb?Xk袋Y< .KVWб;N~w~{?cOs,87;yS2g=Y|{mƾpsqͱD"?w}w/O}*TQPg?W|m?=g} K$D"H$p R 5~_c]Vqު@OBޱw8DxU acᯡ$"#g??Ԣ+ۻMRcs[[yU.ċrPSh? ; Fe9F2nCu_@!`A4 ch\xC1! 'jgbLv2($nSżDAN$~U9aj}@Bf"'CSߦ/$i!S Ǣ(+~cK޽hY7KS P #GDM&^pdi,F@5AuY0uﱑIIqB  GwȑgFeUfn QقG x -#;H |o|__~SPrYg7?Ӝ*J$>bݽpȋ_8c8sz+vsڵ p"HUK.a:/t$KY]]SED"H$đ<]&D"8$ w}|썝z/*JjDd2i/">3 fD"p 7{ ^œ*JjNwk%D"H$G&YeJlJDrt~E00dU//<šį7f,'IѮvSj-e$r͛c@ w}9w8>j‰4.~ j|#{R}i^׺ 24.!  1sc6RF: qM/3fD..Ny D/:0Dg N80D 6^sx8o]GK.4a[Gtg~ޝZu!nY o;JsѸp(XGsH{<|Y@gtku躈P1cTn>IdaV\^ʚ`ܖN#MLHRw8g1Los-k\iCMZߞ:s? PZrj1x[^ )sc!`?es"5s,DKY__]p 9U8ܹ7\ve=>O+,8 馛zc'tRj<__ٷo_3vuC?CI0H$ǥ^۱co~TQp񲗽_r3O<ϜceD"H$đx<9΃B T3e>DZTZ6I"BFM)r 2 B.݈ Ũm!"M(uʶ ([3yh u*ăH84siVg)-Qư rF1Xm $ZD7⢋ 3"XY6@pBaLbDՈdS N0E ~-'>>B}Fo A^(:0"!"K;1mI7&#p?LZs 0V1^җ'?y%_T]v\_tC( >^n|Y\{/ceD"?Ȟ=To[lSEÅp7$PU.".򹯢K$D"H$1GH7d8,YXݬe`Ԑ3T/R0 ÈdOXj44>VTGD7&fiWw p t.Z2Ë&f*" 8rzAD,8jn??!_\r%m۶񖷼1;~ W_ V¾eز n'=9)^<츉,˸ xߌ⧪x{o"q|_{鍝yر1 z}du[ڊ.m=IPycrā{9#ds'sywϘH$D"H Va C)5BqMN3)(5'#)Dp[=P*uTԔRQKM"=Ʋ K0Y4KgrJ}ŌGC: d3zP 0opP+l( 0h\vVP;t&bn v"0o_9/m ^ P+a?vp[q#m@nԮpBG lFiϜRxS01Vm-<{C& y{ds ǟZaUXИ!?p92OCcg FN1i#E__wbU4WT|E! м&ㅱTR2 X/'!iAql(n7"'hۇOf~Lrq djċqꀾ+ȹ6΁ U?^ w88L ʇ0dK,__-upΨpNq4D/7m۶=eC?zطw@~gdˣ!#?#,W^ye3v7rUWWj%fM۶m3|TW}'/Ǭ5fuVWnUczn84E/nc6kUw=EQ<̖A]v"Be£{cшw]{._[|#s9|s,1Oa["H$ı1Q1ꛥߜh1%C0oVx5S1 Pޕ4yфqJh"7jjk;a*Kձ۸8! ~EhFL_̴-( 1~Vqhaj%DADO0,x(ֵd@#(0 n ;FOqC?SoQ8ݧU/XSܜ#䯣uP: YD»[KRekXOE'ꐙo IDATZ d `nԆVh3׊%̆{Ɉw)\FD9վ&^RRub#ZTN2P5"$RT Rq`Dbz\sraȘqKMF5 dh-jjJR\96ϯC-~. #Y#6kX^1FkFMM%V'a?H [ԧzcgy棲}G>>1C1ϣž\՘|y٫VcݽG ^o>k-H0\sMIp9ݵ01tMë>O29|ʟz?~#M׳z+Vbqq޽[YYn/x K$o;#bϸ[{cw'tA3$s.B4M,y{_|̼'})te$D"H̝htN<  |b9r16}3M6`T1a\n ԡ[R1 kƯꈱ.Jb*Ь ᨱ^W:`V(̘lC-0Z}9tPtM*d}6_E@Fa 3kZ`bjB'(@_ttj5oG!c14j?ۃm|mjEp^f~{F5" IUgr3C^P Q R`DNGdnĂ 0qQr=F!γW41I4x7~ DZ"3E95Ђ&:PW9Mށ0gnjFtMs(HtQUʲw|퍝~A33u{]on_*J$D"H$ FѼKH$Dp;M_e/g oxA7FXWc'0j׫yv;D֙:#D ǩ2ԂuUXʞ@&[{}Ctsg-a$CӊjYEs؎Y MtѮ`;NYM(LpN7H5L;.Qab0޵ATG2ܴG@٬q±4&296!Q'B Y=' T58h##+ΙU??31S۸ftc,|DFan.x=r͚`i W2KMrm8$!dĹ2Z$sGt\0+'Y]a˯&"ݷۇO0a;Ͽ7vw}N%.ιMJG_W{^ǿ7~?X\\|gol޽|SED"H$|9\&dUYf]57Mb!G(0 rڎC%ud%gKneI z2ǻl, D31ޣJfm܈ eHa1vVfְ5뷡luՙȅGlZ 1>ÈA` {^0sLNox@>1ڡ^a(  2V+Bͣr>c洩iZT̰L2qmk9G|Fw TjWNsVDR9(R9*%%:3Jg}Ȇ! BDu u{ {/Qk7_OQ0QR翊GasQ2塈DDw_<#& RopQV+ќSHF!Ypmy]1:jc2MjR=G|T c(ۆJb]anXArQV Q 0UDk&ZSnsݩytE\4<F+ n^Vi;XJLQR1koP7]bFzG ] !]IM%5kf#j+~1ᾍV(^h 'CJt0A<4= :V**{ћVL_"b𢛬7V ^<6a(Ƌ(Xƺ%2)A+moFM-Rݹ7^PkPjHdA(oaVb56 bX.AN1QXtK,%g); =ZFQ2e"kJ݅Ǭhb\׾2_QS7c kf+Lt.3 8F 31s"Ɖg[|#:Lh`4Zf4q&ot; Lcl;TTj[AGYΊ&d#J#/ (!qp=TA=|h(.3(?D(ċ*;uڱV+kSq'>"5k"+GNmؕ‰Yc$QE/g޹R3u5S' ` q5DNW!mK}{{0Eݸ$*{[r}k̚Yc]&ɔ5.*ַS~*mc>ӄS @FΈ%ƺP ZI צW1u~ ֕g>3ֱT('p*8-nu"d?-֖kI$Njcrנ>x.3߻0Iľ}K{c;vx_-IO?=o =ՏH'f\wUZ.b߯ıW^8w۷~~ni+R'z@Fs>rO^|?=Cvz&[n[ouN%.xos\o/~1/~x_;w7=|N%D"H$(_8g2]ãż6?j3a*S|\X׊UJe5ʚYe+TLůN࿏AT!2 -(MԪo! G~NwOS5:34":OF|7c+CTfn # 1='(g]+pƇQhɨy|c{O?qmS8vrzK?)$/ލ~>Szߖowʶ|&?sEi.~'~;3nW>1?G;سD _>|jg鴑n >o} ˷Idރ~?nomL(>(iN77񍹘ロ#GP,H4MC۶K3*㎹m ϶V?x^ǧ>)x ??UWI$D"H}.D"Hr;ѷn;el~_׶w+??g?Yz衸7~7~.Y"*gF+~h[ه2ny͍<6@^c۔Ȗ7Gg ~'o.=3L֋ofw"8L&v_ ǎ ~gK.d[k-~;o~:iwۊK$D"H$v]#LQ . .*b8jeHXF Xaկ[DuhX nf a?\90I1: {GxaM:@rX0>ug+g]*̆v7pSDL| ;3$죨H (_N!3 q2u>";[\@AZꩃxL>cv<%QۣEUB3 c4K_ǜ=Fc4T4:I4|p9düԍB7ot:{r}>u4SWfosD' :Eta96\P ?y9 ~㶈Q8bpm\E#\^VLS;MwJN.}!1ÅvT;拓cXe&".#Zr^uV[NfQuPm['ɤ|>+VPӄ^Q&2dvȨWMlXe6 _m}_f ct=r*;_6[9clg>6]ev l}DY~sVWWy{߻3׿u|m_=_~[ˮdȶq_oE{u[W~7e:S{>s><}XK$vbx7~pnu]k_m>?~~lnۗ 1/D"H$^b8dPɄ˘TJjji| [#$kr獨uc3ZW1nj8Yc$#ԌiY~҆x LL`(dH΀t,-FHN/BF“V&7?zEfw]G܌6x>itMFWp_󱡥aM& 3˰q̺JA"  An9~ש^8C/ iPx ;B~1 ,SڃXr,S1E|O1sx. . 2E1HƌdD&Q,S\\\͡;(Ad\KJY%KX4Q%23 X3Ě!Ev`88z狱8~Gy|r?'K;r &7Cr3Dd_dߌ·->mko>]ss斘[߂w~>_wO3\u^+<!f<>:-ثs=Fm??@Y[v&{>-ˮ<⏽\}m{'vGDg2jӟɟlx/[x󞷭vp8維kk;ԣD"H$Dc7L!:b$c=ȝ`{(Yv)o!r×0v IDAT%33SD7|['SʈU1q,:b/N:+P?okNNFdγAPR"2/^Ph"ީA{a[ZkXk-'L0L; U'T ki_gg/8ǨuU*Lcu:suLܬS9/>hB(H3VmZNRɄ$#kGoE!Æ}>FAEEX fhf!*8t3^q@ΧYS1F/߈RbV*TsB /"3Bna-p LYJa+8\E(^<Ѯse8[aƤ>n/I ­Ē{o.g"/3_L)KZ](?[xy&e2=S0E̅& ƷOFfh/ed1X41`L㓜(Tr-oѹ1[\Pu' U~7u]mwl+K?1o?3bc+y/N>ǽoL/w3\pm=wG#|enUW]W_}m?9:Vg$Ƣ '~?ix^܋|~//|Q"٭nǏ]ve oV;n~}//wGD"H$GDj1t0D '_k:CH:#@BXkFXi +a]XimµnA&@m5e1}ȨkYu- ?[V6 )ڎ5iÈ5jF3I0»xN*]eLD )8#Nfkk iLm_ օ9a`Ve#*u1SG^h1\a>HDFLdE#JM<\9ZԋOf^fP@im`Xbh:/7q4qӸ2&h s}*E0뢃YԒ׳S V( "Xb,#FHV#}1QG\[6lȵ eHB f 35ՙwjzƍhhlm5\1ԧ>ѣG綽ukV;ly}!w?O򊛑[sfuIw0}>jXwGo=Ǟ]^Wq]wqm뮻vW?ݛ;$sܶ,m)rW_{>r>x47ɏOOfu{~~xշs_*Zn& nk}C۲O?#?w~wt72&ɮZmչmo j'}O/97w2wW~vD"H${`s0&Ÿ~EN0I}n!2`\M\f=1!W`|*ׄ؏J*ZF C,\ yvVЮYo};YNt\(*N\_X]/p8r2rzߏ7aQ“V 'G-5 Ӆ2IByK2'# KC zriɘ]jvCgB[EF0jxqL8ȥą]3_W?}gC:^ӟȌ̾zAi"пl69nJ/gdQ>(Anmӈ^T*R=u鴉vtl ?{]d)!?tO F̜ωGC̈́UV64ni{Ry,ot`ϸm[mKxӛ޴vܝ=rמ^y˿O.42y9o#"o oxMGsM ?2G :z ﳉÐxPUڶ`'?=xm>'z."7w+nz0RU[&D"H$Ν7_"H$O{^4P~[!{?8t>xV|?FWNUo8f3;[o5r$-ѼeÇyы^6ʏVo]:sfwsWߋYa~%e<SYuO>y[6}Or7O|bC4Y"H$D" F2jFۭmJpoiWS߮a5wf# XC-vb94ӯF4#z1c 4T2{~)D4q?ן:G#(Fc6&ۍH|ܩz:/(\'!J8Ypp266ts-y[ȧן;E k;X1E2Q F>P%z͡1.ts{理Սn]o4_hUR ;s@N]8GC_[V Ab6Tp:OLi`APn]xwa> PJ*Vi_]t}s_rX3Yo(t.a$kjE%k$:ZF,s_}XvaJK)Kޗb&A3렡8=B߿[joY(.e\H&%0tʞMlm~yȑwjM~#gaM}m<=S~'~O|$;E]?ֻLq_ttq_!#*Vc;SU?v^ x|Ϭ}"859U>|.h[I1psmu韐H$D"HvRyR %b#>F%m&u*PK= __'vt ϵ83]z?Nwe/|}C;ݍydey 47r{3g׉&>mфyEꖟ=o1+g}ip0oTX î %l 7XF L '  xbN}cAwXb[ߒkA9y(ζܡԡ*%ht̋]bKdα4@-5L(0cF $*D'mtϳ5хͩDz#@'BKi@f$wPC uU3ϱdRb^6th`m*8*.g߲ "bdq`%'58ܖ&" J`\T r-d^#&N~c_#ͯ6-vލ<99yNwc裏twx̣K&Poի_#+n ,} L9mҶ-y~]N8\sNwa_я~tH$D"/Y #^2TB_3QsŢXw5\{18_;2tZsc  LX b&uܤ8 ѩ x+kD5(apP(d:91fr M$Cl_{w2;0@ӵRhddt:^]N!: ȱaLX{wOʌzChdAta>:F릋i@ ZnȢ0[/r)0DgWbN+ XO s.u_SGxCкq{?֝^5k KR1c%jzoO;?q[ȏj/ލg\]9Μt2N$O?ş|$ޯo;˱շ̹MdC蹲LU?vy_"Hx_&D"H$D"|W-pa1Qc N!9y|I Uى6tZ?@0Kd/2鋭HJ&ƕ}1sGE t@N6"iYIBQXq*m fq}x<#%8=t^v@CXƬU*]\7c=Bv FCJӘ |!9V۹xQ]ŏQQ 7SuL%^H8e?dQh o?GfS ש1?f+c@̎"tX%&](߭(rҷ ( 4NhĻVNuSv}lH1@MfQxҏ\6i1εXSq\`"yp1Q$ڏ;@F4nL?>HT\T\@i0XT͉O/qE<~+'Vogu }a]?3~>˿N${Y]sox6FW˯{Os?!D"q.4MCnKD"H$Db\&D"x=2׾y w5W@g|f[yW"HW_G7#Nx6Ⱦv?0Q~vH$;d2aiii>'D"H$}vWL4*mј!-yc ( C\ Wpb &#Fb@F=II ; Y"S׈>2cfJŌo-dEg`z~CNPpXǸ;i褡 Yb^S 'AsE2ӻP+;N#2fBfiDDhjnOiF` va);U8;4ZQUVGd]"M*1ildCŐ!bN B:cG+1OҺ 1|E\#t5/ȝ %1.}w9ɸ=NffHf~%#gZ,64qwQL簪1e rFBM!`ȍwi z &IpY:+j;P4 %ka^%Jmínq&N\ΩB`)\q؞d)yY9[a$'7 ; XIz5,eۓv FW~cw_FFo~;ؓS#?|3 _wN_,H<#G-ow 1/!L$=IUU Cy>'D"H$p8Ddw&֏DZeIi\XXx4X Y`ȱ &rK-x>a$+<9ǭۓRdbD'Mpځ,0 #97-a]$2,RPupڀEqÑSa/̱ ֏XPu+dfUC LVrvs"4ܜbcJ_ fsVZ깘ZG VCs+9La̬ bШ(B% gGqk6f/(Y  aum)A;mq$xhEd3!VsP$肸9:*Nvk, /dH/ȜBW (^`ߺbq (VŠ\Կ> B9Z9!Fn<0şw#݉8v0df?Ƙ}jC,@t,Wܰ* $(C,fQtS]8Q& ̔ӈ@gݧJMKHy;ܛj;8}5e'g^ED͍lM&vi]1ʏ?ޯ^ڿD"I& ;ݍ?H$D"Hu1E8':P|fu7B gCpo#eX3k,wu@k+::"K⋫@i|Q~>>Ђ/h"\l?\/ǚ tAJEGa8]NFɢ["ϛL--N;&]qtڀ|Zp0j6-poQKMK0x.mRyED֨͋"Y,Xff8Xb]lɴXK#f:koP]Sg_ϵbb Z8}lc)E BP֜\ -($D ']74 IMz@/V4:1 Aѻ-BfÃȥs*n'c9ycC=NM 44RPk݇ٻ:mR_Qb EB };,yFLK6FUThU! BSK SF4F+qe;CiQ -ź( kGzwm~{P?f1 N)A IDATxCFɁ2&f[8^">H.yfN~0Ks]^Vd(R0u(!V>%l$A}a|/[z+_@Nv8z(7ѣGyX]=:/x x̵^I$;ĬH"W߲徃Pǎ'رc=t4RS1Fp2Xf݈ڭbME8F h5nLԅšaVqF,~F bTI\ki"ސp"& DG\rƷcL (R!YG3G8$(]< X/N(u23oq{ 䃡yػVt4tDw+`, (dNǸyvuPDTIIfKM0!Eq4Y kIFs"Kr3FrwwN1*I4W;տr7}ྐྵcy}i}~KrQ=qWt=x~lё]4/}"TUűc8vz+.%swv5y#}O$D"HA;.H$MH=8ɩ&gCѣq9cǎ%D"PUb083k'}zHD"H${7\~xt֤OLhwXJxubk"֯ |s$QĘ ,{)sVm>`yՒo8v~Ƣ)71bQ!֋)ZTQ>/;v h}}Zla>X$ԹrYa%'@<[=YDƌ4x2)ٍO)fB4fsx #Vrr@GÚ [KANu#_5nyy, Yk"u8Vk@3R~=խ))C񯏾ߋ,L!HƲ ZqǡX?V{VN#:sƁSko+d98X/E q 6q[7'n9! '|c&?̨yQ86ĭv,yd&EBdjX 0jdf޹^p^%doxaHAt1&r/MyKSwN9r_m=z\kv Dbn 0 (>龗]t|ر,\rNw!H'su'pw+'}O$D"HUvBA~VY rn.W;mX|QQA~=x`(eÃ"@"+..gitQd,j,'],}+A@ENIR%{nRUut8Dh"~3wP40 Se ѻ:Kf*ъ;Z]d @nIve @JZFM, {1a8t!b@Aŋufǵ#Sax.1CE b }~~,8gTQ3>u4ec>$y؟\{Aʼ8dV, EvFV ` QN[| M8:`ttQcGG V^I9CEBӭE0M_CgS%휟_c7f ,䗒FEE0#&GF ypBK i655  j#eŦmehb!hRO~͟|i*e/LogHsk'ޗV?Wo믿>Es${dBQmbH/"H$DbRn\(2/|>~6fKCŨ^d` ZPBscaZǜharMXTE,ͭm_fwDqsNxދ \ 3`ȥ mB^bԟ\+]G хdRX0{#wf& *!b=4_Z(5^4Qd &sYphz110^^Hиq)1A8mm uzXzQĩR,Ziɵd ͦr x<9P;?($yk(ÉBā$/ kJ\A# cx a9zDDE4d|`(7~xML]U K՜R(r/\rыqsQnNe%'hQ8]Q?!&o4K, YtK!g4QZjjӸ1ݺ\(q:TuJn_-k:TkZPgn&&=8>p|[&o\zkkxk^86̋'շlO<Ĺupp%p==9H*ڶm[#uW#+eG8a?nӎVzAČ7?YHT{J»Mg#ӱiĢ|n|ZFr\8- #NG3 }\K1UHs|-J ;h"N2bt44ק,xihX1 zB>71~fCE0-_T&]&N/4Ql$((0vqSVՏUk ʼdu\Cߘ{hnk// +%\{\q8߀񶷽~oVz㎸~馛Ϊ,H.{nocի_v?R0qAJ.:Oeя~en&mOO^2U;=Y"$)?H$D"8 yeip1jm#9)ȁ"QIwڭ1Mc>#ejLXI(_]_=be! :>;qTn1N;9$>6":)e<;!D(@#U,Yo+]͉Q[ЙP%N;7j ! Y>ķ7i0;L)K X,JFe^HNw64n]p=۬b Sh՟SVc[i؊\Rzg8qdv{םZSg|tӎ6x@T2`aqp4tҐ < 'z#Q?P*YĄGҎ*t4,7SE4<ZP˘qiC?lD"H$i .D"H<ݤo62kϻz2|l"/xwQtMYAWW5\sVm<ìN|^{YH$KxY[D^}םQ{ǎ//6l^(0Ζ'ORU,K/J$dwLW?H$D"8=#pᴥnOnRߚ!#,ѽ*Zj 6g'Rr&z-tP 8, ꂏ nůxϵ4KtbRF,NX"AvF,<4(_Ib=.8L}# dz(Fg,H%{qlWhbUkc>d::7H QQ1JĻp gGosPhA:r2,PYBaBLDȼEOFE:QgN~.c}H4D0;(LqX%bhg.no(B/3VGoDn@PM8$tS\5ߎ׽.`$3ns}d&Mb;0ǯuji] V4b  9% et\i+%,EHixrՙ Yqs;Gv$n!X9 3t Ka-p?/yKκޖD"8n fuTP~ 9⮻ڰUzW_}9oo'H$Xci$}O$D"H$[0 4iix6'!r)K d8:c`PADOC2ك:Czb۷җ?4Yf <,x./`+QPqh|;]|_~E&eB21L.WAAa J)EqxL ` !]CNfba: 'B(i<;:AJE߁[pv?, dAXLl`x81 8D?70=\@'׉}Z~j1VN ]@" /w[s!N  F&o#FpgY#6Pgr v mvDhyggn<tvh|6)Q<( ptN NBAu‰#.Tp`RMޏ&E2]h(͵S~eA{@kzK.&O &ytc@p\]?Bs* 5 U@=ձEd`u'M~塩 7ũyw†,/L0:MܹPiӄ ge>*(j*<:_AArIJDYuvp!flgW_i.(*kq䡵רh߉EO.BtbFE@% J4* 8naDZVAirCn~`Z<=F<o~As6A`T@\r\0VFQ0RE:".re5f.'q#l؟"aBM aW\op,lOAbTh Foק=88-67/)k<0=88￟_CSA<ɿ}ӸLu=+[u! 2:#/T4!ƒ01/W AAa "BY݌GRQ#GLI+>6kbѸb @,JB>!  LpP{"}&(2m<<&:AEALjyAS1lJT:F(8"pt"1G˳E5Q'DǬp&-T׿LȮ (F(Pמe:aQ|h.ȀHcwa)ΡRG;<89RXS<IbGg^_wŸ2b׎<7@@M N$q?ߋ#0<`HPq{\==m""571g T᱈h !80]Bd*}jSTUB=_V/~umy_STGA TϽ _8qXr(IJc&Ax9  wAD&4_uow~w.a IDATo=x /|ၷ~7~uA a~⺷or+*F}@K/=ݫ"b`fuhOxH   K\!ͬop FH1i&rMϋ"5pXӣC7&i&D= iH ~3n;4D .oFǂ-;xj(E@tjʪ #x5Ō,Xn&Ha ˓3{ns4퇽 Ϯ }H VU#*KCӁ@evqwq|s3McTAPB{D(.CA, ؔ9spU1)rߍٹםד(5Ub8Ρ{(hx;7&?!)x(澸_VkME6;I4ɉ4WQ!ϳUwD4$s?_y|+@>-w@?L \&Es|οS? mOe8;D_4qN \=fʲYM  •g4Ai2_8=;7:勰9R1~`YP0rá lYRG0le# ZxnAIPwcAc1{],w,b(&FғHDqQ0)ֽQ%602{j "L돲`lou4Y }hF?]alP"h7w66F^6pa3ETTvK8p8;¨VU(0HnLGS1bs $b}'ܢ&*`ME ?:=4t4W-j:@X#6(*}}!bcֈE%21Cfh>'3F ̦ (P n{(]Ԋe M7kFEЋܺ͛7sFt:׾ oګ<\AlOۣ0nݺȢ,K|_Dsu ӄ #iT~"   J.FM΅73-_U=F "yQdsqsWv. VOk6YgwPڏC31޹0Uz#`EI|(ANaL{'Pg1F m$*h" K6vyn[8)4?Jin7YQP8>*?^|X$o#_ Mњn!X=GQ6qI   \MWUi/7AdPkq>ϋJAgJMܦ9p*taw珉pW-9Tb0x8\Ă'(y =G `! 鵇BaDlas)׋&BcDAXrZ5C-*1,U(h e>_@Aë2ž} vKHM4Փ@LN@PQ86m2"]A xrML L,>߃QeB,T/CN@HBBIqt8P EyT*>&(0jEAc8M# v8ɢ -8@ѭ4,: (B#I4@(DP%Wщp4pE01)Ja`&NIGDOkfD?;f+_ʅַ5x+<~^= :\yz/9as֭A4K/t{!Eж-{ւWAAO˄  <',<6nFvg}燕#,p.fC@Ppʯ llWǬhZyɎ6,'/<8n?\q0TUN.7rtm!Eks 8F0>ǠT#\XEg 65(m\[Z(x48bhr0lq&ndguH5PPD11)QA^n}qF!u.$E@%C% =.縷?t|=<^1'7|rE>4}7AS>5xyI-r[n R5<CAAA vO"p`¨I.\hM8\ 8bև@ЪUULdE}V c,F:6!kEP  3(v.jw-VU+ E#4[" 4] r(@#qw:AWb[94<|2/Ƃ2 =z" Emx؉zBr.0S`Px m?(]|6Ek#p;؏E_l0; h.YvP{' RM? erIVQiK@^ .İN=D:~vP0r: qЪ[$PkI=-Ž~G8Y0sV$4J:FU2a17=չ#f,@zē,_'Ԑּ3k(Ä_Kjɂh,q0Q\&Ah9&e7DAAʣBQ݌DGa߹c R Z-03T}@S EN-7I0ǔn G. U{c~6X ӄ Tr)h)C[o ^nXRuZ{K/tݹsgZD \4UUAƒ#   \ ?m۲hv]!9پl`87 OE` # J(\ #uR  fф6h6ǶiUfNVEVOHCP9! q1͍;WMOEc}k@!$ BMBŇf@; e׈ιksH1(w(ut%wB~'YlSحD~&x8/#bplvLsF1G]JO8 ':4Np@1#S aGʅhFҸKL(J љj8;Z4hѠE̅2_+v+ZڏG돲 lY0S{]=8l}||xCϓ& ŋ<7o/\|뭷e Ͻ ^n8|3ڦ8M('/( AA ]g9''E1*v"j d<K\&`vs۶աRdc! EYW(TtTd:%FW=@roWh1K g9Xp{σH߁X44L#(Rʣ(Cu3X5|vx~$ups\K&gז#a8ćz墱J.)1s82NXUa{48شJF2,(͵?h(^5j0;ܓѨt,%وvQ8]cyȡE#< ]ZMLh4Y00Qиݔbuֈv(t1^Ջڽz2'I+} _86V ?˿/%7|s彗_~\G} <,9!  pFWeBA24?A1/]bk"ַ_x<~ĞWؚ~0 =le -p03꺖 DAAAbfY:^h/0Ҡ_ZQlхL^> ,fqAAc -6UҬba]ԀV&ZDSp@`bzvva$;, h4vO%h6$X= @8cq~<)FaP#+tniD 4{٠QS40F^q[T1|t1)^qӍ% CeXa=&^ݵS0^;]IcM|1 GF@SۮBbHǨ*O69Z$G K̳ÅCK5f{󿍟sr(Q4ٵ'1¨r:ļ#;+0 DG ODO o?iG͛xwf%=  _ ^O%$r-ܿ?;(smSa eY,BAAJ35?ED2YT8TbXW"9N/S(t;E£ŁQ^_(%i Bm5U .@dw֗[OAblo`-nBYS<6Pݞ`bOO@Dâg]\!]|-̨ĽVW] vC!ƻ8hߍ1fFiabNr6QQ,9Hлu΂4 "x Ohr'X%# 0X)͵2i94p 0 Km\w%$Rhȫ.Rb76jw@?EBqNE&  %V)UМD iOlCH.#x( jx[1M*kUP 2`DAx!:T=5n?>4i0G`7gt->=0I@DO `ӹD*`QSov?f'gMN WѠZ+@)Fz/{>p @@⬂ `)mǠӘg`i+sG8Tg(XU, 4Z56"XmS&ƀx!#Aa 5 ]T.+b&"zUg(\Mˑ8nnxLh2.H|;#'!%}]RK"y; {^A?_?I-V /"Z;x^>ö(*SrL&0CM6kF" ,Y,Q$ ])šF`YST1@P*bN(,a07@qJPkdj&F-\5O}1s,@I?Zh҂Kc69! be`__yl{gKAXvZ <̹=Ϸ,ˍ QU'/, AA DqJ3̋' 5p"hQvfǃ; D ;X`b?2]M8r!7]o> R=G5T~X< 8xW.(2ڙEwG7 ub*nOY~0J*a88vnXg0IТ ;t0m;X)]%El^'W0l@: #J HZK|OH 4\g K$vӲUaj&rG;ڭHΑή*"x?7v@;{Җo|k߿A3x7XÊdz>{} lc>c2\v3;_AA*r?RbM6q\9@HT-^gtH6 'R^Ł`l㤨څ 9@LYir]rtb7|Z~X%2kQPPZUWԮ0ƅL P1rNem F 4q$Pr=!Ѷ&, bEc,W:e44bJΩG#4^;4쀠y(Z b1(~{\D*3V#2|C &TrX{9Qҍ⠨ѣ.uVUhpJb Ng+.vѨ b Q I) \~n_އ{?)m~Qpmܹsn_c>c<@;_AA.6h(..$"}]G[ EΝ9H"8;0},pI1܂ǰTϺnu9wl( BI2LzW%7z`,|'o~־QmN߿} |> Ъ#|Hq>.f¥swLǹ fhHP5A)FSAfyz(&4gXH1a#pBOaU}]$Z4а.*RÜqS9gt~7\n.`3E8rw6F(qq;ѿ>@`?xn (@ckih 4ltTc}ߘKIrhrƎGP*S;bX='P8 ^] J`ljUhU‡Ata1)4Shkq!Js-;'SL"Ǚ=f|mA΂ ^^Owy;x㍍K}\(A.l,AD'/|Ցr/  WGP{-#k!4d$D~I0Th VT*/G64n^ŘhaP0TfEԳ?¤xlzBi@e$,`t@.DHفH lަدЩ8s zT>}yFQ V$҈g4f-1-6ߢϢS(|8nlnAQzsE+<\d,:SF^lY_,޷0|Ne9k)p7Cwb˒8"pn(N" ̱o0(RypRLL%Ghy`Qcb bfQ]̰ym*J?![ ,] go?Ϧ)>=v<} d+V}6ܹ,>v<} p !ieyW_m  Ձ99DQ|.5?M 3 'ف"6K)m8) ]1S1rq:`]yqj}>f"](l"p;(0wd7tXs~S_NXut鎵Pc0"?u p.ݓ b[ضNtSuEI'1Te mM=3aLY!F s#yA=GӱN0@`T*5$vTl+SH@eoԻsZ؎‘(i1؏eD ErU1] 3f< Ce>GvU֡T*4!wyG߼noCSAźh˿۱eӟ$+{OD <2f-AAEa!*|, 0jBM@sa99n?S5 'znX^>w'س/aDELZ{[(n{Bh6PP9@u4  Хn 5ZVq|]Z!4Y}7d'>~\:=M9,,#TBVv 1$ۡ9Zc}g" 6(sxPk0s ^ɢ|v8 {ϡR{ٝ'.r"G8ELi 2CP*_O@%Mok`HAAA|\/-j>GYLdT Pd狉zc 믿|hz7ooo? V4KV4w}w)8[nW~WAxᜃ3ZAAAx#,&Go*N6Ѹ}l^o8N#(,9N~\bZo\#$g (,%Kh$LZUQlb?q"Uc0jEab<0^xen޼yj/珴 u]7E &`ooenݺuAD۶pe_  ]>~0cb:/ |?Ii<2.'L(_aD;\b&@h?Pds_(*19` (.߯W:s7W\>9(hX(Vl(E%i/4y {_/ :H웽2L&Pgxi*)8?ChUГĚ}jU;*OAcTqۛ}?#d6ncK"F80?'4ͮ']\L܎Ag78xX:;`äžyk#xvh<\gtp.Pjq܆㕬-  x\;>wc߇0T˜g^ VaTۣqywwOH3#E( į' ^@%jd[M<ϞK>se8N>[Y!ڮ]> r";CذAAN$Dfzχ~ьfFh$K kZ=Wh}uy0TALS&(τC%XE<|@j6W >i u4]0# pH;ޟ2`0CUPSH % Y@„  7DI %_/r?yrm=֩!>}ϋ"FGG)7"/G桮Z 'N3yapppBӟ;3g`ڵ n"f2{xODDDDr, 6WAk%x~~jlJdoࠡƵI Sf`00[W0d Ba5i͒&7D%,?^H<UkHڅqe?nҊ aʚW +&( v( e(GP"J(ڬ&F0HD5IڑwTIܷ1.U&Psp *W=A)3q$\(Bq!#^N"Y"UDb@63ZALaIT+ &׀Btzưw9cǎwÆ s4iJD!ɬIc\?>ϻQeV^=g a-&q!DCYx""""" HYS^ m E;zF<$cSq}p+0`f\?uʺjj9]ᄃi(n"#Ì{\5_G2W͠'?EoJ:`)Vj.QB%`W"LiB:RHDB]-7$֧ʄQVZ So\&20e:keK !L5-Y4|<Ì6l#,ZB21I[@&޿J}V)Ӊ @p",@zpP0 +hբ%¢ Vms%R!H >;^} =)T=PQ}r 0d%*{T*!Q$6qRs@u&f ~`mNuP|i0!c6Q% F0ib_~ƋWwll M˛oe/山1 Q}M)aF6Dh\5BV/'46a@ \pdC]OC +H 2fjG20eRAFI ,?'sUHO BpS,hpxʏλ<4T"q&DF -Bm2|qjRME/|a% e]f6_hQoX x!D_dӄZ. Sa#>>䐨IÄi(I0 _0 ~: PJ~-Z^ixӴSfׯPWֽe˖,V6˳yxIL b``+^$DDJWes?QW2 ug0VO}sںsFFF0::t!>eFD1^MVZ^/''s#۶f={<|U&hX(x"""""m> N k p牥Hˉڀ )jՋv~(UJ!33cl5* JGm|j뎊OT}-#nAbA #n :Xߤri%c1ei8Q{2BFځ'LD=/@ ;iLo0]yJWt0=?kWb'|Uƌ{v$iNEpQ3(` e]`&g<\db+0o(E8?'B’}HBÃAEETL!?|n()+x:d ןI"|?3CX\\:ޟQO:kA%Ka50TƺJFG\t_/$=x~9>?fQRq 'Rh(si.bڽbxܷPr߂̡{8$fqχ6ӨEm%"8R矩Ka<"WQC fN3 _T*U8aQ3wWYIxWr4/lݺwƞ={pidYlݺ<@z{caÆEY?]:3/'^/vkΝ;q\pmkm008j'OJlٲW^@:a,)"%Ҟ3AkexכU'"*O/T){_;pô(bR"Sס!L%LԾ'e eiV%%)t*h_PZ"8wMm @@]m^;=`J-z@D#BP=Ha^4.YW'alݺ=XOy㛧DD%O.qWOwݓuGmVD3RJR+x?A sSWځo'0_@OyxVP0&U3< <†meDOA6TÁf{'h!-$l3D_`pÐ)€iY)H~uvy H!+Č6jax'H"VZAD Z4|tO[Q29c52FxL|)< # 5. D$jROx-?NmtėVr> W{G9:t\UBoY~%3&_tANi#&)y`_hayy}$%p@|EJm'H)ٗJ5W5uePaBXaD0  q}W;k;'e NEԗ"tRaճKANcp!L0d 0D/KӰLik ~ ]TTJiFFlZ ȪXHAj%t |mƒC0FAEU7$ HFaʰ j'J{pLê&,Q #œ/We&/pDu> 'pI=R`hvQ“ ^3ގ?M"6|X $maD:+0""""""i 2g^k!L;@ 3.oQIBiJ{kO4 ^8pZ; |